Long-Term Stability of Radio Sources in VLBI Analysis
NASA Technical Reports Server (NTRS)
Engelhardt, Gerald; Thorandt, Volkmar
2010-01-01
Positional stability of radio sources is an important requirement for modeling of only one source position for the complete length of VLBI data of presently more than 20 years. The stability of radio sources can be verified by analyzing time series of radio source coordinates. One approach is a statistical test for normal distribution of residuals to the weighted mean for each radio source component of the time series. Systematic phenomena in the time series can thus be detected. Nevertheless, an inspection of rate estimation and weighted root-mean-square (WRMS) variations about the mean is also necessary. On the basis of the time series computed by the BKG group in the frame of the ICRF2 working group, 226 stable radio sources with an axis stability of 10 as could be identified. They include 100 ICRF2 axes-defining sources which are determined independently of the method applied in the ICRF2 working group. 29 stable radio sources with a source structure index of less than 3.0 can also be used to increase the number of 295 ICRF2 defining sources.
Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015
2012-01-01
Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561
NASA Astrophysics Data System (ADS)
Gattano, C.; Lambert, S.; Bizouard, C.
2017-12-01
In the context of selecting sources defining the celestial reference frame, we compute astrometric time series of all VLBI radio-sources from observations in the International VLBI Service database. The time series are then analyzed with Allan variance in order to estimate the astrometric stability. From results, we establish a new classification that takes into account the whole multi-time scales information. The algorithm is flexible on the definition of ``stable source" through an adjustable threshold.
About the Modeling of Radio Source Time Series as Linear Splines
NASA Astrophysics Data System (ADS)
Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald
2016-12-01
Many of the time series of radio sources observed in geodetic VLBI show variations, caused mainly by changes in source structure. However, until now it has been common practice to consider source positions as invariant, or to exclude known misbehaving sources from the datum conditions. This may lead to a degradation of the estimated parameters, as unmodeled apparent source position variations can propagate to the other parameters through the least squares adjustment. In this paper we will introduce an automated algorithm capable of parameterizing the radio source coordinates as linear splines.
Simulation of time series by distorted Gaussian processes
NASA Technical Reports Server (NTRS)
Greenhall, C. A.
1977-01-01
Distorted stationary Gaussian process can be used to provide computer-generated imitations of experimental time series. A method of analyzing a source time series and synthesizing an imitation is shown, and an example using X-band radiometer data is given.
NASA Astrophysics Data System (ADS)
Kelly, Brandon C.; Hughes, Philip A.; Aller, Hugh D.; Aller, Margo F.
2003-07-01
We introduce an algorithm for applying a cross-wavelet transform to analysis of quasi-periodic variations in a time series and introduce significance tests for the technique. We apply a continuous wavelet transform and the cross-wavelet algorithm to the Pearson-Readhead VLBI survey sources using data obtained from the University of Michigan 26 m paraboloid at observing frequencies of 14.5, 8.0, and 4.8 GHz. Thirty of the 62 sources were chosen to have sufficient data for analysis, having at least 100 data points for a given time series. Of these 30 sources, a little more than half exhibited evidence for quasi-periodic behavior in at least one observing frequency, with a mean characteristic period of 2.4 yr and standard deviation of 1.3 yr. We find that out of the 30 sources, there were about four timescales for every 10 time series, and about half of those sources showing quasi-periodic behavior repeated the behavior in at least one other observing frequency.
Gass, Katherine; Balachandran, Sivaraman; Chang, Howard H.; Russell, Armistead G.; Strickland, Matthew J.
2015-01-01
Epidemiologic studies utilizing source apportionment (SA) of fine particulate matter have shown that particles from certain sources might be more detrimental to health than others; however, it is difficult to quantify the uncertainty associated with a given SA approach. In the present study, we examined associations between source contributions of fine particulate matter and emergency department visits for pediatric asthma in Atlanta, Georgia (2002–2010) using a novel ensemble-based SA technique. Six daily source contributions from 4 SA approaches were combined into an ensemble source contribution. To better account for exposure uncertainty, 10 source profiles were sampled from their posterior distributions, resulting in 10 time series with daily SA concentrations. For each of these time series, Poisson generalized linear models with varying lag structures were used to estimate the health associations for the 6 sources. The rate ratios for the source-specific health associations from the 10 imputed source contribution time series were combined, resulting in health associations with inflated confidence intervals to better account for exposure uncertainty. Adverse associations with pediatric asthma were observed for 8-day exposure to particles generated from diesel-fueled vehicles (rate ratio = 1.06, 95% confidence interval: 1.01, 1.10) and gasoline-fueled vehicles (rate ratio = 1.10, 95% confidence interval: 1.04, 1.17). PMID:25776011
A scalable database model for multiparametric time series: a volcano observatory case study
NASA Astrophysics Data System (ADS)
Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea
2014-05-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
A multidisciplinary database for geophysical time series management
NASA Astrophysics Data System (ADS)
Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.
2013-12-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
Cohen, Michael X
2017-09-27
The number of simultaneously recorded electrodes in neuroscience is steadily increasing, providing new opportunities for understanding brain function, but also new challenges for appropriately dealing with the increase in dimensionality. Multivariate source separation analysis methods have been particularly effective at improving signal-to-noise ratio while reducing the dimensionality of the data and are widely used for cleaning, classifying and source-localizing multichannel neural time series data. Most source separation methods produce a spatial component (that is, a weighted combination of channels to produce one time series); here, this is extended to apply source separation to a time series, with the idea of obtaining a weighted combination of successive time points, such that the weights are optimized to satisfy some criteria. This is achieved via a two-stage source separation procedure, in which an optimal spatial filter is first constructed and then its optimal temporal basis function is computed. This second stage is achieved with a time-delay-embedding matrix, in which additional rows of a matrix are created from time-delayed versions of existing rows. The optimal spatial and temporal weights can be obtained by solving a generalized eigendecomposition of covariance matrices. The method is demonstrated in simulated data and in an empirical electroencephalogram study on theta-band activity during response conflict. Spatiotemporal source separation has several advantages, including defining empirical filters without the need to apply sinusoidal narrowband filters. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Escalas, M.; Queralt, P.; Ledo, J.; Marcuello, A.
2012-04-01
Magnetotelluric (MT) method is a passive electromagnetic technique, which is currently used to characterize sites for the geological storage of CO2. These later ones are usually located nearby industrialized, urban or farming areas, where man-made electromagnetic (EM) signals contaminate the MT data. The identification and characterization of the artificial EM sources which generate the so-called "cultural noise" is an important challenge to obtain the most reliable results with the MT method. The polarization attributes of an EM signal (tilt angle, ellipticity and phase difference between its orthogonal components) are related to the character of its source. In a previous work (Escalas et al. 2011), we proposed a method to distinguish natural signal from cultural noise in the raw MT data. It is based on the polarization analysis of the MT time-series in the time-frequency domain, using a wavelet scheme. We developed an algorithm to implement the method, and was tested with both synthetic and field data. In 2010, we carried out a controlled-source electromagnetic (CSEM) experiment in the Hontomín site (the Research Laboratory on Geological Storage of CO2 in Spain). MT time-series were contaminated at different frequencies with the signal emitted by a controlled artificial EM source: two electric dipoles (1 km long, arranged in North-South and East-West directions). The analysis with our algorithm of the electric field time-series acquired in this experiment was successful: the polarization attributes of both the natural and artificial signal were obtained in the time-frequency domain, highlighting their differences. The processing of the magnetic field time-series acquired in the Hontomín experiment has been done in the present work. This new analysis of the polarization attributes of the magnetic field data has provided additional information to detect the contribution of the artificial source in the measured data. Moreover, the joint analysis of the polarization attributes of the electric and magnetic field has been crucial to fully characterize the properties and the location of the noise source. Escalas, M., Queralt, P., Ledo, J., Marcuello, A., 2011. Identification of cultural noise sources in magnetotelluric data: estimating polarization attributes in the time-frequency domain using wavelet analysis. Geophysical Research Abstracts Vol. 13, EGU2011-6085. EGU General Assembly 2011.
2017-01-04
response, including the time for reviewing instructions, searching existing data sources, searching existing data sources, gathering and maintaining...configurations with a restrained manikin, was evaluated in four different test series . Test Series 1 was conducted to determine the materials and...5 ms TTP. Test Series 2 was conducted to determine the materials and drop heights required for energy attenuation of the seat pan to generate a 4 m
75 FR 22779 - FIFRA Scientific Advisory Panel; Notice of Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-30
... exposure factors to generate time series of exposure for simulated individuals. One-stage or two-stage...., built-in simple source-to- concentration module, user-entered time series from other models or field... FOR FURTHER INFORMATION CONTACT at least 10 days prior to the meeting to give EPA as much time as...
Development and methods for an open-sourced data visualization tool
USDA-ARS?s Scientific Manuscript database
This paper presents an open source on-demand web tool, which is specifically addressed to scientists and researchers that are non-expert in converting time series data into a time surface visualization. Similar to a GIS environment the time surface shows time on two axes; time of day vs. day of year...
NASA Astrophysics Data System (ADS)
Reimer, Janet J.; Cai, Wei-Jun; Xue, Liang; Vargas, Rodrigo; Noakes, Scott; Hu, Xinping; Signorini, Sergio R.; Mathis, Jeremy T.; Feely, Richard A.; Sutton, Adrienne J.; Sabine, Christopher; Musielewicz, Sylvia; Chen, Baoshan; Wanninkhof, Rik
2017-08-01
Marine carbonate system monitoring programs often consist of multiple observational methods that include underway cruise data, moored autonomous time series, and discrete water bottle samples. Monitored parameters include all, or some of the following: partial pressure of CO2 of the water (pCO2w) and air, dissolved inorganic carbon (DIC), total alkalinity (TA), and pH. Any combination of at least two of the aforementioned parameters can be used to calculate the others. In this study at the Gray's Reef (GR) mooring in the South Atlantic Bight (SAB) we: examine the internal consistency of pCO2w from underway cruise, moored autonomous time series, and calculated from bottle samples (DIC-TA pairing); describe the seasonal to interannual pCO2w time series variability and air-sea flux (FCO2), as well as describe the potential sources of pCO2w variability; and determine the source/sink for atmospheric pCO2. Over the 8.5 years of GR mooring time series, mooring-underway and mooring-bottle calculated-pCO2w strongly correlate with r-values > 0.90. pCO2w and FCO2 time series follow seasonal thermal patterns; however, seasonal non-thermal processes, such as terrestrial export, net biological production, and air-sea exchange also influence variability. The linear slope of time series pCO2w increases by 5.2 ± 1.4 μatm y-1 with FCO2 increasing 51-70 mmol m-2 y-1. The net FCO2 sign can switch interannually with the magnitude varying greatly. Non-thermal pCO2w is also increasing over the time series, likely indicating that terrestrial export and net biological processes drive the long term pCO2w increase.
NASA Astrophysics Data System (ADS)
Lundgren, Paul; Nikkhoo, Mehdi; Samsonov, Sergey V.; Milillo, Pietro; Gil-Cruz, Fernando; Lazo, Jonathan
2017-07-01
Copahue volcano straddling the edge of the Agrio-Caviahue caldera along the Chile-Argentina border in the southern Andes has been in unrest since inflation began in late 2011. We constrain Copahue's source models with satellite and airborne interferometric synthetic aperture radar (InSAR) deformation observations. InSAR time series from descending track RADARSAT-2 and COSMO-SkyMed data span the entire inflation period from 2011 to 2016, with their initially high rates of 12 and 15 cm/yr, respectively, slowing only slightly despite ongoing small eruptions through 2016. InSAR ascending and descending track time series for the 2013-2016 time period constrain a two-source compound dislocation model, with a rate of volume increase of 13 × 106 m3/yr. They consist of a shallow, near-vertical, elongated source centered at 2.5 km beneath the summit and a deeper, shallowly plunging source centered at 7 km depth connecting the shallow source to the deeper caldera. The deeper source is located directly beneath the volcano tectonic seismicity with the lower bounds of the seismicity parallel to the plunge of the deep source. InSAR time series also show normal fault offsets on the NE flank Copahue faults. Coulomb stress change calculations for right-lateral strike slip (RLSS), thrust, and normal receiver faults show positive values in the north caldera for both RLSS and normal faults, suggesting that northward trending seismicity and Copahue fault motion within the caldera are caused by the modeled sources. Together, the InSAR-constrained source model and the seismicity suggest a deep conduit or transfer zone where magma moves from the central caldera to Copahue's upper edifice.
NASA Astrophysics Data System (ADS)
Butler, P. G.; Scourse, J. D.; Richardson, C. A.; Wanamaker, A. D., Jr.
2009-04-01
Determinations of the local correction (ΔR) to the globally averaged marine radiocarbon reservoir age are often isolated in space and time, derived from heterogeneous sources and constrained by significant uncertainties. Although time series of ΔR at single sites can be obtained from sediment cores, these are subject to multiple uncertainties related to sedimentation rates, bioturbation and interspecific variations in the source of radiocarbon in the analysed samples. Coral records provide better resolution, but these are available only for tropical locations. It is shown here that it is possible to use the shell of the long-lived bivalve mollusc Arctica islandica as a source of high resolution time series of absolutely-dated marine radiocarbon determinations for the shelf seas surrounding the North Atlantic ocean. Annual growth increments in the shell can be crossdated and chronologies can be constructed in a precise analogue with the use of tree-rings. Because the calendar dates of the samples are known, ΔR can be determined with high precision and accuracy and because all the samples are from the same species, the time series of ΔR values possesses a high degree of internal consistency. Presented here is a multi-centennial (AD 1593 - AD 1933) time series of 31 ΔR values for a site in the Irish Sea close to the Isle of Man. The mean value of ΔR (-62 14C yrs) does not change significantly during this period but increased variability is apparent before AD 1750.
NASA Astrophysics Data System (ADS)
Carter, W. E.; Robertson, D. S.; Nothnagel, A.; Nicolson, G. D.; Schuh, H.
1988-12-01
High-accuracy geodetic very long baseline interferometry measurements between the African, Eurasian, and North American plates have been analyzed to determine the terrestrial coordinates of the Hartebeesthoek observatory to better than 10 cm, to determine the celestial coordinates of eight Southern Hemisphere radio sources with milliarc second (mas) accuracy, and to derive quasi-independent polar motion, UTI, and nutation time series. Comparison of the earth orientation time series with ongoing International Radio Interferometric Surveying project values shows agreement at about the 1 mas of arc level in polar motion and nutation and 0.1 ms of time in UTI. Given the independence of the observing sessions and the unlikeliness of common systematic error sources, this level of agreement serves to bound the total errors in both measurement series.
Multifractal detrended fluctuation analysis of sheep livestock prices in origin
NASA Astrophysics Data System (ADS)
Pavón-Domínguez, P.; Serrano, S.; Jiménez-Hornero, F. J.; Jiménez-Hornero, J. E.; Gutiérrez de Ravé, E.; Ariza-Villaverde, A. B.
2013-10-01
The multifractal detrended fluctuation analysis (MF-DFA) is used to verify whether or not the returns of time series of prices paid to farmers in original markets can be described by the multifractal approach. By way of example, 5 weekly time series of prices of different breeds, slaughter weight and market differentiation from 2000 to 2012 are analyzed. Results obtained from the multifractal parameters and multifractal spectra show that the price series of livestock products are of a multifractal nature. The Hurst exponent shows that these time series are stationary signals, some of which exhibit long memory (Merino milk-fed in Seville and Segureña paschal in Jaen), short memory (Merino paschal in Cordoba and Segureña milk-fed in Jaen) or even are close to an uncorrelated signals (Merino paschal in Seville). MF-DFA is able to discern the different underlying dynamics that play an important role in different types of sheep livestock markets, such as degree and source of multifractality. In addition, the main source of multifractality of these time series is due to the broadness of the probability function, instead of the long-range correlation properties between small and large fluctuations, which play a clearly secondary role.
CauseMap: fast inference of causality from complex time series.
Maher, M Cyrus; Hernandez, Ryan D
2015-01-01
Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data. Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM), a method for establishing causality from long time series data (≳25 observations). Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens' Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement CCM in Julia, a high-performance programming language designed for facile technical computing. Our software package, CauseMap, is platform-independent and freely available as an official Julia package. Conclusions. CauseMap is an efficient implementation of a state-of-the-art algorithm for detecting causality from time series data. We believe this tool will be a valuable resource for biomedical research and personalized medicine.
Altiparmak, Fatih; Ferhatosmanoglu, Hakan; Erdal, Selnur; Trost, Donald C
2006-04-01
An effective analysis of clinical trials data involves analyzing different types of data such as heterogeneous and high dimensional time series data. The current time series analysis methods generally assume that the series at hand have sufficient length to apply statistical techniques to them. Other ideal case assumptions are that data are collected in equal length intervals, and while comparing time series, the lengths are usually expected to be equal to each other. However, these assumptions are not valid for many real data sets, especially for the clinical trials data sets. An addition, the data sources are different from each other, the data are heterogeneous, and the sensitivity of the experiments varies by the source. Approaches for mining time series data need to be revisited, keeping the wide range of requirements in mind. In this paper, we propose a novel approach for information mining that involves two major steps: applying a data mining algorithm over homogeneous subsets of data, and identifying common or distinct patterns over the information gathered in the first step. Our approach is implemented specifically for heterogeneous and high dimensional time series clinical trials data. Using this framework, we propose a new way of utilizing frequent itemset mining, as well as clustering and declustering techniques with novel distance metrics for measuring similarity between time series data. By clustering the data, we find groups of analytes (substances in blood) that are most strongly correlated. Most of these relationships already known are verified by the clinical panels, and, in addition, we identify novel groups that need further biomedical analysis. A slight modification to our algorithm results an effective declustering of high dimensional time series data, which is then used for "feature selection." Using industry-sponsored clinical trials data sets, we are able to identify a small set of analytes that effectively models the state of normal health.
Determine if analysis of lag structure from time series epidemiology, using gases, particles, and source factor time series, can contribute to understanding the relationships among various air pollution indicators. Methods: Analyze lag structure from an epidemiologic study of ca...
NASA Astrophysics Data System (ADS)
Luo, Qiu; Xin, Wu; Qiming, Xiong
2017-06-01
In the process of vegetation remote sensing information extraction, the problem of phenological features and low performance of remote sensing analysis algorithm is not considered. To solve this problem, the method of remote sensing vegetation information based on EVI time-series and the classification of decision-tree of multi-source branch similarity is promoted. Firstly, to improve the time-series stability of recognition accuracy, the seasonal feature of vegetation is extracted based on the fitting span range of time-series. Secondly, the decision-tree similarity is distinguished by adaptive selection path or probability parameter of component prediction. As an index, it is to evaluate the degree of task association, decide whether to perform migration of multi-source decision tree, and ensure the speed of migration. Finally, the accuracy of classification and recognition of pests and diseases can reach 87%--98% of commercial forest in Dalbergia hainanensis, which is significantly better than that of MODIS coverage accuracy of 80%--96% in this area. Therefore, the validity of the proposed method can be verified.
Closed-Loop Optimal Control Implementations for Space Applications
2016-12-01
analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to feedback on the...through the analyses of a series of optimal control problems, several real- time optimal control algorithms are developed that continuously adapt to...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering
NASA Astrophysics Data System (ADS)
Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio
2013-10-01
Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.
1983-12-01
near the turbidity channels. Furthermore, Hastrup concludes, after an analysis of time series data taken from the Tyrrhenian abyssal plain, that the top...Bottom-Interacting Ocean Acoustics edited by W. A. Kuperman and F. B. Jensen (Plenum Press, N York, 1980). 84 24. 0. F. Hastrup , "Digital Analysis of
Investigation of a long time series of CO2 from a tall tower using WRF-SPA
NASA Astrophysics Data System (ADS)
Smallman, Luke; Williams, Mathew; Moncrieff, John B.
2013-04-01
Atmospheric observations from tall towers are an important source of information about CO2 exchange at the regional scale. Here, we have used a forward running model, WRF-SPA, to generate a time series of CO2 at a tall tower for comparison with observations from Scotland over multiple years (2006-2008). We use this comparison to infer strength and distribution of sources and sinks of carbon and ecosystem process information at the seasonal scale. The specific aim of this research is to combine a high resolution (6 km) forward running meteorological model (WRF) with a modified version of a mechanistic ecosystem model (SPA). SPA provides surface fluxes calculated from coupled energy, hydrological and carbon cycles. This closely coupled representation of the biosphere provides realistic surface exchanges to drive mixing within the planetary boundary layer. The combined model is used to investigate the sources and sinks of CO2 and to explore which land surfaces contribute to a time series of hourly observations of atmospheric CO2 at a tall tower, Angus, Scotland. In addition to comparing the modelled CO2 time series to observations, modelled ecosystem specific (i.e. forest, cropland, grassland) CO2 tracers (e.g., assimilation and respiration) have been compared to the modelled land surface assimilation to investigate how representative tall tower observations are of land surface processes. WRF-SPA modelled CO2 time series compares well to observations (R2 = 0.67, rmse = 3.4 ppm, bias = 0.58 ppm). Through comparison of model-observation residuals, we have found evidence that non-cropped components of agricultural land (e.g., hedgerows and forest patches) likely contribute a significant and observable impact on regional carbon balance.
Nowcasting influenza outbreaks using open-source media report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ray, Jaideep; Brownstein, John S.
We construct and verify a statistical method to nowcast influenza activity from a time-series of the frequency of reports concerning influenza related topics. Such reports are published electronically by both public health organizations as well as newspapers/media sources, and thus can be harvested easily via web crawlers. Since media reports are timely, whereas reports from public health organization are delayed by at least two weeks, using timely, open-source data to compensate for the lag in %E2%80%9Cofficial%E2%80%9D reports can be useful. We use morbidity data from networks of sentinel physicians (both the Center of Disease Control's ILINet and France's Sentinelles network)more » as the gold standard of influenza-like illness (ILI) activity. The time-series of media reports is obtained from HealthMap (http://healthmap.org). We find that the time-series of media reports shows some correlation ( 0.5) with ILI activity; further, this can be leveraged into an autoregressive moving average model with exogenous inputs (ARMAX model) to nowcast ILI activity. We find that the ARMAX models have more predictive skill compared to autoregressive (AR) models fitted to ILI data i.e., it is possible to exploit the information content in the open-source data. We also find that when the open-source data are non-informative, the ARMAX models reproduce the performance of AR models. The statistical models are tested on data from the 2009 swine-flu outbreak as well as the mild 2011-2012 influenza season in the U.S.A.« less
Time-dependent source model of the Lusi mud volcano
NASA Astrophysics Data System (ADS)
Shirzaei, M.; Rudolph, M. L.; Manga, M.
2014-12-01
The Lusi mud eruption, near Sidoarjo, East Java, Indonesia, began erupting in May 2006 and continues to erupt today. Previous analyses of surface deformation data suggested an exponential decay of the pressure in the mud source, but did not constrain the geometry and evolution of the source(s) from which the erupting mud and fluids ascend. To understand the spatiotemporal evolution of the mud and fluid sources, we apply a time-dependent inversion scheme to a densely populated InSAR time series of the surface deformation at Lusi. The SAR data set includes 50 images acquired on 3 overlapping tracks of the ALOS L-band satellite between May 2006 and April 2011. Following multitemporal analysis of this data set, the obtained surface deformation time series is inverted in a time-dependent framework to solve for the volume changes of distributed point sources in the subsurface. The volume change distribution resulting from this modeling scheme shows two zones of high volume change underneath Lusi at 0.5-1.5 km and 4-5.5km depth as well as another shallow zone, 7 km to the west of Lusi and underneath the Wunut gas field. The cumulative volume change within the shallow source beneath Lusi is ~2-4 times larger than that of the deep source, whilst the ratio of the Lusi shallow source volume change to that of Wunut gas field is ~1. This observation and model suggest that the Lusi shallow source played a key role in eruption process and mud supply, but that additional fluids do ascend from depths >4 km on eruptive timescales.
Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals.
Hedayatifar, L; Vahabi, M; Jafari, G R
2011-08-01
When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.
Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals
NASA Astrophysics Data System (ADS)
Hedayatifar, L.; Vahabi, M.; Jafari, G. R.
2011-08-01
When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.
Blind source separation problem in GPS time series
NASA Astrophysics Data System (ADS)
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2016-04-01
A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition techniques in explaining the data and in recovering the original (known) sources. Using the same number of components, we find that the vbICA method fits the data almost as well as a PCA method, since the χ 2 increase is less than 10 % the value calculated using a PCA decomposition. Unlike PCA, the vbICA algorithm is found to correctly separate the sources if the correlation of the dataset is low (<0.67) and the geodetic network is sufficiently dense (ten continuous GPS stations within a box of side equal to two times the locking depth of a fault where an earthquake of Mw >6 occurred). We also provide a cookbook for the use of the vbICA algorithm in analyses of position time series for tectonic and non-tectonic applications.
Improved detection of radioactive material using a series of measurements
NASA Astrophysics Data System (ADS)
Mann, Jenelle
The goal of this project is to develop improved algorithms for detection of radioactive sources that have low signal compared to background. The detection of low signal sources is of interest in national security applications where the source may have weak ionizing radiation emissions, is heavily shielded, or the counting time is short (such as portal monitoring). Traditionally to distinguish signal from background the decision threshold (y*) is calculated by taking a long background count and limiting the false negative error (alpha error) to 5%. Some problems with this method include: background is constantly changing due to natural environmental fluctuations and large amounts of data are being taken as the detector continuously scans that are not utilized. Rather than looking at a single measurement, this work investigates looking at a series of N measurements and develops an appropriate decision threshold for exceeding the decision threshold n times in a series of N. This methodology is investigated for a rectangular, triangular, sinusoidal, Poisson, and Gaussian distribution.
Trends and Patterns in a New Time Series of Natural and Anthropogenic Methane Emissions, 1980-2000
NASA Astrophysics Data System (ADS)
Matthews, E.; Bruhwiler, L.; Themelis, N. J.
2007-12-01
We report on a new time series of methane (CH4) emissions from anthropogenic and natural sources developed for a multi-decadal methane modeling study (see following presentation by Bruhwiler et al.). The emission series extends from 1980 through the early 2000s with annual emissions for all countries has several features distinct from the source histories based on IPCC methods typically employed in modeling the global methane cycle. Fossil fuel emissions rely on 7 fuel-process emission combinations and minimize reliance on highly-uncertain emission factors. Emissions from ruminant animals employ regional profiles of bovine populations that account for the influence of variable age- and size-demographics on emissions and are ~15% lower than other estimates. Waste-related emissions are developed using an approach that avoids using of data-poor emission factors and accounts for impacts of recycling and thermal treatment of waste on diverting material from landfills and CH4 capture at landfill facilities. Emissions from irrigated rice use rice-harvest areas under 3 water-management systems and a new historical data set that analyzes multiple sources for trends in water management since 1980. A time series of emissions from natural wetlands was developed by applying a multiple-regression model derived from full process-based model of Walter with analyzed meteorology from the ERA-40 reanalysis.
NASA Astrophysics Data System (ADS)
Shirzaei, Manoochehr; Walter, Thomas
2010-05-01
Volcanic unrest and eruptions are one of the major natural hazards next to earthquakes, floods, and storms. It has been shown that many of volcanic and tectonic unrests are triggered by changes in the stress field induced by nearby seismic and magmatic activities. In this study, as part of a mobile volcano fast response system so-called "Exupery" (www.exupery-vfrs.de) we present an arrangement for semi real time assessing the stress field excited by volcanic activity. This system includes; (1) an approach called "WabInSAR" dedicated for advanced processing of the satellite data and providing an accurate time series of the surface deformation [1, 2], (2) a time dependent inverse source modeling method to investigate the source of volcanic unrest using observed surface deformation data [3, 4], (3) the assessment of the changes in stress field induced by magmatic activity at the nearby volcanic and tectonic systems. This system is implemented in a recursive manner that allows handling large 3D data sets in an efficient and robust way which is requirement of an early warning system. We have applied and validated this arrangement on Mauna Loa volcano, Hawaii Island, to assess the influence of the time dependent activities of Mauna Loa on earthquake occurrence at the Kaoiki seismic zone. References [1] M. Shirzaei and T. R. Walter, "Wavelet based InSAR (WabInSAR): a new advanced time series approach for accurate spatiotemporal surface deformation monitoring," IEEE, pp. submitted, 2010. [2] M. Shirzaei and R. T. Walter, "Deformation interplay at Hawaii Island through InSAR time series and modeling," J. Geophys Res., vol. submited, 2009. [3] M. Shirzaei and T. R. Walter, "Randomly Iterated Search and Statistical Competency (RISC) as powerful inversion tools for deformation source modeling: application to volcano InSAR data," J. Geophys. Res., vol. 114, B10401, doi:10.1029/2008JB006071, 2009. [4] M. Shirzaei and T. R. Walter, "Genetic algorithm combined with Kalman filter as powerful tool for nonlinear time dependent inverse modelling: Application to volcanic deformation time series," J. Geophys. Res., pp. submitted, 2010.
Mittapalli, R K; Qhattal, H S Sha; Lockman, P R; Yamsani, M R
2010-11-01
The main objective of the present study was to develop an orally disintegrating tablet formulation of domperidone and to study the functionality differences of superdisintegrants each obtained from two different sources on the tablet properties. Domperidone tablets were formulated with different superdisintegrants by direct compression. The effect of the type of superdisintegrant, its concentration and source was studied by measuring the in-vitro disintegration time, wetting time, water absorption ratios, drug release by dissolution and in-vivo oral disintegration time. Tablets prepared with crospovidone had lower disintegration times than tablets prepared from sodium starchglycolate and croscarmellose sodium. Formulations prepared with Polyplasdone XL, Ac-Di-Sol, and Explotab (D series) were better than formulations prepared with superdisintegrants obtained from other sources (DL series) which had longer disintegration times and lower water uptake ratios. The in-vivo disintegration time of formulation D-106 containing polyplasdone XL was significantly lower than that of the marketed formulation Domel-MT. The results from this study suggest that disintegration of orally disintegrating tablets is dependent on the nature of superdisintegrant, concentration in the formulation and its source. Even though a superdisintegrant meets USP standards there can be a variance among manufacturers in terms of performance. This is not only limited to in-vitro studies but carries over to disintegration times in the human population.
NASA Astrophysics Data System (ADS)
Lindholm, D. M.; Wilson, A.
2010-12-01
The Laboratory for Atmospheric and Space Physics at the University of Colorado has developed an Open Source, OPeNDAP compliant, Java Servlet based, RESTful web service to serve time series data. In addition to handling OPeNDAP style requests and returning standard responses, existing modules for alternate output formats can be reused or customized. It is also simple to reuse or customize modules to directly read various native data sources and even to perform some processing on the server. The server is built around a common data model based on the Unidata Common Data Model (CDM) which merges the NetCDF, HDF, and OPeNDAP data models. The server framework features a modular architecture that supports pluggable Readers, Writers, and Filters via the common interface to the data, enabling a workflow that reads data from their native form, performs some processing on the server, and presents the results to the client in its preferred form. The service is currently being used operationally to serve time series data for the LASP Interactive Solar Irradiance Data Center (LISIRD, http://lasp.colorado.edu/lisird/) and as part of the Time Series Data Server (TSDS, http://tsds.net/). I will present the data model and how it enables reading, writing, and processing concerns to be separated into loosely coupled components. I will also share thoughts for evolving beyond the time series abstraction and providing a general purpose data service that can be orchestrated into larger workflows.
NASA Astrophysics Data System (ADS)
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we introduce the vbICA technique and present its application on synthetic data that simulate a GPS network recording ground deformation in a tectonically active region, with synthetic time-series containing interseismic, coseismic, and postseismic deformation, plus seasonal deformation, and white and coloured noise. We study the ability of the algorithm to recover the original (known) sources of deformation, and then apply it to a real scenario: the Emilia seismic sequence (2012, northern Italy), which is an example of seismic sequence occurred in a slowly converging tectonic setting, characterized by several local to regional anthropogenic or natural sources of deformation, mainly subsidence due to fluid withdrawal and sediments compaction. We apply both PCA and vbICA to displacement time-series recorded by continuous GPS and InSAR (Pezzo et al., EGU2015-8950).
NASA Astrophysics Data System (ADS)
Wziontek, Hartmut; Wilmes, Herbert; Güntner, Andreas; Creutzfeldt, Benjamin
2010-05-01
Water mass changes are a major source of variations in residual gravimetric time series obtained from the combination of observations with superconducting and absolute gravimeters. Changes in the local water storage are the main influence, but global variations contribute to the signal significantly. For three European gravity stations, Bad Homburg, Wettzell and Medicina, different global hydrology models are compared. The influence of topographic effects is discussed and due to the long-term stability of the combined gravity time series, inter-annual signals in model data and gravimetric observations are compared. Two sources of influence are discriminated, i.e., the effect of a local zone with an extent of a few kilometers around the gravimetric station and the global contribution beyond 50km. Considering their coarse resolution and uncertainties, local effects calculated from global hydrological models are compared with the in-situ gravity observations and, for the station Wettzell, with local hydrological monitoring data.
Barry T. Wilson; Joseph F. Knight; Ronald E. McRoberts
2018-01-01
Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several...
Arima model and exponential smoothing method: A comparison
NASA Astrophysics Data System (ADS)
Wan Ahmad, Wan Kamarul Ariffin; Ahmad, Sabri
2013-04-01
This study shows the comparison between Autoregressive Moving Average (ARIMA) model and Exponential Smoothing Method in making a prediction. The comparison is focused on the ability of both methods in making the forecasts with the different number of data sources and the different length of forecasting period. For this purpose, the data from The Price of Crude Palm Oil (RM/tonne), Exchange Rates of Ringgit Malaysia (RM) in comparison to Great Britain Pound (GBP) and also The Price of SMR 20 Rubber Type (cents/kg) with three different time series are used in the comparison process. Then, forecasting accuracy of each model is measured by examinethe prediction error that producedby using Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE), and Mean Absolute deviation (MAD). The study shows that the ARIMA model can produce a better prediction for the long-term forecasting with limited data sources, butcannot produce a better prediction for time series with a narrow range of one point to another as in the time series for Exchange Rates. On the contrary, Exponential Smoothing Method can produce a better forecasting for Exchange Rates that has a narrow range of one point to another for its time series, while itcannot produce a better prediction for a longer forecasting period.
Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.; ...
2017-12-28
A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea)more » underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.« less
NASA Astrophysics Data System (ADS)
Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.; Chiswell, S. R.
2018-03-01
A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea) underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.
A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea)more » underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.« less
77 FR 18872 - Availability of Electric Power Sources
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-28
... the first time that document is referenced. Revision 1 of Regulatory Guide 1.93 is available.... Introduction The NRC is issuing a revision to an existing guide in the NRC's ``Regulatory Guide'' series. This series was developed to describe and make available to the public information such as methods that are...
... Resources CME/CEU and Online Lectures Online Continuing Education Series Distinguished Lecture Series Integrated Medicine Research Lecture ... has been a source of many folk or traditional remedies and more modern medicinal and cosmetic products. At various times aloe ...
NASA Astrophysics Data System (ADS)
Auer, I.; Kirchengast, A.; Proske, H.
2009-09-01
The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.
Principal component analysis of MSBAS DInSAR time series from Campi Flegrei, Italy
NASA Astrophysics Data System (ADS)
Tiampo, Kristy F.; González, Pablo J.; Samsonov, Sergey; Fernández, Jose; Camacho, Antonio
2017-09-01
Because of its proximity to the city of Naples and with a population of nearly 1 million people within its caldera, Campi Flegrei is one of the highest risk volcanic areas in the world. Since the last major eruption in 1538, the caldera has undergone frequent episodes of ground subsidence and uplift accompanied by seismic activity that has been interpreted as the result of a stationary, deeper source below the caldera that feeds shallower eruptions. However, the location and depth of the deeper source is not well-characterized and its relationship to current activity is poorly understood. Recently, a significant increase in the uplift rate has occurred, resulting in almost 13 cm of uplift by 2013 (De Martino et al., 2014; Samsonov et al., 2014b; Di Vito et al., 2016). Here we apply a principal component decomposition to high resolution time series from the region produced by the advanced Multidimensional SBAS DInSAR technique in order to better delineate both the deeper source and the recent shallow activity. We analyzed both a period of substantial subsidence (1993-1999) and a second of significant uplift (2007-2013) and inverted the associated vertical surface displacement for the most likely source models. Results suggest that the underlying dynamics of the caldera changed in the late 1990s, from one in which the primary signal arises from a shallow deflating source above a deeper, expanding source to one dominated by a shallow inflating source. In general, the shallow source lies between 2700 and 3400 m below the caldera while the deeper source lies at 7600 m or more in depth. The combination of principal component analysis with high resolution MSBAS time series data allows for these new insights and confirms the applicability of both to areas at risk from dynamic natural hazards.
INTEGRAL/SPI data segmentation to retrieve source intensity variations
NASA Astrophysics Data System (ADS)
Bouchet, L.; Amestoy, P. R.; Buttari, A.; Rouet, F.-H.; Chauvin, M.
2013-07-01
Context. The INTEGRAL/SPI, X/γ-ray spectrometer (20 keV-8 MeV) is an instrument for which recovering source intensity variations is not straightforward and can constitute a difficulty for data analysis. In most cases, determining the source intensity changes between exposures is largely based on a priori information. Aims: We propose techniques that help to overcome the difficulty related to source intensity variations, which make this step more rational. In addition, the constructed "synthetic" light curves should permit us to obtain a sky model that describes the data better and optimizes the source signal-to-noise ratios. Methods: For this purpose, the time intensity variation of each source was modeled as a combination of piecewise segments of time during which a given source exhibits a constant intensity. To optimize the signal-to-noise ratios, the number of segments was minimized. We present a first method that takes advantage of previous time series that can be obtained from another instrument on-board the INTEGRAL observatory. A data segmentation algorithm was then used to synthesize the time series into segments. The second method no longer needs external light curves, but solely SPI raw data. For this, we developed a specific algorithm that involves the SPI transfer function. Results: The time segmentation algorithms that were developed solve a difficulty inherent to the SPI instrument, which is the intensity variations of sources between exposures, and it allows us to obtain more information about the sources' behavior. Based on observations with INTEGRAL, an ESA project with instruments and science data centre funded by ESA member states (especially the PI countries: Denmark, France, Germany, Italy, Spain, and Switzerland), Czech Republic and Poland with participation of Russia and the USA.
NASA Astrophysics Data System (ADS)
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2014-12-01
A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise), and study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, we apply vbICA to different tectonically active scenarios, such as earthquakes in central and northern Italy, as well as the study of slow slip events in Cascadia.
Zhang, Xiao-Zheng; Bi, Chuan-Xing; Zhang, Yong-Bin; Xu, Liang
2015-05-01
Planar near-field acoustic holography has been successfully extended to reconstruct the sound field in a moving medium, however, the reconstructed field still contains the convection effect that might lead to the wrong identification of sound sources. In order to accurately identify sound sources in a moving medium, a time-domain equivalent source method is developed. In the method, the real source is replaced by a series of time-domain equivalent sources whose strengths are solved iteratively by utilizing the measured pressure and the known convective time-domain Green's function, and time averaging is used to reduce the instability in the iterative solving process. Since these solved equivalent source strengths are independent of the convection effect, they can be used not only to identify sound sources but also to model sound radiations in both moving and static media. Numerical simulations are performed to investigate the influence of noise on the solved equivalent source strengths and the effect of time averaging on reducing the instability, and to demonstrate the advantages of the proposed method on the source identification and sound radiation modeling.
Deterministically estimated fission source distributions for Monte Carlo k-eigenvalue problems
Biondo, Elliott D.; Davidson, Gregory G.; Pandya, Tara M.; ...
2018-04-30
The standard Monte Carlo (MC) k-eigenvalue algorithm involves iteratively converging the fission source distribution using a series of potentially time-consuming inactive cycles before quantities of interest can be tallied. One strategy for reducing the computational time requirements of these inactive cycles is the Sourcerer method, in which a deterministic eigenvalue calculation is performed to obtain an improved initial guess for the fission source distribution. This method has been implemented in the Exnihilo software suite within SCALE using the SPNSPN or SNSN solvers in Denovo and the Shift MC code. The efficacy of this method is assessed with different Denovo solutionmore » parameters for a series of typical k-eigenvalue problems including small criticality benchmarks, full-core reactors, and a fuel cask. Here it is found that, in most cases, when a large number of histories per cycle are required to obtain a detailed flux distribution, the Sourcerer method can be used to reduce the computational time requirements of the inactive cycles.« less
Deterministically estimated fission source distributions for Monte Carlo k-eigenvalue problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biondo, Elliott D.; Davidson, Gregory G.; Pandya, Tara M.
The standard Monte Carlo (MC) k-eigenvalue algorithm involves iteratively converging the fission source distribution using a series of potentially time-consuming inactive cycles before quantities of interest can be tallied. One strategy for reducing the computational time requirements of these inactive cycles is the Sourcerer method, in which a deterministic eigenvalue calculation is performed to obtain an improved initial guess for the fission source distribution. This method has been implemented in the Exnihilo software suite within SCALE using the SPNSPN or SNSN solvers in Denovo and the Shift MC code. The efficacy of this method is assessed with different Denovo solutionmore » parameters for a series of typical k-eigenvalue problems including small criticality benchmarks, full-core reactors, and a fuel cask. Here it is found that, in most cases, when a large number of histories per cycle are required to obtain a detailed flux distribution, the Sourcerer method can be used to reduce the computational time requirements of the inactive cycles.« less
Modelling spatiotemporal change using multidimensional arrays Meng
NASA Astrophysics Data System (ADS)
Lu, Meng; Appel, Marius; Pebesma, Edzer
2017-04-01
The large variety of remote sensors, model simulations, and in-situ records provide great opportunities to model environmental change. The massive amount of high-dimensional data calls for methods to integrate data from various sources and to analyse spatiotemporal and thematic information jointly. An array is a collection of elements ordered and indexed in arbitrary dimensions, which naturally represent spatiotemporal phenomena that are identified by their geographic locations and recording time. In addition, array regridding (e.g., resampling, down-/up-scaling), dimension reduction, and spatiotemporal statistical algorithms are readily applicable to arrays. However, the role of arrays in big geoscientific data analysis has not been systematically studied: How can arrays discretise continuous spatiotemporal phenomena? How can arrays facilitate the extraction of multidimensional information? How can arrays provide a clean, scalable and reproducible change modelling process that is communicable between mathematicians, computer scientist, Earth system scientist and stakeholders? This study emphasises on detecting spatiotemporal change using satellite image time series. Current change detection methods using satellite image time series commonly analyse data in separate steps: 1) forming a vegetation index, 2) conducting time series analysis on each pixel, and 3) post-processing and mapping time series analysis results, which does not consider spatiotemporal correlations and ignores much of the spectral information. Multidimensional information can be better extracted by jointly considering spatial, spectral, and temporal information. To approach this goal, we use principal component analysis to extract multispectral information and spatial autoregressive models to account for spatial correlation in residual based time series structural change modelling. We also discuss the potential of multivariate non-parametric time series structural change methods, hierarchical modelling, and extreme event detection methods to model spatiotemporal change. We show how array operations can facilitate expressing these methods, and how the open-source array data management and analytics software SciDB and R can be used to scale the process and make it easily reproducible.
Background/Question/Methods Bacterial pathogens in surface water present disease risks to aquatic communities and for human recreational activities. Sources of these pathogens include runoff from urban, suburban, and agricultural point and non-point sources, but hazardous micr...
NASA Astrophysics Data System (ADS)
Gica, E.
2016-12-01
The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.
A study of sound generation in subsonic rotors, volume 2
NASA Technical Reports Server (NTRS)
Chalupnik, J. D.; Clark, L. T.
1975-01-01
Computer programs were developed for use in the analysis of sound generation by subsonic rotors. Program AIRFOIL computes the spectrum of radiated sound from a single airfoil immersed in a laminar flow field. Program ROTOR extends this to a rotating frame, and provides a model for sound generation in subsonic rotors. The program also computes tone sound generation due to steady state forces on the blades. Program TONE uses a moving source analysis to generate a time series for an array of forces moving in a circular path. The resultant time series are than Fourier transformed to render the results in spectral form. Program SDATA is a standard time series analysis package. It reads in two discrete time series and forms auto and cross covariances and normalizes these to form correlations. The program then transforms the covariances to yield auto and cross power spectra by means of a Fourier transformation.
A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin McCarthy; Milos Manic
Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less
Understanding the source of multifractality in financial markets
NASA Astrophysics Data System (ADS)
Barunik, Jozef; Aste, Tomaso; Di Matteo, T.; Liu, Ruipeng
2012-09-01
In this paper, we use the generalized Hurst exponent approach to study the multi-scaling behavior of different financial time series. We show that this approach is robust and powerful in detecting different types of multi-scaling. We observe a puzzling phenomenon where an apparent increase in multifractality is measured in time series generated from shuffled returns, where all time-correlations are destroyed, while the return distributions are conserved. This effect is robust and it is reproduced in several real financial data including stock market indices, exchange rates and interest rates. In order to understand the origin of this effect we investigate different simulated time series by means of the Markov switching multifractal model, autoregressive fractionally integrated moving average processes with stable innovations, fractional Brownian motion and Levy flights. Overall we conclude that the multifractality observed in financial time series is mainly a consequence of the characteristic fat-tailed distribution of the returns and time-correlations have the effect to decrease the measured multifractality.
NASA Astrophysics Data System (ADS)
Elangasinghe, M. A.; Singhal, N.; Dirks, K. N.; Salmond, J. A.; Samarasinghe, S.
2014-09-01
This paper uses artificial neural networks (ANN), combined with k-means clustering, to understand the complex time series of PM10 and PM2.5 concentrations at a coastal location of New Zealand based on data from a single site. Out of available meteorological parameters from the network (wind speed, wind direction, solar radiation, temperature, relative humidity), key factors governing the pattern of the time series concentrations were identified through input sensitivity analysis performed on the trained neural network model. The transport pathways of particulate matter under these key meteorological parameters were further analysed through bivariate concentration polar plots and k-means clustering techniques. The analysis shows that the external sources such as marine aerosols and local sources such as traffic and biomass burning contribute equally to the particulate matter concentrations at the study site. These results are in agreement with the results of receptor modelling by the Auckland Council based on Positive Matrix Factorization (PMF). Our findings also show that contrasting concentration-wind speed relationships exist between marine aerosols and local traffic sources resulting in very noisy and seemingly large random PM10 concentrations. The inclusion of cluster rankings as an input parameter to the ANN model showed a statistically significant (p < 0.005) improvement in the performance of the ANN time series model and also showed better performance in picking up high concentrations. For the presented case study, the correlation coefficient between observed and predicted concentrations improved from 0.77 to 0.79 for PM2.5 and from 0.63 to 0.69 for PM10 and reduced the root mean squared error (RMSE) from 5.00 to 4.74 for PM2.5 and from 6.77 to 6.34 for PM10. The techniques presented here enable the user to obtain an understanding of potential sources and their transport characteristics prior to the implementation of costly chemical analysis techniques or advanced air dispersion models.
Application of blind source separation to real-time dissolution dynamic nuclear polarization.
Hilty, Christian; Ragavan, Mukundan
2015-01-20
The use of a blind source separation (BSS) algorithm is demonstrated for the analysis of time series of nuclear magnetic resonance (NMR) spectra. This type of data is obtained commonly from experiments, where analytes are hyperpolarized using dissolution dynamic nuclear polarization (D-DNP), both in in vivo and in vitro contexts. High signal gains in D-DNP enable rapid measurement of data sets characterizing the time evolution of chemical or metabolic processes. BSS is based on an algorithm that can be applied to separate the different components contributing to the NMR signal and determine the time dependence of the signals from these components. This algorithm requires minimal prior knowledge of the data, notably, no reference spectra need to be provided, and can therefore be applied rapidly. In a time-resolved measurement of the enzymatic conversion of hyperpolarized oxaloacetate to malate, the two signal components are separated into computed source spectra that closely resemble the spectra of the individual compounds. An improvement in the signal-to-noise ratio of the computed source spectra is found compared to the original spectra, presumably resulting from the presence of each signal more than once in the time series. The reconstruction of the original spectra yields the time evolution of the contributions from the two sources, which also corresponds closely to the time evolution of integrated signal intensities from the original spectra. BSS may therefore be an approach for the efficient identification of components and estimation of kinetics in D-DNP experiments, which can be applied at a high level of automation.
Detection of anomalous signals in temporally correlated data (Invited)
NASA Astrophysics Data System (ADS)
Langbein, J. O.
2010-12-01
Detection of transient tectonic signals in data obtained from large geodetic networks requires the ability to detect signals that are both temporally and spatially coherent. In this report I will describe a modification to an existing method that estimates both the coefficients of temporally correlated noise model and an efficient filter based on the noise model. This filter, when applied to the original time-series, effectively whitens (or flattens) the power spectrum. The filtered data provide the means to calculate running averages which are then used to detect deviations from the background trends. For large networks, time-series of signal-to-noise ratio (SNR) can be easily constructed since, by filtering, each of the original time-series has been transformed into one that is closer to having a Gaussian distribution with a variance of 1.0. Anomalous intervals may be identified by counting the number of GPS sites for which the SNR exceeds a specified value. For example, during one time interval, if there were 5 out of 20 time-series with SNR>2, this would be considered anomalous; typically, one would expect at 95% confidence that there would be at least 1 out of 20 time-series with an SNR>2. For time intervals with an anomalously large number of high SNR, the spatial distribution of the SNR is mapped to identify the location of the anomalous signal(s) and their degree of spatial clustering. Estimating the filter that should be used to whiten the data requires modification of the existing methods that employ maximum likelihood estimation to determine the temporal covariance of the data. In these methods, it is assumed that the noise components in the data are a combination of white, flicker and random-walk processes and that they are derived from three different and independent sources. Instead, in this new method, the covariance matrix is constructed assuming that only one source is responsible for the noise and that source can be represented as a white-noise random-number generator convolved with a filter whose spectral properties are frequency (f) independent at its highest frequencies, 1/f at the middle frequencies, and 1/f2 at the lowest frequencies. For data sets with no gaps in their time-series, construction of covariance and inverse covariance matrices is extremely efficient. Application of the above algorithm to real data potentially involves several iterations as small, tectonic signals of interest are often indistinguishable from background noise. Consequently, simply plotting the time-series of each GPS site is used to identify the largest outliers and signals independent of their cause. Any analysis of the background noise levels must factor in these other signals while the gross outliers need to be removed.
2016-09-01
Method Scientific Operating Procedure Series : SOP-C En vi ro nm en ta l L ab or at or y Jonathon Brame and Chris Griggs September 2016...BET) Method Scientific Operating Procedure Series : SOP-C Jonathon Brame and Chris Griggs Environmental Laboratory U.S. Army Engineer Research and...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing
NASA Astrophysics Data System (ADS)
Reading, Michael J.; Santos, Isaac R.; Maher, Damien T.; Jeffrey, Luke C.; Tait, Douglas R.
2017-07-01
The oceans are a major source of the potent greenhouse gas nitrous oxide (N2O) to the atmosphere. However, little information is available on how estuaries and the coastal ocean may contribute to N2O budgets, and on the drivers of N2O in aquatic environments. This study utilised five time series stations along the freshwater to marine continuum in a sub-tropical estuary in Australia (Coffs Creek, Australia). Each time series station captured N2O, radon (222Rn, a natural submarine groundwater discharge tracer), dissolved nitrogen, and dissolved organic carbon (DOC) concentrations for a minimum of 25 h. The use of automated time series observations enabled spatial and tidal-scale variability of N2O to be captured. Groundwater was highly enriched in N2O (up to 306 nM) compared to the receiving surface water. Dissolved N2O supersaturation as high as 386% (27.4 nM) was observed in the upstream freshwater and brackish water areas which represented only a small (∼13%) proportion of the total estuary area. A large area of N2O undersaturation (as low as 53% or 3.9 nM) was observed in the mangrove-dominated lower estuary. This undersaturated area likely resulted from N2O consumption due to nitrate/nitrite (NOx) limitation in mangrove sediments subject to shallow porewater exchange. Overall, the estuary was a minor source of N2O to the atmosphere as the lower mangrove-dominated estuary sink of N2O counteracted groundwater-dominated source of N2O in the upper estuary. Average area-weighted N2O fluxes at the water-air interface approached zero (0.2-0.7 μmol m-2 d-1, depending on piston velocity model used), and were much lower than nitrogen-rich Northern Hemisphere estuaries that are considered large sources of N2O to the atmosphere. This study revealed a temporally and spatially diverse estuary, with areas of N2O production and consumption related to oxygen and total dissolved nitrogen availability, submarine groundwater discharge, and uptake within mangroves.
Adaptive Decomposition of Highly Resolved Time Series into Local and Non‐local Components
Highly time-resolved air monitoring data are widely being collected over long time horizons in order to characterizeambient and near-source air quality trends. In many applications, it is desirable to split the time-resolved data into two ormore components (e.g., local and region...
NASA Astrophysics Data System (ADS)
Saturnino, Diana; Olsen, Nils; Finlay, Chris
2017-04-01
High-precision magnetic measurements collected by satellites such as Swarm or CHAMP,flying at altitudes between 300 and 800km, allow for improved geomagnetic field modelling. An accurate description of the internal (core and crust) field must account for contributions from other sources, such as the ionosphere and magnetosphere. However, the description of the rapidly changing external field contributions, particularly during the quiet times from which the data are selected, constitutes a major challenge of the construction of such models. Our study attempts to obtain improved knowledge on ionospheric field contributions during quiet times conditions, in particular during night local times. We use two different datasets: ground magnetic observatories time series (obtained below the ionospheric E-layer currents), and Swarm satellites measurements acquired above these currents. First, we remove from the data estimates of the core, lithospheric and large-scale magnetospheric magnetic contributions as given by the CHAOS-6 model, to obtain corrected time series. Then, we focus on the differences of the corrected time series: for a pair of ground magnetic observatories, we determine the time series of the difference, and similarly we determine time series differences at satellite altitude, given by the difference between the Swarm Alpha and Charlie satellites taken in the vicinity of the ground observatory locations. The obtained differences time series are analysed regarding their temporal and spatial scales variations, with emphasis on measurements during night local times.
Online Conditional Outlier Detection in Nonstationary Time Series
Liu, Siqi; Wright, Adam; Hauskrecht, Milos
2017-01-01
The objective of this work is to develop methods for detecting outliers in time series data. Such methods can become the key component of various monitoring and alerting systems, where an outlier may be equal to some adverse condition that needs human attention. However, real-world time series are often affected by various sources of variability present in the environment that may influence the quality of detection; they may (1) explain some of the changes in the signal that would otherwise lead to false positive detections, as well as, (2) reduce the sensitivity of the detection algorithm leading to increase in false negatives. To alleviate these problems, we propose a new two-layer outlier detection approach that first tries to model and account for the nonstationarity and periodic variation in the time series, and then tries to use other observable variables in the environment to explain any additional signal variation. Our experiments on several data sets in different domains show that our method provides more accurate modeling of the time series, and that it is able to significantly improve outlier detection performance. PMID:29644345
Online Conditional Outlier Detection in Nonstationary Time Series.
Liu, Siqi; Wright, Adam; Hauskrecht, Milos
2017-05-01
The objective of this work is to develop methods for detecting outliers in time series data. Such methods can become the key component of various monitoring and alerting systems, where an outlier may be equal to some adverse condition that needs human attention. However, real-world time series are often affected by various sources of variability present in the environment that may influence the quality of detection; they may (1) explain some of the changes in the signal that would otherwise lead to false positive detections, as well as, (2) reduce the sensitivity of the detection algorithm leading to increase in false negatives. To alleviate these problems, we propose a new two-layer outlier detection approach that first tries to model and account for the nonstationarity and periodic variation in the time series, and then tries to use other observable variables in the environment to explain any additional signal variation. Our experiments on several data sets in different domains show that our method provides more accurate modeling of the time series, and that it is able to significantly improve outlier detection performance.
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
NASA Astrophysics Data System (ADS)
Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.
2018-01-01
This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.
GRASS GIS: The first Open Source Temporal GIS
NASA Astrophysics Data System (ADS)
Gebbert, Sören; Leppelt, Thomas
2015-04-01
GRASS GIS is a full featured, general purpose Open Source geographic information system (GIS) with raster, 3D raster and vector processing support[1]. Recently, time was introduced as a new dimension that transformed GRASS GIS into the first Open Source temporal GIS with comprehensive spatio-temporal analysis, processing and visualization capabilities[2]. New spatio-temporal data types were introduced in GRASS GIS version 7, to manage raster, 3D raster and vector time series. These new data types are called space time datasets. They are designed to efficiently handle hundreds of thousands of time stamped raster, 3D raster and vector map layers of any size. Time stamps can be defined as time intervals or time instances in Gregorian calendar time or relative time. Space time datasets are simplifying the processing and analysis of large time series in GRASS GIS, since these new data types are used as input and output parameter in temporal modules. The handling of space time datasets is therefore equal to the handling of raster, 3D raster and vector map layers in GRASS GIS. A new dedicated Python library, the GRASS GIS Temporal Framework, was designed to implement the spatio-temporal data types and their management. The framework provides the functionality to efficiently handle hundreds of thousands of time stamped map layers and their spatio-temporal topological relations. The framework supports reasoning based on the temporal granularity of space time datasets as well as their temporal topology. It was designed in conjunction with the PyGRASS [3] library to support parallel processing of large datasets, that has a long tradition in GRASS GIS [4,5]. We will present a subset of more than 40 temporal modules that were implemented based on the GRASS GIS Temporal Framework, PyGRASS and the GRASS GIS Python scripting library. These modules provide a comprehensive temporal GIS tool set. The functionality range from space time dataset and time stamped map layer management over temporal aggregation, temporal accumulation, spatio-temporal statistics, spatio-temporal sampling, temporal algebra, temporal topology analysis, time series animation and temporal topology visualization to time series import and export capabilities with support for NetCDF and VTK data formats. We will present several temporal modules that support parallel processing of raster and 3D raster time series. [1] GRASS GIS Open Source Approaches in Spatial Data Handling In Open Source Approaches in Spatial Data Handling, Vol. 2 (2008), pp. 171-199, doi:10.1007/978-3-540-74831-19 by M. Neteler, D. Beaudette, P. Cavallini, L. Lami, J. Cepicky edited by G. Brent Hall, Michael G. Leahy [2] Gebbert, S., Pebesma, E., 2014. A temporal GIS for field based environmental modeling. Environ. Model. Softw. 53, 1-12. [3] Zambelli, P., Gebbert, S., Ciolli, M., 2013. Pygrass: An Object Oriented Python Application Programming Interface (API) for Geographic Resources Analysis Support System (GRASS) Geographic Information System (GIS). ISPRS Intl Journal of Geo-Information 2, 201-219. [4] Löwe, P., Klump, J., Thaler, J. (2012): The FOSS GIS Workbench on the GFZ Load Sharing Facility compute cluster, (Geophysical Research Abstracts Vol. 14, EGU2012-4491, 2012), General Assembly European Geosciences Union (Vienna, Austria 2012). [5] Akhter, S., Aida, K., Chemin, Y., 2010. "GRASS GIS on High Performance Computing with MPI, OpenMP and Ninf-G Programming Framework". ISPRS Conference, Kyoto, 9-12 August 2010
NASA Astrophysics Data System (ADS)
Larnier, H.; Sailhac, P.; Chambodut, A.
2018-01-01
Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.
Acoustic emission linear pulse holography
Collins, H. Dale; Busse, Lawrence J.; Lemon, Douglas K.
1985-01-01
Defects in a structure are imaged as they propagate, using their emitted acoustic energy as a monitored source. Short bursts of acoustic energy propagate through the structure to a discrete element receiver array. A reference timing transducer located between the array and the inspection zone initiates a series of time-of-flight measurements. A resulting series of time-of-flight measurements are then treated as aperture data and are transferred to a computer for reconstruction of a synthetic linear holographic image. The images can be displayed and stored as a record of defect growth.
NASA Astrophysics Data System (ADS)
Adegoke, Oluwashina; Dhang, Prasun; Mukhopadhyay, Banibrata; Ramadevi, M. C.; Bhattacharya, Debbijoy
2018-05-01
By analysing the time series of RXTE/PCA data, the non-linear variabilities of compact sources have been repeatedly established. Depending on the variation in temporal classes, compact sources exhibit different non-linear features. Sometimes they show low correlation/fractal dimension, but in other classes or intervals of time they exhibit stochastic nature. This could be because the accretion flow around a compact object is a non-linear general relativistic system involving magnetohydrodynamics. However, the more conventional way of addressing a compact source is the analysis of its spectral state. Therefore, the question arises: What is the connection of non-linearity to the underlying spectral properties of the flow when the non-linear properties are related to the associated transport mechanisms describing the geometry of the flow? This work is aimed at addressing this question. Based on the connection between observed spectral and non-linear (time series) properties of two X-ray binaries: GRS 1915+105 and Sco X-1, we attempt to diagnose the underlying accretion modes of the sources in terms of known accretion classes, namely, Keplerian disc, slim disc, advection dominated accretion flow and general advective accretion flow. We explore the possible transition of the sources from one accretion mode to others with time. We further argue that the accretion rate must play an important role in transition between these modes.
Vegetation Response to Climate Change in the Southern Part of Qinghai-Tibet Plateau at Basinal Scale
NASA Astrophysics Data System (ADS)
Liu, X.; Liu, C.; Kang, Q.; Yin, B.
2018-04-01
Global climate change has significantly affected vegetation variation in the third-polar region of the world - the Qinghai-Tibet Plateau. As one of the most important indicators of vegetation variation (growth, coverage and tempo-spatial change), the Normalized Difference Vegetation Index (NDVI) is widely employed to study the response of vegetation to climate change. However, a long-term series analysis cannot be achieved because a single data source is constrained by time sequence. Therefore, a new framework was presented in this paper to extend the product series of monthly NDVI, taking as an example the Yarlung Zangbo River Basin, one of the most important river basins in the Qinghai-Tibet Plateau. NDVI products were acquired from two public sources: Global Inventory Modeling and Mapping Studies (GIMMS) Advanced Very High Resolution Radiometer (AVHRR) and Moderate-Resolution Imaging spectroradiometer (MODIS). After having been extended using the new framework, the new time series of NDVI covers a 384 months period (1982-2013), 84 months longer than previous time series of NDVI product, greatly facilitating NDVI related scientific research. In the new framework, the Gauss Filtering Method was employed to filter out noise in the NDVI product. Next, the standard method was introduced to enhance the comparability of the two data sources, and a pixel-based regression method was used to construct NDVI-extending models with one pixel after another. The extended series of NDVI fit well with original AVHRR-NDVI. With the extended time-series, temporal trends and spatial heterogeneity of NDVI in the study area were studied. Principal influencing factors on NDVI were further determined. The monthly NDVI is highly correlated with air temperature and precipitation in terms of climatic change wherein the spatially averaged NDVI slightly increases in the summer and has increased in temperature and decreased in precipitation in the 32 years period. The spatial heterogeneity of NDVI is in accordance with the seasonal variation of the two climate-change factors. All of these findings can provide valuable scientific support for water-land resources exploration in the third-polar region of the world.
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo
2016-01-01
Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A; Zhang, Wenbo; He, Bin
2016-12-01
Combined source-imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a noninvasive fashion. Source-imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source-imaging algorithms to both find the network nodes [regions of interest (ROI)] and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses, and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Source-imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from interictal and ictal signals recorded by EEG and/or Magnetoencephalography (MEG). Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ∼20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Our study indicates that combined source-imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions.
We present a framework to compare water treatment costs to source water protection costs, an important knowledge gap for drinking water treatment plants (DWTPs). This trade-off helps to determine what incentives a DWTP has to invest in natural infrastructure or pollution reductio...
GIAnT - Generic InSAR Analysis Toolbox
NASA Astrophysics Data System (ADS)
Agram, P.; Jolivet, R.; Riel, B. V.; Simons, M.; Doin, M.; Lasserre, C.; Hetland, E. A.
2012-12-01
We present a computing framework for studying the spatio-temporal evolution of ground deformation from interferometric synthetic aperture radar (InSAR) data. Several open-source tools including Repeat Orbit Interferometry PACkage (ROI-PAC) and InSAR Scientific Computing Environment (ISCE) from NASA-JPL, and Delft Object-oriented Repeat Interferometric Software (DORIS), have enabled scientists to generate individual interferograms from raw radar data with relative ease. Numerous computational techniques and algorithms that reduce phase information from multiple interferograms to a deformation time-series have been developed and verified over the past decade. However, the sharing and direct comparison of products from multiple processing approaches has been hindered by - 1) absence of simple standards for sharing of estimated time-series products, 2) use of proprietary software tools with license restrictions and 3) the closed source nature of the exact implementation of many of these algorithms. We have developed this computing framework to address all of the above issues. We attempt to take the first steps towards creating a community software repository for InSAR time-series analysis. To date, we have implemented the short baseline subset algorithm (SBAS), NSBAS and multi-scale interferometric time-series (MInTS) in this framework and the associated source code is included in the GIAnT distribution. A number of the associated routines have been optimized for performance and scalability with large data sets. Some of the new features in our processing framework are - 1) the use of daily solutions from continuous GPS stations to correct for orbit errors, 2) the use of meteorological data sets to estimate the tropospheric delay screen and 3) a data-driven bootstrapping approach to estimate the uncertainties associated with estimated time-series products. We are currently working on incorporating tidal load corrections for individual interferograms and propagation of noise covariance models through the processing chain for robust estimation of uncertainties in the deformation estimates. We will demonstrate the ease of use of our framework with results ranging from regional scale analysis around Long Valley, CA and Parkfield, CA to continental scale analysis in Western South America. We will also present preliminary results from a new time-series approach that simultaneously estimates deformation over the complete spatial domain at all time epochs on a distributed computing platform. GIAnT has been developed entirely using open source tools and uses Python as the underlying platform. We build on the extensive numerical (NumPy) and scientific (SciPy) computing Python libraries to develop an object-oriented, flexible and modular framework for time-series InSAR applications. The toolbox is currently configured to work with outputs from ROI-PAC, ISCE and DORIS, but can easily be extended to support products from other SAR/InSAR processors. The toolbox libraries include support for hierarchical data format (HDF5) memory mapped files, parallel processing with Python's multi-processing module and support for many convex optimization solvers like CSDP, CVXOPT etc. An extensive set of routines to deal with ASCII and XML files has also been included for controlling the processing parameters.
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.
NASA Astrophysics Data System (ADS)
Karbon, Maria; Heinkelmann, Robert; Mora-Diaz, Julian; Xu, Minghui; Nilsson, Tobias; Schuh, Harald
2017-07-01
The radio sources within the most recent celestial reference frame (CRF) catalog ICRF2 are represented by a single, time-invariant coordinate pair. The datum sources were chosen mainly according to certain statistical properties of their position time series. Yet, such statistics are not applicable unconditionally, and also ambiguous. However, ignoring systematics in the source positions of the datum sources inevitably leads to a degradation of the quality of the frame and, therefore, also of the derived quantities such as the Earth orientation parameters. One possible approach to overcome these deficiencies is to extend the parametrization of the source positions, similarly to what is done for the station positions. We decided to use the multivariate adaptive regression splines algorithm to parametrize the source coordinates. It allows a great deal of automation, by combining recursive partitioning and spline fitting in an optimal way. The algorithm finds the ideal knot positions for the splines and, thus, the best number of polynomial pieces to fit the data autonomously. With that we can correct the ICRF2 a priori coordinates for our analysis and eliminate the systematics in the position estimates. This allows us to introduce also special handling sources into the datum definition, leading to on average 30 % more sources in the datum. We find that not only the CPO can be improved by more than 10 % due to the improved geometry, but also the station positions, especially in the early years of VLBI, can benefit greatly.
NASA Astrophysics Data System (ADS)
Lundgren, P.; Nikkhoo, M.; Samsonov, S. V.; Milillo, P.; Gil-Cruz, F., Sr.; Lazo, J.
2017-12-01
Copahue volcano straddling the edge of the Agrio-Caviahue caldera along the Chile-Argentinaborder in the southern Andes has been in unrest since inflation began in late 2011. We constrain Copahue'ssource models with satellite and airborne interferometric synthetic aperture radar (InSAR) deformationobservations. InSAR time series from descending track RADARSAT-2 and COSMO-SkyMed data span theentire inflation period from 2011 to 2016, with their initially high rates of 12 and 15 cm/yr, respectively,slowing only slightly despite ongoing small eruptions through 2016. InSAR ascending and descending tracktime series for the 2013-2016 time period constrain a two-source compound dislocation model, with a rate ofvolume increase of 13 × 106 m3/yr. They consist of a shallow, near-vertical, elongated source centered at2.5 km beneath the summit and a deeper, shallowly plunging source centered at 7 km depth connecting theshallow source to the deeper caldera. The deeper source is located directly beneath the volcano tectonicseismicity with the lower bounds of the seismicity parallel to the plunge of the deep source. InSAR time seriesalso show normal fault offsets on the NE flank Copahue faults. Coulomb stress change calculations forright-lateral strike slip (RLSS), thrust, and normal receiver faults show positive values in the north caldera forboth RLSS and normal faults, suggesting that northward trending seismicity and Copahue fault motion withinthe caldera are caused by the modeled sources. Together, the InSAR-constrained source model and theseismicity suggest a deep conduit or transfer zone where magma moves from the central caldera toCopahue's upper edifice.
2017-12-29
indicated as shaded intervals in cyan) is shown in the context of the 5-6 August 2011 storm energetics. These are depicted by the time series of [b...of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...documented in a series of journal articles [Horvath and Lovell, 2017A; 2017B; 2017C; 2017D]. Our findings contribute to the better understanding of
A geodetic matched filter search for slow slip with application to the Mexico subduction zone
NASA Astrophysics Data System (ADS)
Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W. B.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.
2017-12-01
Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low-frequency earthquakes and repeating earthquakes provide evidence of low-amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent data sets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with postprocessed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modeling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T, and Mw of events larger than Mw 6 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the downdip edges of the Mw>7.5 slow slip events.
A geodetic matched-filter search for slow slip with application to the Mexico subduction zone
NASA Astrophysics Data System (ADS)
Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.
2017-12-01
Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low frequency earthquakes and repeating earthquakes provide evidence of low amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here, we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent datasets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with post-processed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modelling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T and Mw of events larger than Mw 6.0 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the down dip edges of the Mw > 7.5 SSEs.
NASA Astrophysics Data System (ADS)
Coburn, C. A.; Qin, Y.; Zhang, J.; Staenz, K.
2015-12-01
Food security is one of the most pressing issues facing humankind. Recent estimates predict that over one billion people don't have enough food to meet their basic nutritional needs. The ability of remote sensing tools to monitor and model crop production and predict crop yield is essential for providing governments and farmers with vital information to ensure food security. Google Earth Engine (GEE) is a cloud computing platform, which integrates storage and processing algorithms for massive remotely sensed imagery and vector data sets. By providing the capabilities of storing and analyzing the data sets, it provides an ideal platform for the development of advanced analytic tools for extracting key variables used in regional and national food security systems. With the high performance computing and storing capabilities of GEE, a cloud-computing based system for near real-time crop land monitoring was developed using multi-source remotely sensed data over large areas. The system is able to process and visualize the MODIS time series NDVI profile in conjunction with Landsat 8 image segmentation for crop monitoring. With multi-temporal Landsat 8 imagery, the crop fields are extracted using the image segmentation algorithm developed by Baatz et al.[1]. The MODIS time series NDVI data are modeled by TIMESAT [2], a software package developed for analyzing time series of satellite data. The seasonality of MODIS time series data, for example, the start date of the growing season, length of growing season, and NDVI peak at a field-level are obtained for evaluating the crop-growth conditions. The system fuses MODIS time series NDVI data and Landsat 8 imagery to provide information of near real-time crop-growth conditions through the visualization of MODIS NDVI time series and comparison of multi-year NDVI profiles. Stakeholders, i.e., farmers and government officers, are able to obtain crop-growth information at crop-field level online. This unique utilization of GEE in combination with advanced analytic and extraction techniques provides a vital remote sensing tool for decision makers and scientists with a high-degree of flexibility to adapt to different uses.
Miklius, Asta; Flower, M.F.J.; Huijsmans, J.P.P.; Mukasa, S.B.; Castillo, P.
1991-01-01
Taal lava series can be distinguished from each other by differences in major and trace element trends and trace element ratios, indicating multiple magmatic systems associated with discrete centers in time and space. On Volcano Island, contemporaneous lava series range from typically calc-alkaline to iron-enriched. Major and trace element variation in these series can be modelled by fractionation of similar assemblages, with early fractionation of titano-magnetite in less iron-enriched series. However, phase compositional and petrographic evidence of mineral-liquid disequilibrium suggests that magma mixing played an important role in the evolution of these series. -from Authors
Fahnline, John B
2016-12-01
An equivalent source method is developed for solving transient acoustic boundary value problems. The method assumes the boundary surface is discretized in terms of triangular or quadrilateral elements and that the solution is represented using the acoustic fields of discrete sources placed at the element centers. Also, the boundary condition is assumed to be specified for the normal component of the surface velocity as a function of time, and the source amplitudes are determined to match the known elemental volume velocity vector at a series of discrete time steps. Equations are given for marching-on-in-time schemes to solve for the source amplitudes at each time step for simple, dipole, and tripole source formulations. Several example problems are solved to illustrate the results and to validate the formulations, including problems with closed boundary surfaces where long-time numerical instabilities typically occur. A simple relationship between the simple and dipole source amplitudes in the tripole source formulation is derived so that the source radiates primarily in the direction of the outward surface normal. The tripole source formulation is shown to eliminate interior acoustic resonances and long-time numerical instabilities.
Multifractality Signatures in Quasars Time Series. I. 3C 273
NASA Astrophysics Data System (ADS)
Belete, A. Bewketu; Bravo, J. P.; Canto Martins, B. L.; Leão, I. C.; De Araujo, J. M.; De Medeiros, J. R.
2018-05-01
The presence of multifractality in a time series shows different correlations for different time scales as well as intermittent behaviour that cannot be captured by a single scaling exponent. The identification of a multifractal nature allows for a characterization of the dynamics and of the intermittency of the fluctuations in non-linear and complex systems. In this study, we search for a possible multifractal structure (multifractality signature) of the flux variability in the quasar 3C 273 time series for all electromagnetic wavebands at different observation points, and the origins for the observed multifractality. This study is intended to highlight how the scaling behaves across the different bands of the selected candidate which can be used as an additional new technique to group quasars based on the fractal signature observed in their time series and determine whether quasars are non-linear physical systems or not. The Multifractal Detrended Moving Average algorithm (MFDMA) has been used to study the scaling in non-linear, complex and dynamic systems. To achieve this goal, we applied the backward (θ = 0) MFDMA method for one-dimensional signals. We observe weak multifractal (close to monofractal) behaviour in some of the time series of our candidate except in the mm, UV and X-ray bands. The non-linear temporal correlation is the main source of the observed multifractality in the time series whereas the heaviness of the distribution contributes less.
This paper presents a technique for determining the trace gas emission rate from a point source. The technique was tested using data from controlled methane release experiments and from measurement downwind of a natural gas production facility in Wyoming. Concentration measuremen...
SiGN-SSM: open source parallel software for estimating gene networks with state space models.
Tamada, Yoshinori; Yamaguchi, Rui; Imoto, Seiya; Hirose, Osamu; Yoshida, Ryo; Nagasaki, Masao; Miyano, Satoru
2011-04-15
SiGN-SSM is an open-source gene network estimation software able to run in parallel on PCs and massively parallel supercomputers. The software estimates a state space model (SSM), that is a statistical dynamic model suitable for analyzing short time and/or replicated time series gene expression profiles. SiGN-SSM implements a novel parameter constraint effective to stabilize the estimated models. Also, by using a supercomputer, it is able to determine the gene network structure by a statistical permutation test in a practical time. SiGN-SSM is applicable not only to analyzing temporal regulatory dependencies between genes, but also to extracting the differentially regulated genes from time series expression profiles. SiGN-SSM is distributed under GNU Affero General Public Licence (GNU AGPL) version 3 and can be downloaded at http://sign.hgc.jp/signssm/. The pre-compiled binaries for some architectures are available in addition to the source code. The pre-installed binaries are also available on the Human Genome Center supercomputer system. The online manual and the supplementary information of SiGN-SSM is available on our web site. tamada@ims.u-tokyo.ac.jp.
Ocean time-series near Bermuda: Hydrostation S and the US JGOFS Bermuda Atlantic time-series study
NASA Technical Reports Server (NTRS)
Michaels, Anthony F.; Knap, Anthony H.
1992-01-01
Bermuda is the site of two ocean time-series programs. At Hydrostation S, the ongoing biweekly profiles of temperature, salinity and oxygen now span 37 years. This is one of the longest open-ocean time-series data sets and provides a view of decadal scale variability in ocean processes. In 1988, the U.S. JGOFS Bermuda Atlantic Time-series Study began a wide range of measurements at a frequency of 14-18 cruises each year to understand temporal variability in ocean biogeochemistry. On each cruise, the data range from chemical analyses of discrete water samples to data from electronic packages of hydrographic and optics sensors. In addition, a range of biological and geochemical rate measurements are conducted that integrate over time-periods of minutes to days. This sampling strategy yields a reasonable resolution of the major seasonal patterns and of decadal scale variability. The Sargasso Sea also has a variety of episodic production events on scales of days to weeks and these are only poorly resolved. In addition, there is a substantial amount of mesoscale variability in this region and some of the perceived temporal patterns are caused by the intersection of the biweekly sampling with the natural spatial variability. In the Bermuda time-series programs, we have added a series of additional cruises to begin to assess these other sources of variation and their impacts on the interpretation of the main time-series record. However, the adequate resolution of higher frequency temporal patterns will probably require the introduction of new sampling strategies and some emerging technologies such as biogeochemical moorings and autonomous underwater vehicles.
Characterizing and estimating noise in InSAR and InSAR time series with MODIS
Barnhart, William D.; Lohman, Rowena B.
2013-01-01
InSAR time series analysis is increasingly used to image subcentimeter displacement rates of the ground surface. The precision of InSAR observations is often affected by several noise sources, including spatially correlated noise from the turbulent atmosphere. Under ideal scenarios, InSAR time series techniques can substantially mitigate these effects; however, in practice the temporal distribution of InSAR acquisitions over much of the world exhibit seasonal biases, long temporal gaps, and insufficient acquisitions to confidently obtain the precisions desired for tectonic research. Here, we introduce a technique for constraining the magnitude of errors expected from atmospheric phase delays on the ground displacement rates inferred from an InSAR time series using independent observations of precipitable water vapor from MODIS. We implement a Monte Carlo error estimation technique based on multiple (100+) MODIS-based time series that sample date ranges close to the acquisitions times of the available SAR imagery. This stochastic approach allows evaluation of the significance of signals present in the final time series product, in particular their correlation with topography and seasonality. We find that topographically correlated noise in individual interferograms is not spatially stationary, even over short-spatial scales (<10 km). Overall, MODIS-inferred displacements and velocities exhibit errors of similar magnitude to the variability within an InSAR time series. We examine the MODIS-based confidence bounds in regions with a range of inferred displacement rates, and find we are capable of resolving velocities as low as 1.5 mm/yr with uncertainties increasing to ∼6 mm/yr in regions with higher topographic relief.
tsiR: An R package for time-series Susceptible-Infected-Recovered models of epidemics.
Becker, Alexander D; Grenfell, Bryan T
2017-01-01
tsiR is an open source software package implemented in the R programming language designed to analyze infectious disease time-series data. The software extends a well-studied and widely-applied algorithm, the time-series Susceptible-Infected-Recovered (TSIR) model, to infer parameters from incidence data, such as contact seasonality, and to forward simulate the underlying mechanistic model. The tsiR package aggregates a number of different fitting features previously described in the literature in a user-friendly way, providing support for their broader adoption in infectious disease research. Also included in tsiR are a number of diagnostic tools to assess the fit of the TSIR model. This package should be useful for researchers analyzing incidence data for fully-immunizing infectious diseases.
Cosmogenic 36Cl in karst waters: Quantifying contributions from atmospheric and bedrock sources
NASA Astrophysics Data System (ADS)
Johnston, V. E.; McDermott, F.
2009-12-01
Improved reconstructions of cosmogenic isotope production through time are crucial to understand past solar variability. As a preliminary step to derive atmospheric 36Cl/Cl solar proxy time-series from speleothems, we quantify 36Cl sources in cave dripwaters. Atmospheric 36Cl fallout rates are a potential proxy for solar output; however extraneous 36Cl derived from in-situ production in cave host-rocks could complicate the solar signal. Results from numerical modeling and preliminary geochemical data presented here show that the atmospheric 36Cl source dominates in many, but not all cave dripwaters. At favorable low elevation, mid-latitude sites, 36Cl based speleothem solar irradiance reconstructions could extend back to 500 ka, with a possible centennial scale temporal resolution. This would represent a marginal improvement in resolution compared with existing polar ice core records, with the added advantages of a wider geographic range, independent U-series constrained chronology, and the potential for contemporaneous climate signals within the same speleothem material.
A Dashboard for the Italian Computing in ALICE
NASA Astrophysics Data System (ADS)
Elia, D.; Vino, G.; Bagnasco, S.; Crescente, A.; Donvito, G.; Franco, A.; Lusso, S.; Mura, D.; Piano, S.; Platania, G.; ALICE Collaboration
2017-10-01
A dashboard devoted to the computing in the Italian sites for the ALICE experiment at the LHC has been deployed. A combination of different complementary monitoring tools is typically used in most of the Tier-2 sites: this makes somewhat difficult to figure out at a glance the status of the site and to compare information extracted from different sources for debugging purposes. To overcome these limitations a dedicated ALICE dashboard has been designed and implemented in each of the ALICE Tier-2 sites in Italy: in particular, it provides a single, interactive and easily customizable graphical interface where heterogeneous data are presented. The dashboard is based on two main ingredients: an open source time-series database and a dashboard builder tool for visualizing time-series metrics. Various sensors, able to collect data from the multiple data sources, have been also written. A first version of a national computing dashboard has been implemented using a specific instance of the builder to gather data from all the local databases.
The plant phenological online database (PPODB): an online database for long-term phenological data
NASA Astrophysics Data System (ADS)
Dierenbach, Jonas; Badeck, Franz-W.; Schaber, Jörg
2013-09-01
We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via
Mohammed, Emad A; Naugler, Christopher
2017-01-01
Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.
Mohammed, Emad A.; Naugler, Christopher
2017-01-01
Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996
BiGGEsTS: integrated environment for biclustering analysis of time series gene expression data
Gonçalves, Joana P; Madeira, Sara C; Oliveira, Arlindo L
2009-01-01
Background The ability to monitor changes in expression patterns over time, and to observe the emergence of coherent temporal responses using expression time series, is critical to advance our understanding of complex biological processes. Biclustering has been recognized as an effective method for discovering local temporal expression patterns and unraveling potential regulatory mechanisms. The general biclustering problem is NP-hard. In the case of time series this problem is tractable, and efficient algorithms can be used. However, there is still a need for specialized applications able to take advantage of the temporal properties inherent to expression time series, both from a computational and a biological perspective. Findings BiGGEsTS makes available state-of-the-art biclustering algorithms for analyzing expression time series. Gene Ontology (GO) annotations are used to assess the biological relevance of the biclusters. Methods for preprocessing expression time series and post-processing results are also included. The analysis is additionally supported by a visualization module capable of displaying informative representations of the data, including heatmaps, dendrograms, expression charts and graphs of enriched GO terms. Conclusion BiGGEsTS is a free open source graphical software tool for revealing local coexpression of genes in specific intervals of time, while integrating meaningful information on gene annotations. It is freely available at: . We present a case study on the discovery of transcriptional regulatory modules in the response of Saccharomyces cerevisiae to heat stress. PMID:19583847
Early warning by near-real time disturbance monitoring (Invited)
NASA Astrophysics Data System (ADS)
Verbesselt, J.; Zeileis, A.; Herold, M.
2013-12-01
Near real-time monitoring of ecosystem disturbances is critical for rapidly assessing and addressing impacts on carbon dynamics, biodiversity, and socio-ecological processes. Satellite remote sensing enables cost-effective and accurate monitoring at frequent time steps over large areas. Yet, generic methods to detect disturbances within newly captured satellite images are lacking. We propose a multi-purpose time-series-based disturbance detection approach that identifies and models stable historical variation to enable change detection within newly acquired data. Satellite image time series of vegetation greenness provide a global record of terrestrial vegetation productivity over the past decades. Here, we assess and demonstrate the method by applying it to (1) real-world satellite greenness image time series between February 2000 and July 2011 covering Somalia to detect drought-related vegetation disturbances (2) landsat image time series to detect forest disturbances. First, results illustrate that disturbances are successfully detected in near real-time while being robust to seasonality and noise. Second, major drought-related disturbance corresponding with most drought-stressed regions in Somalia are detected from mid-2010 onwards. Third, the method can be applied to landsat image time series having a lower temporal data density. Furthermore the method can analyze in-situ or satellite data time series of biophysical indicators from local to global scale since it is fast, does not depend on thresholds and does not require time series gap filling. While the data and methods used are appropriate for proof-of-concept development of global scale disturbance monitoring, specific applications (e.g., drought or deforestation monitoring) mandates integration within an operational monitoring framework. Furthermore, the real-time monitoring method is implemented in open-source environment and is freely available in the BFAST package for R software. Information illustrating how to apply the method on satellite image time series are available at http://bfast.R-Forge.R-project.org/ and the example section of the bfastmonitor() function within the BFAST package.
The Mighty Atom? The Development of Nuclear Power Technology
ERIC Educational Resources Information Center
Harris, Frank
2014-01-01
The use of nuclear energy for the generation of electricity started in the 1950s and was viewed, at the time, as a source of virtually free power. Development flourished and some countries adopted the nuclear option as their principal source for producing electrical energy. However, a series of nuclear incidents and concern about the treatment of…
Earthquake forecasting studies using radon time series data in Taiwan
NASA Astrophysics Data System (ADS)
Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong
2017-04-01
For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.
National Health Expenditures, 1996
Levit, Katharine R.; Lazenby, Helen C.; Braden, Bradley R.; Cowan, Cathy A.; Sensenig, Arthur L.; McDonnell, Patricia A.; Stiller, Jean M.; Won, Darleen K.; Martin, Anne B.; Sivarajan, Lekha; Donham, Carolyn S.; Long, Anna M.; Stewart, Madie W.
1997-01-01
The national health expenditures (NHE) series presented in this report for 1960-96 provides a view of the economic history of health care in the United States through spending for health care services and the sources financing that care. In 1996 NHE topped $1 trillion. At the same time, spending grew at the slowest rate, 4.4 percent, ever recorded in the current series. For the first time, this article presents estimates of Medicare managed care payments by type of service, as well as nursing home and home health spending in hospital-based facilities. PMID:10179997
Information-Theoretical Analysis of EEG Microstate Sequences in Python.
von Wegner, Frederic; Laufs, Helmut
2018-01-01
We present an open-source Python package to compute information-theoretical quantities for electroencephalographic data. Electroencephalography (EEG) measures the electrical potential generated by the cerebral cortex and the set of spatial patterns projected by the brain's electrical potential on the scalp surface can be clustered into a set of representative maps called EEG microstates. Microstate time series are obtained by competitively fitting the microstate maps back into the EEG data set, i.e., by substituting the EEG data at a given time with the label of the microstate that has the highest similarity with the actual EEG topography. As microstate sequences consist of non-metric random variables, e.g., the letters A-D, we recently introduced information-theoretical measures to quantify these time series. In wakeful resting state EEG recordings, we found new characteristics of microstate sequences such as periodicities related to EEG frequency bands. The algorithms used are here provided as an open-source package and their use is explained in a tutorial style. The package is self-contained and the programming style is procedural, focusing on code intelligibility and easy portability. Using a sample EEG file, we demonstrate how to perform EEG microstate segmentation using the modified K-means approach, and how to compute and visualize the recently introduced information-theoretical tests and quantities. The time-lagged mutual information function is derived as a discrete symbolic alternative to the autocorrelation function for metric time series and confidence intervals are computed from Markov chain surrogate data. The software package provides an open-source extension to the existing implementations of the microstate transform and is specifically designed to analyze resting state EEG recordings.
NASA Astrophysics Data System (ADS)
Eberle, J.; Schmullius, C.
2017-12-01
Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.
High Resolution Geological Site Characterization Utilizing Ground Motion Data
1992-06-26
Hayward, 1992). 15 Acquistion I 16 The source characterization array was composed of 28 stations evenly 17 distributed on the circumference of a...of analog anti alias filters, no prefiltering was applied during II acquistion . 12 Results 13 We deployed 9 different sources within the source...calculated using a 1024 point Hamming window applied to 32 the original 1000 point detrended and padded time series. These are then contoured as a 33
NASA Technical Reports Server (NTRS)
Biezad, D. J.; Schmidt, D. K.; Leban, F.; Mashiko, S.
1986-01-01
Single-channel pilot manual control output in closed-tracking tasks is modeled in terms of linear discrete transfer functions which are parsimonious and guaranteed stable. The transfer functions are found by applying a modified super-position time series generation technique. A Levinson-Durbin algorithm is used to determine the filter which prewhitens the input and a projective (least squares) fit of pulse response estimates is used to guarantee identified model stability. Results from two case studies are compared to previous findings, where the source of data are relatively short data records, approximately 25 seconds long. Time delay effects and pilot seasonalities are discussed and analyzed. It is concluded that single-channel time series controller modeling is feasible on short records, and that it is important for the analyst to determine a criterion for best time domain fit which allows association of model parameter values, such as pure time delay, with actual physical and physiological constraints. The purpose of the modeling is thus paramount.
Ghalyan, Najah F; Miller, David J; Ray, Asok
2018-06-12
Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.
NASA Astrophysics Data System (ADS)
Wang, H.; Cheng, J.
2017-12-01
A method to Synthesis natural electric and magnetic Time series is proposed whereby the time series of local site are derived using an Impulse Response and a reference (STIR). The method is based on the assumption that the external source of magnetic fields are uniform, and the electric and magnetic fields acquired at the surface satisfy a time-independent linear relation in frequency domain.According to the convolution theorem, we can synthesize natural electric and magnetic time series using the impulse responses of inter-station transfer functions with a reference. Applying this method, two impulse responses need to be estimated: the quasi-MT impulse response tensor and the horizontal magnetic impulse response tensor. These impulse response tensors relate the local horizontal electric and magnetic components with the horizontal magnetic components at a reference site, respectively. Some clean segments of times series are selected to estimate impulse responses by using least-square (LS) method. STIR is similar with STIN (Wang, 2017), but STIR does not need to estimate the inter-station transfer functions, and the synthesized data are more accurate in high frequency, where STIN fails when the inter-station transfer functions are contaminated severely. A test with good quality of MT data shows that synthetic time-series are similar to natural electric and magnetic time series. For contaminated AMT example, when this method is used to remove noise present at the local site, the scatter of MT sounding curves are clear reduced, and the data quality are improved. *This work is funded by National Key R&D Program of China(2017YFC0804105),National Natural Science Foundation of China (41604064, 51574250), State Key Laboratory of Coal Resources and Safe Mining ,China University of Mining & Technology,(SKLCRSM16DC09)
NASA Astrophysics Data System (ADS)
Kuze, A.; Suto, H.; Kataoka, F.; Shiomi, K.; Kondo, Y.; Crisp, D.; Butz, A.
2017-12-01
Atmospheric methane (CH4) has an important role in global radiative forcing of climate but its emission estimates have larger uncertainties than carbon dioxide (CO2). The area of anthropogenic emission sources is usually much smaller than 100 km2. The Thermal And Near infrared Sensor for carbon Observation Fourier-Transform Spectrometer (TANSO-FTS) onboard the Greenhouse gases Observing SATellite (GOSAT) has measured CO2 and CH4 column density using sun light reflected from the earth's surface. It has an agile pointing system and its footprint can cover 87-km2 with a single detector. By specifying pointing angles and observation time for every orbit, TANSO-FTS can target various CH4 point sources together with reference points every 3 day over years. We selected a reference point that represents CH4 background density before or after targeting a point source. By combining satellite-measured enhancement of the CH4 column density and surface measured wind data or estimates from the Weather Research and Forecasting (WRF) model, we estimated CH4emission amounts. Here, we picked up two sites in the US West Coast, where clear sky frequency is high and a series of data are available. The natural gas leak at Aliso Canyon showed a large enhancement and its decrease with time since the initial blowout. We present time series of flux estimation assuming the source is single point without influx. The observation of the cattle feedlot in Chino, California has weather station within the TANSO-FTS footprint. The wind speed is monitored continuously and the wind direction is stable at the time of GOSAT overpass. The large TANSO-FTS footprint and strong wind decreases enhancement below noise level. Weak wind shows enhancements in CH4, but the velocity data have large uncertainties. We show the detection limit of single samples and how to reduce uncertainty using time series of satellite data. We will propose that the next generation instruments for accurate anthropogenic CO2 and CH4 flux estimation have improve spatial resolution (˜1km2 ) to further enhance column density changes. We also propose adding imaging capability to monitor plume orientation. We will present laboratory model results and a sampling pattern optimization study that combines local emission source and global survey observations.
Multifractal behavior of an air pollutant time series and the relevance to the predictability.
Dong, Qingli; Wang, Yong; Li, Peizhi
2017-03-01
Compared with the traditional method of detrended fluctuation analysis, which is used to characterize fractal scaling properties and long-range correlations, this research provides new insight into the multifractality and predictability of a nonstationary air pollutant time series using the methods of spectral analysis and multifractal detrended fluctuation analysis. First, the existence of a significant power-law behavior and long-range correlations for such series are verified. Then, by employing shuffling and surrogating procedures and estimating the scaling exponents, the major source of multifractality in these pollutant series is found to be the fat-tailed probability density function. Long-range correlations also partly contribute to the multifractal features. The relationship between the predictability of the pollutant time series and their multifractal nature is then investigated with extended analyses from the quantitative perspective, and it is found that the contribution of the multifractal strength of long-range correlations to the overall multifractal strength can affect the predictability of a pollutant series in a specific region to some extent. The findings of this comprehensive study can help to better understand the mechanisms governing the dynamics of air pollutant series and aid in performing better meteorological assessment and management. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Goodwell, Allison E.; Kumar, Praveen
2017-07-01
Information theoretic measures can be used to identify nonlinear interactions between source and target variables through reductions in uncertainty. In information partitioning, multivariate mutual information is decomposed into synergistic, unique, and redundant components. Synergy is information shared only when sources influence a target together, uniqueness is information only provided by one source, and redundancy is overlapping shared information from multiple sources. While this partitioning has been applied to provide insights into complex dependencies, several proposed partitioning methods overestimate redundant information and omit a component of unique information because they do not account for source dependencies. Additionally, information partitioning has only been applied to time-series data in a limited context, using basic pdf estimation techniques or a Gaussian assumption. We develop a Rescaled Redundancy measure (Rs) to solve the source dependency issue, and present Gaussian, autoregressive, and chaotic test cases to demonstrate its advantages over existing techniques in the presence of noise, various source correlations, and different types of interactions. This study constitutes the first rigorous application of information partitioning to environmental time-series data, and addresses how noise, pdf estimation technique, or source dependencies can influence detected measures. We illustrate how our techniques can unravel the complex nature of forcing and feedback within an ecohydrologic system with an application to 1 min environmental signals of air temperature, relative humidity, and windspeed. The methods presented here are applicable to the study of a broad range of complex systems composed of interacting variables.
2017-10-03
and Microbiome Research Seminar Series . Baylor College of Medicine. 10/26/16. 12. "Rewiring the DNA binding domains ofbacterial two-component system...Structural and Quantitative Biology Seminar Series . 11/16/15. 16. "Engineering bacterial two component signal transduction systems to function as sensors...hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and
NASA Astrophysics Data System (ADS)
Lohman, R. B.; Barnhart, W. D.
2011-12-01
We present interferometric synthetic aperture radar (InSAR) time series maps that span the eastern Zagros (Fars Arc) collisional belt and western Makran accretionary prism of Southern Iran. Given the upcoming availability of large volumes of SAR data from new platforms, such as Sentinel 1 and potentially DESDynI, we explore computationally efficient approaches for extracting deformation time series when the signal of interest is small compared to the level of noise in individual interferograms. We use 12 descending and 2 ascending multi-frame (2-4 frames) Envisat tracks and 2 ascending ALOS tracks spanning 2003-2010 and 2006-2010. We implement a linear inversion, similar to the Small Baseline Subset (SBaS) technique, to derive surface displacements at individual acquisition dates from trees of interferograms with perpendicular baselines less than 350m for Envisat and 1500m for ALOS pairs. This spatially extensive dataset allows us to investigate several attributes of interferometry that vary spatially and temporally over large distances, including changes in phase coherence relative to elevation and relief as well as land use. Through synthetic tests and observed data, we explore various sources of potential error in calculation of time series, including variable coherence of pixels between interferograms in a single track, ambiguities in phase unwrapping, and orbital ramp estimation over scenes with variable correlated noise structure. We present examples of detected signals with both temporally variable characteristics and small magnitudes, including surface/subsurface salt deformation, aseismic deformation across Minab-Zendan-Palami strike-slip zone, and subsidence due to hydrocarbon extraction.
NASA Astrophysics Data System (ADS)
Brereton, Carol A.; Joynes, Ian M.; Campbell, Lucy J.; Johnson, Matthew R.
2018-05-01
Fugitive emissions are important sources of greenhouse gases and lost product in the energy sector that can be difficult to detect, but are often easily mitigated once they are known, located, and quantified. In this paper, a scalar transport adjoint-based optimization method is presented to locate and quantify unknown emission sources from downstream measurements. This emission characterization approach correctly predicted locations to within 5 m and magnitudes to within 13% of experimental release data from Project Prairie Grass. The method was further demonstrated on simulated simultaneous releases in a complex 3-D geometry based on an Alberta gas plant. Reconstructions were performed using both the complex 3-D transient wind field used to generate the simulated release data and using a sequential series of steady-state RANS wind simulations (SSWS) representing 30 s intervals of physical time. Both the detailed transient and the simplified wind field series could be used to correctly locate major sources and predict their emission rates within 10%, while predicting total emission rates from all sources within 24%. This SSWS case would be much easier to implement in a real-world application, and gives rise to the possibility of developing pre-computed databases of both wind and scalar transport adjoints to reduce computational time.
NASA Astrophysics Data System (ADS)
Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen
2016-04-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].
David, J M; Pollari, F; Pintar, K D M; Nesbitt, A; Butler, A J; Ravel, A
2017-11-01
Campylobacteriosis, the most frequent bacterial enteric disease, shows a clear yet unexplained seasonality. The study purpose was to explore the influence of seasonal fluctuation in the contamination of and in the behaviour exposures to two important sources of Campylobacter on the seasonality of campylobacteriosis. Time series analyses were applied to data collected through an integrated surveillance system in Canada in 2005-2010. Data included sporadic, domestically-acquired cases of Campylobacter jejuni infection, contamination of retail chicken meat and of surface water by C. jejuni, and exposure to each source through barbequing and swimming in natural waters. Seasonal patterns were evident for all variables with a peak in summer for human cases and for both exposures, in fall for chicken meat contamination, and in late fall for water contamination. Time series analyses showed that the observed campylobacteriosis summer peak could only be significantly linked to behaviour exposures rather than sources contamination (swimming rather than water contamination and barbequing rather than chicken meat contamination). The results indicate that the observed summer increase in human cases may be more the result of amplification through more frequent risky exposures rather than the result of an increase of the Campylobacter source contamination.
New Insights into the Explosion Source from SPE
NASA Astrophysics Data System (ADS)
Patton, H. J.
2015-12-01
Phase I of the Source Physics Experiments (SPE) is a series of chemical explosions at varying depths and yields detonated in the same emplacement hole on Climax stock, a granitic pluton located on the Nevada National Security Site. To date, four of the seven planned tests have been conducted, the last in May 2015, called SPE-4P, with a scaled depth of burial of 1549 m/kt1/3 in order to localize the source in time and space. Surface ground motions validated that the source medium did not undergo spallation, and a key experimental objective was achieved where SPE-4P is the closest of all tests in the series to a pure monopole source and will serve as an empirical Green's function for analysis against other SPE tests. A scientific objective of SPE is to understand mechanisms of rock damage for generating seismic waves, particularly surface and S waves, including prompt damage under compressive stresses and "late-time" damage under tensile stresses. Studies have shown that prompt damage can explain ~75% of the seismic moment for some SPE tests. Spallation is a form of late-time damage and a facilitator of damage mechanisms under tensile stresses including inelastic brittle deformation and shear dilatancy on pre-existing faults or joints. As an empirical Green's function, SPE-4P allows the study of late-time damage mechanisms on other SPE tests that induce spallation and late-time damage, and I'll discuss these studies. The importance for nuclear monitoring cannot be overstated because new research shows that damage mechanisms can affect surface wave magnitude Ms more than tectonic release, and are a likely factor related to anomalous mb-Ms behavior for North Korean tests.
Comparison of ocean mass content change from direct and inversion based approaches
NASA Astrophysics Data System (ADS)
Uebbing, Bernd; Kusche, Jürgen; Rietbroek, Roelof
2017-04-01
The GRACE satellite mission provides an indispensable tool for measuring oceanic mass variations. Such time series are essential to separate global mean sea level rise in thermosteric and mass driven contributions, and thus to constrain ocean heat content and (deep) ocean warming when viewed together with altimetry and Argo data. However, published estimates over the GRACE era differ, not only depending on the time window considered. Here, we will look into sources of such differences with direct and inverse approaches. Deriving ocean mass time series requires several processing steps; choosing a GRACE (and altimetry and Argo) product, data coverage, masks and filters to be applied in either spatial or spectral domain, corrections related to spatial leakage, GIA and geocenter motion need to be accounted for. In this study, we quantify the effects of individual processing choices and assumptions of the direct and inversion based approaches to derive ocean mass content change. Furthermore, we compile the different estimates from existing literature and sources, to highlight the differences.
Microforms in gravel bed rivers: Formation, disintegration, and effects on bedload transport
Strom, K.; Papanicolaou, A.N.; Evangelopoulos, N.; Odeh, M.
2004-01-01
This research aims to advance current knowledge on cluster formation and evolution by tackling some of the aspects associated with cluster microtopography and the effects of clusters on bedload transport. The specific objectives of the study are (1) to identify the bed shear stress range in which clusters form and disintegrate, (2) to quantitatively describe the spacing characteristics and orientation of clusters with respect to flow characteristics, (3) to quantify the effects clusters have on the mean bedload rate, and (4) to assess the effects of clusters on the pulsating nature of bedload. In order to meet the objectives of this study, two main experimental scenarios, namely, Test Series A and B (20 experiments overall) are considered in a laboratory flume under well-controlled conditions. Series A tests are performed to address objectives (1) and (2) while Series B is designed to meet objectives (3) and (4). Results show that cluster microforms develop in uniform sediment at 1.25 to 2 times the Shields parameter of an individual particle and start disintegrating at about 2.25 times the Shields parameter. It is found that during an unsteady flow event, effects of clusters on bedload transport rate can be classified in three different phases: a sink phase where clusters absorb incoming sediment, a neutral phase where clusters do not affect bedload, and a source phase where clusters release particles. Clusters also increase the magnitude of the fluctuations in bedload transport rate, showing that clusters amplify the unsteady nature of bedload transport. A fourth-order autoregressive, autoregressive integrated moving average model is employed to describe the time series of bedload and provide a predictive formula for predicting bedload at different periods. Finally, a change-point analysis enhanced with a binary segmentation procedure is performed to identify the abrupt changes in the bedload statistic characteristics due to the effects of clusters and detect the different phases in bedload time series using probability theory. The analysis verifies the experimental findings that three phases are detected in the bedload rate time series structure, namely, sink, neutral, and source. ?? ASCE / JUNE 2004.
Flicker Noise in GNSS Station Position Time Series: How much is due to Crustal Loading Deformations?
NASA Astrophysics Data System (ADS)
Rebischung, P.; Chanard, K.; Metivier, L.; Altamimi, Z.
2017-12-01
The presence of colored noise in GNSS station position time series was detected 20 years ago. It has been shown since then that the background spectrum of non-linear GNSS station position residuals closely follows a power-law process (known as flicker noise, 1/f noise or pink noise), with some white noise taking over at the highest frequencies. However, the origin of the flicker noise present in GNSS station position time series is still unclear. Flicker noise is often described as intrinsic to the GNSS system, i.e. due to errors in the GNSS observations or in their modeling, but no such error source has been identified so far that could explain the level of observed flicker noise, nor its spatial correlation.We investigate another possible contributor to the observed flicker noise, namely real crustal displacements driven by surface mass transports, i.e. non-tidal loading deformations. This study is motivated by the presence of power-law noise in the time series of low-degree (≤ 40) and low-order (≤ 12) Stokes coefficients observed by GRACE - power-law noise might also exist at higher degrees and orders, but obscured by GRACE observational noise. By comparing GNSS station position time series with loading deformation time series derived from GRACE gravity fields, both with their periodic components removed, we therefore assess whether GNSS and GRACE both plausibly observe the same flicker behavior of surface mass transports / loading deformations. Taking into account GRACE observability limitations, we also quantify the amount of flicker noise in GNSS station position time series that could be explained by such flicker loading deformations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.
2015-01-19
Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry ofmore » response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search« less
Michalareas, George; Schoffelen, Jan-Mathijs; Paterson, Gavin; Gross, Joachim
2013-01-01
Abstract In this work, we investigate the feasibility to estimating causal interactions between brain regions based on multivariate autoregressive models (MAR models) fitted to magnetoencephalographic (MEG) sensor measurements. We first demonstrate the theoretical feasibility of estimating source level causal interactions after projection of the sensor-level model coefficients onto the locations of the neural sources. Next, we show with simulated MEG data that causality, as measured by partial directed coherence (PDC), can be correctly reconstructed if the locations of the interacting brain areas are known. We further demonstrate, if a very large number of brain voxels is considered as potential activation sources, that PDC as a measure to reconstruct causal interactions is less accurate. In such case the MAR model coefficients alone contain meaningful causality information. The proposed method overcomes the problems of model nonrobustness and large computation times encountered during causality analysis by existing methods. These methods first project MEG sensor time-series onto a large number of brain locations after which the MAR model is built on this large number of source-level time-series. Instead, through this work, we demonstrate that by building the MAR model on the sensor-level and then projecting only the MAR coefficients in source space, the true casual pathways are recovered even when a very large number of locations are considered as sources. The main contribution of this work is that by this methodology entire brain causality maps can be efficiently derived without any a priori selection of regions of interest. Hum Brain Mapp, 2013. © 2012 Wiley Periodicals, Inc. PMID:22328419
NASA Astrophysics Data System (ADS)
Saturnino, Diana; Langlais, Benoit; Amit, Hagay; Civet, François; Mandea, Mioara; Beucler, Éric
2018-03-01
A detailed description of the main geomagnetic field and of its temporal variations (i.e., the secular variation or SV) is crucial to understanding the geodynamo. Although the SV is known with high accuracy at ground magnetic observatory locations, the globally uneven distribution of the observatories hampers the determination of a detailed global pattern of the SV. Over the past two decades, satellites have provided global surveys of the geomagnetic field which have been used to derive global spherical harmonic (SH) models through some strict data selection schemes to minimise external field contributions. However, discrepancies remain between ground measurements and field predictions by these models; indeed the global models do not reproduce small spatial scales of the field temporal variations. To overcome this problem we propose to directly extract time series of the field and its temporal variation from satellite measurements as it is done at observatory locations. We follow a Virtual Observatory (VO) approach and define a global mesh of VOs at satellite altitude. For each VO and each given time interval we apply an Equivalent Source Dipole (ESD) technique to reduce all measurements to a unique location. Synthetic data are first used to validate the new VO-ESD approach. Then, we apply our scheme to data from the first two years of the Swarm mission. For the first time, a 2.5° resolution global mesh of VO time series is built. The VO-ESD derived time series are locally compared to ground observations as well as to satellite-based model predictions. Our approach is able to describe detailed temporal variations of the field at local scales. The VO-ESD time series are then used to derive global spherical harmonic models. For a simple SH parametrization the model describes well the secular trend of the magnetic field both at satellite altitude and at the surface. As more data will be made available, longer VO-ESD time series can be derived and consequently used to study sharp temporal variation features, such as geomagnetic jerks.
Land use change, and the implementation of best management practices to remedy the adverse effects of land use change, alter hydrologic patterns, contaminant loading and water quality in freshwater ecosystems. These changes are not constant over time, but vary in response to di...
NASA Astrophysics Data System (ADS)
Lieu, Richard
2018-01-01
A hierarchy of statistics of increasing sophistication and accuracy is proposed, to exploit an interesting and fundamental arithmetic structure in the photon bunching noise of incoherent light of large photon occupation number, with the purpose of suppressing the noise and rendering a more reliable and unbiased measurement of the light intensity. The method does not require any new hardware, rather it operates at the software level, with the help of high precision computers, to reprocess the intensity time series of the incident light to create a new series with smaller bunching noise coherence length. The ultimate accuracy improvement of this method of flux measurement is limited by the timing resolution of the detector and the photon occupation number of the beam (the higher the photon number the better the performance). The principal application is accuracy improvement in the bolometric flux measurement of a radio source.
Moody, George B; Mark, Roger G; Goldberger, Ary L
2011-01-01
PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.
NASA Astrophysics Data System (ADS)
Hansen, A. B.; Kendall, E.; Chew, B. N.; Chong, W. M.; Gan, C.; Hort, M. C.; Shaw, F.; Witham, C. S.
2017-12-01
Biomass burning in South East Asia causes intense haze episodes in Singapore, these are of major concern to the local government and the population exposed to the haze. Using a Lagrangian dispersion model we have studied haze in the seven most recent years (2010 - 2016) to gain a deeper understanding of intense haze in Singapore. In this study, modelled haze time-series at one eastern and one western monitoring station in Singapore are compared to local observed PM10 and PM2.5 air concentrations. The haze time-series are broken down by season or month, source region, and monitoring location.The analysis, presented as time series and pie charts, illustrates the relative contribution to haze in Singapore from different regions, variations between seasons and the correlation of impact to the combined timing of burning activity and meteorological patterns. Air history maps, showing where air arriving in Singapore originates from and/or has travelled through, are used to isolate the meteorological dependence of impacts. These show a strong monsoonal variation and help explain the inter-annual differences between sources and actual concentrations of biomass burning PM in Singapore. For example, there is a strong correlation in 2013 between burning in Riau and haze in Singapore, but a weak correlation in other years when a significant part of haze originates from, e.g., Peninsula Malaysia, but emissions are seemingly negligible. We see that, in spite of the size of Singapore, there is significant difference in concentrations and major contributing source regions between the two monitoring stations, annually and seasonally. The differences at the two monitoring stations are seen in varying degrees in the years 2011, 2012, 2014, and 2015, throughout different seasons. Although only biomass burning is considered in the simulations, our modelled results are in good agreement with local observations. We have identified the source regions with the biggest contributions to haze in Singapore as Riau and Peninsula Malaysia, with secondary contributions from South Sumatra, Jambi, Central and West Kalimantan, Riau Islands, and Bangka-Belitung. We show that both regional burning and regional weather has significant impact on local haze conditions in Singapore.
A novel Bayesian approach to acoustic emission data analysis.
Agletdinov, E; Pomponi, E; Merson, D; Vinogradov, A
2016-12-01
Acoustic emission (AE) technique is a popular tool for materials characterization and non-destructive testing. Originating from the stochastic motion of defects in solids, AE is a random process by nature. The challenging problem arises whenever an attempt is made to identify specific points corresponding to the changes in the trends in the fluctuating AE time series. A general Bayesian framework is proposed for the analysis of AE time series, aiming at automated finding the breakpoints signaling a crossover in the dynamics of underlying AE sources. Copyright © 2016 Elsevier B.V. All rights reserved.
Groundwater similarity across a watershed derived from time-warped and flow-corrected time series
NASA Astrophysics Data System (ADS)
Rinderer, M.; McGlynn, B. L.; van Meerveld, H. J.
2017-05-01
Information about catchment-scale groundwater dynamics is necessary to understand how catchments store and release water and why water quantity and quality varies in streams. However, groundwater level monitoring is often restricted to a limited number of sites. Knowledge of the factors that determine similarity between monitoring sites can be used to predict catchment-scale groundwater storage and connectivity of different runoff source areas. We used distance-based and correlation-based similarity measures to quantify the spatial and temporal differences in shallow groundwater similarity for 51 monitoring sites in a Swiss prealpine catchment. The 41 months long time series were preprocessed using Dynamic Time-Warping and a Flow-corrected Time Transformation to account for small timing differences and bias toward low-flow periods. The mean distance-based groundwater similarity was correlated to topographic indices, such as upslope contributing area, topographic wetness index, and local slope. Correlation-based similarity was less related to landscape position but instead revealed differences between seasons. Analysis of variance and partial Mantel tests showed that landscape position, represented by the topographic wetness index, explained 52% of the variability in mean distance-based groundwater similarity, while spatial distance, represented by the Euclidean distance, explained only 5%. The variability in distance-based similarity and correlation-based similarity between groundwater and streamflow time series was significantly larger for midslope locations than for other landscape positions. This suggests that groundwater dynamics at these midslope sites, which are important to understand runoff source areas and hydrological connectivity at the catchment scale, are most difficult to predict.
NASA Astrophysics Data System (ADS)
Munekane, H.; Oikawa, J.; Kobayashi, T.
2014-12-01
Miyake-jima is an active basaltic stratovolcano that is located 200km south of Tokyo, Japan. Its eruption event in 2000 was remarkable in that the large caldera was formed at the summit in approximately one month. During the caldera forming stage, very-long-period (VLP) seismic pulse waves with a duration of about 50-s that were accompanied by the step-like inflation were repeatedly recorded. Based on the broadband seismometer data, the piston model is proposed in which a vertical piston of solid materials in the conduit is intermittently sucked into the magma chamber located 3-5 km beneath the edific. In this study, we used the kinematic displacements from the continuous GPS observation to obtain additonal insights on the source mechanism of the pulse waves. We calculated the kinematic displacements of 15 GPS stations on Miyake-jima that were in operation at that time at 30 sec interval. Then we extracted the displacements associated with each event using 20-hour time window centered at the occurrence of the event, and stacked the whole time series to obtain mean displacement time series. The obtained time series contain: 1) step-like displacements associated with the pulse waves, 2) exponential decay following the events with time constant of approximately half-day, and 3) steady linear displacements indicating continuous contraction of the edifice. The type one displacements can be attributed to the simultaneous inflation of an mogi-type spherical pressure source located at the depth of 3.6 km under the edifice, and the opening of the nearby vertical dike whose top is at the depth of 2.3 km. The type two displacements can be interpreted as the pressure adjustment at the type one source by the outflow of the magma driven by the pressure difference between the type one source and surrounding area. The type three displacements can be interpreted as the steady outflow of the magma from the type one source. The above results support the ``piston model'' for the source of the pulse waves. However, it seems that the pressure increases by the collapse of the piston are not adjusted by the steady magma outflow as the ``piston model'' suggests but by the pressure-driven magma outflow. The steady magma outflow instead seems to be responsible for the long-term shrinkage of the edifice observed at that period.
Coral radiocarbon constraints on the source of the Indonesian throughflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, M.D.; Schrag, D.P.; Kashgarian, M.
1997-06-01
Radiocarbon variability in {ital Porites} spp. corals from Guam and the Makassar Strait (Indonesian Seaway) was used to identify the source waters contributing to the Indonesian throughflow. Time series with bimonthly resolution were constructed using accelerator mass spectrometry. The seasonal variability ranges from 15 to 60{per_thousand}, with large interannual variability. {Delta}{sup 14}C values from Indonesia and Guam have a nearly identical range. Annual mean {Delta}{sup 14}C values from Indonesia are 50 to 60{per_thousand} higher than in corals from Canton in the South Equatorial Current [{ital Druffel}, 1987]. These observations support a year-round North Pacific source for the Indonesian throughflow andmore » imply negligible contribution by South Equatorial Current water. The large seasonality in {Delta}{sup 14}C values from both sites emphasizes the dynamic behavior of radiocarbon in the surface ocean and suggests that {Delta}{sup 14}C time series of similar resolution can help constrain seasonal and interannual changes in ocean circulation in the Pacific over the last several decades.{copyright} 1997 American Geophysical Union« less
A Python-based interface to examine motions in time series of solar images
NASA Astrophysics Data System (ADS)
Campos-Rozo, J. I.; Vargas Domínguez, S.
2017-10-01
Python is considered to be a mature programming language, besides of being widely accepted as an engaging option for scientific analysis in multiple areas, as will be presented in this work for the particular case of solar physics research. SunPy is an open-source library based on Python that has been recently developed to furnish software tools to solar data analysis and visualization. In this work we present a graphical user interface (GUI) based on Python and Qt to effectively compute proper motions for the analysis of time series of solar data. This user-friendly computing interface, that is intended to be incorporated to the Sunpy library, uses a local correlation tracking technique and some extra tools that allows the selection of different parameters to calculate, vizualize and analyze vector velocity fields of solar data, i.e. time series of solar filtergrams and magnetograms.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Cost Analysis Sources and Documents Data Base Reference Manual (Update)
1989-06-01
M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986
Correlated errors in geodetic time series: Implications for time-dependent deformation
Langbein, J.; Johnson, H.
1997-01-01
Analysis of frequent trilateration observations from the two-color electronic distance measuring networks in California demonstrate that the noise power spectra are dominated by white noise at higher frequencies and power law behavior at lower frequencies. In contrast, Earth scientists typically have assumed that only white noise is present in a geodetic time series, since a combination of infrequent measurements and low precision usually preclude identifying the time-correlated signature in such data. After removing a linear trend from the two-color data, it becomes evident that there are primarily two recognizable types of time-correlated noise present in the residuals. The first type is a seasonal variation in displacement which is probably a result of measuring to shallow surface monuments installed in clayey soil which responds to seasonally occurring rainfall; this noise is significant only for a small fraction of the sites analyzed. The second type of correlated noise becomes evident only after spectral analysis of line length changes and shows a functional relation at long periods between power and frequency of and where f is frequency and ?? ??? 2. With ?? = 2, this type of correlated noise is termed random-walk noise, and its source is mainly thought to be small random motions of geodetic monuments with respect to the Earth's crust, though other sources are possible. Because the line length changes in the two-color networks are measured at irregular intervals, power spectral techniques cannot reliably estimate the level of I//" noise. Rather, we also use here a maximum likelihood estimation technique which assumes that there are only two sources of noise in the residual time series (white noise and randomwalk noise) and estimates the amount of each. From this analysis we find that the random-walk noise level averages about 1.3 mm/Vyr and that our estimates of the white noise component confirm theoretical limitations of the measurement technique. In addition, the seasonal noise can be as large as 3 mm in amplitude but typically is less than 0.5 mm. Because of the presence of random-walk noise in these time series, modeling and interpretation of the geodetic data must account for this source of error. By way of example we show that estimating the time-varying strain tensor (a form of spatial averaging) from geodetic data having both random-walk and white noise error components results in seemingly significant variations in the rate of strain accumulation; spatial averaging does reduce the size of both noise components but not their relative influence on the resulting strain accumulation model. Copyright 1997 by the American Geophysical Union.
Baroreflex Coupling Assessed by Cross-Compression Entropy
Schumann, Andy; Schulz, Steffen; Voss, Andreas; Scharbrodt, Susann; Baumert, Mathias; Bär, Karl-Jürgen
2017-01-01
Estimating interactions between physiological systems is an important challenge in modern biomedical research. Here, we explore a new concept for quantifying information common in two time series by cross-compressibility. Cross-compression entropy (CCE) exploits the ZIP data compression algorithm extended to bivariate data analysis. First, time series are transformed into symbol vectors. Symbols of the target time series are coded by the symbols of the source series. Uncoupled and linearly coupled surrogates were derived from cardiovascular recordings of 36 healthy controls obtained during rest to demonstrate suitability of this method for assessing physiological coupling. CCE at rest was compared to that of isometric handgrip exercise. Finally, spontaneous baroreflex interaction assessed by CCEBRS was compared between 21 patients suffering from acute schizophrenia and 21 matched controls. The CCEBRS of original time series was significantly higher than in uncoupled surrogates in 89% of the subjects and higher than in linearly coupled surrogates in 47% of the subjects. Handgrip exercise led to sympathetic activation and vagal inhibition accompanied by reduced baroreflex sensitivity. CCEBRS decreased from 0.553 ± 0.030 at rest to 0.514 ± 0.035 during exercise (p < 0.001). In acute schizophrenia, heart rate, and blood pressure were elevated. Heart rate variability indicated a change of sympathovagal balance. The CCEBRS of patients with schizophrenia was reduced compared to healthy controls (0.546 ± 0.042 vs. 0.507 ± 0.046, p < 0.01) and revealed a decrease of blood pressure influence on heart rate in patients with schizophrenia. Our results indicate that CCE is suitable for the investigation of linear and non-linear coupling in cardiovascular time series. CCE can quantify causal interactions in short, noisy and non-stationary physiological time series. PMID:28539889
van de Flierdt, T.; Frank, M.; Halliday, A.N.; Hein, J.R.; Hattendorf, B.; Gunther, D.; Kubik, P.W.
2003-01-01
The sources of non-anthropogenic Pb in seawater have been the subject of debate. Here we present Pb isotope time-series that indicate that the non-anthropogenic Pb budget of the northernmost Pacific Ocean has been governed by ocean circulation and riverine inputs, which in turn have ultimately been controlled by tectonic processes. Despite the fact that the investigated locations are situated within the Asian dust plume, and proximal to extensive arc volcanism, eolian contributions have had little impact. We have obtained the first high-resolution and high-precision Pb isotope time-series of North Pacific deep water from two ferromanganese crusts from the Gulf of Alaska in the NE Pacific Ocean, and from the Detroit Seamount in the NW Pacific Ocean. Both crusts were dated applying 10 Be/9Be ratios and yield continuous time-series for the past 13.5 and 9.6 Myr, respectively. Lead isotopes show a monotonic evolution in 206Pb/204Pb from low values in the Miocene (??? 18.57) to high values at present day (??? 18.84) in both crusts, even though they are separated by more than 3000 km along the Aleutian Arc. The variation exceeds the amplitude found in Equatorial Pacific deep water records by about three-fold. There also is a striking similarity in 207Pb/204Pb and 208Pb/ 204Pb ratios of the two crusts, indicating the existence of a local circulation cell in the sub-polar North Pacific, where efficient lateral mixing has taken place but only limited exchange (in terms of Pb) with deep water from the Equatorial Pacific has occurred. Both crusts display well-defined trends with age in Pb-Pb isotope mixing plots, which require the involvement of at least four distinct Pb sources for North Pacific deep water. The Pb isotope time-series reveal that eolian supplies (volcanic ash and continent-derived loess) have only been of minor importance for the dissolved Pb budget of marginal sites in the deep North Pacific over the past 6 Myr. The two predominant sources have been young volcanic arcs, one located in the northeastern part and one located in the northwestern part of the Pacific margin, from where material has been eroded and delivered to the ocean, most likely via riverine pathways. ?? 2003 Elsevier Science B.V. All rights reserved.
Mapping Wetlands of Dongting Lake in China Using Landsat and SENTINEL-1 Time Series at 30M
NASA Astrophysics Data System (ADS)
Xing, L.; Tang, X.; Wang, H.; Fan, W.; Gao, X.
2018-04-01
Mapping and monitoring wetlands of Dongting lake using optical sensor data has been limited by cloud cover, and open access Sentinal-1 C-band data could provide cloud-free SAR images with both have high spatial and temporal resolution, which offer new opportunities for monitoring wetlands. In this study, we combined optical data and SAR data to map wetland of Dongting Lake reserves in 2016. Firstly, we generated two monthly composited Landsat land surface reflectance, NDVI, NDWI, TC-Wetness time series and Sentinel-1 (backscattering coefficient for VH and VV) time series. Secondly, we derived surface water body with two monthly frequencies based on the threshold method using the Sentinel-1 time series. Then the permanent water and seasonal water were separated by the submergence ratio. Other land cover types were identified based on SVM classifier using Landsat time series. Results showed that (1) the overall accuracies and kappa coefficients were above 86.6 % and 0.8. (3) Natural wetlands including permanent water body (14.8 %), seasonal water body (34.6 %), and permanent marshes (10.9 %) were the main land cover types, accounting for 60.3 % of the three wetland reserves. Human-made wetlands, such as rice fields, accounted 34.3 % of the total area. Generally, this study proposed a new flowchart for wetlands mapping in Dongting lake by combining multi-source remote sensing data, and the use of the two-monthly composited optical time series effectively made up the missing data due to the clouds and increased the possibility of precise wetlands classification.
Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.
Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald
2017-01-01
Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.
Focal mechanism of the seismic series prior to the 2011 El Hierro eruption
NASA Astrophysics Data System (ADS)
del Fresno, C.; Buforn, E.; Cesca, S.; Domínguez Cerdeña, I.
2015-12-01
The onset of the submarine eruption of El Hierro (10-Oct-2011) was preceded by three months of low-magnitude seismicity (Mw<4.0) characterized by a well documented hypocenter migration from the center to the south of the island. Seismic sources of this series have been studied in order to understand the physical process of magma migration. Different methodologies were used to obtain focal mechanisms of largest shocks. Firstly, we have estimated the joint fault plane solutions for 727 shocks using first motion P polarities to infer the stress pattern of the sequence and to determine the time evolution of principle axes orientation. Results show almost vertical T-axes during the first two months of the series and horizontal P-axes on N-S direction coinciding with the migration. Secondly, a point source MT inversion was performed with data of the largest 21 earthquakes of the series (M>3.5). Amplitude spectra was fitted at local distances (<20km). Reliability and stability of the results were evaluated with synthetic data. Results show a change in the focal mechanism pattern within the first days of October, varying from complex sources of higher non-double-couple components before that date to a simpler strike-slip mechanism with horizontal tension axes on E-W direction the week prior to the eruption onset. A detailed study was carried out for the 8 October 2011 earthquake (Mw=4.0). Focal mechanism was retrieved using a MT inversion at regional and local distances. Results indicate an important component of strike-slip fault and null isotropic component. The stress pattern obtained corresponds to horizontal compression in a NNW-SSE direction, parallel to the southern ridge of the island, and a quasi-horizontal extension in an EW direction. Finally, a simple source time function of 0.3s has been estimated for this shock using the Empirical Green function methodology.
2017-12-01
6028 Date Cleared: 30 NOV 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Data analysis tools which operate on varied data sources including time series ...public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...and raw detections from geo-located tweets Micro-paths (10M) (No distance/ time filter) Raw Tracks (10M) Raw Detections (10M) APPROVED FOR PUBLIC
Solar Environmental Disturbances
2007-11-02
like stars were examined, extending the previous 7–12 year time series to 13–20 years by combining Strömgren b, y photometry from Lowell Observatory...per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...explanations for how these physical processes affect the production of solar activity, both on short and long time scales. Solar cycle variation
NASA Astrophysics Data System (ADS)
Ozawa, T.; Miyagi, Y.
2017-12-01
Shinmoe-dake located to SW Japan erupted in January 2011 and lava accumulated in the crater (e.g., Ozawa and Kozono, EPS, 2013). Last Vulcanian eruption occurred in September 2011, and after that, no eruption has occurred until now. Miyagi et al. (GRL, 2014) analyzed TerraSAR-X and Radarsat-2 SAR data acquired after the last eruption and found continuous inflation in the crater. Its inflation decayed with time, but had not terminated in May 2013. Since the time-series of inflation volume change rate fitted well to the exponential function with the constant term, we suggested that lava extrusion had continued in long-term due to deflation of shallow magma source and to magma supply from deeper source. To investigate its deformation after that, we applied InSAR to Sentinel-1 and ALOS-2 SAR data. Inflation decayed further, and almost terminated in the end of 2016. It means that this deformation has continued more than five years from the last eruption. We have found that the time series of inflation volume change rate fits better to the double-exponential function than single-exponential function with the constant term. The exponential component with the short time constant has almost settled in one year from the last eruption. Although InSAR result from TerraSAR-X data of November 2011 and May 2013 indicated deflation of shallow source under the crater, such deformation has not been obtained from recent SAR data. It suggests that this component has been due to deflation of shallow magma source with excess pressure. In this study, we found the possibility that long-term component also decayed exponentially. Then this factor may be deflation of deep source or delayed vesiculation.
NASA Astrophysics Data System (ADS)
Genty, Dominique; Massault, Marc
1999-05-01
Twenty-two AMS 14C measurements have been made on a modern stalagmite from SW France in order to reconstruct the 14C activity history of the calcite deposit. Annual growth laminae provides a chronology up to 1919 A.D. Results show that the stalagmite 14C activity time series is sensitive to modern atmosphere 14C activity changes such as those produced by the nuclear weapon tests. The comparison between the two 14C time series shows that the stalagmite time series is damped: its amplitude variation between pre-bomb and post-bomb values is 75% less and the time delay between the two time series peaks is 16 years ±3. A model is developed using atmosphere 14C and 13C data, fractionation processes and three soil organic matter components whose mean turnover rates are different. The linear correlation coefficient between modeled and measured activities is 0.99. These results, combined with two other stalagmite 14C time series already published and compared with local vegetation and climate, demonstrate that most of the carbon transfer dynamics are controlled in the soil by soil organic matter degradation rates. Where vegetation produces debris whose degradation is slow, the fraction of old carbon injected in the system increases, the observed 14C time series is much more damped and lag time longer than that observed under grassland sites. The same mixing model applied on the 13C shows a good agreement ( R2 = 0.78) between modeled and measured stalagmite δ 13C and demonstrates that the Suess Effect due to fossil fuel combustion in the atmosphere is recorded in the stalagmite but with a damped effect due to SOM degradation rate. The different sources of dead carbon in the seepage water are calculated and discussed.
Mapping deforestation and forest degradation using Landsat time series: a case of Sumatra—Indonesia
Belinda Arunarwati Margono
2013-01-01
Indonesia experiences the second highest rate of deforestation among tropical countries (FAO 2005, 2010). Consequently, timely and accurate forest data are required to combat deforestation and forest degradation in support of climate change mitigation and biodiversity conservation policy initiatives. Remote sensing is considered as a significant data source for forest...
Long-Term Variations of the EOP and ICRF2
NASA Technical Reports Server (NTRS)
Zharov, Vladimir; Sazhin, Mikhail; Sementsov, Valerian; Sazhina, Olga
2010-01-01
We analyzed the time series of the coordinates of the ICRF radio sources. We show that part of the radio sources, including the defining sources, shows a significant apparent motion. The stability of the celestial reference frame is provided by a no-net-rotation condition applied to the defining sources. In our case this condition leads to a rotation of the frame axes with time. We calculated the effect of this rotation on the Earth orientation parameters (EOP). In order to improve the stability of the celestial reference frame we suggest a new method for the selection of the defining sources. The method consists of two criteria: the first one we call cosmological and the second one kinematical. It is shown that a subset of the ICRF sources selected according to cosmological criteria provides the most stable reference frame for the next decade.
NASA Astrophysics Data System (ADS)
Poupardin, A.; Heinrich, P.; Hébert, H.; Schindelé, F.; Jamelot, A.; Reymond, D.; Sugioka, H.
2018-05-01
This paper evaluates the importance of frequency dispersion in the propagation of recent trans-Pacific tsunamis. Frequency dispersion induces a time delay for the most energetic waves, which increases for long propagation distances and short source dimensions. To calculate this time delay, propagation of tsunamis is simulated and analyzed from spectrograms of time-series at specific gauges in the Pacific Ocean. One- and two-dimensional simulations are performed by solving either shallow water or Boussinesq equations and by considering realistic seismic sources. One-dimensional sensitivity tests are first performed in a constant-depth channel to study the influence of the source width. Two-dimensional tests are then performed in a simulated Pacific Ocean with a 4000-m constant depth and by considering tectonic sources of 2010 and 2015 Chilean earthquakes. For these sources, both the azimuth and the distance play a major role in the frequency dispersion of tsunamis. Finally, simulations are performed considering the real bathymetry of the Pacific Ocean. Multiple reflections, refractions as well as shoaling of waves result in much more complex time series for which the effects of the frequency dispersion are hardly discernible. The main point of this study is to evaluate frequency dispersion in terms of traveltime delays by calculating spectrograms for a time window of 6 hours after the arrival of the first wave. Results of the spectral analysis show that the wave packets recorded by pressure and tide sensors in the Pacific Ocean seem to be better reproduced by the Boussinesq model than the shallow water model and approximately follow the theoretical dispersion relationship linking wave arrival times and frequencies. Additionally, a traveltime delay is determined above which effects of frequency dispersion are considered to be significant in terms of maximum surface elevations.
State Energy Price and Expenditure Estimates
2017-01-01
The State Energy Price and Expenditure Estimates provide data on energy prices in current dollars per million Btu and expenditures in current dollars, by state and for the United States, by energy source and by sector in annual time-series back to 1970
NASA Astrophysics Data System (ADS)
Benz, N.; Bartlow, N. M.
2017-12-01
The addition of borehole strainmeter (BSM) to cGPS time series inversions can yield more precise slip distributions at the subduction interface during episodic tremor and slip (ETS) events in the Cascadia subduction zone. Traditionally very noisy BSM data has not been easy to incorporate until recently, but developments in processing noise, re-orientation of strain components, removal of tidal, hydrologic, and atmospheric signals have made this additional source of data viable (Roeloffs, 2010). The major advantage with BSMs is their sensitivity to spatial derivatives in slip, which is valuable for investigating the ETS nucleation process and stress changes on the plate interface due to ETS. Taking advantage of this, we simultaneously invert PBO GPS and cleaned BSM time series with the Network Inversion Filter (Segall and Matthews, 1997) for slip distribution and slip rate during selected Cascadia ETS events. Stress distributions are also calculated for the plate interface using these inversion results to estimate the amount of stress change during an ETS event. These calculations are performed with and without the utilization of BSM time series, highlighting the role of BSM data in constraining slip and stress.
Nonlinear analysis of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea
NASA Astrophysics Data System (ADS)
Rojo-Garibaldi, Berenice; Salas-de-León, David Alberto; Adela Monreal-Gómez, María; Sánchez-Santillán, Norma Leticia; Salas-Monreal, David
2018-04-01
Hurricanes are complex systems that carry large amounts of energy. Their impact often produces natural disasters involving the loss of human lives and materials, such as infrastructure, valued at billions of US dollars. However, not everything about hurricanes is negative, as hurricanes are the main source of rainwater for the regions where they develop. This study shows a nonlinear analysis of the time series of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea obtained from 1749 to 2012. The construction of the hurricane time series was carried out based on the hurricane database of the North Atlantic basin hurricane database (HURDAT) and the published historical information. The hurricane time series provides a unique historical record on information about ocean-atmosphere interactions. The Lyapunov exponent indicated that the system presented chaotic dynamics, and the spectral analysis and nonlinear analyses of the time series of the hurricanes showed chaotic edge behavior. One possible explanation for this chaotic edge is the individual chaotic behavior of hurricanes, either by category or individually regardless of their category and their behavior on a regular basis.
NASA Astrophysics Data System (ADS)
Kis, A.; Lemperger, I.; Wesztergom, V.; Menvielle, M.; Szalai, S.; Novák, A.; Hada, T.; Matsukiyo, S.; Lethy, A. M.
2016-12-01
Magnetotelluric method is widely applied for investigation of subsurface structures by imaging the spatial distribution of electric conductivity. The method is based on the experimental determination of surface electromagnetic impedance tensor (Z) by surface geomagnetic and telluric registrations in two perpendicular orientation. In practical explorations the accurate estimation of Z necessitates the application of robust statistical methods for two reasons:1) the geomagnetic and telluric time series' are contaminated by man-made noise components and2) the non-homogeneous behavior of ionospheric current systems in the period range of interest (ELF-ULF and longer periods) results in systematic deviation of the impedance of individual time windows.Robust statistics manage both load of Z for the purpose of subsurface investigations. However, accurate analysis of the long term temporal variation of the first and second statistical moments of Z may provide valuable information about the characteristics of the ionospheric source current systems. Temporal variation of extent, spatial variability and orientation of the ionospheric source currents has specific effects on the surface impedance tensor. Twenty year long geomagnetic and telluric recordings of the Nagycenk Geophysical Observatory provides unique opportunity to reconstruct the so called magnetotelluric source effect and obtain information about the spatial and temporal behavior of ionospheric source currents at mid-latitudes. Detailed investigation of time series of surface electromagnetic impedance tensor has been carried out in different frequency classes of the ULF range. The presentation aims to provide a brief review of our results related to long term periodic modulations, up to solar cycle scale and about eventual deviations of the electromagnetic impedance and so the reconstructed equivalent ionospheric source effects.
Performance Assessment of Network Intrusion-Alert Prediction
2012-09-01
the threats. In this thesis, we use Snort to generate the intrusion detection alerts. 2. SNORT Snort is an open source network intrusion...standard for IPS. (Snort, 2012) We choose Snort because it is an open source product that is free to download and can be deployed cross-platform...Learning & prediction in relational time series: A survey. 21st Behavior Representation in Modeling & Simulation ( BRIMS ) Conference 2012, 93–100. Tan
PyEEG: an open source Python module for EEG/MEG feature extraction.
Bao, Forrest Sheng; Liu, Xin; Zhang, Christina
2011-01-01
Computer-aided diagnosis of neural diseases from EEG signals (or other physiological signals that can be treated as time series, e.g., MEG) is an emerging field that has gained much attention in past years. Extracting features is a key component in the analysis of EEG signals. In our previous works, we have implemented many EEG feature extraction functions in the Python programming language. As Python is gaining more ground in scientific computing, an open source Python module for extracting EEG features has the potential to save much time for computational neuroscientists. In this paper, we introduce PyEEG, an open source Python module for EEG feature extraction.
PyEEG: An Open Source Python Module for EEG/MEG Feature Extraction
Bao, Forrest Sheng; Liu, Xin; Zhang, Christina
2011-01-01
Computer-aided diagnosis of neural diseases from EEG signals (or other physiological signals that can be treated as time series, e.g., MEG) is an emerging field that has gained much attention in past years. Extracting features is a key component in the analysis of EEG signals. In our previous works, we have implemented many EEG feature extraction functions in the Python programming language. As Python is gaining more ground in scientific computing, an open source Python module for extracting EEG features has the potential to save much time for computational neuroscientists. In this paper, we introduce PyEEG, an open source Python module for EEG feature extraction. PMID:21512582
NASA Astrophysics Data System (ADS)
Feuerstein, Stefanie; Schepanski, Kerstin
2017-04-01
One of the world's largest sources of atmospheric dust is the Sahara. It is said that 55% of the total global dust emission can be linked to the desert in northern Africa. Thus, understanding the Saharan dust sources is of great importance to estimate the total global dust load and its variability. Especially one type of dust sources has gained attention in dust research in recent years: The emission of dust from sediments formed by hydrologic processes, so called alluvial dust sources. These sediments were either formed in the past under the influences of a more humid paleoclimate or are deposited recently, e.g. during strong precipitation events when surficial runoff leads to the activation of wadi systems or to the occurrence of flash floods. Especially the latter phenomenon is able to deliver a huge amount of potentially erodible sediments. The research presented here focuses on the characterization of these alluvial dust sources with special attention on their temporal variability in relation to wet and dry phases. A study area covering the Aïr Massif in Niger is analysed over a four years time span from January 2013 to December 2016. The whole cycle from sediment formation to dust emission is illustrated by using data of various satellite sensors that are able to capture the processes taking place at the land surface as well as in the atmosphere: (1) The rainfall distribution for the study area is shown by time series of the TRMM precipitation estimates. A catchment analysis of the area helps to estimate the amount of surficial runoff and to detect areas of potential sediment accumulation. (2) Changes in the sediment structure of the land surface are analysed using atmospherically corrected time series of NASA's Landsat-8 OLI satellite. A land cover classification shows the distribution of alluvial sediments over the area; fresh layers of alluvial deposits are detected. Furthermore, the evolution of the vegetation cover, which inhibits dust emission, is analysed by calculating NDVI time series from the Landsat data. (3) The MSG Dust Product is used to determine the frequency of dust emission from the investigation area. Furthermore, the product allows the precise localization of the sources. Therefore the alluvial sediments can directly be connected to dust emission. By combining the findings of these different satellite sensors, a profound analysis of alluvial dust sources on different levels is possible. The connection between the amount of precipitation and the supply of potentially erodible sediments is shown, which leads to a better understanding of the temporal evolution and importance of this source type.
Providing web-based tools for time series access and analysis
NASA Astrophysics Data System (ADS)
Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane
2014-05-01
Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30, 833-845. Verbesselt, J., R. Hyndman, G. Newnham and D. Culvenor (2010). Detecting trend and seasonal changes in satellite image time series. Remote Sensing of Environment, 114, 106-115. DOI: 10.1016/j.rse.2009.08.014 Forkel, M., N. Carvalhais, J. Verbesselt, M. Mahecha, C. Neigh and M. Reichstein (2013). Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology. Remote Sensing 5, 2113-2144.
Application of dynamic topic models to toxicogenomics data.
Lee, Mikyung; Liu, Zhichao; Huang, Ruili; Tong, Weida
2016-10-06
All biological processes are inherently dynamic. Biological systems evolve transiently or sustainably according to sequential time points after perturbation by environment insults, drugs and chemicals. Investigating the temporal behavior of molecular events has been an important subject to understand the underlying mechanisms governing the biological system in response to, such as, drug treatment. The intrinsic complexity of time series data requires appropriate computational algorithms for data interpretation. In this study, we propose, for the first time, the application of dynamic topic models (DTM) for analyzing time-series gene expression data. A large time-series toxicogenomics dataset was studied. It contains over 3144 microarrays of gene expression data corresponding to rat livers treated with 131 compounds (most are drugs) at two doses (control and high dose) in a repeated schedule containing four separate time points (4-, 8-, 15- and 29-day). We analyzed, with DTM, the topics (consisting of a set of genes) and their biological interpretations over these four time points. We identified hidden patterns embedded in this time-series gene expression profiles. From the topic distribution for compound-time condition, a number of drugs were successfully clustered by their shared mode-of-action such as PPARɑ agonists and COX inhibitors. The biological meaning underlying each topic was interpreted using diverse sources of information such as functional analysis of the pathways and therapeutic uses of the drugs. Additionally, we found that sample clusters produced by DTM are much more coherent in terms of functional categories when compared to traditional clustering algorithms. We demonstrated that DTM, a text mining technique, can be a powerful computational approach for clustering time-series gene expression profiles with the probabilistic representation of their dynamic features along sequential time frames. The method offers an alternative way for uncovering hidden patterns embedded in time series gene expression profiles to gain enhanced understanding of dynamic behavior of gene regulation in the biological system.
Visual Analytics of integrated Data Systems for Space Weather Purposes
NASA Astrophysics Data System (ADS)
Rosa, Reinaldo; Veronese, Thalita; Giovani, Paulo
Analysis of information from multiple data sources obtained through high resolution instrumental measurements has become a fundamental task in all scientific areas. The development of expert methods able to treat such multi-source data systems, with both large variability and measurement extension, is a key for studying complex scientific phenomena, especially those related to systemic analysis in space and environmental sciences. In this talk, we present a time series generalization introducing the concept of generalized numerical lattice, which represents a discrete sequence of temporal measures for a given variable. In this novel representation approach each generalized numerical lattice brings post-analytical data information. We define a generalized numerical lattice as a set of three parameters representing the following data properties: dimensionality, size and post-analytical measure (e.g., the autocorrelation, Hurst exponent, etc)[1]. From this representation generalization, any multi-source database can be reduced to a closed set of classified time series in spatiotemporal generalized dimensions. As a case study, we show a preliminary application in space science data, highlighting the possibility of a real time analysis expert system. In this particular application, we have selected and analyzed, using detrended fluctuation analysis (DFA), several decimetric solar bursts associated to X flare-classes. The association with geomagnetic activity is also reported. DFA method is performed in the framework of a radio burst automatic monitoring system. Our results may characterize the variability pattern evolution, computing the DFA scaling exponent, scanning the time series by a short windowing before the extreme event [2]. For the first time, the application of systematic fluctuation analysis for space weather purposes is presented. The prototype for visual analytics is implemented in a Compute Unified Device Architecture (CUDA) by using the K20 Nvidia graphics processing units (GPUs) to reduce the integrated analysis runtime. [1] Veronese et al. doi: 10.6062/jcis.2009.01.02.0021, 2010. [2] Veronese et al. doi:http://dx.doi.org/10.1016/j.jastp.2010.09.030, 2011.
Wilkinson, S N; Dougall, C; Kinsey-Henderson, A E; Searle, R D; Ellis, R J; Bartley, R
2014-01-15
The use of river basin modelling to guide mitigation of non-point source pollution of wetlands, estuaries and coastal waters has become widespread. To assess and simulate the impacts of alternate land use or climate scenarios on river washload requires modelling techniques that represent sediment sources and transport at the time scales of system response. Building on the mean-annual SedNet model, we propose a new D-SedNet model which constructs daily budgets of fine sediment sources, transport and deposition for each link in a river network. Erosion rates (hillslope, gully and streambank erosion) and fine sediment sinks (floodplains and reservoirs) are disaggregated from mean annual rates based on daily rainfall and runoff. The model is evaluated in the Burdekin basin in tropical Australia, where policy targets have been set for reducing sediment and nutrient loads to the Great Barrier Reef (GBR) lagoon from grazing and cropping land. D-SedNet predicted annual loads with similar performance to that of a sediment rating curve calibrated to monitored suspended sediment concentrations. Relative to a 22-year reference load time series at the basin outlet derived from a dynamic general additive model based on monitoring data, D-SedNet had a median absolute error of 68% compared with 112% for the rating curve. RMS error was slightly higher for D-SedNet than for the rating curve due to large relative errors on small loads in several drought years. This accuracy is similar to existing agricultural system models used in arable or humid environments. Predicted river loads were sensitive to ground vegetation cover. We conclude that the river network sediment budget model provides some capacity for predicting load time-series independent of monitoring data in ungauged basins, and for evaluating the impact of land management on river sediment load time-series, which is challenging across large regions in data-poor environments. © 2013. Published by Elsevier B.V. All rights reserved.
MULTIPLE INPUT BINARY ADDER EMPLOYING MAGNETIC DRUM DIGITAL COMPUTING APPARATUS
Cooke-Yarborough, E.H.
1960-12-01
A digital computing apparatus is described for adding a plurality of multi-digit binary numbers. The apparatus comprises a rotating magnetic drum, a recording head, first and second reading heads disposed adjacent to the first and second recording tracks, and a series of timing signals recorded on the first track. A series of N groups of digit-representing signals is delivered to the recording head at time intervals corresponding to the timing signals, each group consisting of digits of the same significance in the numbers, and the signal series is recorded on the second track of the drum in synchronism with the timing signals on the first track. The multistage registers are stepped cyclically through all positions, and each of the multistage registers is coupled to the control lead of a separate gate circuit to open the corresponding gate at only one selected position in each cycle. One of the gates has its input coupled to the bistable element to receive the sum digit, and the output lead of this gate is coupled to the recording device. The inputs of the other gates receive the digits to be added from the second reading head, and the outputs of these gates are coupled to the adding register. A phase-setting pulse source is connected to each of the multistage registers individually to step the multistage registers to different initial positions in the cycle, and the phase-setting pulse source is actuated each N time interval to shift a sum digit to the bistable element, where the multistage register coupled to bistable element is operated by the phase- setting pulse source to that position in its cycle N steps before opening the first gate, so that this gate opens in synchronism with each of the shifts to pass the sum digits to the recording head.
NASA Astrophysics Data System (ADS)
Kadlec, J.; Ames, D. P.
2014-12-01
The aim of the presented work is creating a freely accessible, dynamic and re-usable snow cover map of the world by combining snow extent and snow depth datasets from multiple sources. The examined data sources are: remote sensing datasets (MODIS, CryoLand), weather forecasting model outputs (OpenWeatherMap, forecast.io), ground observation networks (CUAHSI HIS, GSOD, GHCN, and selected national networks), and user-contributed snow reports on social networks (cross-country and backcountry skiing trip reports). For adding each type of dataset, an interface and an adapter is created. Each adapter supports queries by area, time range, or combination of area and time range. The combined dataset is published as an online snow cover mapping service. This web service lowers the learning curve that is required to view, access, and analyze snow depth maps and snow time-series. All data published by this service are licensed as open data; encouraging the re-use of the data in customized applications in climatology, hydrology, sports and other disciplines. The initial version of the interactive snow map is on the website snow.hydrodata.org. This website supports the view by time and view by site. In view by time, the spatial distribution of snow for a selected area and time period is shown. In view by site, the time-series charts of snow depth at a selected location is displayed. All snow extent and snow depth map layers and time series are accessible and discoverable through internationally approved protocols including WMS, WFS, WCS, WaterOneFlow and WaterML. Therefore they can also be easily added to GIS software or 3rd-party web map applications. The central hypothesis driving this research is that the integration of user contributed data and/or social-network derived snow data together with other open access data sources will result in more accurate and higher resolution - and hence more useful snow cover maps than satellite data or government agency produced data by itself.
Lehmann, Dietrich; Faber, Pascal L; Gianotti, Lorena R R; Kochi, Kieko; Pascual-Marqui, Roberto D
2006-01-01
Brain electric mechanisms of temporary, functional binding between brain regions are studied using computation of scalp EEG coherence and phase locking, sensitive to time differences of few milliseconds. However, such results if computed from scalp data are ambiguous since electric sources are spatially oriented. Non-ambiguous results can be obtained using calculated time series of strength of intracerebral model sources. This is illustrated applying LORETA modeling to EEG during resting and meditation. During meditation, time series of LORETA model sources revealed a tendency to decreased left-right intracerebral coherence in the delta band, and to increased anterior-posterior intracerebral coherence in the theta band. An alternate conceptualization of functional binding is based on the observation that brain electric activity is discontinuous, i.e., that it occurs in chunks of up to about 100 ms duration that are detectable as quasi-stable scalp field configurations of brain electric activity, called microstates. Their functional significance is illustrated in spontaneous and event-related paradigms, where microstates associated with imagery- versus abstract-type mentation, or while reading positive versus negative emotion words showed clearly different regions of cortical activation in LORETA tomography. These data support the concept that complete brain functions of higher order such as a momentary thought might be incorporated in temporal chunks of processing in the range of tens to about 100 ms as quasi-stable brain states; during these time windows, subprocesses would be accepted as members of the ongoing chunk of processing.
NASA Astrophysics Data System (ADS)
Duan, Yixiang; Su, Yongxuan; Jin, Zhe; Abeln, Stephen P.
2000-03-01
The development of a highly sensitive, field portable, low-powered instrument for on-site, real-time liquid waste stream monitoring is described in this article. A series of factors such as system sensitivity and portability, plasma source, sample introduction, desolvation system, power supply, and the instrument configuration, were carefully considered in the design of the portable instrument. A newly designed, miniature, modified microwave plasma source was selected as the emission source for spectroscopy measurement, and an integrated small spectrometer with a charge-coupled device detector was installed for signal processing and detection. An innovative beam collection system with optical fibers was designed and used for emission signal collection. Microwave plasma can be sustained with various gases at relatively low power, and it possesses high detection capabilities for both metal and nonmetal pollutants, making it desirable to use for on-site, real-time, liquid waste stream monitoring. An effective in situ sampling system was coupled with a high efficiency desolvation device for direct-sampling liquid samples into the plasma. A portable computer control system is used for data processing. The new, integrated instrument can be easily used for on-site, real-time monitoring in the field. The system possesses a series of advantages, including high sensitivity for metal and nonmetal elements; in situ sampling; compact structure; low cost; and ease of operation and handling. These advantages will significantly overcome the limitations of previous monitoring techniques and make great contributions to environmental restoration and monitoring.
Methodological uncertainties in multi-regression analyses of middle-atmospheric data series.
Kerzenmacher, Tobias E; Keckhut, Philippe; Hauchecorne, Alain; Chanin, Marie-Lise
2006-07-01
Multi-regression analyses have often been used recently to detect trends, in particular in ozone or temperature data sets in the stratosphere. The confidence in detecting trends depends on a number of factors which generate uncertainties. Part of these uncertainties comes from the random variability and these are what is usually considered. They can be statistically estimated from residual deviations between the data and the fitting model. However, interferences between different sources of variability affecting the data set, such as the Quasi-Biennal Oscillation (QBO), volcanic aerosols, solar flux variability and the trend can also be a critical source of errors. This type of error has hitherto not been well quantified. In this work an artificial data series has been generated to carry out such estimates. The sources of errors considered here are: the length of the data series, the dependence on the choice of parameters used in the fitting model and the time evolution of the trend in the data series. Curves provided here, will permit future studies to test the magnitude of the methodological bias expected for a given case, as shown in several real examples. It is found that, if the data series is shorter than a decade, the uncertainties are very large, whatever factors are chosen to identify the source of the variability. However the errors can be limited when dealing with natural variability, if a sufficient number of periods (for periodic forcings) are covered by the analysed dataset. However when analysing the trend, the response to volcanic eruption induces a bias, whatever the length of the data series. The signal to noise ratio is a key factor: doubling the noise increases the period for which data is required in order to obtain an error smaller than 10%, from 1 to 3-4 decades. Moreover, if non-linear trends are superimposed on the data, and if the length of the series is longer than five years, a non-linear function has to be used to estimate trends. When applied to real data series, and when a breakpoint in the series occurs, the study reveals that data extending over 5 years are needed to detect a significant change in the slope of the ozone trends at mid-latitudes.
Extragalactic Science With Kepler
NASA Astrophysics Data System (ADS)
Fanelli, Michael N.; Marcum, P.
2012-01-01
Although designed as an exoplanet and stellar astrophysics experiment, the Kepler mission provides a unique capability to explore the essentially unknown photometric stability of galactic systems at millimag levels using Kepler's blend of high precision and continuous monitoring. Time series observations of galaxies are sensitive to both quasi-continuous variability, driven by accretion activity from embedded active nuclei, and random, episodic events, such as supernovae. In general, galaxies lacking active nuclei are not expected to be variable with the timescales and amplitudes observed in stellar sources and are free of source motions that affect stars (e.g., parallax). These sources can serve as a population of quiescent, non-variable sources, which may be used to quantify the photometric stability and noise characteristics of the Kepler photometer. A factor limiting galaxy monitoring in the Kepler FOV is the overall lack of detailed quantitative information for the galaxy population. Despite these limitations, a significant number of galaxies are being observed, forming the Kepler Galaxy Archive. Observed sources total approximately 100, 250, and 700 in Cycles 1-3 (Cycle 3 began in June 2011). In this poster we interpret the properties of a set of 20 galaxies monitored during quarters 4 through 8, their associated light curves, photometric and astrometric precision and potential variability. We describe data analysis issues relevant to extended sources and available software tools. In addition, we detail ongoing surveys that are providing new photometric and morphological information for galaxies over the entire field. These new datasets will both aid the interpretation of the time series, and improve source selection, e.g., help identify candidate AGNs and starburst systems, for further monitoring.
NASA Astrophysics Data System (ADS)
Anderson, J.; Johnson, J. B.; Arechiga, R. O.; Edens, H. E.; Thomas, R. J.
2011-12-01
We use radio frequency (VHF) pulse locations mapped with the New Mexico Tech Lightning Mapping Array (LMA) to study the distribution of thunder sources in lightning channels. A least squares inversion is used to fit channel acoustic energy radiation with broadband (0.01 to 500 Hz) acoustic recordings using microphones deployed local (< 10 km) to the lightning. We model the thunder (acoustic) source as a superposition of line segments connecting the LMA VHF pulses. An optimum branching algorithm is used to reconstruct conductive channels delineated by VHF sources, which we discretize as a superposition of finely-spaced (0.25 m) acoustic point sources. We consider total radiated thunder as a weighted superposition of acoustic waves from individual channels, each with a constant current along its length that is presumed to be proportional to acoustic energy density radiated per unit length. Merged channels are considered as a linear sum of current-carrying branches and radiate proportionally greater acoustic energy. Synthetic energy time series for a given microphone location are calculated for each independent channel. We then use a non-negative least squares inversion to solve for channel energy densities to match the energy time series determined from broadband acoustic recordings across a 4-station microphone network. Events analyzed by this method have so far included 300-1000 VHF sources, and correlations as high as 0.5 between synthetic and recorded thunder energy were obtained, despite the presence of wind noise and 10-30 m uncertainty in VHF source locations.
Piezotube borehole seismic source
Daley, Tom M; Solbau, Ray D; Majer, Ernest L
2014-05-06
A piezoelectric borehole source capable of permanent or semipermanent insertion into a well for uninterrupted well operations is described. The source itself comprises a series of piezoelectric rings mounted to an insulative mandrel internally sized to fit over a section of well tubing, the rings encased in a protective housing and electrically connected to a power source. Providing an AC voltage to the rings will cause expansion and contraction sufficient to create a sonic pulse. The piezoelectric borehole source fits into a standard well, and allows for uninterrupted pass-through of production tubing, and other tubing and electrical cables. Testing using the source may be done at any time, even concurrent with well operations, during standard production.
A Generalized Wave Diagram for Moving Sources
NASA Astrophysics Data System (ADS)
Alt, Robert; Wiley, Sam
2004-12-01
Many introductory physics texts1-5 accompany the discussion of the Doppler effect and the formation of shock waves with diagrams illustrating the effect of a source moving through an elastic medium. Typically these diagrams consist of a series of equally spaced dots, representing the location of the source at different times. These are surrounded by a series of successively smaller circles representing wave fronts (see Fig. 1). While such a diagram provides a clear illustration of the shock wave produced by a source moving at a speed greater than the wave speed, and also the resultant pattern when the source speed is less than the wave speed (the Doppler effect), the texts do not often show the details of the construction. As a result, the key connection between the relative distance traveled by the source and the distance traveled by the wave is not explicitly made. In this paper we describe an approach emphasizing this connection that we have found to be a useful classroom supplement to the usual text presentation. As shown in Fig. 2 and Fig. 3, the Doppler effect and the shock wave can be illustrated by diagrams generated by the construction that follows.
NASA Astrophysics Data System (ADS)
El Yazidi, Abdelhadi; Ramonet, Michel; Ciais, Philippe; Broquet, Gregoire; Pison, Isabelle; Abbaris, Amara; Brunner, Dominik; Conil, Sebastien; Delmotte, Marc; Gheusi, Francois; Guerin, Frederic; Hazan, Lynn; Kachroudi, Nesrine; Kouvarakis, Giorgos; Mihalopoulos, Nikolaos; Rivier, Leonard; Serça, Dominique
2018-03-01
This study deals with the problem of identifying atmospheric data influenced by local emissions that can result in spikes in time series of greenhouse gases and long-lived tracer measurements. We considered three spike detection methods known as coefficient of variation (COV), robust extraction of baseline signal (REBS) and standard deviation of the background (SD) to detect and filter positive spikes in continuous greenhouse gas time series from four monitoring stations representative of the European ICOS (Integrated Carbon Observation System) Research Infrastructure network. The results of the different methods are compared to each other and against a manual detection performed by station managers. Four stations were selected as test cases to apply the spike detection methods: a continental rural tower of 100 m height in eastern France (OPE), a high-mountain observatory in the south-west of France (PDM), a regional marine background site in Crete (FKL) and a marine clean-air background site in the Southern Hemisphere on Amsterdam Island (AMS). This selection allows us to address spike detection problems in time series with different variability. Two years of continuous measurements of CO2, CH4 and CO were analysed. All methods were found to be able to detect short-term spikes (lasting from a few seconds to a few minutes) in the time series. Analysis of the results of each method leads us to exclude the COV method due to the requirement to arbitrarily specify an a priori percentage of rejected data in the time series, which may over- or underestimate the actual number of spikes. The two other methods freely determine the number of spikes for a given set of parameters, and the values of these parameters were calibrated to provide the best match with spikes known to reflect local emissions episodes that are well documented by the station managers. More than 96 % of the spikes manually identified by station managers were successfully detected both in the SD and the REBS methods after the best adjustment of parameter values. At PDM, measurements made by two analyzers located 200 m from each other allow us to confirm that the CH4 spikes identified in one of the time series but not in the other correspond to a local source from a sewage treatment facility in one of the observatory buildings. From this experiment, we also found that the REBS method underestimates the number of positive anomalies in the CH4 data caused by local sewage emissions. As a conclusion, we recommend the use of the SD method, which also appears to be the easiest one to implement in automatic data processing, used for the operational filtering of spikes in greenhouse gases time series at global and regional monitoring stations of networks like that of the ICOS atmosphere network.
NASA Astrophysics Data System (ADS)
Veronesi, F.; Grassi, S.
2016-09-01
Wind resource assessment is a key aspect of wind farm planning since it allows to estimate the long term electricity production. Moreover, wind speed time-series at high resolution are helpful to estimate the temporal changes of the electricity generation and indispensable to design stand-alone systems, which are affected by the mismatch of supply and demand. In this work, we present a new generalized statistical methodology to generate the spatial distribution of wind speed time-series, using Switzerland as a case study. This research is based upon a machine learning model and demonstrates that statistical wind resource assessment can successfully be used for estimating wind speed time-series. In fact, this method is able to obtain reliable wind speed estimates and propagate all the sources of uncertainty (from the measurements to the mapping process) in an efficient way, i.e. minimizing computational time and load. This allows not only an accurate estimation, but the creation of precise confidence intervals to map the stochasticity of the wind resource for a particular site. The validation shows that machine learning can minimize the bias of the wind speed hourly estimates. Moreover, for each mapped location this method delivers not only the mean wind speed, but also its confidence interval, which are crucial data for planners.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
High suspended sediment concentrations (SSCs) from natural and anthropogenic sources are responsible for biological impairments of many streams, rivers, lakes, and estuaries, but techniques to estimate sediment concentrations or loads accurately at the daily temporal resolution a...
Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.
Ak, Ronay; Fink, Olga; Zio, Enrico
2016-08-01
The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.
A Source Term for Wave Attenuation by Sea Ice in WAVEWATCH III (registered trademark): IC4
2017-06-07
by ANSI Std. Z39.18 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for... time . Diamonds indicate active, moored AWACs. Circle indicates location of R/V Sikuliaq. Thick magenta and white lines indicate path of R/V Sikuliaq...past and future ship position, respectively). .................................................................. 15 Figure 10 Time series of
Cascading Oscillators in Decoding Speech: Reflection of a Cortical Computation Principle
2016-09-06
Combining an experimental paradigm based on Ghitza and Greenberg (2009) for speech with the approach of Farbood et al. (2013) to timing in key...Fuglsang, 2015). A model was developed which uses modulation spectrograms to construct an oscillating time - series synchronized with the slowly varying...estimated to average 1 hour per response, including the time for reviewing instructions, searching data sources, gathering and maintaining the data
NASA Astrophysics Data System (ADS)
Bruni, S.; Zerbini, Susanna; Raicich, F.; Errico, M.; Santi, E.
2014-12-01
Global navigation satellite systems (GNSS) data are a fundamental source of information for achieving a better understanding of geophysical and climate-related phenomena. However, discontinuities in the coordinate time series might be a severe limiting factor for the reliable estimate of long-term trends. A methodological approach has been adapted from Rodionov (Geophys Res Lett 31:L09204, 2004; Geophys Res Lett 31:L12707, 2006) and from Rodionov and Overland (J Marine Sci 62:328-332, 2005) to identify both the epoch of occurrence and the magnitude of jumps corrupting GNSS data sets without any a priori information on these quantities. The procedure is based on the Sequential t test Analysis of Regime Shifts (STARS) (Rodionov in Geophys Res Lett 31:L09204, 2004). The method has been tested against a synthetic data set characterized by typical features exhibited by real GNSS time series, such as linear trend, seasonal cycle, jumps, missing epochs and a combination of white and flicker noise. The results show that the offsets identified by the algorithm are split into 48 % of true-positive, 28 % of false-positive and 24 % of false-negative events. The procedure has then been applied to GPS coordinate time series of stations located in the southeastern Po Plain, in Italy. The series span more than 15 years and are affected by offsets of different nature. The methodology proves to be effective, as confirmed by the comparison between the corrected GPS time series and those obtained by other observation techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena
The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less
2017-08-14
the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this...place; c) Site visits took place for two of the candidate technologies, T- SERIES by ZeroBase and Sol-Char by the University of Colorado, within the...visits during the planned timeframe within the SLB-STO-D master plan; d) The T- Series by Zero-Base appears to be the most mature of all the industry
The scheme and research of TV series multidimensional comprehensive evaluation on cross-platform
NASA Astrophysics Data System (ADS)
Chai, Jianping; Bai, Xuesong; Zhou, Hongjun; Yin, Fulian
2016-10-01
As for shortcomings of the comprehensive evaluation system on traditional TV programs such as single data source, ignorance of new media as well as the high time cost and difficulty of making surveys, a new evaluation of TV series is proposed in this paper, which has a perspective in cross-platform multidimensional evaluation after broadcasting. This scheme considers the data directly collected from cable television and the Internet as research objects. It's based on TOPSIS principle, after preprocessing and calculation of the data, they become primary indicators that reflect different profiles of the viewing of TV series. Then after the process of reasonable empowerment and summation by the six methods(PCA, AHP, etc.), the primary indicators form the composite indices on different channels or websites. The scheme avoids the inefficiency and difficulty of survey and marking; At the same time, it not only reflects different dimensions of viewing, but also combines TV media and new media, completing the multidimensional comprehensive evaluation of TV series on cross-platform.
NASA Astrophysics Data System (ADS)
Gunn, Grant; Duguay, Claude; Atwood, Don
2017-04-01
This study identifies the dominant scattering mechanism for C-, X- and Ku-band for bubbled freshwater lake ice in the Hudson Bay Lowlands near Churchill, Canada, using a winter time series of fully polarimetric ground-based (X- and Ku-band, UW-Scat) scatterometer and spaceborne (C-band) synthetic aperture radar (SAR, Radarsat-2) observations collected coincidentally to in-situ snow and ice measurements. Scatterometer observations identify two dominant backscatter sources from the ice cover: the snow-ice, and ice-water interface. Using in-situ measurements as ground-truth, a winter time series of scatterometer and satellite acquisitions show increases in backscatter from the ice-water interface prior to the timing of tubular bubble development in the ice cover. This timing indicates that scattering in the ice is independent of double-bounce scatter caused by tubular bubble inclusions. Concurrently, the co-polarized phase difference of interactions at the ice-water interface from both scatterometer and SAR observations are centred at 0° throughout the time series, indicating a scattering regime other than double bounce. A Yamaguchi three-component decomposition of SAR observations is presented for C-band acquisitions indicating a dominant single-bounce scattering mechanism regime, which is hypothesized to be a result of an ice-water interface that presents a rough surface or a surface composed of preferentially oriented facets. This study is the first to present a winter time series of coincident ground-based and spaceborne fully polarimetric active microwave observations for bubbled freshwater lake ice.
NASA Astrophysics Data System (ADS)
Peng, Xing; Shi, Guo-Liang; Gao, Jian; Liu, Jia-Yuan; HuangFu, Yan-Qi; Ma, Tong; Wang, Hai-Ting; Zhang, Yue-Chong; Wang, Han; Li, Hui; Ivey, Cesunica E.; Feng, Yin-Chang
2016-08-01
With real time resolved data of Particulate matter (PM) and chemical species, understanding the source patterns and chemical characteristics is critical to establish controlling of PM. In this work, PM2.5 and chemical species were measured by corresponding online instruments with 1-h time resolution in Beijing. Multilinear Engine 2 (ME2) model was applied to explore the sources, and four sources (vehicle emission, crustal dust, secondary formation and coal combustion) were identified. To investigate the sensitivity of time resolution on the source contributions and chemical characteristics, ME2 was conducted with four time resolution runs (1-h, 2-h, 4-h, and 8-h). Crustal dust and coal combustion display large variation in the four time resolutions runs, with their contributions ranging from 6.7 to 10.4 μg m-3 and from 6.4 to 12.2 μg m-3, respectively. The contributions of vehicle emission and secondary formation range from 7.5 to 10.5 and from 14.7 to 16.7 μg m-3, respectively. The sensitivity analyses were conducted by principal component analysis-plot (PCA-plot), coefficient of divergence (CD), average absolute error (AAE) and correlation coefficients. For the four time resolution runs, the source contributions and profiles of crustal dust and coal combustion were more unstable than other source categories, possibly due to the lack of key markers of crustal dust and coal combustion (e.g. Si, Al). On the other hand, vehicle emission and crustal dust were more sensitive to time series of source contributions at different time resolutions. Findings in this study can improve our knowledge of source contributions and chemical characteristics at different time solutions.
NASA Astrophysics Data System (ADS)
Ohlendorf, S. J.; Feigl, K.; Thurber, C. H.; Lu, Z.; Masterlark, T.
2011-12-01
Okmok Volcano is an active caldera located on Umnak Island in the Aleutian Island arc. Okmok, having recently erupted in 1997 and 2008, is well suited for multidisciplinary studies of magma migration and storage because it hosts a good seismic network and has been the subject of synthetic aperture radar (SAR) images that span the recent eruption cycle. Interferometric SAR can characterize surface deformation in space and time, while data from the seismic network provides important information about the interior processes and structure of the volcano. We conduct a complete time series analysis of deformation of Okmok with images collected by the ERS and Envisat satellites on more than 100 distinct epochs between 1993 and 2008. We look for changes in inter-eruption inflation rates, which may indicate inelastic rheologic effects. For the time series analysis, we analyze the gradient of phase directly, without unwrapping, using the General Inversion of Phase Technique (GIPhT) [Feigl and Thurber, 2009]. This approach accounts for orbital and atmospheric effects and provides realistic estimates of the uncertainties of the model parameters. We consider several models for the source, including the prolate spheroid model and the Mogi model, to explain the observed deformation. Using a medium that is a homogeneous half space, we estimate the source depth to be centered at about 4 km below sea level, consistent with the findings of Masterlark et al. [2010]. As in several other geodetic studies, we find the source to be approximately centered beneath the caldera. To account for rheologic complexity, we next apply the Finite Element Method to simulate a pressurized cavity embedded in a medium with material properties derived from body wave seismic tomography. This approach allows us to address the problem of unreasonably large pressure values implied by a Mogi source with a radius of about 1 km by experimenting with larger sources. We also compare the time dependence of the source to published results that used GPS data.
Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.
2009-01-01
In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.
Kalman Filters for Time Delay of Arrival-Based Source Localization
NASA Astrophysics Data System (ADS)
Klee, Ulrich; Gehrig, Tobias; McDonough, John
2006-12-01
In this work, we propose an algorithm for acoustic source localization based on time delay of arrival (TDOA) estimation. In earlier work by other authors, an initial closed-form approximation was first used to estimate the true position of the speaker followed by a Kalman filtering stage to smooth the time series of estimates. In the proposed algorithm, this closed-form approximation is eliminated by employing a Kalman filter to directly update the speaker's position estimate based on the observed TDOAs. In particular, the TDOAs comprise the observation associated with an extended Kalman filter whose state corresponds to the speaker's position. We tested our algorithm on a data set consisting of seminars held by actual speakers. Our experiments revealed that the proposed algorithm provides source localization accuracy superior to the standard spherical and linear intersection techniques. Moreover, the proposed algorithm, although relying on an iterative optimization scheme, proved efficient enough for real-time operation.
USGS GNSS Applications to Volcano Disaster Response and Hazard Mitigation
NASA Astrophysics Data System (ADS)
Lisowski, M.; McCaffrey, R.
2015-12-01
Volcanic unrest is often identified by increased rates of seismicity, deformation, or the release of volcanic gases. Deformation results when ascending magma accumulates in crustal reservoirs, creates new pathways to the surface, or drains from magma reservoirs to feed an eruption. This volcanic deformation is overprinted by deformation from tectonic processes. GNSS monitoring of volcanoes captures transient volcanic deformation and steady and transient tectonic deformation, and we use the TDEFNODE software to unravel these effects. We apply the technique on portions of the Cascades Volcanic arc in central Oregon and in southern Washington that include a deforming volcano. In central Oregon, the regional TDEFNODE model consists of several blocks that rotate and deform internally and a decaying inflationary volcanic pressure source to reproduce the crustal bulge centered ~5 km west of South Sister. We jointly invert 47 interferograms that cover the interval from 1992 to 2010, as well as 2001 to 2015 continuous GNSS (cGNSS) and survey-mode (sGNSS) time series from stations in and around the Three Sisters, Newberry, and Crater Lake areas. A single, smoothly-decaying ~5 km deep spherical or prolate spheroid volcanic pressure source activated around 1998 provides the best fit to the combined geodetic data. In southern Washington, GNSS displacement time-series track decaying deflation of a ~8 km deep magma reservoir that fed the 2004 to 2008 eruption of Mount St. Helens. That deformation reversed when it began to recharge after the eruption ended. Offsets from slow slip events on the Cascadia subduction zone punctuate the GNSS displacement time series, and we remove them by estimating source parameters for these events. This regional TDEFNODE model extends from Mount Rainier south to Mount Hood, and additional volcanic sources could be added if these volcanoes start deforming. Other TDEFNODE regional models are planned for northern Washington (Mount Baker and Glacier Peak), northern California (Mount Shasta, Medicine Lake, Lassen Peak), and Long Valley. These models take advantage of the data from dense GNSS networks, they provide source parameters for volcanic and tectonic transients, and can be used to discriminate possible short- and long-term volcano- tectonic interactions.
NASA Astrophysics Data System (ADS)
Juckett, David A.
2001-09-01
A more complete understanding of the periodic dynamics of the Sun requires continued exploration of non-11-year oscillations in addition to the benchmark 11-year sunspot cycle. In this regard, several solar, geomagnetic, and cosmic ray time series were examined to identify common spectral components and their relative phase relationships. Several non-11-year oscillations were identified within the near-decadal range with periods of ~8, 10, 12, 15, 18, 22, and 29 years. To test whether these frequency components were simply low-level noise or were related to a common source, the phases were extracted for each component in each series. The phases were nearly identical across the solar and geomagnetic series, while the corresponding components in four cosmic ray surrogate series exhibited inverted phases, similar to the known phase relationship with the 11-year sunspot cycle. Cluster analysis revealed that this pattern was unlikely to occur by chance. It was concluded that many non-11-year oscillations truly exist in the solar dynamical environment and that these contribute to the complex variations observed in geomagnetic and cosmic ray time series. Using the different energy sensitivities of the four cosmic ray surrogate series, a preliminary indication of the relative intensities of the various solar-induced oscillations was observed. It provides evidence that many of the non-11-year oscillations result from weak interplanetary magnetic field/solar wind oscillations that originate from corresponding variations in the open-field regions of the Sun.
1991-03-21
discussion of spectral factorability and motivations for broadband analysis, the report is subdivided into four main sections. In Section 1.0, we...estimates. The motivation for developing our multi-channel deconvolution method was to gain information about seismic sources, most notably, nuclear...with complex constraints for estimating the rupture history. Such methods (applied mostly to data sets that also include strong rmotion data), were
NASA Astrophysics Data System (ADS)
Sawant, S. A.; Chakraborty, M.; Suradhaniwar, S.; Adinarayana, J.; Durbha, S. S.
2016-06-01
Satellite based earth observation (EO) platforms have proved capability to spatio-temporally monitor changes on the earth's surface. Long term satellite missions have provided huge repository of optical remote sensing datasets, and United States Geological Survey (USGS) Landsat program is one of the oldest sources of optical EO datasets. This historical and near real time EO archive is a rich source of information to understand the seasonal changes in the horticultural crops. Citrus (Mandarin / Nagpur Orange) is one of the major horticultural crops cultivated in central India. Erratic behaviour of rainfall and dependency on groundwater for irrigation has wide impact on the citrus crop yield. Also, wide variations are reported in temperature and relative humidity causing early fruit onset and increase in crop water requirement. Therefore, there is need to study the crop growth stages and crop evapotranspiration at spatio-temporal scale for managing the scarce resources. In this study, an attempt has been made to understand the citrus crop growth stages using Normalized Difference Time Series (NDVI) time series data obtained from Landsat archives (http://earthexplorer.usgs.gov/). Total 388 Landsat 4, 5, 7 and 8 scenes (from year 1990 to Aug. 2015) for Worldwide Reference System (WRS) 2, path 145 and row 45 were selected to understand seasonal variations in citrus crop growth. Considering Landsat 30 meter spatial resolution to obtain homogeneous pixels with crop cover orchards larger than 2 hectare area was selected. To consider change in wavelength bandwidth (radiometric resolution) with Landsat sensors (i.e. 4, 5, 7 and 8) NDVI has been selected to obtain continuous sensor independent time series. The obtained crop growth stage information has been used to estimate citrus basal crop coefficient information (Kcb). Satellite based Kcb estimates were used with proximal agrometeorological sensing system observed relevant weather parameters for crop ET estimation. The results show that time series EO based crop growth stage estimates provide better information about geographically separated citrus orchards. Attempts are being made to estimate regional variations in citrus crop water requirement for effective irrigation planning. In future high resolution Sentinel 2 observations from European Space Agency (ESA) will be used to fill the time gaps and to get better understanding about citrus crop canopy parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lieu, Richard
A hierarchy of statistics of increasing sophistication and accuracy is proposed to exploit an interesting and fundamental arithmetic structure in the photon bunching noise of incoherent light of large photon occupation number, with the purpose of suppressing the noise and rendering a more reliable and unbiased measurement of the light intensity. The method does not require any new hardware, rather it operates at the software level with the help of high-precision computers to reprocess the intensity time series of the incident light to create a new series with smaller bunching noise coherence length. The ultimate accuracy improvement of this methodmore » of flux measurement is limited by the timing resolution of the detector and the photon occupation number of the beam (the higher the photon number the better the performance). The principal application is accuracy improvement in the signal-limited bolometric flux measurement of a radio source.« less
Time-frequency analysis of functional optical mammographic images
NASA Astrophysics Data System (ADS)
Barbour, Randall L.; Graber, Harry L.; Schmitz, Christoph H.; Tarantini, Frank; Khoury, Georges; Naar, David J.; Panetta, Thomas F.; Lewis, Theophilus; Pei, Yaling
2003-07-01
We have introduced working technology that provides for time-series imaging of the hemoglobin signal in large tissue structures. In this study we have explored our ability to detect aberrant time-frequency responses of breast vasculature for subjects with Stage II breast cancer at rest and in response to simple provocations. The hypothesis being explored is that time-series imaging will be sensitive to the known structural and functional malformations of the tumor vasculature. Mammographic studies were conducted using an adjustable hemisheric measuring head containing 21 source and 21 detector locations (441 source-detector pairs). Simultaneous dual-wavelength studies were performed at 760 and 830 nm at a framing rate of ~2.7 Hz. Optical measures were performed on women lying prone with the breast hanging in a pendant position. Two class of measures were performed: (1) 20- minute baseline measure wherein the subject was at rest; (2) provocation studies wherein the subject was asked to perform some simple breathing maneuvers. Collected data were analyzed to identify the time-frequency structure and central tendencies of the detector responses and those of the image time series. Imaging data were generated using the Normalized Difference Method (Pei et al., Appl. Opt. 40, 5755-5769, 2001). Results obtained clearly document three classes of anomalies when compared to the normal contralateral breast. 1) Breast tumors exhibit altered oxygen supply/demand imbalance in response to an oxidative challenge (breath hold). 2) The vasomotor response of the tumor vasculature is mainly depressed and exhibits an altered modulation. 3) The affected area of the breast wherein the altered vasomotor signature is seen extends well beyond the limits of the tumor itself.
Signal restoration through deconvolution applied to deep mantle seismic probes
NASA Astrophysics Data System (ADS)
Stefan, W.; Garnero, E.; Renaut, R. A.
2006-12-01
We present a method of signal restoration to improve the signal-to-noise ratio, sharpen seismic arrival onset, and act as an empirical source deconvolution of specific seismic arrivals. Observed time-series gi are modelled as a convolution of a simpler time-series fi, and an invariant point spread function (PSF) h that attempts to account for the earthquake source process. The method is used on the shear wave time window containing SKS and S, whereby using a Gaussian PSF produces more impulsive, narrower, signals in the wave train. The resulting restored time-series facilitates more accurate and objective relative traveltime estimation of the individual seismic arrivals. We demonstrate the accuracy of the reconstruction method on synthetic seismograms generated by the reflectivity method. Clean and sharp reconstructions are obtained with real data, even for signals with relatively high noise content. Reconstructed signals are simpler, more impulsive, and narrower, which allows highlighting of some details of arrivals that are not readily apparent in raw waveforms. In particular, phases nearly coincident in time can be separately identified after processing. This is demonstrated for two seismic wave pairs used to probe deep mantle and core-mantle boundary structure: (1) the Sab and Scd arrivals, which travel above and within, respectively, a 200-300-km-thick, higher than average shear wave velocity layer at the base of the mantle, observable in the 88-92 deg epicentral distance range and (2) SKS and SPdiff KS, which are core waves with the latter having short arcs of P-wave diffraction, and are nearly identical in timing near 108-110 deg in distance. A Java/Matlab algorithm was developed for the signal restoration, which can be downloaded from the authors web page, along with example data and synthetic seismograms.
MEG/EEG Source Reconstruction, Statistical Evaluation, and Visualization with NUTMEG
Dalal, Sarang S.; Zumer, Johanna M.; Guggisberg, Adrian G.; Trumpis, Michael; Wong, Daniel D. E.; Sekihara, Kensuke; Nagarajan, Srikantan S.
2011-01-01
NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions. PMID:21437174
MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.
Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S
2011-01-01
NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.
True random bit generators based on current time series of contact glow discharge electrolysis
NASA Astrophysics Data System (ADS)
Rojas, Andrea Espinel; Allagui, Anis; Elwakil, Ahmed S.; Alawadhi, Hussain
2018-05-01
Random bit generators (RBGs) in today's digital information and communication systems employ a high rate physical entropy sources such as electronic, photonic, or thermal time series signals. However, the proper functioning of such physical systems is bound by specific constrains that make them in some cases weak and susceptible to external attacks. In this study, we show that the electrical current time series of contact glow discharge electrolysis, which is a dc voltage-powered micro-plasma in liquids, can be used for generating random bit sequences in a wide range of high dc voltages. The current signal is quantized into a binary stream by first using a simple moving average function which makes the distribution centered around zero, and then applying logical operations which enables the binarized data to pass all tests in industry-standard randomness test suite by the National Institute of Standard Technology. Furthermore, the robustness of this RBG against power supply attacks has been examined and verified.
NASA Astrophysics Data System (ADS)
Bliss, Donald; Franzoni, Linda; Rouse, Jerry; Manning, Ben
2005-09-01
An analysis method for time-dependent broadband diffuse sound fields in enclosures is described. Beginning with a formulation utilizing time-dependent broadband intensity boundary sources, the strength of these wall sources is expanded in a series in powers of an absorption parameter, thereby giving a separate boundary integral problem for each power. The temporal behavior is characterized by a Taylor expansion in the delay time for a source to influence an evaluation point. The lowest-order problem has a uniform interior field proportional to the reciprocal of the absorption parameter, as expected, and exhibits relatively slow exponential decay. The next-order problem gives a mean-square pressure distribution that is independent of the absorption parameter and is primarily responsible for the spatial variation of the reverberant field. This problem, which is driven by input sources and the lowest-order reverberant field, depends on source location and the spatial distribution of absorption. Additional problems proceed at integer powers of the absorption parameter, but are essentially higher-order corrections to the spatial variation. Temporal behavior is expressed in terms of an eigenvalue problem, with boundary source strength distributions expressed as eigenmodes. Solutions exhibit rapid short-time spatial redistribution followed by long-time decay of a predominant spatial mode.
Granger causal time-dependent source connectivity in the somatosensory network
NASA Astrophysics Data System (ADS)
Gao, Lin; Sommerlade, Linda; Coffman, Brian; Zhang, Tongsheng; Stephen, Julia M.; Li, Dichen; Wang, Jue; Grebogi, Celso; Schelter, Bjoern
2015-05-01
Exploration of transient Granger causal interactions in neural sources of electrophysiological activities provides deeper insights into brain information processing mechanisms. However, the underlying neural patterns are confounded by time-dependent dynamics, non-stationarity and observational noise contamination. Here we investigate transient Granger causal interactions using source time-series of somatosensory evoked magnetoencephalographic (MEG) elicited by air puff stimulation of right index finger and recorded using 306-channel MEG from 21 healthy subjects. A new time-varying connectivity approach, combining renormalised partial directed coherence with state space modelling, is employed to estimate fast changing information flow among the sources. Source analysis confirmed that somatosensory evoked MEG was mainly generated from the contralateral primary somatosensory cortex (SI) and bilateral secondary somatosensory cortices (SII). Transient Granger causality shows a serial processing of somatosensory information, 1) from contralateral SI to contralateral SII, 2) from contralateral SI to ipsilateral SII, 3) from contralateral SII to contralateral SI, and 4) from contralateral SII to ipsilateral SII. These results are consistent with established anatomical connectivity between somatosensory regions and previous source modeling results, thereby providing empirical validation of the time-varying connectivity analysis. We argue that the suggested approach provides novel information regarding transient cortical dynamic connectivity, which previous approaches could not assess.
NASA Astrophysics Data System (ADS)
Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Krause, Stefan
2017-04-01
At the river catchment scale, storm events can drive highly variable behaviour in nutrient and water fluxes, yet short-term dynamics are frequently missed by low resolution sampling regimes. In addition, nutrient source contributions can vary significantly within and between storm events. Our inability to identify and characterise time dynamic source zone contributions severely hampers the adequate design of land use management practices in order to control nutrient exports from agricultural landscapes. Here, we utilise an 8-month high-frequency (hourly) time series of streamflow, nitrate concentration (NO3) and fluorescent dissolved organic matter concentration (FDOM) derived from optical in-situ sensors located in a headwater agricultural catchment. We characterised variability in flow and nutrient dynamics across 29 storm events. Storm events represented 31% of the time series and contributed disproportionately to nutrient loads (43% of NO3 and 36% of CDOM) relative to their duration. Principal components analysis of potential hydroclimatological controls on nutrient fluxes demonstrated that a small number of components, representing >90% of variance in the dataset, were highly significant model predictors of inter-event variability in catchment nutrient export. Hysteresis analysis of nutrient concentration-discharge relationships suggested spatially discrete source zones existed for NO3 and FDOM, and that activation of these zones varied on an event-specific basis. Our results highlight the benefits of high-frequency in-situ monitoring for characterising complex short-term nutrient dynamics and unravelling connections between hydroclimatological variability and river nutrient export and source zone activation under extreme flow conditions. These new process-based insights are fundamental to underpinning the development of targeted management measures to reduce nutrient loading of surface waters.
TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems
NASA Astrophysics Data System (ADS)
Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.
2012-12-01
A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time-series data in the GUI as well. This GUI also includes mouse-clickable functions such as zooming or expanding the time-series display, measuring tsunami signal characteristics (arrival time, wave period and amplitude, etc.), and removing the tide signal from the time-series data. De-tiding of the time series is necessary to obtain accurate measurements of tsunami wave parameters and to maintain accurate historical tsunami databases. With TIDE TOOL, de-tiding is accomplished with a set of tide harmonic coefficients routinely computed and updated at PTWC for many of the stations in PTWC's inventory (~570). PTWC also uses the decoded time series files (previous 3-5 days' worth) to compute on-the-fly tide coefficients. The latter is useful in cases where the station is new and a long-term stable set of tide coefficients are not available or cannot be easily obtained due to various non-astronomical effects. The international tsunami warning system is coordinated globally by the UNESCO IOC, and a number of countries in the Pacific and Indian Ocean, and Caribbean depend on Tide Tool to monitor tsunamis in real time.
Experimental testing of the noise-canceling processor.
Collins, Michael D; Baer, Ralph N; Simpson, Harry J
2011-09-01
Signal-processing techniques for localizing an acoustic source buried in noise are tested in a tank experiment. Noise is generated using a discrete source, a bubble generator, and a sprinkler. The experiment has essential elements of a realistic scenario in matched-field processing, including complex source and noise time series in a waveguide with water, sediment, and multipath propagation. The noise-canceling processor is found to outperform the Bartlett processor and provide the correct source range for signal-to-noise ratios below -10 dB. The multivalued Bartlett processor is found to outperform the Bartlett processor but not the noise-canceling processor. © 2011 Acoustical Society of America
Development and analysis of a meteorological database, Argonne National Laboratory, Illinois
Over, Thomas M.; Price, Thomas H.; Ishii, Audrey L.
2010-01-01
A database of hourly values of air temperature, dewpoint temperature, wind speed, and solar radiation from January 1, 1948, to September 30, 2003, primarily using data collected at the Argonne National Laboratory station, was developed for use in continuous-time hydrologic modeling in northeastern Illinois. Missing and apparently erroneous data values were replaced with adjusted values from nearby stations used as 'backup'. Temporal variations in the statistical properties of the data resulting from changes in measurement and data-storage methodologies were adjusted to match the statistical properties resulting from the data-collection procedures that have been in place since January 1, 1989. The adjustments were computed based on the regressions between the primary data series from Argonne National Laboratory and the backup series using data obtained during common periods; the statistical properties of the regressions were used to assign estimated standard errors to values that were adjusted or filled from other series. Each hourly value was assigned a corresponding data-source flag that indicates the source of the value and its transformations. An analysis of the data-source flags indicates that all the series in the database except dewpoint have a similar fraction of Argonne National Laboratory data, with about 89 percent for the entire period, about 86 percent from 1949 through 1988, and about 98 percent from 1989 through 2003. The dewpoint series, for which observations at Argonne National Laboratory did not begin until 1958, has only about 71 percent Argonne National Laboratory data for the entire period, about 63 percent from 1948 through 1988, and about 93 percent from 1989 through 2003, indicating a lower reliability of the dewpoint sensor. A basic statistical analysis of the filled and adjusted data series in the database, and a series of potential evapotranspiration computed from them using the computer program LXPET (Lamoreux Potential Evapotranspiration) also was carried out. This analysis indicates annual cycles in solar radiation and potential evapotranspiration that follow the annual cycle of extraterrestrial solar radiation, whereas temperature and dewpoint annual cycles are lagged by about 1 month relative to the solar cycle. The annual cycle of wind has a late summer minimum, and spring and fall maximums. At the annual time scale, the filled and adjusted data series and computed potential evapotranspiration have significant serial correlation and possibly have significant temporal trends. The inter-annual fluctuations of temperature and dewpoint are weakest, whereas those of wind and potential evapotranspiration are strongest.
An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data
NASA Astrophysics Data System (ADS)
Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.
2016-12-01
QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.
NASA Astrophysics Data System (ADS)
Gowda, P. H.
2016-12-01
Evapotranspiration (ET) is an important process in ecosystems' water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. There are efforts to develop such datasets on a regional to global scale but often faced with the limitations of spatial-temporal resolution tradeoffs in satellite remote sensing technology. In this study, we developed frameworks for generating high and medium resolution daily ET maps from Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data, respectively. For developing high resolution (30-m) daily time series ET maps with Landsat TM data, the series version of Two Source Energy Balance (TSEB) model was used to compute sensible and latent heat fluxes of soil and canopy separately. Landsat 5 (2000-2011) and Landsat 8 (2013-2014) imageries for row 28/35 and 27/36 covering central Oklahoma was used. MODIS data (2001-2014) covering Oklahoma and Texas Panhandle was used to develop medium resolution (250-m), time series daily ET maps with SEBS (Surface Energy Balance System) model. An extensive network of weather stations managed by Texas High Plains ET Network and Oklahoma Mesonet was used to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. A linear interpolation sub-model was used to estimate the daily ET between the image acquisition days. Accuracy assessment of daily ET maps were done against eddy covariance data from two grassland sites at El Reno, OK. Statistical results indicated good performance by modeling frameworks developed for deriving time series ET maps. Results indicated that the proposed ET mapping framework is suitable for deriving daily time series ET maps at regional scale with Landsat and MODIS data.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
Self-calibrating multiplexer circuit
Wahl, Chris P.
1997-01-01
A time domain multiplexer system with automatic determination of acceptable multiplexer output limits, error determination, or correction is comprised of a time domain multiplexer, a computer, a constant current source capable of at least three distinct current levels, and two series resistances employed for calibration and testing. A two point linear calibration curve defining acceptable multiplexer voltage limits may be defined by the computer by determining the voltage output of the multiplexer to very accurately known input signals developed from predetermined current levels across the series resistances. Drift in the multiplexer may be detected by the computer when the output voltage limits, expected during normal operation, are exceeded, or the relationship defined by the calibration curve is invalidated.
Results from field tests of the one-dimensional Time-Encoded Imaging System.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brennan, James S.; Brubaker, Erik
2014-09-01
A series of field experiments were undertaken to evaluate the performance of the one dimensional time encoded imaging system. The significant detection of a Cf252 fission radiation source was demonstrated at a stand-off of 100 meters. Extrapolations to different quantities of plutonium equivalent at different distances are made. Hardware modifications to the system for follow on work are suggested.
Model Performance Evaluation and Scenario Analysis ...
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too
NASA Astrophysics Data System (ADS)
Charco, M.; Rodriguez Molina, S.; Gonzalez, P. J.; Negredo, A. M.; Poland, M. P.; Schmidt, D. A.
2017-12-01
The Three Sisters volcanic region Oregon (USA) is one of the most active volcanic areas in the Cascade Range and is densely populated with eruptive vents. An extensive area just west of South Sister volcano has been actively uplifting since about 1998. InSAR data from 1992 through 2001 showed an uplift rate in the area of 3-4 cm/yr. Then the deformation rate considerably decreased between 2004 and 2006 as shown by both InSAR and continuous GPS measurements. Once magmatic system geometry and location are determined, a linear inversion of all GPS and InSAR data available is performed in order to estimate the volume changes of the source along the analyzed time interval. For doing so, we applied a technique based on the Truncated Singular Value Decomposition (TSVD) of the Green's function matrix representing the linear inversion. Here, we develop a strategy to provide a cut-off for truncation removing the smallest singular values without too much loose of data resolution against the stability of the method. Furthermore, the strategy will give us a quantification of the uncertainty of the volume change time series. The strength of the methodology resides in allowing the joint inversion of InSAR measurements from multiple tracks with different look angles and three component GPS measurements from multiple sites.Finally, we analyze the temporal behavior of the source volume changes using a new analytical model that describes the process of injecting magma into a reservoir surrounded by a viscoelastic shell. This dynamic model is based on Hagen-Poiseuille flow through a vertical conduit that leads to an increase in pressure within a spherical reservoir and time-dependent surface deformation. The volume time series are compared to predictions from the dynamic model to constrain model parameters, namely characteristic Poiseuille and Maxwell time scales, inlet and outlet injection pressure, and source and shell geometries. The modeling approach used here could be used to develop a mathematically rigorous strategy for including time-series deformation data in the interpretation of volcanic unrest.
NASA Astrophysics Data System (ADS)
Soulsby, Chris; Birkel, Christian; Geris, Josie; Tetzlaff, Doerthe
2016-04-01
Advances in the use of hydrological tracers and their integration into rainfall runoff models is facilitating improved quantification of stream water age distributions. This is of fundamental importance to understanding water quality dynamics over both short- and long-time scales, particularly as water quality parameters are often associated with water sources of markedly different ages. For example, legacy nitrate pollution may reflect deeper waters that have resided in catchments for decades, whilst more dynamics parameters from anthropogenic sources (e.g. P, pathogens etc) are mobilised by very young (<1 day) near-surface water sources. It is increasingly recognised that water age distributions of stream water is non-stationary in both the short (i.e. event dynamics) and longer-term (i.e. in relation to hydroclimatic variability). This provides a crucial context for interpreting water quality time series. Here, we will use longer-term (>5 year), high resolution (daily) isotope time series in modelling studies for different catchments to show how variable stream water age distributions can be a result of hydroclimatic variability and the implications for understanding water quality. We will also use examples from catchments undergoing rapid urbanisation, how the resulting age distributions of stream water change in a predictable way as a result of modified flow paths. The implication for the management of water quality in urban catchments will be discussed.
Implications on 1+1 D runup modeling due to time features of the earthquake source
NASA Astrophysics Data System (ADS)
Fuentes, M.; Riquelme, S.; Campos, J. A.
2017-12-01
The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1+1D solution for the shoreline motion time series, from the static case to the dynamic case, by including both, rise time and rupture velocity. Results show that the static case correspond to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum run-up may be affected by very slow ruptures and long rise time. The analytical solution has been tested for the Nicaraguan tsunami earthquake, suggesting that the rupture was not slow enough to cause wave amplification to explain the high runup observations.
Development of Alabama Resources Information System (ARIS)
NASA Technical Reports Server (NTRS)
Herring, B. E.; Vachon, R. I.
1976-01-01
A formal, organized set of information concerning the development status of the Alabama Resources Information System (ARIS) as of September 1976 is provided. A series of computer source language programs, and flow charts related to each of the computer programs to provide greater ease in performing future change are presented. Listings of the variable names, and their meanings, used in the various source code programs, and copies of the various user manuals which were prepared through this time are given.
Moore, Darrell; Van Nest, Byron N; Seier, Edith
2011-06-01
Classical experiments demonstrated that honey bee foragers trained to collect food at virtually any time of day will return to that food source on subsequent days with a remarkable degree of temporal accuracy. This versatile time-memory, based on an endogenous circadian clock, presumably enables foragers to schedule their reconnaissance flights to best take advantage of the daily rhythms of nectar and pollen availability in different species of flowers. It is commonly believed that the time-memory rapidly extinguishes if not reinforced daily, thus enabling foragers to switch quickly from relatively poor sources to more productive ones. On the other hand, it is also commonly thought that extinction of the time-memory is slow enough to permit foragers to 'remember' the food source over a day or two of bad weather. What exactly is the time-course of time-memory extinction? In a series of field experiments, we determined that the level of food-anticipatory activity (FAA) directed at a food source is not rapidly extinguished and, furthermore, the time-course of extinction is dependent upon the amount of experience accumulated by the forager at that source. We also found that FAA is prolonged in response to inclement weather, indicating that time-memory extinction is not a simple decay function but is responsive to environmental changes. These results provide insights into the adaptability of FAA under natural conditions.
A Kalman filter approach for the determination of celestial reference frames
NASA Astrophysics Data System (ADS)
Soja, Benedikt; Gross, Richard; Jacobs, Christopher; Chin, Toshio; Karbon, Maria; Nilsson, Tobias; Heinkelmann, Robert; Schuh, Harald
2017-04-01
The coordinate model of radio sources in International Celestial Reference Frames (ICRF), such as the ICRF2, has traditionally been a constant offset. While sufficient for a large part of radio sources considering current accuracy requirements, several sources exhibit significant temporal coordinate variations. In particular, the group of the so-called special handling sources is characterized by large fluctuations in the source positions. For these sources and for several from the "others" category of radio sources, a coordinate model that goes beyond a constant offset would be beneficial. However, due to the sheer amount of radio sources in catalogs like the ICRF2, and even more so with the upcoming ICRF3, it is difficult to find the most appropriate coordinate model for every single radio source. For this reason, we have developed a time series approach to the determination of celestial reference frames (CRF). We feed the radio source coordinates derived from single very long baseline interferometry (VLBI) sessions sequentially into a Kalman filter and smoother, retaining their full covariances. The estimation of the source coordinates is carried out with a temporal resolution identical to the input data, i.e. usually 1-4 days. The coordinates are assumed to behave like random walk processes, an assumption which has already successfully been made for the determination of terrestrial reference frames such as the JTRF2014. To be able to apply the most suitable process noise value for every single radio source, their statistical properties are analyzed by computing their Allan standard deviations (ADEV). Additional to the determination of process noise values, the ADEV allows drawing conclusions whether the variations in certain radio source positions significantly deviate from random walk processes. Our investigations also deal with other means of source characterization, such as the structure index, in order to derive a suitable process noise model. The Kalman filter CRFs resulting from the different approaches are compared among each other, to the original radio source position time series, as well as to a traditional CRF solution, in which the constant source positions are estimated in a global least squares adjustment.
Seasonal variability and degradation investigation of iodocarbons in a coastal fjord
NASA Astrophysics Data System (ADS)
Shi, Qiang; Wallace, Douglas
2016-04-01
Methyl iodide (CH3I) is considered an important carrier of iodine atoms from sea to air. The importance of other volatile iodinated compounds, such as very short-lived iodocarbons (e.g. CH2ClI, CH2I2), has also been demonstrated [McFiggans, 2005; O'Dowd and Hoffmann, 2005; Carpenter et al., 2013]. The production pathways of iodocarbons, and controls on their sea-to-air flux can be investigated by in-situ studies (e.g. surface layer mass balance from time-series studies) and by incubation experiments. Shi et al., [2014] reported previously unrecognised large, night-time losses of CH3I observed during incubation experiments with coastal waters. These losses were significant for controlling the sea-to-air flux but are not yet understood. As part of a study to further investigate sources and sinks of CH3I and other iodocarbons in coastal waters, samples have been analysed weekly since April 2015 at 4 depths (5 to 60 m) in the Bedford Basin, Halifax, Canada. The time-series study was part of a broader study that included measurement of other, potentially related parameters (temperature, salinity, Chlorophyll a etc.). A set of repeated degradation experiments was conducted, in the context of this time-series, including incubations within a solar simulator using 13C labelled CH3I. Results of the time-series sampling and incubation experiments will be presented.
NASA Astrophysics Data System (ADS)
Gong, W.; Meyer, F. J.; Lee, C.-W.; Lu, Z.; Freymueller, J.
2015-02-01
A 7 year time series of satellite radar images over Unimak Island, Alaska—site of Westdahl Volcano, Fisher Caldera, and Shishaldin Volcano—was processed using a model-free Persistent Scatterer Interferometry technique assisted by numerical weather prediction model. The deformation-only signals were optimally extracted from atmosphere-contaminated phase records. The reconstructed deformation time series maps are compared with campaign and continuous Global Positioning System (GPS) measurements as well as Small Baseline Subset interferometric synthetic aperture radar (InSAR) results for quality assessment and geophysical interpretation. We observed subtle surface inflation at Westdahl Volcano that can be fit by a Mogi source located at approximately 3.6 km north of Westdahl peak and at depth of about 6.9 km that is consistent with the GPS-estimated depth for the 1998 to 2001 time period. The magma chamber volume change decays during the period of 2003 to 2010. The deformation field over Fisher Caldera is steadily subsiding over time. Its best fit analytical model is a sill source that is about 7.9 km in length, 0.54 km in width, and located at about 5.5 km below sea level underneath the center of Fisher Caldera with strike angle of N52°E. Very little deformation was detected near Shishaldin peak; however, a region approximately 15 km east of Shishaldin, as well as an area at the Tugamak range at about 30 km northwest of Shishaldin, shows evidence for movement toward the satellite, with a temporal signature correlated with the 2004 Shishaldin eruption. The cause of these movements is unknown.
NASA Astrophysics Data System (ADS)
Rajib, A.; Zhao, L.; Merwade, V.; Shin, J.; Smith, J.; Song, C. X.
2017-12-01
Despite the significant potential of remotely sensed earth observations, their application is still not full-fledged in water resources research, management and education. Inconsistent storage structures, data formats and spatial resolution among different platforms/sources of earth observations hinder the use of these data. Available web-services can help bulk data downloading and visualization, but they are not sufficiently tailored to meet the degree of interoperability required for direct application of earth observations in hydrologic modeling at user-defined spatio-temporal scales. Similarly, the least ambiguous way for educators and watershed managers is to instantaneously obtain a time-series at any watershed of interest without spending time and computational resources on data download and post-processing activities. To address this issue, an open access, online platform, named HydroGlobe, is developed that minimizes all these processing tasks and delivers ready-to-use data from different earth observation sources. HydroGlobe can provide spatially-averaged time series of earth observations by using the following inputs: (i) data source, (ii) temporal extent in the form of start/end date, and (iii) geographic units (e.g., grid cell or sub-basin boundary) and extent in the form of GIS shapefile. In its preliminary version, HydroGlobe simultaneously handles five data sources including the surface and root zone soil moisture from SMAP (Soil Moisture Active Passive Mission), actual and potential evapotranspiration from MODIS (Moderate Resolution Imaging Spectroradiometer), and precipitation from GPM (Global Precipitation Measurements). This presentation will demonstrate the HydroGlobe interface and its applicability using few test cases on watersheds from different parts of the globe.
NASA Technical Reports Server (NTRS)
Dong, D.; Fang, P.; Bock, F.; Webb, F.; Prawirondirdjo, L.; Kedar, S.; Jamason, P.
2006-01-01
Spatial filtering is an effective way to improve the precision of coordinate time series for regional GPS networks by reducing so-called common mode errors, thereby providing better resolution for detecting weak or transient deformation signals. The commonly used approach to regional filtering assumes that the common mode error is spatially uniform, which is a good approximation for networks of hundreds of kilometers extent, but breaks down as the spatial extent increases. A more rigorous approach should remove the assumption of spatially uniform distribution and let the data themselves reveal the spatial distribution of the common mode error. The principal component analysis (PCA) and the Karhunen-Loeve expansion (KLE) both decompose network time series into a set of temporally varying modes and their spatial responses. Therefore they provide a mathematical framework to perform spatiotemporal filtering.We apply the combination of PCA and KLE to daily station coordinate time series of the Southern California Integrated GPS Network (SCIGN) for the period 2000 to 2004. We demonstrate that spatially and temporally correlated common mode errors are the dominant error source in daily GPS solutions. The spatial characteristics of the common mode errors are close to uniform for all east, north, and vertical components, which implies a very long wavelength source for the common mode errors, compared to the spatial extent of the GPS network in southern California. Furthermore, the common mode errors exhibit temporally nonrandom patterns.
NASA Astrophysics Data System (ADS)
Sprigg, W. A.; Sahoo, S.; Prasad, A. K.; Venkatesh, A. S.; Vukovic, A.; Nickovic, S.
2015-12-01
Identification and evaluation of sources of aeolian mineral dust is a critical task in the simulation of dust. Recently, time series of space based multi-sensor satellite images have been used to identify and monitor changes in the land surface characteristics. Modeling of windblown dust requires precise delineation of mineral dust source and its strength that varies over a region as well as seasonal and inter-annual variability due to changes in land use and land cover. Southwest USA is one of the major dust emission prone zone in North American continent where dust is generated from low lying dried-up areas with bare ground surface and they may be scattered or appear as point sources on high resolution satellite images. In the current research, various satellite derived variables have been integrated to produce a high-resolution dust source mask, at grid size of 250 m, using data such as digital elevation model, surface reflectance, vegetation cover, land cover class, and surface wetness. Previous dust source models have been adopted to produce a multi-parameter dust source mask using data from satellites such as Terra (Moderate Resolution Imaging Spectroradiometer - MODIS), and Landsat. The dust source mask model captures the topographically low regions with bare soil surface, dried-up river plains, and lakes which form important source of dust in southwest USA. The study region is also one of the hottest regions of USA where surface dryness, land use (agricultural use), and vegetation cover changes significantly leading to major changes in the areal coverage of potential dust source regions. A dynamic high resolution dust source mask have been produced to address intra-annual change in the aerial extent of bare dry surfaces. Time series of satellite derived data have been used to create dynamic dust source masks. A new dust source mask at 16 day interval allows enhanced detection of potential dust source regions that can be employed in the dust emission and transport pathways models for better estimation of emission of dust during dust storms, particulate air pollution, public health risk assessment tools and decision support systems.
Real-time inversions for finite fault slip models and rupture geometry based on high-rate GPS data
Minson, Sarah E.; Murray, Jessica R.; Langbein, John O.; Gomberg, Joan S.
2015-01-01
We present an inversion strategy capable of using real-time high-rate GPS data to simultaneously solve for a distributed slip model and fault geometry in real time as a rupture unfolds. We employ Bayesian inference to find the optimal fault geometry and the distribution of possible slip models for that geometry using a simple analytical solution. By adopting an analytical Bayesian approach, we can solve this complex inversion problem (including calculating the uncertainties on our results) in real time. Furthermore, since the joint inversion for distributed slip and fault geometry can be computed in real time, the time required to obtain a source model of the earthquake does not depend on the computational cost. Instead, the time required is controlled by the duration of the rupture and the time required for information to propagate from the source to the receivers. We apply our modeling approach, called Bayesian Evidence-based Fault Orientation and Real-time Earthquake Slip, to the 2011 Tohoku-oki earthquake, 2003 Tokachi-oki earthquake, and a simulated Hayward fault earthquake. In all three cases, the inversion recovers the magnitude, spatial distribution of slip, and fault geometry in real time. Since our inversion relies on static offsets estimated from real-time high-rate GPS data, we also present performance tests of various approaches to estimating quasi-static offsets in real time. We find that the raw high-rate time series are the best data to use for determining the moment magnitude of the event, but slightly smoothing the raw time series helps stabilize the inversion for fault geometry.
NASA Astrophysics Data System (ADS)
Sorge, J.; Williams-Jones, G.; Wright, R.; Varley, N. R.
2010-12-01
Satellite imagery is playing an increasingly prominent role in volcanology as it allows for consistent monitoring of remote, dangerous, and/or under-monitored volcanoes. One such system is Volcán de Colima (Mexico), a persistently active andesitic stratovolcano. Its characteristic and hazardous activity includes lava dome growth, pyroclastic flows, explosions, and Plinian to Subplinian eruptions, which have historically occurred at the end of Volcán de Colima’s eruptive cycle. Despite the availability of large amounts of historical satellite imagery, methods to process and interpret these images over long time periods are limited. Furthermore, while time-series InSAR data from a previous study (December 2002 to August 2006) detected an overall subsidence between 1 and 3 km from the summit, there is insufficient temporal resolution to unambiguously constrain the source processes. To address this issue, a semi-automated process for time-based characterization of persistent volcanic activity at Volcán de Colima has been developed using a combination of MODIS and GOES satellite imagery to identify thermal anomalies on the volcano edifice. This satellite time-series data is then combined with available geodetic data, a detailed eruption history, and other geophysical time-series data (e.g., seismicity, explosions/day, effusion rate, environmental data, etc.) and examined for possible correlations and recurring patterns in the multiple data sets to investigate potential trigger mechanisms responsible for the changes in volcanic activity. GOES and MODIS images are available from 2000 to present at a temporal resolution of one image every 30 minutes and up to four images per day, respectively, creating a data set of approximately 180,000 images. Thermal anomalies over Volcán de Colima are identified in both night- and day-time images by applying a time-series approach to the analysis of MODIS data. Detection of false anomalies, caused by non-volcanic heat sources such as fires or solar heating (in the daytime images), is mitigated by adjusting the MODIS detection thresholds, through comparison of daytime versus nighttime results, and by observing the spatial distribution of the anomalies on the edifice. Conversely, anomalies may not be detected due to cloud cover; clouds absorb thermal radiation limiting or preventing the ability of the satellite to measure thermal events; therefore, the anomaly data is supplemented with a cloud cover time-series data set. Fast Fourier and Wavelet transforms are then applied to the continuous, uninterrupted intervals of satellite observation to compare and correlate with the multiple time-series data sets. The result is the characterization of the behavior of an individual volcano, based on an extended time period. This volcano specific, comprehensive characterization can then be used as a predictive tool in the real-time monitoring of volcanic activity.
NASA Technical Reports Server (NTRS)
Ehhalt, D. H.; Fraser, P. J.; Albritton, D.; Cicerone, R. J.; Khalil, M. A. K.; Legrand, M.; Makide, Y.; Rowland, F. S.; Steele, L. P.; Zander, R.
1989-01-01
Source gases are defined as those gases that, by their breakdown, introduce into the stratosphere halogen, hydrogen, and nitrogen compounds that are important in stratospheric ozone destruction. Given here is an update of the existing concentration time series for chlorocarbons, nitrous oxide, and methane. Also reviewed is information on halogen containing species and the use of these data for establishing trends. Also reviewed is evidence on trends in trace gases that influence tropospheric chemistry and thus the tropospheric lifetimes of source gases, such as carbon dioxide, carbon monoxide, or nitrogen oxides. Much of the information is given in tabular form.
Consistent modelling of wind turbine noise propagation from source to receiver.
Barlas, Emre; Zhu, Wei Jun; Shen, Wen Zhong; Dag, Kaya O; Moriarty, Patrick
2017-11-01
The unsteady nature of wind turbine noise is a major reason for annoyance. The variation of far-field sound pressure levels is not only caused by the continuous change in wind turbine noise source levels but also by the unsteady flow field and the ground characteristics between the turbine and receiver. To take these phenomena into account, a consistent numerical technique that models the sound propagation from the source to receiver is developed. Large eddy simulation with an actuator line technique is employed for the flow modelling and the corresponding flow fields are used to simulate sound generation and propagation. The local blade relative velocity, angle of attack, and turbulence characteristics are input to the sound generation model. Time-dependent blade locations and the velocity between the noise source and receiver are considered within a quasi-3D propagation model. Long-range noise propagation of a 5 MW wind turbine is investigated. Sound pressure level time series evaluated at the source time are studied for varying wind speeds, surface roughness, and ground impedances within a 2000 m radius from the turbine.
Consistent modelling of wind turbine noise propagation from source to receiver
Barlas, Emre; Zhu, Wei Jun; Shen, Wen Zhong; ...
2017-11-28
The unsteady nature of wind turbine noise is a major reason for annoyance. The variation of far-field sound pressure levels is not only caused by the continuous change in wind turbine noise source levels but also by the unsteady flow field and the ground characteristics between the turbine and receiver. To take these phenomena into account, a consistent numerical technique that models the sound propagation from the source to receiver is developed. Large eddy simulation with an actuator line technique is employed for the flow modelling and the corresponding flow fields are used to simulate sound generation and propagation. Themore » local blade relative velocity, angle of attack, and turbulence characteristics are input to the sound generation model. Time-dependent blade locations and the velocity between the noise source and receiver are considered within a quasi-3D propagation model. Long-range noise propagation of a 5 MW wind turbine is investigated. Sound pressure level time series evaluated at the source time are studied for varying wind speeds, surface roughness, and ground impedances within a 2000 m radius from the turbine.« less
Consistent modelling of wind turbine noise propagation from source to receiver
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barlas, Emre; Zhu, Wei Jun; Shen, Wen Zhong
The unsteady nature of wind turbine noise is a major reason for annoyance. The variation of far-field sound pressure levels is not only caused by the continuous change in wind turbine noise source levels but also by the unsteady flow field and the ground characteristics between the turbine and receiver. To take these phenomena into account, a consistent numerical technique that models the sound propagation from the source to receiver is developed. Large eddy simulation with an actuator line technique is employed for the flow modelling and the corresponding flow fields are used to simulate sound generation and propagation. Themore » local blade relative velocity, angle of attack, and turbulence characteristics are input to the sound generation model. Time-dependent blade locations and the velocity between the noise source and receiver are considered within a quasi-3D propagation model. Long-range noise propagation of a 5 MW wind turbine is investigated. Sound pressure level time series evaluated at the source time are studied for varying wind speeds, surface roughness, and ground impedances within a 2000 m radius from the turbine.« less
NASA Astrophysics Data System (ADS)
Eberle, J.; Hüttich, C.; Schmullius, C.
2014-12-01
Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are updated in near-realtime based on the linked data providers mentioned above. An alert is automatically pushed to the user if the new data meets the conditions of the registered filter expression. This monitoring service is available on the web portal with alerting by email and within the mobile app with alerting by email and push notification.
Wójcik, J.; Kujawska, T.; Nowicki, A.; Lewin, P.A.
2008-01-01
The primary goal of this work was to verify experimentally the applicability of the recently introduced Time-Averaged Wave Envelope (TAWE) method [1] as a tool for fast prediction of four dimensional (4D) pulsed nonlinear pressure fields from arbitrarily shaped acoustic sources in attenuating media. The experiments were performed in water at the fundamental frequency of 2.8 MHz for spherically focused (focal length F = 80 mm) square (20 × 20 mm) and rectangular (10 × 25 mm) sources similar to those used in the design of 1D linear arrays operating with ultrasonic imaging systems. The experimental results obtained with 10-cycle tone bursts at three different excitation levels corresponding to linear, moderately nonlinear and highly nonlinear propagation conditions (0.045, 0.225 and 0.45 MPa on-source pressure amplitude, respectively) were compared with those yielded using the TAWE approach [1]. The comparison of the experimental results and numerical simulations has shown that the TAWE approach is well suited to predict (to within ± 1 dB) both the spatial-temporal and spatial-spectral pressure variations in the pulsed nonlinear acoustic beams. The obtained results indicated that implementation of the TAWE approach enabled shortening of computation time in comparison with the time needed for prediction of the full 4D pulsed nonlinear acoustic fields using a conventional (Fourier-series) approach [2]. The reduction in computation time depends on several parameters, including the source geometry, dimensions, fundamental resonance frequency, excitation level as well as the strength of the medium nonlinearity. For the non-axisymmetric focused transducers mentioned above and excited by a tone burst corresponding to moderately nonlinear and highly nonlinear conditions the execution time of computations was 3 and 12 hours, respectively, when using a 1.5 GHz clock frequency, 32-bit processor PC laptop with 2 GB RAM memory, only. Such prediction of the full 4D pulsed field is not possible when using conventional, Fourier-series scheme as it would require increasing the RAM memory by at least 2 orders of magnitude. PMID:18474387
Source and transport of human enteric viruses in deep municipal water supply wells
Bradbury, Kenneth R.; Borchardt, Mark A.; Gotkowitz, Madeline; Spencer, Susan K.; Zhu, Jun; Hunt, Randall J.
2013-01-01
Until recently, few water utilities or researchers were aware of possible virus presence in deep aquifers and wells. During 2008 and 2009 we collected a time series of virus samples from six deep municipal water-supply wells. The wells range in depth from approximately 220 to 300 m and draw water from a sandstone aquifer. Three of these wells draw water from beneath a regional aquitard, and three draw water from both above and below the aquitard. We also sampled a local lake and untreated sewage as potential virus sources. Viruses were detected up to 61% of the time in each well sampled, and many groundwater samples were positive for virus infectivity. Lake samples contained viruses over 75% of the time. Virus concentrations and serotypes observed varied markedly with time in all samples. Sewage samples were all extremely high in virus concentration. Virus serotypes detected in sewage and groundwater were temporally correlated, suggesting very rapid virus transport, on the order of weeks, from the source(s) to wells. Adenovirus and enterovirus levels in the wells were associated with precipitation events. The most likely source of the viruses in the wells was leakage of untreated sewage from sanitary sewer pipes.
A Web-Based Framework For a Time-Domain Warehouse
NASA Astrophysics Data System (ADS)
Brewer, J. M.; Bloom, J. S.; Kennedy, R.; Starr, D. L.
2009-09-01
The Berkeley Transients Classification Pipeline (TCP) uses a machine-learning classifier to automatically categorize transients from large data torrents and provide automated notification of astronomical events of scientific interest. As part of the training process, we created a large warehouse of light-curve sources with well-labelled classes that serve as priors to the classification engine. This web-based interactive framework, which we are now making public via DotAstro.org (http://dotastro.org/), allows us to ingest time-variable source data in a wide variety of formats and store it in a common internal data model. Data is passed between pipeline modules in a prototype XML representation of time-series format (VOTimeseries), which can also be emitted to collaborators through dotastro.org. After import, the sources can be visualized using Google Sky, light curves can be inspected interactively, and classifications can be manually adjusted.
NASA Astrophysics Data System (ADS)
Ozawa, Taku; Ueda, Hideki
2011-12-01
InSAR time series analysis is an effective tool for detecting spatially and temporally complicated volcanic deformation. To obtain details of such deformation, we developed an advanced InSAR time series analysis using interferograms of multiple-orbit tracks. Considering only right- (or only left-) looking SAR observations, incidence directions for different orbit tracks are mostly included in a common plane. Therefore, slant-range changes in their interferograms can be expressed by two components in the plane. This approach estimates the time series of their components from interferograms of multiple-orbit tracks by the least squares analysis, and higher accuracy is obtained if many interferograms of different orbit tracks are available. Additionally, this analysis can combine interferograms for different incidence angles. In a case study on Miyake-jima, we obtained a deformation time series corresponding to GPS observations from PALSAR interferograms of six orbit tracks. The obtained accuracy was better than that with the SBAS approach, demonstrating its effectiveness. Furthermore, it is expected that higher accuracy would be obtained if SAR observations were carried out more frequently in all orbit tracks. The deformation obtained in the case study indicates uplift along the west coast and subsidence with contraction around the caldera. The speed of the uplift was almost constant, but the subsidence around the caldera decelerated from 2009. A flat deformation source was estimated near sea level under the caldera, implying that deceleration of subsidence was related to interaction between volcanic thermal activity and the aquifer.
Acoustic Full Waveform Inversion to Characterize Near-surface Chemical Explosions
NASA Astrophysics Data System (ADS)
Kim, K.; Rodgers, A. J.
2015-12-01
Recent high-quality, atmospheric overpressure data from chemical high-explosive experiments provide a unique opportunity to characterize near-surface explosions, specifically estimating yield and source time function. Typically, yield is estimated from measured signal features, such as peak pressure, impulse, duration and/or arrival time of acoustic signals. However, the application of full waveform inversion to acoustic signals for yield estimation has not been fully explored. In this study, we apply a full waveform inversion method to local overpressure data to extract accurate pressure-time histories of acoustics sources during chemical explosions. A robust and accurate inversion technique for acoustic source is investigated using numerical Green's functions that take into account atmospheric and topographic propagation effects. The inverted pressure-time history represents the pressure fluctuation at the source region associated with the explosion, and thus, provides a valuable information about acoustic source mechanisms and characteristics in greater detail. We compare acoustic source properties (i.e., peak overpressure, duration, and non-isotropic shape) of a series of explosions having different emplacement conditions and investigate the relationship of the acoustic sources to the yields of explosions. The time histories of acoustic sources may refine our knowledge of sound-generation mechanisms of shallow explosions, and thereby allow for accurate yield estimation based on acoustic measurements. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Water Column Variability in Coastal Regions
1997-09-30
to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data... 1 . REPORT DATE 30 SEP 1997 2. REPORT TYPE 3. DATES COVERED 00-00-1997 to 00-00-1997 4. TITLE AND SUBTITLE Water Column Variability in...Andrews, Woods, and Kester deployed a spar buoy at a central location in Narragansett Bay to obtain time-series variations at multiple depths ( 1 , 4
NASA Astrophysics Data System (ADS)
Witt, Thomas J.; Fletcher, N. E.
2010-10-01
We investigate some statistical properties of ac voltages from a white noise source measured with a digital lock-in amplifier equipped with finite impulse response output filters which introduce correlations between successive voltage values. The main goal of this work is to propose simple solutions to account for correlations when calculating the standard deviation of the mean (SDM) for a sequence of measurement data acquired using such an instrument. The problem is treated by time series analysis based on a moving average model of the filtering process. Theoretical expressions are derived for the power spectral density (PSD), the autocorrelation function, the equivalent noise bandwidth and the Allan variance; all are related to the SDM. At most three parameters suffice to specify any of the above quantities: the filter time constant, the time between successive measurements (both set by the lock-in operator) and the PSD of the white noise input, h0. Our white noise source is a resistor so that the PSD is easily calculated; there are no free parameters. Theoretical expressions are checked against their respective sample estimates and, with the exception of two of the bandwidth estimates, agreement to within 11% or better is found.
NASA Astrophysics Data System (ADS)
Feigin, A. M.; Mukhin, D.; Volodin, E. M.; Gavrilov, A.; Loskutov, E. M.
2013-12-01
The new method of decomposition of the Earth's climate system into well separated spatial-temporal patterns ('climatic modes') is discussed. The method is based on: (i) generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding vector (space-distributed) time series in basis of spatial-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points; (ii) expanding both real SST data, and longer by several times SST data generated numerically, in STEOF basis; (iii) use of the numerically produced STEOF basis for exclusion of 'too slow' (and thus not represented correctly) processes from real data. The application of the method allows by means of vector time series generated numerically by the INM RAS Coupled Climate Model [2] to separate from real SST anomalies data [3] two climatic modes possessing by noticeably different time scales: 3-5 and 9-11 years. Relations of separated modes to ENSO and PDO are investigated. Possible applications of spatial-temporal climatic patterns concept to prognosis of climate system evolution is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm 3. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/
NASA Astrophysics Data System (ADS)
Tominaga, M.; Tivey, M.; Sager, W.
2017-12-01
Two major difficulties have hindered improving the accuracy of the Late-Mid Jurassic geomagnetic polarity time scale: a dearth of reliable high-resolution radiometric dates and the lack of a continuous Jurassic geomagnetic polarity time scale (GPTS) record. We present the latest effort towards establishing a definitive Mid Jurassic to Early Cretaceous (M-series) GPTS model using three high-resolution, multi-level (sea surface [0 km], mid-water [3 km], and near-source [5.2 km]) marine magnetic profiles from a seamount-free corridor adjacent to the Waghenaer Fracture Zone in the western Pacific Jurassic Quiet Zone (JQZ). The profiles show a global coherency in magnetic anomaly correlations between two mid ocean ridge systems (i.e., Japanese and Hawaiian lineations). Their unprecedented high data resolution documents a detailed anomaly character (i.e., amplitudes and wavelengths). We confirm that this magnetic anomaly record shows a coherent anomaly sequence from M29 back in time to M42 with previously suggested from the Japanese lineation in the Pigafetta Basin. Especially noticeable is the M39-M41 Low Amplitude Zone defined in the Pigafetta Bsin, which potentially defines the bounds of JQZ seafloor. We assessed the anomaly source with regard to the crustal architecture, including the effects of Cretaceous volcanism on crustal magnetization and conclude that the anomaly character faithfully represents changes in geomagnetic field intensity and polarity over time and is mostly free of any overprint of the original Jurassic magnetic remanence by later Cretaceous volcanism. We have constructed polarity block models (RMS <5 nT [normalized] between observed and calculated profiles) for each of the survey lines, yielding three potential GPTS candidate models with different source-to-sensor resolutions, from M19-M38, which can be compared to currently available magnetostratigraphic records. The overall polarity reversal rates calculated from each of the models are anomalously high, which is consistent with previous observations from the Japanese M-series sequence. The anomalously high reversal rates during a period of apparent low field intensity suggests a unique period of geomagnetic field behavior in Earth's history.
An investigation of fMRI time series stationarity during motor sequence learning foot tapping tasks.
Muhei-aldin, Othman; VanSwearingen, Jessie; Karim, Helmet; Huppert, Theodore; Sparto, Patrick J; Erickson, Kirk I; Sejdić, Ervin
2014-04-30
Understanding complex brain networks using functional magnetic resonance imaging (fMRI) is of great interest to clinical and scientific communities. To utilize advanced analysis methods such as graph theory for these investigations, the stationarity of fMRI time series needs to be understood as it has important implications on the choice of appropriate approaches for the analysis of complex brain networks. In this paper, we investigated the stationarity of fMRI time series acquired from twelve healthy participants while they performed a motor (foot tapping sequence) learning task. Since prior studies have documented that learning is associated with systematic changes in brain activation, a sequence learning task is an optimal paradigm to assess the degree of non-stationarity in fMRI time-series in clinically relevant brain areas. We predicted that brain regions involved in a "learning network" would demonstrate non-stationarity and may violate assumptions associated with some advanced analysis approaches. Six blocks of learning, and six control blocks of a foot tapping sequence were performed in a fixed order. The reverse arrangement test was utilized to investigate the time series stationarity. Our analysis showed some non-stationary signals with a time varying first moment as a major source of non-stationarity. We also demonstrated a decreased number of non-stationarities in the third block as a result of priming and repetition. Most of the current literature does not examine stationarity prior to processing. The implication of our findings is that future investigations analyzing complex brain networks should utilize approaches robust to non-stationarities, as graph-theoretical approaches can be sensitive to non-stationarities present in data. Copyright © 2014 Elsevier B.V. All rights reserved.
The PRIMAP-hist national historical emissions time series
NASA Astrophysics Data System (ADS)
Gütschow, Johannes; Jeffery, M. Louise; Gieseke, Robert; Gebel, Ronja; Stevens, David; Krapp, Mario; Rocha, Marcia
2016-11-01
To assess the history of greenhouse gas emissions and individual countries' contributions to emissions and climate change, detailed historical data are needed. We combine several published datasets to create a comprehensive set of emissions pathways for each country and Kyoto gas, covering the years 1850 to 2014 with yearly values, for all UNFCCC member states and most non-UNFCCC territories. The sectoral resolution is that of the main IPCC 1996 categories. Additional time series of CO2 are available for energy and industry subsectors. Country-resolved data are combined from different sources and supplemented using year-to-year growth rates from regionally resolved sources and numerical extrapolations to complete the dataset. Regional deforestation emissions are downscaled to country level using estimates of the deforested area obtained from potential vegetation and simulations of agricultural land. In this paper, we discuss the data sources and methods used and present the resulting dataset, including its limitations and uncertainties. The dataset is available from doi:10.5880/PIK.2016.003 and can be viewed on the website accompanying this paper (http://www.pik-potsdam.de/primap-live/primap-hist/).
Phase correction and error estimation in InSAR time series analysis
NASA Astrophysics Data System (ADS)
Zhang, Y.; Fattahi, H.; Amelung, F.
2017-12-01
During the last decade several InSAR time series approaches have been developed in response to the non-idea acquisition strategy of SAR satellites, such as large spatial and temporal baseline with non-regular acquisitions. The small baseline tubes and regular acquisitions of new SAR satellites such as Sentinel-1 allows us to form fully connected networks of interferograms and simplifies the time series analysis into a weighted least square inversion of an over-determined system. Such robust inversion allows us to focus more on the understanding of different components in InSAR time-series and its uncertainties. We present an open-source python-based package for InSAR time series analysis, called PySAR (https://yunjunz.github.io/PySAR/), with unique functionalities for obtaining unbiased ground displacement time-series, geometrical and atmospheric correction of InSAR data and quantifying the InSAR uncertainty. Our implemented strategy contains several features including: 1) improved spatial coverage using coherence-based network of interferograms, 2) unwrapping error correction using phase closure or bridging, 3) tropospheric delay correction using weather models and empirical approaches, 4) DEM error correction, 5) optimal selection of reference date and automatic outlier detection, 6) InSAR uncertainty due to the residual tropospheric delay, decorrelation and residual DEM error, and 7) variance-covariance matrix of final products for geodetic inversion. We demonstrate the performance using SAR datasets acquired by Cosmo-Skymed and TerraSAR-X, Sentinel-1 and ALOS/ALOS-2, with application on the highly non-linear volcanic deformation in Japan and Ecuador (figure 1). Our result shows precursory deformation before the 2015 eruptions of Cotopaxi volcano, with a maximum uplift of 3.4 cm on the western flank (fig. 1b), with a standard deviation of 0.9 cm (fig. 1a), supporting the finding by Morales-Rivera et al. (2017, GRL); and a post-eruptive subsidence on the same area, with a maximum of -3 +/- 0.9 cm (fig. 1c). Time-series displacement map (fig. 2) shows a highly non-linear deformation behavior, indicating the complicated magma propagation process during this eruption cycle.
NASA Astrophysics Data System (ADS)
Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert
2016-05-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond.
NASA Astrophysics Data System (ADS)
Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.
2015-12-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond. Spatial and temporal patterns were further analysed with an ecological perspective in a follow-up study. Results show that changes in land use patterns such as land use intensification and reduced agricultural expansion reflect the socio-economic transformations that occurred in the region
Scaling laws from geomagnetic time series
Voros, Z.; Kovacs, P.; Juhasz, A.; Kormendi, A.; Green, A.W.
1998-01-01
The notion of extended self-similarity (ESS) is applied here for the X - component time series of geomagnetic field fluctuations. Plotting nth order structure functions against the fourth order structure function we show that low-frequency geomagnetic fluctuations up to the order n = 10 follow the same scaling laws as MHD fluctuations in solar wind, however, for higher frequencies (f > l/5[h]) a clear departure from the expected universality is observed for n > 6. ESS does not allow to make an unambiguous statement about the non triviality of scaling laws in "geomagnetic" turbulence. However, we suggest to use higher order moments as promising diagnostic tools for mapping the contributions of various remote magnetospheric sources to local observatory data. Copyright 1998 by the American Geophysical Union.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, Leiph
Although using standard Taylor series coefficients for finite-difference operators is optimal in the sense that in the limit of infinitesimal space and time discretization, the solution approaches the correct analytic solution to the acousto-dynamic system of differential equations, other finite-difference operators may provide optimal computational run time given certain error bounds or source bandwidth constraints. This report describes the results of investigation of alternative optimal finite-difference coefficients based on several optimization/accuracy scenarios and provides recommendations for minimizing run time while retaining error within given error bounds.
NASA Astrophysics Data System (ADS)
Kovalets, Ivan V.; Efthimiou, George C.; Andronopoulos, Spyros; Venetsanos, Alexander G.; Argyropoulos, Christos D.; Kakosimos, Konstantinos E.
2018-05-01
In this work, we present an inverse computational method for the identification of the location, start time, duration and quantity of emitted substance of an unknown air pollution source of finite time duration in an urban environment. We considered a problem of transient pollutant dispersion under stationary meteorological fields, which is a reasonable assumption for the assimilation of available concentration measurements within 1 h from the start of an incident. We optimized the calculation of the source-receptor function by developing a method which requires integrating as many backward adjoint equations as the available measurement stations. This resulted in high numerical efficiency of the method. The source parameters are computed by maximizing the correlation function of the simulated and observed concentrations. The method has been integrated into the CFD code ADREA-HF and it has been tested successfully by performing a series of source inversion runs using the data of 200 individual realizations of puff releases, previously generated in a wind tunnel experiment.
NASA Astrophysics Data System (ADS)
Koshimura, S.; Hino, R.; Ohta, Y.; Kobayashi, H.; Musa, A.; Murashima, Y.
2014-12-01
With use of modern computing power and advanced sensor networks, a project is underway to establish a new system of real-time tsunami inundation forecasting, damage estimation and mapping to enhance society's resilience in the aftermath of major tsunami disaster. The system consists of fusion of real-time crustal deformation monitoring/fault model estimation by Ohta et al. (2012), high-performance real-time tsunami propagation/inundation modeling with NEC's vector supercomputer SX-ACE, damage/loss estimation models (Koshimura et al., 2013), and geo-informatics. After a major (near field) earthquake is triggered, the first response of the system is to identify the tsunami source model by applying RAPiD Algorithm (Ohta et al., 2012) to observed RTK-GPS time series at GEONET sites in Japan. As performed in the data obtained during the 2011 Tohoku event, we assume less than 10 minutes as the acquisition time of the source model. Given the tsunami source, the system moves on to running tsunami propagation and inundation model which was optimized on the vector supercomputer SX-ACE to acquire the estimation of time series of tsunami at offshore/coastal tide gauges to determine tsunami travel and arrival time, extent of inundation zone, maximum flow depth distribution. The implemented tsunami numerical model is based on the non-linear shallow-water equations discretized by finite difference method. The merged bathymetry and topography grids are prepared with 10 m resolution to better estimate the tsunami inland penetration. Given the maximum flow depth distribution, the system performs GIS analysis to determine the numbers of exposed population and structures using census data, then estimates the numbers of potential death and damaged structures by applying tsunami fragility curve (Koshimura et al., 2013). Since the tsunami source model is determined, the model is supposed to complete the estimation within 10 minutes. The results are disseminated as mapping products to responders and stakeholders, e.g. national and regional municipalities, to be utilized for their emergency/response activities. In 2014, the system is verified through the case studies of 2011 Tohoku event and potential earthquake scenarios along Nankai Trough with regard to its capability and robustness.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing
NASA Astrophysics Data System (ADS)
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture—for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments—as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series—daily Poaceae pollen concentrations over the period 2006-2014—was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
ERIC Educational Resources Information Center
Baum, Sandy; Payea, Kathleen
This report presents annual data on the amount of financial assistance--grants, loans, and work-study--available to students to help pay for postsecondary education. The College Board began this data series in 1983 to track the value of financial aid over time from federal, state, and institutional sources. The report also contains information on…
Using the Microcomputer to Generate Materials for Bibliographic Instruction.
ERIC Educational Resources Information Center
Hendley, Gaby G.
Guide-worksheets were developed on a word processor in a high school library for bibliographic instruction of English and social studies students to cover the following reference sources: Facts on File; Social Issues Resource Series (S.I.R.S.); Editorial Research Reports; Great Contemporary Issues (New York Times), which also includes Facts on…
The Open Source Movement, Publishing, and the Dissemination of Knowledge.
ERIC Educational Resources Information Center
Young, Bob
A series of changes in intellectual property law over the last 30 years has made it more difficult for researchers, scientists, authors, and artists to cooperate and collaborate on critically important projects. At the same time, the advance of communications technologies, specifically the Internet, promises to foster an explosion in creativity,…
ERIC Educational Resources Information Center
College Board, New York, NY.
This report presents annual data on the amount of financial assistance--grants, loans, and work-study--available to help students pay for postsecondary education. The College Board began this data series in 1983 to track the value of such aid over time from federal, state, and institutional sources. This year, information on federal education tax…
Atmospheric dust events in Central Asia: Relationship to wind, soil type, and land use
USDA-ARS?s Scientific Manuscript database
Xinjiang Province is one of the most important source regions of atmospheric dust in China. Spatial-temporal characteristics of dust events in the region were investigated by time series analysis of annual dust event frequency and meteorological data collected at 101 stations in Xinjiang Province fr...
A Comparative Analysis of Juvenile Book Review Media.
ERIC Educational Resources Information Center
Witucke, A. Virginia
This study of book reviews takes an objective look at the major sources that review children's books. Periodicals examined are Booklist, Bulletin of the Center for Children's Books, Horn Book, New York Times Book Review, and School Library Journal. Presented in a series of eight tables, the report examines reviews of 30 titles published between…
NASA Astrophysics Data System (ADS)
Sovardi, Carlo; Jaensch, Stefan; Polifke, Wolfgang
2016-09-01
A numerical method to concurrently characterize both aeroacoustic scattering and noise sources at a duct singularity is presented. This approach combines Large Eddy Simulation (LES) with techniques of System Identification (SI): In a first step, a highly resolved LES with external broadband acoustic excitation is carried out. Subsequently, time series data extracted from the LES are post-processed by means of SI to model both acoustic propagation and noise generation. The present work studies the aero-acoustic characteristics of an orifice placed in a duct at low flow Mach numbers with the "LES-SI" method. Parametric SI based on the Box-Jenkins mathematical structure is employed, with a prediction error approach that utilizes correlation analysis of the output residuals to avoid overfitting. Uncertainties of model parameters due to the finite length of times series are quantified in terms of confidence intervals. Numerical results for acoustic scattering matrices and power spectral densities of broad-band noise are validated against experimental measurements over a wide range of frequencies below the cut-off frequency of the duct.
Earth's Surface Displacements from the GPS Time Series
NASA Astrophysics Data System (ADS)
Haritonova, D.; Balodis, J.; Janpaule, I.; Morozova, K.
2015-11-01
The GPS observations of both Latvian permanent GNSS networks - EUPOS®-Riga and LatPos, have been collected for a period of 8 years - from 2007 to 2014. Local surface displacements have been derived from the obtained coordinate time series eliminating different impact sources. The Bernese software is used for data processing. The EUREF Permanent Network (EPN) stations in the surroundings of Latvia are selected as fiducial stations. The results have shown a positive tendency of vertical displacements in the western part of Latvia - station heights are increasing, and negative velocities are observed in the central and eastern parts. Station vertical velocities are ranging in diapason of 4 mm/year. In the case of horizontal displacements, site velocities are up to 1 mm/year and mostly oriented to the south. The comparison of the obtained results with data from the deformation model NKG_RF03vel has been made. Additionally, the purpose of this study is to analyse GPS time series obtained using two different data processing strategies: Precise Point Positioning (PPP) and estimation of station coordinates relatively to the positions of fiducial stations also known as Differential GNSS.
Quentin, Wilm; Neubauer, Simone; Leidl, Reiner; König, Hans-Helmut
2007-01-01
This paper reviews the international literature that employed time-series analysis to evaluate the effects of advertising bans on aggregate consumption of cigarettes or tobacco. A systematic search of the literature was conducted. Three groups of studies representing analyses of advertising bans in the U.S.A., in other countries and in 22 OECD countries were defined. The estimated effects of advertising bans and their significance were analysed. 24 studies were identified. They used a wide array of explanatory variables, models, estimating methods and data sources. 18 studies found a negative effect of an advertising ban on aggregate consumption, but only ten of these studies found a significant effect. Two studies using data from 22 OECD countries suggested that partial bans would have little or no influence on aggregate consumption, whereas complete bans would significantly reduce consumption. The results imply that advertising bans have a negative but sometimes only narrow impact on consumption. Complete bans let expect a higher effectiveness. Because of methodological restrictions of analysing advertising bans' effects by time series approaches, also different approaches should be used in the future.
Multifractality and Network Analysis of Phase Transition
Li, Wei; Yang, Chunbin; Han, Jihui; Su, Zhu; Zou, Yijiang
2017-01-01
Many models and real complex systems possess critical thresholds at which the systems shift dramatically from one sate to another. The discovery of early-warnings in the vicinity of critical points are of great importance to estimate how far the systems are away from the critical states. Multifractal Detrended Fluctuation analysis (MF-DFA) and visibility graph method have been employed to investigate the multifractal and geometrical properties of the magnetization time series of the two-dimensional Ising model. Multifractality of the time series near the critical point has been uncovered from the generalized Hurst exponents and singularity spectrum. Both long-term correlation and broad probability density function are identified to be the sources of multifractality. Heterogeneous nature of the networks constructed from magnetization time series have validated the fractal properties. Evolution of the topological quantities of the visibility graph, along with the variation of multifractality, serve as new early-warnings of phase transition. Those methods and results may provide new insights about the analysis of phase transition problems and can be used as early-warnings for a variety of complex systems. PMID:28107414
Review of current GPS methodologies for producing accurate time series and their error sources
NASA Astrophysics Data System (ADS)
He, Xiaoxing; Montillet, Jean-Philippe; Fernandes, Rui; Bos, Machiel; Yu, Kegen; Hua, Xianghong; Jiang, Weiping
2017-05-01
The Global Positioning System (GPS) is an important tool to observe and model geodynamic processes such as plate tectonics and post-glacial rebound. In the last three decades, GPS has seen tremendous advances in the precision of the measurements, which allow researchers to study geophysical signals through a careful analysis of daily time series of GPS receiver coordinates. However, the GPS observations contain errors and the time series can be described as the sum of a real signal and noise. The signal itself can again be divided into station displacements due to geophysical causes and to disturbing factors. Examples of the latter are errors in the realization and stability of the reference frame and corrections due to ionospheric and tropospheric delays and GPS satellite orbit errors. There is an increasing demand on detecting millimeter to sub-millimeter level ground displacement signals in order to further understand regional scale geodetic phenomena hence requiring further improvements in the sensitivity of the GPS solutions. This paper provides a review spanning over 25 years of advances in processing strategies, error mitigation methods and noise modeling for the processing and analysis of GPS daily position time series. The processing of the observations is described step-by-step and mainly with three different strategies in order to explain the weaknesses and strengths of the existing methodologies. In particular, we focus on the choice of the stochastic model in the GPS time series, which directly affects the estimation of the functional model including, for example, tectonic rates, seasonal signals and co-seismic offsets. Moreover, the geodetic community continues to develop computational methods to fully automatize all phases from analysis of GPS time series. This idea is greatly motivated by the large number of GPS receivers installed around the world for diverse applications ranging from surveying small deformations of civil engineering structures (e.g., subsidence of the highway bridge) to the detection of particular geophysical signals.
A graph-based approach to detect spatiotemporal dynamics in satellite image time series
NASA Astrophysics Data System (ADS)
Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal
2017-08-01
Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.
NASA Astrophysics Data System (ADS)
Wu, S.; Yan, Y.; Du, Z.; Zhang, F.; Liu, R.
2017-10-01
The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.
Ye, Yu; Kerr, William C
2011-01-01
To explore various model specifications in estimating relationships between liver cirrhosis mortality rates and per capita alcohol consumption in aggregate-level cross-section time-series data. Using a series of liver cirrhosis mortality rates from 1950 to 2002 for 47 U.S. states, the effects of alcohol consumption were estimated from pooled autoregressive integrated moving average (ARIMA) models and 4 types of panel data models: generalized estimating equation, generalized least square, fixed effect, and multilevel models. Various specifications of error term structure under each type of model were also examined. Different approaches controlling for time trends and for using concurrent or accumulated consumption as predictors were also evaluated. When cirrhosis mortality was predicted by total alcohol, highly consistent estimates were found between ARIMA and panel data analyses, with an average overall effect of 0.07 to 0.09. Less consistent estimates were derived using spirits, beer, and wine consumption as predictors. When multiple geographic time series are combined as panel data, none of existent models could accommodate all sources of heterogeneity such that any type of panel model must employ some form of generalization. Different types of panel data models should thus be estimated to examine the robustness of findings. We also suggest cautious interpretation when beverage-specific volumes are used as predictors. Copyright © 2010 by the Research Society on Alcoholism.
Data visualization in interactive maps and time series
NASA Astrophysics Data System (ADS)
Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe
2014-05-01
State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.
A new algorithm for automatic Outlier Detection in GPS Time Series
NASA Astrophysics Data System (ADS)
Cannavo', Flavio; Mattia, Mario; Rossi, Massimo; Palano, Mimmo; Bruno, Valentina
2010-05-01
Nowadays continuous GPS time series are considered a crucial product of GPS permanent networks, useful in many geo-science fields, such as active tectonics, seismology, crustal deformation and volcano monitoring (Altamimi et al. 2002, Elósegui et al. 2006, Aloisi et al. 2009). Although the GPS data elaboration software has increased in reliability, the time series are still affected by different kind of noise, from the intrinsic noise (e.g. thropospheric delay) to the un-modeled noise (e.g. cycle slips, satellite faults, parameters changing). Typically GPS Time Series present characteristic noise that is a linear combination of white noise and correlated colored noise, and this characteristic is fractal in the sense that is evident for every considered time scale or sampling rate. The un-modeled noise sources result in spikes, outliers and steps. These kind of errors can appreciably influence the estimation of velocities of the monitored sites. The outlier detection in generic time series is a widely treated problem in literature (Wei, 2005), while is not fully developed for the specific kind of GPS series. We propose a robust automatic procedure for cleaning the GPS time series from the outliers and, especially for long daily series, steps due to strong seismic or volcanic events or merely instrumentation changing such as antenna and receiver upgrades. The procedure is basically divided in two steps: a first step for the colored noise reduction and a second step for outlier detection through adaptive series segmentation. Both algorithms present novel ideas and are nearly unsupervised. In particular, we propose an algorithm to estimate an autoregressive model for colored noise in GPS time series in order to subtract the effect of non Gaussian noise on the series. This step is useful for the subsequent step (i.e. adaptive segmentation) which requires the hypothesis of Gaussian noise. The proposed algorithms are tested in a benchmark case study and the results confirm that the algorithms are effective and reasonable. Bibliography - Aloisi M., A. Bonaccorso, F. Cannavò, S. Gambino, M. Mattia, G. Puglisi, E. Boschi, A new dyke intrusion style for the Mount Etna May 2008 eruption modelled through continuous tilt and GPS data, Terra Nova, Volume 21 Issue 4 , Pages 316 - 321, doi: 10.1111/j.1365-3121.2009.00889.x (August 2009) - Altamimi Z., Sillard P., Boucher C., ITRF2000: A new release of the International Terrestrial Reference frame for earth science applications, J Geophys Res-Solid Earth, 107 (B10): art. no.-2214, (Oct 2002) - Elósegui, P., J. L. Davis, D. Oberlander, R. Baena, and G. Ekström , Accuracy of high-rate GPS for seismology, Geophys. Res. Lett., 33, L11308, doi:10.1029/2006GL026065 (2006) - Wei W. S., Time Series Analysis: Univariate and Multivariate Methods, Addison Wesley (2 edition), ISBN-10: 0321322169 (July, 2005)
Chan, Emily H; Sahai, Vikram; Conrad, Corrie; Brownstein, John S
2011-05-01
A variety of obstacles including bureaucracy and lack of resources have interfered with timely detection and reporting of dengue cases in many endemic countries. Surveillance efforts have turned to modern data sources, such as Internet search queries, which have been shown to be effective for monitoring influenza-like illnesses. However, few have evaluated the utility of web search query data for other diseases, especially those of high morbidity and mortality or where a vaccine may not exist. In this study, we aimed to assess whether web search queries are a viable data source for the early detection and monitoring of dengue epidemics. Bolivia, Brazil, India, Indonesia and Singapore were chosen for analysis based on available data and adequate search volume. For each country, a univariate linear model was then built by fitting a time series of the fraction of Google search query volume for specific dengue-related queries from that country against a time series of official dengue case counts for a time-frame within 2003-2010. The specific combination of queries used was chosen to maximize model fit. Spurious spikes in the data were also removed prior to model fitting. The final models, fit using a training subset of the data, were cross-validated against both the overall dataset and a holdout subset of the data. All models were found to fit the data quite well, with validation correlations ranging from 0.82 to 0.99. Web search query data were found to be capable of tracking dengue activity in Bolivia, Brazil, India, Indonesia and Singapore. Whereas traditional dengue data from official sources are often not available until after some substantial delay, web search query data are available in near real-time. These data represent valuable complement to assist with traditional dengue surveillance.
Three-dimensional time reversal communications in elastic media
Anderson, Brian E.; Ulrich, Timothy J.; Le Bas, Pierre-Yves; ...
2016-02-23
Our letter presents a series of vibrational communication experiments, using time reversal, conducted on a set of cast iron pipes. Time reversal has been used to provide robust, private, and clean communications in many underwater acoustic applications. Also, the use of time reversal to communicate along sections of pipes and through a wall is demonstrated here in order to overcome the complications of dispersion and multiple scattering. These demonstrations utilize a single source transducer and a single sensor, a triaxial accelerometer, enabling multiple channels of simultaneous communication streams to a single location.
A Bayesian Approach to Systematic Error Correction in Kepler Photometric Time Series
NASA Astrophysics Data System (ADS)
Jenkins, Jon Michael; VanCleve, J.; Twicken, J. D.; Smith, J. C.; Kepler Science Team
2011-01-01
In order for the Kepler mission to achieve its required 20 ppm photometric precision for 6.5 hr observations of 12th magnitude stars, the Presearch Data Conditioning (PDC) software component of the Kepler Science Processing Pipeline must reduce systematic errors in flux time series to the limit of stochastic noise for errors with time-scales less than three days, without smoothing or over-fitting away the transits that Kepler seeks. The current version of PDC co-trends against ancillary engineering data and Pipeline generated data using essentially a least squares (LS) approach. This approach is successful for quiet stars when all sources of systematic error have been identified. If the stars are intrinsically variable or some sources of systematic error are unknown, LS will nonetheless attempt to explain all of a given time series, not just the part the model can explain well. Negative consequences can include loss of astrophysically interesting signal, and injection of high-frequency noise into the result. As a remedy, we present a Bayesian Maximum A Posteriori (MAP) approach, in which a subset of intrinsically quiet and highly-correlated stars is used to establish the probability density function (PDF) of robust fit parameters in a diagonalized basis. The PDFs then determine a "reasonable” range for the fit parameters for all stars, and brake the runaway fitting that can distort signals and inject noise. We present a closed-form solution for Gaussian PDFs, and show examples using publically available Quarter 1 Kepler data. A companion poster (Van Cleve et al.) shows applications and discusses current work in more detail. Kepler was selected as the 10th mission of the Discovery Program. Funding for this mission is provided by NASA, Science Mission Directorate.
Monitoring Volcano Deformation in the Northernmost Andes with ALOS InSAR Time-Series
NASA Astrophysics Data System (ADS)
Morales Rivera, A. M.; Amelung, F.
2014-12-01
Satellite-based Interferometric Synthetic Aperture Radar (InSAR) is well known to be used as a volcano monitoring tool, providing the opportunity to conduct local and regional surveys to detect and measure volcanic deformation. The signals detected by InSAR on volcanoes can be related to various phenomena, such as volume changes in magmatic reservoirs, compaction of recent deposits, changes in hydrothermal activity, and flank instability. The InSAR time-series method has well documented examples of these phenomena, including precursory inflation of magma reservoirs months prior to volcanic eruptions, proving its potential for early warning systems. We use the ALOS-1 satellite from the Japanese Aerospace Exploration Agency (JAXA), which acquired a global L-band data set of nearly 20 acquisitions during 2007-2011, to make an InSAR time-series analysis using the Small Baseline method (SBAS). Our analysis covers all of the volcanoes in Colombia, Ecuador, and Peru that are cataloged by the Global Volcanism Program. We present results showing time-dependent ground deformation on an near the volcanoes, and present kinematic models to constrain the characteristics of the magmatic sources for the cases in which the deformation is likely related to changes in magma reservoir pressurization.
NASA Astrophysics Data System (ADS)
Henderson, S. T.; Pritchard, M. E.
2011-12-01
The Central Andes Volcanic Zone (CVZ) contains many intriguing areas of ongoing crustal deformation detectable with InSAR. Foremost among these are the 1-2cm/yr radar line-of-sight (LOS) inflations near Uturuncu Volcano in Bolivia and the Lazufre volcanic area spanning the border of Chile and Argentina (Pritchard and Simons 2002). These two deformation sources are intriguing in that they are long-lived (>10yrs), have large diameters (>50km), and have modeled sources at mid-crustal depths (10-20km). For Uturuncu, the best-fitting source depths coincide with the seismically imaged Altiplano-Puna Magma Body (eg. Chimielowsi et al. 1999, Zandt et al. 2003). Regional InSAR time series analysis enables the spatial and temporal comparison of the Uturuncu and Lazufre signals with other deformations in a sub-region of the CVZ from 1992 to the present. Our study focuses on volcanic deformation, but we also resolve non-magmatic deformation signals including landslides and salars. The study region benefits from a large InSAR dataset of 631 ERS and ENVISAT interferograms, distributed between two descending tracks and two ascending tracks, covering up to 870 kilometers along the volcanic arc. We employ an inversion method based on the SBAS algorithm (Berardino 2002), but modified to avoid interpolation across dates with incoherent values. This modification effectively deals with the heterogeneous spatial extents and data gaps present in individual interferograms for long tracks. With our time series results we investigate the timing of possible magma migrations and we explore the parameters of forward models that match observations. Results indicate continuing monotonic inflation styles at Uturuncu and Lazufre with maximum LOS uplift at 1.0cm/yr and 2.5cm/yr respectively (Pritchard and Simons 2004, Froger et al. 2007, Ruch et al. 2009). We discuss evidence for 2mm/yr broad LOS deflation collocated with the Uturuncu inflation signal and comment on possible models for its origin. We also detect nonlinear deformation styles including an abrupt transition from 5mm/yr LOS deflation to 5mm/yr LOS inflation over several years near Cerro Overo in Chile. The cause of this 15km-diameter deformation is unknown, but it is not obviously related to a salar or other hydrologic signal.
The RATIO method for time-resolved Laue crystallography
Coppens, Philip; Pitak, Mateusz; Gembicky, Milan; Messerschmidt, Marc; Scheins, Stephan; Benedict, Jason; Adachi, Shin-ichi; Sato, Tokushi; Nozawa, Shunsuke; Ichiyanagi, Kohei; Chollet, Matthieu; Koshihara, Shin-ya
2009-01-01
A RATIO method for analysis of intensity changes in time-resolved pump–probe Laue diffraction experiments is described. The method eliminates the need for scaling the data with a wavelength curve representing the spectral distribution of the source and removes the effect of possible anisotropic absorption. It does not require relative scaling of series of frames and removes errors due to all but very short term fluctuations in the synchrotron beam. PMID:19240334
From open source communications to knowledge
NASA Astrophysics Data System (ADS)
Preece, Alun; Roberts, Colin; Rogers, David; Webberley, Will; Innes, Martin; Braines, Dave
2016-05-01
Rapid processing and exploitation of open source information, including social media sources, in order to shorten decision-making cycles, has emerged as an important issue in intelligence analysis in recent years. Through a series of case studies and natural experiments, focussed primarily upon policing and counter-terrorism scenarios, we have developed an approach to information foraging and framing to inform decision making, drawing upon open source intelligence, in particular Twitter, due to its real-time focus and frequent use as a carrier for links to other media. Our work uses a combination of natural language (NL) and controlled natural language (CNL) processing to support information collection from human sensors, linking and schematising of collected information, and the framing of situational pictures. We illustrate the approach through a series of vignettes, highlighting (1) how relatively lightweight and reusable knowledge models (schemas) can rapidly be developed to add context to collected social media data, (2) how information from open sources can be combined with reports from trusted observers, for corroboration or to identify con icting information; and (3) how the approach supports users operating at or near the tactical edge, to rapidly task information collection and inform decision-making. The approach is supported by bespoke software tools for social media analytics and knowledge management.
Implications on 1 + 1 D Tsunami Runup Modeling due to Time Features of the Earthquake Source
NASA Astrophysics Data System (ADS)
Fuentes, M.; Riquelme, S.; Ruiz, J.; Campos, J.
2018-02-01
The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1 + 1 D solution for the shoreline motion time series, from the static case to the kinematic case, by including both rise time and rupture velocity. Our results show that the static case corresponds to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum runup may be affected by very slow ruptures and long rise time. Parametric analysis reveals that runup is strictly decreasing with the rise time while is highly amplified in a certain range of slow rupture velocities. For even lower rupture velocities, the tsunami excitation vanishes and for larger, quicker approaches to the instantaneous case.
Implications on 1 + 1 D Tsunami Runup Modeling due to Time Features of the Earthquake Source
NASA Astrophysics Data System (ADS)
Fuentes, M.; Riquelme, S.; Ruiz, J.; Campos, J.
2018-04-01
The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1 + 1 D solution for the shoreline motion time series, from the static case to the kinematic case, by including both rise time and rupture velocity. Our results show that the static case corresponds to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum runup may be affected by very slow ruptures and long rise time. Parametric analysis reveals that runup is strictly decreasing with the rise time while is highly amplified in a certain range of slow rupture velocities. For even lower rupture velocities, the tsunami excitation vanishes and for larger, quicker approaches to the instantaneous case.
Reconstructing surface ocean circulation with 129I time series records from corals
Chang, Ching-Chih; Burr, George S.; Jull, A. J. Timothy; Russell, Joellen L.; Biddulph, Dana; White, Lara; Prouty, Nancy G.; Chen, Yue-Gau; Chuan-Chou Shen,; Zhou, Weijian; Lam, Doan Dinh
2016-01-01
The long-lived radionuclide 129I (half-life: 15.7 × 106 yr) is well-known as a useful environmental tracer. At present, the global 129I in surface water is about 1–2 orders of magnitude higher than pre-1960 levels. Since the 1990s, anthropogenic 129I produced from industrial nuclear fuels reprocessing plants has been the primary source of 129I in marine surface waters of the Atlantic and around the globe. Here we present four coral 129I time series records from: 1) Con Dao and 2) Xisha Islands, the South China Sea, 3) Rabaul, Papua New Guinea and 4) Guam. The Con Dao coral 129I record features a sudden increase in 129I in 1959. The Xisha coral shows similar peak values for 129I as the Con Dao coral, punctuated by distinct low values, likely due to the upwelling in the central South China Sea. The Rabaul coral features much more gradual 129I increases in the 1970s, similar to a published record from the Solomon Islands. The Guam coral 129I record contains the largest measured values for any site, with two large peaks, in 1955 and 1959. Nuclear weapons testing was the primary 129I source in the Western Pacific in the latter part of the 20th Century, notably from testing in the Marshall Islands. The Guam 1955 peak and Con Dao 1959 increases are likely from the 1954 Castle Bravo test, and the Operation Hardtack I test is the most likely source of the 1959 peak observed at Guam. Radiogenic iodine found in coral was carried primarily through surface ocean currents. The coral 129I time series data provide a broad picture of the surface distribution and depth penetration of 129I in the Pacific Ocean over the past 60 years.
Reconstructing surface ocean circulation with 129I time series records from corals.
Chang, Ching-Chih; Burr, George S; Jull, A J Timothy; Russell, Joellen L; Biddulph, Dana; White, Lara; Prouty, Nancy G; Chen, Yue-Gau; Shen, Chuan-Chou; Zhou, Weijian; Lam, Doan Dinh
2016-12-01
The long-lived radionuclide 129 I (half-life: 15.7 × 10 6 yr) is well-known as a useful environmental tracer. At present, the global 129 I in surface water is about 1-2 orders of magnitude higher than pre-1960 levels. Since the 1990s, anthropogenic 129 I produced from industrial nuclear fuels reprocessing plants has been the primary source of 129 I in marine surface waters of the Atlantic and around the globe. Here we present four coral 129 I time series records from: 1) Con Dao and 2) Xisha Islands, the South China Sea, 3) Rabaul, Papua New Guinea and 4) Guam. The Con Dao coral 129 I record features a sudden increase in 129 I in 1959. The Xisha coral shows similar peak values for 129 I as the Con Dao coral, punctuated by distinct low values, likely due to the upwelling in the central South China Sea. The Rabaul coral features much more gradual 129 I increases in the 1970s, similar to a published record from the Solomon Islands. The Guam coral 129 I record contains the largest measured values for any site, with two large peaks, in 1955 and 1959. Nuclear weapons testing was the primary 129 I source in the Western Pacific in the latter part of the 20th Century, notably from testing in the Marshall Islands. The Guam 1955 peak and Con Dao 1959 increases are likely from the 1954 Castle Bravo test, and the Operation Hardtack I test is the most likely source of the 1959 peak observed at Guam. Radiogenic iodine found in coral was carried primarily through surface ocean currents. The coral 129 I time series data provide a broad picture of the surface distribution and depth penetration of 129 I in the Pacific Ocean over the past 60 years. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Improving Photometry and Stellar Signal Preservation with Pixel-Level Systematic Error Correction
NASA Technical Reports Server (NTRS)
Kolodzijczak, Jeffrey J.; Smith, Jeffrey C.; Jenkins, Jon M.
2013-01-01
The Kepler Mission has demonstrated that excellent stellar photometric performance can be achieved using apertures constructed from optimally selected CCD pixels. The clever methods used to correct for systematic errors, while very successful, still have some limitations in their ability to extract long-term trends in stellar flux. They also leave poorly correlated bias sources, such as drifting moiré pattern, uncorrected. We will illustrate several approaches where applying systematic error correction algorithms to the pixel time series, rather than the co-added raw flux time series, provide significant advantages. Examples include, spatially localized determination of time varying moiré pattern biases, greater sensitivity to radiation-induced pixel sensitivity drops (SPSDs), improved precision of co-trending basis vectors (CBV), and a means of distinguishing the stellar variability from co-trending terms even when they are correlated. For the last item, the approach enables physical interpretation of appropriately scaled coefficients derived in the fit of pixel time series to the CBV as linear combinations of various spatial derivatives of the pixel response function (PRF). We demonstrate that the residuals of a fit of soderived pixel coefficients to various PRF-related components can be deterministically interpreted in terms of physically meaningful quantities, such as the component of the stellar flux time series which is correlated with the CBV, as well as, relative pixel gain, proper motion and parallax. The approach also enables us to parameterize and assess the limiting factors in the uncertainties in these quantities.
Where does streamwater come from in low-relief forested watersheds? A dual-isotope approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klaus, J.; McDonnell, J. J.; Jackson, C. R.
The time and geographic sources of streamwater in low-relief watersheds are poorly understood. This is partly due to the difficult combination of low runoff coefficients and often damped streamwater isotopic signals precluding traditional hydrograph separation and convolution integral approaches. Here we present a dual-isotope approach involving 18O and 2H of water in a low-angle forested watershed to determine streamwater source components and then build a conceptual model of streamflow generation. We focus on three headwater lowland sub-catchments draining the Savannah River Site in South Carolina, USA. Our results for a 3-year sampling period show that the slopes of the meteoricmore » water lines/evaporation water lines (MWLs/EWLs) of the catchment water sources can be used to extract information on runoff sources in ways not considered before. Our dual-isotope approach was able to identify unique hillslope, riparian and deep groundwater, and streamflow compositions. Thus, the streams showed strong evaporative enrichment compared to the local meteoric water line (δ 2H = 7.15 · δ 18O +9.28‰) with slopes of 2.52, 2.84, and 2.86. Based on the unique and unambiguous slopes of the EWLs of the different water cycle components and the isotopic time series of the individual components, we were able to show how the riparian zone controls baseflow in this system and how the riparian zone "resets" the stable isotope composition of the observed streams in our low-angle, forested watersheds. Although this approach is limited in terms of quantifying mixing percentages between different end-members, our dual-isotope approach enabled the extraction of hydrologically useful information in a region with little change in individual isotope time series.« less
Where does streamwater come from in low-relief forested watersheds? A dual-isotope approach
Klaus, J.; McDonnell, J. J.; Jackson, C. R.; ...
2015-01-08
The time and geographic sources of streamwater in low-relief watersheds are poorly understood. This is partly due to the difficult combination of low runoff coefficients and often damped streamwater isotopic signals precluding traditional hydrograph separation and convolution integral approaches. Here we present a dual-isotope approach involving 18O and 2H of water in a low-angle forested watershed to determine streamwater source components and then build a conceptual model of streamflow generation. We focus on three headwater lowland sub-catchments draining the Savannah River Site in South Carolina, USA. Our results for a 3-year sampling period show that the slopes of the meteoricmore » water lines/evaporation water lines (MWLs/EWLs) of the catchment water sources can be used to extract information on runoff sources in ways not considered before. Our dual-isotope approach was able to identify unique hillslope, riparian and deep groundwater, and streamflow compositions. Thus, the streams showed strong evaporative enrichment compared to the local meteoric water line (δ 2H = 7.15 · δ 18O +9.28‰) with slopes of 2.52, 2.84, and 2.86. Based on the unique and unambiguous slopes of the EWLs of the different water cycle components and the isotopic time series of the individual components, we were able to show how the riparian zone controls baseflow in this system and how the riparian zone "resets" the stable isotope composition of the observed streams in our low-angle, forested watersheds. Although this approach is limited in terms of quantifying mixing percentages between different end-members, our dual-isotope approach enabled the extraction of hydrologically useful information in a region with little change in individual isotope time series.« less
Reconstruction of the Precipitation in the Canary Islands for the Period 1595-1836.
NASA Astrophysics Data System (ADS)
García, Ricardo; Macias, Antonio; Gallego, David; Hernández, Emiliano; Gimeno, Luis; Ribera, Pedro
2003-08-01
Historical documentary sources in the Canary Islands have been used to construct cereal production series for the period 1595-1836. The cereal growth period in this region covers essentially the rainy season, making these crops adequate to characterize the annual precipitation. A proxy for the Islands' rainfall based on the historical series of wheat and barley production has been constructed and assessed by using two independent series of dry and wet years. The spectral analysis of the crop production reveals a strong non stationary behavior. This fact, along with the direct comparison with several reconstructed and instrumental North Atlantic Oscillation series, suggests the potential use of the reconstructed precipitation as a proxy for this climatic oscillation during preinstrumental times.This is an abridged version of the full-length article that is available online (10.1175/BAMS-84-8-García)
Numerical analysis of transient fields near thin-wire antennas and scatterers
NASA Astrophysics Data System (ADS)
Landt, J. A.
1981-11-01
Under the premise that `accelerated charge radiates,' one would expect radiation on wire structures to occur from driving points, ends of wires, bends in wires, or locations of lumped loading. Here, this premise is investigated in a series of numerical experiments. The numerical procedure is based on a moment-method solution of a thin-wire time-domain electric-field integral equation. The fields in the vicinity of wire structures are calculated for short impulsive-type excitations, and are viewed in a series of time sequences or snapshots. For these excitations, the fields are spatially limited in the radial dimension, and expand in spheres centered about points of radiation. These centers of radiation coincide with the above list of possible source regions. Time retardation permits these observations to be made clearly in the time domain, similar to time-range gating. In addition to providing insight into transient radiation processes, these studies show that the direction of energy flow is not always defined by Poynting's vector near wire structures.
Brigode, Pierre; Brissette, Francois; Nicault, Antoine; ...
2016-09-06
Over the last decades, different methods have been used by hydrologists to extend observed hydro-climatic time series, based on other data sources, such as tree rings or sedimentological datasets. For example, tree ring multi-proxies have been studied for the Caniapiscau Reservoir in northern Québec (Canada), leading to the reconstruction of flow time series for the last 150 years. In this paper, we applied a new hydro-climatic reconstruction method on the Caniapiscau Reservoir and compare the obtained streamflow time series against time series derived from dendrohydrology by other authors on the same catchment and study the natural streamflow variability over themore » 1881–2011 period in that region. This new reconstruction is based not on natural proxies but on a historical reanalysis of global geopotential height fields, and aims firstly to produce daily climatic time series, which are then used as inputs to a rainfall–runoff model in order to obtain daily streamflow time series. The performances of the hydro-climatic reconstruction were quantified over the observed period, and showed good performances, in terms of both monthly regimes and interannual variability. The streamflow reconstructions were then compared to two different reconstructions performed on the same catchment by using tree ring data series, one being focused on mean annual flows and the other on spring floods. In terms of mean annual flows, the interannual variability in the reconstructed flows was similar (except for the 1930–1940 decade), with noteworthy changes seen in wetter and drier years. For spring floods, the reconstructed interannual variabilities were quite similar for the 1955–2011 period, but strongly different between 1880 and 1940. Here, the results emphasize the need to apply different reconstruction methods on the same catchments. Indeed, comparisons such as those above highlight potential differences between available reconstructions and, finally, allow a retrospective analysis of the proposed reconstructions of past hydro-climatological variabilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brigode, Pierre; Brissette, Francois; Nicault, Antoine
Over the last decades, different methods have been used by hydrologists to extend observed hydro-climatic time series, based on other data sources, such as tree rings or sedimentological datasets. For example, tree ring multi-proxies have been studied for the Caniapiscau Reservoir in northern Québec (Canada), leading to the reconstruction of flow time series for the last 150 years. In this paper, we applied a new hydro-climatic reconstruction method on the Caniapiscau Reservoir and compare the obtained streamflow time series against time series derived from dendrohydrology by other authors on the same catchment and study the natural streamflow variability over themore » 1881–2011 period in that region. This new reconstruction is based not on natural proxies but on a historical reanalysis of global geopotential height fields, and aims firstly to produce daily climatic time series, which are then used as inputs to a rainfall–runoff model in order to obtain daily streamflow time series. The performances of the hydro-climatic reconstruction were quantified over the observed period, and showed good performances, in terms of both monthly regimes and interannual variability. The streamflow reconstructions were then compared to two different reconstructions performed on the same catchment by using tree ring data series, one being focused on mean annual flows and the other on spring floods. In terms of mean annual flows, the interannual variability in the reconstructed flows was similar (except for the 1930–1940 decade), with noteworthy changes seen in wetter and drier years. For spring floods, the reconstructed interannual variabilities were quite similar for the 1955–2011 period, but strongly different between 1880 and 1940. Here, the results emphasize the need to apply different reconstruction methods on the same catchments. Indeed, comparisons such as those above highlight potential differences between available reconstructions and, finally, allow a retrospective analysis of the proposed reconstructions of past hydro-climatological variabilities.« less
Apparatus and method for compensating for clock drift in downhole drilling components
Hall, David R [Provo, UT; Pixton, David S [Lehi, UT; Johnson, Monte L [Orem, UT; Bartholomew, David B [Springville, UT; Hall, Jr., H. Tracy
2007-08-07
A precise downhole clock that compensates for drift includes a prescaler configured to receive electrical pulses from an oscillator. The prescaler is configured to output a series of clock pulses. The prescaler outputs each clock pulse after counting a preloaded number of electrical pulses from the oscillator. The prescaler is operably connected to a compensator module for adjusting the number loaded into the prescaler. By adjusting the number that is loaded into the prescaler, the timing may be advanced or retarded to more accurately synchronize the clock pulses with a reference time source. The compensator module is controlled by a counter-based trigger module configured to trigger the compensator module to load a value into the prescaler. Finally, a time-base logic module is configured to calculate the drift of the downhole clock by comparing the time of the downhole clock with a reference time source.
NASA Astrophysics Data System (ADS)
Liebert, Adam; Sawosz, Piotr; Milej, Daniel; Kacprzak, Michał; Weigl, Wojciech; Botwicz, Marcin; MaCzewska, Joanna; Fronczewska, Katarzyna; Mayzner-Zawadzka, Ewa; Królicki, Leszek; Maniewski, Roman
2011-04-01
Recently, it was shown in measurements carried out on humans that time-resolved near-infrared reflectometry and fluorescence spectroscopy may allow for discrimination of information originating directly from the brain avoiding influence of contaminating signals related to the perfusion of extracerebral tissues. We report on continuation of these studies, showing that the near-infrared light can be detected noninvasively on the surface of the tissue at large interoptode distance. A multichannel time-resolved optical monitoring system was constructed for measurements of diffuse reflectance in optically turbid medium at very large source-detector separation up to 9 cm. The instrument was applied during intravenous injection of indocyanine green and the distributions of times of flight of photons were successfully acquired showing inflow and washout of the dye in the tissue. Time courses of the statistical moments of distributions of times of flight of photons are presented and compared to the results obtained simultaneously at shorter source-detector separations (3, 4, and 5 cm). We show in a series of experiments carried out on physical phantom and healthy volunteers that the time-resolved data acquisition in combination with very large source-detector separation may allow one to improve depth selectivity of perfusion assessment in the brain.
Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A.
2013-01-01
Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/ PMID:24124417
Cockfield, Jeremy; Su, Kyungmin; Robbins, Kay A
2013-01-01
Experiments to monitor human brain activity during active behavior record a variety of modalities (e.g., EEG, eye tracking, motion capture, respiration monitoring) and capture a complex environmental context leading to large, event-rich time series datasets. The considerable variability of responses within and among subjects in more realistic behavioral scenarios requires experiments to assess many more subjects over longer periods of time. This explosion of data requires better computational infrastructure to more systematically explore and process these collections. MOBBED is a lightweight, easy-to-use, extensible toolkit that allows users to incorporate a computational database into their normal MATLAB workflow. Although capable of storing quite general types of annotated data, MOBBED is particularly oriented to multichannel time series such as EEG that have event streams overlaid with sensor data. MOBBED directly supports access to individual events, data frames, and time-stamped feature vectors, allowing users to ask questions such as what types of events or features co-occur under various experimental conditions. A database provides several advantages not available to users who process one dataset at a time from the local file system. In addition to archiving primary data in a central place to save space and avoid inconsistencies, such a database allows users to manage, search, and retrieve events across multiple datasets without reading the entire dataset. The database also provides infrastructure for handling more complex event patterns that include environmental and contextual conditions. The database can also be used as a cache for expensive intermediate results that are reused in such activities as cross-validation of machine learning algorithms. MOBBED is implemented over PostgreSQL, a widely used open source database, and is freely available under the GNU general public license at http://visual.cs.utsa.edu/mobbed. Source and issue reports for MOBBED are maintained at http://vislab.github.com/MobbedMatlab/
Nichols, J.D.; Morris, R.W.; Brownie, C.; Pollock, K.H.
1986-01-01
The authors present a new method that can be used to estimate taxonomic turnover in conjunction with stratigraphic range data for families in five phyla of Paleozoic marine invertebrates. Encounter probabilities varied among taxa and showed evidence of a decrease over time for the geologic series examined. The number of families varied substantially among the five phyla and showed some evidence of an increase over the series examined. There was no evidence of variation in extinction probabilities among the phyla. Although there was evidence of temporal variation in extinction probabilities within phyla, there was no evidence of a linear decrease in extinction probabilities over time, as has been reported by others. The authors did find evidence of high extinction probabilities for the two intervals that had been identified by others as periods of mass extinction. They found no evidence of variation in turnover among the five phyla. There was evidence of temporal variation in turnover, with greater turnover occurring in the older series.
NEW SUNS IN THE COSMOS. III. MULTIFRACTAL SIGNATURE ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freitas, D. B. de; Nepomuceno, M. M. F.; Junior, P. R. V. de Moraes
2016-11-01
In the present paper, we investigate the multifractality signatures in hourly time series extracted from the CoRoT spacecraft database. Our analysis is intended to highlight the possibility that astrophysical time series can be members of a particular class of complex and dynamic processes, which require several photometric variability diagnostics to characterize their structural and topological properties. To achieve this goal, we search for contributions due to a nonlinear temporal correlation and effects caused by heavier tails than the Gaussian distribution, using a detrending moving average algorithm for one-dimensional multifractal signals (MFDMA). We observe that the correlation structure is the mainmore » source of multifractality, while heavy-tailed distribution plays a minor role in generating the multifractal effects. Our work also reveals that the rotation period of stars is inherently scaled by the degree of multifractality. As a result, analyzing the multifractal degree of the referred series, we uncover an evolution of multifractality from shorter to larger periods.« less
The Chaotic Long-term X-ray Variability of 4U 1705-44
NASA Astrophysics Data System (ADS)
Phillipson, R. A.; Boyd, P. T.; Smale, A. P.
2018-04-01
The low-mass X-ray binary 4U1705-44 exhibits dramatic long-term X-ray time variability with a timescale of several hundred days. The All-Sky Monitor (ASM) aboard the Rossi X-ray Timing Explorer (RXTE) and the Japanese Monitor of All-sky X-ray Image (MAXI) aboard the International Space Station together have continuously observed the source from December 1995 through May 2014. The combined ASM-MAXI data provide a continuous time series over fifty times the length of the timescale of interest. Topological analysis can help us identify 'fingerprints' in the phase-space of a system unique to its equations of motion. The Birman-Williams theorem postulates that if such fingerprints are the same between two systems, then their equations of motion must be closely related. The phase-space embedding of the source light curve shows a strong resemblance to the double-welled nonlinear Duffing oscillator. We explore a range of parameters for which the Duffing oscillator closely mirrors the time evolution of 4U1705-44. We extract low period, unstable periodic orbits from the 4U1705-44 and Duffing time series and compare their topological information. The Duffing and 4U1705-44 topological properties are identical, providing strong evidence that they share the same underlying template. This suggests that we can look to the Duffing equation to help guide the development of a physical model to describe the long-term X-ray variability of this and other similarly behaved X-ray binary systems.
Standardized principal components for vegetation variability monitoring across space and time
NASA Astrophysics Data System (ADS)
Mathew, T. R.; Vohora, V. K.
2016-08-01
Vegetation at any given location changes through time and in space. In what quantity it changes, where and when can help us in identifying sources of ecosystem stress, which is very useful for understanding changes in biodiversity and its effect on climate change. Such changes known for a region are important in prioritizing management. The present study considers the dynamics of savanna vegetation in Kruger National Park (KNP) through the use of temporal satellite remote sensing images. Spatial variability of vegetation is a key characteristic of savanna landscapes and its importance to biodiversity has been demonstrated by field-based studies. The data used for the study were sourced from the U.S. Agency for International Development where AVHRR derived Normalized Difference Vegetation Index (NDVI) images available at spatial resolutions of 8 km and at dekadal scales. The study area was extracted from these images for the time-period 1984-2002. Maximum value composites were derived for individual months resulting in an image dataset of 216 NDVI images. Vegetation dynamics across spatio-temporal domains were analyzed using standardized principal components analysis (SPCA) on the NDVI time-series. Each individual image variability in the time-series is considered. The outcome of this study demonstrated promising results - the variability of vegetation change in the area across space and time, and also indicated changes in landscape on 6 individual principal components (PCs) showing differences not only in magnitude, but also in pattern, of different selected eco-zones with constantly changing and evolving ecosystem.
NASA Astrophysics Data System (ADS)
Saturnino, Diana; Langlais, Benoit; Amit, Hagay; Mandea, Mioara; Civet, François; Beucler, Éric
2017-04-01
A complete description of the main geomagnetic field temporal variation is crucial to understand dynamics in the core. This variation, termed secular variation (SV), is known with high accuracy at ground magnetic observatory locations. However the description of its spatial variability is hampered by the globally uneven distribution of the observatories. For the past two decades a global coverage of the field changes has been allowed by satellites. Their surveys of the geomagnetic field have been used to derive and improve global spherical harmonic (SH) models through some strict data selection schemes to minimise external field contributions. But discrepancies remain between ground measurements and field predictions by these models. Indeed, the global models do not reproduce small spatial scales of the field temporal variations. To overcome this problem we propose a modified Virtual Observatory (VO) approach by defining a globally homogeneous mesh of VOs at satellite altitude. With this approach we directly extract time series of the field and its temporal variation from satellite measurements as it is done at observatory locations. As satellite measurements are acquired at different altitudes a correction for the altitude is needed. Therefore, we apply an Equivalent Source Dipole (ESD) technique for each VO and each given time interval to reduce all measurements to a unique location, leading to time series similar to those available at ground magnetic observatories. Synthetic data is first used to validate the new VO-ESD approach. Then, we apply our scheme to measurements from the Swarm mission. For the first time, a 2.5 degrees resolution global mesh of VO times series is built. The VO-ESD derived time series are locally compared to ground observations as well as to satellite-based model predictions. The approach is able to describe detailed temporal variations of the field at local scales. The VO-ESD time series are also used to derive global SH models. Without regularization these models describe well the secular trend of the magnetic field. The derivation of longer VO-ESD time series, as more data will be made available, will allow the study of field temporal variations features such as geomagnetic jerks.
Duliu, Octavian G; Varlam, Carmen; Shnawaw, Muataz Dheyaa
2018-05-16
To get more information on the origin of tritium and to evidence any possible presence of anthropogenic sources, between January 1999 and December 2016, the precipitation level and tritium concentration were monthly recorded and investigated by the Cryogenic Institute of Ramnicu Valcea, Romania. Compared with similar data covering a radius of about 1200 km westward, the measurements gave similar results concerning the time evolution of tritium content and precipitation level for the entire time interval excepting the period between 2009 and 2011 when the tritium concentrations showed a slight increase, most probable due to the activity of neighboring experimental pilot plant for tritium and deuterium separation. Regardless this fact, all data pointed towards a steady tendency of tritium concentrations to decrease with an annual rate of about 1.4 ± 0.05%. The experimental data on precipitation levels and tritium concentrations form two complete time series whose time series analysis showed, at p < 0.01, the presence of a single one-year periodicity whose coincident maximums which correspond to late spring - early summer months suggest the existence of the Spring Leak mechanism with a possible contribution of the soil moisture remobilization during the warm period. Copyright © 2018 Elsevier Ltd. All rights reserved.
Weighted statistical parameters for irregularly sampled time series
NASA Astrophysics Data System (ADS)
Rimoldini, Lorenzo
2014-01-01
Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.
The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.
2010-12-01
Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.
Category-Specific Comparison of Univariate Alerting Methods for Biosurveillance Decision Support
Elbert, Yevgeniy; Hung, Vivian; Burkom, Howard
2013-01-01
Objective For a multi-source decision support application, we sought to match univariate alerting algorithms to surveillance data types to optimize detection performance. Introduction Temporal alerting algorithms commonly used in syndromic surveillance systems are often adjusted for data features such as cyclic behavior but are subject to overfitting or misspecification errors when applied indiscriminately. In a project for the Armed Forces Health Surveillance Center to enable multivariate decision support, we obtained 4.5 years of out-patient, prescription and laboratory test records from all US military treatment facilities. A proof-of-concept project phase produced 16 events with multiple evidence corroboration for comparison of alerting algorithms for detection performance. We used the representative streams from each data source to compare sensitivity of 6 algorithms to injected spikes, and we used all data streams from 16 known events to compare them for detection timeliness. Methods The six methods compared were: Holt-Winters generalized exponential smoothing method (1)automated choice between daily methods, regression and an exponential weighted moving average (2)adaptive daily Shewhart-type chartadaptive one-sided daily CUSUMEWMA applied to 7-day means with a trend correction; and7-day temporal scan statistic Sensitivity testing: We conducted comparative sensitivity testing for categories of time series with similar scales and seasonal behavior. We added multiples of the standard deviation of each time series as single-day injects in separate algorithm runs. For each candidate method, we then used as a sensitivity measure the proportion of these runs for which the output of each algorithm was below alerting thresholds estimated empirically for each algorithm using simulated data streams. We identified the algorithm(s) whose sensitivity was most consistently high for each data category. For each syndromic query applied to each data source (outpatient, lab test orders, and prescriptions), 502 authentic time series were derived, one for each reporting treatment facility. Data categories were selected in order to group time series with similar expected algorithm performance: Median > 100 < Median ≤ 10Median = 0Lag 7 Autocorrelation Coefficient ≥ 0.2Lag 7 Autocorrelation Coefficient < 0.2 Timeliness testing: For the timeliness testing, we avoided artificiality of simulated signals by measuring alerting detection delays in the 16 corroborated outbreaks. The multiple time series from these events gave a total of 141 time series with outbreak intervals for timeliness testing. The following measures were computed to quantify timeliness of detection: Median Detection Delay – median number of days to detect the outbreak.Penalized Mean Detection Delay –mean number of days to detect the outbreak with outbreak misses penalized as 1 day plus the maximum detection time. Results Based on the injection results, the Holt-Winters algorithm was most sensitive among time series with positive medians. The adaptive CUSUM and the Shewhart methods were most sensitive for data streams with median zero. Table 1 provides timeliness results using the 141 outbreak-associated streams on sparse (Median=0) and non-sparse data categories. [Insert table #1 here] Data median Detection Delay, days Holt-winters Regression EWMA Adaptive Shewhart Adaptive CUSUM 7-day Trend-adj. EWMA 7-day Temporal Scan Median 0 Median 3 2 4 2 4.5 2 Penalized Mean 7.2 7 6.6 6.2 7.3 7.6 Median >0 Median 2 2 2.5 2 6 4 Penalized Mean 6.1 7 7.2 7.1 7.7 6.6 The gray shading in the table 1 indicates methods with shortest detection delays for sparse and non-sparse data streams. The Holt-Winters method was again superior for non-sparse data. For data with median=0, the adaptive CUSUM was superior for a daily false alarm probability of 0.01, but the Shewhart method was timelier for more liberal thresholds. Conclusions Both kinds of detection performance analysis showed the method based on Holt-Winters exponential smoothing superior on non-sparse time series with day-of-week effects. The adaptive CUSUM and She-whart methods proved optimal on sparse data and data without weekly patterns.
Pareeth, Sajid; Bresciani, Mariano; Buzzi, Fabio; Leoni, Barbara; Lepori, Fabio; Ludovisi, Alessandro; Morabito, Giuseppe; Adrian, Rita; Neteler, Markus; Salmaso, Nico
2017-02-01
The availability of more than thirty years of historical satellite data is a valuable source which could be used as an alternative to the sparse in-situ data. We developed a new homogenised time series of daily day time Lake Surface Water Temperature (LSWT) over the last thirty years (1986-2015) at a spatial resolution of 1km from thirteen polar orbiting satellites. The new homogenisation procedure implemented in this study corrects for the different acquisition times of the satellites standardizing the derived LSWT to 12:00 UTC. In this study, we developed new time series of LSWT for five large lakes in Italy and evaluated the product with in-situ data from the respective lakes. Furthermore, we estimated the long-term annual and summer trends, the temporal coherence of mean LSWT between the lakes, and studied the intra-annual variations and long-term trends from the newly developed LSWT time series. We found a regional warming trend at a rate of 0.017°Cyr -1 annually and 0.032°Cyr -1 during summer. Mean annual and summer LSWT temporal patterns in these lakes were found to be highly coherent. Amidst the reported rapid warming of lakes globally, it is important to understand the long-term variations of surface temperature at a regional scale. This study contributes a new method to derive long-term accurate LSWT for lakes with sparse in-situ data thereby facilitating understanding of regional level changes in lake's surface temperature. Copyright © 2016 Elsevier B.V. All rights reserved.
FPT- FORTRAN PROGRAMMING TOOLS FOR THE DEC VAX
NASA Technical Reports Server (NTRS)
Ragosta, A. E.
1994-01-01
The FORTRAN Programming Tools (FPT) are a series of tools used to support the development and maintenance of FORTRAN 77 source codes. Included are a debugging aid, a CPU time monitoring program, source code maintenance aids, print utilities, and a library of useful, well-documented programs. These tools assist in reducing development time and encouraging high quality programming. Although intended primarily for FORTRAN programmers, some of the tools can be used on data files and other programming languages. BUGOUT is a series of FPT programs that have proven very useful in debugging a particular kind of error and in optimizing CPU-intensive codes. The particular type of error is the illegal addressing of data or code as a result of subtle FORTRAN errors that are not caught by the compiler or at run time. A TRACE option also allows the programmer to verify the execution path of a program. The TIME option assists the programmer in identifying the CPU-intensive routines in a program to aid in optimization studies. Program coding, maintenance, and print aids available in FPT include: routines for building standard format subprogram stubs; cleaning up common blocks and NAMELISTs; removing all characters after column 72; displaying two files side by side on a VT-100 terminal; creating a neat listing of a FORTRAN source code including a Table of Contents, an Index, and Page Headings; converting files between VMS internal format and standard carriage control format; changing text strings in a file without using EDT; and replacing tab characters with spaces. The library of useful, documented programs includes the following: time and date routines; a string categorization routine; routines for converting between decimal, hex, and octal; routines to delay process execution for a specified time; a Gaussian elimination routine for solving a set of simultaneous linear equations; a curve fitting routine for least squares fit to polynomial, exponential, and sinusoidal forms (with a screen-oriented editor); a cubic spline fit routine; a screen-oriented array editor; routines to support parsing; and various terminal support routines. These FORTRAN programming tools are written in FORTRAN 77 and ASSEMBLER for interactive and batch execution. FPT is intended for implementation on DEC VAX series computers operating under VMS. This collection of tools was developed in 1985.
The chaotic long-term X-ray variability of 4U 1705-44
NASA Astrophysics Data System (ADS)
Phillipson, R. A.; Boyd, P. T.; Smale, A. P.
2018-07-01
The low-mass X-ray binary 4U1705-44 exhibits dramatic long-term X-ray time variability with a time-scale of several hundred days. The All-Sky Monitor (ASM) aboard the Rossi X-ray Timing Explorer (RXTE) and the Japanese Monitor of All-sky X-ray Image (MAXI) aboard the International Space Station together have continuously observed the source from 1995 December through 2014 May. The combined ASM-MAXI data provide a continuous time series over 50 times the length of the time-scale of interest. Topological analysis can help us identify `fingerprints' in the phase space of a system unique to its equations of motion. The Birman-Williams theorem postulates that if such fingerprints are the same between two systems, then their equations of motion must be closely related. The phase-space embedding of the source light curve shows a strong resemblance to the double-welled non-linear Duffing oscillator. We explore a range of parameters for which the Duffing oscillator closely mirrors the time evolution of 4U1705-44. We extract low period, unstable periodic orbits from the 4U1705-44 and Duffing time series and compare their topological information. The Duffing and 4U1705-44 topological properties are identical, providing strong evidence that they share the same underlying template. This suggests that we can look to the Duffing equation to help guide the development of a physical model to describe the long-term X-ray variability of this and other similarly behaved X-ray binary systems.
Accelerating fissile material detection with a neutron source
Rowland, Mark S.; Snyderman, Neal J.
2018-01-30
A neutron detector system for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly to count neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. The system includes a Poisson neutron generator for in-beam interrogation of a possible fissile neutron source and a DC power supply that exhibits electrical ripple on the order of less than one part per million. Certain voltage multiplier circuits, such as Cockroft-Walton voltage multipliers, are used to enhance the effective of series resistor-inductor circuits components to reduce the ripple associated with traditional AC rectified, high voltage DC power supplies.
[The effect of tobacco prices on consumption: a time series data analysis for Mexico].
Olivera-Chávez, Rosa Itandehui; Cermeño-Bazán, Rodolfo; de Miera-Juárez, Belén Sáenz; Jiménez-Ruiz, Jorge Alberto; Reynales-Shigematsu, Luz Myriam
2010-01-01
To estimate the price elasticity of the demand for cigarettes in Mexico based on data sources and a methodology different from the ones used in previous studies on the topic. Quarterly time series of consumption, income and price for the time period 1994 to 2005 were used. A long-run demand model was estimated using Ordinary Least Squares (OLS) and the existence of a cointegration relationship was investigated. Also, a model using Dinamic Ordinary Least Squares (DOLS) was estimated to correct for potential endogeneity of independent variables and autocorrelation of the residuals. DOLS estimates showed that a 10% increase in cigarette prices could reduce consumption in 2.5% (p<0.05) and increase government revenue in 16.11%. The results confirmed the effectiveness of taxes as an instrument for tobacco control in Mexico. An increase in taxes can be used to increase cigarette prices and therefore to reduce consumption and increase government revenue.
NASA Astrophysics Data System (ADS)
Shan, Zhendong; Ling, Daosheng
2018-02-01
This article develops an analytical solution for the transient wave propagation of a cylindrical P-wave line source in a semi-infinite elastic solid with a fluid layer. The analytical solution is presented in a simple closed form in which each term represents a transient physical wave. The Scholte equation is derived, through which the Scholte wave velocity can be determined. The Scholte wave is the wave that propagates along the interface between the fluid and solid. To develop the analytical solution, the wave fields in the fluid and solid are defined, their analytical solutions in the Laplace domain are derived using the boundary and interface conditions, and the solutions are then decomposed into series form according to the power series expansion method. Each item of the series solution has a clear physical meaning and represents a transient wave path. Finally, by applying Cagniard's method and the convolution theorem, the analytical solutions are transformed into the time domain. Numerical examples are provided to illustrate some interesting features in the fluid layer, the interface and the semi-infinite solid. When the P-wave velocity in the fluid is higher than that in the solid, two head waves in the solid, one head wave in the fluid and a Scholte wave at the interface are observed for the cylindrical P-wave line source.
PRECISION INTEGRATOR FOR MINUTE ELECTRIC CURRENTS
Hemmendinger, A.; Helmer, R.J.
1961-10-24
An integrator is described for measuring the value of integrated minute electrical currents. The device consists of a source capacitor connected in series with the source of such electrical currents, a second capacitor of accurately known capacitance and a source of accurately known and constant potential, means responsive to the potentials developed across the source capacitor for reversibly connecting the second capacitor in series with the source of known potential and with the source capacitor and at a rate proportional to the potential across the source capacitor to maintain the magnitude of the potential across the source capacitor at approximately zero. (AEC)
Looking Forward: Comment on Morgante, Zolfaghari, and Johnson
ERIC Educational Resources Information Center
Creel, Sarah C.
2012-01-01
Morgante et al. (in press) find inconsistencies in the time reporting of a Tobii T60XL eye tracker. Their study raises important questions about the use of the Tobii T-series in particular, and various software and hardware in general, in different infant eye tracking paradigms. It leaves open the question of the source of the inconsistencies.…
Nonlinear forecasting as a way of distinguishing chaos from measurement error in time series
NASA Astrophysics Data System (ADS)
Sugihara, George; May, Robert M.
1990-04-01
An approach is presented for making short-term predictions about the trajectories of chaotic dynamical systems. The method is applied to data on measles, chickenpox, and marine phytoplankton populations, to show how apparent noise associated with deterministic chaos can be distinguished from sampling error and other sources of externally induced environmental noise.
Research on the Flow of International Students to UK Universities: Determinants and Implications
ERIC Educational Resources Information Center
Naidoo, Vikash
2007-01-01
Using time series data over the 1985-2003 period, this article examines some of the determinants of international student mobility to universities in the UK. The research found that some of the main factors influencing international student mobility to the UK include access to domestic education opportunities in the source country, the level of…
Cunningham, James K; Liu, Lon-Mu; Callaghan, Russell C
2016-11-01
In December 2006 the United States regulated sodium permanganate, a cocaine essential chemical. In March 2007 Mexico, the United States' primary source for methamphetamine, closed a chemical company accused of illicitly importing 60+ tons of pseudoephedrine, a methamphetamine precursor chemical. US cocaine availability and methamphetamine availability, respectively, decreased in association. This study tested whether the controls had impacts upon the numbers of US cocaine users and methamphetamine users. Auto-regressive integrated moving average (ARIMA) intervention time-series analysis. Comparison series-heroin and marijuana users-were used. United States, 2002-14. The National Survey on Drug Use and Health (n = 723 283), a complex sample survey of the US civilian, non-institutionalized population. Estimates of the numbers of (1) past-year users and (2) past-month users were constructed for each calendar quarter from 2002 to 2014, providing each series with 52 time-periods. Downward shifts in cocaine users started at the time of the cocaine regulation. Past-year and past-month cocaine users series levels decreased by approximately 1 946 271 (-32%) (P < 0.05) and 694 770 (-29%) (P < 0.01), respectively-no apparent recovery occurred through 2014. Downward shifts in methamphetamine users started at the time of the chemical company closure. Past-year and past-month methamphetamine series levels decreased by 494 440 (-35%) [P < 0.01; 95% confidence interval (CI) = -771 897, -216 982] and 277 380 (-45%) (P < 0.05; CI = -554 073, -686), respectively-partial recovery possibly occurred in 2013. The comparison series changed little at the intervention times. Essential/precursor chemical controls in the United States (2006) and Mexico (2007) were associated with large, extended (7+ years) reductions in cocaine users and methamphetamine users in the United States. © 2016 Society for the Study of Addiction.
Long-Term Stability Assessment of Sonoran Desert for Vicarious Calibration of GOES-R
NASA Astrophysics Data System (ADS)
Kim, W.; Liang, S.; Cao, C.
2012-12-01
Vicarious calibration refers to calibration techniques that do not depend on onboard calibration devices. Although sensors and onboard calibration devices undergo rigorous validation processes before launch, performance of sensors often degrades after the launch due to exposure to the harsh space environment and the aging of devices. Such in-flight changes of devices can be identified and adjusted through vicarious calibration activities where the sensor degradation is measured in reference to exterior calibration sources such as the Sun, the Moon, and the Earth surface. Sonoran desert is one of the best calibration sites located in the North America that are available for vicarious calibration of GOES-R satellite. To accurately calibrate sensors onboard GOES-R satellite (e.g. advanced baseline imager (ABI)), the temporal stability of Sonoran desert needs to be assessed precisely. However, short-/mid-term variations in top-of-atmosphere (TOA) reflectance caused by meteorological variables such as water vapor amount and aerosol loading are often difficult to retrieve, making the use of TOA reflectance time series for the stability assessment of the site. In this paper, we address this issue of normalization of TOA reflectance time series using a time series analysis algorithm - seasonal trend decomposition procedure based on LOESS (STL) (Cleveland et al, 1990). The algorithm is basically a collection of smoothing filters which leads to decomposition of a time series into three additive components; seasonal, trend, and remainder. Since this non-linear technique is capable of extracting seasonal patterns in the presence of trend changes, the seasonal variation can be effectively identified in the time series of remote sensing data subject to various environmental changes. The experiment results performed with Landsat 5 TM data show that the decomposition results acquired for the Sonoran Desert area produce normalized series that have much less uncertainty than those of traditional BRDF models, which leads to more accurate stability assessment.
Fractal Analysis of Air Pollutant Concentrations
NASA Astrophysics Data System (ADS)
Cortina-Januchs, M. G.; Barrón-Adame, J. M.; Vega-Corona, A.; Andina, D.
2010-05-01
Air pollution poses significant threats to human health and the environment throughout the developed and developing countries. This work focuses on fractal analysis of pollutant concentration in Salamanca, Mexico. The city of Salamanca has been catalogued as one of the most polluted cities in Mexico. The main causes of pollution in this city are fixed emission sources, such as chemical industry and electricity generation. Sulphur Dioxide (SO2) and Particulate Matter less than 10 micrometer in diameter (PM10) are the most important pollutants in this region. Air pollutant concentrations were investigated by applying the box counting method in time series obtained of the Automatic Environmental Monitoring Network (AEMN). One year of time series of hourly average concentrations were analyzed in order to characterize the temporal structures of SO2 and PM10.
Evaluating disease management program effectiveness: an introduction to time-series analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2003-01-01
Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
NASA Astrophysics Data System (ADS)
Larmat, C. S.; Rougier, E.; Delorey, A.; Steedman, D. W.; Bradley, C. R.
2016-12-01
The goal of the Source Physics Experiment (SPE) is to bring empirical and theoretical advances to the problem of detection and identification of underground nuclear explosions. For this, the SPE program includes a strong modeling effort based on first principles calculations with the challenge to capture both the source and near-source processes and those taking place later in time as seismic waves propagate within complex 3D geologic environments. In this paper, we report on results of modeling that uses hydrodynamic simulation codes (Abaqus and CASH) coupled with a 3D full waveform propagation code, SPECFEM3D. For modeling the near source region, we employ a fully-coupled Euler-Lagrange (CEL) modeling capability with a new continuum-based visco-plastic fracture model for simulation of damage processes, called AZ_Frac. These capabilities produce high-fidelity models of various factors believed to be key in the generation of seismic waves: the explosion dynamics, a weak grout-filled borehole, the surrounding jointed rock, and damage creation and deformations happening around the source and the free surface. SPECFEM3D, based on the Spectral Element Method (SEM) is a direct numerical method for full wave modeling with mathematical accuracy. The coupling interface consists of a series of grid points of the SEM mesh situated inside of the hydrodynamic code's domain. Displacement time series at these points are computed using output data from CASH or Abaqus (by interpolation if needed) and fed into the time marching scheme of SPECFEM3D. We will present validation tests with the Sharpe's model and comparisons of waveforms modeled with Rg waves (2-8Hz) that were recorded up to 2 km for SPE. We especially show effects of the local topography, velocity structure and spallation. Our models predict smaller amplitudes of Rg waves for the first five SPE shots compared to pure elastic models such as Denny &Johnson (1991).
NASA Astrophysics Data System (ADS)
Mehrdad Mirsanjari, Mir; Mohammadyari, Fatemeh
2018-03-01
Underground water is regarded as considerable water source which is mainly available in arid and semi arid with deficient surface water source. Forecasting of hydrological variables are suitable tools in water resources management. On the other hand, time series concepts is considered efficient means in forecasting process of water management. In this study the data including qualitative parameters (electrical conductivity and sodium adsorption ratio) of 17 underground water wells in Mehran Plain has been used to model the trend of parameters change over time. Using determined model, the qualitative parameters of groundwater is predicted for the next seven years. Data from 2003 to 2016 has been collected and were fitted by AR, MA, ARMA, ARIMA and SARIMA models. Afterward, the best model is determined using information criterion or Akaike (AIC) and correlation coefficient. After modeling parameters, the map of agricultural land use in 2016 and 2023 were generated and the changes between these years were studied. Based on the results, the average of predicted SAR (Sodium Adsorption Rate) in all wells in the year 2023 will increase compared to 2016. EC (Electrical Conductivity) average in the ninth and fifteenth holes and decreases in other wells will be increased. The results indicate that the quality of groundwater for Agriculture Plain Mehran will decline in seven years.
Rainfall variability in southern Spain on decadal to centennial time scales
NASA Astrophysics Data System (ADS)
Rodrigo, F. S.; Esteban-Parra, M. J.; Pozo-Vázquez, D.; Castro-Díez, Y.
2000-06-01
In this work a long rainfall series in Andalusia (southern Spain) is analysed. Methods of historical climatology were used to reconstruct a 500-year series from historical sources. Different statistical tools were used to detect and characterize significant changes in this series. Results indicate rainfall fluctuations, without abrupt changes, in the following alternating dry and wet phases: 1501-1589 dry, 1590-1649 wet, 1650-1775 dry, 1776-1937 wet and 1938-1997 dry. Possible causal mechanisms are discussed, emphasizing the important contribution of the North Atlantic Oscillation (NAO) to rainfall variability in the region. Solar activity is discussed in relation to the Maunder Minimum period, and finally the past and present are compared. Results indicate that the magnitude of fluctuations is similar in the past and present.
High-voltage subnanosecond dielectric breakdown
NASA Astrophysics Data System (ADS)
Mankowski, John Jerome
Current interests in ultrawideband radar sources are in the microwave regime, which correspond to voltage pulse risetimes less than a nanosecond. Some new sources, including the Phillips Laboratory Hindenberg series of hydrogen gas switched pulsers use hydrogen at hundreds of atmospheres of pressure in the switch. Unfortunately, the published data of electrical breakdown of gas and liquid media at these time lengths are relatively scarce. A study was conducted on the electrical breakdown properties of liquid and gas dielectrics at subnanosecond and nanoseconds. Two separate voltage sources with pulse risetimes less than 400 ps were developed. Diagnostic probes were designed and tested for their capability of detecting high voltage pulses at these fast risetimes. A thorough investigation into E-field strengths of liquid and gas dielectrics at breakdown times ranging from 0.4 to 5 ns was performed. The voltage polarity dependence on breakdown strength is observed. Streak camera images of streamer formation were taken. The effect of ultraviolet radiation, incident upon the gap, on statistical lag time was determined.
GPS Imaging of Time-Dependent Seasonal Strain in Central California
NASA Astrophysics Data System (ADS)
Kraner, M.; Hammond, W. C.; Kreemer, C.; Borsa, A. A.; Blewitt, G.
2016-12-01
Recently, studies are suggesting that crustal deformation can be time-dependent and nontectonic. Continuous global positioning system (cGPS) measurements are now showing how steady long-term deformation can be influenced by factors such as fluctuations in loading and temperature variations. Here we model the seasonal time-dependent dilatational and shear strain in Central California, specifically surrounding the Parkfield region and try to uncover the sources of these deformation patterns. We use 8 years of cGPS data (2008 - 2016) processed by the Nevada Geodetic Laboratory and carefully select the cGPS stations for our analysis based on the vertical position of cGPS time series during the drought period. In building our strain model, we first detrend the selected station time series using a set of velocities from the robust MIDAS trend estimator. This estimation algorithm is a robust approach that is insensitive to common problems such as step discontinuities, outliers, and seasonality. We use these detrended time series to estimate the median cGPS positions for each month of the 8-year period and filter displacement differences between these monthly median positions using a filtering technique called "GPS Imaging." This technique improves the overall robustness and spatial resolution of the input displacements for the strain model. We then model our dilatational and shear strain field for each month of time series. We also test a variety of a priori constraints, which controls the style of faulting within the strain model. Upon examining our strain maps, we find that a seasonal strain signal exists in Central California. We investigate how this signal compares to thermoelastic, hydrologic, and atmospheric loading models during the 8-year period. We additionally determine whether the drought played a role in influencing the seasonal signal.
Infrared photometry of the black hole candidate Sagittarius A*
NASA Technical Reports Server (NTRS)
Close, Laird M.; Mccarthy, Donald W. JR.; Melia, Fulvio
1995-01-01
An infrared source has been imaged within 0.2 +/- 0.3 arcseconds of the unique Galactic center radio source Sgr A* High angular resolution (averaged value of the Full Width at Half Maximum (FWHM) approximately 0.55 arcseconds) was achieved by rapid (approximately 50 Hz) real-time images motion compensation. The source's near-infrared magnitudes (K = 12.1 +/- 0.3, H = 13.7 +/- 0.3, and J = 16.6 +/- 0.4) are consistent with a hot object reddened by the local extinction A(sub v) approximately 27). At the 3 sigma level of confidence, a time series of 80 images limits the source variability to less than 50% on timescales from 3 to 30 minutes. The photometry is consistent with the emission from a simple accretion disk model for a approximately 1 x 10(exp 6) solar mass black hole. However, the fluxes are also consistent with a hot luminous (L approximately 10(exp 3.5) to 10(exp 4-6) solar luminosity) central cluster star positionally coincident with Sgr A*.
Automatic Classification of Time-variable X-Ray Sources
NASA Astrophysics Data System (ADS)
Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.
2014-05-01
To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ~97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7-500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.
Climate-driven seasonal geocenter motion during the GRACE period
NASA Astrophysics Data System (ADS)
Zhang, Hongyue; Sun, Yu
2018-03-01
Annual cycles in the geocenter motion time series are primarily driven by mass changes in the Earth's hydrologic system, which includes land hydrology, atmosphere, and oceans. Seasonal variations of the geocenter motion have been reliably determined according to Sun et al. (J Geophys Res Solid Earth 121(11):8352-8370, 2016) by combining the Gravity Recovery And Climate Experiment (GRACE) data with an ocean model output. In this study, we reconstructed the observed seasonal geocenter motion with geophysical model predictions of mass variations in the polar ice sheets, continental glaciers, terrestrial water storage (TWS), and atmosphere and dynamic ocean (AO). The reconstructed geocenter motion time series is shown to be in close agreement with the solution based on GRACE data supporting with an ocean bottom pressure model. Over 85% of the observed geocenter motion time series, variance can be explained by the reconstructed solution, which allows a further investigation of the driving mechanisms. We then demonstrated that AO component accounts for 54, 62, and 25% of the observed geocenter motion variances in the X, Y, and Z directions, respectively. The TWS component alone explains 42, 32, and 39% of the observed variances. The net mass changes over oceans together with self-attraction and loading effects also contribute significantly (about 30%) to the seasonal geocenter motion in the X and Z directions. Other contributing sources, on the other hand, have marginal (less than 10%) impact on the seasonal variations but introduce a linear trend in the time series.
Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu
2015-10-14
Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations.
Pan, Yuanjin; Shen, Wen-Bin; Ding, Hao; Hwang, Cheinway; Li, Jin; Zhang, Tengxu
2015-01-01
Modeling nonlinear vertical components of a GPS time series is critical to separating sources contributing to mass displacements. Improved vertical precision in GPS positioning at stations for velocity fields is key to resolving the mechanism of certain geophysical phenomena. In this paper, we use ensemble empirical mode decomposition (EEMD) to analyze the daily GPS time series at 89 continuous GPS stations, spanning from 2002 to 2013. EEMD decomposes a GPS time series into different intrinsic mode functions (IMFs), which are used to identify different kinds of signals and secular terms. Our study suggests that the GPS records contain not only the well-known signals (such as semi-annual and annual signals) but also the seldom-noted quasi-biennial oscillations (QBS). The quasi-biennial signals are explained by modeled loadings of atmosphere, non-tidal and hydrology that deform the surface around the GPS stations. In addition, the loadings derived from GRACE gravity changes are also consistent with the quasi-biennial deformations derived from the GPS observations. By removing the modeled components, the weighted root-mean-square (WRMS) variation of the GPS time series is reduced by 7.1% to 42.3%, and especially, after removing the seasonal and QBO signals, the average improvement percentages for seasonal and QBO signals are 25.6% and 7.5%, respectively, suggesting that it is significant to consider the QBS signals in the GPS records to improve the observed vertical deformations. PMID:26473882
Stochastic sediment property inversion in Shallow Water 06.
Michalopoulou, Zoi-Heleni
2017-11-01
Received time-series at a short distance from the source allow the identification of distinct paths; four of these are direct, surface and bottom reflections, and sediment reflection. In this work, a Gibbs sampling method is used for the estimation of the arrival times of these paths and the corresponding probability density functions. The arrival times for the first three paths are then employed along with linearization for the estimation of source range and depth, water column depth, and sound speed in the water. Propagating densities of arrival times through the linearized inverse problem, densities are also obtained for the above parameters, providing maximum a posteriori estimates. These estimates are employed to calculate densities and point estimates of sediment sound speed and thickness using a non-linear, grid-based model. Density computation is an important aspect of this work, because those densities express the uncertainty in the inversion for sediment properties.
Simultaneous identification of transfer functions and combustion noise of a turbulent flame
NASA Astrophysics Data System (ADS)
Merk, M.; Jaensch, S.; Silva, C.; Polifke, W.
2018-05-01
The Large Eddy Simulation/System Identification (LES/SI) approach allows to deduce a flame transfer function (FTF) from LES of turbulent reacting flow: Time series of fluctuations of reference velocity and global heat release rate resulting from broad-band excitation of a simulated turbulent flame are post-processed via SI techniques to derive a low order model of the flame dynamics, from which the FTF is readily deduced. The current work investigates an extension of the established LES/SI approach: In addition to estimation of the FTF, a low order model for the combustion noise source is deduced from the same time series data. By incorporating such a noise model into a linear thermoacoustic model, it is possible to predict the overall level as well as the spectral distribution of sound pressure in confined combustion systems that do not exhibit self-excited thermoacoustic instability. A variety of model structures for estimation of a noise model are tested in the present study. The suitability and quality of these model structures are compared against each other, their sensitivity regarding certain time series properties is studied. The influence of time series length, signal-to-noise ratio as well as acoustic reflection coefficient of the boundary conditions on the identification are examined. It is shown that the Box-Jenkins model structure is superior to simpler approaches for the simultaneous identification of models that describe the FTF as well as the combustion noise source. Subsequent to the question of the most adequate model structure, the choice of optimal model order is addressed, as in particular the optimal parametrization of the noise model is not obvious. Akaike's Information Criterion and a model residual analysis are applied to draw qualitative and quantitative conclusions on the most suitable model order. All investigations are based on a surrogate data model, which allows a Monte Carlo study across a large parameter space with modest computationally effort. The conducted study constitutes a solid basis for the application of advanced SI techniques to actual LES data.
Data cleaning in the energy domain
NASA Astrophysics Data System (ADS)
Akouemo Kengmo Kenfack, Hermine N.
This dissertation addresses the problem of data cleaning in the energy domain, especially for natural gas and electric time series. The detection and imputation of anomalies improves the performance of forecasting models necessary to lower purchasing and storage costs for utilities and plan for peak energy loads or distribution shortages. There are various types of anomalies, each induced by diverse causes and sources depending on the field of study. The definition of false positives also depends on the context. The analysis is focused on energy data because of the availability of data and information to make a theoretical and practical contribution to the field. A probabilistic approach based on hypothesis testing is developed to decide if a data point is anomalous based on the level of significance. Furthermore, the probabilistic approach is combined with statistical regression models to handle time series data. Domain knowledge of energy data and the survey of causes and sources of anomalies in energy are incorporated into the data cleaning algorithm to improve the accuracy of the results. The data cleaning method is evaluated on simulated data sets in which anomalies were artificially inserted and on natural gas and electric data sets. In the simulation study, the performance of the method is evaluated for both detection and imputation on all identified causes of anomalies in energy data. The testing on utilities' data evaluates the percentage of improvement brought to forecasting accuracy by data cleaning. A cross-validation study of the results is also performed to demonstrate the performance of the data cleaning algorithm on smaller data sets and to calculate an interval of confidence for the results. The data cleaning algorithm is able to successfully identify energy time series anomalies. The replacement of those anomalies provides improvement to forecasting models accuracy. The process is automatic, which is important because many data cleaning processes require human input and become impractical for very large data sets. The techniques are also applicable to other fields such as econometrics and finance, but the exogenous factors of the time series data need to be well defined.
CrowdWater - Can people observe what models need?
NASA Astrophysics Data System (ADS)
van Meerveld, I. H. J.; Seibert, J.; Vis, M.; Etter, S.; Strobl, B.
2017-12-01
CrowdWater (www.crowdwater.ch) is a citizen science project that explores the usefulness of crowd-sourced data for hydrological model calibration and prediction. Hydrological models are usually calibrated based on observed streamflow data but it is likely easier for people to estimate relative stream water levels, such as the water level above or below a rock, than streamflow. Relative stream water levels may, therefore, be a more suitable variable for citizen science projects than streamflow. In order to test this assumption, we held surveys near seven different sized rivers in Switzerland and asked more than 450 volunteers to estimate the water level class based on a picture with a virtual staff gauge. The results show that people can generally estimate the relative water level well, although there were also a few outliers. We also asked the volunteers to estimate streamflow based on the stick method. The median estimated streamflow was close to the observed streamflow but the spread in the streamflow estimates was large and there were very large outliers, suggesting that crowd-based streamflow data is highly uncertain. In order to determine the potential value of water level class data for model calibration, we converted streamflow time series for 100 catchments in the US to stream level class time series and used these to calibrate the HBV model. The model was then validated using the streamflow data. The results of this modeling exercise show that stream level class data are useful for constraining a simple runoff model. Time series of only two stream level classes, e.g. above or below a rock in the stream, were already informative, especially when the class boundary was chosen towards the highest stream levels. There was hardly any improvement in model performance when more than five water level classes were used. This suggests that if crowd-sourced stream level observations are available for otherwise ungauged catchments, these data can be used to constrain a simple runoff model and to generate simulated streamflow time series from the level observations.
Picosecond x-ray diagnostics for third and fourth generation synchrotron sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeCamp, Matthew
2016-03-30
In the DOE-EPSCoR State/National Laboratory partnership grant ``Picosecond x-ray diagnostics for third and fourth generation synchrotron sources'' Dr. DeCamp set forth a partnership between the University of Delaware and Argonne National Laboratory. This proposal aimed to design and implement a series of experiments utilizing, or improving upon, existing time-domain hard x-ray spectroscopies at a third generation synchrotron source. Specifically, the PI put forth three experimental projects to be explored in the grant cycle: 1) implementing a picosecond ``x-ray Bragg switch'' using a laser excited nano-structured metallic film, 2) designing a robust x-ray optical delay stage for x-ray pump-probe studies atmore » a hard x-ray synchrotron source, and 3) building/installing a laser based x-ray source at the Advanced Photon Source for two-color x-ray pump-probe studies.« less
Nonlinear time-series analysis of current signal in cathodic contact glow discharge electrolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allagui, Anis, E-mail: aallagui@sharjah.ac.ae; Abdelkareem, Mohammad Ali; Rojas, Andrea Espinel
In the standard two-electrode configuration employed in electrolytic process, when the control dc voltage is brought to a critical value, the system undergoes a transition from conventional electrolysis to contact glow discharge electrolysis (CGDE), which has also been referred to as liquid-submerged micro-plasma, glow discharge plasma electrolysis, electrode effect, electrolytic plasma, etc. The light-emitting process is associated with the development of an irregular and erratic current time-series which has been arbitrarily labelled as “random,” and thus dissuaded further research in this direction. Here, we examine the current time-series signals measured in cathodic CGDE configuration in a concentrated KOH solution atmore » different dc bias voltages greater than the critical voltage. We show that the signals are, in fact, not random according to the NIST SP. 800-22 test suite definition. We also demonstrate that post-processing low-pass filtered sequences requires less time than the native as-measured sequences, suggesting a superposition of low frequency chaotic fluctuations and high frequency behaviors (which may be produced by more than one possible source of entropy). Using an array of nonlinear time-series analyses for dynamical systems, i.e., the computation of largest Lyapunov exponents and correlation dimensions, and re-construction of phase portraits, we found that low-pass filtered datasets undergo a transition from quasi-periodic to chaotic to quasi-hyper-chaotic behavior, and back again to chaos when the voltage controlling-parameter is increased. The high frequency part of the signals is discussed in terms of highly nonlinear turbulent motion developed around the working electrode.« less
Feng, Zhujing; Schilling, Keith E; Chan, Kung-Sik
2013-06-01
Nitrate-nitrogen concentrations in rivers represent challenges for water supplies that use surface water sources. Nitrate concentrations are often modeled using time-series approaches, but previous efforts have typically relied on monthly time steps. In this study, we developed a dynamic regression model of daily nitrate concentrations in the Raccoon River, Iowa, that incorporated contemporaneous and lags of precipitation and discharge occurring at several locations around the basin. Results suggested that 95 % of the variation in daily nitrate concentrations measured at the outlet of a large agricultural watershed can be explained by time-series patterns of precipitation and discharge occurring in the basin. Discharge was found to be a more important regression variable than precipitation in our model but both regression parameters were strongly correlated with nitrate concentrations. The time-series model was consistent with known patterns of nitrate behavior in the watershed, successfully identifying contemporaneous dilution mechanisms from higher relief and urban areas of the basin while incorporating the delayed contribution of nitrate from tile-drained regions in a lagged response. The first difference of the model errors were modeled as an AR(16) process and suggest that daily nitrate concentration changes remain temporally correlated for more than 2 weeks although temporal correlation was stronger in the first few days before tapering off. Consequently, daily nitrate concentrations are non-stationary, i.e. of strong memory. Using time-series models to reliably forecast daily nitrate concentrations in a river based on patterns of precipitation and discharge occurring in its basin may be of great interest to water suppliers.
NASA Astrophysics Data System (ADS)
Godsey, S. E.; Kirchner, J. W.
2008-12-01
The mean residence time - the average time that it takes rainfall to reach the stream - is a basic parameter used to characterize catchment processes. Heterogeneities in these processes lead to a distribution of travel times around the mean residence time. By examining this travel time distribution, we can better predict catchment response to contamination events. A catchment system with shorter residence times or narrower distributions will respond quickly to contamination events, whereas systems with longer residence times or longer-tailed distributions will respond more slowly to those same contamination events. The travel time distribution of a catchment is typically inferred from time series of passive tracers (e.g., water isotopes or chloride) in precipitation and streamflow. Variations in the tracer concentration in streamflow are usually damped compared to those in precipitation, because precipitation inputs from different storms (with different tracer signatures) are mixed within the catchment. Mathematically, this mixing process is represented by the convolution of the travel time distribution and the precipitation tracer inputs to generate the stream tracer outputs. Because convolution in the time domain is equivalent to multiplication in the frequency domain, it is relatively straightforward to estimate the parameters of the travel time distribution in either domain. In the time domain, the parameters describing the travel time distribution are typically estimated by maximizing the goodness of fit between the modeled and measured tracer outputs. In the frequency domain, the travel time distribution parameters can be estimated by fitting a power-law curve to the ratio of precipitation spectral power to stream spectral power. Differences between the methods of parameter estimation in the time and frequency domain mean that these two methods may respond differently to variations in data quality, record length and sampling frequency. Here we evaluate how well these two methods of travel time parameter estimation respond to different sources of uncertainty and compare the methods to one another. We do this by generating synthetic tracer input time series of different lengths, and convolve these with specified travel-time distributions to generate synthetic output time series. We then sample both the input and output time series at various sampling intervals and corrupt the time series with realistic error structures. Using these 'corrupted' time series, we infer the apparent travel time distribution, and compare it to the known distribution that was used to generate the synthetic data in the first place. This analysis allows us to quantify how different record lengths, sampling intervals, and error structures in the tracer measurements affect the apparent mean residence time and the apparent shape of the travel time distribution.
Benchmarking of Touschek Beam Lifetime Calculations for the Advanced Photon Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, A.; Yang, B.
2017-06-25
Particle loss from Touschek scattering is one of the most significant issues faced by present and future synchrotron light source storage rings. For example, the predicted, Touschek-dominated beam lifetime for the Advanced Photon Source (APS) Upgrade lattice in 48-bunch, 200-mA timing mode is only ~ 2 h. In order to understand the reliability of the predicted lifetime, a series of measurements with various beam parameters was performed on the present APS storage ring. This paper first describes the entire process of beam lifetime measurement, then compares measured lifetime with the calculated one by applying the measured beam parameters. The resultsmore » show very good agreement.« less
Research on precise modeling of buildings based on multi-source data fusion of air to ground
NASA Astrophysics Data System (ADS)
Li, Yongqiang; Niu, Lubiao; Yang, Shasha; Li, Lixue; Zhang, Xitong
2016-03-01
Aims at the accuracy problem of precise modeling of buildings, a test research was conducted based on multi-source data for buildings of the same test area , including top data of air-borne LiDAR, aerial orthophotos, and façade data of vehicle-borne LiDAR. After accurately extracted the top and bottom outlines of building clusters, a series of qualitative and quantitative analysis was carried out for the 2D interval between outlines. Research results provide a reliable accuracy support for precise modeling of buildings of air ground multi-source data fusion, on the same time, discussed some solution for key technical problems.
A first application of independent component analysis to extracting structure from stock returns.
Back, A D; Weigend, A S
1997-08-01
This paper explores the application of a signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series into a new space of statistically independent components (ICs). We apply ICA to three years of daily returns of the 28 largest Japanese stocks and compare the results with those obtained using principal component analysis. The results indicate that the estimated ICs fall into two categories, (i) infrequent large shocks (responsible for the major changes in the stock prices), and (ii) frequent smaller fluctuations (contributing little to the overall level of the stocks). We show that the overall stock price can be reconstructed surprisingly well by using a small number of thresholded weighted ICs. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price is less similar to the original one. ICA is shown to be a potentially powerful method of analyzing and understanding driving mechanisms in financial time series. The application to portfolio optimization is described in Chin and Weigend (1998).
NASA Astrophysics Data System (ADS)
Chen, R. S.; Levy, M.; Baptista, S.; Adamo, S.
2010-12-01
Vulnerability to climate variability and change will depend on dynamic interactions between different aspects of climate, land-use change, and socioeconomic trends. Measurements and projections of these changes are difficult at the local scale but necessary for effective planning. New data sources and methods make it possible to assess land-use and socioeconomic changes that may affect future patterns of climate vulnerability. In this paper we report on new time series data sets that reveal trends in the spatial patterns of climate vulnerability in the Caribbean/Gulf of Mexico Region. Specifically, we examine spatial time series data for human population over the period 1990-2000, time series data on land use and land cover over 2000-2009, and infant mortality rates as a proxy for poverty for 2000-2008. We compare the spatial trends for these measures to the distribution of climate-related natural disaster risk hotspots (cyclones, floods, landslides, and droughts) in terms of frequency, mortality, and economic losses. We use these data to identify areas where climate vulnerability appears to be increasing and where it may be decreasing. Regions where trends and patterns are especially worrisome include coastal areas of Guatemala and Honduras.
Hu, Weiming; Tian, Guodong; Kang, Yongxin; Yuan, Chunfeng; Maybank, Stephen
2017-09-25
In this paper, a new nonparametric Bayesian model called the dual sticky hierarchical Dirichlet process hidden Markov model (HDP-HMM) is proposed for mining activities from a collection of time series data such as trajectories. All the time series data are clustered. Each cluster of time series data, corresponding to a motion pattern, is modeled by an HMM. Our model postulates a set of HMMs that share a common set of states (topics in an analogy with topic models for document processing), but have unique transition distributions. For the application to motion trajectory modeling, topics correspond to motion activities. The learnt topics are clustered into atomic activities which are assigned predicates. We propose a Bayesian inference method to decompose a given trajectory into a sequence of atomic activities. On combining the learnt sources and sinks, semantic motion regions, and the learnt sequence of atomic activities, the action represented by the trajectory can be described in natural language in as automatic a way as possible. The effectiveness of our dual sticky HDP-HMM is validated on several trajectory datasets. The effectiveness of the natural language descriptions for motions is demonstrated on the vehicle trajectories extracted from a traffic scene.
NASA Technical Reports Server (NTRS)
Ray, R. D.; Beckley, B. D.; Lemoine, F. G.
2010-01-01
A somewhat unorthodox method for determining vertical crustal motion at a tide-gauge location is to difference the sea level time series with an equivalent time series determined from satellite altimetry, To the extent that both instruments measure an identical ocean signal, the difference will be dominated by vertical land motion at the gauge. We revisit this technique by analyzing sea level signals at 28 tide gauges that are colocated with DORIS geodetic stations. Comparisons of altimeter-gauge vertical rates with DORIS rates yield a median difference of 1.8 mm/yr and a weighted root-mean-square difference of2.7 mm/yr. The latter suggests that our uncertainty estimates, which are primarily based on an assumed AR(l) noise process in all time series, underestimates the true errors. Several sources of additional error are discussed, including possible scale errors in the terrestrial reference frame to which altimeter-gauge rates are mostly insensitive, One of our stations, Male, Maldives, which has been the subject of some uninformed arguments about sea-level rise, is found to have almost no vertical motion, and thus is vulnerable to rising sea levels. Published by Elsevier Ltd. on behalf of COSPAR.
Selected properties of GPS and Galileo-IOV receiver intersystem biases in multi-GNSS data processing
NASA Astrophysics Data System (ADS)
Paziewski, Jacek; Sieradzki, Rafał; Wielgosz, Paweł
2015-09-01
Two overlapping frequencies—L1/E1 and L5/E5a—in GPS and Galileo systems support the creation of mixed double-differences in a tightly combined relative positioning model. On the other hand, a tightly combined model makes it necessary to take into account receiver intersystem bias, which is the difference in receiver hardware delays. This bias is present in both carrier-phase and pseudorange observations. Earlier research showed that using a priori knowledge of earlier-calibrated ISB to correct GNSS observations has significant impact on ambiguity resolution and, therefore, precise positioning results. In previous research concerning ISB estimation conducted by the authors, small oscillations in phase ISB time series were detected. This paper investigates this effect present in the GPS-Galileo-IOV ISB time series. In particular, ISB short-term temporal stability and its dependence on the number of Galileo satellites used in the ISB estimation was examined. In this contribution we investigate the amplitude and frequency of the detected ISB time series oscillations as well as their potential source. The presented results are based on real observational data collected on a zero baseline with the use of different sets of GNSS receivers.
Stochastic modeling for time series InSAR: with emphasis on atmospheric effects
NASA Astrophysics Data System (ADS)
Cao, Yunmeng; Li, Zhiwei; Wei, Jianchao; Hu, Jun; Duan, Meng; Feng, Guangcai
2018-02-01
Despite the many applications of time series interferometric synthetic aperture radar (TS-InSAR) techniques in geophysical problems, error analysis and assessment have been largely overlooked. Tropospheric propagation error is still the dominant error source of InSAR observations. However, the spatiotemporal variation of atmospheric effects is seldom considered in the present standard TS-InSAR techniques, such as persistent scatterer interferometry and small baseline subset interferometry. The failure to consider the stochastic properties of atmospheric effects not only affects the accuracy of the estimators, but also makes it difficult to assess the uncertainty of the final geophysical results. To address this issue, this paper proposes a network-based variance-covariance estimation method to model the spatiotemporal variation of tropospheric signals, and to estimate the temporal variance-covariance matrix of TS-InSAR observations. The constructed stochastic model is then incorporated into the TS-InSAR estimators both for parameters (e.g., deformation velocity, topography residual) estimation and uncertainty assessment. It is an incremental and positive improvement to the traditional weighted least squares methods to solve the multitemporal InSAR time series. The performance of the proposed method is validated by using both simulated and real datasets.
Optimizing Use of Water Management Systems during Changes of Hydrological Conditions
NASA Astrophysics Data System (ADS)
Výleta, Roman; Škrinár, Andrej; Danáčová, Michaela; Valent, Peter
2017-10-01
When designing the water management systems and their components, there is a need of more detail research on hydrological conditions of the river basin, runoff of which creates the main source of water in the reservoir. Over the lifetime of the water management systems the hydrological time series are never repeated in the same form which served as the input for the design of the system components. The design assumes the observed time series to be representative at the time of the system use. However, it is rather unrealistic assumption, because the hydrological past will not be exactly repeated over the design lifetime. When designing the water management systems, the specialists may occasionally face the insufficient or oversized capacity design, possibly wrong specification of the management rules which may lead to their non-optimal use. It is therefore necessary to establish a comprehensive approach to simulate the fluctuations in the interannual runoff (taking into account the current dry and wet periods) in the form of stochastic modelling techniques in water management practice. The paper deals with the methodological procedure of modelling the mean monthly flows using the stochastic Thomas-Fiering model, while modification of this model by Wilson-Hilferty transformation of independent random number has been applied. This transformation usually applies in the event of significant asymmetry in the observed time series. The methodological procedure was applied on the data acquired at the gauging station of Horné Orešany in the Parná Stream. Observed mean monthly flows for the period of 1.11.1980 - 31.10.2012 served as the model input information. After extrapolation the model parameters and Wilson-Hilferty transformation parameters the synthetic time series of mean monthly flows were simulated. Those have been compared with the observed hydrological time series using basic statistical characteristics (e. g. mean, standard deviation and skewness) for testing the quality of the model simulation. The synthetic hydrological series of monthly flows were created having the same statistical properties as the time series observed in the past. The compiled model was able to take into account the diversity of extreme hydrological situations in a form of synthetic series of mean monthly flows, while the occurrence of a set of flows was confirmed, which could and may occur in the future. The results of stochastic modelling in the form of synthetic time series of mean monthly flows, which takes into account the seasonal fluctuations of runoff within the year, could be applicable in engineering hydrology (e. g. for optimum use of the existing water management system that is related to reassessment of economic risks of the system).
Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis
NASA Astrophysics Data System (ADS)
Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A.; Ghosh, Sujit K.; Montet, Benjamin T.; Newton, Elisabeth R.
2017-05-01
Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.
REVISITING EVIDENCE OF CHAOS IN X-RAY LIGHT CURVES: THE CASE OF GRS 1915+105
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mannattil, Manu; Gupta, Himanshu; Chakraborty, Sagar, E-mail: mmanu@iitk.ac.in, E-mail: hiugupta@iitk.ac.in, E-mail: sagarc@iitk.ac.in
2016-12-20
Nonlinear time series analysis has been widely used to search for signatures of low-dimensional chaos in light curves emanating from astrophysical bodies. A particularly popular example is the microquasar GRS 1915+105, whose irregular but systematic X-ray variability has been well studied using data acquired by the Rossi X-ray Timing Explorer . With a view to building simpler models of X-ray variability, attempts have been made to classify the light curves of GRS 1915+105 as chaotic or stochastic. Contrary to some of the earlier suggestions, after careful analysis, we find no evidence for chaos or determinism in any of the GRS 1915+105 classes. Themore » dearth of long and stationary data sets representing all the different variability classes of GRS 1915+105 makes it a poor candidate for analysis using nonlinear time series techniques. We conclude that either very exhaustive data analysis with sufficiently long and stationary light curves should be performed, keeping all the pitfalls of nonlinear time series analysis in mind, or alternative schemes of classifying the light curves should be adopted. The generic limitations of the techniques that we point out in the context of GRS 1915+105 affect all similar investigations of light curves from other astrophysical sources.« less
Data Sources for Biosurveillance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walters, Ronald A.; Harlan, MS, Pete A.; Nelson, Noele P.
2010-03-01
Biosurveillance analyzes timely data, often fusing time series of many different types of data, to infer the status of public health rather than solely exploiting data having diagnostic specificity. Integrated biosurveillance requires a synthesis of analytic approaches derived from the natural disaster, public health, medical, meteorological, and social science communities, among others, and it is the cornerstone of early disease detection. This paper summarizes major systems dedicated to such an endeavor and emphasizes system capabilities that if creatively exploited could contribute to creation of an effective global biosurveillance enterprise.
VizieR Online Data Catalog: Sample of faint X-ray pulsators (Israel+, 2016)
NASA Astrophysics Data System (ADS)
Israel, G. L.; Esposito, P.; Rodriguez Castillo, G. A.; Sidoli, L.
2018-04-01
As of 2015 December 31, we extracted about 430000 time series from sources with more than 10 counts (after background subtraction); ~190000 of them have more than 50 counts and their PSDs were searched for significant peaks. At the time of writing, the total number of searched Fourier frequencies was about 4.3x109. After a detailed screening, we obtained a final sample of 41 (42) new X-ray pulsators (signals), which are listed in Table 1. (1 data file).
ERIC Educational Resources Information Center
Johnson, Nate
2009-01-01
What does it cost to provide a bachelor's-level education? This question arises with increasing frequency and urgency as pressure mounts on policymakers and education leaders to increase the education attainment level in the United States, to "Double the Numbers" in some cases. At the same time, the two traditional sources of…
ERIC Educational Resources Information Center
Haddad, Caroline, Ed.; Rennie, Luisa, Ed.
2005-01-01
Although many excellent materials now exist that detail the full range of potential uses of Information Communication Technologies (ICTs) in education, already overworked policy makers and others often lack the time it takes to surf the Internet, or access libraries and other sources of information on their own in search of ideas and material…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reiter, R.; Kanter, H.J.; Jaeger, H.
A statistical evaluation of tropospheric ozone concentrations in the air obtained at 3 different levels is presented from data covering 1977 to 1984. Annual and interannual variations are used to project a trend. To clarify the climatology of the stratospheric exchange, the measuring series of cosmogenic radionuclides Be7, P32, P33 covering the period 1970 through 1981 are statistically analyzed with regard to the ozone concentration recorded on the Zugspitze. The statistics of stratospheric intrusions is shown and the stratospheric residence time is estimated. Effects of the eruption of volcano El Chichon in April 1982 on the concentration of the stratosphericmore » aerosol are documented. The time variation of the concentration of the stratospheric aerosol is studied with consideration of the stratospheric circulation. The noted effects are weighed by a comparison with earlier volcanic eruptions. First results of CO/sub 2/ recordings in the lower stratosphere are presented. Based on CO/sub 2/ recording series from two different levels (740 m and 1780 m a.s.1) from the years 1978 to 1980, systematic differences are shown as a function of height. The question of sources and sinks is discussed to assess the contribution from anthropogenic sources.« less
van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-08-07
Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.
Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-01-01
Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160
Wang, Xuelei; Wang, Qiao; Yang, Shengtian; Zheng, Donghai; Wu, Chuanqing; Mannaerts, C M
2011-06-01
Nitrogen (N) removal by vegetation uptake is one of the most important functions of riparian buffer zones in preventing non-point source pollution (NSP), and many studies about N uptake at the river reach scale have proven the effectiveness of plants in controlling nutrient pollution. However, at the watershed level, the riparian zones form dendritic networks and, as such, may be the predominant spatially structured feature in catchments and landscapes. Thus, assessing the functions of riparian system at the basin scale is important. In this study, a new method coupling remote sensing and ecological models was used to assess the N removal by riparian vegetation on a large spatial scale. The study site is located around the Guanting reservoir in Beijing, China, which was abandoned as the source water system for Beijing due to serious NSP in 1997. SPOT 5 data was used to map the land cover, and Landsat-5 TM time series images were used to retrieve land surface parameters. A modified forest nutrient cycling and biomass model (ForNBM) was used to simulate N removal, and the modified net primary productivity (NPP) module was driven by remote sensing image time series. Besides the remote sensing data, the necessary database included meteorological data, soil chemical and physical data and plant nutrient data. Pot and plot experiments were used to calibrate and validate the simulations. Our study has proven that, by coupling remote sensing data and parameters retrieval techniques to plant growth process models, catchment scale estimations of nitrogen uptake rates can be improved by spatial pixel-based modelling. Copyright © 2011 Elsevier B.V. All rights reserved.
The influence of biomass energy consumption on CO2 emissions: a wavelet coherence approach.
Bilgili, Faik; Öztürk, İlhan; Koçak, Emrah; Bulut, Ümit; Pamuk, Yalçın; Muğaloğlu, Erhan; Bağlıtaş, Hayriye H
2016-10-01
In terms of today, one may argue, throughout observations from energy literature papers, that (i) one of the main contributors of the global warming is carbon dioxide emissions, (ii) the fossil fuel energy usage greatly contributes to the carbon dioxide emissions, and (iii) the simulations from energy models attract the attention of policy makers to renewable energy as alternative energy source to mitigate the carbon dioxide emissions. Although there appears to be intensive renewable energy works in the related literature regarding renewables' efficiency/impact on environmental quality, a researcher might still need to follow further studies to review the significance of renewables in the environment since (i) the existing seminal papers employ time series models and/or panel data models or some other statistical observation to detect the role of renewables in the environment and (ii) existing papers consider mostly aggregated renewable energy source rather than examining the major component(s) of aggregated renewables. This paper attempted to examine clearly the impact of biomass on carbon dioxide emissions in detail through time series and frequency analyses. Hence, the paper follows wavelet coherence analyses. The data covers the US monthly observations ranging from 1984:1 to 2015 for the variables of total energy carbon dioxide emissions, biomass energy consumption, coal consumption, petroleum consumption, and natural gas consumption. The paper thus, throughout wavelet coherence and wavelet partial coherence analyses, observes frequency properties as well as time series properties of relevant variables to reveal the possible significant influence of biomass usage on the emissions in the USA in both the short-term and the long-term cycles. The paper also reveals, finally, that the biomass consumption mitigates CO2 emissions in the long run cycles after the year 2005 in the USA.
Decadal variability on the Northwest European continental shelf
NASA Astrophysics Data System (ADS)
Jones, Sam; Cottier, Finlo; Inall, Mark; Griffiths, Colin
2018-02-01
Decadal scale time series of the shelf seas are important for understanding both climate and process studies. Despite numerous investigations of long-term temperature variability in the shelf seas, studies of salinity variability are few. Salt is a more conservative tracer than temperature in shallow seas, and it can reveal changes in local hydrographic conditions as well as transmitted basin-scale changes. Here, new inter-annual salinity time series on the northwest European shelf are developed and a 13 year high resolution salinity record from a coastal mooring in western Scotland is presented and analysed. We find strong temporal variability in coastal salinity on timescales ranging from tidal to inter-annual, with the magnitude of variability greatest during winter months. There is little seasonality and no significant decadal trend in the coastal time series of salinity. We propose 4 hydrographic states to explain salinity variance in the shelf area west of Scotland based on the interaction between a baroclinic coastal current and wind-forced barotropic flow: while wind forcing is important, we find that changes in the buoyancy-driven flow are more likely to influence long-term salinity observations. We calculate that during prevailing westerly wind conditions, surface waters in the Sea of the Hebrides receive a mix of 62% Atlantic origin water to 38% coastal sources. This contrasts with easterly wind conditions, during which the mix is 6% Atlantic to 94% coastal sources on average. This 'switching' between hydrographic states is expected to impact nutrient transport and therefore modify the level of primary productivity on the shelf. This strong local variability in salinity is roughly an order of magnitude greater than changes in the adjacent ocean basin, and we infer from this that Scottish coastal waters are likely to be resilient to decadal changes in ocean climate.
NASA Astrophysics Data System (ADS)
Kasai, K.; Shiomi, K.; Konno, A.; Tadono, T.; Hori, M.
2016-12-01
Global observation of greenhouse gases such as carbon dioxide (CO2) and methane (CH4) with high spatio-temporal resolution and accurate estimation of sources and sinks are important to understand greenhouse gases dynamics. Greenhouse Gases Observing Satellite (GOSAT) has observed column-averaged dry-air mole fractions of CO2 (XCO2) and CH4 (XCH4) over 7 years since January 2009 with wide swath but sparse pointing. Orbiting Carbon Observatory-2 (OCO-2) has observed XCO2 jointly on orbit since July 2014 with narrow swath but high resolution. We use two retrieved datasets as GOSAT observation data. One is ACOS GOSAT/TANSO-FTS Level 2 Full Product by NASA/JPL, and the other is NIES TANSO-FTS L2 column amount (SWIR). By using these GOSAT datasets and OCO-2 L2 Full Product, the biases among datasets, local sources and sinks, and temporal variability of greenhouse gases are clarified. In addition, CarbonTracker, which is a global model of atmospheric CO2 and CH4 developed by NOAA/ESRL, are also analyzed for comparing between satellite observation data and atmospheric model data. Before analyzing these datasets, outliers are screened by using quality flag, outcome flag, and warn level in land or sea parts. Time series data of XCO2 and XCH4 are obtained globally from satellite observation and atmospheric model datasets, and functions which express typical inter-annual and seasonal variation are fitted to each spatial grid. Consequently, anomalous events of XCO2 and XCH4 are extracted by the difference between each time series dataset and the fitted function. Regional emission and absorption events are analyzed by time series variation of satellite observation data and by comparing with atmospheric model data.
Aerosol composition and its sources at the King Sejong Station, Antarctic peninsula
NASA Astrophysics Data System (ADS)
Mishra, Vinit K.; Kim, Ki-Hyun; Hong, Sungmin; Lee, Khanghyun
The annual cycles of major metals and ions in suspended particulate matters (SPM) have been investigated at a costal site of the Antarctic Peninsula in order to elucidate temporal variations as well as major source processes responsible for their formation. The measurements had been performed from January 2000 to December 2001 at the Korean Antarctic research station, 'King Sejong' (62°13' S, 58°47' W). The observed time series of important aerosol components showed clear seasonal variation patterns, while the mean elemental concentrations (e.g., 1875 (Al), 10.3 (Ba), 0.3 (Bi), 1.3 (Cd), 1.7 pg m -3 (Co)) were generally compatible with those reported previously. The presence of high EF values with respect to both mean crustal and seawater composition (such as Bi, Cd, Cr, Cu, Ni, V, and Zn), however, suggests a possibly important role of anthropogenic processes in this remote site. In contrast, the concentrations of ionic species were not clearly distinguishable from those of other Antarctic sites; but the consideration of ionic mass balance between cations and anions pointed out the uniqueness of their source/sink processes in the study area. The major source processes of those aerosol components were also investigated using a series of statistical analyses. The overall results of our study indicated the dominance of several processes (or sources) such as sea-salt emission, secondary aerosol formation, and anthropogenic pollution from both local and distant sources.
Madjidi, Faramarz; Behroozy, Ali
2014-01-01
Exposure to visible light and near infrared (NIR) radiation in the wavelength region of 380 to 1400 nm may cause thermal retinal injury. In this analysis, the effective spectral radiance of a hot source is replaced by its temperature in the exposure limit values in the region of 380-1400 nm. This article describes the development and implementation of a computer code to predict those temperatures, corresponding to the exposure limits proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). Viewing duration and apparent diameter of the source were inputs for the computer code. At the first stage, an infinite series was created for calculation of spectral radiance by integration with Planck's law. At the second stage for calculation of effective spectral radiance, the initial terms of this infinite series were selected and integration was performed by multiplying these terms by a weighting factor R(λ) in the wavelength region 380-1400 nm. At the third stage, using a computer code, the source temperature that can emit the same effective spectral radiance was found. As a result, based only on measuring the source temperature and accounting for the exposure time and the apparent diameter of the source, it is possible to decide whether the exposure to visible and NIR in any 8-hr workday is permissible. The substitution of source temperature for effective spectral radiance provides a convenient way to evaluate exposure to visible light and NIR.
Reduced rank models for travel time estimation of low order mode pulses.
Chandrayadula, Tarun K; Wage, Kathleen E; Worcester, Peter F; Dzieciuch, Matthew A; Mercer, James A; Andrew, Rex K; Howe, Bruce M
2013-10-01
Mode travel time estimation in the presence of internal waves (IWs) is a challenging problem. IWs perturb the sound speed, which results in travel time wander and mode scattering. A standard approach to travel time estimation is to pulse compress the broadband signal, pick the peak of the compressed time series, and average the peak time over multiple receptions to reduce variance. The peak-picking approach implicitly assumes there is a single strong arrival and does not perform well when there are multiple arrivals due to scattering. This article presents a statistical model for the scattered mode arrivals and uses the model to design improved travel time estimators. The model is based on an Empirical Orthogonal Function (EOF) analysis of the mode time series. Range-dependent simulations and data from the Long-range Ocean Acoustic Propagation Experiment (LOAPEX) indicate that the modes are represented by a small number of EOFs. The reduced-rank EOF model is used to construct a travel time estimator based on the Matched Subspace Detector (MSD). Analysis of simulation and experimental data show that the MSDs are more robust to IW scattering than peak picking. The simulation analysis also highlights how IWs affect the mode excitation by the source.
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
NASA Astrophysics Data System (ADS)
Rahim, K. J.; Cumming, B. F.; Hallett, D. J.; Thomson, D. J.
2007-12-01
An accurate assessment of historical local Holocene data is important in making future climate predictions. Holocene climate is often obtained through proxy measures such as diatoms or pollen using radiocarbon dating. Wiggle Match Dating (WMD) uses an iterative least squares approach to tune a core with a large amount of 14C dates to the 14C calibration curve. This poster will present a new method of tuning a time series with when only a modest number of 14C dates are available. The method presented uses the multitaper spectral estimation, and it specifically makes use of a multitaper spectral coherence tuning technique. Holocene climate reconstructions are often based on a simple depth-time fit such as a linear interpolation, splines, or low order polynomials. Many of these models make use of only a small number of 14C dates, each of which is a point estimate with a significant variance. This technique attempts to tune the 14C dates to a reference series, such as tree rings, varves, or the radiocarbon calibration curve. The amount of 14C in the atmosphere is not constant, and a significant source of variance is solar activity. A decrease in solar activity coincides with an increase in cosmogenic isotope production, and an increase in cosmogenic isotope production coincides with a decrease in temperature. The method presented uses multitaper coherence estimates and adjusts the phase of the time series to line up significant line components with that of the reference series in attempt to obtain a better depth-time fit then the original model. Given recent concerns and demonstrations of the variation in estimated dates from radiocarbon labs, methods to confirm and tune the depth-time fit can aid climate reconstructions by improving and serving to confirm the accuracy of the underlying depth-time fit. Climate reconstructions can then be made on the improved depth-time fit. This poster presents a run though of this process using Chauvin Lake in the Canadian prairies and Mt. Barr Cirque Lake located in British Columbia as examples.
Geodetic imaging of tectonic deformation with InSAR
NASA Astrophysics Data System (ADS)
Fattahi, Heresh
Precise measurements of ground deformation across the plate boundaries are crucial observations to evaluate the location of strain localization and to understand the pattern of strain accumulation at depth. Such information can be used to evaluate the possible location and magnitude of future earthquakes. Interferometric Synthetic Aperture Radar (InSAR) potentially can deliver small-scale (few mm/yr) ground displacement over long distances (hundreds of kilometers) across the plate boundaries and over continents. However, Given the ground displacement as our signal of interest, the InSAR observations of ground deformation are usually affected by several sources of systematic and random noises. In this dissertation I identify several sources of systematic and random noise, develop new methods to model and mitigate the systematic noise and to evaluate the uncertainty of the ground displacement measured with InSAR. I use the developed approach to characterize the tectonic deformation and evaluate the rate of strain accumulation along the Chaman fault system, the western boundary of the India with Eurasia tectonic plates. I evaluate the bias due to the topographic residuals in the InSAR range-change time-series and develope a new method to estimate the topographic residuals and mitigate the effect from the InSAR range-change time-series (Chapter 2). I develop a new method to evaluate the uncertainty of the InSAR velocity field due to the uncertainty of the satellite orbits (Chapter 3) and a new algorithm to automatically detect and correct the phase unwrapping errors in a dense network of interferograms (Chapter 4). I develop a new approach to evaluate the impact of systematic and stochastic components of the tropospheric delay on the InSAR displacement time-series and its uncertainty (Chapter 5). Using the new InSAR time-series approach developed in the previous chapters, I study the tectonic deformation across the western boundary of the India plate with Eurasia and evaluated the rate of strain accumulation along the Chaman fault system (Chapter 5). I also evaluate the co-seismic and post-seismic displacement of a moderate M5.5 earthquake on the Ghazaband fault (Chapter 6). The developed methods to mitigate the systematic noise from InSAR time-series, significantly improve the accuracy of the InSAR displacement time-series and velocity. The approaches to evaluate the effect of the stochastic components of noise in InSAR displacement time-series enable us to obtain the variance-covariance matrix of the InSAR displacement time-series and to express their uncertainties. The effect of the topographic residuals in the InSAR range-change time-series is proportional to the perpendicular baseline history of the set of SAR acquisitions. The proposed method for topographic residual correction, efficiently corrects the displacement time-series. Evaluation of the uncertainty of velocity due to the orbital errors shows that for modern SAR satellites with precise orbits such as TerraSAR-X and Sentinel-1, the uncertainty of 0.2 mm/yr per 100 km and for older satellites with less accurate orbits such as ERS and Envisat, the uncertainty of 1.5 and 0.5mm/yr per 100 km, respectively are achievable. However, the uncertainty due to the orbital errors depends on the orbital uncertainties, the number and time span of SAR acquisitions. Contribution of the tropospheric delay to the InSAR range-change time-series can be subdivided to systematic (seasonal delay) and stochastic components. The systematic component biases the displacement times-series and velocity field as a function of the acquisition time and the non-seasonal component significantly contributes to the InSAR uncertainty. Both components are spatially correlated and therefore the covariance of noise between pixels should be considered for evaluating the uncertainty due to the random tropospheric delay. The relative velocity uncertainty due to the random tropospheric delay depends on the scatter of the random tropospheric delay, and is inversely proportional to the number of acquisitions, and the total time span covered by the SAR acquisitions. InSAR observations across the Chaman fault system shows that relative motion between India and Eurasia in the western boundary is distributed among different faults. The InSAR velocity field indicates strain localization on the Chaman fault and Ghazaband fault with slip rates of ~8 and ~16 mm/yr, respectively. High rate of strain accumulation on the Ghazaband fault and lack of evidence for rupturing the fault during the 1935 Quetta earthquake indicates that enough strain has been accumulated for large (M>7) earthquake, which threatens Balochistan and the City of Quetta. Chaman fault from latitudes ~29.5 N to ~32.5 N is creeping with a maximum surface creep rate of 8 mm/yr, which indicates that Chaman fault is only partially locked and therefore moderate earthquakes (M<7) similar to what has been recorded in last 100 years are expected.
An agreement coefficient for image comparison
Ji, Lei; Gallo, Kevin
2006-01-01
Combination of datasets acquired from different sensor systems is necessary to construct a long time-series dataset for remotely sensed land-surface variables. Assessment of the agreement of the data derived from various sources is an important issue in understanding the data continuity through the time-series. Some traditional measures, including correlation coefficient, coefficient of determination, mean absolute error, and root mean square error, are not always optimal for evaluating the data agreement. For this reason, we developed a new agreement coefficient for comparing two different images. The agreement coefficient has the following properties: non-dimensional, bounded, symmetric, and distinguishable between systematic and unsystematic differences. The paper provides examples of agreement analyses for hypothetical data and actual remotely sensed data. The results demonstrate that the agreement coefficient does include the above properties, and therefore is a useful tool for image comparison.
Advanced functional network analysis in the geosciences: The pyunicorn package
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen
2013-04-01
Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.
Revision of Primary Series Maps
,
2000-01-01
In 1992, the U.S. Geological Survey (USGS) completed a 50-year effort to provide primary series map coverage of the United States. Many of these maps now need to be updated to reflect the construction of new roads and highways and other changes that have taken place over time. The USGS has formulated a graphic revision plan to help keep the primary series maps current. Primary series maps include 1:20,000-scale quadrangles of Puerto Rico, 1:24,000- or 1:25,000-scale quadrangles of the conterminous United States, Hawaii, and U.S. Territories, and 1:63,360-scale quadrangles of Alaska. The revision of primary series maps from new collection sources is accomplished using a variety of processes. The raster revision process combines the scanned content of paper maps with raster updating technologies. The vector revision process involves the automated plotting of updated vector files. Traditional processes use analog stereoplotters and manual scribing instruments on specially coated map separates. The ability to select from or combine these processes increases the efficiency of the National Mapping Division map revision program.
Discovery and identification of a series of alkyl decalin isomers in petroleum geological samples.
Wang, Huitong; Zhang, Shuichang; Weng, Na; Zhang, Bin; Zhu, Guangyou; Liu, Lingyan
2015-07-07
The comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry (GC × GC/TOFMS) has been used to characterize a crude oil and a source rock extract sample. During the process, a series of pairwise components between monocyclic alkanes and mono-aromatics have been discovered. After tentative assignments of decahydronaphthalene isomers, a series of alkyl decalin isomers have been synthesized and used for identification and validation of these petroleum compounds. From both the MS and chromatography information, these pairwise compounds were identified as 2-alkyl-decahydronaphthalenes and 1-alkyl-decahydronaphthalenes. The polarity of 1-alkyl-decahydronaphthalenes was stronger. Their long chain alkyl substituent groups may be due to bacterial transformation or different oil cracking events. This systematic profiling of alkyl-decahydronaphthalene isomers provides further understanding and recognition of these potential petroleum biomarkers.
NASA Astrophysics Data System (ADS)
Chromá, Kateřina; Brázdil, Rudolf; Dolák, Lukáš; Řezníčková, Ladislava; Valášek, Hubert; Zahradníček, Pavel
2016-04-01
Hailstorms belong to natural phenomena causing great material damage in present time, similarly as it was in the past. In Moravia (eastern part of the Czech Republic), systematic meteorological observations started generally in the latter half of the 19th century. Therefore, in order to create long-term series of hailstorms, it is necessary to search for other sources of information. Different types of documentary evidence are used in historical climatology, such as annals, chronicles, diaries, private letters, newspapers etc. Besides them, institutional documentary evidence of economic and administrative character (e.g. taxation records) has particular importance. This study aims to create a long-term series of hailstorms in South Moravia using various types of documentary evidence (such as taxation records, family archives, chronicles and newspapers which are the most important) and systematic meteorological observations in the station network. Although available hailstorm data cover the 1541-2014 period, incomplete documentary evidence allows reasonable analysis of fluctuations in hailstorm frequency only since the 1770s. The series compiled from documentary data and systematic meteorological observations is used to identify periods of lower and higher hailstorm frequency. Existing data may be used also for the study of spatial hailstorm variability. Basic uncertainties of compiled hailstorm series are discussed. Despite some bias in hailstorm data, South-Moravian hailstorm series significantly extends our knowledge about this phenomenon in the south-eastern part of the Czech Republic. The study is a part of the research project "Hydrometeorological extremes in Southern Moravia derived from documentary evidence" supported by the Grant Agency of the Czech Republic, reg. no. 13-19831S.
Klebsiella Pneumoniae Liver Abscess: A Case Series of Six Asian Patients
Oikonomou, Katerina G.; Aye, Myint
2017-01-01
Case series Patient: Female, 60 • Male, 45 • Male, 56 • Male, 65 • Female, 57 • Male, 35 Final Diagnosis: Klebsiella pneumoniae liver abscess Symptoms: Fever Medication: — Clinical Procedure: — Specialty: Infectious Diseases Objective: Rare co-existance of disease or pathology Background: Liver abscesses represent a serious infection of hepatic parenchyma and are associated with significant morbidity and mortality. The emergence of a new hypervirulent variant of Klebsiella pneumoniae, which can cause serious infections in the Asian population, is under investigation. We report a case series of six Asian patients hospitalized at our institution from January 2013 to November 2015 for liver abscess due to Klebsiella pneumoniae. Case Report: Charts of six Asian patients were retrospectively reviewed. Four patients were male and two were female. The mean age was 53 years (range: 35–64 years). All patients had no known past medical history of immunodeficiency. Three patients had multiple liver abscesses at the time of initial presentation. In five patients, the source of entry of the pathogenic microorganism was unknown and in one patient the suspected source of entry was the gastrointestinal tract. In three patients there was also concomitant Klebsiella pneumoniae bacteremia. The mean duration of antibiotic treatment was seven weeks and the mean duration of hospital stay was 13.5 days. Conclusions: Liver abscess should always be included in the differential diagnosis in cases of sepsis without obvious source and/or in the clinical scenarios of fever, abdominal pain, and liver lesions. PMID:28947732
NASA Astrophysics Data System (ADS)
Milej, Daniel; Janusek, Dariusz; Gerega, Anna; Wojtkiewicz, Stanislaw; Sawosz, Piotr; Treszczanowicz, Joanna; Weigl, Wojciech; Liebert, Adam
2015-10-01
The aim of the study was to determine optimal measurement conditions for assessment of brain perfusion with the use of optical contrast agent and time-resolved diffuse reflectometry in the near-infrared wavelength range. The source-detector separation at which the distribution of time of flights (DTOF) of photons provided useful information on the inflow of the contrast agent to the intracerebral brain tissue compartments was determined. Series of Monte Carlo simulations was performed in which the inflow and washout of the dye in extra- and intracerebral tissue compartments was modeled and the DTOFs were obtained at different source-detector separations. Furthermore, tests on diffuse phantoms were carried out using a time-resolved setup allowing the measurement of DTOFs at 16 source-detector separations. Finally, the setup was applied in experiments carried out on the heads of adult volunteers during intravenous injection of indocyanine green. Analysis of statistical moments of the measured DTOFs showed that the source-detector separation of 6 cm is recommended for monitoring of inflow of optical contrast to the intracerebral brain tissue compartments with the use of continuous wave reflectometry, whereas the separation of 4 cm is enough when the higher-order moments of DTOFs are available.
Neural basis of postural instability identified by VTC and EEG
Cao, Cheng; Jaiswal, Niharika; Newell, Karl M.
2010-01-01
In this study, we investigated the neural basis of virtual time to contact (VTC) and the hypothesis that VTC provides predictive information for future postural instability. A novel approach to differentiate stable pre-falling and transition-to-instability stages within a single postural trial while a subject was performing a challenging single leg stance with eyes closed was developed. Specifically, we utilized wavelet transform and stage segmentation algorithms using VTC time series data set as an input. The VTC time series was time-locked with multichannel (n = 64) EEG signals to examine its underlying neural substrates. To identify the focal sources of neural substrates of VTC, a two-step approach was designed combining the independent component analysis (ICA) and low-resolution tomography (LORETA) of multichannel EEG. There were two major findings: (1) a significant increase of VTC minimal values (along with enhanced variability of VTC) was observed during the transition-to-instability stage with progression to ultimate loss of balance and falling; and (2) this VTC dynamics was associated with pronounced modulation of EEG predominantly within theta, alpha and gamma frequency bands. The sources of this EEG modulation were identified at the cingulate cortex (ACC) and the junction of precuneus and parietal lobe, as well as at the occipital cortex. The findings support the hypothesis that the systematic increase of minimal values of VTC concomitant with modulation of EEG signals at the frontal-central and parietal–occipital areas serve collectively to predict the future instability in posture. PMID:19655130
LORETA EEG phase reset of the default mode network.
Thatcher, Robert W; North, Duane M; Biver, Carl J
2014-01-01
The purpose of this study was to explore phase reset of 3-dimensional current sources in Brodmann areas located in the human default mode network (DMN) using Low Resolution Electromagnetic Tomography (LORETA) of the human electroencephalogram (EEG). The EEG was recorded from 19 scalp locations from 70 healthy normal subjects ranging in age from 13 to 20 years. A time point by time point computation of LORETA current sources were computed for 14 Brodmann areas comprising the DMN in the delta frequency band. The Hilbert transform of the LORETA time series was used to compute the instantaneous phase differences between all pairs of Brodmann areas. Phase shift and lock durations were calculated based on the 1st and 2nd derivatives of the time series of phase differences. Phase shift duration exhibited three discrete modes at approximately: (1) 25 ms, (2) 50 ms, and (3) 65 ms. Phase lock duration present primarily at: (1) 300-350 ms and (2) 350-450 ms. Phase shift and lock durations were inversely related and exhibited an exponential change with distance between Brodmann areas. The results are explained by local neural packing density of network hubs and an exponential decrease in connections with distance from a hub. The results are consistent with a discrete temporal model of brain function where anatomical hubs behave like a "shutter" that opens and closes at specific durations as nodes of a network giving rise to temporarily phase locked clusters of neurons for specific durations.
NASA Astrophysics Data System (ADS)
Xu, B.
2017-12-01
Interferometric Synthetic Aperture Radar (InSAR) has the advantages of high spatial resolution which enable measure line of sight (LOS) surface displacements with nearly complete spatial continuity and a satellite's perspective that permits large areas view of Earth's surface quickly and efficiently. However, using InSAR to observe long wavelength and small magnitude deformation signals is still significantly limited by various unmodeled errors sources i.e. atmospheric delays, orbit induced errors, Digital Elevation Model (DEM) errors. Independent component analysis (ICA) is a probabilistic method for separating linear mixed signals generated by different underlying physical processes.The signal sources which form the interferograms are statistically independent both in space and in time, thus, they can be separated by ICA approach.The seismic behavior in the Los Angeles Basin is active and the basin has experienced numerous moderate to large earthquakes since the early Pliocene. Hence, understanding the seismotectonic deformation in the Los Angeles Basin is important for analyzing seismic behavior. Compare with the tectonic deformations, nontectonic deformations due to groundwater and oil extraction may be mainly responsible for the surface deformation in the Los Angeles basin. Using the small baseline subset (SBAS) InSAR method, we extracted the surface deformation time series in the Los Angeles basin with a time span of 7 years (September 27, 2003-September 25,2010). Then, we successfully separate the atmospheric noise from InSAR time series and detect different processes caused by different mechanisms.
Pre-2014 mudslides at Oso revealed by InSAR and multi-source DEM analysis
NASA Astrophysics Data System (ADS)
Kim, J. W.; Lu, Z.; QU, F.
2014-12-01
The landslide is a process that results in the downward and outward movement of slope-reshaping materials including rocks and soils and annually causes the loss of approximately $3.5 billion and tens of casualties in the United States. The 2014 Oso mudslide was an extreme event costing nearly 40 deaths and damaging civilian properties. Landslides are often unpredictable, but in many cases, catastrophic events are repetitive. Historic record in the Oso mudslide site indicates that there have been serial events in decades, though the extent of sliding events varied from time to time. In our study, the combination of multi-source DEMs, InSAR, and time-series InSAR analysis has enabled to characterize the Oso mudslide. InSAR results from ALOS PALSAR show that there was no significant deformation between mid-2006 and 2011. The combination of time-series InSAR analysis and old-dated DEM indicated revealed topographic changes associated the 2006 sliding event, which is confirmed by the difference of multiple LiDAR DEMs. Precipitation and discharge measurements before the 2006 and 2014 landslide events did not exhibit extremely anomalous records, suggesting the precipitation is not the controlling factor in determining the sliding events at Oso. The lack of surface deformation during 2006-2011 and weak correlation between the precipitation and the sliding event, suggest other factors (such as porosity) might play a critical role on the run-away events at this Oso and other similar landslides.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569
Dynamical density delay maps: simple, new method for visualising the behaviour of complex systems
2014-01-01
Background Physiologic signals, such as cardiac interbeat intervals, exhibit complex fluctuations. However, capturing important dynamical properties, including nonstationarities may not be feasible from conventional time series graphical representations. Methods We introduce a simple-to-implement visualisation method, termed dynamical density delay mapping (“D3-Map” technique) that provides an animated representation of a system’s dynamics. The method is based on a generalization of conventional two-dimensional (2D) Poincaré plots, which are scatter plots where each data point, x(n), in a time series is plotted against the adjacent one, x(n + 1). First, we divide the original time series, x(n) (n = 1,…, N), into a sequence of segments (windows). Next, for each segment, a three-dimensional (3D) Poincaré surface plot of x(n), x(n + 1), h[x(n),x(n + 1)] is generated, in which the third dimension, h, represents the relative frequency of occurrence of each (x(n),x(n + 1)) point. This 3D Poincaré surface is then chromatised by mapping the relative frequency h values onto a colour scheme. We also generate a colourised 2D contour plot from each time series segment using the same colourmap scheme as for the 3D Poincaré surface. Finally, the original time series graph, the colourised 3D Poincaré surface plot, and its projection as a colourised 2D contour map for each segment, are animated to create the full “D3-Map.” Results We first exemplify the D3-Map method using the cardiac interbeat interval time series from a healthy subject during sleeping hours. The animations uncover complex dynamical changes, such as transitions between states, and the relative amount of time the system spends in each state. We also illustrate the utility of the method in detecting hidden temporal patterns in the heart rate dynamics of a patient with atrial fibrillation. The videos, as well as the source code, are made publicly available. Conclusions Animations based on density delay maps provide a new way of visualising dynamical properties of complex systems not apparent in time series graphs or standard Poincaré plot representations. Trainees in a variety of fields may find the animations useful as illustrations of fundamental but challenging concepts, such as nonstationarity and multistability. For investigators, the method may facilitate data exploration. PMID:24438439
The impact of seasonal signals on spatio-temporal filtering
NASA Astrophysics Data System (ADS)
Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz
2016-04-01
Existence of Common Mode Errors (CMEs) in permanent GNSS networks contribute to spatial and temporal correlation in residual time series. Time series from permanently observing GNSS stations of distance less than 2 000 km are similarly influenced by such CME sources as: mismodelling (Earth Orientation Parameters - EOP, satellite orbits or antenna phase center variations) during the process of the reference frame realization, large-scale atmospheric and hydrospheric effects as well as small scale crust deformations. Residuals obtained as a result of detrending and deseasonalising of topocentric GNSS time series arranged epoch-by-epoch form an observation matrix independently for each component (North, East, Up). CME is treated as internal structure of the data. Assuming a uniform temporal function across the network it is possible to filter CME out using PCA (Principal Component Analysis) approach. Some of above described CME sources may be reflected as a wide range of frequencies in GPS residual time series. In order to determine an impact of seasonal signals modeling to existence of spatial correlation in network and consequently the results of CME filtration, we chose two ways of modeling. The first approach was commonly presented by previous authors, who modeled with the Least-Squares Estimation (LSE) only annual and semi-annual oscillations. In the second one the set of residuals was a result of modeling of deterministic part that included fortnightly periods plus up to 9th harmonics of Chandlerian, tropical and draconitic oscillations. Correlation coefficients for residuals in parallel with KMO (Kaiser-Meyer-Olkin) statistic and Bartlett's test of sphericity were determined. For this research we used time series expressed in ITRF2008 provided by JPL (Jet Propulsion Laboratory). GPS processing was made using GIPSY-OASIS software in a PPP (Precise Point Positioning) mode. In order to form GPS station network that meet demands of uniform spatial response to the CME we chose 18 stations located in Central Europe. Created network extends up to 1500 kilometers. The KMO statistic indicate whether a component analysis may be useful for a chosen data set. We obtained KMO statistic value of 0.87 and 0.62 for residuals of Up component after first and second approaches were applied, what means that both residuals share common errors. Bartlett's test of sphericity analysis met a requirement that in both cases there are correlations in residuals. Another important results are the eigenvalues expressed as a percentage of the total variance explained by the first few components in PCA. For North, East and Up component we obtain respectively 68%, 75%, 65% and 47%, 54%, 52% after first and second approaches were applied. The results of CME filtration using PCA approach performed on both residual time series influence directly the uncertainty of the velocity of permanent stations. In our case spatial filtering reduces the uncertainty of velocity from 0.5 to 0.8 mm for horizontal components and from 0.6 to 0.9 mm on average for Up component when annual and semi-annual signals were assumed. Nevertheless, while second approach to the deterministic part modelling was used, deterioration of velocity uncertainty was noticed only for Up component, probably due to much higher autocorrelation in the time series when comparing to horizontal components.
Getting Astrophysical Information from LISA Data
NASA Technical Reports Server (NTRS)
Stebbins, R. T.; Bender, P. L.; Folkner, W. M.
1997-01-01
Gravitational wave signals from a large number of astrophysical sources will be present in the LISA data. Information about as many sources as possible must be estimated from time series of strain measurements. Several types of signals are expected to be present: simple periodic signals from relatively stable binary systems, chirped signals from coalescing binary systems, complex waveforms from highly relativistic binary systems, stochastic backgrounds from galactic and extragalactic binary systems and possibly stochastic backgrounds from the early Universe. The orbital motion of the LISA antenna will modulate the phase and amplitude of all these signals, except the isotropic backgrounds and thereby give information on the directions of sources. Here we describe a candidate process for disentangling the gravitational wave signals and estimating the relevant astrophysical parameters from one year of LISA data. Nearly all of the sources will be identified by searching with templates based on source parameters and directions.
New developments in flash radiography
NASA Astrophysics Data System (ADS)
Mattsson, Arne
2007-01-01
The paper will review some of the latest developments in flash radiography. A series of multi anode tubes has been developed. These are tubes with several x-ray sources within the same vacuum enclosure. The x-ray sources are closely spaced, to come as close as possible to a single source. The x-ray sources are sequentially pulsed, at times that can be independently chosen. Tubes for voltages in the range 150 - 500 kV, with up to eight x-ray sources, will be described. Combining a multi anode tube with an intensified CCD camera, will make it possible to generate short "x-ray movies". A new flash x-ray control system has been developed. The system is operated from a PC or Laptop. All parameters of a multi channel flash x-ray system can be remotely set and monitored. The system will automatically store important operation parameters.
Chan, Emily H.; Sahai, Vikram; Conrad, Corrie; Brownstein, John S.
2011-01-01
Background A variety of obstacles including bureaucracy and lack of resources have interfered with timely detection and reporting of dengue cases in many endemic countries. Surveillance efforts have turned to modern data sources, such as Internet search queries, which have been shown to be effective for monitoring influenza-like illnesses. However, few have evaluated the utility of web search query data for other diseases, especially those of high morbidity and mortality or where a vaccine may not exist. In this study, we aimed to assess whether web search queries are a viable data source for the early detection and monitoring of dengue epidemics. Methodology/Principal Findings Bolivia, Brazil, India, Indonesia and Singapore were chosen for analysis based on available data and adequate search volume. For each country, a univariate linear model was then built by fitting a time series of the fraction of Google search query volume for specific dengue-related queries from that country against a time series of official dengue case counts for a time-frame within 2003–2010. The specific combination of queries used was chosen to maximize model fit. Spurious spikes in the data were also removed prior to model fitting. The final models, fit using a training subset of the data, were cross-validated against both the overall dataset and a holdout subset of the data. All models were found to fit the data quite well, with validation correlations ranging from 0.82 to 0.99. Conclusions/Significance Web search query data were found to be capable of tracking dengue activity in Bolivia, Brazil, India, Indonesia and Singapore. Whereas traditional dengue data from official sources are often not available until after some substantial delay, web search query data are available in near real-time. These data represent valuable complement to assist with traditional dengue surveillance. PMID:21647308
Hauk, O; Keil, A; Elbert, T; Müller, M M
2002-01-30
We describe a methodology to apply current source density (CSD) and minimum norm (MN) estimation as pre-processing tools for time-series analysis of single trial EEG data. The performance of these methods is compared for the case of wavelet time-frequency analysis of simulated gamma-band activity. A reasonable comparison of CSD and MN on the single trial level requires regularization such that the corresponding transformed data sets have similar signal-to-noise ratios (SNRs). For region-of-interest approaches, it should be possible to optimize the SNR for single estimates rather than for the whole distributed solution. An effective implementation of the MN method is described. Simulated data sets were created by modulating the strengths of a radial and a tangential test dipole with wavelets in the frequency range of the gamma band, superimposed with simulated spatially uncorrelated noise. The MN and CSD transformed data sets as well as the average reference (AR) representation were subjected to wavelet frequency-domain analysis, and power spectra were mapped for relevant frequency bands. For both CSD and MN, the influence of noise can be sufficiently suppressed by regularization to yield meaningful information, but only MN represents both radial and tangential dipole sources appropriately as single peaks. Therefore, when relating wavelet power spectrum topographies to their neuronal generators, MN should be preferred.
NASA Astrophysics Data System (ADS)
van den Besselaar, E. J. M.; Sanchez-Lorenzo, A.; Wild, M.; Klein Tank, A. M. G.
2012-04-01
The surface solar radiation (SSR) is the fundamental source of energy in the climate system, and consequently the source of life on our planet, due to its central role in the surface energy balance. Therefore, a significant impact on temperatures is expected due to the widespread dimming/brightening phenomenon observed since the second half of the 20th century (Wild, 2009). Previous studies pointed out the effects of SSR trends in temperatures series over Europe (Makowski et al., 2009; Philipona et al., 2009), although the lack of long-term SSR series limits these results. This work describes an updated sunshine duration (SS) dataset compiled by the European Climate Assessment and Dataset (ECA&D) project based on around 300 daily time series over Europe covering the 1961-2010 period. The relationship between the SS and temperature series is analysed based on four temperature variables: maximum (TX), minimum (TN) and mean temperature (TG), as well as the diurnal temperature range (DTR). Regional and pan-European mean series of SS and temperatures are constructed. The analyses are performed on annual and seasonal scale, and focusing on the interannual and decadal agreement between the variables. The results show strong positive correlations on interannual scales between SS and temperatures over Europe, especially for the DTR and TX during the summer period and regions in Central Europe. Interestingly, the SS and temperatures series show a tendency towards higher correlations in the smoothed series, both for different regions and temperature variables. These results confirm the relationship between temperature and SS trends over Europe since the second half of the 20th century, which has been speculated to partially decrease (increase) temperatures during the dimming (brightening) period (Makowski et al., 2009; Wild, 2009). Further research is needed to confirm this cause-effect relationship currently found only using correlation analysis.
Ames, Jennifer T; Federle, Michael P
2011-07-01
Our purpose was to review the clinical and imaging findings in a series of patients with septic thrombophlebitis of the portal venous system in order to define criteria that might allow more confident and timely diagnosis. This is a retrospective case series. The clinical and imaging features were analyzed in 33 subjects with septic thrombophlebitis of the portal venous system. All 33 patients with septic thrombophlebitis of the portal venous system had pre-disposing infectious or inflammatory processes. Contrast-enhanced CT studies of patients with septic thrombophlebitis typically demonstrate an infectious gastrointestinal source (82%), thrombosis (70%), and/or gas (21%) of the portal system or its branches, and intrahepatic abnormalities such as a transient hepatic attenuation difference (THAD) (42%) or abscess (61%). Septic thrombophlebitis of the portal system is often associated with an infectious source in the gastrointestinal tract and sepsis. Contrast-enhanced CT demonstrates an infectious gastrointestinal source, thrombosis or gas within the portal system or its branches, and intrahepatic abnormalities such as abscess in most cases. We report a THAD in several of our patients, an observation that was not made in prior reports of septic thrombophlebitis.
Klebsiella Pneumoniae Liver Abscess: A Case Series of Six Asian Patients.
Oikonomou, Katerina G; Aye, Myint
2017-09-26
BACKGROUND Liver abscesses represent a serious infection of hepatic parenchyma and are associated with significant morbidity and mortality. The emergence of a new hypervirulent variant of Klebsiella pneumoniae, which can cause serious infections in the Asian population, is under investigation. We report a case series of six Asian patients hospitalized at our institution from January 2013 to November 2015 for liver abscess due to Klebsiella pneumoniae. CASE REPORT Charts of six Asian patients were retrospectively reviewed. Four patients were male and two were female. The mean age was 53 years (range: 35-64 years). All patients had no known past medical history of immunodeficiency. Three patients had multiple liver abscesses at the time of initial presentation. In five patients, the source of entry of the pathogenic microorganism was unknown and in one patient the suspected source of entry was the gastrointestinal tract. In three patients there was also concomitant Klebsiella pneumoniae bacteremia. The mean duration of antibiotic treatment was seven weeks and the mean duration of hospital stay was 13.5 days. CONCLUSIONS Liver abscess should always be included in the differential diagnosis in cases of sepsis without obvious source and/or in the clinical scenarios of fever, abdominal pain, and liver lesions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffin, J.M.
1977-11-01
The pseudo data approach to the joint production of petroleum refining and chemicals is described as an alternative that avoids the multicollinearity of time series data and allows a complex technology to be characterized in a statistical price possibility frontier. Intended primarily for long-range analysis, the pseudo data method can be used as a source of elasticity estimate for policy analysis. 19 references.
A Neutral Odor May Become a Sexual Incentive through Classical Conditioning in Male Rats
ERIC Educational Resources Information Center
Kvitvik, Inger-Line; Berg, Kristine Marit; Agmo, Anders
2010-01-01
A neutral olfactory stimulus was employed as CS in a series of experiments with a sexually receptive female as UCS and the execution of an intromission as the UCR. Each experimental session lasted until the male ejaculated. The time the experimental subject spent in a zone adjacent to the source of the olfactory stimulus during the 10 s of CS…
Radar observation of known and unknown clear echoes
NASA Technical Reports Server (NTRS)
Glover, K. M.; Konrad, T. G.
1979-01-01
Target cross-section as a function of wavelength for known insects, known bird, and dot targets is presented. Tracking data using the time series analysis was tabulated for known birds. Examples were selected from these early works to give entomologists some indication of the types of information that are available by radar as well as examples of the different sources of clear-air radar backscatter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fast, J; Zhang, Q; Tilp, A
Significantly improved returns in their aerosol chemistry data can be achieved via the development of a value-added product (VAP) of deriving OA components, called Organic Aerosol Components (OACOMP). OACOMP is primarily based on multivariate analysis of the measured organic mass spectral matrix. The key outputs of OACOMP are the concentration time series and the mass spectra of OA factors that are associated with distinct sources, formation and evolution processes, and physicochemical properties.
The use of hierarchical clustering for the design of optimized monitoring networks
NASA Astrophysics Data System (ADS)
Soares, Joana; Makar, Paul Andrew; Aklilu, Yayne; Akingunola, Ayodeji
2018-05-01
Associativity analysis is a powerful tool to deal with large-scale datasets by clustering the data on the basis of (dis)similarity and can be used to assess the efficacy and design of air quality monitoring networks. We describe here our use of Kolmogorov-Zurbenko filtering and hierarchical clustering of NO2 and SO2 passive and continuous monitoring data to analyse and optimize air quality networks for these species in the province of Alberta, Canada. The methodology applied in this study assesses dissimilarity between monitoring station time series based on two metrics: 1 - R, R being the Pearson correlation coefficient, and the Euclidean distance; we find that both should be used in evaluating monitoring site similarity. We have combined the analytic power of hierarchical clustering with the spatial information provided by deterministic air quality model results, using the gridded time series of model output as potential station locations, as a proxy for assessing monitoring network design and for network optimization. We demonstrate that clustering results depend on the air contaminant analysed, reflecting the difference in the respective emission sources of SO2 and NO2 in the region under study. Our work shows that much of the signal identifying the sources of NO2 and SO2 emissions resides in shorter timescales (hourly to daily) due to short-term variation of concentrations and that longer-term averages in data collection may lose the information needed to identify local sources. However, the methodology identifies stations mainly influenced by seasonality, if larger timescales (weekly to monthly) are considered. We have performed the first dissimilarity analysis based on gridded air quality model output and have shown that the methodology is capable of generating maps of subregions within which a single station will represent the entire subregion, to a given level of dissimilarity. We have also shown that our approach is capable of identifying different sampling methodologies as well as outliers (stations' time series which are markedly different from all others in a given dataset).
van de Flierdt, T.; Frank, M.; Lee, D.-C.; Halliday, A.N.; Reynolds, B.C.; Hein, J.R.
2004-01-01
The behavior of dissolved Hf in the marine environment is not well understood due to the lack of direct seawater measurements of Hf isotopes and the limited number of Hf isotope time-series obtained from ferromanganese crusts. In order to place better constraints on input sources and develop further applications, a combined Nd-Hf isotope time-series study of five Pacific ferromanganese crusts was carried out. The samples cover the past 38 Myr and their locations range from sites at the margin of the ocean to remote areas, sites from previously unstudied North and South Pacific areas, and water depths corresponding to deep and bottom waters. For most of the samples a broad coupling of Nd and Hf isotopes is observed. In the Equatorial Pacific ENd and EHf both decrease with water depth. Similarly, ENd and EHf both increase from the South to the North Pacific. These data indicate that the Hf isotopic composition is, in general terms, a suitable tracer for ocean circulation, since inflow and progressive admixture of bottom water is clearly identifiable. The time-series data indicate that inputs and outputs have been balanced throughout much of the late Cenozoic. A simple box model can constrain the relative importance of potential input sources to the North Pacific. Assuming steady state, the model implies significant contributions of radiogenic Nd and Hf from young circum-Pacific arcs and a subordinate role of dust inputs from the Asian continent for the dissolved Nd and Hf budget of the North Pacific. Some changes in ocean circulation that are clearly recognizable in Nd isotopes do not appear to be reflected by Hf isotopic compositions. At two locations within the Pacific Ocean a decoupling of Nd and Hf isotopes is found, indicating limited potential for Hf isotopes as a stand-alone oceanographic tracer and providing evidence of additional local processes that govern the Hf isotopic composition of deep water masses. In the case of the Southwest Pacific there is evidence that decoupling may have been the result of changes in weathering style related to the buildup of Antarctic glaciation. Copyright ?? 2004 Elsevier Ltd.
Deriving phenological metrics from NDVI through an open source tool developed in QGIS
NASA Astrophysics Data System (ADS)
Duarte, Lia; Teodoro, A. C.; Gonçalves, Hernãni
2014-10-01
Vegetation indices have been commonly used over the past 30 years for studying vegetation characteristics using images collected by remote sensing satellites. One of the most commonly used is the Normalized Difference Vegetation Index (NDVI). The various stages that green vegetation undergoes during a complete growing season can be summarized through time-series analysis of NDVI data. The analysis of such time-series allow for extracting key phenological variables or metrics of a particular season. These characteristics may not necessarily correspond directly to conventional, ground-based phenological events, but do provide indications of ecosystem dynamics. A complete list of the phenological metrics that can be extracted from smoothed, time-series NDVI data is available in the USGS online resources (http://phenology.cr.usgs.gov/methods_deriving.php).This work aims to develop an open source application to automatically extract these phenological metrics from a set of satellite input data. The main advantage of QGIS for this specific application relies on the easiness and quickness in developing new plug-ins, using Python language, based on the experience of the research group in other related works. QGIS has its own application programming interface (API) with functionalities and programs to develop new features. The toolbar developed for this application was implemented using the plug-in NDVIToolbar.py. The user introduces the raster files as input and obtains a plot and a report with the metrics. The report includes the following eight metrics: SOST (Start Of Season - Time) corresponding to the day of the year identified as having a consistent upward trend in the NDVI time series; SOSN (Start Of Season - NDVI) corresponding to the NDVI value associated with SOST; EOST (End of Season - Time) which corresponds to the day of year identified at the end of a consistent downward trend in the NDVI time series; EOSN (End of Season - NDVI) corresponding to the NDVI value associated with EOST; MAXN (Maximum NDVI) which corresponds to the maximum NDVI value; MAXT (Time of Maximum) which is the day associated with MAXN; DUR (Duration) defined as the number of days between SOST and EOST; and AMP (Amplitude) which is the difference between MAXN and SOSN. This application provides all these metrics in a single step. Initially, the data points are interpolated using a moving average graphic with five and three points. The eight metrics previously described are then obtained from the spline using numpy functions. In the present work, the developed toolbar was applied to MODerate resolution Imaging Spectroradiometer (MODIS) data covering a particular region of Portugal, which can be generally applied to other satellite data and study area. The code is open and can be modified according to the user requirements. Other advantage in publishing the plug-ins and the application code is the possibility of other users to improve this application.
Disk Disruptions and X-ray Intensity Excursions in Cyg X-2, LMC X-3 and Cyg X-3
NASA Astrophysics Data System (ADS)
Boyd, P. T.; Smale, A. P.
2001-05-01
The RXTE All Sky Monitor soft X-ray light curves of many X-ray binaries show long-term intensity variations (a.k.a "superorbital periodicities") that have been ascribed to precession of a warped, tilted accretion disk around the X-ray source. We have found that the excursion times between X-ray minima in Cyg X-2 can be characterized as a series of integer multiples of the 9.8 binary orbital period, (as opposed to the previously reported stable 77.7 day single periodicity, or a single modulation whose period changes slowly with time). While the data set is too short for a proper statistical analysis, it is clear that the length of any given intensity excursion cannot be used to predict the next (integer) excursion length in the series. In the black hole candidate system LMC X-3, the excursion times are shown to be related to each other by rational fractions. We find that the long term light curve of the unusual galactic X-ray jet source Cyg X-3 can also be described as a series of intensity excursions related to each other by integer multiples of a fundamental underlying clock. In the latter cases, the clock is apparently not related to the known binary periods. A unified physical model, involving both an inclined accretion disk and a fixed-probability disk disruption mechanism is presented, and compared with three-body scattering results. Each time the disk passes through the orbital plane it experiences a fixed probability P that it will disrupt. This model has testable predictions---the distribution of integers should resemble that of an atomic process with a characteristic half life. Further analysis can support or refute the model, and shed light on what system parameters effectively set the value of P.
NASA Technical Reports Server (NTRS)
Sabaka, T. J.; Rowlands, D. D.; Luthcke, S. B.; Boy, J.-P.
2010-01-01
We describe Earth's mass flux from April 2003 through November 2008 by deriving a time series of mas cons on a global 2deg x 2deg equal-area grid at 10 day intervals. We estimate the mass flux directly from K band range rate (KBRR) data provided by the Gravity Recovery and Climate Experiment (GRACE) mission. Using regularized least squares, we take into account the underlying process dynamics through continuous space and time-correlated constraints. In addition, we place the mascon approach in the context of other filtering techniques, showing its equivalence to anisotropic, nonsymmetric filtering, least squares collocation, and Kalman smoothing. We produce mascon time series from KBRR data that have and have not been corrected (forward modeled) for hydrological processes and fmd that the former produce superior results in oceanic areas by minimizing signal leakage from strong sources on land. By exploiting the structure of the spatiotemporal constraints, we are able to use a much more efficient (in storage and computation) inversion algorithm based upon the conjugate gradient method. This allows us to apply continuous rather than piecewise continuous time-correlated constraints, which we show via global maps and comparisons with ocean-bottom pressure gauges, to produce time series with reduced random variance and full systematic signal. Finally, we present a preferred global model, a hybrid whose oceanic portions are derived using forward modeling of hydrology but whose land portions are not, and thus represent a pure GRACE-derived signal.
Scharnweber, Tobias; Hevia, Andrea; Buras, Allan; van der Maaten, Ernst; Wilmking, Martin
2016-10-01
Element composition of annually resolved tree-rings constitutes a promising biological proxy for reconstructions of environmental conditions and pollution history. However, several methodological and physiological issues have to be addressed before sound conclusions can be drawn from dendrochemical time series. For example, radial and vertical translocation processes of elements in the wood might blur or obscure any dendrochemical signal. In this study, we tested the degree of synchronism of elemental time series within and between trees of one coniferous (Pinus sylvestris L.) and one broadleaf (Castanea sativa Mill.) species growing in conventionally managed forests without direct pollution sources in their surroundings. Micro X-ray fluorescence (μXRF) analysis was used to establish time series of relative concentrations of multiple elements (Mg, Al, P, Cl, K, Ca, Cr, Mn, Fe and Ni) for different stem heights and stem exposures. We found a common long-term (decadal) trend for most elements in both species, but only little coherence in the high frequency domain (inter-annual variations). Aligning the element curves by cambial age instead of year of ring formation reduced the standard deviations between the single measurements. This points at an influence of age on longer term trends and would require a detrending in order to extract any environmental signal from dendrochemical time series. The common signal was stronger for pine than for chestnut. In pine, many elements show a concentration gradient with higher values towards the tree crown. Mobility of elements in the stem leading to high within- and between-tree variability, as well as a potential age-trend apparently complicate the establishment of reliable dendrochemical chronologies. For future wood-chemical studies, we recommend to work with element ratios instead of single element time series, to consider potential age trends and to analyze more than one sample per tree to account for internal variability. Copyright © 2016 Elsevier B.V. All rights reserved.
Measuring efficiency of international crude oil markets: A multifractality approach
NASA Astrophysics Data System (ADS)
Niere, H. M.
2015-01-01
The three major international crude oil markets are treated as complex systems and their multifractal properties are explored. The study covers daily prices of Brent crude, OPEC reference basket and West Texas Intermediate (WTI) crude from January 2, 2003 to January 2, 2014. A multifractal detrended fluctuation analysis (MFDFA) is employed to extract the generalized Hurst exponents in each of the time series. The generalized Hurst exponent is used to measure the degree of multifractality which in turn is used to quantify the efficiency of the three international crude oil markets. To identify whether the source of multifractality is long-range correlations or broad fat-tail distributions, shuffled data and surrogated data corresponding to each of the time series are generated. Shuffled data are obtained by randomizing the order of the price returns data. This will destroy any long-range correlation of the time series. Surrogated data is produced using the Fourier-Detrended Fluctuation Analysis (F-DFA). This is done by randomizing the phases of the price returns data in Fourier space. This will normalize the distribution of the time series. The study found that for the three crude oil markets, there is a strong dependence of the generalized Hurst exponents with respect to the order of fluctuations. This shows that the daily price time series of the markets under study have signs of multifractality. Using the degree of multifractality as a measure of efficiency, the results show that WTI is the most efficient while OPEC is the least efficient market. This implies that OPEC has the highest likelihood to be manipulated among the three markets. This reflects the fact that Brent and WTI is a very competitive market hence, it has a higher level of complexity compared against OPEC, which has a large monopoly power. Comparing with shuffled data and surrogated data, the findings suggest that for all the three crude oil markets, the multifractality is mainly due to long-range correlations.
Fischer, Michael A; Leidner, Bertil; Kartalis, Nikolaos; Svensson, Anders; Aspelin, Peter; Albiin, Nils; Brismar, Torkel B
2014-01-01
To assess feasibility and image quality (IQ) of a new post-processing algorithm for retrospective extraction of an optimised multi-phase CT (time-resolved CT) of the liver from volumetric perfusion imaging. Sixteen patients underwent clinically indicated perfusion CT using 4D spiral mode of dual-source 128-slice CT. Three image sets were reconstructed: motion-corrected and noise-reduced (MCNR) images derived from 4D raw data; maximum and average intensity projections (time MIP/AVG) of the arterial/portal/portal-venous phases and all phases (total MIP/ AVG) derived from retrospective fusion of dedicated MCNR split series. Two readers assessed the IQ, detection rate and evaluation time; one reader assessed image noise and lesion-to-liver contrast. Time-resolved CT was feasible in all patients. Each post-processing step yielded a significant reduction of image noise and evaluation time, maintaining lesion-to-liver contrast. Time MIPs/AVGs showed the highest overall IQ without relevant motion artefacts and best depiction of arterial and portal/portal-venous phases respectively. Time MIPs demonstrated a significantly higher detection rate for arterialised liver lesions than total MIPs/AVGs and the raw data series. Time-resolved CT allows data from volumetric perfusion imaging to be condensed into an optimised multi-phase liver CT, yielding a superior IQ and higher detection rate for arterialised liver lesions than the raw data series. • Four-dimensional computed tomography is limited by motion artefacts and poor image quality. • Time-resolved-CT facilitates 4D-CT data visualisation, segmentation and analysis by condensing raw data. • Time-resolved CT demonstrates better image quality than raw data images. • Time-resolved CT improves detection of arterialised liver lesions in cirrhotic patients.
Automatic classification of time-variable X-ray sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lo, Kitty K.; Farrell, Sean; Murphy, Tara
2014-05-01
To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, andmore » other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.« less
On the design of henon and logistic map-based random number generator
NASA Astrophysics Data System (ADS)
Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah
2017-10-01
The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.
Loos, Martin; Krauss, Martin; Fenner, Kathrin
2012-09-18
Formation of soil nonextractable residues (NER) is central to the fate and persistence of pesticides. To investigate pools and extent of NER formation, an established inverse modeling approach for pesticide soil degradation time series was evaluated with a Monte Carlo Markov Chain (MCMC) sampling procedure. It was found that only half of 73 pesticide degradation time series from a homogeneous soil source allowed for well-behaved identification of kinetic parameters with a four-pool model containing a parent compound, a metabolite, a volatile, and a NER pool. A subsequent simulation indeed confirmed distinct parameter combinations of low identifiability. Taking the resulting uncertainties into account, several conclusions regarding NER formation and its impact on persistence assessment could nonetheless be drawn. First, rate constants for transformation of parent compounds to metabolites were correlated to those for transformation of parent compounds to NER, leading to degradation half-lives (DegT50) typically not being larger than disappearance half-lives (DT50) by more than a factor of 2. Second, estimated rate constants were used to evaluate NER formation over time. This showed that NER formation, particularly through the metabolite pool, may be grossly underestimated when using standard incubation periods. It further showed that amounts and uncertainties in (i) total NER, (ii) NER formed from the parent pool, and (iii) NER formed from the metabolite pool vary considerably among data sets at t→∞, with no clear dominance between (ii) and (iii). However, compounds containing aromatic amine moieties were found to form significantly more total NER when extrapolating to t→∞ than the other compounds studied. Overall, our study stresses the general need for assessing uncertainties, identifiability issues, and resulting biases when using inverse modeling of degradation time series for evaluating persistence and NER formation.
A Methodological Framework for Model Selection in Interrupted Time Series Studies.
Lopez Bernal, J; Soumerai, S; Gasparrini, A
2018-06-06
Interrupted time series is a powerful and increasingly popular design for evaluating public health and health service interventions. The design involves analysing trends in the outcome of interest and estimating the change in trend following an intervention relative to the counterfactual (the expected ongoing trend if the intervention had not occurred). There are two key components to modelling this effect: first, defining the counterfactual; second, defining the type of effect that the intervention is expected to have on the outcome, known as the impact model. The counterfactual is defined by extrapolating the underlying trends observed before the intervention to the post-intervention period. In doing this, authors must consider the pre-intervention period that will be included, any time varying confounders, whether trends may vary within different subgroups of the population and whether trends are linear or non-linear. Defining the impact model involves specifying the parameters that model the intervention, including for instance whether to allow for an abrupt level change or a gradual slope change, whether to allow for a lag before any effect on the outcome, whether to allow a transition period during which the intervention is being implemented and whether a ceiling or floor effect might be expected. Inappropriate model specification can bias the results of an interrupted time series analysis and using a model that is not closely tailored to the intervention or testing multiple models increases the risk of false positives being detected. It is important that authors use substantive knowledge to customise their interrupted time series model a priori to the intervention and outcome under study. Where there is uncertainty in model specification, authors should consider using separate data sources to define the intervention, running limited sensitivity analyses or undertaking initial exploratory studies. Copyright © 2018. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Jackson, Brian; Lorenz, Ralph; Davis, Karan
2018-01-01
Dust devils are likely the dominant source of dust for the martian atmosphere, but the amount and frequency of dust-lifting depend on the statistical distribution of dust devil parameters. Dust devils exhibit pressure perturbations and, if they pass near a barometric sensor, they may register as a discernible dip in a pressure time-series. Leveraging this fact, several surveys using barometric sensors on landed spacecraft have revealed dust devil structures and occurrence rates. However powerful they are, though, such surveys suffer from non-trivial biases that skew the inferred dust devil properties. For example, such surveys are most sensitive to dust devils with the widest and deepest pressure profiles, but the recovered profiles will be distorted, broader and shallow than the actual profiles. In addition, such surveys often do not provide wind speed measurements alongside the pressure time series, and so the durations of the dust devil signals in the time series cannot be directly converted to profile widths. Fortunately, simple statistical and geometric considerations can de-bias these surveys, allowing conversion of the duration of dust devil signals into physical widths, given only a distribution of likely translation velocities, and the recovery of the underlying distributions of physical parameters. In this study, we develop a scheme for de-biasing such surveys. Applying our model to an in-situ survey using data from the Phoenix lander suggests a larger dust flux and a dust devil occurrence rate about ten times larger than previously inferred. Comparing our results to dust devil track surveys suggests only about one in five low-pressure cells lifts sufficient dust to leave a visible track.
Monitoring forest dynamics with multi-scale and time series imagery.
Huang, Chunbo; Zhou, Zhixiang; Wang, Di; Dian, Yuanyong
2016-05-01
To learn the forest dynamics and evaluate the ecosystem services of forest effectively, a timely acquisition of spatial and quantitative information of forestland is very necessary. Here, a new method was proposed for mapping forest cover changes by combining multi-scale satellite remote-sensing imagery with time series data. Using time series Normalized Difference Vegetation Index products derived from the Moderate Resolution Imaging Spectroradiometer images (MODIS-NDVI) and Landsat Thematic Mapper/Enhanced Thematic Mapper Plus (TM/ETM+) images as data source, a hierarchy stepwise analysis from coarse scale to fine scale was developed for detecting the forest change area. At the coarse scale, MODIS-NDVI data with 1-km resolution were used to detect the changes in land cover types and a land cover change map was constructed using NDVI values at vegetation growing seasons. At the fine scale, based on the results at the coarse scale, Landsat TM/ETM+ data with 30-m resolution were used to precisely detect the forest change location and forest change trend by analyzing time series forest vegetation indices (IFZ). The method was tested using the data for Hubei Province, China. The MODIS-NDVI data from 2001 to 2012 were used to detect the land cover changes, and the overall accuracy was 94.02 % at the coarse scale. At the fine scale, the available TM/ETM+ images at vegetation growing seasons between 2001 and 2012 were used to locate and verify forest changes in the Three Gorges Reservoir Area, and the overall accuracy was 94.53 %. The accuracy of the two layer hierarchical monitoring results indicated that the multi-scale monitoring method is feasible and reliable.
Decadal-Scale Crustal Deformation Transients in Japan Prior to the March 11, 2011 Tohoku Earthquake
NASA Astrophysics Data System (ADS)
Mavrommatis, A. P.; Segall, P.; Miyazaki, S.; Owen, S. E.; Moore, A. W.
2012-12-01
Excluding postseismic transients and slow-slip events, interseismic deformation is generally believed to accumulate linearly in time. We test this assumption using data from Japan's GPS Earth Observation Network System (GEONET), which provides high-precision time series spanning over 10 years. Here we report regional signals of decadal transients that in some cases appear to be unrelated to any known source of deformation. We analyze GPS position time series processed independently, using the BERNESE and GIPSY-PPP software, provided by the Geospatial Information Authority of Japan (GSI) and a collaborative effort of Jet Propulsion Laboratory (JPL) and Dr. Mark Simons (Caltech), respectively. We use time series from 891 GEONET stations, spanning an average of ~14 years prior to the Mw 9.0 March 11, 2011 Tohoku earthquake. We assume a time series model that includes a linear term representing constant velocity, as well as a quadratic term representing constant acceleration. Postseismic transients, where observed, are modeled by A log(1 + t/tc). We also model seasonal terms and antenna offsets, and solve for the best-fitting parameters using standard nonlinear least squares. Uncertainties in model parameters are determined by linear propagation of errors. Noise parameters are inferred from time series that lack obvious transients using maximum-likelihood estimation and assuming a combination of power-law and white noise. Resulting velocity uncertainties are on the order of 1.0 to 1.5 mm/yr. Excluding stations with high misfit to the time series model, our results reveal several spatially coherent patterns of statistically significant (at as much as 5σ) apparent crustal acceleration in various regions of Japan. The signal exhibits similar patterns in both the GSI and JPL solutions and is not coherent across the entire network, which indicates that the pattern is not a reference frame artifact. We interpret most of the accelerations to represent transient deformation due to known sources, including slow-slip events (e.g., the post-2000 Tokai event) or postseismic transients due to large earthquakes prior to 1996 (e.g., the M 7.7 1993 Hokkaido-Nansei-Oki and M 7.7 1994 Sanriku-Oki earthquakes). Viscoelastic modeling will be required to confirm the influence of past earthquakes on the acceleration field. In addition to these signals, we find spatially coherent accelerations in the Tohoku and Kyushu regions. Specifically, we observe generally southward acceleration extending for ~400 km near the west coast of Tohoku, east-southeastward acceleration covering ~200 km along the southeast coast of Tohoku, and west-northwestward acceleration spanning ~100 km across the south coast of Kyushu. Interestingly, the eastward acceleration field in Tohoku is spatially correlated with the extent of the March 11, 2011 Mw 9.0 rupture area. We note that the inferred acceleration is present prior to the sequence of M 7+ earthquakes beginning in 2003, and that short-term transients following these events have been accounted for in the analysis. A possible, although non-unique, cause of the acceleration is increased slip rate on the Japan Trench. However, such widespread changes would not be predicted by standard earthquake nucleation models.
Operational use of open satellite data for marine water quality monitoring
NASA Astrophysics Data System (ADS)
Symeonidis, Panagiotis; Vakkas, Theodoros
2017-09-01
The purpose of this study was to develop an operational platform for marine water quality monitoring using near real time satellite data. The developed platform utilizes free and open satellite data available from different data sources like COPERNICUS, the European Earth Observation Initiative, or NASA, from different satellites and instruments. The quality of the marine environment is operationally evaluated using parameters like chlorophyll-a concentration, water color and Sea Surface Temperature (SST). For each parameter, there are more than one dataset available, from different data sources or satellites, to allow users to select the most appropriate dataset for their area or time of interest. The above datasets are automatically downloaded from the data provider's services and ingested to the central, spatial engine. The spatial data platform uses the Postgresql database with the PostGIS extension for spatial data storage and Geoserver for the provision of the spatial data services. The system provides daily, 10 days and monthly maps and time series of the above parameters. The information is provided using a web client which is based on the GET SDI PORTAL, an easy to use and feature rich geospatial visualization and analysis platform. The users can examine the temporal variation of the parameters using a simple time animation tool. In addition, with just one click on the map, the system provides an interactive time series chart for any of the parameters of the available datasets. The platform can be offered as Software as a Service (SaaS) to any area in the Mediterranean region.
NASA Technical Reports Server (NTRS)
Deming, Drake; Boyle, Robert J.; Jennings, Donald E.; Wiedemann, Gunter
1988-01-01
The use of the extremely Zeeman-sensitive IR emission line Mg I, at 12.32 microns, to study solar magnetic fields. Time series observations of the line in the quiet sun were obtained in order to determine the response time of the line to the five-minute oscillations. Based upon the velocity amplitude and average period measured in the line, it is concluded that it is formed in the temperature minimum region. The magnetic structure of sunspots is investigated by stepping a small field of view in linear 'slices' through the spots. The region of penumbral line formation does not show the Evershed outflow common in photospheric lines. The line intensity is a factor of two greater in sunspot penumbrae than in the photosphere, and at the limb the penumbral emission begins to depart from optical thinness, the line source function increasing with height. For a spot near disk center, the radial decrease in absolute magnetic field strength is steeper than the generally accepted dependence.
Seismic and Aseismic Slip on the Cascadia Megathrust
NASA Astrophysics Data System (ADS)
Michel, S. G. R. M.; Gualandi, A.; Avouac, J. P.
2017-12-01
Our understanding of the dynamics governing aseismic and seismic slip hinges on our ability to image the time evolution of fault slip during and in between earthquakes and transients. Such kinematic descriptions are also pivotal to assess seismic hazard as, on the long term, elastic strain accumulating around a fault should be balanced by elastic strain released by seismic slip and aseismic transients. In this presentation, we will discuss how such kinematic descriptions can be obtained from the analysis and modelling of geodetic time series. We will use inversion methods based on Independent Component Analysis (ICA) decomposition of the time series to extract and model the aseismic slip (afterslip and slow slip events). We will show that this approach is very effective to identify, and filter out, non-tectonic sources of geodetic strain such as the strain due to surface loads, which can be estimated using gravimetric measurements from GRACE, and thermal strain. We will discuss in particular the application to the Cascadia subduction zone.
NASA Astrophysics Data System (ADS)
Gerard-Marchant, P. G.
2008-12-01
Numpy is a free, open source C/Python interface designed for the fast and convenient manipulation of multidimensional numerical arrays. The base object, ndarray, can also be easily be extended to define new objects meeting specific needs. Thanks to its simplicity, efficiency and modularity, numpy and its companion library Scipy have become increasingly popular in the scientific community over the last few years, with application ranging from astronomy and engineering to finances and statistics. Its capacity to handle missing values is particularly appealing when analyzing environmental time series, where irregular data sampling might be an issue. After reviewing the main characteristics of numpy objects and the mechanism of subclassing, we will present the scikits.timeseries package, developed to manipulate single- and multi-variable arrays indexed in time. We will illustrate some typical applications of this package by introducing climpy, a set of extensions designed to help analyzing the impacts of climate variability on environmental data such as precipitations or streamflows.
Shorvon, Simon
2007-01-01
This paper records the history of Epilepsia, the journal of the International League Against Epilepsy, from its inception in 1908/1909 until the beginning of its fourth series in 1961. During this time, publication was interrupted on three occasions and so the journal appeared in four series, with a complex numbering system. Over the years, the content and format of the journal has varied. Its role has changed over the years, at times primarily as a scientific organ and at other times as a source of ILAE news and reports. Concerns throughout its history have included its role as an historical record, its international representation, financial vicissitude, quality of papers, the balance between basic and clinical science, the value of clinical papers, and issues of overspecialization. Epilepsia is today the leading clinical epilepsy journal; but these are still significant concerns, and a knowledge of the history of Epilepsia is important for understanding the current position of the journal.
Results of meteorological monitoring in Gorny Altai before and after the Chuya earthquake in 2003
NASA Astrophysics Data System (ADS)
Aptikaeva, O. I.; Shitov, A. V.
2014-12-01
We consider the dynamics of some meteorological parameters in Gorny Altai from 2000 to 2011. We analyzed the variations in the meteorological parameters related to the strong Chuya earthquake (September 27, 2003). A number of anomalies were revealed in the time series. Before this strong earthquake, the winter temperatures at the nearest meteorological station to the earthquake source increased by 8-10°C (by 2009 they returned to the mean values), while the air humidity in winter decreased. In the winter of 2002, we observed a long negative anomaly in the time series of the atmospheric pressure. At the same time, the decrease in the released seismic energy was replaced by the tendency to its increase. Using wavelet analysis we revealed the synchronism in the dynamics of the atmospheric parameters, variations in the solar and geomagnetic activities, and geodynamic processes. We also discuss the relationship of the atmospheric and geodynamic processes and the comfort conditions of the population in the climate analyzed here.
NASA Astrophysics Data System (ADS)
Williams, B.; Thibodeau, B.; Chikaraishi, Y.; Ohkouchi, N.; Grottoli, A. G.
2014-12-01
Instrumental and proxy data and global climate model experiments indicate a multi-decadal shoaling of the western tropical Pacific (WTP) thermocline potentially related to a shift in ENSO frequency. In the WTP, the nutricline coincides with the thermocline, and a shoaling of the nutricline brings more nitrate-rich seawater higher in the water column and within the sunlit euphotic zone. In the nutrient-poor WTP, this incursion of nitrate-rich water at the bottom of the euphotic zone may stimulate productivity in the water column. However, there is a general paucity of measurements below the surface with which to investigate recent changes in seawater chemistry. Nitrogen isotope (δ15N) measurements of particulate organic matter (POM) can elucidate the source of nitrogen to the WTP and related trophic dynamics. This POM is the food source to the long-lived proteinaceous corals, and drives the nitrogen isotopic composition of their skeleton. Here, we report time series δ15N values from the banded skeletons of proteinaceous corals from offshore Palau in the WTP that provide proxy information about past changes in euphotic zone nitrogen dynamics. Bulk skeletal δ15N values declined between 1977 and 2010 suggesting a progressively increasing contribution of deep water with isotopically-light nitrate to the euphotic zone and/or a shortening of the planktonic food web. Since only some amino acids are enriched in δ15N with each trophic transfer in a food web, we measured the δ15N composition of seven individual amino acids in the same coral skeleton. The δ15N time series of the individual amino acids also declined over time, mirroring the bulk values. These new data indicate that the changes in the source nitrogen to the base of the euphotic zone drives a decline in coral skeletal δ15N values, consistent with the shoaling nutricline, with no coinciding alteration of the trophic structure in the WTP.
Derivation of GNSS derived station velocities for a surface deformation model in the Austrian region
NASA Astrophysics Data System (ADS)
Umnig, Elke; Weber, Robert; Maras, Jadre; Brückl, Ewald
2016-04-01
This contribution deals with the first comprehensive analysis of GNSS derived surface velocities computed within an observation network of about 100 stations covering the whole Austrian territory and parts of the neighbouring countries. Coordinate time series are available now, spanning a period of 5 years (2010.0-2015.0) for one focus area in East Austria and one and a half year (2013.5-2015.0) for the remaining part of the tracking network. In principle the data series are stemming from two different GNSS campaigns. The former was set up to investigate intra plate tectonic movements within the framework of the project ALPAACT (seismological and geodetic monitoring of ALpine-PAnnonian ACtive Tectonics), the latter was designed to support a number of various requests, e.g. derivation of GNSS derived water vapour fields, but also to expand the foresaid tectonic studies. In addition the activities within the ALPAACT project supplement the educational initiative SHOOLS & QUAKES, where scholars contribute to seismological research. For the whole period of the processed coordinate time series daily solutions have been computed by means of the Bernese software. The processed coordinate time series are tied to the global reference frame ITRF2000 as well as to the frame ITRF2008. Due to the transition of the reference from ITRF2000 to ITRF2008 within the processing period, but also due to updates of the Bernese software from version 5.0 to 5.2 the time series were initially not fully consistent and have to be re-aligned to a common frame. So the goal of this investigation is to derive a nationwide consistent horizontal motion field on base of GNSS reference station data within the ITRF2008 frame, but also with respect to the Eurasian plate. In this presentation we focus on the set-up of the coordinate time series and on the problem of frame alignment. Special attention is also paid to the separation into linear and periodic motion signals, originating from tectonic or non-tectonic sources.
The Chaotic Light Curves of Accreting Black Holes
NASA Technical Reports Server (NTRS)
Kazanas, Demosthenes
2007-01-01
We present model light curves for accreting Black Hole Candidates (BHC) based on a recently developed model of these sources. According to this model, the observed light curves and aperiodic variability of BHC are due to a series of soft photon injections at random (Poisson) intervals and the stochastic nature of the Comptonization process in converting these soft photons to the observed high energy radiation. The additional assumption of our model is that the Comptonization process takes place in an extended but non-uniform hot plasma corona surrounding the compact object. We compute the corresponding Power Spectral Densities (PSD), autocorrelation functions, time skewness of the light curves and time lags between the light curves of the sources at different photon energies and compare our results to observation. Our model reproduces the observed light curves well, in that it provides good fits to their overall morphology (as manifest by the autocorrelation and time skewness) and also to their PSDs and time lags, by producing most of the variability power at time scales 2 a few seconds, while at the same time allowing for shots of a few msec in duration, in accordance with observation. We suggest that refinement of this type of model along with spectral and phase lag information can be used to probe the structure of this class of high energy sources.
NASA Astrophysics Data System (ADS)
Allstadt, K.; Moretti, L.; Mangeney, A.; Stutzmann, E.; Capdeville, Y.
2014-12-01
The time series of forces exerted on the earth by a large and rapid landslide derived remotely from the inversion of seismic records can be used to tie post-slide evidence to what actually occurred during the event and can be used to tune numerical models and test theoretical methods. This strategy is applied to the 48.5 Mm3 August 2010 Mount Meager rockslide-debris flow in British Columbia, Canada. By inverting data from just five broadband seismic stations less than 300 km from the source, we reconstruct the time series of forces that the landslide exerted on the Earth as it occurred. The result illuminates a complex retrogressive initiation sequence and features attributable to flow over a complicated path including several curves and runup against a valley wall. The seismically derived force history also allows for the estimation of the horizontal acceleration (0.39 m/s^2) and average apparent coefficient of basal friction (0.38) of the rockslide, and the speed of the center of mass of the debris flow (peak of 92 m/s). To extend beyond these simple calculations and to test the interpretation, we also use the seismically derived force history to guide numerical modeling of the event - seeking to simulate the landslide in a way that best fits both the seismic and field constraints. This allows for a finer reconstruction of the volume, timing, and sequence of events, estimates of friction, and spatiotemporal variations in speed and flow thickness. The modeling allowed us to analyze the sensitivity of the force to the different parameters involved in the landslide modeling to better understand what can and cannot be constrained from seismic source inversions of landslide signals.
Magnetic Oscillations Mark Sites of Magnetic Transients in an Acoustically Active Flare
NASA Astrophysics Data System (ADS)
Lindsey, Charles A.; Donea, A.; Hudson, H. S.; Martinez Oliveros, J.; Hanson, C.
2011-05-01
The flare of 2011 February 15, in NOAA AR11158, was the first acoustically active flare of solar cycle 24, and the first observed by the Solar Dynamics Observatory (SDO). It was exceptional in a number of respects (Kosovichev 2011a,b). Sharp ribbon-like transient Doppler, and magnetic signatures swept over parts of the active region during the impulsive phase of the flare. We apply seismic holography to a 2-hr time series of HMI observations encompassing the flare. The acoustic source distribution appears to have been strongly concentrated in a single highly compact penumbral region in which the continuum-intensity signature was unusually weak. The line-of-sight magnetic transient was strong in parts of the active region, but relatively weak in the seismic-source region. On the other hand, the neighbourhoods of the regions visited by the strongest magnetic transients maintained conspicuous 5-minutes-period variations in the line of sight magnetic signature for the full 2-hr duration of the time series, before the flare as well as after. We apply standard helioseismic control diagnostics for clues as to the physics underlying 5-minute magnetic oscillations in regions conducive to magnetic transients during a flare and consider the prospective development of this property as an indicator of flare potentiality on some time scale. We make use of high-resolution data from AIA, using diffracted images where necessary to obtain good photometry where the image is otherwise saturated. This is relevant to seismic emission driven by thick-target heating in the absence of back-warming. We also use RHESSI imaging spectroscopy to compare the source distributions of HXR and seismic emission.
NASA Astrophysics Data System (ADS)
Mau, S.; Reed, J.; Clark, J.; Valentine, D.
2006-12-01
Large quantities of natural gas are emitted from the seafloor into the coastal ocean near Coal Oil Point, Santa Barbara Channel (SBC), California. Methane, ethane, and propane were quantified in the surface water at 79 stations in a 270 km2 area in order to map the surficial hydrocarbon plume and to quantify air-sea exchange of these gases. A time series was initiated for 14 stations to identify the variability of the mapped plume, and biologically-mediated oxidation rates of methane were measured to quantify the loss of methane in surface water. The hydrocarbon plume was found to comprise ~70 km2 and extended beyond study area. The plume width narrowed from 3 km near the source to 0.7 km further from the source, and then expanded to 6.7 km at the edge of the study area. This pattern matches the cyclonic gyre which is the normal current flow in this part of the Santa Barbara Channel - pushing water to the shore near the seep field and then broadening the plume while the water turns offshore further from the source. Concentrations of gaseous hydrocarbons decrease as the plume migrates. Time series sampling shows similar plume width and hydrocarbon concentrations when normal current conditions prevail. In contrast, smaller plume width and low hydrocarbon concentrations were observed when an additional anticyclonic eddy reversed the normal current flow, and a much broader plume with higher hydrocarbon concentrations was observed during a time of diminished speed within the current gyre. These results demonstrate that surface currents control hydrocarbon plume dynamics in the SBC, though hydrocarbon flux to the atmosphere is likely less dependent on currents. Estimates of air- sea hydrocarbon flux and biological oxidation rates will also be presented.
Time-Series Similarity Analysis of Satellite Derived Data to Understand Changes in Forest Biomass.
NASA Astrophysics Data System (ADS)
Singh, N.; Fritz, B.
2017-12-01
One of the goals of promoting bioenergy is reducing green-house gas emissions by replacing fossil fuels. However, there are concerns that carbon emissions due to changes in land use resulting from crop production for ethanol will negate the impact of biofuels on the environment. So, the current focus is to use lignocellulose feedstocks also referred to as second generation biofuels as the new source of bioenergy. Wood based pellets derived from the forests of southeastern United States are one such source which is being exported to Europe as a carbon-neutral fuel. These wood-pellets meet the EU standard for carbon emissions and are being used to replace coal for energy generation and heating. As a result US exports of wood-based pellets have increased from nearly zero to over 6 million metric tons over the past 8 years. Wood-based pellets are traditionally produced from softwood trees which have a relatively shorter life-cycle and propagate easily, and thus are expected to provide a sustainable source of wood chips used for pellet production. However, there are concerns that as the demand and price of wood pellets increases, lumber mills will seek wood chips from other sources as well, particularly from hardwood trees resulting in higher carbon emissions as well as loss of biodiversity. In this study we use annual stacks of normalized difference vegetation index (NDVI) data at a 16-day temporal resolution to monitor biomass around pellet mills in southeastern United States. We use a combination of time series similarity technique and supervised learning to understand if there have been significant changes in biomass around pellet mills in the southeastern US. We also demonstrate how our method can be used to monitor biomass over large geographic regions using phenological properties of growing vegetation.
An energy management for series hybrid electric vehicle using improved dynamic programming
NASA Astrophysics Data System (ADS)
Peng, Hao; Yang, Yaoquan; Liu, Chunyu
2018-02-01
With the increasing numbers of hybrid electric vehicle (HEV), management for two energy sources, engine and battery, is more and more important to achieve the minimum fuel consumption. This paper introduces several working modes of series hybrid electric vehicle (SHEV) firstly and then describes the mathematical model of main relative components in SHEV. On the foundation of this model, dynamic programming is applied to distribute energy of engine and battery on the platform of matlab and acquires less fuel consumption compared with traditional control strategy. Besides, control rule recovering energy in brake profiles is added into dynamic programming, so shorter computing time is realized by improved dynamic programming and optimization on algorithm.
NASA Astrophysics Data System (ADS)
Christensen, J. N.; Cliff, S. S.; Vancuren, R. A.; Perry, K. D.; Depaolo, D. J.
2006-12-01
Research over the past decade has highlighted the importance of intercontinental transport and exchange of atmospheric aerosols, including soil-derived dust and industrial pollutants. Far-traveled aerosols can affect air quality, atmospheric radiative forcing and cloud formation and can be an important component in soils. Principal component analysis of elemental data for aerosols collected over California has identified a persistent Asian soil dust component that peaks with Asian dust storm events [1]. Isotopic fingerprinting can provide an additional and potentially more discriminating tool for tracing sources of dust. For example, the naturally variable isotopic compositions of Sr and Nd reflect both the geochemistry of the dust source and its pre- weathering geologic history. Sr and Nd isotopic data and chemical data have been collected for a time series of PM2.5 filter samples from Hefei, China taken from eraly April into early May, 2002. This period encompassed a series of dust storms. The sampling time frame overlapped with the 2002 Intercontinental Transport and Chemical Transformation (ITCT-2K2) experiment along the Pacific coast of North America and inland California. Highs in 87Sr/86Sr in the Hefei time series coincide with peaks in Ca and Si representing peaks in mineral particulate loading resulting from passing dust storms. Mixing diagrams combining isotopic data with chemical data identify several components; a high 87Sr/86Sr component that we identify with mineral dust (loess), and two different low 87Sr/86Sr components (local sources and marine aerosol). Using our measured isotopic composition of the "loess" standard CJ-1 [2] as representative of the pure high 87Sr/86Sr component, we calculate 24 hour average loess particulate concentrations in air which range up to 35 micrograms per cubic meter. Marine aerosol was a major component on at least one of the sampled days. The results for the Hefei samples provide a basis for our isotopic study of California mineral aerosols, including the identification and apportionment of local and far-traveled Asian dust components and their variation in time. [1]VanCuren R.A., Cliff, S.S., Perry, K.D. and Jimenez-Cruz, M. (2005) J. Geophys. Res., 110, D09S90, doi: 10.1029/2004JD004973 [2]Nishikawa, M., Hao, Q. and Morita, M. (2000) Global Environ. Res. 4, 1:103-113.
Waveform inversion of volcano-seismic signals for an extended source
Nakano, M.; Kumagai, H.; Chouet, B.; Dawson, P.
2007-01-01
We propose a method to investigate the dimensions and oscillation characteristics of the source of volcano-seismic signals based on waveform inversion for an extended source. An extended source is realized by a set of point sources distributed on a grid surrounding the centroid of the source in accordance with the source geometry and orientation. The source-time functions for all point sources are estimated simultaneously by waveform inversion carried out in the frequency domain. We apply a smoothing constraint to suppress short-scale noisy fluctuations of source-time functions between adjacent sources. The strength of the smoothing constraint we select is that which minimizes the Akaike Bayesian Information Criterion (ABIC). We perform a series of numerical tests to investigate the capability of our method to recover the dimensions of the source and reconstruct its oscillation characteristics. First, we use synthesized waveforms radiated by a kinematic source model that mimics the radiation from an oscillating crack. Our results demonstrate almost complete recovery of the input source dimensions and source-time function of each point source, but also point to a weaker resolution of the higher modes of crack oscillation. Second, we use synthetic waveforms generated by the acoustic resonance of a fluid-filled crack, and consider two sets of waveforms dominated by the modes with wavelengths 2L/3 and 2W/3, or L and 2L/5, where W and L are the crack width and length, respectively. Results from these tests indicate that the oscillating signature of the 2L/3 and 2W/3 modes are successfully reconstructed. The oscillating signature of the L mode is also well recovered, in contrast to results obtained for a point source for which the moment tensor description is inadequate. However, the oscillating signature of the 2L/5 mode is poorly recovered owing to weaker resolution of short-scale crack wall motions. The triggering excitations of the oscillating cracks are successfully reconstructed. Copyright 2007 by the American Geophysical Union.
Surface Forcing from CH4 at the North Slope of Alaska and Southern Great Plains Sites
NASA Astrophysics Data System (ADS)
Collins, W.; Feldman, D.; Turner, D. D.
2014-12-01
Recent increases in atmospheric CH4 have been spatially heterogeneous as indicated by in situ flask measurements and space-borne remote-sensing retrievals from the AIRS instrument, potentially leading to increased radiative forcing. We present detailed, specialized measurements at the DOE ARM North Slope of Alaska (NSA) and Southern Great Plains (SGP) sites to derive the time-series of both CH4 atmospheric concentrations and associated radiative implications at highly-contrasting natural and anthropogenic sources. Using a combination of spectroscopic measurements, in situ observations, and ancillary data for the atmospheric thermodynamic state from radiosondes and cloud-clearing from active sounders, we can separate out the contribution of CH4 to clear-sky downwelling radiance spectra and its infrared surface forcing. The time-series indicates year-to-year variation in shoulder season increases of CH4 concentration and forcing at NSA and large signals from anthropogenic activity at SGP.
Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling
NASA Astrophysics Data System (ADS)
Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando
2013-04-01
SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.
The plant phenological online database (PPODB): an online database for long-term phenological data.
Dierenbach, Jonas; Badeck, Franz-W; Schaber, Jörg
2013-09-01
We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via www.ppodb.de .
Fan, Yifang; Fan, Yubo; Li, Zhiyu; Newman, Tony; Lv, Changsheng; Zhou, Yi
2013-01-01
No consensus has been reached on how musculoskeletal system injuries or aging can be explained by a walking plantar impulse. We standardize the plantar impulse by defining a principal axis of plantar impulse. Based upon this standardized plantar impulse, two indexes are presented: plantar pressure record time series and plantar-impulse distribution along the principal axis of plantar impulse. These indexes are applied to analyze the plantar impulse collected by plantar pressure plates from three sources: Achilles tendon ruptures; elderly people (ages 62-71); and young people (ages 19-23). Our findings reveal that plantar impulse distribution curves for Achilles tendon ruptures change irregularly with subjects' walking speed changes. When comparing distribution curves of the young, we see a significant difference in the elderly subjects' phalanges plantar pressure record time series. This verifies our hypothesis that a plantar impulse can function as a means to assess and evaluate musculoskeletal system injuries and aging.
Forecasting electricity usage using univariate time series models
NASA Astrophysics Data System (ADS)
Hock-Eam, Lim; Chee-Yin, Yip
2014-12-01
Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.
High gradient lens for charged particle beam
Chen, Yu-Jiuan
2014-04-29
Methods and devices enable shaping of a charged particle beam. A dynamically adjustable electric lens includes a series of alternating a series of alternating layers of insulators and conductors with a hollow center. The series of alternating layers when stacked together form a high gradient insulator (HGI) tube to allow propagation of the charged particle beam through the hollow center of the HGI tube. A plurality of transmission lines are connected to a plurality of sections of the HGI tube, and one or more voltage sources are provided to supply an adjustable voltage value to each transmission line of the plurality of transmission lines. By changing the voltage values supplied to each section of the HGI tube, any desired electric field can be established across the HGI tube. This way various functionalities including focusing, defocusing, acceleration, deceleration, intensity modulation and others can be effectuated on a time varying basis.
Anticipated detection of favorable periods for wind energy production by means of information theory
NASA Astrophysics Data System (ADS)
Vogel, Eugenio; Saravia, Gonzalo; Kobe, Sigismund; Schumann, Rolf; Schuster, Rolf
Managing the electric power produced by different sources requires mixing the different response times they present. Thus, for instance, coal burning presents large time lags until operational conditions are reached while hydroelectric generation can react in a matter of some seconds or few minutes to reach the desired productivity. Wind energy production (WEP) can be instantaneously fed to the network to save fuels with low thermal inertia (gas burning for instance), but this source presents sudden variations within few hours. We report here for the first time a method based on information theory to handle WEP. This method has been successful in detecting dynamical changes in magnetic transitions and variations of stock markets. An algorithm called wlzip based on information recognition is used to recognize the information content of a time series. We make use of publically available energy data in Germany to simulate real applications. After a calibration process the system can recognize directly on the WEP data the onset of favorable periods of a desired strength. Optimization can lead to a few hours of anticipation which is enough to control the mixture of WEP with other energy sources, thus saving fuels.
Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.
Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches formore » companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.« less
Scaling analysis of bilateral hand tremor movements in essential tremor patients.
Blesic, S; Maric, J; Dragasevic, N; Milanovic, S; Kostic, V; Ljubisavljevic, Milos
2011-08-01
Recent evidence suggests that the dynamic-scaling behavior of the time-series of signals extracted from separate peaks of tremor spectra may reveal existence of multiple independent sources of tremor. Here, we have studied dynamic characteristics of the time-series of hand tremor movements in essential tremor (ET) patients using the detrended fluctuation analysis method. Hand accelerometry was recorded with (500 g) and without weight loading under postural conditions in 25 ET patients and 20 normal subjects. The time-series comprising peak-to-peak (PtP) intervals were extracted from regions around the first three main frequency components of power spectra (PwS) of the recorded tremors. The data were compared between the load and no-load condition on dominant (related to tremor severity) and non-dominant tremor side and with the normal (physiological) oscillations in healthy subjects. Our analysis shows that, in ET, the dynamic characteristics of the main frequency component of recorded tremors exhibit scaling behavior. Furthermore, they show that the two main components of ET tremor frequency spectra, otherwise indistinguishable without load, become significantly different after inertial loading and that they differ between the tremor sides (related to tremor severity). These results show that scaling, a time-domain analysis, helps revealing tremor features previously not revealed by frequency-domain analysis and suggest that distinct oscillatory central circuits may generate the tremor in ET patients.
Borehole Volumetric Strainmeters Detect Very Long-period Ocean Level Changes in Tokai Area
NASA Astrophysics Data System (ADS)
Takanami, T.; Linde, A. T.; Sacks, S. I.; Kitagawa, G.; Hirata, N.; Rydelek, P. A.
2015-12-01
We detected a clear very long-period strain signal with a predominant period of about 2 months in the data from Sacks-Evertson borehole volumetric strainmeters. These have been operated by the Japan Meteorological Agency (JMA) since 1976 in Tokai area, Japan, the area of an expected Tokai eartquake. Earth's surface is always influenced by natural force such as earth tide, air pressure, and precipitation as well as by human induced sources. In order to decompose into their components in the maximum likelihood estimation, state-space modeling (Takanami et al., 2013) is applied to the observed time series data for 15 months before and after the earthquake M6.5 that occurred on 11th August 2009 in Suruga Bay. In the analysis, the strain data are decomposed into trend, air pressure, earth tide, precipitation effects and observation noise. Clear long-period strain signals are seen in the normalized trend component time series. Time series data from JMA tide gages around Suruga Bay are similarly decomposed. Then spectral analyses are applied to the trend components for the same time interval. Comparison of amplitude peaks in spectra for both data sets show all have a peak at period of about 1464 hours. Thus strain changes may be influenced by very long-period ocean level changes; it is necessary to consider this possibility before attributing tectonic significance to such variations.
Initial Validation of NDVI time seriesfrom AVHRR, VEGETATION, and MODIS
NASA Technical Reports Server (NTRS)
Morisette, Jeffrey T.; Pinzon, Jorge E.; Brown, Molly E.; Tucker, Jim; Justice, Christopher O.
2004-01-01
The paper will address Theme 7: Multi-sensor opportunities for VEGETATION. We present analysis of a long-term vegetation record derived from three moderate resolution sensors: AVHRR, VEGETATION, and MODIS. While empirically based manipulation can ensure agreement between the three data sets, there is a need to validate the series. This paper uses atmospherically corrected ETM+ data available over the EOS Land Validation Core Sites as an independent data set with which to compare the time series. We use ETM+ data from 15 globally distributed sites, 7 of which contain repeat coverage in time. These high-resolution data are compared to the values of each sensor by spatially aggregating the ETM+ to each specific sensors' spatial coverage. The aggregated ETM+ value provides a point estimate for a specific site on a specific date. The standard deviation of that point estimate is used to construct a confidence interval for that point estimate. The values from each moderate resolution sensor are then evaluated with respect to that confident interval. Result show that AVHRR, VEGETATION, and MODIS data can be combined to assess temporal uncertainties and address data continuity issues and that the atmospherically corrected ETM+ data provide an independent source with which to compare that record. The final product is a consistent time series climate record that links historical observations to current and future measurements.
Zhang, Wanfeng; Zhu, Shukui; He, Sheng; Wang, Yanxin
2015-02-06
Using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOFMS), volatile and semi-volatile organic compounds in crude oil samples from different reservoirs or regions were analyzed for the development of a molecular fingerprint database. Based on the GC×GC/TOFMS fingerprints of crude oils, principal component analysis (PCA) and cluster analysis were used to distinguish the oil sources and find biomarkers. As a supervised technique, the geological characteristics of crude oils, including thermal maturity, sedimentary environment etc., are assigned to the principal components. The results show that tri-aromatic steroid (TAS) series are the suitable marker compounds in crude oils for the oil screening, and the relative abundances of individual TAS compounds have excellent correlation with oil sources. In order to correct the effects of some other external factors except oil sources, the variables were defined as the content ratio of some target compounds and 13 parameters were proposed for the screening of oil sources. With the developed model, the crude oils were easily discriminated, and the result is in good agreement with the practical geological setting. Copyright © 2014 Elsevier B.V. All rights reserved.
Ji, Kang Hyeun; Herring, Thomas A.; Llenos, Andrea L.
2013-01-01
Long Valley Caldera in eastern California is an active volcanic area and has shown continued unrest in the last three decades. We have monitored surface deformation from Global Positioning System (GPS) data by using a projection method that we call Targeted Projection Operator (TPO). TPO projects residual time series with secular rates and periodic terms removed onto a predefined spatial pattern. We used the 2009–2010 slow deflation as a target spatial pattern. The resulting TPO time series shows a detailed deformation history including the 2007–2009 inflation, the 2009–2010 deflation, and a recent inflation that started in late-2011 and is continuing at the present time (November 2012). The recent inflation event is about four times faster than the previous 2007–2009 event. A Mogi source of the recent event is located beneath the resurgent dome at about 6.6 km depth at a rate of 0.009 km3/yr volume change. TPO is simple and fast and can provide a near real-time continuous monitoring tool without directly looking at all the data from many GPS sites in this potentially eruptive volcanic system.
Kalman Filter Time Series Analysis of Gamma-Ray Data from NaI(T1) Detectors for the ND6620 Computer.
1985-05-08
J ., ,mi. m- n.I Illa IdI~ikll di I I I " i~i l ll . . ... " " . .... " ". . . . .. MIDAS FORTRAN IV 21 DEC...Washington, D.C. ApprovedI for publi releasec. di ~tiiin unlimited . ... ... ... . . . . . . . . . . . . . . S~ i’C a;s (,.-,ON OF _45 S ACE REPORT...be gaussian with a mean of zero and covariance Rk wnich is known or can oe estimated. Tne oehavior of the source between times k and K+ l is assumed
Deep neural networks to enable real-time multimessenger astrophysics
NASA Astrophysics Data System (ADS)
George, Daniel; Huerta, E. A.
2018-02-01
Gravitational wave astronomy has set in motion a scientific revolution. To further enhance the science reach of this emergent field of research, there is a pressing need to increase the depth and speed of the algorithms used to enable these ground-breaking discoveries. We introduce Deep Filtering—a new scalable machine learning method for end-to-end time-series signal processing. Deep Filtering is based on deep learning with two deep convolutional neural networks, which are designed for classification and regression, to detect gravitational wave signals in highly noisy time-series data streams and also estimate the parameters of their sources in real time. Acknowledging that some of the most sensitive algorithms for the detection of gravitational waves are based on implementations of matched filtering, and that a matched filter is the optimal linear filter in Gaussian noise, the application of Deep Filtering using whitened signals in Gaussian noise is investigated in this foundational article. The results indicate that Deep Filtering outperforms conventional machine learning techniques, achieves similar performance compared to matched filtering, while being several orders of magnitude faster, allowing real-time signal processing with minimal resources. Furthermore, we demonstrate that Deep Filtering can detect and characterize waveform signals emitted from new classes of eccentric or spin-precessing binary black holes, even when trained with data sets of only quasicircular binary black hole waveforms. The results presented in this article, and the recent use of deep neural networks for the identification of optical transients in telescope data, suggests that deep learning can facilitate real-time searches of gravitational wave sources and their electromagnetic and astroparticle counterparts. In the subsequent article, the framework introduced herein is directly applied to identify and characterize gravitational wave events in real LIGO data.
NASA Astrophysics Data System (ADS)
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingsong; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Román, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.
2017-07-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
NASA Astrophysics Data System (ADS)
Klos, Anna; Olivares, German; Teferle, Felix Norman; Bogusz, Janusz
2016-04-01
Station velocity uncertainties determined from a series of Global Navigation Satellite System (GNSS) position estimates depend on both the deterministic and stochastic models applied to the time series. While the deterministic model generally includes parameters for a linear and several periodic terms the stochastic model is a representation of the noise character of the time series in form of a power-law process. For both of these models the optimal model may vary from one time series to another while the models also depend, to some degree, on each other. In the past various power-law processes have been shown to fit the time series and the sources for the apparent temporally-correlated noise were attributed to, for example, mismodelling of satellites orbits, antenna phase centre variations, troposphere, Earth Orientation Parameters, mass loading effects and monument instabilities. Blewitt and Lavallée (2002) demonstrated how improperly modelled seasonal signals affected the estimates of station velocity uncertainties. However, in their study they assumed that the time series followed a white noise process with no consideration of additional temporally-correlated noise. Bos et al. (2010) empirically showed for a small number of stations that the noise character was much more important for the reliable estimation of station velocity uncertainties than the seasonal signals. In this presentation we pick up from Blewitt and Lavallée (2002) and Bos et al. (2010), and have derived formulas for the computation of the General Dilution of Precision (GDP) under presence of periodic signals and temporally-correlated noise in the time series. We show, based on simulated and real time series from globally distributed IGS (International GNSS Service) stations processed by the Jet Propulsion Laboratory (JPL), that periodic signals dominate the effect on the velocity uncertainties at short time scales while for those beyond four years, the type of noise becomes much more important. In other words, for time series long enough, the assumed periodic signals do not affect the velocity uncertainties as much as the assumed noise model. We calculated the GDP to be the ratio between two errors of velocity: without and with inclusion of seasonal terms of periods equal to one year and its overtones till 3rd. To all these cases power-law processes of white, flicker and random-walk noise were added separately. Few oscillations in GDP can be noticed for integer years, which arise from periodic terms added. Their amplitudes in GDP increase along with the increasing spectral index. Strong peaks of oscillations in GDP are indicated for short time scales, especially for random-walk processes. This means that badly monumented stations are affected the most. Local minima and maxima in GDP are also enlarged as the noise approaches random walk. We noticed that the semi-annual signal increased the local GDP minimum for white noise. This suggests that adding power-law noise to a deterministic model with annual term or adding a semi-annual term to white noise causes an increased velocity uncertainty even at the points, where determined velocity is not biased.
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Qingson; Kim, JiHyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey; Masek, Jeffrey G.; Morisette, Jeffrey T.; Zhang, Xiaoyang; Papuga, Shirley A.
2017-01-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warming/cooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500 m Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF/NBAR/albedo products and 30 m Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDF/Albedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30 m Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30 m albedos for the intervening daily time steps in this study. These enhanced daily 30 m spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of ±0.006. These synthetic time series provide much greater spatial detail than the 500 m gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 km by 14 km) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30 m resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
Detection of deformation time-series in Miyake-jima using PALSAR/InSAR
NASA Astrophysics Data System (ADS)
Ozawa, T.; Ueda, H.
2010-12-01
Volcano deformation is often complicated temporally and spatially. Then deformation mapping by InSAR is useful to understand it in detail. However, InSAR is affected by the atmospheric, the ionospheric and other noises, and then we sometimes miss an important temporal change of deformation with a few cm. So we want to develop InSAR time-series analysis which detects volcano deformation precisely. Generally, the area of 10×10km which covers general volcano size is included in several SAR scenes obtained from different orbits or observation modes. First, interferograms are generated for each orbit path. In InSAR processing, the atmospheric noise reduction using the simulation from numerical weather model is used. Long wavelength noise due to orbit error and the ionospheric disturbance is corrected by adjusting to GPS deformation time-series, assuming it to be a plane. Next, we estimate deformation time-series from obtained interferograms. Radar incidence directions for each orbit path are different, but those for observation modes with 34.3° and 41.5° offnadir angles are almost included in one plane. Then slant-range change for all orbit paths can be described by the horizontal and the vertical components of its co-plane. Inversely, we estimate them for all epochs with the constraint that temporal change of deformation is smooth. Simultaneously, we estimate DEM error. As one of case studies, we present an application in Miyake-jima. Miyake-jima is a volcanic island located to 200km south of Tokyo, and a large amount of volcanic gas has been ejecting since the 2000 eruption. Crustal deformation associated with such volcanic activity has been observed by continuous GPS observations. However, its distribution is complicated, and therefore we applied this method to detect precise deformation time-series. In the most of GPS sites, obtained time-series were good agreement with GPS time-series, and the root-mean-square of residuals was less than 1cm. However, the temporal step of deformation was estimated in 2008, and it is not consistent with GPS time-series. We think that the effect of an orbit maneuver in 2008 has appeared. An improvement for such noise is one of next subjects. In the obtained deformation map, contraction around the caldera and uplift along the north-west-south coast were found. It is obvious that this deformation pattern cannot be explained by simple one inflation or deflation source, and its interpretation is also one of next subjects. In the caldera bottom, subsidence with 14cm/yr was found. Though its subsidence speed was constant until 2008, it decelerated to 20cm/yr from 2009. Furthermore subsidence speed in 2010 was 3cm/yr. Around the same time, low-frequency earthquakes increased just under the caldera. Then we speculate that deceleration of subsidence may directly relate with the volcanic activity. Although the result shows volcano deformation in detail, some mis-estimations were obtained. We believe that this InSAR time-series analysis is useful, but more improvements are necessary.
NASA Technical Reports Server (NTRS)
Wang, Zhuosen; Schaaf, Crystal B.; Sun, Quingsong; Kim, Jihyun; Erb, Angela M.; Gao, Feng; Roman, Miguel O.; Yang, Yun; Petroy, Shelley; Taylor, Jeffrey R.;
2017-01-01
Seasonal vegetation phenology can significantly alter surface albedo which in turn affects the global energy balance and the albedo warmingcooling feedbacks that impact climate change. To monitor and quantify the surface dynamics of heterogeneous landscapes, high temporal and spatial resolution synthetic time series of albedo and the enhanced vegetation index (EVI) were generated from the 500-meter Moderate Resolution Imaging Spectroradiometer (MODIS) operational Collection V006 daily BRDF (Bidirectional Reflectance Distribution Function) / NBAR (Nadir BRDF-Adjusted Reflectance) / albedo products and 30-meter Landsat 5 albedo and near-nadir reflectance data through the use of the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM). The traditional Landsat Albedo (Shuai et al., 2011) makes use of the MODIS BRDFAlbedo products (MCD43) by assigning appropriate BRDFs from coincident MODIS products to each Landsat image to generate a 30-meter Landsat albedo product for that acquisition date. The available cloud free Landsat 5 albedos (due to clouds, generated every 16 days at best) were used in conjunction with the daily MODIS albedos to determine the appropriate 30-meter albedos for the intervening daily time steps in this study. These enhanced daily 30-meter spatial resolution synthetic time series were then used to track albedo and vegetation phenology dynamics over three Ameriflux tower sites (Harvard Forest in 2007, Santa Rita in 2011 and Walker Branch in 2005). These Ameriflux sites were chosen as they are all quite nearby new towers coming on line for the National Ecological Observatory Network (NEON), and thus represent locations which will be served by spatially paired albedo measures in the near future. The availability of data from the NEON towers will greatly expand the sources of tower albedometer data available for evaluation of satellite products. At these three Ameriflux tower sites the synthetic time series of broadband shortwave albedos were evaluated using the tower albedo measurements with a Root Mean Square Error (RMSE) less than 0.013 and a bias within the range of 0.006. These synthetic time series provide much greater spatial detail than the 500 meter gridded MODIS data, especially over more heterogeneous surfaces, which improves the efforts to characterize and monitor the spatial variation across species and communities. The mean of the difference between maximum and minimum synthetic time series of albedo within the MODIS pixels over a subset of satellite data of Harvard Forest (16 kilometers by 14 kilometers) was as high as 0.2 during the snow-covered period and reduced to around 0.1 during the snow-free period. Similarly, we have used STARFM to also couple MODIS Nadir BRDF-Adjusted Reflectances (NBAR) values with Landsat 5 reflectances to generate daily synthetic times series of NBAR and thus Enhanced Vegetation Index (NBAR-EVI) at a 30-meter resolution. While normally STARFM is used with directional reflectances, the use of the view angle corrected daily MODIS NBAR values will provide more consistent time series. These synthetic times series of EVI are shown to capture seasonal vegetation dynamics with finer spatial and temporal details, especially over heterogeneous land surfaces.
A 280-Year Long Series of Phenological Observations of Cherry Tree Blossoming Dates for Switzerland
NASA Astrophysics Data System (ADS)
Rutishauser, T.; Luterbacher, J.; Wanner, H.
2003-04-01
Phenology is generally described as the timing of life cycle phases or activities of plants and animals in their temporal occurrence throughout the year (Lieth 1974). Recent studies have shown that meteorological and climatological impacts leave their 'fingerprints' across natural systems in general and strongly influence the seasonal activities of single animal and plant species. During the 20th century, phenological observation networks have been established around the world to document and analyze the influence of the globally changing climate to plants and wildlife. This work presents a first attempt of a unique 280-year long series of phenological observations of cherry tree blossoming dates for the Swiss plateau region. In Switzerland, a nation-wide phenological observation network has been established in 1951 currently documenting 69 phenophases of 26 different plant species. A guidebook seeks to increase objectiveness in the network observations. The observations of the blooming of the cherry tree (prunus avium) were chosen to calculate a mean series for the Swiss plateau region with observations from altitudes ranging between 370 and 860 asl. A total number of 737 observations from 21 stations were used. A linear regression was established between the mean blooming date and altitude in order to correct the data to a reference altitude level. Other ecological parameters were unaccounted for. The selected network data series from 1951 to 2000 was combined and prolonged with observations from various sources back to 1721. These include several historical observation series by farmers, clergymen and teachers, data from various stations collected at the newly established Swiss meteorological network from 1864 to 1873 and the single long series of observations from Liestal starting in 1894. The homogenized time series of observations will be compared with reconstructions of late winter temperatures as well as statistical estimations of blooming time based on long instrumental data from Europe. In addition, the series is one of the few historical phenological records to assess past climate and ecological changes. Lieth, H. (1974). Phenology and Seasonality Modeling. Berlin, Heidelberg, New York, Springer.
NASA Astrophysics Data System (ADS)
Daux, V.; Garcia de Cortazar-Atauri, I.; Yiou, P.; Chuine, I.; Garnier, E.; Ladurie, E. Le Roy; Mestre, O.; Tardaguila, J.
2011-11-01
We present a dataset of grape harvest dates (GHD) series that has been compiled from international and non-translated French and Spanish literature and from unpublished documentary sources from public organizations and from wine-growers. As of June 2011, this GHD dataset comprises 378 series mainly from France (93% of the data) as well as series from Switzerland, Italy, Spain and Luxembourg. The series have variable length and contain gaps of variable sizes. The longest and most complete ones are from Burgundy, Switzerland, Southern Rhône valley, Jura and Ile-de-France. The GHD series were grouped into 27 regions according to their location, to geomorphological and geological criteria, and to past and present grape varieties. The GHD regional composite series (GHD-RCS) were calculated and compared pairwise to assess the quality of the series. Significant (p-value < 0.001) and strong correlations exist between most of them. As expected, the correlations tended to be higher when the vineyards are closer, the highest correlation (R = 0.91) being obtained between the High Loire Valley and the Ile-de-France GHD-RCS. The strong dependence of vine cycle on temperature and, therefore, the strong link between GHD and the temperature of the growing season was also used to test the quality of the GHD series. The strongest correlations are obtained between the GHD-RCS and the temperature series of the nearest weather stations. Moreover, the GHD-RCS/temperature correlation maps show spatial patterns similar to temperature correlation maps. The stability of the correlations over time is explored. The most striking feature is their generalized deterioration at the late 19th-early 20th turning point. The possible effects on the GHD of the phylloxera crisis, which took place at this time, are discussed. The median of the standardized GHD-RCS was calculated. The distribution of the extreme years of this general synthetic series is not homogenous. Extremely late years all occur during a two-century long time-window from the early 17th to the early 19th century, while extremely early years are frequent during the 16th and since the mid-19th century. The dataset is made accessible for climate research through the Internet. It should allow a variety of climate studies, including reconstructions of atmospheric circulation over Western Europe.
Experimental study of a SINIS detector response time at 350 GHz signal frequency
NASA Astrophysics Data System (ADS)
Lemzyakov, S.; Tarasov, M.; Mahashabde, S.; Yusupov, R.; Kuzmin, L.; Edelman, V.
2018-03-01
Response time constant of a SINIS bolometer integrated in an annular ring antenna was measured at a bath temperature of 100 mK. Samples comprising superconducting aluminium electrodes and normal-metal Al/Fe strip connected to electrodes via tunnel junctions were fabricated on oxidized Si substrate using shadow evaporation. The bolometer was illuminated by a fast black-body radiation source through a band-pass filter centered at 350 GHz with a passband of 7 GHz. Radiation source is a thin NiCr film on sapphire substrate. For rectangular 10÷100 μs current pulse the radiation front edge was rather sharp due to low thermal capacitance of NiCr film and low thermal conductivity of substrate at temperatures in the range 1-4 K. The rise time of the response was ~1-10 μs. This time presumably is limited by technical reasons: high dynamic resistance of series array of bolometers and capacitance of a long twisted pair wiring from SINIS bolometer to a room-temperature amplifier.
Variability and trends in surface seawater pCO2 and CO2 flux in the Pacific Ocean
NASA Astrophysics Data System (ADS)
Sutton, A. J.; Wanninkhof, R.; Sabine, C. L.; Feely, R. A.; Cronin, M. F.; Weller, R. A.
2017-06-01
Variability and change in the ocean sink of anthropogenic carbon dioxide (CO2) have implications for future climate and ocean acidification. Measurements of surface seawater CO2 partial pressure (pCO2) and wind speed from moored platforms are used to calculate high-resolution CO2 flux time series. Here we use the moored CO2 fluxes to examine variability and its drivers over a range of time scales at four locations in the Pacific Ocean. There are significant surface seawater pCO2, salinity, and wind speed trends in the North Pacific subtropical gyre, especially during winter and spring, which reduce CO2 uptake over the 10 year record of this study. Starting in late 2013, elevated seawater pCO2 values driven by warm anomalies cause this region to be a net annual CO2 source for the first time in the observational record, demonstrating how climate forcing can influence the timing of an ocean region shift from CO2 sink to source.
NASA Astrophysics Data System (ADS)
Zhang, Zhikuan; Zhang, Shengdong; Feng, Chuguang; Chan, Mansun
2003-10-01
In this paper, a source/drain structure separated from the silicon substrate by oxide isolation is fabricated and studied. The source/drain diffusion regions are connected to the shallow source/drain extension through a smaller opening defined by a double spacer process. Experimental results indicate that the source/drain on insulator significantly reduces the parasitic capacitance. Further optimization by simulation indicates a reduction of series resistance and band-to-band drain leakage at off-state can be achieved in extremely scaled devices. Compared with the conventional planner source/drain structure, the reduction of parasitic capacitance and series resistance can be as much as 80% and 30% respectively.
Series Connected Buck-Boost Regulator
NASA Technical Reports Server (NTRS)
Birchenough, Arthur G. (Inventor)
2006-01-01
A Series Connected Buck-Boost Regulator (SCBBR) that switches only a fraction of the input power, resulting in relatively high efficiencies. The SCBBR has multiple operating modes including a buck, a boost, and a current limiting mode, so that an output voltage of the SCBBR ranges from below the source voltage to above the source voltage.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Figure 1). This ensures that 95 percent of the time, when the DQO is met, the actual CE value will be ±5... while still being assured of correctly demonstrating compliance. It is designed to reduce “false... approach follows: 4.3A source conducts an initial series of at least three runs. The owner or operator may...
ERIC Educational Resources Information Center
National Center for Homeless Education at SERVE, 2016
2016-01-01
Children and youth who experience homelessness face many barriers to education, yet school can be a source of stability, affirmation, and hope during a time of chaos and trauma when a young person loses his or her housing. Community service providers play a key role in linking homeless children and youth to schools and providing wraparound…
ERIC Educational Resources Information Center
Abramowitz, Jack
This skills-text is the first of four books in the series "Readings in American History." The materials allow opportunities to improve reading and comprehension skills in a subject matter context by using certain primary sources related to the topic. Book I covers the time from the European discovery of the Americas in 1492 to the end of…
Antunes, Luciana Principal; Martins, Layla Farage; Pereira, Roberta Verciano; Thomas, Andrew Maltez; Barbosa, Deibs; Lemos, Leandro Nascimento; Silva, Gianluca Major Machado; Moura, Livia Maria Silva; Epamino, George Willian Condomitti; Digiampietri, Luciano Antonio; Lombardi, Karen Cristina; Ramos, Patricia Locosque; Quaggio, Ronaldo Bento; de Oliveira, Julio Cezar Franco; Pascon, Renata Castiglioni; Cruz, João Batista da; da Silva, Aline Maria; Setubal, João Carlos
2016-01-01
Composting is a promising source of new organisms and thermostable enzymes that may be helpful in environmental management and industrial processes. Here we present results of metagenomic- and metatranscriptomic-based analyses of a large composting operation in the São Paulo Zoo Park. This composting exhibits a sustained thermophilic profile (50 °C to 75 °C), which seems to preclude fungal activity. The main novelty of our study is the combination of time-series sampling with shotgun DNA, 16S rRNA gene amplicon, and metatranscriptome high-throughput sequencing, enabling an unprecedented detailed view of microbial community structure, dynamics, and function in this ecosystem. The time-series data showed that the turning procedure has a strong impact on the compost microbiota, restoring to a certain extent the population profile seen at the beginning of the process; and that lignocellulosic biomass deconstruction occurs synergistically and sequentially, with hemicellulose being degraded preferentially to cellulose and lignin. Moreover, our sequencing data allowed near-complete genome reconstruction of five bacterial species previously found in biomass-degrading environments and of a novel biodegrading bacterial species, likely a new genus in the order Bacillales. The data and analyses provided are a rich source for additional investigations of thermophilic composting microbiology. PMID:27941956
Antunes, Luciana Principal; Martins, Layla Farage; Pereira, Roberta Verciano; Thomas, Andrew Maltez; Barbosa, Deibs; Lemos, Leandro Nascimento; Silva, Gianluca Major Machado; Moura, Livia Maria Silva; Epamino, George Willian Condomitti; Digiampietri, Luciano Antonio; Lombardi, Karen Cristina; Ramos, Patricia Locosque; Quaggio, Ronaldo Bento; de Oliveira, Julio Cezar Franco; Pascon, Renata Castiglioni; Cruz, João Batista da; da Silva, Aline Maria; Setubal, João Carlos
2016-12-12
Composting is a promising source of new organisms and thermostable enzymes that may be helpful in environmental management and industrial processes. Here we present results of metagenomic- and metatranscriptomic-based analyses of a large composting operation in the São Paulo Zoo Park. This composting exhibits a sustained thermophilic profile (50 °C to 75 °C), which seems to preclude fungal activity. The main novelty of our study is the combination of time-series sampling with shotgun DNA, 16S rRNA gene amplicon, and metatranscriptome high-throughput sequencing, enabling an unprecedented detailed view of microbial community structure, dynamics, and function in this ecosystem. The time-series data showed that the turning procedure has a strong impact on the compost microbiota, restoring to a certain extent the population profile seen at the beginning of the process; and that lignocellulosic biomass deconstruction occurs synergistically and sequentially, with hemicellulose being degraded preferentially to cellulose and lignin. Moreover, our sequencing data allowed near-complete genome reconstruction of five bacterial species previously found in biomass-degrading environments and of a novel biodegrading bacterial species, likely a new genus in the order Bacillales. The data and analyses provided are a rich source for additional investigations of thermophilic composting microbiology.
Persistent homology of time-dependent functional networks constructed from coupled time series
NASA Astrophysics Data System (ADS)
Stolz, Bernadette J.; Harrington, Heather A.; Porter, Mason A.
2017-04-01
We use topological data analysis to study "functional networks" that we construct from time-series data from both experimental and synthetic sources. We use persistent homology with a weight rank clique filtration to gain insights into these functional networks, and we use persistence landscapes to interpret our results. Our first example uses time-series output from networks of coupled Kuramoto oscillators. Our second example consists of biological data in the form of functional magnetic resonance imaging data that were acquired from human subjects during a simple motor-learning task in which subjects were monitored for three days during a five-day period. With these examples, we demonstrate that (1) using persistent homology to study functional networks provides fascinating insights into their properties and (2) the position of the features in a filtration can sometimes play a more vital role than persistence in the interpretation of topological features, even though conventionally the latter is used to distinguish between signal and noise. We find that persistent homology can detect differences in synchronization patterns in our data sets over time, giving insight both on changes in community structure in the networks and on increased synchronization between brain regions that form loops in a functional network during motor learning. For the motor-learning data, persistence landscapes also reveal that on average the majority of changes in the network loops take place on the second of the three days of the learning process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arzoumanian, Zaven; Brazier, Adam; Chatterjee, Shami
2015-11-01
We present high-precision timing observations spanning up to nine years for 37 millisecond pulsars monitored with the Green Bank and Arecibo radio telescopes as part of the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) project. We describe the observational and instrumental setups used to collect the data, and methodology applied for calculating pulse times of arrival; these include novel methods for measuring instrumental offsets and characterizing low signal-to-noise ratio timing results. The time of arrival data are fit to a physical timing model for each source, including terms that characterize time-variable dispersion measure and frequency-dependent pulse shape evolution. Inmore » conjunction with the timing model fit, we have performed a Bayesian analysis of a parameterized timing noise model for each source, and detect evidence for excess low-frequency, or “red,” timing noise in 10 of the pulsars. For 5 of these cases this is likely due to interstellar medium propagation effects rather than intrisic spin variations. Subsequent papers in this series will present further analysis of this data set aimed at detecting or limiting the presence of nanohertz-frequency gravitational wave signals.« less
NASA Astrophysics Data System (ADS)
Snelson, C. M.; Chipman, V.; White, R. L.; Emmitt, R.; Townsend, M.
2013-12-01
Discriminating low-yield nuclear explosions is one of the current challenges in the field of monitoring and verification. Work is currently underway in Nevada to address this challenge by conducting a series of experiments using a physics-based approach. This has been accomplished by using a multifaceted, multi-disciplinary approach that includes a range of activities, from characterizing the shallow subsurface to acquiring new explosion data both in the near field (< 100 m from the source) to the far field (> 100 m to 10 s km from the source). The Source Physics Experiment (SPE) is a collaborative project between National Security Technologies, LLC, Lawrence Livermore National Laboratory, Los Alamos National Laboratory, Sandia National Laboratories, the Defense Threat Reduction Agency, and the Air Force Technical Applications Center. The goal of the SPE is to understand the transition of seismic energy from the near field to the far field; to understand the development of S-waves in explosives sources; and to understand how anisotropy controls seismic energy transmission and partitioning. To fully explore these problems, the SPE test series includes tests in both simple and complex geology cases. The current series is being conducted in a highly fractured granite body. This location was chosen, in part, because it was the location of previous nuclear tests in the same rock body and because generally the geology has been well characterized. In addition to historic data, high-resolution seismic reflection, cross-hole tomography, core samples, LIDAR, hyperspectral, and fracture mapping data have been acquired to further characterize and detect changes after each of the shot across the test bed. The complex geology series includes 7 planned shots using conventional explosives in the same shot hole surrounded by Continuous Reflectometry for Radius vs. Time Experiment (CORRTEX), Time of Arrival, Velocity of Detonation, down-hole accelerometers, surface accelerometers, infrasound, and a suite of seismic sensors of various frequency bands from the near field to the far field. This allows for the use of a single test bed in the granite instead of multiple test beds to obtain the same results. The shots are planned at various depths to obtain a Green's function, scaled depth-of-burial data, nominal depth-of-burial data and damage-zone data. Three shots have been executed to date and the fourth is planned for August 2013 as a 220 lb (100 kg) TNT equivalent shot at a depth of 315 ft (96 m). Over 400 data channels have been recorded on the first series of shots with high fidelity. Once the complex geology site data have been exploited, a new test bed will be developed in a simpler geology to test these physics-based models. Ultimately, the results from this project will provide the next advances in the science of monitoring to enable a physics-based predicative capability. This work was done by National Security Technologies, LLC, under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy. DOE/NV/25946--1835.
Modeling turbidity and flow at daily steps in karst using ARIMA/ARFIMA-GARCH error models
NASA Astrophysics Data System (ADS)
Massei, N.
2013-12-01
Hydrological and physico-chemical variations recorded at karst springs usually reflect highly non-linear processes and the corresponding time series are then very often also highly non-linear. Among others, turbidity, as an important parameter regarding water quality and management, is a very complex response of karst systems to rain events, involving direct transfer of particles from point-source recharge as well as resuspension of particles previously deposited and stored within the system. For those reasons, turbidity modeling has not been well taken in karst hydrological models so far. Most of the time, the modeling approaches would involve stochastic linear models such ARIMA-type models and their derivatives (ARMA, ARMAX, ARIMAX, ARFIMA...). Yet, linear models usually fail to represent well the whole (stochastic) process variability, and their residuals still contain useful information that can be used to either understand the whole variability or to enhance short-term predictability and forecasting. Model residuals are actually not i.i.d., which can be identified by the fact that squared residuals still present clear and significant serial correlation. Indeed, high (low) amplitudes are followed in time by high (low) amplitudes, which can be seen on residuals time series as periods of time during which amplitudes are higher (lower) then the mean amplitude. This is known as the ARCH effet (AutoRegressive Conditional Heteroskedasticity), and the corresponding non-linear process affecting residuals of a linear model can be modeled using ARCH or generalized ARCH (GARCH) non-linear modeling, which approaches are very well known in econometrics. Here we investigated the capability of ARIMA-GARCH error models to represent a ~20-yr daily turbidity time series recorded at a karst spring used for water supply of the city of Le Havre (Upper Normandy, France). ARIMA and ARFIMA models were used to represent the mean behavior of the time series and the residuals clearly appeared to present a pronounced ARCH effect, as confirmed by Ljung-Box and McLeod-Li tests. We then identified and fitted GARCH models to the residuals of ARIMA and ARFIMA models in order to model the conditional variance and volatility of the turbidity time series. The results eventually showed that serial correlation was succesfully removed in the last standardized residuals of the GARCH model, and hence that the ARIMA-GARCH error model appeared consistent for modeling such time series. The approach finally improved short-term (e.g a few steps-ahead) turbidity forecasting.
Time-correlated neutron analysis of a multiplying HEU source
NASA Astrophysics Data System (ADS)
Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.
2015-06-01
The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
How to Model Super-Soft X-ray Sources?
NASA Astrophysics Data System (ADS)
Rauch, Thomas
2012-07-01
During outbursts, the surface temperatures of white dwarfs in cataclysmic variables exceed by far half a million Kelvin. In this phase, they may become the brightest super-soft sources (SSS) in the sky. Time-series of high-resolution, high S/N X-ray spectra taken during rise, maximum, and decline of their X-ray luminosity provide insights into the processes following such outbursts as well as in the surface composition of the white dwarf. Their analysis requires adequate NLTE model atmospheres. The Tuebingen Non-LTE Model-Atmosphere Package (TMAP) is a powerful tool for their calculation. We present the application of TMAP models to SSS spectra and discuss their validity.
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
NASA Astrophysics Data System (ADS)
Perera, Asanga Hiran
The magnitude of the extrinsic parasitic MOSFET series resistance was experimentally evaluated in the deep -submicron domain and its consequence on device performance was determined. The series resistance of depletion mode MOSFET test structures were measured for source-drain sizes as small as 0.2 μm by 0.3 μm at room temperature and 100^ circK. To build the test structures a multilevel -full electron beam lithography fabrication process was developed with a pattern overlay accuracy of 75 nm. A new positive tone novalac resist, SYSTEM-9, was developed for electron beam application. The resist had moderate sensitivity, 19-30 muC/cm ^2, and a contrast up to 14. Interrupted development and reduced developer temperature resulted in contrast enhancements of up to 125%. SYSTEM-9 had a two or three times better dry etch resistance than PMMA. A shallow trench isolation technology capable of defining 0.2 μm wide active areas was developed. A rapid thermal annealing based silicidation scheme using TiSi_2 was established. MOSFET sidewall spacer formation using PECVD SiO_2 was calibrated. Antimony and gallium were investigated as possible alternatives to arsenic and boron, respectively, and well behaved substrate diodes were successfully fabricated. Two new patterning techniques for the metal bi-layer metalization of TiW and Al, based on liftoff and reactive ion etching, were developed. The source drain resistance of the test structures was measured at room temperature and at 100^ circK. An LN_2 flushed cold chuck for low temperature device probing was designed and constructed. The temperature dependence of the current voltage characteristics and the extracted series resistance proved that current flow in the contacts was tunneling dominated. The extrinsic source-drain resistance increased rapidly as the contact size decreased below 0.5 mum, and showed an almost two order of magnitude change, when the source-drain area was reduced from 2 x 1.7 mum^2 to 0.2 x 0.3 mum^2 . The effect of this resistance increase on a CMOS inverter switching speed was estimated. A first order empirical model to predict the series resistance was also formulated. Good correspondence was observed between results from the device simulator PISCES-2B and measured data for larger source-drain sizes.
The role of social media in recruiting for clinical trials in pregnancy.
Shere, Mahvash; Zhao, Xiu Yan; Koren, Gideon
2014-01-01
Recruitment of women in the periconceptional period to clinical studies using traditional advertising through medical establishments is difficult and slow. Given the widespread use of the internet as a source for medical information and research, we analyze the impact of social media in the second phase of an ongoing randomized, open-label clinical trial among pregnant women. This study aims to assess the effectiveness of social media as a recruitment tool through the comparison of diverse recruitment techniques in two different phases of the trial. Recruitment in Phase 1 of the study consisted solely of traditional healthcare-based sources. This was compared to Phase 2 of the study where traditional recruitment was continued and expanded, while social media was used as a supplementary source. Yearly recruitment and recruitment rates in the two phases were compared using the Mann Whitney U test. The contributions of each recruitment source to overall recruitment were analyzed, and the impact of potential confounders on recruitment rate was evaluated using a multiple regression and Interrupted Time Series Analysis. In the first phase of the study, with over 56 months of recruitment using traditional sources, 35 women were enrolled in the study, resulting in a mean rate of ±0.62 recruits/month. In the 6 months implementing recruitment through social media, 45 women were recruited, for a 12-fold higher rate of ±7.5 recruits/month. Attrition rates remained constant, suggesting that social media had a positive impact on recruitment. The Interrupted Time Series Analysis detected a significant difference in recruitment after the intervention of social media (p<0.0001) with an evident increase in the number of recruits observed after the use of social media. Clinicians and scientists recruiting for clinical studies should learn how to use online social media platforms to improve recruitment rates, thus increasing recruitment efficiency and cost-effectiveness.
Strynar, Mark; Dagnino, Sonia; McMahen, Rebecca; Liang, Shuang; Lindstrom, Andrew; Andersen, Erik; McMillan, Larry; Thurman, Michael; Ferrer, Imma; Ball, Carol
2015-10-06
Recent scientific scrutiny and concerns over exposure, toxicity, and risk have led to international regulatory efforts resulting in the reduction or elimination of certain perfluorinated compounds from various products and waste streams. Some manufacturers have started producing shorter chain per- and polyfluorinated compounds to try to reduce the potential for bioaccumulation in humans and wildlife. Some of these new compounds contain central ether oxygens or other minor modifications of traditional perfluorinated structures. At present, there has been very limited information published on these "replacement chemistries" in the peer-reviewed literature. In this study we used a time-of-flight mass spectrometry detector (LC-ESI-TOFMS) to identify fluorinated compounds in natural waters collected from locations with historical perfluorinated compound contamination. Our workflow for discovery of chemicals included sequential sampling of surface water for identification of potential sources, nontargeted TOFMS analysis, molecular feature extraction (MFE) of samples, and evaluation of features unique to the sample with source inputs. Specifically, compounds were tentatively identified by (1) accurate mass determination of parent and/or related adducts and fragments from in-source collision-induced dissociation (CID), (2) in-depth evaluation of in-source adducts formed during analysis, and (3) confirmation with authentic standards when available. We observed groups of compounds in homologous series that differed by multiples of CF2 (m/z 49.9968) or CF2O (m/z 65.9917). Compounds in each series were chromatographically separated and had comparable fragments and adducts produced during analysis. We detected 12 novel perfluoroalkyl ether carboxylic and sulfonic acids in surface water in North Carolina, USA using this approach. A key piece of evidence was the discovery of accurate mass in-source n-mer formation (H(+) and Na(+)) differing by m/z 21.9819, corresponding to the mass difference between the protonated and sodiated dimers.
The Role of Social Media in Recruiting for Clinical Trials in Pregnancy
Shere, Mahvash; Zhao, Xiu Yan; Koren, Gideon
2014-01-01
Background Recruitment of women in the periconceptional period to clinical studies using traditional advertising through medical establishments is difficult and slow. Given the widespread use of the internet as a source for medical information and research, we analyze the impact of social media in the second phase of an ongoing randomized, open-label clinical trial among pregnant women. This study aims to assess the effectiveness of social media as a recruitment tool through the comparison of diverse recruitment techniques in two different phases of the trial. Methods Recruitment in Phase 1 of the study consisted solely of traditional healthcare-based sources. This was compared to Phase 2 of the study where traditional recruitment was continued and expanded, while social media was used as a supplementary source. Yearly recruitment and recruitment rates in the two phases were compared using the Mann Whitney U test. The contributions of each recruitment source to overall recruitment were analyzed, and the impact of potential confounders on recruitment rate was evaluated using a multiple regression and Interrupted Time Series Analysis. Results In the first phase of the study, with over 56 months of recruitment using traditional sources, 35 women were enrolled in the study, resulting in a mean rate of ±0.62 recruits/month. In the 6 months implementing recruitment through social media, 45 women were recruited, for a 12-fold higher rate of ±7.5 recruits/month. Attrition rates remained constant, suggesting that social media had a positive impact on recruitment. The Interrupted Time Series Analysis detected a significant difference in recruitment after the intervention of social media (p<0.0001) with an evident increase in the number of recruits observed after the use of social media. Conclusions Clinicians and scientists recruiting for clinical studies should learn how to use online social media platforms to improve recruitment rates, thus increasing recruitment efficiency and cost-effectiveness. PMID:24671210
Hackstadt, Amber J; Peng, Roger D
2014-11-01
Time series studies have suggested that air pollution can negatively impact health. These studies have typically focused on the total mass of fine particulate matter air pollution or the individual chemical constituents that contribute to it, and not source-specific contributions to air pollution. Source-specific contribution estimates are useful from a regulatory standpoint by allowing regulators to focus limited resources on reducing emissions from sources that are major contributors to air pollution and are also desired when estimating source-specific health effects. However, researchers often lack direct observations of the emissions at the source level. We propose a Bayesian multivariate receptor model to infer information about source contributions from ambient air pollution measurements. The proposed model incorporates information from national databases containing data on both the composition of source emissions and the amount of emissions from known sources of air pollution. The proposed model is used to perform source apportionment analyses for two distinct locations in the United States (Boston, Massachusetts and Phoenix, Arizona). Our results mirror previous source apportionment analyses that did not utilize the information from national databases and provide additional information about uncertainty that is relevant to the estimation of health effects.
NASA Astrophysics Data System (ADS)
Plank, G.; Slater, D.; Torrisi, J.; Presser, R.; Williams, M.; Smith, K. D.
2012-12-01
The Nevada Seismological Laboratory (NSL) manages time-series data and high-throughput IP telemetry for the National Center for Nuclear Security (NCNS) Source Physics Experiment (SPE), underway on the Nevada National Security Site (NNSS). During active-source experiments, SPE's heterogeneous systems record over 350 channels of a variety of data types including seismic, infrasound, acoustic, and electro-magnetic. During the interim periods, broadband and short period instruments record approximately 200 channels of continuous, high-sample-rate seismic data. Frequent changes in sensor and station configurations create a challenging meta-data environment. Meta-data account for complete operational histories, including sensor types, serial numbers, gains, sample rates, orientations, instrument responses, data-logger types etc. To date, these catalogue 217 stations, over 40 different sensor types, and over 1000 unique recording configurations (epochs). Facilities for processing, backup, and distribution of time-series data currently span four Linux servers, 60Tb of disk capacity, and two data centers. Bandwidth, physical security, and redundant power and cooling systems for acquisition, processing, and backup servers are provided by NSL's Reno data center. The Nevada System of Higher Education (NSHE) System Computer Services (SCS) in Las Vegas provides similar facilities for the distribution server. NSL staff handle setup, maintenance, and security of all data management systems. SPE PIs have remote access to meta-data, raw data, and CSS3.0 compilations, via SSL-based transfers such as rsync or secure-copy, as well as shell access for data browsing and limited processing. Meta-data are continuously updated and posted on the Las Vegas distribution server as station histories are better understood and errors are corrected. Raw time series and refined CSS3.0 data compilations with standardized formats are transferred to the Las Vegas data server as available. For better data availability and station monitoring, SPE is beginning to leverage NSL's wide-area digital IP network with nine SPE stations and six Rock Valley area stations that stream continuous recordings in real time to the NSL Reno data center. These stations, in addition to eight regional legacy stations supported by National Security Technologies (NSTec), are integrated with NSL's regional monitoring network and constrain a high-quality local earthquake catalog for NNSS. The telemetered stations provide critical capabilities for SPE, and infrastructure for earthquake response on NNSS as well as southern Nevada and the Las Vegas area.
NASA Astrophysics Data System (ADS)
Broich, M.; Tulbure, M. G.; Wijaya, A.; Weisse, M.; Stolle, F.
2017-12-01
Deforestation and forest degradation form the 2nd largest source of anthropogenic CO2 emissions. While deforestation is being globally mapped with satellite image time series, degradation remains insufficiently quantified. Previous studies quantified degradation for small scale, local sites. A method suitable for accurate mapping across large areas has not yet been developed due to the variability of the low magnitude and short-lived degradation signal and the absence of data with suitable resolution properties. Here we use a combination of newly available streams of free optical and radar image time series acquired by NASA and ESA, and HPC-based data science algorithms to innovatively quantify degradation consistently across Southeast Asia (SEA). We used Sentinel1 c-band radar data and NASA's new Harmonized Landsat8 (L8) Sentinel2 (S2) product (HLS) for cloud free optical images. Our results show that dense time series of cloud penetrating Sentinel 1 c-band radar can provide degradation alarm flags, while the HLS product of cloud-free optical images can unambiguously confirm degradation alarms. The detectability of degradation differed across SEA. In the seasonal forest of continental SEA the reliability of our radar-based alarm flags increased as the variability in landscape moisture decreases in the dry season. We reliably confirmed alarms with optical image time series during the late dry season, where degradation in open canopy forests becomes detectable once the undergrowth vegetation has died down. Conversely, in insular SEA landscape moisture is low, the radar time series generated degradation alarms flags with moderate to high reliability throughout the year, further confirmed with the HLS product. Based on the HLS product we can now confirm degradation within < 6 months on average as opposed to 1 year when using either L8 or S2 alone. In contrast to continental SEA, across insular SEA our degradation maps are not suitable to provide annual maps of total degradation area, but can pinpoint degradation areas on a rolling basin throughout the year. In both continental SEA and insular SEA there the combination of optical and radar time series provides better results than either one on its own. Our results provide significant information with application for carbon trading policy and land management.
Fossil-Fuel C02 Emissions Database and Exploration System
NASA Astrophysics Data System (ADS)
Krassovski, M.; Boden, T.; Andres, R. J.; Blasing, T. J.
2012-12-01
The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production at global, regional, and national spatial scales. The CDIAC emission time series estimates are based largely on annual energy statistics published at the national level by the United Nations (UN). CDIAC has developed a relational database to house collected data and information and a web-based interface to help users worldwide identify, explore and download desired emission data. The available information is divided in two major group: time series and gridded data. The time series data is offered for global, regional and national scales. Publications containing historical energy statistics make it possible to estimate fossil fuel CO2 emissions back to 1751. Etemad et al. (1991) published a summary compilation that tabulates coal, brown coal, peat, and crude oil production by nation and year. Footnotes in the Etemad et al.(1991) publication extend the energy statistics time series back to 1751. Summary compilations of fossil fuel trade were published by Mitchell (1983, 1992, 1993, 1995). Mitchell's work tabulates solid and liquid fuel imports and exports by nation and year. These pre-1950 production and trade data were digitized and CO2 emission calculations were made following the procedures discussed in Marland and Rotty (1984) and Boden et al. (1995). The gridded data presents annual and monthly estimates. Annual data presents a time series recording 1° latitude by 1° longitude CO2 emissions in units of million metric tons of carbon per year from anthropogenic sources for 1751-2008. The monthly, fossil-fuel CO2 emissions estimates from 1950-2008 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2011), the references therein, and the methodology described in Andres et al. (2011). The data accessible here take these tabular, national, mass-emissions data and distribute them spatially on a one degree latitude by one degree longitude grid. The within-country spatial distribution is achieved through a fixed population distribution as reported in Andres et al. (1996). This presentation introduces newly build database and web interface, reflects the present state and functionality of the Fossil-Fuel CO2 Emissions Database and Exploration System as well as future plans for expansion.
Asymmetric multiscale multifractal analysis of wind speed signals
NASA Astrophysics Data System (ADS)
Zhang, Xiaonei; Zeng, Ming; Meng, Qinghao
We develop a new method called asymmetric multiscale multifractal analysis (A-MMA) to explore the multifractality and asymmetric autocorrelations of the signals with a variable scale range. Three numerical experiments are provided to demonstrate the effectiveness of our approach. Then, the proposed method is applied to investigate multifractality and asymmetric autocorrelations of difference sequences between wind speed fluctuations with uptrends or downtrends. The results show that these sequences appear to be far more complex and contain abundant fractal dynamics information. Through analyzing the Hurst surfaces of nine difference sequences, we found that all series exhibit multifractal properties and multiscale structures. Meanwhile, the asymmetric autocorrelations are observed in all variable scale ranges and the asymmetry results are of good consistency within a certain spatial range. The sources of multifractality and asymmetry in nine difference series are further discussed using the corresponding shuffled series and surrogate series. We conclude that the multifractality of these series is due to both long-range autocorrelation and broad probability density function, but the major source of multifractality is long-range autocorrelation, and the source of asymmetry is affected by the spatial distance.
NASA Astrophysics Data System (ADS)
Korotaev, S. M.; Serdyuk, V. O.; Kiktenko, E. O.; Budnev, N. M.; Gorohov, J. V.
Although the general theory macroscopic quantum entanglement of is still in its infancy, consideration of the matter in the framework of action-at-a distance electrodynamics predicts for the random dissipative processes observability of the advanced nonlocal correlations (time reversal causality). These correlations were really revealed in our previous experiments with some large-scale heliogeophysical processes as the source ones and the lab detectors as the probe ones. Recently a new experiment has been performing on the base of Baikal Deep Water Neutrino Observatory. The thick water layer is an excellent shield against any local impacts on the detectors. The first annual series 2012/2013 has demonstrated that detector signals respond to the heliogeophysical (external) processes and causal connection of the signals directed downwards: from the Earth surface to the Baikal floor. But this nonlocal connection proved to be in reverse time. In addition advanced nonlocal correlation of the detector signal with the regional source-process: the random component of hydrological activity in the upper layer was revealed and the possibility of its forecast on nonlocal correlations was demonstrated. But the strongest macroscopic nonlocal correlations are observed at extremely low frequencies, that is at periods of several months. Therefore the above results should be verified in a longer experiment. We verify them by data of the second annual series 2013/2014 of the Baikal experiment. All the results have been confirmed, although some quantitative parameters of correlations and time reversal causal links turned out different due to nonstationarity of the source-processes. A new result is displaying of the advanced response of nonlocal correlation detector to the earthquake. This opens up the prospect of the earthquake forecast on the new physical principle, although further confirmation in the next events is certainly needed. The continuation of the Baikal experiment with expanded program is burning.
Discovery and Evolution of the New Black Hole Candidate Swift J1539.2-6227 During Its 2008 Outburst
NASA Technical Reports Server (NTRS)
Krimm, H. A.; Tomsick, J. A.; Markwardt, C. B.; Brocksopp, C.; Grise, F.; Kaaret, P.; Romano, P.
2010-01-01
We report on the discovery by the Swift Gamma-Ray Burst Explorer of the black hole candidate Swift J1539.2-6227 and the subsequent course of an outburst beginning in November 2008 and lasting at least seven months. The source was discovered during normal observations with the Swift Burst Alert Telescope (BAT) on 2008 November 25. An extended observing campaign with the Rossi X-Ray Timing Explorer (RXTE) and Swift provided near-daily coverage over 176 days, giving us a rare opportunity to track the evolution of spectral and timing parameters with fine temporal resolution through a series of spectral states. The source was first detected in a hard state during which strong low-frequency quasiperiodic oscillations (QPOs) were detected. The QPOs persisted for about 35 days and a signature of the transition from the hard to soft intermediate states was seen in the timing data. The source entered a short-lived thermal state about 40 days after the start of the outburst. There were variations in spectral hardness as the source flux declined and returned to a hard state at the end of the outburst. The progression of spectral states and the nature of the timing features provide strong evidence that Swift J1539.2-6227 is a candidate black hole in a low-mass X-ray binary system.
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
NASA Astrophysics Data System (ADS)
Swanson, R. E.
2017-12-01
Climate data records typically exhibit considerable variation over short time scales both from natural variability and from instrumentation issues. The use of linear least squares regression can provide overall trend information from noisy data, however assessing intermediate time periods can also provide useful information unavailable from basic trend calculations. Extracting the short term information in these data for assessing changes to climate or for comparison of data series from different sources requires the application of filters to separate short period variations from longer period trends. A common method used to smooth data is the moving average, which is a simple digital filter that can distort the resulting series due to the aliasing of the sampling period into the output series. We utilized Hamming filters to compare MSU/AMSU satellite time series developed by three research groups (UAH, RSS and NOAA STAR), the results published in January 2017 [http://journals.ametsoc.org/doi/abs/10.1175/JTECH-D-16-0121.1]. Since the last release date (July 2016) for the data analyzed in that paper, some of these groups have updated their analytical procedures and additional months of data are available to extend the series. An updated analysis of these data using the latest data releases available from each group is to be presented. Improved graphics will be employed to provide a clearer visualization of the differences between each group's results. As in the previous paper, the greatest difference between the UAH TMT series and those from the RSS and NOAA data appears during the early period of data from the MSU instruments before about 2003, as shown in the attached figure, and preliminary results indicate this pattern continues. Also to be presented are other findings regarding seasonal changes which were not included in the previous study.
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
Liu, Yangfan; Bolton, J Stuart
2016-08-01
The (Cartesian) multipole series, i.e., the series comprising monopole, dipoles, quadrupoles, etc., can be used, as an alternative to the spherical or cylindrical wave series, in representing sound fields in a wide range of problems, such as source radiation, sound scattering, etc. The proofs of the completeness of the spherical and cylindrical wave series in these problems are classical results, and it is also generally agreed that the Cartesian multipole series spans the same space as the spherical waves: a rigorous mathematical proof of that statement has, however, not been presented. In the present work, such a proof of the completeness of the Cartesian multipole series, both in two and three dimensions, is given, and the linear dependence relations among different orders of multipoles are discussed, which then allows one to easily extract a basis from the multipole series. In particular, it is concluded that the multipoles comprising the two highest orders in the series form a basis of the whole series, since the multipoles of all the lower source orders can be expressed as a linear combination of that basis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva, E.R.C. da; Filho, B.J.C.
This paper presents a PWM current clamping circuit for improving a series resonant DC link converter. This circuit is capable of reducing current peaks to about 1.2--1.4 times the DC bias current. When desired, resonant transition creates notches in the dc link current, allowing the converter`s switches to synchronize with external PWM strategy. A regulated DC current source may be obtained--by using a conventional rectifier source--to feed a DC load or a current source inverter. Phase plane approach makes ease the understanding the operation, control and design procedure of the circuit. Another topology is derived and its features compared tomore » the first circuit. Simulation results for the simplified circuit and for a three-phase induction motor driven by such inverter will be presented. Moreover, the principle is corroborated by experimental results.« less
Schachter, L; Dobrescu, S; Stiebing, K E; Thuillier, T; Lamy, T
2008-02-01
Charge diffusion in an electron cyclotron resonance ion source (ECRIS) discharge is usually characterized by nonambipolar behavior. While the ions are transported to the radial walls, electrons are lost axially from the magnetic trap. Global neutrality is maintained via compensating currents in the conducting walls of the vacuum chamber. It is assumed that this behavior reduces the ion breeding times compared to a truly ambipolar plasma. We have carried out a series of dedicated experiments in which the ambipolarity of the ECRIS plasma was influenced by inserting special metal-dielectric structures (MD layers) into the plasma chamber of the Frankfurt 14 GHz ECRIS. The measurements demonstrate the positive influence on the source performance when the ECR plasma is changed toward more ambipolar behavior.
NOAA Propagation Database Value in Tsunami Forecast Guidance
NASA Astrophysics Data System (ADS)
Eble, M. C.; Wright, L. M.
2016-02-01
The National Oceanic and Atmospheric Administration (NOAA) Center for Tsunami Research (NCTR) has developed a tsunami forecasting capability that combines a graphical user interface with data ingestion and numerical models to produce estimates of tsunami wave arrival times, amplitudes, current or water flow rates, and flooding at specific coastal communities. The capability integrates several key components: deep-ocean observations of tsunamis in real-time, a basin-wide pre-computed propagation database of water level and flow velocities based on potential pre-defined seismic unit sources, an inversion or fitting algorithm to refine the tsunami source based on the observations during an event, and tsunami forecast models. As tsunami waves propagate across the ocean, observations from the deep ocean are automatically ingested into the application in real-time to better define the source of the tsunami itself. Since passage of tsunami waves over a deep ocean reporting site is not immediate, we explore the value of the NOAA propagation database in providing placeholder forecasts in advance of deep ocean observations. The propagation database consists of water elevations and flow velocities pre-computed for 50 x 100 [km] unit sources in a continuous series along all known ocean subduction zones. The 2011 Japan Tohoku tsunami is presented as the case study
Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M
2007-01-01
We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.
LORETA EEG phase reset of the default mode network
Thatcher, Robert W.; North, Duane M.; Biver, Carl J.
2014-01-01
Objectives: The purpose of this study was to explore phase reset of 3-dimensional current sources in Brodmann areas located in the human default mode network (DMN) using Low Resolution Electromagnetic Tomography (LORETA) of the human electroencephalogram (EEG). Methods: The EEG was recorded from 19 scalp locations from 70 healthy normal subjects ranging in age from 13 to 20 years. A time point by time point computation of LORETA current sources were computed for 14 Brodmann areas comprising the DMN in the delta frequency band. The Hilbert transform of the LORETA time series was used to compute the instantaneous phase differences between all pairs of Brodmann areas. Phase shift and lock durations were calculated based on the 1st and 2nd derivatives of the time series of phase differences. Results: Phase shift duration exhibited three discrete modes at approximately: (1) 25 ms, (2) 50 ms, and (3) 65 ms. Phase lock duration present primarily at: (1) 300–350 ms and (2) 350–450 ms. Phase shift and lock durations were inversely related and exhibited an exponential change with distance between Brodmann areas. Conclusions: The results are explained by local neural packing density of network hubs and an exponential decrease in connections with distance from a hub. The results are consistent with a discrete temporal model of brain function where anatomical hubs behave like a “shutter” that opens and closes at specific durations as nodes of a network giving rise to temporarily phase locked clusters of neurons for specific durations. PMID:25100976
NASA Astrophysics Data System (ADS)
Czuba, Jonathan A.; Foufoula-Georgiou, Efi; Gran, Karen B.; Belmont, Patrick; Wilcock, Peter R.
2017-05-01
Understanding how sediment moves along source to sink pathways through watersheds—from hillslopes to channels and in and out of floodplains—is a fundamental problem in geomorphology. We contribute to advancing this understanding by modeling the transport and in-channel storage dynamics of bed material sediment on a river network over a 600 year time period. Specifically, we present spatiotemporal changes in bed sediment thickness along an entire river network to elucidate how river networks organize and process sediment supply. We apply our model to sand transport in the agricultural Greater Blue Earth River Basin in Minnesota. By casting the arrival of sediment to links of the network as a Poisson process, we derive analytically (under supply-limited conditions) the time-averaged probability distribution function of bed sediment thickness for each link of the river network for any spatial distribution of inputs. Under transport-limited conditions, the analytical assumptions of the Poisson arrival process are violated (due to in-channel storage dynamics) where we find large fluctuations and periodicity in the time series of bed sediment thickness. The time series of bed sediment thickness is the result of dynamics on a network in propagating, altering, and amalgamating sediment inputs in sometimes unexpected ways. One key insight gleaned from the model is that there can be a small fraction of reaches with relatively low-transport capacity within a nonequilibrium river network acting as "bottlenecks" that control sediment to downstream reaches, whereby fluctuations in bed elevation can dissociate from signals in sediment supply.
NASA Astrophysics Data System (ADS)
Allstadt, Kate
2013-09-01
methods can substantially improve the characterization of the dynamics of large and rapid landslides. Such landslides often generate strong long-period seismic waves due to the large-scale acceleration of the entire landslide mass, which, according to theory, can be approximated as a single-force mechanism at long wavelengths. I apply this theory and invert the long-period seismic waves generated by the 48.5 Mm3 August 2010 Mount Meager rockslide-debris flow in British Columbia. Using data from five broadband seismic stations 70 to 276 km from the source, I obtain a time series of forces the landslide exerted on the Earth, with peak forces of 1.0 × 1011 N. The direction and amplitude of the forces can be used to determine the timing and occurrence of events and subevents. Using this result, in combination with other field and geospatial evidence, I calculate an average horizontal acceleration of the rockslide of 0.39 m/s2 and an average apparent coefficient of basal friction of 0.38 ± 0.02, which suggests elevated basal fluid pressures. The direction and timing of the strongest forces are consistent with the centripetal acceleration of the debris flow around corners in its path. I use this correlation to estimate speeds, which peak at 92 m/s. This study demonstrates that the time series recording of forces exerted by a large and rapid landslide derived remotely from seismic records can be used to tie post-slide evidence to what actually occurred during the event and can serve to validate numerical models and theoretical methods.
Fumeaux, Christophe; Lin, Hungyen; Serita, Kazunori; Withayachumnankul, Withawat; Kaufmann, Thomas; Tonouchi, Masayoshi; Abbott, Derek
2012-07-30
The process of terahertz generation through optical rectification in a nonlinear crystal is modeled using discretized equivalent current sources. The equivalent terahertz sources are distributed in the active volume and computed based on a separately modeled near-infrared pump beam. This approach can be used to define an appropriate excitation for full-wave electromagnetic numerical simulations of the generated terahertz radiation. This enables predictive modeling of the near-field interactions of the terahertz beam with micro-structured samples, e.g. in a near-field time-resolved microscopy system. The distributed source model is described in detail, and an implementation in a particular full-wave simulation tool is presented. The numerical results are then validated through a series of measurements on square apertures. The general principle can be applied to other nonlinear processes with possible implementation in any full-wave numerical electromagnetic solver.
Integration of Reference Frames Using VLBI
NASA Technical Reports Server (NTRS)
Ma, Chopo; Smith, David E. (Technical Monitor)
2001-01-01
Very Long Baseline Interferometry (VLBI) has the unique potential to integrate the terrestrial and celestial reference frames through simultaneous estimation of positions and velocities of approx. 40 active VLBI stations and a similar number of stations/sites with sufficient historical data, the position and position stability of approx. 150 well-observed extragalactic radio sources and another approx. 500 sources distributed fairly uniformly on the sky, and the time series of the five parameters that specify the relative orientation of the two frames. The full realization of this potential is limited by a number of factors including the temporal and spatial distribution of the stations, uneven distribution of observations over the sources and the sky, variations in source structure, modeling of the solid/fluid Earth and troposphere, logistical restrictions on the daily observing network size, and differing strategies for optimizing analysis for TRF, for CRF and for EOP. The current status of separately optimized and integrated VLBI analysis will be discussed.
Kaptsov, V A; Sosunov, N N; Shishchenko, I I; Viktorov, V S; Tulushev, V N; Deynego, V N; Bukhareva, E A; Murashova, M A; Shishchenko, A A
2014-01-01
There was performed the experimental work on the study of the possibility of the application of LED lighting (LED light sources) in rail transport for traffic safety in related professions. Results of 4 series of studies involving 10 volunteers for the study and a comparative evaluation of the functional state of the visual analyzer, the general functional state and mental capacity under the performing the simulated operator activity in conditions of traditional light sources (incandescent, fluorescent lamp) and the new LED (LED lamp, LED panel) light sources have revealed changes in the negative direction. This was pronounced in a some decrease of functional stability to color discrimination between green and red cone signals, as well as an increase in response time in complex visual--motor response and significant reduction in readiness for emergency action of examinees.
Shao, Liyang; Zhang, Lianjun; Zhen, Zhen
2017-01-01
Children’s blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children’s blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children’s BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children’s BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children’s blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children’s BLLs in the city of Syracuse, NY. The average of children’s BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children’s blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children’s BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis. PMID:28182688
Shao, Liyang; Zhang, Lianjun; Zhen, Zhen
2017-01-01
Children's blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children's blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children's BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children's BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children's blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children's BLLs in the city of Syracuse, NY. The average of children's BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children's blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children's BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis.
Annual land cover change mapping using MODIS time series to improve emissions inventories.
NASA Astrophysics Data System (ADS)
López Saldaña, G.; Quaife, T. L.; Clifford, D.
2014-12-01
Understanding and quantifying land surface changes is necessary for estimating greenhouse gas and ammonia emissions, and for meeting air quality limits and targets. More sophisticated inventories methodologies for at least key emission source are needed due to policy-driven air quality directives. Quantifying land cover changes on an annual basis requires greater spatial and temporal disaggregation of input data. The main aim of this study is to develop a methodology for using Earth Observations (EO) to identify annual land surface changes that will improve emissions inventories from agriculture and land use/land use change and forestry (LULUCF) in the UK. First goal is to find the best sets of input features that describe accurately the surface dynamics. In order to identify annual and inter-annual land surface changes, a times series of surface reflectance was used to capture seasonal variability. Daily surface reflectance images from the Moderate Resolution Imaging Spectroradiometer (MODIS) at 500m resolution were used to invert a Bidirectional Reflectance Distribution Function (BRDF) model to create the seamless time series. Given the limited number of cloud-free observations, a BRDF climatology was used to constrain the model inversion and where no high-scientific quality observations were available at all, as a gap filler. The Land Cover Map 2007 (LC2007) produced by the Centre for Ecology & Hydrology (CEH) was used for training and testing purposes. A prototype land cover product was created for 2006 to 2008. Several machine learning classifiers were tested as well as different sets of input features going from the BRDF parameters to spectral Albedo. We will present the results of the time series development and the first exercises when creating the prototype land cover product.
NASA Astrophysics Data System (ADS)
McGinty, N.; Johnson, M. P.; Power, A. M.
2012-07-01
Population dynamics in open systems are complicated by the interactions of local demography and local environmental forcing with processes occurring at larger scales. A local system such as an estuary or bay may contain a zooplankton population that effectively becomes independent of regional dynamics or the local dynamics may be closely coupled to a broader scale pattern. As an alternative, the details of migration and advection may mean that dynamics in a local system are coupled to other specific areas rather than tracking the overall dynamics at a larger scale. We used a reconstructed time series (1973-1987) for copepod taxa to examine the extent to which zooplankton dynamics in Galway Bay reflect processes in broader areas of the NE Atlantic. Continuous Plankton Recorder (CPR) counts were used to establish time series for nine offshore ecoregions, with the regions themselves defined using underlying patterns of chlorophyll variability. The open nature of Galway Bay was reflected in strong associations between bay zooplankton counts and offshore CPR data in a majority of cases (7/10). For each zooplankton taxon, there were large differences among regions in the degree of association with Galway Bay time series. Akaike weights indicated that one ecoregion tended to be the dominant link for each taxon. This indicates that the zooplankton of the Bay reflect more than the local modification of a regional signal and that different zooplankton in the bay may have separate source regions. The data from Galway Bay also fall within a 'sampling shadow' of the CPR. Later years of the time series showed evidence for changes in phenology, with spring zooplankton peaks generally occurring earlier in the year for smaller species.
Experimental measurements of hydrodynamic instabilities on NOVA of relevance to astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budil, K S; Cherfils, C; Drake, R P
1998-09-11
Large lasers such as Nova allow the possibility of achieving regimes of high energy densities in plasmas of millimeter spatial scales and nanosecond time scales. In those plasmas where thermal conductivity and viscosity do not play a significant role, the hydrodynamic evolution is suitable for benchmarking hydrodynamics modeling in astrophysical codes. Several experiments on Nova examine hydrodynamically unstable interfaces. A typical Nova experiment uses a gold millimeter-scale hohlraum to convert the laser energy to a 200 eV blackbody source lasting about a nanosecond. The x-rays ablate a planar target, generating a series of shocks and accelerating the target. The evolvingmore » area1 density is diagnosed by time-resolved radiography, using a second x-ray source. Data from several experiments are presented and diagnostic techniques are discussed.« less
NASA Astrophysics Data System (ADS)
Heine, Frank; Schwander, Thomas; Lange, Robert; Smutny, Berry
2006-04-01
Tesat-Spacecom has developed a series of fiber coupled single frequency lasers for space applications ranging from onboard metrology for space borne FTIR spectrometers to step tunable seed lasers for LIDAR applications. The cw-seed laser developed for the ESA AEOLUS Mission shows a 3* 10 -11 Allen variance from 1 sec time intervals up to 1000 sec. Q-switched lasers with stable beam pointing under space environments are another field of development. One important aspect of a space borne laser system is a reliable fiber coupled laser diode pump source around 808nm. A dedicated development concerning chip design and packaging yielded in a 5*10 6h MTTF (mean time to failure) for the broad area emitters. Qualification and performance test results for the different laser assemblies will be presented and their application in the different space programs.
Empirical wind model for the middle and lower atmosphere. Part 2: Local time variations
NASA Technical Reports Server (NTRS)
Hedin, A. E.; Fleming, E. L.; Manson, A. H.; Schmidlin, F. J.; Avery, S. K.; Clark, R. R.; Franke, S. J.; Fraser, G. J.; Tsuda, T.; Vial, F.
1993-01-01
The HWM90 thermospheric wind model was revised in the lower thermosphere and extended into the mesosphere and lower atmosphere to provide a single analytic model for calculating zonal and meridional wind profiles representative of the climatological average for various geophysical conditions. Local time variations in the mesosphere are derived from rocket soundings, incoherent scatter radar, MF radar, and meteor radar. Low-order spherical harmonics and Fourier series are used to describe these variations as a function of latitude and day of year with cubic spline interpolation in altitude. The model represents a smoothed compromise between the original data sources. Although agreement between various data sources is generally good, some systematic differences are noted. Overall root mean square differences between measured and model tidal components are on the order of 5 to 10 m/s.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Tong, Yindong; Bu, Xiaoge; Chen, Junyue; Zhou, Feng; Chen, Long; Liu, Maodian; Tan, Xin; Yu, Tao; Zhang, Wei; Mi, Zhaorong; Ma, Lekuan; Wang, Xuejun; Ni, Jing
2017-01-05
Based on a time-series dataset and the mass balance method, the contributions of various sources to the nutrient discharges from the Yangtze River to the East China Sea are identified. The results indicate that the nutrient concentrations vary considerably among different sections of the Yangtze River. Non-point sources are an important source of nutrients to the Yangtze River, contributing about 36% and 63% of the nitrogen and phosphorus discharged into the East China Sea, respectively. Nutrient inputs from non-point sources vary among the sections of the Yangtze River, and the contributions of non-point sources increase from upstream to downstream. Considering the rice growing patterns in the Yangtze River Basin, the synchrony of rice tillering and the wet seasons might be an important cause of the high nutrient discharge from the non-point sources. Based on our calculations, a reduction of 0.99Tg per year in total nitrogen discharges from the Yangtze River would be needed to limit the occurrences of harmful algal blooms in the East China Sea to 15 times per year. The extensive construction of sewage treatment plants in urban areas may have only a limited effect on reducing the occurrences of harmful algal blooms in the future. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dobrovolný, Petr; Brázdil, Rudolf; Kotyza, Oldřich; Valášek, Hubert
2010-05-01
Series of temperature and precipitation indices (in ordinal scale) based on interpretation of various sources of documentary evidence (e.g. narrative written reports, visual daily weather records, personal correspondence, special prints, official economic records, etc.) are used as predictors in the reconstruction of mean seasonal temperatures and seasonal precipitation totals for the Czech Lands from A.D. 1500. Long instrumental measurements from 1771 (temperatures) and 1805 (precipitation) are used as a target values to calibrate and verify documentary-based index series. Reconstruction is based on linear regression with variance and mean adjustments. Reconstructed series were compared with similar European documentary-based reconstructions as well as with reconstructions based on different natural proxies. Reconstructed series were analyzed with respect to trends on different time-scales and occurrence of extreme values. We discuss uncertainties typical for documentary evidence from historical archives. Besides the fact that reports on weather and climate in documentary archives cover all seasons, our reconstructions provide the best results for winter temperatures and summer precipitation. However, explained variance for these seasons is comparable to other existing reconstructions for Central Europe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, L.; Witzel, G.; Ghez, A. M.
2014-08-10
Continuously time variable sources are often characterized by their power spectral density and flux distribution. These quantities can undergo dramatic changes over time if the underlying physical processes change. However, some changes can be subtle and not distinguishable using standard statistical approaches. Here, we report a methodology that aims to identify distinct but similar states of time variability. We apply this method to the Galactic supermassive black hole, where 2.2 μm flux is observed from a source associated with Sgr A* and where two distinct states have recently been suggested. Our approach is taken from mathematical finance and works withmore » conditional flux density distributions that depend on the previous flux value. The discrete, unobserved (hidden) state variable is modeled as a stochastic process and the transition probabilities are inferred from the flux density time series. Using the most comprehensive data set to date, in which all Keck and a majority of the publicly available Very Large Telescope data have been merged, we show that Sgr A* is sufficiently described by a single intrinsic state. However, the observed flux densities exhibit two states: noise dominated and source dominated. Our methodology reported here will prove extremely useful to assess the effects of the putative gas cloud G2 that is on its way toward the black hole and might create a new state of variability.« less
Falcone, Roger [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Advanced Light Source (ALS); Univ. of California, Berkeley, CA (United States). Dept. of Physics
2018-05-04
Summer Lecture Series 2008: Molecular movies of chemical reactions and material phase transformations need a strobe of x-rays, the penetrating light that reveals how atoms and molecules assemble in chemical and biological systems and complex materials. Roger Falcone, Director of the Advanced Light Source,will discuss a new generation of x ray sources that will enable a new science of atomic dynamics on ultrafast timescales.
Perioperative outcome in dogs with hemoperitoneum: 83 cases (2005-2010).
Lux, Cassie N; Culp, William T N; Mayhew, Philipp D; Tong, Kim; Rebhun, Robert B; Kass, Philip H
2013-05-15
To characterize the clinical course of dogs with hemoperitoneum in the perioperative setting and to determine risk factors that may affect short-term outcome. Retrospective case series. 83 client-owned dogs. The medical records of dogs with hemoperitoneum that underwent surgery between 2005 and 2010 were reviewed. Data were analyzed to determine risk factors associated with perioperative outcome. The perioperative period was defined as the time from admission to the hospital for treatment of hemoperitoneum until the time of discharge or euthanasia (within the same visit). 13 of 83 (16%) dogs died or were euthanized in the perioperative period. The median hospitalization time for surviving dogs was 2 days (range, 1 to 5 days). The requirement for a massive transfusion with blood products was a negative prognostic indicator for hospital discharge. The source of bleeding was isolated to the spleen in 75 of 83 (90%) dogs; a splenic source of hemorrhage was determined to be a positive predictor of survival to discharge from the hospital. In the present study, factors associated with death and failure to be discharged from the hospital included tachycardia, a requirement for massive transfusion with blood products, and the development of respiratory disease secondary to suspected pulmonary thromboembolism or acute respiratory distress syndrome. The presence of disease within the spleen was positively associated with survival to discharge. Surgical intervention for treatment of hemoperitoneum, regardless of etiology, resulted in discharge from the hospital for 70 of the 83 (84%) dogs in this series.
Time series analysis of Carbon Monoxide from MOPITT over the Asian Continent from 2000-2004
NASA Astrophysics Data System (ADS)
Bhattacharjee, P. S.; Roy, P.
2005-12-01
The human population continues to grow and large parts of the world industrialize rapidly, causing changes in the global atmospheric chemistry. Carbon monoxide (CO) is a poisonous gas in the troposphere when highly concentrated, and is produced by fossil fuel combustion, biomass burning and through natural emissions from plants. It is also an important trace gas in the atmosphere and plays a major role in the atmospheric chemistry. We present a study of CO from the measurement of MOPITT (Measurement of Pollution in the Troposphere-Level 3 gridded data) instrument on NASA Terra satellite over India and Eastern Asia for the period of 2000-2004. Day- and night-time total column CO measurements are considered over the selected regions in India, China, Thailand and Japan. The selected regions comprise of industrial cities in the Asian continent which form the source of high CO in the atmosphere. The time series data do not show an overall increasing or decreasing trend, but CO is affected by seasonal variations, wind, and precipitation patterns. East Asian regions have higher and wider seasonal fluctuations than the Indian region. CO total column values over the Bay of Bengal are also high and can be explained through wind patterns from the land towards the ocean. Although the sources of CO are mostly confined to the land, it is transported globally through the atmosphere, and has high concentrations over the ocean.
NASA Astrophysics Data System (ADS)
Tovar-SáNchez, Antonio; Serón, Juan; Marbã, Núria; Arrieta, Jesús M.; Duarte, Carlos M.
2010-06-01
We discuss Al, Ag, Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb, and Zn contents in seagrass Posidonia oceanica rhizomes from the Balearic Archipelago for the last 3 decades. Time series of metal concentration in P. oceanica were measured by dating rhizomes using retrospective procedures. The highest concentrations of Al (174.73 μg g-1), Cd (3.56 μg g-1), Cr (1.34 μg g-1), Cu (32.15 μg g-1), Pb (8.51 μg g-1), and Zn (107.14 μg g-1) were measured in meadows located around the largest and most densely populated island (Mallorca Island). There was a general tendency for Ag concentration to decrease with time (up to 80% from 1990 to 2005 in sample from Mallorca Island), which could be attributed to a reduction of the anthropogenic sources. Nickel and Zn concentrations were the unique elements that showed a consistent temporal trend in all samples, increasing their concentrations since year 1996 at all studied stations; this trend matched with the time series of UV-absorbing aerosols particles in the air (i.e., aerosols index) over the Mediterranean region (r2: 0.78, p < 0.001 for Cabrera Island), suggesting that P. oceanica could be an efficient recorder of dust events. A comparison of enrichment factors in rhizomes relative to average crustal material indicates that suspended aerosol is also the most likely source for Cr and Fe to P. oceanica.
Bao, Changjun; Hu, Jianli; Liu, Wendong; Liang, Qi; Wu, Ying; Norris, Jessie; Peng, Zhihang; Yu, Rongbin; Shen, Hongbing; Chen, Feng
2014-01-01
Objective This study aimed to describe the spatial and temporal trends of Shigella incidence rates in Jiangsu Province, People's Republic of China. It also intended to explore complex risk modes facilitating Shigella transmission. Methods County-level incidence rates were obtained for analysis using geographic information system (GIS) tools. Trend surface and incidence maps were established to describe geographic distributions. Spatio-temporal cluster analysis and autocorrelation analysis were used for detecting clusters. Based on the number of monthly Shigella cases, an autoregressive integrated moving average (ARIMA) model successfully established a time series model. A spatial correlation analysis and a case-control study were conducted to identify risk factors contributing to Shigella transmissions. Results The far southwestern and northwestern areas of Jiangsu were the most infected. A cluster was detected in southwestern Jiangsu (LLR = 11674.74, P<0.001). The time series model was established as ARIMA (1, 12, 0), which predicted well for cases from August to December, 2011. Highways and water sources potentially caused spatial variation in Shigella development in Jiangsu. The case-control study confirmed not washing hands before dinner (OR = 3.64) and not having access to a safe water source (OR = 2.04) as the main causes of Shigella in Jiangsu Province. Conclusion Improvement of sanitation and hygiene should be strengthened in economically developed counties, while access to a safe water supply in impoverished areas should be increased at the same time. PMID:24416167
NASA Astrophysics Data System (ADS)
Demetrescu, C.; Dobrica, V.; Stefan, C.
2017-12-01
A rich scientific literature is linking length-of-day (LOD) fluctuations to geomagnetic field and flow oscillations in the fluid outer core. We demostrate that the temporal evolution of the geomagnetic field shows the existence of several oscillations at decadal, inter-decadal, and sub-centennial time scales that superimpose on a so-called inter-centennial constituent. We show that while the subcentennial oscillations of the geomagnetic field, produced by torsional oscillations in the core, could be linked to oscillations of LOD at a similar time scale, the oscillations at decadal and sub-decadal time scales, of external origin, can be found in LOD too. We discuss these issues from the perspective of long time-span main field models (gufm1 - Jackson et al., 2000; COV-OBS - Gillet et al., 2013) that are used to retrieve time series of geomagnetic elements in a 2.5x2.5° network. The decadal and sub-decadal constituents of the time series of annual values in LOD and geomagnetic field were separated in the cyclic component of a Hodrick-Prescott filtering applied to data, and shown to highly correlate to variations of external sources such as the magnetospheric ring current.
Hamid, Laith; Al Farawn, Ali; Merlet, Isabelle; Japaridze, Natia; Heute, Ulrich; Stephani, Ulrich; Galka, Andreas; Wendling, Fabrice; Siniatchkin, Michael
2017-07-01
The clinical routine of non-invasive electroencephalography (EEG) is usually performed with 8-40 electrodes, especially in long-term monitoring, infants or emergency care. There is a need in clinical and scientific brain imaging to develop inverse solution methods that can reconstruct brain sources from these low-density EEG recordings. In this proof-of-principle paper we investigate the performance of the spatiotemporal Kalman filter (STKF) in EEG source reconstruction with 9-, 19- and 32- electrodes. We used simulated EEG data of epileptic spikes generated from lateral frontal and lateral temporal brain sources using state-of-the-art neuronal population models. For validation of source reconstruction, we compared STKF results to the location of the simulated source and to the results of low-resolution brain electromagnetic tomography (LORETA) standard inverse solution. STKF consistently showed less localization bias compared to LORETA, especially when the number of electrodes was decreased. The results encourage further research into the application of the STKF in source reconstruction of brain activity from low-density EEG recordings.
NASA Astrophysics Data System (ADS)
Daux, V.; Garcia de Cortazar-Atauri, I.; Yiou, P.; Chuine, I.; Garnier, E.; Ladurie, E. Le Roy; Mestre, O.; Tardaguila, J.
2012-09-01
We present an open-access dataset of grape harvest dates (GHD) series that has been compiled from international, French and Spanish literature and from unpublished documentary sources from public organizations and from wine-growers. As of June 2011, this GHD dataset comprises 380 series mainly from France (93% of the data) as well as series from Switzerland, Italy, Spain and Luxemburg. The series have variable length (from 1 to 479 data, mean length of 45 data) and contain gaps of variable sizes (mean ratio of observations/series length of 0.74). The longest and most complete ones are from Burgundy, Switzerland, Southern Rhône valley, Jura and Ile-de-France. The most ancient harvest date of the dataset is in 1354 in Burgundy. The GHD series were grouped into 27 regions according to their location, to geomorphological and geological criteria, and to past and present grape varieties. The GHD regional composite series (GHD-RCS) were calculated and compared pairwise to assess their reliability assuming that series close to one another are highly correlated. Most of the pairwise correlations are significant (p-value < 0.001) and strong (mean pairwise correlation coefficient of 0.58). As expected, the correlations tend to be higher when the vineyards are closer. The highest correlation (R = 0.91) is obtained between the High Loire Valley and the Ile-de-France GHD-RCS. The strong dependence of the vine cycle on temperature and, therefore, the strong link between the harvest dates and the temperature of the growing season was also used to test the quality of the GHD series. The strongest correlations are obtained between the GHD-RCS and the temperature series of the nearest weather stations. Moreover, the GHD-RCS/temperature correlation maps show spatial patterns similar to temperature correlation maps. The stability of the correlations over time is explored. The most striking feature is their generalised deterioration at the late 19th-early 20th century. The possible effects on GHD of the phylloxera crisis, which took place at this time, are discussed. The median of all the standardized GHD-RCS was calculated. The distribution of the extreme years of this general series is not homogenous. Extremely late years all occur during a two-century long time window from the early 17th to the early 19th century, while extremely early years are frequent during the 16th and since the mid-19th century.
NASA Astrophysics Data System (ADS)
Sánchez de la Campa, A. M.; de la Rosa, J. D.
2014-12-01
A temporal series study of atmospheric aerosol was performed over the last ten years (2003-2012) in an urban background monitoring station with ceramic industrial influence, in Bailén, SE Spain. Temporal trends of major and minor chemical components of PM10 for a long term data series were investigated, showing that PM10 concentrations have been steadily decreasing over almost a decade, with a statistical significance. Measurements indicate a reduction of elements and components related to the industrial activity of brick-ceramic production (V, Cd, Rb, La, Cr, Ni, As, Pb and SO42-). Conversely, Cu levels define an increasing trend from the beginning of the study period but with the highest step trend since 2011-2012, coinciding with the beginning of the financial and economic crisis in 2008. A similar time evolution pattern of Cu and OC, EC, and K levels may be a tracer of domestic local combustion source, and a new biomass burning source has been identified. Chemical composition of olive tree logs suggest as the combustion of wood with high concentration of Cu can imply an increase of Cu concentration in the atmospheric particles compared with other sources such as traffic.
Detection of a sudden change of the field time series based on the Lorenz system.
Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan
2017-01-01
We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.
Automated Analysis of Renewable Energy Datasets ('EE/RE Data Mining')
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Brian; Elmore, Ryan; Getman, Dan
This poster illustrates methods to substantially improve the understanding of renewable energy data sets and the depth and efficiency of their analysis through the application of statistical learning methods ('data mining') in the intelligent processing of these often large and messy information sources. The six examples apply methods for anomaly detection, data cleansing, and pattern mining to time-series data (measurements from metering points in buildings) and spatiotemporal data (renewable energy resource datasets).
Enhancing programming logic thinking using analogy mapping
NASA Astrophysics Data System (ADS)
Sukamto, R. A.; Megasari, R.
2018-05-01
Programming logic thinking is the most important competence for computer science students. However, programming is one of the difficult subject in computer science program. This paper reports our work about enhancing students' programming logic thinking using Analogy Mapping for basic programming subject. Analogy Mapping is a computer application which converts source code into analogies images. This research used time series evaluation and the result showed that Analogy Mapping can enhance students' programming logic thinking.
2009-06-01
isolation. In addition to being inherently multi-modal, human perception takes advantages of multiple sources of information within a single modality...restric- tion was reasonable for the applications we looked at. However, consider using a TIM to model a teacher student relationship among moving objects...That is, imagine one teacher object demonstrating a behavior for a student object. The student can observe the teacher and then recreate the behavior
1987-02-01
flowcharting . 3. ProEram Codin in HLL. This stage consists of transcribing the previously designed program into R an t at can be translated into the machine...specified conditios 7. Documentation. Program documentation is necessary for user information, for maintenance, and for future applications. Flowcharts ...particular CP U. Asynchronous. Operating without reference to an overall timing source. BASIC. Beginners ’ All-purpose Symbolic Instruction Code; a widely
Continuous slope-area discharge records in Maricopa County, Arizona, 2004–2012
Wiele, Stephen M.; Heaton, John W.; Bunch, Claire E.; Gardner, David E.; Smith, Christopher F.
2015-12-29
Analyses of sources of errors and the impact stage data errors have on calculated discharge time series are considered, along with issues in data reduction. Steeper, longer stream reaches are generally less sensitive to measurement error. Other issues considered are pressure transducer drawdown, capture of flood peaks with discrete stage data, selection of stage record for development of rating curves, and minimum stages for the calculation of discharge.