Modelling of Vortex-Induced Loading on a Single-Blade Installation Setup
NASA Astrophysics Data System (ADS)
Skrzypiński, Witold; Gaunaa, Mac; Heinz, Joachim
2016-09-01
Vortex-induced integral loading fluctuations on a single suspended blade at various inflow angles were modeled in the presents work by means of stochastic modelling methods. The reference time series were obtained by 3D DES CFD computations carried out on the DTU 10MW reference wind turbine blade. In the reference time series, the flapwise force component, Fx, showed both higher absolute values and variation than the chordwise force component, Fz, for every inflow angle considered. For this reason, the present paper focused on modelling of the Fx and not the Fz whereas Fz would be modelled using exactly the same procedure. The reference time series were significantly different, depending on the inflow angle. This made the modelling of all the time series with a single and relatively simple engineering model challenging. In order to find model parameters, optimizations were carried out, based on the root-mean-square error between the Single-Sided Amplitude Spectra of the reference and modelled time series. In order to model well defined frequency peaks present at certain inflow angles, optimized sine functions were superposed on the stochastically modelled time series. The results showed that the modelling accuracy varied depending on the inflow angle. None the less, the modelled and reference time series showed a satisfactory general agreement in terms of their visual and frequency characteristics. This indicated that the proposed method is suitable to model loading fluctuations on suspended blades.
NASA Astrophysics Data System (ADS)
Wu, Xiaoping; Abbondanza, Claudio; Altamimi, Zuheir; Chin, T. Mike; Collilieux, Xavier; Gross, Richard S.; Heflin, Michael B.; Jiang, Yan; Parker, Jay W.
2015-05-01
The current International Terrestrial Reference Frame is based on a piecewise linear site motion model and realized by reference epoch coordinates and velocities for a global set of stations. Although linear motions due to tectonic plates and glacial isostatic adjustment dominate geodetic signals, at today's millimeter precisions, nonlinear motions due to earthquakes, volcanic activities, ice mass losses, sea level rise, hydrological changes, and other processes become significant. Monitoring these (sometimes rapid) changes desires consistent and precise realization of the terrestrial reference frame (TRF) quasi-instantaneously. Here, we use a Kalman filter and smoother approach to combine time series from four space geodetic techniques to realize an experimental TRF through weekly time series of geocentric coordinates. In addition to secular, periodic, and stochastic components for station coordinates, the Kalman filter state variables also include daily Earth orientation parameters and transformation parameters from input data frames to the combined TRF. Local tie measurements among colocated stations are used at their known or nominal epochs of observation, with comotion constraints applied to almost all colocated stations. The filter/smoother approach unifies different geodetic time series in a single geocentric frame. Fragmented and multitechnique tracking records at colocation sites are bridged together to form longer and coherent motion time series. While the time series approach to TRF reflects the reality of a changing Earth more closely than the linear approximation model, the filter/smoother is computationally powerful and flexible to facilitate incorporation of other data types and more advanced characterization of stochastic behavior of geodetic time series.
75 FR 4570 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-28
... applications. Signal-to-Noise Enhancement in Imaging Applications Using a Time-Series of Images Description of... applications that use a time-series of images. In one embodiment of the invention, a time-series of images is... Imaging Applications Using a Time-Series of Images'' (HHS Reference No. E-292- 2009/0-US-01). Related...
NASA Astrophysics Data System (ADS)
Wang, H.; Cheng, J.
2017-12-01
A method to Synthesis natural electric and magnetic Time series is proposed whereby the time series of local site are derived using an Impulse Response and a reference (STIR). The method is based on the assumption that the external source of magnetic fields are uniform, and the electric and magnetic fields acquired at the surface satisfy a time-independent linear relation in frequency domain.According to the convolution theorem, we can synthesize natural electric and magnetic time series using the impulse responses of inter-station transfer functions with a reference. Applying this method, two impulse responses need to be estimated: the quasi-MT impulse response tensor and the horizontal magnetic impulse response tensor. These impulse response tensors relate the local horizontal electric and magnetic components with the horizontal magnetic components at a reference site, respectively. Some clean segments of times series are selected to estimate impulse responses by using least-square (LS) method. STIR is similar with STIN (Wang, 2017), but STIR does not need to estimate the inter-station transfer functions, and the synthesized data are more accurate in high frequency, where STIN fails when the inter-station transfer functions are contaminated severely. A test with good quality of MT data shows that synthetic time-series are similar to natural electric and magnetic time series. For contaminated AMT example, when this method is used to remove noise present at the local site, the scatter of MT sounding curves are clear reduced, and the data quality are improved. *This work is funded by National Key R&D Program of China(2017YFC0804105),National Natural Science Foundation of China (41604064, 51574250), State Key Laboratory of Coal Resources and Safe Mining ,China University of Mining & Technology,(SKLCRSM16DC09)
NASA Astrophysics Data System (ADS)
Caro Cuenca, Miguel; Esfahany, Sami Samiei; Hanssen, Ramon F.
2010-12-01
Persistent scatterer Radar Interferometry (PSI) can provide with a wealth of information on surface motion. These methods overcome the major limitations of the antecessor technique, interferometric SAR (InSAR), such as atmospheric disturbances, by detecting the scatterers which are slightly affected by noise. The time span that surface deformation processes are observed is limited by the satellite lifetime, which is usually less than 10 years. However most of deformation phenomena last longer. In order to fully monitor and comprehend the observed signal, acquisitions from different sensors can be merged. This is a complex task for one main reason. PSI methods provide with estimations that are relative in time to one of the acquisitions which is referred to as master or reference image. Therefore, time series acquired by different sensors will have different reference images and cannot be directly compared or joint unless they are set to the same time reference system. In global terms, the operation of translating from one to another reference systems consist of calculating a vertical offset, which is the total deformation that occurs between the two master times. To estimate this offset, different strategies can be applied, for example, using additional data such as leveling or GPS measurements. In this contribution we propose to use a least squares to merge PSI time series without any ancillary information. This method treats the time series individually, i.e. per PS, and requires some knowledge of the deformation signal, for example, if a polynomial would fairly describe the expected behavior. To test the proposed approach, we applied it to the southern Netherlands, where the surface is affected by ground water processes in abandoned mines. The time series were obtained after processing images provided by ERS1/2 and Envisat. The results were validated using in-situ water measurements, which show very high correlation with deformation time series.
Nonlinear time-series-based adaptive control applications
NASA Technical Reports Server (NTRS)
Mohler, R. R.; Rajkumar, V.; Zakrzewski, R. R.
1991-01-01
A control design methodology based on a nonlinear time-series reference model is presented. It is indicated by highly nonlinear simulations that such designs successfully stabilize troublesome aircraft maneuvers undergoing large changes in angle of attack as well as large electric power transients due to line faults. In both applications, the nonlinear controller was significantly better than the corresponding linear adaptive controller. For the electric power network, a flexible AC transmission system with series capacitor power feedback control is studied. A bilinear autoregressive moving average reference model is identified from system data, and the feedback control is manipulated according to a desired reference state. The control is optimized according to a predictive one-step quadratic performance index. A similar algorithm is derived for control of rapid changes in aircraft angle of attack over a normally unstable flight regime. In the latter case, however, a generalization of a bilinear time-series model reference includes quadratic and cubic terms in angle of attack.
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
Weighted combination of LOD values oa splitted into frequency windows
NASA Astrophysics Data System (ADS)
Fernandez, L. I.; Gambis, D.; Arias, E. F.
In this analysis a one-day combined time series of LOD(length-of-day) estimates is presented. We use individual data series derived by 7 GPS and 3 SLR analysis centers, which routinely contribute to the IERS database over a recent 27-month period (Jul 1996 - Oct 1998). The result is compared to the multi-technique combined series C04 produced by the Central Bureau of the IERS that is commonly used as a reference for the study of the phenomena of Earth rotation variations. The Frequency Windows Combined Series procedure brings out a time series, which is close to C04 but shows an amplitude difference that might explain the evident periodic behavior present in the differences of these two combined series. This method could be useful to generate a new time series to be used as a reference in the high frequency variations of the Earth rotation studies.
76 FR 2665 - Caribbean Fishery Management Council; Scoping Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-14
... time series of catch data that is considered to be consistently reliable across all islands as defined... based on what the Council considers to be the longest time series of catch data that is consistently... preferred management reference point time series. Action 3b. Recreational Bag Limits Option 1: No action. Do...
NASA Astrophysics Data System (ADS)
Deng, Liansheng; Jiang, Weiping; Li, Zhao; Chen, Hua; Wang, Kaihua; Ma, Yifang
2017-02-01
Higher-order ionospheric (HOI) delays are one of the principal technique-specific error sources in precise global positioning system analysis and have been proposed to become a standard part of precise GPS data processing. In this research, we apply HOI delay corrections to the Crustal Movement Observation Network of China's (CMONOC) data processing (from January 2000 to December 2013) and furnish quantitative results for the effects of HOI on CMONOC coordinate time series. The results for both a regional reference frame and global reference frame are analyzed and compared to clarify the HOI effects on the CMONOC network. We find that HOI corrections can effectively reduce the semi-annual signals in the northern and vertical components. For sites with lower semi-annual amplitudes, the average decrease in magnitude can reach 30 and 10 % for the northern and vertical components, respectively. The noise amplitudes with HOI corrections and those without HOI corrections are not significantly different. Generally, the HOI effects on CMONOC networks in a global reference frame are less obvious than the results in the regional reference frame, probably because the HOI-induced errors are smaller in comparison to the higher noise levels seen when using a global reference frame. Furthermore, we investigate the combined contributions of environmental loading and HOI effects on the CMONOC stations. The largest loading effects on the vertical displacement are found in the mid- to high-latitude areas. The weighted root mean square differences between the corrected and original weekly GPS height time series of the loading model indicate that the mass loading adequately reduced the scatter on the CMONOC height time series, whereas the results in the global reference frame showed better agreements between the GPS coordinate time series and the environmental loading. When combining the effects of environmental loading and HOI corrections, the results with the HOI corrections reduced the scatter on the observed GPS height coordinates better than the height when estimated without HOI corrections, and the combined solutions in the regional reference frame indicate more preferred improvements. Therefore, regional reference frames are recommended to investigate the HOI effects on regional networks.
Radiation: Time, Space and Spirit--Keys to Scientific Literacy Series.
ERIC Educational Resources Information Center
Stonebarger, Bill
This discussion of radiation considers the spectrum of electromagnetic energy including light, x-rays, radioactivity, and other waves. Radiation is considered from three aspects; time, space, and spirit. Time refers to a sense of history; space refers to geography; and spirit refers to life and thought. Several chapters on the history and concepts…
Tormene, Paolo; Giorgino, Toni; Quaglini, Silvana; Stefanelli, Mario
2009-01-01
The purpose of this study was to assess the performance of a real-time ("open-end") version of the dynamic time warping (DTW) algorithm for the recognition of motor exercises. Given a possibly incomplete input stream of data and a reference time series, the open-end DTW algorithm computes both the size of the prefix of reference which is best matched by the input, and the dissimilarity between the matched portions. The algorithm was used to provide real-time feedback to neurological patients undergoing motor rehabilitation. We acquired a dataset of multivariate time series from a sensorized long-sleeve shirt which contains 29 strain sensors distributed on the upper limb. Seven typical rehabilitation exercises were recorded in several variations, both correctly and incorrectly executed, and at various speeds, totaling a data set of 840 time series. Nearest-neighbour classifiers were built according to the outputs of open-end DTW alignments and their global counterparts on exercise pairs. The classifiers were also tested on well-known public datasets from heterogeneous domains. Nonparametric tests show that (1) on full time series the two algorithms achieve the same classification accuracy (p-value =0.32); (2) on partial time series, classifiers based on open-end DTW have a far higher accuracy (kappa=0.898 versus kappa=0.447;p<10(-5)); and (3) the prediction of the matched fraction follows closely the ground truth (root mean square <10%). The results hold for the motor rehabilitation and the other datasets tested, as well. The open-end variant of the DTW algorithm is suitable for the classification of truncated quantitative time series, even in the presence of noise. Early recognition and accurate class prediction can be achieved, provided that enough variance is available over the time span of the reference. Therefore, the proposed technique expands the use of DTW to a wider range of applications, such as real-time biofeedback systems.
Nuclear Power: Time, Space and Spirit--Keys to Scientific Literacy Series.
ERIC Educational Resources Information Center
Stonebarger, Bill
One of the most important discoveries of the twentieth century was the fission of radioactive materials. This booklet considers nuclear energy from three aspects: time; space; and spirit. Time refers to a sense of history; space refers to geography; and spirit refers to life and thought. Several chapters on the history and concepts of nuclear…
Fluctuations in Cerebral Hemodynamics
2003-12-01
Determination of scaling properties Detrended Fluctuations Analysis (see (28) and references therein) is commonly used to determine scaling...pressure (averaged over a cardiac beat) of a healthy subject. First 1000 values of the time series are shown. (b) Detrended fluctuation analysis (DFA...1000 values of the time series are shown. (b) Detrended fluctuation analysis of the time series shown in (a). Fig . 3 Side-by-side boxplot for the
The Gene: Time, Space and Spirit--Keys to Scientific Literacy Series.
ERIC Educational Resources Information Center
Stonebarger, Bill
It has only been since the late nineteenth century that people have understood the mechanics of heredity and the discoveries of genes and DNA are even more recent. This booklet considers three aspects of genetics; time, space, and spirit. Time refers to a sense of history; space refers to geography; and spirit refers to life and thought. Several…
The Expanding Universe: Time, Space and Spirit--Keys to Scientific Literacy Series.
ERIC Educational Resources Information Center
Stonebarger, Bill
Nearly every culture has made important discoveries about the universe. Most cultures have searched for a better understanding of the cosmos and how the earth and human life relate. The discussion in this booklet considers time, space, and spirit. Time refers to a sense of history; space refers to geography; and spirit refers to life and thought.…
Natural Resources: Time, Space and Spirit--Keys to Scientific Literacy Series.
ERIC Educational Resources Information Center
Stonebarger, Bill
Many experts have predicted a global crisis for the end of the twentieth century because of dwindling supplies of natural resources such as minerals, oil, gas, and soil. This booklet considers three aspects of natural resources, time, space, and spirit. Time refers to a sense of history; space refers to geography; and spirit refers to life and…
A DDC Bibliography on Computers in Information Sciences. Volume I. Information Sciences Series.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 249 annotated references grouped under two major headings: Time Shared, On-Line, and Real Time Systems, and Computer Components. The references are arranged in accesion number (AD-number)…
Living in Space: Time, Space and Spirit--Keys to Scientific Literacy Series.
ERIC Educational Resources Information Center
Stonebarger, Bill
The idea of flight and space travel are not new, but the technologies which make them possible are very recent. This booklet considers time, space, and spirit related to living in space. Time refers to a sense of history; space refers to geography; and spirit refers to life and thought. Several chapters on the history and concepts of flight and…
Using Time-Series Research Designs to Investigate the Effects of Instruction on SLA.
ERIC Educational Resources Information Center
Mellow, J. Dean; And Others
1996-01-01
Argues that the study of second-language acquisition theory can be enhanced through time-series research designs. Within the context of investigating the effects of second-language instruction, four main reasons for using T-S design are identified. (95 references) (Author/CK)
Extending the soil moisture record of the climate reference network with machine learning
USDA-ARS?s Scientific Manuscript database
Soil moisture estimation is crucial for agricultural decision-support and a key component of hydrological and climatic research. Unfortunately, quality-controlled soil moisture time series data are uncommon before the most recent decade. However, time series data for precipitation are accessible at ...
NASA Astrophysics Data System (ADS)
Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł
2018-01-01
The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Reference manual for generation and analysis of Habitat Time Series: version II
Milhous, Robert T.; Bartholow, John M.; Updike, Marlys A.; Moos, Alan R.
1990-01-01
The selection of an instream flow requirement for water resource management often requires the review of how the physical habitat changes through time. This review is referred to as 'Time Series Analysis." The Tune Series Library (fSLIB) is a group of programs to enter, transform, analyze, and display time series data for use in stream habitat assessment. A time series may be defined as a sequence of data recorded or calculated over time. Examples might be historical monthly flow, predicted monthly weighted usable area, daily electrical power generation, annual irrigation diversion, and so forth. The time series can be analyzed, both descriptively and analytically, to understand the importance of the variation in the events over time. This is especially useful in the development of instream flow needs based on habitat availability. The TSLIB group of programs assumes that you have an adequate study plan to guide you in your analysis. You need to already have knowledge about such things as time period and time step, species and life stages to consider, and appropriate comparisons or statistics to be produced and displayed or tabulated. Knowing your destination, you must first evaluate whether TSLIB can get you there. Remember, data are not answers. This publication is a reference manual to TSLIB and is intended to be a guide to the process of using the various programs in TSLIB. This manual is essentially limited to the hands-on use of the various programs. a TSLIB use interface program (called RTSM) has been developed to provide an integrated working environment where the use has a brief on-line description of each TSLIB program with the capability to run the TSLIB program while in the user interface. For information on the RTSM program, refer to Appendix F. Before applying the computer models described herein, it is recommended that the user enroll in the short course "Problem Solving with the Instream Flow Incremental Methodology (IFIM)." This course is offered by the Aquatic Systems Branch of the National Ecology Research Center. For more information about the TSLIB software, refer to the Memorandum of Understanding. Chapter 1 provides a brief introduction to the Instream Flow Incremental Methodology and TSLIB. Other chapters in this manual provide information on the different aspects of using the models. The information contained in the other chapters includes (2) acquisition, entry, manipulation, and listing of streamflow data; (3) entry, manipulation, and listing of the habitat-versus-streamflow function; (4) transferring streamflow data; (5) water resources systems analysis; (6) generation and analysis of daily streamflow and habitat values; (7) generation of the time series of monthly habitats; (8) manipulation, analysis, and display of month time series data; and (9) generation, analysis, and display of annual time series data. Each section includes documentation for the programs therein with at least one page of information for each program, including a program description, instructions for running the program, and sample output. The Appendixes contain the following: (A) sample file formats; (B) descriptions of default filenames; (C) alphabetical summary of batch-procedure files; (D) installing and running TSLIB on a microcomputer; (E) running TSLIB on a CDC Cyber computer; (F) using the TSLIB user interface program (RTSM); and (G) running WATSTORE on the USGS Amdahl mainframe computer. The number for this version of TSLIB--Version II-- is somewhat arbitrary, as the TSLIB programs were collected into a library some time ago; but operators tended to use and manage them as individual programs. Therefore, we will consider the group of programs from the past that were only on the CDC Cyber computer as Version 0; the programs from the past that were on both the Cyber and the IBM-compatible microcomputer as Version I; and the programs contained in this reference manual as Version II.
NASA Astrophysics Data System (ADS)
Menne, Matthew J.; Williams, Claude N., Jr.
2005-10-01
An evaluation of three hypothesis test statistics that are commonly used in the detection of undocumented changepoints is described. The goal of the evaluation was to determine whether the use of multiple tests could improve undocumented, artificial changepoint detection skill in climate series. The use of successive hypothesis testing is compared to optimal approaches, both of which are designed for situations in which multiple undocumented changepoints may be present. In addition, the importance of the form of the composite climate reference series is evaluated, particularly with regard to the impact of undocumented changepoints in the various component series that are used to calculate the composite.In a comparison of single test changepoint detection skill, the composite reference series formulation is shown to be less important than the choice of the hypothesis test statistic, provided that the composite is calculated from the serially complete and homogeneous component series. However, each of the evaluated composite series is not equally susceptible to the presence of changepoints in its components, which may be erroneously attributed to the target series. Moreover, a reference formulation that is based on the averaging of the first-difference component series is susceptible to random walks when the composition of the component series changes through time (e.g., values are missing), and its use is, therefore, not recommended. When more than one test is required to reject the null hypothesis of no changepoint, the number of detected changepoints is reduced proportionately less than the number of false alarms in a wide variety of Monte Carlo simulations. Consequently, a consensus of hypothesis tests appears to improve undocumented changepoint detection skill, especially when reference series homogeneity is violated. A consensus of successive hypothesis tests using a semihierarchic splitting algorithm also compares favorably to optimal solutions, even when changepoints are not hierarchic.
NASA Astrophysics Data System (ADS)
Roche, Marc; Degrendele, Koen; Vrignaud, Christophe; Loyer, Sophie; Le Bas, Tim; Augustin, Jean-Marie; Lurton, Xavier
2018-06-01
The increased use of backscatter measurements in time series for environmental monitoring necessitates the comparability of individual results. With the current lack of pre-calibrated multibeam echosounder systems for absolute backscatter measurement, a pragmatic solution is the use of natural reference areas for ensuring regular assessment of the backscatter measurement repeatability. This method mainly relies on the assumption of a sufficiently stable reference area regarding its backscatter signature. The aptitude of a natural area to provide a stable and uniform backscatter response must be carefully considered and demonstrated by a sufficiently long time-series of measurements. Furthermore, this approach requires a strict control of the acquisition and processing parameters. If all these conditions are met, stability check and relative calibration of a system are possible by comparison with the averaged backscatter values for the area. Based on a common multibeam echosounder and sampling campaign completed by available bathymetric and backscatter time series, the suitability as a backscatter reference area of three different candidates was evaluated. Two among them, Carré Renard and Kwinte, prove to be excellent choices, while the third one, Western Solent, lacks sufficient data over time, but remains a valuable candidate. The case studies and the available backscatter data on these areas prove the applicability of this method. The expansion of the number of commonly used reference areas and the growth of the number of multibeam echosounder controlled thereon could greatly contribute to the further development of quantitative applications based on multibeam echosounder backscatter measurements.
Use of the Box and Jenkins time series technique in traffic forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nihan, N.L.; Holmesland, K.O.
The use of recently developed time series techniques for short-term traffic volume forecasting is examined. A data set containing monthly volumes on a freeway segment for 1968-76 is used to fit a time series model. The resultant model is used to forecast volumes for 1977. The forecast volumes are then compared with actual volumes in 1977. Time series techniques can be used to develop highly accurate and inexpensive short-term forecasts. The feasibility of using these models to evaluate the effects of policy changes or other outside impacts is considered. (1 diagram, 1 map, 14 references,2 tables)
A high-fidelity weather time series generator using the Markov Chain process on a piecewise level
NASA Astrophysics Data System (ADS)
Hersvik, K.; Endrerud, O.-E. V.
2017-12-01
A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.
76 FR 41454 - Caribbean Fishery Management Council; Scoping Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... based on alternative selected in Action 3(a) and time series of landings data as defined in Action 1(a...., Puerto Rico, St. Thomas/St. John, St. Croix) based on the preferred management reference point time series selected by the Council in Actions 1(a) and 2(a). Alternative 2A. Use a mid-point or equidistant...
Koskinen, Hanna; Mikkola, Hennamari; Saastamoinen, Leena K; Ahola, Elina; Martikainen, Jaana E
2015-12-01
To analyze the medium- to long-term impact of generic substitution and the reference price system on the daily cost of antipsychotics in Finland. The additional impact of reference pricing over and above previously implemented generic substitution was also assessed. An interrupted time series design with a control group and segmented regression analysis was used to estimate the effect of the implementation of generic substitution and the reference price system on the daily cost of antipsychotics. The data have 69 monthly values of the average daily cost for each of the studied antipsychotics: 39 months before and 30 months after the introduction of reference pricing. For one of the studied antipsychotic, the time before the introduction of reference pricing could be further divided into time before and after the introduction of generic substitution. According to the model, 2.5 years after the implementation of reference pricing, the daily cost of the studied antipsychotics was 24.6% to 50.6% lower than it would have been if reference pricing had not been implemented. Two and a half years after the implementation of the reference price system, however, the additional impact of reference pricing over and above previously implemented generic substitution was modest, less than 1 percentage point. Although the price competition induced by reference pricing decreased the prices of antipsychotics in Finland in the short-term, the prices had a tendency to stagnate or even to turn in an upward direction in the medium- to long-term. Furthermore, the additional impact of reference pricing over and above previously implemented generic substitution remained quite modest. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Assessment of time-series MODIS data for cropland mapping in the U.S. central Great Plains
NASA Astrophysics Data System (ADS)
Masialeti, Iwake
This study had three general objectives. First, to explore ways of creating and refining a reference data set when reference data set is unobtainable. Second, extend work previously done in Kansas by Wardlow et al. (2007) to Nebraska, several exploratory approaches were used to further investigate the potential of MODIS NDVI 250-m data in agricultural-related land cover research other parts of the Great Plains. The objective of this part of the research was to evaluate the applicability of time-series MODIS 250-m NDVI data for crop-type discrimination by spectrally characterizing and discriminating major crop types in Nebraska using the reference data set collected and refined under research performed for the first objective. Third, conduct an initial investigation into whether time-series NDVI response curves for crops over a growing season for one year could be used to classify crops for a different year. In this case, time-series NDVI response curves for 2001 and 2005 were investigated to ascertain whether or not the 2001 data set could be used to classify crops for 2005. GIS operations, and reference data refinement using clustering and visual assessment of each crop's NDVI cluster profiles in Nebraska, demonstrated that it is possible to devise an alternative reference data set and refinement plan that redresses the unexpected loss of training and validation data. The analysis enabled the identification and removal of crop pattern outliers and sites atypical of crop phenology under consideration, and after editing, a total of 1,288 field sites remained, which were used as a reference data set for classification of Nebraska crop types. A pixel-level analysis of the time-series MODIS 250-m NDVI for 1,288 field sites representing each of the eight cover types under investigation across Nebraska found that each crop type had a distinctive MODIS 250-m NDVI profile corresponding to the crop calendar. A visual and statistical comparison of the average NDVI profiles showed that the crop types were separable at different times of the growing season based on their phenology-driven spectral-temporal differences. Winter wheat and alfalfa, winter wheat and summer crops, and alfalfa and summer crops were clearly separable. Specific summer crop types were not easily distinguishable from each other due to their similar crop calendars. Their greatest separability however occurred during the initial spring green up and/or senescence plant growth phases. In Kansas, an initial investigation revealed that there was near-complete agreement between the winter wheat crop profiles but that there were some minor differences in the crop profiles for alfalfa and summer crops between 2001 and 2005. However, the profiles of summer crops---corn, grain sorghum, and soybeans---displayed a shift to the right by at least 1 composite date, indicative of possible late crop planting and emergence. Alfalfa and summer crops, seem to suggest that time series NDVI response curves for crops over a growing period for one year of valid ground reference data may not be used to map crops for a different year without taking into account the climatic and/or environmental conditions of each year.
NASA Astrophysics Data System (ADS)
Krämer, Stefan; Rohde, Sophia; Schröder, Kai; Belli, Aslan; Maßmann, Stefanie; Schönfeld, Martin; Henkel, Erik; Fuchs, Lothar
2015-04-01
The design of urban drainage systems with numerical simulation models requires long, continuous rainfall time series with high temporal resolution. However, suitable observed time series are rare. As a result, usual design concepts often use uncertain or unsuitable rainfall data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic rainfall data as input for urban drainage modelling are advanced, tested, and compared. Synthetic rainfall time series of three different precipitation model approaches, - one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model-, are provided for three catchments with different sewer system characteristics in different climate regions in Germany: - Hamburg (northern Germany): maritime climate, mean annual rainfall: 770 mm; combined sewer system length: 1.729 km (City center of Hamburg), storm water sewer system length (Hamburg Harburg): 168 km - Brunswick (Lower Saxony, northern Germany): transitional climate from maritime to continental, mean annual rainfall: 618 mm; sewer system length: 278 km, connected impervious area: 379 ha, height difference: 27 m - Friburg in Brisgau (southern Germany): Central European transitional climate, mean annual rainfall: 908 mm; sewer system length: 794 km, connected impervious area: 1 546 ha, height difference 284 m Hydrodynamic models are set up for each catchment to simulate rainfall runoff processes in the sewer systems. Long term event time series are extracted from the - three different synthetic rainfall time series (comprising up to 600 years continuous rainfall) provided for each catchment and - observed gauge rainfall (reference rainfall) according national hydraulic design standards. The synthetic and reference long term event time series are used as rainfall input for the hydrodynamic sewer models. For comparison of the synthetic rainfall time series against the reference rainfall and against each other the number of - surcharged manholes, - surcharges per manhole, - and the average surcharge volume per manhole are applied as hydraulic performance criteria. The results are discussed and assessed to answer the following questions: - Are the synthetic rainfall approaches suitable to generate high resolution rainfall series and do they produce, - in combination with numerical rainfall runoff models - valid results for design of urban drainage systems? - What are the bounds of uncertainty in the runoff results depending on the synthetic rainfall model and on the climate region? The work is carried out within the SYNOPSE project, funded by the German Federal Ministry of Education and Research (BMBF).
NASA Astrophysics Data System (ADS)
Helmer, E.; Ruzycki, T. S.; Wunderle, J. M.; Kwit, C.; Ewert, D. N.; Voggesser, S. M.; Brandeis, T. J.
2011-12-01
We mapped tropical dry forest height (RMSE = 0.9 m, R2 = 0.84, range 0.6-7 m) and foliage height profiles with a time series of gap-filled Landsat and Advanced Land Imager (ALI) imagery for the island of Eleuthera, The Bahamas. We also mapped disturbance type and age with decision tree classification of the image time series. Having mapped these variables in the context of studies of wintering habitat of an endangered Nearctic-Neotropical migrant bird, the Kirtland's Warbler (Dendroica kirtlandii), we then illustrated relationships between forest vertical structure, disturbance type and counts of forage species important to the Kirtland's Warbler. The ALI imagery and the Landsat time series were both critical to the result for forest height, which the strong relationship of forest height with disturbance type and age facilitated. Also unique to this study was that seven of the eight image time steps were cloud-gap-filled images: mosaics of the clear parts of several cloudy scenes, in which cloud gaps in a reference scene for each time step are filled with image data from alternate scenes. We created each cloud-cleared image, including a virtually seamless ALI image mosaic, with regression tree normalization of the image data that filled cloud gaps. We also illustrated how viewing time series imagery as red-green-blue composites of tasseled cap wetness (RGB wetness composites) aids reference data collection for classifying tropical forest disturbance type and age.
Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites
Madonna, F.; Rosoldi, M.; Güldner, J.; ...
2014-11-19
The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010–2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%.more » Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. In conclusion, specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... designed to protect investors and the public interest. Granting Market Makers more time to request a review... addresses errors in series with zero or no bid. Specifically, the Exchange proposes replacing reference to ``series quoted no bid on the Exchange'' with ``series where the NBBO bid is zero.'' This is being done to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... addresses errors in series with zero or no bid. Specifically, the Exchange proposes replacing reference to ``series quoted no bid on the Exchange'' with ``series where the NBBO bid is zero.'' This is being done to... Exchange proposes to amend the times in which certain ATP Holders are required to notify the Exchange in...
MOnthly TEmperature DAtabase of Spain 1951-2010: MOTEDAS. (1) Quality control
NASA Astrophysics Data System (ADS)
Peña-Angulo, Dhais; Cortesi, Nicola; Simolo, Claudia; Stepanek, Peter; Brunetti, Michele; González-Hidalgo, José Carlos
2014-05-01
The HIDROCAES project (Impactos Hidrológicos del Calentamiento Global en España, Spanish Ministery of Research CGL2011-27574-C02-01) is focused on the high resolution in the Spanish continental land of the warming processes during the 1951-2010. To do that the Department of Geography (University of Zaragoza, Spain), the Hydrometeorological Service (Brno Division, Chezck Republic) and the ISAC-CNR (Bologna, Italy) are developing the new dataset MOTEDAS (MOnthly TEmperature DAtabase of Spain), from which we present a collection of poster to show (1) the general structure of dataset and quality control; (2) the analyses of spatial correlation of monthly mean values of maximum (Tmax) and minimum (Tmin temperature; (3) the reconstruction processes of series and high resolution grid developing; (4) the first initial results of trend analyses of annual, seasonal and monthly range mean values. MOTEDAS has been created after exhaustive analyses and quality control of the original digitalized data of the Spanish National Meteorological Agency (Agencia Estatal de Meteorología, AEMET). Quality control was applied without any prior reconstruction, i.e. on original series. Then, from the total amount of series stored at AEMet archives (more than 4680) we selected only those series with at least 10 years of data (i.e. 120 months, 3066 series) to apply a quality control and reconstruction processes (see Poster MOTEDAS 3). Length of series was Tmin, upper and lower thresholds of absolute data, etc), and by comparison with reference series (see Poster MOTEDAS 3, about reconstruction). Anomalous data were considered when difference between Candidate and Reference series were higher than three times the interquartile distance. The total amount of monthly suspicious data recognized and discarded at the end of this analyses was 7832 data for Tmin, and 8063 for Tmax data; they represent less than 0,8% of original total monthly data, for both Tmax and Tmin. No spatial pattern was detected in the suspicious data; month by month Tmin shows maximum detection in summer months, while Tmax does not show any monthly pattern. Secondly, the homogeneity analyses was performed on the list of series free of anomalous data by using an arrays of test (SNHT, Bivariate, T de Student and Pettit) after new reference series calculated with data free of anomalous. The tests were applied at monthly, seasonal and annual scale (i.e. 17 times per method). Statistical inhomogeneity detections were accepted as follows: Three annual detections (monthly, seasonal, annual) must be found in SNHT or Bivariate test. The total amount of detections by the four tests was greater than 5% of the total possible detection per year. Before any correction we examined the Candidate and reference series chart. Proclim and Anclim software were used during all the processes The total amount of series affected by inhomogeneities was 1013 (Tmax) and 1011 (Tmin), i.e. 1/3 of original series was considered as inhomogeneous. We notice that identified inhomogeneous series in Tmax and Tmin usually do not coincide. This apparently small amount of series compared with previous work could be originated because of the mean length of series is around 15-20 years. References. Stepánek P. 2008a. AnClim - software for time series analysis (for Windows 95/NT). Department of Geography, Faculty of Natural Sciences, MU, Brno, 1.47 B. Stepánek P.. 2008b. ProClimDB - Software for Processing Climatological Datasets. CHMI, Regional office, Brno.
NASA Astrophysics Data System (ADS)
Rahim, K. J.; Cumming, B. F.; Hallett, D. J.; Thomson, D. J.
2007-12-01
An accurate assessment of historical local Holocene data is important in making future climate predictions. Holocene climate is often obtained through proxy measures such as diatoms or pollen using radiocarbon dating. Wiggle Match Dating (WMD) uses an iterative least squares approach to tune a core with a large amount of 14C dates to the 14C calibration curve. This poster will present a new method of tuning a time series with when only a modest number of 14C dates are available. The method presented uses the multitaper spectral estimation, and it specifically makes use of a multitaper spectral coherence tuning technique. Holocene climate reconstructions are often based on a simple depth-time fit such as a linear interpolation, splines, or low order polynomials. Many of these models make use of only a small number of 14C dates, each of which is a point estimate with a significant variance. This technique attempts to tune the 14C dates to a reference series, such as tree rings, varves, or the radiocarbon calibration curve. The amount of 14C in the atmosphere is not constant, and a significant source of variance is solar activity. A decrease in solar activity coincides with an increase in cosmogenic isotope production, and an increase in cosmogenic isotope production coincides with a decrease in temperature. The method presented uses multitaper coherence estimates and adjusts the phase of the time series to line up significant line components with that of the reference series in attempt to obtain a better depth-time fit then the original model. Given recent concerns and demonstrations of the variation in estimated dates from radiocarbon labs, methods to confirm and tune the depth-time fit can aid climate reconstructions by improving and serving to confirm the accuracy of the underlying depth-time fit. Climate reconstructions can then be made on the improved depth-time fit. This poster presents a run though of this process using Chauvin Lake in the Canadian prairies and Mt. Barr Cirque Lake located in British Columbia as examples.
Econometrics and Psychometrics: A Survey of Communalities
ERIC Educational Resources Information Center
Goldberger, Arthur S.
1971-01-01
Several themes which are common to both econometrics and psychometrics are surveyed. The themes are illustrated by reference to permanent income hypotheses, simultaneous equation models, adaptive expectations and partial adjustment schemes, and by reference to test score theory, factor analysis, and time-series models. (Author)
Conditional heteroscedasticity as a leading indicator of ecological regime shifts.
Seekell, David A; Carpenter, Stephen R; Pace, Michael L
2011-10-01
Regime shifts are massive, often irreversible, rearrangements of nonlinear ecological processes that occur when systems pass critical transition points. Ecological regime shifts sometimes have severe consequences for human well-being, including eutrophication in lakes, desertification, and species extinctions. Theoretical and laboratory evidence suggests that statistical anomalies may be detectable leading indicators of regime shifts in ecological time series, making it possible to foresee and potentially avert incipient regime shifts. Conditional heteroscedasticity is persistent variance characteristic of time series with clustered volatility. Here, we analyze conditional heteroscedasticity as a potential leading indicator of regime shifts in ecological time series. We evaluate conditional heteroscedasticity by using ecological models with and without four types of critical transition. On approaching transition points, all time series contain significant conditional heteroscedasticity. This signal is detected hundreds of time steps in advance of the regime shift. Time series without regime shifts do not have significant conditional heteroscedasticity. Because probability values are easily associated with tests for conditional heteroscedasticity, detection of false positives in time series without regime shifts is minimized. This property reduces the need for a reference system to compare with the perturbed system.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... that, except for directors elected by the holders of any series of preferred stock, any director, or the entire Board, may be removed from office at any time, but only by the affirmative vote of at least... references to the following: The 3.75% Series A Convertible Notes due 2012 and the 3.75% Series B Convertible...
Non-linear motions in reprocessed GPS station position time series
NASA Astrophysics Data System (ADS)
Rudenko, Sergei; Gendt, Gerd
2010-05-01
Global Positioning System (GPS) data of about 400 globally distributed stations obtained at time span from 1998 till 2007 were reprocessed using GFZ Potsdam EPOS (Earth Parameter and Orbit System) software within International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Pilot Project and IGS Data Reprocessing Campaign with the purpose to determine weekly precise coordinates of GPS stations located at or near tide gauges. Vertical motions of these stations are used to correct the vertical motions of tide gauges for local motions and to tie tide gauge measurements to the geocentric reference frame. Other estimated parameters include daily values of the Earth rotation parameters and their rates, as well as satellite antenna offsets. The solution GT1 derived is based on using absolute phase center variation model, ITRF2005 as a priori reference frame, and other new models. The solution contributed also to ITRF2008. The time series of station positions are analyzed to identify non-linear motions caused by different effects. The paper presents the time series of GPS station coordinates and investigates apparent non-linear motions and their influence on GPS station height rates.
2013-03-21
10 2.3 Time Series Response Data ................................................................................. 12 2.4 Comparison of Response...to 12 evaluating the efficiency of the parameter estimates. In the past, the most popular form of response surface design used the D-optimality...as well. A model can refer to almost anything in math , statistics, or computer science. It can be any “physical, mathematical, or logical
NASA Astrophysics Data System (ADS)
Manikumari, N.; Murugappan, A.; Vinodhini, G.
2017-07-01
Time series forecasting has gained remarkable interest of researchers in the last few decades. Neural networks based time series forecasting have been employed in various application areas. Reference Evapotranspiration (ETO) is one of the most important components of the hydrologic cycle and its precise assessment is vital in water balance and crop yield estimation, water resources system design and management. This work aimed at achieving accurate time series forecast of ETO using a combination of neural network approaches. This work was carried out using data collected in the command area of VEERANAM Tank during the period 2004 - 2014 in India. In this work, the Neural Network (NN) models were combined by ensemble learning in order to improve the accuracy for forecasting Daily ETO (for the year 2015). Bagged Neural Network (Bagged-NN) and Boosted Neural Network (Boosted-NN) ensemble learning were employed. It has been proved that Bagged-NN and Boosted-NN ensemble models are better than individual NN models in terms of accuracy. Among the ensemble models, Boosted-NN reduces the forecasting errors compared to Bagged-NN and individual NNs. Regression co-efficient, Mean Absolute Deviation, Mean Absolute Percentage error and Root Mean Square Error also ascertain that Boosted-NN lead to improved ETO forecasting performance.
Derivation of GNSS derived station velocities for a surface deformation model in the Austrian region
NASA Astrophysics Data System (ADS)
Umnig, Elke; Weber, Robert; Maras, Jadre; Brückl, Ewald
2016-04-01
This contribution deals with the first comprehensive analysis of GNSS derived surface velocities computed within an observation network of about 100 stations covering the whole Austrian territory and parts of the neighbouring countries. Coordinate time series are available now, spanning a period of 5 years (2010.0-2015.0) for one focus area in East Austria and one and a half year (2013.5-2015.0) for the remaining part of the tracking network. In principle the data series are stemming from two different GNSS campaigns. The former was set up to investigate intra plate tectonic movements within the framework of the project ALPAACT (seismological and geodetic monitoring of ALpine-PAnnonian ACtive Tectonics), the latter was designed to support a number of various requests, e.g. derivation of GNSS derived water vapour fields, but also to expand the foresaid tectonic studies. In addition the activities within the ALPAACT project supplement the educational initiative SHOOLS & QUAKES, where scholars contribute to seismological research. For the whole period of the processed coordinate time series daily solutions have been computed by means of the Bernese software. The processed coordinate time series are tied to the global reference frame ITRF2000 as well as to the frame ITRF2008. Due to the transition of the reference from ITRF2000 to ITRF2008 within the processing period, but also due to updates of the Bernese software from version 5.0 to 5.2 the time series were initially not fully consistent and have to be re-aligned to a common frame. So the goal of this investigation is to derive a nationwide consistent horizontal motion field on base of GNSS reference station data within the ITRF2008 frame, but also with respect to the Eurasian plate. In this presentation we focus on the set-up of the coordinate time series and on the problem of frame alignment. Special attention is also paid to the separation into linear and periodic motion signals, originating from tectonic or non-tectonic sources.
Analysis of Site Position Time Series Derived From Space Geodetic Solutions
NASA Astrophysics Data System (ADS)
Angermann, D.; Meisel, B.; Kruegel, M.; Tesmer, V.; Miller, R.; Drewes, H.
2003-12-01
This presentation deals with the analysis of station coordinate time series obtained from VLBI, SLR, GPS and DORIS solutions. We also present time series for the origin and scale derived from these solutions and discuss their contribution to the realization of the terrestrial reference frame. For these investigations we used SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS and DORIS time series were obtained from weekly station coordinates solutions provided by the IGS, and from the joint DORIS analysis center (IGN-JPL). We analysed the time series with respect to various aspects, such as non-linear motions, periodic signals and systematic differences (biases). A major focus is on a comparison of the results at co-location sites in order to identify technique- and/or solution related problems. This may also help to separate and quantify possible effects, and to understand the origin of still existing discrepancies. Technique-related systematic effects (biases) should be reduced to the highest possible extent, before using the space geodetic solutions for a geophysical interpretation of seasonal signals in site position time series.
NASA Astrophysics Data System (ADS)
Gattano, C.; Lambert, S.; Bizouard, C.
2017-12-01
In the context of selecting sources defining the celestial reference frame, we compute astrometric time series of all VLBI radio-sources from observations in the International VLBI Service database. The time series are then analyzed with Allan variance in order to estimate the astrometric stability. From results, we establish a new classification that takes into account the whole multi-time scales information. The algorithm is flexible on the definition of ``stable source" through an adjustable threshold.
The Recalibrated Sunspot Number: Impact on Solar Cycle Predictions
NASA Astrophysics Data System (ADS)
Clette, F.; Lefevre, L.
2017-12-01
Recently and for the first time since their creation, the sunspot number and group number series were entirely revisited and a first fully recalibrated version was officially released in July 2015 by the World Data Center SILSO (Brussels). Those reference long-term series are widely used as input data or as a calibration reference by various solar cycle prediction methods. Therefore, past predictions may now need to be redone using the new sunspot series, and methods already used for predicting cycle 24 will require adaptations before attempting predictions of the next cycles.In order to clarify the nature of the applied changes, we describe the different corrections applied to the sunspot and group number series, which affect extended time periods and can reach up to 40%. While some changes simply involve constant scale factors, other corrections vary with time or follow the solar cycle modulation. Depending on the prediction method and on the selected time interval, this can lead to different responses and biases. Moreover, together with the new series, standard error estimates are also progressively added to the new sunspot numbers, which may help deriving more accurate uncertainties for predicted activity indices. We conclude on the new round of recalibration that is now undertaken in the framework of a broad multi-team collaboration articulated around upcoming ISSI workshops. We outline the future corrections that can still be expected in the future, as part of a permanent upgrading process and quality control. From now on, future sunspot-based predictive models should thus be made more adaptable, and regular updates of predictions should become common practice in order to track periodic upgrades of the sunspot number series, just like it is done when using other modern solar observational series.
On the equivalence of case-crossover and time series methods in environmental epidemiology.
Lu, Yun; Zeger, Scott L
2007-04-01
The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.
40 CFR 75.41 - Precision criteria.
Code of Federal Regulations, 2011 CFR
2011-07-01
... for the entire 30- to 90-day period. (9) The owner or operator shall provide two separate time series... time where the vertical axis represents the percentage difference between each paired hourly reading... monitoring system (or reference method) readings versus time where the vertical axis represents hourly...
40 CFR 75.41 - Precision criteria.
Code of Federal Regulations, 2014 CFR
2014-07-01
... for the entire 30- to 90-day period. (9) The owner or operator shall provide two separate time series... time where the vertical axis represents the percentage difference between each paired hourly reading... monitoring system (or reference method) readings versus time where the vertical axis represents hourly...
40 CFR 75.41 - Precision criteria.
Code of Federal Regulations, 2010 CFR
2010-07-01
... for the entire 30- to 90-day period. (9) The owner or operator shall provide two separate time series... time where the vertical axis represents the percentage difference between each paired hourly reading... monitoring system (or reference method) readings versus time where the vertical axis represents hourly...
40 CFR 75.41 - Precision criteria.
Code of Federal Regulations, 2013 CFR
2013-07-01
... for the entire 30- to 90-day period. (9) The owner or operator shall provide two separate time series... time where the vertical axis represents the percentage difference between each paired hourly reading... monitoring system (or reference method) readings versus time where the vertical axis represents hourly...
40 CFR 75.41 - Precision criteria.
Code of Federal Regulations, 2012 CFR
2012-07-01
... for the entire 30- to 90-day period. (9) The owner or operator shall provide two separate time series... time where the vertical axis represents the percentage difference between each paired hourly reading... monitoring system (or reference method) readings versus time where the vertical axis represents hourly...
Use of Naturally Available Reference Targets to Calibrate Airborne Laser Scanning Intensity Data
Vain, Ants; Kaasalainen, Sanna; Pyysalo, Ulla; Krooks, Anssi; Litkey, Paula
2009-01-01
We have studied the possibility of calibrating airborne laser scanning (ALS) intensity data, using land targets typically available in urban areas. For this purpose, a test area around Espoonlahti Harbor, Espoo, Finland, for which a long time series of ALS campaigns is available, was selected. Different target samples (beach sand, concrete, asphalt, different types of gravel) were collected and measured in the laboratory. Using tarps, which have certain backscattering properties, the natural samples were calibrated and studied, taking into account the atmospheric effect, incidence angle and flying height. Using data from different flights and altitudes, a time series for the natural samples was generated. Studying the stability of the samples, we could obtain information on the most ideal types of natural targets for ALS radiometric calibration. Using the selected natural samples as reference, the ALS points of typical land targets were calibrated again and examined. Results showed the need for more accurate ground reference data, before using natural samples in ALS intensity data calibration. Also, the NIR camera-based field system was used for collecting ground reference data. This system proved to be a good means for collecting in situ reference data, especially for targets with inhomogeneous surface reflection properties. PMID:22574045
Interferometer with Continuously Varying Path Length Measured in Wavelengths to the Reference Mirror
NASA Technical Reports Server (NTRS)
Ohara, Tetsuo (Inventor)
2016-01-01
An interferometer in which the path length of the reference beam, measured in wavelengths, is continuously changing in sinusoidal fashion and the interference signal created by combining the measurement beam and the reference beam is processed in real time to obtain the physical distance along the measurement beam between the measured surface and a spatial reference frame such as the beam splitter. The processing involves analyzing the Fourier series of the intensity signal at one or more optical detectors in real time and using the time-domain multi-frequency harmonic signals to extract the phase information independently at each pixel position of one or more optical detectors and converting the phase information to distance information.
Reach for Reference. What Did We Wear Yesterday? A New Costume Reference
ERIC Educational Resources Information Center
Safford, Barbara Ripp
2004-01-01
This article describes a new five-volume series, "Fashion, Costume, and Culture: Clothing, Headwear, Body Decorations, and Footwear through the Ages" (Sara Pendergast and Tom Pendergast, Editors), that provides a broad overview of costume traditions of diverse cultures from prehistoric times to the present day. These volumes illustrate how…
... under the skin) Infection (a slight risk any time the skin is broken) Alternative Names GH test Images Growth hormone stimulation test - series References Ali O. Hyperpituitarism, tall stature, and overgrowth ...
... is excellent. However, another hydrocele may form over time, or if there was also a hernia present. Alternative Names Hydrocelectomy Images Hydrocele repair - series References Aiken JJ, Oldham KT. Inguinal hernias. In: ...
NASA Astrophysics Data System (ADS)
Tesmer, Volker; Boehm, Johannes; Heinkelmann, Robert; Schuh, Harald
2007-06-01
This paper compares estimated terrestrial reference frames (TRF) and celestial reference frames (CRF) as well as position time-series in terms of systematic differences, scale, annual signals and station position repeatabilities using four different tropospheric mapping functions (MF): The NMF (Niell Mapping Function) and the recently developed GMF (Global Mapping Function) consist of easy-to-handle stand-alone formulae, whereas the IMF (Isobaric Mapping Function) and the VMF1 (Vienna Mapping Function 1) are determined from numerical weather models. All computations were performed at the Deutsches Geodätisches Forschungsinstitut (DGFI) using the OCCAM 6.1 and DOGS-CS software packages for Very Long Baseline Interferometry (VLBI) data from 1984 until 2005. While it turned out that CRF estimates only slightly depend on the MF used, showing small systematic effects up to 0.025 mas, some station heights of the computed TRF change by up to 13 mm. The best agreement was achieved for the VMF1 and GMF results concerning the TRFs, and for the VMF1 and IMF results concerning scale variations and position time-series. The amplitudes of the annual periodical signals in the time-series of estimated heights differ by up to 5 mm. The best precision in terms of station height repeatability is found for the VMF1, which is 5 7% better than for the other MFs.
Neutron monitors and muon detectors for solar modulation studies: 2. ϕ time series
NASA Astrophysics Data System (ADS)
Ghelfi, A.; Maurin, D.; Cheminet, A.; Derome, L.; Hubert, G.; Melot, F.
2017-08-01
The level of solar modulation at different times (related to the solar activity) is a central question of solar and galactic cosmic-ray physics. In the first paper of this series, we have established a correspondence between the uncertainties on ground-based detectors count rates and the parameter ϕ (modulation level in the force-field approximation) reconstructed from these count rates. In this second paper, we detail a procedure to obtain a reference ϕ time series from neutron monitor data. We show that we can have an unbiased and accurate ϕ reconstruction (Δϕ / ϕ ≃ 10 %). We also discuss the potential of Bonner spheres spectrometers and muon detectors to provide ϕ time series. Two by-products of this calculation are updated ϕ values for the cosmic-ray database and a web interface to retrieve and plot ϕ from the 50's to today (http://lpsc.in2p3.fr/crdb).
Fractal analysis on human dynamics of library loans
NASA Astrophysics Data System (ADS)
Fan, Chao; Guo, Jin-Li; Zha, Yi-Long
2012-12-01
In this paper, the fractal characteristic of human behaviors is investigated from the perspective of time series constructed with the amount of library loans. The values of the Hurst exponent and length of non-periodic cycle calculated through rescaled range analysis indicate that the time series of human behaviors and their sub-series are fractal with self-similarity and long-range dependence. Then the time series are converted into complex networks by the visibility algorithm. The topological properties of the networks such as scale-free property and small-world effect imply that there is a close relationship among the numbers of repetitious behaviors performed by people during certain periods of time. Our work implies that there is intrinsic regularity in the human collective repetitious behaviors. The conclusions may be helpful to develop some new approaches to investigate the fractal feature and mechanism of human dynamics, and provide some references for the management and forecast of human collective behaviors.
... support for only a very short period of time. Alternative Names Needle cricothyrotomy Images Emergency airway puncture Cricoid cartilage Emergency airway puncture - series References Hebert RB, Bose S, Mace SE. Cricothyrotomy and ...
Acoustic emission linear pulse holography
Collins, H. Dale; Busse, Lawrence J.; Lemon, Douglas K.
1985-01-01
Defects in a structure are imaged as they propagate, using their emitted acoustic energy as a monitored source. Short bursts of acoustic energy propagate through the structure to a discrete element receiver array. A reference timing transducer located between the array and the inspection zone initiates a series of time-of-flight measurements. A resulting series of time-of-flight measurements are then treated as aperture data and are transferred to a computer for reconstruction of a synthetic linear holographic image. The images can be displayed and stored as a record of defect growth.
IDS plot tools for time series of DORIS station positions and orbit residuals
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.; Moreaux, G.; Mezerette, A.
2012-12-01
DORIS (Doppler Orbitography and Radiopositioning Integrated by Satellite) is a Doppler satellite tracking system developed for precise orbit determination and precise ground location. It is onboard the Cryosat-2, Jason-1, Jason-2 and HY-2A altimetric satellites and the remote sensing satellites SPOT-4 and SPOT-5. It also flew with SPOT-2, SPOT-3, TOPEX/POSEIDON and ENVISAT. Since 1994 and thanks to its worldwide distributed network of more than fifty permanent stations, DORIS contributes to the realization and maintenance of the ITRS (International Terrestrial Reference System). 3D positions and velocities of the reference sites at a cm and mm/yr accuracy lead to scientific studies in geodesy and geophysics. The primary objective of the International DORIS Service (IDS) is to provide a support, through DORIS data and products, to research and operational activities. In order to promote the use of the DORIS products, the IDS has made available on its web site (ids-doris.org) a new set of tools, called Plot tools, to interactively build and display graphs of DORIS station coordinates time series and orbit residuals. These web tools are STCDtool providing station coordinates time series (North, East, Up position evolution) from the IDS Analysis Centers, and POEtool providing statistics time series (orbit residuals and number of measurements for the DORIS stations) from CNES (the French Space Agency) Precise Orbit Determination processing. Complementary data about station and satellites events can also be displayed (e.g. antenna changes, system failures, degraded data...). Information about earthquakes obtained from USGS survey service can also be superimposed on the position time series. All these events can help in interpreting the discontinuities in the time series. The purpose of this presentation is to show the functionalities of these tools and their interest for the monitoring of the crustal deformation at DORIS sites.
NASA Astrophysics Data System (ADS)
Phillips, D. A.; Herring, T.; Melbourne, T. I.; Murray, M. H.; Szeliga, W. M.; Floyd, M.; Puskas, C. M.; King, R. W.; Boler, F. M.; Meertens, C. M.; Mattioli, G. S.
2017-12-01
The Geodesy Advancing Geosciences and EarthScope (GAGE) Facility, operated by UNAVCO, provides a diverse suite of geodetic data, derived products and cyberinfrastructure services to support community Earth science research and education. GPS data and products including decadal station position time series and velocities are provided for 2000+ continuous GPS stations from the Plate Boundary Observatory (PBO) and other networks distributed throughout the high Arctic, North America, and Caribbean regions. The position time series contain a multitude of signals in addition to the secular motions, including coseismic and postseismic displacements, interseismic strain accumulation, and transient signals associated with hydrologic and other processes. We present our latest velocity field solutions, new time series offset estimate products, and new time series examples associated with various phenomena. Position time series, and the signals they contain, are inherently dependent upon analysis parameters such as network scaling and reference frame realization. The estimation of scale changes for example, a common practice, has large impacts on vertical motion estimates. GAGE/PBO velocities and time series are currently provided in IGS (IGb08) and North America (NAM08, IGb08 rotated to a fixed North America Plate) reference frames. We are reprocessing all data (1996 to present) as part of the transition from IGb08 to IGS14 that began in 2017. New NAM14 and IGS14 data products are discussed. GAGE/PBO GPS data products are currently generated using onsite computing clusters. As part of an NSF funded EarthCube Building Blocks project called "Deploying MultiFacility Cyberinfrastructure in Commercial and Private Cloud-based Systems (GeoSciCloud)", we are investigating performance, cost, and efficiency differences between local computing resources and cloud based resources. Test environments include a commercial cloud provider (Amazon/AWS), NSF cloud-like infrastructures within XSEDE (TACC, the Texas Advanced Computing Center), and in-house cyberinfrastructures. Preliminary findings from this effort are presented. Web services developed by UNAVCO to facilitate the discovery, customization and dissemination of GPS data and products are also presented.
Khavrutskii, Ilja V; Wallqvist, Anders
2010-11-09
This paper introduces an efficient single-topology variant of Thermodynamic Integration (TI) for computing relative transformation free energies in a series of molecules with respect to a single reference state. The presented TI variant that we refer to as Single-Reference TI (SR-TI) combines well-established molecular simulation methodologies into a practical computational tool. Augmented with Hamiltonian Replica Exchange (HREX), the SR-TI variant can deliver enhanced sampling in select degrees of freedom. The utility of the SR-TI variant is demonstrated in calculations of relative solvation free energies for a series of benzene derivatives with increasing complexity. Noteworthy, the SR-TI variant with the HREX option provides converged results in a challenging case of an amide molecule with a high (13-15 kcal/mol) barrier for internal cis/trans interconversion using simulation times of only 1 to 4 ns.
A Bayesian CUSUM plot: Diagnosing quality of treatment.
Rosthøj, Steen; Jacobsen, Rikke-Line
2017-12-01
To present a CUSUM plot based on Bayesian diagnostic reasoning displaying evidence in favour of "healthy" rather than "sick" quality of treatment (QOT), and to demonstrate a technique using Kaplan-Meier survival curves permitting application to case series with ongoing follow-up. For a case series with known final outcomes: Consider each case a diagnostic test of good versus poor QOT (expected vs. increased failure rates), determine the likelihood ratio (LR) of the observed outcome, convert LR to weight taking log to base 2, and add up weights sequentially in a plot showing how many times odds in favour of good QOT have been doubled. For a series with observed survival times and an expected survival curve: Divide the curve into time intervals, determine "healthy" and specify "sick" risks of failure in each interval, construct a "sick" survival curve, determine the LR of survival or failure at the given observation times, convert to weights, and add up. The Bayesian plot was applied retrospectively to 39 children with acute lymphoblastic leukaemia with completed follow-up, using Nordic collaborative results as reference, showing equal odds between good and poor QOT. In the ongoing treatment trial, with 22 of 37 children still at risk for event, QOT has been monitored with average survival curves as reference, odds so far favoring good QOT 2:1. QOT in small patient series can be assessed with a Bayesian CUSUM plot, retrospectively when all treatment outcomes are known, but also in ongoing series with unfinished follow-up. © 2017 John Wiley & Sons, Ltd.
Impacts of GNSS position offsets on global frame stability
NASA Astrophysics Data System (ADS)
Griffiths, Jake; Ray, Jim
2015-04-01
Positional offsets appear in Global Navigation Satellite System (GNSS) time series for a variety of reasons. Antenna or radome changes are the most common cause for these discontinuities. Many others are from earthquakes, receiver changes, and different anthropogenic modifications at or near the stations. Some jumps appear for unknown or undocumented reasons. Accurate determination of station velocities, and therefore geophysical parameters and terrestrial reference frames, requires that positional offsets be correctly found and compensated. Williams (2003) found that undetected offsets introduce a random walk error component in individual station time series. The topic of detecting positional offsets has received considerable attention in recent years (e.g., Detection of Offsets in GPS Experiment; DOGEx), and most research groups using GNSS have adopted a mix of manual and automated methods for finding them. The removal of a positional offset from a time series is usually handled by estimating the average station position on both sides of the discontinuity. Except for large earthquake events, the velocity is usually assumed constant and continuous across the positional jump. This approach is sufficient in the absence of time-correlated errors. However, GNSS time series contain periodic and power-law (flicker) errors. In this paper, we evaluate the impact to individual station results and the overall stability of the global reference frame from adding increasing numbers of positional discontinuities. We use the International GNSS Service (IGS) weekly SINEX files, and iteratively insert positional offset parameters. Each iteration includes a restacking of the modified SINEX files using the CATREF software from Institut National de l'Information Géographique et Forestière (IGN). Comparisons of successive stacked solutions are used to assess the impacts on the time series of x-pole and y-pole offsets, along with changes in regularized position and secular velocity for stations with more than 2.5 years of data. Our preliminary results indicate that the change in polar motion scatter is logarithmic with increasing numbers of discontinuities. The best-fit natural logarithm to the changes in scatter for x-pole has R2 = 0.58; the fit for the y-pole series has R2 = 0.99. From these empirical functions, we find that polar motion scatter increases from zero when the total rate of discontinuities exceeds 0.2 (x-pole) and 1.3 (y-pole) per station, on average (the IGS has 0.65 per station). Thus, the presence of position offsets in GNSS station time series is likely already a contributor to IGS polar motion inaccuracy and global frame instability. Impacts to station position and velocity estimates depend on noise features found in that station's positional time series. For instance, larger changes in velocity occur for stations with shorter and noisier data spans. This is because an added discontinuity parameter for an individual station time series can induce changes in average position on both sides of the break. We will expand on these results, and consider remaining questions about the role of velocity discontinuities and the effects caused by non-core reference frame stations.
NASA Technical Reports Server (NTRS)
Beckley, B. D.; Lemoine, F. G.; Zelensky, N. P.; Yang, X.; Holmes, S.; Ray, R. D.; Mitchum, G. T.; Desai, S.; Brown, S.; Haines, B.
2011-01-01
Recent developments in Precise Orbit Determinations (POD) due to in particular to revisions to the terrestrial reference frame realization and the time variable gravity (TVG) continues to provide improvements to the accuracy and stability of the PO directly affecting mean sea level (MSL) estimates. Long-term credible MSL estimates require the development and continued maintenance of a stable reference frame, along with vigilant monitoring of the performance of the independent tracking systems used to calculate the orbits for altimeter spacecrafts. The stringent MSL accuracy requirements of a few tenths of an mm/yr are particularly essential for mass budget closure analysis over the relative short time period of Jason-l &2, GRACE, and Argo coincident measurements. In an effort to adhere to cross mission consistency, we have generated a full time series of experimental orbits (GSFC stdlllO) for TOPEX/Poseidon (TP), Jason-I, and OSTM based on an improved terrestrial reference frame (TRF) realization (ITRF2008), revised static (GGM03s), and time variable gravity field (Eigen6s). In this presentation we assess the impact of the revised precision orbits on inter-mission bias estimates, and resultant global and regional MSL trends. Tide gauge verification results are shown to assess the current stability of the Jason-2 sea surface height time series that suggests a possible discontinuity initiated in early 2010. Although the Jason-2 time series is relatively short (approximately 3 years), a thorough review of the entire suite of geophysical and environmental range corrections is warranted and is underway to maintain the fidelity of the record.
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1992-01-01
Numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series are presented. This approach makes it possible to extract dynamical invariants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), given a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
Nonlinear techniques for forecasting solar activity directly from its time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.; Cooley, J.
1993-01-01
This paper presents numerical techniques for constructing nonlinear predictive models to forecast solar flux directly from its time series. This approach makes it possible to extract dynamical in variants of our system without reference to any underlying solar physics. We consider the dynamical evolution of solar activity in a reconstructed phase space that captures the attractor (strange), give a procedure for constructing a predictor of future solar activity, and discuss extraction of dynamical invariants such as Lyapunov exponents and attractor dimension.
NASA Astrophysics Data System (ADS)
Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol
2005-10-01
Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.
Future mission studies: Forecasting solar flux directly from its chaotic time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.
1991-01-01
The mathematical structure of the programs written to construct a nonlinear predictive model to forecast solar flux directly from its time series without reference to any underlying solar physics is presented. This method and the programs are written so that one could apply the same technique to forecast other chaotic time series, such as geomagnetic data, attitude and orbit data, and even financial indexes and stock market data. Perhaps the most important application of this technique to flight dynamics is to model Goddard Trajectory Determination System (GTDS) output of residues between observed position of spacecraft and calculated position with no drag (drag flag = off). This would result in a new model of drag working directly from observed data.
Rigorous Combination of GNSS and VLBI: How it Improves Earth Orientation and Reference Frames
NASA Astrophysics Data System (ADS)
Lambert, S. B.; Richard, J. Y.; Bizouard, C.; Becker, O.
2017-12-01
Current reference series (C04) of the International Earth Rotation and Reference Systems Service (IERS) are produced by a weighted combination of Earth orientation parameters (EOP) time series built up by combination centers of each technique (VLBI, GNSS, Laser ranging, DORIS). In the future, we plan to derive EOP from a rigorous combination of the normal equation systems of the four techniques.We present here the results of a rigorous combination of VLBI and GNSS pre-reduced, constraint-free, normal equations with the DYNAMO geodetic analysis software package developed and maintained by the French GRGS (Groupe de Recherche en GeÌodeÌsie Spatiale). The used normal equations are those produced separately by the IVS and IGS combination centers to which we apply our own minimal constraints.We address the usefulness of such a method with respect to the classical, a posteriori, combination method, and we show whether EOP determinations are improved.Especially, we implement external validations of the EOP series based on comparison with geophysical excitation and examination of the covariance matrices. Finally, we address the potential of the technique for the next generation celestial reference frames, which are currently determined by VLBI only.
An approach to checking case-crossover analyses based on equivalence with time-series methods.
Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L
2008-03-01
The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.
Long-Term Stability Assessment of Sonoran Desert for Vicarious Calibration of GOES-R
NASA Astrophysics Data System (ADS)
Kim, W.; Liang, S.; Cao, C.
2012-12-01
Vicarious calibration refers to calibration techniques that do not depend on onboard calibration devices. Although sensors and onboard calibration devices undergo rigorous validation processes before launch, performance of sensors often degrades after the launch due to exposure to the harsh space environment and the aging of devices. Such in-flight changes of devices can be identified and adjusted through vicarious calibration activities where the sensor degradation is measured in reference to exterior calibration sources such as the Sun, the Moon, and the Earth surface. Sonoran desert is one of the best calibration sites located in the North America that are available for vicarious calibration of GOES-R satellite. To accurately calibrate sensors onboard GOES-R satellite (e.g. advanced baseline imager (ABI)), the temporal stability of Sonoran desert needs to be assessed precisely. However, short-/mid-term variations in top-of-atmosphere (TOA) reflectance caused by meteorological variables such as water vapor amount and aerosol loading are often difficult to retrieve, making the use of TOA reflectance time series for the stability assessment of the site. In this paper, we address this issue of normalization of TOA reflectance time series using a time series analysis algorithm - seasonal trend decomposition procedure based on LOESS (STL) (Cleveland et al, 1990). The algorithm is basically a collection of smoothing filters which leads to decomposition of a time series into three additive components; seasonal, trend, and remainder. Since this non-linear technique is capable of extracting seasonal patterns in the presence of trend changes, the seasonal variation can be effectively identified in the time series of remote sensing data subject to various environmental changes. The experiment results performed with Landsat 5 TM data show that the decomposition results acquired for the Sonoran Desert area produce normalized series that have much less uncertainty than those of traditional BRDF models, which leads to more accurate stability assessment.
A hybrid-domain approach for modeling climate data time series
NASA Astrophysics Data System (ADS)
Wen, Qiuzi H.; Wang, Xiaolan L.; Wong, Augustine
2011-09-01
In order to model climate data time series that often contain periodic variations, trends, and sudden changes in mean (mean shifts, mostly artificial), this study proposes a hybrid-domain (HD) algorithm, which incorporates a time domain test and a newly developed frequency domain test through an iterative procedure that is analogue to the well known backfitting algorithm. A two-phase competition procedure is developed to address the confounding issue between modeling periodic variations and mean shifts. A variety of distinctive features of climate data time series, including trends, periodic variations, mean shifts, and a dependent noise structure, can be modeled in tandem using the HD algorithm. This is particularly important for homogenization of climate data from a low density observing network in which reference series are not available to help preserve climatic trends and long-term periodic variations, preventing them from being mistaken as artificial shifts. The HD algorithm is also powerful in estimating trend and periodicity in a homogeneous data time series (i.e., in the absence of any mean shift). The performance of the HD algorithm (in terms of false alarm rate and hit rate in detecting shifts/cycles, and estimation accuracy) is assessed via a simulation study. Its power is further illustrated through its application to a few climate data time series.
Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M
2015-08-01
To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NChina16: A stable geodetic reference frame for geological hazard studies in north China
NASA Astrophysics Data System (ADS)
Wang, G.; Yan, B.; Gan, W.; Geng, J.
2017-12-01
This study established a stable North China Reference Frame 2016 (NChina16) using five years of continuous GPS observations (2011.8 to 2016.8) from 12 continuously operating reference stations (CORS) fixed to the stable interior of the North China Craton. Applications of NChina16 in landslide, subsidence, and post-seismic displacement studies are illustrated. The primary result of this study is the seven parameters for transforming Cartesian ECEF (Earth-Centered, Earth-Fixed) coordinates X, Y, and Z from the International GNSS Service Reference Frame 2008 (IGS08) to NChina16. The seven parameters include the epoch that is used to tie the regional reference frame to IGS08 and the time derivatives of three translations and three rotations. A method for developing a regional geodetic reference frame is introduced in detail. The GIPSY-OASIS (V6.4) software package was used to obtain the precise point positioning (PPP) time series with respect to IGS08. The stability (accuracy) of NChina16 is about 0.5 mm/year in both vertical and horizontal directions. This study also developed a regional seasonal model for correcting vertical displacement time series data derived from the PPP solutions. Long-term GPS observations (1999-2016) from five CORS in north China were used to develop the seasonal model. According to this study, the PPP daily solutions with respect to NChina16 could achieve 2-3 mm horizontal accuracy and 4-5 mm vertical accuracy after being modified by the regional model. NChina16 will be critical to the long-term landslide, subsidence, fault, and structural monitoring in north China and for ongoing post-seismic crustal deformation studies in Japan. NChina16 will be incrementally improved and synchronized with the IGS reference frame update.
NASA Astrophysics Data System (ADS)
Gang, Zhang; Fansong, Meng; Jianzhong, Wang; Mingtao, Ding
2018-02-01
Determining magnetotelluric impedance precisely and accurately is fundamental to valid inversion and geological interpretation. This study aims to determine the minimum value of signal-to-noise ratio (SNR) which maintains the effectiveness of remote reference technique. Results of standard time series simulation, addition of different Gaussian noises to obtain the different SNR time series, and analysis of the intermediate data, such as polarization direction, correlation coefficient, and impedance tensor, show that when the SNR value is larger than 23.5743, the polarization direction disorder at morphology and a smooth and accurate sounding carve value can be obtained. At this condition, the correlation coefficient value of nearly complete segments between the base and remote station is larger than 0.9, and impedance tensor Zxy presents only one aggregation, which meet the natural magnetotelluric signal characteristic.
NASA Technical Reports Server (NTRS)
Beckley, Brian D.; Ray, Richard D.; Lemoine, Frank G.; Zelensky, N. P.; Holmes, S. A.; Desal, Shailen D.; Brown, Shannon; Mitchum, G. T.; Jacob, Samuel; Luthcke, Scott B.
2010-01-01
The science value of satellite altimeter observations has grown dramatically over time as enabling models and technologies have increased the value of data acquired on both past and present missions. With the prospect of an observational time series extending into several decades from TOPEX/Poseidon through Jason-1 and the Ocean Surface Topography Mission (OSTM), and further in time with a future set of operational altimeters, researchers are pushing the bounds of current technology and modeling capability in order to monitor global sea level rate at an accuracy of a few tenths of a mm/yr. The measurement of mean sea-level change from satellite altimetry requires an extreme stability of the altimeter measurement system since the signal being measured is at the level of a few mm/yr. This means that the orbit and reference frame within which the altimeter measurements are situated, and the associated altimeter corrections, must be stable and accurate enough to permit a robust MSL estimate. Foremost, orbit quality and consistency are critical to satellite altimeter measurement accuracy. The orbit defines the altimeter reference frame, and orbit error directly affects the altimeter measurement. Orbit error remains a major component in the error budget of all past and present altimeter missions. For example, inconsistencies in the International Terrestrial Reference Frame (ITRF) used to produce the precision orbits at different times cause systematic inconsistencies to appear in the multimission time-frame between TOPEX and Jason-1, and can affect the intermission calibration of these data. In an effort to adhere to cross mission consistency, we have generated the full time series of orbits for TOPEX/Poseidon (TP), Jason-1, and OSTM based on recent improvements in the satellite force models, reference systems, and modeling strategies. The recent release of the entire revised Jason-1 Geophysical Data Records, and recalibration of the microwave radiometer correction also require the further re-examination of inter-mission consistency issues. Here we present an assessment of these recent improvements to the accuracy of the 17 -year sea surface height time series, and evaluate the subsequent impact on global and regional mean sea level estimates.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
.... Article Fifth, Paragraph D provides that, except for directors elected by the holders of any series of preferred stock, any director, or the entire Board, may be removed from office at any time, but only by the... deletes obsolete references to the following: The 3.75% Series A Convertible Notes due 2012 and the 3.75...
NASA Astrophysics Data System (ADS)
Gruszczynska, Marta; Rosat, Severine; Klos, Anna; Bogusz, Janusz
2017-04-01
Seasonal oscillations in the GPS position time series can arise from real geophysical effects and numerical artefacts. According to Dong et al. (2002) environmental loading effects can account for approximately 40% of the total variance of the annual signals in GPS time series, however using generally acknowledged methods (e.g. Least Squares Estimation, Wavelet Decomposition, Singular Spectrum Analysis) to model seasonal signals we are not able to separate real from spurious signals (effects of mismodelling aliased into annual period as well as draconitic). Therefore, we propose to use Multichannel Singular Spectrum Analysis (MSSA) to determine seasonal oscillations (with annual and semi-annual periods) from GPS position time series and environmental loading displacement models. The MSSA approach is an extension of the classical Karhunen-Loève method and it is a special case of SSA for multivariate time series. The main advantage of MSSA is the possibility to extract common seasonal signals for stations from selected area and to investigate the causality between a set of time series as well. In this research, we explored the ability of MSSA application to separate real geophysical effects from spurious effects in GPS time series. For this purpose, we used GPS position changes and environmental loading models. We analysed the topocentric time series from 250 selected stations located worldwide, delivered from Network Solution obtained by the International GNSS Service (IGS) as a contribution to the latest realization of the International Terrestrial Reference System (namely ITRF2014, Rebishung et al., 2016). We also researched atmospheric, hydrological and non-tidal oceanic loading models provided by the EOST/IPGS Loading Service in the Centre-of-Figure (CF) reference frame. The analysed displacements were estimated from ERA-Interim (surface pressure), MERRA-land (soil moisture and snow) as well as ECCO2 ocean bottom pressure. We used Multichannel Singular Spectrum Analysis to determine common seasonal signals in two case studies with adopted a 3-years lag-window as the optimal window size. We also inferred the statistical significance of oscillations through the Monte Carlo MSSA method (Allen and Robertson, 1996). In the first case study, we investigated the common spatio-temporal seasonal signals for all stations. For this purpose, we divided selected stations with respect to the continents. For instance, for stations located in Europe, seasonal oscillations accounts for approximately 45% of the GPS-derived data variance. Much higher variance of seasonal signals is explained by hydrological loadings of about 92%, while the non-tidal oceanic loading accounted for 31% of total variance. In the second case study, we analysed the capability of the MSSA method to establish a causality between several time series. Each of estimated Principal Component represents pattern of the common signal for all analysed data. For ZIMM station (Zimmerwald, Switzerland), the 1st, 2nd and 9th, 10th Principal Components, which accounts for 35% of the variance, corresponds to the annual and semi-annual signals. In this part, we applied the non-parametric MSSA approach to extract the common seasonal signals for GPS time series and environmental loadings for each of the 250 stations with clear statement, that some part of seasonal signal reflects the real geophysical effects. REFERENCES: 1. Allen, M. and Robertson, A.: 1996, Distinguishing modulated oscillations from coloured noise in multivariate datasets. Climate Dynamics, 12, No. 11, 775-784. DOI: 10.1007/s003820050142. 2. Dong, D., Fang, P., Bock, Y., Cheng, M.K. and Miyazaki, S.: 2002, Anatomy of apparent seasonal variations from GPS-derived site position time series. Journal of Geophysical Research, 107, No. B4, 2075. DOI: 10.1029/2001JB000573. 3. Rebischung, P., Altamimi, Z., Ray, J. and Garayt, B.: 2016, The IGS contribution to ITRF2014. Journal of Geodesy, 90, No. 7, 611-630. DOI:10.1007/s00190-016-0897-6.
NASA Astrophysics Data System (ADS)
Abe, R.; Hamada, K.; Hirata, N.; Tamura, R.; Nishi, N.
2015-05-01
As well as the BIM of quality management in the construction industry, demand for quality management of the manufacturing process of the member is higher in shipbuilding field. The time series of three-dimensional deformation of the each process, and are accurately be grasped strongly demanded. In this study, we focused on the shipbuilding field, will be examined three-dimensional measurement method. The shipyard, since a large equipment and components are intricately arranged in a limited space, the installation of the measuring equipment and the target is limited. There is also the element to be measured is moved in each process, the establishment of the reference point for time series comparison is necessary to devise. In this paper will be discussed method for measuring the welding deformation in time series by using a total station. In particular, by using a plurality of measurement data obtained from this approach and evaluated the amount of deformation of each process.
Documentation of a spreadsheet for time-series analysis and drawdown estimation
Halford, Keith J.
2006-01-01
Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user-defined periods. Raw or reduced drawdown estimates can be copied from the spreadsheet application or written to tab-delimited ASCII files.
... lines of the face and will fade over time. Your surgeon will probably advise you to limit your sun exposure. Alternative Names Rhytidectomy; Facialplasty; Cosmetic surgery of the face Images Facelift - series References McLain L. Facial cosmetic surgery. In: Hupp ...
NASA Astrophysics Data System (ADS)
Li, Lingqi; Gottschalk, Lars; Krasovskaia, Irina; Xiong, Lihua
2018-01-01
Reconstruction of missing runoff data is of important significance to solve contradictions between the common situation of gaps and the fundamental necessity of complete time series for reliable hydrological research. The conventional empirical orthogonal functions (EOF) approach has been documented to be useful for interpolating hydrological series based upon spatiotemporal decomposition of runoff variation patterns, without additional measurements (e.g., precipitation, land cover). This study develops a new EOF-based approach (abbreviated as CEOF) that conditions EOF expansion on the oscillations at outlet (or any other reference station) of a target basin and creates a set of residual series by removing the dependence on this reference series, in order to redefine the amplitude functions (components). This development allows a transparent hydrological interpretation of the dimensionless components and thereby strengthens their capacities to explain various runoff regimes in a basin. The two approaches are demonstrated on an application of discharge observations from the Ganjiang basin, China. Two alternatives for determining amplitude functions based on centred and standardised series, respectively, are tested. The convergence in the reconstruction of observations at different sites as a function of the number of components and its relation to the characteristics of the site are analysed. Results indicate that the CEOF approach offers an efficient way to restore runoff records with only one to four components; it shows more superiority in nested large basins than at headwater sites and often performs better than the EOF approach when using standardised series, especially in improving infilling accuracy for low flows. Comparisons against other interpolation methods (i.e., nearest neighbour, linear regression, inverse distance weighting) further confirm the advantage of the EOF-based approaches in avoiding spatial and temporal inconsistencies in estimated series.
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
Time evolution of an SLR reference frame
NASA Astrophysics Data System (ADS)
Angermann, D.; Gerstl, M.; Kelm, R.; Müller, H.; Seemüller, W.; Vei, M.
2002-07-01
On the basis of LAGEOS-1 and LAGEOS-2 data we computed a 10-years (1990-2000) solution for SLR station positions and velocities. The paper describes the data processing with the DGFI software package DOGS. We present results for station coordinates and their time variation for 41 stations of the global SLR network, and discuss the stability and time evolution of the SLR reference frame established in the same way. We applied different methods to assess the quality and consistency of the SLR results. The results presented in this paper include: (1) a time series of weekly estimated station coordinates; (2) a comparison of a 10-year LAGEOS-1 and LAGEOS-2 solution; (3) a comparison of 2.5-year solutions with the combined 10-year solution to assess the internal stability and the time evolution of the SLR reference frame; (4) a comparison of the SLR reference frame with ITRF97; and (5) a comparison of SLR station velocities with those of ITRF97 and NNR NUVEL-1A.
Cost Analysis Sources and Documents Data Base Reference Manual (Update)
1989-06-01
M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986
Analysis of Vlbi, Slr and GPS Site Position Time Series
NASA Astrophysics Data System (ADS)
Angermann, D.; Krügel, M.; Meisel, B.; Müller, H.; Tesmer, V.
Conventionally the IERS terrestrial reference frame (ITRF) is realized by the adoption of a set of epoch coordinates and linear velocities for a set of global tracking stations. Due to the remarkable progress of the space geodetic observation techniques (e.g. VLBI, SLR, GPS) the accuracy and consistency of the ITRF increased continuously. The accuracy achieved today is mainly limited by technique-related systematic errors, which are often poorly characterized or quantified. Therefore it is essential to analyze the individual techniques' solutions with respect to systematic differences, models, parameters, datum definition, etc. Main subject of this presentation is the analysis of GPS, SLR and VLBI time series of site positions. The investigations are based on SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS time series are based on weekly IGS station coordinates solutions. We analyze the time series with respect to the issues mentioned above. In particular we characterize the noise in the time series, identify periodic signals, and investigate non-linear effects that complicate the assignment of linear velocities for global tracking sites. One important aspect is the comparison of results obtained by different techniques at colocation sites.
NASA Astrophysics Data System (ADS)
Coulot, David; Richard, Jean-Yves
2017-04-01
Many major indicators of climate change are monitored with space observations (sea level rise from satellite altimetry, ice melting from dedicated satellites, etc.). This monitoring is highly dependent on references (positions and velocities of ground observing instruments, orbits of satellites, etc.) that only geodesy can provide. The current accuracy of these references does not permit to fully support the challenges that the constantly evolving Earth system gives rise to, and can consequently limit the accuracy of these indicators. For this reason, in the framework of the Global Geodetic Observing System (GGOS), stringent requirements are fixed to the International Terrestrial Reference Frame (ITRF) for the next decade: an accuracy at the level of 1 mm and a stability at the level of 0.1 mm/yr. This means an improvement of the current quality of ITRF by a factor of 5-10. Improving the quality of the geodetic references is an issue which requires a thorough reassessment of the methodologies involved. The most relevant and promising method to improve this quality is the direct combination (Combination at Observation Level - COL) of the space-geodetic measurements used to compute the official references of the International Earth Rotation and Reference Systems Service (IERS). The GEODESIE project aims at (i) determining highly-accurate global and consistent references (time series of Terrestrial Reference Frames and Celestial Reference Frames, of Earth's Orientation Parameters, and orbits of Earth's observation satellites) and (ii) providing the geophysical and climate research communities with these references, for a better estimation of geocentric sea level rise, ice mass balance and on-going climate changes. Time series of sea levels computed from altimetric data and tide gauge records with these references (orbits of satellite altimeters, Terrestrial Reference Frames and related vertical velocities of stations) will also be provided. The geodetic references will be essential bases for Earth's observation and monitoring to support the challenges of the century. The geocentric time series of sea levels will permit to better apprehend (i) the drivers of the global mean sea level rise and of regional variations of sea level and (ii) the contribution of the global climate change induced by anthropogenic greenhouse gases emissions to these drivers. All the results and computation and quality assessment reports will be available on a Website designed and opened in the Summer of 2017. This project, supported by the French Agence Nationale de la Recherche (ANR) for the period 2017-2020, will be an unprecedented opportunity to provide the French Groupe de Recherche de Géodésie Spatiale (GRGS) with complete simulation and data processing capabilities to prepare the future arrival of space missions such as the European Geodetic Reference Antenna in SPace (E-GRASP) and to significantly contribute to the GGOS with accurate references.
The Potential of Tropospheric Gradients for Regional Precipitation Prediction
NASA Astrophysics Data System (ADS)
Boisits, Janina; Möller, Gregor; Wittmann, Christoph; Weber, Robert
2017-04-01
Changes of temperature and humidity in the neutral atmosphere cause variations in tropospheric path delays and tropospheric gradients. By estimating zenith wet delays (ZWD) and gradients using a GNSS reference station network the obtained time series provide information about spatial and temporal variations of water vapour in the atmosphere. Thus, GNSS-based tropospheric parameters can contribute to the forecast of regional precipitation events. In a recently finalized master thesis at TU Wien the potential of tropospheric gradients for weather prediction was investigated. Therefore, ZWD and gradient time series at selected GNSS reference stations were compared to precipitation data over a period of six months (April to September 2014). The selected GNSS stations form two test areas within Austria. All required meteorological data was provided by the Central Institution for Meteorology and Geodynamics (ZAMG). Two characteristics in ZWD and gradient time series can be anticipated in case of an approaching weather front. First, an induced asymmetry in tropospheric delays results in both, an increased magnitude of the gradient and in gradients pointing towards the weather front. Second, an increase in ZWD reflects the increased water vapour concentration right before a precipitation event. To investigate these characteristics exemplary test events were processed. On the one hand, the sequence of the anticipated increase in ZWD at each GNSS station obtained by cross correlation of the time series indicates the direction of the approaching weather front. On the other hand, the corresponding peak in gradient time series allows the deduction of the direction of movement as well. To verify the results precipitation data from ZAMG was used. It can be deduced, that tropospheric gradients show high potential for predicting precipitation events. While ZWD time series rather indicate the orientation of the air mass boundary, gradients rather indicate the direction of movement of an approaching weather front. Additionally our investigations have shown that gradients are able to capture the characteristics of an approaching weather front twenty to thirty hours before the precipitation event, which allows a first indication well in advance. Thus in conclusion, the utilization of GNSS tropospheric parameters, in particular tropospheric gradients, has the potential to contribute substantially to weather forecasting models.
Kepler Fine Guidance Sensor Data
NASA Technical Reports Server (NTRS)
Van Cleve, Jeffrey; Campbell, Jennifer Roseanna
2017-01-01
The Kepler and K2 missions collected Fine Guidance Sensor (FGS) data in addition to the science data, as discussed in the Kepler Instrument Handbook (KIH, Van Cleve and Caldwell 2016). The FGS CCDs are frame transfer devices (KIH Table 7) located in the corners of the Kepler focal plane (KIH Figure 24), which are read out 10 times every second. The FGS data are being made available to the user community for scientific analysis as flux and centroid time series, along with a limited number of FGS full frame images which may be useful for constructing a World Coordinate System (WCS) or otherwise putting the time series data in context. This document will describe the data content and file format, and give example MATLAB scripts to read the time series. There are three file types delivered as the FGS data.1. Flux and Centroid (FLC) data: time series of star signal and centroid data. 2. Ancillary FGS Reference (AFR) data: catalog of information about the observed stars in the FLC data. 3. FGS Full-Frame Image (FGI) data: full-frame image snapshots of the FGS CCDs.
Brewer spectrometer total ozone column measurements in Sodankylä
NASA Astrophysics Data System (ADS)
Karppinen, Tomi; Lakkala, Kaisa; Karhu, Juha M.; Heikkinen, Pauli; Kivi, Rigel; Kyrö, Esko
2016-06-01
Brewer total ozone column measurements started in Sodankylä in May 1988, 9 months after the signing of The Montreal Protocol. The Brewer instrument has been well maintained and frequently calibrated since then to produce a high-quality ozone time series now spanning more than 25 years. The data have now been uniformly reprocessed between 1988 and 2014. The quality of the data has been assured by automatic data rejection rules as well as by manual checking. Daily mean values calculated from the highest-quality direct sun measurements are available 77 % of time with up to 75 measurements per day on clear days. Zenith sky measurements fill another 14 % of the time series and winter months are sparsely covered by moon measurements. The time series provides information to survey the evolution of Arctic ozone layer and can be used as a reference point for assessing other total ozone column measurement practices.
Estimating the effective spatial resolution of an AVHRR time series
Meyer, D.J.
1996-01-01
A method is proposed to estimate the spatial degradation of geometrically rectified AVHRR data resulting from misregistration and off-nadir viewing, and to infer the cumulative effect of these degradations over time. Misregistrations are measured using high resolution imagery as a geometric reference, and pixel sizes are computed directly from satellite zenith angles. The influence or neighbouring features on a nominal 1 km by 1 km pixel over a given site is estimated from the above information, and expressed as a spatial distribution whose spatial frequency response is used to define an effective field-of-view (EFOV) for a time series. In a demonstration of the technique applied to images from the Conterminous U.S. AVHRR data set, an EFOV of 3·1km in the east-west dimension and 19 km in the north-south dimension was estimated for a time series accumulated over a grasslands test site.
Student Understanding of Time in Special Relativity: Simultaneity and Reference Frames.
ERIC Educational Resources Information Center
Scherr, Rachel E.; Shaffer, Peter S.; Vokos, Stamatis
2001-01-01
Reports on an investigation of students' understanding of the concept of time in special relativity. Discusses a series of research tasks to illustrate how student reasoning of fundamental concepts of relativity was probed. Indicates that after standard instruction, students have serious difficulties with the relativity of simultaneity and the…
Sea change: Charting the course for biogeochemical ocean time-series research in a new millennium
NASA Astrophysics Data System (ADS)
Church, Matthew J.; Lomas, Michael W.; Muller-Karger, Frank
2013-09-01
Ocean time-series provide vital information needed for assessing ecosystem change. This paper summarizes the historical context, major program objectives, and future research priorities for three contemporary ocean time-series programs: The Hawaii Ocean Time-series (HOT), the Bermuda Atlantic Time-series Study (BATS), and the CARIACO Ocean Time-Series. These three programs operate in physically and biogeochemically distinct regions of the world's oceans, with HOT and BATS located in the open-ocean waters of the subtropical North Pacific and North Atlantic, respectively, and CARIACO situated in the anoxic Cariaco Basin of the tropical Atlantic. All three programs sustain near-monthly shipboard occupations of their field sampling sites, with HOT and BATS beginning in 1988, and CARIACO initiated in 1996. The resulting data provide some of the only multi-disciplinary, decadal-scale determinations of time-varying ecosystem change in the global ocean. Facilitated by a scoping workshop (September 2010) sponsored by the Ocean Carbon Biogeochemistry (OCB) program, leaders of these time-series programs sought community input on existing program strengths and for future research directions. Themes that emerged from these discussions included: 1. Shipboard time-series programs are key to informing our understanding of the connectivity between changes in ocean-climate and biogeochemistry 2. The scientific and logistical support provided by shipboard time-series programs forms the backbone for numerous research and education programs. Future studies should be encouraged that seek mechanistic understanding of ecological interactions underlying the biogeochemical dynamics at these sites. 3. Detecting time-varying trends in ocean properties and processes requires consistent, high-quality measurements. Time-series must carefully document analytical procedures and, where possible, trace the accuracy of analyses to certified standards and internal reference materials. 4. Leveraged implementation, testing, and validation of autonomous and remote observing technologies at time-series sites provide new insights into spatiotemporal variability underlying ecosystem changes. 5. The value of existing time-series data for formulating and validating ecosystem models should be promoted. In summary, the scientific underpinnings of ocean time-series programs remain as strong and important today as when these programs were initiated. The emerging data inform our knowledge of the ocean's biogeochemistry and ecology, and improve our predictive capacity about planetary change.
Monitoring of Earth Rotation by VLBI
NASA Technical Reports Server (NTRS)
Ma., Chopo; Macmillan, D. S.
2000-01-01
Monitoring Earth rotation with Very Long Baseline Interferometry (VLBI) has unique potential because of direct access to the Celestial Reference System (CRF and Terrestrial Reference System (TRF) and the feasibility of re-analyzing the entire data set. While formal precision of better than 0.045 mas for pole and 0.002 ms for UT 1 has been seen in the best 24-hr data, the accuracy of the Earth Orientation Parameter (EOP) time series as a whole is subject to logistical, operational, analytical and conceptual constraints. The current issues related to the VLBI data set and the CORE program for greater time resolution such as analysis consistency, network jitter and reference frame stability will be discussed.
Using Time Series Analysis to Predict Cardiac Arrest in a PICU.
Kennedy, Curtis E; Aoki, Noriaki; Mariscalco, Michele; Turley, James P
2015-11-01
To build and test cardiac arrest prediction models in a PICU, using time series analysis as input, and to measure changes in prediction accuracy attributable to different classes of time series data. Retrospective cohort study. Thirty-one bed academic PICU that provides care for medical and general surgical (not congenital heart surgery) patients. Patients experiencing a cardiac arrest in the PICU and requiring external cardiac massage for at least 2 minutes. None. One hundred three cases of cardiac arrest and 109 control cases were used to prepare a baseline dataset that consisted of 1,025 variables in four data classes: multivariate, raw time series, clinical calculations, and time series trend analysis. We trained 20 arrest prediction models using a matrix of five feature sets (combinations of data classes) with four modeling algorithms: linear regression, decision tree, neural network, and support vector machine. The reference model (multivariate data with regression algorithm) had an accuracy of 78% and 87% area under the receiver operating characteristic curve. The best model (multivariate + trend analysis data with support vector machine algorithm) had an accuracy of 94% and 98% area under the receiver operating characteristic curve. Cardiac arrest predictions based on a traditional model built with multivariate data and a regression algorithm misclassified cases 3.7 times more frequently than predictions that included time series trend analysis and built with a support vector machine algorithm. Although the final model lacks the specificity necessary for clinical application, we have demonstrated how information from time series data can be used to increase the accuracy of clinical prediction models.
A Time Series of Mean Global Sea Surface Temperature from the Along-Track Scanning Radiometers
NASA Astrophysics Data System (ADS)
Veal, Karen L.; Corlett, Gary; Remedios, John; Llewellyn-Jones, David
2010-12-01
A climate data set requires a long time series of consistently processed data with suitably long periods of overlap of different instruments which allows characterization of any inter-instrument biases. The data obtained from ESA's three Along-Track Scanning Radiometers (ATSRs) together comprise an 18 year record of SST with overlap periods of at least 6 months. The data from all three ATSRs has been consistently processed. These factors together with the stability of the instruments and the precision of the derived SST makes this data set eminently suitable for the construction of a time series of SST that complies with many of the GCOS requirements for a climate data set. A time series of global and regional average SST anomalies has been constructed from the ATSR version 2 data set. An analysis of the overlap periods of successive instruments was used to remove intra-series biases and align the series to a common reference. An ATSR climatology has been developed and has been used to calculate the SST anomalies. The ATSR-1 time series and the AATSR time series have been aligned to ATSR-2. The largest adjustment is ~0.2 K between ATSR-2 and AATSR which is suspected to be due to a shift of the 12 μm filter function for AATSR. An uncertainty of 0.06 K is assigned to the relative anomaly record that is derived from the dual three-channel night-time data. A relative uncertainty of 0.07 K is assigned to the dual night-time two-channel record, except in the ATSR-1 period (1994-1996) where it is larger.
Two approaches to timescale modeling for proxy series with chronological errors.
NASA Astrophysics Data System (ADS)
Divine, Dmitry; Godtliebsen, Fred
2010-05-01
A substantial part of proxy series used in paleoclimate research has chronological uncertainties. Any constructed timescale is therefore only an estimate of the true, but unknown timescale. An accurate assessment of the timing of events in the paleoproxy series and networks, as well as the use of proxy-based paleoclimate reconstructions in GCM model scoring experiments, requires the effect of these errors to be properly taken into account. We consider two types of the timescale error models corresponding to the two basic approaches to construction of the (depth-) age scale in a proxy series. Typically, a chronological control of a proxy series stemming from all types of marine and terrestrial sedimentary archives is based on the use of 14C dates, reference horizons or their combination. Depending on the prevalent origin of the available fix points (age markers) the following approaches to timescale modeling are proposed. 1) 14C dates. The algorithm uses Markov-chain Monte Carlo sampling technique to generate the ordered set of perturbed age markers. Proceeding sequentially from the youngest to the oldest fixpoint, the sampler draws random numbers from the age distribution of each individual 14C date. Every following perturbed age marker is generated such that condition of no age reversal is fulfilled. The relevant regression model is then applied to construct a simulated timescale. 2) Reference horizons (f. ex. volcanic or dust layers, T bomb peak) generally provide absolutely dated fixpoints. Due to a natural variability in sedimentation (accumulation) rate, however, the dating uncertainty in the interpolated timescale tends to grow together with a span to the nearest fixpoint. The (accumulation, sedimentation) process associated with formation of a proxy series is modelled using stochastic Levy process. The respective increments for the process are drawn from the log-normal distribution with the mean/variance ratio prescribed as a site(proxy)- dependent external parameter. The number of generated annual increments corresponds to a time interval between the considered reference horizons. The simulated series is then rescaled to match the length of the actual core section being modelled. Within each method the multitude of timescales is generated creating a number of possible realisations of a proxy series or a proxy based reconstruction in the time domain. This allows consideration of a proxy record in a probabilistic framework. The effect of accounting for uncertainties in chronology on a reconstructed environmental variable is illustrated with the two case studies of marine sediment records.
NASA Astrophysics Data System (ADS)
Di Piazza, A.; Cordano, E.; Eccel, E.
2012-04-01
The issue of climate change detection is considered a major challenge. In particular, high temporal resolution climate change scenarios are required in the evaluation of the effects of climate change on agricultural management (crop suitability, yields, risk assessment, etc.) energy production and water management. In this work, a "Weather Generator" technique was used for downscaling climate change scenarios for temperature. An R package (RMAWGEN, Cordano and Eccel, 2011 - available on http://cran.r-project.org) was developed aiming to generate synthetic daily weather conditions by using the theory of vectorial auto-regressive models (VAR). The VAR model was chosen for its ability in maintaining the temporal and spatial correlations among variables. In particular, observed time series of daily maximum and minimum temperature are transformed into "new" normally-distributed variable time series which are used to calibrate the parameters of a VAR model by using ordinary least square methods. Therefore the implemented algorithm, applied to monthly mean climatic values downscaled by Global Climate Model predictions, can generate several stochastic daily scenarios where the statistical consistency among series is saved. Further details are present in RMAWGEN documentation. An application is presented here by using a dataset with daily temperature time series recorded in 41 different sites of Trentino region for the period 1958-2010. Temperature time series were pre-processed to fill missing values (by a site-specific calibrated Inverse Distance Weighting algorithm, corrected with elevation) and to remove inhomogeneities. Several climatic indices were taken into account, useful for several impact assessment applications, and their time trends within the time series were analyzed. The indices go from the more classical ones, as annual mean temperatures, seasonal mean temperatures and their anomalies (from the reference period 1961-1990) to the climate change indices selected from the list recommended by the World Meteorological Organization Commission for Climatology (WMO-CCL) and the Research Programme on Climate Variability and Predictability (CLIVAR) project's Expert Team on Climate Change Detection, Monitoring and Indices (ETCCDMI). Each index was applied to both observed (and processed) data and to synthetic time series produced by the Weather Generator, over the thirty year reference period 1981-2010, in order to validate the procedure. Climate projections were statistically downscaled for a selection of sites for the two 30-year periods 2021-2050 and 2071-2099 of the European project "Ensembles" multi-model output (scenario A1B). The use of several climatic indices strengthens the trend analysis of both the generated synthetic series and future climate projections.
Impact of seasonal and postglacial surface displacement on global reference frames
NASA Astrophysics Data System (ADS)
Krásná, Hana; Böhm, Johannes; King, Matt; Memin, Anthony; Shabala, Stanislav; Watson, Christopher
2014-05-01
The calculation of actual station positions requires several corrections which are partly recommended by the International Earth Rotation and Reference Systems Service (IERS) Conventions (e.g., solid Earth tides and ocean tidal loading) as well as other corrections, e.g. accounting for hydrology and atmospheric loading. To investigate the pattern of omitted non-linear seasonal motion we estimated empirical harmonic models for selected stations within a global solution of suitable Very Long Baseline Interferometry (VLBI) sessions as well as mean annual models by stacking yearly time series of station positions. To validate these models we compare them to displacement series obtained from the Gravity Recovery and Climate Experiment (GRACE) data and to hydrology corrections determined from global models. Furthermore, we assess the impact of the seasonal station motions on the celestial reference frame as well as on Earth orientation parameters derived from real and also artificial VLBI observations. In the second part of the presentation we apply vertical rates of the ICE-5G_VM2_2012 vertical land movement grid on vertical station velocities. We assess the impact of postglacial uplift on the variability in the scale given different sampling of the postglacial signal in time and hence on the uncertainty in the scale rate of the estimated terrestrial reference frame.
A laboratory assessment of the measurement accuracy of weighing type rainfall intensity gauges
NASA Astrophysics Data System (ADS)
Colli, M.; Chan, P. W.; Lanza, L. G.; La Barbera, P.
2012-04-01
In recent years the WMO Commission for Instruments and Methods of Observation (CIMO) fostered noticeable advancements in the accuracy of precipitation measurement issue by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries (Lanza et al., 2005; Vuerich et al., 2009). Extreme events analysis is proven to be highly affected by the on-site RI measurement accuracy (see e.g. Molini et al., 2004) and the time resolution of the available RI series certainly constitutes another key-factor in constructing hyetographs that are representative of real rain events. The OTT Pluvio2 weighing gauge (WG) and the GEONOR T-200 vibrating-wire precipitation gauge demonstrated very good performance under previous constant flow rate calibration efforts (Lanza et al., 2005). Although WGs do provide better performance than more traditional Tipping Bucket Rain gauges (TBR) under continuous and constant reference intensity, dynamic effects seem to affect the accuracy of WG measurements under real world/time varying rainfall conditions (Vuerich et al., 2009). The most relevant is due to the response time of the acquisition system and the derived systematic delay of the instrument in assessing the exact weight of the bin containing cumulated precipitation. This delay assumes a relevant role in case high resolution rain intensity time series are sought from the instrument, as is the case of many hydrologic and meteo-climatic applications. This work reports the laboratory evaluation of Pluvio2 and T-200 rainfall intensity measurements accuracy. Tests are carried out by simulating different artificial precipitation events, namely non-stationary rainfall intensity, using a highly accurate dynamic rainfall generator. Time series measured by an Ogawa drop counter (DC) at a field test site located within the Hong Kong International Airport (HKIA) were aggregated at a 1-minute scale and used as reference for the artificial rain generation (Colli et al., 2012). The preliminary development and validation of the rainfall simulator for the generation of variable time steps reference intensities is also shown. The generator is characterized by a sufficiently short time response with respect to the expected weighing gauges behavior in order to ensure effective comparison of the measured/reference intensity at very high resolution in time.
Correlates of depression in bipolar disorder
Moore, Paul J.; Little, Max A.; McSharry, Patrick E.; Goodwin, Guy M.; Geddes, John R.
2014-01-01
We analyse time series from 100 patients with bipolar disorder for correlates of depression symptoms. As the sampling interval is non-uniform, we quantify the extent of missing and irregular data using new measures of compliance and continuity. We find that uniformity of response is negatively correlated with the standard deviation of sleep ratings (ρ = –0.26, p = 0.01). To investigate the correlation structure of the time series themselves, we apply the Edelson–Krolik method for correlation estimation. We examine the correlation between depression symptoms for a subset of patients and find that self-reported measures of sleep and appetite/weight show a lower average correlation than other symptoms. Using surrogate time series as a reference dataset, we find no evidence that depression is correlated between patients, though we note a possible loss of information from sparse sampling. PMID:24352942
Time Series of Greenland Ice-Sheet Elevations and Mass Changes from ICESat 2003-2009
NASA Astrophysics Data System (ADS)
Zwally, H. J.; Li, J.; Medley, B.; Robbins, J. W.; Yi, D.
2015-12-01
We follow the repeat-track analysis (RTA) of ICESat surface-elevation data by a second stage that adjusts the measured elevations on repeat passes to the reference track taking into account the cross-track slope (αc), in order to construct elevation time series. αc are obtained from RTA simultaneous solutions for αc, dh/dt, and h0. The height measurements on repeat tracks are initially interpolated to uniform along-track reference points (every 172 m) and times (ti) giving the h(xi,ti) used in the RTA solutions. The xi are the cross-track spacings from the reference track and i is the laser campaign index. The adjusted elevation measurements at the along-track reference points are hr(ti) = h(xi,ti) - xi tan(αc) - h0. The hr(ti) time series are averaged over 50 km cells creating H(ti) series and further averaged (weighted by cell area) to H(t) time series over drainage systems (DS), elevation bands, regions, and the entire ice sheet. Temperature-driven changes in the rate of firn compaction, CT(t), are calculated for 50 km cells with our firn-compaction model giving I(t) = H(t) - CT(t) - B(t) where B(t) is the vertical motion of the bedrock. During 2003 to 2009, the average dCT(t)/dt in the accumulation zone is -5 cm/yr, which amounts to a -75 km3/yr correction to ice volume change estimates. The I(t) are especially useful for studying the seasonal cycle of mass gains and losses and interannual variations. The H(t) for the ablation zone are fitted with a multi-variate function with a linear component describing the upward component of ice flow plus winter accumulation (fall through spring) and a portion of a sine function describing the superimposed summer melting. During fall to spring the H(t) indicate that the upward motion of the ice flow is at a rate of 1 m/yr, giving an annual mass gain of 180 Gt/yr in the ablation zone. The summer loss from surface melting in the high-melt summer of 2005 is 350 Gt/yr, giving a net surface loss of 170 Gt/yr from the ablation zone for 2005. During 2003-2008, the H(t) for the ablation zone show accelerations of the mass losses in the northwest DS8 and in the west-central DS7 (including Jacobshavn glacier) and offsetting decelerations of the mass losses in the east-central DS3 and southeast DS4, much of which occurred in 2008 possibly due to an eastward shift in the surface mass balance.
Theory and Realization of Global Terrestrial Reference Systems
NASA Technical Reports Server (NTRS)
Ma, C.; Bolotin, S.; Gipson, J.; Gordon, D.; Le Bail, K.; MacMillan, D.
2010-01-01
Comparison of realizations of the terrestrial reference frame. IGN and DGFI both generated realizations of the terrestrial reference frame under the auspices of the IERS from combination of the same space geodetic data. We examined both results for VLBI sites using the full geodetic VLBI data set with respect to site positions and velocities and time series of station positions, baselines and Earth orientation parameters. One of the difficulties encountered was matching episodic breaks and periods of non-linear motion of the two realizations with the VLBI models. Our analysis and conclusions will be discussed.
Correlation and Stacking of Relative Paleointensity and Oxygen Isotope Data
NASA Astrophysics Data System (ADS)
Lurcock, P. C.; Channell, J. E.; Lee, D.
2012-12-01
The transformation of a depth-series into a time-series is routinely implemented in the geological sciences. This transformation often involves correlation of a depth-series to an astronomically calibrated time-series. Eyeball tie-points with linear interpolation are still regularly used, although these have the disadvantages of being non-repeatable and not based on firm correlation criteria. Two automated correlation methods are compared: the simulated annealing algorithm (Huybers and Wunsch, 2004) and the Match protocol (Lisiecki and Lisiecki, 2002). Simulated annealing seeks to minimize energy (cross-correlation) as "temperature" is slowly decreased. The Match protocol divides records into intervals, applies penalty functions that constrain accumulation rates, and minimizes the sum of the squares of the differences between two series while maintaining the data sequence in each series. Paired relative paleointensity (RPI) and oxygen isotope records, such as those from IODP Site U1308 and/or reference stacks such as LR04 and PISO, are warped using known warping functions, and then the un-warped and warped time-series are correlated to evaluate the efficiency of the correlation methods. Correlations are performed in tandem to simultaneously optimize RPI and oxygen isotope data. Noise spectra are introduced at differing levels to determine correlation efficiency as noise levels change. A third potential method, known as dynamic time warping, involves minimizing the sum of distances between correlated point pairs across the whole series. A "cost matrix" between the two series is analyzed to find a least-cost path through the matrix. This least-cost path is used to nonlinearly map the time/depth of one record onto the depth/time of another. Dynamic time warping can be expanded to more than two dimensions and used to stack multiple time-series. This procedure can improve on arithmetic stacks, which often lose coherent high-frequency content during the stacking process.
NASA Astrophysics Data System (ADS)
van der Heijden, Sven; Callau Poduje, Ana; Müller, Hannes; Shehu, Bora; Haberlandt, Uwe; Lorenz, Manuel; Wagner, Sven; Kunstmann, Harald; Müller, Thomas; Mosthaf, Tobias; Bárdossy, András
2015-04-01
For the design and operation of urban drainage systems with numerical simulation models, long, continuous precipitation time series with high temporal resolution are necessary. Suitable observed time series are rare. As a result, intelligent design concepts often use uncertain or unsuitable precipitation data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic precipitation data for urban drainage modelling are advanced, tested, and compared. The presented study compares four different approaches of precipitation models regarding their ability to reproduce rainfall and runoff characteristics. These include one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model, and one disaggregation approach based on daily precipitation measurements. All four models produce long precipitation time series with a temporal resolution of five minutes. The synthetic time series are first compared to observed rainfall reference time series. Comparison criteria include event based statistics like mean dry spell and wet spell duration, wet spell amount and intensity, long term means of precipitation sum and number of events, and extreme value distributions for different durations. Then they are compared regarding simulated discharge characteristics using an urban hydrological model on a fictitious sewage network. First results show a principal suitability of all rainfall models but with different strengths and weaknesses regarding the different rainfall and runoff characteristics considered.
ERIC Educational Resources Information Center
Kan, Katherine L.
1994-01-01
Reviews graphic novels for young adults, including five titles from "The Adventures of Tintin," a French series that often uses ethnic and racial stereotypes which reflect the time in which they were published, and "Wolverine," a Marvel comic character adventure. (Contains six references.) (LRW)
Time-series analysis of the transcriptome and proteome of Escherichia coli upon glucose repression.
Borirak, Orawan; Rolfe, Matthew D; de Koning, Leo J; Hoefsloot, Huub C J; Bekker, Martijn; Dekker, Henk L; Roseboom, Winfried; Green, Jeffrey; de Koster, Chris G; Hellingwerf, Klaas J
2015-10-01
Time-series transcript- and protein-profiles were measured upon initiation of carbon catabolite repression in Escherichia coli, in order to investigate the extent of post-transcriptional control in this prototypical response. A glucose-limited chemostat culture was used as the CCR-free reference condition. Stopping the pump and simultaneously adding a pulse of glucose, that saturated the cells for at least 1h, was used to initiate the glucose response. Samples were collected and subjected to quantitative time-series analysis of both the transcriptome (using microarray analysis) and the proteome (through a combination of 15N-metabolic labeling and mass spectrometry). Changes in the transcriptome and corresponding proteome were analyzed using statistical procedures designed specifically for time-series data. By comparison of the two sets of data, a total of 96 genes were identified that are post-transcriptionally regulated. This gene list provides candidates for future in-depth investigation of the molecular mechanisms involved in post-transcriptional regulation during carbon catabolite repression in E. coli, like the involvement of small RNAs. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
37 CFR 1.78 - Claiming benefit of earlier filing date and cross-references to other applications.
Code of Federal Regulations, 2012 CFR
2012-07-01
... such prior-filed application, identifying it by application number (consisting of the series code and.... These time periods are not extendable. Except as provided in paragraph (a)(3) of this section, the... application. The time periods in this paragraph do not apply if the later-filed application is: (A) An...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-05
... limitation section (ALS) of their approved maintenance program (Time Limits Manual (TLM), chapters 05-00-01... airplanes used for pilot training. Revise their ALS of their approved maintenance program (TLM chapters 05... limitations section (ALS) of the operators approved maintenance program (reference the Time Limits Manual (TLM...
Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Pilger, E.; Wright, R.
2011-07-01
We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.
Fuchs, Erich; Gruber, Christian; Reitmaier, Tobias; Sick, Bernhard
2009-09-01
Neural networks are often used to process temporal information, i.e., any kind of information related to time series. In many cases, time series contain short-term and long-term trends or behavior. This paper presents a new approach to capture temporal information with various reference periods simultaneously. A least squares approximation of the time series with orthogonal polynomials will be used to describe short-term trends contained in a signal (average, increase, curvature, etc.). Long-term behavior will be modeled with the tapped delay lines of a time-delay neural network (TDNN). This network takes the coefficients of the orthogonal expansion of the approximating polynomial as inputs such considering short-term and long-term information efficiently. The advantages of the method will be demonstrated by means of artificial data and two real-world application examples, the prediction of the user number in a computer network and online tool wear classification in turning.
ERIC Educational Resources Information Center
Fraser, Renee White; Shani, Hadasa
Intended as a companion piece to volume 2 in the Method Series, Environmental Health Planning (CE 024 230), this second of six volumes in the International Health Planning Reference Series is a combined literature review and annotated bibliography dealing with environmental factors in health planning for developing countries. The review identifies…
A new method of real-time detection of changes in periodic data stream
NASA Astrophysics Data System (ADS)
Lyu, Chen; Lu, Guoliang; Cheng, Bin; Zheng, Xiangwei
2017-07-01
The change point detection in periodic time series is much desirable in many practical usages. We present a novel algorithm for this task, which includes two phases: 1) anomaly measure- on the basis of a typical regression model, we propose a new computation method to measure anomalies in time series which does not require any reference data from other measurement(s); 2) change detection- we introduce a new martingale test for detection which can be operated in an unsupervised and nonparametric way. We have conducted extensive experiments to systematically test our algorithm. The results make us believe that our algorithm can be directly applicable in many real-world change-point-detection applications.
Integrating Analysis Goals for EOP, CRF and TRF
NASA Technical Reports Server (NTRS)
Ma, Chopo; MacMillan, Daniel; Petrov, Leonid
2002-01-01
In a simplified, idealized way the TRF (Terrestrial Reference Frame) can be considered a set of positions at epoch and corresponding linear rates of change while the CRF (Celestial Reference Frame) is a set of fixed directions in space. VLBI analysis can be optimized for CRF and TRF separately while handling some of the complexity of geodetic and astrometric reality. For EOP (Earth Orientation Parameter) time series both CRF and TRF should be accurate at the epoch of interest and well defined over time. The optimal integration of EOP, TRF and CRF in a single VLBI solution configuration requires a detailed consideration of the data set and the possibly conflicting nature of the reference frames. A possible approach for an integrated analysis is described.
NASA Astrophysics Data System (ADS)
Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen
2016-04-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].
NASA Astrophysics Data System (ADS)
Su, Xiaoli; Luo, Zhicai; Zhou, Zebing
2018-06-01
Knowledge of backscatter change is important to accurately retrieve elevation change time series from satellite radar altimetry over continental ice sheets. Previously, backscatter coefficients generated in two cases, namely with and without accounting for backscatter gradient (BG), are used. However, the difference between backscatter time series obtained separately in these two cases and its impact on retrieving elevation change are not well known. Here we first compare the mean profiles of the Ku and Ka band backscatter over the Greenland ice sheet (GrIS), with results illustrating that the Ku-band backscatter is 3 ∼ 5 dB larger than that of the Ka band. We then conduct statistic analysis about time series of backscatter formed separately in the above two cases for both Ku and Ka bands over two regions in the GrIS. It is found that the standard deviation of backscatter time series becomes slightly smaller after removing the BG effect, which suggests that the method for the BG correction is effective. Furthermore, the impact on elevation change from backscatter change due to the BG effect is separately assessed for both Ku and Ka bands over the GrIS. We conclude that Ka band altimetry would benefit from a BG induced backscatter analysis (∼10% over region 2). This study may provide a reference to form backscatter time series towards refining elevation change time series from satellite radar altimetry over ice sheets using repeat-track analysis.
The virtual enhancements - solar proton event radiation (VESPER) model
NASA Astrophysics Data System (ADS)
Aminalragia-Giamini, Sigiava; Sandberg, Ingmar; Papadimitriou, Constantinos; Daglis, Ioannis A.; Jiggens, Piers
2018-02-01
A new probabilistic model introducing a novel paradigm for the modelling of the solar proton environment at 1 AU is presented. The virtual enhancements - solar proton event radiation model (VESPER) uses the European space agency's solar energetic particle environment modelling (SEPEM) Reference Dataset and produces virtual time-series of proton differential fluxes. In this regard it fundamentally diverges from the approach of existing SPE models that are based on probabilistic descriptions of SPE macroscopic characteristics such as peak flux and cumulative fluence. It is shown that VESPER reproduces well the dataset characteristics it uses, and further comparisons with existing models are made with respect to their results. The production of time-series as the main output of the model opens a straightforward way for the calculation of solar proton radiation effects in terms of time-series and the pairing with effects caused by trapped radiation and galactic cosmic rays.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
Stochastic Residual-Error Analysis For Estimating Hydrologic Model Predictive Uncertainty
A hybrid time series-nonparametric sampling approach, referred to herein as semiparametric, is presented for the estimation of model predictive uncertainty. The methodology is a two-step procedure whereby a distributed hydrologic model is first calibrated, then followed by brute ...
AnClim and ProClimDB software for data quality control and homogenization of time series
NASA Astrophysics Data System (ADS)
Stepanek, Petr
2015-04-01
During the last decade, a software package consisting of AnClim, ProClimDB and LoadData for processing (mainly climatological) data has been created. This software offers a complex solution for processing of climatological time series, starting from loading the data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme value evaluations and RCM outputs verification and correction (ProClimDB and AnClim software). The detection of inhomogeneities is carried out on a monthly scale through the application of AnClim, or newly by R functions called from ProClimDB, while quality control, the preparation of reference series and the correction of found breaks is carried out by the ProClimDB software. The software combines many statistical tests, types of reference series and time scales (monthly, seasonal and annual, daily and sub-daily ones). These can be used to create an "ensemble" of solutions, which may be more reliable than any single method. AnClim software is suitable for educational purposes: e.g. for students getting acquainted with methods used in climatology. Built-in graphical tools and comparison of various statistical tests help in better understanding of a given method. ProClimDB is, on the contrary, tool aimed for processing of large climatological datasets. Recently, functions from R may be used within the software making it more efficient in data processing and capable of easy inclusion of new methods (when available under R). An example of usage is easy comparison of methods for correction of inhomogeneities in daily data (HOM of Paul Della-Marta, SPLIDHOM method of Olivier Mestre, DAP - own method, QM of Xiaolan Wang and others). The software is available together with further information on www.climahom.eu . Acknowledgement: this work was partially funded by the project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248.
Direct determination of geocenter motion by combining SLR, VLBI, GNSS, and DORIS time series
NASA Astrophysics Data System (ADS)
Wu, X.; Abbondanza, C.; Altamimi, Z.; Chin, T. M.; Collilieux, X.; Gross, R. S.; Heflin, M. B.; Jiang, Y.; Parker, J. W.
2013-12-01
The longest-wavelength surface mass transport includes three degree-one spherical harmonic components involving hemispherical mass exchanges. The mass load causes geocenter motion between the center-of-mass of the total Earth system (CM) and the center-of-figure of the solid Earth surface (CF), and deforms the solid Earth. Estimation of the degree-1 surface mass changes through CM-CF and degree-1 deformation signatures from space geodetic techniques can thus complement GRACE's time-variable gravity data to form a complete change spectrum up to a high resolution. Currently, SLR is considered the most accurate technique for direct geocenter motion determination. By tracking satellite motion from ground stations, SLR determines the motion between CM and the geometric center of its ground network (CN). This motion is then used to approximate CM-CF and subsequently for deriving degree-1 mass changes. However, the SLR network is very sparse and uneven in global distribution. The average number of operational tracking stations is about 20 in recent years. The poor network geometry can have a large CN-CF motion and is not ideal for the determination of CM-CF motion and degree-1 mass changes. We recently realized an experimental Terrestrial Reference Frame (TRF) through station time series using the Kalman filter and the RTS smoother. The TRF has its origin defined at nearly instantaneous CM using weekly SLR measurement time series. VLBI, GNSS and DORIS time series are combined weekly with those of SLR and tied to the geocentric (CM) reference frame through local tie measurements and co-motion constraints on co-located geodetic stations. The unified geocentric time series of the four geodetic techniques provide a much better network geometry for direct geodetic determination of geocenter motion. Results from this direct approach using a 90-station network compares favorably with those obtained from joint inversions of GPS/GRACE data and ocean bottom pressure models. We will also show that a previously identified discrepancy in X-component between direct SLR orbit-tracking and inverse determined geocenter motions is largely reconciled with the new unified network.
Production and Uses of Multi-Decade Geodetic Earth Science Data Records
NASA Astrophysics Data System (ADS)
Bock, Y.; Kedar, S.; Moore, A. W.; Fang, P.; Liu, Z.; Sullivan, A.; Argus, D. F.; Jiang, S.; Marshall, S. T.
2017-12-01
The Solid Earth Science ESDR System (SESES) project funded under the NASA MEaSUREs program produces and disseminates mature, long-term, calibrated and validated, GNSS based Earth Science Data Records (ESDRs) that encompass multiple diverse areas of interest in Earth Science, such as tectonic motion, transient slip and earthquake dynamics, as well as meteorology, climate, and hydrology. The ESDRs now span twenty-five years for the earliest stations and today are available for thousands of global and regional stations. Using a unified metadata database and a combination of GNSS solutions generated by two independent analysis centers, the project currently produces four long-term ESDR's: Geodetic Displacement Time Series: Daily, combined, cleaned and filtered, GIPSY and GAMIT long-term time series of continuous GPS station positions (global and regional) in the latest version of ITRF, automatically updated weekly. Geodetic Velocities: Weekly updated velocity field + velocity field histories in various reference frames; compendium of all model parameters including earthquake catalog, coseismic offsets, and postseismic model parameters (exponential or logarithmic). Troposphere Delay Time Series: Long-term time series of troposphere delay (30-min resolution) at geodetic stations, necessarily estimated during position time series production and automatically updated weekly. Seismogeodetic records for historic earthquakes: High-rate broadband displacement and seismic velocity time series combining 1 Hz GPS displacements and 100 Hz accelerometer data for select large earthquakes and collocated cGPS and seismic instruments from regional networks. We present several recent notable examples of the ESDR's usage: A transient slip study that uses the combined position time series to unravel "tremor-less" slow tectonic transient events. Fault geometry determination from geodetic slip rates. Changes in water resources across California's physiographic provinces at a spatial resolution of 75 km. Retrospective study of a southern California summer monsoon event.
A scalable database model for multiparametric time series: a volcano observatory case study
NASA Astrophysics Data System (ADS)
Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea
2014-05-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
A multidisciplinary database for geophysical time series management
NASA Astrophysics Data System (ADS)
Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.
2013-12-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes
NASA Astrophysics Data System (ADS)
Zhou, Wei-Xing; Sornette, Didier
We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.
NASA Astrophysics Data System (ADS)
Koeppen, W. C.; Wright, R.; Pilger, E.
2009-12-01
We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found that MODVOLC2 achieved good results on multiple sensors (MODIS and GOES), which provides confidence that MODVOLC2 can be run on future instruments regardless of their spatial and temporal resolutions. The improved performance of MODVOLC2 over MODVOLC makes possible the detection of lower temperature thermal anomalies that will be useful in improving our ability to document Earth’s volcanic eruptions as well as detect possible low temperature thermal precursors to larger eruptions.
Segmentation of time series with long-range fractal correlations.
Bernaola-Galván, P; Oliver, J L; Hackenberg, M; Coronado, A V; Ivanov, P Ch; Carpena, P
2012-06-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.
NASA Astrophysics Data System (ADS)
Xu, Xijin; Tang, Qian; Xia, Haiyue; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-04-01
Chaotic time series prediction based on nonlinear systems showed a superior performance in prediction field. We studied prenatal exposure to polychlorinated biphenyls (PCBs) by chaotic time series prediction using the least squares self-exciting threshold autoregressive (SEATR) model in umbilical cord blood in an electronic waste (e-waste) contaminated area. The specific prediction steps basing on the proposal methods for prenatal PCB exposure were put forward, and the proposed scheme’s validity was further verified by numerical simulation experiments. Experiment results show: 1) seven kinds of PCB congeners negatively correlate with five different indices for birth status: newborn weight, height, gestational age, Apgar score and anogenital distance; 2) prenatal PCB exposed group at greater risks compared to the reference group; 3) PCBs increasingly accumulated with time in newborns; and 4) the possibility of newborns suffering from related diseases in the future was greater. The desirable numerical simulation experiments results demonstrated the feasibility of applying mathematical model in the environmental toxicology field.
Xu, Xijin; Tang, Qian; Xia, Haiyue; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-01-01
Chaotic time series prediction based on nonlinear systems showed a superior performance in prediction field. We studied prenatal exposure to polychlorinated biphenyls (PCBs) by chaotic time series prediction using the least squares self-exciting threshold autoregressive (SEATR) model in umbilical cord blood in an electronic waste (e-waste) contaminated area. The specific prediction steps basing on the proposal methods for prenatal PCB exposure were put forward, and the proposed scheme’s validity was further verified by numerical simulation experiments. Experiment results show: 1) seven kinds of PCB congeners negatively correlate with five different indices for birth status: newborn weight, height, gestational age, Apgar score and anogenital distance; 2) prenatal PCB exposed group at greater risks compared to the reference group; 3) PCBs increasingly accumulated with time in newborns; and 4) the possibility of newborns suffering from related diseases in the future was greater. The desirable numerical simulation experiments results demonstrated the feasibility of applying mathematical model in the environmental toxicology field. PMID:27118260
1992-01-01
VM and the correlation entropy K,(M) versus the embedding dimension M for both the linear and non-linear signals. Crosses refer to the linear signal...mensions, leading to a correlation dimension v=2.7. A similar structure was observed bv Voges et al. [461 in the analysis of the X-ray variability of...0 + 7 1j, and its recurrence plots often indicates whether a where A 0 = 10 and 71, is uniformly random dis- meaningful correlation integral analysis
System and method for charging electrochemical cells in series
DeLuca, William H.; Hornstra, Jr, Fred; Gelb, George H.; Berman, Baruch; Moede, Larry W.
1980-01-01
A battery charging system capable of equalizing the charge of each individual cell at a selected full charge voltage includes means for regulating charger current to first increase current at a constant rate until a bulk charging level is achieved or until any cell reaches a safe reference voltage. A system controller then begins to decrease the charging rate as long as any cell exceeds the reference voltage until an equalization current level is reached. At this point, the system controller activates a plurality of shunt modules to permit shunting of current around any cell having a voltage exceeding the reference voltage. Leads extending between the battery of cells and shunt modules are time shared to permit alternate shunting of current and voltage monitoring without the voltage drop caused by the shunt current. After each cell has at one time exceeded the reference voltage, the charging current is terminated.
Intervention analysis of power plant impact on fish populations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madenjian, C.P.
1984-10-01
Intervention analysis was applied to 10 yr (years 1973-1982) of field fish abundance data at the D. C. Cook Nuclear Power Plant, southeastern Lake Michigan. Three log-transformed catch series, comprising monthly observations, were examined for each combination of two species (alewife, Alosa pseudoharenga, or yellow perch, Perca flavescens) and gear (trawl or gill net): catch at the plant discharged transect, catch at the reference transect, and the ratio of plant catch to reference catch. Time series separated by age groups were examined. Based on intervention analysis, no change in the abundance of fish populations could be attributed to plant operation.more » Additionally, a modification of the intervention analysis technique was applied to investigate trends in abundance at both the plant discharge and reference transects. Significant declines were detected for abundance of alewife adults at both of the transects. Results of the trend analysis support the contention that the alewives have undergone a lakewide decrease in abundance during the 1970s.« less
A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis
NASA Astrophysics Data System (ADS)
Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz
2018-04-01
For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.
Cost-benefit analysis of the 55-mph speed limit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forester, T.H.; McNown, R.F.; Singell, L.D.
1984-01-01
This article presents the results of an empirical study which estimates the number of reduced fatalities as a result of the imposed 55-mph speed limit. Time series data for the US from 1952 to 1979 is employed in a regression model capturing the relation between fatalities, average speed, variability of speed, and the speed limit. Also discussed are the alternative approaches to valuing human life and the value of time. Provided is a series of benefit-cost ratios based on alternative measures of the benefits and costs from life saving. The paper concludes that the 55-mph speed limit is not costmore » efficient unless additional time on the highway is valued significantly below levels estimated in the best reasearch on the value of time. 12 references, 1 table.« less
Detection of long term persistence in time series of the Neuquen River (Argentina)
NASA Astrophysics Data System (ADS)
Seoane, Rafael; Paz González, Antonio
2014-05-01
In the Patagonian region (Argentina), previous hydrometeorological studies that have been developed using general circulation models show variations in annual mean flows. Future climate scenarios obtained from high-resolution models indicate decreases in total annual precipitation, and these scenarios are more important in the Neuquén river basin (23000 km2). The aim of this study was the estimation of long term persistence in the Neuquén River basin (Argentina). The detection of variations in the long range dependence term and long memory of time series was evaluated with the Hurst exponent. We applied rescaled adjusted range analysis (R/S) to time series of River discharges measured from 1903 to 2011 and this time series was divided into two subperiods: the first was from 1903 to 1970 and the second from 1970 to 2011. Results show a small increase in persistence for the second period. Our results are consistent with those obtained by Koch and Markovic (2007), who observed and estimated an increase of the H exponent for the period 1960-2000 in the Elbe River (Germany). References Hurst, H. (1951).Long term storage capacities of reservoirs". Trans. Am. Soc. Civil Engrs., 116:776-808. Koch and Markovic (2007). Evidences for Climate Change in Germany over the 20th Century from the Stochastic Analysis of hydro-meteorological Time Series, MODSIM07, International Congress on Modelling and Simulation, Christchurch, New Zealand.
A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress
2018-01-01
The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies. PMID:29765399
A Seasonal Time-Series Model Based on Gene Expression Programming for Predicting Financial Distress.
Cheng, Ching-Hsue; Chan, Chia-Pang; Yang, Jun-He
2018-01-01
The issue of financial distress prediction plays an important and challenging research topic in the financial field. Currently, there have been many methods for predicting firm bankruptcy and financial crisis, including the artificial intelligence and the traditional statistical methods, and the past studies have shown that the prediction result of the artificial intelligence method is better than the traditional statistical method. Financial statements are quarterly reports; hence, the financial crisis of companies is seasonal time-series data, and the attribute data affecting the financial distress of companies is nonlinear and nonstationary time-series data with fluctuations. Therefore, this study employed the nonlinear attribute selection method to build a nonlinear financial distress prediction model: that is, this paper proposed a novel seasonal time-series gene expression programming model for predicting the financial distress of companies. The proposed model has several advantages including the following: (i) the proposed model is different from the previous models lacking the concept of time series; (ii) the proposed integrated attribute selection method can find the core attributes and reduce high dimensional data; and (iii) the proposed model can generate the rules and mathematical formulas of financial distress for providing references to the investors and decision makers. The result shows that the proposed method is better than the listing classifiers under three criteria; hence, the proposed model has competitive advantages in predicting the financial distress of companies.
Cornick, Matthew; Hunt, Brian; Ott, Edward; Kurtuldu, Huseyin; Schatz, Michael F
2009-03-01
Data assimilation refers to the process of estimating a system's state from a time series of measurements (which may be noisy or incomplete) in conjunction with a model for the system's time evolution. Here we demonstrate the applicability of a recently developed data assimilation method, the local ensemble transform Kalman filter, to nonlinear, high-dimensional, spatiotemporally chaotic flows in Rayleigh-Bénard convection experiments. Using this technique we are able to extract the full temperature and velocity fields from a time series of shadowgraph measurements. In addition, we describe extensions of the algorithm for estimating model parameters. Our results suggest the potential usefulness of our data assimilation technique to a broad class of experimental situations exhibiting spatiotemporal chaos.
NASA Astrophysics Data System (ADS)
Patra, S. R.
2017-12-01
Evapotranspiration (ET0) influences water resources and it is considered as a vital process in aridic hydrologic frameworks. It is one of the most important measure in finding the drought condition. Therefore, time series forecasting of evapotranspiration is very important in order to help the decision makers and water system mangers build up proper systems to sustain and manage water resources. Time series considers that -history repeats itself, hence by analysing the past values, better choices, or forecasts, can be carried out for the future. Ten years of ET0 data was used as a part of this study to make sure a satisfactory forecast of monthly values. In this study, three models: (ARIMA) mathematical model, artificial neural network model, support vector machine model are presented. These three models are used for forecasting monthly reference crop evapotranspiration based on ten years of past historical records (1991-2001) of measured evaporation at Ganjam region, Odisha, India without considering the climate data. The developed models will allow water resource managers to predict up to 12 months, making these predictions very useful to optimize the resources needed for effective water resources management. In this study multistep-ahead prediction is performed which is more complex and troublesome than onestep ahead. Our investigation proposed that nonlinear relationships may exist among the monthly indices, so that the ARIMA model might not be able to effectively extract the full relationship hidden in the historical data. Support vector machines are potentially helpful time series forecasting strategies on account of their strong nonlinear mapping capability and resistance to complexity in forecasting data. SVMs have great learning capability in time series modelling compared to ANN. For instance, the SVMs execute the structural risk minimization principle, which allows in better generalization as compared to neural networks that use the empirical risk minimization principle. The reliability of these computational models was analysed in light of simulation results and it was found out that SVM model produces better results among the three. The future research should be routed to extend the validation data set and to check the validity of our results on different areas with hybrid intelligence techniques.
The high order dispersion analysis based on first-passage-time probability in financial markets
NASA Astrophysics Data System (ADS)
Liu, Chenggong; Shang, Pengjian; Feng, Guochen
2017-04-01
The study of first-passage-time (FPT) event about financial time series has gained broad research recently, which can provide reference for risk management and investment. In this paper, a new measurement-high order dispersion (HOD)-is developed based on FPT probability to explore financial time series. The tick-by-tick data of three Chinese stock markets and three American stock markets are investigated. We classify the financial markets successfully through analyzing the scaling properties of FPT probabilities of six stock markets and employing HOD method to compare the differences of FPT decay curves. It can be concluded that long-range correlation, fat-tailed broad probability density function and its coupling with nonlinearity mainly lead to the multifractality of financial time series by applying HOD method. Furthermore, we take the fluctuation function of multifractal detrended fluctuation analysis (MF-DFA) to distinguish markets and get consistent results with HOD method, whereas the HOD method is capable of fractionizing the stock markets effectively in the same region. We convince that such explorations are relevant for a better understanding of the financial market mechanisms.
ERIC Educational Resources Information Center
Zeman, Anne; Kelly, Kate
A volume in the Scholastic Homework Reference Series, this document provides fourth to sixth grade students and their parents with the information they need to complete U.S. history assignments. With the help of Dial-A-Teacher, which has operated a telephone helpline since 1979, this American history reference guide presents easy-to-understand…
Time Series ARIMA Models of Undergraduate Grade Point Average.
ERIC Educational Resources Information Center
Rogers, Bruce G.
The Auto-Regressive Integrated Moving Average (ARIMA) Models, often referred to as Box-Jenkins models, are regression methods for analyzing sequential dependent observations with large amounts of data. The Box-Jenkins approach, a three-stage procedure consisting of identification, estimation and diagnosis, was used to select the most appropriate…
Topological data analysis of financial time series: Landscapes of crashes
NASA Astrophysics Data System (ADS)
Gidea, Marian; Katz, Yuri
2018-02-01
We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.
Anwar, A R; Muthalib, M; Perrey, S; Galka, A; Granert, O; Wolff, S; Deuschl, G; Raethjen, J; Heute, U; Muthuraman, M
2012-01-01
Directionality analysis of signals originating from different parts of brain during motor tasks has gained a lot of interest. Since brain activity can be recorded over time, methods of time series analysis can be applied to medical time series as well. Granger Causality is a method to find a causal relationship between time series. Such causality can be referred to as a directional connection and is not necessarily bidirectional. The aim of this study is to differentiate between different motor tasks on the basis of activation maps and also to understand the nature of connections present between different parts of the brain. In this paper, three different motor tasks (finger tapping, simple finger sequencing, and complex finger sequencing) are analyzed. Time series for each task were extracted from functional magnetic resonance imaging (fMRI) data, which have a very good spatial resolution and can look into the sub-cortical regions of the brain. Activation maps based on fMRI images show that, in case of complex finger sequencing, most parts of the brain are active, unlike finger tapping during which only limited regions show activity. Directionality analysis on time series extracted from contralateral motor cortex (CMC), supplementary motor area (SMA), and cerebellum (CER) show bidirectional connections between these parts of the brain. In case of simple finger sequencing and complex finger sequencing, the strongest connections originate from SMA and CMC, while connections originating from CER in either direction are the weakest ones in magnitude during all paradigms.
The impact of generic reference pricing interventions in the statin market.
Puig-Junoy, Jaume
2007-11-01
The objective of this study was to evaluate the intended and unintended impact on pharmaceutical use and sales of three public reimbursement reforms applied to the prescription of statins: a Spanish generic reference pricing system, and two competing policies introduced by the Andalusian Public Health Service. This study is designed as an interrupted time series analysis with comparison series of 46 monthly drug use and sales figures from January 2001 to October 2004 for each active ingredient. The mean monthly saving for the year after the introduction of reference pricing was 16.7% of total lovastatin sales, representing only 1.1% of total statins sales. Mean monthly savings for the 10 months after reference pricing being applied to simvastatin were 51.8% of simvastatin sales, and 13.9% of statin sales. Over the 46 months of the study, all analysed public interventions resulted in a 2.2% average monthly decrease in statin sales in the rest of Spain and savings non-significantly different from zero in Andalusia. RP has been effective at reducing the volume of sales growth of the off-patent statins, yet its overall impact on sales of all statins has been relatively modest.
GNSS station displacement analysis
NASA Astrophysics Data System (ADS)
Haritonova, Diana; Balodis, Janis; Janpaule, Inese; Normand, Madara
2013-04-01
Time series of GNSS station results of both the EUPOS®-Riga and LatPos networks have been developed at the Institute of Geodesy and Geoinformation (University of Latvia). The reference stations from EUREF Permanent Network (EPN) in surroundings of Latvia have been used and Bernese GPS Software, Version 5.0, in both static and kinematic modes was applied. The standard data sets were taken from IGS data base. The results of time series have been analysed and distinctive behaviour of daily and subdaily movements of EUPOS®-Riga and LatPos stations was identified. The reasons of dependence of GNSS station coordinate distribution on possible external factors such as seismic activity of some areas of Latvia and periodic processes were given.
Series: Practical guidance to qualitative research. Part 1: Introduction.
Moser, Albine; Korstjens, Irene
2017-12-01
In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called Frequently Asked Questions. This journal series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By 'novice' we mean Master's students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of papers reporting on qualitative research. This first article describes the key features of qualitative research, provides publications for further learning and reading, and gives an outline of the series.
NASA Astrophysics Data System (ADS)
Bock, Y.; Fang, P.; Moore, A. W.; Kedar, S.; Liu, Z.; Owen, S. E.; Glasscoe, M. T.
2016-12-01
Detection of time-dependent crustal deformation relies on the availability of accurate surface displacements, proper time series analysis to correct for secular motion, coseismic and non-tectonic instrument offsets, periodic signatures at different frequencies, and a realistic estimate of uncertainties for the parameters of interest. As part of the NASA Solid Earth Science ESDR System (SESES) project, daily displacement time series are estimated for about 2500 stations, focused on tectonic plate boundaries and having a global distribution for accessing the terrestrial reference frame. The "combined" time series are optimally estimated from independent JPL GIPSY and SIO GAMIT solutions, using a consistent set of input epoch-date coordinates and metadata. The longest time series began in 1992; more than 30% of the stations have experienced one or more of 35 major earthquakes with significant postseismic deformation. Here we present three examples of time-dependent deformation that have been detected in the SESES displacement time series. (1) Postseismic deformation is a fundamental time-dependent signal that indicates a viscoelastic response of the crust/mantle lithosphere, afterslip, or poroelastic effects at different spatial and temporal scales. It is critical to identify and estimate the extent of postseismic deformation in both space and time not only for insight into the crustal deformation and earthquake cycles and their underlying physical processes, but also to reveal other time-dependent signals. We report on our database of characterized postseismic motions using a principal component analysis to isolate different postseismic processes. (2) Starting with the SESES combined time series and applying a time-dependent Kalman filter, we examine episodic tremor and slow slip (ETS) in the Cascadia subduction zone. We report on subtle slip details, allowing investigation of the spatiotemporal relationship between slow slip transients and tremor and their underlying physical mechanisms. (3) We present evolving strain dilatation and shear rates based on the SESES velocities for regional subnetworks as a metric for assigning earthquake probabilities and detection of possible time-dependent deformation related to underlying physical processes.
NASA Astrophysics Data System (ADS)
Krzan, Grzegorz; Stępniak, Katarzyna
2017-09-01
In high-accuracy positioning using GNSS, the most common solution is still relative positioning using double-difference observations of dual-frequency measurements. An increasingly popular alternative to relative positioning are undifferenced approaches, which are designed to make full use of modern satellite systems and signals. Positions referenced to global International Terrestrial Reference Frame (ITRF2008) obtained from Precise Point Positioning (PPP) or Undifferenced (UD) network solutions have to be transformed to national (regional) reference frame, which introduces additional bases related to the transformation process. In this paper, satellite observations from two test networks using different observation time series were processed. The first test concerns the positioning accuracy from processing one year of dual-frequency GPS observations from 14 EUREF Permanent Network (EPN) stations using NAPEOS 3.3.1 software. The results were transformed into a national reference frame (PL-ETRF2000) and compared to positions from an EPN cumulative solution, which was adopted as the true coordinates. Daily observations were processed using PPP and UD multi-station solutions to determine the final accuracy resulting from satellite positioning, the transformation to national coordinate systems and Eurasian intraplate plate velocities. The second numerical test involved similar processing strategies of post-processing carried out using different observation time series (30 min., 1 hour, 2 hours, daily) and different classes of GNSS receivers. The centimeter accuracy of results presented in the national coordinate system satisfies the requirements of many surveying and engineering applications.
NASA Astrophysics Data System (ADS)
Betsuin, Toshiki; Tanaka, Yasunori; Arai, T.; Uesugi, Y.; Ishijima, T.
2018-03-01
This paper describes the application of an Ar/CH4/H2 inductively coupled thermal plasma with and without coil current modulation to synthesise diamond films. Induction thermal plasma with coil current modulation is referred to as modulated induction thermal plasma (M-ITP), while that without modulation is referred to as non-modulated ITP (NM-ITP). First, spectroscopic observations of NM-ITP and M-ITP with different modulation waveforms were made to estimate the composition in flux from the thermal plasma by measuring the time evolution in the spectral intensity from the species. Secondly, we studied polycrystalline diamond film deposition tests on a Si substrate, and we studied monocrystalline diamond film growth tests using the irradiation of NM-ITP and M-ITP. From these tests, diamond nucleation effects by M-ITP were found. Finally, following the irradiation results, we attempted to use a time-series irradiation of M-ITP and NM-ITP for polycrystalline diamond film deposition on a Si substrate. The results indicated that numerous larger diamond particles were deposited with a high population density on the Si substrate by time-series irradiation.
Segmentation of time series with long-range fractal correlations
Bernaola-Galván, P.; Oliver, J.L.; Hackenberg, M.; Coronado, A.V.; Ivanov, P.Ch.; Carpena, P.
2012-01-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome. PMID:23645997
Forecasting of cyanobacterial density in Torrão reservoir using artificial neural networks.
Torres, Rita; Pereira, Elisa; Vasconcelos, Vítor; Teles, Luís Oliva
2011-06-01
The ability of general regression neural networks (GRNN) to forecast the density of cyanobacteria in the Torrão reservoir (Tâmega river, Portugal), in a period of 15 days, based on three years of collected physical and chemical data, was assessed. Several models were developed and 176 were selected based on their correlation values for the verification series. A time lag of 11 was used, equivalent to one sample (periods of 15 days in the summer and 30 days in the winter). Several combinations of the series were used. Input and output data collected from three depths of the reservoir were applied (surface, euphotic zone limit and bottom). The model that presented a higher average correlation value presented the correlations 0.991; 0.843; 0.978 for training, verification and test series. This model had the three series independent in time: first test series, then verification series and, finally, training series. Only six input variables were considered significant to the performance of this model: ammonia, phosphates, dissolved oxygen, water temperature, pH and water evaporation, physical and chemical parameters referring to the three depths of the reservoir. These variables are common to the next four best models produced and, although these included other input variables, their performance was not better than the selected best model.
NASA Astrophysics Data System (ADS)
Ye, H.; Liu, F.; Turner, I.; Anh, V.; Burrage, K.
2013-09-01
Fractional partial differential equations with more than one fractional derivative in time describe some important physical phenomena, such as the telegraph equation, the power law wave equation, or the Szabo wave equation. In this paper, we consider two- and three-dimensional multi-term time and space fractional partial differential equations. The multi-term time-fractional derivative is defined in the Caputo sense, whose order belongs to the interval (1,2],(2,3],(3,4] or (0, m], and the space-fractional derivative is referred to as the fractional Laplacian form. We derive series expansion solutions based on a spectral representation of the Laplacian operator on a bounded region. Some applications are given for the two- and three-dimensional telegraph equation, power law wave equation and Szabo wave equation.
NASA Astrophysics Data System (ADS)
Cohen, W. B.; Yang, Z.; Stehman, S.; Huang, C.; Healey, S. P.
2013-12-01
Forest ecosystem process models require spatially and temporally detailed disturbance data to accurately predict fluxes of carbon or changes in biodiversity over time. A variety of new mapping algorithms using dense Landsat time series show great promise for providing disturbance characterizations at an annual time step. These algorithms provide unprecedented detail with respect to timing, magnitude, and duration of individual disturbance events, and causal agent. But all maps have error and disturbance maps in particular can have significant omission error because many disturbances are relatively subtle. Because disturbance, although ubiquitous, can be a relatively rare event spatially in any given year, omission errors can have a great impact on mapped rates. Using a high quality reference disturbance dataset, it is possible to not only characterize map errors but also to adjust mapped disturbance rates to provide unbiased rate estimates with confidence intervals. We present results from a national-level disturbance mapping project (the North American Forest Dynamics project) based on the Vegetation Change Tracker (VCT) with annual Landsat time series and uncertainty analyses that consist of three basic components: response design, statistical design, and analyses. The response design describes the reference data collection, in terms of the tool used (TimeSync), a formal description of interpretations, and the approach for data collection. The statistical design defines the selection of plot samples to be interpreted, whether stratification is used, and the sample size. Analyses involve derivation of standard agreement matrices between the map and the reference data, and use of inclusion probabilities and post-stratification to adjust mapped disturbance rates. Because for NAFD we use annual time series, both mapped and adjusted rates are provided at an annual time step from ~1985-present. Preliminary evaluations indicate that VCT captures most of the higher intensity disturbances, but that many of the lower intensity disturbances (thinnings, stress related to insects and disease, etc.) are missed. Because lower intensity disturbances are a large proportion of the total set of disturbances, adjusting mapped disturbance rates to include these can be important for inclusion in ecosystem process models. The described statistical disturbance rate adjustments are aspatial in nature, such that the basic underlying map is unchanged. For spatially explicit ecosystem modeling, such adjustments, although important, can be difficult to directly incorporate. One approach for improving the basic underlying map is an ensemble modeling approach that uses several different complementary maps, each derived from a different algorithm and having their own strengths and weaknesses relative to disturbance magnitude and causal agent of disturbance. We will present results from a pilot study associated with the Landscape Change Monitoring System (LCMS), an emerging national-level program that builds upon NAFD and the well-established Monitoring Trends in Burn Severity (MTBS) program.
Probabilistic mapping of flood-induced backscatter changes in SAR time series
NASA Astrophysics Data System (ADS)
Schlaffer, Stefan; Chini, Marco; Giustarini, Laura; Matgen, Patrick
2017-04-01
The information content of flood extent maps can be increased considerably by including information on the uncertainty of the flood area delineation. This additional information can be of benefit in flood forecasting and monitoring. Furthermore, flood probability maps can be converted to binary maps showing flooded and non-flooded areas by applying a threshold probability value pF = 0.5. In this study, a probabilistic change detection approach for flood mapping based on synthetic aperture radar (SAR) time series is proposed. For this purpose, conditional probability density functions (PDFs) for land and open water surfaces were estimated from ENVISAT ASAR Wide Swath (WS) time series containing >600 images using a reference mask of permanent water bodies. A pixel-wise harmonic model was used to account for seasonality in backscatter from land areas caused by soil moisture and vegetation dynamics. The approach was evaluated for a large-scale flood event along the River Severn, United Kingdom. The retrieved flood probability maps were compared to a reference flood mask derived from high-resolution aerial imagery by means of reliability diagrams. The obtained performance measures indicate both high reliability and confidence although there was a slight under-estimation of the flood extent, which may in part be attributed to topographically induced radar shadows along the edges of the floodplain. Furthermore, the results highlight the importance of local incidence angle for the separability between flooded and non-flooded areas as specular reflection properties of open water surfaces increase with a more oblique viewing geometry.
Estimating Root Mean Square Errors in Remotely Sensed Soil Moisture over Continental Scale Domains
NASA Technical Reports Server (NTRS)
Draper, Clara S.; Reichle, Rolf; de Jeu, Richard; Naeimi, Vahid; Parinussa, Robert; Wagner, Wolfgang
2013-01-01
Root Mean Square Errors (RMSE) in the soil moisture anomaly time series obtained from the Advanced Scatterometer (ASCAT) and the Advanced Microwave Scanning Radiometer (AMSR-E; using the Land Parameter Retrieval Model) are estimated over a continental scale domain centered on North America, using two methods: triple colocation (RMSETC ) and error propagation through the soil moisture retrieval models (RMSEEP ). In the absence of an established consensus for the climatology of soil moisture over large domains, presenting a RMSE in soil moisture units requires that it be specified relative to a selected reference data set. To avoid the complications that arise from the use of a reference, the RMSE is presented as a fraction of the time series standard deviation (fRMSE). For both sensors, the fRMSETC and fRMSEEP show similar spatial patterns of relatively highlow errors, and the mean fRMSE for each land cover class is consistent with expectations. Triple colocation is also shown to be surprisingly robust to representativity differences between the soil moisture data sets used, and it is believed to accurately estimate the fRMSE in the remotely sensed soil moisture anomaly time series. Comparing the ASCAT and AMSR-E fRMSETC shows that both data sets have very similar accuracy across a range of land cover classes, although the AMSR-E accuracy is more directly related to vegetation cover. In general, both data sets have good skill up to moderate vegetation conditions.
ERIC Educational Resources Information Center
Schaumann, Leif
Intended as a companion piece to volume 7 in the Method Series, Pharmaceutical Supply System Planning (CE 024 234), this fifth of six volumes in the International Health Planning Reference Series is a combined literature review and annotated bibliography dealing with alternative methodologies for planning and analyzing pharmaceutical supply…
Design Methods in Solid Rocket Motors. Revised Version 1988
1988-04-01
15 France DrB.Zelier SNPE/CRB BP No.2 Le Bouchet 91710 Vert Ie Petit France CONTENTS PREFACE LIST OF AUTHORS/SPEAKERS INTRODUCTION by D.Reydellet...DISCUSSIONS BIBLIOGRAPHY "Not available at time of printing. v Page iii iv Reference 1 2 3 4A 4B" 5 6 7 8 9 10 11 RTD B 1-1 Introduction to the lecture series...the year 2000. 1-4 Introduction a la lecture serie 150 par D. REYDELLET Ingenieur en Chef de l’Armement Direction des Engins 26, boulevard Victor
Mariani, Luigi; Zavatti, Franco
2017-09-01
The spectral periods in North Atlantic Oscillation (NAO), Atlantic Multidecadal Oscillation (AMO) and El Nino Southern Oscillation (ENSO) were analyzed and has been verified how they imprint a time series of European temperature anomalies (ETA), two European temperature time series and some phenological series (dates of cherry flowering and grapevine harvest). Such work had as reference scenario the linear causal chain MCTP (Macroscale Circulation→Temperature→Phenology of crops) that links oceanic and atmospheric circulation to surface air temperature which in its turn determines the earliness of appearance of phenological phases of plants. Results show that in the three segments of the MCTP causal chain are present cycles with the following central period in years (the % of the 12 analyzed time series interested by these cycles are in brackets): 65 (58%), 24 (58%), 20.5 (58%), 13.5 (50%), 11.5 (58%), 7.7 (75%), 5.5 (58%), 4.1 (58%), 3 (50%), 2.4 (67%). A comparison with short term spectral peaks of the four El Niño regions (nino1+2, nino3, nino3.4 and nino4) show that 10 of the 12 series are imprinted by periods around 2.3-2.4yr while 50-58% of the series are imprinted by El Niño periods of 4-4.2, 3.8-3.9, 3-3.1years. The analysis highlights the links among physical and biological variables of the climate system at scales that range from macro to microscale whose knowledge is crucial to reach a suitable understanding of the ecosystem behavior. The spectral analysis was also applied to a time series of spring - summer precipitation in order to evaluate the presence of peaks common with other 12 selected series with result substantially negative which brings us to rule out the existence of a linear causal chain MCPP (Macroscale Circulation→Precipitation→Phenology). Copyright © 2017 Elsevier B.V. All rights reserved.
Application of time-variable process noise in terrestrial reference frames determined from VLBI data
NASA Astrophysics Data System (ADS)
Soja, Benedikt; Gross, Richard S.; Abbondanza, Claudio; Chin, Toshio M.; Heflin, Michael B.; Parker, Jay W.; Wu, Xiaoping; Balidakis, Kyriakos; Nilsson, Tobias; Glaser, Susanne; Karbon, Maria; Heinkelmann, Robert; Schuh, Harald
2018-05-01
In recent years, Kalman filtering has emerged as a suitable technique to determine terrestrial reference frames (TRFs), a prime example being JTRF2014. The time series approach allows variations of station coordinates that are neither reduced by observational corrections nor considered in the functional model to be taken into account. These variations are primarily due to non-tidal geophysical loading effects that are not reduced according to the current IERS Conventions (2010). It is standard practice that the process noise models applied in Kalman filter TRF solutions are derived from time series of loading displacements and account for station dependent differences. So far, it has been assumed that the parameters of these process noise models are constant over time. However, due to the presence of seasonal and irregular variations, this assumption does not truly reflect reality. In this study, we derive a station coordinate process noise model allowing for such temporal variations. This process noise model and one that is a parameterized version of the former are applied in the computation of TRF solutions based on very long baseline interferometry data. In comparison with a solution based on a constant process noise model, we find that the station coordinates are affected at the millimeter level.
SDCLIREF - A sub-daily gridded reference dataset
NASA Astrophysics Data System (ADS)
Wood, Raul R.; Willkofer, Florian; Schmid, Franz-Josef; Trentini, Fabian; Komischke, Holger; Ludwig, Ralf
2017-04-01
Climate change is expected to impact the intensity and frequency of hydrometeorological extreme events. In order to adequately capture and analyze extreme rainfall events, in particular when assessing flood and flash flood situations, data is required at high spatial and sub-daily resolution which is often not available in sufficient density and over extended time periods. The ClimEx project (Climate Change and Hydrological Extreme Events) addresses the alteration of hydrological extreme events under climate change conditions. In order to differentiate between a clear climate change signal and the limits of natural variability, unique Single-Model Regional Climate Model Ensembles (CRCM5 driven by CanESM2, RCP8.5) were created for a European and North-American domain, each comprising 50 members of 150 years (1951-2100). In combination with the CORDEX-Database, this newly created ClimEx-Ensemble is a one-of-a-kind model dataset to analyze changes of sub-daily extreme events. For the purpose of bias-correcting the regional climate model ensembles as well as for the baseline calibration and validation of hydrological catchment models, a new sub-daily (3h) high-resolution (500m) gridded reference dataset (SDCLIREF) was created for a domain covering the Upper Danube and Main watersheds ( 100.000km2). As the sub-daily observations lack a continuous time series for the reference period 1980-2010, the need for a suitable method to bridge the gap of the discontinuous time series arouse. The Method of Fragments (Sharma and Srikanthan (2006); Westra et al. (2012)) was applied to transform daily observations to sub-daily rainfall events to extend the time series and densify the station network. Prior to applying the Method of Fragments and creating the gridded dataset using rigorous interpolation routines, data collection of observations, operated by several institutions in three countries (Germany, Austria, Switzerland), and the subsequent quality control of the observations was carried out. Among others, the quality control checked for steps, extensive dry seasons, temporal consistency and maximum hourly values. The resulting SDCLIREF dataset provides a robust precipitation reference for hydrometeorological applications in unprecedented high spatio-temporal resolution. References: Sharma, A.; Srikanthan, S. (2006): Continuous Rainfall Simulation: A Nonparametric Alternative. In: 30th Hydrology and Water Resources Symposium 4-7 December 2006, Launceston, Tasmania. Westra, S.; Mehrotra, R.; Sharma, A.; Srikanthan, R. (2012): Continuous rainfall simulation. 1. A regionalized subdaily disaggregation approach. In: Water Resour. Res. 48 (1). DOI: 10.1029/2011WR010489.
Providing web-based tools for time series access and analysis
NASA Astrophysics Data System (ADS)
Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane
2014-05-01
Time series information is widely used in environmental change analyses and is also an essential information for stakeholders and governmental agencies. However, a challenging issue is the processing of raw data and the execution of time series analysis. In most cases, data has to be found, downloaded, processed and even converted in the correct data format prior to executing time series analysis tools. Data has to be prepared to use it in different existing software packages. Several packages like TIMESAT (Jönnson & Eklundh, 2004) for phenological studies, BFAST (Verbesselt et al., 2010) for breakpoint detection, and GreenBrown (Forkel et al., 2013) for trend calculations are provided as open-source software and can be executed from the command line. This is needed if data pre-processing and time series analysis is being automated. To bring both parts, automated data access and data analysis, together, a web-based system was developed to provide access to satellite based time series data and access to above mentioned analysis tools. Users of the web portal are able to specify a point or a polygon and an available dataset (e.g., Vegetation Indices and Land Surface Temperature datasets from NASA MODIS). The data is then being processed and provided as a time series CSV file. Afterwards the user can select an analysis tool that is being executed on the server. The final data (CSV, plot images, GeoTIFFs) is visualized in the web portal and can be downloaded for further usage. As a first use case, we built up a complimentary web-based system with NASA MODIS products for Germany and parts of Siberia based on the Earth Observation Monitor (www.earth-observation-monitor.net). The aim of this work is to make time series analysis with existing tools as easy as possible that users can focus on the interpretation of the results. References: Jönnson, P. and L. Eklundh (2004). TIMESAT - a program for analysing time-series of satellite sensor data. Computers and Geosciences 30, 833-845. Verbesselt, J., R. Hyndman, G. Newnham and D. Culvenor (2010). Detecting trend and seasonal changes in satellite image time series. Remote Sensing of Environment, 114, 106-115. DOI: 10.1016/j.rse.2009.08.014 Forkel, M., N. Carvalhais, J. Verbesselt, M. Mahecha, C. Neigh and M. Reichstein (2013). Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology. Remote Sensing 5, 2113-2144.
40 CFR 63.653 - Monitoring, recordkeeping, and implementation plan for emissions averaging.
Code of Federal Regulations, 2011 CFR
2011-07-01
... § 63.120 of subpart G; and (ii) For closed vent systems with control devices, conduct an initial design... different times, and/or in different submittals, later submittals may refer to earlier submittals instead of... controlled using a treatment process or series of treatment processes that achieves an emission reduction...
The Physics of Equestrian Show Jumping
ERIC Educational Resources Information Center
Stinner, Art
2014-01-01
This article discusses the kinematics and dynamics of equestrian show jumping. For some time I have attended a series of show jumping events at Spruce Meadows, an international equestrian center near Calgary, Alberta, often referred to as the "Wimbledon of equestrian jumping." I have always had a desire to write an article such as this…
New Roles for New Times: Digital Curation for Preservation
ERIC Educational Resources Information Center
Walters, Tyler; Skinner, Katherine
2011-01-01
Digital curation refers to the actions people take to maintain and add value to digital information over its lifecycle, including the processes used when creating digital content. Digital preservation focuses on the "series of managed activities necessary to ensure continued access to digital materials for as long as necessary." In this…
Using the Microcomputer to Generate Materials for Bibliographic Instruction.
ERIC Educational Resources Information Center
Hendley, Gaby G.
Guide-worksheets were developed on a word processor in a high school library for bibliographic instruction of English and social studies students to cover the following reference sources: Facts on File; Social Issues Resource Series (S.I.R.S.); Editorial Research Reports; Great Contemporary Issues (New York Times), which also includes Facts on…
Forecasting Techniques and Library Circulation Operations: Implications for Management.
ERIC Educational Resources Information Center
Ahiakwo, Okechukwu N.
1988-01-01
Causal regression and time series models were developed using six years of data for home borrowing, average readership, and books consulted at a university library. The models were tested for efficacy in producing short-term planning and control data. Combined models were tested in establishing evaluation measures. (10 references) (Author/MES)
Smoothing Forecasting Methods for Academic Library Circulations: An Evaluation and Recommendation.
ERIC Educational Resources Information Center
Brooks, Terrence A.; Forys, John W., Jr.
1986-01-01
Circulation time-series data from 50 midwest academic libraries were used to test 110 variants of 8 smoothing forecasting methods. Data and methodologies and illustrations of two recommended methods--the single exponential smoothing method and Brown's one-parameter linear exponential smoothing method--are given. Eight references are cited. (EJS)
Evaluation as Story: The Narrative Quality of Educational Evaluation.
ERIC Educational Resources Information Center
Wachtman, Edward L.
The author presents his opinion that educational evaluation has much similarity to the nonfiction narrative, (defined as a series of events ordered in time), particularly as it relates a current situation to future possibilities. He refers to Stake's statement that evaluation is concerned not only with outcomes but also with antecedents and with…
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2012 CFR
2012-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2010 CFR
2010-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2014 CFR
2014-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2013 CFR
2013-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
14 CFR Appendix A to Part 150 - Noise Exposure Maps
Code of Federal Regulations, 2011 CFR
2011-01-01
... series of n events in time period T, in seconds. Note: When T is one hour, LT is referred to as one-hour... sound attenuation into the design and construction of a structure may be necessary to achieve..., noise exposure maps prepared in connection with studies which were either Federally funded or Federally...
Whomersley, P; Schratzberger, M; Huxham, M; Bates, H; Rees, H
2007-01-01
Sewage sludge was disposed of in Liverpool Bay for over 100 years. Annual amounts increased from 0.5 million tonnes per annum in 1900 to approximately 2 million tonnes per annum by 1995. Macrofauna and a suite of environmental variables were collected at a station adjacent to, and a reference station distant from, the disposal site over 13 years, spanning a pre- (1990-1998) and post- (1999-2003) cessation period. Univariate and multivariate analyses of the time-series data showed significant community differences between reference and disposal site stations and multivariate analyses revealed station-specific community development post-disposal. Temporal variability of communities collected at the disposal station post-cessation was higher than during years of disposal, when temporally stable dominance patterns of disturbance-tolerant species had established. Alterations of community structure post-disturbance reflected successional changes possibly driven by facilitation. Subtle faunistic changes at the Liverpool Bay disposal site indicate that the near-field effects of the disposal of sewage sludge were small and therefore could be considered environmentally acceptable.
Preparations for the IGS realization of ITRF2014
NASA Astrophysics Data System (ADS)
Rebischung, Paul; Schmid, Ralf
2016-04-01
The International GNSS Service (IGS) currently prepares its own realization, called IGS14, of the latest release of the International Terrestrial Reference Frame (ITRF2014). This preparation involves: - a selection of the most suitable reference frame (RF) stations from the complete set of GNSS stations in ITRF2014; - the design of a well-distributed core network of RF stations for the purpose of aligning global GNSS solutions; - a re-evaluation of the GPS and GLONASS satellite antenna phase center offsets (PCOs), based on the SINEX files provided by the IGS Analysis Centers (ACs) in the frame of the second IGS reprocessing campaign repro2. This presentation will first cover the criteria used for the selection of the IGS14 and IGS14 core RF stations as well as preliminary station selection results. We will then use the preliminary IGS14 RF to re-align the daily IGS combined repro2 SINEX solutions and study the impact of the RF change on GNSS-derived geodetic parameter time series. In a second part, we will focus on the re-evaluation of the GNSS satellite antenna PCOs. A re-evaluation of at least their radial (z) components is indeed required, despite the negligible scale difference between ITRF2008 and ITRF2014, because of modeling changes recently introduced within the IGS which affect the scale of GNSS terrestrial frames (Earth radiation pressure, antenna thrust). Moreover, the 13 GPS and GLONASS satellites launched since September 2012 are currently assigned preliminary block-specific mean PCO values which need to be updated. From the daily AC repro2 SINEX files, we will therefore derive time series of satellite z-PCO estimates and analyze the resulting time series. Since several ACs provided all three components of the satellite PCOs in their SINEX files, we will additionally derive similar x- and y-PCO time series and discuss the relevance of their potential re-evaluation.
Laboratory techniques and rhythmometry
NASA Technical Reports Server (NTRS)
Halberg, F.
1973-01-01
Some of the procedures used for the analysis of rhythms are illustrated, notably as these apply to current medical and biological practice. For a quantitative approach to medical and broader socio-ecologic goals, the chronobiologist gathers numerical objective reference standards for rhythmic biophysical, biochemical, and behavioral variables. These biological reference standards can be derived by specialized computer analyses of largely self-measured (until eventually automatically recorded) time series (autorhythmometry). Objective numerical values for individual and population parameters of reproductive cycles can be obtained concomitantly with characteristics of about-yearly (circannual), about-daily (circadian) and other rhythms.
NASA Astrophysics Data System (ADS)
Mantegna, Rosario N.; Stanley, H. Eugene
2007-08-01
Preface; 1. Introduction; 2. Efficient market hypothesis; 3. Random walk; 4. Lévy stochastic processes and limit theorems; 5. Scales in financial data; 6. Stationarity and time correlation; 7. Time correlation in financial time series; 8. Stochastic models of price dynamics; 9. Scaling and its breakdown; 10. ARCH and GARCH processes; 11. Financial markets and turbulence; 12. Correlation and anti-correlation between stocks; 13. Taxonomy of a stock portfolio; 14. Options in idealized markets; 15. Options in real markets; Appendix A: notation guide; Appendix B: martingales; References; Index.
Apparatus and method for compensating for clock drift in downhole drilling components
Hall, David R [Provo, UT; Pixton, David S [Lehi, UT; Johnson, Monte L [Orem, UT; Bartholomew, David B [Springville, UT; Hall, Jr., H. Tracy
2007-08-07
A precise downhole clock that compensates for drift includes a prescaler configured to receive electrical pulses from an oscillator. The prescaler is configured to output a series of clock pulses. The prescaler outputs each clock pulse after counting a preloaded number of electrical pulses from the oscillator. The prescaler is operably connected to a compensator module for adjusting the number loaded into the prescaler. By adjusting the number that is loaded into the prescaler, the timing may be advanced or retarded to more accurately synchronize the clock pulses with a reference time source. The compensator module is controlled by a counter-based trigger module configured to trigger the compensator module to load a value into the prescaler. Finally, a time-base logic module is configured to calculate the drift of the downhole clock by comparing the time of the downhole clock with a reference time source.
The Integration of Research in Judgment and Decision Theory
1980-07-01
off at any one of a series of choice points in a basically linear, unidimensional, all-or-none series of relays is at least in part the result of the...Subjective and objective referents. An objective referent requires a series of observations in which inter-observer reliabilities approximate unity; as... series of studies by Br6hmer (1980). More generally, research as far back as that of Krechevsky’s in the 1930s was conducted precisely to show that
Comparison of ITRF2014 station coordinate input time series of DORIS, VLBI and GNSS
NASA Astrophysics Data System (ADS)
Tornatore, Vincenza; Tanır Kayıkçı, Emine; Roggero, Marco
2016-12-01
In this paper station coordinate time series from three space geodesy techniques that have contributed to the realization of the International Terrestrial Reference Frame 2014 (ITRF2014) are compared. In particular the height component time series extracted from official combined intra-technique solutions submitted for ITRF2014 by DORIS, VLBI and GNSS Combination Centers have been investigated. The main goal of this study is to assess the level of agreement among these three space geodetic techniques. A novel analytic method, modeling time series as discrete-time Markov processes, is presented and applied to the compared time series. The analysis method has proven to be particularly suited to obtain quasi-cyclostationary residuals which are an important property to carry out a reliable harmonic analysis. We looked for common signatures among the three techniques. Frequencies and amplitudes of the detected signals have been reported along with their percentage of incidence. Our comparison shows that two of the estimated signals, having one-year and 14 days periods, are common to all the techniques. Different hypotheses on the nature of the signal having a period of 14 days are presented. As a final check we have compared the estimated velocities and their standard deviations (STD) for the sites that co-located the VLBI, GNSS and DORIS stations, obtaining a good agreement among the three techniques both in the horizontal (1.0 mm/yr mean STD) and in the vertical (0.7 mm/yr mean STD) component, although some sites show larger STDs, mainly due to lack of data, different data spans or noisy observations.
NASA Astrophysics Data System (ADS)
Nahmani, S.; Coulot, D.; Biancale, R.; Bizouard, C.; Bonnefond, P.; Bouquillon, S.; Collilieux, X.; Deleflie, F.; Garayt, B.; Lambert, S. B.; Laurent-Varin, S.; Marty, J. C.; Mercier, F.; Metivier, L.; Meyssignac, B.; Pollet, A.; Rebischung, P.; Reinquin, F.; Richard, J. Y.; Tertre, F.; Woppelmann, G.
2017-12-01
Many major indicators of climate change are monitored with space observations. This monitoring is highly dependent on references that only geodesy can provide. The current accuracy of these references does not permit to fully support the challenges that the constantly evolving Earth system gives rise to, and can consequently limit the accuracy of these indicators. Thus, in the framework of the GGOS, stringent requirements are fixed to the International Terrestrial Reference Frame (ITRF) for the next decade: an accuracy at the level of 1 mm and a stability at the level of 0.1 mm/yr. This means an improvement of the current quality of ITRF by a factor of 5-10. Improving the quality of the geodetic references is an issue which requires a thorough reassessment of the methodologies involved. The most relevant and promising method to improve this quality is the direct combination of the space-geodetic measurements used to compute the official references of the IERS. The GEODESIE project aims at (i) determining highly-accurate global and consistent references and (ii) providing the geophysical and climate research communities with these references, for a better estimation of geocentric sea level rise, ice mass balance and on-going climate changes. Time series of sea levels computed from altimetric data and tide gauge records with these references will also be provided. The geodetic references will be essential bases for Earth's observation and monitoring to support the challenges of the century. The geocentric time series of sea levels will permit to better apprehend (i) the drivers of the global mean sea level rise and of regional variations of sea level and (ii) the contribution of the global climate change induced by anthropogenic greenhouse gases emissions to these drivers. All the results and computation and quality assessment reports will be available at geodesie_anr.ign.fr.This project, supported by the French Agence Nationale de la Recherche (ANR) for the period 2017-2020, will be an unprecedented opportunity to provide the French Groupe de Recherche de Géodésie Spatiale (GRGS) with complete simulation and data processing capabilities to prepare the future arrival of space missions such as the European Geodetic Reference Antenna in SPace (E-GRASP) and to significantly contribute to the GGOS with accurate references.
NASA Astrophysics Data System (ADS)
Wziontek, H.; Palinkas, V.; Falk, R.; Vaľko, M.
2016-12-01
Since decades, absolute gravimeters are compared on a regular basis on an international level, starting at the International Bureau for Weights and Measures (BIPM) in 1981. Usually, these comparisons are based on constant reference values deduced from all accepted measurements acquired during the comparison period. Temporal changes between comparison epochs are usually not considered. Resolution No. 2, adopted by IAG during the IUGG General Assembly in Prague 2015, initiates the establishment of a Global Absolute Gravity Reference System based on key comparisons of absolute gravimeters (AG) under the International Committee for Weights and Measures (CIPM) in order to establish a common level in the microGal range. A stable and unique reference frame can only be achieved, if different AG are taking part in different kind of comparisons. Systematic deviations between the respective comparison reference values can be detected, if the AG can be considered stable over time. The continuous operation of superconducting gravimeters (SG) on selected stations further supports the temporal link of comparison reference values by establishing a reference function over time. By a homogenous reprocessing of different comparison epochs and including AG and SG time series at selected stations, links between several comparisons will be established and temporal comparison reference functions will be derived. By this, comparisons on a regional level can be traced to back to the level of key comparisons, providing a reference for other absolute gravimeters. It will be proved and discussed, how such a concept can be used to support the future absolute gravity reference system.
Conditional adaptive Bayesian spectral analysis of nonstationary biomedical time series.
Bruce, Scott A; Hall, Martica H; Buysse, Daniel J; Krafty, Robert T
2018-03-01
Many studies of biomedical time series signals aim to measure the association between frequency-domain properties of time series and clinical and behavioral covariates. However, the time-varying dynamics of these associations are largely ignored due to a lack of methods that can assess the changing nature of the relationship through time. This article introduces a method for the simultaneous and automatic analysis of the association between the time-varying power spectrum and covariates, which we refer to as conditional adaptive Bayesian spectrum analysis (CABS). The procedure adaptively partitions the grid of time and covariate values into an unknown number of approximately stationary blocks and nonparametrically estimates local spectra within blocks through penalized splines. CABS is formulated in a fully Bayesian framework, in which the number and locations of partition points are random, and fit using reversible jump Markov chain Monte Carlo techniques. Estimation and inference averaged over the distribution of partitions allows for the accurate analysis of spectra with both smooth and abrupt changes. The proposed methodology is used to analyze the association between the time-varying spectrum of heart rate variability and self-reported sleep quality in a study of older adults serving as the primary caregiver for their ill spouse. © 2017, The International Biometric Society.
The long-range correlation and evolution law of centennial-scale temperatures in Northeast China.
Zheng, Xiaohui; Lian, Yi; Wang, Qiguang
2018-01-01
This paper applies the detrended fluctuation analysis (DFA) method to investigate the long-range correlation of monthly mean temperatures from three typical measurement stations at Harbin, Changchun, and Shenyang in Northeast China from 1909 to 2014. The results reveal the memory characteristics of the climate system in this region. By comparing the temperatures from different time periods and investigating the variations of its scaling exponents at the three stations during these different time periods, we found that the monthly mean temperature has long-range correlation, which indicates that the temperature in Northeast China has long-term memory and good predictability. The monthly time series of temperatures over the past 106 years also shows good long-range correlation characteristics. These characteristics are also obviously observed in the annual mean temperature time series. Finally, we separated the centennial-length temperature time series into two time periods. These results reveal that the long-range correlations at the Harbin station over these two time periods have large variations, whereas no obvious variations are observed at the other two stations. This indicates that warming affects the regional climate system's predictability differently at different time periods. The research results can provide a quantitative reference point for regional climate predictability assessment and future climate model evaluation.
Evaluation from 3-Years Time Serie of Daily Actual Evapotranspiration over the Tibetan Plateau
NASA Astrophysics Data System (ADS)
Faivre, R.; Menenti, M.
2016-08-01
The estimation of turbulent uxes is of primary interest for hydrological and climatological studies. Also the use of optical remote sensing data in the VNIR and TIR domain already proved to allow for the parameterization of surface energy balance, leading to many algorithms. Their use over arid high elevation areas require detailed characterisation of key surface physical properties and atmospheric statement at a reference level. Satellite products aquired over the Tibetan Plateau and simulations results delivered in the frame of the CEOP-AEGIS project provide incentives for a regular analysis at medium scale.This work aims at evaluating the use Feng-Yun 2 series and MODIS data (VNIR and TIR) for land surface evapotranspiration (ET) daily mapping based on SEBI algorithm, over the whole Tibetan Plateau (Faivre, 2014). An evaluation is performed over some reference sites set-up through the Tibetan Plateau.
Stochastic modeling of hourly rainfall times series in Campania (Italy)
NASA Astrophysics Data System (ADS)
Giorgio, M.; Greco, R.
2009-04-01
Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil protection agency meteorological warning network. ACKNOWLEDGEMENTS The research was co-financed by the Italian Ministry of University, by means of the PRIN 2006 PRIN program, within the research project entitled ‘Definition of critical rainfall thresholds for destructive landslides for civil protection purposes'. REFERENCES Cowpertwait, P.S.P., Kilsby, C.G. and O'Connell, P.E., 2002. A space-time Neyman-Scott model of rainfall: Empirical analysis of extremes, Water Resources Research, 38(8):1-14. Salas, J.D., 1992. Analysis and modeling of hydrological time series, in D.R. Maidment, ed., Handbook of Hydrology, McGraw-Hill, New York. Heneker, T.M., Lambert, M.F. and Kuczera G., 2001. A point rainfall model for risk-based design, Journal of Hydrology, 247(1-2):54-71.
IGS14/igs14.atx: a new Framework for the IGS Products
NASA Astrophysics Data System (ADS)
Rebischung, P.; Schmid, R.
2016-12-01
The International GNSS Service (IGS) is about to switch to a new reference frame (IGS14), based on the latest release of the International Terrestrial Reference Frame (ITRF2014), as the basis for its products. An updated set of satellite and ground antenna calibrations (igs14.atx) will become effective at the same time. IGS14 and igs14.atx will then replace the previous IGS08/igs08.atx framework in use since GPS week 1632 (17 April 2011) and in the second IGS reprocessing campaign (repro2). Despite the negligible scale difference between ITRF2008 and ITRF2014 (0.02 ppb), the radial components of all GPS and GLONASS satellite antenna phase center offsets (z-PCOs) had to be updated in igs14.atx, because of modeling changes recently introduced within the IGS that affect the scale of the IGS products. This was achieved by deriving and averaging time series of satellite z-PCO estimates, consistent with the ITRF2014 scale, from the daily repro2 and latest operational SINEX solutions of seven IGS Analysis Centers (ACs). Compared to igs08.atx, igs14.atx includes robot calibrations for 16 additional ground antenna types, so that the percentage of stations with absolute calibrations in the IGS network will reach 90% after the switch. 19 type-mean robot calibrations were also updated thanks to the availability of calibration results for additional antenna samples. IGS14 is basically an extract of well-suited reference frame stations (i.e., with long and stable position time series) from ITRF2014. However, to make the IGS14 station coordinates consistent with the new igs14.atx ground antenna calibrations, position offsets due to the switch from igs08.atx to igs14.atx were derived for all IGS14 stations affected by ground antenna calibration updates and applied to their ITRF2014 coordinates. This presentation will first detail the different steps of the elaboration of IGS14 and igs14.atx. The impact of the switch on GNSS-derived geodetic parameter time series will then be assessed by re-aligning the daily repro2 and latest operational IGS combined SINEX solutions to IGS14/igs14.atx. A particular focus will finally be given to the biases and trends present in the satellite z-PCO time series derived from the daily AC SINEX solutions, and to their interpretation in terms of scale and scale rate of the terrestrial frame.
Deconvolution of time series in the laboratory
NASA Astrophysics Data System (ADS)
John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian
2016-10-01
In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.
Long-Term Variations of the EOP and ICRF2
NASA Technical Reports Server (NTRS)
Zharov, Vladimir; Sazhin, Mikhail; Sementsov, Valerian; Sazhina, Olga
2010-01-01
We analyzed the time series of the coordinates of the ICRF radio sources. We show that part of the radio sources, including the defining sources, shows a significant apparent motion. The stability of the celestial reference frame is provided by a no-net-rotation condition applied to the defining sources. In our case this condition leads to a rotation of the frame axes with time. We calculated the effect of this rotation on the Earth orientation parameters (EOP). In order to improve the stability of the celestial reference frame we suggest a new method for the selection of the defining sources. The method consists of two criteria: the first one we call cosmological and the second one kinematical. It is shown that a subset of the ICRF sources selected according to cosmological criteria provides the most stable reference frame for the next decade.
NASA Astrophysics Data System (ADS)
Gowda, P. H.
2016-12-01
Evapotranspiration (ET) is an important process in ecosystems' water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. There are efforts to develop such datasets on a regional to global scale but often faced with the limitations of spatial-temporal resolution tradeoffs in satellite remote sensing technology. In this study, we developed frameworks for generating high and medium resolution daily ET maps from Landsat and MODIS (Moderate Resolution Imaging Spectroradiometer) data, respectively. For developing high resolution (30-m) daily time series ET maps with Landsat TM data, the series version of Two Source Energy Balance (TSEB) model was used to compute sensible and latent heat fluxes of soil and canopy separately. Landsat 5 (2000-2011) and Landsat 8 (2013-2014) imageries for row 28/35 and 27/36 covering central Oklahoma was used. MODIS data (2001-2014) covering Oklahoma and Texas Panhandle was used to develop medium resolution (250-m), time series daily ET maps with SEBS (Surface Energy Balance System) model. An extensive network of weather stations managed by Texas High Plains ET Network and Oklahoma Mesonet was used to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. A linear interpolation sub-model was used to estimate the daily ET between the image acquisition days. Accuracy assessment of daily ET maps were done against eddy covariance data from two grassland sites at El Reno, OK. Statistical results indicated good performance by modeling frameworks developed for deriving time series ET maps. Results indicated that the proposed ET mapping framework is suitable for deriving daily time series ET maps at regional scale with Landsat and MODIS data.
Hu, Kaifeng; Ellinger, James J; Chylla, Roger A; Markley, John L
2011-12-15
Time-zero 2D (13)C HSQC (HSQC(0)) spectroscopy offers advantages over traditional 2D NMR for quantitative analysis of solutions containing a mixture of compounds because the signal intensities are directly proportional to the concentrations of the constituents. The HSQC(0) spectrum is derived from a series of spectra collected with increasing repetition times within the basic HSQC block by extrapolating the repetition time to zero. Here we present an alternative approach to data collection, gradient-selective time-zero (1)H-(13)C HSQC(0) in combination with fast maximum likelihood reconstruction (FMLR) data analysis and the use of two concentration references for absolute concentration determination. Gradient-selective data acquisition results in cleaner spectra, and NMR data can be acquired in both constant-time and non-constant-time mode. Semiautomatic data analysis is supported by the FMLR approach, which is used to deconvolute the spectra and extract peak volumes. The peak volumes obtained from this analysis are converted to absolute concentrations by reference to the peak volumes of two internal reference compounds of known concentration: DSS (4,4-dimethyl-4-silapentane-1-sulfonic acid) at the low concentration limit (which also serves as chemical shift reference) and MES (2-(N-morpholino)ethanesulfonic acid) at the high concentration limit. The linear relationship between peak volumes and concentration is better defined with two references than with one, and the measured absolute concentrations of individual compounds in the mixture are more accurate. We compare results from semiautomated gsHSQC(0) with those obtained by the original manual phase-cycled HSQC(0) approach. The new approach is suitable for automatic metabolite profiling by simultaneous quantification of multiple metabolites in a complex mixture.
Cankorur-Cetinkaya, Ayca; Dereli, Elif; Eraslan, Serpil; Karabekmez, Erkan; Dikicioglu, Duygu; Kirdar, Betul
2012-01-01
Background Understanding the dynamic mechanism behind the transcriptional organization of genes in response to varying environmental conditions requires time-dependent data. The dynamic transcriptional response obtained by real-time RT-qPCR experiments could only be correctly interpreted if suitable reference genes are used in the analysis. The lack of available studies on the identification of candidate reference genes in dynamic gene expression studies necessitates the identification and the verification of a suitable gene set for the analysis of transient gene expression response. Principal Findings In this study, a candidate reference gene set for RT-qPCR analysis of dynamic transcriptional changes in Saccharomyces cerevisiae was determined using 31 different publicly available time series transcriptome datasets. Ten of the twelve candidates (TPI1, FBA1, CCW12, CDC19, ADH1, PGK1, GCN4, PDC1, RPS26A and ARF1) we identified were not previously reported as potential reference genes. Our method also identified the commonly used reference genes ACT1 and TDH3. The most stable reference genes from this pool were determined as TPI1, FBA1, CDC19 and ACT1 in response to a perturbation in the amount of available glucose and as FBA1, TDH3, CCW12 and ACT1 in response to a perturbation in the amount of available ammonium. The use of these newly proposed gene sets outperformed the use of common reference genes in the determination of dynamic transcriptional response of the target genes, HAP4 and MEP2, in response to relaxation from glucose and ammonium limitations, respectively. Conclusions A candidate reference gene set to be used in dynamic real-time RT-qPCR expression profiling in yeast was proposed for the first time in the present study. Suitable pools of stable reference genes to be used under different experimental conditions could be selected from this candidate set in order to successfully determine the expression profiles for the genes of interest. PMID:22675547
Snedden, Gregg A.; Swenson, Erick M.
2012-01-01
Hourly time-series salinity and water-level data are collected at all stations within the Coastwide Reference Monitoring System (CRMS) network across coastal Louisiana. These data, in addition to vegetation and soils data collected as part of CRMS, are used to develop a suite of metrics and indices to assess wetland condition in coastal Louisiana. This document addresses the primary objectives of the CRMS hydrologic analytical team, which were to (1) adopt standard time-series analytical techniques that could effectively assess spatial and temporal variability in hydrologic characteristics across the Louisiana coastal zone on site, project, basin, and coastwide scales and (2) develop and apply an index based on wetland hydrology that can describe the suitability of local hydrology in the context of maximizing the productivity of wetland plant communities. Approaches to quantifying tidal variability (least squares harmonic analysis) and partitioning variability of time-series data to various time scales (spectral analysis) are presented. The relation between marsh elevation and the tidal frame of a given hydrograph is described. A hydrologic index that integrates water-level and salinity data, which are collected hourly, with vegetation data that are collected annually is developed. To demonstrate its utility, the hydrologic index is applied to 173 CRMS sites across the coast, and variability in index scores across marsh vegetation types (fresh, intermediate, brackish, and saline) is assessed. The index is also applied to 11 sites located in three Coastal Wetlands Planning, Protection and Restoration Act projects, and the ability of the index to convey temporal hydrologic variability in response to climatic stressors and restoration measures, as well as the effect that this community may have on wetland plant productivity, is illustrated.
Floris, Ilaria; Billard, Hélène; Boquien, Clair-Yves; Joram-Gauvard, Evelyne; Simon, Laure; Legrand, Arnaud; Boscher, Cécile; Rozé, Jean-Christophe; Bolaños-Jiménez, Francisco; Kaeffer, Bertrand
2015-01-01
Human breast milk is an extremely dynamic fluid containing many biologically-active components which change throughout the feeding period and throughout the day. We designed a miRNA assay on minimized amounts of raw milk obtained from mothers of preterm infants. We investigated changes in miRNA expression within month 2 of lactation and then over the course of 24 hours. Analyses were performed on pooled breast milk, made by combining samples collected at different clock times from the same mother donor, along with time series collected over 24 hours from four unsynchronized mothers. Whole milk, lipids or skim milk fractions were processed and analyzed by qPCR. We measured hsa-miR-16-5p, hsa-miR-21-5p, hsa-miR-146-5p, and hsa-let-7a, d and g (all -5p). Stability of miRNA endogenous controls was evaluated using RefFinder, a web tool integrating geNorm, Normfinder, BestKeeper and the comparative ΔΔCt method. MiR-21 and miR-16 were stably expressed in whole milk collected within month 2 of lactation from four mothers. Analysis of lipids and skim milk revealed that miR-146b and let-7d were better references in both fractions. Time series (5H-23H) allowed the identification of a set of three endogenous reference genes (hsa-let-7d, hsa-let-7g and miR-146b) to normalize raw quantification cycle (Cq) data. We identified a daily oscillation of miR-16-5p. Our assay allows exploring miRNA levels of breast milk from mother with preterm baby collected in time series over 48-72 hours.
GPS-Only Terrestrial Reference Frame Based on a Global Reprocessing
NASA Astrophysics Data System (ADS)
Dietrich, R.; Rothacher, M.; Ruelke, A.; Fritsche, M.; Steigenberger, P.
2007-12-01
The realization of the International Terrestrial Reference System (ITRS) with highest accuracy and stability is fundamental and crucial for applications in geodesy, geodynamics, geophysics and global change. In a joint effort TU Dresden and TU Munich/GFZ Potsdam reprocessed a global GPS network of more than 200 stations. As a contribution to an ITRS realization daily normal equations from 1994 to 2005 were rigorously combined in order to determine a global GPS-only reference frame (PDR05/Potsdam-Dresden-Reprocessing Reference Frame). We present a realization of the global terrestrial reference system which follows the center of mass approach in consideration of the load-induced deformation of the Earth's crust due to the redistribution of surface masses. The stability of our reference frame will be evaluated based on the obtained long-term trends of station coordinates, the load-induced deformation estimates and the homogeneous time series of station positions. We will compare our solution with other recent terrestrial reference system realizations and give some conclusions for future realizations of the ITRS.
Simple Deterministically Constructed Recurrent Neural Networks
NASA Astrophysics Data System (ADS)
Rodan, Ali; Tiňo, Peter
A large number of models for time series processing, forecasting or modeling follows a state-space formulation. Models in the specific class of state-space approaches, referred to as Reservoir Computing, fix their state-transition function. The state space with the associated state transition structure forms a reservoir, which is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be potentially exploited by the reservoir-to-output readout mapping. The largely "black box" character of reservoirs prevents us from performing a deeper theoretical investigation of the dynamical properties of successful reservoirs. Reservoir construction is largely driven by a series of (more-or-less) ad-hoc randomized model building stages, with both the researchers and practitioners having to rely on a series of trials and errors. We show that a very simple deterministically constructed reservoir with simple cycle topology gives performances comparable to those of the Echo State Network (ESN) on a number of time series benchmarks. Moreover, we argue that the memory capacity of such a model can be made arbitrarily close to the proved theoretical limit.
Rapid iterative reanalysis for automated design
NASA Technical Reports Server (NTRS)
Bhatia, K. G.
1973-01-01
A method for iterative reanalysis in automated structural design is presented for a finite-element analysis using the direct stiffness approach. A basic feature of the method is that the generalized stiffness and inertia matrices are expressed as functions of structural design parameters, and these generalized matrices are expanded in Taylor series about the initial design. Only the linear terms are retained in the expansions. The method is approximate because it uses static condensation, modal reduction, and the linear Taylor series expansions. The exact linear representation of the expansions of the generalized matrices is also described and a basis for the present method is established. Results of applications of the present method to the recalculation of the natural frequencies of two simple platelike structural models are presented and compared with results obtained by using a commonly applied analysis procedure used as a reference. In general, the results are in good agreement. A comparison of the computer times required for the use of the present method and the reference method indicated that the present method required substantially less time for reanalysis. Although the results presented are for relatively small-order problems, the present method will become more efficient relative to the reference method as the problem size increases. An extension of the present method to static reanalysis is described, ana a basis for unifying the static and dynamic reanalysis procedures is presented.
NASA Astrophysics Data System (ADS)
Nyeki, Stephan; Wacker, Stefan; Gröbner, Julian; Finsterle, Wolfgang; Wild, Martin
2017-08-01
A large number of radiometers are traceable to the World Standard Group (WSG) for shortwave radiation and the interim World Infrared Standard Group (WISG) for longwave radiation, hosted by the Physikalisch-Meteorologisches Observatorium Davos/World Radiation Centre (PMOD/WRC, Davos, Switzerland). The WSG and WISG have recently been found to over- and underestimate radiation values, respectively (Fehlmann et al., 2012; Gröbner et al., 2014), although research is still ongoing. In view of a possible revision of the reference scales of both standard groups, this study discusses the methods involved and the implications on existing archives of radiation time series, such as the Baseline Surface Radiation Network (BSRN). Based on PMOD/WRC calibration archives and BSRN data archives, the downward longwave radiation (DLR) time series over the 2006-2015 period were analysed at four stations (polar and mid-latitude locations). DLR was found to increase by up to 3.5 and 5.4 W m-2 for all-sky and clear-sky conditions, respectively, after applying a WISG reference scale correction and a minor correction for the dependence of pyrgeometer sensitivity on atmospheric integrated water vapour content. Similar increases in DLR may be expected at other BSRN stations. Based on our analysis, a number of recommendations are made for future studies.
Quebec. Reference Series No. 30.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Quebec and is suitable for teacher reference or student reading. Separate sections discuss geography, climate, population, history, political history, recent politics, agriculture, forestry, mining, manufacturing and industry, hydroelectric power,…
Alberta. Reference Series No. 26.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Alberta and is suitable for teacher reference or student reading. Separate sections discuss the history and population, the provincial government, the economy, transportation, communications, mineral resources, agriculture, manufacturing, forest products,…
New Brunswick. Reference Series No. 31.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of New Brunswick and is suitable for teacher reference or student reading. Separate sections discuss cities and population, geography, history, politics, economy, manufacturing, forestry, agriculture, fisheries, mining, electricity, transportation, government…
Effects of Uncertainties in Electric Field Boundary Conditions for Ring Current Simulations
NASA Astrophysics Data System (ADS)
Chen, Margaret W.; O'Brien, T. Paul; Lemon, Colby L.; Guild, Timothy B.
2018-01-01
Physics-based simulation results can vary widely depending on the applied boundary conditions. As a first step toward assessing the effect of boundary conditions on ring current simulations, we analyze the uncertainty of cross-polar cap potentials (CPCP) on electric field boundary conditions applied to the Rice Convection Model-Equilibrium (RCM-E). The empirical Weimer model of CPCP is chosen as the reference model and Defense Meteorological Satellite Program CPCP measurements as the reference data. Using temporal correlations from a statistical analysis of the "errors" between the reference model and data, we construct a Monte Carlo CPCP discrete time series model that can be generalized to other model boundary conditions. RCM-E simulations using electric field boundary conditions from the reference model and from 20 randomly generated Monte Carlo discrete time series of CPCP are performed for two large storms. During the 10 August 2000 storm main phase, the proton density at 10
Career Development in the College Years.
ERIC Educational Resources Information Center
Myers, Roger A.
This paper concerns itself with two relatively central issues in career development, life stages and choice behavior and focuses on the tasks of the college counselor in regards to them. Life stages refer to sequential series of tasks to be accomplished within time periods that can be specified. Choice behavior is the on-going process of making a…
Teaching Strategies & Techniques for Adjunct Faculty. Third Edition. Higher Education Series.
ERIC Educational Resources Information Center
Greive, Donald
This booklet presents teaching strategies and techniques in a quick reference format. It was designed specifically to assist adjunct and part-time faculty, who have careers outside of education, to efficiently grasp many of the concepts necessary for effective teaching. Included are a checklist of points to review prior to beginning a teaching…
The Evolution of Children's Mental Addition.
ERIC Educational Resources Information Center
Ashcraft, Mark H.; Hamann, Mary Sue
Students in grades 1, 4, 7, and 10 were tested in a two-part investigation of simple and complex mental addition (with college students as a reference point). One session involved a normal reaction time task in which children made true/false judgments about a series of addition examples. The other session involved a verbal protocol interview, the…
Creating a New Architecture for the Learning College
ERIC Educational Resources Information Center
O'Banion, Terry
2007-01-01
The publication of "A Nation at Risk" in 1983 triggered a series of major reform efforts in education that are still evolving. As part of the reform efforts, leaders began to refer to a Learning Revolution that would "place learning first by overhauling the traditional architecture of education." The old architecture--time-bound, place-bound,…
Time Series Analysis of Water Level and Temperature in the St Louis River Estuary
Pressure and temperature loggers were deployed at 9 sites in the St Louis River estuary between 6/23 10/31 2011. A reference sensor was place on the shore to correct pressure data. Sensors were paced at <1 m depth in Allouez Bay, Superior Bay, near Hearding Island, WLSSD Bay, th...
Astrometric Observations of Comets and Asteroids and Subsequent Orbital Investigations
NASA Technical Reports Server (NTRS)
Marsden, Brian G.; McCrosky, Richard E.
1997-01-01
An earlier series of photographic observations was made with the 1.5-m reflector from 1972 to 1989. The start of the series to which this report refers occurred shortly before the conversion from photographic to CCD operation in August 1989, at which point there was a dramatic increase in the productivity of the program. This is evident gives a month-by-month summary of the observations; the earlier data refer to the measurement or remeasurement of photographic plates previously taken with the same telescope. The total number of observations made was 24,423, of which 1338 were of comets. Of the 23,085 observations of asteroids, 21,529 referred to asteroids that were unnumbered when the observations were made. Since an important emphasis of the program was to improve knowledge of the orbits to the point where asteroids can be numbered, the fact that only 4262 of the observations refer to asteroids that are still unnumbered is a measure of the program's success, with 30-35 percent of all the new numberings being habitually made solely because of the recent data from the Oak Ridge program, which even at the time of McCrosky's retirement was still the fourth largest comet-asteroid astrometric program in the world.
Detecting Land Cover Change by Trend and Seasonality of Remote Sensing Time Series
NASA Astrophysics Data System (ADS)
Oliveira, J. C.; Epiphanio, J. N.; Mello, M. P.
2013-05-01
Natural resource managers demand knowledge of information on the spatiotemporal dynamics of land use and land cover change, and detection and characteristics change over time is an initial step for the understanding of the mechanism of change. The propose of this research is the use the approach BFAST (Breaks For Additive Seasonal and Trend) for detects trend and seasonal changes within Normalized Difference Vegetation Index (NDVI) time series. BFAST integrates the decomposition of time series into trend, seasonal, and noise components with methods for detecting change within time series without the need to select a reference period, set a threshold, or define a change trajectory. BFAST iteratively estimates the time and number of changes, and characterizes change by its magnitude and direction. The general model is of the form Yt = Tt + St + et (t= 1,2,3,…, n) where Yt is the observed data at time t, Tt is the trend component, St is the seasonal component, and et is the remainder component. In this study was used MODIS NDVI time series datasets (MOD13Q1) over 11 years (2000 - 2010) on an intensive agricultural area in Mato Grosso - Brazil. At first it was applied a filter for noise reduction (4253H twice) over spectral curve of each MODIS pixel, and subsequently each time series was decomposed into seasonal, trend, and remainder components by BFAST. Were detected one abrupt change from a single pixel of forest and two abrupt changes on trend component to a pixel of the agricultural area. Figure 1 shows the number of phonological change with base in seasonal component for study area. This paper demonstrated the ability of the BFAST to detect long-term phenological change by analyzing time series while accounting for abrupt and gradual changes. The algorithm iteratively estimates the dates and number of changes occurring within seasonal and trend components, and characterizes changes by extracting the magnitude and direction of change. Changes occurring in the seasonal component indicate phenological changes, while changes occurring in the trend component indicate gradual and abrupt change. BFAST can be used to analyze different types of remotely sensed time series and can be applied to other time series such as econometrics, climatology, and hydrology. The algorithm used in this study is available in BFAT package for R from CRAN (http://cran.r-project.org/package=bfast).; Figure 1 - Number of the phonological change with base in seasonal component.
NASA Technical Reports Server (NTRS)
Sanders, Abram F. J.; Verstraeten, Willem W.; Kooreman, Maurits L.; van Leth, Thomas C.; Beringer, Jason; Joiner, Joanna
2016-01-01
A global, monthly averaged time series of Sun-induced Fluorescence (SiF), spanning January 2007 to June 2015, was derived from Metop-A Global Ozone Monitoring Experiment 2 (GOME-2) spectral measurements. Far-red SiF was retrieved using the filling-in of deep solar Fraunhofer lines and atmospheric absorption bands based on the general methodology described by Joiner et al, AMT, 2013. A Principal Component (PC) analysis of spectra over non-vegetated areas was performed to describe the effects of atmospheric absorption. Our implementation (SiF KNMI) is an independent algorithm and differs from the latest implementation of Joiner et al, AMT, 2013 (SiF NASA, v26), because we used desert reference areas for determining PCs (as opposed to cloudy ocean and some desert) and a wider fit window that covers water vapour and oxygen absorption bands (as opposed to only Fraunhofer lines). As a consequence, more PCs were needed (35 as opposed to 12). The two time series (SiF KNMI and SiF NASA, v26) correlate well (overall R of 0.78) except for tropical rain forests. Sensitivity experiments suggest the strong impact of the water vapour absorption band on retrieved SiF values. Furthermore, we evaluated the SiF time series with Gross Primary Productivity (GPP) derived from twelve flux towers in Australia. Correlations for individual towers range from 0.37 to 0.84. They are particularly high for managed biome types. In the de-seasonalized Australian SiF time series, the break of the Millennium Drought during local summer of 2010/2011 is clearly observed.
NASA Astrophysics Data System (ADS)
Nicolas, J.; Nocquet, J.; van Camp, M.; Coulot, D.
2003-12-01
Time-dependent displacements of stations usually have magnitude close to the accuracy of each individual technique, and it still remains difficult to separate the true geophysical motion from possible artifacts inherent to each space geodetic technique. The Observatoire de la C“te d'Azur (OCA), located at Grasse, France benefits from the collocation of several geodetic instruments and techniques (3 laser ranging stations, and a permanent GPS) what allows us to do a direct comparison of the time series. Moreover, absolute gravimetry measurement campaigns have also been regularly performed since 1997, first by the "Ecole et Observatoire des Sciences de la Terre (EOST) of Strasbourg, France, and more recently by the Royal Observatory of Belgium. This study presents a comparison between the positioning time series of the vertical component derived from the SLR and GPS analysis with the gravimetric results from 1997 to 2003. The laser station coordinates are based on a LAGEOS -1 and -2 combined solution using reference 10-day arc orbits, the ITRF2000 reference frame, and the IERS96 conventions. Different GPS weekly global solutions provided from several IGS are combined and compared to the SLR results. The absolute gravimetry measurements are converted into vertical displacements with a classical gradient. The laser time series indicate a strong annual signal at the level of about 3-4 cm peak to peak amplitude on the vertical component. Absolute gravimetry data agrees with the SLR results. GPS positioning solutions also indicate a significant annual term, but with a magnitude of only 50% of the one shown by the SLR solution and by the gravimetry measurements. Similar annual terms are also observed on other SLR sites we processed, but usually with! lower and various amplitudes. These annual signals are also compared to vertical positioning variations corresponding to an atmospheric loading model. We present the level of agreement between the different techniques and we discuss possible explanations for the discrepancy noted between the signals. At last, we expose explanations for the large annual term at Grasse: These annual variations could be partly due to an hydrological loading effect on the karstic massif on which the observatory is located.
Nova Scotia. Reference Series No. 32.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Nova Scotia and is suitable for teacher reference or student reading. Separate sections discuss the geography and climate, history, economic development, fishing, agriculture, forestry, mining, manufacturing, energy, education, arts and culture, and…
Manitoba. Reference Series No. 28.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Manitoba and is suitable for teacher reference or student reading. Separate sections discuss agriculture, mining, energy, transportation and communication, fishing, forestry, fur trapping, health and social services, education, and political life. Specific…
Newfoundland. Reference Series No. 34.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Newfoundland and is suitable for teacher reference or student reading. Separate sections discuss geography and climate, history, economy, population and settlement, arts and culture, leisure and recreation, and heritage. Specific topics include the…
Ontario. Reference Series No. 29.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Ontario and is suitable for teacher reference or student reading. Separate sections discuss geography, climate, history, agriculture, forestry, fishing, mining, manufacturing, transportation, energy, arts and culture, sports and recreation, and people and…
Anomalous scaling of stochastic processes and the Moses effect
NASA Astrophysics Data System (ADS)
Chen, Lijian; Bassler, Kevin E.; McCauley, Joseph L.; Gunaratne, Gemunu H.
2017-04-01
The state of a stochastic process evolving over a time t is typically assumed to lie on a normal distribution whose width scales like t1/2. However, processes in which the probability distribution is not normal and the scaling exponent differs from 1/2 are known. The search for possible origins of such "anomalous" scaling and approaches to quantify them are the motivations for the work reported here. In processes with stationary increments, where the stochastic process is time-independent, autocorrelations between increments and infinite variance of increments can cause anomalous scaling. These sources have been referred to as the Joseph effect and the Noah effect, respectively. If the increments are nonstationary, then scaling of increments with t can also lead to anomalous scaling, a mechanism we refer to as the Moses effect. Scaling exponents quantifying the three effects are defined and related to the Hurst exponent that characterizes the overall scaling of the stochastic process. Methods of time series analysis that enable accurate independent measurement of each exponent are presented. Simple stochastic processes are used to illustrate each effect. Intraday financial time series data are analyzed, revealing that their anomalous scaling is due only to the Moses effect. In the context of financial market data, we reiterate that the Joseph exponent, not the Hurst exponent, is the appropriate measure to test the efficient market hypothesis.
Anomalous scaling of stochastic processes and the Moses effect.
Chen, Lijian; Bassler, Kevin E; McCauley, Joseph L; Gunaratne, Gemunu H
2017-04-01
The state of a stochastic process evolving over a time t is typically assumed to lie on a normal distribution whose width scales like t^{1/2}. However, processes in which the probability distribution is not normal and the scaling exponent differs from 1/2 are known. The search for possible origins of such "anomalous" scaling and approaches to quantify them are the motivations for the work reported here. In processes with stationary increments, where the stochastic process is time-independent, autocorrelations between increments and infinite variance of increments can cause anomalous scaling. These sources have been referred to as the Joseph effect and the Noah effect, respectively. If the increments are nonstationary, then scaling of increments with t can also lead to anomalous scaling, a mechanism we refer to as the Moses effect. Scaling exponents quantifying the three effects are defined and related to the Hurst exponent that characterizes the overall scaling of the stochastic process. Methods of time series analysis that enable accurate independent measurement of each exponent are presented. Simple stochastic processes are used to illustrate each effect. Intraday financial time series data are analyzed, revealing that their anomalous scaling is due only to the Moses effect. In the context of financial market data, we reiterate that the Joseph exponent, not the Hurst exponent, is the appropriate measure to test the efficient market hypothesis.
Using endmembers in AVIRIS images to estimate changes in vegetative biomass
NASA Technical Reports Server (NTRS)
Smith, Milton O.; Adams, John B.; Ustin, Susan L.; Roberts, Dar A.
1992-01-01
Field techniques for estimating vegetative biomass are labor intensive, and rarely are used to monitor changes in biomass over time. Remote-sensing offers an attractive alternative to field measurements; however, because there is no simple correspondence between encoded radiance in multispectral images and biomass, it is not possible to measure vegetative biomass directly from AVIRIS images. Ways to estimate vegetative biomass by identifying community types and then applying biomass scalars derived from field measurements are investigated. Field measurements of community-scale vegetative biomass can be made, at least for local areas, but it is not always possible to identify vegetation communities unambiguously using remote measurements and conventional image-processing techniques. Furthermore, even when communities are well characterized in a single image, it typically is difficult to assess the extent and nature of changes in a time series of images, owing to uncertainties introduced by variations in illumination geometry, atmospheric attenuation, and instrumental responses. Our objective is to develop an improved method based on spectral mixture analysis to characterize and identify vegetative communities, that can be applied to multi-temporal AVIRIS and other types of images. In previous studies, multi-temporal data sets (AVIRIS and TM) of Owens Valley, CA were analyzed and vegetation communities were defined in terms of fractions of reference (laboratory and field) endmember spectra. An advantage of converting an image to fractions of reference endmembers is that, although fractions in a given pixel may vary from image to image in a time series, the endmembers themselves typically are constant, thus providing a consistent frame of reference.
An operational definition of a statistically meaningful trend.
Bryhn, Andreas C; Dimberg, Peter H
2011-04-28
Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.
NASA Astrophysics Data System (ADS)
Dalezios, Nicolas; Spyropoulos, Nicos V.; Tarquis, Ana M.
2015-04-01
The research work stems from the hypothesis that it is possible to perform an estimation of seasonal water needs of olive tree farms under drought periods by cross correlating high spatial, spectral and temporal resolution (~monthly) of satellite data, acquired at well defined time intervals of the phenological cycle of crops, with ground-truth information simultaneously applied during the image acquisitions. The present research is for the first time, demonstrating the coordinated efforts of space engineers, satellite mission control planners, remote sensing scientists and ground teams to record at specific time intervals of the phenological cycle of trees from ground "zero" and from 770 km above the Earth's surface, the status of plants for subsequent cross correlation and analysis regarding the estimation of the seasonal evapotranspiration in vulnerable agricultural environment. The ETo and ETc derived by Penman-Montieth equation and reference Kc tables, compared with new ETd using the Kc extracted from the time series satellite data. Several vegetation indices were also used especially the RedEdge and the chlorophyll one based on WorldView-2 RedEdge and second NIR bands to relate the tree status with water and nutrition needs. Keywords: Evapotransipration, Very High Spatial Resolution - VHSR, time series, remote sensing, vulnerability, agriculture, vegetation indeces.
Prediction of the reference evapotranspiration using a chaotic approach.
Wang, Wei-guang; Zou, Shan; Luo, Zhao-hui; Zhang, Wei; Chen, Dan; Kong, Jun
2014-01-01
Evapotranspiration is one of the most important hydrological variables in the context of water resources management. An attempt was made to understand and predict the dynamics of reference evapotranspiration from a nonlinear dynamical perspective in this study. The reference evapotranspiration data was calculated using the FAO Penman-Monteith equation with the observed daily meteorological data for the period 1966-2005 at four meteorological stations (i.e., Baotou, Zhangbei, Kaifeng, and Shaoguan) representing a wide range of climatic conditions of China. The correlation dimension method was employed to investigate the chaotic behavior of the reference evapotranspiration series. The existence of chaos in the reference evapotranspiration series at the four different locations was proved by the finite and low correlation dimension. A local approximation approach was employed to forecast the daily reference evapotranspiration series. Low root mean square error (RSME) and mean absolute error (MAE) (for all locations lower than 0.31 and 0.24, resp.), high correlation coefficient (CC), and modified coefficient of efficiency (for all locations larger than 0.97 and 0.8, resp.) indicate that the predicted reference evapotranspiration agrees well with the observed one. The encouraging results indicate the suitableness of chaotic approach for understanding and predicting the dynamics of the reference evapotranspiration.
Prince Edward Island. Reference Series No. 33.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Prince Edward Island and is suitable for teacher reference or student reading. Separate sections discuss geography, climate, history, early trade, development, agriculture, the potato industry, forests, fisheries, aquaculture, industry, tourism, energy,…
Saskatchewan. Reference Series No. 27.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Saskatchewan and is suitable for teacher reference or student reading. Separate sections discuss history, economy, oil, uranium, potash, coal, minerals and metals, agriculture, forestry, tourism and recreation, arts and culture, and people. Specific topics…
NASA Astrophysics Data System (ADS)
Yu, Hongjuan; Guo, Jinyun; Kong, Qiaoli; Chen, Xiaodong
2018-04-01
The static observation data from a relative gravimeter contain noise and signals such as gravity tides. This paper focuses on the extraction of the gravity tides from the static relative gravimeter data for the first time applying the combined method of empirical mode decomposition (EMD) and independent component analysis (ICA), called the EMD-ICA method. The experimental results from the CG-5 gravimeter (SCINTREX Limited Ontario Canada) data show that the gravity tides time series derived by EMD-ICA are consistent with the theoretical reference (Longman formula) and the RMS of their differences only reaches 4.4 μGal. The time series of the gravity tides derived by EMD-ICA have a strong correlation with the theoretical time series and the correlation coefficient is greater than 0.997. The accuracy of the gravity tides estimated by EMD-ICA is comparable to the theoretical model and is slightly higher than that of independent component analysis (ICA). EMD-ICA could overcome the limitation of ICA having to process multiple observations and slightly improve the extraction accuracy and reliability of gravity tides from relative gravimeter data compared to that estimated with ICA.
NASA Astrophysics Data System (ADS)
Osada, Y.; Ohta, Y.; Demachi, T.; Kido, M.; Fujimoto, H.; Azuma, R.; Hino, R.
2013-12-01
Large interplate earthquake repeatedly occurred in Japan Trench. Recently, the detail crustal deformation revealed by the nation-wide inland GPS network called as GEONET by GSI. However, the maximum displacement region for interplate earthquake is mainly located offshore region. GPS/Acoustic seafloor geodetic observation (hereafter GPS/A) is quite important and useful for understanding of shallower part of the interplate coupling between subducting and overriding plates. We typically conduct GPS/A in specific ocean area based on repeated campaign style using research vessel or buoy. Therefore, we cannot monitor the temporal variation of seafloor crustal deformation in real time. The one of technical issue on real time observation is kinematic GPS analysis because kinematic GPS analysis based on reference and rover data. If the precise kinematic GPS analysis will be possible in the offshore region, it should be promising method for real time GPS/A with USV (Unmanned Surface Vehicle) and a moored buoy. We assessed stability, precision and accuracy of StarFireTM global satellites based augmentation system. We primarily tested for StarFire in the static condition. In order to assess coordinate precision and accuracy, we compared 1Hz StarFire time series and post-processed precise point positioning (PPP) 1Hz time series by GIPSY-OASIS II processing software Ver. 6.1.2 with three difference product types (ultra-rapid, rapid, and final orbits). We also used difference interval clock information (30 and 300 seconds) for the post-processed PPP processing. The standard deviation of real time StarFire time series is less than 30 mm (horizontal components) and 60 mm (vertical component) based on 1 month continuous processing. We also assessed noise spectrum of the estimated time series by StarFire and post-processed GIPSY PPP results. We found that the noise spectrum of StarFire time series is similar pattern with GIPSY-OASIS II processing result based on JPL rapid orbit products with 300 seconds interval clock information. And we report stability, precision and accuracy of StarFire in the moving conditon.
Estimating clinical chemistry reference values based on an existing data set of unselected animals.
Dimauro, Corrado; Bonelli, Piero; Nicolussi, Paola; Rassu, Salvatore P G; Cappio-Borlino, Aldo; Pulina, Giuseppe
2008-11-01
In an attempt to standardise the determination of biological reference values, the International Federation of Clinical Chemistry (IFCC) has published a series of recommendations on developing reference intervals. The IFCC recommends the use of an a priori sampling of at least 120 healthy individuals. However, such a high number of samples and laboratory analysis is expensive, time-consuming and not always feasible, especially in veterinary medicine. In this paper, an alternative (a posteriori) method is described and is used to determine reference intervals for biochemical parameters of farm animals using an existing laboratory data set. The method used was based on the detection and removal of outliers to obtain a large sample of animals likely to be healthy from the existing data set. This allowed the estimation of reliable reference intervals for biochemical parameters in Sarda dairy sheep. This method may also be useful for the determination of reference intervals for different species, ages and gender.
Normative Databases for Imaging Instrumentation.
Realini, Tony; Zangwill, Linda M; Flanagan, John G; Garway-Heath, David; Patella, Vincent M; Johnson, Chris A; Artes, Paul H; Gaddie, Ian B; Fingeret, Murray
2015-08-01
To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer's database differs in size, eligibility criteria, and ethnic make-up, among other key features. The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments.
Normative Databases for Imaging Instrumentation
Realini, Tony; Zangwill, Linda; Flanagan, John; Garway-Heath, David; Patella, Vincent Michael; Johnson, Chris; Artes, Paul; Ben Gaddie, I.; Fingeret, Murray
2015-01-01
Purpose To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. Methods A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Results Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer’s database differs in size, eligibility criteria, and ethnic make-up, among other key features. Conclusions The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments. PMID:25265003
1981-04-01
JAN 73 1473 EDITION OF • NOV 6S IS OBSOLETE SECURITY M^ mrm THIS PAGE (When Date Entered) UNCLASSIFIED SECURITY CLASSIFICATION OP THIS PAOKWhrn Dmtm...similar to those obtained during the laboratory simulation (Figure 5) o > Q ID < 47.5 48.5 49.5 50.5 TIME ( ms ) Figure 5, Example of the...45> ■400 VELOCITY PULSE PROJECTILE TYPE HE(TP) 12 TIME ( MS ) Figure 8. Time Series for the 75mm HE(JPl Projectile, Rd, No. 4 reference length
NEW SUNS IN THE COSMOS. III. MULTIFRACTAL SIGNATURE ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freitas, D. B. de; Nepomuceno, M. M. F.; Junior, P. R. V. de Moraes
2016-11-01
In the present paper, we investigate the multifractality signatures in hourly time series extracted from the CoRoT spacecraft database. Our analysis is intended to highlight the possibility that astrophysical time series can be members of a particular class of complex and dynamic processes, which require several photometric variability diagnostics to characterize their structural and topological properties. To achieve this goal, we search for contributions due to a nonlinear temporal correlation and effects caused by heavier tails than the Gaussian distribution, using a detrending moving average algorithm for one-dimensional multifractal signals (MFDMA). We observe that the correlation structure is the mainmore » source of multifractality, while heavy-tailed distribution plays a minor role in generating the multifractal effects. Our work also reveals that the rotation period of stars is inherently scaled by the degree of multifractality. As a result, analyzing the multifractal degree of the referred series, we uncover an evolution of multifractality from shorter to larger periods.« less
Space shuttle simulation model
NASA Technical Reports Server (NTRS)
Tatom, F. B.; Smith, S. R.
1980-01-01
The effects of atmospheric turbulence in both horizontal and near horizontal flight, during the return of the space shuttle, are important for determining design, control, and 'pilot-in-the-loop' effects. A nonrecursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model, the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes which are entitled shuttle simulation turbulence tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 10,000 meters. The turbulence generation procedure is described as well as the results of validating the simulated turbulence. Conclusions and recommendations are presented and references cited. The tabulated one dimensional von Karman spectra and the results of spectral and statistical analyses of the SSTT are contained in the appendix.
Archfield, Stacey A.; Vogel, Richard M.; Steeves, Peter A.; Brandt, Sara L.; Weiskel, Peter K.; Garabedian, Stephen P.
2010-01-01
Federal, State and local water-resource managers require a variety of data and modeling tools to better understand water resources. The U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, has developed a statewide, interactive decision-support tool to meet this need. The decision-support tool, referred to as the Massachusetts Sustainable-Yield Estimator (MA SYE) provides screening-level estimates of the sustainable yield of a basin, defined as the difference between the unregulated streamflow and some user-specified quantity of water that must remain in the stream to support such functions as recreational activities or aquatic habitat. The MA SYE tool was designed, in part, because the quantity of surface water available in a basin is a time-varying quantity subject to competing demands for water. To compute sustainable yield, the MA SYE tool estimates a daily time series of unregulated, daily mean streamflow for a 44-year period of record spanning October 1, 1960, through September 30, 2004. Selected streamflow quantiles from an unregulated, daily flow-duration curve are estimated by solving six regression equations that are a function of physical and climate basin characteristics at an ungaged site on a stream of interest. Streamflow is then interpolated between the estimated quantiles to obtain a continuous daily flow-duration curve. A time series of unregulated daily streamflow subsequently is created by transferring the timing of the daily streamflow at a reference streamgage to the ungaged site by equating exceedence probabilities of contemporaneous flow at the two locations. One of 66 reference streamgages is selected by kriging, a geostatistical method, which is used to map the spatial relation among correlations between the time series of the logarithm of daily streamflows at each reference streamgage and the ungaged site. Estimated unregulated, daily mean streamflows show good agreement with observed unregulated, daily mean streamflow at 18 streamgages located across southern New England. Nash-Sutcliffe efficiency goodness-of-fit values are between 0.69 and 0.98, and percent root-mean-square-error values are between 19 and 283 percent. The MA SYE tool provides an estimate of streamflow adjusted for current (2000-04) water withdrawals and discharges using a spatially referenced database of permitted groundwater and surface-water withdrawal and discharge volumes. For a user-selected basin, the database is queried to obtain the locations of water withdrawal or discharge volumes within the basin. Groundwater and surface-water withdrawals and discharges are subtracted and added, respectively, from the unregulated, daily streamflow at an ungaged site to obtain a streamflow time series that includes the effects of these withdrawals and discharges. Users also have the option of applying an analytical solution to the time-varying, groundwater withdrawal and discharge volumes that take into account the effects of the aquifer properties on the timing and magnitude of streamflow alteration. For the MA SYE tool, it is assumed that groundwater and surface-water divides are coincident. For areas of southeastern Massachusetts and Cape Cod where this assumption is known to be violated, groundwater-flow models are used to estimate average monthly streamflows at fixed locations. There are several limitations to the quality and quantity of the spatially referenced database of groundwater and surface-water withdrawals and discharges. The adjusted streamflow values do not account for the effects on streamflow of climate change, septic-system discharge, impervious area, non-public water-supply withdrawals less than 100,000 gallons per day, and impounded surface-water bodies.
British Columbia. Reference Series No. 25.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of British Columbia and is suitable for teacher reference or student reading. A discussion of the province's history includes the early European explorers, Indian natives, and later fur traders and settlers. The building of the transcontinental railway, entry…
The Northwest Territories. Reference Series No. 35.
ERIC Educational Resources Information Center
Department of External Affairs, Ottawa (Ontario).
This booklet, one of a series featuring the Canadian provinces, presents a brief overview of Northwest Territories and is suitable for teacher reference or student reading. Separate sections discuss geography, history and people, land claims, the economy, the government, and recreation and the arts. Specific topics include the expansive size and…
Evaluating disease management program effectiveness: an introduction to time-series analysis.
Linden, Ariel; Adams, John L; Roberts, Nancy
2003-01-01
Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.
Multiple Time Series Node Synchronization Utilizing Ambient Reference
2014-12-31
assessment, is the need for fine scale synchronization among communicating nodes and across multiple domains. The severe requirements that Special...processing targeted to performance assessment, is the need for fine scale synchronization among communicating nodes and across multiple domains. The...research community and it is well documented and characterized. The datasets considered from this project (listed below) were used to derive the
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-19
... limitations section (ALS) of the operators' approved maintenance program (reference the Time Limits Manual... airplane used for Pilot Training. (2) Within 30 days after the effective date of this AD, revise the ALS of... within 30 days after the effective date of this AD, revising the ALS of the operators' approved...
NASA Technical Reports Server (NTRS)
Scholtz, P.; Smyth, P.
1992-01-01
This article describes an investigation of a statistical hypothesis testing method for detecting changes in the characteristics of an observed time series. The work is motivated by the need for practical automated methods for on-line monitoring of Deep Space Network (DSN) equipment to detect failures and changes in behavior. In particular, on-line monitoring of the motor current in a DSN 34-m beam waveguide (BWG) antenna is used as an example. The algorithm is based on a measure of the information theoretic distance between two autoregressive models: one estimated with data from a dynamic reference window and one estimated with data from a sliding reference window. The Hinkley cumulative sum stopping rule is utilized to detect a change in the mean of this distance measure, corresponding to the detection of a change in the underlying process. The basic theory behind this two-model test is presented, and the problem of practical implementation is addressed, examining windowing methods, model estimation, and detection parameter assignment. Results from the five fault-transition simulations are presented to show the possible limitations of the detection method, and suggestions for future implementation are given.
Using exogenous variables in testing for monotonic trends in hydrologic time series
Alley, William M.
1988-01-01
One approach that has been used in performing a nonparametric test for monotonic trend in a hydrologic time series consists of a two-stage analysis. First, a regression equation is estimated for the variable being tested as a function of an exogenous variable. A nonparametric trend test such as the Kendall test is then performed on the residuals from the equation. By analogy to stagewise regression and through Monte Carlo experiments, it is demonstrated that this approach will tend to underestimate the magnitude of the trend and to result in some loss in power as a result of ignoring the interaction between the exogenous variable and time. An alternative approach, referred to as the adjusted variable Kendall test, is demonstrated to generally have increased statistical power and to provide more reliable estimates of the trend slope. In addition, the utility of including an exogenous variable in a trend test is examined under selected conditions.
Signal processing of anthropometric data
NASA Astrophysics Data System (ADS)
Zimmermann, W. J.
1983-09-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Signal processing of anthropometric data
NASA Technical Reports Server (NTRS)
Zimmermann, W. J.
1983-01-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
An Analytical Time–Domain Expression for the Net Ripple Produced by Parallel Interleaved Converters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian B.; Krein, Philip T.
We apply modular arithmetic and Fourier series to analyze the superposition of N interleaved triangular waveforms with identical amplitudes and duty-ratios. Here, interleaving refers to the condition when a collection of periodic waveforms with identical periods are each uniformly phase-shifted across one period. The main result is a time-domain expression which provides an exact representation of the summed and interleaved triangular waveforms, where the peak amplitude and parameters of the time-periodic component are all specified in closed-form. Analysis is general and can be used to study various applications in multi-converter systems. This model is unique not only in that itmore » reveals a simple and intuitive expression for the net ripple, but its derivation via modular arithmetic and Fourier series is distinct from prior approaches. The analytical framework is experimentally validated with a system of three parallel converters under time-varying operating conditions.« less
Phase correction and error estimation in InSAR time series analysis
NASA Astrophysics Data System (ADS)
Zhang, Y.; Fattahi, H.; Amelung, F.
2017-12-01
During the last decade several InSAR time series approaches have been developed in response to the non-idea acquisition strategy of SAR satellites, such as large spatial and temporal baseline with non-regular acquisitions. The small baseline tubes and regular acquisitions of new SAR satellites such as Sentinel-1 allows us to form fully connected networks of interferograms and simplifies the time series analysis into a weighted least square inversion of an over-determined system. Such robust inversion allows us to focus more on the understanding of different components in InSAR time-series and its uncertainties. We present an open-source python-based package for InSAR time series analysis, called PySAR (https://yunjunz.github.io/PySAR/), with unique functionalities for obtaining unbiased ground displacement time-series, geometrical and atmospheric correction of InSAR data and quantifying the InSAR uncertainty. Our implemented strategy contains several features including: 1) improved spatial coverage using coherence-based network of interferograms, 2) unwrapping error correction using phase closure or bridging, 3) tropospheric delay correction using weather models and empirical approaches, 4) DEM error correction, 5) optimal selection of reference date and automatic outlier detection, 6) InSAR uncertainty due to the residual tropospheric delay, decorrelation and residual DEM error, and 7) variance-covariance matrix of final products for geodetic inversion. We demonstrate the performance using SAR datasets acquired by Cosmo-Skymed and TerraSAR-X, Sentinel-1 and ALOS/ALOS-2, with application on the highly non-linear volcanic deformation in Japan and Ecuador (figure 1). Our result shows precursory deformation before the 2015 eruptions of Cotopaxi volcano, with a maximum uplift of 3.4 cm on the western flank (fig. 1b), with a standard deviation of 0.9 cm (fig. 1a), supporting the finding by Morales-Rivera et al. (2017, GRL); and a post-eruptive subsidence on the same area, with a maximum of -3 +/- 0.9 cm (fig. 1c). Time-series displacement map (fig. 2) shows a highly non-linear deformation behavior, indicating the complicated magma propagation process during this eruption cycle.
Development of web tools to disseminate space geodesy data-related products
NASA Astrophysics Data System (ADS)
Soudarin, Laurent; Ferrage, Pascale; Mezerette, Adrien
2015-04-01
In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). A database was created to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).
Robust, automatic GPS station velocities and velocity time series
NASA Astrophysics Data System (ADS)
Blewitt, G.; Kreemer, C.; Hammond, W. C.
2014-12-01
Automation in GPS coordinate time series analysis makes results more objective and reproducible, but not necessarily as robust as the human eye to detect problems. Moreover, it is not a realistic option to manually scan our current load of >20,000 time series per day. This motivates us to find an automatic way to estimate station velocities that is robust to outliers, discontinuities, seasonality, and noise characteristics (e.g., heteroscedasticity). Here we present a non-parametric method based on the Theil-Sen estimator, defined as the median of velocities vij=(xj-xi)/(tj-ti) computed between all pairs (i, j). Theil-Sen estimators produce statistically identical solutions to ordinary least squares for normally distributed data, but they can tolerate up to 29% of data being problematic. To mitigate seasonality, our proposed estimator only uses pairs approximately separated by an integer number of years (N-δt)<(tj-ti )<(N+δt), where δt is chosen to be small enough to capture seasonality, yet large enough to reduce random error. We fix N=1 to maximally protect against discontinuities. In addition to estimating an overall velocity, we also use these pairs to estimate velocity time series. To test our methods, we process real data sets that have already been used with velocities published in the NA12 reference frame. Accuracy can be tested by the scatter of horizontal velocities in the North American plate interior, which is known to be stable to ~0.3 mm/yr. This presents new opportunities for time series interpretation. For example, the pattern of velocity variations at the interannual scale can help separate tectonic from hydrological processes. Without any step detection, velocity estimates prove to be robust for stations affected by the Mw7.2 2010 El Mayor-Cucapah earthquake, and velocity time series show a clear change after the earthquake, without any of the usual parametric constraints, such as relaxation of postseismic velocities to their preseismic values.
Review of current GPS methodologies for producing accurate time series and their error sources
NASA Astrophysics Data System (ADS)
He, Xiaoxing; Montillet, Jean-Philippe; Fernandes, Rui; Bos, Machiel; Yu, Kegen; Hua, Xianghong; Jiang, Weiping
2017-05-01
The Global Positioning System (GPS) is an important tool to observe and model geodynamic processes such as plate tectonics and post-glacial rebound. In the last three decades, GPS has seen tremendous advances in the precision of the measurements, which allow researchers to study geophysical signals through a careful analysis of daily time series of GPS receiver coordinates. However, the GPS observations contain errors and the time series can be described as the sum of a real signal and noise. The signal itself can again be divided into station displacements due to geophysical causes and to disturbing factors. Examples of the latter are errors in the realization and stability of the reference frame and corrections due to ionospheric and tropospheric delays and GPS satellite orbit errors. There is an increasing demand on detecting millimeter to sub-millimeter level ground displacement signals in order to further understand regional scale geodetic phenomena hence requiring further improvements in the sensitivity of the GPS solutions. This paper provides a review spanning over 25 years of advances in processing strategies, error mitigation methods and noise modeling for the processing and analysis of GPS daily position time series. The processing of the observations is described step-by-step and mainly with three different strategies in order to explain the weaknesses and strengths of the existing methodologies. In particular, we focus on the choice of the stochastic model in the GPS time series, which directly affects the estimation of the functional model including, for example, tectonic rates, seasonal signals and co-seismic offsets. Moreover, the geodetic community continues to develop computational methods to fully automatize all phases from analysis of GPS time series. This idea is greatly motivated by the large number of GPS receivers installed around the world for diverse applications ranging from surveying small deformations of civil engineering structures (e.g., subsidence of the highway bridge) to the detection of particular geophysical signals.
NASA Astrophysics Data System (ADS)
Rawles, Christopher; Thurber, Clifford
2015-08-01
We present a simple, fast, and robust method for automatic detection of P- and S-wave arrivals using a nearest neighbours-based approach. The nearest neighbour algorithm is one of the most popular time-series classification methods in the data mining community and has been applied to time-series problems in many different domains. Specifically, our method is based on the non-parametric time-series classification method developed by Nikolov. Instead of building a model by estimating parameters from the data, the method uses the data itself to define the model. Potential phase arrivals are identified based on their similarity to a set of reference data consisting of positive and negative sets, where the positive set contains examples of analyst identified P- or S-wave onsets and the negative set contains examples that do not contain P waves or S waves. Similarity is defined as the square of the Euclidean distance between vectors representing the scaled absolute values of the amplitudes of the observed signal and a given reference example in time windows of the same length. For both P waves and S waves, a single pass is done through the bandpassed data, producing a score function defined as the ratio of the sum of similarity to positive examples over the sum of similarity to negative examples for each window. A phase arrival is chosen as the centre position of the window that maximizes the score function. The method is tested on two local earthquake data sets, consisting of 98 known events from the Parkfield region in central California and 32 known events from the Alpine Fault region on the South Island of New Zealand. For P-wave picks, using a reference set containing two picks from the Parkfield data set, 98 per cent of Parkfield and 94 per cent of Alpine Fault picks are determined within 0.1 s of the analyst pick. For S-wave picks, 94 per cent and 91 per cent of picks are determined within 0.2 s of the analyst picks for the Parkfield and Alpine Fault data set, respectively. For the Parkfield data set, our method picks 3520 P-wave picks and 3577 S-wave picks out of 4232 station-event pairs. For the Alpine Fault data set, the method picks 282 P-wave picks and 311 S-wave picks out of a total of 344 station-event pairs. For our testing, we note that the vast majority of station-event pairs have analyst picks, although some analyst picks are excluded based on an accuracy assessment. Finally, our tests suggest that the method is portable, allowing the use of a reference set from one region on data from a different region using relatively few reference picks.
Dean, Roger T; Dunsmuir, William T M
2016-06-01
Many articles on perception, performance, psychophysiology, and neuroscience seek to relate pairs of time series through assessments of their cross-correlations. Most such series are individually autocorrelated: they do not comprise independent values. Given this situation, an unfounded reliance is often placed on cross-correlation as an indicator of relationships (e.g., referent vs. response, leading vs. following). Such cross-correlations can indicate spurious relationships, because of autocorrelation. Given these dangers, we here simulated how and why such spurious conclusions can arise, to provide an approach to resolving them. We show that when multiple pairs of series are aggregated in several different ways for a cross-correlation analysis, problems remain. Finally, even a genuine cross-correlation function does not answer key motivating questions, such as whether there are likely causal relationships between the series. Thus, we illustrate how to obtain a transfer function describing such relationships, informed by any genuine cross-correlations. We illustrate the confounds and the meaningful transfer functions by two concrete examples, one each in perception and performance, together with key elements of the R software code needed. The approach involves autocorrelation functions, the establishment of stationarity, prewhitening, the determination of cross-correlation functions, the assessment of Granger causality, and autoregressive model development. Autocorrelation also limits the interpretability of other measures of possible relationships between pairs of time series, such as mutual information. We emphasize that further complexity may be required as the appropriate analysis is pursued fully, and that causal intervention experiments will likely also be needed.
Relative Photometry of HAT-P-1b Occultations
NASA Astrophysics Data System (ADS)
Béky, Bence; Holman, Matthew J.; Gilliland, Ronald L.; Bakos, Gáspár Á.; Winn, Joshua N.; Noyes, Robert W.; Sasselov, Dimitar D.
2013-06-01
We present Hubble Space Telescope (HST) Space Telescope Imaging Spectrograph observations of two occultations of the transiting exoplanet HAT-P-1b. By measuring the planet to star flux ratio near opposition, we constrain the geometric albedo of the planet, which is strongly linked to its atmospheric temperature gradient. An advantage of HAT-P-1 as a target is its binary companion ADS 16402 A, which provides an excellent photometric reference, simplifying the usual steps in removing instrumental artifacts from HST time-series photometry. We find that without this reference star, we would need to detrend the lightcurve with the time of the exposures as well as the first three powers of HST orbital phase, and this would introduce a strong bias in the results for the albedo. However, with this reference star, we only need to detrend the data with the time of the exposures to achieve the same per-point scatter, therefore we can avoid most of the bias associated with detrending. Our final result is a 2σ upper limit of 0.64 for the geometric albedo of HAT-P-1b between 577 and 947 nm.
Toward Automatic Georeferencing of Archival Aerial Photogrammetric Surveys
NASA Astrophysics Data System (ADS)
Giordano, S.; Le Bris, A.; Mallet, C.
2018-05-01
Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i) the computation of a coarse absolute image orientation; (ii) the use of the coarse Digital Surface Model (DSM) information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT REGULATIONS GOVERNING U.S. SAVINGS BONDS, SERIES A, B, C, D, E, F, G, H, J, AND K, AND U.S. SAVINGS NOTES General Information § 315.2 Definitions. As used in... the context indicates otherwise. General references to bonds and direct references to Series E bonds...
NASA Astrophysics Data System (ADS)
Moreaux, Guilhem; Lemoine, Frank G.; Capdeville, Hugues; Kuzin, Sergey; Otten, Michiel; Štěpánek, Petr; Willis, Pascal; Ferrage, Pascale
2016-12-01
In preparation of the 2014 realization of the International Terrestrial Reference Frame (ITRF2014), the International DORIS Service delivered to the International Earth Rotation and Reference Systems Service a set of 1140 weekly solution files including station coordinates and Earth orientation parameters, covering the time period from 1993.0 to 2015.0. The data come from eleven DORIS satellites: TOPEX/Poseidon, SPOT2, SPOT3, SPOT4, SPOT5, Envisat, Jason-1, Jason-2, Cryosat-2, Saral and HY-2A. In their processing, the six analysis centers which contributed to the DORIS combined solution used the latest time variable gravity models and estimated DORIS ground beacon frequency variations. Furthermore, all the analysis centers but one excepted included in their processing phase center variations for ground antennas. The main objective of this study is to present the combination process and to analyze the impact of the new modeling on the performance of the new combined solution. Comparisons with the IDS contribution to ITRF2008 show that (i) the application of the DORIS ground phase center variations in the data processing shifts the combined scale upward by nearly 7-11 mm and (ii) thanks to estimation of DORIS ground beacon frequency variations, the new combined solution no longer shows any scale discontinuity in early 2002 and does not present unexplained vertical discontinuities in any station position time series. However, analysis of the new series with respect to ITRF2008 exhibits a scale increase late 2011 which is not yet explained. A new DORIS Terrestrial Reference Frame was computed to evaluate the intrinsic quality of the new combined solution. That evaluation shows that the addition of data from the new missions equipped with the latest generation of DORIS receiver (Jason-2, Cryosat-2, HY-2A, Saral), results in an internal position consistency of 10 mm or better after mid-2008.
de Cremoux, P; Bieche, I; Tran-Perennou, C; Vignaud, S; Boudou, E; Asselain, B; Lidereau, R; Magdelénat, H; Becette, V; Sigal-Zafrani, B; Spyratos, F
2004-09-01
Quantitative reverse transcription-polymerase chain reaction (RT-PCR) used to detect minor changes in specific mRNA concentrations may be associated with poor reproducibility. Stringent quality control is therefore essential at each step of the protocol, including the PCR procedure. We performed inter-laboratory quality control of quantitative PCR between two independent laboratories, using in-house RT-PCR assays on a series of hormone-related target genes in a retrospective consecutive series of 79 breast tumors. Total RNA was reverse transcribed in a single center. Calibration curves were performed for five target genes (estrogen receptor (ER)alpha, ERbeta, progesterone receptor (PR), CYP19 (aromatase) and Ki 67) and for two reference genes (human acidic ribosomal phosphoprotein PO (RPLPO) and TATA box-binding protein (TBP)). Amplification efficiencies of the calibrator were determined for each run and used to calculate mRNA expression. Correlation coefficients were evaluated for each target and each reference gene. A good correlation was observed for all target and reference genes in both centers using their own protocols and kits (P < 0.0001). The correlation coefficients ranged from 0.90 to 0.98 for the various target genes in the two centers. A good correlation was observed between the level of expression of the ERalpha and the PR transcripts (P < 0.001). A weak inverse correlation was observed in both centers between ERalpha and ERbeta levels, but only when TBP was the reference gene. No other correlation was observed with other parameters. Real-time PCR assays allow convenient quantification of target mRNA transcripts and quantification of target-derived nucleic acids in clinical specimens. This study addresses the importance of inter-laboratory quality controls for the use of a panel of real-time PCR assays devoted to clinical samples and protocols and to ensure their appropriate accuracy. This can also facilitate exchanges and multicenter comparison of data.
NASA Astrophysics Data System (ADS)
He, Jiayi; Shang, Pengjian; Xiong, Hui
2018-06-01
Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.
Prediction of the Reference Evapotranspiration Using a Chaotic Approach
Wang, Wei-guang; Zou, Shan; Luo, Zhao-hui; Zhang, Wei; Kong, Jun
2014-01-01
Evapotranspiration is one of the most important hydrological variables in the context of water resources management. An attempt was made to understand and predict the dynamics of reference evapotranspiration from a nonlinear dynamical perspective in this study. The reference evapotranspiration data was calculated using the FAO Penman-Monteith equation with the observed daily meteorological data for the period 1966–2005 at four meteorological stations (i.e., Baotou, Zhangbei, Kaifeng, and Shaoguan) representing a wide range of climatic conditions of China. The correlation dimension method was employed to investigate the chaotic behavior of the reference evapotranspiration series. The existence of chaos in the reference evapotranspiration series at the four different locations was proved by the finite and low correlation dimension. A local approximation approach was employed to forecast the daily reference evapotranspiration series. Low root mean square error (RSME) and mean absolute error (MAE) (for all locations lower than 0.31 and 0.24, resp.), high correlation coefficient (CC), and modified coefficient of efficiency (for all locations larger than 0.97 and 0.8, resp.) indicate that the predicted reference evapotranspiration agrees well with the observed one. The encouraging results indicate the suitableness of chaotic approach for understanding and predicting the dynamics of the reference evapotranspiration. PMID:25133221
Effect of Time Varying Gravity on DORIS processing for ITRF2013
NASA Astrophysics Data System (ADS)
Zelensky, N. P.; Lemoine, F. G.; Chinn, D. S.; Beall, J. W.; Melachroinos, S. A.; Beckley, B. D.; Pavlis, D.; Wimert, J.
2013-12-01
Computations are under way to develop a new time series of DORIS SINEX solutions to contribute to the development of the new realization of the terrestrial reference frame (c.f. ITRF2013). One of the improvements that are envisaged is the application of improved models of time-variable gravity in the background orbit modeling. At GSFC we have developed a time series of spherical harmonics to degree and order 5 (using the GOC02S model as a base), based on the processing of SLR and DORIS data to 14 satellites from 1993 to 2013. This is compared with the standard approach used in ITRF2008, based on the static model EIGEN-GL04S1 which included secular variations in only a few select coefficients. Previous work on altimeter satellite POD (c.f. TOPEX/Poseidon, Jason-1, Jason-2) has shown that the standard model is not adequate and orbit improvements are observed with application of more detailed models of time-variable gravity. In this study, we quantify the impact of TVG modeling on DORIS satellite POD, and ascertain the impact on DORIS station positions estimated weekly from 1993 to 2013. The numerous recent improvements to SLR and DORIS processing at GSFC include a more complete compliance to IERS2010 standards, improvements to SLR/DORIS measurement modeling, and improved non-conservative force modeling to DORIS satellites. These improvements will affect gravity coefficient estimates, POD, and the station solutions. Tests evaluate the impact of time varying gravity on tracking data residuals, station consistency, and the geocenter and scale reference frame parameters.
Kronberg, James W.
1992-01-01
A sequential power-up circuit for starting several electrical load elements in series to avoid excessive current surge, comprising a voltage ramp generator and a set of voltage comparators, each comparator having a different reference voltage and interfacing with a switch that is capable of turning on one of the load elements. As the voltage rises, it passes the reference voltages one at a time and causes the switch corresponding to that voltage to turn on its load element. The ramp is turned on and off by a single switch or by a logic-level electrical signal. The ramp rate for turning on the load element is relatively slow and the rate for turning the elements off is relatively fast. Optionally, the duration of each interval of time between the turning on of the load elements is programmable.
Time-series analysis of the barriers for admission into a spinal rehabilitation unit.
New, P W; Akram, M
2016-02-01
This is a prospective open-cohort case series. The objective of this study was to assess changes over time in the duration of key acute hospital process barriers for patients with spinal cord damage (SCD) from admission until transfer into spinal rehabilitation unit (SRU) or other destinations. The study was conducted in Acute hospitals, Victoria, Australia (2006-2013). Duration of the following discrete sequential processes was measured: acute hospital admission until referral to SRU, referral until SRU assessment, SRU assessment until ready for SRU transfer and ready for transfer until SRU admission. Time-series analysis was performed using a generalised additive model (GAM). Seasonality of non-traumatic spinal cord dysfunction (SCDys) was examined. GAM analysis shows that the waiting time for admission into SRU was significantly (P<0.001) longer for patients who were female, who had tetraplegia, who were motor complete, had a pelvic pressure ulcer and who were referred from another health network. Age had a non-linear effect on the duration of waiting for transfer from acute hospital to SRU and both the acute hospital and SRU length of stay (LOS). The duration patients spent waiting for SRU admission increased over the study period. There was an increase in the number of referrals over the study period and an increase in the number of patients accepted but not admitted into the SRU. There was no notable seasonal influence on the referral of patients with SCDys. Time-series analysis provides additional insights into changes in the waiting times for SRU admission and the LOS in hospital for patients with SCD.
International Geomagnetic Reference Field: the third generation.
Peddie, N.W.
1982-01-01
In August 1981 the International Association of Geomagnetism and Aeronomy revised the International Geomagnetic Reference Field (IGRF). It is the second revision since the inception of the IGRF in 1968. The revision extends the earlier series of IGRF models from 1980 to 1985, introduces a new series of definitive models for 1965-1976, and defines a provisional reference field for 1975- 1980. The revision consists of: 1) a model of the main geomagnetic field at 1980.0, not continuous with the earlier series of IGRF models together with a forecast model of the secular variation of the main field during 1980-1985; 2) definitive models of the main field at 1965.0, 1970.0, and 1975.0, with linear interpolation of the model coefficients specified for intervening dates; and 3) a provisional reference field for 1975-1980, defined as the linear interpolation of the 1975 and 1980 main-field models.-from Author
Interannual Variability of OLR as Observed by AIRS and CERES
NASA Technical Reports Server (NTRS)
Susskind, Joel; Molnar, Gyula; Iredell, Lena; Loeb, Norman G.
2012-01-01
This paper compares spatial anomaly time series of OLR (Outgoing Longwave Radiation) and OLR(sub CLR) (Clear Sky OLR) as determined using observations from CERES Terra and AIRS over the time period September 2002 through June 2011. Both AIRS and CERES show a significant decrease in global mean and tropical mean OLR over this time period. We find excellent agreement of the anomaly time-series of the two OLR data sets in almost every detail, down to 1 deg X 1 deg spatial grid point level. The extremely close agreement of OLR anomaly time series derived from observations by two different instruments implies that both sets of results must be highly stable. This agreement also validates to some extent the anomaly time series of the AIRS derived products used in the computation of the AIRS OLR product. The paper also examines the correlations of anomaly time series of AIRS and CERES OLR, on different spatial scales, as well as those of other AIRS derived products, with that of the NOAA Sea Surface Temperature (SST) product averaged over the NOAA Nino-4 spatial region. We refer to these SST anomalies as the El Nino Index. Large spatially coherent positive and negative correlations of OLR anomaly time series with that of the El Nino Index are found in different spatial regions. Anomalies of global mean, and especially tropical mean, OLR are highly positively correlated with the El Nino Index. These correlations explain that the recent global and tropical mean decreases in OLR over the period September 2002 through June 2011, as observed by both AIRS and CERES, are primarily the result of a transition from an El Nino condition at the beginning of the data record to La Nina conditions toward the end of the data period. We show that the close correlation of global mean, and especially tropical mean, OLR anomalies with the El Nino Index can be well accounted for by temporal changes of OLR within two spatial regions which lie outside the NOAA Nino-4 region, in which anomalies of cloud cover and mid-tropospheric water vapor are both highly negatively correlated with the El Nino Index. Agreement of the AIRS and CERES OLR(sub CLR) anomaly time series is less good, which may be a result of the large sampling differences in the ensemble of cases included in each OLR(sub CLR) data set.
A new algorithm for automatic Outlier Detection in GPS Time Series
NASA Astrophysics Data System (ADS)
Cannavo', Flavio; Mattia, Mario; Rossi, Massimo; Palano, Mimmo; Bruno, Valentina
2010-05-01
Nowadays continuous GPS time series are considered a crucial product of GPS permanent networks, useful in many geo-science fields, such as active tectonics, seismology, crustal deformation and volcano monitoring (Altamimi et al. 2002, Elósegui et al. 2006, Aloisi et al. 2009). Although the GPS data elaboration software has increased in reliability, the time series are still affected by different kind of noise, from the intrinsic noise (e.g. thropospheric delay) to the un-modeled noise (e.g. cycle slips, satellite faults, parameters changing). Typically GPS Time Series present characteristic noise that is a linear combination of white noise and correlated colored noise, and this characteristic is fractal in the sense that is evident for every considered time scale or sampling rate. The un-modeled noise sources result in spikes, outliers and steps. These kind of errors can appreciably influence the estimation of velocities of the monitored sites. The outlier detection in generic time series is a widely treated problem in literature (Wei, 2005), while is not fully developed for the specific kind of GPS series. We propose a robust automatic procedure for cleaning the GPS time series from the outliers and, especially for long daily series, steps due to strong seismic or volcanic events or merely instrumentation changing such as antenna and receiver upgrades. The procedure is basically divided in two steps: a first step for the colored noise reduction and a second step for outlier detection through adaptive series segmentation. Both algorithms present novel ideas and are nearly unsupervised. In particular, we propose an algorithm to estimate an autoregressive model for colored noise in GPS time series in order to subtract the effect of non Gaussian noise on the series. This step is useful for the subsequent step (i.e. adaptive segmentation) which requires the hypothesis of Gaussian noise. The proposed algorithms are tested in a benchmark case study and the results confirm that the algorithms are effective and reasonable. Bibliography - Aloisi M., A. Bonaccorso, F. Cannavò, S. Gambino, M. Mattia, G. Puglisi, E. Boschi, A new dyke intrusion style for the Mount Etna May 2008 eruption modelled through continuous tilt and GPS data, Terra Nova, Volume 21 Issue 4 , Pages 316 - 321, doi: 10.1111/j.1365-3121.2009.00889.x (August 2009) - Altamimi Z., Sillard P., Boucher C., ITRF2000: A new release of the International Terrestrial Reference frame for earth science applications, J Geophys Res-Solid Earth, 107 (B10): art. no.-2214, (Oct 2002) - Elósegui, P., J. L. Davis, D. Oberlander, R. Baena, and G. Ekström , Accuracy of high-rate GPS for seismology, Geophys. Res. Lett., 33, L11308, doi:10.1029/2006GL026065 (2006) - Wei W. S., Time Series Analysis: Univariate and Multivariate Methods, Addison Wesley (2 edition), ISBN-10: 0321322169 (July, 2005)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffin, J.M.
1977-11-01
The pseudo data approach to the joint production of petroleum refining and chemicals is described as an alternative that avoids the multicollinearity of time series data and allows a complex technology to be characterized in a statistical price possibility frontier. Intended primarily for long-range analysis, the pseudo data method can be used as a source of elasticity estimate for policy analysis. 19 references.
ERIC Educational Resources Information Center
Usher, Alex
2005-01-01
For the first time in the better part of a decade, the idea of making student loans "Income-Contingent" (often referred to as "ICR", which stands for Income-Contingent Repayment) is making a re-appearance on the Canadian policy scene. For those who have long favoured ICRs, this debate will be an opportunity to dust off their…
Woodstove smoke and CO emissions: comparison of reference methods with the VIP sampler.
Jaasma, D R; Champion, M C; Shelton, J W
1990-06-01
A new field sampler has been developed for measuring the particulate matter (PM) and carbon monoxide emissions of woodburning stoves. Particulate matter is determined by carbon balance and the workup of a sample train which is similar to a room-temperature EPA Method 5G train. A steel tank, initially evacuated, serves as the motive force for sampling and also accumulates a gas sample for post-test analysis of time-averaged stack CO and CO2 concentrations. Workup procedures can be completed within 72 hours of sampler retrieval. The system has been compared to reference methods in two laboratory test series involving six different woodburning appliances and two independent laboratories. The correlation of field sampler emission rates and reference method rates is strong.
Woodstove smoke and CO emissions: Comparison of reference methods with the VIP sampler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaasma, D.R.; Champion, M.C.; Shelton, J.W.
1990-06-01
A new field sampler has been developed for measuring the particulate matter (PM) and carbon monoxide emissions of woodburning stoves. Particulate matter is determined by carbon balance and the workup of a sample train which is similar to a room-temperature EPA Method 5G train. A steel tank, initially evacuated, serves as the motive force for sampling and also accumulates a gas sample for post-test analysis of time-averaged stack CO and CO{sub 2} concentrations. Workup procedures can be completed within 72 hours of sampler retrieval. The system has been compared to reference methods in two laboratory test series involving six differentmore » woodburning appliances and two independent laboratories. The correlation of field sampler emission rates and reference method rates is strong.« less
NChina16: A stable geodetic reference frame for geological hazard studies in North China
NASA Astrophysics Data System (ADS)
Wang, Guoquan; Bao, Yan; Gan, Weijun; Geng, Jianghui; Xiao, Gengru; Shen, Jack S.
2018-04-01
We have developed a stable North China Reference Frame 2016 (NChina16) using five years of continuous GPS observations (2011.8-2016.8) from 12 continuously operating reference stations (CORS) fixed to the North China Craton. Applications of NChina16 in landslide and subsidence studies are illustrated in this article. A method for realizing a regional geodetic reference frame is introduced. The primary result of this study is the seven parameters for transforming Cartesian ECEF (Earth-Centered, Earth-Fixed) coordinates X, Y, and Z from the International GNSS Service Reference Frame 2008 (IGS08) to NChina16. The seven parameters include the epoch that is used to align the regional reference frame to IGS08 and the time derivatives of three translations and three rotations. The GIPSY-OASIS (V6.4) software package was used to obtain the precise point positioning (PPP) daily solutions with respect to IGS08. The frame stability of NChina16 is approximately 0.5 mm/year in both horizontal and vertical directions. This study also developed a regional model for correcting seasonal motions superimposed into the vertical component of the GPS-derived displacement time series. Long-term GPS observations (1999-2016) from five CORS in North China were used to develop the seasonal model. According to this study, the PPP daily solutions with respect to NChina16 could achieve 2-3 mm horizontal accuracy and 4-5 mm vertical accuracy after being modified by the regional model. NChina16 will be critical to study geodynamic problems in North China, such as earthquakes, faulting, subsidence, and landslides. The regional reference frame will be periodically updated every few years to mitigate degradation of the frame with time and be synchronized with the update of IGS reference frame.
Functional Cues for Position Learning Effects in Animals
ERIC Educational Resources Information Center
Burns, Richard A.; Johnson, Kendra S.; Harris, Brian A.; Kinney, Beth A.; Wright, Sarah E.
2004-01-01
Using transfer methodology, several possible factors that could have affected the expression of serial position learning were examined with runway-trained rats. A 3-trial series (SNP) --for which S and P refer to series trials when sucrose (S) and plain (P) Noyes pellets were used as a reward, and N refers to a trial without reward -- was the…
NASA Astrophysics Data System (ADS)
Lakshmi, K.; Rama Mohan Rao, A.
2014-10-01
In this paper, a novel output-only damage-detection technique based on time-series models for structural health monitoring in the presence of environmental variability and measurement noise is presented. The large amount of data obtained in the form of time-history response is transformed using principal component analysis, in order to reduce the data size and thereby improve the computational efficiency of the proposed algorithm. The time instant of damage is obtained by fitting the acceleration time-history data from the structure using autoregressive (AR) and AR with exogenous inputs time-series prediction models. The probability density functions (PDFs) of damage features obtained from the variances of prediction errors corresponding to references and healthy current data are found to be shifting from each other due to the presence of various uncertainties such as environmental variability and measurement noise. Control limits using novelty index are obtained using the distances of the peaks of the PDF curves in healthy condition and used later for determining the current condition of the structure. Numerical simulation studies have been carried out using a simply supported beam and also validated using an experimental benchmark data corresponding to a three-storey-framed bookshelf structure proposed by Los Alamos National Laboratory. Studies carried out in this paper clearly indicate the efficiency of the proposed algorithm for damage detection in the presence of measurement noise and environmental variability.
Hu, Kaifeng; Ellinger, James J.; Chylla, Roger A.; Markley, John L.
2011-01-01
Time-zero 2D 13C HSQC (HSQC0) spectroscopy offers advantages over traditional 2D NMR for quantitative analysis of solutions containing a mixture of compounds because the signal intensities are directly proportional to the concentrations of the constituents. The HSQC0 spectrum is derived from a series of spectra collected with increasing repetition times within the basic HSQC block by extrapolating the repetition time to zero. Here we present an alternative approach to data collection, gradient-selective time-zero 1H-13C HSQC0 in combination with fast maximum likelihood reconstruction (FMLR) data analysis and the use of two concentration references for absolute concentration determination. Gradient-selective data acquisition results in cleaner spectra, and NMR data can be acquired in both constant-time and non-constant time mode. Semi-automatic data analysis is supported by the FMLR approach, which is used to deconvolute the spectra and extract peak volumes. The peak volumes obtained from this analysis are converted to absolute concentrations by reference to the peak volumes of two internal reference compounds of known concentration: DSS (4,4-dimethyl-4-silapentane-1-sulfonic acid) at the low concentration limit (which also serves as chemical shift reference) and MES (2-(N-morpholino)ethanesulfonic acid) at the high concentration limit. The linear relationship between peak volumes and concentration is better defined with two references than with one, and the measured absolute concentrations of individual compounds in the mixture are more accurate. We compare results from semi-automated gsHSQC0 with those obtained by the original manual phase-cycled HSQC0 approach. The new approach is suitable for automatic metabolite profiling by simultaneous quantification of multiple metabolites in a complex mixture. PMID:22029275
Results from the JPL IGS Analysis Center IGS14 Reprocessing Campaign
NASA Astrophysics Data System (ADS)
Ries, P. A.; Amiri, N.; Heflin, M. B.; Sakumura, C.; Sibois, A. E.; Sibthorpe, A.; David, M. W.
2017-12-01
The JPL IGS analysis center has begun a campaign to reprocess GPS orbits and clocks in the IGS14 reference frame. Though the new frame is only a few millimeters offset from the previous IGb08 frame, a reprocessing is required for consistent use of the new frame due to a change in the satellite phase center offsets between the frames. We will present results on the reprocessing campaign from 2002 to present in order to evaluate any effects caused by the new frame. We also create long-term time-series and periodograms of translation, rotation, and scale parameters to see if there is any divergence between the frames. We will also process long-term PPP time series and derived velocities for a well-distributed set of stations in each frame to compare with the published frame offsets.
NASA Astrophysics Data System (ADS)
Zhang, Caiyun; Smith, Molly; Lv, Jie; Fang, Chaoyang
2017-05-01
Mapping plant communities and documenting their changes is critical to the on-going Florida Everglades restoration project. In this study, a framework was designed to map dominant vegetation communities and inventory their changes in the Florida Everglades Water Conservation Area 2A (WCA-2A) using time series Landsat images spanning 1996-2016. The object-based change analysis technique was combined in the framework. A hybrid pixel/object-based change detection approach was developed to effectively collect training samples for historical images with sparse reference data. An object-based quantification approach was also developed to assess the expansion/reduction of a specific class such as cattail (an invasive species in the Everglades) from the object-based classifications of two dates of imagery. The study confirmed the results in the literature that cattail was largely expanded during 1996-2007. It also revealed that cattail expansion was constrained after 2007. Application of time series Landsat data is valuable to document vegetation changes for the WCA-2A impoundment. The digital techniques developed will benefit global wetland mapping and change analysis in general, and the Florida Everglades WCA-2A in particular.
NASA Technical Reports Server (NTRS)
Ray, R. D.; Beckley, B. D.; Lemoine, F. G.
2010-01-01
A somewhat unorthodox method for determining vertical crustal motion at a tide-gauge location is to difference the sea level time series with an equivalent time series determined from satellite altimetry, To the extent that both instruments measure an identical ocean signal, the difference will be dominated by vertical land motion at the gauge. We revisit this technique by analyzing sea level signals at 28 tide gauges that are colocated with DORIS geodetic stations. Comparisons of altimeter-gauge vertical rates with DORIS rates yield a median difference of 1.8 mm/yr and a weighted root-mean-square difference of2.7 mm/yr. The latter suggests that our uncertainty estimates, which are primarily based on an assumed AR(l) noise process in all time series, underestimates the true errors. Several sources of additional error are discussed, including possible scale errors in the terrestrial reference frame to which altimeter-gauge rates are mostly insensitive, One of our stations, Male, Maldives, which has been the subject of some uninformed arguments about sea-level rise, is found to have almost no vertical motion, and thus is vulnerable to rising sea levels. Published by Elsevier Ltd. on behalf of COSPAR.
A 19-year radar altimeter elevation change time-series of the East and West Antarctic ice sheets
NASA Astrophysics Data System (ADS)
Sundal, A. V.; Shepherd, A.; Wingham, D.; Muir, A.; Mcmillan, M.; Galin, N.
2012-12-01
We present 19 years of continuous radar altimeter observations of the East and West Antarctic ice sheets acquired by the ERS-1, ERS-2, and ENVISAT satellites between May 1992 and September 2010. Time-series of surface elevation change were developed at 39,375 crossing points of the satellite orbit ground tracks using the method of dual cycle crossovers (Zwally et al., 1989; Wingham et al., 1998). In total, 46.5 million individual measurements were included in the analysis, encompassing 74 and 76 % of the East and West Antarctic ice sheet, respectively. The satellites were cross-calibrated by calculating differences between elevation changes occurring during periods of mission overlap. We use the merged time-series to explore spatial and temporal patterns of elevation change and to characterise and quantify the signals of Antarctic ice sheet imbalance. References: Wingham, D., Ridout, A., Scharroo, R., Arthern, R. & Shum, C.K. (1998): Antarctic elevation change from 1992 to 1996. Science, 282, 456-458. Zwally, H. J., Brenner, A. C., Major, J. A., Bindschadler, R. A. & Marsh, J. G. (1989): Growth of Greenland ice-sheet - measurements. Science, 246, 1587-1589.
Publication dates of the North American Fauna series
Banks, Richard C.
1971-01-01
The correct date of publication of numbers in the well-known North American Fauna series, begun in 1889, was printed on the cover of each issue through No. 48. After that time dates of publication that appear on the covers are either incomplete or incorrect. For taxonomic purposes, for developing a chronological survey of a subject, or for other reasons, the exact date of publication of numbers in this important series is useful. We think it important to call attention to the correct dates of publication for numbers beyond 48. Those dealing with birds are listed below by number, followed by the author's name for ease of reference, the date printed on the cover of the issue, and the correct date of publication in parentheses. Numbers not listed do not relate to birds (see J. Mammal., 51: 845, 1970).
He, Meilin; Shen, Wenbin; Chen, Ruizhi; Ding, Hao; Guo, Guangyi
2017-01-01
The solid Earth deforms elastically in response to variations of surface atmosphere, hydrology, and ice/glacier mass loads. Continuous geodetic observations by Global Positioning System (CGPS) stations and Gravity Recovery and Climate Experiment (GRACE) record such deformations to estimate seasonal and secular mass changes. In this paper, we present the seasonal variation of the surface mass changes and the crustal vertical deformation in the South China Block (SCB) identified by GPS and GRACE observations with records spanning from 1999 to 2016. We used 33 CGPS stations to construct a time series of coordinate changes, which are decomposed by empirical orthogonal functions (EOFs) in SCB. The average weighted root-mean-square (WRMS) reduction is 38% when we subtract GRACE-modeled vertical displacements from GPS time series. The first common mode shows clear seasonal changes, indicating seasonal surface mass re-distribution in and around the South China Block. The correlation between GRACE and GPS time series is analyzed which provides a reference for further improvement of the seasonal variation of CGPS time series. The results of the GRACE observations inversion are the surface deformations caused by the surface mass change load at a rate of about −0.4 to −0.8 mm/year, which is used to improve the long-term trend of non-tectonic loads of the GPS vertical velocity field to further explain the crustal tectonic movement in the SCB and surroundings. PMID:29301236
Sriyudthsak, Kansuporn; Iwata, Michio; Hirai, Masami Yokota; Shiraishi, Fumihide
2014-06-01
The availability of large-scale datasets has led to more effort being made to understand characteristics of metabolic reaction networks. However, because the large-scale data are semi-quantitative, and may contain biological variations and/or analytical errors, it remains a challenge to construct a mathematical model with precise parameters using only these data. The present work proposes a simple method, referred to as PENDISC (Parameter Estimation in a N on- DImensionalized S-system with Constraints), to assist the complex process of parameter estimation in the construction of a mathematical model for a given metabolic reaction system. The PENDISC method was evaluated using two simple mathematical models: a linear metabolic pathway model with inhibition and a branched metabolic pathway model with inhibition and activation. The results indicate that a smaller number of data points and rate constant parameters enhances the agreement between calculated values and time-series data of metabolite concentrations, and leads to faster convergence when the same initial estimates are used for the fitting. This method is also shown to be applicable to noisy time-series data and to unmeasurable metabolite concentrations in a network, and to have a potential to handle metabolome data of a relatively large-scale metabolic reaction system. Furthermore, it was applied to aspartate-derived amino acid biosynthesis in Arabidopsis thaliana plant. The result provides confirmation that the mathematical model constructed satisfactorily agrees with the time-series datasets of seven metabolite concentrations.
van Leeuwen, Willem J. D.
2008-01-01
This study examines how satellite based time-series vegetation greenness data and phenological measurements can be used to monitor and quantify vegetation recovery after wildfire disturbances and examine how pre-fire fuel reduction restoration treatments impact fire severity and impact vegetation recovery trajectories. Pairs of wildfire affected sites and a nearby unburned reference site were chosen to measure the post-disturbance recovery in relation to climate variation. All site pairs were chosen in forested uplands in Arizona and were restricted to the area of the Rodeo-Chediski fire that occurred in 2002. Fuel reduction treatments were performed in 1999 and 2001. The inter-annual and seasonal vegetation dynamics before, during, and after wildfire events can be monitored using a time series of biweekly composited MODIS NDVI (Moderate Resolution Imaging Spectroradiometer - Normalized Difference Vegetation Index) data. Time series analysis methods included difference metrics, smoothing filters, and fitting functions that were applied to extract seasonal and inter-annual change and phenological metrics from the NDVI time series data from 2000 to 2007. Pre- and post-fire Landsat data were used to compute the Normalized Burn Ratio (NBR) and examine burn severity at the selected sites. The phenological metrics (pheno-metrics) included the timing and greenness (i.e. NDVI) for the start, peak and end of the growing season as well as proxy measures for the rate of green-up and senescence and the annual vegetation productivity. Pre-fire fuel reduction treatments resulted in lower fire severity, which reduced annual productivity much less than untreated areas within the Rodeo-Chediski fire perimeter. The seasonal metrics were shown to be useful for estimating the rate of post-fire disturbance recovery and the timing of phenological greenness phases. The use of satellite time series NDVI data and derived pheno-metrics show potential for tracking vegetation cover dynamics and successional changes in response to drought, wildfire disturbances, and forest restoration treatments in fire-suppressed forests. PMID:27879809
Forward Period Analysis Method of the Periodic Hamiltonian System.
Wang, Pengfei
2016-01-01
Using the forward period analysis (FPA), we obtain the period of a Morse oscillator and mathematical pendulum system, with the accuracy of 100 significant digits. From these results, the long-term [0, 1060] (time unit) solutions, ranging from the Planck time to the age of the universe, are computed reliably and quickly with a parallel multiple-precision Taylor series (PMT) scheme. The application of FPA to periodic systems can greatly reduce the computation time of long-term reliable simulations. This scheme provides an efficient way to generate reference solutions, against which long-term simulations using other schemes can be tested.
NASA Astrophysics Data System (ADS)
Usowicz, Jerzy, B.; Marczewski, Wojciech; Usowicz, Boguslaw; Lipiec, Jerzy; Lukowski, Mateusz I.
2010-05-01
This paper presents the results of the time series analysis of the soil moisture observed at two test sites Podlasie, Polesie, in the Cal/Val AO 3275 campaigns in Poland, during the interval 2006-2009. The test sites have been selected on a basis of their contrasted hydrological conditions. The region Podlasie (Trzebieszow) is essentially drier than the wetland region Polesie (Urszulin). It is worthwhile to note that the soil moisture variations can be represented as a non-stationary random process, and therefore appropriate analysis methods are required. The so-called Empirical Mode Decomposition (EMD) method has been chosen, since it is one of the best methods for the analysis of non-stationary and nonlinear time series. To confirm the results obtained by the EMD we have also used the wavelet methods. Firstly, we have used EMD (analyze step) to decompose the original time series into the so-called Intrinsic Mode Functions (IMFs) and then by grouping and addition similar IMFs (synthesize step) to obtain a few signal components with corresponding temporal scales. Such an adaptive procedure enables to decompose the original time series into diurnal, seasonal and trend components. Revealing of all temporal scales which operates in the original time series is our main objective and this approach may prove to be useful in other studies. Secondly, we have analyzed the soil moisture time series from both sites using the cross-wavelet and wavelet coherency. These methods allow us to study the degree of spatial coherence, which may vary in various intervals of time. We hope the obtained results provide some hints and guidelines for the validation of ESA SMOS data. References: B. Usowicz, J.B. Usowicz, Spatial and temporal variation of selected physical and chemical properties of soil, Institute of Agrophysics, Polish Academy of Sciences, Lublin 2004, ISBN 83-87385-96-4 Rao, A.R., Hsu, E.-C., Hilbert-Huang Transform Analysis of Hydrological and Environmental Time Series, Springer, 2008, ISBN: 978-1-4020-6453-1 Acknowledgements. This work was funded in part by the PECS - Programme for European Cooperating States, No. 98084 "SWEX/R - Soil Water and Energy Exchange/Research".
NASA Astrophysics Data System (ADS)
Lasaponara, Rosa; Lanorte, Antonio; Lovallo, Michele; Telesca, Luciano
2015-04-01
Time series can fruitfully support fire monitoring and management from statistical analysis of fire occurrence (Tuia et al. 2008) to danger estimation (lasaponara 2005), damage evaluation (Lanorte et al 2014) and post fire recovery (Lanorte et al. 2014). In this paper, the time dynamics of SPOT-VEGETATION Normalized Difference Vegetation Index (NDVI) time series are analyzed by using the statistical approach of the Fisher-Shannon (FS) information plane to assess and monitor vegetation recovery after fire disturbance. Fisher-Shannon information plane analysis allows us to gain insight into the complex structure of a time series to quantify its degree of organization and order. The analysis was carried out using 10-day Maximum Value Composites of NDVI (MVC-NDVI) with a 1 km × 1 km spatial resolution. The investigation was performed on two test sites located in Galizia (North Spain) and Peloponnese (South Greece), selected for the vast fires which occurred during the summer of 2006 and 2007 and for their different vegetation covers made up mainly of low shrubland in Galizia test site and evergreen forest in Peloponnese. Time series of MVC-NDVI have been analyzed before and after the occurrence of the fire events. Results obtained for both the investigated areas clearly pointed out that the dynamics of the pixel time series before the occurrence of the fire is characterized by a larger degree of disorder and uncertainty; while the pixel time series after the occurrence of the fire are featured by a higher degree of organization and order. In particular, regarding the Peloponneso fire, such discrimination is more evident than in the Galizia fire. This suggests a clear possibility to discriminate the different post-fire behaviors and dynamics exhibited by the different vegetation covers. Reference Lanorte A, R Lasaponara, M Lovallo, L Telesca 2014 Fisher-Shannon information plane analysis of SPOT/VEGETATION Normalized Difference Vegetation Index (NDVI) time series to characterize vegetation recovery after fire disturbanceInternational Journal of Applied Earth Observation and Geoinformation 26 441-446 Lanorte A, M Danese, R Lasaponara, B Murgante 2014 Multiscale mapping of burn area and severity using multisensor satellite data and spatial autocorrelation analysis International Journal of Applied Earth Observation and Geoinformation 20, 42-51 Tuia D, F Ratle, R Lasaponara, L Telesca, M Kanevski 2008 Scan statistics analysis of forest fire clusters Communications in Nonlinear Science and Numerical Simulation 13 (8), 1689-1694 Telesca L, R Lasaponara 2006 Pre and post fire behavioral trends revealed in satellite NDVI time series Geophysical Research Letters 33 (14) Lasaponara R 2005 Intercomparison of AVHRR based fire susceptibility indicators for the Mediterranean ecosystems of southern Italy International Journal of Remote Sensing 26 (5), 853-870
Dynamic fMRI of a decision-making task
NASA Astrophysics Data System (ADS)
Singh, Manbir; Sungkarat, Witaya
2008-03-01
A novel fMRI technique has been developed to capture the dynamics of the evolution of brain activity during complex tasks such as those designed to evaluate the neural basis of decision-making under different situations. A task called the Iowa Gambling Task was used as an example. Six normal human volunteers were studied. The task was presented inside a 3T MRI and a dynamic fMRI study of the approximately 2s period between the beginning and end of the decision-making period was conducted by employing a series of reference functions, separated by 200 ms, designed to capture activation at different time-points within this period. As decision-making culminates with a button-press, the timing of the button press was chosen as the reference (t=0) and corresponding reference functions were shifted backward in steps of 200ms from this point up to the time when motor activity from the previous button press became predominant. SPM was used to realign, high-pass filter (cutoff 200s), normalize to the Montreal Neurological Institute (MNI) Template using a 12 parameter affine/non-linear transformation, 8mm Gaussian smoothing, and event-related General Linear Model analysis for each of the shifted reference functions. The t-score of each activated voxel was then examined to find its peaking time. A random effect analysis (p<0.05) showed prefrontal, parietal and bi-lateral hippocampal activation peaking at different times during the decision making period in the n=6 group study.
Nonlinear time-series analysis of current signal in cathodic contact glow discharge electrolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allagui, Anis, E-mail: aallagui@sharjah.ac.ae; Abdelkareem, Mohammad Ali; Rojas, Andrea Espinel
In the standard two-electrode configuration employed in electrolytic process, when the control dc voltage is brought to a critical value, the system undergoes a transition from conventional electrolysis to contact glow discharge electrolysis (CGDE), which has also been referred to as liquid-submerged micro-plasma, glow discharge plasma electrolysis, electrode effect, electrolytic plasma, etc. The light-emitting process is associated with the development of an irregular and erratic current time-series which has been arbitrarily labelled as “random,” and thus dissuaded further research in this direction. Here, we examine the current time-series signals measured in cathodic CGDE configuration in a concentrated KOH solution atmore » different dc bias voltages greater than the critical voltage. We show that the signals are, in fact, not random according to the NIST SP. 800-22 test suite definition. We also demonstrate that post-processing low-pass filtered sequences requires less time than the native as-measured sequences, suggesting a superposition of low frequency chaotic fluctuations and high frequency behaviors (which may be produced by more than one possible source of entropy). Using an array of nonlinear time-series analyses for dynamical systems, i.e., the computation of largest Lyapunov exponents and correlation dimensions, and re-construction of phase portraits, we found that low-pass filtered datasets undergo a transition from quasi-periodic to chaotic to quasi-hyper-chaotic behavior, and back again to chaos when the voltage controlling-parameter is increased. The high frequency part of the signals is discussed in terms of highly nonlinear turbulent motion developed around the working electrode.« less
Development of web tools to disseminate space geodesy data-related products
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.; Mezerette, A.
2014-12-01
In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). The next step currently in progress is the creation of a database to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).
A synergic simulation-optimization approach for analyzing biomolecular dynamics in living organisms.
Sadegh Zadeh, Kouroush
2011-01-01
A synergic duo simulation-optimization approach was developed and implemented to study protein-substrate dynamics and binding kinetics in living organisms. The forward problem is a system of several coupled nonlinear partial differential equations which, with a given set of kinetics and diffusion parameters, can provide not only the commonly used bleached area-averaged time series in fluorescence microscopy experiments but more informative full biomolecular/drug space-time series and can be successfully used to study dynamics of both Dirac and Gaussian fluorescence-labeled biomacromolecules in vivo. The incomplete Cholesky preconditioner was coupled with the finite difference discretization scheme and an adaptive time-stepping strategy to solve the forward problem. The proposed approach was validated with analytical as well as reference solutions and used to simulate dynamics of GFP-tagged glucocorticoid receptor (GFP-GR) in mouse cancer cell during a fluorescence recovery after photobleaching experiment. Model analysis indicates that the commonly practiced bleach spot-averaged time series is not an efficient approach to extract physiological information from the fluorescence microscopy protocols. It was recommended that experimental biophysicists should use full space-time series, resulting from experimental protocols, to study dynamics of biomacromolecules and drugs in living organisms. It was also concluded that in parameterization of biological mass transfer processes, setting the norm of the gradient of the penalty function at the solution to zero is not an efficient stopping rule to end the inverse algorithm. Theoreticians should use multi-criteria stopping rules to quantify model parameters by optimization. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Schreiber, Henry D.
1987-01-01
An electrochemical series for redox couples in a glass-forming oxide melt is developed. This series is a quantitative numerical scale of reference reduction potentials of the redox couples in a silicate melt that is a model for basaltic magmas. The redox couples are ordered in terms of their reference reduction potentials; the order appears to be relatively independent of the exact melt composition and temperature. Thus, upon calibration to a desired composition, oxygen fugacity, and temperature, this electrochemical series can provide estimates of redox state proportions in basaltic magmas on different planetary bodies. The geochemical electrochemical series can also be used to understand the interrelationship of the redox state of the magma and the presence of volatile species such as oxygen, water, sulfur gases, and carbon gases.
NASA Astrophysics Data System (ADS)
Werner, C. L.; Wegmuller, U.; Strozzi, T.; Wiesmann, A.
2006-12-01
Principle contributors to the noise in differential SAR interferograms are temporal phase stability of the surface, geometry relating to baseline and surface slope, and propagation path delay variations due to tropospheric water vapor and the ionosphere. Time series analysis of multiple interferograms generated from a stack of SAR SLC images seeks to determine the deformation history of the surface while reducing errors. Only those scatterers within a resolution element that are stable and coherent for each interferometric pair contribute to the desired deformation signal. Interferograms with baselines exceeding 1/3 the critical baseline have substantial geometrical decorrelation for distributed targets. Short baseline pairs with multiple reference scenes can be combined using least-squares estimation to obtain a global deformation solution. Alternately point-like persistent scatterers can be identified in scenes that do not exhibit geometrical decorrelation associated with large baselines. In this approach interferograms are formed from a stack of SAR complex images using a single reference scene. Stable distributed scatter pixels are excluded however due to the presence of large baselines. We apply both point- based and short-baseline methodologies and compare results for a stack of fine-beam Radarsat data acquired in 2002-2004 over a rapidly subsiding oil field near Lost Hills, CA. We also investigate the density of point-like scatters with respect to image resolution. The primary difficulty encountered when applying time series methods is phase unwrapping errors due to spatial and temporal gaps. Phase unwrapping requires sufficient spatial and temporal sampling. Increasing the SAR range bandwidth increases the range resolution as well as increasing the critical interferometric baseline that defines the required satellite orbital tube diameter. Sufficient spatial sampling also permits unwrapping because of the reduced phase/pixel gradient. Short time intervals further reduce the differential phase due to deformation when the deformation is continuous. Lower frequency systems (L- vs. C-Band) substantially improve the ability to unwrap the phase correctly by directly reducing both interferometric phase amplitude and temporal decorrelation.
Floris, Ilaria; Billard, Hélène; Boquien, Clair-Yves; Joram-Gauvard, Evelyne; Simon, Laure; Legrand, Arnaud; Boscher, Cécile; Rozé, Jean-Christophe; Bolaños-Jiménez, Francisco; Kaeffer, Bertrand
2015-01-01
Background and Aims Human breast milk is an extremely dynamic fluid containing many biologically-active components which change throughout the feeding period and throughout the day. We designed a miRNA assay on minimized amounts of raw milk obtained from mothers of preterm infants. We investigated changes in miRNA expression within month 2 of lactation and then over the course of 24 hours. Materials and Methods Analyses were performed on pooled breast milk, made by combining samples collected at different clock times from the same mother donor, along with time series collected over 24 hours from four unsynchronized mothers. Whole milk, lipids or skim milk fractions were processed and analyzed by qPCR. We measured hsa-miR-16-5p, hsa-miR-21-5p, hsa-miR-146-5p, and hsa-let-7a, d and g (all -5p). Stability of miRNA endogenous controls was evaluated using RefFinder, a web tool integrating geNorm, Normfinder, BestKeeper and the comparative ΔΔCt method. Results MiR-21 and miR-16 were stably expressed in whole milk collected within month 2 of lactation from four mothers. Analysis of lipids and skim milk revealed that miR-146b and let-7d were better references in both fractions. Time series (5H-23H) allowed the identification of a set of three endogenous reference genes (hsa-let-7d, hsa-let-7g and miR-146b) to normalize raw quantification cycle (Cq) data. We identified a daily oscillation of miR-16-5p. Perspectives Our assay allows exploring miRNA levels of breast milk from mother with preterm baby collected in time series over 48–72 hours. PMID:26474056
On the Long-Term "Hesitation Waltz" Between the Earth's Figure and Rotation Axes
NASA Astrophysics Data System (ADS)
Couhert, A.; Mercier, F.; Bizouard, C.
2017-12-01
The principal figure axis of the Earth refers to its axis of maximum inertia. In the absence of external torques, the latter should closely coincide with the rotation pole, when averaged over many years. However, because of tidal and non-tidal mass redistributions within the Earth system, the rotational axis executes a circular motion around the figure axis essentially at seasonal time scales. In between, it is not clear what happens at decadal time spans and how well the two axes are aligned. The long record of accurate Satellite Laser Ranging (SLR) observations to Lageos makes possible to directly measure the long time displacement of the figure axis with respect to the crust, through the determination of the degree 2 order 1 geopotential coefficients for the 34-year period 1983-2017. On the other hand, the pole coordinate time series (mainly from GNSS and VLBI data) yield the motion of the rotation pole with even a greater accuracy. This study is focused on the analysis of the long-term behavior of the two time series, as well as the derivation of possible explanations for their discrepancies.
Kronberg, J.W.
1992-06-02
A sequential power-up circuit for starting several electrical load elements in series to avoid excessive current surge, comprising a voltage ramp generator and a set of voltage comparators, each comparator having a different reference voltage and interfacing with a switch that is capable of turning on one of the load elements. As the voltage rises, it passes the reference voltages one at a time and causes the switch corresponding to that voltage to turn on its load element. The ramp is turned on and off by a single switch or by a logic-level electrical signal. The ramp rate for turning on the load element is relatively slow and the rate for turning the elements off is relatively fast. Optionally, the duration of each interval of time between the turning on of the load elements is programmable. 2 figs.
Leo Kanner's Mention of 1938 in His Report on Autism Refers to His First Patient.
Olmsted, Dan; Blaxill, Mark
2016-01-01
Leo Kanner begins his landmark 1943 case series on autistic children by stating the condition was first brought to his attention in 1938. Recent letters to JADD have described this reference as "mysterious" and speculated it refers to papers published that year by Despert or Asperger. In fact, as Kanner goes on to state, 1938 is when he examined the first child in his case series. An exchange of letters with Despert and later writing by Kanner also point to the originality of his observations.
Simulation Study Using a New Type of Sample Variance
NASA Technical Reports Server (NTRS)
Howe, D. A.; Lainson, K. J.
1996-01-01
We evaluate with simulated data a new type of sample variance for the characterization of frequency stability. The new statistic (referred to as TOTALVAR and its square root TOTALDEV) is a better predictor of long-term frequency variations than the present sample Allan deviation. The statistical model uses the assumption that a time series of phase or frequency differences is wrapped (periodic) with overall frequency difference removed. We find that the variability at long averaging times is reduced considerably for the five models of power-law noise commonly encountered with frequency standards and oscillators.
Propagation of stage measurement uncertainties to streamflow time series
NASA Astrophysics Data System (ADS)
Horner, Ivan; Le Coz, Jérôme; Renard, Benjamin; Branger, Flora; McMillan, Hilary
2016-04-01
Streamflow uncertainties due to stage measurements errors are generally overlooked in the promising probabilistic approaches that have emerged in the last decade. We introduce an original error model for propagating stage uncertainties through a stage-discharge rating curve within a Bayesian probabilistic framework. The method takes into account both rating curve (parametric errors and structural errors) and stage uncertainty (systematic and non-systematic errors). Practical ways to estimate the different types of stage errors are also presented: (1) non-systematic errors due to instrument resolution and precision and non-stationary waves and (2) systematic errors due to gauge calibration against the staff gauge. The method is illustrated at a site where the rating-curve-derived streamflow can be compared with an accurate streamflow reference. The agreement between the two time series is overall satisfying. Moreover, the quantification of uncertainty is also satisfying since the streamflow reference is compatible with the streamflow uncertainty intervals derived from the rating curve and the stage uncertainties. Illustrations from other sites are also presented. Results are much contrasted depending on the site features. In some cases, streamflow uncertainty is mainly due to stage measurement errors. The results also show the importance of discriminating systematic and non-systematic stage errors, especially for long term flow averages. Perspectives for improving and validating the streamflow uncertainty estimates are eventually discussed.
van Leeuwen, Martin; Kremens, Robert L.; van Aardt, Jan
2015-01-01
Photosynthetic light-use efficiency (LUE) has gained wide interest as an input to modeling forest gross primary productivity (GPP). The photochemical reflectance index (PRI) has been identified as a principle means to inform LUE-based models, using airborne and satellite-based observations of canopy reflectance. More recently, low-cost electronics have become available with the potential to provide for dense in situ time-series measurements of PRI. A recent design makes use of interference filters to record light transmission within narrow wavebands. Uncertainty remains as to the dynamic range of these sensors and performance under low light conditions, the placement of the reference band, and methodology for reflectance calibration. This paper presents a low-cost sensor design and is tested in a laboratory set-up, as well in the field. The results demonstrate an excellent performance against a calibration standard (R2 = 0.9999) and at low light conditions. Radiance measurements over vegetation demonstrate a reversible reduction in green reflectance that was, however, seen in both the reference and signal wavebands. Time-series field measurements of PRI in a Douglas-fir canopy showed a weak correlation with eddy-covariance-derived LUE and a significant decline in PRI over the season. Effects of light quality, bidirectional scattering effects, and possible sensor artifacts on PRI are discussed. PMID:25951342
Modeling PSInSAR time series without phase unwrapping
Zhang, L.; Ding, X.; Lu, Z.
2011-01-01
In this paper, we propose a least-squares-based method for multitemporal synthetic aperture radar interferometry that allows one to estimate deformations without the need of phase unwrapping. The method utilizes a series of multimaster wrapped differential interferograms with short baselines and focuses on arcs at which there are no phase ambiguities. An outlier detector is used to identify and remove the arcs with phase ambiguities, and a pseudoinverse of the variance-covariance matrix is used as the weight matrix of the correlated observations. The deformation rates at coherent points are estimated with a least squares model constrained by reference points. The proposed approach is verified with a set of simulated data.
Globally-Gridded Interpolated Night-Time Marine Air Temperatures 1900-2014
NASA Astrophysics Data System (ADS)
Junod, R.; Christy, J. R.
2016-12-01
Over the past century, climate records have pointed to an increase in global near-surface average temperature. Near-surface air temperature over the oceans is a relatively unused parameter in understanding the current state of climate, but is useful as an independent temperature metric over the oceans and serves as a geographical and physical complement to near-surface air temperature over land. Though versions of this dataset exist (i.e. HadMAT1 and HadNMAT2), it has been strongly recommended that various groups generate climate records independently. This University of Alabama in Huntsville (UAH) study began with the construction of monthly night-time marine air temperature (UAHNMAT) values from the early-twentieth century through to the present era. Data from the International Comprehensive Ocean and Atmosphere Data Set (ICOADS) were used to compile a time series of gridded UAHNMAT, (20S-70N). This time series was homogenized to correct for the many biases such as increasing ship height, solar deck heating, etc. The time series of UAHNMAT, once adjusted to a standard reference height, is gridded to 1.25° pentad grid boxes and interpolated using the kriging interpolation technique. This study will present results which quantify the variability and trends and compare to current trends of other related datasets that include HadNMAT2 and sea-surface temperatures (HadISST & ERSSTv4).
Wohlin, Åsa
2015-03-21
The distribution of codons in the nearly universal genetic code is a long discussed issue. At the atomic level, the numeral series 2x(2) (x=5-0) lies behind electron shells and orbitals. Numeral series appear in formulas for spectral lines of hydrogen. The question here was if some similar scheme could be found in the genetic code. A table of 24 codons was constructed (synonyms counted as one) for 20 amino acids, four of which have two different codons. An atomic mass analysis was performed, built on common isotopes. It was found that a numeral series 5 to 0 with exponent 2/3 times 10(2) revealed detailed congruency with codon-grouped amino acid side-chains, simultaneously with the division on atom kinds, further with main 3rd base groups, backbone chains and with codon-grouped amino acids in relation to their origin from glycolysis or the citrate cycle. Hence, it is proposed that this series in a dynamic way may have guided the selection of amino acids into codon domains. Series with simpler exponents also showed noteworthy correlations with the atomic mass distribution on main codon domains; especially the 2x(2)-series times a factor 16 appeared as a conceivable underlying level, both for the atomic mass and charge distribution. Furthermore, it was found that atomic mass transformations between numeral systems, possibly interpretable as dimension degree steps, connected the atomic mass of codon bases with codon-grouped amino acids and with the exponent 2/3-series in several astonishing ways. Thus, it is suggested that they may be part of a deeper reference system. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
Biogeochemical Response to Mesoscale Physical Forcing in the California Current System
NASA Technical Reports Server (NTRS)
Niiler, Pearn P.; Letelier, Ricardo; Moisan, John R.; Marra, John A. (Technical Monitor)
2001-01-01
In the first part of the project, we investigated the local response of the coastal ocean ecosystems (changes in chlorophyll, concentration and chlorophyll, fluorescence quantum yield) to physical forcing by developing and deploying Autonomous Drifting Ocean Stations (ADOS) within several mesoscale features along the U.S. west coast. Also, we compared the temporal and spatial variability registered by sensors mounted in the drifters to that registered by the sensors mounted in the satellites in order to assess the scales of variability that are not resolved by the ocean color satellite. The second part of the project used the existing WOCE SVP Surface Lagrangian drifters to track individual water parcels through time. The individual drifter tracks were used to generate multivariate time series by interpolating/extracting the biological and physical data fields retrieved by remote sensors (ocean color, SST, wind speed and direction, wind stress curl, and sea level topography). The individual time series of the physical data (AVHRR, TOPEX, NCEP) were analyzed against the ocean color (SeaWiFS) time-series to determine the time scale of biological response to the physical forcing. The results from this part of the research is being used to compare the decorrelation scales of chlorophyll from a Lagrangian and Eulerian framework. The results from both parts of this research augmented the necessary time series data needed to investigate the interactions between the ocean mesoscale features, wind, and the biogeochemical processes. Using the historical Lagrangian data sets, we have completed a comparison of the decorrelation scales in both the Eulerian and Lagrangian reference frame for the SeaWiFS data set. We are continuing to investigate how these results might be used in objective mapping efforts.
ERIC Educational Resources Information Center
Lankes, R. David, Ed.; Collins, John W., III, Ed.; Kasowitz, Abby S., Ed.
This book on digital reference services begins with an introduction entitled "The Foundations of Digital Reference" (R. David Lankes). Part 1, "The New Reference Culture: Traits and Trends," includes the following papers: "Why Reference Is about To Change Forever (but Not Completely)" (Joseph Janes);…
RELATIVE PHOTOMETRY OF HAT-P-1b OCCULTATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beky, Bence; Holman, Matthew J.; Noyes, Robert W.
2013-06-01
We present Hubble Space Telescope (HST) Space Telescope Imaging Spectrograph observations of two occultations of the transiting exoplanet HAT-P-1b. By measuring the planet to star flux ratio near opposition, we constrain the geometric albedo of the planet, which is strongly linked to its atmospheric temperature gradient. An advantage of HAT-P-1 as a target is its binary companion ADS 16402 A, which provides an excellent photometric reference, simplifying the usual steps in removing instrumental artifacts from HST time-series photometry. We find that without this reference star, we would need to detrend the lightcurve with the time of the exposures as wellmore » as the first three powers of HST orbital phase, and this would introduce a strong bias in the results for the albedo. However, with this reference star, we only need to detrend the data with the time of the exposures to achieve the same per-point scatter, therefore we can avoid most of the bias associated with detrending. Our final result is a 2{sigma} upper limit of 0.64 for the geometric albedo of HAT-P-1b between 577 and 947 nm.« less
NASA Astrophysics Data System (ADS)
Ohta, Y.; Ohzono, M.; Takahashi, H.; Kawamoto, S.; Hino, R.
2017-12-01
A large and destructive earthquake (Mjma 7.3) occurred on April 15, 2016 in Kumamoto region, southwestern Japan. This earthquake was accompanied approximately 32 s later by an M 6 earthquake in central Oita region, which hypocenter located 80 km northeast from the hypocenter of the mainshock of the Kumamoto earthquake. This triggered earthquake also had the many aftershocks in and around the Oita region. It is important to understand how to occur such chain-reacted earthquake sequences. We used the 1Hz dual-frequency phase and range data from GEONET in Kyushu island. The data were processed using GIPSY-OASIS (version 6.4). We adopoted kinematic PPP strategy for the coordinate estimation. The reference GPS satellite orbit and 5 s clock information were obtained using the CODE product. We also applied simple sidereal filter technique for the estimated time series. Based on the obtained 1Hz GNSS time series, we estimated the areal strain and principle strain field using the method of the Shen et al. (1996). For the assessment of the dynamic strain, firstly we calculated the averaged absolute value of areal strain field between 60-85s after the origin time of the mainshock of the Kumamoto earthquake which was used as the "reference" static strain field. Secondly, we estimated the absolute value of areal strain in each time step. Finally, we calculated the strain ratio in each time step relative to the "reference". Based on this procedure, we can extract the spatial and temporal characteristic of the dynamic strain in each time step. Extracted strain ratio clearly shows the spatial and temporal dynamic strain characteristic. When an attention is paid to a region of triggered Oita earthquake, the timing of maximum dynamic strain ratio in the epicenter just corresponds to the origin time of the triggered event. It strongly suggested that the large dynamic strain may trigger the Oita event. The epicenter of the triggered earthquake located within the geothermal region. In the geothermal region, the crustal materials are more sensitive to stress perturbations, and the earthquakes are more easily triggered compared with other typical regions. Our result also suggested that the real-time strain field monitoring may be useful information for the understanding of the possibility of the remotely triggered earthquake in the future.
Performance of vegetation indices from Landsat time series in deforestation monitoring
NASA Astrophysics Data System (ADS)
Schultz, Michael; Clevers, Jan G. P. W.; Carter, Sarah; Verbesselt, Jan; Avitabile, Valerio; Quang, Hien Vu; Herold, Martin
2016-10-01
The performance of Landsat time series (LTS) of eight vegetation indices (VIs) was assessed for monitoring deforestation across the tropics. Three sites were selected based on differing remote sensing observation frequencies, deforestation drivers and environmental factors. The LTS of each VI was analysed using the Breaks For Additive Season and Trend (BFAST) Monitor method to identify deforestation. A robust reference database was used to evaluate the performance regarding spatial accuracy, sensitivity to observation frequency and combined use of multiple VIs. The canopy cover sensitive Normalized Difference Fraction Index (NDFI) was the most accurate. Among those tested, wetness related VIs (Normalized Difference Moisture Index (NDMI) and the Tasselled Cap wetness (TCw)) were spatially more accurate than greenness related VIs (Normalized Difference Vegetation Index (NDVI) and Tasselled Cap greenness (TCg)). When VIs were fused on feature level, spatial accuracy was improved and overestimation of change reduced. NDVI and NDFI produced the most robust results when observation frequency varies.
Ip, Ryan H L; Li, W K; Leung, Kenneth M Y
2013-09-15
Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.
Moody, George B; Mark, Roger G; Goldberger, Ary L
2011-01-01
PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.
NASA Astrophysics Data System (ADS)
Sigro, J.; Brunet, M.; Aguilar, E.; Stoll, H.; Jimenez, M.
2009-04-01
The Spanish-funded research project Rapid Climate Changes in the Iberian Peninsula (IP) Based on Proxy Calibration, Long Term Instrumental Series and High Resolution Analyses of Terrestrial and Marine Records (CALIBRE: ref. CGL2006-13327-C04/CLI) has as main objective to analyse climate dynamics during periods of rapid climate change by means of developing high-resolution paleoclimate proxy records from marine and terrestrial (lakes and caves) deposits over the IP and calibrating them with long-term and high-quality instrumental climate time series. Under CALIBRE, the coordinated project Developing and Enhancing a Climate Instrumental Dataset for Calibrating Climate Proxy Data and Analysing Low-Frequency Climate Variability over the Iberian Peninsula (CLICAL: CGL2006-13327-C04-03/CLI) is devoted to the development of homogenised climate records and sub-regional time series which can be confidently used in the calibration of the lacustrine, marine and speleothem time series generated under CALIBRE. Here we present the procedures followed in order to homogenise a dataset of maximum and minimum temperature and precipitation data on a monthly basis over the Spanish northern coast. The dataset is composed of thirty (twenty) precipitation (temperature) long monthly records. The data are quality controlled following the procedures recommended by Aguilar et al. (2003) and tested for homogeneity and adjusted by following the approach adopted by Brunet et al. (2008). Sub-regional time series of precipitation, maximum and minimum temperatures for the period 1853-2007 have been generated by averaging monthly anomalies and then adding back the base-period mean, according to the method of Jones and Hulme (1996). Also, a method to adjust the variance bias present in regional time series associated over time with varying sample size has been applied (Osborn et al., 1997). The results of this homogenisation exercise and the development of the associated sub-regional time series will be widely discussed. Initial comparisons with rapidly growing speleothems in two different caves indicate that speleothem trace element ratios like Ba/Ca are recording the decrease in littoral precipitation in the last several decades. References Aguilar, E., Auer, I., Brunet, M., Peterson, T. C. and Weringa, J. 2003. Guidelines on Climate Metadata and Homogenization, World Meteorological Organization (WMO)-TD no. 1186 / World Climate Data and Monitoring Program (WCDMP) no. 53, Geneva: 51 pp. Brunet M, Saladié O, Jones P, Sigró J, Aguilar E, Moberg A, Lister D, Walther A, Almarza C. 2008. A case-study/guidance on the development of long-term daily adjusted temperature datasets, WMO-TD-1425/WCDMP-66, Geneva: 43 pp. Jones, P D, and Hulme M, 1996, Calculating regional climatic time series for temperature and precipitation: Methods and illustrations, Int. J. Climatol., 16, 361- 377. Osborn, T. J., Briffa K. R., and Jones P. D., 1997, Adjusting variance for sample-size in tree-ring chronologies and other regional mean time series, Dendrochronologia, 15, 89- 99.
Bai, Ru-feng; Ma, Shu-hua; Zhang, Hai-dong; Chang, Lin; Zhang, Zhong; Liu, Li; Zhang, Feng-qin; Guo, Zhao-ming; Shi, Mei-sen
2014-03-01
A block of an injury instrument will be left in wounds sometimes, and the suspect instrument can be discriminated by comparison with the block that was left through elemental analysis. In this study, three brands (Shibazi, Zhangxiaoquan, Qiaoxifu) of kitchen knives with forged, chop, and slice application series were analyzed by inductively coupled plasma atomic emission spectroscopy (ICP-AES) and Infrared Absorption to investigate the type, number of elements and the reference range used for comparing. The results show that when regarding one or more element as the discriminative threshold, together with 5% relative standard deviation (RSD) as the reference range, all the samples could be distinguished among different series. Furthermore, within the same series, the discriminative capability could reach up to 88.57% for all samples. In addition, elements with high content, such as Cr, Mn, and C, were useful to discriminate among different series, and trace elements, such as Ni, Si, and Cu, were useful within the same series. However, in practice, it is necessary to evaluate the accuracy of the method by Standard Reference Material (SRM) before an examination is performed.
NASA Astrophysics Data System (ADS)
Vershkov, A. N.; Petrovskaya, M. S.
2016-11-01
The series in ellipsoidal harmonics for derivatives of the Earth's gravity potential are used only on the reference ellipsoid enveloping the Earth due to their very complex mathematical structure. In the current study, the series in ellipsoidal harmonics are constructed for first- and second-order derivatives of the potential at satellite altitudes; their structure is similar to the series on the reference ellipsoid. The point P is chosen at a random satellite altitude; then, the ellipsoid of revolution is described, which passes through this point and is confocal to the reference ellipsoid. An object-centered coordinate system with the origin at the point P is considered. Using a sequence of transformations, the nonsingular series in ellipsoidal harmonics is constructed for first and second derivatives of the potential in the object-centered coordinate system. These series can be applied to develop a model of the Earth's potential, based on combined use of surface gravitational force measurements, data on the satellite orbital position, its acceleration, or measurements of the gravitational force gradients of the first and second order. The technique is applicable to any other planet of the Solar System.
Selected Reference Books of 1970-71
ERIC Educational Resources Information Center
Sheehy, Eugene P.
1971-01-01
A continuation of the semiannual series, this list presents a selection of recent scholarly and foreign works of interest to reference workers in university libraries. It is not intended to be well balanced or comprehensive. (34 references) (Author/NH)
Construction Theory and Noise Analysis Method of Global CGCS2000 Coordinate Frame
NASA Astrophysics Data System (ADS)
Jiang, Z.; Wang, F.; Bai, J.; Li, Z.
2018-04-01
The definition, renewal and maintenance of geodetic datum has been international hot issue. In recent years, many countries have been studying and implementing modernization and renewal of local geodetic reference coordinate frame. Based on the precise result of continuous observation for recent 15 years from state CORS (continuously operating reference system) network and the mainland GNSS (Global Navigation Satellite System) network between 1999 and 2007, this paper studies the construction of mathematical model of the Global CGCS2000 frame, mainly analyzes the theory and algorithm of two-step method for Global CGCS2000 Coordinate Frame formulation. Finally, the noise characteristic of the coordinate time series are estimated quantitatively with the criterion of maximum likelihood estimation.
VizieR Online Data Catalog: Variable stars in globular clusters (Figuera Jaimes+, 2016)
NASA Astrophysics Data System (ADS)
Figuera Jaimes, R.; Bramich, D. M.; Skottfelt, J.; Kains, N.; Jorgensen, U. G.; Horne, K.; Dominik, M.; Alsubai, K. A.; Bozza, V.; Calchi Novati, S.; Ciceri, S.; D'Ago, G.; Galianni, P.; Gu, S.-H.; W Harpsoe, K. B.; Haugbolle, T.; Hinse, T. C.; Hundertmark, M.; Juncher, D.; Korhonen, H.; Mancini, L.; Popovas, A.; Rabus, M.; Rahvar, S.; Scarpetta, G.; Schmidt, R. W.; Snodgrass, C.; Southworth, J.; Starkey, D.; Street, R. A.; Surdej, J.; Wang, X.-B.; Wertz, O.
2016-02-01
Observations were taken during 2013 and 2014 as part of an ongoing program at the 1.54m Danish telescope at the ESO observatory at La Silla in Chile that was implemented from April to September each year. table1.dat file contains the time-series I photometry for all the variables in the globular clusters studied in this work. We list standard and instrumental magnitudes and their uncertainties corresponding to the variable star identification, filter, and epoch of mid-exposure. For completeness, we also list the reference flux, difference flux, and photometric scale factor, along with the uncertainties on the reference and difference fluxes. (2 data files).
VizieR Online Data Catalog: Variable stars in NGC 6715 (Figuera Jaimes+, 2016)
NASA Astrophysics Data System (ADS)
Figuera Jaimes, R.; Bramich, D. M.; Kains, N.; Skottfelt, J.; Jorgensen, U. G.; Horne, K.; Dominik, M.; Alsubai, K. A.; Bozza, V.; Burgdorf, M. J.; Calchi Novati, S.; Ciceri, S.; D'Ago, G.; Evans, D. F.; Galianni, P.; Gu, S. H.; Harpsoe, K. B. W.; Haugbolle, T.; Hinse, T. C.; Hundertmark, M.; Juncher, D.; Kerins, E.; Korhonen, H.; Kuffmeier, M.; Mancini, L.; Peixinho, N.; Popovas, A.; Rabus, M.; Rahvar, S.; Scarpetta, G.; Schmidt, R. W.; Snodgrass, C.; Southworth, J.; Starkey, D.; Street, R. A.; Surdej, J.; Tronsgaard, R.; Unda-Sanzana, E.; von Essen, C.; Wang, X. B.; Wertz, O.
2016-06-01
Observations were taken during 2013, 2014, and 2015 as part of an ongoing program at the 1.54m Danish telescope at the ESO observatory at La Silla in Chile that was implemented from April to September each year. table1.dat file contains the time-series I photometry for all the variables in NGC 6715 studied in this work. We list standard and instrumental magnitudes and their uncertainties corresponding to the variable star identification, filter, and epoch of mid-exposure. For completeness, we also list the reference flux, difference flux, and photometric scale factor, along with the uncertainties on the reference and difference fluxes. (3 data files).
Reference and Information Services: An Introduction. Second Edition. Library Science Text Series.
ERIC Educational Resources Information Center
Bopp, Richard E., Ed.; Smith, Linda C., Ed.
This document provides an overview of the concepts and processes behind reference services and the most important sources consulted in answering common reference questions. The book is divided into two parts. Part 1 deals with concepts and theory. It covers ethical aspects of reference service, the reference interview, the principles and goals of…
NASA Astrophysics Data System (ADS)
Zoran, Maria A.; Dida, Adrian I.
2017-10-01
Urban green areas are experiencing rapid land cover change caused by human-induced land degradation and extreme climatic events. Vegetation index time series provide a useful way to monitor urban vegetation phenological variations. This study quantitatively describes Normalized Difference Vegetation Index NDVI) /Enhanced Vegetation Index (EVI) and Leaf Area Index (LAI) temporal changes for Bucharest metropolitan region land cover in Romania from the perspective of vegetation phenology and its relation with climate changes and extreme climate events. The time series from 2000 to 2016 of the NOAA AVHRR and MODIS Terra/Aqua satellite data were analyzed to extract anomalies. Time series of climatic variables were also analyzed through anomaly detection techniques and the Fourier Transform. Correlations between NDVI/EVI time series and climatic variables were computed. Temperature, rainfall and radiation were significantly correlated with almost all land-cover classes for the harmonic analysis amplitude term. However, vegetation phenology was not correlated with climatic variables for the harmonic analysis phase term suggesting a delay between climatic variations and vegetation response. Training and validation were based on a reference dataset collected from IKONOS high resolution remote sensing data. The mean detection accuracy for period 2000- 2016 was assessed to be of 87%, with a reasonable balance between change commission errors (19.3%), change omission errors (24.7%), and Kappa coefficient of 0.73. This paper demonstrates the potential of moderate - and high resolution, multispectral imagery to map and monitor the evolution of the physical urban green land cover under climate and anthropogenic pressure.
Separation of components from a scale mixture of Gaussian white noises
NASA Astrophysics Data System (ADS)
Vamoş, Călin; Crăciun, Maria
2010-05-01
The time evolution of a physical quantity associated with a thermodynamic system whose equilibrium fluctuations are modulated in amplitude by a slowly varying phenomenon can be modeled as the product of a Gaussian white noise {Zt} and a stochastic process with strictly positive values {Vt} referred to as volatility. The probability density function (pdf) of the process Xt=VtZt is a scale mixture of Gaussian white noises expressed as a time average of Gaussian distributions weighted by the pdf of the volatility. The separation of the two components of {Xt} can be achieved by imposing the condition that the absolute values of the estimated white noise be uncorrelated. We apply this method to the time series of the returns of the daily S&P500 index, which has also been analyzed by means of the superstatistics method that imposes the condition that the estimated white noise be Gaussian. The advantage of our method is that this financial time series is processed without partitioning or removal of the extreme events and the estimated white noise becomes almost Gaussian only as result of the uncorrelation condition.
ERIC Educational Resources Information Center
Harrison, Don K.; Brown, Dorothy R.
Although calculated by various statistical methods, retention (in this monograph) refers to the time that a former hard core member stays on the job. These rates may be tallied from the first day of pre-vocational training at a center, from the first day of a plant's vestibule training, or the first day of work at the job site. The hard core need…
1987-02-01
flowcharting . 3. ProEram Codin in HLL. This stage consists of transcribing the previously designed program into R an t at can be translated into the machine...specified conditios 7. Documentation. Program documentation is necessary for user information, for maintenance, and for future applications. Flowcharts ...particular CP U. Asynchronous. Operating without reference to an overall timing source. BASIC. Beginners ’ All-purpose Symbolic Instruction Code; a widely
NASA Technical Reports Server (NTRS)
Bamber, M J
1935-01-01
General methods of theoretical analysis of airplane spinning characteristics have been available for some time. Some of these methods of analysis might be used by designers to predict the spinning characteristics of proposed airplane designs if the necessary aerodynamic data were known. The present investigation, to determine the spinning characteristics of wings, is planned to include variations in airfoil sections, plan forms, and tip shapes of monoplane wings and variations in stagger, gap, and decalage for biplane cellules. The first series of tests, made on a rectangular Clark Y monoplane wing, are reported in reference 1. That report also gives an analysis of the data for predicting the probable effects of various important parameters on the spin for normal airplanes using such a wing. The present report is the second of the series. It gives the aerodynamic characteristics of a rectangular Clark Y biplane cellule in spinning attitudes and includes a discussion of the data, using the method of analysis given in reference 1.
Hybrid pregnant reference phantom series based on adult female ICRP reference phantom
NASA Astrophysics Data System (ADS)
Rafat-Motavalli, Laleh; Miri-Hakimabad, Hashem; Hoseinian-Azghadi, Elie
2018-03-01
This paper presents boundary representation (BREP) models of pregnant female and her fetus at the end of each trimester. The International Commission on Radiological Protection (ICRP) female reference voxel phantom was used as a base template in development process of the pregnant hybrid phantom series. The differences in shape and location of the displaced maternal organs caused by enlarging uterus were also taken into account. The CT and MR images of fetus specimens and pregnant patients of various ages were used to replace the maternal abdominal pelvic organs of template phantom and insert the fetus inside the gravid uterus. Each fetal model contains 21 different organs and tissues. The skeletal model of the fetus also includes age-dependent cartilaginous and ossified skeletal components. The replaced maternal organ models were converted to NURBS surfaces and then modified to conform to reference values of ICRP Publication 89. The particular feature of current series compared to the previously developed pregnant phantoms is being constructed upon the basis of ICRP reference phantom. The maternal replaced organ models are NURBS surfaces. With this great potential, they might have the feasibility of being converted to high quality polygon mesh phantoms.
Caviola, Sara; Carey, Emma; Mammarella, Irene C; Szucs, Denes
2017-01-01
We review how stress induction, time pressure manipulations and math anxiety can interfere with or modulate selection of problem-solving strategies (henceforth "strategy selection") in arithmetical tasks. Nineteen relevant articles were identified, which contain references to strategy selection and time limit (or time manipulations), with some also discussing emotional aspects in mathematical outcomes. Few of these take cognitive processes such as working memory or executive functions into consideration. We conclude that due to the sparsity of available literature our questions can only be partially answered and currently there is not much evidence of clear associations. We identify major gaps in knowledge and raise a series of open questions to guide further research.
Reconstruction method for fringe projection profilometry based on light beams.
Li, Xuexing; Zhang, Zhijiang; Yang, Chen
2016-12-01
A novel reconstruction method for fringe projection profilometry, based on light beams, is proposed and verified by experiments. Commonly used calibration techniques require the parameters of projector calibration or the reference planes placed in many known positions. Obviously, introducing the projector calibration can reduce the accuracy of the reconstruction result, and setting the reference planes to many known positions is a time-consuming process. Therefore, in this paper, a reconstruction method without projector's parameters is proposed and only two reference planes are introduced. A series of light beams determined by the subpixel point-to-point map on the two reference planes combined with their reflected light beams determined by the camera model are used to calculate the 3D coordinates of reconstruction points. Furthermore, the bundle adjustment strategy and the complementary gray-code phase-shifting method are utilized to ensure the accuracy and stability. Qualitative and quantitative comparisons as well as experimental tests demonstrate the performance of our proposed approach, and the measurement accuracy can reach about 0.0454 mm.
Biography Today: Profiles of People of Interest to Young Readers. Author Series, Volume 1.
ERIC Educational Resources Information Center
Harris, Laurie Lanzen, Ed.
The serialized reference work "Biography Today" is initiating a "Subject Series" that in five separate volumes will encompass: authors, artists, scientists, and inventors, sports figures, and world leaders. This is the first volume in the "Author Series." There will be no duplication between the regular series and the special subject volumes. This…
NASA Astrophysics Data System (ADS)
Jiang, Weiping; Deng, Liansheng; Zhou, Xiaohui; Ma, Yifang
2014-05-01
Higher-order ionospheric (HIO) corrections are proposed to become a standard part for precise GPS data analysis. For this study, we deeply investigate the impacts of the HIO corrections on the coordinate time series by implementing re-processing of the GPS data from Crustal Movement Observation Network of China (CMONOC). Nearly 13 year data are used in our three processing runs: (a) run NO, without HOI corrections, (b) run IG, both second- and third-order corrections are modeled using the International Geomagnetic Reference Field 11 (IGRF11) to model the magnetic field, (c) run ID, the same with IG but dipole magnetic model are applied. Both spectral analysis and noise analysis are adopted to investigate these effects. Results show that for CMONOC stations, HIO corrections are found to have brought an overall improvement. After the corrections are applied, the noise amplitudes decrease, with the white noise amplitudes showing a more remarkable variation. Low-latitude sites are more affected. For different coordinate components, the impacts vary. The results of an analysis of stacked periodograms show that there is a good match between the seasonal amplitudes and the HOI corrections, and the observed variations in the coordinate time series are related to HOI effects. HOI delays partially explain the seasonal amplitudes in the coordinate time series, especially for the U component. The annual amplitudes for all components are decreased for over one-half of the selected CMONOC sites. Additionally, the semi-annual amplitudes for the sites are much more strongly affected by the corrections. However, when diplole model is used, the results are not as optimistic as IGRF model. Analysis of dipole model indicate that HIO delay lead to the increase of noise amplitudes, and that HIO delays with dipole model can generate false periodic signals. When dipole model are used in modeling HIO terms, larger residual and noise are brought in rather than the effective improvements.
Measuring efficiency of international crude oil markets: A multifractality approach
NASA Astrophysics Data System (ADS)
Niere, H. M.
2015-01-01
The three major international crude oil markets are treated as complex systems and their multifractal properties are explored. The study covers daily prices of Brent crude, OPEC reference basket and West Texas Intermediate (WTI) crude from January 2, 2003 to January 2, 2014. A multifractal detrended fluctuation analysis (MFDFA) is employed to extract the generalized Hurst exponents in each of the time series. The generalized Hurst exponent is used to measure the degree of multifractality which in turn is used to quantify the efficiency of the three international crude oil markets. To identify whether the source of multifractality is long-range correlations or broad fat-tail distributions, shuffled data and surrogated data corresponding to each of the time series are generated. Shuffled data are obtained by randomizing the order of the price returns data. This will destroy any long-range correlation of the time series. Surrogated data is produced using the Fourier-Detrended Fluctuation Analysis (F-DFA). This is done by randomizing the phases of the price returns data in Fourier space. This will normalize the distribution of the time series. The study found that for the three crude oil markets, there is a strong dependence of the generalized Hurst exponents with respect to the order of fluctuations. This shows that the daily price time series of the markets under study have signs of multifractality. Using the degree of multifractality as a measure of efficiency, the results show that WTI is the most efficient while OPEC is the least efficient market. This implies that OPEC has the highest likelihood to be manipulated among the three markets. This reflects the fact that Brent and WTI is a very competitive market hence, it has a higher level of complexity compared against OPEC, which has a large monopoly power. Comparing with shuffled data and surrogated data, the findings suggest that for all the three crude oil markets, the multifractality is mainly due to long-range correlations.
A technique to detect microclimatic inhomogeneities in historical temperature records
NASA Astrophysics Data System (ADS)
Runnalls, K. E.; Oke, T. R.
2003-04-01
A technique to identify inhomogeneities in historical temperature records caused by microclimatic changes to the surroundings of a climate station (e.g. minor instrument relocations, vegetation growth/removal, construction of houses, roads, runways) is presented. The technique uses daily maximum and minimum temperatures to estimate the magnitude of nocturnal cooling. The test station is compared to a nearby reference station by constructing time series of monthly "cooling ratios". It is argued that the cooling ratio is a particularly sensitive measure of microclimatic differences between neighbouring climate stations. Firstly, because microclimatic character is best expressed at night in stable conditions. Secondly, because larger-scale climatic influences common to both stations are removed by the use of a ratio and, because the ratio can be shown to be invariant in the mean with weather variables such as wind and cloud. Inflections (change points) in time series of cooling ratios therefore signal microclimatic change in one of the station records. Hurst rescaling is applied to the time series to aid in the identification of change points, which can then be compared to documented station history events, if sufficient metatdata is available. Results for a variety of air temperature records, ranging from rural to urban stations, are presented to illustrate the applicability of the technique.
Estimates of bottom roughness length and bottom shear stress in South San Francisco Bay, California
Cheng, R.T.; Ling, C.-H.; Gartner, J.W.; Wang, P.-F.
1999-01-01
A field investigation of the hydrodynamics and the resuspension and transport of participate matter in a bottom boundary layer was carried out in South San Francisco Bay (South Bay), California, during March-April 1995. Using broadband acoustic Doppler current profilers, detailed measurements of turbulent mean velocity distribution within 1.5 m above bed have been obtained. A global method of data analysis was used for estimating bottom roughness length zo and bottom shear stress (or friction velocities u*). Field data have been examined by dividing the time series of velocity profiles into 24-hour periods and independently analyzing the velocity profile time series by flooding and ebbing periods. The global method of solution gives consistent properties of bottom roughness length zo and bottom shear stress values (or friction velocities u*) in South Bay. Estimated mean values of zo and u* for flooding and ebbing cycles are different. The differences in mean zo and u* are shown to be caused by tidal current flood-ebb inequality, rather than the flooding or ebbing of tidal currents. The bed shear stress correlates well with a reference velocity; the slope of the correlation defines a drag coefficient. Forty-three days of field data in South Bay show two regimes of zo (and drag coefficient) as a function of a reference velocity. When the mean velocity is >25-30 cm s-1, the ln zo (and thus the drag coefficient) is inversely proportional to the reference velocity. The cause for the reduction of roughness length is hypothesized as sediment erosion due to intensifying tidal currents thereby reducing bed roughness. When the mean velocity is <25-30 cm s-1, the correlation between zo and the reference velocity is less clear. A plausible explanation of scattered values of zo under this condition may be sediment deposition. Measured sediment data were inadequate to support this hypothesis, but the proposed hypothesis warrants further field investigation.
Description and User Instructions for the Quaternion_to_Orbit_v3 Software
NASA Technical Reports Server (NTRS)
Strekalov, Dmitry V.; Kruizinga, Gerhard L.; Paik, Meegyeong; Yuan, Dah-Ning; Asmar, Sami W.
2012-01-01
For a given inertial frame of reference, the software combines the spacecraft orbits with the spacecraft attitude quaternions, and rotates the body-fixed reference frame of a particular spacecraft to the inertial reference frame. The conversion assumes that the two spacecraft are aligned with respect to the mutual line of sight, with a parameterized time tag. The software is implemented in Python and is completely open source. It is very versatile, and may be applied under various circumstances and for other related purposes. Based on the solid linear algebra analysis, it has an extra option for compensating the linear pitch. This software has been designed for simulation of the calibration maneuvers performed by the two spacecraft comprising the GRAIL mission to the Moon, but has potential use for other applications. In simulations of formation flights, one needs to coordinate the spacecraft orbits represented in an appropriate inertial reference frame and the spacecraft attitudes. The latter are usually given as the time series of quaternions rotating the body-fixed reference frame of a particular spacecraft to the inertial reference frame. It is often desirable to simulate the same maneuver for different segments of the orbit. It is also useful to study various maneuvers that could be performed at the same orbit segment. These two lines of study are more timeand labor-efficient if the attitude and orbit data are generated independently, so that the part of the data that has not been changed can be recycled in the course of multiple simulations.
Analysis of Differences Between VLBI, GNSS and SLR Earth Orientation Series
NASA Astrophysics Data System (ADS)
MacMillan, D. S.; Pavlis, E. C.; Griffiths, J.
2016-12-01
We have compared polar motion series from VLBI, GNSS, and SLR where the reference frames were aligned to ITRF2008. Three objectives of the comparisons are 1) to determine biases between the techniques and 2) to determine the precisions of each technique via a 3-corner hat analysis after removing the relative biases, and 3) to evaluate the long-term stability of EOP series. Between VLBI and GPS or SLR, there are clear annual variations ranging from 25 to 100 µas in peak-to-peak amplitude. We investigate the possible causes of these variations. In addition, there are other apparent systematic bias and rate differences. From the point of view of VLBI, it is evident that there are VLBI network dependent effects, specifically between the operational R1 and R4 weekly 24-hour sessions. We investigate the origins of these differences including network station changes in these networks over the period from 2002-present. The EOP biases and precisions of the five IVS VLBI CONT campaigns (since 2002) are also analyzed since these sessions were each designed to provide the highest quality results that could be produced at the time. A possible source of biases between the geodetic techniques is the underlying reference frame used by each technique. We also consider the technique differences when ITRF2014 was applied instead of ITRF2008.
Analysis of Polar Motion Series Differences Between VLBI, GNSS, and SLR
NASA Astrophysics Data System (ADS)
MacMillan, Daniel; Pavlis, Erricos
2017-04-01
We have compared polar motion series from VLBI, GNSS, and SLR generated with a reference frame aligned to ITRF2008. Three objectives of the comparisons are 1) to determine biases between the techniques, 2) to determine the precision of each technique via a 3-corner hat analysis after removing the relative biases, and 3) to evaluate the long-term stability of polar motion series. Between VLBI, GNSS,and SLR, there are clear variations ranging from 20 to 60 µas in peak-to-peak amplitude. We investigate the possible causes of these variations. In addition, there are other apparent systematic biases and rate differences. There are VLBI network dependent effects that appear in the VLBI-GNSS and VLBI-SLR differences, specifically between the operational R1 and R4 weekly 24-hour sessions. We investigate the origins of these differences including network station changes in these networks over the period from 2002-present. The polar motion biases and precisions of the five IVS VLBI continuous observing CONT campaigns (since 2002) are also analyzed since these 2-week campaigns were each designed to provide the highest quality results that could be produced at the time. A possible source of bias between the three techniques is the underlying chosen sub-network used by each technique to realize the adopted reference frame. We also consider the technique differences when ITRF2014 is used instead of ITRF2008
Flood return level analysis of Peaks over Threshold series under changing climate
NASA Astrophysics Data System (ADS)
Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.
2016-12-01
Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.
Distance-dependent processing of pictures and words.
Amit, Elinor; Algom, Daniel; Trope, Yaacov
2009-08-01
A series of 8 experiments investigated the association between pictorial and verbal representations and the psychological distance of the referent objects from the observer. The results showed that people better process pictures that represent proximal objects and words that represent distal objects than pictures that represent distal objects and words that represent proximal objects. These results were obtained with various psychological distance dimensions (spatial, temporal, and social), different tasks (classification and categorization), and different measures (speed of processing and selective attention). The authors argue that differences in the processing of pictures and words emanate from the physical similarity of pictures, but not words, to the referents. Consequently, perceptual analysis is commonly applied to pictures but not to words. Pictures thus impart a sense of closeness to the referent objects and are preferably used to represent such objects, whereas words do not convey proximity and are preferably used to represent distal objects in space, time, and social perspective.
Seasonal station variations in the Vienna VLBI terrestrial reference frame VieTRF16a
NASA Astrophysics Data System (ADS)
Krásná, Hana; Böhm, Johannes; Madzak, Matthias
2017-04-01
The special analysis center of the International Very Long Baseline Interferometry (VLBI) Service for Geodesy and Astrometry (IVS) at TU Wien (VIE) routinely analyses the VLBI measurements and estimates its own Terrestrial Reference Frame (TRF) solutions. We present our latest solution VieTRF16a (1979.0 - 2016.5) computed with the software VieVS version 3.0. Several recent updates of the software have been applied, e.g., the estimation of annual and semi-annual station variations as global parameters. The VieTRF16a is determined in the form of the conventional model (station position and its linear velocity) simultaneously with the celestial reference frame and Earth orientation parameters. In this work, we concentrate on the seasonal station variations in the residual time series and compare our TRF with the three combined TRF solutions ITRF2014, DTRF2014 and JTRF2014.
NASA Astrophysics Data System (ADS)
Soja, B.; Krasna, H.; Boehm, J.; Gross, R. S.; Abbondanza, C.; Chin, T. M.; Heflin, M. B.; Parker, J. W.; Wu, X.
2017-12-01
The most recent realizations of the ITRS include several innovations, two of which are especially relevant to this study. On the one hand, the IERS ITRS combination center at DGFI-TUM introduced a two-level approach with DTRF2014, consisting of a classical deterministic frame based on normal equations and an optional coordinate time series of non-tidal displacements calculated from geophysical loading models. On the other hand, the JTRF2014 by the combination center at JPL is a time series representation of the ITRF determined by Kalman filtering. Both the JTRF2014 and the second level of the DTRF2014 are thus able to take into account short-term variations in the station coordinates. In this study, based on VLBI data, we combine these two approaches, applying them to the determination of both terrestrial and celestial reference frames. Our product has two levels like DTRF2014, with the second level being a Kalman filter solution like JTRF2014. First, we compute a classical TRF and CRF in a global least-squares adjustment by stacking normal equations from 5446 VLBI sessions between 1979 and 2016 using the Vienna VLBI and Satellite Software VieVS (solution level 1). Next, we obtain coordinate residuals from the global adjustment by applying the level-1 TRF and CRF in the single-session analysis and estimating coordinate offsets. These residuals are fed into a Kalman filter and smoother, taking into account the stochastic properties of the individual stations and radio sources. The resulting coordinate time series (solution level 2) serve as an additional layer representing irregular variations not considered in the first level of our approach. Both levels of our solution are implemented in VieVS in order to test their individual and combined performance regarding the repeatabilities of estimated baseline lengths, EOP, and radio source coordinates.
Multivariate analysis applied to monthly rainfall over Rio de Janeiro state, Brazil
NASA Astrophysics Data System (ADS)
Brito, Thábata T.; Oliveira-Júnior, José F.; Lyra, Gustavo B.; Gois, Givanildo; Zeri, Marcelo
2017-10-01
Spatial and temporal patterns of rainfall were identified over the state of Rio de Janeiro, southeast Brazil. The proximity to the coast and the complex topography create great diversity of rainfall over space and time. The dataset consisted of time series (1967-2013) of monthly rainfall over 100 meteorological stations. Clustering analysis made it possible to divide the stations into six groups (G1, G2, G3, G4, G5 and G6) with similar rainfall spatio-temporal patterns. A linear regression model was applied to a time series and a reference. The reference series was calculated from the average rainfall within a group, using nearby stations with higher correlation (Pearson). Based on t-test ( p < 0.05) all stations had a linear spatiotemporal trend. According to the clustering analysis, the first group (G1) contains stations located over the coastal lowlands and also over the ocean facing area of Serra do Mar (Sea ridge), a 1500 km long mountain range over the coastal Southeastern Brazil. The second group (G2) contains stations over all the state, from Serra da Mantiqueira (Mantiqueira Mountains) and Costa Verde (Green coast), to the south, up to stations in the Northern parts of the state. Group 3 (G3) contains stations in the highlands over the state (Serrana region), while group 4 (G4) has stations over the northern areas and the continent-facing side of Serra do Mar. The last two groups were formed with stations around Paraíba River (G5) and the metropolitan area of the city of Rio de Janeiro (G6). The driest months in all regions were June, July and August, while November, December and January were the rainiest months. Sharp transitions occurred when considering monthly accumulated rainfall: from January to February, and from February to March, likely associated with episodes of "veranicos", i.e., periods of 4-15 days of duration with no rainfall.
Spatio-temporal filtering for determination of common mode error in regional GNSS networks
NASA Astrophysics Data System (ADS)
Bogusz, Janusz; Gruszczynski, Maciej; Figurski, Mariusz; Klos, Anna
2015-04-01
The spatial correlation between different stations for individual components in the regional GNSS networks seems to be significant. The mismodelling in satellite orbits, the Earth orientation parameters (EOP), largescale atmospheric effects or satellite antenna phase centre corrections can all cause the regionally correlated errors. This kind of GPS time series errors are referred to as common mode errors (CMEs). They are usually estimated with the regional spatial filtering, such as the "stacking". In this paper, we show the stacking approach for the set of ASG-EUPOS permanent stations, assuming that spatial distribution of the CME is uniform over the whole region of Poland (more than 600 km extent). The ASG-EUPOS is a multifunctional precise positioning system based on the reference network designed for Poland. We used a 5- year span time series (2008-2012) of daily solutions in the ITRF2008 from Bernese 5.0 processed by the Military University of Technology EPN Local Analysis Centre (MUT LAC). At the beginning of our analyses concerning spatial dependencies, the correlation coefficients between each pair of the stations in the GNSS network were calculated. This analysis shows that spatio-temporal behaviour of the GPS-derived time series is not purely random, but there is the evident uniform spatial response. In order to quantify the influence of filtering using CME, the norms L1 and L2 were determined. The values of these norms were calculated for the North, East and Up components twice: before performing the filtration and after stacking. The observed reduction of the L1 and L2 norms was up to 30% depending on the dimension of the network. However, the question how to define an optimal size of CME-analysed subnetwork remains unanswered in this research, due to the fact that our network is not extended enough.
Clinical pharmacy academic career transitions: Viewpoints from the field.
Blackmer, Allison B; Thompson, Angela M; Jeffres, Meghan N; Glode, Ashley E; Thompson, Megan; Mahyari, Nila
2018-02-01
The six authors of this commentary series, who have recently transitioned into or within an academic career, discuss challenging aspects of an academic career change. This is a three-part commentary series that explores select challenges: 1) feedback, evaluation, and advancement; 2) understanding and balancing of distribution of effort; 3) learning how and when to say yes. Faculty, or those interested in pursuing a career in pharmacy academia, can refer to this commentary series as a reference. Schools of pharmacy may utilize this as a tool for new faculty members during orientation in order to ensure smooth integration into the academic environment. Copyright © 2017 Elsevier Inc. All rights reserved.
Zanderigo, Francesca; Sparacino, Giovanni; Kovatchev, Boris; Cobelli, Claudio
2007-09-01
The aim of this article was to use continuous glucose error-grid analysis (CG-EGA) to assess the accuracy of two time-series modeling methodologies recently developed to predict glucose levels ahead of time using continuous glucose monitoring (CGM) data. We considered subcutaneous time series of glucose concentration monitored every 3 minutes for 48 hours by the minimally invasive CGM sensor Glucoday® (Menarini Diagnostics, Florence, Italy) in 28 type 1 diabetic volunteers. Two prediction algorithms, based on first-order polynomial and autoregressive (AR) models, respectively, were considered with prediction horizons of 30 and 45 minutes and forgetting factors (ff) of 0.2, 0.5, and 0.8. CG-EGA was used on the predicted profiles to assess their point and dynamic accuracies using original CGM profiles as reference. Continuous glucose error-grid analysis showed that the accuracy of both prediction algorithms is overall very good and that their performance is similar from a clinical point of view. However, the AR model seems preferable for hypoglycemia prevention. CG-EGA also suggests that, irrespective of the time-series model, the use of ff = 0.8 yields the highest accurate readings in all glucose ranges. For the first time, CG-EGA is proposed as a tool to assess clinically relevant performance of a prediction method separately at hypoglycemia, euglycemia, and hyperglycemia. In particular, we have shown that CG-EGA can be helpful in comparing different prediction algorithms, as well as in optimizing their parameters.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-30
... Change Relating to $0.50 and $1 Strike Price Intervals for Classes in the Short Term Option Series... amend Rule 1012 (Series of Options Open for Trading) and Rule 1101A (Terms of Option Contracts) to give... Term Option Series Program (referred to as a ``Related non-Short Term Option series''), for the Related...
Describing temporal variability of the mean Estonian precipitation series in climate time scale
NASA Astrophysics Data System (ADS)
Post, P.; Kärner, O.
2009-04-01
Applicability of the random walk type models to represent the temporal variability of various atmospheric temperature series has been successfully demonstrated recently (e.g. Kärner, 2002). Main problem in the temperature modeling is connected to the scale break in the generally self similar air temperature anomaly series (Kärner, 2005). The break separates short-range strong non-stationarity from nearly stationary longer range variability region. This is an indication of the fact that several geophysical time series show a short-range non-stationary behaviour and a stationary behaviour in longer range (Davis et al., 1996). In order to model series like that the choice of time step appears to be crucial. To characterize the long-range variability we can neglect the short-range non-stationary fluctuations, provided that we are able to model properly the long-range tendencies. The structure function (Monin and Yaglom, 1975) was used to determine an approximate segregation line between the short and the long scale in terms of modeling. The longer scale can be called climate one, because such models are applicable in scales over some decades. In order to get rid of the short-range fluctuations in daily series the variability can be examined using sufficiently long time step. In the present paper, we show that the same philosophy is useful to find a model to represent a climate-scale temporal variability of the Estonian daily mean precipitation amount series over 45 years (1961-2005). Temporal variability of the obtained daily time series is examined by means of an autoregressive and integrated moving average (ARIMA) family model of the type (0,1,1). This model is applicable for daily precipitation simulating if to select an appropriate time step that enables us to neglet the short-range non-stationary fluctuations. A considerably longer time step than one day (30 days) is used in the current paper to model the precipitation time series variability. Each ARIMA (0,1,1) model can be interpreted to be consisting of random walk in a noisy environment (Box and Jenkins, 1976). The fitted model appears to be weakly non-stationary, that gives us the possibility to use stationary approximation if only the noise component from that sum of white noise and random walk is exploited. We get a convenient routine to generate a stationary precipitation climatology with a reasonable accuracy, since the noise component variance is much larger than the dispersion of the random walk generator. This interpretation emphasizes dominating role of a random component in the precipitation series. The result is understandable due to a small territory of Estonia that is situated in the mid-latitude cyclone track. References Box, J.E.P. and G. Jenkins 1976: Time Series Analysis, Forecasting and Control (revised edn.), Holden Day San Francisco, CA, 575 pp. Davis, A., Marshak, A., Wiscombe, W. and R. Cahalan 1996: Multifractal characterizations of intermittency in nonstationary geophysical signals and fields.in G. Trevino et al. (eds) Current Topics in Nonsstationarity Analysis. World-Scientific, Singapore, 97-158. Kärner, O. 2002: On nonstationarity and antipersistency in global temperature series. J. Geophys. Res. D107; doi:10.1029/2001JD002024. Kärner, O. 2005: Some examples on negative feedback in the Earth climate system. Centr. European J. Phys. 3; 190-208. Monin, A.S. and A.M. Yaglom 1975: Statistical Fluid Mechanics, Vol 2. Mechanics of Turbulence , MIT Press Boston Mass, 886 pp.
... provider will listen to your breathing with a stethoscope. If you have a collapsed lung, there are ... rupture, chest x-ray Pneumothorax - chest x-ray Respiratory system Chest tube insertion - series Pneumothorax - series References ...
Unification of height systems in the frame of GGOS
NASA Astrophysics Data System (ADS)
Sánchez, Laura
2015-04-01
Most of the existing vertical reference systems do not fulfil the accuracy requirements of modern Geodesy. They refer to local sea surface levels, are stationary (do not consider variations in time), realize different physical height types (orthometric, normal, normal-orthometric, etc.), and their combination in a global frame presents uncertainties at the metre level. To provide a precise geodetic infrastructure for monitoring the Earth system, the Global Geodetic Observing System (GGOS) of the International Association of Geodesy (IAG), promotes the standardization of the height systems worldwide. The main purpose is to establish a global gravity field-related vertical reference system that (1) supports a highly-precise (at cm-level) combination of physical and geometric heights worldwide, (2) allows the unification of all existing local height datums, and (3) guarantees vertical coordinates with global consistency (the same accuracy everywhere) and long-term stability (the same order of accuracy at any time). Under this umbrella, the present contribution concentrates on the definition and realization of a conventional global vertical reference system; the standardization of the geodetic data referring to the existing height systems; and the formulation of appropriate strategies for the precise transformation of the local height datums into the global vertical reference system. The proposed vertical reference system is based on two components: a geometric component consisting of ellipsoidal heights as coordinates and a level ellipsoid as the reference surface, and a physical component comprising geopotential numbers as coordinates and an equipotential surface defined by a conventional W0 value as the reference surface. The definition of the physical component is based on potential parameters in order to provide reference to any type of physical heights (normal, orthometric, etc.). The conversion of geopotential numbers into metric heights and the modelling of the reference surface (geoid or quasigeoid determination) are considered as steps of the realization. The vertical datum unification strategy is based on (1) the physical connection of height datums to determine their discrepancies, (2) joint analysis of satellite altimetry and tide gauge records to determine time variations of sea level at reference tide gauges, (3) combination of geometrical and physical heights in a well-distributed and high-precise reference frame to estimate the relationship between the individual vertical levels and the global one, and (4) analysis of GNSS time series at reference tide gauges to separate crustal movements from sea level changes. The final vertical transformation parameters are provided by the common adjustment of the observation equations derived from these methods.
1981-08-01
RATIO TEST STATISTIC FOR SPHERICITY OF COMPLEX MULTIVARIATE NORMAL DISTRIBUTION* C. Fang P. R. Krishnaiah B. N. Nagarsenker** August 1981 Technical...and their applications in time sEries, the reader is referred to Krishnaiah (1976). Motivated by the applications in the area of inference on multiple...for practical purposes. Here, we note that Krishnaiah , Lee and Chang (1976) approxi- mated the null distribution of certain power of the likeli
Explosives Instrumentation Group Trial 6/77-Propellant Fire Trials (Series Two).
1981-10-01
frames/s. A 19 mm Sony U-Matic video cassette recorder (VCR) and camera were used to view the hearth from a tower 100 m from ground-zero (GZ). Normal...camera started. This procedure permitted increased recording time of the event. A 19 mm Sony U-Matic VCR and camera was used to view the container...Lumpur, Malaysia Exchange Section, British Library, U.K. Periodicals Recording Section, Science Reference Library, British Library, U.K. Library, Chemical
1986-12-01
poorly written problem statements. We decline to artificially create difficulties for experimentation. Others have encountered these issues and treated...you lose some of the weaning. The method also does not extend well to nonlinear or time-varying system (sometimes it can be don#. but it creates ...thereby introduced creates problems and solves nothing. For variable-geometry aircraft, some projects establish reference geometry values that change as
Individual and Joint Expert Judgments as Reference Standards in Artifact Detection
Verduijn, Marion; Peek, Niels; de Keizer, Nicolette F.; van Lieshout, Erik-Jan; de Pont, Anne-Cornelie J.M.; Schultz, Marcus J.; de Jonge, Evert; de Mol, Bas A.J.M.
2008-01-01
Objective To investigate the agreement among clinical experts in their judgments of monitoring data with respect to artifacts, and to examine the effect of reference standards that consist of individual and joint expert judgments on the performance of artifact filters. Design Individual judgments of four physicians, a majority vote judgment, and a consensus judgment were obtained for 30 time series of three monitoring variables: mean arterial blood pressure (ABPm), central venous pressure (CVP), and heart rate (HR). The individual and joint judgments were used to tune three existing automated filtering methods and to evaluate the performance of the resulting filters. Measurements The interrater agreement was calculated in terms of positive specific agreement (PSA). The performance of the artifact filters was quantified in terms of sensitivity and positive predictive value (PPV). Results PSA values between 0.33 and 0.85 were observed among clinical experts in their selection of artifacts, with relatively high values for CVP data. Artifact filters developed using judgments of individual experts were found to moderately generalize to new time series and other experts; sensitivity values ranged from 0.40 to 0.60 for ABPm and HR filters (PPV: 0.57–0.84), and from 0.63 to 0.80 for CVP filters (PPV: 0.71–0.86). A higher performance value for the filters was found for the three variable types when joint judgments were used for tuning the filtering methods. Conclusion Given the disagreement among experts in their individual judgment of monitoring data with respect to artifacts, the use of joint reference standards obtained from multiple experts is recommended for development of automatic artifact filters. PMID:18096912
Densification of the ITRF through the weekly combination of regional and global GNSS solutions
NASA Astrophysics Data System (ADS)
Legrand, J.; Bruyninx, C.; Saria, E.; Griffiths, J.; Craymer, M. R.; Dawson, J. H.; Kenyeres, A.; Santamaría-Gómez, A.; Sanchez, L.; Altamimi, Z.
2012-12-01
The IAG Working Group (WG) "Integration of Dense Velocity Fields in the ITRF" was created in 2011 as a follow-up of the WG "Regional Dense Velocity Fields" (2007-2011). The goal of the WG is to densify the International Terrestrial Reference Frame (ITRF) using regional GNSS solutions as well as global solutions. This was originally done by combining several cumulative position/velocity solutions submitted to the WG by the global analysis center (ULR) and the IAG regional reference frame sub-commissions (APREF, EUREF, SIRGAS, NAREF) analysis centers. However, several test combinations together with the comparison of the residual position time series demonstrated the limitations of this approach. In June 2012, the WG decided to adopt a new approach based on a weekly combination of the GNSS solutions. This new approach will enable us to mitigate network effects, have full control over the discontinuities and the velocity constraints, manage the different data spans and derive residual position time series in addition to a velocity field. All initial contributors have agreed to submit weekly solutions and in addition initial contacts have been made with other sub-commissions, particularly Africa, in order to extent the densified velocity field to all continents. Preliminary results of the analysis of weekly solutions will be presented. More details on the WG are available from http://epncb.oma.be/IAG/.
NASA Astrophysics Data System (ADS)
Vernon, F.; Arrott, M.; Orcutt, J. A.; Mueller, C.; Case, J.; De Wardener, G.; Kerfoot, J.; Schofield, O.
2013-12-01
Any approach sophisticated enough to handle a variety of data sources and scale, yet easy enough to promote wide use and mainstream adoption is required to address the following mappings: - From the authored domain of observation to the requested domain of interest; - From the authored spatiotemporal resolution to the requested resolution; and - From the representation of data placed on wide variety of discrete mesh types to the use of that data as a continuos field with a selectable continuity. The Open Geospatial Consortium's (OGC) Reference Model[1] with its direct association with the ISO 19000 series standards provides a comprehensive foundation to represent all data on any type of mesh structure, aka "Discrete Coverages". The Reference Model also provides the specification for the core operations required to utilize any Discrete Coverage. The FEniCS Project[2] provides a comprehensive model for how to represent the Basis Functions on mesh structures as "Degrees of Freedom" to present discrete data as continuous fields with variable continuity. In this talk, we will present the research and development the OOI Cyberinfrastructure Project is pursuing to integrate these approaches into a comprehensive Application Programming Interface (API) to author, acquire and operate on the broad range of data formulation from time series, trajectories and tables through to time variant finite difference grids and finite element meshes.
Change classification in SAR time series: a functional approach
NASA Astrophysics Data System (ADS)
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2017-10-01
Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.
Packaging Science for a Reference Service.
ERIC Educational Resources Information Center
Dixon, Bernard; Clarke, Lawrence
1991-01-01
Proposes that book series, geared to the general reader, can provide a readily available source of scientific information to the public. Portrays the possibilities for the effective marketing of such book series through the experience of the authors in editing and merchandising a science book series, which has been sold through direct marketing…
Sabatini, Angelo Maria; Ligorio, Gabriele; Mannini, Andrea
2015-11-23
In biomechanical studies Optical Motion Capture Systems (OMCS) are considered the gold standard for determining the orientation and the position (pose) of an object in a global reference frame. However, the use of OMCS can be difficult, which has prompted research on alternative sensing technologies, such as body-worn inertial sensors. We developed a drift-free method to estimate the three-dimensional (3D) displacement of a body part during cyclical motions using body-worn inertial sensors. We performed the Fourier analysis of the stride-by-stride estimates of the linear acceleration, which were obtained by transposing the specific forces measured by the tri-axial accelerometer into the global frame using a quaternion-based orientation estimation algorithm and detecting when each stride began using a gait-segmentation algorithm. The time integration was performed analytically using the Fourier series coefficients; the inverse Fourier series was then taken for reconstructing the displacement over each single stride. The displacement traces were concatenated and spline-interpolated to obtain the entire trace. The method was applied to estimate the motion of the lower trunk of healthy subjects that walked on a treadmill and it was validated using OMCS reference 3D displacement data; different approaches were tested for transposing the measured specific force into the global frame, segmenting the gait and performing time integration (numerically and analytically). The width of the limits of agreements were computed between each tested method and the OMCS reference method for each anatomical direction: Medio-Lateral (ML), VerTical (VT) and Antero-Posterior (AP); using the proposed method, it was observed that the vertical component of displacement (VT) was within ±4 mm (±1.96 standard deviation) of OMCS data and each component of horizontal displacement (ML and AP) was within ±9 mm of OMCS data. Fourier harmonic analysis was applied to model stride-by-stride linear accelerations during walking and to perform their analytical integration. Our results showed that analytical integration based on Fourier series coefficients was a useful approach to accurately estimate 3D displacement from noisy acceleration data.
LOD estimation from DORIS observations
NASA Astrophysics Data System (ADS)
Stepanek, Petr; Filler, Vratislav; Buday, Michal; Hugentobler, Urs
2016-04-01
The difference between astronomically determined duration of the day and 86400 seconds is called length of day (LOD). The LOD could be also understood as the daily rate of the difference between the Universal Time UT1, based on the Earth rotation, and the International Atomic Time TAI. The LOD is estimated using various Satellite Geodesy techniques as GNSS and SLR, while absolute UT1-TAI difference is precisely determined by VLBI. Contrary to other IERS techniques, the LOD estimation using DORIS (Doppler Orbitography and Radiopositioning Integrated by satellite) measurement did not achieve a geodetic accuracy in the past, reaching the precision at the level of several ms per day. However, recent experiments performed by IDS (International DORIS Service) analysis centre at Geodetic Observatory Pecny show a possibility to reach accuracy around 0.1 ms per day, when not adjusting the cross-track harmonics in the Satellite orbit model. The paper presents the long term LOD series determined from the DORIS solutions. The series are compared with C04 as the reference. Results are discussed in the context of accuracy achieved with GNSS and SLR. Besides the multi-satellite DORIS solutions, also the LOD series from the individual DORIS satellite solutions are analysed.
Non-linearity of geocentre motion and its impact on the origin of the terrestrial reference frame
NASA Astrophysics Data System (ADS)
Dong, Danan; Qu, Weijing; Fang, Peng; Peng, Dongju
2014-08-01
The terrestrial reference frame is a cornerstone for modern geodesy and its applications for a wide range of Earth sciences. The underlying assumption for establishing a terrestrial reference frame is that the motion of the solid Earth's figure centre relative to the mass centre of the Earth system on a multidecadal timescale is linear. However, past international terrestrial reference frames (ITRFs) showed unexpected accelerated motion in their translation parameters. Based on this underlying assumption, the inconsistency of relative origin motions of the ITRFs has been attributed to data reduction imperfection. We investigated the impact of surface mass loading from atmosphere, ocean, snow, soil moisture, ice sheet, glacier and sea level from 1983 to 2008 on the geocentre variations. The resultant geocentre time-series display notable trend acceleration from 1998 onward, in particular in the z-component. This effect is primarily driven by the hydrological mass redistribution in the continents (soil moisture, snow, ice sheet and glacier). The acceleration is statistically significant at the 99 per cent confidence level as determined using the Mann-Kendall test, and it is highly correlated with the satellite laser ranging determined translation series. Our study, based on independent geophysical and hydrological models, demonstrates that, in addition to systematic errors from analysis procedures, the observed non-linearity of the Earth-system behaviour at interannual timescales is physically driven and is able to explain 42 per cent of the disparity between the origins of ITRF2000 and ITRF2005, as well as the high level of consistency between the ITRF2005 and ITRF2008 origins.
A 280-Year Long Series of Phenological Observations of Cherry Tree Blossoming Dates for Switzerland
NASA Astrophysics Data System (ADS)
Rutishauser, T.; Luterbacher, J.; Wanner, H.
2003-04-01
Phenology is generally described as the timing of life cycle phases or activities of plants and animals in their temporal occurrence throughout the year (Lieth 1974). Recent studies have shown that meteorological and climatological impacts leave their 'fingerprints' across natural systems in general and strongly influence the seasonal activities of single animal and plant species. During the 20th century, phenological observation networks have been established around the world to document and analyze the influence of the globally changing climate to plants and wildlife. This work presents a first attempt of a unique 280-year long series of phenological observations of cherry tree blossoming dates for the Swiss plateau region. In Switzerland, a nation-wide phenological observation network has been established in 1951 currently documenting 69 phenophases of 26 different plant species. A guidebook seeks to increase objectiveness in the network observations. The observations of the blooming of the cherry tree (prunus avium) were chosen to calculate a mean series for the Swiss plateau region with observations from altitudes ranging between 370 and 860 asl. A total number of 737 observations from 21 stations were used. A linear regression was established between the mean blooming date and altitude in order to correct the data to a reference altitude level. Other ecological parameters were unaccounted for. The selected network data series from 1951 to 2000 was combined and prolonged with observations from various sources back to 1721. These include several historical observation series by farmers, clergymen and teachers, data from various stations collected at the newly established Swiss meteorological network from 1864 to 1873 and the single long series of observations from Liestal starting in 1894. The homogenized time series of observations will be compared with reconstructions of late winter temperatures as well as statistical estimations of blooming time based on long instrumental data from Europe. In addition, the series is one of the few historical phenological records to assess past climate and ecological changes. Lieth, H. (1974). Phenology and Seasonality Modeling. Berlin, Heidelberg, New York, Springer.
ERIC Educational Resources Information Center
Mervine, K. E.
This bibliography is part of a series of Environmental Resource Packets prepared under a grant from EXXON Education Foundation. The most authoritative and accessible references in the urban transportation field are reviewed. The authors, publisher, point of view, level, and summary are given for each reference. The references are categorized…
15 CFR 200.105 - Standard reference data.
Code of Federal Regulations, 2010 CFR
2010-01-01
... for application in energy, environment and health, industrial process design, materials durability... Institute of Physics, in the National Standard Reference Data System reports as the NSRDS-NIST series, and...
A Maple package for improved global mapping forecast
NASA Astrophysics Data System (ADS)
Carli, H.; Duarte, L. G. S.; da Mota, L. A. C. P.
2014-03-01
We present a Maple implementation of the well known global approach to time series analysis and some further developments designed to improve the computational efficiency of the forecasting capabilities of the approach. This global approach can be summarized as being a reconstruction of the phase space, based on a time ordered series of data obtained from the system. After that, using the reconstructed vectors, a portion of this space is used to produce a mapping, a polynomial fitting, through a minimization procedure, that represents the system and can be employed to forecast further entries for the series. In the present implementation, we introduce a set of commands, tools, in order to perform all these tasks. For example, the command VecTS deals mainly with the reconstruction of the vector in the phase space. The command GfiTS deals with producing the minimization and the fitting. ForecasTS uses all these and produces the prediction of the next entries. For the non-standard algorithms, we here present two commands: IforecasTS and NiforecasTS that, respectively deal with the one-step and the N-step forecasting. Finally, we introduce two further tools to aid the forecasting. The commands GfiTS and AnalysTS, basically, perform an analysis of the behavior of each portion of a series regarding the settings used on the commands just mentioned above. Catalogue identifier: AERW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERW_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3001 No. of bytes in distributed program, including test data, etc.: 95018 Distribution format: tar.gz Programming language: Maple 14. Computer: Any capable of running Maple Operating system: Any capable of running Maple. Tested on Windows ME, Windows XP, Windows 7. RAM: 128 MB Classification: 4.3, 4.9, 5 Nature of problem: Time series analysis and improving forecast capability. Solution method: The method of solution is partially based on a result published in [1]. Restrictions: If the time series that is being analyzed presents a great amount of noise or if the dynamical system behind the time series is of high dimensionality (Dim≫3), then the method may not work well. Unusual features: Our implementation can, in the cases where the dynamics behind the time series is given by a system of low dimensionality, greatly improve the forecast. Running time: This depends strongly on the command that is being used. References: [1] Barbosa, L.M.C.R., Duarte, L.G.S., Linhares, C.A. and da Mota, L.A.C.P., Improving the global fitting method on nonlinear time series analysis, Phys. Rev. E 74, 026702 (2006).
Occurrence analysis of daily rainfalls by using non-homogeneous Poissonian processes
NASA Astrophysics Data System (ADS)
Sirangelo, B.; Ferrari, E.; de Luca, D. L.
2009-09-01
In recent years several temporally homogeneous stochastic models have been applied to describe the rainfall process. In particular stochastic analysis of daily rainfall time series may contribute to explain the statistic features of the temporal variability related to the phenomenon. Due to the evident periodicity of the physical process, these models have to be used only to short temporal intervals in which occurrences and intensities of rainfalls can be considered reliably homogeneous. To this aim, occurrences of daily rainfalls can be considered as a stationary stochastic process in monthly periods. In this context point process models are widely used for at-site analysis of daily rainfall occurrence; they are continuous time series models, and are able to explain intermittent feature of rainfalls and simulate interstorm periods. With a different approach, periodic features of daily rainfalls can be interpreted by using a temporally non-homogeneous stochastic model characterized by parameters expressed as continuous functions in the time. In this case, great attention has to be paid to the parsimony of the models, as regards the number of parameters and the bias introduced into the generation of synthetic series, and to the influence of threshold values in extracting peak storm database from recorded daily rainfall heights. In this work, a stochastic model based on a non-homogeneous Poisson process, characterized by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. In particular, variation of rainfall occurrence intensity ? (t) is modelled by using Fourier series analysis, in which the non-homogeneous process is transformed into a homogeneous and unit one through a proper transformation of time domain, and the choice of the minimum number of harmonics is evaluated applying available statistical tests. The procedure is applied to a dataset of rain gauges located in different geographical zones of Mediterranean area. Time series have been selected on the basis of the availability of at least 50 years in the time period 1921-1985, chosen as calibration period, and of all the years of observation in the subsequent validation period 1986-2005, whose daily rainfall occurrence process variability is under hypothesis. Firstly, for each time series and for each fixed threshold value, parameters estimation of the non-homogeneous Poisson model is carried out, referred to calibration period. As second step, in order to test the hypothesis that daily rainfall occurrence process preserves the same behaviour in more recent time periods, the intensity distribution evaluated for calibration period is also adopted for the validation period. Starting from this and using a Monte Carlo approach, 1000 synthetic generations of daily rainfall occurrences, of length equal to validation period, have been carried out, and for each simulation sample ?(t) has been evaluated. This procedure is adopted because of the complexity of determining analytical statistical confidence limits referred to the sample intensity ?(t). Finally, sample intensity, theoretical function of the calibration period and 95% statistical band, evaluated by Monte Carlo approach, are matching, together with considering, for each threshold value, the mean square error (MSE) between the theoretical ?(t) and the sample one of recorded data, and his correspondent 95% one tail statistical band, estimated from the MSE values between the sample ?(t) of each synthetic series and the theoretical one. The results obtained may be very useful in the context of the identification and calibration of stochastic rainfall models based on historical precipitation data. Further applications of the non-homogeneous Poisson model will concern the joint analyses of the storm occurrence process with the rainfall height marks, interpreted by using a temporally homogeneous model in proper sub-year intervals.
Benchmarking the Algorithms to Detect Seasonal Signals Under Different Noise Conditions
NASA Astrophysics Data System (ADS)
Klos, A.; Bogusz, J.; Bos, M. S.
2017-12-01
Global Positioning System (GPS) position time series contain seasonal signals. Among the others, annual and semi-annual are the most powerful. Widely, these oscillations are modelled as curves with constant amplitudes, using the Weighted Least-Squares (WLS) algorithm. However, in reality, the seasonal signatures vary over time, as their geophysical causes are not constant. Different algorithms have been already used to cover this time-variability, as Wavelet Decomposition (WD), Singular Spectrum Analysis (SSA), Chebyshev Polynomial (CP) or Kalman Filter (KF). In this research, we employed 376 globally distributed GPS stations which time series contributed to the newest International Terrestrial Reference Frame (ITRF2014). We show that for c.a. 20% of stations the amplitudes of seasonal signal varies over time of more than 1.0 mm. Then, we compare the WD, SSA, CP and KF algorithms for a set of synthetic time series to quantify them under different noise conditions. We show that when variations of seasonal signals are ignored, the power-law character is biased towards flicker noise. The most reliable estimates of the variations were found to be given by SSA and KF. These methods also perform the best for other noise levels while WD, and to a lesser extend also CP, have trouble in separating the seasonal signal from the noise which leads to an underestimation in the spectral index of power-law noise of around 0.1. For real ITRF2014 GPS data we discovered, that SSA and KF are capable to model 49-84% and 77-90% of the variance of the true varying seasonal signals, respectively.
NASA Astrophysics Data System (ADS)
Baldysz, Zofia; Nykiel, Grzegorz; Araszkiewicz, Andrzej; Figurski, Mariusz; Szafranek, Karolina
2016-09-01
The main purpose of this research was to acquire information about consistency of ZTD (zenith total delay) linear trends and seasonal components between two consecutive GPS reprocessing campaigns. The analysis concerned two sets of the ZTD time series which were estimated during EUREF (Reference Frame Sub-Commission for Europe) EPN (Permanent Network) reprocessing campaigns according to 2008 and 2015 MUT AC (Military University of Technology Analysis Centre) scenarios. Firstly, Lomb-Scargle periodograms were generated for 57 EPN stations to obtain a characterisation of oscillations occurring in the ZTD time series. Then, the values of seasonal components and linear trends were estimated using the LSE (least squares estimation) approach. The Mann-Kendall trend test was also carried out to verify the presence of linear long-term ZTD changes. Finally, differences in seasonal signals and linear trends between these two data sets were investigated. All these analyses were conducted for the ZTD time series of two lengths: a shortened 16-year series and a full 18-year one. In the case of spectral analysis, amplitudes of the annual and semi-annual periods were almost exactly the same for both reprocessing campaigns. Exceptions were found for only a few stations and they did not exceed 1 mm. The estimated trends were also similar. However, for the reprocessing performed in 2008, the trends values were usually higher. In general, shortening of the analysed time period by 2 years resulted in a decrease of the linear trends values of about 0.07 mm yr-1. This was confirmed by analyses based on two data sets.
A data variance technique for automated despiking of magnetotelluric data with a remote reference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kappler, K.
2011-02-15
The magnetotelluric method employs co-located surface measurements of electric and magnetic fields to infer the local electrical structure of the earth. The frequency-dependent 'apparent resistivity' curves can be inaccurate at long periods if input data are contaminated - even when robust remote reference techniques are employed. Data despiking prior to processing can result in significantly more reliable estimates of long period apparent resistivities. This paper outlines a two-step method of automatic identification and replacement for spike-like contamination of magnetotelluric data; based on the simultaneity of natural electric and magnetic field variations at distant sites. This simultaneity is exploited both tomore » identify windows in time when the array data are compromised, and to generate synthetic data that replace observed transient noise spikes. In the first step, windows in data time series containing spikes are identified via intersite comparison of channel 'activity' - such as the variance of differenced data within each window. In the second step, plausible data for replacement of flagged windows is calculated by Wiener filtering coincident data in clean channels. The Wiener filters - which express the time-domain relationship between various array channels - are computed using an uncontaminated segment of array training data. Examples are shown where the algorithm is applied to artificially contaminated data, and to real field data. In both cases all spikes are successfully identified. In the case of implanted artificial noise, the synthetic replacement time series are very similar to the original recording. In all cases, apparent resistivity and phase curves obtained by processing the despiked data are much improved over curves obtained from raw data.« less
NASA Astrophysics Data System (ADS)
Abell, J. T.; Jacobsen, J.; Bjorkstedt, E.
2016-02-01
Determining aragonite saturation state (Ω) in seawater requires measurement of two parameters of the carbonate system: most commonly dissolved inorganic carbon (DIC) and total alkalinity (TA). The routine measurement of DIC and TA is not always possible on frequently repeated hydrographic lines or at moored-time series that collect hydrographic data at short time intervals. In such cases a proxy can be developed that relates the saturation state as derived from one time or infrequent DIC and TA measurements (Ωmeas) to more frequently measured parameters such as dissolved oxygen (DO) and temperature (Temp). These proxies are generally based on best-fit parameterizations that utilize references values of DO and Temp and adjust linear coefficients until the error between the proxy-derived saturation state (Ωproxy) and Ωmeas is minimized. Proxies have been used to infer Ω from moored hydrographic sensors and gliders which routinely collect DO and Temp data but do not include carbonate parameter measurements. Proxies can also calculate Ω in regional oceanographic models which do not explicitly include carbonate parameters. Here we examine the variability and accuracy of Ωproxy along a near-shore hydrographic line and a moored-time series stations at Trinidad Head, CA. The saturation state is determined using proxies from different coastal regions of the California Current Large Marine Ecosystem and from different years of sampling along the hydrographic line. We then calculate the variability and error associated with the use of different proxy coefficients, the sensitivity to reference values and the inclusion of additional variables. We demonstrate how this variability affects estimates of the intensity and duration of exposure to aragonite corrosive conditions on the near-shore shelf and in the water column.
Caviola, Sara; Carey, Emma; Mammarella, Irene C.; Szucs, Denes
2017-01-01
We review how stress induction, time pressure manipulations and math anxiety can interfere with or modulate selection of problem-solving strategies (henceforth “strategy selection”) in arithmetical tasks. Nineteen relevant articles were identified, which contain references to strategy selection and time limit (or time manipulations), with some also discussing emotional aspects in mathematical outcomes. Few of these take cognitive processes such as working memory or executive functions into consideration. We conclude that due to the sparsity of available literature our questions can only be partially answered and currently there is not much evidence of clear associations. We identify major gaps in knowledge and raise a series of open questions to guide further research. PMID:28919870
Three decades of harnessing the GPS data explosion for geophysics (Vening Meinesz Medal Lecture)
NASA Astrophysics Data System (ADS)
Blewitt, Geoffrey
2015-04-01
In this presentation, I attempt to convey the immensity of the task that faced the geodesy community three decades ago, and continues to challenge us, to harness all potentially valuable GPS data available in the world for geophysical science. It would be fair to see that three decades ago, we were struggling with controlled tests just to get GPS geodesy working, and had little time to imagine the flood of data today. Yet the geodesy community has succeeded in meeting this challenge. Today, for example, the Nevada Geodetic Laboratory produces and makes publicly available coordinate time series for over 12,000 geodetic GPS station around the globe with various data intervals, latencies, and reference frames. About 8,000 stations have their daily time series updated every week, with 4,000 being updated the next day with coordinates at daily and 5 minute intervals. About 2,000 stations have their time series updated every hour with coordinates at 5 minute intervals. I will show examples of how these time series are being used by NGL and many other scientists to study a wide variety of geophysical topics, including plate tectonics, earthquake modeling, seismic and tsunami hazard, volcanic deformation, water resources, mountain growth, terrestrial reference frame realization, glacial isostatic adjustment, ice sheet melting, sea level rise and coastal subsidence, and even fundamental physics, using GPS atomic clocks to probe the nature of dark matter in the universe. The explosion in GPS data has challenged us to invent new data processing algorithms and develop robust automation in order to keep up with the flood. This explosion has been exponential, and therefore it can be said that it is not a recent phenomena, but rather that it began in the earliest years of GPS geodesy, and has always posed a challenge to us. Over the course of my post-doctoral career starting in late 1985, I have had the good fortune to witness the key developments that have taken place since the early years of geodetic GPS and over the course of three decades. These developments continue today as strongly as ever. Essential innovations have included, for example, automation of GPS cycle slip detection and mitigation, carrier phase ambiguity resolution, the birth and operation of the IGS for reliable orbit and clock estimation, the invention of algorithms that scale linearly with the number of stations, and the deep integration of GPS solutions into the ITRF, providing measures of accuracy, precision, and stability. As a recent example of automation, I show a new non-parametric algorithm to estimate station velocities quickly and robustly, without need to detect and correct for outliers, seasonal signals, and discontinuities in the time series steps that commonly occur due to equipment changes. The complete automation from data collection to production of station velocities (and, now, velocity time series) allows us to process all potentially valuable data, and to focus more on discovery and analysis of the results for geophysical applications, often with great redundancy in the data leading to high statistical significance and more robust scientific conclusions. I show by example that another benefit of this capability to process all data in a robust turn-key fashion is to enhance the opportunity for making discoveries, without necessarily planning all of the steps that can lead us to discovery's door.
Financial time series: A physics perspective
NASA Astrophysics Data System (ADS)
Gopikrishnan, Parameswaran; Plerou, Vasiliki; Amaral, Luis A. N.; Rosenow, Bernd; Stanley, H. Eugene
2000-06-01
Physicists in the last few years have started applying concepts and methods of statistical physics to understand economic phenomena. The word ``econophysics'' is sometimes used to refer to this work. One reason for this interest is the fact that Economic systems such as financial markets are examples of complex interacting systems for which a huge amount of data exist and it is possible that economic problems viewed from a different perspective might yield new results. This article reviews the results of a few recent phenomenological studies focused on understanding the distinctive statistical properties of financial time series. We discuss three recent results-(i) The probability distribution of stock price fluctuations: Stock price fluctuations occur in all magnitudes, in analogy to earthquakes-from tiny fluctuations to very drastic events, such as market crashes, eg., the crash of October 19th 1987, sometimes referred to as ``Black Monday''. The distribution of price fluctuations decays with a power-law tail well outside the Lévy stable regime and describes fluctuations that differ by as much as 8 orders of magnitude. In addition, this distribution preserves its functional form for fluctuations on time scales that differ by 3 orders of magnitude, from 1 min up to approximately 10 days. (ii) Correlations in financial time series: While price fluctuations themselves have rapidly decaying correlations, the magnitude of fluctuations measured by either the absolute value or the square of the price fluctuations has correlations that decay as a power-law and persist for several months. (iii) Correlations among different companies: The third result bears on the application of random matrix theory to understand the correlations among price fluctuations of any two different stocks. From a study of the eigenvalue statistics of the cross-correlation matrix constructed from price fluctuations of the leading 1000 stocks, we find that the largest 5-10% of the eigenvalues and the corresponding eigenvectors show systematic deviations from the predictions for a random matrix, whereas the rest of the eigenvalues conform to random matrix behavior-suggesting that these 5-10% of the eigenvalues contain system-specific information about correlated behavior. .
Deriving Daily Time Series Evapotranspiration, Evaporation and Transpiration Maps With Landsat Data
NASA Astrophysics Data System (ADS)
Paul, G.; Gowda, P. H.; Marek, T.; Xiao, X.; Basara, J. B.
2014-12-01
Mapping high resolution evapotranspiration (ET) over large region at daily time step is complex and computationally intensive. Utility of high resolution daily ET maps are large ranging from crop water management to watershed management. The aim of this work is to generate daily time series (10 years) ET and its components vegetation transpiration (T) and soil water evaporation (E) maps using Landsat 5 satellite data for Southern Great Plains forage-rangeland-winter wheat production system in Oklahoma (OK). Framework for generating these products included the two source energy balance (TSEB) algorithm and other important features were: (a) atmospheric correction algorithm; (b) spatially interpolated weather inputs; (c) functions for varying Priestley-Taylor coefficient; and (d) ET, E and T extrapolating algorithm utilizing reference ET. An extensive network of 140 weather stations managed by Oklahoma Mesonet was utilized to generate spatially interpolated inputs of air temperature, relative humidity, wind speed, solar radiation, pressure, and reference ET. Validation of the ET maps were done against eddy covariance data from two grassland sites at El Reno, OK suggested good performance (Table 1). Figure 1 illustrates a daily ET map for a very small subset of 18thJuly 2006 ET map, where difference in ET among different land uses such as the irrigated cropland, vegetation along drainage, and grassland is very distinct. Results indicated that the proposed ET mapping framework is suitable for deriving high resolution time series daily ET maps at regional scale with Landsat Thematic Mapper data. . Table 1: Daily actual ET performance statistics for two grassland locations at El Reno OK for year 2005 . Management Type Mean (obs) (mm d-1) Mean (est) (mm d-1) MBE (mm d-1) % MBE (%) RMSE (mm d-1) RMSE (%) MAE (mm d-1) MAPD (%) NSE R2 Control 2.2 1.8 -0.43 -19.4 0.87 38.9 0.65 29.5 0.71 0.79 Burnt 2.0 1.8 -0.15 -7.7 0.80 39.8 0.62 30.7 0.73 0.77
NASA Astrophysics Data System (ADS)
Ozdagli, A. I.; Liu, B.; Moreu, F.
2018-07-01
According to railroad managers, displacement of railroad bridges under service loads is an important parameter in the condition assessment and performance evaluation. However, measuring bridge responses in the field is often costly and labor-intensive. This paper proposes a low-cost, efficient wireless intelligent sensor (LEWIS) platform that can compute in real-time the dynamic transverse displacements of railroad bridges under service loads. This sensing platform drives on an open-source Arduino ecosystem and combines low-cost microcontrollers with affordable accelerometers and wireless transmission modules. The proposed LEWIS system is designed to reconstruct dynamic displacements from acceleration measurements onboard, eliminating the need for offline post-processing, and to transmit the data in real-time to a base station where the inspector at the bridge can see the displacements while the train is crossing, or to a remote office if so desired by internet. Researchers validated the effectiveness of the new LEWIS by conducting a series of laboratory experiments. A shake table setup simulated transverse bridge displacements measured on the field and excited the proposed platform, a commercially available wired expensive accelerometer, and reference LVDT displacement sensor. The responses obtained from the wireless system were compared to the displacements reconstructed from commercial accelerometer readings and the reference LVDT. The results of the laboratory experiments demonstrate that the proposed system is capable of reconstructing transverse displacements of railroad bridges under revenue service traffic accurately and transmitting the data in real-time wirelessly. In conclusion, the platform presented in this paper can be used in the performance assessment of railroad bridge network cost-effectively and accurately. Future work includes collecting real-time reference-free displacements of one railroad bridge in Colorado under train crossings to further prove LEWIS' suitability for engineering applications.
Functional MRI and Multivariate Autoregressive Models
Rogers, Baxter P.; Katwal, Santosh B.; Morgan, Victoria L.; Asplund, Christopher L.; Gore, John C.
2010-01-01
Connectivity refers to the relationships that exist between different regions of the brain. In the context of functional magnetic resonance imaging (fMRI), it implies a quantifiable relationship between hemodynamic signals from different regions. One aspect of this relationship is the existence of small timing differences in the signals in different regions. Delays of 100 ms or less may be measured with fMRI, and these may reflect important aspects of the manner in which brain circuits respond as well as the overall functional organization of the brain. The multivariate autoregressive time series model has features to recommend it for measuring these delays, and is straightforward to apply to hemodynamic data. In this review, we describe the current usage of the multivariate autoregressive model for fMRI, discuss the issues that arise when it is applied to hemodynamic time series, and consider several extensions. Connectivity measures like Granger causality that are based on the autoregressive model do not always reflect true neuronal connectivity; however, we conclude that careful experimental design could make this methodology quite useful in extending the information obtainable using fMRI. PMID:20444566
Efficacy of memory aids after traumatic brain injury: A single case series.
Bos, Hannah R; Babbage, Duncan R; Leathem, Janet M
2017-01-01
Individuals living with traumatic brain injury commonly have difficulties with prospective memory-the ability to remember a planned action at the intended time. Traditionally a memory notebook has been recommended as a compensatory memory aid. Electronic devices have the advantage of providing a cue at the appropriate time to remind participants to refer to the memory aid and complete tasks. Research suggests these have potential benefit in neurorehabilitation. This study aimed to investigate the efficacy of a memory notebook and specifically a smartphone as a compensatory memory aid. A single case series design was used to assess seven participants. A no-intervention baseline was followed by training and intervention with either the smartphone alone, or a memory notebook and later the smartphone. Memory was assessed with weekly assigned memory tasks. Participants using a smartphone showed improvements in their ability to complete assigned memory tasks accurately and within the assigned time periods. Use of a smartphone provided additional benefits over and above those already seen for those who received a memory notebook first. Smartphones have the potential to be a useful and cost effective tool in neurorehabilitation practice.
Blackmer, Allison B; Thompson, Angela M; Jeffres, Meghan N; Glode, Ashley E; Mahyari, Nila; Thompson, Megan
2018-02-01
The six authors of this commentary series, who have recently transitioned into or within an academic career, discuss challenging aspects of an academic career change. This is a three-part commentary series that explores select challenges: 1) feedback, evaluation and advancement; 2) understanding and balancing of distribution of effort; 3) learning how and when to say yes. Faculty, or those interested in pursuing a career in pharmacy academia, can refer to this commentary series as a reference. Schools of pharmacy may utilize this as a tool for new faculty members during orientation in order to ensure smooth integration into the academic environment. Copyright © 2017 Elsevier Inc. All rights reserved.
The purpose of the Mississippi River map series is to provide reference for ecological vulnerability throughout the entire Mississippi River Basin, which is a forthcoming product. This map series product consists of seven 32 inch x 40 inch posters, with a nominal scale of 1 inch ...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-18
... (Regional Jet Series 900) Airplanes AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of... Part 39 Air transportation, Aircraft, Aviation safety, Incorporation by reference, Safety. The Proposed... Manual, and the Bombardier CRJ Series Regional Jet Aircraft Maintenance Manual (AMM). Table 2--Guidance...
Atomic Fuel, Understanding the Atom Series. Revised.
ERIC Educational Resources Information Center
Hogerton, John F.
This publication is part of the "Understanding the Atom" series. Complete sets of the series are available free to teachers, schools, and public librarians who can make them available for reference or use by groups. Among the topics discussed are: What Atomic Fuel Is; The Odyssey of Uranium; Production of Uranium; Fabrication of Reactor…
Serial Monogamy: Extended Fictions and the Television Revolution
ERIC Educational Resources Information Center
Mackey, Margaret
2006-01-01
Changes in television technology have fostered changes in how we view fiction on television. This article explores some of these changes in the context of the teenage series, "Felicity" (WBTV, 1998-2002). It draws comparisons with the experience of reading series fiction in print, referring to the children's print series, "The Beverly Gray College…
Fossil-Fuel C02 Emissions Database and Exploration System
NASA Astrophysics Data System (ADS)
Krassovski, M.; Boden, T.; Andres, R. J.; Blasing, T. J.
2012-12-01
The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production at global, regional, and national spatial scales. The CDIAC emission time series estimates are based largely on annual energy statistics published at the national level by the United Nations (UN). CDIAC has developed a relational database to house collected data and information and a web-based interface to help users worldwide identify, explore and download desired emission data. The available information is divided in two major group: time series and gridded data. The time series data is offered for global, regional and national scales. Publications containing historical energy statistics make it possible to estimate fossil fuel CO2 emissions back to 1751. Etemad et al. (1991) published a summary compilation that tabulates coal, brown coal, peat, and crude oil production by nation and year. Footnotes in the Etemad et al.(1991) publication extend the energy statistics time series back to 1751. Summary compilations of fossil fuel trade were published by Mitchell (1983, 1992, 1993, 1995). Mitchell's work tabulates solid and liquid fuel imports and exports by nation and year. These pre-1950 production and trade data were digitized and CO2 emission calculations were made following the procedures discussed in Marland and Rotty (1984) and Boden et al. (1995). The gridded data presents annual and monthly estimates. Annual data presents a time series recording 1° latitude by 1° longitude CO2 emissions in units of million metric tons of carbon per year from anthropogenic sources for 1751-2008. The monthly, fossil-fuel CO2 emissions estimates from 1950-2008 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2011), the references therein, and the methodology described in Andres et al. (2011). The data accessible here take these tabular, national, mass-emissions data and distribute them spatially on a one degree latitude by one degree longitude grid. The within-country spatial distribution is achieved through a fixed population distribution as reported in Andres et al. (1996). This presentation introduces newly build database and web interface, reflects the present state and functionality of the Fossil-Fuel CO2 Emissions Database and Exploration System as well as future plans for expansion.
ERIC Educational Resources Information Center
Zedeck, Sheldon, Ed.
2011-01-01
APA Books® announces the "APA Handbook of Industrial and Organizational Psychology"--the first offering in an new reference series covering core and emerging subdisciplines, the "APA Handbooks in Psychology." I/O Psychology is both a science/practice and an applied/basic research discipline. Appropriately, the "APA…
Stochastic Calculus and Differential Equations for Physics and Finance
NASA Astrophysics Data System (ADS)
McCauley, Joseph L.
2013-02-01
1. Random variables and probability distributions; 2. Martingales, Markov, and nonstationarity; 3. Stochastic calculus; 4. Ito processes and Fokker-Planck equations; 5. Selfsimilar Ito processes; 6. Fractional Brownian motion; 7. Kolmogorov's PDEs and Chapman-Kolmogorov; 8. Non Markov Ito processes; 9. Black-Scholes, martingales, and Feynman-Katz; 10. Stochastic calculus with martingales; 11. Statistical physics and finance, a brief history of both; 12. Introduction to new financial economics; 13. Statistical ensembles and time series analysis; 14. Econometrics; 15. Semimartingales; References; Index.
Wideband Detection and Classification of Practice Limpet Mines against Various Backgrounds
2008-07-01
variations de la hauteur. Les sonars imageurs haute fréquence permettent de dresser une carte de la réflectivité haute fréquence de la surface et de...25 Figure 32 The cross -correlations (described in the text) of the echo time series with a reference plate echo. The first target is...Fig.20d) for the [17 57] kHz compensated pulse.......................... 28 viii DRDC Atlantic TM 2008-079 Figure 34 The cross
Price of gasoline: forecasting comparisons. [Box-Jenkins, econometric, and regression methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bopp, A.E.; Neri, J.A.
Gasoline prices are simulated using three popular forecasting methodologies: A Box--Jenkins type method, an econometric method, and a regression method. One-period-ahead and 18-period-ahead comparisons are made. For the one-period-ahead method, a Box--Jenkins type time-series model simulated best, although all do well. However, for the 18-period simulation, the econometric and regression methods perform substantially better than the Box-Jenkins formulation. A rationale for and implications of these results ae discussed. 11 references.
NASA Astrophysics Data System (ADS)
Rowlands, G.; Kiyani, K. H.; Chapman, S. C.; Watkins, N. W.
2009-12-01
Quantitative analysis of solar wind fluctuations are often performed in the context of intermittent turbulence and center around methods to quantify statistical scaling, such as power spectra and structure functions which assume a stationary process. The solar wind exhibits large scale secular changes and so the question arises as to whether the timeseries of the fluctuations is non-stationary. One approach is to seek a local stationarity by parsing the time interval over which statistical analysis is performed. Hence, natural systems such as the solar wind unavoidably provide observations over restricted intervals. Consequently, due to a reduction of sample size leading to poorer estimates, a stationary stochastic process (time series) can yield anomalous time variation in the scaling exponents, suggestive of nonstationarity. The variance in the estimates of scaling exponents computed from an interval of N observations is known for finite variance processes to vary as ~1/N as N becomes large for certain statistical estimators; however, the convergence to this behavior will depend on the details of the process, and may be slow. We study the variation in the scaling of second-order moments of the time-series increments with N for a variety of synthetic and “real world” time series, and we find that in particular for heavy tailed processes, for realizable N, one is far from this ~1/N limiting behavior. We propose a semiempirical estimate for the minimum N needed to make a meaningful estimate of the scaling exponents for model stochastic processes and compare these with some “real world” time series from the solar wind. With fewer datapoints the stationary timeseries becomes indistinguishable from a nonstationary process and we illustrate this with nonstationary synthetic datasets. Reference article: K. H. Kiyani, S. C. Chapman and N. W. Watkins, Phys. Rev. E 79, 036109 (2009).
Global and Hemispheric Temperature Anomalies: Land and Marine Instrumental Records (1850 - 2015)
Jones, P. D. [Climatic Research Unit (CRU), University of East Anglia, Norwich, United Kingdom; Parker, D. E. [Hadley Centre for Climate Prediction and Research, Berkshire, United Kingdom; Osborn, T. J. [Climatic Research Unit (CRU), University of East Anglia, Norwich, United Kingdom; Briffa, K. R. [Climatic Research Unit (CRU), University of East Anglia, Norwich, United Kingdom
2016-05-01
These global and hemispheric temperature anomaly time series, which incorporate land and marine data, are continually updated and expanded by P. Jones of the Climatic Research Unit (CRU) with help from colleagues at the CRU and other institutions. Some of the earliest work in producing these temperature series dates back to Jones et al. (1986a,b,c), Jones (1988, 1994), and Jones and Briffa (1992). Most of the discussion of methods given here has been gleaned from the Frequently Asked Questions section of the CRU temperature data web pages. Users are encouraged to visit the CRU Web site for the most comprehensive overview of these data (the "HadCRUT4" dataset), other associated datasets, and the most recent literature references to the work of Jones et al.
Fitzmaurice, Gerard J.; Redmond, Karen C.; Fitzpatrick, David A.; Bartosik, Waldemar
2014-01-01
In keeping with international trends, lung cancer incidence and mortality are increasing among the Irish population with many patients presenting with advanced disease that excludes the potential for curative management. Consequently palliative treatment options for this patient group are being increasingly explored with various degrees of success. Endobronchial stenosis represents a particularly challenging area of management among these patients and a number of techniques have been described without the identification of a single gold standard. We report our experience of the first time use of endobronchial cryotherapy in Ireland with reference to a case series, including an example of its use in the management of benign disease, in order to support patients with borderline lung function and enable definitive palliative treatment. PMID:24791176
Contemporaneous disequilibrium of bio-optical properties in the Southern Ocean
NASA Astrophysics Data System (ADS)
Kahru, Mati; Lee, Zhongping; Mitchell, B. Greg
2017-03-01
Significant changes in satellite-detected net primary production (NPP, mg C m-2 d-1) were observed in the Southern Ocean during 2011-2016: an increase in the Pacific sector and a decrease in the Atlantic sector. While no clear physical forcing was identified, we hypothesize that the changes in NPP were associated with changes in the phytoplankton community and reflected in the concomitant bio-optical properties. Satellite algorithms for chlorophyll a concentration (Chl a, mg m-3) use a combination of estimates of the remote sensing reflectance Rrs(λ) that are statistically fitted to a global reference data set. In any particular region or point in space/time the estimate produced by the global "mean" algorithm can deviate from the true value. Reflectance anomaly (RA) is supposed to remove the first-order variability in Rrs(λ) associated with Chl a and reveal bio-optical properties that are due to the composition of phytoplankton and associated materials. Time series of RA showed variability at multiple scales, including the life span of the sensor, multiyear and annual. Models of plankton functional types using estimated Chl a as input cannot be expected to correctly resolve regional and seasonal anomalies due to biases in the Chl a estimate that they are based on. While a statistical model using RA(λ) time series can predict the times series of NPP with high accuracy (R2 = 0.82) in both Pacific and Atlantic regions, the underlying mechanisms in terms of phytoplankton groups and the associated materials remain elusive.
A comparative simulation study of AR(1) estimators in short time series.
Krone, Tanja; Albers, Casper J; Timmerman, Marieke E
2017-01-01
Various estimators of the autoregressive model exist. We compare their performance in estimating the autocorrelation in short time series. In Study 1, under correct model specification, we compare the frequentist r 1 estimator, C-statistic, ordinary least squares estimator (OLS) and maximum likelihood estimator (MLE), and a Bayesian method, considering flat (B f ) and symmetrized reference (B sr ) priors. In a completely crossed experimental design we vary lengths of time series (i.e., T = 10, 25, 40, 50 and 100) and autocorrelation (from -0.90 to 0.90 with steps of 0.10). The results show a lowest bias for the B sr , and a lowest variability for r 1 . The power in different conditions is highest for B sr and OLS. For T = 10, the absolute performance of all measurements is poor, as expected. In Study 2, we study robustness of the methods through misspecification by generating the data according to an ARMA(1,1) model, but still analysing the data with an AR(1) model. We use the two methods with the lowest bias for this study, i.e., B sr and MLE. The bias gets larger when the non-modelled moving average parameter becomes larger. Both the variability and power show dependency on the non-modelled parameter. The differences between the two estimation methods are negligible for all measurements.
Analysis of the DORIS, GNSS, SLR, VLBI and gravimetric time series at the GGOS core sites
NASA Astrophysics Data System (ADS)
Moreaux, G.; Lemoine, F. G.; Luceri, V.; Pavlis, E. C.; MacMillan, D. S.; Bonvalot, S.; Saunier, J.
2017-12-01
Since June 2016 and the installation of a new DORIS station in Wettzell (Germany), four geodetic sites (Badary, Greenbelt, Wettzell and Yarragadee) are equipped with the four space geodetic techniques (DORIS, GNSS, SLR and VLBI). In line with the GGOS (Global Geodetic Observing System) objective of achieving a terrestrial reference frame at the millimetric level of accuracy, the combination centers of the four space techniques initiated a joint study to assess the level of agreement among these space geodetic techniques. In addition to the four sites, we will consider all the GGOS core sites including the seven sites with at least two space geodetic techniques in addition to DORIS. Starting from the coordinate time series, we will estimate and compare the mean positions and velocities of the co-located instruments. The temporal evolution of the coordinate differences will also be evaluated with respect to the local tie vectors and discrepancies will be investigated. Then, the analysis of the signal content of the time series will be carried out. Amplitudes and phases of the common signals among the techniques, and eventually from gravity data, will be compared. The first objective of this talk is to describe our joint study: the sites, the data, and the objectives. The second purpose is to present the first results obtained from the GGAO (Goddard Geophysical and Astronomic Observatory) site of Greenbelt.
NASA Astrophysics Data System (ADS)
Ma, M.
2015-12-01
The Qinghai-Tibet Plateau (QTP) is the world's highest and largest plateau and is occasionally referred to as "the roof of the world". As the important "water tower", there are 1,091 lakes of more than 1.0 km2 in the QTP areas, which account for 49.4% of the total area of lakes in China. Some studies focus on the lake area changes of the QTP areas, which mainly use the middle-resolution remote sensing data (e.g. Landsat TM). In this study, the coarse-resolution time series remote sensing data, MODIS data at a spatial resolution of 250m, was used to monitor the lake area changes of the QTP areas during the last 15 years. The dataset is the MOD13Q1 and the Normal Difference Vegetation Index (NDVI) is used to identify the lake area when the NDVI is less than 0. The results show the obvious inner-annual changes of most of the lakes. Therefore the annually average and maximum lake areas are calculated based on the time series remote data, which can better quantify the change characteristics than the single scene of image data from the middle-resolution data. The results indicate that there are big spatial variances of the lake area changes in the QTB. The natural driving factors are analyzed for revealing the causes of changes.
NASA Technical Reports Server (NTRS)
Firestone, Elaine R. (Editor); Hooker, Stanford B.
1998-01-01
The Sea-viewing Wide Field-of-view Sensor (SeaWiFS) is the follow-on ocean color instrument to the Coastal Zone Color Scanner (CZCS), which ceased operations in 1986, after an eight-year mission. SeaWiFS was launched on 1 August 1997, on the SeaStar satellite, built by Orbital Sciences Corporation (OSC). The SeaWiFS Project at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), undertook the responsibility of documenting all aspects of this mission, which is critical to the ocean color and marine science communities. This documentation, entitled the SeaWiFS Technical Report Series, is in the form of NASA Technical Memorandum Number 104566 and 1998-104566. All reports published are volumes within the series. This particular volume, which is the last of the so-called Prelaunch Series serves as a reference, or guidebook, to the previous 42 volumes and consists of 6 sections including: an addenda, an errata, an index to key words and phrases, lists of acronyms and symbols used, and a list of all references cited. The editors have published a cumulative index of this type after every five volumes. Each index covers the reference topics published in all previous editions, that is, each new index includes all of the information contained in the preceding indexes with the exception of any addenda.
Rollinson, Njal; Holt, Sarah M; Massey, Melanie D; Holt, Richard C; Nancekivell, E Graham; Brooks, Ronald J
2018-05-01
Temperature has a strong effect on ectotherm development rate. It is therefore possible to construct predictive models of development that rely solely on temperature, which have applications in a range of biological fields. Here, we leverage a reference series of development stages for embryos of the turtle Chelydra serpentina, which was described at a constant temperature of 20 °C. The reference series acts to map each distinct developmental stage onto embryonic age (in days) at 20 °C. By extension, an embryo taken from any given incubation environment, once staged, can be assigned an equivalent age at 20 °C. We call this concept "Equivalent Development", as it maps the development stage of an embryo incubated at a given temperature to its equivalent age at a reference temperature. In the laboratory, we used the concept of Equivalent Development to estimate development rate of embryos of C. serpentina across a series of constant temperatures. Using these estimates of development rate, we created a thermal performance curve measured in units of Equivalent Development (TPC ED ). We then used the TPC ED to predict developmental stage of embryos in several natural turtle nests across six years. We found that 85% of the variation of development stage in natural nests could be explained. Further, we compared the predictive accuracy of the model based on the TPC ED to the predictive accuracy of a degree-day model, where development is assumed to be linearly related to temperature and the amount of accumulated heat is summed over time. Information theory suggested that the model based on the TPC ED better describes variation in developmental stage in wild nests than the degree-day model. We suggest the concept of Equivalent Development has several strengths and can be broadly applied. In particular, studies on temperature-dependent sex determination may be facilitated by the concept of Equivalent Development, as development age maps directly onto the developmental series of the organism, allowing critical periods of sex determination to be delineated without invasive sampling, even under fluctuating temperature. Copyright © 2018 Elsevier Ltd. All rights reserved.
Liu, Jing; Wang, Qun; Sun, Minying; Zhu, Linlin; Yang, Michael; Zhao, Yu
2014-01-01
Quantitative real-time reverse transcription PCR (qRT-PCR) has become a widely used method for gene expression analysis; however, its data interpretation largely depends on the stability of reference genes. The transcriptomics of Panax ginseng, one of the most popular and traditional ingredients used in Chinese medicines, is increasingly being studied. Furthermore, it is vital to establish a series of reliable reference genes when qRT-PCR is used to assess the gene expression profile of ginseng. In this study, we screened out candidate reference genes for ginseng using gene expression data generated by a high-throughput sequencing platform. Based on the statistical tests, 20 reference genes (10 traditional housekeeping genes and 10 novel genes) were selected. These genes were tested for the normalization of expression levels in five growth stages and three distinct plant organs of ginseng by qPCR. These genes were subsequently ranked and compared according to the stability of their expressions using geNorm, NormFinder, and BestKeeper computational programs. Although the best reference genes were found to vary across different samples, CYP and EF-1α were the most stable genes amongst all samples. GAPDH/30S RPS20, CYP/60S RPL13 and CYP/QCR were the optimum pair of reference genes in the roots, stems, and leaves. CYP/60S RPL13, CYP/eIF-5A, aTUB/V-ATP, eIF-5A/SAR1, and aTUB/pol IIa were the most stably expressed combinations in each of the five developmental stages. Our study serves as a foundation for developing an accurate method of qRT-PCR and will benefit future studies on gene expression profiles of Panax Ginseng.
Gaussian Process Kalman Filter for Focal Plane Wavefront Correction and Exoplanet Signal Extraction
NASA Astrophysics Data System (ADS)
Sun, He; Kasdin, N. Jeremy
2018-01-01
Currently, the ultimate limitation of space-based coronagraphy is the ability to subtract the residual PSF after wavefront correction to reveal the planet. Called reference difference imaging (RDI), the technique consists of conducting wavefront control to collect the reference point spread function (PSF) by observing a bright star, and then extracting target planet signals by subtracting a weighted sum of reference PSFs. Unfortunately, this technique is inherently inefficient because it spends a significant fraction of the observing time on the reference star rather than the target star with the planet. Recent progress in model based wavefront estimation suggests an alternative approach. A Kalman filter can be used to estimate the stellar PSF for correction by the wavefront control system while simultaneously estimating the planet signal. Without observing the reference star, the (extended) Kalman filter directly utilizes the wavefront correction data and combines the time series observations and model predictions to estimate the stellar PSF and planet signals. Because wavefront correction is used during the entire observation with no slewing, the system has inherently better stability. In this poster we show our results aimed at further improving our Kalman filter estimation accuracy by including not only temporal correlations but also spatial correlations among neighboring pixels in the images. This technique is known as a Gaussian process Kalman filter (GPKF). We also demonstrate the advantages of using a Kalman filter rather than RDI by simulating a real space exoplanet detection mission.
Nonis, Alberto; Vezzaro, Alice; Ruperti, Benedetto
2012-07-11
Genome wide transcriptomic surveys together with targeted molecular studies are uncovering an ever increasing number of differentially expressed genes in relation to agriculturally relevant processes in olive (Olea europaea L). These data need to be supported by quantitative approaches enabling the precise estimation of transcript abundance. qPCR being the most widely adopted technique for mRNA quantification, preliminary work needs to be done to set up robust methods for extraction of fully functional RNA and for the identification of the best reference genes to obtain reliable quantification of transcripts. In this work, we have assessed different methods for their suitability for RNA extraction from olive fruits and leaves and we have evaluated thirteen potential candidate reference genes on 21 RNA samples belonging to fruit developmental/ripening series and to leaves subjected to wounding. By using two different algorithms, GAPDH2 and PP2A1 were identified as the best reference genes for olive fruit development and ripening, and their effectiveness for normalization of expression of two ripening marker genes was demonstrated.
NASA Astrophysics Data System (ADS)
Heinkelmann, R.; Belda-Palazon, S.; Ferrándiz, J.; Schuh, H.
2015-08-01
For applications in Earth sciences, navigation, and astronomy the celestial (ICRF) and terrestrial (ITRF) reference frames as well as the orientation among them, the Earth orientation parameters (EOP), have to be consistent at the level of 1 mm and 0.1 mm/yr (GGOS recommendations). We assess the effect of unmodelled geophysical signals in the regularized coordinates and the sensitivity with respect to different a priori EOP and celestial reference frames. The EOP are determined using the same VLBI data but with station coordinates fixed on different TRFs. The conclusion is that within the time span of data incorporated into ITRF2008 (Altamimi, et al., 2011) the ITRF2008 and the IERS 08 C04 are consistent. This consistency involves that non-linear station motion such as unmodelled geophysical signals partly affect the IERS 08 C04 EOP. There are small but not negligible inconsistencies between the conventional celestial reference frame, ICRF2 (Fey, et al., 2009), the ITRF2008 and the conventional EOP that are quantified by comparing VTRF2008 (Böckmann, et al., 2010) and ITRF2008.
Whitfield, Paul H.; Burn, Donald H.; Hannaford, Jamie; Higgins, Hélène; Hodgkins, Glenn A.; Marsh, Terry; Looser, Ulrich
2012-01-01
Identifying climate-driven trends in river flows on a global basis is hampered by a lack of long, quality time series data for rivers with relatively undisturbed regimes. This is a global problem compounded by the lack of support for essential long-term monitoring. Experience demonstrates that, with clear strategic objectives, and the support of sponsoring organizations, reference hydrologic networks can constitute an exceptionally valuable data source to effectively identify, quantify and interpret hydrological change—the speed and magnitude of which is expected to a be a primary driver of water management and flood alleviation strategies through the future—and for additional applications. Reference hydrologic networks have been developed in many countries in the past few decades. These collections of streamflow gauging stations, that are maintained and operated with the intention of observing how the hydrology of watersheds responds to variations in climate, are described. The status of networks under development is summarized. We suggest a plan of actions to make more effective use of this collection of networks.
36 CFR § 1237.16 - How do agencies store audiovisual records?
Code of Federal Regulations, 2013 CFR
2013-07-01
... textual series (e.g., store poster series separately from other kinds of agency publications, or... reference, see § 1237.3); (e) Store posters and similar graphic works in oversize formats, in map cases...
Integrating Analysis Goals for EOP, CRF and TRF
NASA Technical Reports Server (NTRS)
Ma, Chopo; MacMillan, D.; Petrov, L.; Smith, David E. (Technical Monitor)
2001-01-01
In a simplified, idealized way the TRF can be considered a set of positions at epoch and corresponding linear rates of change while the CRF is a set of fixed directions in space. VLBI analysis can be optimized for CRF and TRF separately while handling some of the complexity of geodetic and astrometric reality. For EOP time series both CRF and TRF should be accurate at the epoch of interest and well defined over time. The optimal integral EOP, TRF and CRF in a single VLBI solution configuration requires a detailed consideration of the data set and the possibly conflicting nature of reference frames.
Global Vertical Rates from VLBl
NASA Technical Reports Server (NTRS)
Ma, Chopo; MacMillan, D.; Petrov, L.
2003-01-01
The analysis of global VLBI observations provides vertical rates for 50 sites with formal errors less than 2 mm/yr and median formal error of 0.4 mm/yr. These sites are largely in Europe and North America with a few others in east Asia, Australia, South America and South Africa. The time interval of observations is up to 20 years. The error of the velocity reference frame is less than 0.5 mm/yr, but results from several sites with observations from more than one antenna suggest that the estimated vertical rates may have temporal variations or non-geophysical components. Comparisons with GPS rates and corresponding site position time series will be discussed.
NASA Astrophysics Data System (ADS)
Shi, Wenhui; Feng, Changyou; Qu, Jixian; Zha, Hao; Ke, Dan
2018-02-01
Most of the existing studies on wind power output focus on the fluctuation of wind farms and the spatial self-complementary of wind power output time series was ignored. Therefore the existing probability models can’t reflect the features of power system incorporating wind farms. This paper analyzed the spatial self-complementary of wind power and proposed a probability model which can reflect temporal characteristics of wind power on seasonal and diurnal timescales based on sufficient measured data and improved clustering method. This model could provide important reference for power system simulation incorporating wind farms.
Determination of impurities in uranium matrices by time-of-flight ICP-MS using matrix-matched method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buerger, Stefan; Riciputi, Lee R; Bostick, Debra A
2007-01-01
The analysis of impurities in uranium matrices is performed in a variety of fields, e.g. for quality control in the production stream converting uranium ores to fuels, as element signatures in nuclear forensics and safeguards, and for non-proliferation control. We have investigated the capabilities of time-of-flight ICP-MS for the analysis of impurities in uranium matrices using a matrix-matched method. The method was applied to the New Brunswick Laboratory CRM 124(1-7) series. For the seven certified reference materials, an overall precision and accuracy of approximately 5% and 14%, respectively, were obtained for 18 analyzed elements.
NASA Astrophysics Data System (ADS)
Martini, Luiz Cesar
2014-04-01
This article results from Introducing the Dimensional Continuous Space-Time Theory that was published in reference 1. The Dimensional Continuous Space-Time Theory shows a series of facts relative to matter, energy, space and concludes that empty space is inelastic, absolutely stationary, motionless, perpetual, without possibility of deformation neither can it be destroyed or created. A elementary cell of empty space or a certain amount of empty space can be occupied by any quantity of energy or matter without any alteration or deformation. As a consequence of these properties and being a integral part of the theory, the principles of Relativity Theory must be changed to become simple and intuitive.
Lai, Jih-Sheng; Liu, Changrong; Ridenour, Amy
2009-04-14
DC/DC converter has a transformer having primary coils connected to an input side and secondary coils connected to an output side. Each primary coil connects a full-bridge circuit comprising two switches on two legs, the primary coil being connected between the switches on each leg, each full-bridge circuit being connected in parallel wherein each leg is disposed parallel to one another, and the secondary coils connected to a rectifying circuit. An outer loop control circuit that reduces ripple in a voltage reference has a first resistor connected in series with a second resistor connected in series with a first capacitor which are connected in parallel with a second capacitor. An inner loop control circuit that reduces ripple in a current reference has a third resistor connected in series with a fourth resistor connected in series with a third capacitor which are connected in parallel with a fourth capacitor.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... to this AD because it is part of the Bombardier CRJ Series Regional Jet Aircraft Maintenance Manual... transportation, Aircraft, Aviation safety, Incorporation by reference, Safety. Adoption of the Amendment... Task 30-11-41-820-801 of the Canadair CRJ Series Regional Jet Aircraft Maintenance Manual. (5) Within...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-10
... Series 705) Airplanes, and Model CL-600-2D24 (Regional Jet Series 900) Airplanes AGENCY: Federal Aviation... Part 39 Air transportation, Aircraft, Aviation safety, Incorporation by reference, Safety. The Proposed... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration 14 CFR Part 39 [Docket No. FAA-2010...
NASA Astrophysics Data System (ADS)
Shirzaei, Manoochehr; Walter, Thomas
2010-05-01
Volcanic unrest and eruptions are one of the major natural hazards next to earthquakes, floods, and storms. It has been shown that many of volcanic and tectonic unrests are triggered by changes in the stress field induced by nearby seismic and magmatic activities. In this study, as part of a mobile volcano fast response system so-called "Exupery" (www.exupery-vfrs.de) we present an arrangement for semi real time assessing the stress field excited by volcanic activity. This system includes; (1) an approach called "WabInSAR" dedicated for advanced processing of the satellite data and providing an accurate time series of the surface deformation [1, 2], (2) a time dependent inverse source modeling method to investigate the source of volcanic unrest using observed surface deformation data [3, 4], (3) the assessment of the changes in stress field induced by magmatic activity at the nearby volcanic and tectonic systems. This system is implemented in a recursive manner that allows handling large 3D data sets in an efficient and robust way which is requirement of an early warning system. We have applied and validated this arrangement on Mauna Loa volcano, Hawaii Island, to assess the influence of the time dependent activities of Mauna Loa on earthquake occurrence at the Kaoiki seismic zone. References [1] M. Shirzaei and T. R. Walter, "Wavelet based InSAR (WabInSAR): a new advanced time series approach for accurate spatiotemporal surface deformation monitoring," IEEE, pp. submitted, 2010. [2] M. Shirzaei and R. T. Walter, "Deformation interplay at Hawaii Island through InSAR time series and modeling," J. Geophys Res., vol. submited, 2009. [3] M. Shirzaei and T. R. Walter, "Randomly Iterated Search and Statistical Competency (RISC) as powerful inversion tools for deformation source modeling: application to volcano InSAR data," J. Geophys. Res., vol. 114, B10401, doi:10.1029/2008JB006071, 2009. [4] M. Shirzaei and T. R. Walter, "Genetic algorithm combined with Kalman filter as powerful tool for nonlinear time dependent inverse modelling: Application to volcanic deformation time series," J. Geophys. Res., pp. submitted, 2010.
A comparison of ITRF2014, DTRF2014 and JTRF2014 polar motion series with geophysical excitation data
NASA Astrophysics Data System (ADS)
Rebischung, Paul; Chen, Wei; Ray, Jim
2017-04-01
Three solutions were generated in response to the 2014 update by the IERS of the International Terrestrial Reference Frame: ITRF2014, the official solution from IGN; DTRF2014, from DGFI; JTRF2014, from JPL. Each incorporates essentially the same time series information of geocentric station positions + Earth rotation parameters + their associated variance-covariances from the four contributing space geodetic techniques (SLR, VLBI, DORIS, GNSS) plus local 3D vector ties (measured by conventional surveying methods) that relate a subset of co-located stations. Given the fact that measurements by all the techniques, as well as the local ties, suffer significant unmodeled systematic errors that are poorly understood, the covariance matrices are not reliable except for their geometrical aspect. So the three combination strategies differ not just in their mathematical procedures, but more importantly in how the systematic errors are handled (or not). Factors include the relative weighting of inputs, modeling of non-linear station motions, detection of time series discontinuities, etc. The final combination results therefore also differ, mostly in rather subtle ways. There are very few ways to make external evaluations of the quality of the various combinations as independent observations are generally not accurate enough. However, one approach has been shown to give useful insight by comparing the daily polar motions with geophysical excitations computed from global circulation models for atmosphere, ocean, and hydrology. J. Kouba (2010) did this for ITRF2008 and DTRF2008 and found an excess of high-frequency rotational scatter in the DGFI solution. After the development of the IGS in the 1990s, the ITRF daily polar motion accuracy has been about 30 uas or 1 mm of surface rotation. The corresponding geophysical models are not nearly so accurate but their independence does provide a valuable reference against which the geodetic results can be compared. Direct inter-comparisons of the three combined polar motion series and the IGS-only series (which predominates since 2000) already reveal interesting features: seasonal amplitudes vary markedly, up to 20 uas for the annual term in one case; differences for periods longer than monthly are greater than found in 2008; and 7-d harmonics are found in one series but not the others. We also apply the refined polar motion excitation theory of W. Chen et al. (2013), which incorporates frequency-dependent effects and updated Earth parameters, to further study the 2014 frame solutions. Results will be presented in the poster.
Investigating the creeping section of the San Andreas Fault using ALOS PALSAR interferometry
NASA Astrophysics Data System (ADS)
Agram, P. S.; Wortham, C.; Zebker, H. A.
2010-12-01
In recent years, time-series InSAR techniques have been used to study the temporal characteristics of various geophysical phenomena that produce surface deformation including earthquakes and magma migration in volcanoes. Conventional InSAR and time-series InSAR techniques have also been successfully used to study aseismic creep across faults in urban areas like the Northern Hayward Fault in California [1-3]. However, application of these methods to studying the time-dependent creep across the Central San Andreas Fault using C-band ERS and Envisat radar satellites has resulted in limited success. While these techniques estimate the average long-term far-field deformation rates reliably, creep measurement close to the fault (< 3-4 Km) is virtually impossible due to heavy decorrelation at C-band (6cm wavelength). Shanker and Zebker (2009) [4] used the Persistent Scatterer (PS) time-series InSAR technique to estimate a time-dependent non-uniform creep signal across a section of the creeping segment of the San Andreas Fault. However, the identified PS network was spatially very sparse (1 per sq. km) to study temporal characteristics of deformation of areas close to the fault. In this work, we use L-band (24cm wavelength) SAR data from the PALSAR instrument on-board the ALOS satellite, launched by Japanese Aerospace Exploration Agency (JAXA) in 2006, to study the temporal characteristics of creep across the Central San Andreas Fault. The longer wavelength at L-band improves observed correlation over the entire scene which significantly increased the ground area coverage of estimated deformation in each interferogram but at the cost of decreased sensitivity of interferometric phase to surface deformation. However, noise levels in our deformation estimates can be decreased by combining information from multiple SAR acquisitions using time-series InSAR techniques. We analyze 13 SAR acquisitions spanning the time-period from March 2007 to Dec 2009 using the Short Baseline Subset Analysis (SBAS) time-series InSAR technique [3]. We present detailed comparisons of estimated time-series of fault creep as a function of position along the fault including the locked section around Parkfield, CA. We also present comparisons between the InSAR time-series and GPS network observations in the Parkfield region. During these three years of observation, the average fault creep is estimated to be 35 mm/yr. References [1] Bürgmann,R., E. Fielding and, J. Sukhatme, Slip along the Hayward fault, California, estimated from space-based synthetic aperture radar interferometry, Geology,26, 559-562, 1998. [2] Ferretti, A., C. Prati and F. Rocca, Permanent Scatterers in SAR Interferometry, IEEE Trans. Geosci. Remote Sens., 39, 8-20, 2001. [3] Lanari, R.,F. Casu, M. Manzo, and P. Lundgren, Application of SBAS D- InSAR technique to fault creep: A case study of the Hayward Fault, California. Remote Sensing of Environment, 109(1), 20-28, 2007. [4] Shanker, A. P., and H. Zebker, Edgelist phase unwrapping algorithm for time-series InSAR. J. Opt. Soc. Am. A, 37(4), 2010.
A Geodetic Strain Rate Model for the Pacific-North American Plate Boundary, western United States
NASA Astrophysics Data System (ADS)
Kreemer, C.; Hammond, W. C.; Blewitt, G.; Holland, A. A.; Bennett, R. A.
2012-04-01
We present a model of crustal strain rates derived from GPS measurements of horizontal station velocities in the Pacific-North American plate boundary in the western United States. The model reflects a best estimate of present-day deformation from the San Andreas fault system in the west to the Basin and Range province in the east. Of the total 2,846 GPS velocities used in the model, 1,197 are derived by ourselves, and 1,649 are taken from (mostly) published results. The velocities derived by ourselves (the "UNR solution") are estimated from GPS position time-series of continuous and semi-continuous stations for which data are publicly available. We estimated ITRF2005 positions from 2002-2011.5 using JPL's GIPSY-OASIS II software with ambiguity resolution applied using our custom Ambizap software. Only stations with time-series that span at least 2.25 years are considered. We removed from the time-series continental-scale common-mode errors using a spatially-varying filtering technique. Velocity uncertainties (typically 0.1-0.3 mm/yr) assume that the time-series contain flicker plus white noise. We used a subset of stations on the stable parts of the Pacific and North American plates to estimate the Pacific-North American pole of rotation. This pole is applied as a boundary condition to the model and the North American - ITRF2005 pole is used to rotate our velocities into a North America fixed reference frame. We do not include parts of the time-series that show curvature due to post-seismic deformation after major earthquakes and we also exclude stations whose time-series display a significant unexplained non-linearity or that are near volcanic centers. Transient effects longer than the observation period (i.e., slow viscoelastic relaxation) are left in the data. We added to the UNR solution velocities from 12 other studies. The velocities are transformed onto the UNR solution's reference frame by estimating and applying a translation and rotation that minimizes the velocities at collocated stations. We removed obvious outliers and velocities in areas that we identified to undergo subsidence likely due to excessive water pumping. For the strain rate calculations we excluded GPS stations with anomalous vertical motion or annual horizontal periodicity, which are indicators of local site instability. First, we used the stations from the UNR solution to create a Delaunay triangulation and estimated the horizontal strain rate components (and rigid body rotation) for each triangle in a linear least-squares inversion using the horizontal velocities as input. Some level of spatial damping was applied to minimize unnecessary spatial variation in the model parameters. The strain rates estimates were then used as a priori strain rate variances in a method that fits continuous bi-cubic Bessel spline functions through the velocity gradient field while minimizing the weighted misfit to all velocities. A minimal level of spatial smoothing of the variances was applied. The strain rate tensor model is shown by contours of the second invariant of the tensor, which is a measure of the amplitude that is coordinate frame independent. We also show a map of the tensor style and of the signal-to-noise ratio of the model.
Identification of flood-rich and flood-poor periods in flood series
NASA Astrophysics Data System (ADS)
Mediero, Luis; Santillán, David; Garrote, Luis
2015-04-01
Recently, a general concern about non-stationarity of flood series has arisen, as changes in catchment response can be driven by several factors, such as climatic and land-use changes. Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Trends are usually detected by the Mann-Kendall test. However, the results of this test depend on the starting and ending year of the series, which can lead to different results in terms of the period considered. The results can be conditioned to flood-poor and flood-rich periods located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to a set of long series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. Mediero et al. (2014) found a general decreasing trend in flood series in some parts of Spain that could be caused by a flood-rich period observed in 1950-1970, placed at the beginning of the flood series. The results of this study support the findings of Mediero et al. (2014), as a flood-rich period in 1950-1970 was identified in most of the selected sites. References: Mediero, L., Santillán, D., Garrote, L., Granados, A. Detection and attribution of trends in magnitude, frequency and timing of floods in Spain, Journal of Hydrology, 517, 1072-1088, 2014.
USDA-ARS?s Scientific Manuscript database
Establishment of a metrology-based measurement system requires the solid foundation of traceability of measurements to available, appropriate certified reference materials (CRM). In the early 1970s the first “biological” Reference Material (RM) of Bowens Kale, Orchard Leaves, and Bovine Liver from ...
Oakland County Science Safety Series: Reference Guide for Elementary Science.
ERIC Educational Resources Information Center
Crowder, Betty Pogue; And Others
This reference guide is designed to organize and suggest acceptable practices and procedures for dealing with safety in elementary science instruction. It is intended as a reference for teachers, administrators, and other school staff in planning for science activities and in making daily safety decisions. Topics covered in the guide include: (1)…
Oakland County Science Safety Series: Reference Guide for Biology.
ERIC Educational Resources Information Center
Bury, Dan; And Others
This reference guide is designed to organize and suggest acceptable practices and procedures for dealing with safety in the area of biology instruction. It is intended as a reference for teachers, administrators, and other school staff in planning for science activities and in making daily safety decisions. Discussions deal with responsibility for…
Swahili Learners' Reference Grammar. African Language Learners' Reference Grammar Series.
ERIC Educational Resources Information Center
Thompson, Katrina Daly; Schleicher, Antonia Folarin
This reference grammar is written for speakers of English who are learning Swahili. Because many language learners are not familiar with the grammatical terminology, this book explains the basic terminology and concepts of English grammar that are necessary for understanding the grammar of Swahili. It assumes no formal knowledge of English grammar…
The Reference Process and the Philosophy of Karl Popper.
ERIC Educational Resources Information Center
Neill, S. D.
1985-01-01
Two aspects of Karl Popper's philosophy are applied to reference process: process is viewed as series of problem-solving situations amenable to analysis using Popper's problem-solving schema. Reference interview is analyzed in context of Popper's postulate that books contain autonomous world of ideas existing apart from mind of knower. (30…
Sumi, Tomonari; Maruyama, Yutaka; Mitsutake, Ayori; Koga, Kenichiro
2016-06-14
In the conventional classical density functional theory (DFT) for simple fluids, an ideal gas is usually chosen as the reference system because there is a one-to-one correspondence between the external field and the density distribution function, and the exact intrinsic free-energy functional is available for the ideal gas. In this case, the second-order density functional Taylor series expansion of the excess intrinsic free-energy functional provides the hypernetted-chain (HNC) approximation. Recently, it has been shown that the HNC approximation significantly overestimates the solvation free energy (SFE) for an infinitely dilute Lennard-Jones (LJ) solution, especially when the solute particles are several times larger than the solvent particles [T. Miyata and J. Thapa, Chem. Phys. Lett. 604, 122 (2014)]. In the present study, we propose a reference-modified density functional theory as a systematic approach to improve the SFE functional as well as the pair distribution functions. The second-order density functional Taylor series expansion for the excess part of the intrinsic free-energy functional in which a hard-sphere fluid is introduced as the reference system instead of an ideal gas is applied to the LJ pure and infinitely dilute solution systems and is proved to remarkably improve the drawbacks of the HNC approximation. Furthermore, the third-order density functional expansion approximation in which a factorization approximation is applied to the triplet direct correlation function is examined for the LJ systems. We also show that the third-order contribution can yield further refinements for both the pair distribution function and the excess chemical potential for the pure LJ liquids.
Immediate versus sustained effects: interrupted time series analysis of a tailored intervention.
Hanbury, Andria; Farley, Katherine; Thompson, Carl; Wilson, Paul M; Chambers, Duncan; Holmes, Heather
2013-11-05
Detailed intervention descriptions and robust evaluations that test intervention impact--and explore reasons for impact--are an essential part of progressing implementation science. Time series designs enable the impact and sustainability of intervention effects to be tested. When combined with time series designs, qualitative methods can provide insight into intervention effectiveness and help identify areas for improvement for future interventions. This paper describes the development, delivery, and evaluation of a tailored intervention designed to increase primary health care professionals' adoption of a national recommendation that women with mild to moderate postnatal depression (PND) are referred for psychological therapy as a first stage treatment. Three factors influencing referral for psychological treatment were targeted using three related intervention components: a tailored educational meeting, a tailored educational leaflet, and changes to an electronic system data template used by health professionals during consultations for PND. Evaluation comprised time series analysis of monthly audit data on percentage referral rates and monthly first prescription rates for anti-depressants. Interviews were conducted with a sample of health professionals to explore their perceptions of the intervention components and to identify possible factors influencing intervention effectiveness. The intervention was associated with a significant, immediate, positive effect upon percentage referral rates for psychological treatments. This effect was not sustained over the ten month follow-on period. Monthly rates of anti-depressant prescriptions remained consistently high after the intervention. Qualitative interview findings suggest key messages received from the intervention concerned what appropriate antidepressant prescribing is, suggesting this to underlie the lack of impact upon prescribing rates. However, an understanding that psychological treatment can have long-term benefits was also cited. Barriers to referral identified before intervention were cited again after the intervention, suggesting the intervention had not successfully tackled the barriers targeted. A time series design allowed the initial and sustained impact of our intervention to be tested. Combined with qualitative interviews, this provided insight into intervention effectiveness. Future research should test factors influencing intervention sustainability, and promote adoption of the targeted behavior and dis-adoption of competing behaviors where appropriate.
Geocenter Motion Derived from the JTRF2014 Combination
NASA Astrophysics Data System (ADS)
Abbondanza, C.; Chin, T. M.; Gross, R. S.; Heflin, M. B.; Parker, J. W.; van Dam, T. M.; Wu, X.
2016-12-01
JTRF2014 represents the JPL Terrestrial Reference Frame (TRF) recently obtained as a result of the combination of the space-geodetic reprocessed inputs to the ITRF2014. Based upon a Kalman filter and smoother approach, JTRF2014 assimilates station positions and Earth-Orientation Parameters (EOPs) from GNSS, VLBI, SLR and DORIS and combine them through local tie measurements. JTRF is in its essence a time-series based TRF. In the JTRF2014 the dynamical evolution of the station positions is formulated by introducing linear and seasonal terms (annual and semi-annual periodic modes). Non-secular and non-seasonal motions of the geodetic sites are included in the smoothed time series by properly defining the station position process noise whose variance is characterized by analyzing station displacements induced by temporal changes of planetary fluid masses (atmosphere, oceans and continental surface water). With its station position time series output at a weekly resolution, JTRF2014 materializes a sub-secular frame whose origin is at the quasi-instantaneous Center of Mass (CM) as sensed by SLR. Both SLR and VLBI contribute to the scale of the combined frame. The sub-secular nature of the frame allows the users to directly access the quasi-instantaneous geocenter and scale information. Unlike standard combined TRF products which only give access to the secular component of the CM-CN motions, JTRF2014 is able to preserve -in addition to the long-term- the seasonal, non-seasonal and non-secular components of the geocenter motion. In the JTRF2014 assimilation scheme, local tie measurements are used to transfer the geocenter information from SLR to the space-geodetic techniques which are either insensitive to CM (VLBI) or whose geocenter motion is poorly determined (GNSS and DORIS). Properly tied to the CM frame through local ties and co-motion constraints, GNSS, VLBI and DORIS contribute to improve the SLR network geometry. In this paper, the determination of the weekly (CM-CN) time series as inferred from the JTRF2014 combination will be presented. Comparisons with geocenter time series derived from global inversions of GPS, GRACE and ocean bottom pressure models show the JTRF2014-derived geocenter favourably compares to the results of the inversion.
Immediate versus sustained effects: interrupted time series analysis of a tailored intervention
2013-01-01
Background Detailed intervention descriptions and robust evaluations that test intervention impact—and explore reasons for impact—are an essential part of progressing implementation science. Time series designs enable the impact and sustainability of intervention effects to be tested. When combined with time series designs, qualitative methods can provide insight into intervention effectiveness and help identify areas for improvement for future interventions. This paper describes the development, delivery, and evaluation of a tailored intervention designed to increase primary health care professionals’ adoption of a national recommendation that women with mild to moderate postnatal depression (PND) are referred for psychological therapy as a first stage treatment. Methods Three factors influencing referral for psychological treatment were targeted using three related intervention components: a tailored educational meeting, a tailored educational leaflet, and changes to an electronic system data template used by health professionals during consultations for PND. Evaluation comprised time series analysis of monthly audit data on percentage referral rates and monthly first prescription rates for anti-depressants. Interviews were conducted with a sample of health professionals to explore their perceptions of the intervention components and to identify possible factors influencing intervention effectiveness. Results The intervention was associated with a significant, immediate, positive effect upon percentage referral rates for psychological treatments. This effect was not sustained over the ten month follow-on period. Monthly rates of anti-depressant prescriptions remained consistently high after the intervention. Qualitative interview findings suggest key messages received from the intervention concerned what appropriate antidepressant prescribing is, suggesting this to underlie the lack of impact upon prescribing rates. However, an understanding that psychological treatment can have long-term benefits was also cited. Barriers to referral identified before intervention were cited again after the intervention, suggesting the intervention had not successfully tackled the barriers targeted. Conclusion A time series design allowed the initial and sustained impact of our intervention to be tested. Combined with qualitative interviews, this provided insight into intervention effectiveness. Future research should test factors influencing intervention sustainability, and promote adoption of the targeted behavior and dis-adoption of competing behaviors where appropriate. PMID:24188718
Analysis of satellite precipitation over East Africa during last decades
NASA Astrophysics Data System (ADS)
Cattani, Elsa; Wenhaji Ndomeni, Claudine; Merino, Andrés; Levizzani, Vincenzo
2016-04-01
Daily accumulated precipitation time series from satellite retrieval algorithms (e.g., ARC2 and TAMSAT) are exploited to extract the spatial and temporal variability of East Africa (EA - 5°S-20°N, 28°E-52°E) precipitation during last decades (1983-2013). The Empirical Orthogonal Function (EOF) analysis is applied to precipitation time series to investigate the spatial and temporal variability in particular for October-November-December referred to as the short rain season. Moreover, the connection among EA's precipitation, sea surface temperature, and soil moisture is analyzed through the correlation with the dominant EOF modes of variability. Preliminary results concern the first two EOF's modes for the ARC2 data set. EOF1 is characterized by an inter-annual variability and a positive correlation between precipitation and El Niño, positive Indian Ocean Dipole mode, and soil moisture, while EOF2 shows a dipole structure of spatial variability associated with a longer scale temporal variability. This second dominant mode is mostly linked to sea surface temperature variations in the North Atlantic Ocean. Further analyses are carried out by computing the time series of the joint CCI/CLIVAR/JCOMM Expert Team on Climate Change Detection and Indices (ETCCDI, http://etccdi.pacificclimate.org/index.shtml), i.e. RX1day, RX5day, CDD, CDD, CWD, SDII, PRCPTOT, R10, R20. The purpose is to identify the occurrenes of extreme events (droughts and floods) and extract precipitation temporal variation by trend analysis (Mann-Kendall technique). Results for the ARC2 data set demonstrate the existence of a dipole spatial pattern in the linear trend of the time series of PRCPTOT (annual precipitation considering days with a rain rate > 1 mm) and SDII (average precipitation on wet days over a year). A negative trend is mainly present over West Ethiopia and Sudan, whereas a positive trend is exhibited over East Ethiopia and Somalia. CDD (maximum number of consecutive dry days) and CWD (maximum number of consecutive wet days) time series do not exhibit a similar behavior and trends are generally weaker with a lower significance level with respect to PRCPTOT and SDII.
The Development of a Sea Surface Height Climate Data Record from Multi-mission Altimeter Data
NASA Astrophysics Data System (ADS)
Beckley, B. D.; Ray, R. D.; Lemoine, F. G.; Zelensky, N. P.; Desai, S. D.; Brown, S.; Mitchum, G. T.; Nerem, R.; Yang, X.; Holmes, S. A.
2011-12-01
The determination of the rate of change of mean sea level (MSL) has undeniable societal significance. The science value of satellite altimeter observations has grown dramatically over time as improved models and technologies have increased the value of data acquired on both past and present missions enabling credible MSL estimates. With the prospect of an observational time series extending into several decades from TOPEX/Poseidon through Jason-1 and the Ocean Surface Topography Mission (OSTM), and further in time with a future set of operational altimeters, researchers are pushing the bounds of current technology and modeling capability in order to monitor global and regional sea level rates at an accuracy of a few tenths of a mm/yr. GRACE data analysis suggests that the ice melt from Alaska alone contributes 0.3 mm/y to global sea level rise. The measurement of MSL change from satellite altimetry requires an extreme stability of the altimeter measurement system since the signal being measured is at the level of a few mm/yr. This means that the orbit and reference frame within which the altimeter measurements are situated, and the associated altimeter corrections, must be stable and accurate enough to permit a robust MSL estimate. Foremost, orbit quality and consistency are critical not only to satellite altimeter measurement accuracy across one mission, but also for the seamless transition between missions (Beckley, et. al, 2005). The analysis of altimeter data for TOPEX/Poseidon, Jason-1, and OSTM requires that the orbits for all three missions be in a consistent reference frame, and calculated with the best possible standards to minimize error and maximize the data return from the time series, particularly with respect to the demanding application of measuring sea level trends. In this presentation we describe the development and utility of the MEaSURE's TPJAOS V1.0 sea surface height Climate Data Record (http://podaac.jpl.nasa.gov/dataset/MERGED_TP_J1_OSTM_OST_ALL). We provide an assessment of recent improvements to the accuracy of the 19-year sea surface height time series, describe continuing calibration/validation activities, and evaluate the subsequent impact on global and regional mean sea level estimates.
NASA Astrophysics Data System (ADS)
Kuze, A.; Suto, H.; Kataoka, F.; Shiomi, K.; Kondo, Y.; Crisp, D.; Butz, A.
2017-12-01
Atmospheric methane (CH4) has an important role in global radiative forcing of climate but its emission estimates have larger uncertainties than carbon dioxide (CO2). The area of anthropogenic emission sources is usually much smaller than 100 km2. The Thermal And Near infrared Sensor for carbon Observation Fourier-Transform Spectrometer (TANSO-FTS) onboard the Greenhouse gases Observing SATellite (GOSAT) has measured CO2 and CH4 column density using sun light reflected from the earth's surface. It has an agile pointing system and its footprint can cover 87-km2 with a single detector. By specifying pointing angles and observation time for every orbit, TANSO-FTS can target various CH4 point sources together with reference points every 3 day over years. We selected a reference point that represents CH4 background density before or after targeting a point source. By combining satellite-measured enhancement of the CH4 column density and surface measured wind data or estimates from the Weather Research and Forecasting (WRF) model, we estimated CH4emission amounts. Here, we picked up two sites in the US West Coast, where clear sky frequency is high and a series of data are available. The natural gas leak at Aliso Canyon showed a large enhancement and its decrease with time since the initial blowout. We present time series of flux estimation assuming the source is single point without influx. The observation of the cattle feedlot in Chino, California has weather station within the TANSO-FTS footprint. The wind speed is monitored continuously and the wind direction is stable at the time of GOSAT overpass. The large TANSO-FTS footprint and strong wind decreases enhancement below noise level. Weak wind shows enhancements in CH4, but the velocity data have large uncertainties. We show the detection limit of single samples and how to reduce uncertainty using time series of satellite data. We will propose that the next generation instruments for accurate anthropogenic CO2 and CH4 flux estimation have improve spatial resolution (˜1km2 ) to further enhance column density changes. We also propose adding imaging capability to monitor plume orientation. We will present laboratory model results and a sampling pattern optimization study that combines local emission source and global survey observations.
Rainfall height stochastic modelling as a support tool for landslides early warning
NASA Astrophysics Data System (ADS)
Capparelli, G.; Giorgio, M.; Greco, R.; Versace, P.
2009-04-01
Occurrence of landslides is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Although heavy landslides frequently occurred in Campania, southern Italy, during the last decade, no complete data sets are available for natural slopes where landslides occurred. As a consequence, landslide risk assessment procedures and early warning systems in Campania still rely on simple empirical models based on correlation between daily rainfall records and observed landslides, like FLAIR model [Versace et al., 2003]. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction. In mountainous areas, rainfall spatial and temporal variability are very pronounced due to orographic effects, making predictions even more complicated. Existing rain gauge networks are not dense enough to resolve the small scale spatial variability, and the same limitation of spatial resolution affects rainfall height maps provided by radar sensors as well as by meteorological physically based models. Therefore, analysis of on-site recorded rainfall height time series still represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR and ARMA [Box and Jenkins, 1976]. Sometimes exogenous information coming from additional series of observations is also taken into account, and the models are called ARX and ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted in conjunction with FLAIR model to calculate the probability of flowslides occurrence. The final aim of the study is in fact to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil protection agency meteorological warning network. So far, the model has been applied only to data series recorded at a single rain gauge. Future extension will deal with spatial correlation between time series recorded at different gauges. ACKNOWLEDGEMENTS The research was co-financed by the Italian Ministry of University, by means of the PRIN 2006 PRIN program, within the research project entitled ‘Definition of critical rainfall thresholds for destructive landslides for civil protection purposes'. REFERENCES Box, G.E.P. and Jenkins, G.M., 1976. Time Series Analysis Forecasting and Control, Holden-Day, San Francisco. Cowpertwait, P.S.P., Kilsby, C.G. and O'Connell, P.E., 2002. A space-time Neyman-Scott model of rainfall: Empirical analysis of extremes, Water Resources Research, 38(8):1-14. Salas, J.D., 1992. Analysis and modeling of hydrological time series, in D.R. Maidment, ed., Handbook of Hydrology, McGraw-Hill, New York. Heneker, T.M., Lambert, M.F. and Kuczera G., 2001. A point rainfall model for risk-based design, Journal of Hydrology, 247(1-2):54-71. Versace, P., Sirangelo. B. and Capparelli, G., 2003. Forewarning model of landslides triggered by rainfall. Proc. 3rd International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction and Assessment, Davos.
The complexity of gene expression dynamics revealed by permutation entropy
2010-01-01
Background High complexity is considered a hallmark of living systems. Here we investigate the complexity of temporal gene expression patterns using the concept of Permutation Entropy (PE) first introduced in dynamical systems theory. The analysis of gene expression data has so far focused primarily on the identification of differentially expressed genes, or on the elucidation of pathway and regulatory relationships. We aim to study gene expression time series data from the viewpoint of complexity. Results Applying the PE complexity metric to abiotic stress response time series data in Arabidopsis thaliana, genes involved in stress response and signaling were found to be associated with the highest complexity not only under stress, but surprisingly, also under reference, non-stress conditions. Genes with house-keeping functions exhibited lower PE complexity. Compared to reference conditions, the PE of temporal gene expression patterns generally increased upon stress exposure. High-complexity genes were found to have longer upstream intergenic regions and more cis-regulatory motifs in their promoter regions indicative of a more complex regulatory apparatus needed to orchestrate their expression, and to be associated with higher correlation network connectivity degree. Arabidopsis genes also present in other plant species were observed to exhibit decreased PE complexity compared to Arabidopsis specific genes. Conclusions We show that Permutation Entropy is a simple yet robust and powerful approach to identify temporal gene expression profiles of varying complexity that is equally applicable to other types of molecular profile data. PMID:21176199
Digital gate pulse generator for cycloconverter control
Klein, Frederick F.; Mutone, Gioacchino A.
1989-01-01
The present invention provides a digital gate pulse generator which controls the output of a cycloconverter used for electrical power conversion applications by determining the timing and delivery of the firing pulses to the switching devices in the cycloconverter. Previous gate pulse generators have been built with largely analog or discrete digital circuitry which require many precision components and periodic adjustment. The gate pulse generator of the present invention utilizes digital techniques and a predetermined series of values to develop the necessary timing signals for firing the switching device. Each timing signal is compared with a reference signal to determine the exact firing time. The present invention is significantly more compact than previous gate pulse generators, responds quickly to changes in the output demand and requires only one precision component and no adjustments.
A New Approach to Monitoring Coastal Marshes for Persistent Flooding
NASA Astrophysics Data System (ADS)
Kalcic, M. T.; Underwood, L. W.; Fletcher, R. M.
2012-12-01
Many areas in coastal Louisiana are below sea level and protected from flooding by a system of natural and man-made levees. Flooding is common when the levees are overtopped by storm surge or rising rivers. Many levees in this region are further stressed by erosion and subsidence. The floodwaters can become constricted by levees and trapped, causing prolonged inundation. Vegetative communities in coastal regions, from fresh swamp forest to saline marsh, can be negatively affected by inundation and changes in salinity. As saltwater persists, it can have a toxic effect upon marsh vegetation causing die off and conversion to open water types, destroying valuable species habitats. The length of time the water persists and the average annual salinity are important variables in modeling habitat switching (cover type change). Marsh type habitat switching affects fish, shellfish, and wildlife inhabitants, and can affect the regional ecosystem and economy. There are numerous restoration and revitalization projects underway in the coastal region, and their effects on the entire ecosystem need to be understood. For these reasons, monitoring persistent saltwater intrusion and inundation is important. For this study, persistent flooding in Louisiana coastal marshes was mapped using MODIS (Moderate Resolution Imaging Spectroradiometer) time series of a Normalized Difference Water Index (NDWI). The time series data were derived for 2000 through 2009, including flooding due to Hurricane Rita in 2005 and Hurricane Ike in 2008. Using the NDWI, duration and extent of flooding can be inferred. The Time Series Product Tool (TSPT), developed at NASA SSC, is a suite of software developed in MATLAB® that enables improved-quality time series images to be computed using advanced temporal processing techniques. This software has been used to compute time series for monitoring temporal changes in environmental phenomena, (e.g. NDVI times series from MODIS), and was modified and used to compute the NDWI indices and also the Normalized Difference Soil Index (NDSI). Coastwide Reference Monitoring System (CRMS) water levels from various hydrologic monitoring stations and aerial photography were used to optimize thresholds for MODIS-derived time series of NDWI and to validate resulting flood maps. In most of the profiles produced for post-hurricane assessment, the increase in the NDWI index (from storm surge) is accompanied by a decrease in the vegetation index (NDVI) and then a period of declining water. The NDSI index represents non-green or dead vegetation and increases after the hurricane's destruction of the marsh vegetation. Behavior of these indices over time is indicative of which areas remain flooded, which areas recover to their former levels of vegetative vigor, and which areas are stressed or in transition. Tracking these indices over time shows the recovery rate of vegetation and the relative behavior to inundation persistence. The results from this study demonstrated that identification of persistent marsh flooding, utilizing the tools developed in this study, provided an approximate 70-80 percent accuracy rate when compared to the actual days flooded at the CRMS stations.
A New Approach to Monitoring Coastal Marshes for Persistent Flooding
NASA Technical Reports Server (NTRS)
Kalcic, M. T.; Undersood, Lauren W.; Fletcher, Rose
2012-01-01
Many areas in coastal Louisiana are below sea level and protected from flooding by a system of natural and man-made levees. Flooding is common when the levees are overtopped by storm surge or rising rivers. Many levees in this region are further stressed by erosion and subsidence. The floodwaters can become constricted by levees and trapped, causing prolonged inundation. Vegetative communities in coastal regions, from fresh swamp forest to saline marsh, can be negatively affected by inundation and changes in salinity. As saltwater persists, it can have a toxic effect upon marsh vegetation causing die off and conversion to open water types, destroying valuable species habitats. The length of time the water persists and the average annual salinity are important variables in modeling habitat switching (cover type change). Marsh type habitat switching affects fish, shellfish, and wildlife inhabitants, and can affect the regional ecosystem and economy. There are numerous restoration and revitalization projects underway in the coastal region, and their effects on the entire ecosystem need to be understood. For these reasons, monitoring persistent saltwater intrusion and inundation is important. For this study, persistent flooding in Louisiana coastal marshes was mapped using MODIS (Moderate Resolution Imaging Spectroradiometer) time series of a Normalized Difference Water Index (NDWI). The time series data were derived for 2000 through 2009, including flooding due to Hurricane Rita in 2005 and Hurricane Ike in 2008. Using the NDWI, duration and extent of flooding can be inferred. The Time Series Product Tool (TSPT), developed at NASA SSC, is a suite of software developed in MATLAB(R) that enables improved-quality time series images to be computed using advanced temporal processing techniques. This software has been used to compute time series for monitoring temporal changes in environmental phenomena, (e.g. NDVI times series from MODIS), and was modified and used to compute the NDWI indices and also the Normalized Difference Soil Index (NDSI). Coastwide Reference Monitoring System (CRMS) water levels from various hydrologic monitoring stations and aerial photography were used to optimize thresholds for MODIS-derived time series of NDWI and to validate resulting flood maps. In most of the profiles produced for post-hurricane assessment, the increase in the NDWI index (from storm surge) is accompanied by a decrease in the vegetation index (NDVI) and then a period of declining water. The NDSI index represents non-green or dead vegetation and increases after the hurricane s destruction of the marsh vegetation. Behavior of these indices over time is indicative of which areas remain flooded, which areas recover to their former levels of vegetative vigor, and which areas are stressed or in transition. Tracking these indices over time shows the recovery rate of vegetation and the relative behavior to inundation persistence. The results from this study demonstrated that identification of persistent marsh flooding, utilizing the tools developed in this study, provided an approximate 70-80 percent accuracy rate when compared to the actual days flooded at the CRMS stations.
National Clearinghouse for Drug Abuse Information Selected Reference Series, Series 4, No. 1.
ERIC Educational Resources Information Center
National Inst. on Drug Abuse (DHEW/PHS), Rockville, MD. National Clearinghouse for Drug Abuse Information.
This bibliography, which attempts to gather the significant research on the reproductive effects of the drugs of abuse, is one in a series prepared by the National Clearinghouse for Drug Abuse Information on subjects of topical interest. Selection of literature is based on its currency, its significance in the field, and its availability in local…
Electron Cloud Effects in Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furman, M.A.
Abstract We present a brief summary of various aspects of the electron-cloud effect (ECE) in accelerators. For further details, the reader is encouraged to refer to the proceedings of many prior workshops, either dedicated to EC or with significant EC contents, including the entire ?ECLOUD? series [1?22]. In addition, the proceedings of the various flavors of Particle Accelerator Conferences [23] contain a large number of EC-related publications. The ICFA Beam Dynamics Newsletter series [24] contains one dedicated issue, and several occasional articles, on EC. An extensive reference database is the LHC website on EC [25].
Expected Improvements in VLBI Measurements of the Earth's Orientation
NASA Technical Reports Server (NTRS)
Ma, Chopo
2003-01-01
Measurements of the Earth s orientation since the 1970s using space geodetic techniques have provided a continually expanding and improving data set for studies of the Earth s structure and the distribution of mass and angular momentum. The accuracy of current one-day measurements is better than 100 microarcsec for the motion of the pole with respect to the celestial and terrestrial reference frames and better than 3 microsec for the rotation around the pole. VLBI uniquely provides the three Earth orientation parameters (nutation and UTI) that relate the Earth to the extragalactic celestial reference frame. The accuracy and resolution of the VLBI Earth orientation time series can be expected to improve substantially in the near future because of refinements in the realization of the celestial reference frame, improved modeling of the troposphere and non-linear station motions, larger observing networks, optimized scheduling, deployment of disk-based Mark V recorders, full use of Mark IV capabilities, and e-VLBI. More radical future technical developments will be discussed.
Practical analysis of tide gauges records from Antarctica
NASA Astrophysics Data System (ADS)
Galassi, Gaia; Spada, Giorgio
2015-04-01
We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted as the effect of current ice melting and steric ocean contributions. By the Ensemble Empirical Mode Decomposition method, we have detected different oscillations embedded in the sea-level signals for Antarctica and AP. This confirms previously recognized connections between the sea-level variations in Antarctica and ocean modes like the ENSO.
NASA Astrophysics Data System (ADS)
Sawant, S. A.; Chakraborty, M.; Suradhaniwar, S.; Adinarayana, J.; Durbha, S. S.
2016-06-01
Satellite based earth observation (EO) platforms have proved capability to spatio-temporally monitor changes on the earth's surface. Long term satellite missions have provided huge repository of optical remote sensing datasets, and United States Geological Survey (USGS) Landsat program is one of the oldest sources of optical EO datasets. This historical and near real time EO archive is a rich source of information to understand the seasonal changes in the horticultural crops. Citrus (Mandarin / Nagpur Orange) is one of the major horticultural crops cultivated in central India. Erratic behaviour of rainfall and dependency on groundwater for irrigation has wide impact on the citrus crop yield. Also, wide variations are reported in temperature and relative humidity causing early fruit onset and increase in crop water requirement. Therefore, there is need to study the crop growth stages and crop evapotranspiration at spatio-temporal scale for managing the scarce resources. In this study, an attempt has been made to understand the citrus crop growth stages using Normalized Difference Time Series (NDVI) time series data obtained from Landsat archives (http://earthexplorer.usgs.gov/). Total 388 Landsat 4, 5, 7 and 8 scenes (from year 1990 to Aug. 2015) for Worldwide Reference System (WRS) 2, path 145 and row 45 were selected to understand seasonal variations in citrus crop growth. Considering Landsat 30 meter spatial resolution to obtain homogeneous pixels with crop cover orchards larger than 2 hectare area was selected. To consider change in wavelength bandwidth (radiometric resolution) with Landsat sensors (i.e. 4, 5, 7 and 8) NDVI has been selected to obtain continuous sensor independent time series. The obtained crop growth stage information has been used to estimate citrus basal crop coefficient information (Kcb). Satellite based Kcb estimates were used with proximal agrometeorological sensing system observed relevant weather parameters for crop ET estimation. The results show that time series EO based crop growth stage estimates provide better information about geographically separated citrus orchards. Attempts are being made to estimate regional variations in citrus crop water requirement for effective irrigation planning. In future high resolution Sentinel 2 observations from European Space Agency (ESA) will be used to fill the time gaps and to get better understanding about citrus crop canopy parameters.
Mbida, André Dieudonné; Sosso, Samuel; Flori, Pierre; Saoudin, Henia; Lawrence, Philip; Monny-Lobé, Marcel; Oyono, Yves; Ndzi, Edward; Cappelli, Giulia; Lucht, Frédéric; Pozzetto, Bruno; Oukem-Boyer, Odile Ouwe Missi; Bourlet, Thomas
2009-09-01
This study aimed to evaluate the use of dried blood spots (DBSs) and dried plasma spots (DPSs) locally collected in 2 rural dispensaries in Cameroon for the quantification of HIV-1 RNA. Forty-one subjects were sampled and spots of whole blood and plasma were deposited onto Whatman 903 cards and dried at ambient temperature under local conditions. Two sets of DBS and DPS cards were done per patient. The rest of the liquid plasma (LP) was frozen until use. LPs were tested at the "Chantal Biya" International Reference Centre (Yaoundé, Cameroon) by the Abbott Real-Time HIV-1 assay (Abbott Molecular Diagnostics, Wiesbaden, Germany). One series of DBS and DPS was transported and tested between 2 and 6 weeks later at the Virology Laboratory of Saint-Etienne (France). The second series was routed by mail and tested after up to 3 months of storage at ambient temperature. From the first series, the correlation rate between viral loads obtained from LP and DBS, and from LP and DPS, was 0.98 and 0.99, respectively; specificity of DBS and DPS results was 100%. The results obtained from the second series indicate a great stability of DBS after long-term storage. This study demonstrates that DBSs collected under local conditions in resource-limited settings are suitable for the differed quantification of HIV-1 RNA.
The application of a shift theorem analysis technique to multipoint measurements
NASA Astrophysics Data System (ADS)
Dieckmann, M. E.; Chapman, S. C.
1999-03-01
A Fourier domain technique has been proposed previously which, in principle, quantifies the extent to which multipoint in-situ measurements can identify whether or not an observed structure is time stationary in its rest frame. Once a structure, sampled for example by four spacecraft, is shown to be quasi-stationary in its rest frame, the structure's velocity vector can be determined with respect to the sampling spacecraft. We investigate the properties of this technique, which we will refer to as a stationarity test, by applying it to two point measurements of a simulated boundary layer. The boundary layer was evolved using a PIC (particle in cell) electromagnetic code. Initial and boundary conditions were chosen such, that two cases could be considered, i.e. a spacecraft pair moving through (1) a time stationary boundary structure and (2) a boundary structure which is evolving (expanding) in time. The code also introduces noise in the simulated data time series which is uncorrelated between the two spacecraft. We demonstrate that, provided that the time series is Hanning windowed, the test is effective in determining the relative velocity between the boundary layer and spacecraft and in determining the range of frequencies over which the data can be treated as time stationary or time evolving. This work presents a first step towards understanding the effectiveness of this technique, as required in order for it to be applied to multispacecraft data.
Long-term coastal measurements for large-scale climate trends characterization
NASA Astrophysics Data System (ADS)
Pomaro, Angela; Cavaleri, Luigi; Lionello, Piero
2017-04-01
Multi-decadal time-series of observational wave data beginning in the late 1970's are relatively rare. The present study refers to the analysis of the 37-year long directional wave time-series recorded between 1979 and 2015 at the CNR-ISMAR (Institute of Marine Sciences of the Italian National Research Council) "Acqua Alta" oceanographic research tower, located in the Northern Adriatic Sea, 15 km offshore the Venice lagoon, on 16 m depth. The extent of the time series allows to exploit its content not only for modelling purposes or short-term statistical analyses, but also at the climatological scale thanks to the peculiar meteorological and oceanographic aspects of the coastal area where this relevant infrastructure has been installed. We explore the dataset both to characterize the local average climate and its variability, and to detect the possible long-term trends that might be suggestive of, or emphasize, large scale circulation patterns and trends. Measured data are essential for the assessment, and often for the calibration, of model data, generally, if long enough, also the reference also for climate studies. By applying this analysis to an area well characterized from the meteorological point of view, we first assess the changes in time based on measured data, and then we compare them to the ones derived from the ERA-Interim regional simulation over the same area, thus showing the strong improvement that is still needed to get reliable climate models projections on coastal areas and the Mediterranean Region as a whole. Moreover, long term hindcast aiming at climatic considerations are well known for 1) underestimating, if their resolution is not high enough, the actual wave heights as well as for 2) being strongly affected by different conditions over time that are likely to introduce spurious trends of variable magnitude. In particular the different amount, in time, of assimilated data by the hindcast models, directly and indirectly affects the results, making it difficult, if not impossible, to distinguish the imposed effects from the climate signal itself, as demonstrated by Aarnes et al. (2015). From this point of view the problem is that long-term measured datasets are relatively unique, due to the cost and technical difficulty of maintaining fixed instrumental equipment over time, as well as of assuring the homogeneity and availability of the entire dataset. For this reason we are furthermore working on the publication of the quality controlled dataset to make it widely available for open-access research purposes. The analysis and homogenization of the original dataset has actually required a substantial part of the time spent on the study, because of the strong impact that the quality of the data may have on the final result. We consider this particularly relevant, especially when referring to coastal areas, where the lack of reliable satellite data makes it difficult to improve the model capability to resolve the local peculiar oceanographic processes. We describe in detail any step and procedure used in producing the data, including full descriptions of the experimental design, data acquisition assays, and any computational processing needed to support the technical quality of the dataset.
Airborne Sea-Surface Topography in an Absolute Reference Frame
NASA Astrophysics Data System (ADS)
Brozena, J. M.; Childers, V. A.; Jacobs, G.; Blaha, J.
2003-12-01
Highly dynamic coastal ocean processes occur at temporal and spatial scales that cannot be captured by the present generation of satellite altimeters. Space-borne gravity missions such as GRACE also provide time-varying gravity and a geoidal msl reference surface at resolution that is too coarse for many coastal applications. The Naval Research Laboratory and the Naval Oceanographic Office have been testing the application of airborne measurement techniques, gravity and altimetry, to determine sea-surface height and height anomaly at the short scales required for littoral regions. We have developed a precise local gravimetric geoid over a test region in the northern Gulf of Mexico from historical gravity data and recent airborne gravity surveys. The local geoid provides a msl reference surface with a resolution of about 10-15 km and provides a means to connect airborne, satellite and tide-gage observations in an absolute (WGS-84) framework. A series of altimetry reflights over the region with time scales of 1 day to 1 year reveal a highly dynamic environment with coherent and rapidly varying sea-surface height anomalies. AXBT data collected at the same time show apparent correlation with wave-like temperature anomalies propagating up the continental slope of the Desoto Canyon. We present animations of the temporal evolution of the surface topography and water column temperature structure down to the 800 m depth of the AXBT sensors.
Two-body potential model based on cosine series expansion for ionic materials
Oda, Takuji; Weber, William J.; Tanigawa, Hisashi
2015-09-23
There is a method to construct a two-body potential model for ionic materials with a Fourier series basis and we examine it. For this method, the coefficients of cosine basis functions are uniquely determined by solving simultaneous linear equations to minimize the sum of weighted mean square errors in energy, force and stress, where first-principles calculation results are used as the reference data. As a validation test of the method, potential models for magnesium oxide are constructed. The mean square errors appropriately converge with respect to the truncation of the cosine series. This result mathematically indicates that the constructed potentialmore » model is sufficiently close to the one that is achieved with the non-truncated Fourier series and demonstrates that this potential virtually provides minimum error from the reference data within the two-body representation. The constructed potential models work appropriately in both molecular statics and dynamics simulations, especially if a two-step correction to revise errors expected in the reference data is performed, and the models clearly outperform two existing Buckingham potential models that were tested. Moreover, the good agreement over a broad range of energies and forces with first-principles calculations should enable the prediction of materials behavior away from equilibrium conditions, such as a system under irradiation.« less
Food Production, Management, and Services. Reference Book [and] Student Activity Book.
ERIC Educational Resources Information Center
Texas Tech Univ., Lubbock. Curriculum Center for Family and Consumer Sciences.
This student activity book and reference book, which are part of a family and consumer sciences education series focusing on a broad range of employment opportunities, are intended for use in one- and two-year food production, management, and services programs for Texas high school students. The reference book provides information needed by…
Riparian reference areas in Idaho: A catalog of plant associations and conservation sites
Mabel Jankovsky-Jones; Steven K. Rust; Robert K. Moseley
1999-01-01
Idaho land managers and regulators need knowledge on riparian reference sites. Reference sites are ecological controls that can be used to set meaningful management and regulatory goals. Since 1984, the Idaho Conservation Data Center, Boise, ID, has compiled information in a series of interrelated databases on the distribution and condition of riparian, wetland, and...
ERIC Educational Resources Information Center
Texas Tech Univ., Lubbock. Curriculum Center for Family and Consumer Sciences.
This student activity book and reference book, which are part of a family and consumer sciences education series focusing on a broad range of employment opportunities, are intended for use in 1- and 2-year textile and apparel production, management, and services programs for Texas high school students. The reference book provides information…
Services for Older Adults. Reference Book [and] Student Activity Book.
ERIC Educational Resources Information Center
Texas Tech Univ., Lubbock. Curriculum Center for Family and Consumer Sciences.
This student activity book and reference book, which are part of a family and consumer sciences education series focusing on a broad range of employment opportunities, are intended for use in 1- and 2- programs preparing Texas high school students for employment in occupations related to providing services for older adults. The reference book…
Guidelines for Marine Biological Reference Collections. Unesco Reports in Marine Sciences, No. 22.
ERIC Educational Resources Information Center
Hureau, J. C.; Rice, A. L.
This manual provides practical advice on the appropriation, conservation, and documentation of a marine biological reference collection, in response to needs expressed by Mediterranean Arab countries. A reference collection is defined as a working museum containing a series of specimens with which biologists are able to compare their own material.…
NASA Astrophysics Data System (ADS)
Erhard, Jannis; Bleiziffer, Patrick; Görling, Andreas
2016-09-01
A power series approximation for the correlation kernel of time-dependent density-functional theory is presented. Using this approximation in the adiabatic-connection fluctuation-dissipation (ACFD) theorem leads to a new family of Kohn-Sham methods. The new methods yield reaction energies and barriers of unprecedented accuracy and enable a treatment of static (strong) correlation with an accuracy of high-level multireference configuration interaction methods but are single-reference methods allowing for a black-box-like handling of static correlation. The new methods exhibit a better scaling of the computational effort with the system size than rivaling wave-function-based electronic structure methods. Moreover, the new methods do not suffer from the problem of singularities in response functions plaguing previous ACFD methods and therefore are applicable to any type of electronic system.
Glacier Volume Change Estimation Using Time Series of Improved Aster Dems
NASA Astrophysics Data System (ADS)
Girod, Luc; Nuth, Christopher; Kääb, Andreas
2016-06-01
Volume change data is critical to the understanding of glacier response to climate change. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) system embarked on the Terra (EOS AM-1) satellite has been a unique source of systematic stereoscopic images covering the whole globe at 15m resolution and at a consistent quality for over 15 years. While satellite stereo sensors with significantly improved radiometric and spatial resolution are available to date, the potential of ASTER data lies in its long consistent time series that is unrivaled, though not fully exploited for change analysis due to lack of data accuracy and precision. Here, we developed an improved method for ASTER DEM generation and implemented it in the open source photogrammetric library and software suite MicMac. The method relies on the computation of a rational polynomial coefficients (RPC) model and the detection and correction of cross-track sensor jitter in order to compute DEMs. ASTER data are strongly affected by attitude jitter, mainly of approximately 4 km and 30 km wavelength, and improving the generation of ASTER DEMs requires removal of this effect. Our sensor modeling does not require ground control points and allows thus potentially for the automatic processing of large data volumes. As a proof of concept, we chose a set of glaciers with reference DEMs available to assess the quality of our measurements. We use time series of ASTER scenes from which we extracted DEMs with a ground sampling distance of 15m. Our method directly measures and accounts for the cross-track component of jitter so that the resulting DEMs are not contaminated by this process. Since the along-track component of jitter has the same direction as the stereo parallaxes, the two cannot be separated and the elevations extracted are thus contaminated by along-track jitter. Initial tests reveal no clear relation between the cross-track and along-track components so that the latter seems not to be easily modeled analytically from the first one. We thus remove the remaining along-track jitter effects in the DEMs statistically through temporal DEM stacks to finally compute the glacier volume changes over time. Our method yields cleaner and spatially more complete elevation data, which also proved to be more in accordance to reference DEMs, compared to NASA's AST14DMO DEM standard products. The quality of the demonstrated measurements promises to further unlock the underused potential of ASTER DEMs for glacier volume change time series on a global scale. The data produced by our method will help to better understand the response of glaciers to climate change and their influence on runoff and sea level.
A Kalman filter approach for the determination of celestial reference frames
NASA Astrophysics Data System (ADS)
Soja, Benedikt; Gross, Richard; Jacobs, Christopher; Chin, Toshio; Karbon, Maria; Nilsson, Tobias; Heinkelmann, Robert; Schuh, Harald
2017-04-01
The coordinate model of radio sources in International Celestial Reference Frames (ICRF), such as the ICRF2, has traditionally been a constant offset. While sufficient for a large part of radio sources considering current accuracy requirements, several sources exhibit significant temporal coordinate variations. In particular, the group of the so-called special handling sources is characterized by large fluctuations in the source positions. For these sources and for several from the "others" category of radio sources, a coordinate model that goes beyond a constant offset would be beneficial. However, due to the sheer amount of radio sources in catalogs like the ICRF2, and even more so with the upcoming ICRF3, it is difficult to find the most appropriate coordinate model for every single radio source. For this reason, we have developed a time series approach to the determination of celestial reference frames (CRF). We feed the radio source coordinates derived from single very long baseline interferometry (VLBI) sessions sequentially into a Kalman filter and smoother, retaining their full covariances. The estimation of the source coordinates is carried out with a temporal resolution identical to the input data, i.e. usually 1-4 days. The coordinates are assumed to behave like random walk processes, an assumption which has already successfully been made for the determination of terrestrial reference frames such as the JTRF2014. To be able to apply the most suitable process noise value for every single radio source, their statistical properties are analyzed by computing their Allan standard deviations (ADEV). Additional to the determination of process noise values, the ADEV allows drawing conclusions whether the variations in certain radio source positions significantly deviate from random walk processes. Our investigations also deal with other means of source characterization, such as the structure index, in order to derive a suitable process noise model. The Kalman filter CRFs resulting from the different approaches are compared among each other, to the original radio source position time series, as well as to a traditional CRF solution, in which the constant source positions are estimated in a global least squares adjustment.
Extraction of temporal information in functional MRI
NASA Astrophysics Data System (ADS)
Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia
2002-10-01
The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.
Spatiotemporal alignment of in utero BOLD-MRI series.
Turk, Esra Abaci; Luo, Jie; Gagoski, Borjan; Pascau, Javier; Bibbo, Carolina; Robinson, Julian N; Grant, P Ellen; Adalsteinsson, Elfar; Golland, Polina; Malpica, Norberto
2017-08-01
To present a method for spatiotemporal alignment of in-utero magnetic resonance imaging (MRI) time series acquired during maternal hyperoxia for enabling improved quantitative tracking of blood oxygen level-dependent (BOLD) signal changes that characterize oxygen transport through the placenta to fetal organs. The proposed pipeline for spatiotemporal alignment of images acquired with a single-shot gradient echo echo-planar imaging includes 1) signal nonuniformity correction, 2) intravolume motion correction based on nonrigid registration, 3) correction of motion and nonrigid deformations across volumes, and 4) detection of the outlier volumes to be discarded from subsequent analysis. BOLD MRI time series collected from 10 pregnant women during 3T scans were analyzed using this pipeline. To assess pipeline performance, signal fluctuations between consecutive timepoints were examined. In addition, volume overlap and distance between manual region of interest (ROI) delineations in a subset of frames and the delineations obtained through propagation of the ROIs from the reference frame were used to quantify alignment accuracy. A previously demonstrated rigid registration approach was used for comparison. The proposed pipeline improved anatomical alignment of placenta and fetal organs over the state-of-the-art rigid motion correction methods. In particular, unexpected temporal signal fluctuations during the first normoxia period were significantly decreased (P < 0.01) and volume overlap and distance between region boundaries measures were significantly improved (P < 0.01). The proposed approach to align MRI time series enables more accurate quantitative studies of placental function by improving spatiotemporal alignment across placenta and fetal organs. 1 Technical Efficacy: Stage 1 J. MAGN. RESON. IMAGING 2017;46:403-412. © 2017 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Ishtiaq, K. S.; Abdul-Aziz, O. I.
2014-12-01
We developed a scaling-based, simple empirical model for spatio-temporally robust prediction of the diurnal cycles of wetland net ecosystem exchange (NEE) by using an extended stochastic harmonic algorithm (ESHA). A reference-time observation from each diurnal cycle was utilized as the scaling parameter to normalize and collapse hourly observed NEE of different days into a single, dimensionless diurnal curve. The modeling concept was tested by parameterizing the unique diurnal curve and predicting hourly NEE of May to October (summer growing and fall seasons) between 2002-12 for diverse wetland ecosystems, as available in the U.S. AmeriFLUX network. As an example, the Taylor Slough short hydroperiod marsh site in the Florida Everglades had data for four consecutive growing seasons from 2009-12; results showed impressive modeling efficiency (coefficient of determination, R2 = 0.66) and accuracy (ratio of root-mean-square-error to the standard deviation of observations, RSR = 0.58). Model validation was performed with an independent year of NEE data, indicating equally impressive performance (R2 = 0.68, RSR = 0.57). The model included a parsimonious set of estimated parameters, which exhibited spatio-temporal robustness by collapsing onto narrow ranges. Model robustness was further investigated by analytically deriving and quantifying parameter sensitivity coefficients and a first-order uncertainty measure. The relatively robust, empirical NEE model can be applied for simulating continuous (e.g., hourly) NEE time-series from a single reference observation (or a set of limited observations) at different wetland sites of comparable hydro-climatology, biogeochemistry, and ecology. The method can also be used for a robust gap-filling of missing data in observed time-series of periodic ecohydrological variables for wetland or other ecosystems.
Validation of endogenous reference genes for qRT-PCR analysis of human visceral adipose samples
2010-01-01
Background Given the epidemic proportions of obesity worldwide and the concurrent prevalence of metabolic syndrome, there is an urgent need for better understanding the underlying mechanisms of metabolic syndrome, in particular, the gene expression differences which may participate in obesity, insulin resistance and the associated series of chronic liver conditions. Real-time PCR (qRT-PCR) is the standard method for studying changes in relative gene expression in different tissues and experimental conditions. However, variations in amount of starting material, enzymatic efficiency and presence of inhibitors can lead to quantification errors. Hence the need for accurate data normalization is vital. Among several known strategies for data normalization, the use of reference genes as an internal control is the most common approach. Recent studies have shown that both obesity and presence of insulin resistance influence an expression of commonly used reference genes in omental fat. In this study we validated candidate reference genes suitable for qRT-PCR profiling experiments using visceral adipose samples from obese and lean individuals. Results Cross-validation of expression stability of eight selected reference genes using three popular algorithms, GeNorm, NormFinder and BestKeeper found ACTB and RPII as most stable reference genes. Conclusions We recommend ACTB and RPII as stable reference genes most suitable for gene expression studies of human visceral adipose tissue. The use of these genes as a reference pair may further enhance the robustness of qRT-PCR in this model system. PMID:20492695
Validation of endogenous reference genes for qRT-PCR analysis of human visceral adipose samples.
Mehta, Rohini; Birerdinc, Aybike; Hossain, Noreen; Afendy, Arian; Chandhoke, Vikas; Younossi, Zobair; Baranova, Ancha
2010-05-21
Given the epidemic proportions of obesity worldwide and the concurrent prevalence of metabolic syndrome, there is an urgent need for better understanding the underlying mechanisms of metabolic syndrome, in particular, the gene expression differences which may participate in obesity, insulin resistance and the associated series of chronic liver conditions. Real-time PCR (qRT-PCR) is the standard method for studying changes in relative gene expression in different tissues and experimental conditions. However, variations in amount of starting material, enzymatic efficiency and presence of inhibitors can lead to quantification errors. Hence the need for accurate data normalization is vital. Among several known strategies for data normalization, the use of reference genes as an internal control is the most common approach. Recent studies have shown that both obesity and presence of insulin resistance influence an expression of commonly used reference genes in omental fat. In this study we validated candidate reference genes suitable for qRT-PCR profiling experiments using visceral adipose samples from obese and lean individuals. Cross-validation of expression stability of eight selected reference genes using three popular algorithms, GeNorm, NormFinder and BestKeeper found ACTB and RPII as most stable reference genes. We recommend ACTB and RPII as stable reference genes most suitable for gene expression studies of human visceral adipose tissue. The use of these genes as a reference pair may further enhance the robustness of qRT-PCR in this model system.
Global Gridded Crop Model Evaluation: Benchmarking, Skills, Deficiencies and Implications.
NASA Technical Reports Server (NTRS)
Muller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; Deryng, Delphine; Folberth, Christian; Glotter, Michael; Hoek, Steven;
2017-01-01
Crop models are increasingly used to simulate crop yields at the global scale, but so far there is no general framework on how to assess model performance. Here we evaluate the simulation results of 14 global gridded crop modeling groups that have contributed historic crop yield simulations for maize, wheat, rice and soybean to the Global Gridded Crop Model Intercomparison (GGCMI) of the Agricultural Model Intercomparison and Improvement Project (AgMIP). Simulation results are compared to reference data at global, national and grid cell scales and we evaluate model performance with respect to time series correlation, spatial correlation and mean bias. We find that global gridded crop models (GGCMs) show mixed skill in reproducing time series correlations or spatial patterns at the different spatial scales. Generally, maize, wheat and soybean simulations of many GGCMs are capable of reproducing larger parts of observed temporal variability (time series correlation coefficients (r) of up to 0.888 for maize, 0.673 for wheat and 0.643 for soybean at the global scale) but rice yield variability cannot be well reproduced by most models. Yield variability can be well reproduced for most major producing countries by many GGCMs and for all countries by at least some. A comparison with gridded yield data and a statistical analysis of the effects of weather variability on yield variability shows that the ensemble of GGCMs can explain more of the yield variability than an ensemble of regression models for maize and soybean, but not for wheat and rice. We identify future research needs in global gridded crop modeling and for all individual crop modeling groups. In the absence of a purely observation-based benchmark for model evaluation, we propose that the best performing crop model per crop and region establishes the benchmark for all others, and modelers are encouraged to investigate how crop model performance can be increased. We make our evaluation system accessible to all crop modelers so that other modeling groups can also test their model performance against the reference data and the GGCMI benchmark.
NASA Astrophysics Data System (ADS)
Zolgharni, Massoud; Dhutia, Niti M.; Cole, Graham D.; Willson, Keith; Francis, Darrel P.
2014-03-01
Echocardiographers are often unkeen to make the considerable time investment to make additional multiple measurements of Doppler velocity. Main hurdle to obtaining multiple measurements is the time required to manually trace a series of Doppler traces. To make it easier to analyse more beats, we present an automated system for Doppler envelope quantification. It analyses long Doppler strips, spanning many heartbeats, and does not require the electrocardiogram to isolate individual beats. We tested its measurement of velocity-time-integral and peak-velocity against the reference standard defined as the average of three experts who each made three separate measurements. The automated measurements of velocity-time-integral showed strong correspondence (R2 = 0.94) and good Bland-Altman agreement (SD = 6.92%) with the reference consensus expert values, and indeed performed as well as the individual experts (R2 = 0.90 to 0.96, SD = 5.66% to 7.64%). The same performance was observed for peak-velocities; (R2 = 0.98, SD = 2.95%) and (R2 = 0.93 to 0.98, SD = 2.94% to 5.12%). This automated technology allows <10 times as many beats to be acquired and analysed compared to the conventional manual approach, with each beat maintaining its accuracy.
Space-Time Localization of Plasma Turbulence Using Multiple Spacecraft Radio Links
NASA Technical Reports Server (NTRS)
Armstrong, John W.; Estabrook, Frank B.
2011-01-01
Space weather is described as the variability of solar wind plasma that can disturb satellites and systems and affect human space exploration. Accurate prediction requires information of the heliosphere inside the orbit of the Earth. However, for predictions using remote sensing, one needs not only plane-of-sky position but also range information the third spatial dimension to show the distance to the plasma disturbances and thus when they might propagate or co-rotate to create disturbances at the orbit of the Earth. Appropriately processed radio signals from spacecraft having communications lines-of-sight passing through the inner heliosphere can be used for this spacetime localization of plasma disturbances. The solar plasma has an electron density- and radio-wavelength-dependent index of refraction. An approximately monochromatic wave propagating through a thin layer of plasma turbulence causes a geometrical-optics phase shift proportional to the electron density at the point of passage, the radio wavelength, and the thickness of the layer. This phase shift is the same for a wave propagating either up or down through the layer at the point of passage. This attribute can be used for space-time localization of plasma irregularities. The transfer function of plasma irregularities to the observed time series depends on the Doppler tracking mode. When spacecraft observations are in the two-way mode (downlink radio signal phase-locked to an uplink radio transmission), plasma fluctuations have a two-pulse response in the Doppler. In the two-way mode, the Doppler time series y2(t) is the difference between the frequency of the downlink signal received and the frequency of a ground reference oscillator. A plasma blob localized at a distance x along the line of sight perturbs the phase on both the up and down link, giving rise to two events in the two-way tracking time series separated by a time lag depending the blob s distance from the Earth: T2-2x/c, where T2 is the two-way time-of-flight of radio waves to/from the spacecraft and c is the speed of light. In some tracking situations, more information is available. For example, with the 5-link Cassini radio system, the plasma contribution to the up and down links, y(sub up)(t) and y(sub dn)(t), can be computed separately. The times series y(sub up)(t) and y(sub dn)(t) respond to a localized plasma blob with one event in each time series. These events are also separated in time by T2-2x/c. By cross-correlating the up and down link Doppler time series, the time separation of the plasma events can be measured and hence the plasma blob s distance from the Earth determined. Since the plane-of-sky position is known, this technique allows localization of plasma events in time and three space dimensions.
NASA Astrophysics Data System (ADS)
Duveiller, G.; Donatelli, M.; Fumagalli, D.; Zucchini, A.; Nelson, R.; Baruth, B.
2017-02-01
Coupled atmosphere-ocean general circulation models (GCMs) simulate different realizations of possible future climates at global scale under contrasting scenarios of land-use and greenhouse gas emissions. Such data require several additional processing steps before it can be used to drive impact models. Spatial downscaling, typically by regional climate models (RCM), and bias-correction are two such steps that have already been addressed for Europe. Yet, the errors in resulting daily meteorological variables may be too large for specific model applications. Crop simulation models are particularly sensitive to these inconsistencies and thus require further processing of GCM-RCM outputs. Moreover, crop models are often run in a stochastic manner by using various plausible weather time series (often generated using stochastic weather generators) to represent climate time scale for a period of interest (e.g. 2000 ± 15 years), while GCM simulations typically provide a single time series for a given emission scenario. To inform agricultural policy-making, data on near- and medium-term decadal time scale is mostly requested, e.g. 2020 or 2030. Taking a sample of multiple years from these unique time series to represent time horizons in the near future is particularly problematic because selecting overlapping years may lead to spurious trends, creating artefacts in the results of the impact model simulations. This paper presents a database of consolidated and coherent future daily weather data for Europe that addresses these problems. Input data consist of daily temperature and precipitation from three dynamically downscaled and bias-corrected regional climate simulations of the IPCC A1B emission scenario created within the ENSEMBLES project. Solar radiation is estimated from temperature based on an auto-calibration procedure. Wind speed and relative air humidity are collected from historical series. From these variables, reference evapotranspiration and vapour pressure deficit are estimated ensuring consistency within daily records. The weather generator ClimGen is then used to create 30 synthetic years of all variables to characterize the time horizons of 2000, 2020 and 2030, which can readily be used for crop modelling studies.
A novel approach for detecting heat waves: the Standardized Heat-Wave Index.
NASA Astrophysics Data System (ADS)
Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro
2016-04-01
Extreme temperatures have an impact on the energy balance of any living organism and on the operational capabilities of critical infrastructures. The ability to capture the occurrence of extreme temperature events is therefore an essential property of a multi-hazard extreme climate indicator. In this paper we introduce a new index for the detection of such extreme temperature events called SHI (Standardized Heat-Wave Index), developed in the context of XCF project for the construction of a multi-hazard extreme climate indicator (ECI). SHI is a probabilistic index based on the analysis of maximum daily temperatures time series; it is standardized, enabling comparisons overs space/time and with other indices, and it is capable of describing both extreme cold and hot events. Given a particular location, SHI is constructed using the time series of local maximum daily temperatures with the following procedure: three-days cumulated maximum daily temperatures are assigned to each day of the time series; probabilities of occurrence in the same months the reference days belong to are computed for each of the previous calculated values; such probability values are thus projected on a standard normal distribution, obtaining our standardized indices. In this work we present results obtained using NCEP Reanalysis dataset for air temperature at sigma 0.995 level, which timespan ranges from 1948 to 2014. Given the specific framework of this work, the geographical focus of this study is limited to the African continent. We present a validation of the index by showing its use for monitoring heat-waves under different climate regimes.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-06
... Model CL-600-2D24 (Regional Jet Series 900) Airplanes AGENCY: Federal Aviation Administration (FAA), DOT... referred to Task 05- 51-27-210-801 of Part 2, Volume 1, of the Bombardier CRJ Series Regional Jet Aircraft... Regulatory Flexibility Act. List of Subjects in 14 CFR Part 39 Air transportation, Aircraft, Aviation safety...
Monitoring the fracture behavior of SiCp/Al alloy composites using infrared lock-in thermography
NASA Astrophysics Data System (ADS)
Kordatos, E. Z.; Myriounis, D., P.; Hasan, S., T.; Matikas, T. E.
2009-03-01
his work deals with the study of fracture behavior of silicon carbide particle-reinforced (SiCp) A359 aluminum alloy matrix composites using an innovative nondestructive method based on lock-in thermography. The heat wave, generated by the thermo-mechanical coupling and the intrinsic energy dissipated during mechanical cyclic loading of the sample, was detected by an infrared camera. The coefficient of thermo-elasticity allows for the transformation of the temperature profiles into stresses. A new procedure was developed to determine the crack growth rate using thermographic mapping of the material undergoing fatigue: (a) The distribution of temperature and stresses at the surface of the specimen was monitored during the test. To this end, thermal images were obtained as a function of time and saved in the form of a movie. (b) The stresses were evaluated in a post-processing mode, along a series of equally spaced reference lines of the same length, set in front of the crack-starting notch. The idea was that the stress monitored at the location of a line versus time (or fatigue cycles) would exhibit an increase while the crack approaches the line, then attain a maximum when the crack tip was on the line. Due to the fact that the crack growth path could not be predicted and was not expected to follow a straight line in front of the notch, the stresses were monitored along a series of lines of a certain length, instead of a series of equally spaced points in front of the notch. The exact path of the crack could be easily determined by looking at the stress maxima along each of these reference lines. The thermographic results on the crack growth rate of the metal matrix composite (MMC) samples with three different heat treatments were correlated with measurements obtained by the conventional compliance method, and found to be in agreement.
The Design of a Propeller for a U.S. Coast Guard Icebreaker Tugboat
1975-10-01
series as investigated by Van Lammeren, Van Manen , and Oosterveld . This propeller series, commonly referred to as the K’jgeningen B-screw series...RtFEKENCES Van Lammeren, W.P.A., Van Manen , J.D., and Oosterveld, M.W.C., "The Wageningen B-Screw Series," Transactions of the Society of Naval Architects...forces has been reported by Van Gunsteren and Pronk . They investigated both single and twin screw ships of various types and the results are
Prediction of stock markets by the evolutionary mix-game model
NASA Astrophysics Data System (ADS)
Chen, Fang; Gou, Chengling; Guo, Xiaoqian; Gao, Jieping
2008-06-01
This paper presents the efforts of using the evolutionary mix-game model, which is a modified form of the agent-based mix-game model, to predict financial time series. Here, we have carried out three methods to improve the original mix-game model by adding the abilities of strategy evolution to agents, and then applying the new model referred to as the evolutionary mix-game model to forecast the Shanghai Stock Exchange Composite Index. The results show that these modifications can improve the accuracy of prediction greatly when proper parameters are chosen.
Nonequilibrium Dynamics of Arbitrary-Range Ising Models with Decoherence: An Exact Analytic Solution
2013-04-03
spontaneous deexcitation, spontaneous excitation, and elastic dephasing, respectively (see Fig. 1). We refer to the spin-changing processes (σ̂±) as Raman ...Series of Raman flips of the spin on site j can be formally accounted for as a magnetic field of strength 2Jjk/N acting for a time τ upj − τ downj . In...2σ̂±j , all Rayleigh jumps can be evaluated at t = 0 (their commutation with Raman jumps only affects the overall sign of the wave function). To the
Accelerating Large Data Analysis By Exploiting Regularities
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Ellsworth, David
2003-01-01
We present techniques for discovering and exploiting regularity in large curvilinear data sets. The data can be based on a single mesh or a mesh composed of multiple submeshes (also known as zones). Multi-zone data are typical to Computational Fluid Dynamics (CFD) simulations. Regularities include axis-aligned rectilinear and cylindrical meshes as well as cases where one zone is equivalent to a rigid-body transformation of another. Our algorithms can also discover rigid-body motion of meshes in time-series data. Next, we describe a data model where we can utilize the results from the discovery process in order to accelerate large data visualizations. Where possible, we replace general curvilinear zones with rectilinear or cylindrical zones. In rigid-body motion cases we replace a time-series of meshes with a transformed mesh object where a reference mesh is dynamically transformed based on a given time value in order to satisfy geometry requests, on demand. The data model enables us to make these substitutions and dynamic transformations transparently with respect to the visualization algorithms. We present results with large data sets where we combine our mesh replacement and transformation techniques with out-of-core paging in order to achieve significant speed-ups in analysis.
Detecting nonlinear dynamics of functional connectivity
NASA Astrophysics Data System (ADS)
LaConte, Stephen M.; Peltier, Scott J.; Kadah, Yasser; Ngan, Shing-Chung; Deshpande, Gopikrishna; Hu, Xiaoping
2004-04-01
Functional magnetic resonance imaging (fMRI) is a technique that is sensitive to correlates of neuronal activity. The application of fMRI to measure functional connectivity of related brain regions across hemispheres (e.g. left and right motor cortices) has great potential for revealing fundamental physiological brain processes. Primarily, functional connectivity has been characterized by linear correlations in resting-state data, which may not provide a complete description of its temporal properties. In this work, we broaden the measure of functional connectivity to study not only linear correlations, but also those arising from deterministic, non-linear dynamics. Here the delta-epsilon approach is extended and applied to fMRI time series. The method of delays is used to reconstruct the joint system defined by a reference pixel and a candidate pixel. The crux of this technique relies on determining whether the candidate pixel provides additional information concerning the time evolution of the reference. As in many correlation-based connectivity studies, we fix the reference pixel. Every brain location is then used as a candidate pixel to estimate the spatial pattern of deterministic coupling with the reference. Our results indicate that measured connectivity is often emphasized in the motor cortex contra-lateral to the reference pixel, demonstrating the suitability of this approach for functional connectivity studies. In addition, discrepancies with traditional correlation analysis provide initial evidence for non-linear dynamical properties of resting-state fMRI data. Consequently, the non-linear characterization provided from our approach may provide a more complete description of the underlying physiology and brain function measured by this type of data.
Contribution of TIGA reprocessing to the ITRF densification
NASA Astrophysics Data System (ADS)
Rudenko, S.; Dähnn, M.; Gendt, G.; Brandt, A.; Nischan, T.
2009-04-01
Analysis of tide gauge measurements with the purpose of sea level change investigations requires a well defined reference frame. Such reference frame can be realized through precise positions of GPS stations located at or near tide gauges (TIGA stations) and analyzed within the IGS GPS Tide Gauge Benchmark Monitoring Pilot Project (TIGA). To tie this reference frame to the International Terrestrial Reference Frame (ITRF), one should process simultaneously GPS data from TIGA and IGS stations included in the ITRF. A time series of GPS station positions has been recently derived by reprocessing GPS data from about 400 GPS stations globally distributed covering totally time span from 1998 till 2008 using EPOS-Potsdam software developed at GFZ and improved in the recent years. The analysis is based on the use of IERS Conventions 2003, ITRF2005 as a priori reference frame, FES2004 ocean tide loading model, absolute phase centre variations for GPS satellite transmit and ground receive antennae and other models. About 220 stations of the solution are IGS ones and about 180 are TIGA GPS stations that are not IGS ones. The solution includes weekly coordinates of GPS stations, daily values of the Earth rotation parameters and their rates, as well as satellite antenna offsets. On the other hand, our new solution can contribute to the ITRF densification by providing positions of about 200 stations being not present in ITRF2005. The solution can be also used for the integration of regional frames. The paper presents the results of the analysis and the comparison of our solution with ITRF2005 and the solutions of other TIGA and IGS Analysis Centres.
ERP-Variations on Time Scales Between Hours and Months Derived From GNSS Observations
NASA Astrophysics Data System (ADS)
Weber, R.; Englich, S.; Mendes Cerveira, P.
2007-05-01
Current observations gained by the space geodetic techniques, especially VLBI, GPS and SLR, allow for the determination of Earth Rotation Parameters (ERPs - polar motion, UT1/LOD) with unprecedented accuracy and temporal resolution. This presentation focuses on contributions to the ERP recovery provided by satellite navigation systems (primarily GPS). The IGS (International GNSS Service), for example, currently provides daily polar motion with an accuracy of less than 0.1mas and LOD estimates with an accuracy of a few microseconds. To study more rapid variations in polar motion and LOD we established in a first step a high resolution (hourly resolution) ERP-time series from GPS observation data of the IGS network covering the year 2005. The calculations were carried out by means of the Bernese GPS Software V5.0 considering observations from a subset of 113 fairly stable stations out of the IGS05 reference frame sites. From these ERP time series the amplitudes of the major diurnal and semidiurnal variations caused by ocean tides are estimated. After correcting the series for ocean tides the remaining geodetic observed excitation is compared with variations of atmospheric excitation (AAM). To study the sensitivity of the estimates with respect to the applied mapping function we applied both the widely used NMF (Niell Mapping Function) and the VMF1 (Vienna Mapping Function 1). In addition, based on computations covering two months in 2005, the potential improvement due to the use of additional GLONASS data will be discussed.
Lugina, K. M. [Department of Geography, St. Petersburg State University, St. Petersburg, Russia; Groisman, P. Ya. [National Climatic Data Center, Asheville, North Carolina USA); Vinnikov, K. Ya. [Department of Atmospheric Sciences, University of Maryland, College Park, Maryland (USA); Koknaeva, V. V. [State Hydrological Institute, St. Petersburg, Russia; Speranskaya, N. A. [State Hydrological Institute, St. Petersburg, Russia
2006-01-01
The mean monthly and annual values of surface air temperature compiled by Lugina et al. have been taken mainly from the World Weather Records, Monthly Climatic Data for the World, and Meteorological Data for Individual Years over the Northern Hemisphere Excluding the USSR. These published records were supplemented with information from different national publications. In the original archive, after removal of station records believed to be nonhomogeneous or biased, 301 and 265 stations were used to determine the mean temperature for the Northern and Southern hemispheres, respectively. The new version of the station temperature archive (used for evaluation of the zonally-averaged temperatures) was created in 1995. The change to the archive was required because data from some stations became unavailable for analyses in the 1990s. During this process, special care was taken to secure homogeneity of zonally averaged time series. When a station (or a group of stations) stopped reporting, a "new" station (or group of stations) was selected in the same region, and its data for the past 50 years were collected and added to the archive. The processing (area-averaging) was organized in such a way that each time series from a new station spans the reference period (1951-1975) and the years thereafter. It was determined that the addition of the new stations had essentially no effect on the zonally-averaged values for the pre-1990 period.
Effect of spatial averaging on multifractal properties of meteorological time series
NASA Astrophysics Data System (ADS)
Hoffmann, Holger; Baranowski, Piotr; Krzyszczak, Jaromir; Zubik, Monika
2016-04-01
Introduction The process-based models for large-scale simulations require input of agro-meteorological quantities that are often in the form of time series of coarse spatial resolution. Therefore, the knowledge about their scaling properties is fundamental for transferring locally measured fluctuations to larger scales and vice-versa. However, the scaling analysis of these quantities is complicated due to the presence of localized trends and non-stationarities. Here we assess how spatially aggregating meteorological data to coarser resolutions affects the data's temporal scaling properties. While it is known that spatial aggregation may affect spatial data properties (Hoffmann et al., 2015), it is unknown how it affects temporal data properties. Therefore, the objective of this study was to characterize the aggregation effect (AE) with regard to both temporal and spatial input data properties considering scaling properties (i.e. statistical self-similarity) of the chosen agro-meteorological time series through multifractal detrended fluctuation analysis (MFDFA). Materials and Methods Time series coming from years 1982-2011 were spatially averaged from 1 to 10, 25, 50 and 100 km resolution to assess the impact of spatial aggregation. Daily minimum, mean and maximum air temperature (2 m), precipitation, global radiation, wind speed and relative humidity (Zhao et al., 2015) were used. To reveal the multifractal structure of the time series, we used the procedure described in Baranowski et al. (2015). The diversity of the studied multifractals was evaluated by the parameters of time series spectra. In order to analyse differences in multifractal properties to 1 km resolution grids, data of coarser resolutions was disaggregated to 1 km. Results and Conclusions Analysing the spatial averaging on multifractal properties we observed that spatial patterns of the multifractal spectrum (MS) of all meteorological variables differed from 1 km grids and MS-parameters were biased by -29.1 % (precipitation; width of MS) up to >4 % (min. Temperature, Radiation; asymmetry of MS). Also, the spatial variability of MS parameters was strongly affected at the highest aggregation (100 km). Obtained results confirm that spatial data aggregation may strongly affect temporal scaling properties. This should be taken into account when upscaling for large-scale studies. Acknowledgements The study was conducted within FACCE MACSUR. Please see Baranowski et al. (2015) for details on funding. References Baranowski, P., Krzyszczak, J., Sławiński, C. et al. (2015). Climate Research 65, 39-52. Hoffman, H., G. Zhao, L.G.J. Van Bussel et al. (2015). Climate Research 65, 53-69. Zhao, G., Siebert, S., Rezaei E. et al. (2015). Agricultural and Forest Meteorology 200, 156-171.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.
2018-02-01
River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of prescribed β values and gap distributions. The aliasing method, however, does not itself account for sampling irregularity, and this introduces some bias in the result. Nonetheless, the wavelet method is recommended for estimating β in irregular time series until improved methods are developed. Finally, all methods' performances depend strongly on the sampling irregularity, highlighting that the accuracy and precision of each method are data specific. Accurately quantifying the strength of fractal scaling in irregular water-quality time series remains an unresolved challenge for the hydrologic community and for other disciplines that must grapple with irregular sampling.
An algebraic method for constructing stable and consistent autoregressive filters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harlim, John, E-mail: jharlim@psu.edu; Department of Meteorology, the Pennsylvania State University, University Park, PA 16802; Hong, Hoon, E-mail: hong@ncsu.edu
2015-02-15
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides amore » discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.« less
Design factors and considerations for a time-based flight management system
NASA Technical Reports Server (NTRS)
Vicroy, D. D.; Williams, D. H.; Sorensen, J. A.
1986-01-01
Recent NASA Langley Research Center research to develop a technology data base from which an advanced Flight Management System (FMS) design might evolve is reviewed. In particular, the generation of fixed range cruise/descent reference trajectories which meet predefined end conditions of altitude, speed, and time is addressed. Results on the design and theoretical basis of the trajectory generation algorithm are presented, followed by a brief discussion of a series of studies that are being conducted to determine the accuracy requirements of the aircraft and weather models resident in the trajectory generation algorithm. Finally, studies to investigate the interface requirements between the pilot and an advanced FMS are considered.
Dharmalingam, Rajasekaran; Dash, Subhransu Sekhar; Senthilnathan, Karthikrajan; Mayilvaganan, Arun Bhaskar; Chinnamuthu, Subramani
2014-01-01
This paper deals with the performance of unified power quality conditioner (UPQC) based on current source converter (CSC) topology. UPQC is used to mitigate the power quality problems like harmonics and sag. The shunt and series active filter performs the simultaneous elimination of current and voltage problems. The power fed is linked through common DC link and maintains constant real power exchange. The DC link is connected through the reactor. The real power supply is given by the photovoltaic system for the compensation of power quality problems. The reference current and voltage generation for shunt and series converter is based on phase locked loop and synchronous reference frame theory. The proposed UPQC-CSC design has superior performance for mitigating the power quality problems. PMID:25013854
Dharmalingam, Rajasekaran; Dash, Subhransu Sekhar; Senthilnathan, Karthikrajan; Mayilvaganan, Arun Bhaskar; Chinnamuthu, Subramani
2014-01-01
This paper deals with the performance of unified power quality conditioner (UPQC) based on current source converter (CSC) topology. UPQC is used to mitigate the power quality problems like harmonics and sag. The shunt and series active filter performs the simultaneous elimination of current and voltage problems. The power fed is linked through common DC link and maintains constant real power exchange. The DC link is connected through the reactor. The real power supply is given by the photovoltaic system for the compensation of power quality problems. The reference current and voltage generation for shunt and series converter is based on phase locked loop and synchronous reference frame theory. The proposed UPQC-CSC design has superior performance for mitigating the power quality problems.
Manufacture and quality control of interconnecting wire harnesses
NASA Technical Reports Server (NTRS)
1973-01-01
Four-volume series of documents has been prepared as standard reference. Each volume may be used separately and covers wire and cable preparation as well as harness fabrication and installation. Series should be useful addition to libraries of manufactures of electrical and electronic equipment.
Leo Kanner's Mention of 1938 in His Report on Autism Refers to His First Patient
ERIC Educational Resources Information Center
Olmsted, Dan; Blaxill, Mark
2016-01-01
Leo Kanner begins his landmark 1943 case series on autistic children by stating the condition was first brought to his attention in 1938. Recent letters to "JADD" have described this reference as "mysterious" and speculated it refers to papers published that year by Despert or Asperger. In fact, as Kanner goes on to state, 1938…
ERIC Educational Resources Information Center
Bopp, Richard E., Ed.; Smith, Linda C., Ed.
Like the first two editions, this third edition is designed primarily to provide the beginning student of library and information science with an overview both of the concepts and processes behind today's reference services and of the most important sources consulted in answering common types of reference questions. The first 12 chapters deal with…
ERIC Educational Resources Information Center
O'Brien, Nancy Patricia
The purpose of this guide is to provide information about the key reference and information resources in the field of education. Sources include items published from 1990 through 1998, with selective inclusion of significant or unique works published prior to 1990. The guide is divided into 14 categories that reflect different aspects of…
IDCDACS: IDC's Distributed Application Control System
NASA Astrophysics Data System (ADS)
Ertl, Martin; Boresch, Alexander; Kianička, Ján; Sudakov, Alexander; Tomuta, Elena
2015-04-01
The Preparatory Commission for the CTBTO is an international organization based in Vienna, Austria. Its mission is to establish a global verification regime to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), which bans all nuclear explosions. For this purpose time series data from a global network of seismic, hydro-acoustic and infrasound (SHI) sensors are transmitted to the International Data Centre (IDC) in Vienna in near-real-time, where it is processed to locate events that may be nuclear explosions. We newly designed the distributed application control system that glues together the various components of the automatic waveform data processing system at the IDC (IDCDACS). Our highly-scalable solution preserves the existing architecture of the IDC processing system that proved successful over many years of operational use, but replaces proprietary components with open-source solutions and custom developed software. Existing code was refactored and extended to obtain a reusable software framework that is flexibly adaptable to different types of processing workflows. Automatic data processing is organized in series of self-contained processing steps, each series being referred to as a processing pipeline. Pipelines process data by time intervals, i.e. the time-series data received from monitoring stations is organized in segments based on the time when the data was recorded. So-called data monitor applications queue the data for processing in each pipeline based on specific conditions, e.g. data availability, elapsed time or completion states of preceding processing pipelines. IDCDACS consists of a configurable number of distributed monitoring and controlling processes, a message broker and a relational database. All processes communicate through message queues hosted on the message broker. Persistent state information is stored in the database. A configurable processing controller instantiates and monitors all data processing applications. Due to decoupling by message queues the system is highly versatile and failure tolerant. The implementation utilizes the RabbitMQ open-source messaging platform that is based upon the Advanced Message Queuing Protocol (AMQP), an on-the-wire protocol (like HTML) and open industry standard. IDCDACS uses high availability capabilities provided by RabbitMQ and is equipped with failure recovery features to survive network and server outages. It is implemented in C and Python and is operated in a Linux environment at the IDC. Although IDCDACS was specifically designed for the existing IDC processing system its architecture is generic and reusable for different automatic processing workflows, e.g. similar to those described in (Olivieri et al. 2012, Kværna et al. 2012). Major advantages are its independence of the specific data processing applications used and the possibility to reconfigure IDCDACS for different types of processing, data and trigger logic. A possible future development would be to use the IDCDACS framework for different scientific domains, e.g. for processing of Earth observation satellite data extending the one-dimensional time-series intervals to spatio-temporal data cubes. REFERENCES Olivieri M., J. Clinton (2012) An almost fair comparison between Earthworm and SeisComp3, Seismological Research Letters, 83(4), 720-727. Kværna, T., S. J. Gibbons, D. B. Harris, D. A. Dodge (2012) Adapting pipeline architectures to track developing aftershock sequences and recurrent explosions, Proceedings of the 2012 Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies, 776-785.
Wang, Wenguang; Ma, Xiaoli; Guo, Xiaoyu; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong
2015-09-18
In order to solve the bottleneck of reference standards shortage for comprehensive quality control of traditional Chinese medicines (TCMs), a series of strategies, including one single reference standard to determine multi-compounds (SSDMC), quantitative analysis by standardized reference extract (QASRE), and quantitative nuclear magnetic resonance spectroscopy (qNMR) were proposed, and Mahoniae Caulis was selected as an example to develop and validate these methods for simultaneous determination of four alkaloids, columbamine, jatrorrhizine, palmatine, and berberine. Comprehensive comparisons among these methods and with the conventional external standard method (ESM) were carried out. The relative expanded uncertainty of measurement was firstly used to compare their credibility. The results showed that all these three new developed methods can accurately accomplish the quantification by using only one purified reference standard, but each of them has its own advantages and disadvantages as well as the specific application scope, which were also discussed in detail in this paper. Copyright © 2015 Elsevier B.V. All rights reserved.
Bel Hadj Hmida, Y; Tahri, N; Sellami, A; Yangui, N; Jlidi, R; Beyrouti, M I; Krichen, M S; Masmoudi, H
2001-01-01
In order to determine the sensitivity of CEA in the diagnosis of colo-rectal carcinoma, we studied a series of 48 patients with colo-rectal carcinoma (1992-1996). The sensitivity was at 52% with a reference value of 5 ng/ml and 68.7% for a reference value of 2.5 ng/ml. With a reference value of 5 ng/ml, the sensitivity of CEA was at 37% only for patients with colo-rectal carcinoma at Dukes B stage, 66.6% for patients at stage C and 75% for patients at stage D. The dosage of CEA was carried out with a sandwich immunoenzymatic technique in tube. There is no statistic significant correlation between the pre-operative rate of CEA and the localisation of the tumor and its histologic type; in contrast, it was significantly correlated with the ganglionnary metastasis. A significant relationship between the pre-operative rate of CEA and the Dukes stage was found for a reference value of 10 ng/ml but not for a reference value of 5 ng/ml. We calculated the specificity of the CEA for the cancers of colon and rectum which was at 76.98% with a reference value of 5 ng/ml and 86% with a reference value of 10 ng/ml.
Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis.
Moser, Albine; Korstjens, Irene
2018-12-01
In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By 'novice' we mean Master's students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research.
Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis
Moser, Albine; Korstjens, Irene
2018-01-01
Abstract In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By ‘novice’ we mean Master’s students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research. PMID:29199486
Multi-geodetic characterization of the seasonal signal at the CERGA geodetic reference, France
NASA Astrophysics Data System (ADS)
Memin, A.; Viswanathan, V.; Fienga, A.; Santamaría-Gómez, A.; Boy, J. P.
2016-12-01
Crustal deformations due to surface-mass loading account for a significant part of the variability in geodetic time series. A perfect understanding of the loading signal observed by geodetic techniques should help in improving terrestrial reference frame (TRF) realizations. Yet, discrepancies between crustal motion estimates from models of surface-mass loading and observations are still too large so that no model is currently recommended by the IERS for reducing the data. We investigate the discrepancy observed in the seasonal variations of the CERGA station, South of France.We characterize the seasonal motions of the reference geodetic station CERGA from GNSS, SLR and LLR. We compare the station motion observed with GNSS and SLR and we estimate changes in the station-to-the-moon distance using an improved processing strategy. We investigate the consistency between these geodetic techniques and compare the observed station motion with that estimated using models of surface-mass change. In that regard, we compute atmospheric loading effects using surface pressure fields from ECMWF, assuming an ocean response according to the classical inverted barometer (IB) assumption, considered to be valid for periods typically exceeding a week. We also used general circulation ocean models (ECCO and GLORYS) forced by wind, heat and fresh water fluxes. The continental water storage is described using GLDAS/Noah and MERRA-land models.Using the surface-mass models, we estimate the amplitude of the seasonal vertical motion of the CERGA station ranging between 5 and 10 mm with a maximum reached in August, mostly due to hydrology. The horizontal seasonal motion of the station may reach up to 3 mm. Such a station motion should induce a change in the distance to the moon reaching up to 10 mm, large enough to be detected in LLR time series and compared to GNSS- and SLR-derived motion.
Flood mapping with multitemporal MODIS data
NASA Astrophysics Data System (ADS)
Son, Nguyen-Thanh; Chen, Chi-Farn; Chen, Cheng-Ru
2014-05-01
Flood is one of the most devastating and frequent disasters resulting in loss of human life and serve damage to infrastructure and agricultural production. Flood is phenomenal in the Mekong River Delta (MRD), Vietnam. It annually lasts from July to November. Information on spatiotemporal flood dynamics is thus important for planners to devise successful strategies for flood monitoring and mitigation of its negative effects. The main objective of this study is to develop an approach for weekly mapping flood dynamics with the Moderate Resolution Imaging Spectroradiometer data in MRD using the water fraction model (WFM). The data processed for 2009 comprises three main steps: (1) data pre-processing to construct smooth time series of the difference in the values (DVLE) between land surface water index (LSWI) and enhanced vegetation index (EVI) using the empirical mode decomposition (EMD), (2) flood derivation using WFM, and (3) accuracy assessment. The mapping results were compared with the ground reference data, which were constructed from Envisat Advanced Synthetic Aperture Radar (ASAR) data. As several error sources, including mixed-pixel problems and low-resolution bias between the mapping results and ground reference data, could lower the level of classification accuracy, the comparisons indicated satisfactory results with the overall accuracy of 80.5% and Kappa coefficient of 0.61, respectively. These results were reaffirmed by a close correlation between the MODIS-derived flood area and that of the ground reference map at the provincial level, with the correlation coefficients (R2) of 0.93. Considering the importance of remote sensing for monitoring floods and mitigating the damage caused by floods to crops and infrastructure, this study eventually leads to the realization of the value of using time-series MODIS DVLE data for weekly flood monitoring in MRD with the aid of EMD and WFM. Such an approach that could provide quantitative information on spatiotemporal flood dynamics for monitoring purposes was completely transferable to other regions in the world.
GARS O'Higgins as a core station for geodesy in Antarctica
NASA Astrophysics Data System (ADS)
Klügel, Thomas; Diedrich, Erhard; Falk, Reinhard; Hessels, Uwe; Höppner, Kathrin; Kühmstedt, Elke; Metzig, Robert; Plötz, Christian; Reinhold, Andreas; Schüler, Torben; Wojdziak, Reiner
2014-05-01
The German Antarctic Receiving Station GARS O'Higgins at the northern tip of the Antarctic Peninsula is a dual purpose facility for Earth observation since more than 20 years. It serves as a satellite ground station for payload data downlink and telecommanding of remote sensing satellites as well as a geodetic observatory for global reference frames and global change. Both applications use the same 9m diameter radio telescope. For space geodesy and astrometry the radio telescope significantly improves the coverage on the southern hemisphere and plays an essential role within the global Very Long Baseline Interferometry (VLBI) network. In particular the determination of the Earth Orientation Parameters (EOP) and the sky coverage of the International Celectial Reference Frame (ICRF) benefit from the location at high southern latitude. Further geodetic instrumentation includes different permanent GNSS receivers (since 1995), two SAR corner reflectors (since 2013) and in the past a PRARE system (1996 - 2004). In addition absolute gravity measurements were performed in 1997 and 2011. All geodetic reference points are tied together by a local survey network. The various geodetic instrumentation and the long time series at O'Higgins allow a reliable determination of crustal motions. VLBI station velocities, continuous GNSS time series and absolute gravity measurements consistently document an uplift rate of about 5 mm/a. A pressure gauge and a radar tide gauge being refererenced to space by a GNSS antenna on top allow the measurement of sea level changes independently from crustal motions, and the determination of the ellipsoidal height of the sea surface, which is, the geoid height plus the mean dynamic topography. The outstanding location on the Antarctic continent makes GARS O'Higgins also in future attractive for polar orbiting satellite missions and an essential station for the global VLBI network. Future plans envisage a development towards an observatory for environmentally relevant research.
Fullerton, David S.; Bush, Charles A.; Pennell, Jean N.
2003-01-01
This data set contains surficial geologic units in the Eastern and Central United States, as well as a glacial limit line showing the position of maximum glacial advance during various geologic time periods. The geologic units represent surficial deposits and other surface materials that accumulated or formed during the past 2+ million years, such as soils, alluvium, and glacial deposits. These surface materials are referred to collectively by many geologists as regolith, the mantle of fragmented and generally unconsolidated material that overlies the bedrock foundation of a continent. This data set and the printed map produced from it, U.S. Geological Survey (USGS) Geologic Investigation Series I-2789, were based on 31 published maps in the USGS's Quaternary Geologic Atlas of the United States map series (USGS Miscellaneous Investigations Series I-1420). The data were compiled at 1:1,000,000 scale, to be viewed as a digital map at 1:2,000,000 nominal scale and to be printed as a conventional paper map at 1:2,500,000 scale.
Xie, Ping; Zhao, Jiang Yan; Wu, Zi Yi; Sang, Yan Fang; Chen, Jie; Li, Bin Bin; Gu, Hai Ting
2018-04-01
The analysis of inconsistent hydrological series is one of the major problems that should be solved for engineering hydrological calculation in changing environment. In this study, the diffe-rences of non-consistency and non-stationarity were analyzed from the perspective of composition of hydrological series. The inconsistent hydrological phenomena were generalized into hydrological processes with inheritance, variability and evolution characteristics or regulations. Furthermore, the hydrological genes were identified following the theory of biological genes, while their inheritance bases and variability bases were determined based on composition of hydrological series under diffe-rent time scales. To identify and test the components of hydrological genes, we constructed a diagnosis system of hydrological genes. With the P-3 distribution as an example, we described the process of construction and expression of the moment genes to illustrate the inheritance, variability and evolution principles of hydrological genes. With the annual minimum 1-month runoff series of Yunjinghong station in Lancangjiang River basin as an example, we verified the feasibility and practicability of hydrological gene theory for the calculation of inconsistent hydrological frequency. The results showed that the method could be used to reveal the evolution of inconsistent hydrological series. Therefore, it provided a new research pathway for engineering hydrological calculation in changing environment and an essential reference for the assessment of water security.
Wang, Ying; Chen, Yajuan; Ding, Liping; Zhang, Jiewei; Wei, Jianhua; Wang, Hongzhi
2016-01-01
The vertical segments of Populus stems are an ideal experimental system for analyzing the gene expression patterns involved in primary and secondary growth during wood formation. Suitable internal control genes are indispensable to quantitative real time PCR (qRT-PCR) assays of gene expression. In this study, the expression stability of eight candidate reference genes was evaluated in a series of vertical stem segments of Populus tomentosa. Analysis through software packages geNorm, NormFinder and BestKeeper showed that genes ribosomal protein (RP) and tubulin beta (TUBB) were the most unstable across the developmental stages of P. tomentosa stems, and the combination of the three reference genes, eukaryotic translation initiation factor 5A (eIF5A), Actin (ACT6) and elongation factor 1-beta (EF1-beta) can provide accurate and reliable normalization of qRT-PCR analysis for target gene expression in stem segments undergoing primary and secondary growth in P. tomentosa. These results provide crucial information for transcriptional analysis in the P. tomentosa stem, which may help to improve the quality of gene expression data in these vertical stem segments, which constitute an excellent plant system for the study of wood formation.
Fluctuation Analysis of Redox Potential to Distinguish Microbial Fe(II) Oxidation.
Enright, A M L; Ferris, F G
2016-11-01
We developed a novel method for distinguishing abiotic and biological iron oxidation in liquid media using oxidation-reduction (redox) potential time series data. The instrument and processing algorithm were tested by immersing the tip of a Pt electrode with an Ag-AgCl reference electrode into an active iron-oxidizing biofilm in a groundwater discharge zone, as well as in two abiotic systems: a killed sample and a chemical control from the same site. We used detrended fluctuation analysis to characterize average root mean square fluctuation behavior, which was distinct in the live system. The calculated α value scaling exponents determined by detrended fluctuation analysis were significantly different at p < 0.001. This indicates that time series of electrode response data may be used to distinguish live and abiotic chemical reaction pathways. Due to the simplicity, portability, and small size, it may be suitable for characterization of extraterrestrial environments where water has been observed, such as Mars and Europa. Key Words: Oxidation-reduction potential-Detrended fluctuation analysis-Iron-oxidizing bacteria. Astrobiology 16, 846-852.
Fearless versus fearful speculative financial bubbles
NASA Astrophysics Data System (ADS)
Andersen, J. V.; Sornette, D.
2004-06-01
Using a recently introduced rational expectation model of bubbles, based on the interplay between stochasticity and positive feedbacks of prices on returns and volatility, we develop a new methodology to test how this model classifies nine time series that have been previously considered as bubbles ending in crashes. The model predicts the existence of two anomalous behaviors occurring simultaneously: (i) super-exponential price growth and (ii) volatility growth, that we refer to as the “fearful singular bubble” regime. Out of the nine time series, we find that five pass our tests and can be characterized as “fearful singular bubbles”. The four other cases are the information technology Nasdaq bubble and three bubbles of the Hang Seng index ending in crashes in 1987, 1994 and 1997. According to our analysis, these four bubbles have developed with essentially no significant increase of their volatility. This paper thus proposes that speculative bubbles ending in crashes form two groups hitherto unrecognized, namely those accompanied by increasing volatility (reflecting increasing risk perception) and those without change of volatility (reflecting an absence of risk perception).
Training Methodology. Part 3: Instructional Methods and Techniques; an Annotated Bibliography.
ERIC Educational Resources Information Center
National Inst. of Mental Health (DHEW), Bethesda, MD.
One of a series of bibliographies within a larger series on mental health inservice training and training methodology, this publication contains 346 abstracts, annotations, and other recent selected references (largely 1960-68) on apprenticeship, coaching, programmed instruction, correspondence study, lectures, group discussion, meetings,…
Superior Cross-Species Reference Genes: A Blueberry Case Study
Die, Jose V.; Rowland, Lisa J.
2013-01-01
The advent of affordable Next Generation Sequencing technologies has had major impact on studies of many crop species, where access to genomic technologies and genome-scale data sets has been extremely limited until now. The recent development of genomic resources in blueberry will enable the application of high throughput gene expression approaches that should relatively quickly increase our understanding of blueberry physiology. These studies, however, require a highly accurate and robust workflow and make necessary the identification of reference genes with high expression stability for correct target gene normalization. To create a set of superior reference genes for blueberry expression analyses, we mined a publicly available transcriptome data set from blueberry for orthologs to a set of Arabidopsis genes that showed the most stable expression in a developmental series. In total, the expression stability of 13 putative reference genes was evaluated by qPCR and a set of new references with high stability values across a developmental series in fruits and floral buds of blueberry were identified. We also demonstrated the need to use at least two, preferably three, reference genes to avoid inconsistencies in results, even when superior reference genes are used. The new references identified here provide a valuable resource for accurate normalization of gene expression in Vaccinium spp. and may be useful for other members of the Ericaceae family as well. PMID:24058469
NASA Astrophysics Data System (ADS)
Das, L.; Dutta, M.; Akhter, J.; Meher, J. K.
2016-12-01
It is a challenging task to create station level (local scale) climate change information over the mountainous locations of Western Himalayan Region (WHR) in India because of limited data availability and poor data quality. In the present study, missing values of station data were handled through Multiple Imputation Chained Equation (MICE) technique. Finally 22 numbers of rain gauge and 16 number of temperature station data having continuous record during 19012005 and 19692009 period respectively were considered as reference stations for developing downscaled rainfall and temperature time series from five commonly available GCMs in the IPCC's different generation assessment reports namely 2nd, 3rd, 4th and 5th hereafter known as SAR, TAR, AR4 and AR5 respectively. Downscaled models were developed using the combined data from the ERA-interim reanalysis and GCMs historical runs (in spite of forcing were not identical in different generation) as predictor and station level rainfall and temperature as predictands. Station level downscaled rainfall and temperature time series were constructed for five GCMs available in each generation. Regional averaged downscaled time series comprising of all stations was prepared for each model and generation and the downscaled results were compared with observed time series. Finally an Overall Model Improvement Index (OMII) was developed using the downscaling results, which was used to investigate the model improvement across generations as well as the improvement of downscaling results obtained from the Empirical Statistical Downscaling (ESD) methods. In case of temperature, models have improved from SAR to AR5 over the study area. In all most all the GCMs TAR is showing worst performance over the WHR by considering the different statistical indices used in this study. In case of precipitation, no model has shown gradual improvement from SAR to AR5 both for interpolated and downscaled values.
NASA Astrophysics Data System (ADS)
Wang, C.; Lu, L.
2015-12-01
The Southeast U.S. is listed one of the fastest growing regions by the Census Bureau, covering two of the eleven megaregions of the United States (Florida and Piedmont Atlantic). The Defense Meteorological Satellite Program (DMSP)'s Operational Line-scan System (OLS) nighttime light (NTL) imagery offers a good opportunity for characterizing the extent and dynamics of urban development at global and regional scales. However, the commonly used thresholding technique for NTL-based urban land mapping often underestimates the suburban and rural areas and overestimates urban extents. In this study we developed a novel approach to estimating impervious surface area (ISA) by integrating the NTL and optical reflectance data. A geographically weighted regression model was built to extract ISA from the Vegetation-Adjusted NTL Urban Index (VANUI). The ISA was estimated each year from 1992 to 2013 to generate the ISA time series for the U.S. Southeast region. Using the National Land Cover Database (NLCD) products of percent imperviousness (2001, 2006, and 2010) as our reference data, accuracy assessment indicated that our approach made considerable improvement of the ISA estimation, especially in suburban areas. With the ISA time series, a nonparametric Mann-Kendall trend analysis was performed to detect hotspots of human settlement expansion, followed by the exploration of decennial U.S. census data to link these patterns to migration flows in these hotspots. Our results provided significant insights to human settlement of the U.S. Southeast in the past decades. The proposed approach has great potential for mapping ISA at broad scales with nightlight data such as DMSP/OLS and the new-generation VIIRS products. The ISA time series generated in this study can be used to assess the anthropogenic impacts on regional climate, environment and ecosystem services in the U.S. Southeast.
Aerosol Climate Time Series Evaluation In ESA Aerosol_cci
NASA Astrophysics Data System (ADS)
Popp, T.; de Leeuw, G.; Pinnock, S.
2015-12-01
Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. By the end of 2015 full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which are also validated. The paper will summarize and discuss the results of major reprocessing and validation conducted in 2015. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products (e.g. dust vs. total AOD, ensembles from different algorithms for the same sensor) will be discussed.
Aerosol Climate Time Series in ESA Aerosol_cci
NASA Astrophysics Data System (ADS)
Popp, Thomas; de Leeuw, Gerrit; Pinnock, Simon
2016-04-01
Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. Meanwhile, full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer, but also from ATSR instruments and the POLDER sensor), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which were also validated and improved in the reprocessing. For the three ATSR algorithms the use of an ensemble method was tested. The paper will summarize and discuss the status of dataset reprocessing and validation. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products (e.g. dust vs. total AOD, ensembles from different algorithms for the same sensor) will be discussed.
Variational models for discontinuity detection
NASA Astrophysics Data System (ADS)
Vitti, Alfonso; Battista Benciolini, G.
2010-05-01
The Mumford-Shah variational model produces a smooth approximation of the data and detects data discontinuities by solving a minimum problem involving an energy functional. The Blake-Zisserman model permits also the detection of discontinuities in the first derivative of the approximation. This model can result in a quasi piece-wise linear approximation, whereas the Mumford-Shah can result in a quasi piece-wise constant approximation. The two models are well known in the mathematical literature and are widely adopted in computer vision for image segmentation. In Geodesy the Blake-Zisserman model has been applied successfully to the detection of cycle-slips in linear combinations of GPS measurements. Few attempts to apply the model to time series of coordinates have been done so far. The problem of detecting discontinuities in time series of GNSS coordinates is well know and its relevance increases as the quality of geodetic measurements, analysis techniques, models and products improves. The application of the Blake-Zisserman model appears reasonable and promising due to the model characteristic to detect both position and velocity discontinuities in the same time series. The detection of position and velocity changes is of great interest in geophysics where the discontinuity itself can be the very relevant object. In the work for the realization of reference frames, detecting position and velocity discontinuities may help to define models that can handle non-linear motions. In this work the Mumford-Shah and the Blake-Zisserman models are briefly presented, the treatment is carried out from a practical viewpoint rather than from a theoretical one. A set of time series of GNSS coordinates has been processed and the results are presented in order to highlight the capabilities and the weakness of the variational approach. A first attempt to derive some indication for the automatic set up of the model parameters has been done. The underlying relation that could links the parameter values to the statistical properties of the data has been investigated.
Why didn't Box-Jenkins win (again)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pack, D.J.; Downing, D.J.
This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less
Multifractal analysis of visibility graph-based Ito-related connectivity time series.
Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano
2016-02-01
In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.
Anelli, Filomena; Ciaramelli, Elisa; Arzy, Shahar; Frassinetti, Francesca
2016-11-01
Accumulating evidence suggests that humans process time and space in similar veins. Humans represent time along a spatial continuum, and perception of temporal durations can be altered through manipulations of spatial attention by prismatic adaptation (PA). Here, we investigated whether PA-induced manipulations of spatial attention can also influence more conceptual aspects of time, such as humans' ability to travel mentally back and forward in time (mental time travel, MTT). Before and after leftward- and rightward-PA, participants projected themselves in the past, present or future time (i.e., self-projection), and, for each condition, determined whether a series of events were located in the past or the future with respect to that specific self-location in time (i.e., self-reference). The results demonstrated that leftward and rightward shifts of spatial attention facilitated recognition of past and future events, respectively. These findings suggest that spatial attention affects the temporal processing of the human self. Copyright © 2016 Elsevier B.V. All rights reserved.
Muggles, Meteoritic Armor, and Menelmacar: Using Fantasy Series in Astronomy Education and Outreach
NASA Astrophysics Data System (ADS)
Larsen, K.; Bednarski, M.
2008-11-01
Due in part to recent (and ongoing) film adaptations, the fantasy series of C.S. Lewis (The Chronicles of Narnia), J.K. Rowling (Harry Potter), Philip Pullman (His Dark Materials), and J.R.R. Tolkien (The Silmarillion, The Hobbit, and The Lord of the Rings) are being introduced to a new audience of young (and not so young) readers. Many astronomers and astronomy educators are unaware of the wide variety of astronomical references contained in each series. The first portion of this workshop will introduce participants to these references, and highlight activities which educators, planetariums, and science centers have already developed to utilize these works in their education and outreach programs. In the second segment of the workshop, participants will develop ideas for activities and materials relevant to their individual circumstances, including standards-based education materials.