Sample records for fluorometer time series

  1. Fast repetition rate (FRR) fluorometer and method for measuring fluorescence and photosynthetic parameters

    DOEpatents

    Kolber, Zbigniew; Falkowski, Paul

    1995-06-20

    A fast repetition rate fluorometer device and method for measuring in vivo fluorescence of phytoplankton or higher plants chlorophyll and photosynthetic parameters of phytoplankton or higher plants by illuminating the phytoplankton or higher plants with a series of fast repetition rate excitation flashes effective to bring about and measure resultant changes in fluorescence yield of their Photosystem II. The series of fast repetition rate excitation flashes has a predetermined energy per flash and a rate greater than 10,000 Hz. Also, disclosed is a flasher circuit for producing the series of fast repetition rate flashes.

  2. Fast repetition rate (FRR) fluorometer and method for measuring fluorescence and photosynthetic parameters

    DOEpatents

    Kolber, Z.; Falkowski, P.

    1995-06-20

    A fast repetition rate fluorometer device and method for measuring in vivo fluorescence of phytoplankton or higher plants chlorophyll and photosynthetic parameters of phytoplankton or higher plants is revealed. The phytoplankton or higher plants are illuminated with a series of fast repetition rate excitation flashes effective to bring about and measure resultant changes in fluorescence yield of their Photosystem II. The series of fast repetition rate excitation flashes has a predetermined energy per flash and a rate greater than 10,000 Hz. Also, disclosed is a flasher circuit for producing the series of fast repetition rate flashes. 14 figs.

  3. FORTRAN PROCESSING OF FLUOROMETRIC DATA LOGGED BY A TURNER DESIGNS FIELD FLUOROMETER

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitate acquisition of real-time dye-tracing data. The Turner Designs Model 10-AU-005 Field Fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  4. Three color laser fluorometer for studies of phytoplankton fluorescence

    NASA Technical Reports Server (NTRS)

    Phinney, David A.; Yentsch, C. S.; Rohrer, J.

    1988-01-01

    A three-color laser fluorometer has been developed for field work operations. Using two tunable dye lasers (excitation wavelengths at 440 nm and 530 nm), broadband wavelength optical filters were selected to obtain maximum fluorescence sensitivity at wavelengths greater than 675 nm (chlorophyll) and 575 + or - 15 nm (phycoerythrin). The laser fluorometer permits the measurement of phytoplankton pigments under static or flowing conditions and more closely resembles the time scales (ns) and energy levels (mW) of other laser-induced fluorescence instruments.

  5. Rapid Processing of Turner Designs Model 10-Au-005 Internally Logged Fluorescence Data

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitates acquisition of real-time dye tracing data. The Turner Designs Model 10-AU-005 field fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  6. FPGA-based photon-counting phase-modulation fluorometer and a brief comparison with that operated in a pulsed-excitation mode

    NASA Astrophysics Data System (ADS)

    Iwata, Tetsuo; Taga, Takanori; Mizuno, Takahiko

    2018-02-01

    We have constructed a high-efficiency, photon-counting phase-modulation fluorometer (PC-PMF) using a field-programmable gate array, which is a modified version of the photon-counting fluorometer (PCF) that works in a pulsed-excitation mode (Iwata and Mizuno in Meas Sci Technol 28:075501, 2017). The common working principle for both is the simultaneous detection of the photoelectron pulse train, which covers 64 ns with a 1.0-ns resolution time (1.0 ns/channel). The signal-gathering efficiency was improved more than 100 times over that of conventional time-correlated single-photon-counting at the expense of resolution time depending on the number of channels. The system dead time for building a histogram was eliminated, markedly shortening the measurement time for fluorescent samples with moderately high quantum yields. We describe the PC-PMF and make a brief comparison with the pulsed-excitation PCF in precision, demonstrating the potential advantage of PC-PMF.

  7. Author Correction: Floats with bio-optical sensors reveal what processes trigger the North Atlantic bloom.

    PubMed

    Mignot, A; Ferrari, R; Claustre, H

    2018-05-04

    In the original version of this Article, the data accession https://doi.org/10.17882/42182 was omitted from the Data Availability statement.In the first paragraph of the Methods subsection entitled 'Float data processing', the WET Labs ECO-triplet fluorometer was incorrectly referred to as 'WETLabs ECO PUK'. In the final paragraph of this subsection, the WET Labs ECO-series fluorometer was incorrectly referred to as 'WETLabs 413 ECO-series'.In the Methods subsection 'Float estimates of phytoplankton carbon biomass', the average particulate organic carbon-bbp ratio of 37,537 mgC m -2 was incorrectly given as 37,357 mgC m -2 .In the second paragraph of the Methods subsection 'Float estimates of population division rates', the symbol for Celsius (C) was omitted from the phrase 'a 10°C increase in temperature'.These errors have now been corrected in the PDF and HTML versions of the Article.

  8. Computer controlled fluorometer device and method of operating same

    DOEpatents

    Kolber, Z.; Falkowski, P.

    1990-07-17

    A computer controlled fluorometer device and method of operating same, said device being made to include a pump flash source and a probe flash source and one or more sample chambers in combination with a light condenser lens system and associated filters and reflectors and collimators, as well as signal conditioning and monitoring means and a programmable computer means and a software programmable source of background irradiance that is operable according to the method of the invention to rapidly, efficiently and accurately measure photosynthetic activity by precisely monitoring and recording changes in fluorescence yield produced by a controlled series of predetermined cycles of probe and pump flashes from the respective probe and pump sources that are controlled by the computer means. 13 figs.

  9. Computer controlled fluorometer device and method of operating same

    DOEpatents

    Kolber, Zbigniew; Falkowski, Paul

    1990-01-01

    A computer controlled fluorometer device and method of operating same, said device being made to include a pump flash source and a probe flash source and one or more sample chambers in combination with a light condenser lens system and associated filters and reflectors and collimators, as well as signal conditioning and monitoring means and a programmable computer means and a software programmable source of background irradiance that is operable according to the method of the invention to rapidly, efficiently and accurately measure photosynthetic activity by precisely monitoring and recording changes in fluorescence yield produced by a controlled series of predetermined cycles of probe and pump flashes from the respective probe and pump sources that are controlled by the computer means.

  10. 21 CFR 862.2560 - Fluorometer for clinical use.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Fluorometer for clinical use. 862.2560 Section 862...) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Laboratory Instruments § 862.2560 Fluorometer for clinical use. (a) Identification. A fluorometer for clinical use is a device...

  11. Pulse amplitude modulated chlorophyll fluorometer

    DOEpatents

    Greenbaum, Elias; Wu, Jie

    2015-12-29

    Chlorophyll fluorometry may be used for detecting toxins in a sample because of changes in micro algae. A portable lab on a chip ("LOAC") based chlorophyll fluorometer may be used for toxin detection and environmental monitoring. In particular, the system may include a microfluidic pulse amplitude modulated ("PAM") chlorophyll fluorometer. The LOAC PAM chlorophyll fluorometer may analyze microalgae and cyanobacteria that grow naturally in source drinking water.

  12. Tolerancing fluorometers for in-vitro diagnostics

    NASA Astrophysics Data System (ADS)

    Heinz, Eric

    2009-02-01

    Anyone can make a fluorometer. All that is needed is a light source, a detector, some filters and maybe some lenses. For commercial clinical instruments, however, our customers demand stability, reliability, and reproducibility. If research instruments are like race cars, commercial instruments should be like trucks. This demands that our fluorometer designs be carefully toleranced. This paper will discuss causes of variation and drift, and ways to make fluorometers that are stable and reproducible.

  13. Hazardous Chemical Fluorometer Development.

    DTIC Science & Technology

    1981-02-01

    RD-0129 997 HAZARDOUS CHEMICAL FLUOROMETER DEYELOPNENT(U) JOHNS HOPKINS UNIV LAUREL RD APPLIED PHYSICS LAB 6 S KEYS FEB Bi JHU/RPL/EED-Bi-6B USCO-D...TEST CHART REr-CRT NO: Cr-n-79-81 Hazardous Chemical Fluorometer Development -- Gary S. Keys q Ft THE JOHNS HOPKINS UNIVERSITYqFt. ill) APPLIED PHYSICS...Connecticut 06340 - 0 I CG-D-79-81/ Ah 7_> Hazardous Chemical Fluorometer Development February 1981 88898 7. ,~rrro z 9. NO-0.C as, 0-a ., AII=q1. Wo

  14. Electrets and plant fluorometers used in field studies to measure hydrogen chloride produced during Space Shuttle launches

    NASA Technical Reports Server (NTRS)

    Milligan, J. E.; Swoboda, G. D.; Susko, M.

    1985-01-01

    The results of the field tests of two monitoring device techniques, electrets and plant fluorometers are analyzed in order to determine the environmental effects of launch by-products and the extent of these effects. The STS launches are used because the Shuttle emits 2 1/2 times more HCl than any previous systems, it produces a voluminous ground cloud and, most important, it produces near field HCl deposition and revolatilization, far-field acid washout/rainout, and gaseous HCl diffusion. Field evaluations of electrets at STS-5, STS-6, and STS-8 have shown that qualitative assessments can be made for areas lightly or moderately impacted by gaseous and aerosol HCl. Field evaluation of the plant productivity fluorometer at STS-8 has shown that this system is also useful for qualitative assessment in areas lightly, moderately, or heavily affected by gaseous and aerosol HCl. Quantitative prediction of HCl may be possible in lightly and moderately affected areas, given deposition rates correlation.

  15. A compact field fluorometer and its application to dye tracing in karst environments

    NASA Astrophysics Data System (ADS)

    Poulain, Amaël; Rochez, Gaëtan; Van Roy, Jean-Pierre; Dewaide, Lorraine; Hallet, Vincent; De Sadelaer, Geert

    2017-08-01

    Dye tracing is a classic technique in hydrogeology to investigate surface-water or groundwater flow characteristics, and it is useful for many applications including natural or industrial issues. The Fluo-Green field fluorometer has been successfully tested in a karst environment and is specifically suitable for in-cave karst water monitoring. Karst research often uses dyes to obtain information about groundwater flow in unexplored cave passages. The compact device, alternatively named Fluo-G, meets the requirements of cave media: small (10 × 16 × 21 cm), lightweight (0.75 kg without ballast) and simple in conception. It is easy for cavers to set up and handle compared to other sampling methods. The fluorometer records uranine, turbidity and temperature with a user-defined time-step (1 min - 1 day). Very low energy consumption allows 9,000 measurements with six AA batteries. The device was calibrated and tested in the laboratory and in field conditions in Belgian karst systems. Results are in good fit with other sampling methods: in-situ fluorometers and automatic water sampling plus laboratory analysis. Recording high quality data (breakthrough curves) in karst with in-cave monitoring is valuable to improve knowledge of karst systems. Many hydrological and hydrogeological applications can benefit from such a low-cost and compact device, and finding the best compromise between resources and quality data is essential. Several improvements are possible but preliminary field tests are very promising.

  16. Swallowable fluorometric capsule for wireless triage of gastrointestinal bleeding.

    PubMed

    Nemiroski, A; Ryou, M; Thompson, C C; Westervelt, R M

    2015-12-07

    Real-time detection of gastrointestinal bleeding remains a major challenge because there does not yet exist a minimally invasive technology that can both i) monitor for blood from an active hemorrhage and ii) uniquely distinguish it from blood left over from an inactive hemorrhage. Such a device would be an important tool for clinical triage. One promising solution, which we have proposed previously, is to inject a fluorescent dye into the blood stream and to use it as a distinctive marker of active bleeding by monitoring leakage into the gastrointestinal tract with a wireless fluorometer. This paper reports, for the first time to our knowledge, the development of a swallowable, wireless capsule with a built-in fluorometer capable of detecting fluorescein in blood, and intended for monitoring gastrointestinal bleeding in the stomach. The embedded, compact fluorometer uses pinholes to define a microliter sensing volume and to eliminate bulky optical components. The proof-of-concept capsule integrates optics, low-noise analog sensing electronics, a microcontroller, battery, and low power Zigbee radio, all into a cylindrical package measuring 11 mm × 27 mm and weighing 10 g. Bench-top experiments demonstrate wireless fluorometry with a limit-of-detection of 20 nM aqueous fluorescein. This device represents a major step towards a technology that would enable simple, rapid detection of active gastrointestinal bleeding, a capability that would save precious time and resources and, ultimately, reduce complications in patients.

  17. Methods and Best Practice to Intercompare Dissolved Oxygen Sensors and Fluorometers/Turbidimeters for Oceanographic Applications.

    PubMed

    Pensieri, Sara; Bozzano, Roberto; Schiano, M Elisabetta; Ntoumas, Manolis; Potiris, Emmanouil; Frangoulis, Constantin; Podaras, Dimitrios; Petihakis, George

    2016-05-17

    In European seas, ocean monitoring strategies in terms of key parameters, space and time scale vary widely for a range of technical and economic reasons. Nonetheless, the growing interest in the ocean interior promotes the investigation of processes such as oxygen consumption, primary productivity and ocean acidity requiring that close attention is paid to the instruments in terms of measurement setup, configuration, calibration, maintenance procedures and quality assessment. To this aim, two separate hardware and software tools were developed in order to test and simultaneously intercompare several oxygen probes and fluorometers/turbidimeters, respectively in the same environmental conditions, with a configuration as close as possible to real in-situ deployment. The chamber designed to perform chlorophyll-a and turbidity tests allowed for the simultaneous acquisition of analogue and digital signals of several sensors at the same time, so it was sufficiently compact to be used in both laboratory and onboard vessels. Methodologies and best practice committed to the intercomparison of dissolved oxygen sensors and fluorometers/turbidimeters have been used, which aid in the promotion of interoperability to access key infrastructures, such as ocean observatories and calibration facilities. Results from laboratory tests as well as field tests in the Mediterranean Sea are presented.

  18. Methods and Best Practice to Intercompare Dissolved Oxygen Sensors and Fluorometers/Turbidimeters for Oceanographic Applications

    PubMed Central

    Pensieri, Sara; Bozzano, Roberto; Schiano, M. Elisabetta; Ntoumas, Manolis; Potiris, Emmanouil; Frangoulis, Constantin; Podaras, Dimitrios; Petihakis, George

    2016-01-01

    In European seas, ocean monitoring strategies in terms of key parameters, space and time scale vary widely for a range of technical and economic reasons. Nonetheless, the growing interest in the ocean interior promotes the investigation of processes such as oxygen consumption, primary productivity and ocean acidity requiring that close attention is paid to the instruments in terms of measurement setup, configuration, calibration, maintenance procedures and quality assessment. To this aim, two separate hardware and software tools were developed in order to test and simultaneously intercompare several oxygen probes and fluorometers/turbidimeters, respectively in the same environmental conditions, with a configuration as close as possible to real in-situ deployment. The chamber designed to perform chlorophyll-a and turbidity tests allowed for the simultaneous acquisition of analogue and digital signals of several sensors at the same time, so it was sufficiently compact to be used in both laboratory and onboard vessels. Methodologies and best practice committed to the intercomparison of dissolved oxygen sensors and fluorometers/turbidimeters have been used, which aid in the promotion of interoperability to access key infrastructures, such as ocean observatories and calibration facilities. Results from laboratory tests as well as field tests in the Mediterranean Sea are presented. PMID:27196908

  19. Use of a Fluorometric Imaging Plate Reader in high-throughput screening

    NASA Astrophysics Data System (ADS)

    Groebe, Duncan R.; Gopalakrishnan, Sujatha; Hahn, Holly; Warrior, Usha; Traphagen, Linda; Burns, David J.

    1999-04-01

    High-throughput screening (HTS) efforts at Abbott Laboratories have been greatly facilitated by the use of a Fluorometric Imaging Plate Reader. The FLIPR consists of an incubated cabinet with integrated 96-channel pipettor and fluorometer. An argon laser is used to excite fluorophores in a 96-well microtiter plate and the emitted fluorometer. An argon laser is used to excite fluorophores in a 96-well microtiter plate and the emitted fluorescence is imaged by a cooled CCD camera. The image data is downloaded from the camera and processed to average the signal form each well of the microtiter pate for each time point. The data is presented in real time on the computer screen, facilitating interpretation and trouble-shooting. In addition to fluorescence, the camera can also detect luminescence form firefly luciferase.

  20. Fluorometric procedures for dye tracing

    USGS Publications Warehouse

    Wilson, James F.; Cobb, Ernest D.; Kilpatrick, F.A.

    1986-01-01

    This manual describes the current fluorometric procedures used by the U.S. Geological Survey in dye tracer studies such as time of travel, dispersion, reaeration, and dilution-type discharge measurements. The advantages of dye tracing are (1) low detection and measurement limits and (2) simplicity and accuracy in measuring dye tracer concentrations using fluorometric techniques. The manual contains necessary background information about fluorescence, dyes, and fluorometers and a description of fluorometric operation and calibration procedures as a guide for laboratory and field use. The background information should be useful to anyone wishing to experiment with dyes, fluorometer components, or procedures different from those described. In addition, a brief section on aerial photography is included because of its possible use to supplement ground-level fluorometry.

  1. Fluorometric procedures for dye tracing

    USGS Publications Warehouse

    Wilson, James F.

    1968-01-01

    This manual describes the current fluorometric procedures used by the U.S. Geological Survey in dye tracer studies such as time of travel, dispersion, reaeration, and dilution-type discharge measurements. The advantages of dye tracing are (1) low detection and measurement limits and (2) simplicity and accuracy in measuring dye tracer concentrations using fluorometric techniques. The manual contains necessary background information about fluorescence, dyes, and fluorometers and a description of fluorometric operation and calibration procedures as a guide for laboratory and field use. The background information should be useful to anyone wishing to experiment with dyes, fluorometer components, or procedures different from those described. In addition, a brief section on aerial photography is included because of its possible use to supplement ground-level fluorometry.

  2. Fluorometric procedures for dye tracing

    USGS Publications Warehouse

    Wilson, James E.; Cobb, Ernest D.; Kilpatrick, Frederick A.

    1984-01-01

    This manual describes the current fluorometric procedures used by the U.S. Geological Survey in dye tracer studies such as time of travel, dispersion, reaeration, and dilution-type discharge measurements. The outstanding characteristics of dye tracing are: (1) the low detection and measurement limits, and (2) the simplicity and accuracy of measuring dye tracer concentrations using fluorometric techniques. The manual contains necessary background information about fluorescence, dyes, and fluorometers and a description of fluorometric operation and calibration procedures as a general guide for laboratory and field use. The background information should be useful to anyone wishing to experiment with dyes, fluorometer components, or procedures different from those described. In addition, a brief section is included on aerial photography because of its possible use to supplement ground-level fluorometry.

  3. Smart oxygen cuvette for optical monitoring of dissolved oxygen in biological blood samples

    NASA Astrophysics Data System (ADS)

    Dabhi, Harish; Alla, Suresh Kumar; Shahriari, Mahmoud R.

    2010-02-01

    A smart Oxygen Cuvette is developed by coating the inner surface of a cuvette with oxygen sensitive thin film material. The coating is glass like sol-gel based sensor that has an embedded ruthenium compound in the glass film. The fluorescence of the ruthenium is quenched depending on the oxygen level. Ocean Optics phase fluorometer, NeoFox is used to measure this rate of fluorescence quenching and computes it for the amount of oxygen present. Multimode optical fibers are used for transportation of light from an LED source to cuvette and from cuvette to phase fluorometer. This new oxygen sensing system yields an inexpensive solution for monitoring the dissolved oxygen in samples for biological and medical applications. In addition to desktop fluorometers, smart oxygen cuvettes can be used with the Ocean Optics handheld Fluorometers, NeoFox Sport. The Smart Oxygen Cuvettes provide a resolution of 4PPB units, an accuracy of less than 5% of the reading, and 90% response in less than 10 seconds.

  4. Great Lakes Demonstration 2

    DTIC Science & Technology

    2012-06-01

    A-8 Figure A-12. Laser fluorometer...District Response Advisory Team DRMM Dynamic Risk Management Model EPA Environmental Protection Agency FL Laser fluorometer FOSC Federal On-Scene...this tactic. During this evolution the Hollyhock experimented applying its ice-breaking capabilities to cut channels and pockets into the ice for oil

  5. 21 CFR 862.2560 - Fluorometer for clinical use.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Fluorometer for clinical use. 862.2560 Section 862.2560 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Laboratory Instruments § 862...

  6. 21 CFR 862.2560 - Fluorometer for clinical use.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Fluorometer for clinical use. 862.2560 Section 862.2560 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Laboratory Instruments § 862...

  7. 21 CFR 862.2560 - Fluorometer for clinical use.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Fluorometer for clinical use. 862.2560 Section 862.2560 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Laboratory Instruments § 862...

  8. 21 CFR 862.2560 - Fluorometer for clinical use.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Fluorometer for clinical use. 862.2560 Section 862.2560 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES CLINICAL CHEMISTRY AND CLINICAL TOXICOLOGY DEVICES Clinical Laboratory Instruments § 862...

  9. Application of the fiber-optic perfusion fluorometer to absorption and exsorption studies in hairless mouse skin.

    PubMed

    Shackleford, J M; Yielding, K L

    1987-09-01

    This study was undertaken to test the fiber-optic perfusion fluorometer as a direct means of evaluating skin absorption and exsorption in hairless mice. Skin-barrier compromise was accomplished in the absorption experiments by application of dimethyl sulfoxide to the skin surface or by partial removal of the stratum corneum with sticky tape. Absorbed fluorescein was measured easily in unanesthetized control (skin-barrier intact) and experimental mice. Unabsorbed chemical did not fluoresce 15 minutes after application, although it was present on the surface of the skin as a dry powder. The time course of fluorescein elimination from the skin was related to a rapid phase (vascular removal) and a slow phase (reservoir entrapment). In the exsorption experiments the fluorescein was injected intraperitoneally. Back skin on the right side was swabbed with either dimethyl sulfoxide or 1% capsaicin in alcohol prior to the injections, and differences in skin fluorescence on the left (control) and right sides were recorded. One application of dimethyl sulfoxide or capsaicin increased the level of skin exsorption. Three applications of dimethyl sulfoxide almost doubled the amount of exsorbed dye, whereas three applications of the capsaicin inhibited the exsorption process. It was concluded that the fiber-optic perfusion fluorometer provides an excellent technique in support of other methods of investigating the skin.

  10. In situ monitoring of tracer tests: how to distinguish tracer recovery from natural background

    NASA Astrophysics Data System (ADS)

    Bailly-Comte, V.; Durepaire, X.; Batiot-Guilhe, C.; Schnegg, P.-A.

    2018-03-01

    Hydrogeological tracer tests are primarily conducted with fluorescent tracers. Field fluorometers make it possible to monitor tracers at very low concentrations (<1 ppb) and at high frequency. However, changes in natural fluorescence at a site resulting from variations of dissolved and suspended inorganic and organic material may compromise the measurement of useful signals, thereby limiting the chances of identifying or quantifying the real tracer recovery. An elevated natural signal can mask small concentrations of the tracer while its variability can give the impression of a false recovery. This article shows how the use of a combination of several continuous measurements at different wavelengths allows a better extraction of the natural signal. Field multispectral fluorometers were installed at two Mediterranean karst outlets; both drain carbonate systems but have different environmental conditions. The fluorometers functioned over several hydrologic cycles, in periods affected or not by artificial tracers, making it possible to observe natural signal variations at these sites. The optical properties of this type of field fluorometer were used to calculate the spectral response of the different optics of the measuring probe. These responses, superimposed on three-dimensional excitation/emission matrices produced from laboratory fluorescence measurements, allowed an understanding of what the fluorometer sees under natural flow conditions. The result is an innovative method for correcting artificial tracer results. This type of correction makes it possible to fine-tune the effect of natural background variation on tracer recovery curves for a clear identification of the tracer presence and a more precise quantification of its recovery.

  11. Combined processing and mutual interpretation of radiometry and fluorometry from autonomous profiling Bio-Argo floats: 2. Colored dissolved organic matter absorption retrieval

    NASA Astrophysics Data System (ADS)

    Xing, Xiaogang; Morel, André; Claustre, Hervé; D'Ortenzio, Fabrizio; Poteau, Antoine

    2012-04-01

    Eight autonomous profiling "Bio-Argo" floats were deployed offshore during about 2 years (2008-2010) in Pacific, Atlantic, and Mediterranean zones. They were equipped with miniaturized bio-optical sensors, namely a radiometer measuring within the upper layer the downward irradiance at 412, 490, and 555 nm, and two fluorometers for detection of chlorophyll-a (Chla) and colored dissolved organic matter (CDOM; profiles from 400 m to surface). A first study dealt with the interpretation of the Chla fluorescence signal in terms of concentration, using for this purpose the diffuse attenuation coefficient for irradiance at 490 nm, Kd(490), taken as a proxy for the Chla absorption. The present study examines the possibility of similarly using the Kd(412) values combined with retrieved Chla profiles to convert the CDOM fluorometric qualitative information into a CDOM absorption coefficient (ay). The rationale is to take advantage of the fact that Kd is more sensitive to CDOM presence at 412 nm than at 490 nm. A validation of this method is tested through its application to field data, collected from a ship over a wide range of trophic conditions (Biogeochemistry and Optics South Pacific Experiment (BIOSOPE) cruise); these data include both in situ fluorescence profiles and CDOM absorption as measured on discrete samples. In addition, near-surface ay values retrieved from the floats agree with those derivable from ocean color imagery (Moderate Resolution Imaging Spectroradiometer (MODIS-A)). The low sensitivity of commercially available CDOM fluorometers presently raises difficulties when applying this technique to open ocean waters. It was nevertheless possible to derive from the floats records meaningful time series of CDOM vertical distribution.

  12. Simple and Inexpensive 3D Printed Filter Fluorometer Designs: User-Friendly Instrument Models for Laboratory Learning and Outreach Activities

    ERIC Educational Resources Information Center

    Porter, Lon A., Jr.; Chapman, Cole A.; Alaniz, Jacob A.

    2017-01-01

    In this work, a versatile and user-friendly selection of stereolithography (STL) files and computer-aided design (CAD) models are shared to assist educators and students in the production of simple and inexpensive 3D printed filter fluorometer instruments. These devices are effective resources for supporting active learners in the exploration of…

  13. Fluorometer with a quartz-rod waveguide-integrating sphere configuration to measure evanescent-field luminescence

    USDA-ARS?s Scientific Manuscript database

    A fluorometer was designed to measure evanescent-field luminescence. A quartz-rod waveguide (d = 2 mm) was installed coaxally inside a cylindrical flow-through cell (id = 2.3 mm, od = 6.3 mm, l = 116 mm). An excitation beam from a UV LED or a miniature xenon flashlamp was focused by a ball lens and ...

  14. Use of a recording fluorometer for continuous measurement of phytoplankton concentration

    NASA Astrophysics Data System (ADS)

    Mills, David K.; Tett, Paul B.

    1990-08-01

    By linking a battery operated logger unit and power supply with a submersible fluorometer we have been able to make continuous untended measurements ofphytoplankton chlorophyll fluorescence, and hence estimate algal biomass for periods of up to 35 days at moorings. Calibration aixi data processing procedures are described, and some examples of the results of deployments in waters around the United Kingdom are presented.

  15. Development of miniaturized submersible fluorometers for the detection of aromatic hydrocarbons in marine waters

    NASA Astrophysics Data System (ADS)

    Tedetti, Marc; Bachet, Caroline; Joffre, Pascal; Ferretto, Nicolas; Guigue, Catherine; Goutx, Madeleine

    2014-05-01

    Polycyclic aromatic hydrocarbons (PAHs) are among the most widespread organic contaminants in aquatic environments. Due to their physico-chemical properties, PAHs are persistent and mobile, can strongly bioaccumulate in food chains and are harmful to living organisms. They are thus recognized by various international organizations as priority contaminants and are included in the list of 45 priority regulated substances by the European Union. Because of their aromatic structure, PAHs are "optically active" and have inherent fluorescence properties in the ultraviolet (UV) spectral domain (200-400 nm). Therefore, UV fluorescence spectroscopy has been successfully used to develop PAH sensors (i.e. UV fluorometers). Currently, five UV submersible fluorometers are commercially available for in situ measurements of PAHs: EnviroFlu-HC (TriOS Optical Sensors, Germany), Hydrocarbon Fluorometer (Sea & Sun Technology, Germany), HydroC ™ / PAH (CONTROS, Germany), UviLux AquaTracka (Chelsea Technology Group, UK) and Cyclops-7 (Turner Designs, US). These UV fluorometers are all dedicated to the measurement of phenanthrene (λEx /λEm: 255/360 nm), one of the most abundant and fluorescent PAHs found in the aquatic environment. In this study, we developed original, miniaturized submersible fluorometers based on deep UV light-emitting diodes (LEDs) for simultaneous measurements of two PAHs of interest: the MiniFluo-UV 1 for the detection of phenanthrene (PHE, at λEx /λEm: 255/360 nm) and naphthalene (NAP, at λEx /λEm: 270/340 nm), and the MiniFluo-UV 2 for the detection of fluorene (FLU, at λEx /λEm: 255/315 nm) and pyrene (PYR, at λEx /λEm: 270/380 nm). The MiniFluo-UV sensors have several features: measurements of two PAHs at the same time, small size (puck format, 80 x 60 mm), very low energy consumption (500 mW at 12V), LED monitoring, analog and numerical communication modes. The two MiniFluo-UV sensors were first tested in the laboratory: 1) on standard solutions of PHE, NAP, FLU and PYR in the range 0.1-100 µg l-1 and 2) on a water soluble fraction (WSF) of crude oil diluted in 0.2 µm filtered seawater (0 to 50% of WSF in seawater). Then, the MiniFluo-UV sensors were mounted onto a conductivity temperature depth (CTD) vertical profiler and tested at sea. Several profiles were performed in the Bay of Marseilles, in different harbours and hydrocarbon-impacted sites. The MiniFluo-UV measurements performed in the laboratory and in the field were associated with spectrofluorometric (EEM/PARAFAC) and/or chromatographic (GC-MS) analyses. The result obtained show that the MiniFluo-UV are pertinent and efficient tool for monitoring hydrocarbon pollutions in the marine environment. This work is a contribution of three projects labelled by the Competitivity Cluster Mer PACA: FUI SEA EXPLORER, DGCIS - Eco industries VASQUE (PI: ACSA-ALCEN, Meyreuil, France) and ANR - ECOTECH IBISCUS (PI: M. Goutx, MIO, Marseille, France).

  16. Fiber-optic filter fluorometer for emission detection of Protoporphyrin IX and its direct precursors - A preliminary study for improved Photodynamic Therapy applications

    NASA Astrophysics Data System (ADS)

    Landes, Rainer; Illanes, Alfredo; van Oepen, Alexander; Goeppner, Daniela; Gollnick, Harald; Friebe, Michael

    2018-03-01

    In this work we present first results of a laboratory manufactured filter-fluorometer to study differences in intensity and position of the main peaks of three porphyrins that appear during the Heme-Synthesis. Porphyrins play a major role in Photodynamic Therapy (PDT) for cancer treatment. Within the Heme-Synthesis, Porphyrins such as Protoporphyrin IX (PPIX) and its two precursors Coproporphyrin III (CPIII) and Uroporphyrin III (UPIII) represent photochemical agents that can interact with light to show fluorescence or generate Reactive Oxygen Species (ROS) to destroy cells. A major problem that arises is determining the ideal time slot to begin treatment after drug application. Our work is meant to show a way to solve this problem by looking at concentration changes of precursors appearing in Heme-Synthesis and using these changes to predict the occurence of PPIX inside the mitochondria.

  17. Processes Affecting the Variability of Fluorescence Signals from Benthic Targets in Shallow Waters

    DTIC Science & Technology

    1998-09-30

    analysis of zooxanthellae isolated from coral samples. We examined fluorescence lifetimes from benthic targets using femtosecond laser based Single...a temporal resolution 5 ps. The laser set-up was employed for laboratory studies of fluorescence lifetimes from model targets such as zooxanthellae ...seagrasses. Over 200 measurements on isolated zooxanthellae were conducted with a bench-top FRR fluorometer and Phase Shift Fluorometer. During the CoBOP-98

  18. Application of the in situ three channel WET Star fluorometer to characterize FDOM sources and determine water masses in the Nordic Seas

    NASA Astrophysics Data System (ADS)

    Raczkowska, Anna; Kowalczuk, Piotr; Sagan, Slawomir; Zablocka, Monika; Stedmon, Colin; Granskog, Mats

    2017-04-01

    Water masses exchange between the Atlantic Ocean and the Arctic Ocean occurs in Nordic Seas and this process represents a crucial component of the northern hemisphere climate system. Nordic Seas are dominated by Atlantic Waters (AW) and Polar Waters (PW) and water formed in the mixing process or local modifications like precipitation and sea-ice melt. Classification of water masses only on the basis of temperature, salinity or density not take into account different sources of fresh water in the Nordic Seas. In this study we propose that measured signal from the in situ three channel WET Star fluorometer could be a useful tool for characterization of dissolved organic matter (DOM) and refinement of water masses classification . Spectral properties of Chromophoric Dissolved Organic Matter and Fluorescent Dissolved Organic Matter (CDOM and FDOM) were characterized in different water masses along a section across the Fram Strait at 79°N as well as in the Nordic Seas in 2014 and 2015. Observations of CDOM and FDOM were carried out with use of in situ three channel WET Labs WET Star fluorometer and Excitation Emission Matrix spectra (EEMs) measured in the water samples. The WET Labs WET Star three channels in situ fluorometer was designed to measure emission of humic and protein-like FDOM fractions. Instruments output was calibrated against respective fluorescence intensity of EMMs measured with use of Aqualog fluorometer (Horiba Scientific) at excitation and emission ranges corresponding to in situ fluorometer channels. The correctness of the calibration was confirmed by empirical linear relationship between WET Star in situ fluorescence intensities and aCDOM(350) derived from water samples. Measured WET Star fluorometer signal enabled to asses distribution of different FDOM fractions in the Nordic Seas. The distribution of humic-like fluorescence intensity in the function of salinity revealed three distinct mixing curves: the first indicates mixing between surface PW diluted by sea ice melt with core of PW from East Greenland Current, the second imply transition from PW to AW, the third curve is an indicator of modification of AW by sea ice melting in the area of Western and Northern Spitsbergen Shelf. Furthermore, fluorescence intensities of humic-like DOM fraction is very low and remains practically constant in the core of AW. In the AW there is a strong subsurface maximum of chlorophyll a fluorescence which was aligned with protein-like fraction of DOM. The linear relationship between phytoplankton fluorescence and fluorescence intensity of protein-like DOM fraction proved that phytoplankton was primary source of protein like fraction of DOM in the AW.

  19. Use of a portable time-resolved fluorometer to determine oxytetracycline residue in four fruit crops

    USDA-ARS?s Scientific Manuscript database

    Worldwide, oxytetracycline (OTC) is used in fruit and vegetable crops to prevent and treat bacteria diseases. In the U.S., the Environmental Protection Agency approved its use in apple, pear, peach, and nectarine, and set tolerance at 350 ng/g. OTC residues in 12 varieties of these fruits are determ...

  20. A set of optical methods for studying marine phytoplankton

    NASA Astrophysics Data System (ADS)

    Konyukhov, I. V.; Glukhovets, D. I.

    2017-05-01

    The results of integrated optical measurements of Black Sea water samples using a spectrophotometer, laser spectrometer, and fluorometer with pulse-modulated excitation light are discussed. A linear correlation between the intensities of chlorophyll absorption at 673 nm and chlorophyll fluorescence (680-750 nm) is observed. Phycoerythrin-containing organisms are recorded in phytoplankton in layers below 20 m. The data of 1-week monitoring of phytoplankton abundance and functional activity in Golubaya Bay with a Mega-25 flow fluorometer are described.

  1. Lightweight Fiber Optic Gas Sensor for Monitoring Regenerative Food Production

    NASA Technical Reports Server (NTRS)

    Schmidlin, Edward; Goswami, Kisholoy

    1995-01-01

    In this final report, Physical Optics Corporation (POC) describes its development of sensors for oxygen, carbon dioxide, and relative humidity. POC has constructed a phase fluorometer that can detect oxygen over the full concentration range from 0 percent to 100 percent. Phase-based measurements offer distinct advantages, such as immunity to source fluctuation, photobleaching, and leaching. All optics, optoelectronics, power supply, and the printed circuit board are included in a single box; the only external connections to the fluorometer are the optical fiber sensor and a power cord. The indicator-based carbon dioxide sensor is also suitable for short-term and discrete measurements over the concentration range from 0 percent to 100 percent. The optical fiber-based humidity sensor contains a porous core for direct interaction of the light beam with water vapor within fiber pores; the detection range for the humidity sensor is 10 percent to 100 percent, and response time is under five minutes. POC is currently pursuing the commercialization of these oxygen and carbon dioxide sensors for environmental applications.

  2. Multiple protocol fluorometer and method

    DOEpatents

    Kolber, Zbigniew S.; Falkowski, Paul G.

    2000-09-19

    A multiple protocol fluorometer measures photosynthetic parameters of phytoplankton and higher plants using actively stimulated fluorescence protocols. The measured parameters include spectrally-resolved functional and optical absorption cross sections of PSII, extent of energy transfer between reaction centers of PSII, F.sub.0 (minimal), F.sub.m (maximal) and F.sub.v (variable) components of PSII fluorescence, photochemical and non-photochemical quenching, size of the plastoquinone (PQ) pool, and the kinetics of electron transport between Q.sub.a and PQ pool and between PQ pool and PSI. The multiple protocol fluorometer, in one embodiment, is equipped with an excitation source having a controlled spectral output range between 420 nm and 555 nm and capable of generating flashlets having a duration of 0.125-32 .mu.s, an interval between 0.5 .mu.s and 2 seconds, and peak optical power of up to 2 W/cm.sup.2. The excitation source is also capable of generating, simultaneous with the flashlets, a controlled continuous, background illumination.

  3. Fluorometry as a bacterial source tracking tool in coastal watersheds, Trinidad, CA

    Treesearch

    Trever Parker; Andrew Stubblefield

    2012-01-01

    Bacterial counts have long been used as indicators of water pollution that may affect public health. By themselves, bacteria are indicators only and can not be used to identify the source of the pollutant for remediation efforts. Methods of microbial source tracking are generally time consuming, labor intensive and expensive. As an alternative, a fluorometer can be...

  4. Anthropogenic inputs of dissolved organic matter in New York Harbor

    NASA Astrophysics Data System (ADS)

    Gardner, G. B.; Chen, R. F.; Olavasen, J.; Peri, F.

    2016-02-01

    The Hudson River flows into the Atlantic Ocean through a highly urbanized region which includes New York City to the east and Newark, New Jersey to the west. As a result, the export of Dissolved Organic Carbon (DOC) from the Hudson to the Atlantic Ocean includes a significant anthropogenic component. A series of high resolution studies of the DOC dynamics of this system were conducted between 2003 and 2010. These included both the Hudson and adjacent large waterways (East River, Newark Bay, Kill Van Kull and Arthur Kill) using coastal research vessels and smaller tributaries (Hackensack, Pasaic and Raritan rivers) using a 25' boat. In both cases measurements were made using towed instrument packages which could be cycled from near surface to near bottom depths with horizontal resolution of approximately 20 to 200 meters depending on depth and deployment strategy. Sensors on the instrument packages included a CTD to provide depth and salinity information and a chromophoric dissolved organic matter(CDOM) fluorometer to measure the fluorescent fraction of the DOC. Discrete samples allowed calibration of the fluorometer and the CDOM data to be related to DOC. The combined data set from these cruises identified multiple scales of source and transport processes for DOC within the Hudson River/New York Harbor region. The Hudson carries a substantial amount of natural DOC from its 230 km inland stretch. Additional sources exist in fringing salt marshes adjacent to the Hackensack and Raritan rivers. However the lower Hudson/New Harbor region receives a large input of DOC from multiple publically owned treatment works (POTW) discharges. The high resolution surveys allowed us to elucidate the distribution of these sources and the manner in which they are rapidly mixed to create the total export. We estimate that anthropogenic sources account for up to 2.5 times the DOC flux contributed by natural processes.

  5. Center for Coastline Security Technology, Year-2

    DTIC Science & Technology

    2007-05-01

    set to a constant value of n = 7.25 Hz (435 RPM) giving an advance ratio of J = U/nD = 0.31 (assuming a vehicle wake deficit of 0.9 UB∞ B), the yaw... noise characteristics (Case #4). Figures for Section 2.9 Figure 2.9.1: WETStar Fluorometer. Figure 2.9.2: Proposed design schematic no. 1...consecutives states and )(kw is the state noise [10]. F defines the relation between the state vector )(kX at time k and the state vector at time k

  6. Making Sense of Plant Health

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Ciencia, Inc. created a new device, known as a Portable Photosynthesis Analyzer, or Phase Fluorometer, that provides real-time data about the photochemical efficiency of phytoplankton and other plant forms. The commercial version of this technology is used for photosynthesis research and offers major benefits to the field of life science. This new instrument is the first portable instrument of its kind. Through a license agreement with Ciencia, Oriel Instruments, of Stratford, Connecticut, manufactures and markets the commercial version of the instrument under the name LifeSense.TMLifeSense is a 70 MHz single-frequency fluorometer that offers unrivaled capabilities for fluorescence lifetime sensing and analysis. LifeSense provides information about all varieties of photosynthetic systems. Photosynthesis research contributes important health assessments about the plant, be it phytoplankton or a higher form of plant life. With its unique sensing capabilities, LifeSense furnishes data regarding the yield of a plant's photochemistry, as well as its levels of photosynthetic activity. The user can then gain an extremely accurate estimate of the plant's chlorophyll biomass, primary production rates, and a general overview of the plant's physiological condition.

  7. Enhanced optical fiber fluorometer using a periodic perturbation in the fiber core

    NASA Astrophysics Data System (ADS)

    Chiniforooshan, Yasser; Bock, Wojtek J.; Ma, Jianjun

    2013-10-01

    Tracing of the specific chemicals and biological agents in a solution is becoming a vital interest in health, security and safety industries. Although a number of standard laboratory-based testing systems exists for detecting such targets, but the fast, real-time and on-site methods could be more efficient and cost-effective. One of the most common ways to detect a target in the solution is to use the fluorophore molecules which will be selectively attached to the targets and will emit or quench the fluorescence in presence of the target. The fiber-optic fluorometers are developed for inexpensive and portable detection. In this paper, we explain a novel multi-segment fiber structure which uses the periodic perturbation on the side-wall of a highly multi-mode fiber to enhance collecting the fluorescent light. This periodic perturbation is fabricated and optimized on the core of the fiber using a CO2 laser. The theoretical explanation to show the physical principle of the structure is followed by the experimental evidence of its functioning.

  8. Tissue-based water quality biosensors for detecting chemical warfare agents

    DOEpatents

    Greenbaum, Elias [Oak Ridge, TN; Sanders, Charlene A [Knoxville, TN

    2003-05-27

    A water quality sensor for detecting the presence of at least one chemical or biological warfare agent includes: a cell; apparatus for introducing water into the cell and discharging water from the cell adapted for analyzing photosynthetic activity of naturally occurring, free-living, indigenous photosynthetic organisms in water; a fluorometer for measuring photosynthetic activity of naturally occurring, free-living, indigenous photosynthetic organisms drawn into the cell; and an electronics package that analyzes raw data from the fluorometer and emits a signal indicating the presence of at least one chemical or biological warfare agent in the water.

  9. Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer

    NASA Astrophysics Data System (ADS)

    Stryjewski, Wieslaw J.

    1991-08-01

    A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.

  10. A rapid excitation-emission matrix fluorometer utilizing supercontinuum white light and acousto-optic tunable filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Wenbo; Department of Dermatology and Skin Science, University of British Columbia, 835 West 10th Avenue, Vancouver, British Columbia V5Z 4E8; Department of Biomedical Engineering, University of British Columbia, KAIS 5500, 2332 Main Mall, Vancouver, British Columbia V6T 1Z4

    Scanning speed and coupling efficiency of excitation light to optic fibres are two major technical challenges that limit the potential of fluorescence excitation-emission matrix (EEM) spectrometer for on-line applications and in vivo studies. In this paper, a novel EEM system, utilizing a supercontinuum white light source and acousto-optic tunable filters (AOTFs), was introduced and evaluated. The supercontinuum white light, generated by pumping a nonlinear photonic crystal fiber with an 800 nm femtosecond laser, was efficiently coupled into a bifurcated optic fiber bundle. High speed EEM spectral scanning was achieved using AOTFs both for selecting excitation wavelength and scanning emission spectra.more » Using calibration lamps (neon and mercury argon), wavelength deviations were determined to vary from 0.18 nm to −0.70 nm within the spectral range of 500–850 nm. Spectral bandwidth for filtered excitation light broadened by twofold compared to that measured with monochromatic light between 650 nm and 750 nm. The EEM spectra for methanol solutions of laser dyes were successfully acquired with this rapid fluorometer using an integration time of 5 s.« less

  11. Tissue-based standoff biosensors for detecting chemical warfare agents

    DOEpatents

    Greenbaum, Elias; Sanders, Charlene A.

    2003-11-18

    A tissue-based, deployable, standoff air quality sensor for detecting the presence of at least one chemical or biological warfare agent, includes: a cell containing entrapped photosynthetic tissue, the cell adapted for analyzing photosynthetic activity of the entrapped photosynthetic tissue; means for introducing an air sample into the cell and contacting the air sample with the entrapped photosynthetic tissue; a fluorometer in operable relationship with the cell for measuring photosynthetic activity of the entrapped photosynthetic tissue; and transmitting means for transmitting analytical data generated by the fluorometer relating to the presence of at least one chemical or biological warfare agent in the air sample, the sensor adapted for deployment into a selected area.

  12. The Fraunhofer line discriminator: An airborne fluorometer

    NASA Technical Reports Server (NTRS)

    Stoertz, G. E.

    1969-01-01

    An experimental Fraunhofer Line Discriminator (FLD) can differentiate and measure solar-stimulated luminescence when viewed against a background of reflected light. Key elements are two extremely sensitive photomultipliers, two glass-spaced Fabry-Perot filters having a bandwidth less than 1 A, and an analog computer. As in conventional fluorometers, concentration of a fluorescent substance is measured by comparison with standards. Quantitative use is probably accurate only at low altitudes but detection of luminescent substances should be possible from any altitude. Applications of the present FLD include remote sensing of fluorescent dyes used in studies of current dynamics. The basic technique is applicable to detection of oil spills, monitoring of pollutants, and sensing over land areas.

  13. The compositional change of Fluorescent Dissolved Organic Matter across Fram Strait assessed with use of a multi channel in situ fluorometer.

    NASA Astrophysics Data System (ADS)

    Raczkowska, A.; Kowalczuk, P.; Sagan, S.; Zabłocka, M.; Pavlov, A. K.; Granskog, M. A.; Stedmon, C. A.

    2016-02-01

    Observations of Colored Dissolved Organic Matter absorption (CDOM) and fluorescence (FDOM) from water samples and an in situ fluorometer and of Inherent Optical Properties (IOP; light absorption and scattering) were carried out along a section across Fram Strait at 79°N. A 3 channel Wetlabs Wetstar fluorometer was deployed, with channels for humic- and protein-like DOM and used to assess distribution of different FDOM fractions. A relationship between fluorescence intensity of the protein-like fraction of FDOM and chlorophyll a fluorescence was found and indicated the importance of phytoplankton biomass in West Spitsbergen Current waters as a significant source of protein-like FDOM. East Greenland Current waters has low concentration of chlorophyll a, and were characterized by high humic-like FDOM fluorescence. An empirical relationship between humic-like FDOM fluorescence intensity and CDOM absorption was derived and confirms the dominance of terrigenous like CDOM on the composition of DOM in the East Greenland Current. These high resolution profile data offer a simple approach to fractionate the contribution of these two DOM source to DOM across the Fram Strait and may help refine estimates of DOC fluxes in and out of the Arctic through this region.

  14. Permeabilized Rat Cardiomyocyte Response Demonstrates Intracellular Origin of Diffusion Obstacles

    PubMed Central

    Jepihhina, Natalja; Beraud, Nathalie; Sepp, Mervi; Birkedal, Rikke; Vendelin, Marko

    2011-01-01

    Intracellular diffusion restrictions for ADP and other molecules have been predicted earlier based on experiments on permeabilized fibers or cardiomyocytes. However, it is possible that the effective diffusion distance is larger than the cell dimensions due to clumping of cells and incomplete separation of cells in fiber preparations. The aim of this work was to check whether diffusion restrictions exist inside rat cardiomyocytes or are caused by large effective diffusion distance. For that, we determined the response of oxidative phosphorylation (OxPhos) to exogenous ADP and ATP stimulation in permeabilized rat cardiomyocytes using fluorescence microscopy. The state of OxPhos was monitored via NADH and flavoprotein autofluorescence. By varying the ADP or ATP concentration in flow chamber, we determined that OxPhos has a low affinity in cardiomyocytes. The experiments were repeated in a fluorometer on cardiomyocyte suspensions leading to similar autofluorescence changes induced by ADP as recorded under the microscope. ATP stimulated OxPhos more in a fluorometer than under the microscope, which was attributed to accumulation of ADP in fluorometer chamber. By calculating the flow profile around the cell in the microscope chamber and comparing model solutions to measured data, we demonstrate that intracellular structures impose significant diffusion obstacles in rat cardiomyocytes. PMID:22067148

  15. A literature review of portable fluorescence-based oil-in-water monitors.

    PubMed

    Lambert, P

    2003-08-15

    The results of a literature search on fluorescence-based portable detectors to measure the real-time concentration of oil are reported. For more than two decades, fluorometers have been commonly employed to monitor dispersed oil levels at oil spills on water. The focus of this paper has been to extract specific information from references about how the instruments were used, including set up and calibration procedures, the oil and dispersant measured, the approximate concentration range of the oil in the water column, and how the real-time data compared to traditional laboratory techniques.

  16. Measuring indigenous photosynthetic organisms to detect chemical warefare agents in water

    DOEpatents

    Greenbaum, Elias; Sanders, Charlene A.

    2005-11-15

    A method of testing water to detect the presence of a chemical or biological warfare agent is disclosed. The method is carried out by establishing control data by providing control water containing indigenous organisms but substantially free of a chemical and a biological warfare agent. Then measuring photosynthetic activity of the control water with a fluorometer to obtain control data to compare with test data to detect the presence of the chemical or agent. The test data is gathered by providing test water comprising the same indigenous organisms as contained in the control water. Further, the test water is suspected of containing the chemical or agent to be tested for. Photosynthetic activity is also measured by fluorescence induction in the test water using a fluorometer.

  17. A fiber optic, ultraviolet light-emitting diode-based, two wavelength fluorometer for monitoring reactive adsorption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granz, Christopher D.; Whitten, James E., E-mail: James-Whitten@uml.edu; Schindler, Bryan J.

    Construction and use of an ultraviolet light-emitting diode-based fluorometer for measuring photoluminescence (PL) from powder samples with a fiber optic probe is described. Fluorescence at two wavelengths is detected by miniature photomultiplier tubes, each equipped with a different band pass filter, whose outputs are analyzed by a microprocessor. Photoluminescent metal oxides and hydroxides, and other semiconducting nanoparticles, often undergo changes in their emission spectra upon exposure to reactive gases, and the ratio of the PL intensities at two wavelengths is diagnostic of adsorption. Use of this instrument for reactive gas sensing and gas filtration applications is illustrated by measuring changesmore » in the PL ratio for zirconium hydroxide and zinc oxide particles upon exposure to air containing low concentrations of sulfur dioxide.« less

  18. Multiplex fluorescent immunoassay device based on magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Godjevargova, T. I.; Ivanov, Y. L.; Dinev, D. D.

    2017-02-01

    Immunofluorescent analyzer based compact disc for simultaneous detection of 3 antibiotics in the same milk sample is consisting of two parts: CD-based immunofluorescence kit and optoelectronic fluorometer. Kit consists of 2 parts: Lyophilized immobilized antibodies on supermagnetic nanoparticles in Eppendorf tubes and CD-based microfluidic disk, in which are formed five chamber systems for simultaneous detecting of 5 separate samples. Each system consists of 2 chambers connected by a special micro channel acting as a hydrophobic valve. In the first chamber lyophilised conjugates of 3 antibiotics with accordingly 3 different fluorescent dyes are placed. The second chamber is for detection of fluorescent signal. The optoelectronic fluorometer is comprising of: integrated thermostatic block; mechanical-detecting unit (fluorometer) and block with controlling and visualizing electronics.The disc gets into a second block of the analyzer, where centrifugation is performed and also reporting of the fluorescent signals. This unit comprises a rotor on which the disc is fixed, permanent electromagnet in the form of a ring inserted under the disc and module of 3 LED diodes with emission filters for the relevant wavelengths corresponding to the used fluorescent dyes and 1 integrated photodiode, in front of which is mounted filter with 3 spectral peaks.The signal from the photodiode is detected by the electronic unit which is sensitive "lock-in" amplifier, the engine rotor management, control of thermostatic device and management of periphery of the analyzer, consisting of display and communications with computer.

  19. An annual cycle of phytoplankton biomass in the Arabian Sea, 1994 1995, as determined by moored optical sensors

    NASA Astrophysics Data System (ADS)

    Kinkade, C. S.; Marra, J.; Dickey, T. D.; Weller, R.

    A surface-to-bottom mooring in the central Arabian Sea (15.5°N, 61.5°E) deployed from October 1994 to October 1995, included fluorometers, PAR irradiance sensors, Lu 683 sensors, and a spectral radiometer. An annual cycle of phytoplankton biomass was determined by transforming signals from the optical sensors into chlorophyll a (chl a). Half-yearly phytoplankton blooms with water-column stratification were observed near the end of each monsoon, as well as biomass increases in response to mesoscale flow features. During the Northeast Monsoon, the integrate water-column chl a rose from 15 to 25 mg m -2, while during the Southwest Monsoon, chl a increased from 15 to a maximum >40 mg m -2. We present an empirical relationship between the ratio of downwelling Ed443/ Ed550 (blue to green wavelength ratio) and integral euphotic zone chl a determined by moored fluorometers ( r2=0.73). There is a more significant relationship between Ed443/ Ed550 measured at one depth in the water column (65 m) and the average vertical attenuation coefficient for PAR (K PAR) between 0 and 65 m ( r2=0.845). Because biofouling was a significant problem at times, data return from any one sensor was incomplete. However, optical sensor/data intercomparison helped fill gaps while permitting investigation of the temporal variability in observed phytoplankton biomass.

  20. Estimation of chromophoric dissolved organic matter (CDOM) and photosynthetic activity of estuarine phytoplankton using a multiple-fixed-wavelength spectral fluorometer.

    PubMed

    Goldman, Emily A; Smith, Erik M; Richardson, Tammi L

    2013-03-15

    The utility of a multiple-fixed-wavelength spectral fluorometer, the Algae Online Analyser (AOA), as a means of quantifying chromophoric dissolved organic matter (CDOM) and phytoplankton photosynthetic activity was tested using algal cultures and natural communities from North Inlet estuary, South Carolina. Comparisons of AOA measurements of CDOM to those by spectrophotometry showed a significant linear relationship, but increasing amounts of background CDOM resulted in progressively higher over-estimates of chromophyte contributions to a simulated mixed algal community. Estimates of photosynthetic activity by the AOA at low irradiance (≈ 80 μmol quanta m(-2) s(-1)) agreed well with analogous values from the literature for the chlorophyte, Dunaliella tertiolecta, but were substantially lower than previous measurements of the maximum quantum efficiency of photosystem II (F(v)/F(m)) in Thalassiosira weissflogii (a diatom) and Rhodomonas salina (a cryptophyte). When cells were exposed to high irradiance (1500 μmol quanta m(-2) s(-1)), declines in photosynthetic activity with time measured by the AOA mirrored estimates of cellular fluorescence capacity using the herbicide 3'-(3, 4-dichlorophenyl)-1',1'-dimethyl urea (DCMU). The AOA shows promise as a tool for the continuous monitoring of phytoplankton community composition, CDOM, and the group-specific photosynthetic activity of aquatic ecosystems. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Fluorescence Spectroscopy in a Shoebox

    ERIC Educational Resources Information Center

    Wahab, M. Farooq

    2007-01-01

    The construction of a fluorometer and a spectrofluorometer using flashlight or a sunlight excitation source in a shoebox and the eye as a detector is being described. The assembly helps in understanding several fundamental ideas related to the subject very easily.

  2. In situ tryptophan-like fluorometers: assessing turbidity and temperature effects for freshwater applications.

    PubMed

    Khamis, K; Sorensen, J P R; Bradley, C; Hannah, D M; Lapworth, D J; Stevens, R

    2015-04-01

    Tryptophan-like fluorescence (TLF) is an indicator of human influence on water quality as TLF peaks are associated with the input of labile organic carbon (e.g. sewage or farm waste) and its microbial breakdown. Hence, real-time measurement of TLF could be particularly useful for monitoring water quality at a higher temporal resolution than available hitherto. However, current understanding of TLF quenching/interference is limited for field deployable sensors. We present results from a rigorous test of two commercially available submersible tryptophan fluorometers (ex ∼ 285, em ∼ 350). Temperature quenching and turbidity interference were quantified in the laboratory and compensation algorithms developed. Field trials were then undertaken involving: (i) an extended deployment (28 days) in a small urban stream; and, (ii) depth profiling of an urban multi-level borehole. TLF was inversely related to water temperature (regression slope range: -1.57 to -2.50). Sediment particle size was identified as an important control on the turbidity specific TLF response, with signal amplification apparent <150 NTU for clay particles and <650 NTU for silt particles. Signal attenuation was only observed >200 NTU for clay particles. Compensation algorithms significantly improved agreement between in situ and laboratory readings for baseflow and storm conditions in the stream. For the groundwater trial, there was an excellent agreement between laboratory and raw in situ TLF; temperature compensation provided only a marginal improvement, and turbidity corrections were unnecessary. These findings highlight the potential utility of real time TLF monitoring for a range of environmental applications (e.g. tracing polluting sources and monitoring groundwater contamination). However, in situations where high/variable suspended sediment loads or rapid changes in temperature are anticipated concurrent monitoring of turbidity and temperature is required and site specific calibration is recommended for long term, surface water monitoring.

  3. Stream periphyton responses to mesocosm treatments of ...

    EPA Pesticide Factsheets

    A stream mesocosm experiment was designed to compare biotic responses among streams exposed to an equal excess specific conductivity target of 850 µS/cm relative to a control that was set for 200 µS/cm and three treatments comprised of different major ion contents. Each treatment and the control was replicated 4 times at the mesocosm scale (16 mesocosms total). The treatments were based on dosing the background mesocosm water, a continuous flow-through mixture of natural river water and reverse osmosis treated water, with stock salt solutions prepared from 1) a mixture of sodium chloride and calcium chloride (Na/Cl chloride), 2) sodium bicarbonate, and 3) magnesium sulfate. The realized average specific conductance over the first 28d of continuous dosing was 827, 829, and 847 µS/cm, for the chloride, bicarbonate, and sulfate based treatments, respectively, and did not differ significantly. The controls averaged 183 µS/cm. Here we focus on comparing stream periphyton communities across treatments based on measurements obtained from a Pulse-Amplitude Modulated (PAM) fluorometer. The fluorometer is used in situ and with built in algorithms distributes the total aerial algal biomass (µg/cm2) of the periphyton among cyanobacteria, diatoms, and green algae. A measurement is recorded in a matter of seconds and, therefore, many different locations can be measured with in each mesocosm at a high return frequency. Eight locations within each of the 1 m2 (0.3 m W x 3

  4. Attempting to link hydro-morphology, transient storage and metabolism in streams: Insights from reactive tracer experiments

    NASA Astrophysics Data System (ADS)

    Kurz, Marie J.; Schmidt, Christian; Blaen, Phillip; Knapp, Julia L. A.; Drummond, Jennifer D.; Martí, Eugenia; Zarnetske, Jay P.; Ward, Adam S.; Krause, Stefan

    2016-04-01

    In-stream transient storage zones, including the hyporheic zone and vegetation beds, can be hotspots of biogeochemical processing in streams, enhancing ecosystem functions such as metabolism and nutrient uptake. The spatio-temporal dynamics and reactivity of these storage zones are influenced by multiple factors, including channel geomorphology, substrate composition and hydrology, and by anthropogenic modifications to flow regimes and nutrient loads. Tracer injections are a commonly employed method to evaluate solute transport and transient storage in streams; however, reactive tracers are needed to differentiate between metabolically active and inactive transient storage zones. The reactive stream tracer resazurin (Raz), a weakly fluorescent dye which irreversibly transforms to resorufin (Rru) under mildly reducing conditions, provides a proxy for aerobic respiration and an estimate of the metabolic activity associated with transient storage zones. Across a range of lotic ecosystems, we try to assess the influence of stream channel hydro-morphology, morphologic heterogeneity, and substrate type on reach (103 m) and sub-reach (102 m) scale transient storage, respiration, and nutrient uptake. To do so, we coupled injections of Raz and conservative tracers (uranine and/or salt) at each study site. The study sites included: vegetated mesocosms controlled for water depth; vegetated and un-vegetated sediment-filled mesocosms fed by waste-water effluent; a contrasting sand- vs. gravel-bedded lowland stream (Q = 0.08 m3/s); and a series of upland streams with varying size (Q = 0.1 - 1.5 m3/s) and prevalence of morphologic features. Continuous time-series of tracer concentrations were recorded using in-situ fluorometers and EC loggers. At the stream sites, time-series were recorded at multiple downstream locations in order to resolve sub-reach dynamics. Analyses yielded highly variable transport metrics and Raz-Rru transformation between study sites and between sub-reaches within stream sites. Higher Raz-Rru transformation rates were typically observed in smaller streams, in sub-reaches with higher prevalence of morphologic features known to promote hyporheic exchange, and in mesocosms with higher water depth, vegetation density and retention time. However, relationships between transformation rates and common metrics of transient storage were not consistent among study cases, indicating the existence of yet unrealized complexities in the relationships between water and solute transport and metabolism. Further insights were also gained related to the utility of Raz and improved tracer test practices.

  5. pCO2 Observations from a Vertical Profiler on the upper continental slope off Vancouver Island: Physical controls on biogeochemical processes.

    NASA Astrophysics Data System (ADS)

    Mihaly, S. F.

    2016-02-01

    We analyse two six month sets of data collected from a vertical profiler on Ocean Networks Canada's NEPTUNE observatory over the summer and early fall of 2012 and 2014. The profiler is in 400 m of water on the upper slope of the continental shelf. The site is away from direct influence of canyons, but is in a region of strong internal tide generation. Both seasonally varying semidiurnal internal tidal currents and diurnal shelf waves are observed. The near surface mean flow is weak and seasonally alternates between the California and Alaskan Currents. Mid-depth waters are influenced by the poleward flowing Californian undercurrent and the deep waters by seasonally varying wind-driven Ekman transport. The profiling package consists of a CTD, an oxygen optode, a pCO2 sensor, Chlorophyll fluorometer/turbidity, CDOM and is co-located with an upward-looking bottom-mounted 75kHz ADCP that measures currents to 30 m below sea surface. With these first deep-sea profiled time series measurements of pCO2, we endeavor to model how the local physical dynamics exert control over the variability of water properties over the slope and shelf and what the variability of the non-conservative tracers of pCO2 and O2 can tell us about the biogeochemistry of the region.

  6. Modulated Chlorophyll "a" Fluorescence: A Tool for Teaching Photosynthesis

    ERIC Educational Resources Information Center

    Marques da Silva, Jorge; Bernardes da Silva, Anabela; Padua, Mario

    2007-01-01

    "In vivo" chlorophyll "a" fluorescence is a key technique in photosynthesis research. The recent release of a low cost, commercial, modulated fluorometer enables this powerful technology to be used in education. Modulated chlorophyll a fluorescence measurement "in vivo" is here proposed as a tool to demonstrate basic…

  7. Water Quality and Plankton in the United States Nearshore Waters of Lake Huron

    EPA Science Inventory

    We conducted an intensive survey for the US nearshore of Lake Huron along a continuous segment (523 km) from Port Huron Michigan to Detour Passage. A depth contour of 20 m was towed with a CTD, fluorometer, transmissometer, and laser optical plankton counter (LOPC). The continu...

  8. Marine Analysis Using a Rapid Scanning Multichannel Fluorometer.

    DTIC Science & Technology

    1985-04-30

    investigated is provided in Table I. Listings Table I. Laboratory algae collection. Class Species Source Media Chlorophyceae Chlorella vulgaris 1 ASP 6...of spectral matching. Hit # Specie A B C Chlorella vulgaris 1 1 1 Dunaliela salina 1 1 1 Tetraselmis sp. 1 1 1 Spirulina major 1 1 1 Skeletonema

  9. Fluorescence dynamics of biological systems using synchrotron radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gratton, E.; Mantulin, W.W.; Weber, G.

    1996-09-01

    A beamline for time-resolved fluorescence spectroscopy of biological systems is under construction at the Synchrotron Radiation Center. The fluorometer, operating in the frequency domain, will take advantage of the time structure of the synchrotron radiation light pulses to determine fluorescence lifetimes. Using frequency-domain techniques, the instrument can achieve an ultimate time resolution on the order of picoseconds. Preliminary experiments have shown that reducing the intensity of one of the fifteen electron bunches in the storage ring allows measurement of harmonic frequencies equivalent to the single-bunch mode. This mode of operation of the synchrotron significantly extends the range of lifetimes thatmore » can be measured. The wavelength range (encompassing the visible and ultraviolet), the range of measurable lifetimes, and the stability and reproducibility of the storage ring pulses should make this beamline a versatile tool for the investigation of the complex fluorescence decay of biological systems. {copyright} {ital 1996 American Institute of Physics.}« less

  10. Exploring Photosynthesis and Plant Stress Using Inexpensive Chlorophyll Fluorometers

    ERIC Educational Resources Information Center

    Cessna, Stephen; Demmig-Adams, Barbara; Adams, William W., III

    2010-01-01

    Mastering the concept of photosynthesis is of critical importance to learning plant physiology and its applications, but seems to be one of the more challenging concepts in biology. This teaching challenge is no doubt compounded by the complexity by which plants alter photosynthesis in different environments. Here we suggest the use of chlorophyll…

  11. "Open-Box" Approach to Measuring Fluorescence Quenching Using an iPad Screen and Digital SLR Camera

    ERIC Educational Resources Information Center

    Koenig, Michael H.; Yi, Eun P.; Sandridge, Matthew J.; Mathew, Alexander S.; Demas, James N.

    2015-01-01

    Fluorescence quenching is an analytical technique and a common undergraduate laboratory exercise. Unfortunately, a typical quenching experiment requires the use of an expensive fluorometer that measures the relative fluorescence intensity of a single sample in a closed compartment unseen by the experimenter. To overcome these shortcomings, we…

  12. Bloom Chasing With a Wave Glider: The MAGI (Mesoscale Features Aggregates Interaction) Project in the North Pacific

    NASA Astrophysics Data System (ADS)

    Wilson, C.; Villareal, T. A.; Anderson, E.

    2015-12-01

    Satellite ocean color data over the past decade has revealed the existence of large phytoplankton blooms in the North Pacific Ocean - specifically in the region NE of Hawai´I near 30°N. These blooms cover thousands of km2, persist for weeks or longer, and are often dominated by nitrogen-fixing diatom symbioses. These events have proven difficult to study outside of the time series station ALOHA at Hawai´i. The limited data indicates that the 30°N blooms are longer-lived, larger, and occur at a greater temperature range than the blooms that develop closer to Hawai´i. In the NE Pacific, at least some of these blooms occur at or near the subtropical front, a salinity-defined temperature compensated frontal zone that has a number of fronts imbedded in it. Here we will report on the results from the MAGI (Mesoscale features Aggregates Interaction) project. In this project, we deployed a Liquid Robotics SV2 Wave Glider® in June, 2015 for a multiple (up to 6) month mission to sample these features and assist in characterizing the bloom dynamics of this region. The Wave Gliders are the first unmanned autonomous marine robots to use only the ocean's wave energy for propulsion. The gliders are navigated remotely allowing a dynamic route through the keying of unique waypoints. Waypoints can be changed to sample features as they develop in the near-real time satellite imagery. The wave glider named Honey Badger is equipped with a CTD, two C3 fluorometers (one with an anti-biofouling coating applied), a Turner Designs PhytoFlash, meteorology and wave sensors, a downward facing camera, a Vengmar passive acoustic monitor, and a towed LISST-Holo.

  13. Continuing and New Measurements at the Abyssal ALOHA Cabled Observatory

    NASA Astrophysics Data System (ADS)

    Howe, B. M.; Potemra, J. T.; Butler, R.; Santiago-Mandujano, F.; Lukas, R.; Duennebier, F. K.; Karl, D. M.; Aucan, J.

    2016-02-01

    The ALOHA Cabled Observatory (ACO) is a general purpose "node" providing power, communications and timing connectivity for science use at Station ALOHA 100 km north of Oahu. Included are a suite of basic sensors making core measurements, some local and some sensing the water column. At 4728 m deep, it is the deepest scientific outpost on the planet with power and Internet. Importantly, Station ALOHA is the field site of the NSF-funded Hawaii Ocean Time-series (HOT) program that has investigated temporal dynamics in biology, physics, and chemistry since 1988, at a site that is representative of roughly 70% of the world ocean, sampling the ocean from top to bottom to monitor and study changes on scales of months to decades. The co-located Woods Hole mooring (WHOTS) provides meteorological and upper ocean physical data. The CMORE (Center for Microbial Oceanography Research and Education) and SCOPE (Simons Collaboration on Ocean Processes and Ecology) programs address their respective science topics at ALOHA. Together these programs provide a truly unique means for observing the ocean across all disciplines and regimes (deep sea, near surface, etc.). ACO has been operating in the abyss since June 2011, collecting temperature, salinity, velocity, acoustic, and video data (see for instance the abstract by Lukas et al., Spatial Analysis of Abyssal Temperature Variations Observed from the ALOHA Cabled Observatory and WHOTS Moorings). Using the University of Hawaii remotely operated vehicle ROV Lu`ukai, a basic sensor package was recently installed equipped with a Paroscientific nano-resolution pressure sensor, a WetLabs fluorometer/turbidity sensor, and a Seabird CTDO2 instrument. These data will be presented and described.

  14. DNA detection using water-soluble conjugated polymers and peptide nucleic acid probes

    PubMed Central

    Gaylord, Brent S.; Heeger, Alan J.; Bazan, Guillermo C.

    2002-01-01

    The light-harvesting properties of cationic conjugated polymers are used to sensitize the emission of a dye on a specific peptide nucleic acid (PNA) sequence for the purpose of homogeneous, “real-time” DNA detection. Signal transduction is controlled by hybridization of the neutral PNA probe and the negative DNA target. Electrostatic interactions bring the hybrid complex and cationic polymer within distances required for Förster energy transfer. Conjugated polymer excitation provides fluorescein emission >25 times higher than that obtained by exciting the dye, allowing detection of target DNA at concentrations of 10 pM with a standard fluorometer. A simple and highly sensitive assay with optical amplification that uses the improved hybridization behavior of PNA/DNA complexes is thus demonstrated. PMID:12167673

  15. Velocity, bathymetry, and transverse mixing characteristics of the Ohio River upstream from Cincinnati, Ohio, October 2004-March 2006

    USGS Publications Warehouse

    Koltun, G.F.; Ostheimer, Chad J.; Griffin, Michael S.

    2006-01-01

    Velocity, bathymetry, and transverse (cross-channel) mixing characteristics were studied in a 34-mile study reach of the Ohio River extending from the lower pool of the Captain Anthony Meldahl Lock and Dam, near Willow Grove, Ky, to just downstream from the confluence of the Licking and Ohio Rivers, near Newport, Ky. Information gathered in this study ultimately will be used to parameterize hydrodynamic and water-quality models that are being developed for the study reach. Velocity data were measured at an average cross-section spacing of about 2,200 feet by means of boat-mounted acoustic Doppler current profilers (ADCPs). ADCP data were postprocessed to create text files describing the three-dimensional velocity characteristics in each transect. Bathymetry data were measured at an average transect spacing of about 800 feet by means of a boat-mounted single-beam echosounder. Depth information obtained from the echosounder were postprocessed with water-surface slope and elevation information collected during the surveys to compute stream-bed elevations. The bathymetry data were written to text files formatted as a series of space-delimited x-, y-, and z-coordinates. Two separate dye-tracer studies were done on different days in overlapping stream segments in an 18.3-mile section of the study reach to assess transverse mixing characteristics in the Ohio River. Rhodamine WT dye was injected into the river at a constant rate, and concentrations were measured in downstream cross sections, generally spaced 1 to 2 miles apart. The dye was injected near the Kentucky shoreline during the first study and near the Ohio shoreline during the second study. Dye concentrations were measured along transects in the river by means of calibrated fluorometers equipped with flow-through chambers, automatic temperature compensation, and internal data loggers. The use of flow-through chambers permitted water to be pumped continuously out of the river from selected depths and through the fluorometer for measurement as the boat traversed the river. Time-tagged concentration readings were joined with horizontal coordinate data simultaneously captured from a differentially corrected Global Positioning System (GPS) device to create a plain-text, comma-separated variable file containing spatially tagged dye-concentration data. Plots showing the transverse variation in relative dye concentration indicate that, within the stream segments sampled, complete transverse mixing of the dye did not occur. In addition, the highest concentrations of dye tended to be nearest the side of the river from which the dye was injected. Velocity, bathymetry, and dye-concentration data collected during this study are available for Internet download by means of hyperlinks in this report. Data contained in this report were collected between October 2004 and March 2006.

  16. Optical Constituents at the Mouth of the Columbia River: Variability and Signature in Remotely Sensed Reflectance

    DTIC Science & Technology

    2013-09-30

    constructed at BIO, carried the new Machine Vision Floc Camera (MVFC), a Sequoia Scientific LISST 100x Type B, an RBR CTD, and two pressure-actuated...WetStar CDOM fluorometer, a Sequoia Scientific flow control switch, and a SeaBird 37 CTD. The flow-control switch allows the ac- 9 to collect 0.2-um

  17. BIOSPACE/DYABOLIC October 2010 Field Program, Monterey Bay, California Data Report

    DTIC Science & Technology

    2011-07-20

    AU Fluorometer. After first reading samples were acidified with three drops of 5% HCl (to destroy the Chlorophyll) allowing the measurement of...DBCBLFOEB CJ BB BND-)#DBE CME JC 57$%2$0=$2B EKGE JKLMCJN DBCBLMKBN CN JJ 57$%2$0=$2C ENJE JKLMBO DBCBLMKMM JE F JO 57$%2$0=$2J EFBN JKLMEKK DBCBLMKMF JC

  18. Relationship between Chlorophyll a Concentration, Light Attenuation and Diving Depth of the Southern Elephant Seal Mirounga leonina

    PubMed Central

    Jaud, Thomas; Dragon, Anne-Cécile; Garcia, Jade Vacquie; Guinet, Christophe

    2012-01-01

    Recently, a number of Antarctic marine environmental studies have used oceanographic parameters collected from instrumented top predators for ecological and physical information. Phytoplankton concentration is generally quantified through active measurement of chlorophyll fluorescence. In this study, light absorption coefficient (K0.75) was used as an indicator of phytoplankton concentration. This measurement, easy to obtain and requiring low electric power, allows for assessing of the fine scale horizontal structuring of phytoplankton. As part of this study, Southern elephant seals (SES) were simultaneously equipped with a fluorometer and a light logger. Along the SES tracks, variations in K0.75 were strongly correlated with chlorophyll, a concentration measured by the fluorometer within the euphotic layer. With regards to SES foraging behaviour, bottom depth of the seal’s dive was highly dependent on light intensity at 150 m, indicating that the vertical distribution of SES’s prey such as myctophids is tightly related to light level. Therefore, change in phytoplankton concentration may not only have a direct effect on SES’s prey abundance but may also determine their vertical accessibility with likely consequences on SES foraging efficiency. PMID:23082166

  19. The new Seafloor Observatory (OBSEA) for remote and long-term coastal ecosystem monitoring.

    PubMed

    Aguzzi, Jacopo; Mànuel, Antoni; Condal, Fernando; Guillén, Jorge; Nogueras, Marc; del Rio, Joaquin; Costa, Corrado; Menesatti, Paolo; Puig, Pere; Sardà, Francesc; Toma, Daniel; Palanques, Albert

    2011-01-01

    A suitable sampling technology to identify species and to estimate population dynamics based on individual counts at different temporal levels in relation to habitat variations is increasingly important for fishery management and biodiversity studies. In the past two decades, as interest in exploring the oceans for valuable resources and in protecting these resources from overexploitation have grown, the number of cabled (permanent) submarine multiparametric platforms with video stations has increased. Prior to the development of seafloor observatories, the majority of autonomous stations were battery powered and stored data locally. The recently installed low-cost, multiparametric, expandable, cabled coastal Seafloor Observatory (OBSEA), located 4 km off of Vilanova i la Gertrú, Barcelona, at a depth of 20 m, is directly connected to a ground station by a telecommunication cable; thus, it is not affected by the limitations associated with previous observation technologies. OBSEA is part of the European Multidisciplinary Seafloor Observatory (EMSO) infrastructure, and its activities are included among the Network of Excellence of the European Seas Observatory NETwork (ESONET). OBSEA enables remote, long-term, and continuous surveys of the local ecosystem by acquiring synchronous multiparametric habitat data and bio-data with the following sensors: Conductivity-Temperature-Depth (CTD) sensors for salinity, temperature, and pressure; Acoustic Doppler Current Profilers (ADCP) for current speed and direction, including a turbidity meter and a fluorometer (for the determination of chlorophyll concentration); a hydrophone; a seismometer; and finally, a video camera for automated image analysis in relation to species classification and tracking. Images can be monitored in real time, and all data can be stored for future studies. In this article, the various components of OBSEA are described, including its hardware (the sensors and the network of marine and land nodes), software (data acquisition, transmission, processing, and storage), and multiparametric measurement (habitat and bio-data time series) capabilities. A one-month multiparametric survey of habitat parameters was conducted during 2009 and 2010 to demonstrate these functions. An automated video image analysis protocol was also developed for fish counting in the water column, a method that can be used with cabled coastal observatories working with still images. Finally, bio-data time series were coupled with data from other oceanographic sensors to demonstrate the utility of OBSEA in studies of ecosystem dynamics.

  20. The New Seafloor Observatory (OBSEA) for Remote and Long-Term Coastal Ecosystem Monitoring

    PubMed Central

    Aguzzi, Jacopo; Mànuel, Antoni; Condal, Fernando; Guillén, Jorge; Nogueras, Marc; del Rio, Joaquin; Costa, Corrado; Menesatti, Paolo; Puig, Pere; Sardà, Francesc; Toma, Daniel; Palanques, Albert

    2011-01-01

    A suitable sampling technology to identify species and to estimate population dynamics based on individual counts at different temporal levels in relation to habitat variations is increasingly important for fishery management and biodiversity studies. In the past two decades, as interest in exploring the oceans for valuable resources and in protecting these resources from overexploitation have grown, the number of cabled (permanent) submarine multiparametric platforms with video stations has increased. Prior to the development of seafloor observatories, the majority of autonomous stations were battery powered and stored data locally. The recently installed low-cost, multiparametric, expandable, cabled coastal Seafloor Observatory (OBSEA), located 4 km off of Vilanova i la Gertrú, Barcelona, at a depth of 20 m, is directly connected to a ground station by a telecommunication cable; thus, it is not affected by the limitations associated with previous observation technologies. OBSEA is part of the European Multidisciplinary Seafloor Observatory (EMSO) infrastructure, and its activities are included among the Network of Excellence of the European Seas Observatory NETwork (ESONET). OBSEA enables remote, long-term, and continuous surveys of the local ecosystem by acquiring synchronous multiparametric habitat data and bio-data with the following sensors: Conductivity-Temperature-Depth (CTD) sensors for salinity, temperature, and pressure; Acoustic Doppler Current Profilers (ADCP) for current speed and direction, including a turbidity meter and a fluorometer (for the determination of chlorophyll concentration); a hydrophone; a seismometer; and finally, a video camera for automated image analysis in relation to species classification and tracking. Images can be monitored in real time, and all data can be stored for future studies. In this article, the various components of OBSEA are described, including its hardware (the sensors and the network of marine and land nodes), software (data acquisition, transmission, processing, and storage), and multiparametric measurement (habitat and bio-data time series) capabilities. A one-month multiparametric survey of habitat parameters was conducted during 2009 and 2010 to demonstrate these functions. An automated video image analysis protocol was also developed for fish counting in the water column, a method that can be used with cabled coastal observatories working with still images. Finally, bio-data time series were coupled with data from other oceanographic sensors to demonstrate the utility of OBSEA in studies of ecosystem dynamics. PMID:22163931

  1. Real time monitoring of urban surface water quality using a submersible, tryptophan-like fluorescence sensor

    NASA Astrophysics Data System (ADS)

    Khamis, Kieran; Bradley, Chris; Hannah, David; Stevens, Rob

    2014-05-01

    Due to the recent development of field-deployable optical sensor technology, continuous quantification and characterization of surface water dissolved organic matter (DOM) is possible now. Tryptophan-like (T1) fluorescence has the potential to be a particularly useful indicator of human influence on water quality as T1 peaks are associated with the input of labial organic carbon (e.g. sewage or farm waste) and its microbial breakdown. Hence, real-time recording of T1 fluorescence could be particular useful for monitoring waste water infrastructure, treatment efficiency and the identification of contamination events at higher temporal resolution than available hitherto. However, an understanding of sensor measurement repeatability/transferability and interaction with environmental parameters (e.g. turbidity) is required. Here, to address this practical knowledge gap, we present results from a rigorous test of a commercially available submersible tryptophan fluorometer (λex 285, λem 350). Sensor performance was first examined in the laboratory by incrementally increasing turbidity under controlled conditions. Further to this the sensor was integrated into a multi-parameter sonde and field tests were undertaken involving: (i) a spatial sampling campaign across a range of surface water sites in the West Midlands, UK; and (ii) collection of high resolution (sub-hourly) samples from an urban stream (Bournbrook, Birmingham, U.K). To determine the ability of the sensor to capture spatiotemporal dynamics of urban waters DOM was characterized for each site or discrete time step using Excitation Emission Matrix spectroscopy and PARAFAC. In both field and laboratory settings fluorescence intensity was attenuated at high turbidity due to suspended particles increasing absorption and light scattering. For the spatial survey, instrument readings were compared to those obtained by a laboratory grade fluorometer (Varian Cary Eclipse) and a strong, linear relationship was apparent (R2 > 0.7). Parallel water sampling and laboratory analysis identified the potential for correction of T1 fluorescence intensity based on turbidity readings. These findings highlight the potential utility of real time monitoring of T1 fluorescence for a range of environmental applications (e.g. monitoring sewage treatment processes and tracing polluting DOM sources). However, if high/variable suspended sediment loads are anticipated concurrent monitoring of turbidity is required for accurate readings.

  2. Time of travel and dispersion of solutes in a 36.4-mile reach of the North Platte River downstream from Casper, Wyoming

    USGS Publications Warehouse

    Armentrout, G.W.; Larson, L.R.

    1984-01-01

    Time-of-travel and dispersion measurements made during a dye study November 7-8, 1978, are presented for a reach of the North Platte River from Casper, Wyo., to a bridge 2 miles downstream from below the Dave Johnston Power Plant. Rhodamine WT dye was injected into the river at Casper, and the resultant dye cloud was traced by sampling as it moved downstream. Samples were taken in three equal-flow sections of the river 's lateral transect at three sites, then analyzed in a fluorometer. The flow in the river was 940 cubic feet per second. The data consist of measured stream mileages and time, distance, and concentration graphs of the dye cloud. The peak concentration traveled through the reach in 24 hours, averaging 1.5 miles per hour; the leading edge took about 22 hours, averaging 1.7 miles per hour; and the trailing edge took 35 hours, averaging 1.0 mile per hour. Data from this study were compared with methods for estimating time of travel for a range of stream discharges.

  3. Rapid Field-Usable Cyanide Sensor Development for Blood and Saliva

    DTIC Science & Technology

    2014-12-01

    Fluoromet11c .lnal)"’" w;u: penormod u~ang one of Mo cor\\JigurntJons. Fluorometri<: Con6guutlon I (FC I) utilfu-d • ~20 nm Ughtemiltlng diode ( LEO , TT...65 Contents lists available at SciVerse ScienceDi rect journal of Chromatography B Jo ur na l h om e pag e: www.elsevie r.com/ loc a te/ c h ro mb

  4. Development of Loop-Mediated Isothermal Amplification (LAMP) Assay for Rapid and Sensitive Identification of Ostrich Meat

    PubMed Central

    Abdulmawjood, Amir; Grabowski, Nils; Fohler, Svenja; Kittler, Sophie; Nagengast, Helga; Klein, Guenter

    2014-01-01

    Animal species identification is one of the primary duties of official food control. Since ostrich meat is difficult to be differentiated macroscopically from beef, therefore new analytical methods are needed. To enforce labeling regulations for the authentication of ostrich meat, it might be of importance to develop and evaluate a rapid and reliable assay. In the present study, a loop-mediated isothermal amplification (LAMP) assay based on the cytochrome b gene of the mitochondrial DNA of the species Struthio camelus was developed. The LAMP assay was used in combination with a real-time fluorometer. The developed system allowed the detection of 0.01% ostrich meat products. In parallel, a direct swab method without nucleic acid extraction using the HYPLEX LPTV buffer was also evaluated. This rapid processing method allowed detection of ostrich meat without major incubation steps. In summary, the LAMP assay had excellent sensitivity and specificity for detecting ostrich meat and could provide a sampling-to-result identification-time of 15 to 20 minutes. PMID:24963709

  5. [MODIS Investigation

    NASA Technical Reports Server (NTRS)

    Abbott, Mark R.

    1998-01-01

    The objective of the last six months were: (1) Continue analysis of Hawaii Ocean Time-series (HOT) bio-optical mooring data, and Southern Ocean bio-optical drifter data; (2) Complete development of documentation of MOCEAN algorithms and software for use by MOCEAN team and GLI team; (3) Deploy instrumentation during JGOFS cruises in the Southern Ocean; (4) Participate in test cruise for Fast Repetition Rate (FRR) fluorometer; (5) Continue chemostat experiments on the relationship of fluorescence quantum yield to environmental factors; and (6) Continue to develop and expand browser-based information system for in situ bio-optical data. We are continuing to analyze bio-optical data collected at the Hawaii Ocean Time Series mooring as well as data from bio-optical drifters that were deployed in the Southern Ocean. A draft manuscript has now been prepared and is being revised. A second manuscript is also in preparation that explores the vector wind fields derived from NSCAT measurements. The HOT bio-optical mooring was recovered in December 1997. After retrieving the data, the sensor package was serviced and redeployed. We have begun preliminary analysis of these data, but we have only had the data for 3 weeks. However, all of the data were recovered, and there were no obvious anomalies. We will add second sensor package to the mooring when it is serviced next spring. In addition, Ricardo Letelier is funded as part of the SeaWiFS calibration/validation effort (through a subcontract from the University of Hawaii, Dr. John Porter), and he will be collecting bio-optical and fluorescence data as part of the HOT activity. This will provide additional in situ measurements for MODIS validation. As noted in the previous quarterly report, we have been analyzing data from three bio-optical drifters that were deployed in the Southern Ocean in September 1996. We presented results on chlorophyll and drifter speed. For the 1998 Ocean Sciences meeting, a paper will be presented on this data set, focusing on the diel variations in fluorescence quantum yield. Briefly, there are systematic patterns in the apparent quantum yield of fluorescence (defined as the slope of the line relating fluorescence/chlorophyll and incoming solar radiation). These systematic variations appear to be related to changes in the circulation of the Antarctic Polar Front which force nutrients into the upper ocean. A more complete analysis will be provided in the next Quarterly report.

  6. Cryogenic Collection of Complete Subsurface Samples for Molecular Biological Analysis

    DTIC Science & Technology

    2012-05-01

    Nitrate was analyzed by ion chromatography ( Dionex IC25) and had a detection limit of 0.01 mg/L. Fluorescein was measured using a flow-through...dissolved oxygen (DO) with a flow through electrode, Nitrate by ion chromatography , and fluorescein with a flow through fluorometer. 1.9 LARGE...measured by headspace gas chromatography (HP 7694 Headspace Sampler attached to an HP 5890 GC with an FID detector). The GC method had a detection

  7. Optical Constituents Along a River Mouth and Inlet: Variability and Signature in Remotely Sensed Reflectance, and: Optical Constituents at the Mouth of the Columbia River: Variability and Signature in Remotely Sensed Reflectance

    DTIC Science & Technology

    2013-09-30

    Vision Floc Camera (MVFC), a Sequoia Scientific LISST 100x Type B, an RBR CTD, and two pressure-actuated Niskin bottles. The Niskin bottles were...Eco bb2fl, that measures 3 backscattering at 532 and 650 nm and CDOM fluorescence, a WetLabs WetStar CDOM fluorometer, a Sequoia Scientific flow

  8. Preparing to Predict: The Second Autonomous Ocean Sampling Network (AOSN-II) Experiment in the Monterey Bay

    DTIC Science & Technology

    2008-06-06

    MBARI buoy M2, profiling to 250 m depth (Figure 1). Instruments on board included a CTD, fluorometer, oxygen and nitrate sensors, bioluminescence, and...dimensional multiscale ocean variability: Massachusetts Bay. Journal of Marine Systems, Special issue on “Three-dimensional ocean circulation: Lagrangian...Oceanography”, T. Paluszkiewicz and S. Harper, Eds., Vol. 19, 1, 172-183. Liang, X.S. and Anderson, D.G.M. (2007) Multiscale Window Transform, SIAM J

  9. Spatial and Diel Variability in Photosynthetic and Photoprotective Pigments in Shallow Benthic Communities

    DTIC Science & Technology

    2001-09-30

    acidification with a Turner 10-000R fluorometer. For phycoerythrin and phycocyanin analysis, the sediments were extracted repeatedly with a phosphate...concentrations varied around 20-fold, phycocyanin varied approximately 70-fold. The highest levels of chlorophylls a and c, and phycocyanin were found in...reflected in the wide range of pigment ratios: 46 for chl c/chl a; 94 for phycoerythrin/chl a; and 27 for phycocyanin /chl a. First derivatives of

  10. Spatial extent and dissipation of the deep chlorophyll layer in Lake Ontario during the Lake Ontario lower foodweb assessment, 2003 and 2008

    USGS Publications Warehouse

    Watkins, J. M.; Weidel, Brian M.; Rudstam, L. G.; Holek, K. T.

    2014-01-01

    Increasing water clarity in Lake Ontario has led to a vertical redistribution of phytoplankton and an increased importance of the deep chlorophyll layer in overall primary productivity. We used in situ fluorometer profiles collected in lakewide surveys of Lake Ontario in 2008 to assess the spatial extent and intensity of the deep chlorophyll layer. In situ fluorometer data were corrected with extracted chlorophyll data using paired samples from Lake Ontario collected in August 2008. The deep chlorophyll layer was present offshore during the stratified conditions of late July 2008 with maximum values from 4-13 μg l-1 corrected chlorophyll a at 10 to 17 m depth within the metalimnion. Deep chlorophyll layer was closely associated with the base of the thermocline and a subsurface maximum of dissolved oxygen, indicating the feature's importance as a growth and productivity maximum. Crucial to the deep chlorophyll layer formation, the photic zone extended deeper than the surface mixed layer in mid-summer. The layer extended through most of the offshore in July 2008, but was not present in the easternmost transect that had a deeper surface mixed layer. By early September 2008, the lakewide deep chlorophyll layer had dissipated. A similar formation and dissipation was observed in the lakewide survey of Lake Ontario in 2003.

  11. Satellite Remote Sensing Studies of Biological and Biogeochemical Processing in the Ocean

    NASA Technical Reports Server (NTRS)

    Vernet, Maria

    2001-01-01

    The remote sensing of phycoerythrin-containing phytoplankton by ocean color was evaluated. Phycoerythrin (PE) can be remotely sensed by three methods: surface reflectance (Sathyendranath et al. 1994), by laser-activated fluorescence (Hoge and Swift 1986) and by passive fluorescence (Letelier et al. 1996). In collaboration with Dr. Frank Hoge and Robert Swift during Dr. Maria Vernet's tenure as Senior Visiting Scientist at Wallops Island, the active and passive methods were studied, in particular the detection of PE fluorescence and spectral reflectance from airborne LIDAR (AOL). Airborne instrumentation allows for more detailed and flexible sampling of the ocean surface than satellites thus providing the ideal platform to test model and develop algorithms than can later be applied to ocean color by satellites such as TERRA and AQUA. Dr. Vernet's contribution to the Wallops team included determination of PE in the water column, in conjunction with AOL flights in the North Atlantic Bight. In addition, a new flow-through fluorometer for PE determination by fluorescence was tested and calibrated. Results: several goals were achieved during this period. Cruises to the California Current, North Atlantic Bight, Gulf of Maine and Chesapeake Bay provided sampling under different oceanographic and optical conditions. The ships carried the flow-through fluorometer and samples for the determination of PE were obtained from the flow-through flow. The AOL was flown over the ship's track, usually several flights during the cruise, weather permitting.

  12. DNA aptamer beacon assay for C-telopeptide and handheld fluorometer to monitor bone resorption.

    PubMed

    Bruno, John Gordon; Carrillo, Maria P; Phillips, Taylor; Hanson, Douglas; Bohmann, Jonathan A

    2011-09-01

    A novel DNA aptamer beacon is described for quantification of a 26-amino acid C-telopeptide (CTx) of human type I bone collagen. One aptamer sequence and its reverse complement dominated the aptamer pool (31.6% of sequenced clones). Secondary structures of these aptamers were examined for potential binding pockets. Three-dimensional computer models which analyzed docking topologies and binding energies were in agreement with empirical fluorescence experiments used to select one candidate loop for beacon assay development. All loop structures from the aptamer finalists were end-labeled with TYE 665 and Iowa Black quencher for comparison of beacon fluorescence levels as a function of CTx concentration. The optimal beacon, designated CTx 2R-2h yielded a low ng/ml limit of detection using a commercially available handheld fluorometer. The CTx aptamer beacon bound full-length 26-amino acid CTx peptide, but not a shorter 8-amino acid segment of CTx peptide which is a common target for commercial CTx ELISA kits. The prototype assay was shown to detect CTx peptide from human urine after creatinine and urea were removed by size-exclusion chromatography to prevent nonspecific denaturing of the aptamer beacon. This work demonstrates the potential of aptamer beacons to be utilized for rapid and sensitive bone health monitoring in a handheld or point-of-care format.

  13. Excitation-emission matrix fluorescence spectroscopy in conjunction with multiway analysis for PAH detection in complex matrices.

    PubMed

    Nahorniak, Michelle L; Booksh, Karl S

    2006-12-01

    A field portable, single exposure excitation-emission matrix (EEM) fluorometer has been constructed and used in conjunction with parallel factor analysis (PARAFAC) to determine the sub part per billion (ppb) concentrations of several aqueous polycyclic aromatic hydrocarbons (PAHs), such as benzo(k)fluoranthene and benzo(a)pyrene, in various matrices including aqueous motor oil extract and asphalt leachate. Multiway methods like PARAFAC are essential to resolve the analyte signature from the ubiquitous background in environmental samples. With multiway data and PARAFAC analysis it is shown that reliable concentration determinations can be achieved with minimal standards in spite of the large convoluting fluorescence background signal. Thus, rapid fieldable EEM analyses may prove to be a good screening method for tracking pollutants and prioritizing sampling and analysis by more complete but time consuming and labor intensive EPA methods.

  14. Loop-mediated isothermal amplification (LAMP) assay-A rapid detection tool for identifying red fox (Vulpes vulpes) DNA in the carcasses of harbour porpoises (Phocoena phocoena).

    PubMed

    Heers, Teresa; van Neer, Abbo; Becker, André; Grilo, Miguel Luca; Siebert, Ursula; Abdulmawjood, Amir

    2017-01-01

    Carcasses of wild animals are often visited by different scavengers. However, determining which scavenger caused certain types of bite marks is particularly difficult and knowledge thereof is lacking. Therefore, a loop-mediated isothermal amplification (LAMP) assay (target sequence cytochrome b) was developed to detect red fox DNA in carcasses of harbour porpoises. The MSwab™ method for direct testing without prior DNA isolation was validated. As a detection device, the portable real-time fluorometer Genie® II was used, which yields rapid results and can be used in field studies without huge laboratory equipment. In addition to in vitro evaluation and validation, a stranded and scavenged harbour porpoise carcass was successfully examined for red fox DNA residues. The developed LAMP method is a valuable diagnostic tool for confirming presumable red fox bite wounds in harbour porpoises without further DNA isolation steps.

  15. Feasibility of surveying pesticide coverage with airborne fluorometer

    NASA Technical Reports Server (NTRS)

    Stoertz, G. E.; Hemphill, W. R.

    1970-01-01

    Response of a Fraunhofer line discriminator (FLD) to varying distributions of granulated corncobs stained with varying concentrations of Rhodamine WT dye was tested on the ground and from an H-19 helicopter. The granules are used as a vehicle for airborne emplacement of poison to control fire ants in the eastern and southeastern United States. Test results showed that the granules are detectable by FLD but that the concentration must be too great to be practical with the present apparatus. Possible methods for enhancement of response may include: (1) increasing dye concentration; (2) incorporating with the poisoned granules a second material to carry the dye alone; (3) use of a more strongly fluorescent substance (at 5890 A); (4) modifying the time interval after dyeing, or modifying the method of dyeing; (5) modifying the FLD for greater efficiency, increased field of view or larger optics; or (6) experimenting with laser-stimulated fluorescence.

  16. Secretory production of cell wall components by Saccharomyces cerevisiae protoplasts in static liquid culture.

    PubMed

    Aoyagi, Hideki; Ishizaka, Mikiko; Tanaka, Hideo

    2012-04-01

    When protoplasts of Saccharomyces cerevisiae T7 and IFO 0309 are cultured in a static liquid culture at 2.5 × 10(6) protoplasts/ml, cell wall regeneration does not occur and cell wall components (CWC) are released into the culture broth. By using a specialized fluorometer, the concentrations of CWC could be measured on the basis of the fluorescence intensity of the CWC after staining with Fluostain I. The inoculum concentration, pH, and osmotic pressure of the medium were important factors for the production of CWC in culture. Under optimal culture conditions, S. cerevisiae T7 protoplasts produced 0.91 mg/ml CWC after 24 h. The CWC induced the tumor necrosis factor-α production about 1.3 times higher than that of the commercially available β-1,3/1,6-glucan from baker's yeast cells.

  17. Time-of-travel and dispersion studies, Lehigh River, Francis E. Walter Lake to Easton, Pennsylvania

    USGS Publications Warehouse

    Kauffman, C.D.

    1983-01-01

    Results of time-of-travel and dispersion studies are presented for the 77.0 mile reach of the Lehigh River from Francis E. Walter Lake to Easton, Pennsylvania. Rhodamine WT dye was injected at several points for a variety of several common flow conditions and its downstream travel was monitored at a number of downstream points by means of a fluorometer. Time-of-travel data have been related to stream discharge, distance along the river channel and dispersion. If 2.205 pounds of a conservative water soluble contaminant were accidentally spilled into the Lehigh River at Penn Haven Junction at Black Creek 6.09 miles downstream from Rockport, Pennsylvania, when the discharge at Walnutport, Pennsylvania, was 600 cubic feet per second, the leading edge, peak, and trailing edge of the contaminant would arrive 31.6 miles downstream at the Northhampton, Pennsylvania, water intakes 45, 54, and 66 hours later, respectively. The maximum concentration expected at the intakes would be about 1.450 micrograms per liter. From data and relations presented, time-of-travel and maximum concentration estimates can be made for any two points within the reach. (USGS)

  18. Airborne fluorometer applicable to marine and estuarine studies

    USGS Publications Warehouse

    Stoertz, George E.; Hemphill, William R.; Markle, David A.

    1969-01-01

    An experimental Fraunhofer line discriminator detected solar-stimulated yellow fluorescence (5890 A) emitted by Rhodamine WT dye in aqueous solutions. Concentration of 1 part per billion was detected in tap water 1/2-meter deep. In extremely turbid San Francisco Bay, dye was monitored in concentrations of less than 5 parts per billion from helicopter and ship. Applications include studies of current dynamics and dispersion. Potential applications of the technique could include sensing oil spills, fish oils, lignin sulfonates, other fluorescent pollutants, and chlorophyll fluorescence.

  19. Bio-optical profile data report coastal transition zone program, R/V Point Sur, June 15-28, 1987

    NASA Technical Reports Server (NTRS)

    Davis, Curtiss O.; Rhea, W. Joseph

    1990-01-01

    Twenty vertical profiles of the bio-optical properties of the ocean were made during a research cruise on the R/V Point Sur, June 15 to 28, 1987, as part of the Coastal Transition Zone Program off Point Arena, California. Extracted chlorophyll values were also measured at some stations to provide calibration data for the in situ fluorometer. This summary provides investigators with an overview of the data collected. The entire data set is available in digital form.

  20. Autonomous Sampling of Remote Phytoplankton Blooms in the North Pacific Subtropical Gyre (July-Aug. 2015).

    NASA Astrophysics Data System (ADS)

    Anderson, E. E.; Wilson, C.; Villareal, T. A.

    2016-12-01

    Satellite ocean color data regularly reveals the existence of large (103 km2) phytoplankton blooms in the North Pacific Ocean that can persist for weeks to months and are often associated with N2 fixing diatom symbioses. The basin size and inability to accurately forecast these blooms makes sampling these events difficult outside of the time series at Station ALOHA. We used an autonomous Wave Glider surface vehicle (Honey Badger) to conduct a large regional survey well north of HI to examine bloom composition and key species distribution. Honey Badger was equipped with a gpCTD, downward looking camera, 2 C3 fluorometers, wind and wave sensors, a Turner Designs' Phytoflash, and a Sequoia Scientific LISST-Holo for imaging cells. Most of the data collected was available in near-real time through NOAA's ERDDAP data server. The 159 day mission began 1 June 2015 and covered 6800 km. From 1 July 2015 to 31 August 2015, Honey Badger transited from low levels of chlorophyll-a (chl) (0.06±0.01 mg m-3), through a mesoscale­ bloom, and then into a broad regional chl increase (0.08±0.01 mg m-3) as noted by the AQUA MODIS satellite. Phytoplankton cell counts (> 14,000 Hemiaulus cells L-1) and increased nocturnal Fv:Fm yields (maximum > 0.61) were concurrent with the 0.1 µg Chl L-1 bloom. A separate bloom of the Rhizosolenia-Richelia symbiosis was noted (> 3,000 Rhizosolenia-Richelia cells L-1) within a smaller, short-lived bloom with a biovolume 2.1 times higher than the rest of the southern transect. The broad regional chl increase in the southern leg of the transit was concurrent with a sustained Hemiaulus increase to 102 cells L-1. Diel patterns in Fv:Fm did not suggest Fe limitation anywhere in the transect. Elevated yields were found only in the diatom increases. Honey Badger and the instruments it carried were useful tools for the investigation of remote bloom dynamics in the Eastern North Pacific Subtropical Gyre.

  1. Autofluorescence lifetime metrology for label-free detection of cartilage matrix degradation

    NASA Astrophysics Data System (ADS)

    Nickdel, Mohammad B.; Lagarto, João. L.; Kelly, Douglas J.; Manning, Hugh B.; Yamamoto, Kazuhiro; Talbot, Clifford B.; Dunsby, Christopher; French, Paul; Itoh, Yoshifumi

    2014-03-01

    Degradation of articular cartilage extracellular matrix (ECM) by proteolytic enzyme is the hallmark of arthritis that leads to joint destruction. Detection of early biochemical changes in cartilage before irreversible structural damages become apparent is highly desirable. Here we report that the autofluorescence decay profile of cartilage is significantly affected by proteolytic degradation of cartilage ECM and can be characterised by measurements of the autofluorescence lifetime (AFL). A multidimensional fluorometer utilizing ultraviolet excitation at 355 nm or 375 nm coupled to a fibreoptic probe was developed for single point time-resolved AFL measurements of porcine articular cartilage explants treated with different proteinases. Degradation of cartilage matrix components by treating with bacterial collagenase, matrix metalloproteinase 1, or trypsin resulted in significant reduction of AFL of the cartilage in both a dose and time dependent manner. Differences in cartilage AFL were also confirmed by fluorescence lifetime imaging microscopy (FLIM). Our data suggest that AFL of cartilage tissue is a potential non-invasive readout to monitor cartilage matrix integrity that may be utilized for diagnosis of arthritis as well as monitoring the efficacy of anti-arthritic therapeutic agents.

  2. A hand-held electronic tongue based on fluorometry for taste assessment of tea.

    PubMed

    Chang, Kuang-Hua; Chen, Richie L C; Hsieh, Bo-Chuan; Chen, Po-Chung; Hsiao, Hsien-Yi; Nieh, Chi-Hua; Cheng, Tzong-Jih

    2010-12-15

    A hand-held electronic tongue was developed for determining taste levels of astringency and umami in tea infusions. The sensing principles are based on quenching the fluorescence of 3-aminophthalate by tannin, and the fluorogenic reaction of o-phthalaldehyde (OPA) with amino acids to determine astringency and umami levels, respectively. Both reactions were measured by a single fluorescence sensing system with same excitation and emission wavelengths (340/425 nm). This work describes in detail the design, fabrication, and performance evaluation of a hand-held fluorometer with an ultra-violet light emitted diode (UVLED) and a photo-detector with a filter built-in. The dimension and the weight of proposed electronic tongue prototype are only 120×60×65 mm(3) and 150 g, respectively. The detection limits of this prototype for theanine and tannic acid were 0.2 μg/ml and 1 μg/ml, respectively. Correlation coefficients of this prototype compared with a commercial fluorescence instrument are both higher than 0.995 in determinations of tannin acid and theanine. Linear detection ranges of the hand-held fluorometer for tannic acid and theanine are 1-20 μg/ml and 0.2-10 μg/ml (CV<5%, n=3), respectively. A specified taste indicator for tea, defined as ratio of umami to astringency, was adopted here to effectively distinguish flavour quality of partially fermented Oolong teas. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. In-situ Fluorometers Reveal High Frequency Dynamics In Dissolved Organic Matter For Urban Rivers

    NASA Astrophysics Data System (ADS)

    Croghan, D.; Bradley, C.; Khamis, K.; Hannah, D. M.; Sadler, J. P.; Van Loon, A.

    2017-12-01

    To-date Dissolved Organic Matter (DOM) dynamics have been quantified poorly in urban rivers, despite the substantial water quality issues linked to urbanisation. Research has been hindered by the low temporal resolution of observations and over-reliance on manual sampling which often fail to capture precipitation events and diurnal dynamics. High frequency data are essential to estimate more accurately DOM fluxes/loads and to understand DOM furnishing and transport processes. Recent advances in optical sensor technology, including field deployable in-situ fluorometers, are yielding new high resolution DOM information. However, no consensus regarding the monitoring resolution required for urban systems exists, with no studies monitoring at <15 min time steps. High-frequency monitoring (5 min resolution; 4 week duration) was conducted on a headwater urban stream in Birmingham, UK (N 52.447430 W -1.936715) to determine the optimum temporal resolution for characterization of DOM event dynamics. A through-flow GGNU-30 monitored wavelengths corresponding to tryptophan-like fluorescence (TLF; Peak T1) (Ex 285 nm/ Em 345 nm) and humic-like fluorescence (HLF; Peak C) (Ex 365 nm/Em 490 nm). The results suggest that at base flow TLF and HLF are relatively stable, though episodic DOM inputs can pulse through the system, which may be missed during lower temporal resolution monitoring. High temporal variation occurs during storm events in TLF and HLF intensity: TLF intensity is highest during the rising limb of the hydrograph and can rapidly decline thereafter, indicating the importance of fast flow-path and close proximity sources to TLF dynamics. HLF intensity tracks discharge more closely, but can also quickly decline during high flow events due to dilution effects. Furthermore, the ratio of TLF:HLF when derived at high-frequency provides a useful indication of the presence and type of organic effluents in stream, which aids in the identification of Combined Sewage Overflow releases. Our work highlights the need for future studies to utilise shorter temporal scales than previously used to monitor urban DOM dynamics. The application of higher frequency monitoring enables the identification of finer-scale patterns and subsequently aids in deciphering the sources and pathways controlling urban DOM dynamics.

  4. SeaWiFS technical report series. Volume 32: Level-3 SeaWiFS data products. Spatial and temporal binning algorithms

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Acker, James G. (Editor); Campbell, Janet W.; Blaisdell, John M.; Darzi, Michael

    1995-01-01

    The level-3 data products from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) are statistical data sets derived from level-2 data. Each data set will be based on a fixed global grid of equal-area bins that are approximately 9 x 9 sq km. Statistics available for each bin include the sum and sum of squares of the natural logarithm of derived level-2 geophysical variables where sums are accumulated over a binning period. Operationally, products with binning periods of 1 day, 8 days, 1 month, and 1 year will be produced and archived. From these accumulated values and for each bin, estimates of the mean, standard deviation, median, and mode may be derived for each geophysical variable. This report contains two major parts: the first (Section 2) is intended as a users' guide for level-3 SeaWiFS data products. It contains an overview of level-0 to level-3 data processing, a discussion of important statistical considerations when using level-3 data, and details of how to use the level-3 data. The second part (Section 3) presents a comparative statistical study of several binning algorithms based on CZCS and moored fluorometer data. The operational binning algorithms were selected based on the results of this study.

  5. Detection Limits for Spectro-fluorometry: A Case Study in the Region of Finstersee, Canton Zug, Northern Switzerland

    NASA Astrophysics Data System (ADS)

    Otz, M. H.; Otz, H. K.; Keller, P.

    2002-05-01

    Synthetic fluorescent dyes, applied below the visual detection limit (< 0.1 mg/L), have been used as tracers of ground water flow paths since the beginning of the 1950s. Since 1965, we have used spectro-fluorometers with photomultipliers to measure low concentrations of fluorescent dyes in ground water in Switzerland. In collaboration with the Engineering Geology Department of the ETH, we have separated uranine at 0.1 ng/L and Na-naphtionate at 1 ng/L from background fluorescence of spring water in the Finstersee region. These values are 10-100 times lower than postulated detection limits in the literature. The use of low dye concentrations prevents a study region from being contaminated by increased background levels due to remnant dye within the aquifer, thereby leaving the region available for future dye tracing studies. Lower detection limits also can solve particular hydraulic problems where conventional methods fail and enhance the possibility for using artificial dyes in environmentally sensitive aquifer settings.

  6. Fluorometric determination of zirconium in minerals

    USGS Publications Warehouse

    Alford, W.C.; Shapiro, L.; White, C.E.

    1951-01-01

    The increasing use of zirconium in alloys and in the ceramics industry has created renewed interest in methods for its determination. It is a common constituent of many minerals, but is usually present in very small amounts. Published methods tend to be tedious, time-consuming, and uncertain as to accuracy. A new fluorometric procedure, which overcomes these objections to a large extent, is based on the blue fluorescence given by zirconium and flavonol in sulfuric acid solution. Hafnium is the only element that interferes. The sample is fused with borax glass and sodium carbonate and extracted with water. The residue is dissolved in sulfuric acid, made alkaline with sodium hydroxide to separate aluminum, and filtered. The precipitate is dissolved in sulfuric acid and electrolysed in a Melaven cell to remove iron. Flavonol is then added and the fluorescence intensity is measured with a photo-fluorometer. Analysis of seven standard mineral samples shows excellent results. The method is especially useful for minerals containing less than 0.25% zirconium oxide.

  7. Determination of lithium in rocks: Fluorometric method

    USGS Publications Warehouse

    White, C.E.; Fletcher, M.H.; Parks, J.

    1951-01-01

    The gravimetric method in general use for the determination of lithium is tedious, and the final weighed product often contains other alkali metals. A fluorometric method was developed to shorten the time required for the analysis and to assure that the final determination is for lithium alone. This procedure is based on the complex formed between lithium and 8-hydroxyquinoline. The fluorescence is developed in a slightly alkaline solution of 95% alcohol and measurement is made on a photoelectric fluorometer. Separation from the ore is carried out by the wet method or by the distillation procedure. Sodium and potassium are removed by alcohol and ether, but complete separation is not necessary. Comparison of analyzed samples shows excellent agreement with spectrographic and gravimetric methods. The fluorometric method is more rapid than the gravimetric and produces more conclusive results. Another useful application is in the preparation of standard lithium solutions from reagent quality salts when a known standard is available. In this case no separations are necessary.

  8. Chemical, biochemical, and environmental fiber sensors IV; Proceedings of the Meeting, Boston, MA, Sept. 8, 9, 1992

    NASA Astrophysics Data System (ADS)

    Lieberman, Robert A.

    Various paper on chemical, biochemical, and environmental fiber sensors are presented. Some of the individual topics addressed include: evanescent-wave fiber optic (FO) biosensor, refractive-index sensors based on coupling to high-index multimode overlays, advanced technique in FO sensors, design of luminescence-based temperature sensors, NIR fluorescence in FO applications, FO sensor based on microencapsulated reagents, emitters and detectors for optical gas and chemical sensing, tunable fiber laser source for methane detection at 1.68 micron, FO fluorometer based on a dual-wavelength laser excitation source, thin polymer films as active components of FO chemical sensors, submicron optical sources for single macromolecule detection, nanometer optical fiber pH sensor. Also discussed are: microfabrication of optical sensor array, luminescent FO sensor for the measurement of pH, time-domain fluorescence methods as applied to pH sensing, characterization of a sol-gel-entrapped artificial receptor, FO technology for nuclear waste cleanup, spectroscopic gas sensing with IR hollow waveguides, dissolved-oxygen quenching of in situ fluorescence measurements.

  9. CISDE Experiment: Nearshore-Estuarine Connectivity & Dispersion

    NASA Astrophysics Data System (ADS)

    Giddings, S. N.; Feddersen, F.; Harvey, M.; Gilroy, A. R.; Crooks, J.; McCullough, J.; Lorda, J.; Grimes, D. J.; Pawlak, G. R.

    2016-02-01

    As part of the CSIDE, Cross Surfzone/Inner-shelf Dye Exchange experiment, nearby shallow estuary measurements were made in addition to the surfzone and inner-shelf measurements, providing an integrated view into the estuary, surfzone, and shelf system. The CSIDE experiment was designed to look at the dispersion of dye as a proxy for dispersion of waterborne constituents such as pollutants, larvae, sediment, etc. along the coast and across the surfzone to a stratified inner-shelf. The Tijuana River Estuary, a shallow estuary with extensive intertidal regions and marsh, is sometimes the source of harmful contaminants that lead to beach closures in the CSIDE experiment region. However, at other times, the estuary may also act as a sink depending upon the freshwater conditions upstream. During this experiment, we installed temperature and salinity sensors, velocimeters, and fluorometers (measuring both turbidity and the dye concentration) in the two main arms of the Tijuana River Estuary to assess the connectivity between the surfzone and the estuary as well as the in-estuary dispersion.

  10. Calibration procedures and first data set of Southern Ocean chlorophyll a profiles collected by elephant seal equipped with a newly developed CTD-fluorescence tags

    NASA Astrophysics Data System (ADS)

    Guinet, C.; Xing, X.; Walker, E.; Monestiez, P.; Marchand, S.; Picard, B.; Jaud, T.; Authier, M.; `Cotté, C.; Dragon, A. C.; Diamond, E.; Antoine, D.; Lovell, P.; Blain, S.; D'Ortenzio, F.; Claustre, H.

    2012-08-01

    In-situ observation of the marine environment has traditionally relied on ship-based platforms. The obvious consequence is that physical and biogeochemical properties have been dramatically undersampled, especially in the remote Southern Ocean (SO). The difficulty in obtaining in situ data represents the major limitations to our understanding, and interpretation of the coupling between physical forcing and the biogeochemical response. Southern elephant seals (Mirounga leonina) equipped with a new generation of oceanographic sensors can measure ocean structure in regions and seasons rarely observed with traditional oceanographic platforms. Over the last few years, seals have allowed for a considerable increase in temperature and salinity profiles from the SO. However we were still lacking information on the spatio-temporal variation of phytoplankton concentration. This information is critical to assess how the biological productivity of the SO, with direct consequences on the amount of CO2 "fixed" by the biological pump, will respond to global warming. In this research program, we use an innovative sampling fluorescence approach to quantify phytoplankton concentration at sea. For the first time, a low energy consumption fluorometer was added to Argos CTD-SRDL tags, and these novel instruments were deployed on 27 southern elephant seals between 25 December 2007 and the 4 February 2011. As many as 3388 fluorescence profiles associated with temperature and salinity measurements were thereby collected from a vast sector of the Southern Indian Ocean. This paper address the calibration issue of the fluorometer before being deployed on elephant seals and present the first results obtained for the Indian Sector of the Southern Ocean. This in situ system is implemented in synergy with satellite ocean colour radiometry. Satellite-derived data is limited to the surface layer and is restricted over the SO by extensive cloud cover. However, with the addition of these new tags, we're able to assess the 3 dimension distribution of phytoplankton concentration by foraging southern elephant seals. This approach reveals that for the Indian sector of the SO, the surface chlorophyll a (chl a) concentrations provided by MODIS were underestimated by a factor of the order of 2-3 compared to in situ measurements. The scientific outcomes of this program include an improved understanding of both the present state and variability in ocean biology, and the accompanying biogeochemistry, as well as the delivery of real-time and open-access data to scientists (doi:10.7491/MEMO.1x).

  11. Calibration procedures and first dataset of Southern Ocean chlorophyll a profiles collected by elephant seals equipped with a newly developed CTD-fluorescence tags

    NASA Astrophysics Data System (ADS)

    Guinet, C.; Xing, X.; Walker, E.; Monestiez, P.; Marchand, S.; Picard, B.; Jaud, T.; Authier, M.; Cotté, C.; Dragon, A. C.; Diamond, E.; Antoine, D.; Lovell, P.; Blain, S.; D'Ortenzio, F.; Claustre, H.

    2013-02-01

    In situ observation of the marine environment has traditionally relied on ship-based platforms. The obvious consequence is that physical and biogeochemical properties have been dramatically undersampled, especially in the remote Southern Ocean (SO). The difficulty in obtaining in situ data represents the major limitations to our understanding, and interpretation of the coupling between physical forcing and the biogeochemical response. Southern elephant seals (Mirounga leonina) equipped with a new generation of oceanographic sensors can measure ocean structure in regions and seasons rarely observed with traditional oceanographic platforms. Over the last few years, seals have allowed for a considerable increase in temperature and salinity profiles from the SO, but we were still lacking information on the spatiotemporal variation of phytoplankton concentration. This information is critical to assess how the biological productivity of the SO, with direct consequences on the amount of CO2 "fixed'' by the biological pump, will respond to global warming. In this research programme, we use an innovative sampling fluorescence approach to quantify phytoplankton concentration at sea. For the first time, a low energy consumption fluorometer was added to Argos CTD-SRDL tags, and these novel instruments were deployed on 27 southern elephant seals between 25 December 2007 and the 4 February 2011. As many as 3388 fluorescence profiles associated with temperature and salinity measurements were thereby collected from a vast sector of the Southern Indian Ocean. This paper addresses the calibration issue of the fluorometer before being deployed on elephant seals and presents the first results obtained for the Indian sector of the Southern Ocean. This in situ system is implemented in synergy with satellite ocean colour radiometry. Satellite-derived data is limited to the surface layer and is restricted over the SO by extensive cloud cover. However, with the addition of these new tags, we are able to assess the 3-dimension distribution of phytoplankton concentration by foraging southern elephant seals. This approach reveals that for the Indian sector of the SO, the surface chlorophyll a (chl a) concentrations provided by MODIS were underestimated by a factor 2 compared to chl a concentrations estimated from HPLC corrected in situ fluorescence measurements. The scientific outcomes of this programme include an improved understanding of both the present state and variability in ocean biology, and the accompanying biogeochemistry, as well as the delivery of real-time and open-access data to scientists (doi:10.7491/MEMO.1).

  12. The microbial mats of Pavilion Lake microbialites: examining the relationship between photosynthesis and carbonate precipitation

    NASA Astrophysics Data System (ADS)

    Lim, D. S. S.; Hawes, I.; Mackey, T. J.; Brady, A. L.; Biddle, J.; Andersen, D. T.; Belan, M.; Slater, G.; Abercromby, A.; Squyres, S. W.; Delaney, M.; Haberle, C. W.; Cardman, Z.

    2014-12-01

    Pavilion Lake in British Columbia, Canada is an ultra-oligotrophic lake that has abundant microbialite growth. Recent research has shown that photoautotrophic microbial communities are important to modern microbialite development in Pavilion Lake. However, questions remain as to the relationship between changing light levels within the lake, variation in microbialite macro-structure, microbial consortia, and the preservation of associated biosignatures within the microbialite fabrics. The 2014 Pavilion Lake Research Project (PLRP) field program was focused on data gathering to understand these complex relationships by determining if a) light is the immediate limit to photosynthetic activity and, if so, if light is distributed around microbialites in ways that are consistent with emergent microbialite structure; and b) if at more local scales, the filamentous pink and green cyanobacterial nodular colonies identified in previous PLRP studies are centers of photosynthetic activity that create pH conditions suitable for carbonate precipitation. A diver-deployed pulse-amplitude modulated (PAM) fluorometer was used to collect synoptic in situ measurements of fluorescence yield and irradiance and across microbialites, focusing on comparing flat and vertical structural elements at a range of sites and depths. As well, we collected time series measurements of photosynthetic activity and irradiance at a set depth of 18 m across three different regions in Pavilion Lake. Our initial findings suggest that all microbialite surfaces are primarily light-limited regardless of depth or location within the lake. Shore based PAM fluorometry and microelectrode profiling of diver-collected samples suggest that pink and green nodules have different photosynthetic properties and pH profiles, and that nodular growth is likely to be the primary route of calcification due to the gelatinous covering the nodule creates. On-going tests for molecular signatures and isotopic shifts will allow for further examination of surface microvariation and the associated influence on microbialite development.

  13. Fiber Optic Immunochemical Sensors For Continuous Monitoring Of Hapten Concentrations

    NASA Astrophysics Data System (ADS)

    Miller, W. Greg; Anderson, F. Philip

    1989-06-01

    We describe a fiber optic sensor based on a homogeneous fluorescence energy transfer immunoassay which operates in a continuous, reversible manner to quantitate the anticonvulsant drug phenytoin. B-phycoerythrin-phenytoin and Texas Red labeled anti-phenytoin antibody were sealed inside a short length of cellulose dialysis tubing which was cemented to the distal end of an optical fiber. When the sensor was placed into a solution of phenytoin, the drug crossed the dialysis membrane, displaced a fraction of the B-phycoerythrin-phenytoin from the antibody, and produced a change in fluorescence signal which was measured with a fiber optic fluorometer. The sensor had a concentration response of 5 to 500μmo1/L phenytoin with a response time of 5 to 15 min and precision of <2.5% CV. The chemical kinetics of the antibody-hapten indicator reaction were modeled mathematically and simulation showed that response time in the minutes range can be achieved when the dissociation rate constant is greater than approximately 10-3 sec-1. The dissociation rate constant influences the time to reach equilibrium and the unbound P* concentration range available for instrumental measurement. The ratio of the labeled and unlabeled hapten dissociation rate constants influences the analyte concentration range to which the sensor will respond.

  14. Preliminary evaluation of an in vivo fluorometer to quantify algal periphyton biomass and community composition

    USGS Publications Warehouse

    Harris, Theodore D.; Graham, Jennifer L.

    2015-01-01

    The bbe-Moldaenke BenthoTorch (BT) is an in vivo fluorometer designed to quantify algal biomass and community composition in benthic environments. The BT quantifies total algal biomass via chlorophyll a (Chl-a) concentration and may differentiate among cyanobacteria, green algae, and diatoms based on pigment fluorescence. To evaluate how BT measurements of periphytic algal biomass (as Chl-a) compared with an ethanol extraction laboratory analysis, we collected BT- and laboratory-measured Chl-a data from 6 stream sites in the Indian Creek basin, Johnson County, Kansas, during August and September 2012. BT-measured Chl-a concentrations were positively related to laboratory-measured concentrations (R2 = 0.47); sites with abundant filamentous algae had weaker relations (R2 = 0.27). Additionally, on a single sample date, we used the BT to determine periphyton biomass and community composition upstream and downstream from 2 wastewater treatment facilities (WWTF) that discharge into Indian Creek. We found that algal biomass increased immediately downstream from the WWTF discharge then slowly decreased as distance from the WWTF increased. Changes in periphyton community structure also occurred; however, there were discrepancies between BT- and laboratory-measured community composition data. Most notably, cyanobacteria were present at all sites based on BT measurements but were present at only one site based on laboratory-analyzed samples. Overall, we found that the BT compared reasonably well with laboratory methods for relative patterns in Chl-a but not as well with absolute Chl-aconcentrations. Future studies need to test the BT over a wider range of Chl-aconcentrations, in colored waters, and across various periphyton assemblages.

  15. Mesoscale Structure of Bio-Optical Properties Within the Northern California Current System, 2000-2002

    NASA Astrophysics Data System (ADS)

    Cowles, T. J.; Barth, J. A.; Wingard, C. E.; Desiderio, R. A.; Letelier, R. M.; Pierce, S. D.

    2002-12-01

    Mesoscale mapping of the hydrographic and bio-optical properties of the Northern California Current System was conducted during spring and summer 2000, 2001, and 2002 off the Oregon coast. A towed, undulating vehicle carried a CTD, two fluorometers, a multi-wavelength absorption and attenuation meter (ac-9), and a PAR sensor. In addition, an ac-9 and a Fast Repetition Rate fluorometer (FRRf) collected bio-optical data on surface waters throughout the mesoscale surveys. Multiple onshore-offshore transect lines provided repeated crossings of velocity jet and frontal boundaries, and allowed resolution of physical and bio-optical parameters on horizontal scales of 1km or less and on vertical scales of 1-2m. Our multi-year results permit assessment of the linkages and the degree of coupling between physical and bio-optical patterns during strong upwelling and strong downwelling events, as well as during low-wind relaxation intervals. The location of the coastal jet and the upwelling front fluctuated considerably under the variable forcing regime, with more extensive mesoscale structure in all parameters in late summer relative to spring, as current meanders developed around subsurface topography (Heceta Bank) and moved offshore near Cape Blanco. Sharp horizontal gradients in autotrophic biomass were observed across the boundaries of the coastal jet and the upwelling front, with chlorophyll levels often in excess of 5-10 mg m-3 on the inshore side of the fronts. Horizontal gradients also were observed in the spectral slope of attenuation and dissolved absorption as well as in the physiological properties of the autotrophic assemblages (as determined with FRRf). Details of the spatial correlations of physical and bio-optical parameters will be presented.

  16. Image stacking approach to increase sensitivity of fluorescence detection using a low cost complementary metal-oxide-semiconductor (CMOS) webcam.

    PubMed

    Balsam, Joshua; Bruck, Hugh Alan; Kostov, Yordan; Rasooly, Avraham

    2012-01-01

    Optical technologies are important for biological analysis. Current biomedical optical analyses rely on high-cost, high-sensitivity optical detectors such as photomultipliers, avalanched photodiodes or cooled CCD cameras. In contrast, Webcams, mobile phones and other popular consumer electronics use lower-sensitivity, lower-cost optical components such as photodiodes or CMOS sensors. In order for consumer electronics devices, such as webcams, to be useful for biomedical analysis, they must have increased sensitivity. We combined two strategies to increase the sensitivity of CMOS-based fluorescence detector. We captured hundreds of low sensitivity images using a Webcam in video mode, instead of a single image typically used in cooled CCD devices.We then used a computational approach consisting of an image stacking algorithm to remove the noise by combining all of the images into a single image. While video mode is widely used for dynamic scene imaging (e.g. movies or time-lapse photography), it is not used to capture a single static image, which removes noise and increases sensitivity by more than thirty fold. The portable, battery-operated Webcam-based fluorometer system developed here consists of five modules: (1) a low cost CMOS Webcam to monitor light emission, (2) a plate to perform assays, (3) filters and multi-wavelength LED illuminator for fluorophore excitation, (4) a portable computer to acquire and analyze images, and (5) image stacking software for image enhancement. The samples consisted of various concentrations of fluorescein, ranging from 30 μM to 1000 μM, in a 36-well miniature plate. In the single frame mode, the fluorometer's limit-of-detection (LOD) for fluorescein is ∼1000 μM, which is relatively insensitive. However, when used in video mode combined with image stacking enhancement, the LOD is dramatically reduced to 30 μM, sensitivity which is similar to that of state-of-the-art ELISA plate photomultiplier-based readers. Numerous medical diagnostics assays rely on optical and fluorescence readers. Our novel combination of detection technologies, which is new to biodetection may enable the development of new low cost optical detectors based on an inexpensive Webcam (<$10). It has the potential to form the basis for high sensitivity, low cost medical diagnostics in resource-poor settings.

  17. Variation of subsurface chlorophyll maximum layer from the vertical profiler and in-situ observation in the eastern coastal region of Korea (the East/Japan Sea)

    NASA Astrophysics Data System (ADS)

    Son, Y. T.; Chang, K. I.; Nam, S.; Kang, D. J.

    2016-02-01

    Coastal monitoring buoy (called it as ESROB) has been continually operated to monitor meteorological (wind, air temperature, air pressure, PAR) and oceanic properties (temperature, salinity, current, chlorophyll fluorescence, DO, turbidity) using equipment such as CTD, fluorometer and WQM (Water Quality Monitor) in the eastern coastal region of Korea (the East/Japan Sea) since April 2011. The ESROB produced temporal evolution of physical and biogeochemical parameters of the water column with high resolution of 10 min interval. In order to understand horizontal influence of physical and biogeochemical parameters on variation of subsurface chlorophyll maximum layer (SCM), interdisciplinary in-situ surveys with small R/V in the study area for about week were conducted in June/October 2014 and in May 2015. A wirewalker, a wave-driven vertically profiling platform (Rainville and Pinkel 2001), was also deployed at two points (about 30 m and 80 m water depth) along cross-shore direction with the ESROB for about one or two weeks with in-situ survey durations. The wirewalker was equipped with CTD, turbidity and chlorophyll a fluorometer profiler, which was completed approximately every 3 10 minute depending on sea surface state. The SCM was observed in almost every deployment nearest coast, except for June in 2014, with variation of semi- and diurnal time periods. Temporal evolution of the wirewalker showed that disappearance and reoccurrence of the SCM within the water column in October 2014, which was associated with vertical mixing induced by strong wind stress. Low salinity plume in the surface layer and shoaling of bottom cold water were concurrently observed after homogeneous water column, affecting another condition to the vertical distribution of chlorophyll a in this coastal region. Moreover in-situ observation with densely points and temporal interval for 1 day revealed that distribution with high concentration of chlorophyll a on isopycnal was association with the horizontal local circulation that has influence on stability (vertical stratification and shear) of the water column. Optical and biogeochemical parameter analyzed from the water samples, affecting on the variation of chlorophyll a concentration within the water column, will be also discussed in the presentation of Ocean Science Meeting.

  18. Image stacking approach to increase sensitivity of fluorescence detection using a low cost complementary metal-oxide-semiconductor (CMOS) webcam

    PubMed Central

    Balsam, Joshua; Bruck, Hugh Alan; Kostov, Yordan; Rasooly, Avraham

    2013-01-01

    Optical technologies are important for biological analysis. Current biomedical optical analyses rely on high-cost, high-sensitivity optical detectors such as photomultipliers, avalanched photodiodes or cooled CCD cameras. In contrast, Webcams, mobile phones and other popular consumer electronics use lower-sensitivity, lower-cost optical components such as photodiodes or CMOS sensors. In order for consumer electronics devices, such as webcams, to be useful for biomedical analysis, they must have increased sensitivity. We combined two strategies to increase the sensitivity of CMOS-based fluorescence detector. We captured hundreds of low sensitivity images using a Webcam in video mode, instead of a single image typically used in cooled CCD devices.We then used a computational approach consisting of an image stacking algorithm to remove the noise by combining all of the images into a single image. While video mode is widely used for dynamic scene imaging (e.g. movies or time-lapse photography), it is not used to capture a single static image, which removes noise and increases sensitivity by more than thirty fold. The portable, battery-operated Webcam-based fluorometer system developed here consists of five modules: (1) a low cost CMOS Webcam to monitor light emission, (2) a plate to perform assays, (3) filters and multi-wavelength LED illuminator for fluorophore excitation, (4) a portable computer to acquire and analyze images, and (5) image stacking software for image enhancement. The samples consisted of various concentrations of fluorescein, ranging from 30 μM to 1000 μM, in a 36-well miniature plate. In the single frame mode, the fluorometer's limit-of-detection (LOD) for fluorescein is ∼1000 μM, which is relatively insensitive. However, when used in video mode combined with image stacking enhancement, the LOD is dramatically reduced to 30 μM, sensitivity which is similar to that of state-of-the-art ELISA plate photomultiplier-based readers. Numerous medical diagnostics assays rely on optical and fluorescence readers. Our novel combination of detection technologies, which is new to biodetection may enable the development of new low cost optical detectors based on an inexpensive Webcam (<$10). It has the potential to form the basis for high sensitivity, low cost medical diagnostics in resource-poor settings. PMID:23990697

  19. Determination of Microbial Extracellular Enzyme Activity in Waters, Soils, and Sediments using High Throughput Microplate Assays

    PubMed Central

    Jackson, Colin R.; Tyler, Heather L.; Millar, Justin J.

    2013-01-01

    Much of the nutrient cycling and carbon processing in natural environments occurs through the activity of extracellular enzymes released by microorganisms. Thus, measurement of the activity of these extracellular enzymes can give insights into the rates of ecosystem level processes, such as organic matter decomposition or nitrogen and phosphorus mineralization. Assays of extracellular enzyme activity in environmental samples typically involve exposing the samples to artificial colorimetric or fluorometric substrates and tracking the rate of substrate hydrolysis. Here we describe microplate based methods for these procedures that allow the analysis of large numbers of samples within a short time frame. Samples are allowed to react with artificial substrates within 96-well microplates or deep well microplate blocks, and enzyme activity is subsequently determined by absorption or fluorescence of the resulting end product using a typical microplate reader or fluorometer. Such high throughput procedures not only facilitate comparisons between spatially separate sites or ecosystems, but also substantially reduce the cost of such assays by reducing overall reagent volumes needed per sample. PMID:24121617

  20. Determination of microbial extracellular enzyme activity in waters, soils, and sediments using high throughput microplate assays.

    PubMed

    Jackson, Colin R; Tyler, Heather L; Millar, Justin J

    2013-10-01

    Much of the nutrient cycling and carbon processing in natural environments occurs through the activity of extracellular enzymes released by microorganisms. Thus, measurement of the activity of these extracellular enzymes can give insights into the rates of ecosystem level processes, such as organic matter decomposition or nitrogen and phosphorus mineralization. Assays of extracellular enzyme activity in environmental samples typically involve exposing the samples to artificial colorimetric or fluorometric substrates and tracking the rate of substrate hydrolysis. Here we describe microplate based methods for these procedures that allow the analysis of large numbers of samples within a short time frame. Samples are allowed to react with artificial substrates within 96-well microplates or deep well microplate blocks, and enzyme activity is subsequently determined by absorption or fluorescence of the resulting end product using a typical microplate reader or fluorometer. Such high throughput procedures not only facilitate comparisons between spatially separate sites or ecosystems, but also substantially reduce the cost of such assays by reducing overall reagent volumes needed per sample.

  1. Repeated erosion of cohesive sediments with biofilms

    NASA Astrophysics Data System (ADS)

    Valentine, K.; Mariotti, G.; Fagherazzi, S.

    2014-04-01

    This study aims to explore the interplay between biofilms and erodability of cohesive sediments. Erosion experiments were run in four laboratory annular flumes with natural sediments. After each erosion the sediment was allowed to settle, mimicking intermittent physical processes like tidal currents and waves. The time between consecutive erosion events ranged from 1 to 12 days. Turbidity of the water column caused by sediment resuspension was used to determine the erodability of the sediments with respect to small and moderate shear stresses. Erodability was also compared on the basis of the presence of benthic biofilms, which were quantified using a Pulse-Amplitude Modulation (PAM) Underwater Fluorometer. We found that frequent erosion lead to the establishment of a weak biofilm, which reduced sediment erosion at small shear stresses (around 0.1 Pa). If prolonged periods without erosion were present, the biofilm fully established, resulting in lower erosion at moderate shear stresses (around 0.4 Pa). We conclude that an unstructured extracellular polymeric substances (EPS) matrix always affect sediment erodability at low shear stresses, while only a fully developed biofilm mat can reduce sediment erodability at moderate shear stresses.

  2. Fraunhofer line-dept sensing applied to water

    NASA Technical Reports Server (NTRS)

    Stoertz, G. E.

    1969-01-01

    An experimental Fraunhofer line discriminator is basically an airborne fluorometer, capable of quantitatively measuring the concentration of fluorescent substances dissolved in water. It must be calibrated against standards and supplemented by ground-truth data on turbidity and on approximate vertical distribution of the fluorescent substance. Quantitative use requires that it be known in advance what substance is the source of the luminescence emission; qualitative sensing, or detection of luminescence is also possible. The two approaches are fundamentally different, having different purposes, different applications, and different instruments. When used for sensing of Rhodamine WT dye in coastal waters and estuaries, the FLD is sensing in the spectral region permitting nearly maximum depth of light penetration.

  3. Fertility of frozen-thawed stallion semen cannot be predicted by the currently used laboratory methods

    PubMed Central

    Kuisma, P; Andersson, M; Koskinen, E; Katila, T

    2006-01-01

    The aim of the project was to use current simple and practical laboratory tests and compare results with the foaling rates of mares inseminated with commercially produced frozen semen. In Exp. 1, semen was tested from 27 and in Exp. 2 from 23 stallions; 19 stallions participated in both experiments. The mean number of mares per stallion in both experiments was 37 (min. 7, max. 121). Sperm morphology was assessed and bacterial culture performed once per stallion. In Exp. 1, progressive motility after 0, 1, 2, 3, and 4 h of incubation using light microscopy, motility characteristics measured with an automatic sperm analyzer, plasma membrane integrity using carboxyfluorescein diacetate/propidium iodide (CFDA/PI) staining and light microscopy, plasma membrane integrity using PI staining and a fluorometer, plasma membrane integrity using a resazurin reduction test, and sperm concentration were evaluated. In Exp. 2, the same tests as in Exp. 1 and a hypo-osmotic swelling test (HOST) using both light microscopy and a fluorometer were performed immediately after thawing and after a 3-h incubation. Statistical analysis was done separately to all stallions and to those having ≥ 20 mares; in addition, stallions with foaling rates < 60 or ≥ 60% were compared. In Exp. 1, progressive motility for all stallions after a 2 – 4-h incubation correlated with the foaling rate (correlation coefficients 0.39 – 0.51), (p < 0.05). In stallions with > 20 mares, the artificial insemination dose showed a correlation coefficient of -0.58 (p < 0.05). In Exp. 2, the HOST immediately after thawing showed a negative correlation with foaling rate (p < 0.05). No single test was consistently reliable for predicting the fertilizing capacity of semen, since the 2 experiments yielded conflicting results, although the same stallions sometimes participated in both. This shows the difficulty of frozen semen quality control in commercially produced stallion semen, and on the other hand, the difficulty of conducting fertility trials in horses. PMID:16987393

  4. Integrated instrument for dynamic light scattering and natural fluorescence measurements

    NASA Astrophysics Data System (ADS)

    Rovati, Luigi; Pollonini, Luca; Ansari, Rafat R.

    2001-06-01

    Over the past two decades, great efforts have been made in ophthalmology to use optical techniques based on dynamic light scattering and tissue natural fluorescence for early (at molecular level) diagnosis of ocular pathologies. In our previous studies, the relationship between the corneal AF and DLS decay widths of ocular tissues were established by performing measurements on diabetes mellitus patients. In those studies, corneal AF mean intensities were significantly correlated with DLS decay width measurements for each diabetic retinopathy grade in the vitreous and in the cornea. This suggested that the quality of the diagnosis could be significantly improved by properly combining these two powerful techniques into a single instrument. Our approach is based on modifying a commercial scanning ocular fluorometer (Fluorotron Master, Ocumetrics Inc., CA, USA) to include both techniques in the same scanning unit. This configuration provides both DLS and AF real time measurements from the same ocular volume: they can be located in each section of the optical axis of the eye from the cornea to the retina. In this paper, the optical setup of the new system is described and preliminary in-vitro and in-vivo measurements are presented.

  5. Panax ginseng induces the expression of CatSper genes and sperm hyperactivation

    PubMed Central

    Park, Eun Hwa; Kim, Do Rim; Kim, Ha Young; Park, Seong Kyu; Chang, Mun Seog

    2014-01-01

    The cation channel of sperm (CatSper) protein family plays important roles in male reproduction and infertility. The four members of this family are expressed exclusively in the testis and are localized differently in sperm. To investigate the effects of Panax ginseng treatment on the expression of CatSper genes and sperm hyperactivation in male mice, sperm motility and CatSper gene expression were assessed using a computer-assisted semen analysis system, a Fluoroskan Ascent microplate fluorometer to assess Ca2+ influx, real-time polymerase chain reaction, Western blotting and immunofluorescence. The results suggested that the Ca2+ levels of sperm cells treated with P. ginseng were increased significantly compared with the normal group. The P. ginseng-treated groups showed increased sperm motility parameters, such as the curvilinear velocity and amplitude of lateral head displacement. Taken together, the data suggest that CatSper messenger ribonucleic acid levels were increased significantly in mouse testes in the P. ginseng-treated group, as was the protein level, with the exception of CatSper2. In conclusion, P. ginseng plays an important role in improving sperm hyperactivation via CatSper gene expression. PMID:24969054

  6. Measurement of discharge using tracers

    USGS Publications Warehouse

    Kilpatrick, Frederick A.; Cobb, Ernest D.

    1984-01-01

    The development of fluorescent dyes and fluorometers that can measure these dyes at very low concentrations has made dye-dilution methods practical for measuring discharge. These methods are particularly useful for determining discharge under certain flow conditions that are unfavorable for current meter measurements. These include small streams, canals, and pipes where:Turbulence is excessive for current meter measurement but conducive to good mixing.Moving rocks and debris are damaging to any instruments placed in the flow.Cross-sectional areas or velocities are indeterminant or changing.There are some unsteady flows such as exist with storm-runoff events on small streams.The flow is physically inaccessible or unsafe.From a practical standpoint, such measurements are limited primarily to small streams due to excessively long channel mixing lengths required of larger streams. Very good accuracy can be obtained provided:Adequate mixing length and time are allowed.Careful field and laboratory techniques are employed.Dye losses are not significant.This manual describes the slug-injection and constant-rate injection methods of performing tracer-dilution measurements. Emphasis is on the use of fluorescent dyes as tracers and the equipment, field methods, and Laboratory procedures for performing such measurements. The tracer-velocity method is also briefly discussed.

  7. Determination of ochratoxin A in wine by means of immunoaffinity and aminopropyl solid-phase column cleanup and fluorometric detection.

    PubMed

    Longobardi, Francesco; Iacovelli, Vito; Catucci, Lucia; Panzarini, Giuseppe; Pascale, Michelangelo; Visconti, Angelo; Agostiano, Angela

    2013-02-27

    A new analytical method for the determination of ochratoxin A (OTA) in red wine has been developed by using a double-extract cleanup and a fluorometric measurement after spectral deconvolution. Wine samples were diluted with a solution containing 1% polyethylene glycol and 5% sodium hydrogencarbonate, filtered, and purified by immunoaffinity and aminopropyl solid-phase column. OTA contents in the purified extract were determined by a spectrofluorometer (excitation wavelength, 330 nm; emission wavelength, 470 nm) after deconvolution of fluorescence spectra. Average recoveries from wine samples spiked with OTA at levels ranging from 0.5 to 3.0 ng/mL were 94.5-105.4% with relative standard deviations (RSD) of <15% (n = 4). The limit of detection (LOD) was 0.2 ng/mL, and the total time of analysis was 30 min. The developed method was tested on 18 red wine samples (naturally contaminated and spiked with OTA at levels ranging from 0.4 to 3.0 ng/mL) and compared with AOAC Official Method 2001.01, based on immunoaffinity column cleanup and HPLC with fluorescence detector. A good correlation (r(2) = 0.9765) was observed between OTA levels obtained with the two methods, highlighting the reliability of the proposed method, the main advantage of which is the simple OTA determination by a benchtop fluorometer with evident reductions of cost and time of analysis.

  8. Production of volatile organic compounds by cyanobacteria Synechococcus sp.

    NASA Astrophysics Data System (ADS)

    Hiraiwa, M.; Abe, M.; Hashimoto, S.

    2014-12-01

    Phytoplankton are known to produce volatile organic compounds (VOCs), which contribute to environmental problems such as global warming and decomposition of stratospheric ozone. For example, picophytoplankton, such as Prochlorococcus and Synechococcus, are distributed in freshwater and oceans worldwide, accounting for a large proportion of biomass and primary production in the open ocean. However, to date, little is known about the production of VOCs by picophytoplankton. In this study, VOCs production by cyanobacteria Synechococcus sp. (NIES-981) was investigated. Synechococcus sp. was obtained from the National Institute for Environmental Studies (NIES), Japan, and cultured at 24°C in autoclaved f/2-Si medium under 54 ± 3 µE m-2 s-1 (1 E = 1 mol of photons) with a 12-h light and 12-h dark cycle. VOCs concentrations were determined using a purge-and-trap gas chromatograph-mass spectrometer (Agilent 5973). The concentrations of chlorophyll a (Chl a) were also determined using a fluorometer (Turner TD-700). Bromomethane (CH3Br) and isoprene were produced by Synechococcus sp. Isoprene production was similar to those of other phytoplankton species reported earlier. Isoprene was produced when Chl a was increasing in the early stage of the incubation period (5-15 days of incubation time, exponential phase), but CH3Br was produced when Chl a was reduced in the late stage of the incubation period (30-40 days of incubation time, death phase).

  9. What limits photosynthetic energy conversion efficiency in nature? Lessons from the oceans.

    PubMed

    Falkowski, Paul G; Lin, Hanzhi; Gorbunov, Maxim Y

    2017-09-26

    Constraining photosynthetic energy conversion efficiency in nature is challenging. In principle, two yield measurements must be made simultaneously: photochemistry, fluorescence and/or thermal dissipation. We constructed two different, extremely sensitive and precise active fluorometers: one measures the quantum yield of photochemistry from changes in variable fluorescence, the other measures fluorescence lifetimes in the picosecond time domain. By deploying the pair of instruments on eight transoceanic cruises over six years, we obtained over 200 000 measurements of fluorescence yields and lifetimes from surface waters in five ocean basins. Our results revealed that the average quantum yield of photochemistry was approximately 0.35 while the average quantum yield of fluorescence was approximately 0.07. Thus, closure on the energy budget suggests that, on average, approximately 58% of the photons absorbed by phytoplankton in the world oceans are dissipated as heat. This extraordinary inefficiency is associated with the paucity of nutrients in the upper ocean, especially dissolved inorganic nitrogen and iron. Our results strongly suggest that, in nature, most of the time, most of the phytoplankton community operates at approximately half of its maximal photosynthetic energy conversion efficiency because nutrients limit the synthesis or function of essential components in the photosynthetic apparatus.This article is part of the themed issue 'Enhancing photosynthesis in crop plants: targets for improvement'. © 2017 The Author(s).

  10. A low cost, customizable turbidostat for use in synthetic circuit characterization.

    PubMed

    Takahashi, Chris N; Miller, Aaron W; Ekness, Felix; Dunham, Maitreya J; Klavins, Eric

    2015-01-16

    Engineered biological circuits are often disturbed by a variety of environmental factors. In batch culture, where the majority of synthetic circuit characterization occurs, environmental conditions vary as the culture matures. Turbidostats are powerful characterization tools that provide static culture environments; however, they are often expensive, especially when purchased in custom configurations, and are difficult to design and construct in a lab. Here, we present a low cost, open source multiplexed turbidostat that can be manufactured and used with minimal experience in electrical or software engineering. We demonstrate the utility of this system to profile synthetic circuit behavior in S. cerevisiae. We also demonstrate the flexibility of the design by showing that a fluorometer can be easily integrated.

  11. Comparison of calibration and standardization approaches for fluorescence guided imaging systems to benchtop fluorescence measurements in cellular systems

    NASA Astrophysics Data System (ADS)

    Litorja, Maritoni; DeRose, Paul

    2018-02-01

    Fluorescence measurements are a staple in biomedicine, from research and discovery to more recently, for fluorescenceguided imaging systems for diagnostics and surgery. Measurement validation for clinical imagers is a challenge as it is applied to many different optical systems and probe through matrices with different optical properties in a demanding field environment. In this paper we will present approaches to fluorescence calibration for a field system, in comparison to those used in laboratory instruments for cell measurements or benchtop fluorometers. We will present the common challenges and differences, and lessons from the standardization effort of laboratory fluorescence measurements. We will discuss the conceptually different pathways to measurement traceability, between counting moles of substance and measuring light.

  12. Zooglider - an Autonomous Vehicle for Optical and Acoustic Sensing of Marine Zooplankton

    NASA Astrophysics Data System (ADS)

    Ohman, M. D.; Davis, R. E.; Sherman, J. T.; Grindley, K.; Whitmore, B. M.

    2016-02-01

    We will present results from early sea trials of the Zooglider, an autonomous zooplankton glider designed and built by the Instrument Development Group at Scripps. The Zooglider is built upon a modified Spray glider and includes a low power camera with telecentric lens and a custom dual frequency sonar (200/1000 kHz). The imaging system quantifies zooplankton as they flow through a sampling tunnel within a well-defined sampling volume. The maximum operating depth is 500 m. Other sensors include a pumped CTD and Chl-a fluorometer. The Zooglider permits in situ measurements of mesozooplankton distributions and three dimensional orientation in relation to other biotic and physical properties of the ocean water column. Zooglider development is supported by the Gordon and Betty Moore Foundation.

  13. Submersible optical sensors exposed to chemically dispersed crude oil: wave tank simulations for improved oil spill monitoring.

    PubMed

    Conmy, Robyn N; Coble, Paula G; Farr, James; Wood, A Michelle; Lee, Kenneth; Pegau, W Scott; Walsh, Ian D; Koch, Corey R; Abercrombie, Mary I; Miles, M Scott; Lewis, Marlon R; Ryan, Scott A; Robinson, Brian J; King, Thomas L; Kelble, Christopher R; Lacoste, Jordanna

    2014-01-01

    In situ fluorometers were deployed during the Deepwater Horizon (DWH) Gulf of Mexico oil spill to track the subsea oil plume. Uncertainties regarding instrument specifications and capabilities necessitated performance testing of sensors exposed to simulated, dispersed oil plumes. Dynamic ranges of the Chelsea Technologies Group AQUAtracka, Turner Designs Cyclops, Satlantic SUNA and WET Labs, Inc. ECO, exposed to fresh and artificially weathered crude oil, were determined. Sensors were standardized against known oil volumes and total petroleum hydrocarbons and benzene-toluene-ethylbenzene-xylene measurements-both collected during spills, providing oil estimates during wave tank dilution experiments. All sensors estimated oil concentrations down to 300 ppb oil, refuting previous reports. Sensor performance results assist interpretation of DWH oil spill data and formulating future protocols.

  14. Brain physiological state evaluated by real-time multiparametric tissue spectroscopy in vivo

    NASA Astrophysics Data System (ADS)

    Mayevsky, Avraham; Barbiro-Michaely, Efrat; Kutai-Asis, Hofit; Deutsch, Assaf; Jaronkin, Alex

    2004-07-01

    The significance of normal mitochondrial function in cellular energy homeostasis as well as its involvement in acute and chronic neurodegenerative disease was reviewed recently (Nicholls & Budd. Physiol Rev. 80: 315-360, 2000). Nevertheless, monitoring of mitochondrial function in vivo and real time mode was not used by many investigators and is very rare in clinical practice. The main principle tool available for the evaluation of mitochondrial function is the monitoring of NADH fluorescence. In order to interpret correctly the changes in NADH redox state in vivo, it is necessary to correlate this signal to other parameters, reflecting O2 supply to the brain. Therefore, we have developed and applied a multiparametric optical monitoring system, by which microcirculatory blood flow and hemoglobin oxygenation is measured, together with mitochondrial NADH fluorescence. Since the calibration of these signals is not in absolute units, the simultaneous monitoring provide a practical tool for the interpretation of brain functional state under various pathophysiological conditions. The monitoring system combines a time-sharing fluorometer-reflectometer for the measurement of NADH fluorescence and hemoglobin oxygenation as well as a laser Doppler flowmeter for the recording of microcirculatory blood flow. A combined fiber optic probe was located on the surface of the brain using a skull cemented cannula. Rats and gerbils were exposed to anoxia, ischemia and spreading depression and the functional state of the brain was evaluated. The results showed a clear correlation between O2 supply/demand as well as, energy balance under the various pathophysiological conditions. This monitoring approach could be adapted to clinical monitoring of tissue vitality.

  15. Real-time imaging of hydrogen peroxide dynamics in vegetative and pathogenic hyphae of Fusarium graminearum.

    PubMed

    Mentges, Michael; Bormann, Jörg

    2015-10-08

    Balanced dynamics of reactive oxygen species in the phytopathogenic fungus Fusarium graminearum play key roles for development and infection. To monitor those dynamics, ratiometric analysis using the novel hydrogen peroxide (H2O2) sensitive fluorescent indicator protein HyPer-2 was established for the first time in phytopathogenic fungi. H2O2 changes the excitation spectrum of HyPer-2 with an excitation maximum at 405 nm for the reduced and 488 nm for the oxidized state, facilitating ratiometric readouts with maximum emission at 516 nm. HyPer-2 analyses were performed using a microtiter fluorometer and confocal laser scanning microscopy (CLSM). Addition of external H2O2 to mycelia caused a steep and transient increase in fluorescence excited at 488 nm. This can be reversed by the addition of the reducing agent dithiothreitol. HyPer-2 in F. graminearum is highly sensitive and specific to H2O2 even in tiny amounts. Hyperosmotic treatment elicited a transient internal H2O2 burst. Hence, HyPer-2 is suitable to monitor the intracellular redox balance. Using CLSM, developmental processes like nuclear division, tip growth, septation, and infection structure development were analyzed. The latter two processes imply marked accumulations of intracellular H2O2. Taken together, HyPer-2 is a valuable and reliable tool for the analysis of environmental conditions, cellular development, and pathogenicity.

  16. Remote monitoring of chlorophyll fluorescence in two reef corals during the 2005 bleaching event at Lee Stocking Island, Bahamas

    NASA Astrophysics Data System (ADS)

    Manzello, D.; Warner, M.; Stabenau, E.; Hendee, J.; Lesser, M.; Jankulak, M.

    2009-03-01

    Zooxanthellae fluorescence was measured in situ, remotely, and in near real-time with a pulse amplitude modulated (PAM) fluorometer for a colony of Siderastrea siderea and Agaricia tenuifolia at Lee Stocking Island, Bahamas during the Caribbean-wide 2005 bleaching event. These colonies displayed evidence of photosystem II (PS II) inactivation coincident with thermal stress and seasonally high doses of solar radiation. Hurricane-associated declines in temperature and light appear to have facilitated the recovery of maximum quantum yield of PS II within these two colonies, although both corals responded differently to individual storms. PAM fluorometry, coupled with long-term measurement of in situ light and temperature, provides much more detail of coral photobiology on a seasonal time scale and during possible bleaching conditions than sporadic, subjective, and qualitative observations. S. siderea displayed evidence of PS II inactivation over a month prior to the issuing of a satellite-based, sea surface temperature (SST) bleaching alert by the National Oceanic and Atmospheric Administration (NOAA). In fact, recovery had already begun in S. siderea when the bleaching alert was issued. Fluorescence data for A. tenuifolia were difficult to interpret because the shaded parts of a colony were monitored and thus did not perfectly coincide with thermal stress and seasonally high doses of solar radiation as in S. siderea. These results further emphasize the limitations of solely monitoring SST (satellite or in situ) as a bleaching indicator without considering the physiological status of coral-zooxanthellae symbioses.

  17. Challenges and Early Results: Interactive benthic experiments in hydrate environments of Barkley Canyon, NEPTUNE Canada.

    NASA Astrophysics Data System (ADS)

    Best, M.; Thomsen, L.; de Beer, D.

    2012-04-01

    NEPTUNE Canada, operating and online since 2009, is an 800km, 5-node, regional cabled ocean network across the northern Juan de Fuca Plate, northeastern Pacific, part of the Ocean Networks Canada Observatory. One of 15 study areas is an environment of exposed hydrate mounds along the wall of Barkley Canyon, at ~865m water depth. This is the home of a benthic crawler developed by Jacobs University of Germany, who is affectionately known as Wally. Wally is equipped with a range of sensors including a camera, methane sensor, current meter, fluorometer, turbidity meter, CTD, and a sediment microprofiler developed at the Max Planck Institute with probes for oxygen, methane, sulphide, pH, temperature, and conductivity. In conjunction with this sensor suite, a series of experiments have been designed to assess the cycling of biogenic carbon and carbonate in this complex environment. The biota range from microbes, to molluscs, to large fish, and therefore the carbon inputs include both a range of organic carbon compounds as well as the complex materials that are "biogenic carbonate". Controlled experimental specimens were deployed of biogenic carbonate (Mytilus edulis fresh shells) and cellulose (pieces of untreated pine lumber) that had been previously well characterized (photographed, weighed, and numbered, matching valves and lumber kept as controls). Deployment at the sediment/water interface was in such a way to maximize natural burial exhumation cycles but to minimize specimen interaction. 10 replicate specimens of each material were deployed in two treatments: 1) adjacent to a natural life and death assemblage of chemosynthetic bivalves and exposed hydrate on a hydrate mound and 2) on the muddy seafloor at a distance from the mound. In order to quantify and track the rates and processes of modification of the natural materials, and their possible environmental/ecological correlates, observations of the experimental specimens are being made on a regular basis using the crawler camera and sensors. On retrieval, the specimens will be further studied for net material loss, surface alteration, microbial recruitment, endo- and epibionts, and microstructural and chemical modification. The complex coordination of hardware, software, and people is challenging such that the ideal of frequent and timely observations of these poorly known processes is realized. Understanding the production and cycling of carbon across the sediment/water interface in this environment will help elucidate the formation and evolution of these hydrate deposits, their distribution through time, and the ecological and taphonomic feedbacks they generate.

  18. Measurement of discharge using tracers

    USGS Publications Warehouse

    Kilpatrick, F.A.; Cobb, Ernest D.

    1985-01-01

    The development of fluorescent dyes and fluorometers that can measure these dyes at very low concentrations has made dye-dilution methods practical for measuring discharge. These methods are particularly useful for determining discharge under certain flow conditions that are unfavorable for current meter measurements. These include small streams, canals, and pipes where 1. Turbulence is excessive for current-meter measurement but conducive to good mixing. 2. Moving rocks and debris may damage instruments placed in the flow. 3. Cross-sectional areas or velocities are indeterminate or changing. 4. The flow is unsteady, such as the flow that exists with storm-runoff events on small streams and urban storm-sewer systems. 5. The flow is physically inaccessible or unsafe. From a practical standpoint, such methods are limited primarily to small streams, because of the excessively long channel-mixing lengths required for larger streams. Very good accuracy can be obtained provided that 1. Adequate mixing length and time are allowed. 2. Careful field and laboratory techniques are used. 3. Dye losses are not significant. This manual describes the slug-injection and constant-rate injection methods of performing tracer-dilution measurements. Emphasis is on the use of fluorescent dyes as tracers and the equipment, field methods, and laboratory procedures for performing such measurements. The tracer-velocity method is also briefly discussed.

  19. Assessment of wavelength-dependent parameters of photosynthetic electron transport with a new type of multi-color PAM chlorophyll fluorometer.

    PubMed

    Schreiber, Ulrich; Klughammer, Christof; Kolbowski, Jörg

    2012-09-01

    Technical features of a novel multi-color pulse amplitude modulation (PAM) chlorophyll fluorometer as well as the applied methodology and some typical examples of its practical application with suspensions of Chlorella vulgaris and Synechocystis PCC 6803 are presented. The multi-color PAM provides six colors of pulse-modulated measuring light (peak-wavelengths at 400, 440, 480, 540, 590, and 625 nm) and six colors of actinic light (AL), peaking at 440, 480, 540, 590, 625 and 420-640 nm (white). The AL can be used for continuous illumination, maximal intensity single-turnover pulses, high intensity multiple-turnover pulses, and saturation pulses. In addition, far-red light (peaking at 725 nm) is provided for preferential excitation of PS I. Analysis of the fast fluorescence rise kinetics in saturating light allows determination of the wavelength- and sample-specific functional absorption cross section of PS II, Sigma(II)(λ), with which the PS II turnover rate at a given incident photosynthetically active radiation (PAR) can be calculated. Sigma(II)(λ) is defined for a quasi-dark reference state, thus differing from σ(PSII) used in limnology and oceanography. Vastly different light response curves for Chlorella are obtained with light of different colors, when the usual PAR-scale is used. Based on Sigma(II)(λ) the PAR, in units of μmol quanta/(m(2) s), can be converted into PAR(II) (in units of PS II effective quanta/s) and a fluorescence-based electron transport rate ETR(II) = PAR(II) · Y(II)/Y(II)(max) can be defined. ETR(II) in contrast to rel.ETR qualifies for quantifying the absolute rate of electron transport in optically thin suspensions of unicellular algae and cyanobacteria. Plots of ETR(II) versus PAR(II) for Chlorella are almost identical using either 440 or 625 nm light. Photoinhibition data are presented suggesting that a lower value of ETR(II)(max) with 440 nm possibly reflects photodamage via absorption by the Mn-cluster of the oxygen-evolving complex.

  20. Vibrio Zinc-Metalloprotease Causes Photoinactivation of Coral Endosymbionts and Coral Tissue Lesions

    PubMed Central

    Sussman, Meir; Mieog, Jos C.; Doyle, Jason; Victor, Steven; Willis, Bette L.; Bourne, David G.

    2009-01-01

    Background Coral diseases are emerging as a serious threat to coral reefs worldwide. Of nine coral infectious diseases, whose pathogens have been characterized, six are caused by agents from the family Vibrionacae, raising questions as to their origin and role in coral disease aetiology. Methodology/Principal Findings Here we report on a Vibrio zinc-metalloprotease causing rapid photoinactivation of susceptible Symbiodinium endosymbionts followed by lesions in coral tissue. Symbiodinium photosystem II inactivation was diagnosed by an imaging pulse amplitude modulation fluorometer in two bioassays, performed by exposing Symbiodinium cells and coral juveniles to non-inhibited and EDTA-inhibited supernatants derived from coral white syndrome pathogens. Conclusion/Significance These findings demonstrate a common virulence factor from four phylogenetically related coral pathogens, suggesting that zinc-metalloproteases may play an important role in Vibrio pathogenicity in scleractinian corals. PMID:19225559

  1. Deep Sea Shell Taphonomy: Interactive benthic experiments in hydrate environments of Barkley Canyon, Ocean Networks Canada.

    NASA Astrophysics Data System (ADS)

    Best, Mairi; Purser, Autun

    2015-04-01

    In order to quantify and track the rates and processes of modification of biogenic carbonate in gas hydrate environments, and their possible environmental/ecological correlates, ongoing observations of experimentally deployed specimens are being made using a remotely controlled crawler with camera and sensors. The crawler is connected to NEPTUNE Canada, an 800km, 5-node, regional cabled ocean network across the northern Juan de Fuca Plate, northeastern Pacific, part of Ocean Networks Canada. One of 15 study areas is an environment of exposed hydrate mounds along the wall of Barkley Canyon, at ˜865m water depth. This is the home of a benthic crawler developed by Jacobs University of Germany, who is affectionately known as Wally. Wally is equipped with a range of sensors including cameras, methane sensor, current meter, fluorometer, turbidity meter, CTD, and a sediment microprofiler with probes for oxygen, methane, sulphide, pH, temperature, and conductivity. In conjunction with this sensor suite, a series of experiments have been designed to assess the cycling of biogenic carbon and carbonate in this complex environment. The biota range from microbes, to molluscs, to large fish, and therefore the carbon inputs include both a range of organic carbon compounds as well as the complex materials that are "biogenic carbonate". Controlled experimental specimens were deployed of biogenic carbonate (Mytilus edulis fresh shells) and cellulose (pieces of untreated pine lumber) that had been previously well characterized (photographed, weighed, and numbered, matching valves and lumber kept as controls). Deployment at the sediment/water interface was in such a way to maximize natural burial exhumation cycles but to minimize specimen interaction. 10 replicate specimens of each material were deployed in two treatments: 1) adjacent to a natural life and death assemblage of chemosynthetic bivalves and exposed hydrate on a hydrate mound and 2) on the muddy seafloor at a distance from the mound. On retrieval, the specimens are being further studied for net material loss, surface alteration, microbial recruitment, endo- and epibionts, and microstructural and chemical modification. Understanding the production and cycling of carbon across the sediment/water interface in this environment will help elucidate the formation and evolution of these hydrate deposits, their distribution through time, and the ecological and taphonomic feedbacks they generate.

  2. Why didn't Box-Jenkins win (again)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pack, D.J.; Downing, D.J.

    This paper focuses on the forecasting performance of the Box-Jenkins methodology applied to the 111 time series of the Makridakis competition. It considers the influence of the following factors: (1) time series length, (2) time-series information (autocorrelation) content, (3) time-series outliers or structural changes, (4) averaging results over time series, and (5) forecast time origin choice. It is found that the 111 time series contain substantial numbers of very short series, series with obvious structural change, and series whose histories are relatively uninformative. If these series are typical of those that one must face in practice, the real message ofmore » the competition is that univariate time series extrapolations will frequently fail regardless of the methodology employed to produce them.« less

  3. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    PubMed

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  4. Regenerating time series from ordinal networks.

    PubMed

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  5. Regenerating time series from ordinal networks

    NASA Astrophysics Data System (ADS)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  6. GPS Position Time Series @ JPL

    NASA Technical Reports Server (NTRS)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  7. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    PubMed

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Detection of a sudden change of the field time series based on the Lorenz system.

    PubMed

    Da, ChaoJiu; Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series.

  9. Volatility of linear and nonlinear time series

    NASA Astrophysics Data System (ADS)

    Kalisky, Tomer; Ashkenazy, Yosef; Havlin, Shlomo

    2005-07-01

    Previous studies indicated that nonlinear properties of Gaussian distributed time series with long-range correlations, ui , can be detected and quantified by studying the correlations in the magnitude series ∣ui∣ , the “volatility.” However, the origin for this empirical observation still remains unclear and the exact relation between the correlations in ui and the correlations in ∣ui∣ is still unknown. Here we develop analytical relations between the scaling exponent of linear series ui and its magnitude series ∣ui∣ . Moreover, we find that nonlinear time series exhibit stronger (or the same) correlations in the magnitude time series compared with linear time series with the same two-point correlations. Based on these results we propose a simple model that generates multifractal time series by explicitly inserting long range correlations in the magnitude series; the nonlinear multifractal time series is generated by multiplying a long-range correlated time series (that represents the magnitude series) with uncorrelated time series [that represents the sign series sgn(ui) ]. We apply our techniques on daily deep ocean temperature records from the equatorial Pacific, the region of the El-Ninõ phenomenon, and find: (i) long-range correlations from several days to several years with 1/f power spectrum, (ii) significant nonlinear behavior as expressed by long-range correlations of the volatility series, and (iii) broad multifractal spectrum.

  10. Duality between Time Series and Networks

    PubMed Central

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  11. Rapid chlorophyll a fluorescence transient of Lemna gibba leaf as an indication of light and hydroxylamine effect on photosystem II activity.

    PubMed

    Dewez, David; Ali, Nadia Ait; Perreault, François; Popovic, Radovan

    2007-05-01

    Rapid chlorophyll fluorescence transient induced by saturating flash (3000 micromol of photons m-2 s-1) was investigated when Lemna gibba had been exposed to light (100 micromol of photons m-2 s-1) causing the Kautsky effect or in low light intensity unable to trigger PSII photochemistry. Measurements were made by using, simultaneously, a pulse amplitude modulated fluorometer and plant efficiency analyzer system, either on non-treated L. gibba leaf or those treated with different concentrations of hydroxylamine (1-50 mM) causing gradual inhibition of the water splitting system. When any leaf was exposed to continuous light during the Kautsky effect, a rapid fluorescence transient may reflect current activity of photosystem II within the photosystem II complex. Under those conditions, a variation of transition steps appearing over time was related to a drastic change to the photosystem II functional properties. This value indicated that the energy dissipation through non-photochemical pathways was undergoing extreme change. The change of rapid fluorescence transient, induced under continuous light, when compared to those obtained under very low light intensity, confirmed the ability of photosystem II to be capable to undergo rapid adaptation lasting about two minutes. When the water splitting system was inhibited and electron donation partially substituted by hydroxylamine, the adaptation ability of photosystem II to different light conditions was lost. In this study, the change of rapid fluorescence kinetic and transient appearing over time was shown to be a good indication for the change of the functional properties of photosystem II induced either by light or by hydroxylamine.

  12. Synthesis of praseodymium-ion-doped perovskite nanophosphor in supercritical water

    NASA Astrophysics Data System (ADS)

    Hakuta, Yukiya; Sue, Kiwamu; Takashima, Hiroshi

    2018-05-01

    We report the synthesis of praseodymium-doped calcium strontium titanate nanoparticles, (Ca0.6Sr0.4)0.997Pr0.002TiO3 (PCSTO), using hydrothermal synthesis under supercritical water conditions and the production of red luminescence. Starting solutions were prepared by dissolving calcium nitrate, strontium nitrate, titanium hydroxide sols, and praseodymium nitrate in distilled water. We investigated the effect of the reaction temperature, concentration, and pH of the starting solution on the luminescence properties. Synthesis was conducted at temperatures of 200 °C–400 °C, a reaction pressure of 30 MPa, and for reaction times of 4–20 s. The Pr concentration was set to 0.2 mol% relative to the (Ca0.6Sr0.4) ions. We also investigated the effect of high temperature annealing on the luminescence properties of the PCSTO nanoparticles. Particle characteristics were evaluated using x-ray diffraction, a scanning transmission electron microscope (STEM) equipped with an energy-dispersive x-ray spectrometer, and a fluorometer. Single-phase perovskite particles were obtained at hydrothermal reaction temperatures of over 300 °C even for a reaction time of several seconds. STEM images showed that the particles had cubic-like shapes with diameters of 8–13 nm and that they were chemically homogeneous. The PCSTO nanoparticles exhibited sharp red luminescence at 612 nm corresponding to the f–f transition of Pr3+ ions. Moreover, annealing at 1000 °C led to particle growth, achieving diameters of 40 nm and an increase in the quantum efficiency to around 12.0%.

  13. Multifunctional System-on-Glass for Lab-on-Chip applications.

    PubMed

    Petrucci, G; Caputo, D; Lovecchio, N; Costantini, F; Legnini, I; Bozzoni, I; Nascetti, A; de Cesare, G

    2017-07-15

    Lab-on-Chip are miniaturized systems able to perform biomolecular analysis in shorter time and with lower reagent consumption than a standard laboratory. Their miniaturization interferes with the multiple functions that the biochemical procedures require. In order to address this issue, our paper presents, for the first time, the integration on a single glass substrate of different thin film technologies in order to develop a multifunctional platform suitable for on-chip thermal treatments and on-chip detection of biomolecules. The proposed System on-Glass hosts thin metal films acting as heating sources; hydrogenated amorphous silicon diodes acting both as temperature sensors to monitor the temperature distribution and photosensors for the on-chip detection and a ground plane ensuring that the heater operation does not affect the photodiode currents. The sequence of the technological steps, the deposition temperatures of the thin films and the parameters of the photolithographic processes have been optimized in order to overcome all the issues of the technological integration. The device has been designed, fabricated and tested for the implementation of DNA amplification through the Polymerase Chain Reaction (PCR) with thermal cycling among three different temperatures on a single site. The glass has been connected to an electronic system that drives the heaters and controls the temperature and light sensors. It has been optically and thermally coupled with another glass hosting a microfluidic network made in polydimethylsiloxane that includes thermally actuated microvalves and a PCR process chamber. The successful DNA amplification has been verified off-chip by using a standard fluorometer. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Detection of a sudden change of the field time series based on the Lorenz system

    PubMed Central

    Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832

  15. Laboratory studies of in vivo fluorescence of phytoplankton

    NASA Technical Reports Server (NTRS)

    Brown, C. A., Jr.; Farmer, F. H.; Jarrett, O., Jr.; Staton, W. L.

    1978-01-01

    A lidar system is developed that uses four selected excitation wavelengths to induce chlorophyll 'a' fluorescence which is indicative of both the concentration and diversity of phytoplankton. The operating principles of the system and the results of measurements of phytoplankton fluorescence in a controlled laboratory environment are presented. A comparative study of results from lidar fluorosensor laboratory tank tests using representative species of phytoplankton in single and multispecies cultures from each of four color groups reveals that (1) there is good correlation between the fluorescence of chlorophyll 'a' remotely simulated and detected by the lidar system and in-situ measurements using four similar excitation wavelengths in a flow-through fluorometer; (2) good correlation exists between the total chlorophyll 'a' calculated from lidar-fluorosensor data and measurements obtained by the Strickland-Parsons method; and (3) the lidar fluorosensor can provide an index of population diversity.

  16. Test of airborne fluorometer over land surfaces and geologic materials

    NASA Technical Reports Server (NTRS)

    Stoertz, G. E.; Hemphill, W. R.

    1970-01-01

    Response of an experimental Fraunhofer line discriminator to a wide range of surficial deposits common in deserts and semideserts was tested in the laboratory and from an H-19 helicopter. No signals attributable to fluorescence were recorded during 540 miles of aerial traverses over southeastern California and west-central Arizona. It was concluded that exposed surfaces of target materials throughout the traverses were either nonluminescent at 5890 A or not sufficiently so to be detectable. It cannot be ruled out that the lack of fluorescence is partly attributable to surficial coatings of nonluminescent weathered material. The principal route surveyed from the air was from Needles, California to Furnace Creek Ranch, Death Valley and return, via the Amargosa River valley, Silurian Lake (dry), Silver Lake (dry), and Soda Lake (dry). Principal targets traversed were unconsolidated clastic sediments ranging from silty clay to cobbles, and a wide range of evaporite deposits.

  17. Laser induced fluorescence technique for detecting organic matter in East China Sea

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Wang, Tianyu; Pan, Delu; Huang, Haiqing

    2017-10-01

    A laser induced fluorescence (LIF) technique for fast diagnosing chromophoric dissolved organic matter (CDOM) in water is discussed. We have developed a new field-portable laser fluorometer for rapid fluorescence measurements. In addtion, the fluorescence spectral characteristics of fluorescent constituents (e.g., CDOM, chlorophyll-a) were analyzed with a spectral deconvolution method of bi-Gaussian peak function. In situ measurements by the LIF technique compared well with values measured by conventional spectrophotometer method in laboratory. A significant correlation (R2 = 0.93) was observed between fluorescence by the technique and absorption by laboratory spectrophotometer. Influence of temperature variation on LIF measurement was investigated in lab and a temperature coefficient was deduced for fluorescence correction. Distributions of CDOM fluorescence measured using this technique in the East China Sea coast were presented. The in situ result demonstrated the utility of the LIF technique for rapid detecting dissolved organic matter.

  18. Product differentiation by analysis of DNA melting curves during the polymerase chain reaction.

    PubMed

    Ririe, K M; Rasmussen, R P; Wittwer, C T

    1997-02-15

    A microvolume fluorometer integrated with a thermal cycler was used to acquire DNA melting curves during polymerase chain reaction by fluorescence monitoring of the double-stranded DNA specific dye SYBR Green I. Plotting fluorescence as a function of temperature as the thermal cycler heats through the dissociation temperature of the product gives a DNA melting curve. The shape and position of this DNA melting curve are functions of the GC/AT ratio, length, and sequence and can be used to differentiate amplification products separated by less than 2 degrees C in melting temperature. Desired products can be distinguished from undesirable products, in many cases eliminating the need for gel electrophoresis. Analysis of melting curves can extend the dynamic range of initial template quantification when amplification is monitored with double-stranded DNA specific dyes. Complete amplification and analysis of products can be performed in less than 15 min.

  19. [Photosynthetic fluorescence characteristics of floating-leaved and submersed macrophytes commonly found in Taihu Lake].

    PubMed

    Song, Yu-zhi; Cai, Wei; Qin, Bo-qiang

    2009-03-01

    Some aquatic macrophytes commonly found in Taihu Lake, including Trapa bispinosa, Nymphyoides peltatum, Vallisneria natans, and Hydrilla verticillata were collected, and their maximal quantum yield of photosystem II (Fv/Fm) as well as the rapid light curves (RLCs) under conditions of light adaptation and dark adaptation were measured in situ by using a submersible and pulse-amplitude modulated fluorometer (Diving-PAM). The results showed that floating-leaved plants T. bispinosa and N. peltatum had higher potential maximum photosynthetic capacity than submerged macrophytes V. natans and H. verticillata. The measured maximal quantum yield of T. bispinosa, N. peltatum, V. natans, and H. verticillata was 0.837, 0.831, 0.684, and 0.764, respectively. Both the maximal relative electron transport rate and the half saturation point of light intensity of T. bispinosa and N. peltatum were higher than those of V. natans and H. verticillata, especially under the condition of light adaptation.

  20. Spectral and Temporal Laser Fluorescence Analysis Such as for Natural Aquatic Environments

    NASA Technical Reports Server (NTRS)

    Chekalyuk, Alexander (Inventor)

    2015-01-01

    An Advanced Laser Fluorometer (ALF) can combine spectrally and temporally resolved measurements of laser-stimulated emission (LSE) for characterization of dissolved and particulate matter, including fluorescence constituents, in liquids. Spectral deconvolution (SDC) analysis of LSE spectral measurements can accurately retrieve information about individual fluorescent bands, such as can be attributed to chlorophyll-a (Chl-a), phycobiliprotein (PBP) pigments, or chromophoric dissolved organic matter (CDOM), among others. Improved physiological assessments of photosynthesizing organisms can use SDC analysis and temporal LSE measurements to assess variable fluorescence corrected for SDC-retrieved background fluorescence. Fluorescence assessments of Chl-a concentration based on LSE spectral measurements can be improved using photo-physiological information from temporal measurements. Quantitative assessments of PBP pigments, CDOM, and other fluorescent constituents, as well as basic structural characterizations of photosynthesizing populations, can be performed using SDC analysis of LSE spectral measurements.

  1. Photoinhibition in common atlantic macroalgae measured on site in Gran Canaria

    NASA Astrophysics Data System (ADS)

    Häder, D.-P.; Porst, M.; Lebert, M.

    2001-03-01

    The photosynthetic quantum yield was analysed in four common atlantic macroalgae, the Rhodophytes Gelidium arbuscula and Halopithys incurvus and the Phaeophytes Halopteris scoparia and Lobophora variegata in Gran Canaria, Canary Islands at their growth site. The fluorescence parameters were measured using a portable pulse amplitude modulated (PAM) fluorometer (PAM 2000) instrument and a diving PAM under water without removing the thalli from their growth sites. Solar radiation was monitored continuously above and under water during the whole experimental period using two three-channel dosimeters (European light dosimeter network; ELDONET) (Real Time Computer, Möhrendorf, Germany). These instruments measure solar radiation in three wavelength ranges, ultraviolet (UV)-A, UV-B and photosynthetic active radiation (PAR). In all four algae the effective photosynthetic quantum yield decreased significantly from the optimal values measured after dark adaptation due to exposure to 15 min solar radiation, but at least partially recovered subsequently in the shade within several hours. Increasing the exposure period to 30 min intensified the photoinhibition. In some algae no recovery was observed after this treatment and in others no significant recovery could be detected. Exposure to unfiltered solar radiation caused a significantly higher photoinhibition than PAR-only radiation or PAR plus UV-A. A substantial inhibition was found in all algae at their growth sites in the water column when the sun was at high angles, as measured with the diving PAM.

  2. In-Field Implementation of a Recombinant Factor C Assay for the Detection of Lipopolysaccharide as a Biomarker of Extant Life within Glacial Environments

    PubMed Central

    Barnett, Megan J.; Wadham, Jemma L.; Jackson, Miriam; Cullen, David C.

    2012-01-01

    The discovery over the past two decades of viable microbial communities within glaciers has promoted interest in the role of glaciers and ice sheets (the cryosphere) as contributors to subglacial erosion, global biodiversity, and in regulating global biogeochemical cycles. In situ or in-field detection and characterisation of microbial communities is becoming recognised as an important approach to improve our understanding of such communities. Within this context we demonstrate, for the first time, the ability to detect Gram-negative bacteria in glacial field-environments (including subglacial environments) via the detection of lipopolysaccharide (LPS); an important component of Gram-negative bacterial cell walls. In-field measurements were performed using the recently commercialised PyroGene® recombinant Factor C (rFC) endotoxin detection system and used in conjunction with a handheld fluorometer to measure the fluorescent endpoint of the assay. Twenty-seven glacial samples were collected from the surface, bed and terminus of a low-biomass Arctic valley glacier (Engabreen, Northern Norway), and were analysed in a field laboratory using the rFC assay. Sixteen of these samples returned positive LPS detection. This work demonstrates that LPS detection via rFC assay is a viable in-field method and is expected to be a useful proxy for microbial cell concentrations in low biomass environments. PMID:25585634

  3. The Use of Chlorophyll Fluorescence Lifetime to Assess Phytoplankton Physiology within a River-Dominated Environment

    NASA Technical Reports Server (NTRS)

    Hall, Callie M.; Miller, Richard L.; Redalje, Donald G.; Fernandez, Salvador M.

    2002-01-01

    Chlorophyll a fluorescence lifetime was measured for phytoplankton populations inhabiting the three physical zones surrounding the Mississippi River's terminus in the Gulf of Mexico. Observations of river discharge volume, nitrate + nitrite, silicate, phosphate, PAR (Photosynthetically Active Radiation) diffuse attenuation within the water column, salinity, temperature, SPM, and chl a concentration were used to characterize the distribution of chl fluorescence lifetime within a given region within restricted periods of time. 33 stations extending from the Mississippi River plume to the shelf break of the Louisiana coast were surveyed for analysis of chlorophyll fluorescence lifetime during two cruises conducted March 31 - April 6, 2000, and October 24 - November 1, 2000. At each station, two to three depths were chosen for fluorescence lifetime measurement to represent the vertical characteristics of the water column. Where possible, samples were taken from just below the surface and from just above and below the pycnocline. All samples collected were within the 1% light level of the water column (the euphotic zone). Upon collection, samples were transferred to amber Nalgene bottles and left in the dark for at least 15 minutes to reduce the effects of non-photochemical quenching and to insure that photosynthetic reaction centers were open. Before measurements within the phase fluorometer were begun, the instrument was allowed to warm up for no less than one hour.

  4. Tissue characterization by time-resolved fluorescence spectroscopy of endogenous and exogenous fluorochromes: apparatus design and preliminary results

    NASA Astrophysics Data System (ADS)

    Glanzmann, Thomas M.; Ballini, Jean-Pierre; Jichlinski, Patrice; van den Bergh, Hubert; Wagnieres, Georges A.

    1996-12-01

    The biomedical use of an optical fiber-based spectro- temporal fluorometer that can endoscopically record the fluorescence decay of an entire spectrum without scanning is presented. The detector consists of a streak camera coupled to a spectrograph. A mode-locked argon ion pumped dye laser or a nitrogen laser-pumped dye laser are used as pulsed excitation light sources. We measured the fluorescence decays of endogenous fluorophores and of ALA-induced- protoporphyrin IX(PPIX) in an excised human bladder with a carcinoma in situ (CIS). Each autofluorescence decay can be decomposed in at least three exponential components for all tissue samples investigated if the excitation is at 425 nm. The decays of the autofluorescence of all normal sites of the human bladder are similar and they differ significantly from the decays measured on the CIS and the necrotic tissue. The fluorescence of the ALA-induced PPIX in the bladder is monoexponential with a lifetime of 15 (plus or minus 1) ns and this fluorescence lifetime does not change significantly between the normal urothelium and the CIS. A photoproduct of ALA-PPIX with a fluorescence maximum at 670 nm and a lifetime of 8 (plus or minus 1) ns was observed. The measurement of the decay of the autofluorescence allowed to correctly identify a normal tissue site that was classified as abnormal by the measurement of the ALA-PPIX fluorescence intensity.

  5. Modified spectrophotometer for multi-dimensional circular dichroism/fluorescence data acquisition in titration experiments: application to the pH and guanidine-HCI induced unfolding of apomyoglobin.

    PubMed

    Ramsay, G; Ionescu, R; Eftink, M R

    1995-08-01

    In a previous paper (Ramsay and Eftink, Biophys. J. 66:516-523) we reported the development of a modified spectrophotometer that can make nearly simultaneous circular dichroism (CD) and fluorescence measurements. This arrangement allows multiple data sets to be collected during a single experiment, resulting in a saving of time and material, and improved correlation between the different types of measurements. The usefulness of the instrument was shown by thermal melting experiments on several different protein systems. This CD/fluorometer spectrophotometer has been further modified by interfacing with a syringe pump and a pH meter. This arrangement allows ligand, pH, and chemical denaturation titration experiments to be performed while monitoring changes in the sample's CD, absorbance, fluorescence, and light scattering properties. Our data acquisition program also has an ability to check whether the signals have approached equilibrium before the data is recorded. For performing pH titrations we have developed a procedure which uses the signal from a pH meter in a feedback circuit in order to collect data at evenly spaced pH intervals. We demonstrate the use of this instrument with studies of the unfolding of sperm whale apomyoglobin, as induced by acid pH and by the addition of guanidine-HCI.

  6. Modified spectrophotometer for multi-dimensional circular dichroism/fluorescence data acquisition in titration experiments: application to the pH and guanidine-HCI induced unfolding of apomyoglobin.

    PubMed Central

    Ramsay, G; Ionescu, R; Eftink, M R

    1995-01-01

    In a previous paper (Ramsay and Eftink, Biophys. J. 66:516-523) we reported the development of a modified spectrophotometer that can make nearly simultaneous circular dichroism (CD) and fluorescence measurements. This arrangement allows multiple data sets to be collected during a single experiment, resulting in a saving of time and material, and improved correlation between the different types of measurements. The usefulness of the instrument was shown by thermal melting experiments on several different protein systems. This CD/fluorometer spectrophotometer has been further modified by interfacing with a syringe pump and a pH meter. This arrangement allows ligand, pH, and chemical denaturation titration experiments to be performed while monitoring changes in the sample's CD, absorbance, fluorescence, and light scattering properties. Our data acquisition program also has an ability to check whether the signals have approached equilibrium before the data is recorded. For performing pH titrations we have developed a procedure which uses the signal from a pH meter in a feedback circuit in order to collect data at evenly spaced pH intervals. We demonstrate the use of this instrument with studies of the unfolding of sperm whale apomyoglobin, as induced by acid pH and by the addition of guanidine-HCI. Images FIGURE 2 PMID:8527683

  7. Observations of Cross-Surf-zone / Inner-shelf Dye Exchange from Aerial Hyperspectral and in Situ Data.

    NASA Astrophysics Data System (ADS)

    Grimes, D. J.; Giddings, S. N.; Kumar, N.; Pawlak, G. R.; Feddersen, F.

    2016-12-01

    Understanding the cross-shelf exchange of nearshore sourced tracers across the surfzone and onto the stratified inner-shelf is critical to be able to predict the evolution of pollution events, HAB, and larval transport, which will enable policy and mitigation efforts. The CSIDE (Cross Surfzone / Inner-shelf Dye Exchange) experiment (Sept & Oct 2015) provides observations to quantify dye tracer exchange across the surfzone/inner-shelf region with 3 dye release experiments. Shoreline released dye and temperature is tracked for 48 hrs and 20 km using aerial hyperspectral and IR imagery, in situ near-shoreline fluorometers, moored wire-walkers, AUV, and boat based towed observations. Here, we focus on the 3rd release, where dye was pumped into the mouth of the Tijuana River / Estuary during an ebb tide with low river discharge. The dye field was transported alongshore in a large coherent patch extending 1 km from shore. The plume persisted overnight with weak dilution and its center of mass was observed to move 3+ km north over 18 hrs. Aerial hyperspectral and in situ observations are analyzed to examine the horizontal and vertical dye distribution. In particular, we will explore the extent to which the stratified inner-shelf is a "material barrier," whether an observed surfzone dye and temperature correlation is maintained on the stratified inner-shelf, at what time- and length-scales, and the processes influencing this relationship.

  8. Multiple Indicator Stationary Time Series Models.

    ERIC Educational Resources Information Center

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  9. The Discovery of Deep Oil Plumes at the Deepwater Horizon Oil Spill Site (Invited)

    NASA Astrophysics Data System (ADS)

    Diercks, A. R.; Asper, V. L.; Highsmith, R. C.; Woolsey, M.; Lohrenz, S. E.; McLetchie, K.; Gossett, A.; Lowe, M., III; Joung, D.; McKay, L.

    2010-12-01

    In May 2010, the National Institute for Undersea Science and Technology (NIUST), a partnership of the University of Mississippi, the University of Southern Mississippi and NOAA, had a 17-day research cruise aboard the UNOLS vessel R/V Pelican scheduled. Two weeks before departure, the Deepwater Horizon oil platform burned and sank, resulting in an uncontrolled oil spill at a depth of ~1500 m at Mississippi Canyon Block 252. The initial mission plan to do AUV surveys of wrecks and hydrate outcrops in the northern Gulf of Mexico, some of them very close to the site of the accident, was abandoned in favor of responding to the still uncontrolled oil spill. The primary goals of the redefined cruise were to acquire baseline and early impact data for seafloor sediments and subsurface distribution of oil and gas hydrates as close as possible in time and space to the origin of the oil spill. Investigating an oil spill nearly a mile deep in the ocean presents special benthic sampling and subsurface oil detection challenges. NIUST’s AUV’s were unloaded from the ship and a large main winch installed to allow operation of a full ocean depth box corer for collecting sediment samples in water depths up to 2000 m. During the first five-day leg of the cruise, a total of 28 box cores were collected. The Pelican returned to port (Cocodrie, LA) to drop off sediment and water samples for immediate analyses, and to take on more sampling gear and supplies for the second leg of the cruise, including an Acrobat, a CDOM fluorometer, a Video Ray ROV, and a CO2 sensor in addition to the already installed CTD Rosette with O2 sensor and beam transmissometer. During Leg 2, CTD stations were plotted to cover the area surrounding the wreck site and at various water depths to map the subsurface water column structure and chemistry as baseline values for future investigations and especially to look for submerged oil and/or gas hydrates. Early in the water column sampling, a subsurface feature was discovered at 1200 to 1400 m depth. This layer was detected by three independent sensors, CDOM (colored dissolved organic matter) fluorometer, beam transmissometer, and dissolved oxygen sensor. All three instruments responded in unison with greater fluorescence and beam attenuation and decreased O2 concentration. These signals were first observed at a station 5 miles from the accident site. Second and third station measurements at 2.5 miles, and at 1.25 miles from the spill site, showed the same signal but with significantly greater magnitude. Following this discovery, the sampling plan for the remaining days of the cruise was changed to map the newly discovered feature. This paper will discuss our data acquired during this cruise aboard the RV Pelican and its original discovery of the deep oil plumes from the Deepwater Horizon well.

  10. Improved Underwater Excitation-Emission Matrix Fluorometer

    NASA Technical Reports Server (NTRS)

    Moore, Casey; daCunha, John; Rhoades, Bruce; Twardowski, Michael

    2007-01-01

    A compact, high-resolution, two-dimensional excitation-emission matrix fluorometer (EEMF) has been designed and built specifically for use in identifying and measuring the concentrations of organic compounds, including polluting hydrocarbons, in natural underwater settings. Heretofore, most EEMFs have been designed and built for installation in laboratories, where they are used to analyze the contents of samples collected in the field and brought to the laboratories. Because the present EEMF can be operated in the field, it is better suited to measurement of spatially and temporally varying concentrations of substances of interest. In excitation-emission matrix (EEM) fluorometry, fluorescence is excited by irradiating a sample at one or more wavelengths, and the fluorescent emission from the sample is measured at multiple wavelengths. When excitation is provided at only one wavelength, the technique is termed one-dimensional (1D) EEM fluorometry because the resulting matrix of fluorescence emission data (the EEM) contains only one row or column. When excitation is provided at multiple wavelengths, the technique is termed two-dimensional (2D) EEM fluorometry because the resulting EEM contains multiple rows and columns. EEM fluorometry - especially the 2D variety - is well established as a means of simultaneously detecting numerous dissolved and particulate compounds in water. Each compound or pool of compounds has a unique spectral fluorescence signature, and each EEM is rich in information content, in that it can contain multiple fluorescence signatures. By use of deconvolution and/or other mixture-analyses techniques, it is often possible to isolate the spectral signature of compounds of interest, even when their fluorescence spectra overlap. What distinguishes the present 2D EEMF over prior laboratory-type 2D EEMFs are several improvements in packaging (including a sealed housing) and other aspects of design that render it suitable for use in natural underwater settings. In addition, the design of the present 2D EEMF incorporates improvements over the one prior commercial underwater 2D EEMF, developed in 1994 by the same company that developed the present one. Notable advanced features of the present EEMF include the following: 1) High sensitivity and spectral resolution are achieved by use of an off-the-shelf grating spectrometer equipped with a sensor in the form of a commercial astronomical- grade 256 532-pixel charge-coupled-device (CCD) array. 2) All of the power supply, timing, control, and readout circuits for the illumination source and the CCD, ancillary environmental monitoring sensors, and circuitry for controlling a shutter or filter motor are custom-designed and mounted compactly on three circuit boards below a fourth circuit board that holds the CCD (see figure). 3) The compactness of the grating spectrometer, CCD, and circuit assembly makes it possible to fit the entire instrument into a compact package that is intended to be maneuverable underwater by one person. 4) In mass production, the cost of the complete instrument would be relatively low - estimated at approximately $30,000 at 2005 prices.

  11. Efficient Algorithms for Segmentation of Item-Set Time Series

    NASA Astrophysics Data System (ADS)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  12. A novel weight determination method for time series data aggregation

    NASA Astrophysics Data System (ADS)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  13. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  14. Updating Landsat time series of surface-reflectance composites and forest change products with new observations

    NASA Astrophysics Data System (ADS)

    Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.

    2017-12-01

    The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time series change detection and update approach followed here, science outcomes or reports representing one temporal epoch can be considered stable and will not be altered when a time series is updated with newly available data.

  15. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  16. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  17. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  18. A Review of Subsequence Time Series Clustering

    PubMed Central

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  19. A review of subsequence time series clustering.

    PubMed

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  20. Multiscale structure of time series revealed by the monotony spectrum.

    PubMed

    Vamoş, Călin

    2017-03-01

    Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.

  1. A rapid diagnostic test and mobile "lab in a suitcase" platform for detecting Ceratocystis spp. responsible for Rapid ‘Ōhi‘a Death

    USGS Publications Warehouse

    Atkinson, Carter T.; Watcher-Weatherwax, William; Roy, Kylle; Heller, Wade P; Keith, Lisa

    2017-01-01

    We describe a field compatible molecular diagnostic test for two new species of Ceratocystis that infect `ōhi`a (Metrosideros polymorpha) and cause the disease commonly known as Rapid `Ōhi`a Death. The diagnostic is based on amplification of a DNA locus within the internal transcribed spacer region that separates fungal 5.8S ribosomal genes. The assay uses forward and reverse primers, recombinase polymerase, and a fluorescent probe that allows isothermal (40oC) amplification and simultaneous quantification of a 115 base pair product with a battery operated fluorometer. DNA extractions are field compatible and can be done by heating wood drill shavings to 100oC in Instagene® solution containing Chelex® resin to bind potential amplification inhibitors. The initial heat treatment is followed by a short bead beating step with steel ball bearings and zirconium beads to release DNA. DNA is subsequently purified with a magnetic bead based extraction method that does not require silica columns or centrifugation. The assay is designed around a portable “lab-in-a-suitcase” platform that includes a portable fluorometer, miniature centrifuge, and heat block that operate off either 120V AC power sources or a 12 volt battery with a portable inverter, a magnetic rack designed for 1.5 ml tubes and magnetic bead DNA purification, pipettes and consumable reagents and tubes. The entire assay from DNA extraction to results can be performed in less than 90 minutes on up to six independent samples plus a positive and negative control. Sensitivity based on suspensions of Ceratocystis endoconidia (spores) that were added to wood shavings and processed under field conditions by Instagene® magnetic bead DNA extraction was up to 163 spores/mg wood for Species A and 55 spores/mg wood for Species B in 95% of replicates as determined by probit analysis. Sensitivity increased 5–10 fold to 19 spores/mg wood for Species A and 9 spores/mg wood for Species B when extractions were performed with a commercial, silica column based DNA purification kit. The test did not cross react with other common fungi that have been isolated from `ōhi`a.

  2. 76 FR 6646 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-07

    ... Series, adjusted option series and any options series until the time to expiration for such series is... time to expiration for such series is less than nine months be treated differently. Specifically, under... series until the time to expiration for such series is less than nine months. Accordingly, the...

  3. Complexity quantification of cardiac variability time series using improved sample entropy (I-SampEn).

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2016-09-01

    The sample entropy (SampEn) has been widely used to quantify the complexity of RR-interval time series. It is a fact that higher complexity, and hence, entropy is associated with the RR-interval time series of healthy subjects. But, SampEn suffers from the disadvantage that it assigns higher entropy to the randomized surrogate time series as well as to certain pathological time series, which is a misleading observation. This wrong estimation of the complexity of a time series may be due to the fact that the existing SampEn technique updates the threshold value as a function of long-term standard deviation (SD) of a time series. However, time series of certain pathologies exhibits substantial variability in beat-to-beat fluctuations. So the SD of the first order difference (short term SD) of the time series should be considered while updating threshold value, to account for period-to-period variations inherited in a time series. In the present work, improved sample entropy (I-SampEn), a new methodology has been proposed in which threshold value is updated by considering the period-to-period variations of a time series. The I-SampEn technique results in assigning higher entropy value to age-matched healthy subjects than patients suffering atrial fibrillation (AF) and diabetes mellitus (DM). Our results are in agreement with the theory of reduction in complexity of RR-interval time series in patients suffering from chronic cardiovascular and non-cardiovascular diseases.

  4. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series

    PubMed Central

    2011-01-01

    Background Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Results Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. Conclusions The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html. PMID:21851598

  5. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series.

    PubMed

    Yuan, Yuan; Chen, Yi-Ping Phoebe; Ni, Shengyu; Xu, Augix Guohua; Tang, Lin; Vingron, Martin; Somel, Mehmet; Khaitovich, Philipp

    2011-08-18

    Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html.

  6. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks

    PubMed Central

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-01-01

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques. PMID:28383496

  7. Measuring Complexity and Predictability of Time Series with Flexible Multiscale Entropy for Sensor Networks.

    PubMed

    Zhou, Renjie; Yang, Chen; Wan, Jian; Zhang, Wei; Guan, Bo; Xiong, Naixue

    2017-04-06

    Measurement of time series complexity and predictability is sometimes the cornerstone for proposing solutions to topology and congestion control problems in sensor networks. As a method of measuring time series complexity and predictability, multiscale entropy (MSE) has been widely applied in many fields. However, sample entropy, which is the fundamental component of MSE, measures the similarity of two subsequences of a time series with either zero or one, but without in-between values, which causes sudden changes of entropy values even if the time series embraces small changes. This problem becomes especially severe when the length of time series is getting short. For solving such the problem, we propose flexible multiscale entropy (FMSE), which introduces a novel similarity function measuring the similarity of two subsequences with full-range values from zero to one, and thus increases the reliability and stability of measuring time series complexity. The proposed method is evaluated on both synthetic and real time series, including white noise, 1/f noise and real vibration signals. The evaluation results demonstrate that FMSE has a significant improvement in reliability and stability of measuring complexity of time series, especially when the length of time series is short, compared to MSE and composite multiscale entropy (CMSE). The proposed method FMSE is capable of improving the performance of time series analysis based topology and traffic congestion control techniques.

  8. Empirical method to measure stochasticity and multifractality in nonlinear time series

    NASA Astrophysics Data System (ADS)

    Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping

    2013-12-01

    An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.

  9. Multiscale Poincaré plots for visualizing the structure of heartbeat time series.

    PubMed

    Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L

    2016-02-09

    Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.

  10. Degree-Pruning Dynamic Programming Approaches to Central Time Series Minimizing Dynamic Time Warping Distance.

    PubMed

    Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip

    2016-06-28

    The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.

  11. From Networks to Time Series

    NASA Astrophysics Data System (ADS)

    Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi

    2012-10-01

    In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.

  12. 76 FR 14111 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... Options Series, adjusted option series and any options series until the time to expiration for such series... time to expiration for such series is less than nine months be treated differently. Specifically, under... until the time to expiration for such series is less than nine months. Accordingly, the requirement to...

  13. Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance

    PubMed Central

    Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao

    2018-01-01

    Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy. PMID:29795600

  14. Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance.

    PubMed

    Liu, Yongli; Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao

    2018-01-01

    Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy.

  15. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  16. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    PubMed

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Analysis of Nonstationary Time Series for Biological Rhythms Research.

    PubMed

    Leise, Tanya L

    2017-06-01

    This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.

  18. EMC: Air Quality Forecast Home page

    Science.gov Websites

    archive NAM Verification Meteorology Error Time Series EMC NAM Spatial Maps Real Time Mesoscale Analysis Precipitation verification NAQFC VERIFICATION CMAQ Ozone & PM Error Time Series AOD Error Time Series HYSPLIT Smoke forecasts vs GASP satellite Dust and Smoke Error Time Series HYSPLIT WCOSS Upgrade (July

  19. Time Series Remote Sensing in Monitoring the Spatio-Temporal Dynamics of Plant Invasions: A Study of Invasive Saltcedar (Tamarix Spp.)

    NASA Astrophysics Data System (ADS)

    Diao, Chunyuan

    In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.

  20. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    NASA Astrophysics Data System (ADS)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  1. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Time series momentum and contrarian effects in the Chinese stock market

    NASA Astrophysics Data System (ADS)

    Shi, Huai-Long; Zhou, Wei-Xing

    2017-10-01

    This paper concentrates on the time series momentum or contrarian effects in the Chinese stock market. We evaluate the performance of the time series momentum strategy applied to major stock indices in mainland China and explore the relation between the performance of time series momentum strategies and some firm-specific characteristics. Our findings indicate that there is a time series momentum effect in the short run and a contrarian effect in the long run in the Chinese stock market. The performances of the time series momentum and contrarian strategies are highly dependent on the look-back and holding periods and firm-specific characteristics.

  3. A Multitaper, Causal Decomposition for Stochastic, Multivariate Time Series: Application to High-Frequency Calcium Imaging Data.

    PubMed

    Sornborger, Andrew T; Lauderdale, James D

    2016-11-01

    Neural data analysis has increasingly incorporated causal information to study circuit connectivity. Dimensional reduction forms the basis of most analyses of large multivariate time series. Here, we present a new, multitaper-based decomposition for stochastic, multivariate time series that acts on the covariance of the time series at all lags, C ( τ ), as opposed to standard methods that decompose the time series, X ( t ), using only information at zero-lag. In both simulated and neural imaging examples, we demonstrate that methods that neglect the full causal structure may be discarding important dynamical information in a time series.

  4. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    PubMed

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  5. Nonlinear parametric model for Granger causality of time series

    NASA Astrophysics Data System (ADS)

    Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano

    2006-06-01

    The notion of Granger causality between two time series examines if the prediction of one series could be improved by incorporating information of the other. In particular, if the prediction error of the first time series is reduced by including measurements from the second time series, then the second time series is said to have a causal influence on the first one. We propose a radial basis function approach to nonlinear Granger causality. The proposed model is not constrained to be additive in variables from the two time series and can approximate any function of these variables, still being suitable to evaluate causality. Usefulness of this measure of causality is shown in two applications. In the first application, a physiological one, we consider time series of heart rate and blood pressure in congestive heart failure patients and patients affected by sepsis: we find that sepsis patients, unlike congestive heart failure patients, show symmetric causal relationships between the two time series. In the second application, we consider the feedback loop in a model of excitatory and inhibitory neurons: we find that in this system causality measures the combined influence of couplings and membrane time constants.

  6. Multivariate time series clustering on geophysical data recorded at Mt. Etna from 1996 to 2003

    NASA Astrophysics Data System (ADS)

    Di Salvo, Roberto; Montalto, Placido; Nunnari, Giuseppe; Neri, Marco; Puglisi, Giuseppe

    2013-02-01

    Time series clustering is an important task in data analysis issues in order to extract implicit, previously unknown, and potentially useful information from a large collection of data. Finding useful similar trends in multivariate time series represents a challenge in several areas including geophysics environment research. While traditional time series analysis methods deal only with univariate time series, multivariate time series analysis is a more suitable approach in the field of research where different kinds of data are available. Moreover, the conventional time series clustering techniques do not provide desired results for geophysical datasets due to the huge amount of data whose sampling rate is different according to the nature of signal. In this paper, a novel approach concerning geophysical multivariate time series clustering is proposed using dynamic time series segmentation and Self Organizing Maps techniques. This method allows finding coupling among trends of different geophysical data recorded from monitoring networks at Mt. Etna spanning from 1996 to 2003, when the transition from summit eruptions to flank eruptions occurred. This information can be used to carry out a more careful evaluation of the state of volcano and to define potential hazard assessment at Mt. Etna.

  7. An Energy-Based Similarity Measure for Time Series

    NASA Astrophysics Data System (ADS)

    Boudraa, Abdel-Ouahab; Cexus, Jean-Christophe; Groussat, Mathieu; Brunagel, Pierre

    2007-12-01

    A new similarity measure, called SimilB, for time series analysis, based on the cross-[InlineEquation not available: see fulltext.]-energy operator (2004), is introduced. [InlineEquation not available: see fulltext.] is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED) or the Pearson correlation coefficient (CC), SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of [InlineEquation not available: see fulltext.] are presented. Particularly, we show that [InlineEquation not available: see fulltext.] as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.

  8. A hybrid algorithm for clustering of time series data based on affinity search technique.

    PubMed

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets.

  9. A Hybrid Algorithm for Clustering of Time Series Data Based on Affinity Search Technique

    PubMed Central

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A.; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets. PMID:24982966

  10. FerryMon: An Unattended Ferry-Based Observatory to Assess Human and Climatically- Induced Ecological Change in the Neuse River-Pamlico Sound System, North Carolina, USA

    NASA Astrophysics Data System (ADS)

    Guajardo, R.; Paerl, H. W.; Hall, N.; Whipple, A.; Luettich, R.

    2007-12-01

    In North Carolina's Neuse River Estuary (NRE)-Pamlico Sound (PS) System, nitrogen (N)-driven eutrophication, water quality and habitat decline have prompted the State and US EPA to mandate watershed-based N load reductions, including a total maximum daily allowable N load (TMDL). Chlorophyll a (chl-a), the indicator of algal biomass, is the measure for the efficacy of N reductions, with "acceptable" values being <40 μg chl- a L-1. However, algal blooms are patchy in time and space, making exceedances of 40 μ g L-1 difficult to track. The North Carolina ferry-based water quality monitoring program, FerryMon (www.ferrymon.org) addresses this and other environmental monitoring needs in the NRE-PS. FerryMon uses NC DOT ferries to provide continuous, space-time intensive, accurate measurements of chl-a and other key water quality criteria, using sensors placed in a flow-through system and discrete sampling of nutrients, organics, diagnostic photopigment and molecular indicators of major algal groups in a near real-time manner. Complementing FerryMon are automated vertical profilers (AVPs), which produce chl-a and other water quality indicator depth profiles with very high time and vertical resolution. In-line spectral fluorometers (Algae Online Analyzers (AOAs)) will be installed starting in late 2007, providing rapid early warning detection and quantification of algal blooms. FerryMon permits spatial characterization of trends in water quality conditions over a range of relevant physical, chemical and biological time scales. This enhanced capability is timely, given a protracted period of increased tropical storm and hurricane activity that, in combination with anthropogenic nutrient enrichment, affects water quality in unpredictable, yet significant ways. FerryMon also serves as a data source for calibrating and verifying remotely sensed indicators of water quality (photopigments, turbidity), nutrient-productivity and hydrologic modeling. Data management and communication links allow FerryMon to integrate with complementary watershed, estuarine and coastal observational programs . FerryMon's technology is readily transferable to other estuarine, large lake and coastal ecosystems served by ferries and other "ships of opportunity".

  11. Clustering Financial Time Series by Network Community Analysis

    NASA Astrophysics Data System (ADS)

    Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio

    In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.

  12. A perturbative approach for enhancing the performance of time series forecasting.

    PubMed

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Multifractal analysis of the Korean agricultural market

    NASA Astrophysics Data System (ADS)

    Kim, Hongseok; Oh, Gabjin; Kim, Seunghwan

    2011-11-01

    We have studied the long-term memory effects of the Korean agricultural market using the detrended fluctuation analysis (DFA) method. In general, the return time series of various financial data, including stock indices, foreign exchange rates, and commodity prices, are uncorrelated in time, while the volatility time series are strongly correlated. However, we found that the return time series of Korean agricultural commodity prices are anti-correlated in time, while the volatility time series are correlated. The n-point correlations of time series were also examined, and it was found that a multifractal structure exists in Korean agricultural market prices.

  14. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  15. Quantifying memory in complex physiological time-series.

    PubMed

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  16. Quantifying Memory in Complex Physiological Time-Series

    PubMed Central

    Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811

  17. Scale-dependent intrinsic entropies of complex time series.

    PubMed

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease. © 2016 The Author(s).

  18. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, A.

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  19. DNA as Sensors and Imaging Agents for Metal Ions

    PubMed Central

    Xiang, Yu

    2014-01-01

    Increasing interests in detecting metal ions in many chemical and biomedical fields have created demands for developing sensors and imaging agents for metal ions with high sensitivity and selectivity. This review covers recent progress in DNA-based sensors and imaging agents for metal ions. Through both combinatorial selection and rational design, a number of metal ion-dependent DNAzymes and metal ion-binding DNA structures that can selectively recognize specific metal ions have been obtained. By attaching these DNA molecules with signal reporters such as fluorophores, chromophores, electrochemical tags, and Raman tags, a number of DNA-based sensors for both diamagnetic and paramagnetic metal ions have been developed for fluorescent, colorimetric, electrochemical, and surface Raman detections. These sensors are highly sensitive (with detection limit down to 11 ppt) and selective (with selectivity up to millions-fold) toward specific metal ions. In addition, through further development to simplify the operation, such as the use of “dipstick tests”, portable fluorometers, computer-readable discs, and widely available glucose meters, these sensors have been applied for on-site and real-time environmental monitoring and point-of-care medical diagnostics. The use of these sensors for in situ cellular imaging has also been reported. The generality of the combinatorial selection to obtain DNAzymes for almost any metal ion in any oxidation state, and the ease of modification of the DNA with different signal reporters make DNA an emerging and promising class of molecules for metal ion sensing and imaging in many fields of applications. PMID:24359450

  20. The examination of headache activity using time-series research designs.

    PubMed

    Houle, Timothy T; Remble, Thomas A; Houle, Thomas A

    2005-05-01

    The majority of research conducted on headache has utilized cross-sectional designs which preclude the examination of dynamic factors and principally rely on group-level effects. The present article describes the application of an individual-oriented process model using time-series analytical techniques. The blending of a time-series approach with an interactive process model allows consideration of the relationships of intra-individual dynamic processes, while not precluding the researcher to examine inter-individual differences. The authors explore the nature of time-series data and present two necessary assumptions underlying the time-series approach. The concept of shock and its contribution to headache activity is also presented. The time-series approach is not without its problems and two such problems are specifically reported: autocorrelation and the distribution of daily observations. The article concludes with the presentation of several analytical techniques suited to examine the time-series interactive process model.

  1. Long-range correlations in time series generated by time-fractional diffusion: A numerical study

    NASA Astrophysics Data System (ADS)

    Barbieri, Davide; Vivoli, Alessandro

    2005-09-01

    Time series models showing power law tails in autocorrelation functions are common in econometrics. A special non-Markovian model for such kind of time series is provided by the random walk introduced by Gorenflo et al. as a discretization of time fractional diffusion. The time series so obtained are analyzed here from a numerical point of view in terms of autocorrelations and covariance matrices.

  2. Improving estimates of ecosystem metabolism by reducing effects of tidal advection on dissolved oxygen time series

    EPA Science Inventory

    In aquatic systems, time series of dissolved oxygen (DO) have been used to compute estimates of ecosystem metabolism. Central to this open-water method is the assumption that the DO time series is a Lagrangian specification of the flow field. However, most DO time series are coll...

  3. 76 FR 28897 - Magnuson-Stevens Act Provisions; Fisheries Off West Coast States; Pacific Coast Groundfish...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-19

    ... time series became closer (while depletion at the end of the time series became more divergent); (4) the agreement in the recruitment time series was much improved; (5) recruitment deviations in log space showed much closer agreement; and (6) the fishing intensity time series showed much closer...

  4. 75 FR 4570 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-28

    ... applications. Signal-to-Noise Enhancement in Imaging Applications Using a Time-Series of Images Description of... applications that use a time-series of images. In one embodiment of the invention, a time-series of images is... Imaging Applications Using a Time-Series of Images'' (HHS Reference No. E-292- 2009/0-US-01). Related...

  5. 78 FR 15385 - Self-Regulatory Organizations; NASDAQ OMX BX, Inc.; Notice of Filing of Proposed Rule Change To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-11

    ... Series, any adjusted option series, and any option series until the time to expiration for such series is... existing requirement may at times discourage liquidity in particular options series because a market maker... the option is subject to the Price/Time execution algorithm, the Directed Market Maker shall receive...

  6. A novel water quality data analysis framework based on time-series data mining.

    PubMed

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Climate Prediction Center - Stratosphere: Polar Stratosphere and Ozone

    Science.gov Websites

    depletion processes can occur. In addition, the latitudinal-time cross sections shows the thermal evolution UV Daily Dosage Estimate South Polar Vertical Ozone Profile Time Series of Size of S.H. Polar Vortex Time Series of Size of S.H. PSC Temperature Time Series of Size of N.H. Polar Vortex Time Series of

  8. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    PubMed

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  9. Sensor-Generated Time Series Events: A Definition Language

    PubMed Central

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  10. Homogenising time series: Beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2010-09-01

    For obtaining reliable information about climate change and climate variability the use of high quality data series is essentially important, and one basic tool of quality improvements is the statistical homogenisation of observed time series. In the recent decades large number of homogenisation methods has been developed, but the real effects of their application on time series are still not known entirely. The ongoing COST HOME project (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. The author believes that several old theoretical rules have to be re-evaluated. Some examples of the hot questions, a) Statistically detected change-points can be accepted only with the confirmation of metadata information? b) Do semi-hierarchic algorithms for detecting multiple change-points in time series function effectively in practise? c) Is it good to limit the spatial comparison of candidate series with up to five other series in the neighbourhood? Empirical results - those from the COST benchmark, and other experiments too - show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities seem like part of the climatic variability, thus the pure application of classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality, than in raw time series. The developers and users of homogenisation methods have to bear in mind that the eventual purpose of homogenisation is not to find change-points, but to have the observed time series with statistical properties those characterise well the climate change and climate variability.

  11. Liquid-chromatographic separation and on-line bioluminescence detection of creatine kinase isoenzymes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostick, W.D.; Denton, M.S.; Dinsmore, S.R.

    1980-01-01

    Isoenzymes of creatine kinase were separated by anion-exchange chromatography, with use of an elution gradient containing lithium acetate (0.1 to 0.6 mol/L). A stream splitter was used to divert a 5% side stream of column effluent, which was subsequently mixed with the reagents necessary for bioluminescence assay of the separated isoenzymes. The use of the stream splitter greatly decreased the rate of consumption of reagent and, when combined with a peristaltic pumping system, permitted independent control of the side-stream flow rate. Thus both the residence interval in a delay coil in which the ATP reaction product is formed and themore » bioluminescence emission was monitored in a flow-through fluorometer without use of an external light source or filters. Separation and detection of the isoenzymes of creatine kinase were rapid, sensitive, and highly selective. The incremental decrease of bioluminescence response owing to inhibition by the ions in the eluent was less than 31% across the entire gradient.« less

  12. Improved salvage of complicated microvascular transplants monitored with quantitative fluorometry.

    PubMed

    Whitney, T M; Lineaweaver, W C; Billys, J B; Siko, P P; Buncke, G M; Alpert, B S; Oliva, A; Buncke, H J

    1992-07-01

    Quantitative fluorometry has been used to monitor circulation in transplanted toes and cutaneous flaps in our unit since 1982. Analysis of 177 uncomplicated transplants monitored by quantitative fluorometry shows that this technique has low false indication rates for arterial occlusion (0.6 percent of patients) and venous occlusion (6.2 percent of patients). None of these patients was reexplored because of a false monitor reading, and except for single abnormal sequences, monitoring appropriately indicated intact circulation throughout the postoperative period. Quantitative fluorometry has correctly indicated vascular complications in 21 (91.3 percent) of 23 transplants over an 8-year period. The salvage rate (85.7 percent) of the fluorescein-monitored reexplored transplants was significantly higher than the salvage rates of similar reexplored transplants not monitored with fluorescein and of reexplored muscle flaps (which cannot be monitored with the fluorometer used at this unit). These clinical data indicate that quantitative fluorometry is a valid and useful postoperative monitor for transplanted toes and cutaneous flaps.

  13. Results of two years of a mooring over a Posidonia Oceanica seagrass meadow (Corsica, France)

    NASA Astrophysics Data System (ADS)

    Champenois, W.; Delille, B.; Beckers, J.-M.; Grégoire, M.; Borges, A. V.

    2009-04-01

    We report the first two year of results from a 10m deep mooring over a Posidonia Oceanica seagrass meadow (Corsica, France) where we deployed from August 2006 to August 2008 an array of 3 optodes, a fluorometer and a sensor for measurements of the partial pressure of CO2 (pCO2). The oxygen data are used to compute by mass balance ecosystem metabolic performance rates (gross primary production, community respiration, net community production). The comparison with rates derived from discrete benthic incubations (every 2 months) is very satisfactory. The pCO2 data are used to assess the sink or source of atmospheric CO2 of the Posidonia Oceanica seagrass meadow. An application of such a mooring is to detect changes in the productivity of the Posidonia meadow that can be used as indicators of overall ecosystem "health" or degradation by human activities. Such a mooring can be used as an affordable and simple tool for management and sustainable development of coastal areas in the Mediterranean.

  14. Fiber optic-based fluorescence detection system for in vivo studies of exogenous chromophore pharmacokinetics

    NASA Astrophysics Data System (ADS)

    Doiron, Daniel R.; Dunn, J. B.; Mitchell, W. L.; Dalton, Brian K.; Garbo, Greta M.; Warner, Jon A.

    1995-05-01

    The detection and quantification of the concentration of exogenous chromophores in-vivo by their fluorescence is complicated by many physical and geometrical parameters. Measurement of such signals is advantageous in determining the pharmacokinetics of photosensitizers such as those used in photodynamic therapy (PDT) or to assist in the diagnosis of tissue histological state. To overcome these difficulties a ratio based fiber optic contact fluorometer has been developed. This fluorescence detection system (FDS) uses the ratio of the fluorescence emission peak of the exogenous chromophore to that of endogenous chromophores, i.e. autofluorescence, to correct for a variety of parameters affecting the magnitude of the measured signals. By doing so it also minimizes the range of baseline measurements prior to exogenous drug injection, for various tissue types. Design of the FDS and results of its testing in animals and patients using the second generation photosensitizer Tin ethyletiopurpurin (SnET2) are presented. These results support the feasibility and usefulness of the Ratio FDS system.

  15. Laboratory testing protocol for the impact of dispersed petrochemicals on seagrass.

    PubMed

    Wilson, K G; Ralph, P J

    2012-11-01

    To improve the effectiveness of oil spill mitigation, we developed a rapid, logistically simple protocol to detect petrochemical stress on seagrass. Sections of leaf blades from Zostera muelleri subsp. capricorni were exposed to the water accommodated fraction (WAF) of non-dispersed and dispersed Tapis crude oil and fuel oil (IFO-380) for 5h. Photosynthetic health was monitored by assessing changes in effective quantum yield of photosystem II (ΔF/F(m)(')) and chlorophyll a pigment concentrations. Loss of total petroleum hydrocarbons (TPH) was measured using an oil-in-water fluorometer, whilst GC-MS analyses quantified the hydrocarbon components within each treatment. Few significant differences were detected in the chlorophyll a pigment analyses; however, ΔF/F(m)(') appeared sensitive to petrochemical exposure. Dispersing both types of oil resulted in a substantial increase in the TPH of the WAF and was generally correlated with a greater physiological impact to the seagrass health, compared with the oil alone. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Potential use of ground-based sensor technologies for weed detection.

    PubMed

    Peteinatos, Gerassimos G; Weis, Martin; Andújar, Dionisio; Rueda Ayala, Victor; Gerhards, Roland

    2014-02-01

    Site-specific weed management is the part of precision agriculture (PA) that tries to effectively control weed infestations with the least economical and environmental burdens. This can be achieved with the aid of ground-based or near-range sensors in combination with decision rules and precise application technologies. Near-range sensor technologies, developed for mounting on a vehicle, have been emerging for PA applications during the last three decades. These technologies focus on identifying plants and measuring their physiological status with the aid of their spectral and morphological characteristics. Cameras, spectrometers, fluorometers and distance sensors are the most prominent sensors for PA applications. The objective of this article is to describe-ground based sensors that have the potential to be used for weed detection and measurement of weed infestation level. An overview of current sensor systems is presented, describing their concepts, results that have been achieved, already utilized commercial systems and problems that persist. A perspective for the development of these sensors is given. © 2013 Society of Chemical Industry.

  17. Pulsed laser fluorometry for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Saunders, G. C.; Martin, J. C.; Jett, J. H.; Wilder, M. E.; Martinez, A.; Bentley, B. F.; Lopez, J.; Hutson, L.

    A compact pulsed laser fluorometer has been incorporated into a continuous flow system developed to detect acetylcholinesterase (AChE) inhibitors and/or primary amine compounds in air and water. A pulsed nitrogen laser pumped dye laser excites fluorescent reactants which flow continuously through a quartz flow cell. Data are collected, analyzed, and displayed using a Macintosh II personal computer. For detection of cholinesterase inhibitors the fluorogenic substrate N methylindoxyl acetate is used to monitor the activity of immobilized enzyme. Presence of inhibitors results in a decrease of steady state fluorescence. Detection of compounds containing primary amines is based on their reaction with fluorescamine to rapidly produce intensely fluorescent products. Compounds of interest to our research were amino acids, peptides, and proteins. An increase in steady state fluorescence could be cause to evaluate the reasons for the change. The detection limit of the protein, bovine serum albumin (BSA) in water, is 10 ppT. Nebulized BSA concentrated by the LANL air sampler can be detected at sub ppT original air concentration.

  18. A remote sensing laser fluorometer. [for detecting oil, ligninsulfonates, and chlorophyll in water

    NASA Technical Reports Server (NTRS)

    Oneill, R. A.; Davis, A. R.; Gross, H. G.; Kruus, J.

    1975-01-01

    A sensor is reported which is able to identify certain specific substances in water by means of their fluorescence spectra. In particular, the sensor detects oil, ligninsulfonates and chlorophyll. The device is able to measure the fluorescence spectra of water at ranges up to 75 m and to detect oil spills on water at altitudes up to 300 m. Blue light from a laser is used to excite the fluorescence of the target. Any light from the ambient background illumination, from the reflected laser light or from the induced fluorescence is gathered by a small telescope focused on the target. Optical filters are used to block the reflected laser light and to select the wavelengths of interest in the fluorescence spectrum of the target. The remaining light is detected with a photomultiplier tube. The amplitude of the laser induced fluorescence in the wavelength interval selected by the optical filters is displayed on a meter or strip chart recorder.

  19. Causal Inference and the Comparative Interrupted Time Series Design: Findings from Within-Study Comparisons

    ERIC Educational Resources Information Center

    St. Clair, Travis; Hallberg, Kelly; Cook, Thomas D.

    2014-01-01

    Researchers are increasingly using comparative interrupted time series (CITS) designs to estimate the effects of programs and policies when randomized controlled trials are not feasible. In a simple interrupted time series design, researchers compare the pre-treatment values of a treatment group time series to post-treatment values in order to…

  20. Time Series Model Identification and Prediction Variance Horizon.

    DTIC Science & Technology

    1980-06-01

    stationary time series Y(t). -6- In terms of p(v), the definition of the three time series memory types is: No Memory Short Memory Long Memory X IP (v)I 0 0...X lp(v)l < - I IP (v) = v=1 v=l v=l Within short memory time series there are three types whose classification in terms of correlation functions is...1974) "Some Recent Advances in Time Series Modeling", TEEE Transactions on Automatic ControZ, VoZ . AC-19, No. 6, December, 723-730. Parzen, E. (1976) "An

  1. Extending nonlinear analysis to short ecological time series.

    PubMed

    Hsieh, Chih-hao; Anderson, Christian; Sugihara, George

    2008-01-01

    Nonlinearity is important and ubiquitous in ecology. Though detectable in principle, nonlinear behavior is often difficult to characterize, analyze, and incorporate mechanistically into models of ecosystem function. One obvious reason is that quantitative nonlinear analysis tools are data intensive (require long time series), and time series in ecology are generally short. Here we demonstrate a useful method that circumvents data limitation and reduces sampling error by combining ecologically similar multispecies time series into one long time series. With this technique, individual ecological time series containing as few as 20 data points can be mined for such important information as (1) significantly improved forecast ability, (2) the presence and location of nonlinearity, and (3) the effective dimensionality (the number of relevant variables) of an ecological system.

  2. Conventional and advanced time series estimation: application to the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database, 1993-2006.

    PubMed

    Moran, John L; Solomon, Patricia J

    2011-02-01

    Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time-dependence may be established using conventional and advanced econometric time series estimators. © 2010 Blackwell Publishing Ltd.

  3. Sea change: Charting the course for biogeochemical ocean time-series research in a new millennium

    NASA Astrophysics Data System (ADS)

    Church, Matthew J.; Lomas, Michael W.; Muller-Karger, Frank

    2013-09-01

    Ocean time-series provide vital information needed for assessing ecosystem change. This paper summarizes the historical context, major program objectives, and future research priorities for three contemporary ocean time-series programs: The Hawaii Ocean Time-series (HOT), the Bermuda Atlantic Time-series Study (BATS), and the CARIACO Ocean Time-Series. These three programs operate in physically and biogeochemically distinct regions of the world's oceans, with HOT and BATS located in the open-ocean waters of the subtropical North Pacific and North Atlantic, respectively, and CARIACO situated in the anoxic Cariaco Basin of the tropical Atlantic. All three programs sustain near-monthly shipboard occupations of their field sampling sites, with HOT and BATS beginning in 1988, and CARIACO initiated in 1996. The resulting data provide some of the only multi-disciplinary, decadal-scale determinations of time-varying ecosystem change in the global ocean. Facilitated by a scoping workshop (September 2010) sponsored by the Ocean Carbon Biogeochemistry (OCB) program, leaders of these time-series programs sought community input on existing program strengths and for future research directions. Themes that emerged from these discussions included: 1. Shipboard time-series programs are key to informing our understanding of the connectivity between changes in ocean-climate and biogeochemistry 2. The scientific and logistical support provided by shipboard time-series programs forms the backbone for numerous research and education programs. Future studies should be encouraged that seek mechanistic understanding of ecological interactions underlying the biogeochemical dynamics at these sites. 3. Detecting time-varying trends in ocean properties and processes requires consistent, high-quality measurements. Time-series must carefully document analytical procedures and, where possible, trace the accuracy of analyses to certified standards and internal reference materials. 4. Leveraged implementation, testing, and validation of autonomous and remote observing technologies at time-series sites provide new insights into spatiotemporal variability underlying ecosystem changes. 5. The value of existing time-series data for formulating and validating ecosystem models should be promoted. In summary, the scientific underpinnings of ocean time-series programs remain as strong and important today as when these programs were initiated. The emerging data inform our knowledge of the ocean's biogeochemistry and ecology, and improve our predictive capacity about planetary change.

  4. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  5. Long-term memory and volatility clustering in high-frequency price changes

    NASA Astrophysics Data System (ADS)

    oh, Gabjin; Kim, Seunghwan; Eom, Cheoljun

    2008-02-01

    We studied the long-term memory in diverse stock market indices and foreign exchange rates using Detrended Fluctuation Analysis (DFA). For all high-frequency market data studied, no significant long-term memory property was detected in the return series, while a strong long-term memory property was found in the volatility time series. The possible causes of the long-term memory property were investigated using the return data filtered by the AR(1) model, reflecting the short-term memory property, the GARCH(1,1) model, reflecting the volatility clustering property, and the FIGARCH model, reflecting the long-term memory property of the volatility time series. The memory effect in the AR(1) filtered return and volatility time series remained unchanged, while the long-term memory property diminished significantly in the volatility series of the GARCH(1,1) filtered data. Notably, there is no long-term memory property, when we eliminate the long-term memory property of volatility by the FIGARCH model. For all data used, although the Hurst exponents of the volatility time series changed considerably over time, those of the time series with the volatility clustering effect removed diminish significantly. Our results imply that the long-term memory property of the volatility time series can be attributed to the volatility clustering observed in the financial time series.

  6. Modeling seasonal variation of hip fracture in Montreal, Canada.

    PubMed

    Modarres, Reza; Ouarda, Taha B M J; Vanasse, Alain; Orzanco, Maria Gabriela; Gosselin, Pierre

    2012-04-01

    The investigation of the association of the climate variables with hip fracture incidences is important in social health issues. This study examined and modeled the seasonal variation of monthly population based hip fracture rate (HFr) time series. The seasonal ARIMA time series modeling approach is used to model monthly HFr incidences time series of female and male patients of the ages 40-74 and 75+ of Montreal, Québec province, Canada, in the period of 1993-2004. The correlation coefficients between meteorological variables such as temperature, snow depth, rainfall depth and day length and HFr are significant. The nonparametric Mann-Kendall test for trend assessment and the nonparametric Levene's test and Wilcoxon's test for checking the difference of HFr before and after change point are also used. The seasonality in HFr indicated sharp difference between winter and summer time. The trend assessment showed decreasing trends in HFr of female and male groups. The nonparametric test also indicated a significant change of the mean HFr. A seasonal ARIMA model was applied for HFr time series without trend and a time trend ARIMA model (TT-ARIMA) was developed and fitted to HFr time series with a significant trend. The multi criteria evaluation showed the adequacy of SARIMA and TT-ARIMA models for modeling seasonal hip fracture time series with and without significant trend. In the time series analysis of HFr of the Montreal region, the effects of the seasonal variation of climate variables on hip fracture are clear. The Seasonal ARIMA model is useful for modeling HFr time series without trend. However, for time series with significant trend, the TT-ARIMA model should be applied for modeling HFr time series. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Smoothing of climate time series revisited

    NASA Astrophysics Data System (ADS)

    Mann, Michael E.

    2008-08-01

    We present an easily implemented method for smoothing climate time series, generalizing upon an approach previously described by Mann (2004). The method adaptively weights the three lowest order time series boundary constraints to optimize the fit with the raw time series. We apply the method to the instrumental global mean temperature series from 1850-2007 and to various surrogate global mean temperature series from 1850-2100 derived from the CMIP3 multimodel intercomparison project. These applications demonstrate that the adaptive method systematically out-performs certain widely used default smoothing methods, and is more likely to yield accurate assessments of long-term warming trends.

  8. Forecasting and analyzing high O3 time series in educational area through an improved chaotic approach

    NASA Astrophysics Data System (ADS)

    Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md

    2017-08-01

    Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.

  9. Transformation-cost time-series method for analyzing irregularly sampled data

    NASA Astrophysics Data System (ADS)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  10. Transformation-cost time-series method for analyzing irregularly sampled data.

    PubMed

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  11. a Method of Time-Series Change Detection Using Full Polsar Images from Different Sensors

    NASA Astrophysics Data System (ADS)

    Liu, W.; Yang, J.; Zhao, J.; Shi, H.; Yang, L.

    2018-04-01

    Most of the existing change detection methods using full polarimetric synthetic aperture radar (PolSAR) are limited to detecting change between two points in time. In this paper, a novel method was proposed to detect the change based on time-series data from different sensors. Firstly, the overall difference image of a time-series PolSAR was calculated by ominous statistic test. Secondly, difference images between any two images in different times ware acquired by Rj statistic test. Generalized Gaussian mixture model (GGMM) was used to obtain time-series change detection maps in the last step for the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection by using the time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can detect the time-series change from different sensors.

  12. The method of trend analysis of parameters time series of gas-turbine engine state

    NASA Astrophysics Data System (ADS)

    Hvozdeva, I.; Myrhorod, V.; Derenh, Y.

    2017-10-01

    This research substantiates an approach to interval estimation of time series trend component. The well-known methods of spectral and trend analysis are used for multidimensional data arrays. The interval estimation of trend component is proposed for the time series whose autocorrelation matrix possesses a prevailing eigenvalue. The properties of time series autocorrelation matrix are identified.

  13. Network structure of multivariate time series.

    PubMed

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  14. Homogenising time series: beliefs, dogmas and facts

    NASA Astrophysics Data System (ADS)

    Domonkos, P.

    2011-06-01

    In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.

  15. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  16. Fuzzy time-series based on Fibonacci sequence for stock price forecasting

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Jong Teoh, Hia

    2007-07-01

    Time-series models have been utilized to make reasonably accurate predictions in the areas of stock price movements, academic enrollments, weather, etc. For promoting the forecasting performance of fuzzy time-series models, this paper proposes a new model, which incorporates the concept of the Fibonacci sequence, the framework of Song and Chissom's model and the weighted method of Yu's model. This paper employs a 5-year period TSMC (Taiwan Semiconductor Manufacturing Company) stock price data and a 13-year period of TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) stock index data as experimental datasets. By comparing our forecasting performances with Chen's (Forecasting enrollments based on fuzzy time-series. Fuzzy Sets Syst. 81 (1996) 311-319), Yu's (Weighted fuzzy time-series models for TAIEX forecasting. Physica A 349 (2004) 609-624) and Huarng's (The application of neural networks to forecast fuzzy time series. Physica A 336 (2006) 481-491) models, we conclude that the proposed model surpasses in accuracy these conventional fuzzy time-series models.

  17. Multivariate Time Series Decomposition into Oscillation Components.

    PubMed

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-08-01

    Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.

  18. Time-series modeling of long-term weight self-monitoring data.

    PubMed

    Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka

    2015-08-01

    Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.

  19. FALSE DETERMINATIONS OF CHAOS IN SHORT NOISY TIME SERIES. (R828745)

    EPA Science Inventory

    A method (NEMG) proposed in 1992 for diagnosing chaos in noisy time series with 50 or fewer observations entails fitting the time series with an empirical function which predicts an observation in the series from previous observations, and then estimating the rate of divergenc...

  20. New Results in Magnitude and Sign Correlations in Heartbeat Fluctuations for Healthy Persons and Congestive Heart Failure (CHF) Patients

    NASA Astrophysics Data System (ADS)

    Diosdado, A. Muñoz; Cruz, H. Reyes; Hernández, D. Bueno; Coyt, G. Gálvez; González, J. Arellanes

    2008-08-01

    Heartbeat fluctuations exhibit temporal structure with fractal and nonlinear features that reflect changes in the neuroautonomic control. In this work we have used the detrended fluctuation analysis (DFA) to analyze heartbeat (RR) intervals of 54 healthy subjects and 40 patients with congestive heart failure during 24 hours; we separate time series for sleep and wake phases. We observe long-range correlations in time series of healthy persons and CHF patients. However, the correlations for CHF patients are weaker than the correlations for healthy persons; this fact has been reported by Ashkenazy et al. [1] but with a smaller group of subjects. In time series of CHF patients there is a crossover, it means that the correlations for high and low frequencies are different, but in time series of healthy persons there are not crossovers even if they are sleeping. These crossovers are more pronounced for CHF patients in the sleep phase. We decompose the heartbeat interval time series into magnitude and sign series, we know that these kinds of signals can exhibit different time organization for the magnitude and sign and the magnitude series relates to nonlinear properties of the original time series, while the sign series relates to the linear properties. Magnitude series are long-range correlated, while the sign series are anticorrelated. Newly, the correlations for healthy persons are different that the correlations for CHF patients both for magnitude and sign time series. In the paper of Ashkenazy et al. they proposed the empirical relation: αsign≈1/2(αoriginal+αmagnitude) for the short-range regime (high frequencies), however, we have found a different relation that in our calculations is valid for short and long-range regime: αsign≈1/4(αoriginal+αmagnitude).

  1. An evaluation of the accuracy of modeled and computed streamflow time-series data for the Ohio River at Hannibal Lock and Dam and at a location upstream from Sardis, Ohio

    USGS Publications Warehouse

    Koltun, G.F.

    2015-01-01

    Streamflow hydrographs were plotted for modeled/computed time series for the Ohio River near the USGS Sardis gage and the Ohio River at the Hannibal Lock and Dam. In general, the time series at these two locations compared well. Some notable differences include the exclusive presence of short periods of negative streamflows in the USGS 15-minute time-series data for the gage on the Ohio River above Sardis, Ohio, and the occurrence of several peak streamflows in the USACE gate/hydropower time series for the Hannibal Lock and Dam that were appreciably larger than corresponding peaks in the other time series, including those modeled/computed for the downstream Sardis gage

  2. Using wavelet-feedforward neural networks to improve air pollution forecasting in urban environments.

    PubMed

    Dunea, Daniel; Pohoata, Alin; Iordache, Stefania

    2015-07-01

    The paper presents the screening of various feedforward neural networks (FANN) and wavelet-feedforward neural networks (WFANN) applied to time series of ground-level ozone (O3), nitrogen dioxide (NO2), and particulate matter (PM10 and PM2.5 fractions) recorded at four monitoring stations located in various urban areas of Romania, to identify common configurations with optimal generalization performance. Two distinct model runs were performed as follows: data processing using hourly-recorded time series of airborne pollutants during cold months (O3, NO2, and PM10), when residential heating increases the local emissions, and data processing using 24-h daily averaged concentrations (PM2.5) recorded between 2009 and 2012. Dataset variability was assessed using statistical analysis. Time series were passed through various FANNs. Each time series was decomposed in four time-scale components using three-level wavelets, which have been passed also through FANN, and recomposed into a single time series. The agreement between observed and modelled output was evaluated based on the statistical significance (r coefficient and correlation between errors and data). Daubechies db3 wavelet-Rprop FANN (6-4-1) utilization gave positive results for O3 time series optimizing the exclusive use of the FANN for hourly-recorded time series. NO2 was difficult to model due to time series specificity, but wavelet integration improved FANN performances. Daubechies db3 wavelet did not improve the FANN outputs for PM10 time series. Both models (FANN/WFANN) overestimated PM2.5 forecasted values in the last quarter of time series. A potential improvement of the forecasted values could be the integration of a smoothing algorithm to adjust the PM2.5 model outputs.

  3. A novel encoding Lempel-Ziv complexity algorithm for quantifying the irregularity of physiological time series.

    PubMed

    Zhang, Yatao; Wei, Shoushui; Liu, Hai; Zhao, Lina; Liu, Chengyu

    2016-09-01

    The Lempel-Ziv (LZ) complexity and its variants have been extensively used to analyze the irregularity of physiological time series. To date, these measures cannot explicitly discern between the irregularity and the chaotic characteristics of physiological time series. Our study compared the performance of an encoding LZ (ELZ) complexity algorithm, a novel variant of the LZ complexity algorithm, with those of the classic LZ (CLZ) and multistate LZ (MLZ) complexity algorithms. Simulation experiments on Gaussian noise, logistic chaotic, and periodic time series showed that only the ELZ algorithm monotonically declined with the reduction in irregularity in time series, whereas the CLZ and MLZ approaches yielded overlapped values for chaotic time series and time series mixed with Gaussian noise, demonstrating the accuracy of the proposed ELZ algorithm in capturing the irregularity, rather than the complexity, of physiological time series. In addition, the effect of sequence length on the ELZ algorithm was more stable compared with those on CLZ and MLZ, especially when the sequence length was longer than 300. A sensitivity analysis for all three LZ algorithms revealed that both the MLZ and the ELZ algorithms could respond to the change in time sequences, whereas the CLZ approach could not. Cardiac interbeat (RR) interval time series from the MIT-BIH database were also evaluated, and the results showed that the ELZ algorithm could accurately measure the inherent irregularity of the RR interval time series, as indicated by lower LZ values yielded from a congestive heart failure group versus those yielded from a normal sinus rhythm group (p < 0.01). Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Time averaging, ageing and delay analysis of financial time series

    NASA Astrophysics Data System (ADS)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  5. Modified DTW for a quantitative estimation of the similarity between rainfall time series

    NASA Astrophysics Data System (ADS)

    Djallel Dilmi, Mohamed; Barthès, Laurent; Mallet, Cécile; Chazottes, Aymeric

    2017-04-01

    The Precipitations are due to complex meteorological phenomenon and can be described as intermittent process. The spatial and temporal variability of this phenomenon is significant and covers large scales. To analyze and model this variability and / or structure, several studies use a network of rain gauges providing several time series of precipitation measurements. To compare these different time series, the authors compute for each time series some parameters (PDF, rain peak intensity, occurrence, amount, duration, intensity …). However, and despite the calculation of these parameters, the comparison of the parameters between two series of measurements remains qualitative. Due to the advection processes, when different sensors of an observation network measure precipitation time series identical in terms of intermitency or intensities, there is a time lag between the different measured series. Analyzing and extracting relevant information on physical phenomena from these precipitation time series implies the development of automatic analytical methods capable of comparing two time series of precipitation measured by different sensors or at two different locations and thus quantifying the difference / similarity. The limits of the Euclidean distance to measure the similarity between the time series of precipitation have been well demonstrated and explained (eg the Euclidian distance is indeed very sensitive to the effects of phase shift : between two identical but slightly shifted time series, this distance is not negligible). To quantify and analysis these time lag, the correlation functions are well established, normalized and commonly used to measure the spatial dependences that are required by many applications. However, authors generally observed that there is always a considerable scatter of the inter-rain gauge correlation coefficients obtained from the individual pairs of rain gauges. Because of a substantial dispersion of estimated time lag, the interpretation of this inter-correlation is not straightforward. We propose here to use an improvement of the Euclidian distance which integrates the global complexity of the rainfall series. The Dynamic Time Wrapping (DTW) used in speech recognition allows matching two time series instantly different and provide the most probable time lag. However, the original formulation of the DTW suffers from some limitations. In particular, it is not adequate to the rain intermittency. In this study we present an adaptation of the DTW for the analysis of rainfall time series : we used time series from the "Météo France" rain gauge network observed between January 1st, 2007 and December 31st, 2015 on 25 stations located in the Île de France area. Then we analyze the results (eg. The distance, the relationship between the time lag detected by our methods and others measured parameters like speed and direction of the wind…) to show the ability of the proposed similarity to provide usefull information on the rain structure. The possibility of using this measure of similarity to define a quality indicator of a sensor integrated into an observation network is also envisaged.

  6. Daily rainfall forecasting for one year in a single run using Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Unnikrishnan, Poornima; Jothiprakash, V.

    2018-06-01

    Effective modelling and prediction of smaller time step rainfall is reported to be very difficult owing to its highly erratic nature. Accurate forecast of daily rainfall for longer duration (multi time step) may be exceptionally helpful in the efficient planning and management of water resources systems. Identification of inherent patterns in a rainfall time series is also important for an effective water resources planning and management system. In the present study, Singular Spectrum Analysis (SSA) is utilized to forecast the daily rainfall time series pertaining to Koyna watershed in Maharashtra, India, for 365 days after extracting various components of the rainfall time series such as trend, periodic component, noise and cyclic component. In order to forecast the time series for longer time step (365 days-one window length), the signal and noise components of the time series are forecasted separately and then added together. The results of the study show that the method of SSA could extract the various components of the time series effectively and could also forecast the daily rainfall time series for longer duration such as one year in a single run with reasonable accuracy.

  7. Cross-correlation of point series using a new method

    NASA Technical Reports Server (NTRS)

    Strothers, Richard B.

    1994-01-01

    Traditional methods of cross-correlation of two time series do not apply to point time series. Here, a new method, devised specifically for point series, utilizes a correlation measure that is based in the rms difference (or, alternatively, the median absolute difference) between nearest neightbors in overlapped segments of the two series. Error estimates for the observed locations of the points, as well as a systematic shift of one series with respect to the other to accommodate a constant, but unknown, lead or lag, are easily incorporated into the analysis using Monte Carlo techniques. A methodological restriction adopted here is that one series be treated as a template series against which the other, called the target series, is cross-correlated. To estimate a significance level for the correlation measure, the adopted alternative (null) hypothesis is that the target series arises from a homogeneous Poisson process. The new method is applied to cross-correlating the times of the greatest geomagnetic storms with the times of maximum in the undecennial solar activity cycle.

  8. Self-organising mixture autoregressive model for non-stationary time series modelling.

    PubMed

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  9. Time Series Model Identification by Estimating Information.

    DTIC Science & Technology

    1982-11-01

    principle, Applications of Statistics, P. R. Krishnaiah , ed., North-Holland: Amsterdam, 27-41. Anderson, T. W. (1971). The Statistical Analysis of Time Series...E. (1969). Multiple Time Series Modeling, Multivariate Analysis II, edited by P. Krishnaiah , Academic Press: New York, 389-409. Parzen, E. (1981...Newton, H. J. (1980). Multiple Time Series Modeling, II Multivariate Analysis - V, edited by P. Krishnaiah , North Holland: Amsterdam, 181-197. Shibata, R

  10. Analysis of Zenith Tropospheric Delay above Europe based on long time series derived from the EPN data

    NASA Astrophysics Data System (ADS)

    Baldysz, Zofia; Nykiel, Grzegorz; Figurski, Mariusz; Szafranek, Karolina; Kroszczynski, Krzysztof; Araszkiewicz, Andrzej

    2015-04-01

    In recent years, the GNSS system began to play an increasingly important role in the research related to the climate monitoring. Based on the GPS system, which has the longest operational capability in comparison with other systems, and a common computational strategy applied to all observations, long and homogeneous ZTD (Zenith Tropospheric Delay) time series were derived. This paper presents results of analysis of 16-year ZTD time series obtained from the EPN (EUREF Permanent Network) reprocessing performed by the Military University of Technology. To maintain the uniformity of data, analyzed period of time (1998-2013) is exactly the same for all stations - observations carried out before 1998 were removed from time series and observations processed using different strategy were recalculated according to the MUT LAC approach. For all 16-year time series (59 stations) Lomb-Scargle periodograms were created to obtain information about the oscillations in ZTD time series. Due to strong annual oscillations which disturb the character of oscillations with smaller amplitude and thus hinder their investigation, Lomb-Scargle periodograms for time series with the deleted annual oscillations were created in order to verify presence of semi-annual, ter-annual and quarto-annual oscillations. Linear trend and seasonal components were estimated using LSE (Least Square Estimation) and Mann-Kendall trend test were used to confirm the presence of linear trend designated by LSE method. In order to verify the effect of the length of time series on the estimated size of the linear trend, comparison between two different length of ZTD time series was performed. To carry out a comparative analysis, 30 stations which have been operating since 1996 were selected. For these stations two periods of time were analyzed: shortened 16-year (1998-2013) and full 18-year (1996-2013). For some stations an additional two years of observations have significant impact on changing the size of linear trend - only for 4 stations the size of linear trend was exactly the same for two periods of time. In one case, the nature of the trend has changed from negative (16-year time series) for positive (18-year time series). The average value of a linear trends for 16-year time series is 1,5 mm/decade, but their spatial distribution is not uniform. The average value of linear trends for all 18-year time series is 2,0 mm/decade, with better spatial distribution and smaller discrepancies.

  11. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    NASA Astrophysics Data System (ADS)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be studied through the copula theory. As to the parameter estimation, the maximum likelihood estimation (MLE) will be applied. To illustrate the method, the univariate time series model and the dependence structure will be determined and tested using the monthly discharge time series of Cuyahoga River Basin.

  12. Using First Differences to Reduce Inhomogeneity in Radiosonde Temperature Datasets.

    NASA Astrophysics Data System (ADS)

    Free, Melissa; Angell, James K.; Durre, Imke; Lanzante, John; Peterson, Thomas C.; Seidel, Dian J.

    2004-11-01

    The utility of a “first difference” method for producing temporally homogeneous large-scale mean time series is assessed. Starting with monthly averages, the method involves dropping data around the time of suspected discontinuities and then calculating differences in temperature from one year to the next, resulting in a time series of year-to-year differences for each month at each station. These first difference time series are then combined to form large-scale means, and mean temperature time series are constructed from the first difference series. When applied to radiosonde temperature data, the method introduces random errors that decrease with the number of station time series used to create the large-scale time series and increase with the number of temporal gaps in the station time series. Root-mean-square errors for annual means of datasets produced with this method using over 500 stations are estimated at no more than 0.03 K, with errors in trends less than 0.02 K decade-1 for 1960 97 at 500 mb. For a 50-station dataset, errors in trends in annual global means introduced by the first differencing procedure may be as large as 0.06 K decade-1 (for six breaks per series), which is greater than the standard error of the trend. Although the first difference method offers significant resource and labor advantages over methods that attempt to adjust the data, it introduces an error in large-scale mean time series that may be unacceptable in some cases.


  13. FRET-Aptamer Assays for Bone Marker Assessment, C-Telopeptide, Creatinine, and Vitamin D

    NASA Technical Reports Server (NTRS)

    Bruno, John G.

    2013-01-01

    Astronauts lose 1.0 to 1.5% of their bone mass per month on long-duration spaceflights. NASA wishes to monitor the bone loss onboard spacecraft to develop nutritional and exercise countermeasures, and make adjustments during long space missions. On Earth, the same technology could be used to monitor osteoporosis and its therapy. Aptamers bind to targets against which they are developed, much like antibodies. However, aptamers do not require animal hosts or cell culture and are therefore easier, faster, and less expensive to produce. In addition, aptamers sometimes exhibit greater affinity and specificity vs. comparable antibodies. In this work, fluorescent dyes and quenchers were added to the aptamers to enable pushbutton, one-step, bind-and-detect fluorescence resonance energy transfer (FRET) assays or tests that can be freeze-dried, rehydrated with body fluids, and used to quantitate bone loss of vitamin D levels with a handheld fluorometer in the spacecraft environment. This work generated specific, rapid, one-step FRET assays for the bone loss marker C-telopeptide (CTx) when extracted from urine, creatinine from urine, and vitamin D congeners in diluted serum. The assays were quantified in nanograms/mL using a handheld fluorometer connected to a laptop computer to convert the raw fluorescence values into concentrations of each analyte according to linear standard curves. DNA aptamers were selected and amplified for several rounds against a 26- amino acid form of CTx, creatinine, and vitamin D. The commonalities between loop structures were studied, and several common loop structures were converted into aptamer beacons with a fluorophore and quencher on each end. In theory, when the aptamer beacon binds its cognate target (CTx bone peptide, creatinine, or vitamin D), it is forced open and no longer quenched, so it gives off fluorescent light (when excited) in proportion to the amount of target present in a sample. This proportional increase in fluorescence is called a "lights on" FRET response. The vitamin D aptamer beacon gives a "lights off" or inversely proportional fluorescence response to the amount of vitamin D present in diluted serum. These FRET-aptamer assays are rapid (<30 minutes), sensitive (low ng/mL detection limits), and quite easy to carry out (add sample, mix, and detect in the handheld reader). Benefits include the speed of the assays as well as the small amount of space taken up by the handheld reader and cuvette assays. The aptamer DNA sequences represent novel additional features of the existing (patent-pending) FRET-aptamer assay platform.

  14. Radiocarbon dating uncertainty and the reliability of the PEWMA method of time-series analysis for research on long-term human-environment interaction

    PubMed Central

    Carleton, W. Christopher; Campbell, David

    2018-01-01

    Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating—the most common chronometric technique in archaeological and palaeoenvironmental research—creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20–30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence. PMID:29351329

  15. Radiocarbon dating uncertainty and the reliability of the PEWMA method of time-series analysis for research on long-term human-environment interaction.

    PubMed

    Carleton, W Christopher; Campbell, David; Collard, Mark

    2018-01-01

    Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating-the most common chronometric technique in archaeological and palaeoenvironmental research-creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20-30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence.

  16. Short Term Rain Prediction For Sustainability of Tanks in the Tropic Influenced by Shadow Rains

    NASA Astrophysics Data System (ADS)

    Suresh, S.

    2007-07-01

    Rainfall and flow prediction, adapting the Venkataraman single time series approach and Wiener multiple time series approach were conducted for Aralikottai tank system, and Kothamangalam tank system, Tamilnadu, India. The results indicated that the raw prediction of daily values is closer to actual values than trend identified predictions. The sister seasonal time series were more amenable for prediction than whole parent time series. Venkataraman single time approach was more suited for rainfall prediction. Wiener approach proved better for daily prediction of flow based on rainfall. The major conclusion is that the sister seasonal time series of rain and flow have their own identities even though they form part of the whole parent time series. Further studies with other tropical small watersheds are necessary to establish this unique characteristic of independent but not exclusive behavior of seasonal stationary stochastic processes as compared to parent non stationary stochastic processes.

  17. The Value of Interrupted Time-Series Experiments for Community Intervention Research

    PubMed Central

    Biglan, Anthony; Ary, Dennis; Wagenaar, Alexander C.

    2015-01-01

    Greater use of interrupted time-series experiments is advocated for community intervention research. Time-series designs enable the development of knowledge about the effects of community interventions and policies in circumstances in which randomized controlled trials are too expensive, premature, or simply impractical. The multiple baseline time-series design typically involves two or more communities that are repeatedly assessed, with the intervention introduced into one community at a time. It is particularly well suited to initial evaluations of community interventions and the refinement of those interventions. This paper describes the main features of multiple baseline designs and related repeated-measures time-series experiments, discusses the threats to internal validity in multiple baseline designs, and outlines techniques for statistical analyses of time-series data. Examples are given of the use of multiple baseline designs in evaluating community interventions and policy changes. PMID:11507793

  18. Rainfall disaggregation for urban hydrology: Effects of spatial consistence

    NASA Astrophysics Data System (ADS)

    Müller, Hannes; Haberlandt, Uwe

    2015-04-01

    For urban hydrology rainfall time series with a high temporal resolution are crucial. Observed time series of this kind are very short in most cases, so they cannot be used. On the contrary, time series with lower temporal resolution (daily measurements) exist for much longer periods. The objective is to derive time series with a long duration and a high resolution by disaggregating time series of the non-recording stations with information of time series of the recording stations. The multiplicative random cascade model is a well-known disaggregation model for daily time series. For urban hydrology it is often assumed, that a day consists of only 1280 minutes in total as starting point for the disaggregation process. We introduce a new variant for the cascade model, which is functional without this assumption and also outperforms the existing approach regarding time series characteristics like wet and dry spell duration, average intensity, fraction of dry intervals and extreme value representation. However, in both approaches rainfall time series of different stations are disaggregated without consideration of surrounding stations. This yields in unrealistic spatial patterns of rainfall. We apply a simulated annealing algorithm that has been used successfully for hourly values before. Relative diurnal cycles of the disaggregated time series are resampled to reproduce the spatial dependence of rainfall. To describe spatial dependence we use bivariate characteristics like probability of occurrence, continuity ratio and coefficient of correlation. Investigation area is a sewage system in Northern Germany. We show that the algorithm has the capability to improve spatial dependence. The influence of the chosen disaggregation routine and the spatial dependence on overflow occurrences and volumes of the sewage system will be analyzed.

  19. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Proposal of Classification Method of Time Series Data in International Emissions Trading Market Using Agent-based Simulation

    NASA Astrophysics Data System (ADS)

    Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi

    This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.

  1. How long will the traffic flow time series keep efficacious to forecast the future?

    NASA Astrophysics Data System (ADS)

    Yuan, PengCheng; Lin, XuXun

    2017-02-01

    This paper investigate how long will the historical traffic flow time series keep efficacious to forecast the future. In this frame, we collect the traffic flow time series data with different granularity at first. Then, using the modified rescaled range analysis method, we analyze the long memory property of the traffic flow time series by computing the Hurst exponent. We calculate the long-term memory cycle and test its significance. We also compare it with the maximum Lyapunov exponent method result. Our results show that both of the freeway traffic flow time series and the ground way traffic flow time series demonstrate positively correlated trend (have long-term memory property), both of their memory cycle are about 30 h. We think this study is useful for the short-term or long-term traffic flow prediction and management.

  2. Measurements of spatial population synchrony: influence of time series transformations.

    PubMed

    Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël

    2015-09-01

    Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.

  3. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    PubMed

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  4. Relating the large-scale structure of time series and visibility networks.

    PubMed

    Rodríguez, Miguel A

    2017-06-01

    The structure of time series is usually characterized by means of correlations. A new proposal based on visibility networks has been considered recently. Visibility networks are complex networks mapped from surfaces or time series using visibility properties. The structures of time series and visibility networks are closely related, as shown by means of fractional time series in recent works. In these works, a simple relationship between the Hurst exponent H of fractional time series and the exponent of the distribution of edges γ of the corresponding visibility network, which exhibits a power law, is shown. To check and generalize these results, in this paper we delve into this idea of connected structures by defining both structures more properly. In addition to the exponents used before, H and γ, which take into account local properties, we consider two more exponents that, as we will show, characterize global properties. These are the exponent α for time series, which gives the scaling of the variance with the size as var∼T^{2α}, and the exponent κ of their corresponding network, which gives the scaling of the averaged maximum of the number of edges, 〈k_{M}〉∼N^{κ}. With this representation, a more precise connection between the structures of general time series and their associated visibility network is achieved. Similarities and differences are more clearly established, and new scaling forms of complex networks appear in agreement with their respective classes of time series.

  5. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012

    PubMed Central

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682

  6. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.

    PubMed

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.

  7. Trend time-series modeling and forecasting with neural networks.

    PubMed

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  8. Adaptive time-variant models for fuzzy-time-series forecasting.

    PubMed

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  9. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  10. Characteristics of the transmission of autoregressive sub-patterns in financial time series

    NASA Astrophysics Data System (ADS)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong

    2014-09-01

    There are many types of autoregressive patterns in financial time series, and they form a transmission process. Here, we define autoregressive patterns quantitatively through an econometrical regression model. We present a computational algorithm that sets the autoregressive patterns as nodes and transmissions between patterns as edges, and then converts the transmission process of autoregressive patterns in a time series into a network. We utilised daily Shanghai (securities) composite index time series to study the transmission characteristics of autoregressive patterns. We found statistically significant evidence that the financial market is not random and that there are similar characteristics between parts and whole time series. A few types of autoregressive sub-patterns and transmission patterns drive the oscillations of the financial market. A clustering effect on fluctuations appears in the transmission process, and certain non-major autoregressive sub-patterns have high media capabilities in the financial time series. Different stock indexes exhibit similar characteristics in the transmission of fluctuation information. This work not only proposes a distinctive perspective for analysing financial time series but also provides important information for investors.

  11. Characteristics of the transmission of autoregressive sub-patterns in financial time series

    PubMed Central

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong

    2014-01-01

    There are many types of autoregressive patterns in financial time series, and they form a transmission process. Here, we define autoregressive patterns quantitatively through an econometrical regression model. We present a computational algorithm that sets the autoregressive patterns as nodes and transmissions between patterns as edges, and then converts the transmission process of autoregressive patterns in a time series into a network. We utilised daily Shanghai (securities) composite index time series to study the transmission characteristics of autoregressive patterns. We found statistically significant evidence that the financial market is not random and that there are similar characteristics between parts and whole time series. A few types of autoregressive sub-patterns and transmission patterns drive the oscillations of the financial market. A clustering effect on fluctuations appears in the transmission process, and certain non-major autoregressive sub-patterns have high media capabilities in the financial time series. Different stock indexes exhibit similar characteristics in the transmission of fluctuation information. This work not only proposes a distinctive perspective for analysing financial time series but also provides important information for investors. PMID:25189200

  12. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    NASA Astrophysics Data System (ADS)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the ARMA class. In most cases the relative improvement of SETAR models against AR models of first order was low ranging between 1% and 4% with the exception of the three-regime model for the River Stour time-series where the improvement was 48.9%. In comparison, the relative improvement of MSW models was between 44.6% and 52.5 for two-regime and from 60.4% to 75% for three-regime models. However, the visual assessment of models plotted against original datasets showed that despite a high value of RSS, some ARMA models could describe the analyzed time-series better than AR, MA and SETAR models with lower values of RSS. In both datasets MSW models provided a very good visual fit describing most of the extreme values.

  13. Documentation of a spreadsheet for time-series analysis and drawdown estimation

    USGS Publications Warehouse

    Halford, Keith J.

    2006-01-01

    Drawdowns during aquifer tests can be obscured by barometric pressure changes, earth tides, regional pumping, and recharge events in the water-level record. These stresses can create water-level fluctuations that should be removed from observed water levels prior to estimating drawdowns. Simple models have been developed for estimating unpumped water levels during aquifer tests that are referred to as synthetic water levels. These models sum multiple time series such as barometric pressure, tidal potential, and background water levels to simulate non-pumping water levels. The amplitude and phase of each time series are adjusted so that synthetic water levels match measured water levels during periods unaffected by an aquifer test. Differences between synthetic and measured water levels are minimized with a sum-of-squares objective function. Root-mean-square errors during fitting and prediction periods were compared multiple times at four geographically diverse sites. Prediction error equaled fitting error when fitting periods were greater than or equal to four times prediction periods. The proposed drawdown estimation approach has been implemented in a spreadsheet application. Measured time series are independent so that collection frequencies can differ and sampling times can be asynchronous. Time series can be viewed selectively and magnified easily. Fitting and prediction periods can be defined graphically or entered directly. Synthetic water levels for each observation well are created with earth tides, measured time series, moving averages of time series, and differences between measured and moving averages of time series. Selected series and fitting parameters for synthetic water levels are stored and drawdowns are estimated for prediction periods. Drawdowns can be viewed independently and adjusted visually if an anomaly skews initial drawdowns away from 0. The number of observations in a drawdown time series can be reduced by averaging across user-defined periods. Raw or reduced drawdown estimates can be copied from the spreadsheet application or written to tab-delimited ASCII files.

  14. 77 FR 42040 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... comparable to the obligations proposed in this filing: Market-Makers % Time % Series Classes CBOE (current........ 60 All classes collectively. PMMs % Time % Series Classes CBOE (current rule) 99% of the time... % Time % Series Classes CBOE (current rule) 99% of the time required to 90% * Class-by-class. provide...

  15. Multiscale multifractal time irreversibility analysis of stock markets

    NASA Astrophysics Data System (ADS)

    Jiang, Chenguang; Shang, Pengjian; Shi, Wenbin

    2016-11-01

    Time irreversibility is one of the most important properties of nonstationary time series. Complex time series often demonstrate even multiscale time irreversibility, such that not only the original but also coarse-grained time series are asymmetric over a wide range of scales. We study the multiscale time irreversibility of time series. In this paper, we develop a method called multiscale multifractal time irreversibility analysis (MMRA), which allows us to extend the description of time irreversibility to include the dependence on the segment size and statistical moments. We test the effectiveness of MMRA in detecting multifractality and time irreversibility of time series generated from delayed Henon map and binomial multifractal model. Then we employ our method to the time irreversibility analysis of stock markets in different regions. We find that the emerging market has higher multifractality degree and time irreversibility compared with developed markets. In this sense, the MMRA method may provide new angles in assessing the evolution stage of stock markets.

  16. Detecting PM2.5's Correlations between Neighboring Cities Using a Time-Lagged Cross-Correlation Coefficient.

    PubMed

    Wang, Fang; Wang, Lin; Chen, Yuming

    2017-08-31

    In order to investigate the time-dependent cross-correlations of fine particulate (PM2.5) series among neighboring cities in Northern China, in this paper, we propose a new cross-correlation coefficient, the time-lagged q-L dependent height crosscorrelation coefficient (denoted by p q (τ, L)), which incorporates the time-lag factor and the fluctuation amplitude information into the analogous height cross-correlation analysis coefficient. Numerical tests are performed to illustrate that the newly proposed coefficient ρ q (τ, L) can be used to detect cross-correlations between two series with time lags and to identify different range of fluctuations at which two series possess cross-correlations. Applying the new coefficient to analyze the time-dependent cross-correlations of PM2.5 series between Beijing and the three neighboring cities of Tianjin, Zhangjiakou, and Baoding, we find that time lags between the PM2.5 series with larger fluctuations are longer than those between PM2.5 series withsmaller fluctuations. Our analysis also shows that cross-correlations between the PM2.5 series of two neighboring cities are significant and the time lags between two PM2.5 series of neighboring cities are significantly non-zero. These findings providenew scientific support on the view that air pollution in neighboring cities can affect one another not simultaneously but with a time lag.

  17. Nonlinear Dynamics, Poor Data, and What to Make of Them?

    NASA Astrophysics Data System (ADS)

    Ghil, M.; Zaliapin, I. V.

    2005-12-01

    The analysis of univariate or multivariate time series provides crucial information to describe, understand, and predict variability in the geosciences. The discovery and implementation of a number of novel methods for extracting useful information from time series has recently revitalized this classical field of study. Considerable progress has also been made in interpreting the information so obtained in terms of dynamical systems theory. In this talk we will describe the connections between time series analysis and nonlinear dynamics, discuss signal-to-noise enhancement, and present some of the novel methods for spectral analysis. These fall into two broad categories: (i) methods that try to ferret out regularities of the time series; and (ii) methods aimed at describing the characteristics of irregular processes. The former include singular-spectrum analysis (SSA), the multi-taper method (MTM), and the maximum-entropy method (MEM). The various steps, as well as the advantages and disadvantages of these methods, will be illustrated by their application to several important climatic time series, such as the Southern Oscillation Index (SOI), paleoclimatic time series, and instrumental temperature time series. The SOI index captures major features of interannual climate variability and is used extensively in its prediction. The other time series cover interdecadal and millennial time scales. The second category includes the calculation of fractional dimension, leading Lyapunov exponents, and Hurst exponents. More recently, multi-trend analysis (MTA), binary-decomposition analysis (BDA), and related methods have attempted to describe the structure of time series that include both regular and irregular components. Within the time available, I will try to give a feeling for how these methods work, and how well.

  18. Nonlinear dynamics of the atmospheric pollutants in Mexico City

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, Alejandro; Barrera-Ferrer, Amilcar; Angulo-Brown, Fernando

    2014-05-01

    The atmospheric pollution in the Metropolitan Zone of Mexico City (MZMC) is a serious problem with social, economical and political consequences, in virtue that it is the region which concentrates both the greatest country population and a great part of commercial and industrial activities. According to the World Health Organization, maximum permissible concentrations of atmospheric pollutants are exceeded frequently. In the MZMC, the environmental monitoring has been limited to criteria pollutants, named in this way due to when their levels are measured in the atmosphere, they indicate in a precise way the air quality. The Automatic Atmospheric Monitoring Network monitors and registers the values of pollutants concentration in air in the MZMC. Actually, it is integrated by approximately 35 automatic-equipped remote stations, which report an every-hour register. Local and global invariant quantities have been widely used to describe the fractal properties of diverse time series. In the study of certain time series, many times it is assumed that they are monofractal, which means that they can be described only with one fractal dimension. But this hypothesis is unrealistic because a lot of time series are heterogeneous and non stationary, so their scaling properties are not the same throughout time and therefore they may require more fractal dimensions for their description. Complexity of the atmospheric pollutants dynamics suggests us to analyze its time series of hourly concentration registers with the multifractal formalism. So, in this work, air concentration time series of MZMC criteria pollutants were studied with the proposed method. The chosen pollutants to perform this analysis are ozone, sulfur dioxide, carbon monoxide, nitrogen dioxide and PM10 (particles less than 10 micrometers). We found that pollutants air concentration time series are multifractal. When we calculate the degree of multifractality for each time series we know that while more multifractal are the time series, there is more complexity both in the time series and in the system from which the measurements were obtained. We studied the variation of the degree of multifractality over time, by calculating the multifractal spectra of the time series for each year; we see the variation in each monitoring station from 1990 until 2013. Multifractal analysis can tell us what kinds of correlations are present in the time series, and it is interesting to consider how these correlations vary over time. Our results show that for all the pollutants and all the monitoring stations the time series have long range correlations and they are highly persistent.

  19. A time-series approach to dynamical systems from classical and quantum worlds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fossion, Ruben

    2014-01-08

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  20. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.

    PubMed

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  1. Time series analysis of the developed financial markets' integration using visibility graphs

    NASA Astrophysics Data System (ADS)

    Zhuang, Enyu; Small, Michael; Feng, Gang

    2014-09-01

    A time series representing the developed financial markets' segmentation from 1973 to 2012 is studied. The time series reveals an obvious market integration trend. To further uncover the features of this time series, we divide it into seven windows and generate seven visibility graphs. The measuring capabilities of the visibility graphs provide means to quantitatively analyze the original time series. It is found that the important historical incidents that influenced market integration coincide with variations in the measured graphical node degree. Through the measure of neighborhood span, the frequencies of the historical incidents are disclosed. Moreover, it is also found that large "cycles" and significant noise in the time series are linked to large and small communities in the generated visibility graphs. For large cycles, how historical incidents significantly affected market integration is distinguished by density and compactness of the corresponding communities.

  2. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  3. Change point detection of the Persian Gulf sea surface temperature

    NASA Astrophysics Data System (ADS)

    Shirvani, A.

    2017-01-01

    In this study, the Student's t parametric and Mann-Whitney nonparametric change point models (CPMs) were applied to detect change point in the annual Persian Gulf sea surface temperature anomalies (PGSSTA) time series for the period 1951-2013. The PGSSTA time series, which were serially correlated, were transformed to produce an uncorrelated pre-whitened time series. The pre-whitened PGSSTA time series were utilized as the input file of change point models. Both the applied parametric and nonparametric CPMs estimated the change point in the PGSSTA in 1992. The PGSSTA follow the normal distribution up to 1992 and thereafter, but with a different mean value after year 1992. The estimated slope of linear trend in PGSSTA time series for the period 1951-1992 was negative; however, that was positive after the detected change point. Unlike the PGSSTA, the applied CPMs suggested no change point in the Niño3.4SSTA time series.

  4. A high-fidelity weather time series generator using the Markov Chain process on a piecewise level

    NASA Astrophysics Data System (ADS)

    Hersvik, K.; Endrerud, O.-E. V.

    2017-12-01

    A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.

  5. Recurrence plots revisited

    NASA Astrophysics Data System (ADS)

    Casdagli, M. C.

    1997-09-01

    We show that recurrence plots (RPs) give detailed characterizations of time series generated by dynamical systems driven by slowly varying external forces. For deterministic systems we show that RPs of the time series can be used to reconstruct the RP of the driving force if it varies sufficiently slowly. If the driving force is one-dimensional, its functional form can then be inferred up to an invertible coordinate transformation. The same results hold for stochastic systems if the RP of the time series is suitably averaged and transformed. These results are used to investigate the nonlinear prediction of time series generated by dynamical systems driven by slowly varying external forces. We also consider the problem of detecting a small change in the driving force, and propose a surrogate data technique for assessing statistical significance. Numerically simulated time series and a time series of respiration rates recorded from a subject with sleep apnea are used as illustrative examples.

  6. A simple and fast representation space for classifying complex time series

    NASA Astrophysics Data System (ADS)

    Zunino, Luciano; Olivares, Felipe; Bariviera, Aurelio F.; Rosso, Osvaldo A.

    2017-03-01

    In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease.

  7. Using SAR satellite data time series for regional glacier mapping

    NASA Astrophysics Data System (ADS)

    Winsvold, Solveig H.; Kääb, Andreas; Nuth, Christopher; Andreassen, Liss M.; van Pelt, Ward J. J.; Schellenberger, Thomas

    2018-03-01

    With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8) and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.

  8. Magnitude and sign of long-range correlated time series: Decomposition and surrogate signal generation.

    PubMed

    Gómez-Extremera, Manuel; Carpena, Pedro; Ivanov, Plamen Ch; Bernaola-Galván, Pedro A

    2016-04-01

    We systematically study the scaling properties of the magnitude and sign of the fluctuations in correlated time series, which is a simple and useful approach to distinguish between systems with different dynamical properties but the same linear correlations. First, we decompose artificial long-range power-law linearly correlated time series into magnitude and sign series derived from the consecutive increments in the original series, and we study their correlation properties. We find analytical expressions for the correlation exponent of the sign series as a function of the exponent of the original series. Such expressions are necessary for modeling surrogate time series with desired scaling properties. Next, we study linear and nonlinear correlation properties of series composed as products of independent magnitude and sign series. These surrogate series can be considered as a zero-order approximation to the analysis of the coupling of magnitude and sign in real data, a problem still open in many fields. We find analytical results for the scaling behavior of the composed series as a function of the correlation exponents of the magnitude and sign series used in the composition, and we determine the ranges of magnitude and sign correlation exponents leading to either single scaling or to crossover behaviors. Finally, we obtain how the linear and nonlinear properties of the composed series depend on the correlation exponents of their magnitude and sign series. Based on this information we propose a method to generate surrogate series with controlled correlation exponent and multifractal spectrum.

  9. Compression based entropy estimation of heart rate variability on multiple time scales.

    PubMed

    Baumert, Mathias; Voss, Andreas; Javorka, Michal

    2013-01-01

    Heart rate fluctuates beat by beat in a complex manner. The aim of this study was to develop a framework for entropy assessment of heart rate fluctuations on multiple time scales. We employed the Lempel-Ziv algorithm for lossless data compression to investigate the compressibility of RR interval time series on different time scales, using a coarse-graining procedure. We estimated the entropy of RR interval time series of 20 young and 20 old subjects and also investigated the compressibility of randomly shuffled surrogate RR time series. The original RR time series displayed significantly smaller compression entropy values than randomized RR interval data. The RR interval time series of older subjects showed significantly different entropy characteristics over multiple time scales than those of younger subjects. In conclusion, data compression may be useful approach for multiscale entropy assessment of heart rate variability.

  10. Short-term versus long-term rainfall time series in the assessment of potable water savings by using rainwater in houses.

    PubMed

    Ghisi, Enedir; Cardoso, Karla Albino; Rupp, Ricardo Forgiarini

    2012-06-15

    The main objective of this article is to assess the possibility of using short-term instead of long-term rainfall time series to evaluate the potential for potable water savings by using rainwater in houses. The analysis was performed considering rainfall data from 1960 to 1995 for the city of Santa Bárbara do Oeste, located in the state of São Paulo, southeastern Brazil. The influence of the rainfall time series, roof area, potable water demand and percentage rainwater demand on the potential for potable water savings was evaluated. The potential for potable water savings was estimated using computer simulations considering a set of long-term rainfall time series and different sets of short-term rainfall time series. The ideal rainwater tank capacity was also assessed for some cases. It was observed that the higher the percentage rainwater demand and the shorter the rainfall time series, the larger the difference between the potential for potable water savings and the greater the variation in the ideal rainwater tank size. The sets of short-term rainfall time series considered adequate for different scenarios ranged from 1 to 13 years depending on the roof area, percentage rainwater demand and potable water demand. The main finding of the research is that sets of short-term rainfall time series can be used to assess the potential for potable water savings by using rainwater, as the results obtained are similar to those obtained from the long-term rainfall time series. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. TEMPORAL SIGNATURES OF AIR QUALITY OBSERVATIONS AND MODEL OUTPUTS: DO TIME SERIES DECOMPOSITION METHODS CAPTURE RELEVANT TIME SCALES?

    EPA Science Inventory

    Time series decomposition methods were applied to meteorological and air quality data and their numerical model estimates. Decomposition techniques express a time series as the sum of a small number of independent modes which hypothetically represent identifiable forcings, thereb...

  12. A new method for reconstruction of solar irradiance

    NASA Astrophysics Data System (ADS)

    Privalsky, Victor

    2018-07-01

    The purpose of this research is to show how time series should be reconstructed using an example with the data on total solar irradiation (TSI) of the Earth and on sunspot numbers (SSN) since 1749. The traditional approach through regression equation(s) is designed for time-invariant vectors of random variables and is not applicable to time series, which present random functions of time. The autoregressive reconstruction (ARR) method suggested here requires fitting a multivariate stochastic difference equation to the target/proxy time series. The reconstruction is done through the scalar equation for the target time series with the white noise term excluded. The time series approach is shown to provide a better reconstruction of TSI than the correlation/regression method. A reconstruction criterion is introduced which allows one to define in advance the achievable level of success in the reconstruction. The conclusion is that time series, including the total solar irradiance, cannot be reconstructed properly if the data are not treated as sample records of random processes and analyzed in both time and frequency domains.

  13. Pearson correlation estimation for irregularly sampled time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Marwan, N.; Heitzig, J.; Kurths, J.

    2012-04-01

    Many applications in the geosciences call for the joint and objective analysis of irregular time series. For automated processing, robust measures of linear and nonlinear association are needed. Up to now, the standard approach would have been to reconstruct the time series on a regular grid, using linear or spline interpolation. Interpolation, however, comes with systematic side-effects, as it increases the auto-correlation in the time series. We have searched for the best method to estimate Pearson correlation for irregular time series, i.e. the one with the lowest estimation bias and variance. We adapted a kernel-based approach, using Gaussian weights. Pearson correlation is calculated, in principle, as a mean over products of previously centralized observations. In the regularly sampled case, observations in both time series were observed at the same time and thus the allocation of measurement values into pairs of products is straightforward. In the irregularly sampled case, however, measurements were not necessarily observed at the same time. Now, the key idea of the kernel-based method is to calculate weighted means of products, with the weight depending on the time separation between the observations. If the lagged correlation function is desired, the weights depend on the absolute difference between observation time separation and the estimation lag. To assess the applicability of the approach we used extensive simulations to determine the extent of interpolation side-effects with increasing irregularity of time series. We compared different approaches, based on (linear) interpolation, the Lomb-Scargle Fourier Transform, the sinc kernel and the Gaussian kernel. We investigated the role of kernel bandwidth and signal-to-noise ratio in the simulations. We found that the Gaussian kernel approach offers significant advantages and low Root-Mean Square Errors for regular, slightly irregular and very irregular time series. We therefore conclude that it is a good (linear) similarity measure that is appropriate for irregular time series with skewed inter-sampling time distributions.

  14. Drunk driving detection based on classification of multivariate time series.

    PubMed

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  15. Nonstationary time series prediction combined with slow feature analysis

    NASA Astrophysics Data System (ADS)

    Wang, G.; Chen, X.

    2015-07-01

    Almost all climate time series have some degree of nonstationarity due to external driving forces perturbing the observed system. Therefore, these external driving forces should be taken into account when constructing the climate dynamics. This paper presents a new technique of obtaining the driving forces of a time series from the slow feature analysis (SFA) approach, and then introduces them into a predictive model to predict nonstationary time series. The basic theory of the technique is to consider the driving forces as state variables and to incorporate them into the predictive model. Experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted to test the model. The results showed improved prediction skills.

  16. miniSEED: The Backbone Data Format for Seismological Time Series

    NASA Astrophysics Data System (ADS)

    Ahern, T. K.; Benson, R. B.; Trabant, C. M.

    2017-12-01

    In 1987, the International Federation of Digital Seismograph Networks (FDSN), adopted the Standard for the Exchange of Earthquake Data (SEED) format to be used for data archiving and exchange of seismological time series data. Since that time, the format has evolved to accommodate new capabilities and features. For example, a notable change in 1992 allowed the format, which includes both the comprehensive metadata and the time series samples, to be used in two additional forms: a container for metadata only called "dataless SEED", and 2) a stand-alone structure for time series called "miniSEED". While specifically designed for seismological data and related metadata, this format has proven to be a useful format for a wide variety of geophysical time series data. Many FDSN data centers now store temperature, pressure, infrasound, tilt and other time series measurements in this internationally used format. Since April 2016, members of the FDSN have been in discussions to design a next generation miniSEED format to accommodate current and future needs, to further generalize the format, and to address a number of historical problems or limitations. We believe the correct approach is to simplify the header, allow for arbitrary header additions, expand the current identifiers, and allow for anticipated future identifiers which are currently unknown. We also believe the primary goal of the format is for efficient archiving, selection and exchange of time series data. By focusing on these goals we avoid trying to generalize the format too broadly into specialized areas such as efficient, low-latency delivery, or including unbounded non-time series data. Our presentation will provide an overview of this format and highlight its most valuable characteristics for time series data from any geophysical domain or beyond.

  17. Alpine Grassland Phenology as Seen in AVHRR, VEGETATION, and MODIS NDVI Time Series - a Comparison with In Situ Measurements

    PubMed Central

    Fontana, Fabio; Rixen, Christian; Jonas, Tobias; Aberegg, Gabriel; Wunderle, Stefan

    2008-01-01

    This study evaluates the ability to track grassland growth phenology in the Swiss Alps with NOAA-16 Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) time series. Three growth parameters from 15 alpine and subalpine grassland sites were investigated between 2001 and 2005: Melt-Out (MO), Start Of Growth (SOG), and End Of Growth (EOG). We tried to estimate these phenological dates from yearly NDVI time series by identifying dates, where certain fractions (thresholds) of the maximum annual NDVI amplitude were crossed for the first time. For this purpose, the NDVI time series were smoothed using two commonly used approaches (Fourier adjustment or alternatively Savitzky-Golay filtering). Moreover, AVHRR NDVI time series were compared against data from the newer generation sensors SPOT VEGETATION and TERRA MODIS. All remote sensing NDVI time series were highly correlated with single point ground measurements and therefore accurately represented growth dynamics of alpine grassland. The newer generation sensors VGT and MODIS performed better than AVHRR, however, differences were minor. Thresholds for the determination of MO, SOG, and EOG were similar across sensors and smoothing methods, which demonstrated the robustness of the results. For our purpose, the Fourier adjustment algorithm created better NDVI time series than the Savitzky-Golay filter, since latter appeared to be more sensitive to noisy NDVI time series. Findings show that the application of various thresholds to NDVI time series allows the observation of the temporal progression of vegetation growth at the selected sites with high consistency. Hence, we believe that our study helps to better understand large-scale vegetation growth dynamics above the tree line in the European Alps. PMID:27879852

  18. Alpine Grassland Phenology as Seen in AVHRR, VEGETATION, and MODIS NDVI Time Series - a Comparison with In Situ Measurements.

    PubMed

    Fontana, Fabio; Rixen, Christian; Jonas, Tobias; Aberegg, Gabriel; Wunderle, Stefan

    2008-04-23

    This study evaluates the ability to track grassland growth phenology in the Swiss Alps with NOAA-16 Advanced Very High Resolution Radiometer (AVHRR) Normalized Difference Vegetation Index (NDVI) time series. Three growth parameters from 15 alpine and subalpine grassland sites were investigated between 2001 and 2005: Melt-Out (MO), Start Of Growth (SOG), and End Of Growth (EOG).We tried to estimate these phenological dates from yearly NDVI time series by identifying dates, where certain fractions (thresholds) of the maximum annual NDVI amplitude were crossed for the first time. For this purpose, the NDVI time series were smoothed using two commonly used approaches (Fourier adjustment or alternatively Savitzky-Golay filtering). Moreover, AVHRR NDVI time series were compared against data from the newer generation sensors SPOT VEGETATION and TERRA MODIS. All remote sensing NDVI time series were highly correlated with single point ground measurements and therefore accurately represented growth dynamics of alpine grassland. The newer generation sensors VGT and MODIS performed better than AVHRR, however, differences were minor. Thresholds for the determination of MO, SOG, and EOG were similar across sensors and smoothing methods, which demonstrated the robustness of the results. For our purpose, the Fourier adjustment algorithm created better NDVI time series than the Savitzky-Golay filter, since latter appeared to be more sensitive to noisy NDVI time series. Findings show that the application of various thresholds to NDVI time series allows the observation of the temporal progression of vegetation growth at the selected sites with high consistency. Hence, we believe that our study helps to better understand largescale vegetation growth dynamics above the tree line in the European Alps.

  19. On the equivalence of case-crossover and time series methods in environmental epidemiology.

    PubMed

    Lu, Yun; Zeger, Scott L

    2007-04-01

    The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.

  20. Spectral analysis for GNSS coordinate time series using chirp Fourier transform

    NASA Astrophysics Data System (ADS)

    Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan

    2017-12-01

    Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.

  1. Gridded Surface Subsurface Hydrologic Analysis Modeling for Analysis of Flood Design Features at the Picayune Strand Restoration Project

    DTIC Science & Technology

    2016-08-01

    the POI. ............................................................... 17  Figure 9. Discharge time series for the Miller pump system...2. In C2, the Miller Canal pump system was implicitly simulated by a time series of outflows assigned to model cells. This flow time series was...representative of how the pump system would operate during the storm events simulated in this work (USACE 2004). The outflow time series for the Miller

  2. Local normalization: Uncovering correlations in non-stationary financial time series

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Guhr, Thomas

    2010-09-01

    The measurement of correlations between financial time series is of vital importance for risk management. In this paper we address an estimation error that stems from the non-stationarity of the time series. We put forward a method to rid the time series of local trends and variable volatility, while preserving cross-correlations. We test this method in a Monte Carlo simulation, and apply it to empirical data for the S&P 500 stocks.

  3. Coil-to-coil physiological noise correlations and their impact on fMRI time-series SNR

    PubMed Central

    Triantafyllou, C.; Polimeni, J. R.; Keil, B.; Wald, L. L.

    2017-01-01

    Purpose Physiological nuisance fluctuations (“physiological noise”) are a major contribution to the time-series Signal to Noise Ratio (tSNR) of functional imaging. While thermal noise correlations between array coil elements have a well-characterized effect on the image Signal to Noise Ratio (SNR0), the element-to-element covariance matrix of the time-series fluctuations has not yet been analyzed. We examine this effect with a goal of ultimately improving the combination of multichannel array data. Theory and Methods We extend the theoretical relationship between tSNR and SNR0 to include a time-series noise covariance matrix Ψt, distinct from the thermal noise covariance matrix Ψ0, and compare its structure to Ψ0 and the signal coupling matrix SSH formed from the signal intensity vectors S. Results Inclusion of the measured time-series noise covariance matrix into the model relating tSNR and SNR0 improves the fit of experimental multichannel data and is shown to be distinct from Ψ0 or SSH. Conclusion Time-series noise covariances in array coils are found to differ from Ψ0 and more surprisingly, from the signal coupling matrix SSH. Correct characterization of the time-series noise has implications for the analysis of time-series data and for improving the coil element combination process. PMID:26756964

  4. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    PubMed

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.

  5. Non-parametric characterization of long-term rainfall time series

    NASA Astrophysics Data System (ADS)

    Tiwari, Harinarayan; Pandey, Brij Kishor

    2018-03-01

    The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.

  6. 31 CFR 353.35 - Payment (redemption).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., 2003, or earlier, will be paid at any time after 6 months from its issue date. A Series EE bond issued... Service Series No. 1-80 (31 CFR part 351). (c) Series HH. A Series HH bond will be paid at any time after... STATES SAVINGS BONDS, SERIES EE AND HH General Provisions for Payment § 353.35 Payment (redemption). (a...

  7. 31 CFR 353.35 - Payment (redemption).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., 2003, or earlier, will be paid at any time after 6 months from its issue date. A Series EE bond issued... Debt Series No. 1-80 (31 CFR part 351). (c) Series HH. A Series HH bond will be paid at any time after... STATES SAVINGS BONDS, SERIES EE AND HH General Provisions for Payment § 353.35 Payment (redemption). (a...

  8. 31 CFR 353.35 - Payment (redemption).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., 2003, or earlier, will be paid at any time after 6 months from its issue date. A Series EE bond issued... Debt Series No. 1-80 (31 CFR part 351). (c) Series HH. A Series HH bond will be paid at any time after... STATES SAVINGS BONDS, SERIES EE AND HH General Provisions for Payment § 353.35 Payment (redemption). (a...

  9. 31 CFR 353.35 - Payment (redemption).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., 2003, or earlier, will be paid at any time after 6 months from its issue date. A Series EE bond issued... Debt Series No. 1-80 (31 CFR part 351). (c) Series HH. A Series HH bond will be paid at any time after... STATES SAVINGS BONDS, SERIES EE AND HH General Provisions for Payment § 353.35 Payment (redemption). (a...

  10. 31 CFR 353.35 - Payment (redemption).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., 2003, or earlier, will be paid at any time after 6 months from its issue date. A Series EE bond issued... Debt Series No. 1-80 (31 CFR part 351). (c) Series HH. A Series HH bond will be paid at any time after... STATES SAVINGS BONDS, SERIES EE AND HH General Provisions for Payment § 353.35 Payment (redemption). (a...

  11. Simulation of time series by distorted Gaussian processes

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1977-01-01

    Distorted stationary Gaussian process can be used to provide computer-generated imitations of experimental time series. A method of analyzing a source time series and synthesizing an imitation is shown, and an example using X-band radiometer data is given.

  12. Reconstruction of ensembles of coupled time-delay systems from time series.

    PubMed

    Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  13. Detecting chaos in irregularly sampled time series.

    PubMed

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  14. Entropic Analysis of Electromyography Time Series

    NASA Astrophysics Data System (ADS)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  15. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time series or relations among phase shifted time series.

  16. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671

  17. Modeling of Engine Parameters for Condition-Based Maintenance of the MTU Series 2000 Diesel Engine

    DTIC Science & Technology

    2016-09-01

    are suitable. To model the behavior of the engine, an autoregressive distributed lag (ARDL) time series model of engine speed and exhaust gas... time series model of engine speed and exhaust gas temperature is derived. The lag length for ARDL is determined by whitening of residuals using the...15 B. REGRESSION ANALYSIS ....................................................................15 1. Time Series Analysis

  18. A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades

    PubMed Central

    Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd

    2017-01-01

    The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter. PMID:28813566

  19. Modeling Geodetic Processes with Levy α-Stable Distribution and FARIMA

    NASA Astrophysics Data System (ADS)

    Montillet, Jean-Philippe; Yu, Kegen

    2015-04-01

    Over the last years the scientific community has been using the auto regressive moving average (ARMA) model in the modeling of the noise in global positioning system (GPS) time series (daily solution). This work starts with the investigation of the limit of the ARMA model which is widely used in signal processing when the measurement noise is white. Since a typical GPS time series consists of geophysical signals (e.g., seasonal signal) and stochastic processes (e.g., coloured and white noise), the ARMA model may be inappropriate. Therefore, the application of the fractional auto-regressive integrated moving average (FARIMA) model is investigated. The simulation results using simulated time series as well as real GPS time series from a few selected stations around Australia show that the FARIMA model fits the time series better than other models when the coloured noise is larger than the white noise. The second fold of this work focuses on fitting the GPS time series with the family of Levy α-stable distributions. Using this distribution, a hypothesis test is developed to eliminate effectively coarse outliers from GPS time series, achieving better performance than using the rule of thumb of n standard deviations (with n chosen empirically).

  20. A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades.

    PubMed

    Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd

    2017-08-01

    The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter.

  1. A Synthesis of VIIRS Solar and Lunar Calibrations

    NASA Technical Reports Server (NTRS)

    Eplee, Robert E.; Turpie, Kevin R.; Meister, Gerhard; Patt, Frederick S.; Fireman, Gwyn F.; Franz, Bryan A.; McClain, Charles R.

    2013-01-01

    The NASA VIIRS Ocean Science Team (VOST) has developed two independent calibrations of the SNPP VIIRS moderate resolution reflective solar bands using solar diffuser and lunar observations through June 2013. Fits to the solar calibration time series show mean residuals per band of 0.078-0.10%. There are apparent residual lunar libration correlations in the lunar calibration time series that are not accounted for by the ROLO photometric model of the Moon. Fits to the lunar time series that account for residual librations show mean residuals per band of 0.071-0.17%. Comparison of the solar and lunar time series shows that the relative differences in the two calibrations are 0.12-0.31%. Relative uncertainties in the VIIRS solar and lunar calibration time series are comparable to those achieved for SeaWiFS, Aqua MODIS, and Terra MODIS. Intercomparison of the VIIRS lunar time series with those from SeaWiFS, Aqua MODIS, and Terra MODIS shows that the scatter in the VIIRS lunar observations is consistent with that observed for the heritage instruments. Based on these analyses, the VOST has derived a calibration lookup table for VIIRS ocean color data based on fits to the solar calibration time series.

  2. Process fault detection and nonlinear time series analysis for anomaly detection in safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, T.L.; Mullen, M.F.; Wangen, L.E.

    In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less

  3. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 datamore » points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.« less

  4. Forecasting Enrollments with Fuzzy Time Series.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  5. Solar-Terrestrial Signal Record in Tree Ring Width Time Series from Brazil

    NASA Astrophysics Data System (ADS)

    Rigozo, Nivaor Rodolfo; Lisi, Cláudio Sergio; Filho, Mário Tomazello; Prestes, Alan; Nordemann, Daniel Jean Roger; de Souza Echer, Mariza Pereira; Echer, Ezequiel; da Silva, Heitor Evangelista; Rigozo, Valderez F.

    2012-12-01

    This work investigates the behavior of the sunspot number and Southern Oscillation Index (SOI) signal recorded in the tree ring time series for three different locations in Brazil: Humaitá in Amazônia State, Porto Ferreira in São Paulo State, and Passo Fundo in Rio Grande do Sul State, using wavelet and cross-wavelet analysis techniques. The wavelet spectra of tree ring time series showed signs of 11 and 22 years, possibly related to the solar activity, and periods of 2-8 years, possibly related to El Niño events. The cross-wavelet spectra for all tree ring time series from Brazil present a significant response to the 11-year solar cycle in the time interval between 1921 to after 1981. These tree ring time series still have a response to the second harmonic of the solar cycle (5.5 years), but in different time intervals. The cross-wavelet maps also showed that the relationship between the SOI x tree ring time series is more intense, for oscillation in the range of 4-8 years.

  6. Introduction and application of the multiscale coefficient of variation analysis.

    PubMed

    Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh

    2017-10-01

    Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.

  7. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis.

    PubMed

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio

    2015-12-01

    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  8. PRESEE: An MDL/MML Algorithm to Time-Series Stream Segmenting

    PubMed Central

    Jiang, Yexi; Tang, Mingjie; Yuan, Changan; Tang, Changjie

    2013-01-01

    Time-series stream is one of the most common data types in data mining field. It is prevalent in fields such as stock market, ecology, and medical care. Segmentation is a key step to accelerate the processing speed of time-series stream mining. Previous algorithms for segmenting mainly focused on the issue of ameliorating precision instead of paying much attention to the efficiency. Moreover, the performance of these algorithms depends heavily on parameters, which are hard for the users to set. In this paper, we propose PRESEE (parameter-free, real-time, and scalable time-series stream segmenting algorithm), which greatly improves the efficiency of time-series stream segmenting. PRESEE is based on both MDL (minimum description length) and MML (minimum message length) methods, which could segment the data automatically. To evaluate the performance of PRESEE, we conduct several experiments on time-series streams of different types and compare it with the state-of-art algorithm. The empirical results show that PRESEE is very efficient for real-time stream datasets by improving segmenting speed nearly ten times. The novelty of this algorithm is further demonstrated by the application of PRESEE in segmenting real-time stream datasets from ChinaFLUX sensor networks data stream. PMID:23956693

  9. PRESEE: an MDL/MML algorithm to time-series stream segmenting.

    PubMed

    Xu, Kaikuo; Jiang, Yexi; Tang, Mingjie; Yuan, Changan; Tang, Changjie

    2013-01-01

    Time-series stream is one of the most common data types in data mining field. It is prevalent in fields such as stock market, ecology, and medical care. Segmentation is a key step to accelerate the processing speed of time-series stream mining. Previous algorithms for segmenting mainly focused on the issue of ameliorating precision instead of paying much attention to the efficiency. Moreover, the performance of these algorithms depends heavily on parameters, which are hard for the users to set. In this paper, we propose PRESEE (parameter-free, real-time, and scalable time-series stream segmenting algorithm), which greatly improves the efficiency of time-series stream segmenting. PRESEE is based on both MDL (minimum description length) and MML (minimum message length) methods, which could segment the data automatically. To evaluate the performance of PRESEE, we conduct several experiments on time-series streams of different types and compare it with the state-of-art algorithm. The empirical results show that PRESEE is very efficient for real-time stream datasets by improving segmenting speed nearly ten times. The novelty of this algorithm is further demonstrated by the application of PRESEE in segmenting real-time stream datasets from ChinaFLUX sensor networks data stream.

  10. Time Series Discord Detection in Medical Data using a Parallel Relational Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Rintoul, Mark Daniel; Wilson, Andrew T.

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  11. Time Series Discord Detection in Medical Data using a Parallel Relational Database [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Wilson, Andrew T.; Rintoul, Mark Daniel

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  12. Space Object Classification Using Fused Features of Time Series Data

    NASA Astrophysics Data System (ADS)

    Jia, B.; Pham, K. D.; Blasch, E.; Shen, D.; Wang, Z.; Chen, G.

    In this paper, a fused feature vector consisting of raw time series and texture feature information is proposed for space object classification. The time series data includes historical orbit trajectories and asteroid light curves. The texture feature is derived from recurrence plots using Gabor filters for both unsupervised learning and supervised learning algorithms. The simulation results show that the classification algorithms using the fused feature vector achieve better performance than those using raw time series or texture features only.

  13. Study of Track Irregularity Time Series Calibration and Variation Pattern at Unit Section

    PubMed Central

    Jia, Chaolong; Wei, Lili; Wang, Hanning; Yang, Jiulin

    2014-01-01

    Focusing on problems existing in track irregularity time series data quality, this paper first presents abnormal data identification, data offset correction algorithm, local outlier data identification, and noise cancellation algorithms. And then proposes track irregularity time series decomposition and reconstruction through the wavelet decomposition and reconstruction approach. Finally, the patterns and features of track irregularity standard deviation data sequence in unit sections are studied, and the changing trend of track irregularity time series is discovered and described. PMID:25435869

  14. A Review of Some Aspects of Robust Inference for Time Series.

    DTIC Science & Technology

    1984-09-01

    REVIEW OF SOME ASPECTSOF ROBUST INFERNCE FOR TIME SERIES by Ad . Dougla Main TE "iAL REPOW No. 63 Septermber 1984 Department of Statistics University of ...clear. One cannot hope to have a good method for dealing with outliers in time series by using only an instantaneous nonlinear transformation of the data...AI.49 716 A REVIEWd OF SOME ASPECTS OF ROBUST INFERENCE FOR TIME 1/1 SERIES(U) WASHINGTON UNIV SEATTLE DEPT OF STATISTICS R D MARTIN SEP 84 TR-53

  15. Exploring total cardiac variability in healthy and pathophysiological subjects using improved refined multiscale entropy.

    PubMed

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2017-02-01

    Multiscale entropy (MSE) and refined multiscale entropy (RMSE) techniques are being widely used to evaluate the complexity of a time series across multiple time scales 't'. Both these techniques, at certain time scales (sometimes for the entire time scales, in the case of RMSE), assign higher entropy to the HRV time series of certain pathologies than that of healthy subjects, and to their corresponding randomized surrogate time series. This incorrect assessment of signal complexity may be due to the fact that these techniques suffer from the following limitations: (1) threshold value 'r' is updated as a function of long-term standard deviation and hence unable to explore the short-term variability as well as substantial variability inherited in beat-to-beat fluctuations of long-term HRV time series. (2) In RMSE, entropy values assigned to different filtered scaled time series are the result of changes in variance, but do not completely reflect the real structural organization inherited in original time series. In the present work, we propose an improved RMSE (I-RMSE) technique by introducing a new procedure to set the threshold value by taking into account the period-to-period variability inherited in a signal and evaluated it on simulated and real HRV database. The proposed I-RMSE assigns higher entropy to the age-matched healthy subjects than that of patients suffering from atrial fibrillation, congestive heart failure, sudden cardiac death and diabetes mellitus, for the entire time scales. The results strongly support the reduction in complexity of HRV time series in female group, old-aged, patients suffering from severe cardiovascular and non-cardiovascular diseases, and in their corresponding surrogate time series.

  16. Time-dependent limited penetrable visibility graph analysis of nonstationary time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong

    2017-06-01

    Recent years have witnessed the development of visibility graph theory, which allows us to analyze a time series from the perspective of complex network. We in this paper develop a novel time-dependent limited penetrable visibility graph (TDLPVG). Two examples using nonstationary time series from RR intervals and gas-liquid flows are provided to demonstrate the effectiveness of our approach. The results of the first example suggest that our TDLPVG method allows characterizing the time-varying behaviors and classifying heart states of healthy, congestive heart failure and atrial fibrillation from RR interval time series. For the second example, we infer TDLPVGs from gas-liquid flow signals and interestingly find that the deviation of node degree of TDLPVGs enables to effectively uncover the time-varying dynamical flow behaviors of gas-liquid slug and bubble flow patterns. All these results render our TDLPVG method particularly powerful for characterizing the time-varying features underlying realistic complex systems from time series.

  17. Second-degree Stokes coefficients from multi-satellite SLR

    NASA Astrophysics Data System (ADS)

    Bloßfeld, Mathis; Müller, Horst; Gerstl, Michael; Štefka, Vojtěch; Bouman, Johannes; Göttl, Franziska; Horwath, Martin

    2015-09-01

    The long wavelength part of the Earth's gravity field can be determined, with varying accuracy, from satellite laser ranging (SLR). In this study, we investigate the combination of up to ten geodetic SLR satellites using iterative variance component estimation. SLR observations to different satellites are combined in order to identify the impact of each satellite on the estimated Stokes coefficients. The combination of satellite-specific weekly or monthly arcs allows to reduce parameter correlations of the single-satellite solutions and leads to alternative estimates of the second-degree Stokes coefficients. This alternative time series might be helpful for assessing the uncertainty in the impact of the low-degree Stokes coefficients on geophysical investigations. In order to validate the obtained time series of second-degree Stokes coefficients, a comparison with the SLR RL05 time series of the Center of Space Research (CSR) is done. This investigation shows that all time series are comparable to the CSR time series. The precision of the weekly/monthly and coefficients is analyzed by comparing mass-related equatorial excitation functions with geophysical model results and reduced geodetic excitation functions. In case of , the annual amplitude and phase of the DGFI solution agrees better with three of four geophysical model combinations than other time series. In case of , all time series agree very well to each other. The impact of on the ice mass trend estimates for Antarctica are compared based on CSR GRACE RL05 solutions, in which different monthly time series are used for replacing. We found differences in the long-term Antarctic ice loss of Gt/year between the GRACE solutions induced by the different SLR time series of CSR and DGFI, which is about 13 % of the total ice loss of Antarctica. This result shows that Antarctic ice mass loss quantifications must be carefully interpreted.

  18. Tracer gauge: An automated dye dilution gauging system for ice‐affected streams

    USGS Publications Warehouse

    Clow, David W.; Fleming, Andrea C.

    2008-01-01

    In‐stream flow protection programs require accurate, real‐time streamflow data to aid in the protection of aquatic ecosystems during winter base flow periods. In cold regions, however, winter streamflow often can only be estimated because in‐channel ice causes variable backwater conditions and alters the stage‐discharge relation. In this study, an automated dye dilution gauging system, a tracer gauge, was developed for measuring discharge in ice‐affected streams. Rhodamine WT is injected into the stream at a constant rate, and downstream concentrations are measured with a submersible fluorometer. Data loggers control system operations, monitor key variables, and perform discharge calculations. Comparison of discharge from the tracer gauge and from a Cipoletti weir during periods of extensive ice cover indicated that the root‐mean‐square error of the tracer gauge was 0.029 m3 s−1, or 6.3% of average discharge for the study period. The tracer gauge system can provide much more accurate data than is currently available for streams that are strongly ice affected and, thus, could substantially improve management of in‐stream flow protection programs during winter in cold regions. Care must be taken, however, to test for the validity of key assumptions, including complete mixing and conservative behavior of dye, no changes in storage, and no gains or losses of water to or from the stream along the study reach. These assumptions may be tested by measuring flow‐weighted dye concentrations across the stream, performing dye mass balance analyses, and evaluating breakthrough curve behavior.

  19. Assessment of benthic flux of dissolved organic carbon in wetland and estuarine sediments using the eddy-correlation technique

    NASA Astrophysics Data System (ADS)

    Swett, M. P.; Amirbahman, A.; Boss, E.

    2009-12-01

    Wetland and estuarine sediments release significant amounts of dissolved organic carbon (DOC) due to high levels of microbial activity, particularly sulfate reduction. Changes in climate and hydrologic conditions have a potential to alter DOC release from these systems as well. This is a concern, as high levels of DOC can lead to mobilization of toxic metals and organics in natural waters. In addition, source waters high in DOC produce undesirable disinfection byproducts in water treatment. Various in situ methods, such as peepers and sediment core centrifugation, exist to quantify vertical benthic fluxes of DOC and other dissolved species from the sediment-water interface (SWI). These techniques, however, are intrusive and involve disturbance of the sediment environment. Eddy-correlation allows for real-time, non-intrusive, in situ flux measurement of important analytes, such as O2 and DOC. An Acoustic Doppler Velocimeter (ADV) is used to obtain three-dimensional fluid velocity measurements. The eddy-correlation technique employs the mathematical separation of fluid velocity into mean velocity and fluctuating velocity components, with the latter representing turbulent eddy velocity. DOC concentrations are measured using a colored dissolved organic matter (CDOM) fluorometer, and instantaneous vertical flux is determined from the correlated data. This study assesses DOC flux at three project sites: a beaver pond in the Lower Penobscot Watershed, Maine; a mudflat in Penobscot River, Maine; and a mudflat in Great Bay, New Hampshire. Eddy flux values are compared with results obtained using peepers and centrifugation, as well as vertical profiling.

  20. Confocal fluorometer for diffusion tracking in 3D engineered tissue constructs

    NASA Astrophysics Data System (ADS)

    Daly, D.; Zilioli, A.; Tan, N.; Buttenschoen, K.; Chikkanna, B.; Reynolds, J.; Marsden, B.; Hughes, C.

    2016-03-01

    We present results of the development of a non-contacting instrument, called fScan, based on scanning confocal fluorometry for assessing the diffusion of materials through a tissue matrix. There are many areas in healthcare diagnostics and screening where it is now widely accepted that the need for new quantitative monitoring technologies is a major pinch point in patient diagnostics and in vitro testing. With the increasing need to interpret 3D responses this commonly involves the need to track the diffusion of compounds, pharma-active species and cells through a 3D matrix of tissue. Methods are available but to support the advances that are currently only promised, this monitoring needs to be real-time, non-invasive, and economical. At the moment commercial meters tend to be invasive and usually require a sample of the medium to be removed and processed prior to testing. This methodology clearly has a number of significant disadvantages. fScan combines a fiber based optical arrangement with a compact, free space optical front end that has been integrated so that the sample's diffusion can be measured without interference. This architecture is particularly important due to the "wet" nature of the samples. fScan is designed to measure constructs located within standard well plates and a 2-D motion stage locates the required sample with respect to the measurement system. Results are presented that show how the meter has been used to evaluate movements of samples through collagen constructs in situ without disturbing their kinetic characteristics. These kinetics were little understood prior to these measurements.

  1. Magnetic focusing immunosensor for the detection of Salmonella typhimurium in foods

    NASA Astrophysics Data System (ADS)

    Pivarnik, Philip E.; Cao, He; Letcher, Stephen V.; Pierson, Arthur H.; Rand, Arthur G.

    1999-01-01

    From 1988 through 1992 Salmonellosis accounted for 27% of the total reported foodborne disease outbreaks and 57% of the outbreaks in which the pathogen was identified. The prevalence of Salmonellosis and the new requirements to monitor the organism as a marker in pathogen reduction programs will drive the need for rapid, on-site testing. A compact fiber optic fluorometer using a red diode laser as an excitation source and fiber probes for analyte detection has been constructed and used to measure Salmonella. The organisms were isolated with anti-Salmonella magnetic beads and were labeled with a secondary antibody conjugated to a red fluorescent dye. The response of the system was proportional to the concentration of Salmonella typhimurium from 3.2 X 105 colony forming units (CFU)/ml to 1.6 X 107 CFU/ml. The system was developed to utilize a fiber-optic magnetic focusing problem that attracted the magnetic microspheres to the surface of a sample chamber directly in front of the excitation and emission fibers. The signal obtained from a homogenous suspension of fluorescent magnetic microspheres was 9 to 10 picowatts. After focusing, the signal from the fluorescent labeled magnetic microspheres increased to 200 picowatts, approximately 20 times greater than the homogeneous suspension. The magnetic focusing assay detected 1.59 X 105 colony forming units/ml of Salmonella typhimurium cultured in growth media. The process of magnetic focusing in front of the fibers has the potential to reduce the background fluorescence from unbound secondary antibodies, eliminating several rinsing steps, resulting in a simple rapid assay.

  2. Repetitive Immunosensor with a Fiber-Optic Device and Antibody-Coated Magnetic Beads for Semi-Continuous Monitoring of Escherichia coli O157:H7

    PubMed Central

    Taniguchi, Midori; Saito, Hirokazu; Mitsubayashi, Kohji

    2017-01-01

    A rapid and reproducible fiber-optic immunosensor for Escherichia coli O157:H7 (E. coli O157:H7) was described. The biosensor consisted of a flow cell, an optical fiber with a thin Ni layer, and a PC linked fluorometer. First, the samples with E. coli O157:H7 were incubated with magnetic beads coated with anti-E. coli O157:H7 antibodies and anti-E. coli O157:H7 antibodies labeled cyanine 5 (Cy5) to make sandwich complexes. Then the Cy5-(E. coli O157:H7)-beads were injected into a flow cell and pulled to the magnetized Ni layer on the optical fiber set in the flow cell. An excitation light (λ = 635 nm) was used to illuminate the optical fiber, and the Cy5 florescent molecules facing the optical fiber were exposed to an evanescent wave from the optical fiber. The 670 nm fluorescent light was measured using a photodiode. Finally, the magnetic intensity of the Ni layer was removed and the Cy5-E. coli O157:H7-beads were washed out for the next immunoassay. E. coli O157:H7, diluted with phosphate buffer (PB), was measured from 1 × 105 to 1 × 107 cells/mL. The total time required for an assay was less than 15 min (except for the pretreatment process) and repeating immunoassay on one optical fiber was made possible. PMID:28925937

  3. Repetitive Immunosensor with a Fiber-Optic Device and Antibody-Coated Magnetic Beads for Semi-Continuous Monitoring of Escherichia coli O157:H7.

    PubMed

    Taniguchi, Midori; Saito, Hirokazu; Mitsubayashi, Kohji

    2017-09-19

    A rapid and reproducible fiber-optic immunosensor for Escherichia coli O157:H7 ( E. coli O157:H7) was described. The biosensor consisted of a flow cell, an optical fiber with a thin Ni layer, and a PC linked fluorometer. First, the samples with E. coli O157:H7 were incubated with magnetic beads coated with anti- E. coli O157:H7 antibodies and anti- E. coli O157:H7 antibodies labeled cyanine 5 (Cy5) to make sandwich complexes. Then the Cy5-( E. coli O157:H7)-beads were injected into a flow cell and pulled to the magnetized Ni layer on the optical fiber set in the flow cell. An excitation light (λ = 635 nm) was used to illuminate the optical fiber, and the Cy5 florescent molecules facing the optical fiber were exposed to an evanescent wave from the optical fiber. The 670 nm fluorescent light was measured using a photodiode. Finally, the magnetic intensity of the Ni layer was removed and the Cy5- E. coli O157:H7-beads were washed out for the next immunoassay. E. coli O157:H7, diluted with phosphate buffer (PB), was measured from 1 × 10⁵ to 1 × 10⁷ cells/mL. The total time required for an assay was less than 15 min (except for the pretreatment process) and repeating immunoassay on one optical fiber was made possible.

  4. Differential temperature effects on dissipation of excess light energy and energy partitioning in lut2 mutant of Arabidopsis thaliana under photoinhibitory conditions.

    PubMed

    Popova, Antoaneta V; Dobrev, Konstantin; Velitchkova, Maya; Ivanov, Alexander G

    2018-05-03

    The high-light-induced alterations in photosynthetic performance of photosystem II (PSII) and photosystem I (PSI) as well as effectiveness of dissipation of excessive absorbed light during illumination for different periods of time at room (22 °C) and low (8-10 °C) temperature of leaves of Arabidopsis thaliana, wt and lut2, were followed with the aim of unraveling the role of lutein in the process of photoinhibition. Photosynthetic parameters of PSII and PSI were determined on whole leaves by PAM fluorometer and oxygen evolving activity-by a Clark-type electrode. In thylakoid membranes, isolated from non-illuminated and illuminated for 4.5 h leaves of wt and lut2 the photochemical activity of PSII and PSI and energy interaction between the main pigment-protein complexes was determined. Results indicate that in non-illuminated leaves of lut2 the maximum rate of oxygen evolution and energy utilization in PSII is lower, excitation pressure of PSII is higher and cyclic electron transport around PSI is faster than in wt leaves. Under high-light illumination, lut2 leaves are more sensitive in respect to PSII performance and the extent of increase of excitation pressure of PSII, Φ NO , and cyclic electron transport around PSI are higher than in wt leaves, especially when illumination is performed at low temperature. Significant part of the excessive light energy is dissipated via mechanism, not dependent on ∆pH and to functioning of xanthophyll cycle in LHCII, operating more intensively in lut2 leaves.

  5. GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns

    DOE PAGES

    Senin, Pavel; Lin, Jessica; Wang, Xing; ...

    2018-02-23

    The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less

  6. GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senin, Pavel; Lin, Jessica; Wang, Xing

    The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less

  7. Using NASA's Giovanni System to Simulate Time-Series Stations in the Outflow Region of California's Eel River

    NASA Technical Reports Server (NTRS)

    Acker, James G.; Shen, Suhung; Leptoukh, Gregory G.; Lee, Zhongping

    2012-01-01

    Oceanographic time-series stations provide vital data for the monitoring of oceanic processes, particularly those associated with trends over time and interannual variability. There are likely numerous locations where the establishment of a time-series station would be desirable, but for reasons of funding or logistics, such establishment may not be feasible. An alternative to an operational time-series station is monitoring of sites via remote sensing. In this study, the NASA Giovanni data system is employed to simulate the establishment of two time-series stations near the outflow region of California s Eel River, which carries a high sediment load. Previous time-series analysis of this location (Acker et al. 2009) indicated that remotely-sensed chl a exhibits a statistically significant increasing trend during summer (low flow) months, but no apparent trend during winter (high flow) months. Examination of several newly-available ocean data parameters in Giovanni, including 8-day resolution data, demonstrates the differences in ocean parameter trends at the two locations compared to regionally-averaged time-series. The hypothesis that the increased summer chl a values are related to increasing SST is evaluated, and the signature of the Eel River plume is defined with ocean optical parameters.

  8. Improving GNSS time series for volcano monitoring: application to Canary Islands (Spain)

    NASA Astrophysics Data System (ADS)

    García-Cañada, Laura; Sevilla, Miguel J.; Pereda de Pablo, Jorge; Domínguez Cerdeña, Itahiza

    2017-04-01

    The number of permanent GNSS stations has increased significantly in recent years for different geodetic applications such as volcano monitoring, which require a high precision. Recently we have started to have coordinates time series long enough so that we can apply different analysis and filters that allow us to improve the GNSS coordinates results. Following this idea we have processed data from GNSS permanent stations used by the Spanish Instituto Geográfico Nacional (IGN) for volcano monitoring in Canary Islands to obtained time series by double difference processing method with Bernese v5.0 for the period 2007-2014. We have identified the characteristics of these time series and obtained models to estimate velocities with greater accuracy and more realistic uncertainties. In order to improve the results we have used two kinds of filters to improve the time series. The first, a spatial filter, has been computed using the series of residuals of all stations in the Canary Islands without an anomalous behaviour after removing a linear trend. This allows us to apply this filter to all sets of coordinates of the permanent stations reducing their dispersion. The second filter takes account of the temporal correlation in the coordinate time series for each station individually. A research about the evolution of the velocity depending on the series length has been carried out and it has demonstrated the need for using time series of at least four years. Therefore, in those stations with more than four years of data, we calculated the velocity and the characteristic parameters in order to have time series of residuals. This methodology has been applied to the GNSS data network in El Hierro (Canary Islands) during the 2011-2012 eruption and the subsequent magmatic intrusions (2012-2014). The results show that in the new series it is easier to detect anomalous behaviours in the coordinates, so they are most useful to detect crustal deformations in volcano monitoring.

  9. A user-defined data type for the storage of time series data allowing efficient similarity screening.

    PubMed

    Sorokin, Anatoly; Selkov, Gene; Goryanin, Igor

    2012-07-16

    The volume of the experimentally measured time series data is rapidly growing, while storage solutions offering better data types than simple arrays of numbers or opaque blobs for keeping series data are sorely lacking. A number of indexing methods have been proposed to provide efficient access to time series data, but none has so far been integrated into a tried-and-proven database system. To explore the possibility of such integration, we have developed a data type for time series storage in PostgreSQL, an object-relational database system, and equipped it with an access method based on SAX (Symbolic Aggregate approXimation). This new data type has been successfully tested in a database supporting a large-scale plant gene expression experiment, and it was additionally tested on a very large set of simulated time series data. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Power law cross-correlations between price change and volume change of Indian stocks

    NASA Astrophysics Data System (ADS)

    Hasan, Rashid; Mohammed Salim, M.

    2017-05-01

    We study multifractal long-range correlations and cross-correlations of daily price change and volume change of 50 stocks that comprise Nifty index of National Stock Exchange, Mumbai, using MF-DFA and MF-DCCA methods. We find that the time series of price change are uncorrelated, whereas anti-persistent long-range multifractal correlations are found in volume change series. We also find antipersistent long-range multifractal cross-correlations between the time series of price change and volume change. As multifractality is a signature of complexity, we estimate complexity parameters of the time series of price change, volume change, and cross-correlated price-volume change by fitting the fourth-degree polynomials to their multifractal spectra. Our results indicate that the time series of price change display high complexity, whereas the time series of volume change and cross-correlated price-volume change display low complexity.

  11. Nonstationary time series prediction combined with slow feature analysis

    NASA Astrophysics Data System (ADS)

    Wang, G.; Chen, X.

    2015-01-01

    Almost all climate time series have some degree of nonstationarity due to external driving forces perturbations of the observed system. Therefore, these external driving forces should be taken into account when reconstructing the climate dynamics. This paper presents a new technique of combining the driving force of a time series obtained using the Slow Feature Analysis (SFA) approach, then introducing the driving force into a predictive model to predict non-stationary time series. In essence, the main idea of the technique is to consider the driving forces as state variables and incorporate them into the prediction model. To test the method, experiments using a modified logistic time series and winter ozone data in Arosa, Switzerland, were conducted. The results showed improved and effective prediction skill.

  12. InSAR Deformation Time Series Processed On-Demand in the Cloud

    NASA Astrophysics Data System (ADS)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.

  13. Forecasting Nonlinear Chaotic Time Series with Function Expression Method Based on an Improved Genetic-Simulated Annealing Algorithm

    PubMed Central

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior. PMID:26000011

  14. The parametric modified limited penetrable visibility graph for constructing complex networks from time series

    NASA Astrophysics Data System (ADS)

    Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang

    2018-02-01

    This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.

  15. Characterizing Time Series Data Diversity for Wind Forecasting: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Chartan, Erol Kevin; Feng, Cong

    Wind forecasting plays an important role in integrating variable and uncertain wind power into the power grid. Various forecasting models have been developed to improve the forecasting accuracy. However, it is challenging to accurately compare the true forecasting performances from different methods and forecasters due to the lack of diversity in forecasting test datasets. This paper proposes a time series characteristic analysis approach to visualize and quantify wind time series diversity. The developed method first calculates six time series characteristic indices from various perspectives. Then the principal component analysis is performed to reduce the data dimension while preserving the importantmore » information. The diversity of the time series dataset is visualized by the geometric distribution of the newly constructed principal component space. The volume of the 3-dimensional (3D) convex polytope (or the length of 1D number axis, or the area of the 2D convex polygon) is used to quantify the time series data diversity. The method is tested with five datasets with various degrees of diversity.« less

  16. [Predicting Incidence of Hepatitis E in Chinausing Fuzzy Time Series Based on Fuzzy C-Means Clustering Analysis].

    PubMed

    Luo, Yi; Zhang, Tao; Li, Xiao-song

    2016-05-01

    To explore the application of fuzzy time series model based on fuzzy c-means clustering in forecasting monthly incidence of Hepatitis E in mainland China. Apredictive model (fuzzy time series method based on fuzzy c-means clustering) was developed using Hepatitis E incidence data in mainland China between January 2004 and July 2014. The incidence datafrom August 2014 to November 2014 were used to test the fitness of the predictive model. The forecasting results were compared with those resulted from traditional fuzzy time series models. The fuzzy time series model based on fuzzy c-means clustering had 0.001 1 mean squared error (MSE) of fitting and 6.977 5 x 10⁻⁴ MSE of forecasting, compared with 0.0017 and 0.0014 from the traditional forecasting model. The results indicate that the fuzzy time series model based on fuzzy c-means clustering has a better performance in forecasting incidence of Hepatitis E.

  17. Forecasting nonlinear chaotic time series with function expression method based on an improved genetic-simulated annealing algorithm.

    PubMed

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.

  18. Large and Small-Scale Cropland Classification on the Foothills of Mount Kenya Based on SPOT-5 Take-5 Data Time Series

    NASA Astrophysics Data System (ADS)

    Eckert, Sandra

    2016-08-01

    The SPOT-5 Take 5 campaign provided SPOT time series data of an unprecedented spatial and temporal resolution. We analysed 29 scenes acquired between May and September 2015 of a semi-arid region in the foothills of Mount Kenya, with two aims: first, to distinguish rainfed from irrigated cropland and cropland from natural vegetation covers, which show similar reflectance patterns; and second, to identify individual crop types. We tested several input data sets in different combinations: the spectral bands and the normalized difference vegetation index (NDVI) time series, principal components of NDVI time series, and selected NDVI time series statistics. For the classification we used random forests (RF). In the test differentiating rainfed cropland, irrigated cropland, and natural vegetation covers, the best classification accuracies were achieved using spectral bands. For the differentiation of crop types, we analysed the phenology of selected crop types based on NDVI time series. First results are promising.

  19. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  20. Analysis of air pollution mortality in terms of life expectancy changes: relation between time series, intervention, and cohort studies.

    PubMed

    Rabl, Ari

    2006-02-01

    Information on life expectancy change is of great concern for policy makers, as evidenced by the discussions of the so-called "harvesting" issue (i.e. the question being, how large a loss each death corresponds to in the mortality results of time series studies). Whereas most epidemiological studies of air pollution mortality have been formulated in terms of mortality risk, this paper shows that a formulation in terms of life expectancy change is mathematically equivalent, but offers several advantages: it automatically takes into account the constraint that everybody dies exactly once, regardless of pollution; it provides a unified framework for time series, intervention studies and cohort studies; and in time series and intervention studies, it yields the life expectancy change directly as a time integral of the observed mortality rate. Results are presented for life expectancy change in time series studies. Determination of the corresponding total number of attributable deaths (as opposed to the number of observed deaths) is shown to be problematic. The time variation of mortality after a change in exposure is shown to depend on the processes by which the body can repair air pollution damage, in particular on their time constants. Hypothetical results are presented for repair models that are plausible in view of the available intervention studies of air pollution and of smoking cessation. If these repair models can also be assumed for acute effects, the results of cohort studies are compatible with those of time series. The proposed life expectancy framework provides information on the life expectancy change in time series studies, and it clarifies the relation between the results of time series, intervention, and cohort studies.

  1. Time Series Decomposition into Oscillation Components and Phase Estimation.

    PubMed

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-02-01

    Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.

  2. Coil-to-coil physiological noise correlations and their impact on functional MRI time-series signal-to-noise ratio.

    PubMed

    Triantafyllou, Christina; Polimeni, Jonathan R; Keil, Boris; Wald, Lawrence L

    2016-12-01

    Physiological nuisance fluctuations ("physiological noise") are a major contribution to the time-series signal-to-noise ratio (tSNR) of functional imaging. While thermal noise correlations between array coil elements have a well-characterized effect on the image Signal to Noise Ratio (SNR 0 ), the element-to-element covariance matrix of the time-series fluctuations has not yet been analyzed. We examine this effect with a goal of ultimately improving the combination of multichannel array data. We extend the theoretical relationship between tSNR and SNR 0 to include a time-series noise covariance matrix Ψ t , distinct from the thermal noise covariance matrix Ψ 0 , and compare its structure to Ψ 0 and the signal coupling matrix SS H formed from the signal intensity vectors S. Inclusion of the measured time-series noise covariance matrix into the model relating tSNR and SNR 0 improves the fit of experimental multichannel data and is shown to be distinct from Ψ 0 or SS H . Time-series noise covariances in array coils are found to differ from Ψ 0 and more surprisingly, from the signal coupling matrix SS H . Correct characterization of the time-series noise has implications for the analysis of time-series data and for improving the coil element combination process. Magn Reson Med 76:1708-1719, 2016. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  3. Time series models on analysing mortality rates and acute childhood lymphoid leukaemia.

    PubMed

    Kis, Maria

    2005-01-01

    In this paper we demonstrate applying time series models on medical research. The Hungarian mortality rates were analysed by autoregressive integrated moving average models and seasonal time series models examined the data of acute childhood lymphoid leukaemia.The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality rates of ischemic heart diseases and analysis of the mortality rates of cancer of digestive system. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by tree methods: standard normal distribution as estimation and estimation of the White's theory and the continuous time case estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.We present a new approach to analysing the occurrence of acute childhood lymphoid leukaemia. We decompose time series into components. The periodicity of acute childhood lymphoid leukaemia in Hungary was examined using seasonal decomposition time series method. The cyclic trend of the dates of diagnosis revealed that a higher percent of the peaks fell within the winter months than in the other seasons. This proves the seasonal occurrence of the childhood leukaemia in Hungary.

  4. Analysis and generation of groundwater concentration time series

    NASA Astrophysics Data System (ADS)

    Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae

    2018-01-01

    Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.

  5. The short time Fourier transform and local signals

    NASA Astrophysics Data System (ADS)

    Okumura, Shuhei

    In this thesis, I examine the theoretical properties of the short time discrete Fourier transform (STFT). The STFT is obtained by applying the Fourier transform by a fixed-sized, moving window to input series. We move the window by one time point at a time, so we have overlapping windows. I present several theoretical properties of the STFT, applied to various types of complex-valued, univariate time series inputs, and their outputs in closed forms. In particular, just like the discrete Fourier transform, the STFT's modulus time series takes large positive values when the input is a periodic signal. One main point is that a white noise time series input results in the STFT output being a complex-valued stationary time series and we can derive the time and time-frequency dependency structure such as the cross-covariance functions. Our primary focus is the detection of local periodic signals. I present a method to detect local signals by computing the probability that the squared modulus STFT time series has consecutive large values exceeding some threshold after one exceeding observation following one observation less than the threshold. We discuss a method to reduce the computation of such probabilities by the Box-Cox transformation and the delta method, and show that it works well in comparison to the Monte Carlo simulation method.

  6. Monitoring of tissue ablation using time series of ultrasound RF data.

    PubMed

    Imani, Farhad; Wu, Mark Z; Lasso, Andras; Burdette, Everett C; Daoud, Mohammad; Fitchinger, Gabor; Abolmaesumi, Purang; Mousavi, Parvin

    2011-01-01

    This paper is the first report on the monitoring of tissue ablation using ultrasound RF echo time series. We calcuate frequency and time domain features of time series of RF echoes from stationary tissue and transducer, and correlate them with ablated and non-ablated tissue properties. We combine these features in a nonlinear classification framework and demonstrate up to 99% classification accuracy in distinguishing ablated and non-ablated regions of tissue, in areas as small as 12mm2 in size. We also demonstrate significant improvement of ablated tissue classification using RF time series compared to the conventional approach of using single RF scan lines. The results of this study suggest RF echo time series as a promising approach for monitoring ablation, and capturing the changes in the tissue microstructure as a result of heat-induced necrosis.

  7. A harmonic linear dynamical system for prominent ECG feature extraction.

    PubMed

    Thi, Ngoc Anh Nguyen; Yang, Hyung-Jeong; Kim, SunHee; Do, Luu Ngoc

    2014-01-01

    Unsupervised mining of electrocardiography (ECG) time series is a crucial task in biomedical applications. To have efficiency of the clustering results, the prominent features extracted from preprocessing analysis on multiple ECG time series need to be investigated. In this paper, a Harmonic Linear Dynamical System is applied to discover vital prominent features via mining the evolving hidden dynamics and correlations in ECG time series. The discovery of the comprehensible and interpretable features of the proposed feature extraction methodology effectively represents the accuracy and the reliability of clustering results. Particularly, the empirical evaluation results of the proposed method demonstrate the improved performance of clustering compared to the previous main stream feature extraction approaches for ECG time series clustering tasks. Furthermore, the experimental results on real-world datasets show scalability with linear computation time to the duration of the time series.

  8. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    NASA Astrophysics Data System (ADS)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  9. Improving forecasting accuracy of medium and long-term runoff using artificial neural network based on EEMD decomposition.

    PubMed

    Wang, Wen-chuan; Chau, Kwok-wing; Qiu, Lin; Chen, Yang-bo

    2015-05-01

    Hydrological time series forecasting is one of the most important applications in modern hydrology, especially for the effective reservoir management. In this research, an artificial neural network (ANN) model coupled with the ensemble empirical mode decomposition (EEMD) is presented for forecasting medium and long-term runoff time series. First, the original runoff time series is decomposed into a finite and often small number of intrinsic mode functions (IMFs) and a residual series using EEMD technique for attaining deeper insight into the data characteristics. Then all IMF components and residue are predicted, respectively, through appropriate ANN models. Finally, the forecasted results of the modeled IMFs and residual series are summed to formulate an ensemble forecast for the original annual runoff series. Two annual reservoir runoff time series from Biuliuhe and Mopanshan in China, are investigated using the developed model based on four performance evaluation measures (RMSE, MAPE, R and NSEC). The results obtained in this work indicate that EEMD can effectively enhance forecasting accuracy and the proposed EEMD-ANN model can attain significant improvement over ANN approach in medium and long-term runoff time series forecasting. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Glossary-HDSC/OWP

    Science.gov Websites

    Glossary Precipitation Frequency Data Server GIS Grids Maps Time Series Temporals Documents Probable provides a measure of the average time between years (and not events) in which a particular value is RECCURENCE INTERVAL). ANNUAL MAXIMUM SERIES (AMS) - Time series of the largest precipitation amounts in a

  11. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  12. Use of Time-Series, ARIMA Designs to Assess Program Efficacy.

    ERIC Educational Resources Information Center

    Braden, Jeffery P.; And Others

    1990-01-01

    Illustrates use of time-series designs for determining efficacy of interventions with fictitious data describing drug-abuse prevention program. Discusses problems and procedures associated with time-series data analysis using Auto Regressive Integrated Moving Averages (ARIMA) models. Example illustrates application of ARIMA analysis for…

  13. Conditional heteroscedasticity as a leading indicator of ecological regime shifts.

    PubMed

    Seekell, David A; Carpenter, Stephen R; Pace, Michael L

    2011-10-01

    Regime shifts are massive, often irreversible, rearrangements of nonlinear ecological processes that occur when systems pass critical transition points. Ecological regime shifts sometimes have severe consequences for human well-being, including eutrophication in lakes, desertification, and species extinctions. Theoretical and laboratory evidence suggests that statistical anomalies may be detectable leading indicators of regime shifts in ecological time series, making it possible to foresee and potentially avert incipient regime shifts. Conditional heteroscedasticity is persistent variance characteristic of time series with clustered volatility. Here, we analyze conditional heteroscedasticity as a potential leading indicator of regime shifts in ecological time series. We evaluate conditional heteroscedasticity by using ecological models with and without four types of critical transition. On approaching transition points, all time series contain significant conditional heteroscedasticity. This signal is detected hundreds of time steps in advance of the regime shift. Time series without regime shifts do not have significant conditional heteroscedasticity. Because probability values are easily associated with tests for conditional heteroscedasticity, detection of false positives in time series without regime shifts is minimized. This property reduces the need for a reference system to compare with the perturbed system.

  14. Estimating rainfall time series and model parameter distributions using model data reduction and inversion techniques

    NASA Astrophysics Data System (ADS)

    Wright, Ashley J.; Walker, Jeffrey P.; Pauwels, Valentijn R. N.

    2017-08-01

    Floods are devastating natural hazards. To provide accurate, precise, and timely flood forecasts, there is a need to understand the uncertainties associated within an entire rainfall time series, even when rainfall was not observed. The estimation of an entire rainfall time series and model parameter distributions from streamflow observations in complex dynamic catchments adds skill to current areal rainfall estimation methods, allows for the uncertainty of entire rainfall input time series to be considered when estimating model parameters, and provides the ability to improve rainfall estimates from poorly gauged catchments. Current methods to estimate entire rainfall time series from streamflow records are unable to adequately invert complex nonlinear hydrologic systems. This study aims to explore the use of wavelets in the estimation of rainfall time series from streamflow records. Using the Discrete Wavelet Transform (DWT) to reduce rainfall dimensionality for the catchment of Warwick, Queensland, Australia, it is shown that model parameter distributions and an entire rainfall time series can be estimated. Including rainfall in the estimation process improves streamflow simulations by a factor of up to 1.78. This is achieved while estimating an entire rainfall time series, inclusive of days when none was observed. It is shown that the choice of wavelet can have a considerable impact on the robustness of the inversion. Combining the use of a likelihood function that considers rainfall and streamflow errors with the use of the DWT as a model data reduction technique allows the joint inference of hydrologic model parameters along with rainfall.

  15. CauseMap: fast inference of causality from complex time series.

    PubMed

    Maher, M Cyrus; Hernandez, Ryan D

    2015-01-01

    Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data. Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM), a method for establishing causality from long time series data (≳25 observations). Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens' Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement CCM in Julia, a high-performance programming language designed for facile technical computing. Our software package, CauseMap, is platform-independent and freely available as an official Julia package. Conclusions. CauseMap is an efficient implementation of a state-of-the-art algorithm for detecting causality from time series data. We believe this tool will be a valuable resource for biomedical research and personalized medicine.

  16. Classification of biosensor time series using dynamic time warping: applications in screening cancer cells with characteristic biomarkers.

    PubMed

    Rai, Shesh N; Trainor, Patrick J; Khosravi, Farhad; Kloecker, Goetz; Panchapakesan, Balaji

    2016-01-01

    The development of biosensors that produce time series data will facilitate improvements in biomedical diagnostics and in personalized medicine. The time series produced by these devices often contains characteristic features arising from biochemical interactions between the sample and the sensor. To use such characteristic features for determining sample class, similarity-based classifiers can be utilized. However, the construction of such classifiers is complicated by the variability in the time domains of such series that renders the traditional distance metrics such as Euclidean distance ineffective in distinguishing between biological variance and time domain variance. The dynamic time warping (DTW) algorithm is a sequence alignment algorithm that can be used to align two or more series to facilitate quantifying similarity. In this article, we evaluated the performance of DTW distance-based similarity classifiers for classifying time series that mimics electrical signals produced by nanotube biosensors. Simulation studies demonstrated the positive performance of such classifiers in discriminating between time series containing characteristic features that are obscured by noise in the intensity and time domains. We then applied a DTW distance-based k -nearest neighbors classifier to distinguish the presence/absence of mesenchymal biomarker in cancer cells in buffy coats in a blinded test. Using a train-test approach, we find that the classifier had high sensitivity (90.9%) and specificity (81.8%) in differentiating between EpCAM-positive MCF7 cells spiked in buffy coats and those in plain buffy coats.

  17. Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application

    NASA Astrophysics Data System (ADS)

    Chen, Jinduan; Boccelli, Dominic L.

    2018-02-01

    Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.

  18. More on Time Series Designs: A Reanalysis of Mayer and Kozlow's Data.

    ERIC Educational Resources Information Center

    Willson, Victor L.

    1982-01-01

    Differentiating between time-series design and time-series analysis, examines design considerations and reanalyzes data previously reported by Mayer and Kozlow in this journal. The current analysis supports the analysis performed by Mayer and Kozlow but puts the results on a somewhat firmer statistical footing. (Author/JN)

  19. 76 FR 27637 - Supplemental Priorities for Discretionary Grant Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-12

    .... Interrupted time series design means a type of quasi-experimental study (as defined in this notice) in which... design is an adaptation of an interrupted time series design that relies on the comparison of treatment... notice), interrupted time series designs (as defined in this notice), or regression discontinuity designs...

  20. 75 FR 47284 - Secretary's Priorities for Discretionary Grant Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-05

    ... the most currently available data. Interrupted time series design \\4\\ means a type of quasi... findings. \\4\\ A single subject or single case design is an adaptation of an interrupted time series design...), interrupted time series designs (as defined in this notice), or regression discontinuity designs (as defined...

  1. 75 FR 37390 - Caribbean Fishery Management Council; Public Hearings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-29

    ...; rather, all are calculated based on landings data averaged over alternative time series. The overfished... the USVI, and recreational landings data recorded during 2000-2001. These time series were considered... Calculated Based on the Alternative Time Series Described in Section 4.2.1. Also Included Are the Average...

  2. An approach to constructing a homogeneous time series of soil mositure using SMOS

    USDA-ARS?s Scientific Manuscript database

    Overlapping soil moisture time series derived from two satellite microwave radiometers (SMOS, Soil Moisture and Ocean Salinity; AMSR-E, Advanced Microwave Scanning Radiometer - Earth Observing System) are used to generate a soil moisture time series from 2003 to 2010. Two statistical methodologies f...

  3. Time Series Econometrics for the 21st Century

    ERIC Educational Resources Information Center

    Hansen, Bruce E.

    2017-01-01

    The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…

  4. Forecasting Jakarta composite index (IHSG) based on chen fuzzy time series and firefly clustering algorithm

    NASA Astrophysics Data System (ADS)

    Ningrum, R. W.; Surarso, B.; Farikhin; Safarudin, Y. M.

    2018-03-01

    This paper proposes the combination of Firefly Algorithm (FA) and Chen Fuzzy Time Series Forecasting. Most of the existing fuzzy forecasting methods based on fuzzy time series use the static length of intervals. Therefore, we apply an artificial intelligence, i.e., Firefly Algorithm (FA) to set non-stationary length of intervals for each cluster on Chen Method. The method is evaluated by applying on the Jakarta Composite Index (IHSG) and compare with classical Chen Fuzzy Time Series Forecasting. Its performance verified through simulation using Matlab.

  5. Financial Time-series Analysis: a Brief Overview

    NASA Astrophysics Data System (ADS)

    Chakraborti, A.; Patriarca, M.; Santhanam, M. S.

    Prices of commodities or assets produce what is called time-series. Different kinds of financial time-series have been recorded and studied for decades. Nowadays, all transactions on a financial market are recorded, leading to a huge amount of data available, either for free in the Internet or commercially. Financial time-series analysis is of great interest to practitioners as well as to theoreticians, for making inferences and predictions. Furthermore, the stochastic uncertainties inherent in financial time-series and the theory needed to deal with them make the subject especially interesting not only to economists, but also to statisticians and physicists [1]. While it would be a formidable task to make an exhaustive review on the topic, with this review we try to give a flavor of some of its aspects.

  6. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    NASA Astrophysics Data System (ADS)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  7. A systematic review on the use of time series data in the study of antimicrobial consumption and Pseudomonas aeruginosa resistance.

    PubMed

    Athanasiou, Christos I; Kopsini, Angeliki

    2018-06-12

    In the field of antimicrobial resistance, the number of studies that use time series data has increased recently. The purpose of this study is the systematic review of all studies on antibacterial consumption and on Pseudomonas aeruginosa resistance in healthcare settings, that have used time series data. A systematic review of the literature till June 2017 was conducted. All the studies that have used time series data and have examined the inhospital antibiotic consumption and Ps. aeruginosa resistance rates or incidence were eligible. No other exclusion criteria were applied. Data on the structure, terminology used, methods used and results of each article were recorded and analyzed as possible. A total of thirty six studies were retrieved, twenty three of which were in accordance with our criteria. Thirteen of them were quasi experimental studies and ten were ecological observational studies. Eighteen studies collected time series data of both parameters and the statistical methodology of "time series analysis" was applied in nine studies. Most of the studies were published in the last eight years. The Interrupted Time Series design was the most widespread. As expected, there was high heterogeneity in regard to the study design, terminology and statistical methods applied. Copyright © 2018. Published by Elsevier Ltd.

  8. Robust extrema features for time-series data analysis.

    PubMed

    Vemulapalli, Pramod K; Monga, Vishal; Brennan, Sean N

    2013-06-01

    The extraction of robust features for comparing and analyzing time series is a fundamentally important problem. Research efforts in this area encompass dimensionality reduction using popular signal analysis tools such as the discrete Fourier and wavelet transforms, various distance metrics, and the extraction of interest points from time series. Recently, extrema features for analysis of time-series data have assumed increasing significance because of their natural robustness under a variety of practical distortions, their economy of representation, and their computational benefits. Invariably, the process of encoding extrema features is preceded by filtering of the time series with an intuitively motivated filter (e.g., for smoothing), and subsequent thresholding to identify robust extrema. We define the properties of robustness, uniqueness, and cardinality as a means to identify the design choices available in each step of the feature generation process. Unlike existing methods, which utilize filters "inspired" from either domain knowledge or intuition, we explicitly optimize the filter based on training time series to optimize robustness of the extracted extrema features. We demonstrate further that the underlying filter optimization problem reduces to an eigenvalue problem and has a tractable solution. An encoding technique that enhances control over cardinality and uniqueness is also presented. Experimental results obtained for the problem of time series subsequence matching establish the merits of the proposed algorithm.

  9. Time series analysis of InSAR data: Methods and trends

    NASA Astrophysics Data System (ADS)

    Osmanoğlu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cabral-Cano, Enrique

    2016-05-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ;unwrapping; of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  10. Inference of scale-free networks from gene expression time series.

    PubMed

    Daisuke, Tominaga; Horton, Paul

    2006-04-01

    Quantitative time-series observation of gene expression is becoming possible, for example by cell array technology. However, there are no practical methods with which to infer network structures using only observed time-series data. As most computational models of biological networks for continuous time-series data have a high degree of freedom, it is almost impossible to infer the correct structures. On the other hand, it has been reported that some kinds of biological networks, such as gene networks and metabolic pathways, may have scale-free properties. We hypothesize that the architecture of inferred biological network models can be restricted to scale-free networks. We developed an inference algorithm for biological networks using only time-series data by introducing such a restriction. We adopt the S-system as the network model, and a distributed genetic algorithm to optimize models to fit its simulated results to observed time series data. We have tested our algorithm on a case study (simulated data). We compared optimization under no restriction, which allows for a fully connected network, and under the restriction that the total number of links must equal that expected from a scale free network. The restriction reduced both false positive and false negative estimation of the links and also the differences between model simulation and the given time-series data.

  11. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  12. Characterizing artifacts in RR stress test time series.

    PubMed

    Astudillo-Salinas, Fabian; Palacio-Baus, Kenneth; Solano-Quinde, Lizandro; Medina, Ruben; Wong, Sara

    2016-08-01

    Electrocardiographic stress test records have a lot of artifacts. In this paper we explore a simple method to characterize the amount of artifacts present in unprocessed RR stress test time series. Four time series classes were defined: Very good lead, Good lead, Low quality lead and Useless lead. 65 ECG, 8 lead, records of stress test series were analyzed. Firstly, RR-time series were annotated by two experts. The automatic methodology is based on dividing the RR-time series in non-overlapping windows. Each window is marked as noisy whenever it exceeds an established standard deviation threshold (SDT). Series are classified according to the percentage of windows that exceeds a given value, based upon the first manual annotation. Different SDT were explored. Results show that SDT close to 20% (as a percentage of the mean) provides the best results. The coincidence between annotators classification is 70.77% whereas, the coincidence between the second annotator and the automatic method providing the best matches is larger than 63%. Leads classified as Very good leads and Good leads could be combined to improve automatic heartbeat labeling.

  13. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Estimation of coupling between time-delay systems from time series

    NASA Astrophysics Data System (ADS)

    Prokhorov, M. D.; Ponomarenko, V. I.

    2005-07-01

    We propose a method for estimation of coupling between the systems governed by scalar time-delay differential equations of the Mackey-Glass type from the observed time series data. The method allows one to detect the presence of certain types of linear coupling between two time-delay systems, to define the type, strength, and direction of coupling, and to recover the model equations of coupled time-delay systems from chaotic time series corrupted by noise. We verify our method using both numerical and experimental data.

  15. A study of stationarity in time series by using wavelet transform

    NASA Astrophysics Data System (ADS)

    Dghais, Amel Abdoullah Ahmed; Ismail, Mohd Tahir

    2014-07-01

    In this work the core objective is to apply discrete wavelet transform (DWT) functions namely Haar, Daubechies, Symmlet, Coiflet and discrete approximation of the meyer wavelets in non-stationary financial time series data from US stock market (DJIA30). The data consists of 2048 daily data of closing index starting from December 17, 2004 until October 23, 2012. From the unit root test the results show that the data is non stationary in the level. In order to study the stationarity of a time series, the autocorrelation function (ACF) is used. Results indicate that, Haar function is the lowest function to obtain noisy series as compared to Daubechies, Symmlet, Coiflet and discrete approximation of the meyer wavelets. In addition, the original data after decomposition by DWT is less noisy series than decomposition by DWT for return time series.

  16. A KST framework for correlation network construction from time series signals

    NASA Astrophysics Data System (ADS)

    Qi, Jin-Peng; Gu, Quan; Zhu, Ying; Zhang, Ping

    2018-04-01

    A KST (Kolmogorov-Smirnov test and T statistic) method is used for construction of a correlation network based on the fluctuation of each time series within the multivariate time signals. In this method, each time series is divided equally into multiple segments, and the maximal data fluctuation in each segment is calculated by a KST change detection procedure. Connections between each time series are derived from the data fluctuation matrix, and are used for construction of the fluctuation correlation network (FCN). The method was tested with synthetic simulations and the result was compared with those from using KS or T only for detection of data fluctuation. The novelty of this study is that the correlation analyses was based on the data fluctuation in each segment of each time series rather than on the original time signals, which would be more meaningful for many real world applications and for analysis of large-scale time signals where prior knowledge is uncertain.

  17. Complexity multiscale asynchrony measure and behavior for interacting financial dynamics

    NASA Astrophysics Data System (ADS)

    Yang, Ge; Wang, Jun; Niu, Hongli

    2016-08-01

    A stochastic financial price process is proposed and investigated by the finite-range multitype contact dynamical system, in an attempt to study the nonlinear behaviors of real asset markets. The viruses spreading process in a finite-range multitype system is used to imitate the interacting behaviors of diverse investment attitudes in a financial market, and the empirical research on descriptive statistics and autocorrelation behaviors of return time series is performed for different values of propagation rates. Then the multiscale entropy analysis is adopted to study several different shuffled return series, including the original return series, the corresponding reversal series, the random shuffled series, the volatility shuffled series and the Zipf-type shuffled series. Furthermore, we propose and compare the multiscale cross-sample entropy and its modification algorithm called composite multiscale cross-sample entropy. We apply them to study the asynchrony of pairs of time series under different time scales.

  18. Solar-Terrestrial Coupling Evidenced by Periodic Behavior in Geomagnetic Indexes and the Infrared Energy Budget of the Thermosphere

    NASA Technical Reports Server (NTRS)

    Mlynczak, Martin G.; Martin-Torres, F. Javier; Mertens, Christopher J.; Marshall, B. Thomas; Thompson, R. Earl; Kozyra, Janet U.; Remsberg, Ellis E.; Gordley, Larry L.; Russell, James M.; Woods, Thomas

    2008-01-01

    We examine time series of the daily global power (W) radiated by carbon dioxide (at 15 microns) and by nitric oxide (at 5.3 microns) from the Earth s thermosphere between 100 km and 200 km altitude. Also examined is a time series of the daily absorbed solar ultraviolet power in the same altitude region in the wavelength span 0 to 175 nm. The infrared data are derived from the SABER instrument and the solar data are derived from the SEE instrument, both on the NASA TIMED satellite. The time series cover nearly 5 years from 2002 through 2006. The infrared and solar time series exhibit a decrease in radiated and absorbed power consistent with the declining phase of the current 11-year solar cycle. The infrared time series also exhibits high frequency variations that are not evident in the solar power time series. Spectral analysis shows a statistically significant 9-day periodicity in the infrared data but not in the solar data. A very strong 9-day periodicity is also found to exist in the time series of daily A(sub p) and K(sub p) geomagnetic indexes. These 9-day periodicities are linked to the recurrence of coronal holes on the Sun. These results demonstrate a direct coupling between the upper atmosphere of the Sun and the infrared energy budget of the thermosphere.

  19. Phase walk analysis of leptokurtic time series.

    PubMed

    Schreiber, Korbinian; Modest, Heike I; Räth, Christoph

    2018-06-01

    The Fourier phase information play a key role for the quantified description of nonlinear data. We present a novel tool for time series analysis that identifies nonlinearities by sensitively detecting correlations among the Fourier phases. The method, being called phase walk analysis, is based on well established measures from random walk analysis, which are now applied to the unwrapped Fourier phases of time series. We provide an analytical description of its functionality and demonstrate its capabilities on systematically controlled leptokurtic noise. Hereby, we investigate the properties of leptokurtic time series and their influence on the Fourier phases of time series. The phase walk analysis is applied to measured and simulated intermittent time series, whose probability density distribution is approximated by power laws. We use the day-to-day returns of the Dow-Jones industrial average, a synthetic time series with tailored nonlinearities mimicing the power law behavior of the Dow-Jones and the acceleration of the wind at an Atlantic offshore site. Testing for nonlinearities by means of surrogates shows that the new method yields strong significances for nonlinear behavior. Due to the drastically decreased computing time as compared to embedding space methods, the number of surrogate realizations can be increased by orders of magnitude. Thereby, the probability distribution of the test statistics can very accurately be derived and parameterized, which allows for much more precise tests on nonlinearities.

  20. Simultaneous determination of radionuclides separable into natural decay series by use of time-interval analysis.

    PubMed

    Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro

    2004-05-01

    A delayed coincidence method, time-interval analysis (TIA), has been applied to successive alpha- alpha decay events on the millisecond time-scale. Such decay events are part of the (220)Rn-->(216)Po ( T(1/2) 145 ms) (Th-series) and (219)Rn-->(215)Po ( T(1/2) 1.78 ms) (Ac-series). By using TIA in addition to measurement of (226)Ra (U-series) from alpha-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject beta-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N(2) gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the (221)Fr-->(217)At ( T(1/2) 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the (225)Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples.

  1. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  2. Design considerations for case series models with exposure onset measurement error.

    PubMed

    Mohammed, Sandra M; Dalrymple, Lorien S; Sentürk, Damla; Nguyen, Danh V

    2013-02-28

    The case series model allows for estimation of the relative incidence of events, such as cardiovascular events, within a pre-specified time window after an exposure, such as an infection. The method requires only cases (individuals with events) and controls for all fixed/time-invariant confounders. The measurement error case series model extends the original case series model to handle imperfect data, where the timing of an infection (exposure) is not known precisely. In this work, we propose a method for power/sample size determination for the measurement error case series model. Extensive simulation studies are used to assess the accuracy of the proposed sample size formulas. We also examine the magnitude of the relative loss of power due to exposure onset measurement error, compared with the ideal situation where the time of exposure is measured precisely. To facilitate the design of case series studies, we provide publicly available web-based tools for determining power/sample size for both the measurement error case series model as well as the standard case series model. Copyright © 2012 John Wiley & Sons, Ltd.

  3. Early warning by near-real time disturbance monitoring (Invited)

    NASA Astrophysics Data System (ADS)

    Verbesselt, J.; Zeileis, A.; Herold, M.

    2013-12-01

    Near real-time monitoring of ecosystem disturbances is critical for rapidly assessing and addressing impacts on carbon dynamics, biodiversity, and socio-ecological processes. Satellite remote sensing enables cost-effective and accurate monitoring at frequent time steps over large areas. Yet, generic methods to detect disturbances within newly captured satellite images are lacking. We propose a multi-purpose time-series-based disturbance detection approach that identifies and models stable historical variation to enable change detection within newly acquired data. Satellite image time series of vegetation greenness provide a global record of terrestrial vegetation productivity over the past decades. Here, we assess and demonstrate the method by applying it to (1) real-world satellite greenness image time series between February 2000 and July 2011 covering Somalia to detect drought-related vegetation disturbances (2) landsat image time series to detect forest disturbances. First, results illustrate that disturbances are successfully detected in near real-time while being robust to seasonality and noise. Second, major drought-related disturbance corresponding with most drought-stressed regions in Somalia are detected from mid-2010 onwards. Third, the method can be applied to landsat image time series having a lower temporal data density. Furthermore the method can analyze in-situ or satellite data time series of biophysical indicators from local to global scale since it is fast, does not depend on thresholds and does not require time series gap filling. While the data and methods used are appropriate for proof-of-concept development of global scale disturbance monitoring, specific applications (e.g., drought or deforestation monitoring) mandates integration within an operational monitoring framework. Furthermore, the real-time monitoring method is implemented in open-source environment and is freely available in the BFAST package for R software. Information illustrating how to apply the method on satellite image time series are available at http://bfast.R-Forge.R-project.org/ and the example section of the bfastmonitor() function within the BFAST package.

  4. Vertical variation of mixing within porous sediment beds below turbulent flows

    PubMed Central

    Chandler, I. D.; Pearson, J. M.; van Egmond, R.

    2016-01-01

    Abstract River ecosystems are influenced by contaminants in the water column, in the pore water and adsorbed to sediment particles. When exchange across the sediment‐water interface (hyporheic exchange) is included in modeling, the mixing coefficient is often assumed to be constant with depth below the interface. Novel fiber‐optic fluorometers have been developed and combined with a modified EROSIMESS system to quantify the vertical variation in mixing coefficient with depth below the sediment‐water interface. The study considered a range of particle diameters and bed shear velocities, with the permeability Péclet number, PeK between 1000 and 77,000 and the shear Reynolds number, Re*, between 5 and 600. Different parameterization of both an interface exchange coefficient and a spatially variable in‐sediment mixing coefficient are explored. The variation of in‐sediment mixing is described by an exponential function applicable over the full range of parameter combinations tested. The empirical relationship enables estimates of the depth to which concentrations of pollutants will penetrate into the bed sediment, allowing the region where exchange will occur faster than molecular diffusion to be determined. PMID:27635104

  5. Satellite detection of phytoplankton export from the mid-Atlantic Bight during the 1979 spring bloom

    NASA Technical Reports Server (NTRS)

    Walsh, J. J.; Dieterle, D. A.; Esaias, W. E.

    1986-01-01

    Analysis of Coastal Zone Color Scanner (CZCS) imagery confirms shipboard and in situ moored fluorometer observations of resuspension of near-bottom chlorophyll within surface waters (1 to 10 m) by northwesterly wind events in the mid-Atlantic Bight. As much as 8 to 16 micrograms chl/l are found during these wind events from March to May, with a seasonal increase of algal biomass until onset of stratification of the water column. Rapid sinking or downwelling apparently occurs after subsequent wind events, however, such that the predominant surface chlorophyll pattern is approx. 0.5 to 1.5 micrograms/l over the continental shelf during most of the spring bloom. Perhaps half of the chlorophyll increase observed by satellite during a wind resuspension event represents in-situ production during the 4 to 5 day interval, with the remainder attributed to accumulation of algal biomass previously produced and temporarily stored within near-bottom water. Present calculations suggest that about 10% of the primary production of the spring bloom may be exported as ungrazed phytoplankton carbon from mid-Atlantic shelf waters to those of the continental slope.

  6. Eat-by-light fiber-optic and micro-optic devices for food quality and safety assessment

    NASA Astrophysics Data System (ADS)

    Mignani, A. G.; Ciaccheri, L.; Cucci, C.; Mencaglia, A. A.; Cimato, A.; Attilio, C.; Thienpont, H.; Ottevaere, H.; Paolesse, R.; Mastroianni, M.; Monti, D.; Buonocore, G.; Del Nobile, A.; Mentana, A.; Grimaldi, M. F.; Dall'Asta, C.; Faccini, A.; Galaverna, G.; Dossena, A.

    2007-06-01

    A selection is presented of fiber-optic and micro-optic devices that have been designed and tested for guaranteeing the quality and safety of typical foods, such as extra virgin olive oil, beer, and milk. Scattered colorimetry is used to authenticate various types of extra virgin olive oil and beer, while a fiber-optic-based device for UV-VIS-NIR absorption spectroscopy is exploited in order to obtain the hyperspectral optical signature of olive oil. This is done not only for authentication purposes, but also so as to correlate the spectral data with the content of fatty acids, which are important nutritional factors. A micro-optic sensor for the detection of olive oil aroma that is capable of distinguishing different ageing levels of extra virgin olive oil is also presented. It shows effective potential for acting as a smart cap of bottled olive oil in order to achieve a non-destructive olfactory perception of oil ageing. Lastly, a compact portable fluorometer for the rapid monitoring of the carcinogenic M1 aflatoxin in milk, is experimented.

  7. Eat-by-light: fiber-optic and micro-optic devices for food safety and quality assessment

    NASA Astrophysics Data System (ADS)

    Mignani, A. G.; Ciaccheri, L.; Cucci, C.; Mencaglia, A. A.; Cimato, A.; Attilio, C.; Thienpont, H.; Ottevaere, H.; Paolesse, R.; Mastroianni, M.; Monti, D.; Buonocore, G.; Del Nobile, A.; Mentana, A.; Dall'Asta, C.; Faccini, A.; Galaverna, G.; Dossena, A.

    2007-07-01

    A selection of fiber-optic and micro-optic devices is presented designed and tested for monitoring the quality and safety of typical foods, namely the extra virgin olive oil, the beer, and the milk. Scattered colorimetry is used for the authentication of various types of extra virgin olive oil and beer, while a fiber-optic-based device for UV-VIS-NIR absorption spectroscopy is exploited in order to obtain the hyperspectral optical signature of olive oil. This is done not only for authentication purposes, but also so as to correlate the spectral data with the content of fatty acids that are important nutritional factors. A micro-optic sensor for the detection of olive oil aroma is presented. It is capable of distinguishing different ageing levels of extra virgin olive oil. It shows effective potential for acting as a smart cap of bottled olive oil in order to achieve a non-destructive olfactory perception of oil ageing. Lastly, a compact portable fluorometer is experimented for the rapid monitoring of the carcinogenic M1 aflatoxin in milk.

  8. Non-linear forecasting in high-frequency financial time series

    NASA Astrophysics Data System (ADS)

    Strozzi, F.; Zaldívar, J. M.

    2005-08-01

    A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.

  9. Pseudo-random bit generator based on lag time series

    NASA Astrophysics Data System (ADS)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  10. Correlation and Stacking of Relative Paleointensity and Oxygen Isotope Data

    NASA Astrophysics Data System (ADS)

    Lurcock, P. C.; Channell, J. E.; Lee, D.

    2012-12-01

    The transformation of a depth-series into a time-series is routinely implemented in the geological sciences. This transformation often involves correlation of a depth-series to an astronomically calibrated time-series. Eyeball tie-points with linear interpolation are still regularly used, although these have the disadvantages of being non-repeatable and not based on firm correlation criteria. Two automated correlation methods are compared: the simulated annealing algorithm (Huybers and Wunsch, 2004) and the Match protocol (Lisiecki and Lisiecki, 2002). Simulated annealing seeks to minimize energy (cross-correlation) as "temperature" is slowly decreased. The Match protocol divides records into intervals, applies penalty functions that constrain accumulation rates, and minimizes the sum of the squares of the differences between two series while maintaining the data sequence in each series. Paired relative paleointensity (RPI) and oxygen isotope records, such as those from IODP Site U1308 and/or reference stacks such as LR04 and PISO, are warped using known warping functions, and then the un-warped and warped time-series are correlated to evaluate the efficiency of the correlation methods. Correlations are performed in tandem to simultaneously optimize RPI and oxygen isotope data. Noise spectra are introduced at differing levels to determine correlation efficiency as noise levels change. A third potential method, known as dynamic time warping, involves minimizing the sum of distances between correlated point pairs across the whole series. A "cost matrix" between the two series is analyzed to find a least-cost path through the matrix. This least-cost path is used to nonlinearly map the time/depth of one record onto the depth/time of another. Dynamic time warping can be expanded to more than two dimensions and used to stack multiple time-series. This procedure can improve on arithmetic stacks, which often lose coherent high-frequency content during the stacking process.

  11. Comparative Fatigue Lives of Rubber and PVC Wiper Cylindrical Coatings

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.; Savage, Michael

    2002-01-01

    Three coating materials for rotating cylindrical-coated wiping rollers were fatigue tested in 2 Intaglio printing presses. The coatings were a hard, cross-linked, plasticized PVC thermoset (P-series); a plasticized PVC (A-series); and a hard, nitryl rubber (R-series). Both 2- and 3-parameter Weibull analyses as well as a cost-benefit analysis were performed. The mean value of life for the R-series coating is 24 and 9 times longer than the P- and A-series coatings, respectively. Both the cost and replacement rate for the R-series coating was significantly less than those for the P- and A-series coatings. At a very high probability of survival the R-series coating is approximately 2 and 6 times the lives of the P- and A-series, respectively, before the first failure occurs. Where all coatings are run to failure, using the mean (life) time between removal (MTBR) for each coating to calculate the number of replacements and costs provides qualitatively similar results to those using a Weibull analysis.

  12. Higher-Order Hurst Signatures: Dynamical Information in Time Series

    NASA Astrophysics Data System (ADS)

    Ferenbaugh, Willis

    2005-10-01

    Understanding and comparing time series from different systems requires characteristic measures of the dynamics embedded in the series. The Hurst exponent is a second-order dynamical measure of a time series which grew up within the blossoming fractal world of Mandelbrot. This characteristic measure is directly related to the behavior of the autocorrelation, the power-spectrum, and other second-order things. And as with these other measures, the Hurst exponent captures and quantifies some but not all of the intrinsic nature of a series. The more elusive characteristics live in the phase spectrum and the higher-order spectra. This research is a continuing quest to (more) fully characterize the dynamical information in time series produced by plasma experiments or models. The goal is to supplement the series information which can be represented by a Hurst exponent, and we would like to develop supplemental techniques in analogy with Hurst's original R/S analysis. These techniques should be another way to plumb the higher-order dynamics.

  13. Characterizing time series: when Granger causality triggers complex networks

    NASA Astrophysics Data System (ADS)

    Ge, Tian; Cui, Yindong; Lin, Wei; Kurths, Jürgen; Liu, Chong

    2012-08-01

    In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIHMassachusetts Institute of Technology-Beth Israel Hospital. human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length.

  14. Cycles, scaling and crossover phenomenon in length of the day (LOD) time series

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano

    2007-06-01

    The dynamics of the temporal fluctuations of the length of the day (LOD) time series from January 1, 1962 to November 2, 2006 were investigated. The power spectrum of the whole time series has revealed annual, semi-annual, decadal and daily oscillatory behaviors, correlated with oceanic-atmospheric processes and interactions. The scaling behavior was analyzed by using the detrended fluctuation analysis (DFA), which has revealed two different scaling regimes, separated by a crossover timescale at approximately 23 days. Flicker-noise process can describe the dynamics of the LOD time regime involving intermediate and long timescales, while Brownian dynamics characterizes the LOD time series for small timescales.

  15. Heart rate time series characteristics for early detection of infections in critically ill patients.

    PubMed

    Tambuyzer, T; Guiza, F; Boonen, E; Meersseman, P; Vervenne, H; Hansen, T K; Bjerre, M; Van den Berghe, G; Berckmans, D; Aerts, J M; Meyfroidt, G

    2017-04-01

    It is difficult to make a distinction between inflammation and infection. Therefore, new strategies are required to allow accurate detection of infection. Here, we hypothesize that we can distinguish infected from non-infected ICU patients based on dynamic features of serum cytokine concentrations and heart rate time series. Serum cytokine profiles and heart rate time series of 39 patients were available for this study. The serum concentration of ten cytokines were measured using blood sampled every 10 min between 2100 and 0600 hours. Heart rate was recorded every minute. Ten metrics were used to extract features from these time series to obtain an accurate classification of infected patients. The predictive power of the metrics derived from the heart rate time series was investigated using decision tree analysis. Finally, logistic regression methods were used to examine whether classification performance improved with inclusion of features derived from the cytokine time series. The AUC of a decision tree based on two heart rate features was 0.88. The model had good calibration with 0.09 Hosmer-Lemeshow p value. There was no significant additional value of adding static cytokine levels or cytokine time series information to the generated decision tree model. The results suggest that heart rate is a better marker for infection than information captured by cytokine time series when the exact stage of infection is not known. The predictive value of (expensive) biomarkers should always be weighed against the routinely monitored data, and such biomarkers have to demonstrate added value.

  16. Lab Streaming Layer Enabled Myo Data Collection Software User Manual

    DTIC Science & Technology

    2017-06-07

    time - series data over a local network. LSL handles the networking, time -synchronization, (near-) real- time access as well as, optionally, the... series data collection (e.g., brain activity, heart activity, muscle activity) using the LSL application programming interface (API). Time -synchronized...saved to a single extensible data format (XDF) file. Once the time - series data are collected in a Lab Recorder XDF file, users will be able to query

  17. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    PubMed

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  18. 77 FR 18229 - Applications for New Awards; Investing in Innovation Fund, Validation Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-27

    ... the appearance of a conflict of interest. Interrupted time series design \\8\\ means a type of quasi... single case design is an adaptation of an interrupted time series design that relies on the comparison of... notice), interrupted time series designs (as defined in this notice), or regression discontinuity designs...

  19. 76 FR 32148 - Applications for New Awards; Investing in Innovation Fund

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    .... Interrupted time series design \\5\\ means a type of quasi- experimental study in which the outcome of interest... interrupted time series design that relies on the comparison of treatment effects on a single subject or group... matched comparison group designs (as defined in this notice), interrupted time series designs (as defined...

  20. 76 FR 53348 - Airworthiness Directives; BAE SYSTEMS (Operations) Limited Model BAe 146 Airplanes and Model Avro...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... Maintenance Manual (AMM) includes chapters 05-10 ``Time Limits'', 05-15 ``Critical Design Configuration... 05, ``Time Limits/Maintenance Checks,'' of BAe 146 Series/AVRO 146-RJ Series Aircraft Maintenance... Chapter 05, ``Time Limits/ Maintenance Checks,'' of the BAE SYSTEMS (Operations) Limited BAe 146 Series...

  1. 78 FR 19468 - Applications for New Awards; Minority Science and Engineering Improvement Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-01

    ... new immigrants, who are migrant, or who have disabilities. Interrupted time series design means a type... time series design that relies on the comparison of treatment effects on a single subject or group of... defined in this notice), interrupted time series designs (as defined in this notice), or regression...

  2. 77 FR 18216 - Applications for New Awards; Investing in Innovation Fund, Scale-Up Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-27

    ... evaluation and prevents even the appearance of a conflict of interest. Interrupted time series design \\8... findings. \\8\\ A single subject or single case design is an adaptation of an interrupted time series design... matched comparison group designs (as defined in this notice), interrupted time series designs (as defined...

  3. 76 FR 32171 - Applications for New Awards; Investing in Innovation Fund

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ... conflict of interest. Interrupted time series design \\5\\ means a type of quasi- experimental study in... single case design is an adaptation of an interrupted time series design that relies on the comparison of...), interrupted time series designs (as defined in this notice), or regression discontinuity designs (as defined...

  4. 76 FR 32159 - Applications for New Awards; Investing in Innovation Fund

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-03

    ... conflict of interest. Interrupted time series design \\5\\ means a type of quasi- experimental study in which... design is an adaptation of an interrupted time series design that relies on the comparison of treatment...), interrupted time series designs (as defined in this notice), or regression discontinuity designs (as defined...

  5. What Time-Series Designs May Have to Offer Educational Researchers.

    ERIC Educational Resources Information Center

    Kratochwill, Thomas R.; Levin, Joel R.

    1978-01-01

    The promise of time-series designs for educational research and evaluation is reviewed. Ten time-series designs are presented and discussed in the context of threats to internal and external validity. The advantages and disadvantages of various visual and statistical data-analysis techniques are presented. A bibliography is appended. (Author/RD)

  6. New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.

    ERIC Educational Resources Information Center

    Song, Qiang; Chissom, Brad S.

    Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…

  7. 77 FR 6310 - Electronic Fund Transfers (Regulation E)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-07

    ... a one-time transfer or the first in a series of preauthorized transfers to occur more than 10 days... disclosures and receipts where a consumer schedules a one-time transfer or the first in a series of... where a consumer schedules a one-time transfer or the first in a series of preauthorized transfers to...

  8. The Consequences of Model Misidentification in the Interrupted Time-Series Experiment.

    ERIC Educational Resources Information Center

    Padia, William L.

    Campbell (l969) argued for the interrupted time-series experiment as a useful methodology for testing intervention effects in the social sciences. The validity of the statistical hypothesis testing of time-series, is, however, dependent upon the proper identification of the underlying stochastic nature of the data. Several types of model…

  9. Interactive Digital Signal Processor

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1985-01-01

    Interactive Digital Signal Processor, IDSP, consists of set of time series analysis "operators" based on various algorithms commonly used for digital signal analysis. Processing of digital signal time series to extract information usually achieved by applications of number of fairly standard operations. IDSP excellent teaching tool for demonstrating application for time series operators to artificially generated signals.

  10. FATS: Feature Analysis for Time Series

    NASA Astrophysics Data System (ADS)

    Nun, Isadora; Protopapas, Pavlos; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim

    2017-11-01

    FATS facilitates and standardizes feature extraction for time series data; it quickly and efficiently calculates a compilation of many existing light curve features. Users can characterize or analyze an astronomical photometric database, though this library is not necessarily restricted to the astronomical domain and can also be applied to any kind of time series data.

  11. The Prediction of Teacher Turnover Employing Time Series Analysis.

    ERIC Educational Resources Information Center

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  12. 76 FR 2665 - Caribbean Fishery Management Council; Scoping Meetings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-14

    ... time series of catch data that is considered to be consistently reliable across all islands as defined... based on what the Council considers to be the longest time series of catch data that is consistently... preferred management reference point time series. Action 3b. Recreational Bag Limits Option 1: No action. Do...

  13. G14A-06- Analysis of the DORIS, GNSS, SLR, VLBI and Gravimetric Time Series at the GGOS Core Sites

    NASA Technical Reports Server (NTRS)

    Moreaux, G.; Lemoine, F.; Luceri, V.; Pavlis, E.; MacMillan, D.; Bonvalot, S.; Saunier, J.

    2017-01-01

    Analysis of the time series at the 3-4 multi-technique GGOS sites to analyze and compare the spectral content of the space geodetic and gravity time series. Evaluate the level of agreement between the space geodesy measurements and the physical tie vectors.

  14. Information and Complexity Measures Applied to Observed and Simulated Soil Moisture Time Series

    USDA-ARS?s Scientific Manuscript database

    Time series of soil moisture-related parameters provides important insights in functioning of soil water systems. Analysis of patterns within these time series has been used in several studies. The objective of this work was to compare patterns in observed and simulated soil moisture contents to u...

  15. 75 FR 63841 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-18

    ...; [email protected] . Signal-to-Noise Enhancement in Imaging Applications Using a Time-Series of Images... reduction in imaging applications that use a time-series of images. In one embodiment of the invention, a time-series of images is acquired using a same imaging protocol of the same subject area, but the...

  16. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    ERIC Educational Resources Information Center

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  17. A likelihood-based time series modeling approach for application in dendrochronology to examine the growth-climate relations and forest disturbance history

    EPA Science Inventory

    A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for ...

  18. Testing for nonlinearity in non-stationary physiological time series.

    PubMed

    Guarín, Diego; Delgado, Edilson; Orozco, Álvaro

    2011-01-01

    Testing for nonlinearity is one of the most important preprocessing steps in nonlinear time series analysis. Typically, this is done by means of the linear surrogate data methods. But it is a known fact that the validity of the results heavily depends on the stationarity of the time series. Since most physiological signals are non-stationary, it is easy to falsely detect nonlinearity using the linear surrogate data methods. In this document, we propose a methodology to extend the procedure for generating constrained surrogate time series in order to assess nonlinearity in non-stationary data. The method is based on the band-phase-randomized surrogates, which consists (contrary to the linear surrogate data methods) in randomizing only a portion of the Fourier phases in the high frequency domain. Analysis of simulated time series showed that in comparison to the linear surrogate data method, our method is able to discriminate between linear stationarity, linear non-stationary and nonlinear time series. Applying our methodology to heart rate variability (HRV) records of five healthy patients, we encountered that nonlinear correlations are present in this non-stationary physiological signals.

  19. Cyberpark 2000: Protected Areas Management Pilot Project. Satellite time series vegetation monitoring

    NASA Astrophysics Data System (ADS)

    Monteleone, M.; Lanorte, A.; Lasaponara, R.

    2009-04-01

    Cyberpark 2000 is a project funded by the UE Regional Operating Program of the Apulia Region (2000-2006). The main objective of the Cyberpark 2000 project is to develop a new assessment model for the management and monitoring of protected areas in Foggia Province (Apulia Region) based on Information and Communication Technologies. The results herein described are placed inside the research activities finalized to develop an environmental monitoring system knowledge based on the use of satellite time series. This study include: - A- satellite time series of high spatial resolution data for supporting the analysis of fire static risk factors through land use mapping and spectral/quantitative characterization of vegetation fuels; - B- satellite time series of MODIS for supporting fire dynamic risk evaluation of study area - Integrated fire detection by using thermal imaging cameras placed on panoramic view-points; - C - integrated high spatial and high temporal satellite time series for supporting studies in change detection factors or anomalies in vegetation covers; - D - satellite time-series for monitoring: (i) post fire vegetation recovery and (ii) spatio/temporal vegetation dynamics in unburned and burned vegetation covers.

  20. Analysis of Site Position Time Series Derived From Space Geodetic Solutions

    NASA Astrophysics Data System (ADS)

    Angermann, D.; Meisel, B.; Kruegel, M.; Tesmer, V.; Miller, R.; Drewes, H.

    2003-12-01

    This presentation deals with the analysis of station coordinate time series obtained from VLBI, SLR, GPS and DORIS solutions. We also present time series for the origin and scale derived from these solutions and discuss their contribution to the realization of the terrestrial reference frame. For these investigations we used SLR and VLBI solutions computed at DGFI with the software systems DOGS (SLR) and OCCAM (VLBI). The GPS and DORIS time series were obtained from weekly station coordinates solutions provided by the IGS, and from the joint DORIS analysis center (IGN-JPL). We analysed the time series with respect to various aspects, such as non-linear motions, periodic signals and systematic differences (biases). A major focus is on a comparison of the results at co-location sites in order to identify technique- and/or solution related problems. This may also help to separate and quantify possible effects, and to understand the origin of still existing discrepancies. Technique-related systematic effects (biases) should be reduced to the highest possible extent, before using the space geodetic solutions for a geophysical interpretation of seasonal signals in site position time series.

  1. Recurrent Neural Network Applications for Astronomical Time Series

    NASA Astrophysics Data System (ADS)

    Protopapas, Pavlos

    2017-06-01

    The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.

  2. A comparative study of shallow groundwater level simulation with three time series models in a coastal aquifer of South China

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Wang, Y.; Zhang, J.; Delgado, J.

    2017-05-01

    Accurate and reliable groundwater level forecasting models can help ensure the sustainable use of a watershed's aquifers for urban and rural water supply. In this paper, three time series analysis methods, Holt-Winters (HW), integrated time series (ITS), and seasonal autoregressive integrated moving average (SARIMA), are explored to simulate the groundwater level in a coastal aquifer, China. The monthly groundwater table depth data collected in a long time series from 2000 to 2011 are simulated and compared with those three time series models. The error criteria are estimated using coefficient of determination ( R 2), Nash-Sutcliffe model efficiency coefficient ( E), and root-mean-squared error. The results indicate that three models are all accurate in reproducing the historical time series of groundwater levels. The comparisons of three models show that HW model is more accurate in predicting the groundwater levels than SARIMA and ITS models. It is recommended that additional studies explore this proposed method, which can be used in turn to facilitate the development and implementation of more effective and sustainable groundwater management strategies.

  3. Time series modeling in traffic safety research.

    PubMed

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Effect of noise and filtering on largest Lyapunov exponent of time series associated with human walking.

    PubMed

    Mehdizadeh, Sina; Sanjari, Mohammad Ali

    2017-11-07

    This study aimed to determine the effect of added noise, filtering and time series length on the largest Lyapunov exponent (LyE) value calculated for time series obtained from a passive dynamic walker. The simplest passive dynamic walker model comprising of two massless legs connected by a frictionless hinge joint at the hip was adopted to generate walking time series. The generated time series was used to construct a state space with the embedding dimension of 3 and time delay of 100 samples. The LyE was calculated as the exponential rate of divergence of neighboring trajectories of the state space using Rosenstein's algorithm. To determine the effect of noise on LyE values, seven levels of Gaussian white noise (SNR=55-25dB with 5dB steps) were added to the time series. In addition, the filtering was performed using a range of cutoff frequencies from 3Hz to 19Hz with 2Hz steps. The LyE was calculated for both noise-free and noisy time series with different lengths of 6, 50, 100 and 150 strides. Results demonstrated a high percent error in the presence of noise for LyE. Therefore, these observations suggest that Rosenstein's algorithm might not perform well in the presence of added experimental noise. Furthermore, findings indicated that at least 50 walking strides are required to calculate LyE to account for the effect of noise. Finally, observations support that a conservative filtering of the time series with a high cutoff frequency might be more appropriate prior to calculating LyE. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. 78 FR 15073 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-08

    ... will not apply to such long-term options series until the time to expiration is less than nine months... trading on the Exchange, the Exchange shall from time to time open for trading series of options therein... not apply to options series until the time to expiration is less than nine months (that is, until such...

  6. 77 FR 42038 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... market in 60% of the non-adjusted option series \\4\\ of each registered class that have a time to... time series Classes C2 (current rule) 99 of the time........ 60 Class-by-class. NOM 90 of a trading day... required to provide continuous quotes for the same amount of time in the same percentage of series as...

  7. Automated Bayesian model development for frequency detection in biological time series.

    PubMed

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.

  8. Automated Bayesian model development for frequency detection in biological time series

    PubMed Central

    2011-01-01

    Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure. PMID:21702910

  9. Symplectic geometry spectrum regression for prediction of noisy time series

    NASA Astrophysics Data System (ADS)

    Xie, Hong-Bo; Dokos, Socrates; Sivakumar, Bellie; Mengersen, Kerrie

    2016-05-01

    We present the symplectic geometry spectrum regression (SGSR) technique as well as a regularized method based on SGSR for prediction of nonlinear time series. The main tool of analysis is the symplectic geometry spectrum analysis, which decomposes a time series into the sum of a small number of independent and interpretable components. The key to successful regularization is to damp higher order symplectic geometry spectrum components. The effectiveness of SGSR and its superiority over local approximation using ordinary least squares are demonstrated through prediction of two noisy synthetic chaotic time series (Lorenz and Rössler series), and then tested for prediction of three real-world data sets (Mississippi River flow data and electromyographic and mechanomyographic signal recorded from human body).

  10. Real-time Series Resistance Monitoring in PV Systems; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deceglie, M. G.; Silverman, T. J.; Marion, B.

    We apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IV curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on apply the physical principles of a familiar method, suns-Voc, to a new application: the real-time detection of series resistance changes in modules and systems operating outside. The real-time series resistance (RTSR) method that we describe avoids the need for collecting IVmore » curves or constructing full series-resistance-free IV curves. RTSR is most readily deployable at the module level on micro-inverters or module-integrated electronics, but it can also be extended to full strings. Automated detection of series resistance increases can provide early warnings of some of the most common reliability issues, which also pose fire risks, including broken ribbons, broken solder bonds, and contact problems in the junction or combiner box. We describe the method in detail and describe a sample application to data collected from modules operating in the field.« less

  11. TimesVector: a vectorized clustering approach to the analysis of time series transcriptome data from multiple phenotypes.

    PubMed

    Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun

    2017-12-01

    Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/. sunkim.bioinfo@snu.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. OceanSITES: Sustained Ocean Time Series Observations in the Global Ocean.

    NASA Astrophysics Data System (ADS)

    Weller, R. A.; Gallage, C.; Send, U.; Lampitt, R. S.; Lukas, R.

    2016-02-01

    Time series observations at critical or representative locations are an essential element of a global ocean observing system that is unique and complements other approaches to sustained observing. OceanSITES is an international group of oceanographers associated with such time series sites. OceanSITES exists to promote the continuation and extension of ocean time series sites around the globe. It also exists to plan and oversee the global array of sites in order to address the needs of research, climate change detection, operational applications, and policy makers. OceanSITES is a voluntary group that sits as an Action Group of the JCOMM-OPS Data Buoy Cooperation Panel, where JCOMM-OPS is the operational ocean observing oversight group of the Joint Commission on Oceanography and Marine Meteorology of the International Oceanographic Commission and the World Meteorological Organization. The way forward includes working to complete the global array, moving toward multidisciplinary instrumentation on a subset of the sites, and increasing utilization of the time series data, which are freely available from two Global Data Assembly Centers, one at the National Data Buoy Center and one at Coriolis at IFREMER. One recnet OceanSITES initiative and several results from OceanSITES time series sites are presented. The recent initiative was the assembly of a pool of temperature/conductivity recorders fro provision to OceanSITES sites in order to provide deep ocean temperature and salinity time series. Examples from specific sites include: a 15-year record of surface meteorology and air-sea fluxes from off northern Chile that shows evidence of long-term trends in surface forcing; change in upper ocean salinity and stratification in association with regional change in the hydrological cycle can be seen at the Hawaii time series site; results from monitoring Atlantic meridional transport; and results from a European multidisciplinary time series site.

  13. Quantifying complexity of financial short-term time series by composite multiscale entropy measure

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun

    2015-05-01

    It is significant to study the complexity of financial time series since the financial market is a complex evolved dynamic system. Multiscale entropy is a prevailing method used to quantify the complexity of a time series. Due to its less reliability of entropy estimation for short-term time series at large time scales, a modification method, the composite multiscale entropy, is applied to the financial market. To qualify its effectiveness, its applications in the synthetic white noise and 1 / f noise with different data lengths are reproduced first in the present paper. Then it is introduced for the first time to make a reliability test with two Chinese stock indices. After conducting on short-time return series, the CMSE method shows the advantages in reducing deviations of entropy estimation and demonstrates more stable and reliable results when compared with the conventional MSE algorithm. Finally, the composite multiscale entropy of six important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  14. A Space Affine Matching Approach to fMRI Time Series Analysis.

    PubMed

    Chen, Liang; Zhang, Weishi; Liu, Hongbo; Feng, Shigang; Chen, C L Philip; Wang, Huili

    2016-07-01

    For fMRI time series analysis, an important challenge is to overcome the potential delay between hemodynamic response signal and cognitive stimuli signal, namely the same frequency but different phase (SFDP) problem. In this paper, a novel space affine matching feature is presented by introducing the time domain and frequency domain features. The time domain feature is used to discern different stimuli, while the frequency domain feature to eliminate the delay. And then we propose a space affine matching (SAM) algorithm to match fMRI time series by our affine feature, in which a normal vector is estimated using gradient descent to explore the time series matching optimally. The experimental results illustrate that the SAM algorithm is insensitive to the delay between the hemodynamic response signal and the cognitive stimuli signal. Our approach significantly outperforms GLM method while there exists the delay. The approach can help us solve the SFDP problem in fMRI time series matching and thus of great promise to reveal brain dynamics.

  15. An improvement of the measurement of time series irreversibility with visibility graph approach

    NASA Astrophysics Data System (ADS)

    Wu, Zhenyu; Shang, Pengjian; Xiong, Hui

    2018-07-01

    We propose a method to improve the measure of real-valued time series irreversibility which contains two tools: the directed horizontal visibility graph and the Kullback-Leibler divergence. The degree of time irreversibility is estimated by the Kullback-Leibler divergence between the in and out degree distributions presented in the associated visibility graph. In our work, we reframe the in and out degree distributions by encoding them with different embedded dimensions used in calculating permutation entropy(PE). With this improved method, we can not only estimate time series irreversibility efficiently, but also detect time series irreversibility from multiple dimensions. We verify the validity of our method and then estimate the amount of time irreversibility of series generated by chaotic maps as well as global stock markets over the period 2005-2015. The result shows that the amount of time irreversibility reaches the peak with embedded dimension d = 3 under circumstances of experiment and financial markets.

  16. Data imputation analysis for Cosmic Rays time series

    NASA Astrophysics Data System (ADS)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  17. Analyses of Inhomogeneities in Radiosonde Temperature and Humidity Time Series.

    NASA Astrophysics Data System (ADS)

    Zhai, Panmao; Eskridge, Robert E.

    1996-04-01

    Twice daily radiosonde data from selected stations in the United States (period 1948 to 1990) and China (period 1958 to 1990) were sorted into time series. These stations have one sounding taken in darkness and the other in sunlight. The analysis shows that the 0000 and 1200 UTC time series are highly correlated. Therefore, the Easterling and Peterson technique was tested on the 0000 and 1200 time series to detect inhomogeneities and to estimate the size of the biases. Discontinuities were detected using the difference series created from the 0000 and 1200 UTC time series. To establish that the detected bias was significant, a t test was performed to confirm that the change occurs in the daytime series but not in the nighttime series.Both U.S. and Chinese radiosonde temperature and humidity data include inhomogeneities caused by changes in radiosonde sensors and observation times. The U.S. humidity data have inhomogeneities that were caused by instrument changes and the censoring of data. The practice of reporting relative humidity as 19% when it is lower than 20% or the temperature is below 40°C is called censoring. This combination of procedural and instrument changes makes the detection of biases and adjustment of the data very difficult. In the Chinese temperatures, them are inhomogeneities related to a change in the radiation correction procedure.Test results demonstrate that a modified Easterling and Peterson method is suitable for use in detecting and adjusting time series radiosonde data.Accurate stations histories are very desirable. Stations histories can confirm that detected inhomogeneities are related to instrument or procedural changes. Adjustments can then he made to the data with some confidence.

  18. Model-based Clustering of Categorical Time Series with Multinomial Logit Classification

    NASA Astrophysics Data System (ADS)

    Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea

    2010-09-01

    A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.

  19. Fault Diagnosis from Raw Sensor Data Using Deep Neural Networks Considering Temporal Coherence.

    PubMed

    Zhang, Ran; Peng, Zhen; Wu, Lifeng; Yao, Beibei; Guan, Yong

    2017-03-09

    Intelligent condition monitoring and fault diagnosis by analyzing the sensor data can assure the safety of machinery. Conventional fault diagnosis and classification methods usually implement pretreatments to decrease noise and extract some time domain or frequency domain features from raw time series sensor data. Then, some classifiers are utilized to make diagnosis. However, these conventional fault diagnosis approaches suffer from the expertise of feature selection and they do not consider the temporal coherence of time series data. This paper proposes a fault diagnosis model based on Deep Neural Networks (DNN). The model can directly recognize raw time series sensor data without feature selection and signal processing. It also takes advantage of the temporal coherence of the data. Firstly, raw time series training data collected by sensors are used to train the DNN until the cost function of DNN gets the minimal value; Secondly, test data are used to test the classification accuracy of the DNN on local time series data. Finally, fault diagnosis considering temporal coherence with former time series data is implemented. Experimental results show that the classification accuracy of bearing faults can get 100%. The proposed fault diagnosis approach is effective in recognizing the type of bearing faults.

  20. Recurrent Neural Networks for Multivariate Time Series with Missing Values.

    PubMed

    Che, Zhengping; Purushotham, Sanjay; Cho, Kyunghyun; Sontag, David; Liu, Yan

    2018-04-17

    Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRU-D, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.

  1. Fault Diagnosis from Raw Sensor Data Using Deep Neural Networks Considering Temporal Coherence

    PubMed Central

    Zhang, Ran; Peng, Zhen; Wu, Lifeng; Yao, Beibei; Guan, Yong

    2017-01-01

    Intelligent condition monitoring and fault diagnosis by analyzing the sensor data can assure the safety of machinery. Conventional fault diagnosis and classification methods usually implement pretreatments to decrease noise and extract some time domain or frequency domain features from raw time series sensor data. Then, some classifiers are utilized to make diagnosis. However, these conventional fault diagnosis approaches suffer from the expertise of feature selection and they do not consider the temporal coherence of time series data. This paper proposes a fault diagnosis model based on Deep Neural Networks (DNN). The model can directly recognize raw time series sensor data without feature selection and signal processing. It also takes advantage of the temporal coherence of the data. Firstly, raw time series training data collected by sensors are used to train the DNN until the cost function of DNN gets the minimal value; Secondly, test data are used to test the classification accuracy of the DNN on local time series data. Finally, fault diagnosis considering temporal coherence with former time series data is implemented. Experimental results show that the classification accuracy of bearing faults can get 100%. The proposed fault diagnosis approach is effective in recognizing the type of bearing faults. PMID:28282936

  2. Measurement of cardiac output from dynamic pulmonary circulation time CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yee, Seonghwan, E-mail: Seonghwan.Yee@Beaumont.edu; Scalzetti, Ernest M.

    Purpose: To introduce a method of estimating cardiac output from the dynamic pulmonary circulation time CT that is primarily used to determine the optimal time window of CT pulmonary angiography (CTPA). Methods: Dynamic pulmonary circulation time CT series, acquired for eight patients, were retrospectively analyzed. The dynamic CT series was acquired, prior to the main CTPA, in cine mode (1 frame/s) for a single slice at the level of the main pulmonary artery covering the cross sections of ascending aorta (AA) and descending aorta (DA) during the infusion of iodinated contrast. The time series of contrast changes obtained for DA,more » which is the downstream of AA, was assumed to be related to the time series for AA by the convolution with a delay function. The delay time constant in the delay function, representing the average time interval between the cross sections of AA and DA, was determined by least square error fitting between the convoluted AA time series and the DA time series. The cardiac output was then calculated by dividing the volume of the aortic arch between the cross sections of AA and DA (estimated from the single slice CT image) by the average time interval, and multiplying the result by a correction factor. Results: The mean cardiac output value for the six patients was 5.11 (l/min) (with a standard deviation of 1.57 l/min), which is in good agreement with the literature value; the data for the other two patients were too noisy for processing. Conclusions: The dynamic single-slice pulmonary circulation time CT series also can be used to estimate cardiac output.« less

  3. Ultrasensitive and high-throughput analysis of chlorophyll a in marine phytoplankton extracts using a fluorescence microplate reader.

    PubMed

    Mandalakis, Manolis; Stravinskaitė, Austėja; Lagaria, Anna; Psarra, Stella; Polymenakou, Paraskevi

    2017-07-01

    Chlorophyll a (Chl a) is the predominant pigment in every single photosynthesizing organism including phytoplankton and one of the most commonly measured water quality parameters. Various methods are available for Chl a analysis, but the majority of them are of limited throughput and require considerable effort and time from the operator. The present study describes a high-throughput, microplate-based fluorometric assay for rapid quantification of Chl a in phytoplankton extracts. Microplate sealing combined with ice cooling was proved an effective means for diminishing solvent evaporation during sample loading and minimized the analytical errors involved in Chl a measurements with a fluorescence microplate reader. A set of operating parameters (settling time, detector gain, sample volume) were also optimized to further improve the intensity and reproducibility of Chl a fluorescence signal. A quadratic regression model provided the best fit (r 2  = 0.9998) across the entire calibration range (0.05-240 pg μL -1 ). The method offered excellent intra- and interday precision (% RSD 2.2 to 11.2%) and accuracy (% relative error -3.8 to 13.8%), while it presented particularly low limits of detection (0.044 pg μL -1 ) and quantification (0.132 pg μL -1 ). The present assay was successfully applied on marine phytoplankton extracts, and the overall results were consistent (average % relative error -14.8%) with Chl a concentrations (including divinyl Chl a) measured by high-performance liquid chromatography (HPLC). More importantly, the microplate-based method allowed the analysis of 96 samples/standards within a few minutes, instead of hours or days, when using a traditional cuvette-based fluorometer or an HPLC system. Graphical abstract TChl a concentrations (i.e. sum of Chl a and divinyl Chl a in ng L -1 ) measured in seawater samples by HPLC and fluorescence microplate reader.

  4. NASA Tech Briefs, December 2007

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Topics include: Ka-Band TWT High-Efficiency Power Combiner for High-Rate Data Transmission; Reusable, Extensible High-Level Data-Distribution Concept; Processing Satellite Imagery To Detect Waste Tire Piles; Monitoring by Use of Clusters of Sensor-Data Vectors; Circuit and Method for Communication Over DC Power Line; Switched Band-Pass Filters for Adaptive Transceivers; Noncoherent DTTLs for Symbol Synchronization; High-Voltage Power Supply With Fast Rise and Fall Times; Waveguide Calibrator for Multi-Element Probe Calibration; Four-Way Ka-Band Power Combiner; Loss-of-Control-Inhibitor Systems for Aircraft; Improved Underwater Excitation-Emission Matrix Fluorometer; Metrology Camera System Using Two-Color Interferometry; Design and Fabrication of High-Efficiency CMOS/CCD Imagers; Foam Core Shielding for Spacecraft CHEM-Based Self-Deploying Planetary Storage Tanks Sequestration of Single-Walled Carbon Nanotubes in a Polymer PPC750 Performance Monitor Application-Program-Installer Builder Using Visual Odometry to Estimate Position and Attitude Design and Data Management System Simple, Script-Based Science Processing Archive Automated Rocket Propulsion Test Management Online Remote Sensing Interface Fusing Image Data for Calculating Position of an Object Implementation of a Point Algorithm for Real-Time Convex Optimization Handling Input and Output for COAMPS Modeling and Grid Generation of Iced Airfoils Automated Identification of Nucleotide Sequences Balloon Design Software Rocket Science 101 Interactive Educational Program Creep Forming of Carbon-Reinforced Ceramic-Matrix Composites Dog-Bone Horns for Piezoelectric Ultrasonic/Sonic Actuators Benchtop Detection of Proteins Recombinant Collagenlike Proteins Remote Sensing of Parasitic Nematodes in Plants Direct Coupling From WGM Resonator Disks to Photodetectors Using Digital Radiography To Image Liquid Nitrogen in Voids Multiple-Parameter, Low-False-Alarm Fire-Detection Systems Mosaic-Detector-Based Fluorescence Spectral Imager Plasmoid Thruster for High Specific-Impulse Propulsion Analysis Method for Quantifying Vehicle Design Goals Improved Tracking of Targets by Cameras on a Mars Rover Sample Caching Subsystem Multistage Passive Cooler for Spaceborne Instruments GVIPS Models and Software Stowable Energy-Absorbing Rocker-Bogie Suspensions

  5. Caracterisation of anthropogenic contribution to the coastal fluorescent organic matter

    NASA Astrophysics Data System (ADS)

    El Nahhal, Ibrahim; Nouhi, Ayoub; Mounier, Stéphane

    2015-04-01

    It is known that most of the coastal fluorescent organic matter is of a terrestrial origin (Parlanti, 2000; Tedetti, Guigue, & Goutx, 2010). However, the contribution of the anthropogenic organic matter to this pool is not well defined and evaluated. In this work the monitoring of little bay (Toulon Bay, France) was done in the way to determine the organic fluorescent response during a winter period. The sampling campaign consisted of different days during the month of December, 2014 ( 12th, 15th, 17th, 19th) on 21 different sampling sites for the fluorescence measurements (without any filtering of the samples) and the whole month of December for the bacterial and the turbidity measurements. Excitation Emission Matrices (EEMs) of fluorescence (from 200 to 400 nm and 220 to 420 nm excitation and emission range) were treated by parallel factor analysis (PARAFAC).The parafac analysis of the EEM datasets was conducted using PROGMEEF software in Matlab langage. On the same time that the turbidity and bacterial measurement (particularly the E.Coli concentration) were determined. The results gives in a short time range, information on the the contribution of the anthropogenic inputs to the coastal fluorescent organic matter. In addition, the effect of salinity on the photochemical degradation of the anthropogenic organic matter (especially those from wastewater treatment plants) will be studied to investigate their fate in the water end member by the way of laboratory experiments. Parlanti, E. (2000). Dissolved organic matter fluorescence spectroscopy as a tool to estimate biological activity in a coastal zone submitted to anthropogenic inputs. Organic Geochemistry, 31(12), 1765-1781. doi:10.1016/S0146-6380(00)00124-8 Tedetti, M., Guigue, C., & Goutx, M. (2010). Utilization of a submersible UV fluorometer for monitoring anthropogenic inputs in the Mediterranean coastal waters. Marine Pollution Bulletin, 60(3), 350-62. doi:10.1016/j.marpolbul.2009.10.018

  6. In Situ Stoichiometry in a Large River: Continuous Measurement of Doc, NO3 and PO4 in the Sacramento River

    NASA Astrophysics Data System (ADS)

    Downing, B. D.; Pellerin, B. A.; Bergamaschi, B. A.; Saraceno, J.

    2011-12-01

    Studying controls on geochemical processes in rivers and streams is difficult because concentration and composition often changes rapidly in response to physical and biological forcings. Understanding biogeochemical dynamics in rivers will improve current understanding of the role of watershed sources to carbon cycling, river and stream ecology, and loads to estuaries and oceans. Continuous measurements of dissolved organic carbon (DOC), nitrate (NO3-) and soluble reactive phosphate (SRP) concentrations are now possible, along with some information about DOC composition. In situ sensors designed to measure these constituents provide high frequency, real-time data that can elucidate hydrologic and biogeochemical controls which are difficult to detect using more traditional sampling approaches. Here we present a coupled approach, using in situ optical instrumentation with discharge measurements to provide quantitative estimates of constituent loads to investigate C, NO3- and SRP sources and processing in the Sacramento River, CA, USA. Continuous measurement of DOC concentration was conducted by use of a miniature in situ fluorometer (Turner Designs Cyclops) designed to measure chromophoric dissolved organic matter fluorescence (FDOM) over the course of an entire year. Nitrate was measured concurrently using a Satlantic SUNA and phosphate was measured using a WETLabs model Cycle-P instrument for a two week period in July 2011. Continuous measurement from these instruments paired with continuous measurement of physical water quality variables such as temperature, pH, specific conductance, dissolved oxygen, and turbidity, were used to investigate physical and chemical dynamics of DOC, NO3-, SRP over varying time scales. Deploying these instruments at pre-existing USGS discharge gages allowed for calculation of instantaneous and integrated constituent fluxes, as well as filling in gaps in our understanding biogeochemical processes and transport. Results from the study show that diurnal, event driven and seasonal changes are key to calculating accurate watershed fluxes and detecting transient sources of DOC, NO3- and SRP.

  7. Distribution of Arctic and Pacific copepods and their habitat in the northern Bering Sea and Chukchi Sea

    NASA Astrophysics Data System (ADS)

    Sasaki, H.; Matsuno, K.; Fujiwara, A.; Onuka, M.; Yamaguchi, A.; Ueno, H.; Watanuki, Y.; Kikuchi, T.

    2015-11-01

    The advection of warm Pacific water and the reduction of sea-ice extent in the western Arctic Ocean may influence the abundance and distribution of copepods, i.e., a key component in food webs. To understand the factors affecting abundance of copepods in the northern Bering Sea and Chukchi Sea, we constructed habitat models explaining the spatial patterns of the large and small Arctic copepods and the Pacific copepods, separately, using generalized additive models. Copepods were sampled by NORPAC net. Vertical profiles of density, temperature and salinity in the seawater were measured using CTD, and concentration of chlorophyll a in seawater was measured with a fluorometer. The timing of sea-ice retreat was determined using the satellite image. To quantify the structure of water masses, the magnitude of pycnocline and averaged density, temperature and salinity in upper and bottom layers were scored along three axes using principal component analysis (PCA). The structures of water masses indexed by the scores of PCAs were selected as explanatory variables in the best models. Large Arctic copepods were abundant in the water mass with high salinity water in bottom layer or with cold/low salinity water in upper layer and cold/high salinity water in bottom layer, and small Arctic copepods were abundant in the water mass with warm/saline water in upper layer and cold/high salinity water in bottom layers, while Pacific copepods were abundant in the water mass with warm/saline in upper layer and cold/high salinity water in bottom layer. All copepod groups were abundant in areas with deeper depth. Although chlorophyll a in upper and bottom layers were selected as explanatory variables in the best models, apparent trends were not observed. All copepod groups were abundant where the sea-ice retreated at earlier timing. Our study might indicate potential positive effects of the reduction of sea-ice extent on the distribution of all groups of copepods in the Arctic Ocean.

  8. A geodetic matched filter search for slow slip with application to the Mexico subduction zone

    NASA Astrophysics Data System (ADS)

    Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W. B.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.

    2017-12-01

    Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low-frequency earthquakes and repeating earthquakes provide evidence of low-amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent data sets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with postprocessed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modeling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T, and Mw of events larger than Mw 6 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the downdip edges of the Mw>7.5 slow slip events.

  9. A geodetic matched-filter search for slow slip with application to the Mexico subduction zone

    NASA Astrophysics Data System (ADS)

    Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.

    2017-12-01

    Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low frequency earthquakes and repeating earthquakes provide evidence of low amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here, we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent datasets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with post-processed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modelling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T and Mw of events larger than Mw 6.0 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the down dip edges of the Mw > 7.5 SSEs.

  10. A cluster merging method for time series microarray with production values.

    PubMed

    Chira, Camelia; Sedano, Javier; Camara, Monica; Prieto, Carlos; Villar, Jose R; Corchado, Emilio

    2014-09-01

    A challenging task in time-course microarray data analysis is to cluster genes meaningfully combining the information provided by multiple replicates covering the same key time points. This paper proposes a novel cluster merging method to accomplish this goal obtaining groups with highly correlated genes. The main idea behind the proposed method is to generate a clustering starting from groups created based on individual temporal series (representing different biological replicates measured in the same time points) and merging them by taking into account the frequency by which two genes are assembled together in each clustering. The gene groups at the level of individual time series are generated using several shape-based clustering methods. This study is focused on a real-world time series microarray task with the aim to find co-expressed genes related to the production and growth of a certain bacteria. The shape-based clustering methods used at the level of individual time series rely on identifying similar gene expression patterns over time which, in some models, are further matched to the pattern of production/growth. The proposed cluster merging method is able to produce meaningful gene groups which can be naturally ranked by the level of agreement on the clustering among individual time series. The list of clusters and genes is further sorted based on the information correlation coefficient and new problem-specific relevant measures. Computational experiments and results of the cluster merging method are analyzed from a biological perspective and further compared with the clustering generated based on the mean value of time series and the same shape-based algorithm.

  11. Time series with tailored nonlinearities

    NASA Astrophysics Data System (ADS)

    Räth, C.; Laut, I.

    2015-10-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.

  12. Wavelet analysis and scaling properties of time series

    NASA Astrophysics Data System (ADS)

    Manimaran, P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.

    2005-10-01

    We propose a wavelet based method for the characterization of the scaling behavior of nonstationary time series. It makes use of the built-in ability of the wavelets for capturing the trends in a data set, in variable window sizes. Discrete wavelets from the Daubechies family are used to illustrate the efficacy of this procedure. After studying binomial multifractal time series with the present and earlier approaches of detrending for comparison, we analyze the time series of averaged spin density in the 2D Ising model at the critical temperature, along with several experimental data sets possessing multifractal behavior.

  13. Fragmentation of Solid Materials Using Shock Tubes. Part 1: First Test Series in a Small Diameter Shock Tube

    DTIC Science & Technology

    2017-01-01

    time histories with peak pressures of approximately 250 psi and 500 psi. 1.2 TESTING OBJECTIVES The first goal of this test series was to explore how...finally the late- time at-rest fragments were physically collected and analyzed post-test. Because this test series physically collected over 50,000...for a single fragmenting object. Comparing the three measurement techniques used in this test series , the late- time physically- collected mass

  14. Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time

    NASA Astrophysics Data System (ADS)

    Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.

    2017-12-01

    We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.

  15. Visualizing frequent patterns in large multivariate time series

    NASA Astrophysics Data System (ADS)

    Hao, M.; Marwah, M.; Janetzko, H.; Sharma, R.; Keim, D. A.; Dayal, U.; Patnaik, D.; Ramakrishnan, N.

    2011-01-01

    The detection of previously unknown, frequently occurring patterns in time series, often called motifs, has been recognized as an important task. However, it is difficult to discover and visualize these motifs as their numbers increase, especially in large multivariate time series. To find frequent motifs, we use several temporal data mining and event encoding techniques to cluster and convert a multivariate time series to a sequence of events. Then we quantify the efficiency of the discovered motifs by linking them with a performance metric. To visualize frequent patterns in a large time series with potentially hundreds of nested motifs on a single display, we introduce three novel visual analytics methods: (1) motif layout, using colored rectangles for visualizing the occurrences and hierarchical relationships of motifs in a multivariate time series, (2) motif distortion, for enlarging or shrinking motifs as appropriate for easy analysis and (3) motif merging, to combine a number of identical adjacent motif instances without cluttering the display. Analysts can interactively optimize the degree of distortion and merging to get the best possible view. A specific motif (e.g., the most efficient or least efficient motif) can be quickly detected from a large time series for further investigation. We have applied these methods to two real-world data sets: data center cooling and oil well production. The results provide important new insights into the recurring patterns.

  16. Defect-Repairable Latent Feature Extraction of Driving Behavior via a Deep Sparse Autoencoder

    PubMed Central

    Taniguchi, Tadahiro; Takenaka, Kazuhito; Bando, Takashi

    2018-01-01

    Data representing driving behavior, as measured by various sensors installed in a vehicle, are collected as multi-dimensional sensor time-series data. These data often include redundant information, e.g., both the speed of wheels and the engine speed represent the velocity of the vehicle. Redundant information can be expected to complicate the data analysis, e.g., more factors need to be analyzed; even varying the levels of redundancy can influence the results of the analysis. We assume that the measured multi-dimensional sensor time-series data of driving behavior are generated from low-dimensional data shared by the many types of one-dimensional data of which multi-dimensional time-series data are composed. Meanwhile, sensor time-series data may be defective because of sensor failure. Therefore, another important function is to reduce the negative effect of defective data when extracting low-dimensional time-series data. This study proposes a defect-repairable feature extraction method based on a deep sparse autoencoder (DSAE) to extract low-dimensional time-series data. In the experiments, we show that DSAE provides high-performance latent feature extraction for driving behavior, even for defective sensor time-series data. In addition, we show that the negative effect of defects on the driving behavior segmentation task could be reduced using the latent features extracted by DSAE. PMID:29462931

  17. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  18. The application of complex network time series analysis in turbulent heated jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.

    2014-06-15

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less

  19. Improved visibility graph fractality with application for the diagnosis of Autism Spectrum Disorder

    NASA Astrophysics Data System (ADS)

    Ahmadlou, Mehran; Adeli, Hojjat; Adeli, Amir

    2012-10-01

    Recently, the visibility graph (VG) algorithm was proposed for mapping a time series to a graph to study complexity and fractality of the time series through investigation of the complexity of its graph. The visibility graph algorithm converts a fractal time series to a scale-free graph. VG has been used for the investigation of fractality in the dynamic behavior of both artificial and natural complex systems. However, robustness and performance of the power of scale-freeness of VG (PSVG) as an effective method for measuring fractality has not been investigated. Since noise is unavoidable in real life time series, the robustness of a fractality measure is of paramount importance. To improve the accuracy and robustness of PSVG to noise for measurement of fractality of time series in biological time-series, an improved PSVG is presented in this paper. The proposed method is evaluated using two examples: a synthetic benchmark time series and a complicated real life Electroencephalograms (EEG)-based diagnostic problem, that is distinguishing autistic children from non-autistic children. It is shown that the proposed improved PSVG is less sensitive to noise and therefore more robust compared with PSVG. Further, it is shown that using improved PSVG in the wavelet-chaos neural network model of Adeli and c-workers in place of the Katz fractality dimension results in a more accurate diagnosis of autism, a complicated neurological and psychiatric disorder.

  20. Ionospheric magnetic signals during conjunctions between ground based and Swarm satellite observations

    NASA Astrophysics Data System (ADS)

    Saturnino, Diana; Olsen, Nils; Finlay, Chris

    2017-04-01

    High-precision magnetic measurements collected by satellites such as Swarm or CHAMP,flying at altitudes between 300 and 800km, allow for improved geomagnetic field modelling. An accurate description of the internal (core and crust) field must account for contributions from other sources, such as the ionosphere and magnetosphere. However, the description of the rapidly changing external field contributions, particularly during the quiet times from which the data are selected, constitutes a major challenge of the construction of such models. Our study attempts to obtain improved knowledge on ionospheric field contributions during quiet times conditions, in particular during night local times. We use two different datasets: ground magnetic observatories time series (obtained below the ionospheric E-layer currents), and Swarm satellites measurements acquired above these currents. First, we remove from the data estimates of the core, lithospheric and large-scale magnetospheric magnetic contributions as given by the CHAOS-6 model, to obtain corrected time series. Then, we focus on the differences of the corrected time series: for a pair of ground magnetic observatories, we determine the time series of the difference, and similarly we determine time series differences at satellite altitude, given by the difference between the Swarm Alpha and Charlie satellites taken in the vicinity of the ground observatory locations. The obtained differences time series are analysed regarding their temporal and spatial scales variations, with emphasis on measurements during night local times.

Top