The Use of Crow-AMSAA Plots to Assess Mishap Trends
NASA Technical Reports Server (NTRS)
Dawson, Jeffrey W.
2011-01-01
Crow-AMSAA (CA) plots are used to model reliability growth. Use of CA plots has expanded into other areas, such as tracking events of interest to management, maintenance problems, and safety mishaps. Safety mishaps can often be successfully modeled using a Poisson probability distribution. CA plots show a Poisson process in log-log space. If the safety mishaps are a stable homogenous Poisson process, a linear fit to the points in a CA plot will have a slope of one. Slopes of greater than one indicate a nonhomogenous Poisson process, with increasing occurrence. Slopes of less than one indicate a nonhomogenous Poisson process, with decreasing occurrence. Changes in slope, known as "cusps," indicate a change in process, which could be an improvement or a degradation. After presenting the CA conceptual framework, examples are given of trending slips, trips and falls, and ergonomic incidents at NASA (from Agency-level data). Crow-AMSAA plotting is a robust tool for trending safety mishaps that can provide insight into safety performance over time.
Evaluation of Shiryaev-Roberts procedure for on-line environmental radiation monitoring.
Watson, Mara M; Seliman, Ayman F; Bliznyuk, Valery N; DeVol, Timothy A
2018-04-30
Water can become contaminated as a result of a leak from a nuclear facility, such as a waste facility, or from clandestine nuclear activity. Low-level on-line radiation monitoring is needed to detect these events in real time. A Bayesian control chart method, Shiryaev-Roberts (SR) procedure, was compared with classical methods, 3-σ and cumulative sum (CUSUM), for quantifying an accumulating signal from an extractive scintillating resin flow-cell detection system. Solutions containing 0.10-5.0 Bq/L of 99 Tc, as T99cO 4 - were pumped through a flow cell packed with extractive scintillating resin used in conjunction with a Beta-RAM Model 5 HPLC detector. While T99cO 4 - accumulated on the resin, time series data were collected. Control chart methods were applied to the data using statistical algorithms developed in MATLAB. SR charts were constructed using Poisson (Poisson SR) and Gaussian (Gaussian SR) probability distributions of count data to estimate the likelihood ratio. Poisson and Gaussian SR charts required less volume of radioactive solution at a fixed concentration to exceed the control limit in most cases than 3-σ and CUSUM control charts, particularly solutions with lower activity. SR is thus the ideal control chart for low-level on-line radiation monitoring. Once the control limit was exceeded, activity concentrations were estimated from the SR control chart using the control chart slope on a semi-logarithmic plot. A linear regression fit was applied to averaged slope data for five activity concentration groupings for Poisson and Gaussian SR control charts. A correlation coefficient (R 2 ) of 0.77 for Poisson SR and 0.90 for Gaussian SR suggest this method will adequately estimate activity concentration for an unknown solution. Copyright © 2018 Elsevier Ltd. All rights reserved.
Alternative Derivations for the Poisson Integral Formula
ERIC Educational Resources Information Center
Chen, J. T.; Wu, C. S.
2006-01-01
Poisson integral formula is revisited. The kernel in the Poisson integral formula can be derived in a series form through the direct BEM free of the concept of image point by using the null-field integral equation in conjunction with the degenerate kernels. The degenerate kernels for the closed-form Green's function and the series form of Poisson…
A note on an attempt at more efficient Poisson series evaluation. [for lunar libration
NASA Technical Reports Server (NTRS)
Shelus, P. J.; Jefferys, W. H., III
1975-01-01
A substantial reduction has been achieved in the time necessary to compute lunar libration series. The method involves eliminating many of the trigonometric function calls by a suitable transformation and applying a short SNOBOL processor to the FORTRAN coding of the transformed series, which obviates many of the multiplication operations during the course of series evaluation. It is possible to accomplish similar results quite easily with other Poisson series.
GRAPE- TWO-DIMENSIONAL GRIDS ABOUT AIRFOILS AND OTHER SHAPES BY THE USE OF POISSON'S EQUATION
NASA Technical Reports Server (NTRS)
Sorenson, R. L.
1994-01-01
The ability to treat arbitrary boundary shapes is one of the most desirable characteristics of a method for generating grids, including those about airfoils. In a grid used for computing aerodynamic flow over an airfoil, or any other body shape, the surface of the body is usually treated as an inner boundary and often cannot be easily represented as an analytic function. The GRAPE computer program was developed to incorporate a method for generating two-dimensional finite-difference grids about airfoils and other shapes by the use of the Poisson differential equation. GRAPE can be used with any boundary shape, even one specified by tabulated points and including a limited number of sharp corners. The GRAPE program has been developed to be numerically stable and computationally fast. GRAPE can provide the aerodynamic analyst with an efficient and consistent means of grid generation. The GRAPE procedure generates a grid between an inner and an outer boundary by utilizing an iterative procedure to solve the Poisson differential equation subject to geometrical restraints. In this method, the inhomogeneous terms of the equation are automatically chosen such that two important effects are imposed on the grid. The first effect is control of the spacing between mesh points along mesh lines intersecting the boundaries. The second effect is control of the angles with which mesh lines intersect the boundaries. Along with the iterative solution to Poisson's equation, a technique of coarse-fine sequencing is employed to accelerate numerical convergence. GRAPE program control cards and input data are entered via the NAMELIST feature. Each variable has a default value such that user supplied data is kept to a minimum. Basic input data consists of the boundary specification, mesh point spacings on the boundaries, and mesh line angles at the boundaries. Output consists of a dataset containing the grid data and, if requested, a plot of the generated mesh. The GRAPE program is written in FORTRAN IV for batch execution and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 135K (octal) of 60 bit words. For plotted output the commercially available DISSPLA graphics software package is required. The GRAPE program was developed in 1980.
First Principles Investigation of Fluorine Based Strontium Series of Perovskites
NASA Astrophysics Data System (ADS)
Erum, Nazia; Azhar Iqbal, Muhammad
2016-11-01
Density functional theory is used to explore structural, elastic, and mechanical properties of SrLiF3, SrNaF3, SrKF3 and SrRbF3 fluoroperovskite compounds by means of an ab-initio Full Potential-Linearized Augmented Plane Wave (FP-LAPW) method. Several lattice parameters are employed to obtain accurate equilibrium volume (Vo). The resultant quantities include ground state energy, elastic constants, shear modulus, bulk modulus, young's modulus, cauchy's pressure, poisson's ratio, shear constant, ratio of elastic anisotropy factor, kleinman's parameter, melting temperature, and lame's coefficient. The calculated structural parameters via DFT as well as analytical methods are found to be consistent with experimental findings. Chemical bonding is used to investigate corresponding chemical trends which authenticate combination of covalent-ionic behavior. Furthermore electron density plots as well as elastic and mechanical properties are reported for the first time which reveals that fluorine based strontium series of perovskites are mechanically stable and posses weak resistance towards shear deformation as compared to resistance towards unidirectional compression while brittleness and ionic behavior is dominated in them which decreases from SrLiF3 to SrRbF3. Calculated cauchy's pressure, poisson's ratio and B/G ratio also proves ionic nature in these compounds. The present methodology represents an effective and influential approach to calculate the whole set of elastic and mechanical parameters which would support to understand various physical phenomena and empower device engineers for implementing these materials in numerous applications.
Computation of solar perturbations with Poisson series
NASA Technical Reports Server (NTRS)
Broucke, R.
1974-01-01
Description of a project for computing first-order perturbations of natural or artificial satellites by integrating the equations of motion on a computer with automatic Poisson series expansions. A basic feature of the method of solution is that the classical variation-of-parameters formulation is used rather than rectangular coordinates. However, the variation-of-parameters formulation uses the three rectangular components of the disturbing force rather than the classical disturbing function, so that there is no problem in expanding the disturbing function in series. Another characteristic of the variation-of-parameters formulation employed is that six rather unusual variables are used in order to avoid singularities at the zero eccentricity and zero (or 90 deg) inclination. The integration process starts by assuming that all the orbit elements present on the right-hand sides of the equations of motion are constants. These right-hand sides are then simple Poisson series which can be obtained with the use of the Bessel expansions of the two-body problem in conjunction with certain interation methods. These Poisson series can then be integrated term by term, and a first-order solution is obtained.
Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.
Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio
2014-11-24
The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine stratification.
A more general system for Poisson series manipulation.
NASA Technical Reports Server (NTRS)
Cherniack, J. R.
1973-01-01
The design of a working Poisson series processor system is described that is more general than those currently in use. This system is the result of a series of compromises among efficiency, generality, ease of programing, and ease of use. The most general form of coefficients that can be multiplied efficiently is pointed out, and the place of general-purpose algebraic systems in celestial mechanics is discussed.
Detecting fission from special nuclear material sources
Rowland, Mark S [Alamo, CA; Snyderman, Neal J [Berkeley, CA
2012-06-05
A neutron detector system for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly into information that a first responder can use to discriminate materials. The system comprises counting neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. The system includes a graphing component that displays the plot of the neutron distribution from the unknown source over a Poisson distribution and a plot of neutrons due to background or environmental sources. The system further includes a known neutron source placed in proximity to the unknown source to actively interrogate the unknown source in order to accentuate differences in neutron emission from the unknown source from Poisson distributions and/or environmental sources.
Hu, Wenbiao; Tong, Shilu; Mengersen, Kerrie; Connell, Des
2007-09-01
Few studies have examined the relationship between weather variables and cryptosporidiosis in Australia. This paper examines the potential impact of weather variability on the transmission of cryptosporidiosis and explores the possibility of developing an empirical forecast system. Data on weather variables, notified cryptosporidiosis cases, and population size in Brisbane were supplied by the Australian Bureau of Meteorology, Queensland Department of Health, and Australian Bureau of Statistics for the period of January 1, 1996-December 31, 2004, respectively. Time series Poisson regression and seasonal auto-regression integrated moving average (SARIMA) models were performed to examine the potential impact of weather variability on the transmission of cryptosporidiosis. Both the time series Poisson regression and SARIMA models show that seasonal and monthly maximum temperature at a prior moving average of 1 and 3 months were significantly associated with cryptosporidiosis disease. It suggests that there may be 50 more cases a year for an increase of 1 degrees C maximum temperature on average in Brisbane. Model assessments indicated that the SARIMA model had better predictive ability than the Poisson regression model (SARIMA: root mean square error (RMSE): 0.40, Akaike information criterion (AIC): -12.53; Poisson regression: RMSE: 0.54, AIC: -2.84). Furthermore, the analysis of residuals shows that the time series Poisson regression appeared to violate a modeling assumption, in that residual autocorrelation persisted. The results of this study suggest that weather variability (particularly maximum temperature) may have played a significant role in the transmission of cryptosporidiosis. A SARIMA model may be a better predictive model than a Poisson regression model in the assessment of the relationship between weather variability and the incidence of cryptosporidiosis.
NASA Astrophysics Data System (ADS)
Tóth, B.; Lillo, F.; Farmer, J. D.
2010-11-01
We introduce an algorithm for the segmentation of a class of regime switching processes. The segmentation algorithm is a non parametric statistical method able to identify the regimes (patches) of a time series. The process is composed of consecutive patches of variable length. In each patch the process is described by a stationary compound Poisson process, i.e. a Poisson process where each count is associated with a fluctuating signal. The parameters of the process are different in each patch and therefore the time series is non-stationary. Our method is a generalization of the algorithm introduced by Bernaola-Galván, et al. [Phys. Rev. Lett. 87, 168105 (2001)]. We show that the new algorithm outperforms the original one for regime switching models of compound Poisson processes. As an application we use the algorithm to segment the time series of the inventory of market members of the London Stock Exchange and we observe that our method finds almost three times more patches than the original one.
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
NDE methods for determining the materials properties of silicon carbide plates
NASA Astrophysics Data System (ADS)
Kenderian, Shant; Kim, Yong; Johnson, Eric; Palusinski, Iwona A.
2009-08-01
Two types of SiC plates, differing in their manufacturing processes, were interrogated using a variety of NDE techniques. The task of evaluating the materials properties of these plates was a challenge due to their non-uniform thickness. Ultrasound was used to estimate the Young's Modulus and calculate the thickness profile and Poisson's Ratio of the plates. The Young's Modulus profile plots were consistent with the thickness profile plots, indicating that the technique was highly influenced by the non-uniform thickness of the plates. The Poisson's Ratio is calculated from the longitudinal and shear wave velocities. Because the thickness is cancelled out, the result is dependent only on the time of flight of the two wave modes, which can be measured accurately. X-Ray was used to determine if any density variations were present in the plates. None were detected suggesting that the varying time of flight of the acoustic wave is attributed only to variations in the elastic constants and thickness profiles of the plates. Eddy Current was used to plot the conductivity profile. Surprisingly, the conductivity profile of one type of plates varied over a wide range rarely seen in other materials. The other type revealed a uniform conductivity profile.
NASA Astrophysics Data System (ADS)
Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph
2018-07-01
To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.
NASA Technical Reports Server (NTRS)
Mandell, M. J.; Harvey, J. M.; Katz, I.
1977-01-01
The NASCAP (NASA Charging Analyzer Program) code simulates the charging process for a complex object in either tenuous plasma or ground test environment. Detailed specifications needed to run the code are presented. The object definition section, OBJDEF, allows the test object to be easily defined in the cubic mesh. The test object is composed of conducting sections which may be wholly or partially covered with thin dielectric coatings. The potential section, POTENT, obtains the electrostatic potential in the space surrounding the object. It uses the conjugate gradient method to solve the finite element formulation of Poisson's equation. The CHARGE section of NASCAP treats charge redistribution among the surface cells of the object as well as charging through radiation bombardment. NASCAP has facilities for extensive graphical output, including several types of object display plots, potential contour plots, space charge density contour plots, current density plots, and particle trajectory plots.
Enhanced Night Vision Via a Combination of Poisson Interpolation and Machine Learning
2006-02-01
of 0-255, they are mostly similar. The right plot shows a family of m(x, ψ) curves of ψ=2 (the most linear) through ψ=1024 (the most curved ...complicating low-light imaging. Nayar and Branzoi [04] later suggested a second variant using a DLP micromirror array to modulate the exposure, via time...255, they are mostly similar. The right plot shows a family of m(x, ψ) curves of ψ=2 (the most linear) through ψ=1024 (the most curved
Multiscale Poincaré plots for visualizing the structure of heartbeat time series.
Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L
2016-02-09
Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.
Integración automatizada de las ecuaciones de Lagrange en el movimiento orbital.
NASA Astrophysics Data System (ADS)
Abad, A.; San Juan, J. F.
The new techniques of algebraic manipulation, especially the Poisson Series Processor, permit the analytical integration of the more and more complex problems of celestial mechanics. The authors are developing a new Poisson Series Processor, PSPC, and they use it to solve the Lagrange equation of the orbital motion. They integrate the Lagrange equation by using the stroboscopic method, and apply it to the main problem of the artificial satellite theory.
Conceptual recurrence plots: revealing patterns in human discourse.
Angus, Daniel; Smith, Andrew; Wiles, Janet
2012-06-01
Human discourse contains a rich mixture of conceptual information. Visualization of the global and local patterns within this data stream is a complex and challenging problem. Recurrence plots are an information visualization technique that can reveal trends and features in complex time series data. The recurrence plot technique works by measuring the similarity of points in a time series to all other points in the same time series and plotting the results in two dimensions. Previous studies have applied recurrence plotting techniques to textual data; however, these approaches plot recurrence using term-based similarity rather than conceptual similarity of the text. We introduce conceptual recurrence plots, which use a model of language to measure similarity between pairs of text utterances, and the similarity of all utterances is measured and displayed. In this paper, we explore how the descriptive power of the recurrence plotting technique can be used to discover patterns of interaction across a series of conversation transcripts. The results suggest that the conceptual recurrence plotting technique is a useful tool for exploring the structure of human discourse.
Flood return level analysis of Peaks over Threshold series under changing climate
NASA Astrophysics Data System (ADS)
Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.
2016-12-01
Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.
On the validity of the Poisson assumption in sampling nanometer-sized aerosols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damit, Brian E; Wu, Dr. Chang-Yu; Cheng, Mengdawn
2014-01-01
A Poisson process is traditionally believed to apply to the sampling of aerosols. For a constant aerosol concentration, it is assumed that a Poisson process describes the fluctuation in the measured concentration because aerosols are stochastically distributed in space. Recent studies, however, have shown that sampling of micrometer-sized aerosols has non-Poissonian behavior with positive correlations. The validity of the Poisson assumption for nanometer-sized aerosols has not been examined and thus was tested in this study. Its validity was tested for four particle sizes - 10 nm, 25 nm, 50 nm and 100 nm - by sampling from indoor air withmore » a DMA- CPC setup to obtain a time series of particle counts. Five metrics were calculated from the data: pair-correlation function (PCF), time-averaged PCF, coefficient of variation, probability of measuring a concentration at least 25% greater than average, and posterior distributions from Bayesian inference. To identify departures from Poissonian behavior, these metrics were also calculated for 1,000 computer-generated Poisson time series with the same mean as the experimental data. For nearly all comparisons, the experimental data fell within the range of 80% of the Poisson-simulation values. Essentially, the metrics for the experimental data were indistinguishable from a simulated Poisson process. The greater influence of Brownian motion for nanometer-sized aerosols may explain the Poissonian behavior observed for smaller aerosols. Although the Poisson assumption was found to be valid in this study, it must be carefully applied as the results here do not definitively prove applicability in all sampling situations.« less
Faithfulness of Recurrence Plots: A Mathematical Proof
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; Komuro, Motomasa; Horai, Shunsuke; Aihara, Kazuyuki
It is practically known that a recurrence plot, a two-dimensional visualization of time series data, can contain almost all information related to the underlying dynamics except for its spatial scale because we can recover a rough shape for the original time series from the recurrence plot even if the original time series is multivariate. We here provide a mathematical proof that the metric defined by a recurrence plot [Hirata et al., 2008] is equivalent to the Euclidean metric under mild conditions.
Constructions and classifications of projective Poisson varieties.
Pym, Brent
2018-01-01
This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.
Constructions and classifications of projective Poisson varieties
NASA Astrophysics Data System (ADS)
Pym, Brent
2018-03-01
This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.
A two-dimensional graphing program for the Tektronix 4050-series graphics computers
Kipp, K.L.
1983-01-01
A refined, two-dimensional graph-plotting program was developed for use on Tektronix 4050-series graphics computers. Important features of this program include: any combination of logarithmic and linear axes, optional automatic scaling and numbering of the axes, multiple-curve plots, character or drawn symbol-point plotting, optional cartridge-tape data input and plot-format storage, optional spline fitting for smooth curves, and built-in data-editing options. The program is run while the Tektronix is not connected to any large auxiliary computer, although data from files on an auxiliary computer easily can be transferred to data-cartridge for later plotting. The user is led through the plot-construction process by a series of questions and requests for data input. Five example plots are presented to illustrate program capability and the sequence of program operation. (USGS)
Identifying hidden common causes from bivariate time series: a method using recurrence plots.
Hirata, Yoshito; Aihara, Kazuyuki
2010-01-01
We propose a method for inferring the existence of hidden common causes from observations of bivariate time series. We detect related time series by excessive simultaneous recurrences in the corresponding recurrence plots. We also use a noncoverage property of a recurrence plot by the other to deny the existence of a directional coupling. We apply the proposed method to real wind data.
Sediment Dynamics in a Vegetated Tidally Influenced Interdistributary Island: Wax Lake, Louisiana
2017-07-01
60 Appendix A: Time Series of Wax Lake Hydrological Measurements...north-south wind stress (right). In each plot, the global wavelet spectrum is shown to the right of the wavelet plot, and and the original time series ...for Hs. The global wavelet spectrum is shown to the right of the wavelet plot, and and the original time series is shown below
A Study of Poisson's Ratio in the Yield Region
NASA Technical Reports Server (NTRS)
Gerard, George; Wildhorn, Sorrel
1952-01-01
In the yield region of the stress-strain curve the variation in Poisson's ratio from the elastic to the plastic value is most pronounced. This variation was studied experimentally by a systematic series of tests on several aluminum alloys. The tests were conducted under simple tensile and compressive loading along three orthogonal axes. A theoretical variation of Poisson's ratio for an orthotropic solid was obtained from dilatational considerations. The assumptions used in deriving the theory were examined by use of the test data and were found to be in reasonable agreement with experimental evidence.
From fuzzy recurrence plots to scalable recurrence networks of time series
NASA Astrophysics Data System (ADS)
Pham, Tuan D.
2017-04-01
Recurrence networks, which are derived from recurrence plots of nonlinear time series, enable the extraction of hidden features of complex dynamical systems. Because fuzzy recurrence plots are represented as grayscale images, this paper presents a variety of texture features that can be extracted from fuzzy recurrence plots. Based on the notion of fuzzy recurrence plots, defuzzified, undirected, and unweighted recurrence networks are introduced. Network measures can be computed for defuzzified recurrence networks that are scalable to meet the demand for the network-based analysis of big data.
Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes
NASA Astrophysics Data System (ADS)
Orsingher, Enzo; Polito, Federico
2012-08-01
In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.
Recurrence plots and recurrence quantification analysis of human motion data
NASA Astrophysics Data System (ADS)
Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad
2016-06-01
The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.
Eruption patterns of the chilean volcanoes Villarrica, Llaima, and Tupungatito
NASA Astrophysics Data System (ADS)
Muñoz, Miguel
1983-09-01
The historical eruption records of three Chilean volcanoes have been subjected to many statistical tests, and none have been found to differ significantly from random, or Poissonian, behaviour. The statistical analysis shows rough conformity with the descriptions determined from the eruption rate functions. It is possible that a constant eruption rate describes the activity of Villarrica; Llaima and Tupungatito present complex eruption rate patterns that appear, however, to have no statistical significance. Questions related to loading and extinction processes and to the existence of shallow secondary magma chambers to which magma is supplied from a deeper system are also addressed. The analysis and the computation of the serial correlation coefficients indicate that the three series may be regarded as stationary renewal processes. None of the test statistics indicates rejection of the Poisson hypothesis at a level less than 5%, but the coefficient of variation for the eruption series at Llaima is significantly different from the value expected for a Poisson process. Also, the estimates of the normalized spectrum of the counting process for the three series suggest a departure from the random model, but the deviations are not found to be significant at the 5% level. Kolmogorov-Smirnov and chi-squared test statistics, applied directly to ascertaining to which probability P the random Poisson model fits the data, indicate that there is significant agreement in the case of Villarrica ( P=0.59) and Tupungatito ( P=0.3). Even though the P-value for Llaima is a marginally significant 0.1 (which is equivalent to rejecting the Poisson model at the 90% confidence level), the series suggests that nonrandom features are possibly present in the eruptive activity of this volcano.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
Finite-dimensional integrable systems: A collection of research problems
NASA Astrophysics Data System (ADS)
Bolsinov, A. V.; Izosimov, A. M.; Tsonev, D. M.
2017-05-01
This article suggests a series of problems related to various algebraic and geometric aspects of integrability. They reflect some recent developments in the theory of finite-dimensional integrable systems such as bi-Poisson linear algebra, Jordan-Kronecker invariants of finite dimensional Lie algebras, the interplay between singularities of Lagrangian fibrations and compatible Poisson brackets, and new techniques in projective geometry.
Time to burn: Modeling wildland arson as an autoregressive crime function
Jeffrey P. Prestemon; David T. Butry
2005-01-01
Six Poisson autoregressive models of order p [PAR(p)] of daily wildland arson ignition counts are estimated for five locations in Florida (1994-2001). In addition, a fixed effects time-series Poisson model of annual arson counts is estimated for all Florida counties (1995-2001). PAR(p) model estimates reveal highly significant arson ignition autocorrelation, lasting up...
Coarse-graining time series data: Recurrence plot of recurrence plots and its application for music
NASA Astrophysics Data System (ADS)
Fukino, Miwa; Hirata, Yoshito; Aihara, Kazuyuki
2016-02-01
We propose a nonlinear time series method for characterizing two layers of regularity simultaneously. The key of the method is using the recurrence plots hierarchically, which allows us to preserve the underlying regularities behind the original time series. We demonstrate the proposed method with musical data. The proposed method enables us to visualize both the local and the global musical regularities or two different features at the same time. Furthermore, the determinism scores imply that the proposed method may be useful for analyzing emotional response to the music.
Coarse-graining time series data: Recurrence plot of recurrence plots and its application for music.
Fukino, Miwa; Hirata, Yoshito; Aihara, Kazuyuki
2016-02-01
We propose a nonlinear time series method for characterizing two layers of regularity simultaneously. The key of the method is using the recurrence plots hierarchically, which allows us to preserve the underlying regularities behind the original time series. We demonstrate the proposed method with musical data. The proposed method enables us to visualize both the local and the global musical regularities or two different features at the same time. Furthermore, the determinism scores imply that the proposed method may be useful for analyzing emotional response to the music.
Seasonally adjusted birth frequencies follow the Poisson distribution.
Barra, Mathias; Lindstrøm, Jonas C; Adams, Samantha S; Augestad, Liv A
2015-12-15
Variations in birth frequencies have an impact on activity planning in maternity wards. Previous studies of this phenomenon have commonly included elective births. A Danish study of spontaneous births found that birth frequencies were well modelled by a Poisson process. Somewhat unexpectedly, there were also weekly variations in the frequency of spontaneous births. Another study claimed that birth frequencies follow the Benford distribution. Our objective was to test these results. We analysed 50,017 spontaneous births at Akershus University Hospital in the period 1999-2014. To investigate the Poisson distribution of these births, we plotted their variance over a sliding average. We specified various Poisson regression models, with the number of births on a given day as the outcome variable. The explanatory variables included various combinations of years, months, days of the week and the digit sum of the date. The relationship between the variance and the average fits well with an underlying Poisson process. A Benford distribution was disproved by a goodness-of-fit test (p < 0.01). The fundamental model with year and month as explanatory variables is significantly improved (p < 0.001) by adding day of the week as an explanatory variable. Altogether 7.5% more children are born on Tuesdays than on Sundays. The digit sum of the date is non-significant as an explanatory variable (p = 0.23), nor does it increase the explained variance. INERPRETATION: Spontaneous births are well modelled by a time-dependent Poisson process when monthly and day-of-the-week variation is included. The frequency is highest in summer towards June and July, Friday and Tuesday stand out as particularly busy days, and the activity level is at its lowest during weekends.
Millwood, Reginald; Nageswara-Rao, Madhugiri; Ye, Rongjian; Terry-Emert, Ellie; Johnson, Chelsea R; Hanson, Micaha; Burris, Jason N; Kwit, Charles; Stewart, C Neal
2017-05-02
Switchgrass is C 4 perennial grass species that is being developed as a cellulosic bioenergy feedstock. It is wind-pollinated and considered to be an obligate outcrosser. Genetic engineering has been used to alter cell walls for more facile bioprocessing and biofuel yield. Gene flow from transgenic cultivars would likely be of regulatory concern. In this study we investigated pollen-mediated gene flow from transgenic to nontransgenic switchgrass in a 3-year field experiment performed in Oliver Springs, Tennessee, U.S.A. using a modified Nelder wheel design. The planted area (0.6 ha) contained sexually compatible pollen source and pollen receptor switchgrass plants. One hundred clonal switchgrass 'Alamo' plants transgenic for an orange-fluorescent protein (OFP) and hygromycin resistance were used as the pollen source; whole plants, including pollen, were orange-fluorescent. To assess pollen movement, pollen traps were placed at 10 m intervals from the pollen-source plot in the four cardinal directions extending to 20 m, 30 m, 30 m, and 100 m to the north, south, west, and east, respectively. To assess pollination rates, nontransgenic 'Alamo 2' switchgrass clones were planted in pairs adjacent to pollen traps. In the eastward direction there was a 98% decrease in OFP pollen grains from 10 to 100 m from the pollen-source plot (Poisson regression, F1,8 = 288.38, P < 0.0001). At the end of the second and third year, 1,820 F 1 seeds were collected from pollen recipient-plots of which 962 (52.9%) germinated and analyzed for their transgenic status. Transgenic progeny production detected in each pollen-recipient plot decreased with increased distance from the edge of the transgenic plot (Poisson regression, F1,15 = 12.98, P < 0.003). The frequency of transgenic progeny detected in the eastward plots (the direction of the prevailing wind) ranged from 79.2% at 10 m to 9.3% at 100 m. In these experiments we found transgenic pollen movement and hybridization rates to be inversely associated with distance. However, these data suggest pollen-mediated gene flow is likely to occur up to, at least, 100 m. This study gives baseline data useful to determine isolation distances and other management practices should transgenic switchgrass be grown commercially in relevant environments.
Markov modulated Poisson process models incorporating covariates for rainfall intensity.
Thayakaran, R; Ramesh, N I
2013-01-01
Time series of rainfall bucket tip times at the Beaufort Park station, Bracknell, in the UK are modelled by a class of Markov modulated Poisson processes (MMPP) which may be thought of as a generalization of the Poisson process. Our main focus in this paper is to investigate the effects of including covariate information into the MMPP model framework on statistical properties. In particular, we look at three types of time-varying covariates namely temperature, sea level pressure, and relative humidity that are thought to be affecting the rainfall arrival process. Maximum likelihood estimation is used to obtain the parameter estimates, and likelihood ratio tests are employed in model comparison. Simulated data from the fitted model are used to make statistical inferences about the accumulated rainfall in the discrete time interval. Variability of the daily Poisson arrival rates is studied.
Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives
NASA Astrophysics Data System (ADS)
Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.
2017-12-01
During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.
Bayesian dynamic modeling of time series of dengue disease case counts.
Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander
2017-07-01
The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.
Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu
2016-12-01
We investigated a bacterial sample preparation procedure for single-cell studies. In the present study, we examined whether single bacterial cells obtained via 10-fold dilution followed a theoretical Poisson distribution. Four serotypes of Salmonella enterica, three serotypes of enterohaemorrhagic Escherichia coli and one serotype of Listeria monocytogenes were used as sample bacteria. An inoculum of each serotype was prepared via a 10-fold dilution series to obtain bacterial cell counts with mean values of one or two. To determine whether the experimentally obtained bacterial cell counts follow a theoretical Poisson distribution, a likelihood ratio test between the experimentally obtained cell counts and Poisson distribution which parameter estimated by maximum likelihood estimation (MLE) was conducted. The bacterial cell counts of each serotype sufficiently followed a Poisson distribution. Furthermore, to examine the validity of the parameters of Poisson distribution from experimentally obtained bacterial cell counts, we compared these with the parameters of a Poisson distribution that were estimated using random number generation via computer simulation. The Poisson distribution parameters experimentally obtained from bacterial cell counts were within the range of the parameters estimated using a computer simulation. These results demonstrate that the bacterial cell counts of each serotype obtained via 10-fold dilution followed a Poisson distribution. The fact that the frequency of bacterial cell counts follows a Poisson distribution at low number would be applied to some single-cell studies with a few bacterial cells. In particular, the procedure presented in this study enables us to develop an inactivation model at the single-cell level that can estimate the variability of survival bacterial numbers during the bacterial death process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Poplová, Michaela; Sovka, Pavel; Cifra, Michal
2017-01-01
Photonic signals are broadly exploited in communication and sensing and they typically exhibit Poisson-like statistics. In a common scenario where the intensity of the photonic signals is low and one needs to remove a nonstationary trend of the signals for any further analysis, one faces an obstacle: due to the dependence between the mean and variance typical for a Poisson-like process, information about the trend remains in the variance even after the trend has been subtracted, possibly yielding artifactual results in further analyses. Commonly available detrending or normalizing methods cannot cope with this issue. To alleviate this issue we developed a suitable pre-processing method for the signals that originate from a Poisson-like process. In this paper, a Poisson pre-processing method for nonstationary time series with Poisson distribution is developed and tested on computer-generated model data and experimental data of chemiluminescence from human neutrophils and mung seeds. The presented method transforms a nonstationary Poisson signal into a stationary signal with a Poisson distribution while preserving the type of photocount distribution and phase-space structure of the signal. The importance of the suggested pre-processing method is shown in Fano factor and Hurst exponent analysis of both computer-generated model signals and experimental photonic signals. It is demonstrated that our pre-processing method is superior to standard detrending-based methods whenever further signal analysis is sensitive to variance of the signal.
Poplová, Michaela; Sovka, Pavel
2017-01-01
Photonic signals are broadly exploited in communication and sensing and they typically exhibit Poisson-like statistics. In a common scenario where the intensity of the photonic signals is low and one needs to remove a nonstationary trend of the signals for any further analysis, one faces an obstacle: due to the dependence between the mean and variance typical for a Poisson-like process, information about the trend remains in the variance even after the trend has been subtracted, possibly yielding artifactual results in further analyses. Commonly available detrending or normalizing methods cannot cope with this issue. To alleviate this issue we developed a suitable pre-processing method for the signals that originate from a Poisson-like process. In this paper, a Poisson pre-processing method for nonstationary time series with Poisson distribution is developed and tested on computer-generated model data and experimental data of chemiluminescence from human neutrophils and mung seeds. The presented method transforms a nonstationary Poisson signal into a stationary signal with a Poisson distribution while preserving the type of photocount distribution and phase-space structure of the signal. The importance of the suggested pre-processing method is shown in Fano factor and Hurst exponent analysis of both computer-generated model signals and experimental photonic signals. It is demonstrated that our pre-processing method is superior to standard detrending-based methods whenever further signal analysis is sensitive to variance of the signal. PMID:29216207
NASA Astrophysics Data System (ADS)
Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph
2017-04-01
Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30
High order solution of Poisson problems with piecewise constant coefficients and interface jumps
NASA Astrophysics Data System (ADS)
Marques, Alexandre Noll; Nave, Jean-Christophe; Rosales, Rodolfo Ruben
2017-04-01
We present a fast and accurate algorithm to solve Poisson problems in complex geometries, using regular Cartesian grids. We consider a variety of configurations, including Poisson problems with interfaces across which the solution is discontinuous (of the type arising in multi-fluid flows). The algorithm is based on a combination of the Correction Function Method (CFM) and Boundary Integral Methods (BIM). Interface and boundary conditions can be treated in a fast and accurate manner using boundary integral equations, and the associated BIM. Unfortunately, BIM can be costly when the solution is needed everywhere in a grid, e.g. fluid flow problems. We use the CFM to circumvent this issue. The solution from the BIM is used to rewrite the problem as a series of Poisson problems in rectangular domains-which requires the BIM solution at interfaces/boundaries only. These Poisson problems involve discontinuities at interfaces, of the type that the CFM can handle. Hence we use the CFM to solve them (to high order of accuracy) with finite differences and a Fast Fourier Transform based fast Poisson solver. We present 2-D examples of the algorithm applied to Poisson problems involving complex geometries, including cases in which the solution is discontinuous. We show that the algorithm produces solutions that converge with either 3rd or 4th order of accuracy, depending on the type of boundary condition and solution discontinuity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taddei, Laura; Martinelli, Matteo; Amendola, Luca, E-mail: taddei@thphys.uni-heidelberg.de, E-mail: martinelli@lorentz.leidenuniv.nl, E-mail: amendola@thphys.uni-heidelberg.de
2016-12-01
The aim of this paper is to constrain modified gravity with redshift space distortion observations and supernovae measurements. Compared with a standard ΛCDM analysis, we include three additional free parameters, namely the initial conditions of the matter perturbations, the overall perturbation normalization, and a scale-dependent modified gravity parameter modifying the Poisson equation, in an attempt to perform a more model-independent analysis. First, we constrain the Poisson parameter Y (also called G {sub eff}) by using currently available f σ{sub 8} data and the recent SN catalog JLA. We find that the inclusion of the additional free parameters makes the constraintsmore » significantly weaker than when fixing them to the standard cosmological value. Second, we forecast future constraints on Y by using the predicted growth-rate data for Euclid and SKA missions. Here again we point out the weakening of the constraints when the additional parameters are included. Finally, we adopt as modified gravity Poisson parameter the specific Horndeski form, and use scale-dependent forecasts to build an exclusion plot for the Yukawa potential akin to the ones realized in laboratory experiments, both for the Euclid and the SKA surveys.« less
Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis
NASA Astrophysics Data System (ADS)
Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.
We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.
Forest Health Monitoring in Maryland 1996-1999
Northeastern Research Station
2003-01-01
The National Forest Health Monitoring (FHM) program monitors the long-term status, changes and trends in the health of forest ecosystems and is conducted in cooperation with individual states. In Maryland, 40 FHM plots were established in 1991. Beginning in 1998, 95 plots were added. Each plot is a series of four fixed-area circular plots. Most tree measurements are...
Daily mortality and air pollutants: findings from Köln, Germany.
Spix, C; Wichmann, H E
1996-04-01
For the APHEA study, the short term effects of air pollutants on human health were investigated in a comparable way in various European cities. Daily mortality was used as one of the health effects indicators. This report aims to demonstrate the steps in epidemiological model building in this type of time series analysis aimed at detecting short term effects under a poisson distribution assumption and shows the tools for decision making. In addition, it assesses the impact of these steps on the pollution effect estimates. Köln, Germany, is a city of one million inhabitants. It is densely populated with a warm, humid, unfavourable climate and a high traffic density. In previous studies, smog episodes were found to increase mortality and higher sulphur dioxide (SO2) levels were connected with increases in the number of episodes of croup. Daily total mortality was obtained for 1975-85. SO2, total suspended particulates, and nitrogen dioxide (NO2) data were available from two to five stations for the city area, and size fractionated PM7 data from a neighbouring city. The main tools were time series plots of the raw data, predicted and residual data, the partial autocorrelation function and periodogram of the residuals, cross correlations of prefiltered series, plots of categorised influences, chi 2 statistics of influences and sensitivity analyses taking overdispersion and autocorrelation into account. With regard to model building, it is concluded that seasonal and epidemic correction are the most important steps. The actual model chosen depends very much on the properties of the data set. For the pollution effect estimates, trend, season, and epidemic corrections are important to avoid overestimation of the effect, while an appropriate short term meterology influence correction model may actually prevent underestimation. When the model leaves very little over-dispersion and autocorrelation in the residuals, which indicates a good fit, correction for them has consequently little impact. Given the model, most of the range of SO2 values (5th centile to 95th centile) led to a 3-4% increase in mortality (significant), particulates led to a 2% increase (borderline significant, less data than for SO2), and NO2 had no relationship with mortality (measurements possibly not representative of actual exposure). Effects were usually delayed by a day.
Daily mortality and air pollutants: findings from Köln, Germany.
Spix, C; Wichmann, H E
1996-01-01
STUDY OBJECTIVE AND DESIGN: For the APHEA study, the short term effects of air pollutants on human health were investigated in a comparable way in various European cities. Daily mortality was used as one of the health effects indicators. This report aims to demonstrate the steps in epidemiological model building in this type of time series analysis aimed at detecting short term effects under a poisson distribution assumption and shows the tools for decision making. In addition, it assesses the impact of these steps on the pollution effect estimates. SETTING: Köln, Germany, is a city of one million inhabitants. It is densely populated with a warm, humid, unfavourable climate and a high traffic density. In previous studies, smog episodes were found to increase mortality and higher sulphur dioxide (SO2) levels were connected with increases in the number of episodes of croup. PARTICIPANTS, MATERIALS AND METHODS: Daily total mortality was obtained for 1975-85. SO2, total suspended particulates, and nitrogen dioxide (NO2) data were available from two to five stations for the city area, and size fractionated PM7 data from a neighbouring city. The main tools were time series plots of the raw data, predicted and residual data, the partial autocorrelation function and periodogram of the residuals, cross correlations of prefiltered series, plots of categorised influences, chi 2 statistics of influences and sensitivity analyses taking overdispersion and autocorrelation into account. RESULTS AND CONCLUSIONS: With regard to model building, it is concluded that seasonal and epidemic correction are the most important steps. The actual model chosen depends very much on the properties of the data set. For the pollution effect estimates, trend, season, and epidemic corrections are important to avoid overestimation of the effect, while an appropriate short term meterology influence correction model may actually prevent underestimation. When the model leaves very little over-dispersion and autocorrelation in the residuals, which indicates a good fit, correction for them has consequently little impact. Given the model, most of the range of SO2 values (5th centile to 95th centile) led to a 3-4% increase in mortality (significant), particulates led to a 2% increase (borderline significant, less data than for SO2), and NO2 had no relationship with mortality (measurements possibly not representative of actual exposure). Effects were usually delayed by a day. PMID:8758225
Gustafsson, Leif; Sternad, Mikael
2007-10-01
Population models concern collections of discrete entities such as atoms, cells, humans, animals, etc., where the focus is on the number of entities in a population. Because of the complexity of such models, simulation is usually needed to reproduce their complete dynamic and stochastic behaviour. Two main types of simulation models are used for different purposes, namely micro-simulation models, where each individual is described with its particular attributes and behaviour, and macro-simulation models based on stochastic differential equations, where the population is described in aggregated terms by the number of individuals in different states. Consistency between micro- and macro-models is a crucial but often neglected aspect. This paper demonstrates how the Poisson Simulation technique can be used to produce a population macro-model consistent with the corresponding micro-model. This is accomplished by defining Poisson Simulation in strictly mathematical terms as a series of Poisson processes that generate sequences of Poisson distributions with dynamically varying parameters. The method can be applied to any population model. It provides the unique stochastic and dynamic macro-model consistent with a correct micro-model. The paper also presents a general macro form for stochastic and dynamic population models. In an appendix Poisson Simulation is compared with Markov Simulation showing a number of advantages. Especially aggregation into state variables and aggregation of many events per time-step makes Poisson Simulation orders of magnitude faster than Markov Simulation. Furthermore, you can build and execute much larger and more complicated models with Poisson Simulation than is possible with the Markov approach.
NASA Astrophysics Data System (ADS)
Grzybowski, H.; Mosdorf, R.
2016-09-01
The temperature fluctuations occurring in flow boiling in parallel minichannels with diameter of 1 mm have been experimentally investigated and analysed. The wall temperature was recorded at each minichannel outlet by thermocouple with 0.08 mm diameter probe. The time series where recorded during dynamic two-phase flow instabilities which are accompanied by chaotic temperature fluctuations. Time series were denoised using wavelet decomposition and were analysed using cross recurrence plots (CRP) which enables the study of two time series synchronization.
Hejda, Martin
2012-01-01
The aim was to estimate the impacts of invasive Impatiens parviflora on forests’ herbal layer communities. A replicated Before-After-Control-Impact field experiment and comparisons with adjacent uninvaded plots were used. The alien’s impact on species richness was tested using hierarchical generalized mixed effect models with Poisson error structure. Impact on species composition was tested using multivariate models (DCA, CCA, RDA) and Monte-Carlo permutation tests. Removal plots did not differ in native species richness from neither invaded nor adjacent uninvaded plots, both when the treatment’s main effect or its interaction with sampling time was tested (Chi2 = 0.4757, DF = 2, p = 0.7883; Chi2 = 7.229, DF = 8, p = 0.5121 respectively). On the contrary, ordination models revealed differences in the development of plots following the treatments (p = 0.034) with the invaded plots differing from the adjacent uninvaded (p = 0.002). Impatiens parviflora is highly unlikely to impact native species richness of invaded communities, which may be associated with its limited ability to create a dense canopy, a modest root system or the fact the I. parviflora does not represent a novel and distinctive dominant to the invaded communities. Concerning its potential impacts on species composition, the presence of native clonal species (Athyrium filix-femina, Dryopteris filix-mas, Fragaria moschata, Luzula luzuloides, Poa nemoralis) on the adjacent uninvaded plots likely makes them different from the invaded plots. However, these competitive and strong species are more likely to prevent the invasion of I. parviflora on the adjacent uninvaded plots rather than being themselves eliminated from the invaded communities. PMID:22768091
A Bayesian CUSUM plot: Diagnosing quality of treatment.
Rosthøj, Steen; Jacobsen, Rikke-Line
2017-12-01
To present a CUSUM plot based on Bayesian diagnostic reasoning displaying evidence in favour of "healthy" rather than "sick" quality of treatment (QOT), and to demonstrate a technique using Kaplan-Meier survival curves permitting application to case series with ongoing follow-up. For a case series with known final outcomes: Consider each case a diagnostic test of good versus poor QOT (expected vs. increased failure rates), determine the likelihood ratio (LR) of the observed outcome, convert LR to weight taking log to base 2, and add up weights sequentially in a plot showing how many times odds in favour of good QOT have been doubled. For a series with observed survival times and an expected survival curve: Divide the curve into time intervals, determine "healthy" and specify "sick" risks of failure in each interval, construct a "sick" survival curve, determine the LR of survival or failure at the given observation times, convert to weights, and add up. The Bayesian plot was applied retrospectively to 39 children with acute lymphoblastic leukaemia with completed follow-up, using Nordic collaborative results as reference, showing equal odds between good and poor QOT. In the ongoing treatment trial, with 22 of 37 children still at risk for event, QOT has been monitored with average survival curves as reference, odds so far favoring good QOT 2:1. QOT in small patient series can be assessed with a Bayesian CUSUM plot, retrospectively when all treatment outcomes are known, but also in ongoing series with unfinished follow-up. © 2017 John Wiley & Sons, Ltd.
Modeling post-fire woody carbon dynamics with data from remeasured inventory plots
Bianca N.I. Eskelson; Jeremy Fried; Vicente Monleon
2015-01-01
In California, the Forest Inventory and Analysis (FIA) plots within large fires were visited one year after the fire occurred resulting in a time series of measurements before and after fire. During this additional plot visit, the standard inventory measurements were augmented for these burned plots to assess fire effects. One example of the additional measurements is...
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
NASADIG - NASA DEVICE INDEPENDENT GRAPHICS LIBRARY (AMDAHL VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. E.
1994-01-01
The NASA Device Independent Graphics Library, NASADIG, can be used with many computer-based engineering and management applications. The library gives the user the opportunity to translate data into effective graphic displays for presentation. The software offers many features which allow the user flexibility in creating graphics. These include two-dimensional plots, subplot projections in 3D-space, surface contour line plots, and surface contour color-shaded plots. Routines for three-dimensional plotting, wireframe surface plots, surface plots with hidden line removal, and surface contour line plots are provided. Other features include polar and spherical coordinate plotting, world map plotting utilizing either cylindrical equidistant or Lambert equal area projection, plot translation, plot rotation, plot blowup, splines and polynomial interpolation, area blanking control, multiple log/linear axes, legends and text control, curve thickness control, and multiple text fonts (18 regular, 4 bold). NASADIG contains several groups of subroutines. Included are subroutines for plot area and axis definition; text set-up and display; area blanking; line style set-up, interpolation, and plotting; color shading and pattern control; legend, text block, and character control; device initialization; mixed alphabets setting; and other useful functions. The usefulness of many routines is dependent on the prior definition of basic parameters. The program's control structure uses a serial-level construct with each routine restricted for activation at some prescribed level(s) of problem definition. NASADIG provides the following output device drivers: Selanar 100XL, VECTOR Move/Draw ASCII and PostScript files, Tektronix 40xx, 41xx, and 4510 Rasterizer, DEC VT-240 (4014 mode), IBM AT/PC compatible with SmartTerm 240 emulator, HP Lasergrafix Film Recorder, QMS 800/1200, DEC LN03+ Laserprinters, and HP LaserJet (Series III). NASADIG is written in FORTRAN and is available for several platforms. NASADIG 5.7 is available for DEC VAX series computers running VMS 5.0 or later (MSC-21801), Cray X-MP and Y-MP series computers running UNICOS (COS-10049), and Amdahl 5990 mainframe computers running UTS (COS-10050). NASADIG 5.1 is available for UNIX-based operating systems (MSC-22001). The UNIX version has been successfully implemented on Sun4 series computers running SunOS, SGI IRIS computers running IRIX, Hewlett Packard 9000 computers running HP-UX, and Convex computers running Convex OS (MSC-22001). The standard distribution medium for MSC-21801 is a set of two 6250 BPI 9-track magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. The standard distribution medium for COS-10049 and COS-10050 is a 6250 BPI 9-track magnetic tape in UNIX tar format. Other distribution media and formats may be available upon request. The standard distribution medium for MSC-22001 is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. With minor modification, the UNIX source code can be ported to other platforms including IBM PC/AT series computers and compatibles. NASADIG is also available bundled with TRASYS, the Thermal Radiation Analysis System (COS-10026, DEC VAX version; COS-10040, CRAY version).
NASADIG - NASA DEVICE INDEPENDENT GRAPHICS LIBRARY (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. E.
1994-01-01
The NASA Device Independent Graphics Library, NASADIG, can be used with many computer-based engineering and management applications. The library gives the user the opportunity to translate data into effective graphic displays for presentation. The software offers many features which allow the user flexibility in creating graphics. These include two-dimensional plots, subplot projections in 3D-space, surface contour line plots, and surface contour color-shaded plots. Routines for three-dimensional plotting, wireframe surface plots, surface plots with hidden line removal, and surface contour line plots are provided. Other features include polar and spherical coordinate plotting, world map plotting utilizing either cylindrical equidistant or Lambert equal area projection, plot translation, plot rotation, plot blowup, splines and polynomial interpolation, area blanking control, multiple log/linear axes, legends and text control, curve thickness control, and multiple text fonts (18 regular, 4 bold). NASADIG contains several groups of subroutines. Included are subroutines for plot area and axis definition; text set-up and display; area blanking; line style set-up, interpolation, and plotting; color shading and pattern control; legend, text block, and character control; device initialization; mixed alphabets setting; and other useful functions. The usefulness of many routines is dependent on the prior definition of basic parameters. The program's control structure uses a serial-level construct with each routine restricted for activation at some prescribed level(s) of problem definition. NASADIG provides the following output device drivers: Selanar 100XL, VECTOR Move/Draw ASCII and PostScript files, Tektronix 40xx, 41xx, and 4510 Rasterizer, DEC VT-240 (4014 mode), IBM AT/PC compatible with SmartTerm 240 emulator, HP Lasergrafix Film Recorder, QMS 800/1200, DEC LN03+ Laserprinters, and HP LaserJet (Series III). NASADIG is written in FORTRAN and is available for several platforms. NASADIG 5.7 is available for DEC VAX series computers running VMS 5.0 or later (MSC-21801), Cray X-MP and Y-MP series computers running UNICOS (COS-10049), and Amdahl 5990 mainframe computers running UTS (COS-10050). NASADIG 5.1 is available for UNIX-based operating systems (MSC-22001). The UNIX version has been successfully implemented on Sun4 series computers running SunOS, SGI IRIS computers running IRIX, Hewlett Packard 9000 computers running HP-UX, and Convex computers running Convex OS (MSC-22001). The standard distribution medium for MSC-21801 is a set of two 6250 BPI 9-track magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. The standard distribution medium for COS-10049 and COS-10050 is a 6250 BPI 9-track magnetic tape in UNIX tar format. Other distribution media and formats may be available upon request. The standard distribution medium for MSC-22001 is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. With minor modification, the UNIX source code can be ported to other platforms including IBM PC/AT series computers and compatibles. NASADIG is also available bundled with TRASYS, the Thermal Radiation Analysis System (COS-10026, DEC VAX version; COS-10040, CRAY version).
Wan, Wai-Yin; Chan, Jennifer S K
2009-08-01
For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).
Rapid computation of directional wellbore drawdown in a confined aquifer via Poisson resummation
NASA Astrophysics Data System (ADS)
Blumenthal, Benjamin J.; Zhan, Hongbin
2016-08-01
We have derived a rapidly computed analytical solution for drawdown caused by a partially or fully penetrating directional wellbore (vertical, horizontal, or slant) via Green's function method. The mathematical model assumes an anisotropic, homogeneous, confined, box-shaped aquifer. Any dimension of the box can have one of six possible boundary conditions: 1) both sides no-flux; 2) one side no-flux - one side constant-head; 3) both sides constant-head; 4) one side no-flux; 5) one side constant-head; 6) free boundary conditions. The solution has been optimized for rapid computation via Poisson Resummation, derivation of convergence rates, and numerical optimization of integration techniques. Upon application of the Poisson Resummation method, we were able to derive two sets of solutions with inverse convergence rates, namely an early-time rapidly convergent series (solution-A) and a late-time rapidly convergent series (solution-B). From this work we were able to link Green's function method (solution-B) back to image well theory (solution-A). We then derived an equation defining when the convergence rate between solution-A and solution-B is the same, which we termed the switch time. Utilizing the more rapidly convergent solution at the appropriate time, we obtained rapid convergence at all times. We have also shown that one may simplify each of the three infinite series for the three-dimensional solution to 11 terms and still maintain a maximum relative error of less than 10-14.
Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo
2015-08-01
Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Dynamical density delay maps: simple, new method for visualising the behaviour of complex systems
2014-01-01
Background Physiologic signals, such as cardiac interbeat intervals, exhibit complex fluctuations. However, capturing important dynamical properties, including nonstationarities may not be feasible from conventional time series graphical representations. Methods We introduce a simple-to-implement visualisation method, termed dynamical density delay mapping (“D3-Map” technique) that provides an animated representation of a system’s dynamics. The method is based on a generalization of conventional two-dimensional (2D) Poincaré plots, which are scatter plots where each data point, x(n), in a time series is plotted against the adjacent one, x(n + 1). First, we divide the original time series, x(n) (n = 1,…, N), into a sequence of segments (windows). Next, for each segment, a three-dimensional (3D) Poincaré surface plot of x(n), x(n + 1), h[x(n),x(n + 1)] is generated, in which the third dimension, h, represents the relative frequency of occurrence of each (x(n),x(n + 1)) point. This 3D Poincaré surface is then chromatised by mapping the relative frequency h values onto a colour scheme. We also generate a colourised 2D contour plot from each time series segment using the same colourmap scheme as for the 3D Poincaré surface. Finally, the original time series graph, the colourised 3D Poincaré surface plot, and its projection as a colourised 2D contour map for each segment, are animated to create the full “D3-Map.” Results We first exemplify the D3-Map method using the cardiac interbeat interval time series from a healthy subject during sleeping hours. The animations uncover complex dynamical changes, such as transitions between states, and the relative amount of time the system spends in each state. We also illustrate the utility of the method in detecting hidden temporal patterns in the heart rate dynamics of a patient with atrial fibrillation. The videos, as well as the source code, are made publicly available. Conclusions Animations based on density delay maps provide a new way of visualising dynamical properties of complex systems not apparent in time series graphs or standard Poincaré plot representations. Trainees in a variety of fields may find the animations useful as illustrations of fundamental but challenging concepts, such as nonstationarity and multistability. For investigators, the method may facilitate data exploration. PMID:24438439
Poisson-event-based analysis of cell proliferation.
Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul
2015-05-01
A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.
1976-07-01
PURDUE UNIVERSITY DEPARTMENT OF STATISTICS DIVISION OF MATHEMATICAL SCIENCES ON SUBSET SELECTION PROCEDURES FOR POISSON PROCESSES AND SOME...Mathematical Sciences Mimeograph Series #457, July 1976 This research was supported by the Office of Naval Research under Contract NOOO14-75-C-0455 at Purdue...11 CON PC-111 riFIC-F ,A.F ANO ADDPFS Office of INaval ResearchJu#07 Washington, DC07 36AE 14~~~ rjCr; NF A ’ , A FAA D F 6 - I S it 9 i 1, - ,1 I
NASA Technical Reports Server (NTRS)
Mullenmeister, Paul
1988-01-01
The quasi-geostrophic omega-equation in flux form is developed as an example of a Poisson problem over a spherical shell. Solutions of this equation are obtained by applying a two-parameter Chebyshev solver in vector layout for CDC 200 series computers. The performance of this vectorized algorithm greatly exceeds the performance of its scalar analog. The algorithm generates solutions of the omega-equation which are compared with the omega fields calculated with the aid of the mass continuity equation.
Harold S.J. Zald; Janet L. Ohmann; Heather M. Roberts; Matthew J. Gregory; Emilie B. Henderson; Robert J. McGaughey; Justin Braaten
2014-01-01
This study investigated how lidar-derived vegetation indices, disturbance history from Landsat time series (LTS) imagery, plot location accuracy, and plot size influenced accuracy of statistical spatial models (nearest-neighbor imputation maps) of forest vegetation composition and structure. Nearest-neighbor (NN) imputation maps were developed for 539,000 ha in the...
NASA Astrophysics Data System (ADS)
Atkin, Keith
2016-11-01
This paper shows how very simple circuitry attached to an Arduino microcontroller can be used for the measurement of both frequency and amplitude of a sinusoidal signal. It is also shown how the addition of a readily available software package, MakerPlot, can facilitate the display and investigation of resonance curves for a series LCR circuit.
Warren B. Cohen; Hans-Erik Andersen; Sean P. Healey; Gretchen G. Moisen; Todd A. Schroeder; Christopher W. Woodall; Grant M. Domke; Zhiqiang Yang; Robert E. Kennedy; Stephen V. Stehman; Curtis Woodcock; Jim Vogelmann; Zhe Zhu; Chengquan Huang
2015-01-01
We are developing a system that provides temporally consistent biomass estimates for national greenhouse gas inventory reporting to the United Nations Framework Convention on Climate Change. Our model-assisted estimation framework relies on remote sensing to scale from plot measurements to lidar strip samples, to Landsat time series-based maps. As a demonstration, new...
Bayesian dynamic modeling of time series of dengue disease case counts
López-Quílez, Antonio; Torres-Prieto, Alexander
2017-01-01
The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model’s short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health. PMID:28671941
Ait Kaci Azzou, S; Larribe, F; Froda, S
2016-10-01
In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Pham, T. D.
2016-12-01
Recurrence plots display binary texture of time series from dynamical systems with single dots and line structures. Using fuzzy recurrence plots, recurrences of the phase-space states can be visualized as grayscale texture, which is more informative for pattern analysis. The proposed method replaces the crucial similarity threshold required by symmetrical recurrence plots with the number of cluster centers, where the estimate of the latter parameter is less critical than the estimate of the former.
CI2 for creating and comparing confidence-intervals for time-series bivariate plots.
Mullineaux, David R
2017-02-01
Currently no method exists for calculating and comparing the confidence-intervals (CI) for the time-series of a bivariate plot. The study's aim was to develop 'CI2' as a method to calculate the CI on time-series bivariate plots, and to identify if the CI between two bivariate time-series overlap. The test data were the knee and ankle angles from 10 healthy participants running on a motorised standard-treadmill and non-motorised curved-treadmill. For a recommended 10+ trials, CI2 involved calculating 95% confidence-ellipses at each time-point, then taking as the CI the points on the ellipses that were perpendicular to the direction vector between the means of two adjacent time-points. Consecutive pairs of CI created convex quadrilaterals, and any overlap of these quadrilaterals at the same time or ±1 frame as a time-lag calculated using cross-correlations, indicated where the two time-series differed. CI2 showed no group differences between left and right legs on both treadmills, but the same legs between treadmills for all participants showed differences of less knee extension on the curved-treadmill before heel-strike. To improve and standardise the use of CI2 it is recommended to remove outlier time-series, use 95% confidence-ellipses, and scale the ellipse by the fixed Chi-square value as opposed to the sample-size dependent F-value. For practical use, and to aid in standardisation or future development of CI2, Matlab code is provided. CI2 provides an effective method to quantify the CI of bivariate plots, and to explore the differences in CI between two bivariate time-series. Copyright © 2016 Elsevier B.V. All rights reserved.
A GPU accelerated and error-controlled solver for the unbounded Poisson equation in three dimensions
NASA Astrophysics Data System (ADS)
Exl, Lukas
2017-12-01
An efficient solver for the three dimensional free-space Poisson equation is presented. The underlying numerical method is based on finite Fourier series approximation. While the error of all involved approximations can be fully controlled, the overall computation error is driven by the convergence of the finite Fourier series of the density. For smooth and fast-decaying densities the proposed method will be spectrally accurate. The method scales with O(N log N) operations, where N is the total number of discretization points in the Cartesian grid. The majority of the computational costs come from fast Fourier transforms (FFT), which makes it ideal for GPU computation. Several numerical computations on CPU and GPU validate the method and show efficiency and convergence behavior. Tests are performed using the Vienna Scientific Cluster 3 (VSC3). A free MATLAB implementation for CPU and GPU is provided to the interested community.
Chen, Junning; Suenaga, Hanako; Hogg, Michael; Li, Wei; Swain, Michael; Li, Qing
2016-01-01
Despite their considerable importance to biomechanics, there are no existing methods available to directly measure apparent Poisson's ratio and friction coefficient of oral mucosa. This study aimed to develop an inverse procedure to determine these two biomechanical parameters by utilizing in vivo experiment of contact pressure between partial denture and beneath mucosa through nonlinear finite element (FE) analysis and surrogate response surface (RS) modelling technique. First, the in vivo denture-mucosa contact pressure was measured by a tactile electronic sensing sheet. Second, a 3D FE model was constructed based on the patient CT images. Third, a range of apparent Poisson's ratios and the coefficients of friction from literature was considered as the design variables in a series of FE runs for constructing a RS surrogate model. Finally, the discrepancy between computed in silico and measured in vivo results was minimized to identify the best matching Poisson's ratio and coefficient of friction. The established non-invasive methodology was demonstrated effective to identify such biomechanical parameters of oral mucosa and can be potentially used for determining the biomaterial properties of other soft biological tissues.
Quantized Algebras of Functions on Homogeneous Spaces with Poisson Stabilizers
NASA Astrophysics Data System (ADS)
Neshveyev, Sergey; Tuset, Lars
2012-05-01
Let G be a simply connected semisimple compact Lie group with standard Poisson structure, K a closed Poisson-Lie subgroup, 0 < q < 1. We study a quantization C( G q / K q ) of the algebra of continuous functions on G/ K. Using results of Soibelman and Dijkhuizen-Stokman we classify the irreducible representations of C( G q / K q ) and obtain a composition series for C( G q / K q ). We describe closures of the symplectic leaves of G/ K refining the well-known description in the case of flag manifolds in terms of the Bruhat order. We then show that the same rules describe the topology on the spectrum of C( G q / K q ). Next we show that the family of C*-algebras C( G q / K q ), 0 < q ≤ 1, has a canonical structure of a continuous field of C*-algebras and provides a strict deformation quantization of the Poisson algebra {{C}[G/K]} . Finally, extending a result of Nagy, we show that C( G q / K q ) is canonically KK-equivalent to C( G/ K).
Segundo, J P; Sugihara, G; Dixon, P; Stiber, M; Bersier, L F
1998-12-01
This communication describes the new information that may be obtained by applying nonlinear analytical techniques to neurobiological time-series. Specifically, we consider the sequence of interspike intervals Ti (the "timing") of trains recorded from synaptically inhibited crayfish pacemaker neurons. As reported earlier, different postsynaptic spike train forms (sets of timings with shared properties) are generated by varying the average rate and/or pattern (implying interval dispersions and sequences) of presynaptic spike trains. When the presynaptic train is Poisson (independent exponentially distributed intervals), the form is "Poisson-driven" (unperturbed and lengthened intervals succeed each other irregularly). When presynaptic trains are pacemaker (intervals practically equal), forms are either "p:q locked" (intervals repeat periodically), "intermittent" (mostly almost locked but disrupted irregularly), "phase walk throughs" (intermittencies with briefer regular portions), or "messy" (difficult to predict or describe succinctly). Messy trains are either "erratic" (some intervals natural and others lengthened irregularly) or "stammerings" (intervals are integral multiples of presynaptic intervals). The individual spike train forms were analysed using attractor reconstruction methods based on the lagged coordinates provided by successive intervals from the time-series Ti. Numerous models were evaluated in terms of their predictive performance by a trial-and-error procedure: the most successful model was taken as best reflecting the true nature of the system's attractor. Each form was characterized in terms of its dimensionality, nonlinearity and predictability. (1) The dimensionality of the underlying dynamical attractor was estimated by the minimum number of variables (coordinates Ti) required to model acceptably the system's dynamics, i.e. by the system's degrees of freedom. Each model tested was based on a different number of Ti; the smallest number whose predictions were judged successful provided the best integer approximation of the attractor's true dimension (not necessarily an integer). Dimensionalities from three to five provided acceptable fits. (2) The degree of nonlinearity was estimated by: (i) comparing the correlations between experimental results and data from linear and nonlinear models, and (ii) tuning model nonlinearity via a distance-weighting function and identifying the either local or global neighborhood size. Lockings were compatible with linear models and stammerings were marginal; nonlinear models were best for Poisson-driven, intermittent and erratic forms. (3) Finally, prediction accuracy was plotted against increasingly long sequences of intervals forecast: the accuracies for Poisson-driven, locked and stammering forms were invariant, revealing irregularities due to uncorrelated noise, but those of intermittent and messy erratic forms decayed rapidly, indicating an underlying deterministic process. The excellent reconstructions possible for messy erratic and for some intermittent forms are especially significant because of their relatively low dimensionality (around 4), high degree of nonlinearity and prediction decay with time. This is characteristic of chaotic systems, and provides evidence that nonlinear couplings between relatively few variables are the major source of the apparent complexity seen in these cases. This demonstration of different dimensions, degrees of nonlinearity and predictabilities provides rigorous support for the categorization of different synaptically driven discharge forms proposed earlier on the basis of more heuristic criteria. This has significant implications. (1) It demonstrates that heterogeneous postsynaptic forms can indeed be induced by manipulating a few presynaptic variables. (2) Each presynaptic timing induces a form with characteristic dimensionality, thus breaking up the preparation into subsystems such that the physical variables in each operate as one
Evaluation of lattice sums by the Poisson sum formula
NASA Technical Reports Server (NTRS)
Ray, R. D.
1975-01-01
The Poisson sum formula was applied to the problem of summing pairwise interactions between an observer molecule and a semi-infinite regular array of solid state molecules. The transformed sum is often much more rapidly convergent than the original sum, and forms a Fourier series in the solid surface coordinates. The method is applicable to a variety of solid state structures and functional forms of the pairwise potential. As an illustration of the method, the electric field above the (100) face of the CsCl structure is calculated and compared to earlier results obtained by direct summation.
NASA Astrophysics Data System (ADS)
Casdagli, M. C.
1997-09-01
We show that recurrence plots (RPs) give detailed characterizations of time series generated by dynamical systems driven by slowly varying external forces. For deterministic systems we show that RPs of the time series can be used to reconstruct the RP of the driving force if it varies sufficiently slowly. If the driving force is one-dimensional, its functional form can then be inferred up to an invertible coordinate transformation. The same results hold for stochastic systems if the RP of the time series is suitably averaged and transformed. These results are used to investigate the nonlinear prediction of time series generated by dynamical systems driven by slowly varying external forces. We also consider the problem of detecting a small change in the driving force, and propose a surrogate data technique for assessing statistical significance. Numerically simulated time series and a time series of respiration rates recorded from a subject with sleep apnea are used as illustrative examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
La Russa, D
Purpose: The purpose of this project is to develop a robust method of parameter estimation for a Poisson-based TCP model using Bayesian inference. Methods: Bayesian inference was performed using the PyMC3 probabilistic programming framework written in Python. A Poisson-based TCP regression model that accounts for clonogen proliferation was fit to observed rates of local relapse as a function of equivalent dose in 2 Gy fractions for a population of 623 stage-I non-small-cell lung cancer patients. The Slice Markov Chain Monte Carlo sampling algorithm was used to sample the posterior distributions, and was initiated using the maximum of the posterior distributionsmore » found by optimization. The calculation of TCP with each sample step required integration over the free parameter α, which was performed using an adaptive 24-point Gauss-Legendre quadrature. Convergence was verified via inspection of the trace plot and posterior distribution for each of the fit parameters, as well as with comparisons of the most probable parameter values with their respective maximum likelihood estimates. Results: Posterior distributions for α, the standard deviation of α (σ), the average tumour cell-doubling time (Td), and the repopulation delay time (Tk), were generated assuming α/β = 10 Gy, and a fixed clonogen density of 10{sup 7} cm−{sup 3}. Posterior predictive plots generated from samples from these posterior distributions are in excellent agreement with the observed rates of local relapse used in the Bayesian inference. The most probable values of the model parameters also agree well with maximum likelihood estimates. Conclusion: A robust method of performing Bayesian inference of TCP data using a complex TCP model has been established.« less
Xing, Jian; Burkom, Howard; Tokars, Jerome
2011-12-01
Automated surveillance systems require statistical methods to recognize increases in visit counts that might indicate an outbreak. In prior work we presented methods to enhance the sensitivity of C2, a commonly used time series method. In this study, we compared the enhanced C2 method with five regression models. We used emergency department chief complaint data from US CDC BioSense surveillance system, aggregated by city (total of 206 hospitals, 16 cities) during 5/2008-4/2009. Data for six syndromes (asthma, gastrointestinal, nausea and vomiting, rash, respiratory, and influenza-like illness) was used and was stratified by mean count (1-19, 20-49, ≥50 per day) into 14 syndrome-count categories. We compared the sensitivity for detecting single-day artificially-added increases in syndrome counts. Four modifications of the C2 time series method, and five regression models (two linear and three Poisson), were tested. A constant alert rate of 1% was used for all methods. Among the regression models tested, we found that a Poisson model controlling for the logarithm of total visits (i.e., visits both meeting and not meeting a syndrome definition), day of week, and 14-day time period was best. Among 14 syndrome-count categories, time series and regression methods produced approximately the same sensitivity (<5% difference) in 6; in six categories, the regression method had higher sensitivity (range 6-14% improvement), and in two categories the time series method had higher sensitivity. When automated data are aggregated to the city level, a Poisson regression model that controls for total visits produces the best overall sensitivity for detecting artificially added visit counts. This improvement was achieved without increasing the alert rate, which was held constant at 1% for all methods. These findings will improve our ability to detect outbreaks in automated surveillance system data. Published by Elsevier Inc.
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks
NASA Astrophysics Data System (ADS)
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
AnisoVis: a MATLAB™ toolbox for the visualisation of elastic anisotropy
NASA Astrophysics Data System (ADS)
Healy, D.; Timms, N.; Pearce, M. A.
2016-12-01
The elastic properties of rocks and minerals vary with direction, and this has significant consequences for their physical response to acoustic waves and natural or imposed stresses. This anisotropy of elasticity is well described mathematically by 4th rank tensors of stiffness or compliance. These tensors are not easy to visualise in a single diagram or graphic, and visualising Poisson's ratio and shear modulus presents a further challenge in that their anisotropy depends on two principal directions. Students and researchers can easily underestimate the importance of elastic anisotropy. This presentation describes an open source toolbox of MATLAB scripts that aims to visualise elastic anisotropy in rocks and minerals. The code produces linked 2-D and 3-D representations of the standard elastic constants, such as Young's modulus, Poisson's ratio and shear modulus, all from a simple GUI. The 3-D plots can be manipulated by the user (rotated, panned, zoomed), to encourage investigation and a deeper understanding of directional variations in the fundamental properties. Examples are presented of common rock forming minerals, including those with negative Poisson's ratio (auxetic behaviour). We hope that an open source code base will encourage further enhancements from the rock physics and wider geoscience communities. Eventually, we hope to generate 3-D prints of these complex and beautiful natural surfaces to provide a tactile link to the underlying physics of elastic anisotropy.
Geologic map of the Cucamonga Peak 7.5' quadrangle, San Bernardino County, California
Morton, D.M.; Matti, J.C.; Digital preparation by Koukladas, Catherine; Cossette, P.M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. (Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale.) The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Cucamonga Peak 7.5’ topographic quadrangle in conjunction with the geologic map.
Geologic map of the Telegraph Peak 7.5' quadrangle, San Bernardino County, California
Morton, D.M.; Woodburne, M.O.; Foster, J.H.; Morton, Gregory; Cossette, P.M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Telegraph Peak 7.5’ topographic quadrangle in conjunction with the geologic map.
Data Quality Monitoring and Noise Analysis at the EUREF Permanent Network
NASA Astrophysics Data System (ADS)
Kenyeres, A.; Bruyninx, C.
2004-12-01
The EUREF Permanent Network (EPN) includes now more then 150 GNSS stations of different quality and different observation history. The greatest portion of the sites is settled on the tectonically stable parts of Eurasia, where only mm-level yearly displacements are expected. In order to extract the relevant geophysical information, sophisticated analysis tools and stable, long term observations are necessary. As the EPN is operational since 1996, it offers the potential to estimate high quality velocities associated with reliable uncertainties. In order to support this work, a set of efficient and demonstrative tools have been developed to monitor the data and station quality. The periodically upgraded results are displayed on the website of the EPN Central Bureau (CB) (www.epncb.oma.be) in terms of sky plots, graphs of observation percentage, cycle slips and multipath. The different quality plots are indirectly used for the interpretation of the time series. Sudden changes or unusual variation in the time series (beyond the obvious equipment change) often correlates with changes in the environment mirrored by the quality plots. These graphs are vital for the proper interpretation and the understanding of the real processes. Knowing the nuisance factors, we can generate cleaner time series. We are presenting relevant examples of this work. Two kinds of time series plots are displayed at the EPN CB website: raw and improved time series. They are cumulative solutions of the weekly EPN SINEX files using the minimum constraint approach. Within the improved time series the outliers and offsets are already taken into account. We will also present preliminary results of a detailed noise analysis of the EPN time series. The target of this work is twofold: on one side we aim at computing more realistic velocity estimates of the EPN stations and on the other side the information about the station noise characteristics will support the removal and proper interpretation of site-specific phenomena .
NASA Technical Reports Server (NTRS)
Long, D.
1994-01-01
This library is a set of subroutines designed for vector plotting to CRT's, plotters, dot matrix, and laser printers. LONGLIB subroutines are invoked by program calls similar to standard CALCOMP routines. In addition to the basic plotting routines, LONGLIB contains an extensive set of routines to allow viewport clipping, extended character sets, graphic input, shading, polar plots, and 3-D plotting with or without hidden line removal. LONGLIB capabilities include surface plots, contours, histograms, logarithm axes, world maps, and seismic plots. LONGLIB includes master subroutines, which are self-contained series of commonly used individual subroutines. When invoked, the master routine will initialize the plotting package, and will plot multiple curves, scatter plots, log plots, 3-D plots, etc. and then close the plot package, all with a single call. Supported devices include VT100 equipped with Selanar GR100 or GR100+ boards, VT125s, VT240s, VT220 equipped with Selanar SG220, Tektronix 4010/4014 or 4107/4109 and compatibles, and Graphon GO-235 terminals. Dot matrix printer output is available by using the provided raster scan conversion routines for DEC LA50, Printronix printers, and high or low resolution Trilog printers. Other output devices include QMS laser printers, Postscript compatible laser printers, and HPGL compatible plotters. The LONGLIB package includes the graphics library source code, an on-line help library, scan converter and meta file conversion programs, and command files for installing, creating, and testing the library. The latest version, 5.0, is significantly enhanced and has been made more portable. Also, the new version's meta file format has been changed and is incompatible with previous versions. A conversion utility is included to port the old meta files to the new format. Color terminal plotting has been incorporated. LONGLIB is written in FORTRAN 77 for batch or interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985, and last updated in 1988.
The structure, function and value of urban forests in California communities
E. Gregory McPherson; Qingfu Xiao; Natalie S. van Doorn; John de Goede; Jacquelyn Bjorkman; Allan Hollander; Ryan M. Boynton; James F. Quinn; James H. Thorne
2017-01-01
This study used tree data from field plots in urban areas to describe forest structure in urban areas throughout California. The plot data were used with numerical models to calculate several ecosystem services produced by trees. A series of transfer functions were calculated to scale-up results from the plots to the landscape using urban tree canopy (UTC) mapped at 1-...
NASA Astrophysics Data System (ADS)
Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.
2018-04-01
The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.
Poisson traces, D-modules, and symplectic resolutions
NASA Astrophysics Data System (ADS)
Etingof, Pavel; Schedler, Travis
2018-03-01
We survey the theory of Poisson traces (or zeroth Poisson homology) developed by the authors in a series of recent papers. The goal is to understand this subtle invariant of (singular) Poisson varieties, conditions for it to be finite-dimensional, its relationship to the geometry and topology of symplectic resolutions, and its applications to quantizations. The main technique is the study of a canonical D-module on the variety. In the case the variety has finitely many symplectic leaves (such as for symplectic singularities and Hamiltonian reductions of symplectic vector spaces by reductive groups), the D-module is holonomic, and hence, the space of Poisson traces is finite-dimensional. As an application, there are finitely many irreducible finite-dimensional representations of every quantization of the variety. Conjecturally, the D-module is the pushforward of the canonical D-module under every symplectic resolution of singularities, which implies that the space of Poisson traces is dual to the top cohomology of the resolution. We explain many examples where the conjecture is proved, such as symmetric powers of du Val singularities and symplectic surfaces and Slodowy slices in the nilpotent cone of a semisimple Lie algebra. We compute the D-module in the case of surfaces with isolated singularities and show it is not always semisimple. We also explain generalizations to arbitrary Lie algebras of vector fields, connections to the Bernstein-Sato polynomial, relations to two-variable special polynomials such as Kostka polynomials and Tutte polynomials, and a conjectural relationship with deformations of symplectic resolutions. In the appendix we give a brief recollection of the theory of D-modules on singular varieties that we require.
Poisson traces, D-modules, and symplectic resolutions.
Etingof, Pavel; Schedler, Travis
2018-01-01
We survey the theory of Poisson traces (or zeroth Poisson homology) developed by the authors in a series of recent papers. The goal is to understand this subtle invariant of (singular) Poisson varieties, conditions for it to be finite-dimensional, its relationship to the geometry and topology of symplectic resolutions, and its applications to quantizations. The main technique is the study of a canonical D-module on the variety. In the case the variety has finitely many symplectic leaves (such as for symplectic singularities and Hamiltonian reductions of symplectic vector spaces by reductive groups), the D-module is holonomic, and hence, the space of Poisson traces is finite-dimensional. As an application, there are finitely many irreducible finite-dimensional representations of every quantization of the variety. Conjecturally, the D-module is the pushforward of the canonical D-module under every symplectic resolution of singularities, which implies that the space of Poisson traces is dual to the top cohomology of the resolution. We explain many examples where the conjecture is proved, such as symmetric powers of du Val singularities and symplectic surfaces and Slodowy slices in the nilpotent cone of a semisimple Lie algebra. We compute the D-module in the case of surfaces with isolated singularities and show it is not always semisimple. We also explain generalizations to arbitrary Lie algebras of vector fields, connections to the Bernstein-Sato polynomial, relations to two-variable special polynomials such as Kostka polynomials and Tutte polynomials, and a conjectural relationship with deformations of symplectic resolutions. In the appendix we give a brief recollection of the theory of D-modules on singular varieties that we require.
Elliptic Euler-Poisson-Darboux equation, critical points and integrable systems
NASA Astrophysics Data System (ADS)
Konopelchenko, B. G.; Ortenzi, G.
2013-12-01
The structure and properties of families of critical points for classes of functions W(z,{\\overline{z}}) obeying the elliptic Euler-Poisson-Darboux equation E(1/2, 1/2) are studied. General variational and differential equations governing the dependence of critical points in variational (deformation) parameters are found. Explicit examples of the corresponding integrable quasi-linear differential systems and hierarchies are presented. There are the extended dispersionless Toda/nonlinear Schrödinger hierarchies, the ‘inverse’ hierarchy and equations associated with the real-analytic Eisenstein series E(\\beta ,{\\overline{\\beta }};1/2) among them. The specific bi-Hamiltonian structure of these equations is also discussed.
Hybrid poplar grows poorly on acid spoil banks at high elevations in West Virginia
George R., Jr. Trimble
1963-01-01
In the early 1950s, a region-wide series of hybrid poplar clonal tests was begun in the Northeast to evaluate the performance of 50 selected clones under a variety of site and climatic conditions. The basic test unit was a block of 50 randomized plots-1 plot for each of the 50 clones. In each plot, 16 cuttings were planted at 4-foot spacing.
Geologic map of the San Bernardino North 7.5' quadrangle, San Bernardino County, California
Miller, F.K.; Matti, J.C.
2001-01-01
3. Portable Document Format (.pdf) files of: a. This Readme; includes an Appendix, containing data found in sbnorth_met.txt . b. The Description of Map Units identical to that found on the plot of the PostScript file. c. The same graphic as plotted in 2 above. (Test plots from this .pdf do not produce 1:24,000-scale maps. Use Adobe Acrobat pagesize setting to control map scale.) The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS San Bernardino North 7.5’ topographic quadrangle in conjunction with the geologic map.
Recurrence quantification analysis of electrically evoked surface EMG signal.
Liu, Chunling; Wang, Xu
2005-01-01
Recurrence Plot is a quite useful tool used in time-series analysis, in particular for measuring unstable periodic orbits embedded in a chaotic dynamical system. This paper introduced the structures of the Recurrence Plot and the ways of the plot coming into being. Then the way of the quantification of the Recurrence Plot is defined. In this paper, one of the possible applications of Recurrence Quantification Analysis (RQA) strategy to the analysis of electrical stimulation evoked surface EMG. The result shows the percent determination is increased along with stimulation intensity.
Filtrations on Springer fiber cohomology and Kostka polynomials
NASA Astrophysics Data System (ADS)
Bellamy, Gwyn; Schedler, Travis
2018-03-01
We prove a conjecture which expresses the bigraded Poisson-de Rham homology of the nilpotent cone of a semisimple Lie algebra in terms of the generalized (one-variable) Kostka polynomials, via a formula suggested by Lusztig. This allows us to construct a canonical family of filtrations on the flag variety cohomology, and hence on irreducible representations of the Weyl group, whose Hilbert series are given by the generalized Kostka polynomials. We deduce consequences for the cohomology of all Springer fibers. In particular, this computes the grading on the zeroth Poisson homology of all classical finite W-algebras, as well as the filtration on the zeroth Hochschild homology of all quantum finite W-algebras, and we generalize to all homology degrees. As a consequence, we deduce a conjecture of Proudfoot on symplectic duality, relating in type A the Poisson homology of Slodowy slices to the intersection cohomology of nilpotent orbit closures. In the last section, we give an analogue of our main theorem in the setting of mirabolic D-modules.
Pine invasions in treeless environments: dispersal overruns microsite heterogeneity.
Pauchard, Aníbal; Escudero, Adrián; García, Rafael A; de la Cruz, Marcelino; Langdon, Bárbara; Cavieres, Lohengrin A; Esquivel, Jocelyn
2016-01-01
Understanding biological invasions patterns and mechanisms is highly needed for forecasting and managing these processes and their negative impacts. At small scales, ecological processes driving plant invasions are expected to produce a spatially explicit pattern driven by propagule pressure and local ground heterogeneity. Our aim was to determine the interplay between the intensity of seed rain, using distance to a mature plantation as a proxy, and microsite heterogeneity in the spreading of Pinus contorta in the treeless Patagonian steppe. Three one-hectare plots were located under different degrees of P. contorta invasion (Coyhaique Alto, 45° 30'S and 71° 42'W). We fitted three types of inhomogeneous Poisson models to each pine plot in an attempt for describing the observed pattern as accurately as possible: the "dispersal" models, "local ground heterogeneity" models, and "combined" models, using both types of covariates. To include the temporal axis in the invasion process, we analyzed both the pattern of young and old recruits and also of all recruits together. As hypothesized, the spatial patterns of recruited pines showed coarse scale heterogeneity. Early pine invasion spatial patterns in our Patagonian steppe site is not different from expectations of inhomogeneous Poisson processes taking into consideration a linear and negative dependency of pine recruit intensity on the distance to afforestations. Models including ground-cover predictors were able to describe the point pattern process only in a couple of cases but never better than dispersal models. This finding concurs with the idea that early invasions depend more on seed pressure than on the biotic and abiotic relationships seed and seedlings establish at the microsite scale. Our results show that without a timely and active management, P. contorta will invade the Patagonian steppe independently of the local ground-cover conditions.
Le Strat, Yann
2017-01-01
The objective of this paper is to evaluate a panel of statistical algorithms for temporal outbreak detection. Based on a large dataset of simulated weekly surveillance time series, we performed a systematic assessment of 21 statistical algorithms, 19 implemented in the R package surveillance and two other methods. We estimated false positive rate (FPR), probability of detection (POD), probability of detection during the first week, sensitivity, specificity, negative and positive predictive values and F1-measure for each detection method. Then, to identify the factors associated with these performance measures, we ran multivariate Poisson regression models adjusted for the characteristics of the simulated time series (trend, seasonality, dispersion, outbreak sizes, etc.). The FPR ranged from 0.7% to 59.9% and the POD from 43.3% to 88.7%. Some methods had a very high specificity, up to 99.4%, but a low sensitivity. Methods with a high sensitivity (up to 79.5%) had a low specificity. All methods had a high negative predictive value, over 94%, while positive predictive values ranged from 6.5% to 68.4%. Multivariate Poisson regression models showed that performance measures were strongly influenced by the characteristics of time series. Past or current outbreak size and duration strongly influenced detection performances. PMID:28715489
Sebastian, Tunny; Jeyaseelan, Visalakshi; Jeyaseelan, Lakshmanan; Anandan, Shalini; George, Sebastian; Bangdiwala, Shrikant I
2018-01-01
Hidden Markov models are stochastic models in which the observations are assumed to follow a mixture distribution, but the parameters of the components are governed by a Markov chain which is unobservable. The issues related to the estimation of Poisson-hidden Markov models in which the observations are coming from mixture of Poisson distributions and the parameters of the component Poisson distributions are governed by an m-state Markov chain with an unknown transition probability matrix are explained here. These methods were applied to the data on Vibrio cholerae counts reported every month for 11-year span at Christian Medical College, Vellore, India. Using Viterbi algorithm, the best estimate of the state sequence was obtained and hence the transition probability matrix. The mean passage time between the states were estimated. The 95% confidence interval for the mean passage time was estimated via Monte Carlo simulation. The three hidden states of the estimated Markov chain are labelled as 'Low', 'Moderate' and 'High' with the mean counts of 1.4, 6.6 and 20.2 and the estimated average duration of stay of 3, 3 and 4 months, respectively. Environmental risk factors were studied using Markov ordinal logistic regression analysis. No significant association was found between disease severity levels and climate components.
User Manual for the Data-Series Interface of the Gr Application Software
Donovan, John M.
2009-01-01
This manual describes the data-series interface for the Gr Application software. Basic tasks such as plotting, editing, manipulating, and printing data series are presented. The properties of the various types of data objects and graphical objects used within the application, and the relationships between them also are presented. Descriptions of compatible data-series file formats are provided.
NASA Astrophysics Data System (ADS)
Chen, Wei-Shing
2011-04-01
The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.
A new computer code for discrete fracture network modelling
NASA Astrophysics Data System (ADS)
Xu, Chaoshui; Dowd, Peter
2010-03-01
The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.
Teen Series' Reception: Television, Adolescence and Culture of Feelings.
ERIC Educational Resources Information Center
Pasquier, Dominique
1996-01-01
Noting the popularity of television teen series among young viewers in France, this study examined how the programs are used as a way of defining gender identity for children and adolescents. Results indicated construction of meanings of characters and plots varied by age, gender, and social background of viewers. Relationship to series relied on…
A series of simulated rainfall run-off experiments with applications of different manure types (cattle solid pats, poultry dry litter, swine slurry) was conducted across four seasons on a field containing 36 plots (0.75 × 2 m each), resulting in 144 rainfall run-off events....
Cross-correlation of point series using a new method
NASA Technical Reports Server (NTRS)
Strothers, Richard B.
1994-01-01
Traditional methods of cross-correlation of two time series do not apply to point time series. Here, a new method, devised specifically for point series, utilizes a correlation measure that is based in the rms difference (or, alternatively, the median absolute difference) between nearest neightbors in overlapped segments of the two series. Error estimates for the observed locations of the points, as well as a systematic shift of one series with respect to the other to accommodate a constant, but unknown, lead or lag, are easily incorporated into the analysis using Monte Carlo techniques. A methodological restriction adopted here is that one series be treated as a template series against which the other, called the target series, is cross-correlated. To estimate a significance level for the correlation measure, the adopted alternative (null) hypothesis is that the target series arises from a homogeneous Poisson process. The new method is applied to cross-correlating the times of the greatest geomagnetic storms with the times of maximum in the undecennial solar activity cycle.
Three-dimensional reconstruction of single-cell chromosome structure using recurrence plots.
Hirata, Yoshito; Oda, Arisa; Ohta, Kunihiro; Aihara, Kazuyuki
2016-10-11
Single-cell analysis of the three-dimensional (3D) chromosome structure can reveal cell-to-cell variability in genome activities. Here, we propose to apply recurrence plots, a mathematical method of nonlinear time series analysis, to reconstruct the 3D chromosome structure of a single cell based on information of chromosomal contacts from genome-wide chromosome conformation capture (Hi-C) data. This recurrence plot-based reconstruction (RPR) method enables rapid reconstruction of a unique structure in single cells, even from incomplete Hi-C information.
Three-dimensional reconstruction of single-cell chromosome structure using recurrence plots
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; Oda, Arisa; Ohta, Kunihiro; Aihara, Kazuyuki
2016-10-01
Single-cell analysis of the three-dimensional (3D) chromosome structure can reveal cell-to-cell variability in genome activities. Here, we propose to apply recurrence plots, a mathematical method of nonlinear time series analysis, to reconstruct the 3D chromosome structure of a single cell based on information of chromosomal contacts from genome-wide chromosome conformation capture (Hi-C) data. This recurrence plot-based reconstruction (RPR) method enables rapid reconstruction of a unique structure in single cells, even from incomplete Hi-C information.
The Effectiveness of an Electronic Security Management System in a Privately Owned Apartment Complex
ERIC Educational Resources Information Center
Greenberg, David F.; Roush, Jeffrey B.
2009-01-01
Poisson and negative binomial regression methods are used to analyze the monthly time series data to determine the effects of introducing an integrated security management system including closed-circuit television (CCTV), door alarm monitoring, proximity card access, and emergency call boxes to a large privately-owned complex of apartment…
Extended generalized geometry and a DBI-type effective action for branes ending on branes
NASA Astrophysics Data System (ADS)
Jurčo, Branislav; Schupp, Peter; Vysoký, Jan
2014-08-01
Starting from the Nambu-Goto bosonic membrane action, we develop a geometric description suitable for p-brane backgrounds. With tools of generalized geometry we derive the pertinent generalization of the string open-closed relations to the p-brane case. Nambu-Poisson structures are used in this context to generalize the concept of semi-classical noncommutativity of D-branes governed by a Poisson tensor. We find a natural description of the correspondence of recently proposed commutative and noncommutative versions of an effective action for p-branes ending on a p '-brane. We calculate the power series expansion of the action in background independent gauge. Leading terms in the double scaling limit are given by a generalization of a (semi-classical) matrix model.
Fast, adaptive summation of point forces in the two-dimensional Poisson equation
NASA Technical Reports Server (NTRS)
Van Dommelen, Leon; Rundensteiner, Elke A.
1989-01-01
A comparatively simple procedure is presented for the direct summation of the velocity field introduced by point vortices which significantly reduces the required number of operations by replacing selected partial sums by asymptotic series. Tables are presented which demonstrate the speed of this algorithm in terms of the mere doubling of computational time in dealing with a doubling of the number of vortices; current methods involve a computational time extension by a factor of 4. This procedure need not be restricted to the solution of the Poisson equation, and may be applied to other problems involving groups of points in which the interaction between elements of different groups can be simplified when the distance between groups is sufficiently great.
Application of Poisson random effect models for highway network screening.
Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer
2014-02-01
In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.
Pinheiro, Samya de Lara Lins de Araujo; Saldiva, Paulo Hilário Nascimento; Schwartz, Joel; Zanobetti, Antonella
2014-12-01
OBJECTIVE To analyze the effect of air pollution and temperature on mortality due to cardiovascular and respiratory diseases. METHODS We evaluated the isolated and synergistic effects of temperature and particulate matter with aerodynamic diameter < 10 µm (PM10) on the mortality of individuals > 40 years old due to cardiovascular disease and that of individuals > 60 years old due to respiratory diseases in Sao Paulo, SP, Southeastern Brazil, between 1998 and 2008. Three methodologies were used to evaluate the isolated association: time-series analysis using Poisson regression model, bidirectional case-crossover analysis matched by period, and case-crossover analysis matched by the confounding factor, i.e., average temperature or pollutant concentration. The graphical representation of the response surface, generated by the interaction term between these factors added to the Poisson regression model, was interpreted to evaluate the synergistic effect of the risk factors. RESULTS No differences were observed between the results of the case-crossover and time-series analyses. The percentage change in the relative risk of cardiovascular and respiratory mortality was 0.85% (0.45;1.25) and 1.60% (0.74;2.46), respectively, due to an increase of 10 μg/m3 in the PM10 concentration. The pattern of correlation of the temperature with cardiovascular mortality was U-shaped and that with respiratory mortality was J-shaped, indicating an increased relative risk at high temperatures. The values for the interaction term indicated a higher relative risk for cardiovascular and respiratory mortalities at low temperatures and high temperatures, respectively, when the pollution levels reached approximately 60 μg/m3. CONCLUSIONS The positive association standardized in the Poisson regression model for pollutant concentration is not confounded by temperature, and the effect of temperature is not confounded by the pollutant levels in the time-series analysis. The simultaneous exposure to different levels of environmental factors can create synergistic effects that are as disturbing as those caused by extreme concentrations.
Evenly spaced Detrended Fluctuation Analysis: Selecting the number of points for the diffusion plot
NASA Astrophysics Data System (ADS)
Liddy, Joshua J.; Haddad, Jeffrey M.
2018-02-01
Detrended Fluctuation Analysis (DFA) has become a widely-used tool to examine the correlation structure of a time series and provided insights into neuromuscular health and disease states. As the popularity of utilizing DFA in the human behavioral sciences has grown, understanding its limitations and how to properly determine parameters is becoming increasingly important. DFA examines the correlation structure of variability in a time series by computing α, the slope of the log SD- log n diffusion plot. When using the traditional DFA algorithm, the timescales, n, are often selected as a set of integers between a minimum and maximum length based on the number of data points in the time series. This produces non-uniformly distributed values of n in logarithmic scale, which influences the estimation of α due to a disproportionate weighting of the long-timescale regions of the diffusion plot. Recently, the evenly spaced DFA and evenly spaced average DFA algorithms were introduced. Both algorithms compute α by selecting k points for the diffusion plot based on the minimum and maximum timescales of interest and improve the consistency of α estimates for simulated fractional Gaussian noise and fractional Brownian motion time series. Two issues that remain unaddressed are (1) how to select k and (2) whether the evenly-spaced DFA algorithms show similar benefits when assessing human behavioral data. We manipulated k and examined its effects on the accuracy, consistency, and confidence limits of α in simulated and experimental time series. We demonstrate that the accuracy and consistency of α are relatively unaffected by the selection of k. However, the confidence limits of α narrow as k increases, dramatically reducing measurement uncertainty for single trials. We provide guidelines for selecting k and discuss potential uses of the evenly spaced DFA algorithms when assessing human behavioral data.
Reck, Kasper; Thomsen, Erik V; Hansen, Ole
2011-01-31
The scalar wave equation, or Helmholtz equation, describes within a certain approximation the electromagnetic field distribution in a given system. In this paper we show how to solve the Helmholtz equation in complex geometries using conformal mapping and the homotopy perturbation method. The solution of the mapped Helmholtz equation is found by solving an infinite series of Poisson equations using two dimensional Fourier series. The solution is entirely based on analytical expressions and is not mesh dependent. The analytical results are compared to a numerical (finite element method) solution.
NASA Technical Reports Server (NTRS)
Golub, M. A.; Wydeven, T.
1997-01-01
It is well known that the rate of plasma polymerization, or deposition rate, of a given monomer depends on various plasma process parameters, e.g., monomer flow rate, pressure, power, frequency (DC, rf or microwave), location of the substrate in the reactor, reactor geometry or configuration, and temperature. In contrast, little work has been done to relate deposition rates to monomer structures for a homologous series of monomers where the rates are obtained under identical plasma process parameters. For the particular series of fluorinated ethylenes (C2HxF4-x; x = 0-4), deposition rates were reported for ethylene (ET), vinyl fluoride, vinylidene fluoride and tetrafluoroethylene (TFE), but for plasma polymerizations carried out under different discharge conditions, e.g., pressure, current density, and electrode temperature. Apparently, relative deposition rates were reported for only two members of that series (ET, x = 4, and TFE, x = 0) for which the plasma polymerizations were conducted under identical conditions. We now present relative deposition rates for both homopolymerizations and copolymerizations of the entire series of fluorinated ethylenes (x = 0-4). Our interest in such rates stems from prior work on the plasma copolymerization of ET and TFE in which it was found that the deposition rates for the plasma copolymers, when plotted versus mol % TFE in the ET/TFE feed stock, followed a concave-downward curve situated above the straight line joining the deposition rates for the plasma homopolymers. This type of plot (observed also for an argon-ET/TFE plasma copolymerization) indicated a positive interaction between ET and TFE such that each monomer apparently "sensitized" the plasma copolymerization of the other. Since the shape of that plot is not altered if mol % TFE is replaced by F/C, the fluorine-to-carbon ratio, this paper aims (1) to show how the relative deposition rates for plasma copolymers drawn from all pairs of monomers in the C2HxF4-x series, as well as the deposition rates for the individual plasma homopolymers, vary with F/C ratios of the monomers or monomer blends, and (2) to see if those rates give rise to a common plot.
USDA-ARS?s Scientific Manuscript database
A series of simulated rainfall-runoff experiments with applications of different manure types (cattle solid pats, poultry dry litter, swine slurry) were conducted across four seasons on a field containing 36 plots (0.75 × 2 m each), resulting in 144 rainfall-runoff events. Simulating time-varying re...
Forest habitat types of northern Idaho: A second approximation
Stephen V. Cooper; Kenneth E. Neiman; David W. Roberts
1991-01-01
The addition of more than 900 plots to the Daubenmire's original 181-plot database has resulted in a refinement of their potential natural vegetation-based land classification for northern Idaho. A diagnostic, indicator species-based key is provided for field identification of the eight climax series, 46 habitat types, and 60 phases. Recognized syntaxa are...
NASA Technical Reports Server (NTRS)
Acker, J. G.; Leptoukh, G.; Kempler, S.; Gregg, W.; Berrick, S.; Zhu, T.; Liu, Z.; Rui, H.; Shen, S.
2004-01-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step addressing the challenge of using archived Earth Observing System (EOS) data for regional or global studies by developing an infrastructure with a World Wide Web interface which allows online, interactive, data analysis: the GES DISC Interactive Online Visualization and ANalysis Infrastructure, or "Giovanni." Giovanni provides a data analysis environment that is largely independent of underlying data file format. The Ocean Color Time-Series Project has created an initial implementation of Giovanni using monthly Standard Mapped Image (SMI) data products from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) mission. Giovanni users select geophysical parameters, and the geographical region and time period of interest. The system rapidly generates a graphical or ASCII numerical data output. Currently available output options are: Area plot (averaged or accumulated over any available data period for any rectangular area); Time plot (time series averaged over any rectangular area); Hovmeller plots (image view of any longitude-time and latitude-time cross sections); ASCII output for all plot types; and area plot animations. Future plans include correlation plots, output formats compatible with Geographical Information Systems (GIs), and higher temporal resolution data. The Ocean Color Time-Series Project will produce sensor-independent ocean color data beginning with the Coastal Zone Color Scanner (CZCS) mission and extending through SeaWiFS and Moderate Resolution Imaging Spectroradiometer (MODIS) data sets, and will enable incorporation of Visible/lnfrared Imaging Radiometer Suite (VIIRS) data, which will be added to Giovanni. The first phase of Giovanni will also include tutorials demonstrating the use of Giovanni and collaborative assistance in the development of research projects using the SeaWiFS and Ocean Color Time-Series Project data in the online Laboratory for Ocean Color Users (LOCUS). The synergy of Giovanni with high-quality ocean color data provides users with the ability to investigate a variety of important oceanic phenomena, such as coastal primary productivity related to pelagic fisheries, seasonal patterns and interannual variability, interdependence of atmospheric dust aerosols and harmful algal blooms, and the potential effects of climate change on oceanic productivity.
Zhang, Ling Yu; Liu, Zhao Gang
2017-12-01
Based on the data collected from 108 permanent plots of the forest resources survey in Maoershan Experimental Forest Farm during 2004-2016, this study investigated the spatial distribution of recruitment trees in natural secondary forest by global Poisson regression and geographically weighted Poisson regression (GWPR) with four bandwidths of 2.5, 5, 10 and 15 km. The simulation effects of the 5 regressions and the factors influencing the recruitment trees in stands were analyzed, a description was given to the spatial autocorrelation of the regression residuals on global and local levels using Moran's I. The results showed that the spatial distribution of the number of natural secondary forest recruitment was significantly influenced by stands and topographic factors, especially average DBH. The GWPR model with small scale (2.5 km) had high accuracy of model fitting, a large range of model parameter estimates was generated, and the localized spatial distribution effect of the model parameters was obtained. The GWPR model at small scale (2.5 and 5 km) had produced a small range of model residuals, and the stability of the model was improved. The global spatial auto-correlation of the GWPR model residual at the small scale (2.5 km) was the lowe-st, and the local spatial auto-correlation was significantly reduced, in which an ideal spatial distribution pattern of small clusters with different observations was formed. The local model at small scale (2.5 km) was much better than the global model in the simulation effect on the spatial distribution of recruitment tree number.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wackerbarth, David
Sandia National Laboratories has developed a computer program to review, reduce and manipulate waveform data. PlotData is designed for post-acquisition waveform data analysis. PlotData is both a post-acquisition and an advanced interactive data analysis environment. PlotData requires unidirectional waveform data with both uniform and discrete time-series measurements. PlotData operates on a National Instruments' LabVIEW™ software platform. Using PlotData, the user can capture waveform data from digitizing oscilloscopes over a GPIB, USB and Ethernet interface from Tektronix, Lecroy or Agilent scopes. PlotData can both import and export several types of binary waveform files including, but not limited to, Tektronix .wmf files,more » Lecroy.trc files and xy pair ASCIIfiles. Waveform manipulation includes numerous math functions, integration, differentiation, smoothing, truncation, and other specialized data reduction routines such as VISAR, POV, PVDF (Bauer) piezoelectric gauges, and piezoresistive gauges such as carbon manganin pressure gauges.« less
Li, Xiaoning; Huang, Lijun; Hu, Xiche; Huang, Xuefei
2009-01-01
Summary Three series of thioglycosyl donors differing only in their respective aglycon substituents within each series have been prepared as representatives of typical glycosyl donors. The relative anomeric reactivities of these donors were quantified under competitive glycosylation conditions with various reaction time, promoters, solvents and acceptors. Over three orders of magnitude reactivity difference were generated by simple transformation of the para-substituent on the aglycon with methanol as the acceptor, while chemoselectivities became lower with carbohydrate acceptors. Excellent linear correlations were attained between relative reactivity values of donors and σp values of the substituents in the Hammett plots. This indicates that the glycosylation mechanism remains the same over a wide range of reactivities and glycosylation conditions. The negative slopes of the Hammett plots suggested that electron donating substituents expedite the reactions and the magnitudes of slopes can be rationalized by neighboring group participation as well as electronic properties of the glycon protective groups. Within the same series of donors, less nucleophilic acceptors gave smaller slopes in their Hammett plots. This is consistent with the notion that acceptor nucleophilic attack onto the reactive intermediate is part of the rate limiting step of the glycosylation reaction. Excellent linear Hammett correlations were obtained between relative reactivity values of three series of donors differing only in their aglycon substituents and σp values of the substituents. PMID:19081954
The lunar libration: comparisons between various models - a model fitted to LLR observations
NASA Astrophysics Data System (ADS)
Chapront, J.; Francou, G.
2005-09-01
We consider 4 libration models: 3 numerical models built by JPL (ephemerides for the libration in DE245, DE403 and DE405) and an analytical model improved with numerical complements fitted to recent LLR observations. The analytical solution uses 3 angular variables (ρ1, ρ2, τ) which represent the deviations with respect to Cassini's laws. After having referred the models to a unique reference frame, we study the differences between the models which depend on gravitational and tidal parameters of the Moon, as well as amplitudes and frequencies of the free librations. It appears that the differences vary widely depending of the above quantities. They correspond to a few meters displacement on the lunar surface, reminding that LLR distances are precise to the centimeter level. Taking advantage of the lunar libration theory built by Moons (1984) and improved by Chapront et al. (1999) we are able to establish 4 solutions and to represent their differences by Fourier series after a numerical substitution of the gravitational constants and free libration parameters. The results are confirmed by frequency analyses performed separately. Using DE245 as a basic reference ephemeris, we approximate the differences between the analytical and numerical models with Poisson series. The analytical solution - improved with numerical complements under the form of Poisson series - is valid over several centuries with an internal precision better than 5 centimeters.
ERIC Educational Resources Information Center
Syed, M. Qasim; Lovatt, Ian
2014-01-01
This paper is an addition to the series of papers on the exponential function begun by Albert Bartlett. In particular, we ask how the graph of the exponential function y = e[superscript -t/t] would appear if y were plotted versus ln t rather than the normal practice of plotting ln y versus t. In answering this question, we find a new way to…
An assessment of autumn olive in northern U.S. forests
Cassandra M. Kurtz; Mark H. Hansen
2016-01-01
This publication is part of a series of research notes that provide an overview of the invasive plant species monitored on an extensive systematic network of plots measured by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station (NRS). Each research note features one of the invasive plants monitored on forested plots by...
US forests are showing increased rates of decline in response to a changing climate
Warren B. Cohen; Zhiqiang Yang; David M. Bell; Stephen V. Stehman
2015-01-01
How vulnerable are US forest to a changing climate? We answer this question using Landsat time series data and a unique interpretation approach, TimeSync, a plot-based Landsat visualization and data collection tool. Original analyses were based on a stratified two-stage cluster sample design that included interpretation of 3858 forested plots. From these data, we...
An assessment of nonnative bush honeysuckle in northern U.S. forests
Cassandra Kurtz; M.H. Hansen
2015-01-01
This publication is part of a series that provides an overview of the presence of invasive plant species monitored on an extensive systematic network of plots measured by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station (NRS). Each research note features one of the invasive plants monitored on forested plots by NRS...
Monitoring hemlock crown health in Delaware Water Gap National Recreation Area
Michael E. Montgomery; Bradley Onken; Richard A. Evans; Richard A. Evans
2005-01-01
Decline of the health of hemlocks in Delaware Water Gap National Recreation Area was noticeable in the southern areas of the park by 1992. The following year, a series of plots were established to monitor hemlock health and the abundance of hemlock woolly adelgid. This poster examines only the health rating of the hemlocks in the monitoring plots.
An assessment of common buckthorn in northern U.S. forests
Cassandra M. Kurtz; Mark H. Hansen
2018-01-01
This publication is part of a series of that provides an overview of the presence of invasive plant species monitored on an extensive systematic network of plots measured by the Forest Inventory and Analysis (FIA) program of the USDA Forest Service, Northern Research Station (NRS). Each research note features one of the invasive plants monitored on forested plots by...
Early thinning experiments established by the Fort Valley Experimental Forest (P-53)
Benjamin P. De Blois; Alex. J. Finkral; Andrew J. Sánchez Meador; Margaret M. Moore
2008-01-01
Between 1925 and 1936, the Fort Valley Experimental Forest (FVEF) scientists initiated a study to examine a series of forest thinning experiments in second growth ponderosa pine stands in Arizona and New Mexico. These early thinning plots furnished much of the early background for the development of methods used in forest management in the Southwest. The plots ranged...
Early thinning experiments established by the Fort Valley Experimental Forest
Benjamin P. De Blois; Alex. J. Finkral; Andrew J. Sanchez Meador; Margaret M. Moore
2008-01-01
Between 1925 and 1936, the Fort Valley Experimental Forest (FVEF) scientists initiated a study to examine a series of forest thinning experiments in second growth ponderosa pine stands in Arizona and New Mexico. These early thinning plots furnished much of the early background for the development of methods used in forest management in the Southwest. The plots ranged...
An assessment of Japanese honeysuckle in northern U.S. forests
Cassandra M. Kurtz; Mark H. Hansen
2015-01-01
This publication is part of a series that provides an overview of the presence of invasive plant species monitored on an extensive systematic network of plots measured by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station (NRS). Each research note features one of the invasive plants monitored on forested plots by NRS...
Harvesting intensity affects forest structure and composition in an upland Amazonian forest
John A. Parrotta; John K. Francis; Oliver H. Knowles; NO-VALUE
2002-01-01
Forest structure and floristic composition were studied in a series of 0.5 ha natural forest plots at four sites near Porto Trombetas in Pará State, Brazil, 11–12 years after being subjected to differing levels of above-ground biomass harvest and removal. In addition to undisturbed control plots, experimental treatments included: removal of...
An assessment of garlic mustard in northern U.S. forests
Cassandra M. Kurtz; Mark H. Hansen
2014-01-01
This publication is part of a series that provides an overview of the presence of invasive plant species monitored on an extensive systematic network of plots measured by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station (NRS). Each research note features one of the invasive plants monitored on forested plots by FIA...
Assessment of trend and seasonality in road accident data: an Iranian case study.
Razzaghi, Alireza; Bahrampour, Abbas; Baneshi, Mohammad Reza; Zolala, Farzaneh
2013-06-01
Road traffic accidents and their related deaths have become a major concern, particularly in developing countries. Iran has adopted a series of policies and interventions to control the high number of accidents occurring over the past few years. In this study we used a time series model to understand the trend of accidents, and ascertain the viability of applying ARIMA models on data from Taybad city. This study is a cross-sectional study. We used data from accidents occurring in Taybad between 2007 and 2011. We obtained the data from the Ministry of Health (MOH) and used the time series method with a time lag of one month. After plotting the trend, non-stationary data in mean and variance were removed using Box-Cox transformation and a differencing method respectively. The ACF and PACF plots were used to control the stationary situation. The traffic accidents in our study had an increasing trend over the five years of study. Based on ACF and PACF plots gained after applying Box-Cox transformation and differencing, data did not fit to a time series model. Therefore, neither ARIMA model nor seasonality were observed. Traffic accidents in Taybad have an upward trend. In addition, we expected either the AR model, MA model or ARIMA model to have a seasonal trend, yet this was not observed in this analysis. Several reasons may have contributed to this situation, such as uncertainty of the quality of data, weather changes, and behavioural factors that are not taken into account by time series analysis.
Modeling species-abundance relationships in multi-species collections
Peng, S.; Yin, Z.; Ren, H.; Guo, Q.
2003-01-01
Species-abundance relationship is one of the most fundamental aspects of community ecology. Since Motomura first developed the geometric series model to describe the feature of community structure, ecologists have developed many other models to fit the species-abundance data in communities. These models can be classified into empirical and theoretical ones, including (1) statistical models, i.e., negative binomial distribution (and its extension), log-series distribution (and its extension), geometric distribution, lognormal distribution, Poisson-lognormal distribution, (2) niche models, i.e., geometric series, broken stick, overlapping niche, particulate niche, random assortment, dominance pre-emption, dominance decay, random fraction, weighted random fraction, composite niche, Zipf or Zipf-Mandelbrot model, and (3) dynamic models describing community dynamics and restrictive function of environment on community. These models have different characteristics and fit species-abundance data in various communities or collections. Among them, log-series distribution, lognormal distribution, geometric series, and broken stick model have been most widely used.
Justin S. Crotteau; Martin W. Ritchie; J. Morgan Varner
2014-01-01
Many western USA fire regimes are typified by mixed-severity fire, which compounds the variability inherent to natural regeneration densities in associated forests. Tree regeneration data are often discrete and nonnegative; accordingly, we fit a series of Poisson and negative binomial variation models to conifer seedling counts across four distinct burn severities and...
Modeling time-series count data: the unique challenges facing political communication studies.
Fogarty, Brian J; Monogan, James E
2014-05-01
This paper demonstrates the importance of proper model specification when analyzing time-series count data in political communication studies. It is common for scholars of media and politics to investigate counts of coverage of an issue as it evolves over time. Many scholars rightly consider the issues of time dependence and dynamic causality to be the most important when crafting a model. However, to ignore the count features of the outcome variable overlooks an important feature of the data. This is particularly the case when modeling data with a low number of counts. In this paper, we argue that the Poisson autoregressive model (Brandt and Williams, 2001) accurately meets the needs of many media studies. We replicate the analyses of Flemming et al. (1997), Peake and Eshbaugh-Soha (2008), and Ura (2009) and demonstrate that models missing some of the assumptions of the Poisson autoregressive model often yield invalid inferences. We also demonstrate that the effect of any of these models can be illustrated dynamically with estimates of uncertainty through a simulation procedure. The paper concludes with implications of these findings for the practical researcher. Copyright © 2013 Elsevier Inc. All rights reserved.
Paillet, Frederick L.; Cheng, C.H.; Meredith, J.A.
1987-01-01
Existing techniques for the quantitative interpretation of waveform data have been based on one of two fundamental approaches: (1) simultaneous identification of compressional and shear velocities; and (2) least-squares minimization of the difference between experimental waveforms and synthetic seismograms. Techniques based on the first approach do not always work, and those based on the second seem too numerically cumbersome for routine application during data processing. An alternative approach is tested here, in which synthetic waveforms are used to predict relative mode excitation in the composite waveform. Synthetic waveforms are generated for a series of lithologies ranging from hard, crystalline rocks (Vp equals 6. 0 km/sec. and Poisson's ratio equals 0. 20) to soft, argillaceous sediments (Vp equals 1. 8 km/sec. and Poisson's ratio equals 0. 40). The series of waveforms illustrates a continuous change within this range of rock properties. Mode energy within characteristic velocity windows is computed for each of the modes in the set of synthetic waveforms. The results indicate that there is a consistent variation in mode excitation in lithology space that can be used to construct a unique relationship between relative mode excitation and lithology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pabst, M., E-mail: M.Pabst@fz-juelich.de
2014-06-14
Single charge densities and the potential are used to describe models of electrochemical systems. These quantities can be calculated by solving a system of time dependent nonlinear coupled partial differential equations, the Poisson-Nernst-Planck equations. Assuming small deviations from the electroneutral equilibrium, the linearized and decoupled equations are solved for a radial symmetric geometry, which represents the interface between a cell and a sensor device. The densities and the potential are expressed by Fourier-Bessels series. The system considered has a ratio between the Debye-length and its geometric dimension on the order of 10{sup −4} so the Fourier-Bessel series can be approximatedmore » by elementary functions. The time development of the system is characterized by two time constants, τ{sub c} and τ{sub g}. The constant τ{sub c} describes the approach to the stationary state of the total charge and the potential. τ{sub c} is several orders of magnitude smaller than the geometry-dependent constant τ{sub g}, which is on the order of 10 ms characterizing the transition to the stationary state of the single ion densities.« less
REJEKI, Dwi Sarwani Sri; NURHAYATI, Nunung; AJI, Budi; MURHANDARWATI, E. Elsa Herdiana; KUSNANTO, Hari
2018-01-01
Background: Climatic and weather factors become important determinants of vector-borne diseases transmission like malaria. This study aimed to prove relationships between weather factors with considering human migration and previous case findings and malaria cases in endemic areas in Purworejo during 2005–2014. Methods: This study employed ecological time series analysis by using monthly data. The independent variables were the maximum temperature, minimum temperature, maximum humidity, minimum humidity, precipitation, human migration, and previous malaria cases, while the dependent variable was positive malaria cases. Three models of count data regression analysis i.e. Poisson model, quasi-Poisson model, and negative binomial model were applied to measure the relationship. The least Akaike Information Criteria (AIC) value was also performed to find the best model. Negative binomial regression analysis was considered as the best model. Results: The model showed that humidity (lag 2), precipitation (lag 3), precipitation (lag 12), migration (lag1) and previous malaria cases (lag 12) had a significant relationship with malaria cases. Conclusion: Weather, migration and previous malaria cases factors need to be considered as prominent indicators for the increase of malaria case projection. PMID:29900134
Arcentales, Andrés; Giraldo, Beatriz F; Caminal, Pere; Benito, Salvador; Voss, Andreas
2011-01-01
Autonomic nervous system regulates the behavior of cardiac and respiratory systems. Its assessment during the ventilator weaning can provide information about physio-pathological imbalances. This work proposes a non linear analysis of the complexity of the heart rate variability (HRV) and breathing duration (T(Tot)) applying recurrence plot (RP) and their interaction joint recurrence plot (JRP). A total of 131 patients on weaning trials from mechanical ventilation were analyzed: 92 patients with successful weaning (group S) and 39 patients that failed to maintain spontaneous breathing (group F). The results show that parameters as determinism (DET), average diagonal line length (L), and entropy (ENTR), are statistically significant with RP for T(Tot) series, but not with HRV. When comparing the groups with JRP, all parameters have been relevant. In all cases, mean values of recurrence quantification analysis are higher in the group S than in the group F. The main differences between groups were found on the diagonal and vertical structures of the joint recurrence plot.
Digital geologic map of the Butler Peak 7.5' quadrangle, San Bernardino County, California
Miller, Fred K.; Matti, Jonathan C.; Brown, Howard J.; digital preparation by Cossette, P. M.
2000-01-01
Open-File Report 00-145, is a digital geologic map database of the Butler Peak 7.5' quadrangle that includes (1) ARC/INFO (Environmental Systems Research Institute) version 7.2.1 Patch 1 coverages, and associated tables, (2) a Portable Document Format (.pdf) file of the Description of Map Units, Correlation of Map Units chart, and an explanation of symbols used on the map, btlrpk_dcmu.pdf, (3) a Portable Document Format file of this Readme, btlrpk_rme.pdf (the Readme is also included as an ascii file in the data package), and (4) a PostScript plot file of the map, Correlation of Map Units, and Description of Map Units on a single sheet, btlrpk.ps. No paper map is included in the Open-File report, but the PostScript plot file (number 4 above) can be used to produce one. The PostScript plot file generates a map, peripheral text, and diagrams in the editorial format of USGS Geologic Investigation Series (I-series) maps.
García-González, Miguel A; Fernández-Chimeno, Mireya; Ramos-Castro, Juan
2009-02-01
An analysis of the errors due to the finite resolution of RR time series in the estimation of the approximate entropy (ApEn) is described. The quantification errors in the discrete RR time series produce considerable errors in the ApEn estimation (bias and variance) when the signal variability or the sampling frequency is low. Similar errors can be found in indices related to the quantification of recurrence plots. An easy way to calculate a figure of merit [the signal to resolution of the neighborhood ratio (SRN)] is proposed in order to predict when the bias in the indices could be high. When SRN is close to an integer value n, the bias is higher than when near n - 1/2 or n + 1/2. Moreover, if SRN is close to an integer value, the lower this value, the greater the bias is.
The Reassuring Role of TV's Continuing Characters.
ERIC Educational Resources Information Center
Piltch, Charles N.
1979-01-01
The author analyzes the series form of television program, particularly the qualities and functions of the continuing characters and their relationship to the plot. He discusses the reassuring psychological effects of a TV series on the audience and the implications of a decline in this type of programing. (SJL)
W. Cohen; H. Andersen; S. Healey; G. Moisen; T. Schroeder; C. Woodall; G. Domke; Z. Yang; S. Stehman; R. Kennedy; C. Woodcock; Z. Zhu; J. Vogelmann; D. Steinwand; C. Huang
2014-01-01
The authors are developing a REDD+ MRV system that tests different biomass estimation frameworks and components. Design-based inference from a costly fi eld plot network was compared to sampling with LiDAR strips and a smaller set of plots in combination with Landsat for disturbance monitoring. Biomass estimation uncertainties associated with these different data sets...
Red spruce ecosystem level changes following 14 years of chronic N fertilization
Steven G. McNulty; Johnny Boggs; John D. Aber; Lindsey Rustad; Allison Magill
2005-01-01
In the early 1980s, nitrogen (N) deposition was first postulated as a cause of N saturation and spruce mortality across the northeastern US. In 1988, a series of high elevation spruce-fir forest N addition plots were established on Mt. Ascutney (southeastern) Vermont to test this hypothesis. The paired plots each received, in addition to ambient N deposition, 15.7 kg...
Detecting recurrence domains of dynamical systems by symbolic dynamics.
beim Graben, Peter; Hutt, Axel
2013-04-12
We propose an algorithm for the detection of recurrence domains of complex dynamical systems from time series. Our approach exploits the characteristic checkerboard texture of recurrence domains exhibited in recurrence plots. In phase space, recurrence plots yield intersecting balls around sampling points that could be merged into cells of a phase space partition. We construct this partition by a rewriting grammar applied to the symbolic dynamics of time indices. A maximum entropy principle defines the optimal size of intersecting balls. The final application to high-dimensional brain signals yields an optimal symbolic recurrence plot revealing functional components of the signal.
Presenting simulation results in a nested loop plot.
Rücker, Gerta; Schwarzer, Guido
2014-12-12
Statisticians investigate new methods in simulations to evaluate their properties for future real data applications. Results are often presented in a number of figures, e.g., Trellis plots. We had conducted a simulation study on six statistical methods for estimating the treatment effect in binary outcome meta-analyses, where selection bias (e.g., publication bias) was suspected because of apparent funnel plot asymmetry. We varied five simulation parameters: true treatment effect, extent of selection, event proportion in control group, heterogeneity parameter, and number of studies in meta-analysis. In combination, this yielded a total number of 768 scenarios. To present all results using Trellis plots, 12 figures were needed. Choosing bias as criterion of interest, we present a 'nested loop plot', a diagram type that aims to have all simulation results in one plot. The idea was to bring all scenarios into a lexicographical order and arrange them consecutively on the horizontal axis of a plot, whereas the treatment effect estimate is presented on the vertical axis. The plot illustrates how parameters simultaneously influenced the estimate. It can be combined with a Trellis plot in a so-called hybrid plot. Nested loop plots may also be applied to other criteria such as the variance of estimation. The nested loop plot, similar to a time series graph, summarizes all information about the results of a simulation study with respect to a chosen criterion in one picture and provides a suitable alternative or an addition to Trellis plots.
Impact of traffic congestion on road accidents: a spatial analysis of the M25 motorway in England.
Wang, Chao; Quddus, Mohammed A; Ison, Stephen G
2009-07-01
Traffic congestion and road accidents are two external costs of transport and the reduction of their impacts is often one of the primary objectives for transport policy makers. The relationship between traffic congestion and road accidents however is not apparent and less studied. It is speculated that there may be an inverse relationship between traffic congestion and road accidents, and as such this poses a potential dilemma for transport policy makers. This study aims to explore the impact of traffic congestion on the frequency of road accidents using a spatial analysis approach, while controlling for other relevant factors that may affect road accidents. The M25 London orbital motorway, divided into 70 segments, was chosen to conduct this study and relevant data on road accidents, traffic and road characteristics were collected. A robust technique has been developed to map M25 accidents onto its segments. Since existing studies have often used a proxy to measure the level of congestion, this study has employed a precise congestion measurement. A series of Poisson based non-spatial (such as Poisson-lognormal and Poisson-gamma) and spatial (Poisson-lognormal with conditional autoregressive priors) models have been used to account for the effects of both heterogeneity and spatial correlation. The results suggest that traffic congestion has little or no impact on the frequency of road accidents on the M25 motorway. All other relevant factors have provided results consistent with existing studies.
García Maset, Leonor; González, Lidia Blasco; Furquet, Gonzalo Llop; Suay, Francisco Montes; Marco, Roberto Hernández
2016-11-01
Time series analysis provides information on blood glucose dynamics that is unattainable with conventional glycemic variability (GV) indices. To date, no studies have been published on these parameters in pediatric patients with type 1 diabetes. Our aim is to evaluate the relationship between time series analysis and conventional GV indices, and glycosylated hemoglobin (HbA1c) levels. This is a transversal study of 41 children and adolescents with type 1 diabetes. Glucose monitoring was carried out continuously for 72 h to study the following GV indices: standard deviation (SD) of glucose levels (mg/dL), coefficient of variation (%), interquartile range (IQR; mg/dL), mean amplitude of the largest glycemic excursions (MAGE), and continuous overlapping net glycemic action (CONGA). The time series analysis was conducted by means of detrended fluctuation analysis (DFA) and Poincaré plot. Time series parameters (DFA alpha coefficient and elements of the ellipse of the Poincaré plot) correlated well with the more conventional GV indices. Patients were grouped according to the terciles of these indices, to the terciles of eccentricity (1: 12.56-16.98, 2: 16.99-21.91, 3: 21.92-41.03), and to the value of the DFA alpha coefficient (> or ≤1.5). No differences were observed in the HbA1c of patients grouped by GV index criteria; however, significant differences were found in patients grouped by alpha coefficient and eccentricity, not only in terms of HbA1c, but also in SD glucose, IQR, and CONGA index. The loss of complexity in glycemic homeostasis is accompanied by an increase in variability.
Lord, Dominique
2006-07-01
There has been considerable research conducted on the development of statistical models for predicting crashes on highway facilities. Despite numerous advancements made for improving the estimation tools of statistical models, the most common probabilistic structure used for modeling motor vehicle crashes remains the traditional Poisson and Poisson-gamma (or Negative Binomial) distribution; when crash data exhibit over-dispersion, the Poisson-gamma model is usually the model of choice most favored by transportation safety modelers. Crash data collected for safety studies often have the unusual attributes of being characterized by low sample mean values. Studies have shown that the goodness-of-fit of statistical models produced from such datasets can be significantly affected. This issue has been defined as the "low mean problem" (LMP). Despite recent developments on methods to circumvent the LMP and test the goodness-of-fit of models developed using such datasets, no work has so far examined how the LMP affects the fixed dispersion parameter of Poisson-gamma models used for modeling motor vehicle crashes. The dispersion parameter plays an important role in many types of safety studies and should, therefore, be reliably estimated. The primary objective of this research project was to verify whether the LMP affects the estimation of the dispersion parameter and, if it is, to determine the magnitude of the problem. The secondary objective consisted of determining the effects of an unreliably estimated dispersion parameter on common analyses performed in highway safety studies. To accomplish the objectives of the study, a series of Poisson-gamma distributions were simulated using different values describing the mean, the dispersion parameter, and the sample size. Three estimators commonly used by transportation safety modelers for estimating the dispersion parameter of Poisson-gamma models were evaluated: the method of moments, the weighted regression, and the maximum likelihood method. In an attempt to complement the outcome of the simulation study, Poisson-gamma models were fitted to crash data collected in Toronto, Ont. characterized by a low sample mean and small sample size. The study shows that a low sample mean combined with a small sample size can seriously affect the estimation of the dispersion parameter, no matter which estimator is used within the estimation process. The probability the dispersion parameter becomes unreliably estimated increases significantly as the sample mean and sample size decrease. Consequently, the results show that an unreliably estimated dispersion parameter can significantly undermine empirical Bayes (EB) estimates as well as the estimation of confidence intervals for the gamma mean and predicted response. The paper ends with recommendations about minimizing the likelihood of producing Poisson-gamma models with an unreliable dispersion parameter for modeling motor vehicle crashes.
Space Object Classification Using Fused Features of Time Series Data
NASA Astrophysics Data System (ADS)
Jia, B.; Pham, K. D.; Blasch, E.; Shen, D.; Wang, Z.; Chen, G.
In this paper, a fused feature vector consisting of raw time series and texture feature information is proposed for space object classification. The time series data includes historical orbit trajectories and asteroid light curves. The texture feature is derived from recurrence plots using Gabor filters for both unsupervised learning and supervised learning algorithms. The simulation results show that the classification algorithms using the fused feature vector achieve better performance than those using raw time series or texture features only.
NASA Astrophysics Data System (ADS)
Ahdika, Atina; Lusiyana, Novyan
2017-02-01
World Health Organization (WHO) noted Indonesia as the country with the highest dengue (DHF) cases in Southeast Asia. There are no vaccine and specific treatment for DHF. One of the efforts which can be done by both government and resident is doing a prevention action. In statistics, there are some methods to predict the number of DHF cases to be used as the reference to prevent the DHF cases. In this paper, a discrete time series model, INAR(1)-Poisson model in specific, and Markov prediction model are used to predict the number of DHF patients in West Java Indonesia. The result shows that MPM is the best model since it has the smallest value of MAE (mean absolute error) and MAPE (mean absolute percentage error).
Geologic map of the Fifteenmile Valley 7.5' quadrangle, San Bernardino County, California
Miller, F.K.; Matti, J.C.
2001-01-01
Open-File Report OF 01-132 contains a digital geologic map database of the Fifteenmile Valley 7.5’ quadrangle, San Bernardino County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A PostScript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram, a Description of Map Units, an index map, and a regional structure map. 3. Portable Document Format (.pdf) files of: a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. (Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale.) The Correlation of Map Units (CMU) and Description of Map Units (DMU) is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Fifteenmile Valley 7.5’ topographic quadrangle in conjunction with the geologic map.
NASA Astrophysics Data System (ADS)
Huo, Chengyu; Huang, Xiaolin; Zhuang, Jianjun; Hou, Fengzhen; Ni, Huangjing; Ning, Xinbao
2013-09-01
The Poincaré plot is one of the most important approaches in human cardiac rhythm analysis. However, further investigations are still needed to concentrate on techniques that can characterize the dispersion of the points displayed by a Poincaré plot. Based on a modified Poincaré plot, we provide a novel measurement named distribution entropy (DE) and propose a quadrantal multi-scale distribution entropy analysis (QMDE) for the quantitative descriptions of the scatter distribution patterns in various regions and temporal scales. We apply this method to the heartbeat interval series derived from healthy subjects and congestive heart failure (CHF) sufferers, respectively, and find that the discriminations between them are most significant in the first quadrant, which implies significant impacts on vagal regulation brought about by CHF. We also investigate the day-night differences of young healthy people, and it is shown that the results present a clearly circadian rhythm, especially in the first quadrant. In addition, the multi-scale analysis indicates that the results of healthy subjects and CHF sufferers fluctuate in different trends with variation of the scale factor. The same phenomenon also appears in circadian rhythm investigations of young healthy subjects, which implies that the cardiac dynamic system is affected differently in various temporal scales by physiological or pathological factors.
ERIC Educational Resources Information Center
Maxwell, Jane Carlisle; Pullum, Thomas W.
2001-01-01
Applied the capture-recapture model, through a Poisson regression to a time series of data for admissions to treatment from 1987 to 1996 to estimate the number of heroin addicts in Texas who are "at-risk" for treatment. The entire data set produced estimates that were lower and more plausible than those produced by drawing samples,…
Double Fourier Series Solution of Poisson’s Equation on a Sphere.
1980-10-29
algebraic systems, the solution of these systems, and the inverse transform of the solution in Fourier space back to physi- cal space. 6. Yee, S. Y. K...Multiply each count in steps (2) through (5) by K] 7. Inverse transform um(0j j = 1, J - 1, to obtain u k; set u(P) = u 0 (P). [K(J - 1) log 2 K
Forecasting intentional wildfires using temporal and spatiotemporal autocorrelations
Jeffrey P. Prestemon; María L. Chas-Amil; Julia M. Touza; Scott L. Goodrick
2012-01-01
We report daily time series models containing both temporal and spatiotemporal lags, which are applied to forecasting intentional wildfires in Galicia, Spain. Models are estimated independently for each of the 19 forest districts in Galicia using a 1999â2003 training dataset and evaluated out-of-sample with a 2004â06 dataset. Poisson autoregressive models of order P â...
A normalized model for the half-bridge series resonant converter
NASA Technical Reports Server (NTRS)
King, R.; Stuart, T. A.
1981-01-01
Closed-form steady-state equations are derived for the half-bridge series resonant converter with a rectified (dc) load. Normalized curves for various currents and voltages are then plotted as a function of the circuit parameters. Experimental results based on a 10-kHz converter are presented for comparison with the calculations.
NCEP Air Quality Forecast(AQF) Verification. NOAA/NWS/NCEP/EMC
average Select forecast four: Day 1 AOD skill for all thresholds Day 1 Time series for AOD GT 0 Day 2 AOD skill for all thresholds Day 2 Time series for AOD GT 0 Diurnal plots for AOD GT 0 Select statistic type
Geologic map of the Valjean Hills 7.5' quadrangle, San Bernardino County, California
Calzia, J.P.; Troxel, Bennie W.; digital database by Raumann, Christian G.
2003-01-01
FGDC-compliant metadata for the ARC/INFO coverages. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3 above) or plotting the postscript file (2 above).
Chaos and Forecasting - Proceedings of the Royal Society Discussion Meeting
NASA Astrophysics Data System (ADS)
Tong, Howell
1995-04-01
The Table of Contents for the full book PDF is as follows: * Preface * Orthogonal Projection, Embedding Dimension and Sample Size in Chaotic Time Series from a Statistical Perspective * A Theory of Correlation Dimension for Stationary Time Series * On Prediction and Chaos in Stochastic Systems * Locally Optimized Prediction of Nonlinear Systems: Stochastic and Deterministic * A Poisson Distribution for the BDS Test Statistic for Independence in a Time Series * Chaos and Nonlinear Forecastability in Economics and Finance * Paradigm Change in Prediction * Predicting Nonuniform Chaotic Attractors in an Enzyme Reaction * Chaos in Geophysical Fluids * Chaotic Modulation of the Solar Cycle * Fractal Nature in Earthquake Phenomena and its Simple Models * Singular Vectors and the Predictability of Weather and Climate * Prediction as a Criterion for Classifying Natural Time Series * Measuring and Characterising Spatial Patterns, Dynamics and Chaos in Spatially-Extended Dynamical Systems and Ecologies * Non-Linear Forecasting and Chaos in Ecology and Epidemiology: Measles as a Case Study
Modeling the full-bridge series-resonant power converter
NASA Technical Reports Server (NTRS)
King, R. J.; Stuart, T. A.
1982-01-01
A steady state model is derived for the full-bridge series-resonant power converter. Normalized parametric curves for various currents and voltages are then plotted versus the triggering angle of the switching devices. The calculations are compared with experimental measurements made on a 50 kHz converter and a discussion of certain operating problems is presented.
Direct Behavior Rating: An Evaluation of Time-Series Interpretations as Consequential Validity
ERIC Educational Resources Information Center
Christ, Theodore J.; Nelson, Peter M.; Van Norman, Ethan R.; Chafouleas, Sandra M.; Riley-Tillman, T. Chris
2014-01-01
Direct Behavior Rating (DBR) is a repeatable and efficient method of behavior assessment that is used to document teacher perceptions of student behavior in the classroom. Time-series data can be graphically plotted and visually analyzed to evaluate patterns of behavior or intervention effects. This study evaluated the decision accuracy of novice…
Hydrodynamic measurements in Suisun Bay, California, 1992-93
Gartner, Jeffrey W.; Burau, Jon R.
1999-01-01
Sea level, velocity, temperature, and salinity (conductivity and temperature) data collected in Suisun Bay, California, from December 11, 1992, through May 31, 1993, by the U.S. Geological Survey are documented in this report. Sea-level data were collected at four locations and temperature and salinity data were collected at seven locations. Velocity data were collected at three locations using acoustic Doppler current profilers and at four other locations using point velocity meters. Sea-level and velocity data are presented in three forms (1) harmonic analysis results, (2) time-series plots (sea level, current speed, and current direction versus time), and (3) time-series plots of the low-pass filtered data. Temperature and salinity data are presented as plots of raw and low-pass filtered time series. The velocity and salinity data collected during this study document a period when the residual current patterns and salt field were significantly altered by large Delta outflow (three peaks in excess of 2,000 cubic meters per second). Residual current profiles were consistently seaward with magnitudes that fluctuated primarily in concert with Delta outflow and secondarily with the spring-neap tide cycle. The freshwater inputs advected salinity seaward of Suisun Bay for most of this study. Except for a 10-day period at the beginning of the study, dynamically significant salinities (>2) were seaward of Suisun Bay, which resulted in little or no gravitational circulation transport.
ERIC Educational Resources Information Center
Chakroff, Marilyn; Druben, Laurel, Ed.
This is the French translation of a "how-to" manual, designed as a working and teaching tool for extension agents as they establish and/or maintain local fish pond operations. The manual presents information to facilitate technology transfer and to provide a clear guide for warm water fish pond construction and management. Major topic…
NASA Astrophysics Data System (ADS)
Rud'ko, S. V.; Petrov, P. Yu.; Kuznetsov, A. B.; Shatsillo, A. V.; Petrov, O. L.
2017-12-01
New data were obtained on δ13Ccarb and δ18O variations in the sequence of deposits of the Dal'nyaya Taiga series at the western and eastern flanks of the Ura anticline. The summary δ13C curve was plotted in view of the correlation of sequence-stratigraphic data of the basin analysis. A series of positive anomalies was found within the succession. Alternatives for global chemostratigraphic correlation of the Dal'nyaya Taiga series of the Ura uplift were considered.
Remote Sensing Time Series Product Tool
NASA Technical Reports Server (NTRS)
Predos, Don; Ryan, Robert E.; Ross, Kenton W.
2006-01-01
The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced programmers to bypass the GUI and to create more user-specific output products, such as comparison time plots or images. This type of time series analysis tool for remotely sensed imagery could be the basis of a large-area vegetation surveillance system. The TSPT has been used to generate NDVI time series over growing seasons in California and Argentina and for hurricane events, such as Hurricane Katrina.
A statistical data analysis and plotting program for cloud microphysics experiments
NASA Technical Reports Server (NTRS)
Jordan, A. J.
1981-01-01
The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.
U-series dating of impure carbonates: An isochron technique using total-sample dissolution
Bischoff, J.L.; Fitzpatrick, J.A.
1991-01-01
U-series dating is a well-established technique for age determination of Late Quaternary carbonates. Materials of sufficient purity for nominal dating, however, are not as common as materials with mechanically inseparable aluminosilicate detritus. Detritus contaminates the sample with extraneous Th. We propose that correction for contamination is best accomplished with the isochron technique using total sample dissolution (TSD). Experiments were conducted on artificial mixtures of natural detritus and carbonate and on an impure carbonate of known age. Results show that significant and unpredictable transfer of radionuclides occur from the detritus to the leachate in commonly used selective leaching procedures. The effects of correcting via leachate-residue pairs and isochron plots were assessed. Isochrons using TSD gave best results, followed by isochron plots of leachates only. ?? 1991.
Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2011-10-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 yr, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterize the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. This is done in three steps: First, we analyze the historical eruptive series to assess independence and homogeneity of the process. Second, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Third, we analyze the non-homogeneous Poisson process with a generalized Pareto distribution as the intensity function.
A.R. Mason; H.G. Paul
1994-01-01
Procedures for monitoring larval populations of the Douglas-fir tussock moth and the western spruce budworm are recommended based on many years experience in sampling these species in eastern Oregon and Washington. It is shown that statistically reliable estimates of larval density can be made for a population by sampling host trees in a series of permanent plots in a...
1982-06-01
p*A C.._ _ __ _ _ A, d.tibutiou is unhimta 4 iit 84~ L0 TABLE OF CONTENTS APPENDIX SCOPE OF WORK B MERGE AND COST PROGRAM DOCUMENTATION C FATSCO... PROGRAM TO COMPUTE TIME SERIES FREQUENCY RELATIONSHIPS D HEC-DSS - TIME SERIES DATA FILE MANAGEMENT SYSTEM E PLAN 1 -TIM SERIES DATA PLOTS AND ANNUAL...University of Minnesota, utilized an early version of the Hydrologic Engineering * Center’s (HEC) EEC-5c Computer Program . EEC is a Corps of Engineers
Sin(x)**2 + cos(x)**2 = 1. [programming identities using comparative combinatorial substitutions
NASA Technical Reports Server (NTRS)
Stoutemyer, D. R.
1977-01-01
Attempts to achieve tasteful automatic employment of the identities sin sq x + cos sq x = 1 and cos sq h x -sin sq h x = 1 in a manner which truly minimizes the complexity of the resulting expression are described. The disappointments of trigonometric reduction, trigonometric expansion, pattern matching, Poisson series, and Demoivre's theorem are related. The advantages of using the method of comparative combinatorial substitutions are illustrated.
A colorimetric sensor for the selective detection of fluoride ions.
Wan, Chin-Feng; Chir, Jiun-Ly; Wu, An-Tai
2017-05-01
A colorimetric receptor L was prepared. Receptor L can selectively sense F - based on distinct color changes among a series of ions. It can selectively sense F - through an intramolecular hydrogen bond interaction. A Job plot indicated a 1:1 complexation stoichiometry between receptor L and F - . The association constant for L-F - in CH 3 CN was determined as 9.70 × 10 4 M -1 using a Stern-Volmer plot. Copyright © 2016 John Wiley & Sons, Ltd.
2015 Summer Series - Andy Weir - The Martian: How Science Drove the Plot
2015-06-30
NASA's Journey to Mars sets a goal beyond anything humanity has ever reached. With this monumental vision, we outline the parameters needed for survival. We put ourselves into the spacesuit of a Martian explorer and design what they need. In his novel, The Martian, Andy Weir communicates a quest for survival-- a thought experiment in innovation and ingenuity. Andy Weir will describe how science drove the plot of The Martian, illuminating the connectivity between science fiction and science fact.
Geologic map and digital database of the Romoland 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Digital preparation by Bovard, Kelly R.; Morton, Gregory
2003-01-01
Portable Document Format (.pdf) files of: This Readme; includes in Appendix I, data contained in rom_met.txt The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formationname, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). This Readme file describes the digital data, such as types and general contents of files making up the database, and includes information on how to extract and plot the map and accompanying graphic file. Metadata information can be accessed at http://geo-nsdi.er.usgs.gov/metadata/open-file/03-102 and is included in Appendix I of this Readme.
The critical period of weed control in soybean (Glycine max (L.) Merr.) in north of Iran conditions.
Keramati, Sara; Pirdashti, Hemmatollah; Esmaili, Mohammad Ali; Abbasian, Arastoo; Habibi, Marjaneh
2008-02-01
A field study was conducted in 2006 at Sari Agricultural and Natural Resources University, in order to determine the best time for weed control in soybean promising line, 033. Experiment was arranged in randomized complete block design with 4 replications and two series of treatments. In the first series, weeds were kept in place until crop reached V2 (second trifoliolate), V4 (fourth trifoliolate), V6 (sixth trifoliolate), R1 (beginning bloom, first flower), R3 (beginning pod), R5 (beginning seed) and were then removed and the crop kept weed-free for the rest of the season. In the second series, crops were kept weed-free until the above growth stages after which weeds were allowed to grow in the plots for the rest of the season. Whole season weedy and weed-free plots were included in the experiment for yield comparison. The results showed that among studied traits, grain yield, pod numbers per plant and weed biomass were affected significantly by control and interference treatments. The highest number of pods per plant was obtained from plots which kept weed-free for whole season control. Results showed that weed control should be carried out between V2 (26 day after planting) to R1 (63 day after planting) stages of soybean to provide maximum grain yield. Thus, it is possible to optimize the timing of weed control, which can serve to reduce the costs and side effects of intensive chemical weed control.
Brain, music, and non-Poisson renewal processes
NASA Astrophysics Data System (ADS)
Bianco, Simone; Ignaccolo, Massimiliano; Rider, Mark S.; Ross, Mary J.; Winsor, Phil; Grigolini, Paolo
2007-06-01
In this paper we show that both music composition and brain function, as revealed by the electroencephalogram (EEG) analysis, are renewal non-Poisson processes living in the nonergodic dominion. To reach this important conclusion we process the data with the minimum spanning tree method, so as to detect significant events, thereby building a sequence of times, which is the time series to analyze. Then we show that in both cases, EEG and music composition, these significant events are the signature of a non-Poisson renewal process. This conclusion is reached using a technique of statistical analysis recently developed by our group, the aging experiment (AE). First, we find that in both cases the distances between two consecutive events are described by nonexponential histograms, thereby proving the non-Poisson nature of these processes. The corresponding survival probabilities Ψ(t) are well fitted by stretched exponentials [ Ψ(t)∝exp (-(γt)α) , with 0.5<α<1 .] The second step rests on the adoption of AE, which shows that these are renewal processes. We show that the stretched exponential, due to its renewal character, is the emerging tip of an iceberg, whose underwater part has slow tails with an inverse power law structure with power index μ=1+α . Adopting the AE procedure we find that both EEG and music composition yield μ<2 . On the basis of the recently discovered complexity matching effect, according to which a complex system S with μS<2 responds only to a complex driving signal P with μP⩽μS , we conclude that the results of our analysis may explain the influence of music on the human brain.
Normal forms for Poisson maps and symplectic groupoids around Poisson transversals
NASA Astrophysics Data System (ADS)
Frejlich, Pedro; Mărcuț, Ioan
2018-03-01
Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.
Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.
Frejlich, Pedro; Mărcuț, Ioan
2018-01-01
Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.
Manktelow, Bradley N.; Seaton, Sarah E.
2012-01-01
Background Emphasis is increasingly being placed on the monitoring and comparison of clinical outcomes between healthcare providers. Funnel plots have become a standard graphical methodology to identify outliers and comprise plotting an outcome summary statistic from each provider against a specified ‘target’ together with upper and lower control limits. With discrete probability distributions it is not possible to specify the exact probability that an observation from an ‘in-control’ provider will fall outside the control limits. However, general probability characteristics can be set and specified using interpolation methods. Guidelines recommend that providers falling outside such control limits should be investigated, potentially with significant consequences, so it is important that the properties of the limits are understood. Methods Control limits for funnel plots for the Standardised Mortality Ratio (SMR) based on the Poisson distribution were calculated using three proposed interpolation methods and the probability calculated of an ‘in-control’ provider falling outside of the limits. Examples using published data were shown to demonstrate the potential differences in the identification of outliers. Results The first interpolation method ensured that the probability of an observation of an ‘in control’ provider falling outside either limit was always less than a specified nominal probability (p). The second method resulted in such an observation falling outside either limit with a probability that could be either greater or less than p, depending on the expected number of events. The third method led to a probability that was always greater than, or equal to, p. Conclusion The use of different interpolation methods can lead to differences in the identification of outliers. This is particularly important when the expected number of events is small. We recommend that users of these methods be aware of the differences, and specify which interpolation method is to be used prior to any analysis. PMID:23029202
Novel Flood Detection and Analysis Method Using Recurrence Property
NASA Astrophysics Data System (ADS)
Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert
2016-04-01
Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.
Geologic map of the Sunnymead 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Matti, Jonathan C.
2001-01-01
a. This Readme; includes in Appendix I, data contained in sun_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
NASA Technical Reports Server (NTRS)
Wilson, Robert M.
2001-01-01
Since 1750, the number of cataclysmic volcanic eruptions (volcanic explosivity index (VEI)>=4) per decade spans 2-11, with 96 percent located in the tropics and extra-tropical Northern Hemisphere. A two-point moving average of the volcanic time series has higher values since the 1860's than before, being 8.00 in the 1910's (the highest value) and 6.50 in the 1980's, the highest since the 1910's peak. Because of the usual behavior of the first difference of the two-point moving averages, one infers that its value for the 1990's will measure approximately 6.50 +/- 1, implying that approximately 7 +/- 4 cataclysmic volcanic eruptions should be expected during the present decade (2000-2009). Because cataclysmic volcanic eruptions (especially those having VEI>=5) nearly always have been associated with short-term episodes of global cooling, the occurrence of even one might confuse our ability to assess the effects of global warming. Poisson probability distributions reveal that the probability of one or more events with a VEI>=4 within the next ten years is >99 percent. It is approximately 49 percent for an event with a VEI>=5, and 18 percent for an event with a VEI>=6. Hence, the likelihood that a climatically significant volcanic eruption will occur within the next ten years appears reasonably high.
Non-parametric and least squares Langley plot methods
NASA Astrophysics Data System (ADS)
Kiedron, P. W.; Michalsky, J. J.
2016-01-01
Langley plots are used to calibrate sun radiometers primarily for the measurement of the aerosol component of the atmosphere that attenuates (scatters and absorbs) incoming direct solar radiation. In principle, the calibration of a sun radiometer is a straightforward application of the Bouguer-Lambert-Beer law V = V0e-τ ṡ m, where a plot of ln(V) voltage vs. m air mass yields a straight line with intercept ln(V0). This ln(V0) subsequently can be used to solve for τ for any measurement of V and calculation of m. This calibration works well on some high mountain sites, but the application of the Langley plot calibration technique is more complicated at other, more interesting, locales. This paper is concerned with ferreting out calibrations at difficult sites and examining and comparing a number of conventional and non-conventional methods for obtaining successful Langley plots. The 11 techniques discussed indicate that both least squares and various non-parametric techniques produce satisfactory calibrations with no significant differences among them when the time series of ln(V0)'s are smoothed and interpolated with median and mean moving window filters.
Van Allen Probes Science Gateway and Space Weather Data Processing
NASA Astrophysics Data System (ADS)
Romeo, G.; Barnes, R. J.; Weiss, M.; Fox, N. J.; Mauk, B.; Potter, M.; Kessel, R.
2014-12-01
The Van Allen Probes Science Gateway acts as a centralized interface to the instrument Science Operation Centers (SOCs), provides mission planning tools, and hosts a number of science related activities such as the mission bibliography. Most importantly, the Gateway acts as the primary site for processing and delivering the VAP Space Weather data to users. Over the past year, the web-site has been completely redesigned with the focus on easier navigation and improvements of the existing tools such as the orbit plotter, position calculator and magnetic footprint tool. In addition, a new data plotting facility has been added. Based on HTML5, which allows users to interactively plot Van Allen Probes summary and space weather data. The user can tailor the tool to display exactly the plot they wish to see and then share this with other users via either a URL or by QR code. Various types of plots can be created, including simple time series, data plotted as a function of orbital location, and time versus L-Shell. We discuss the new Van Allen Probes Science Gateway and the Space Weather Data Pipeline.
NASA Technical Reports Server (NTRS)
Young, Richard D.; Rose, Cheryl A.; Starnes, James H., Jr.
2000-01-01
Results of a geometrically nonlinear finite element parametric study to determine curvature correction factors or bulging factors that account for increased stresses due to curvature for longitudinal and circumferential cracks in unstiffened pressurized cylindrical shells are presented. Geometric parameters varied in the study include the shell radius, the shell wall thickness, and the crack length. The major results are presented in the form of contour plots of the bulging factor as a function of two nondimensional parameters: the shell curvature parameter, lambda, which is a function of the shell geometry, Poisson's ratio, and the crack length; and a loading parameter, eta, which is a function of the shell geometry, material properties, and the applied internal pressure. These plots identify the ranges of the shell curvature and loading parameters for which the effects of geometric nonlinearity are significant. Simple empirical expressions for the bulging factor are then derived from the numerical results and shown to predict accurately the nonlinear response of shells with longitudinal and circumferential cracks. The numerical results are also compared with analytical solutions based on linear shallow shell theory for thin shells, and with some other semi-empirical solutions from the literature, and limitations on the use of these other expressions are suggested.
Zeroth Poisson Homology, Foliated Cohomology and Perfect Poisson Manifolds
NASA Astrophysics Data System (ADS)
Martínez-Torres, David; Miranda, Eva
2018-01-01
We prove that, for compact regular Poisson manifolds, the zeroth homology group is isomorphic to the top foliated cohomology group, and we give some applications. In particular, we show that, for regular unimodular Poisson manifolds, top Poisson and foliated cohomology groups are isomorphic. Inspired by the symplectic setting, we define what a perfect Poisson manifold is. We use these Poisson homology computations to provide families of perfect Poisson manifolds.
Inverse sequential procedures for the monitoring of time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy J.
1995-01-01
When one or more new values are added to a developing time series, they change its descriptive parameters (mean, variance, trend, coherence). A 'change index (CI)' is developed as a quantitative indicator that the changed parameters remain compatible with the existing 'base' data. CI formulate are derived, in terms of normalized likelihood ratios, for small samples from Poisson, Gaussian, and Chi-Square distributions, and for regression coefficients measuring linear or exponential trends. A substantial parameter change creates a rapid or abrupt CI decrease which persists when the length of the bases is changed. Except for a special Gaussian case, the CI has no simple explicit regions for tests of hypotheses. However, its design ensures that the series sampled need not conform strictly to the distribution form assumed for the parameter estimates. The use of the CI is illustrated with both constructed and observed data samples, processed with a Fortran code 'Sequitor'.
Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system
NASA Astrophysics Data System (ADS)
Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.
2017-05-01
We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
NASA Astrophysics Data System (ADS)
Weaver, M.; Benner, S.; Fendorf, S.; Sampson, M.; Leng, M.
2007-12-01
Atmospheric concentrations of methane have been steadily increasing over the last 100 years, which has given rise to research of wetland rice fields, recently identified as a major anthropomorphic source of methane. Establishment of experimental soil pots, cultivating an aromatic early variety rice strain in the Kean Svay District of Cambodia, have recently been carried out to evaluate methods to minimize methane release by promoting redox buffering by iron oxides. In the first series of experiments, iron oxides were added to the soils and the rate of change in reducing conditions and methanogenesis onset was monitored. In the second series of experiments, plots are subject to periodic drying cycles to promote rejuvenation of buffering iron oxides. Initial results indicate a delay in the onset of methanogenesis, and overall methane generation, in plots where initial iron oxides concentrations are elevated.
NASA Astrophysics Data System (ADS)
Sun, Deyu; Rettmann, Maryam E.; Holmes, David R.; Linte, Cristian A.; Packer, Douglas; Robb, Richard A.
2014-03-01
In this work, we propose a method for intraoperative reconstruction of a left atrial surface model for the application of cardiac ablation therapy. In this approach, the intraoperative point cloud is acquired by a tracked, 2D freehand intra-cardiac echocardiography device, which is registered and merged with a preoperative, high resolution left atrial surface model built from computed tomography data. For the surface reconstruction, we introduce a novel method to estimate the normal vector of the point cloud from the preoperative left atrial model, which is required for the Poisson Equation Reconstruction algorithm. In the current work, the algorithm is evaluated using a preoperative surface model from patient computed tomography data and simulated intraoperative ultrasound data. Factors such as intraoperative deformation of the left atrium, proportion of the left atrial surface sampled by the ultrasound, sampling resolution, sampling noise, and registration error were considered through a series of simulation experiments.
Impedance and modulus spectroscopic study of nano hydroxyapatite
NASA Astrophysics Data System (ADS)
Jogiya, B. V.; Jethava, H. O.; Tank, K. P.; Raviya, V. R.; Joshi, M. J.
2016-05-01
Hydroxyapatite (Ca10 (PO4)6 (OH)2, HAP) is the main inorganic component of the hard tissues in bones and also important material for orthopedic and dental implant applications. Nano HAP is of great interest due to its various bio-medical applications. In the present work the nano HAP was synthesized by using surfactant mediated approach. Structure and morphology of the synthesized nano HAP was examined by the Powder XRD and TEM. Impedance study was carried out on pelletized sample in a frequency range of 100Hz to 20MHz at room temperature. The variation of dielectric constant, dielectric loss, and a.c. conductivity with frequency of applied field was studied. The Nyquist plot as well as modulus plot was drawn. The Nyquist plot showed two semicircle arcs, which indicated the presence of grain and grain boundary effect in the sample. The typical behavior of the Nyquist plot was represented by equivalent circuit having two parallel RC combinations in series.
Computer Series, 29: Bits and Pieces, 10.
ERIC Educational Resources Information Center
Moore, John W., Ed.
1982-01-01
Describes computer programs (available from authors) including molecular input to computer, programs for quantum chemistry, library orientation to technical literature, plotting potentiometric titration data, simulating oscilloscope curves, organic qualitative analysis with dynamic graphics, extended Huckel calculations, and calculator programs…
Diesel-Powered Heavy-Duty Refrigeration Unit Noise
DOT National Transportation Integrated Search
1976-01-01
A series of noise measurements were performed on a diesel-powered heavy-duty refrigeration unit. Noise survey information collected included: polar plots of the 'A Weighted' noise levels of the unit under maximum and minimum load conditions; a linear...
Koltun, G.F.
2015-01-01
Streamflow hydrographs were plotted for modeled/computed time series for the Ohio River near the USGS Sardis gage and the Ohio River at the Hannibal Lock and Dam. In general, the time series at these two locations compared well. Some notable differences include the exclusive presence of short periods of negative streamflows in the USGS 15-minute time-series data for the gage on the Ohio River above Sardis, Ohio, and the occurrence of several peak streamflows in the USACE gate/hydropower time series for the Hannibal Lock and Dam that were appreciably larger than corresponding peaks in the other time series, including those modeled/computed for the downstream Sardis gage
DISCRETE COMPOUND POISSON PROCESSES AND TABLES OF THE GEOMETRIC POISSON DISTRIBUTION.
A concise summary of the salient properties of discrete Poisson processes , with emphasis on comparing the geometric and logarithmic Poisson processes . The...the geometric Poisson process are given for 176 sets of parameter values. New discrete compound Poisson processes are also introduced. These...processes have properties that are particularly relevant when the summation of several different Poisson processes is to be analyzed. This study provides the
Mark Nelson; Sean Healey; W. Keith Moser; Mark Hansen; Warren Cohen; Mark Hatfield; Nancy Thomas; Jeff Masek
2009-01-01
The North American Forest Dynamics (NAFD) Program is assessing disturbance and regrowth in the forests of the continent. These forest dynamics are interpreted from per-pixel estimates of forest biomass, which are produced for a time series of Landsat 5 Thematic Mapper (TM) and Landsat 7 Enhanced TM Plus images. Image data are combined with sample plot data from the...
Zeus++ - A GUI-Based Flowfield Analysis Tool, Version 1.0, User’s Manual
1999-02-01
A.B. and Priolo, F.J., Personal Communication and unpublished documentation. 9 . Tecplot v7.0 Plotting Package, Amtec Engineering, 1998. lO.Hymer... 9 . SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 5. FUNDING NUMBERS 8. PERFORMING ORGANIZATION REPORT NUMBER NSWCDD/TR-98/147 10...12 7 VON KARMAN OGIVE PARAMETERS 13 HAACK SERIES NOSE PARAMETERS 13 9 POWER SERIES NOSE PARAMETERS 14 10 MISCELLANEOUS OPTIONS 15 11
Grigolli, J F J; Souza, L A; Fernandes, M G; Busoli, A C
2017-08-01
The cotton boll weevil Anthonomus grandis Boheman (Coleoptera: Curculionidae) is the main pest in cotton crop around the world, directly affecting cotton production. In order to establish a sequential sampling plan, it is crucial to understand the spatial distribution of the pest population and the damage it causes to the crop through the different developmental stages of cotton plants. Therefore, this study aimed to investigate the spatial distribution of adults in the cultivation area and their oviposition and feeding behavior throughout the development of the cotton plants. The experiment was conducted in Maracaju, Mato Grosso do Sul, Brazil, in the 2012/2013 and 2013/2014 growing seasons, in an area of 10,000 m 2 , planted with the cotton cultivar FM 993. The experimental area was divided into 100 plots of 100 m 2 (10 × 10 m) each, and five plants per plot were sampled weekly throughout the crop cycle. The number of flower buds with feeding and oviposition punctures and of adult A. grandis was recorded throughout the crop cycle in five plants per plot. After determining the aggregation indices (variance/mean ratio, Morisita's index, exponent k of the negative binomial distribution, and Green's coefficient) and adjusting the frequencies observed in the field to the distribution of frequencies (Poisson, negative binomial, and positive binomial) using the chi-squared test, it was observed that flower buds with punctures derived from feeding, oviposition, and feeding + oviposition showed an aggregated distribution in the cultivation area until 85 days after emergence and a random distribution after this stage. The adults of A. grandis presented a random distribution in the cultivation area.
Wang, Qinggang; Bao, Dachuan; Guo, Yili; Lu, Junmeng; Lu, Zhijun; Xu, Yaozhan; Zhang, Kuihan; Liu, Haibo; Meng, Hongjie; Jiang, Mingxi; Qiao, Xiujuan; Huang, Handong
2014-01-01
The stochastic dilution hypothesis has been proposed to explain species coexistence in species-rich communities. The relative importance of the stochastic dilution effects with respect to other effects such as competition and habitat filtering required to be tested. In this study, using data from a 25-ha species-rich subtropical forest plot with a strong topographic structure at Badagongshan in central China, we analyzed overall species associations and fine-scale species interactions between 2,550 species pairs. The result showed that: (1) the proportion of segregation in overall species association analysis at 2 m neighborhood in this plot followed the prediction of the stochastic dilution hypothesis that segregations should decrease with species richness but that at 10 m neighborhood was higher than the prediction. (2) The proportion of no association type was lower than the expectation of stochastic dilution hypothesis. (3) Fine-scale species interaction analyses using Heterogeneous Poisson processes as null models revealed a high proportion (47%) of significant species effects. However, the assumption of separation of scale of this method was not fully met in this plot with a strong fine-scale topographic structure. We also found that for species within the same families, fine-scale positive species interactions occurred more frequently and negative ones occurred less frequently than expected by chance. These results suggested effects of environmental filtering other than species interaction in this forest. (4) We also found that arbor species showed a much higher proportion of significant fine-scale species interactions (66%) than shrub species (18%). We concluded that the stochastic dilution hypothesis only be partly supported and environmental filtering left discernible spatial signals in the spatial associations between species in this species-rich subtropical forest with a strong topographic structure. PMID:24824996
Wang, Qinggang; Bao, Dachuan; Guo, Yili; Lu, Junmeng; Lu, Zhijun; Xu, Yaozhan; Zhang, Kuihan; Liu, Haibo; Meng, Hongjie; Jiang, Mingxi; Qiao, Xiujuan; Huang, Handong
2014-01-01
The stochastic dilution hypothesis has been proposed to explain species coexistence in species-rich communities. The relative importance of the stochastic dilution effects with respect to other effects such as competition and habitat filtering required to be tested. In this study, using data from a 25-ha species-rich subtropical forest plot with a strong topographic structure at Badagongshan in central China, we analyzed overall species associations and fine-scale species interactions between 2,550 species pairs. The result showed that: (1) the proportion of segregation in overall species association analysis at 2 m neighborhood in this plot followed the prediction of the stochastic dilution hypothesis that segregations should decrease with species richness but that at 10 m neighborhood was higher than the prediction. (2) The proportion of no association type was lower than the expectation of stochastic dilution hypothesis. (3) Fine-scale species interaction analyses using Heterogeneous Poisson processes as null models revealed a high proportion (47%) of significant species effects. However, the assumption of separation of scale of this method was not fully met in this plot with a strong fine-scale topographic structure. We also found that for species within the same families, fine-scale positive species interactions occurred more frequently and negative ones occurred less frequently than expected by chance. These results suggested effects of environmental filtering other than species interaction in this forest. (4) We also found that arbor species showed a much higher proportion of significant fine-scale species interactions (66%) than shrub species (18%). We concluded that the stochastic dilution hypothesis only be partly supported and environmental filtering left discernible spatial signals in the spatial associations between species in this species-rich subtropical forest with a strong topographic structure.
The role of seasonal grass pollen on childhood asthma emergency department presentations.
Erbas, B; Akram, M; Dharmage, S C; Tham, R; Dennekamp, M; Newbigin, E; Taylor, P; Tang, M L K; Abramson, M J
2012-05-01
Few studies have focused on the role of grass pollen on asthma emergency department (ED) presentations among children. None have examined whether a dose-response effect exists between grass pollen levels and these asthma exacerbations. To examine the association between increasing ambient levels of grass pollen and asthma ED presentations in children. To determine whether these associations are seen only after a thunderstorm, or whether grass pollen levels have a consistent influence on childhood asthma ED visits during the season. A short time series ecological study was conducted for asthma presentations to ED among children in Melbourne, Victoria, and grass pollen, meteorological and air quality measurements recorded during the selected 2003 period. A semi-parametric Poisson regression model was used to examine dose-response associations between daily grass pollen levels and mean daily ED attendance for asthma. A smoothed plot suggested a dose-response association. As ambient grass pollen increased to about 19 grains/m(3) , the same day risk of childhood ED presentations also increased linearly (P < 0.001). Grass pollen levels were also associated with an increased risk in asthma ED presentations on the following day (lag 1, P < 0.001). This is the first study to establish a clear relationship between increased risk of childhood asthma ED attendance and levels of ambient grass pollen below 20 grains/m(3) , independent of any impact of thunderstorm-associated asthma. These findings have important implications for patient care, such as asthma management programs that notify the general public regarding periods of high grass pollen exposure, as well as defining the timing of initiation of pollen immunotherapy. © 2012 Blackwell Publishing Ltd.
On the equivalence of case-crossover and time series methods in environmental epidemiology.
Lu, Yun; Zeger, Scott L
2007-04-01
The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.
Tošić, Tamara; Sellers, Kristin K; Fröhlich, Flavio; Fedotenkova, Mariia; Beim Graben, Peter; Hutt, Axel
2015-01-01
For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain.
Tošić, Tamara; Sellers, Kristin K.; Fröhlich, Flavio; Fedotenkova, Mariia; beim Graben, Peter; Hutt, Axel
2016-01-01
For decades, research in neuroscience has supported the hypothesis that brain dynamics exhibits recurrent metastable states connected by transients, which together encode fundamental neural information processing. To understand the system's dynamics it is important to detect such recurrence domains, but it is challenging to extract them from experimental neuroscience datasets due to the large trial-to-trial variability. The proposed methodology extracts recurrent metastable states in univariate time series by transforming datasets into their time-frequency representations and computing recurrence plots based on instantaneous spectral power values in various frequency bands. Additionally, a new statistical inference analysis compares different trial recurrence plots with corresponding surrogates to obtain statistically significant recurrent structures. This combination of methods is validated by applying it to two artificial datasets. In a final study of visually-evoked Local Field Potentials in partially anesthetized ferrets, the methodology is able to reveal recurrence structures of neural responses with trial-to-trial variability. Focusing on different frequency bands, the δ-band activity is much less recurrent than α-band activity. Moreover, α-activity is susceptible to pre-stimuli, while δ-activity is much less sensitive to pre-stimuli. This difference in recurrence structures in different frequency bands indicates diverse underlying information processing steps in the brain. PMID:26834580
Geologic map of the Riverside East 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Cox, Brett F.
2001-01-01
a. This Readme; includes in Appendix I, data contained in rse_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Corona North 7.5' quadrangle, Riverside and San Bernardino counties, California
Morton, Douglas M.; Gray, C.H.; Bovard, Kelly R.; Dawson, Michael
2002-01-01
a. This Readme; includes in Appendix I, data contained in crn_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000- scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Corona South 7.5' quadrangle, Riverside and Orange counties, California
Gray, C.H.; Morton, Douglas M.; Weber, F. Harold; Digital preparation by Bovard, Kelly R.; O'Brien, Timothy
2002-01-01
a. A Readme file; includes in Appendix I, data contained in crs_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Lake Mathews 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Weber, F. Harold
2001-01-01
a. This Readme; includes in Appendix I, data contained in lkm_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous.Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand.In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Steele Peak 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; digital preparation by Alvarez, Rachel M.; Diep, Van M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in stp_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Geologic map of the Riverside West 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Cox, Brett F.
2001-01-01
a. This Readme; includes in Appendix I, data contained in rsw_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f.Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Method for resonant measurement
Rhodes, G.W.; Migliori, A.; Dixon, R.D.
1996-03-05
A method of measurement of objects to determine object flaws, Poisson`s ratio ({sigma}) and shear modulus ({mu}) is shown and described. First, the frequency for expected degenerate responses is determined for one or more input frequencies and then splitting of degenerate resonant modes are observed to identify the presence of flaws in the object. Poisson`s ratio and the shear modulus can be determined by identification of resonances dependent only on the shear modulus, and then using that shear modulus to find Poisson`s ratio using other modes dependent on both the shear modulus and Poisson`s ratio. 1 fig.
Multiscale analysis of neural spike trains.
Ramezan, Reza; Marriott, Paul; Chenouri, Shojaeddin
2014-01-30
This paper studies the multiscale analysis of neural spike trains, through both graphical and Poisson process approaches. We introduce the interspike interval plot, which simultaneously visualizes characteristics of neural spiking activity at different time scales. Using an inhomogeneous Poisson process framework, we discuss multiscale estimates of the intensity functions of spike trains. We also introduce the windowing effect for two multiscale methods. Using quasi-likelihood, we develop bootstrap confidence intervals for the multiscale intensity function. We provide a cross-validation scheme, to choose the tuning parameters, and study its unbiasedness. Studying the relationship between the spike rate and the stimulus signal, we observe that adjusting for the first spike latency is important in cross-validation. We show, through examples, that the correlation between spike trains and spike count variability can be multiscale phenomena. Furthermore, we address the modeling of the periodicity of the spike trains caused by a stimulus signal or by brain rhythms. Within the multiscale framework, we introduce intensity functions for spike trains with multiplicative and additive periodic components. Analyzing a dataset from the retinogeniculate synapse, we compare the fit of these models with the Bayesian adaptive regression splines method and discuss the limitations of the methodology. Computational efficiency, which is usually a challenge in the analysis of spike trains, is one of the highlights of these new models. In an example, we show that the reconstruction quality of a complex intensity function demonstrates the ability of the multiscale methodology to crack the neural code. Copyright © 2013 John Wiley & Sons, Ltd.
Winterstein, Thomas A.; Arntson, Allan D.; Mitton, Gregory B.
2007-01-01
The 1-, 7-, and 30-day low-flow series were determined for 120 continuous-record streamflow stations in Minnesota having at least 20 years of continuous record. The 2-, 5-, 10-, 50-, and 100-year statistics were determined for each series by fitting a log Pearson type III distribution to the data. The methods used to determine the low-flow statistics and to construct the plots of the low-flow frequency curves are described. The low-flow series and the low-flow statistics are presented in tables and graphs.
The Chern-Simons Current in Systems of DNA-RNA Transcriptions
NASA Astrophysics Data System (ADS)
Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin; Saridakis, Emmanuel N.
2018-04-01
A Chern-Simons current, coming from ghost and anti-ghost fields of supersymmetry theory, can be used to define a spectrum of gene expression in new time series data where a spinor field, as alternative representation of a gene, is adopted instead of using the standard alphabet sequence of bases $A, T, C, G, U$. After a general discussion on the use of supersymmetry in biological systems, we give examples of the use of supersymmetry for living organism, discuss the codon and anti-codon ghost fields and develop an algebraic construction for the trash DNA, the DNA area which does not seem active in biological systems. As a general result, all hidden states of codon can be computed by Chern-Simons 3 forms. Finally, we plot a time series of genetic variations of viral glycoprotein gene and host T-cell receptor gene by using a gene tensor correlation network related to the Chern-Simons current. An empirical analysis of genetic shift, in host cell receptor genes with separated cluster of gene and genetic drift in viral gene, is obtained by using a tensor correlation plot over time series data derived as the empirical mode decomposition of Chern-Simons current.
Analysis of vegetation in an Imperata grassland of Barak valley, Assam.
Astapati, Ashim Das; Das, Ashesh Kumar
2012-09-01
Imperata grassland at Dorgakona, Barak valley, North Eastern India was analyzed for species composition and diversity pattern in relation to traditional management practices. 19 families were in the burnt and unburnt plots of the study site with Poaceae as the most dominant one. 29 species occurred in the burnt plot and 28 in the unburnt plot. Most of the species were common in both the plots. The pattern of frequency diagrams indicated that the vegetation was homogeneous. Imperata cylindrica, a rhizomatous grass was the dominant species based on density (318.75 and 304.18 nos. m(-2)), basal cover (158.22 and 148.34 cm2 m(-2)) and Importance value index (IVI) (132.64 and 138.74) for the burnt and unburnt plots respectively. Borreria pusilla was the co-dominant species constituting Imperata-Borreria assemblage of the studied grassland. It was observed that B. pusilla (162.25 nos. m(-2) and 50.37 nos. m(-2), I. cylindrica (318.75 nos. m(-2) and 304.18 nos. m(-2)) and Setaria glauca (24.70 nos. m(-2) and 16.46 nos. m(-2) were benefited from burning as shown by the values sequentially placed for burnt and unburnt plots. Certain grasses like Chrysopogon aciculatus and Sacciolepis indica were restricted to burnt plot while Oxalis corniculata showed its presence to unburnt plot. Grasses dominated the grassland as revealed by their contribution to the mean percentage cover of 72% in burnt plot and 76% in umburnt plot. The dominance-diversity curves in the study site approaches a log normal series distribution suggesting that the resources are shared by the constituent species. Seasonal pattern in diversity index suggested definite influence of climatic seasonality on species diversity; rainy season was conducive for maximum diversity (1.40 and 1.38 in the burnt and unburnt plots, respectively). Dominance increased with concentration of fewer species (0.0021 in burnt plot and 0.0055 in unbumt plot) in summer and behaves inversely to index of diversity. This study showed that the traditional management practices benefits the farmers as it promote grassland regeneration with I. cylindrica as the dominant grass.
Zaylaa, Amira; Charara, Jamal; Girault, Jean-Marc
2015-08-01
The analysis of biomedical signals demonstrating complexity through recurrence plots is challenging. Quantification of recurrences is often biased by sojourn points that hide dynamic transitions. To overcome this problem, time series have previously been embedded at high dimensions. However, no one has quantified the elimination of sojourn points and rate of detection, nor the enhancement of transition detection has been investigated. This paper reports our on-going efforts to improve the detection of dynamic transitions from logistic maps and fetal hearts by reducing sojourn points. Three signal-based recurrence plots were developed, i.e. embedded with specific settings, derivative-based and m-time pattern. Determinism, cross-determinism and percentage of reduced sojourn points were computed to detect transitions. For logistic maps, an increase of 50% and 34.3% in sensitivity of detection over alternatives was achieved by m-time pattern and embedded recurrence plots with specific settings, respectively, and with a 100% specificity. For fetal heart rates, embedded recurrence plots with specific settings provided the best performance, followed by derivative-based recurrence plot, then unembedded recurrence plot using the determinism parameter. The relative errors between healthy and distressed fetuses were 153%, 95% and 91%. More than 50% of sojourn points were eliminated, allowing better detection of heart transitions triggered by gaseous exchange factors. This could be significant in improving the diagnosis of fetal state. Copyright © 2014 Elsevier Ltd. All rights reserved.
Preliminary geologic map of the Elsinore 7.5' Quadrangle, Riverside County, California
Morton, Douglas M.; Weber, F. Harold; Digital preparation: Alvarez, Rachel M.; Burns, Diane
2003-01-01
Open-File Report 03-281 contains a digital geologic map database of the Elsinore 7.5’ quadrangle, Riverside County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. Portable Document Format (.pdf) files of: a. This Readme; includes in Appendix I, data contained in els_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Perrakis, Konstantinos; Gryparis, Alexandros; Schwartz, Joel; Le Tertre, Alain; Katsouyanni, Klea; Forastiere, Francesco; Stafoggia, Massimo; Samoli, Evangelia
2014-12-10
An important topic when estimating the effect of air pollutants on human health is choosing the best method to control for seasonal patterns and time varying confounders, such as temperature and humidity. Semi-parametric Poisson time-series models include smooth functions of calendar time and weather effects to control for potential confounders. Case-crossover (CC) approaches are considered efficient alternatives that control seasonal confounding by design and allow inclusion of smooth functions of weather confounders through their equivalent Poisson representations. We evaluate both methodological designs with respect to seasonal control and compare spline-based approaches, using natural splines and penalized splines, and two time-stratified CC approaches. For the spline-based methods, we consider fixed degrees of freedom, minimization of the partial autocorrelation function, and general cross-validation as smoothing criteria. Issues of model misspecification with respect to weather confounding are investigated under simulation scenarios, which allow quantifying omitted, misspecified, and irrelevant-variable bias. The simulations are based on fully parametric mechanisms designed to replicate two datasets with different mortality and atmospheric patterns. Overall, minimum partial autocorrelation function approaches provide more stable results for high mortality counts and strong seasonal trends, whereas natural splines with fixed degrees of freedom perform better for low mortality counts and weak seasonal trends followed by the time-season-stratified CC model, which performs equally well in terms of bias but yields higher standard errors. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Sergeenko, N. P.
2017-11-01
An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.
Properties of the Bivariate Delayed Poisson Process
1974-07-01
and Lewis (1972) in their Berkeley Symposium paper and here their analysis of the bivariate Poisson processes (without Poisson noise) is carried... Poisson processes . They cannot, however, be independent Poisson processes because their events are associated in pairs by the displace- ment centres...process because its marginal processes for events of each type are themselves (univariate) Poisson processes . Cox and Lewis (1972) assumed a
Identifying Changes of Complex Flood Dynamics with Recurrence Analysis
NASA Astrophysics Data System (ADS)
Wendi, D.; Merz, B.; Marwan, N.
2016-12-01
Temporal changes in flood hazard system are known to be difficult to detect and attribute due to multiple drivers that include complex processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defense, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. Moreover hydrological time series (i.e. discharge) are often subject to measurement errors, such as rating curve error especially in the case of extremes where observation are actually derived through extrapolation. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. Sensitivity of the common measurement errors and noise on recurrence analysis will also be analyzed and evaluated against conventional methods. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic to certain flood events.
QUICK - AN INTERACTIVE SOFTWARE ENVIRONMENT FOR ENGINEERING DESIGN
NASA Technical Reports Server (NTRS)
Schlaifer, R. S.
1994-01-01
QUICK provides the computer user with the facilities of a sophisticated desk calculator which can perform scalar, vector and matrix arithmetic, propagate conic orbits, determine planetary and satellite coordinates and perform other related astrodynamic calculations within a Fortran-like environment. QUICK is an interpreter, therefore eliminating the need to use a compiler or a linker to run QUICK code. QUICK capabilities include options for automated printing of results, the ability to submit operating system commands on some systems, and access to a plotting package (MASL)and a text editor without leaving QUICK. Mathematical and programming features of QUICK include the ability to handle arbitrary algebraic expressions, the capability to define user functions in terms of other functions, built-in constants such as pi, direct access to useful COMMON areas, matrix capabilities, extensive use of double precision calculations, and the ability to automatically load user functions from a standard library. The MASL (The Multi-mission Analysis Software Library) plotting package, included in the QUICK package, is a set of FORTRAN 77 compatible subroutines designed to facilitate the plotting of engineering data by allowing programmers to write plotting device independent applications. Its universality lies in the number of plotting devices it puts at the user's disposal. The MASL package of routines has proved very useful and easy to work with, yielding good plots for most new users on the first or second try. The functions provided include routines for creating histograms, "wire mesh" surface plots and contour plots as well as normal graphs with a large variety of axis types. The library has routines for plotting on cartesian, polar, log, mercator, cyclic, calendar, and stereographic axes, and for performing automatic or explicit scaling. The lengths of the axes of a plot are completely under the control of the program using the library. Programs written to use the MASL subroutines can be made to output to the Calcomp 1055 plotter, the Hewlett-Packard 2648 graphics terminal, the HP 7221, 7475 and 7550 pen plotters, the Tektronix 40xx and 41xx series graphics terminals, the DEC VT125/VT240 graphics terminals, the QMS 800 laser printer, the Sun Microsystems monochrome display, the Ridge Computers monochrome display, the IBM/PC color display, or a "dumb" terminal or printer. Programs using this library can be written so that they always use the same type of plotter or they can allow the choice of plotter type to be deferred until after program execution. QUICK is written in RATFOR for use on Sun4 series computers running SunOS. No source code is provided. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in ASCII format is included on the distribution medium. QUICK was developed in 1991 and is a copyrighted work with all copyright vested in NASA.
Gartner, J.W.; Yost, B.T.
1988-01-01
Current meter data collected at 11 stations and water level data collected at one station in Suisun and San Pablo Bays, California, in 1986 are compiled in this report. Current-meter measurements include current speed and direction, and water temperature and salinity (computed from temperature and conductivity). For each of the 19 current-meter records, data are presented in two forms. These are: (1) results of harmonic analysis; and (2) plots of tidal current speed and direction versus time and plots of temperature and salinity versus time. Spatial distribution of the properties of tidal currents are given in graphic form. In addition, Eulerian residual currents have been compiled by using a vector-averaging technique. Water level data are presented in the form of a time-series plot and the results of harmonic analysis. (USGS)
WASP (Write a Scientific Paper) using Excel - 3: Plotting data.
Grech, Victor
2018-02-01
The plotting of data into graphs should be a mandatory step in all data analysis as part of a descriptive statistics exercise, since it gives the researcher an overview of the shape and nature of the data. Moreover, outlier values may be identified, which may be incorrect data, or true outliers, from which important findings (and publications) may arise. This exercise should always precede inferential statistics, when possible, and this paper in the Early Human Development WASP series provides some pointers for doing so in Microsoft Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.
Search for Correlated Fluctuations in the Beta+ Decay of Na-22
NASA Astrophysics Data System (ADS)
Silverman, M. P.; Strange, W.
2008-10-01
Claims for a ``cosmogenic'' force that correlates otherwise independent stochastic events have been made for at least 10 years, based largely on visual inspection of time series of histograms whose shapes were interpreted as suggestive of recurrent patterns with semi-diurnal, diurnal, and monthly periods. Building on our earlier work to test randomness of different nuclear decay processes, we have searched for correlations in the time-series of coincident positron-electron annihilations deriving from beta+ decay of Na-22. Disintegrations were counted within a narrow time window over a period of 7 days, leading to a time series of more than 1 million events. Statistical tests were performed on the raw time series, its correlation function, and its Fourier transform to search for cyclic correlations indicative of quantum-mechanical violating deviations from Poisson statistics. The time series was then partitioned into a sequence of 167 ``bags'' each of 8192 events. A histogram was made of the events of each bag, where contiguous frequency classes differed by a single count. The chronological sequence of histograms was then tested for correlations within classes. In all cases the results of the tests were in accord with statistical control, giving no evidence of correlated fluctuations.
NASA Astrophysics Data System (ADS)
Zhu, Haiyan; Shi, Liwei; Li, Shuaiqi; Zhang, Shaobo; Xia, Wangsuo
2018-02-01
Structural, electronic properties and elastic anisotropy of hexagonal C40 XSi2 (X = Cr, Mo, W) under equibiaxial in-plane strains are systematically studied using first-principle calculations. The energy gaps show significant changes with biaxial strains, whereas they are always indirect band-gap materials for -6% <ɛxx < 6%. All elastic constants, bulk modulus, shear modulus, Young's modulus increase (decrease) almost linearly with increasing compressive (tensile) strains. The evolutions of BH /GH ratio and Poisson's ratio indicate that these compounds have a better (worse) ductile behaviour under compressive (tensile) strains. A set of 3D plots show a larger directional variability in the Young's modulus E and shear modulus G at different strains for the three compounds, which is consist with the values of anisotropy factors. Moreover, the evolution of Debye temperature and anisotropy of sound velocities with biaxial strains are discussed.
RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.
Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z
2017-04-01
We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.
Computers and the design of ion beam optical systems
NASA Astrophysics Data System (ADS)
White, Nicholas R.
Advances in microcomputers have made it possible to maintain a library of advanced ion optical programs which can be used on inexpensive computer hardware, which are suitable for the design of a variety of ion beam systems including ion implanters, giving excellent results. This paper describes in outline the steps typically involved in designing a complete ion beam system for materials modification applications. Two computer programs are described which, although based largely on algorithms which have been in use for many years, make possible detailed beam optical calculations using microcomputers, specifically the IBM PC. OPTICIAN is an interactive first-order program for tracing beam envelopes through complex optical systems. SORCERY is a versatile program for solving Laplace's and Poisson's equations by finite difference methods using successive over-relaxation. Ion and electron trajectories can be traced through these potential fields, and plots of beam emittance obtained.
Shi, Ting-Ting; Zhang, Xiao-Bo; Guo, Lan-Ping; Huang, Lu-Qi
2017-11-01
The herbs used as the material for traditional Chinese medicine are always planted in the mountainous area where the natural environment is suitable. As the mountain terrain is complex and the distribution of planting plots is scattered, the traditional survey method is difficult to obtain accurate planting area. It is of great significance to provide decision support for the conservation and utilization of traditional Chinese medicine resources by studying the method of extraction of Chinese herbal medicine planting area based on remote sensing and realizing the dynamic monitoring and reserve estimation of Chinese herbal medicines. In this paper, taking the Panax notoginseng plots in Wenshan prefecture of Yunnan province as an example, the China-made GF-1multispectral remote sensing images with a 16 m×16 m resolution were obtained. Then, the time series that can reflect the difference of spectrum of P. notoginseng shed and the background objects were selected to the maximum extent, and the decision tree model of extraction the of P. notoginseng plots was constructed according to the spectral characteristics of the surface features. The results showed that the remote sensing classification method based on the decision tree model could extract P. notoginseng plots in the study area effectively. The method can provide technical support for extraction of P. notoginseng plots at county level. Copyright© by the Chinese Pharmaceutical Association.
Van Allen Probes Science Gateway: A Centralized Data Access Point
NASA Astrophysics Data System (ADS)
Romeo, G.; Barnes, R. J.; Ukhorskiy, A. Y.; Sotirelis, T.; Stephens, G. K.; Kessel, R.; Potter, M.
2015-12-01
The Van Allen Probes Science Gateway acts a centralized interface to the instrument Science Operation Centers (SOCs), provides mission planning tools, and hosts a number of science related activities such as the mission bibliography. Most importantly, the Gateway acts as the primary site for processing and delivering the Van Allen Probes Space Weather data to users. Over the past years, the web-site has been completely redesigned with the focus on easier navigation and improvements of the existing tools such as the orbit plotter, position calculator and magnetic footprint tool. In addition, a new data plotting facility has been added. Based on HTML5, which allows users to interactively plot Van Allen Probes science and space weather data. The user can tailor the tool to display exactly the plot they wish to see and then share this with other users via either a URL or by QR code. Various types of plots can be created, including, simple time series, data plotted as a function of orbital location, and time versus L-Shell, capability of visualizing data from both probes (A & B) on the same plot. In cooperation with all Van Allen Probes Instrument SOCs, the Science Gateway will soon be able to serve higher level data products (Level 3), and to visualize them via the above mentioned HTML5 interface. Users will also be able to create customized CDF files on the fly.
Chen, Y J; Kao, C H; Lin, S J; Tai, C C; Kwan, K S
2000-01-24
A homogeneous series of heterobimetallic complexes of [R-Fc(4-py)Ru(NH3)5](PF6)2 (R = H, Et, Br, acetyl; Fc(4-py) = 4-ferrocenylpyridine) have been prepared and characterized. The mixed-valence species generated in situ using ferrocenium hexafluorophosphate as the oxidant show class II behavior, and the oxidized sites are ruthenium centered. deltaE(1/2), E(1/2)(Fe(III)/Fe(II)) - E(1/2)(Ru(III)/Ru(II)), an upper limit for deltaGo that is an energetic difference between the donor and acceptor sites, changes sharply and linearly with Gutmann solvent donor number (DN) and Hammett substituent constants (sigma). The solvent-dependent and substituent-dependent intervalence transfer bands were found to vary almost exclusively with deltaE(1/2). The activation energy for the optical electron transfer versus deltaE(1/2) plot yields a common nuclear reorganization energy (lambda) of 0.74 +/- 0.04 eV for this series. The equation that allows one to incorporate the effect of both solvent donicity and substituents on optical electron transfer is Eop = lambda + deltaGo, where deltaGo = (deltaGo)intrinsic + (deltaGo)solvent donicity + (deltaGo)substituent effect (deltaGo )intinnsic with a numerical value of 0.083 +/- 0.045 eV was obtained from the intercept of the deltaE(1/2) of [H-Fc(4-py)Ru(NH3)5]2+,3+,4+ versus DN plot. (deltaGo)solvent donicity was obtained from the average slopes of the deltaE(1/2) of [R-Fc-(4-py)Ru(NH3)5]2+,3+,4+ versus DN plot, and (deltaGo)substituent effect was obtained from the average slopes of the corresponding deltaE(1/2) versus sigma plot. The empirical equation allows one to finely tune Eop of this series to Eop = 0.82 + 0.019(DN) + 0.44sigma eV at 298 K, and the discrepancy between the calculated and experimental data is less than 6%.
Seismic Borehole Monitoring of CO2 Injection in an Oil Reservoir
NASA Astrophysics Data System (ADS)
Gritto, R.; Daley, T. M.; Myer, L. R.
2002-12-01
A series of time-lapse seismic cross well and single well experiments were conducted in a diatomite reservoir to monitor the injection of CO2 into a hydrofracture zone, based on P- and S-wave data. A high-frequency piezo-electric P-wave source and an orbital-vibrator S-wave source were used to generate waves that were recorded by hydrophones as well as three-component geophones. The injection well was located about 12 m from the source well. During the pre-injection phase water was injected into the hydrofrac-zone. The set of seismic experiments was repeated after a time interval of 7 months during which CO2 was injected into the hydrofractured zone. The questions to be answered ranged from the detectability of the geologic structure in the diatomic reservoir to the detectability of CO2 within the hydrofracture. Furthermore it was intended to determine which experiment (cross well or single well) is best suited to resolve these features. During the pre-injection experiment, the P-wave velocities exhibited relatively low values between 1700-1900 m/s, which decreased to 1600-1800 m/s during the post-injection phase (-5%). The analysis of the pre-injection S-wave data revealed slow S-wave velocities between 600-800 m/s, while the post-injection data revealed velocities between 500-700 m/s (-6%). These velocity estimates produced high Poisson ratios between 0.36 and 0.46 for this highly porous (~ 50%) material. Differencing post- and pre-injection data revealed an increase in Poisson ratio of up to 5%. Both, velocity and Poisson estimates indicate the dissolution of CO2 in the liquid phase of the reservoir accompanied by a pore-pressure increase. The single well data supported the findings of the cross well experiments. P- and S-wave velocities as well as Poisson ratios were comparable to the estimates of the cross well data.
Effect of Season on the Persistence of Bacterial Pathogens in Runoff from Agricultural Plots
Runoff from agricultural fields undergoing manure applications may carry a variety of chemical and microbial contaminants that compromise water quality and increase the possibility of human exposure to pathogenic microorganisms when recreational waters are impacted. A series of r...
Long-term vegetation monitoring in Great Britain - the Countryside Survey 1978-2007 and beyond
NASA Astrophysics Data System (ADS)
Wood, Claire M.; Smart, Simon M.; Bunce, Robert G. H.; Norton, Lisa R.; Maskell, Lindsay C.; Howard, David C.; Scott, W. Andrew; Henrys, Peter A.
2017-07-01
The Countryside Survey (CS) of Great Britain provides a globally unique series of datasets, consisting of an extensive set of repeated ecological measurements at a national scale, covering a time span of 29 years. CS was first undertaken in 1978 to monitor ecological and land use change in Britain using standardised procedures for recording ecological data from representative 1 km squares throughout the country. The same sites, with some additional squares, were used for subsequent surveys of vegetation undertaken in 1990, 1998 and 2007, with the intention of future surveys. Other data records include soils, freshwater habitats and invertebrates, and land cover and landscape feature diversity and extents. These data have been recorded in the same locations on analogous dates. However, the present paper describes only the details of the vegetation surveys. The survey design is a series of gridded, stratified, randomly selected 1 km squares taken as representative of classes derived from a statistical environmental classification of Britain. In the 1978 survey, 256 one-kilometre sample squares were recorded, increasing to 506 in 1990, 569 in 1998 and 591 in 2007. Initially each square contained up to 11 dispersed vegetation plots but additional plots were later placed in different features so that eventually up to 36 additional sampling plots were recorded, all of which can be relocated where possible (unless the plot has been lost, for example as a consequence of building work), providing a total of 16 992 plots by 2007. Plots are estimated to have a precise relocation accuracy of 85 %. A range of plots located in different land cover types and landscape features (for example, field boundaries) are included. Although a range of analyses have already been carried out, with changes in the vegetation being related to a range of drivers at local and national scales, there is major potential for further analyses, for example in relation to climate change. Although the precise locations of the plots are restricted, largely for reasons of landowner confidentiality, sample sites are intended to be representative of larger areas, and many potential opportunities for further analyses remain. Data from each of the survey years (1978, 1990, 1998, 2007) are available via the following DOIs: Countryside Survey 1978 vegetation plot data (https://doi.org/10.5285/67bbfabb-d981-4ced-b7e7-225205de9c96), Countryside Survey 1990 vegetation plot data (https://doi.org/10.5285/26e79792-5ffc-4116-9ac7-72193dd7f191), Countryside Survey 1998 vegetation plot data (https://doi.org/10.5285/07896bb2-7078-468c-b56d-fb8b41d47065), Countryside Survey 2007 vegetation plot data (https://doi.org/10.5285/57f97915-8ff1-473b-8c77-2564cbd747bc).
Pairing preferences of the model mono-valence mono-atomic ions investigated by molecular simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Qiang; Department of Chemistry, Bohai University, Jinzhou 121000; Zhang, Ruiting
2014-05-14
We carried out a series of potential of mean force calculations to study the pairing preferences of a series of model mono-atomic 1:1 ions with evenly varied sizes. The probabilities of forming the contact ion pair (CIP) and the single water separate ion pair (SIP) were presented in the two-dimensional plots with respect to the ion sizes. The pairing preferences reflected in these plots largely agree with the empirical rule of matching ion sizes in the small and big size regions. In the region that the ion sizes are close to the size of the water molecule; however, a significantmore » deviation from this conventional rule is observed. Our further analysis indicated that this deviation originates from the competition between CIP and the water bridging SIP state. The competition is mainly an enthalpy modulated phenomenon in which the existing of the water bridging plays a significant role.« less
Multi-Scale Mapping of Vegetation Biomass
NASA Astrophysics Data System (ADS)
Hudak, A. T.; Fekety, P.; Falkowski, M. J.; Kennedy, R. E.; Crookston, N.; Smith, A. M.; Mahoney, P.; Glenn, N. F.; Dong, J.; Kane, V. R.; Woodall, C. W.
2016-12-01
Vegetation biomass mapping at multiple scales is important for carbon inventory and monitoring, reporting, and verification (MRV). Project-level lidar collections allow biomass estimation with high confidence where associated with field plot measurements. Predictive models developed from such datasets are customarily used to generate landscape-scale biomass maps. We tested the feasibility of predicting biomass in landscapes surveyed with lidar but without field plots, by withholding plot datasets from a reduced model applied to the landscapes, and found support for a generalized model in the northern Idaho ecoregion. We are also upscaling a generalized model to all forested lands in Idaho. Our regional modeling approach is to sample the 30-m biomass predictions from the landscape-scale maps and use them to train a regional biomass model, using Landsat time series, topographic derivatives, and climate variables as predictors. Our regional map validation approach is to aggregate the regional, annual biomass predictions to the county level and compare them to annual county-level biomass summarized independently from systematic, field-based, annual inventories conducted by the US Forest Inventory and Analysis (FIA) Program nationally. A national-scale forest cover map generated independently from 2010 PALSAR data at 25-m resolution is being used to mask non-forest pixels from the aggregations. Effects of climate change on future regional biomass stores are also being explored, using biomass estimates projected from stand-level inventory data collected in the National Forests and comparing them to FIA plot data collected independently on public and private lands, projected under the same climate change scenarios, with disturbance trends extracted from the Landsat time series. Our ultimate goal is to demonstrate, focusing on the ecologically diverse Northwest region of the USA, a carbon monitoring system (CMS) that is accurate, objective, repeatable, and transparent.
NASA Technical Reports Server (NTRS)
Ladson, C. L.; Brooks, Cuyler W., Jr.
1975-01-01
A computer program developed to calculate the ordinates and surface slopes of any thickness, symmetrical or cambered NACA airfoil of the 4-digit, 4-digit modified, 5-digit, and 16-series airfoil families is presented. The program produces plots of the airfoil nondimensional ordinates and a punch card output of ordinates in the input format of a readily available program for determining the pressure distributions of arbitrary airfoils in subsonic potential viscous flow.
PROMIS series. Volume 8: Midlatitude ground magnetograms
NASA Technical Reports Server (NTRS)
Fairfield, D. H.; Russell, C. T.
1990-01-01
This is the eighth in a series of volumes pertaining to the Polar Region Outer Magnetosphere International Study (PROMIS). This volume contains 24 hour stack plots of 1-minute average, H and D component, ground magnetograms for the period March 10 through June 16, 1986. Nine midlatitude ground stations were selected from the UCLA magnetogram data base that was constructed from all available digitized magnetogram stations. The primary purpose of this publication is to allow users to define universal times and onset longitudes of magnetospheric substorms.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Improved detection of radioactive material using a series of measurements
NASA Astrophysics Data System (ADS)
Mann, Jenelle
The goal of this project is to develop improved algorithms for detection of radioactive sources that have low signal compared to background. The detection of low signal sources is of interest in national security applications where the source may have weak ionizing radiation emissions, is heavily shielded, or the counting time is short (such as portal monitoring). Traditionally to distinguish signal from background the decision threshold (y*) is calculated by taking a long background count and limiting the false negative error (alpha error) to 5%. Some problems with this method include: background is constantly changing due to natural environmental fluctuations and large amounts of data are being taken as the detector continuously scans that are not utilized. Rather than looking at a single measurement, this work investigates looking at a series of N measurements and develops an appropriate decision threshold for exceeding the decision threshold n times in a series of N. This methodology is investigated for a rectangular, triangular, sinusoidal, Poisson, and Gaussian distribution.
Expansion of the gravitational potential with computerized Poisson series
NASA Technical Reports Server (NTRS)
Broucke, R.
1976-01-01
The paper describes a recursive formulation for the expansion of the gravitational potential valid for both the tesseral and zonal harmonics. The expansion is primarily in rectangular coordinates, but the classical orbit elements or equinoctial orbit elements can be easily substituted. The equations of motion for the zonal harmonics in both classical and equinoctial orbital elements are described in a form which will result in closed-form expressions for the first-order perturbations. In order to achieve this result, the true longitude or true anomaly have to be used as independent variables.
Segmented Poincaré plot analysis for risk stratification in patients with dilated cardiomyopathy.
Voss, A; Fischer, C; Schroeder, R; Figulla, H R; Goernig, M
2010-01-01
The prognostic value of heart rate variability in patients with dilated cardiomyopathy (DCM) is limited and does not contribute to risk stratification although the dynamics of ventricular repolarization differs considerably between DCM patients and healthy subjects. Neither linear nor nonlinear methods of heart rate variability analysis could discriminate between patients at high and low risk for sudden cardiac death. The aim of this study was to analyze the suitability of the new developed segmented Poincaré plot analysis (SPPA) to enhance risk stratification in DCM. In contrast to the usual applied Poincaré plot analysis the SPPA retains nonlinear features from investigated beat-to-beat interval time series. Main features of SPPA are the rotation of cloud of points and their succeeded variability depended segmentation. Significant row and column probabilities were calculated from the segments and led to discrimination (up to p<0.005) between low and high risk in DCM patients. For the first time an index from Poincaré plot analysis of heart rate variability was able to contribute to risk stratification in patients suffering from DCM.
Fractional Poisson Fields and Martingales
NASA Astrophysics Data System (ADS)
Aletti, Giacomo; Leonenko, Nikolai; Merzbach, Ely
2018-02-01
We present new properties for the Fractional Poisson process (FPP) and the Fractional Poisson field on the plane. A martingale characterization for FPPs is given. We extend this result to Fractional Poisson fields, obtaining some other characterizations. The fractional differential equations are studied. We consider a more general Mixed-Fractional Poisson process and show that this process is the stochastic solution of a system of fractional differential-difference equations. Finally, we give some simulations of the Fractional Poisson field on the plane.
On a Poisson homogeneous space of bilinear forms with a Poisson-Lie action
NASA Astrophysics Data System (ADS)
Chekhov, L. O.; Mazzocco, M.
2017-12-01
Let \\mathscr A be the space of bilinear forms on C^N with defining matrices A endowed with a quadratic Poisson structure of reflection equation type. The paper begins with a short description of previous studies of the structure, and then this structure is extended to systems of bilinear forms whose dynamics is governed by the natural action A\\mapsto B ABT} of the {GL}_N Poisson-Lie group on \\mathscr A. A classification is given of all possible quadratic brackets on (B, A)\\in {GL}_N× \\mathscr A preserving the Poisson property of the action, thus endowing \\mathscr A with the structure of a Poisson homogeneous space. Besides the product Poisson structure on {GL}_N× \\mathscr A, there are two other (mutually dual) structures, which (unlike the product Poisson structure) admit reductions by the Dirac procedure to a space of bilinear forms with block upper triangular defining matrices. Further generalisations of this construction are considered, to triples (B,C, A)\\in {GL}_N× {GL}_N× \\mathscr A with the Poisson action A\\mapsto B ACT}, and it is shown that \\mathscr A then acquires the structure of a Poisson symmetric space. Generalisations to chains of transformations and to the quantum and quantum affine algebras are investigated, as well as the relations between constructions of Poisson symmetric spaces and the Poisson groupoid. Bibliography: 30 titles.
NASA Astrophysics Data System (ADS)
Theodorsen, A.; E Garcia, O.; Rypdal, M.
2017-05-01
Filtered Poisson processes are often used as reference models for intermittent fluctuations in physical systems. Such a process is here extended by adding a noise term, either as a purely additive term to the process or as a dynamical term in a stochastic differential equation. The lowest order moments, probability density function, auto-correlation function and power spectral density are derived and used to identify and compare the effects of the two different noise terms. Monte-Carlo studies of synthetic time series are used to investigate the accuracy of model parameter estimation and to identify methods for distinguishing the noise types. It is shown that the probability density function and the three lowest order moments provide accurate estimations of the model parameters, but are unable to separate the noise types. The auto-correlation function and the power spectral density also provide methods for estimating the model parameters, as well as being capable of identifying the noise type. The number of times the signal crosses a prescribed threshold level in the positive direction also promises to be able to differentiate the noise type.
Improving long time behavior of Poisson bracket mapping equation: A non-Hamiltonian approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hyun Woo; Rhee, Young Min, E-mail: ymrhee@postech.ac.kr
2014-05-14
Understanding nonadiabatic dynamics in complex systems is a challenging subject. A series of semiclassical approaches have been proposed to tackle the problem in various settings. The Poisson bracket mapping equation (PBME) utilizes a partial Wigner transform and a mapping representation for its formulation, and has been developed to describe nonadiabatic processes in an efficient manner. Operationally, it is expressed as a set of Hamilton's equations of motion, similar to more conventional classical molecular dynamics. However, this original Hamiltonian PBME sometimes suffers from a large deviation in accuracy especially in the long time limit. Here, we propose a non-Hamiltonian variant ofmore » PBME to improve its behavior especially in that limit. As a benchmark, we simulate spin-boson and photosynthetic model systems and find that it consistently outperforms the original PBME and its Ehrenfest style variant. We explain the source of this improvement by decomposing the components of the mapping Hamiltonian and by assessing the energy flow between the system and the bath. We discuss strengths and weaknesses of our scheme with a viewpoint of offering future prospects.« less
NASA Astrophysics Data System (ADS)
Mahato, Somnath; Puigdollers, Joaquim
2018-02-01
Temperature dependent current-voltage (I‒V) characteristics of Au/n-type silicon (n-Si) Schottky barrier diodes have been investigated. Three transition metal oxides (TMO) are used as an interface layer between gold and silicon. The basic Schottky diode parameters such as ideality factor (n), barrier height (ϕb 0) and series resistance (Rs) are calculated and successfully explained by the thermionic emission (TE) theory. It has been found that ideality factor decreased and barrier height increased with increased of temperature. The conventional Richardson plot of ln(I0/T2) vs. 1000/T is determined the activation energy (Ea) and Richardson constant (A*). Whereas value of 'A*' is much smaller than the known theoretical value of n-type Si. The temperature dependent I-V characteristics obtained the mean value of barrier height (ϕb 0 bar) and standard deviation (σs) from the linear plot of ϕap vs. 1000/T. From the modified Richardson plot of ln(I0/T2) ˗ (qσ)2/2(kT)2 vs. 1000/T gives Richardson constant and homogeneous barrier height of Schottky diodes. Main observation in this present work is the barrier height and ideality factor shows a considerable change but the series resistance value exhibits negligible change due to TMO as an interface layer.
On the Singularity of the Vlasov-Poisson System
DOE Office of Scientific and Technical Information (OSTI.GOV)
and Hong Qin, Jian Zheng
2013-04-26
The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker- Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency v approaches zero. However, we show that the colllisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the approaching zero from the positive side.
On the singularity of the Vlasov-Poisson system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Jian; Qin, Hong; Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08550
2013-09-15
The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker-Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency ν approaches zero. However, we show that the collisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the ν approaches zero from the positive side.
Using the Graphing Calculator--in Two-Dimensional Motion Plots.
ERIC Educational Resources Information Center
Brueningsen, Chris; Bower, William
1995-01-01
Presents a series of simple activities involving generalized two-dimensional motion topics to prepare students to study projectile motion. Uses a pair of motion detectors, each connected to a calculator-based-laboratory (CBL) unit interfaced with a standard graphics calculator, to explore two-dimensional motion. (JRH)
ERIC Educational Resources Information Center
Moore, John W., Ed.
1988-01-01
Describes five computer software packages; four for MS-DOS Systems and one for Apple II. Included are SPEC20, an interactive simulation of a Bausch and Lomb Spectronic-20; a database for laboratory chemicals and programs for visualizing Boltzmann-like distributions, orbital plot for the hydrogen atom and molecular orbital theory. (CW)
Operational Control Procedures for the Activated Sludge Process: Appendix.
ERIC Educational Resources Information Center
West, Alfred W.
This document is the appendix for a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. Categories discussed include: control test data, trend charts, moving averages, semi-logarithmic plots, probability…
2011-05-01
failure resistance, which results from their different microplasticity (microbrittleness) and relaxation ability. In order to evaluate the... microplasticity (microbrittleness) in the series of isomorphic hexaborides produced by zone melting we have plotted a number of statistical curves that show
NASA Astrophysics Data System (ADS)
Saadi, Sameh; Simonneaux, Vincent; Boulet, Gilles; Mougenot, Bernard; Zribi, Mehrez; Lili Chabaane, Zohra
2015-04-01
Water scarcity is one of the main factors limiting agricultural development in semi-arid areas. It is thus of major importance to design tools allowing a better management of this resource. Remote sensing has long been used for computing evapotranspiration estimates, which is an input for crop water balance monitoring. Up to now, only medium and low resolution data (e.g. MODIS) are available on regular basis to monitor cultivated areas. However, the increasing availability of high resolution high repetitivity VIS-NIR remote sensing, like the forthcoming Sentinel-2 mission to be lunched in 2015, offers unprecedented opportunity to improve this monitoring. In this study, regional crops water consumption was estimated with the SAMIR software (Satellite of Monitoring Irrigation) using the FAO-56 dual crop coefficient water balance model fed with high resolution NDVI image time series providing estimates of both the actual basal crop coefficient (Kcb) and the vegetation fraction cover. The model includes a soil water model, requiring the knowledge of soil water holding capacity, maximum rooting depth, and water inputs. As irrigations are usually not known on large areas, they are simulated based on rules reproducing the farmer practices. The main objective of this work is to assess the operationality and accuracy of SAMIR at plot and perimeter scales, when several land use types (winter cereals, summer vegetables…), irrigation and agricultural practices are intertwined in a given landscape, including complex canopies such as sparse orchards. Meteorological ground stations were used to compute the reference evapotranspiration and get the rainfall depths. Two time series of ten and fourteen high-resolution SPOT5 have been acquired for the 2008-2009 and 2012-2013 hydrological years over an irrigated area in central Tunisia. They span the various successive crop seasons. The images were radiometrically corrected, first, using the SMAC6s Algorithm, second, using invariant objects located on the scene, based on visual observation of the images. From these time series, a Normalized Difference Vegetation Index (NDVI) profile was generated for each pixel. SAMIR was first calibrated based on ground measurements of evapotranspiration achieved using eddy-correlation devices installed on irrigated wheat and barley plots. After calibration, the model was run to spatialize irrigation over the whole area and a validation was done using cumulated seasonal water volumes obtained from ground survey at both plot and perimeter scales. The results show that although determination of model parameters was successful at plot scale, irrigation rules required an additional calibration which was achieved at perimeter scale.
On the fractal characterization of Paretian Poisson processes
NASA Astrophysics Data System (ADS)
Eliazar, Iddo I.; Sokolov, Igor M.
2012-06-01
Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.
NASA Astrophysics Data System (ADS)
Cassidy, Rachel; Doody, Donnacha; Watson, Catherine
2016-04-01
Despite the implementation of EU regulations controlling the use of fertilisers in agriculture, reserves of phosphorus (P) in soils continue to pose a threat to water quality. Mobilisation and transport of legacy P from soil to surface waters has been highlighted as a probable cause of many water bodies continuing to fail to achieve targets under the Water Framework Directive. However, the rates and quantities lost from farmland, and the timescales for positive change to water quality, following cessation of P inputs, remain poorly understood. Monitoring data from an instrumented grassland research site in Northern Ireland provide some insights. The site is located in a hydrologically 'flashy' landscape characterised by steep gradients and poorly drained soils over impermeable bedrock. Between 2000 and 2005 soil Olsen P concentrations were altered in five 0.2 ha hydrologically isolated grazed grassland plots through chemical fertiliser applications of 0, 10, 20, 40, 80 kg P ha-1yr-1. By 2004 this had resulted in soil Olsen P concentrations of 19, 24, 28, 38 and 67 mg P L-1 across the plots, after which applications ceased. Subsequently, until 2012, changes in soil Olsen P across the plots and losses to overland flow and drainage were monitored, with near-continuous flow measurement and water samples abstracted for chemical analysis. Runoff events were sampled at 20 minute intervals while drainage flows were taken as a weekly composite of 4-hourly samples. Overland flow events were defined by at least 24 hours without flow being recorded at the respective plot outlets. Drainage flow was examined on a weekly basis as it was continuous except during prolonged dry periods. To examine the hydrological drivers of overland flow and drainage losses the dissolved reactive P (DRP) and total P (TP) time series were synchronised with rainfall data and modelled soil moisture deficits. Results demonstrated that from 2005-2012 there was no significant difference among plots in the recorded TP and DRP time series for either overland flow or drainage flow despite the large variation in soil Olsen P. Flow-weighted mean concentrations for overland flow losses declined slightly over the period but remained in excess of the chemical Environmental Quality Standard in all plots (EQS; 0.035 mg/L). In individual events the plot receiving zero P fertiliser inputs since 2000 often lost as much, or more, P than the plot which received 80 kg ha-1 yr-1 up to 2005. Annual loads also reflect this. Drainage losses showed no decline over the period. The hydrological drivers, particularly the antecedent dry period and soil moisture, were observed to have a greater influence on P loss from the plots than soil P status. Given that Olsen P often forms the basis of nutrient management advice this raises questions on the environmental sustainability of current nutrient advice for some soil types under similar geoclimatic conditions.
1985-05-08
Displacement Time Series Forecasts for Channels 2 Through 8 and (b) Channels 9 Through 16 33 17. Sample PSD Plots for ( a ) Levels 99 and 119 East Cell Rail...for Sensors at ( a ) Levels 99 and 119 on the West Cell Rail, (b) Level 69 on the East and West Cell Rail Footings, and (c) Level 99 on the West Cell Rail...17. Sample PSD Plots for ( a ) Levels 99 and 119 East Cell Rail Locations, (b) level 69 Sensors on the East and West Cell Rail Footings, and (c) Level
Alcohol Messages in Prime-Time Television Series
RUSSELL, CRISTEL ANTONIA; RUSSELL, DALE W.
2010-01-01
Alcohol messages contained in television programming serve as sources of information about drinking. To better understand the ways embedded messages about alcohol are communicated, it is crucial to objectively monitor and analyze television alcohol depictions. This article presents a content analysis of an eight-week sample of eighteen prime-time programs. Alcohol messages were coded based on modalities of presentation, level of plot connection, and valence. The analysis reveals that mixed messages about alcohol often coexist but the ways in which they are presented differ: whereas negative messages are tied to the plot and communicated verbally, positive messages are associated with subtle visual portrayals. PMID:21188281
Farag, Tamer H.; Faruque, Abu S.; Wu, Yukun; Das, Sumon K.; Hossain, Anowar; Ahmed, Shahnawaz; Ahmed, Dilruba; Nasrin, Dilruba; Kotloff, Karen L.; Panchilangam, Sandra; Nataro, James P.; Cohen, Dani; Blackwelder, William C.; Levine, Myron M.
2013-01-01
Background Shigella infections are a public health problem in developing and transitional countries because of high transmissibility, severity of clinical disease, widespread antibiotic resistance and lack of a licensed vaccine. Whereas Shigellae are known to be transmitted primarily by direct fecal-oral contact and less commonly by contaminated food and water, the role of the housefly Musca domestica as a mechanical vector of transmission is less appreciated. We sought to assess the contribution of houseflies to Shigella-associated moderate-to-severe diarrhea (MSD) among children less than five years old in Mirzapur, Bangladesh, a site where shigellosis is hyperendemic, and to model the potential impact of a housefly control intervention. Methods Stool samples from 843 children presenting to Kumudini Hospital during 2009–2010 with new episodes of MSD (diarrhea accompanied by dehydration, dysentery or hospitalization) were analyzed. Housefly density was measured twice weekly in six randomly selected sentinel households. Poisson time series regression was performed and autoregression-adjusted attributable fractions (AFs) were calculated using the Bruzzi method, with standard errors via jackknife procedure. Findings Dramatic springtime peaks in housefly density in 2009 and 2010 were followed one to two months later by peaks of Shigella-associated MSD among toddlers and pre-school children. Poisson time series regression showed that housefly density was associated with Shigella cases at three lags (six weeks) (Incidence Rate Ratio = 1.39 [95% CI: 1.23 to 1.58] for each log increase in fly count), an association that was not confounded by ambient air temperature. Autocorrelation-adjusted AF calculations showed that a housefly control intervention could have prevented approximately 37% of the Shigella cases over the study period. Interpretation Houseflies may play an important role in the seasonal transmission of Shigella in some developing country ecologies. Interventions to control houseflies should be evaluated as possible additions to the public health arsenal to diminish Shigella (and perhaps other causes of) diarrheal infection. PMID:23818998
Farag, Tamer H; Faruque, Abu S; Wu, Yukun; Das, Sumon K; Hossain, Anowar; Ahmed, Shahnawaz; Ahmed, Dilruba; Nasrin, Dilruba; Kotloff, Karen L; Panchilangam, Sandra; Nataro, James P; Cohen, Dani; Blackwelder, William C; Levine, Myron M
2013-01-01
Shigella infections are a public health problem in developing and transitional countries because of high transmissibility, severity of clinical disease, widespread antibiotic resistance and lack of a licensed vaccine. Whereas Shigellae are known to be transmitted primarily by direct fecal-oral contact and less commonly by contaminated food and water, the role of the housefly Musca domestica as a mechanical vector of transmission is less appreciated. We sought to assess the contribution of houseflies to Shigella-associated moderate-to-severe diarrhea (MSD) among children less than five years old in Mirzapur, Bangladesh, a site where shigellosis is hyperendemic, and to model the potential impact of a housefly control intervention. Stool samples from 843 children presenting to Kumudini Hospital during 2009-2010 with new episodes of MSD (diarrhea accompanied by dehydration, dysentery or hospitalization) were analyzed. Housefly density was measured twice weekly in six randomly selected sentinel households. Poisson time series regression was performed and autoregression-adjusted attributable fractions (AFs) were calculated using the Bruzzi method, with standard errors via jackknife procedure. Dramatic springtime peaks in housefly density in 2009 and 2010 were followed one to two months later by peaks of Shigella-associated MSD among toddlers and pre-school children. Poisson time series regression showed that housefly density was associated with Shigella cases at three lags (six weeks) (Incidence Rate Ratio = 1.39 [95% CI: 1.23 to 1.58] for each log increase in fly count), an association that was not confounded by ambient air temperature. Autocorrelation-adjusted AF calculations showed that a housefly control intervention could have prevented approximately 37% of the Shigella cases over the study period. Houseflies may play an important role in the seasonal transmission of Shigella in some developing country ecologies. Interventions to control houseflies should be evaluated as possible additions to the public health arsenal to diminish Shigella (and perhaps other causes of) diarrheal infection.
Time Frequency Analysis of The Land Subsidence Monitored Data with Exploration Geophysics
NASA Astrophysics Data System (ADS)
Wang, Shang-Wei
2014-05-01
Taiwan geographic patterns and various industry water, caused Zhuoshui River Fan groundwater extraction of excess leads to land subsidence, affect the safety of high-speed railway traffic and public construction. It is necessary to do the deeply research on the reason and behavior of subsidence. All the related element will be confer including the water extracted groundwater that be used on each industry or the impact of climate change rainfall and the ground formation characteristics. Conducted a series of in situ measurements and monitoring data with Hilbert Huang Transform. Discussion of subsidence mechanism and estimate the future high-speed rail traffic may affect the extent of providing for future reference remediation. We investigate and experiment on the characteristic of land subsidence in Yun Lin area. The Hilbert-Huang Transform (HHT) and signal normalized are be used to discuss the physical meanings and interactions among the time series data of settlement, groundwater, pumping, rainfall and micro-tremor of ground. The broadband seismic signals of the Broadband Array in Taiwan for Seismology, (BATS) obtained near the Zhuoshui River (WLGB in Chia Yi, WGKB in Yun Lin and RLNB in Zhang Hua) were analyzed by using HHT and empirical mode decomposition (EMD) to discuss the micro-tremor characteristics of the settled ground. To compare among ten years series data of micro-tremor, groundwater and land subsidence monitoring wells, we can get more information about land subsidence. The electrical resistivity tomography (ERT) were performed to correlate the resistivity profile and borehole logging data at the test area. The relationships among resistivity, groundwater variation, and ground subsidence obtained from the test area have been discussed. Active and passive multichannel analysis of surface waves method (MASW) can calculate Poisson's ratio by using shear velocity and pressure velocity. The groundwater level can be presumed when Poisson's ratio arrive 0.5. We can know about undulate groundwater stages and variation of ground by more times measurements.
NASA Astrophysics Data System (ADS)
Jurčo, Branislav; Schupp, Peter; Vysoký, Jan
2014-06-01
We generalize noncommutative gauge theory using Nambu-Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg-Witten map. We construct a covariant Nambu-Poisson gauge theory action, give its first order expansion in the Nambu-Poisson tensor and relate it to a Nambu-Poisson matrix model.
Experiments with a Magnetically Controlled Pendulum
ERIC Educational Resources Information Center
Kraftmakher, Yaakov
2007-01-01
A magnetically controlled pendulum is used for observing free and forced oscillations, including nonlinear oscillations and chaotic motion. A data-acquisition system stores the data and displays time series of the oscillations and related phase plane plots, Poincare maps, Fourier spectra and histograms. The decay constant of the pendulum can be…
Characterization of Transiting Exoplanets by Way of Differential Photometry
ERIC Educational Resources Information Center
Cowley, Michael; Hughes, Stephen
2014-01-01
This paper describes a simple activity for plotting and characterizing the light curve from an exoplanet transit event by way of differential photometry analysis. Using free digital imaging software, participants analyse a series of telescope images with the goal of calculating various exoplanet parameters, including size, orbital radius and…
The microcomputer scientific software series 2: general linear model--regression.
Harold M. Rauscher
1983-01-01
The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...
NASA Astrophysics Data System (ADS)
Czuba, Jonathan A.; Foufoula-Georgiou, Efi; Gran, Karen B.; Belmont, Patrick; Wilcock, Peter R.
2017-05-01
Understanding how sediment moves along source to sink pathways through watersheds—from hillslopes to channels and in and out of floodplains—is a fundamental problem in geomorphology. We contribute to advancing this understanding by modeling the transport and in-channel storage dynamics of bed material sediment on a river network over a 600 year time period. Specifically, we present spatiotemporal changes in bed sediment thickness along an entire river network to elucidate how river networks organize and process sediment supply. We apply our model to sand transport in the agricultural Greater Blue Earth River Basin in Minnesota. By casting the arrival of sediment to links of the network as a Poisson process, we derive analytically (under supply-limited conditions) the time-averaged probability distribution function of bed sediment thickness for each link of the river network for any spatial distribution of inputs. Under transport-limited conditions, the analytical assumptions of the Poisson arrival process are violated (due to in-channel storage dynamics) where we find large fluctuations and periodicity in the time series of bed sediment thickness. The time series of bed sediment thickness is the result of dynamics on a network in propagating, altering, and amalgamating sediment inputs in sometimes unexpected ways. One key insight gleaned from the model is that there can be a small fraction of reaches with relatively low-transport capacity within a nonequilibrium river network acting as "bottlenecks" that control sediment to downstream reaches, whereby fluctuations in bed elevation can dissociate from signals in sediment supply.
A Three-dimensional Polymer Scaffolding Material Exhibiting a Zero Poisson's Ratio.
Soman, Pranav; Fozdar, David Y; Lee, Jin Woo; Phadke, Ameya; Varghese, Shyni; Chen, Shaochen
2012-05-14
Poisson's ratio describes the degree to which a material contracts (expands) transversally when axially strained. A material with a zero Poisson's ratio does not transversally deform in response to an axial strain (stretching). In tissue engineering applications, scaffolding having a zero Poisson's ratio (ZPR) may be more suitable for emulating the behavior of native tissues and accommodating and transmitting forces to the host tissue site during wound healing (or tissue regrowth). For example, scaffolding with a zero Poisson's ratio may be beneficial in the engineering of cartilage, ligament, corneal, and brain tissues, which are known to possess Poisson's ratios of nearly zero. Here, we report a 3D biomaterial constructed from polyethylene glycol (PEG) exhibiting in-plane Poisson's ratios of zero for large values of axial strain. We use digital micro-mirror device projection printing (DMD-PP) to create single- and double-layer scaffolds composed of semi re-entrant pores whose arrangement and deformation mechanisms contribute the zero Poisson's ratio. Strain experiments prove the zero Poisson's behavior of the scaffolds and that the addition of layers does not change the Poisson's ratio. Human mesenchymal stem cells (hMSCs) cultured on biomaterials with zero Poisson's ratio demonstrate the feasibility of utilizing these novel materials for biological applications which require little to no transverse deformations resulting from axial strains. Techniques used in this work allow Poisson's ratio to be both scale-independent and independent of the choice of strut material for strains in the elastic regime, and therefore ZPR behavior can be imparted to a variety of photocurable biomaterial.
From Loss of Memory to Poisson.
ERIC Educational Resources Information Center
Johnson, Bruce R.
1983-01-01
A way of presenting the Poisson process and deriving the Poisson distribution for upper-division courses in probability or mathematical statistics is presented. The main feature of the approach lies in the formulation of Poisson postulates with immediate intuitive appeal. (MNS)
Imaging of CO{sub 2} injection during an enhanced-oil-recovery experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritto, Roland; Daley, Thomas M.; Myer, Larry R.
2003-04-29
A series of time-lapse seismic cross well and single well experiments were conducted in a diatomite reservoir to monitor the injection of CO{sub 2} into a hydrofracture zone, using P- and S-wave data. During the first phase the set of seismic experiments were conducted after the injection of water into the hydrofrac-zone. The set of seismic experiments was repeated after a time period of 7 months during which CO{sub 2} was injected into the hydrofractured zone. The issues to be addressed ranged from the detectability of the geologic structure in the diatomic reservoir to the detectability of CO{sub 2} withinmore » the hydrofracture. During the pre-injection experiment, the P-wave velocities exhibited relatively low values between 1700-1900 m/s, which decreased to 1600-1800 m/s during the post-injection phase (-5 percent). The analysis of the pre-injection S-wave data revealed slow S-wave velocities between 600-800 m/s, while the post-injection data revealed velocities between 500-700 m/s (-6 percent). These velocity estimates produced high Poisson ratios between 0.36 and 0.46 for this highly porous ({approx} 50 percent) material. Differencing post- and pre-injection data revealed an increase in Poisson ratio of up to 5 percent. Both, velocity and Poisson estimates indicate the dissolution of CO{sub 2} in the liquid phase of the reservoir accompanied by a pore-pressure increase. The results of the cross well experiments were corroborated by single well data and laboratory measurements on core data.« less
Occurrence analysis of daily rainfalls by using non-homogeneous Poissonian processes
NASA Astrophysics Data System (ADS)
Sirangelo, B.; Ferrari, E.; de Luca, D. L.
2009-09-01
In recent years several temporally homogeneous stochastic models have been applied to describe the rainfall process. In particular stochastic analysis of daily rainfall time series may contribute to explain the statistic features of the temporal variability related to the phenomenon. Due to the evident periodicity of the physical process, these models have to be used only to short temporal intervals in which occurrences and intensities of rainfalls can be considered reliably homogeneous. To this aim, occurrences of daily rainfalls can be considered as a stationary stochastic process in monthly periods. In this context point process models are widely used for at-site analysis of daily rainfall occurrence; they are continuous time series models, and are able to explain intermittent feature of rainfalls and simulate interstorm periods. With a different approach, periodic features of daily rainfalls can be interpreted by using a temporally non-homogeneous stochastic model characterized by parameters expressed as continuous functions in the time. In this case, great attention has to be paid to the parsimony of the models, as regards the number of parameters and the bias introduced into the generation of synthetic series, and to the influence of threshold values in extracting peak storm database from recorded daily rainfall heights. In this work, a stochastic model based on a non-homogeneous Poisson process, characterized by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. In particular, variation of rainfall occurrence intensity ? (t) is modelled by using Fourier series analysis, in which the non-homogeneous process is transformed into a homogeneous and unit one through a proper transformation of time domain, and the choice of the minimum number of harmonics is evaluated applying available statistical tests. The procedure is applied to a dataset of rain gauges located in different geographical zones of Mediterranean area. Time series have been selected on the basis of the availability of at least 50 years in the time period 1921-1985, chosen as calibration period, and of all the years of observation in the subsequent validation period 1986-2005, whose daily rainfall occurrence process variability is under hypothesis. Firstly, for each time series and for each fixed threshold value, parameters estimation of the non-homogeneous Poisson model is carried out, referred to calibration period. As second step, in order to test the hypothesis that daily rainfall occurrence process preserves the same behaviour in more recent time periods, the intensity distribution evaluated for calibration period is also adopted for the validation period. Starting from this and using a Monte Carlo approach, 1000 synthetic generations of daily rainfall occurrences, of length equal to validation period, have been carried out, and for each simulation sample ?(t) has been evaluated. This procedure is adopted because of the complexity of determining analytical statistical confidence limits referred to the sample intensity ?(t). Finally, sample intensity, theoretical function of the calibration period and 95% statistical band, evaluated by Monte Carlo approach, are matching, together with considering, for each threshold value, the mean square error (MSE) between the theoretical ?(t) and the sample one of recorded data, and his correspondent 95% one tail statistical band, estimated from the MSE values between the sample ?(t) of each synthetic series and the theoretical one. The results obtained may be very useful in the context of the identification and calibration of stochastic rainfall models based on historical precipitation data. Further applications of the non-homogeneous Poisson model will concern the joint analyses of the storm occurrence process with the rainfall height marks, interpreted by using a temporally homogeneous model in proper sub-year intervals.
Nonlocal Poisson-Fermi model for ionic solvent.
Xie, Dexuan; Liu, Jinn-Liang; Eisenberg, Bob
2016-07-01
We propose a nonlocal Poisson-Fermi model for ionic solvent that includes ion size effects and polarization correlations among water molecules in the calculation of electrostatic potential. It includes the previous Poisson-Fermi models as special cases, and its solution is the convolution of a solution of the corresponding nonlocal Poisson dielectric model with a Yukawa-like kernel function. The Fermi distribution is shown to be a set of optimal ionic concentration functions in the sense of minimizing an electrostatic potential free energy. Numerical results are reported to show the difference between a Poisson-Fermi solution and a corresponding Poisson solution.
Nonlinear Poisson Equation for Heterogeneous Media
Hu, Langhua; Wei, Guo-Wei
2012-01-01
The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. PMID:22947937
Ryberg, Karen R.; Vecchia, Aldo V.
2012-01-01
Hydrologic time series data and associated anomalies (multiple components of the original time series representing variability at longer-term and shorter-term time scales) are useful for modeling trends in hydrologic variables, such as streamflow, and for modeling water-quality constituents. An R package, called waterData, has been developed for importing daily hydrologic time series data from U.S. Geological Survey streamgages into the R programming environment. In addition to streamflow, data retrieval may include gage height and continuous physical property data, such as specific conductance, pH, water temperature, turbidity, and dissolved oxygen. The package allows for importing daily hydrologic data into R, plotting the data, fixing common data problems, summarizing the data, and the calculation and graphical presentation of anomalies.
A seasonal Bartlett-Lewis Rectangular Pulse model
NASA Astrophysics Data System (ADS)
Ritschel, Christoph; Agbéko Kpogo-Nuwoklo, Komlan; Rust, Henning; Ulbrich, Uwe; Névir, Peter
2016-04-01
Precipitation time series with a high temporal resolution are needed as input for several hydrological applications, e.g. river runoff or sewer system models. As adequate observational data sets are often not available, simulated precipitation series come to use. Poisson-cluster models are commonly applied to generate these series. It has been shown that this class of stochastic precipitation models is able to well reproduce important characteristics of observed rainfall. For the gauge based case study presented here, the Bartlett-Lewis rectangular pulse model (BLRPM) has been chosen. As it has been shown that certain model parameters vary with season in a midlatitude moderate climate due to different rainfall mechanisms dominating in winter and summer, model parameters are typically estimated separately for individual seasons or individual months. Here, we suggest a simultaneous parameter estimation for the whole year under the assumption that seasonal variation of parameters can be described with harmonic functions. We use an observational precipitation series from Berlin with a high temporal resolution to exemplify the approach. We estimate BLRPM parameters with and without this seasonal extention and compare the results in terms of model performance and robustness of the estimation.
Emerald Ash Borer Threat Reveals Ecohydrologic Feedbacks in Northern U.S. Black Ash Wetlands
NASA Astrophysics Data System (ADS)
Diamond, J.; Mclaughlin, D. L.; Slesak, R.
2016-12-01
Hydrology is a primary driver of wetland structure and process that can be modified by abiotic and biotic feedbacks, leading to self-organization of wetland systems. Large-scale disturbance to these feedbacks, such as loss of vegetation, can thus be expected to impact wetland hydrology. The Emerald Ash Borer is an invasive beetle that is expected to cause widespread-loss of ash trees throughout the northern U.S. and Canada. To predict ecosystem response to this threat of vegetation loss, we ask if and how Black Ash (Fraxinus nigra), a ubiquitous facultative-wetland ash species, actively controls wetland hydrology to determine if Black Ash creates favorable hydrologic regimes for growth (i.e., evidence for ecohydrologic feedbacks). We do this by taking advantage of plot-level tree removal experiments in Black Ash-dominated (75-100% basal area) wetlands in the Chippewa National Forest, Minnesota. The monospecies dominance in these systems minimizes variation associated with species-specific effects, allowing for clearer interpretation of results regarding ecohydrologic feedbacks. Here, we present an analysis of six years of water table and soil moisture time series in experimental plots with the following treatments: 1) clear cut, 2) girdling, 3) group-selection thinning, and 4) control. We also present evapotranspiration (ET) time series estimates for each experimental plot using analysis of diel water level variation. Results show elevated water tables in treatment plots relative to control plots for all treatments for several years after treatments were applied, with differences as great as 50 cm. Some recovery of water table to pre-treatment levels was observed over time, but only the group-selection thinning treatment showed near-complete recovery to pre-treatment levels, and clear-cut treatments indicate sustained elevated water tables over five years. Differences among treatments are directly attributed to variably reduced ET relative to controls. Results also indicate changes to the ET vs. water table relationship among treatments, with implications for ET feedbacks to favorable hydrologic regimes for growth. Finally, we present a conceptual model for these ecosystems and discuss how the model will be used to explore ecohydrologic feedbacks in upcoming years.
Saint-Venant end effects for materials with negative Poisson's ratios
NASA Technical Reports Server (NTRS)
Lakes, R. S.
1992-01-01
Results are presented from an analysis of Saint-Venant end effects for materials with negative Poisson's ratio. Examples are presented showing that slow decay of end stress occurs in circular cylinders of negative Poisson's ratio, whereas a sandwich panel containing rigid face sheets and a compliant core exhibits no anomalous effects for negative Poisson's ratio (but exhibits slow stress decay for core Poisson's ratios approaching 0.5). In sand panels with stiff but not perfectly rigid face sheets, a negative Poisson's ratio results in end stress decay, which is faster than it would be otherwise. It is suggested that the slow decay previously predicted for sandwich strips in plane deformation as a result of the geometry can be mitigated by the use of a negative Poisson's ratio material for the core.
Poisson's ratio of fiber-reinforced composites
NASA Astrophysics Data System (ADS)
Christiansson, Henrik; Helsing, Johan
1996-05-01
Poisson's ratio flow diagrams, that is, the Poisson's ratio versus the fiber fraction, are obtained numerically for hexagonal arrays of elastic circular fibers in an elastic matrix. High numerical accuracy is achieved through the use of an interface integral equation method. Questions concerning fixed point theorems and the validity of existing asymptotic relations are investigated and partially resolved. Our findings for the transverse effective Poisson's ratio, together with earlier results for random systems by other authors, make it possible to formulate a general statement for Poisson's ratio flow diagrams: For composites with circular fibers and where the phase Poisson's ratios are equal to 1/3, the system with the lowest stiffness ratio has the highest Poisson's ratio. For other choices of the elastic moduli for the phases, no simple statement can be made.
Chemistry of Stream Sediments and Surface Waters in New England
Robinson, Gilpin R.; Kapo, Katherine E.; Grossman, Jeffrey N.
2004-01-01
Summary -- This online publication portrays regional data for pH, alkalinity, and specific conductance for stream waters and a multi-element geochemical dataset for stream sediments collected in the New England states of Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, and Vermont. A series of interpolation grid maps portray the chemistry of the stream waters and sediments in relation to bedrock geology, lithology, drainage basins, and urban areas. A series of box plots portray the statistical variation of the chemical data grouped by lithology and other features.
Determination of complex electromechanical coefficients for piezoelectric materials
NASA Astrophysics Data System (ADS)
Du, Xiao-Hong
Sugar maple decline, a result of many possible biotic and abiotic causes, has been a problem in northern Pennsylvania since the early 1980s. Several studies have focused on specific causes, yet few have tried to look at a wide array. The purpose of this research was to investigate stresses in sugar maple forest plots in northern Pennsylvania. Three studies were undertaken. The first study examined the spatial extent of sugar maple on 248 plots in Bailey's ecoregions 212F and 212G, which are glaciated and unglaciated regions, respectively. In addition, a health assessment of sugar maple in Pennsylvania was made, with a resulting separation in population between healthy and unhealthy stands occurring at 20 percent dead sugar maple basal area. The second study was conducted to evaluate a statistical sampling design of 28 forested plots, from the above studies population of plots (248), and to provide data on physical and chemical soil variability and sample size estimation for other researchers. The variability of several soil parameters was examined within plots and between health classes of sugar maple and sample size estimations were derived for these populations. The effect of log-normal transformations on reducing variability and sample sizes was examined and soil descriptions of the plots sampled in 1998 were compared to the USDA Soil Survey mapping unit series descriptions for the plot location. Lastly, the effect of sampling intensity on the detection of significant differences between health class treatments was examined. The last study addressed sugar maple decline in northern Pennsylvania during the same period as the first study (approximately 1979-1989) but on 28 plots chosen from the first studies population. These were the same plots used in the second study on soil variability. Recent literature on sugar maple decline has focused on specific causes and few have tried to look at a wide array. This paper investigates stresses in sugar maple plots related to moisture and how these interact with other stresses such as chemistry, insect defoliation, geology, aspect, slope, topography, and atmospheric deposition.
Evaluation of Shiryaev-Roberts Procedure for On-line Environmental Radiation Monitoring
NASA Astrophysics Data System (ADS)
Watson, Mara Mae
An on-line radiation monitoring system that simultaneously concentrates and detects radioactivity is needed to detect an accidental leakage from a nuclear waste disposal facility or clandestine nuclear activity. Previous studies have shown that classical control chart methods can be applied to on-line radiation monitoring data to quickly detect these events as they occur; however, Bayesian control chart methods were not included in these studies. This work will evaluate the performance of a Bayesian control chart method, the Shiryaev-Roberts (SR) procedure, compared to classical control chart methods, Shewhart 3-sigma and cumulative sum (CUSUM), for use in on-line radiation monitoring of 99Tc in water using extractive scintillating resin. Measurements were collected by pumping solutions containing 0.1-5 Bq/L of 99Tc, as 99T cO4-, through a flow cell packed with extractive scintillating resin coupled to a Beta-RAM Model 5 HPLC detector. While 99T cO4- accumulated on the resin, simultaneous measurements were acquired in 10-s intervals and then re-binned to 100-s intervals. The Bayesian statistical method, Shiryaev-Roberts procedure, and classical control chart methods, Shewhart 3-sigma and cumulative sum (CUSUM), were applied to the data using statistical algorithms developed in MATLAB RTM. Two SR control charts were constructed using Poisson distributions and Gaussian distributions to estimate the likelihood ratio, and are referred to as Poisson SR and Gaussian SR to indicate the distribution used to calculate the statistic. The Poisson and Gaussian SR methods required as little as 28.9 mL less solution at 5 Bq/L and as much as 170 mL less solution at 0.5 Bq/L to exceed the control limit than the Shewhart 3-sigma method. The Poisson SR method needed as little as 6.20 mL less solution at 5 Bq/L and up to 125 mL less solution at 0.5 Bq/L to exceed the control limit than the CUSUM method. The Gaussian SR and CUSUM method required comparable solution volumes for test solutions containing at least 1.5 Bq/L of 99T c. For activity concentrations less than 1.5 Bq/L, the Gaussian SR method required as much as 40.8 mL less solution at 0.5 Bq/L to exceed the control limit than the CUSUM method. Both SR methods were able to consistently detect test solutions containing 0.1 Bq/L, unlike the Shewhart 3-sigma and CUSUM methods. Although the Poisson SR method required as much as 178 mL less solution to exceed the control limit than the Gaussian SR method, the Gaussian SR false positive of 0% was much lower than the Poisson SR false positive rate of 1.14%. A lower false positive rate made it easier to differentiate between a false positive and an increase in mean count rate caused by activity accumulating on the resin. The SR procedure is thus the ideal tool for low-level on-line radiation monitoring using extractive scintillating resin, because it needed less volume in most cases to detect an upward shift in the mean count rate than the Shewhart 3-sigma and CUSUM methods and consistently detected lower activity concentrations. The desired results for the monitoring scheme, however, need to be considered prior to choosing between the Poisson and Gaussian distribution to estimate the likelihood ratio, because each was advantageous under different circumstances. Once the control limit was exceeded, activity concentrations were estimated from the SR control chart using the slope of the control chart on a semi-logarithmic plot. Five of nine test solutions for the Poisson SR control chart produced concentration estimates within 30% of the actual value, but the worst case was 263.2% different than the actual value. The estimations for the Gaussian SR control chart were much more precise, with six of eight solutions producing estimates within 30%. Although the activity concentrations estimations were only mediocre for the Poisson SR control chart and satisfactory for the Gaussian SR control chart, these results demonstrate that a relationship exists between activity concentration and the SR control chart magnitude that can be exploited to determine the activity concentration from the SR control chart. More complex methods should be investigated to improve activity concentration estimations from the SR control charts.
Characterization of Nonhomogeneous Poisson Processes Via Moment Conditions.
1986-08-01
Poisson processes play an important role in many fields. The Poisson process is one of the simplest counting processes and is a building block for...place of independent increments. This provides a somewhat different viewpoint for examining Poisson processes . In addition, new characterizations for
Poisson Mixture Regression Models for Heart Disease Prediction.
Mufudza, Chipo; Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.
Poisson Mixture Regression Models for Heart Disease Prediction
Erol, Hamza
2016-01-01
Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611
Monitoring Poisson observations using combined applications of Shewhart and EWMA charts
NASA Astrophysics Data System (ADS)
Abujiya, Mu'azu Ramat
2017-11-01
The Shewhart and exponentially weighted moving average (EWMA) charts for nonconformities are the most widely used procedures of choice for monitoring Poisson observations in modern industries. Individually, the Shewhart EWMA charts are only sensitive to large and small shifts, respectively. To enhance the detection abilities of the two schemes in monitoring all kinds of shifts in Poisson count data, this study examines the performance of combined applications of the Shewhart, and EWMA Poisson control charts. Furthermore, the study proposes modifications based on well-structured statistical data collection technique, ranked set sampling (RSS), to detect shifts in the mean of a Poisson process more quickly. The relative performance of the proposed Shewhart-EWMA Poisson location charts is evaluated in terms of the average run length (ARL), standard deviation of the run length (SDRL), median run length (MRL), average ratio ARL (ARARL), average extra quadratic loss (AEQL) and performance comparison index (PCI). Consequently, all the new Poisson control charts based on RSS method are generally more superior than most of the existing schemes for monitoring Poisson processes. The use of these combined Shewhart-EWMA Poisson charts is illustrated with an example to demonstrate the practical implementation of the design procedure.
Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'.
de Nijs, Robin
2015-07-21
In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics for counts below 100. Only Poisson resampling was not affected by this, while Gaussian redrawing was less affected by it than Poisson redrawing. Poisson resampling is the method of choice, when simulating half-count (or less) images from full-count images. It simulates correctly the statistical properties, also in the case of rounding off of the images.
VTGRAPH - GRAPHIC SOFTWARE TOOL FOR VT TERMINALS
NASA Technical Reports Server (NTRS)
Wang, C.
1994-01-01
VTGRAPH is a graphics software tool for DEC/VT or VT compatible terminals which are widely used by government and industry. It is a FORTRAN or C-language callable library designed to allow the user to deal with many computer environments which use VT terminals for window management and graphic systems. It also provides a PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. The program is transportable to many different computers which use VT terminals. With this graphics package, the user can easily design more friendly user interface programs and design PLOT10 programs on VT terminals with different computer systems. VTGRAPH was developed using the ReGis Graphics set which provides a full range of graphics capabilities. The basic VTGRAPH capabilities are as follows: window management, PLOT10 compatible drawing, generic program routines for two and three dimensional plotting, and color graphics or shaded graphics capability. The program was developed in VAX FORTRAN in 1988. VTGRAPH requires a ReGis graphics set terminal and a FORTRAN compiler. The program has been run on a DEC MicroVAX 3600 series computer operating under VMS 5.0, and has a virtual memory requirement of 5KB.
Fujiyama, Toshifumi; Matsui, Chihiro; Takemura, Akimichi
2016-01-01
We propose a power-law growth and decay model for posting data to social networking services before and after social events. We model the time series structure of deviations from the power-law growth and decay with a conditional Poisson autoregressive (AR) model. Online postings related to social events are described by five parameters in the power-law growth and decay model, each of which characterizes different aspects of interest in the event. We assess the validity of parameter estimates in terms of confidence intervals, and compare various submodels based on likelihoods and information criteria.
NASA Astrophysics Data System (ADS)
Wang, Fengwen
2018-05-01
This paper presents a systematic approach for designing 3D auxetic lattice materials, which exhibit constant negative Poisson's ratios over large strain intervals. A unit cell model mimicking tensile tests is established and based on the proposed model, the secant Poisson's ratio is defined as the negative ratio between the lateral and the longitudinal engineering strains. The optimization problem for designing a material unit cell with a target Poisson's ratio is formulated to minimize the average lateral engineering stresses under the prescribed deformations. Numerical results demonstrate that 3D auxetic lattice materials with constant Poisson's ratios can be achieved by the proposed optimization formulation and that two sets of material architectures are obtained by imposing different symmetry on the unit cell. Moreover, inspired by the topology-optimized material architecture, a subsequent shape optimization is proposed by parametrizing material architectures using super-ellipsoids. By designing two geometrical parameters, simple optimized material microstructures with different target Poisson's ratios are obtained. By interpolating these two parameters as polynomial functions of Poisson's ratios, material architectures for any Poisson's ratio in the interval of ν ∈ [ - 0.78 , 0.00 ] are explicitly presented. Numerical evaluations show that interpolated auxetic lattice materials exhibit constant Poisson's ratios in the target strain interval of [0.00, 0.20] and that 3D auxetic lattice material architectures with programmable Poisson's ratio are achievable.
Nonlinear Poisson equation for heterogeneous media.
Hu, Langhua; Wei, Guo-Wei
2012-08-22
The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Evaluating Curriculum-Based Measurement from a Behavioral Assessment Perspective
ERIC Educational Resources Information Center
Ardoin, Scott P.; Roof, Claire M.; Klubnick, Cynthia; Carfolite, Jessica
2008-01-01
Curriculum-based measurement Reading (CBM-R) is an assessment procedure used to evaluate students' relative performance compared to peers and to evaluate their growth in reading. Within the response to intervention (RtI) model, CBM-R data are plotted in time series fashion as a means modeling individual students' response to varying levels of…
User's guide for Northeast Stand Exam Program (NEST Version 2.1).
Thomas M. Schuler; Brian T. Simpson
1991-01-01
Explains the Northeast Stand Exam (NEST Version 2.1) program. The NEST program was designed for use on the Polycorder 600 Series electronic portable data recorder to record data collected from the standard permanent plot as described by the Stand Culture and Stand Establishment Working Groups of the Northeastern Forest Experiment Station.
ERIC Educational Resources Information Center
Richie, Donald S., Ed.
This Film Focus series is a collection of reviews, essays, and commentaries on the Japanese film Rashomon. The plot consists of an attack, a rape, and a robbery, all of which probably occurred during the Middle Ages. Each character relates his own version of what happened, or might have happened, revealing the outward and inner driving forces,…
Too Soon for Sex? Title No. 501.
ERIC Educational Resources Information Center
DeVault, Christine
This story, one of three in The Sexuality Decision-Making Series for Teens, contains several different possible plot lines depending on the decisions the reader makes for the characters at various "choice points." The story focuses on the theme of abstinence: Serena and James are in love and need to face the realities of their sexual…
The Butterfly Effect for Physics Laboratories
ERIC Educational Resources Information Center
Claycomb, James R.; Valentine, John H.
2015-01-01
A low-cost chaos dynamics lab is developed for quantitative demonstration of the butterfly effect using a magnetic pendulum. Chaotic motion is explored by recording magnetic time series. Students analyze the data in Excel® to investigate the butterfly effect as well as the reconstruction of the strange attractor using time delay plots. The lab…
Diester Molecules for Organic-Based Electrical and Photoelectrical Devices
NASA Astrophysics Data System (ADS)
Topal, Giray; Tombak, Ahmet; Yigitalp, Esref; Batibay, Derya; Kilicoglu, Tahsin; Ocak, Yusuf Selim
2017-07-01
Diester derivatives of terephthalic acid molecules were synthesized according to the literature. Au/Diester derivatives/ n-Si organic-inorganic (OI) heterojunction-type devices were fabricated, and the current-voltage ( I- V) characteristics of the devices have been investigated at room temperature. I- V characteristics demonstrated that all diodes had excellent rectification properties. Primary diode parameters such as series resistance and barrier height were extracted by using semi-log I- V plots and Norde methods, and were compared. It was seen that there was a substantial agreement between results obtained from two methods. Calculated barrier height values were about the same with 0.02-eV differences that were attributed to the series resistance. Ideality factors, which show how the diode closes to ideal diodes, were also extracted from semi-log I- V plots. Thus, the modification of the Au/ n-Si diode potential barrier was accomplished using diester derivatives as an interlayer. The I- V measurements were repeated to characterize the devices at 100 mW/cm2 illumination intensity with the help of a solar simulator with an AM1.5G filter.
17O NMR studies on 4- and 4'-substituted chalcones and p-substituted β-nitrostyrenes
NASA Astrophysics Data System (ADS)
Boykin, D. W.; Baumstark, A. L.; Balakrishnan, P.; Perjéssy, A.; Hrnc˜iar, P.
The 17O NMR chemical shift data for 17O-enriched 4- and 4'-chalcones in toluene at 90°C and for p-substituted β-nitrostyrenes (natural abundance) in acetonitrile at 70°C are reported. The SCS (substituent chemical shift) range for the 4-chalcones p-CH 3O to p-NO 2 is 16.3 ppm; the range for the 4'-chalcones p-CH 3O to p-NO 2 is 32.4 ppm. The SCS range for the p-substituted-β-nitrostyrenes p-CH 3O to p-NO 2 is 13.2 ppm. The data for the three series gave good correlations with σ + constants, while the Dual Substitutent Parameter treatment only slightly improved the correlations using σ R+ constants. Plots of the 17O chemical shifts for both 4- and 4'-chalcones with 17O data for acetophenones and correlation of 17O chemical shift data for the β-nitrostyrenes with that of nitrobenzenes gave good correlations. Plots of the 17O data for all the three series with their respective functional group stretching frequencies gave fair correlations.
A CAD System for Hemorrhagic Stroke.
Nowinski, Wieslaw L; Qian, Guoyu; Hanley, Daniel F
2014-09-01
Computer-aided detection/diagnosis (CAD) is a key component of routine clinical practice, increasingly used for detection, interpretation, quantification and decision support. Despite a critical need, there is no clinically accepted CAD system for stroke yet. Here we introduce a CAD system for hemorrhagic stroke. This CAD system segments, quantifies, and displays hematoma in 2D/3D, and supports evacuation of hemorrhage by thrombolytic treatment monitoring progression and quantifying clot removal. It supports seven-step workflow: select patient, add a new study, process patient's scans, show segmentation results, plot hematoma volumes, show 3D synchronized time series hematomas, and generate report. The system architecture contains four components: library, tools, application with user interface, and hematoma segmentation algorithm. The tools include a contour editor, 3D surface modeler, 3D volume measure, histogramming, hematoma volume plot, and 3D synchronized time-series hematoma display. The CAD system has been designed and implemented in C++. It has also been employed in the CLEAR and MISTIE phase-III, multicenter clinical trials. This stroke CAD system is potentially useful in research and clinical applications, particularly for clinical trials.
GASPLOT - A computer graphics program that draws a variety of thermophysical property charts
NASA Technical Reports Server (NTRS)
Trivisonno, R. J.; Hendricks, R. C.
1977-01-01
A FORTRAN V computer program, written for the UNIVAC 1100 series, is used to draw a variety of precision thermophysical property charts on the Calcomp plotter. In addition to the program (GASPLOT), which requires (15 160) sub 10 storages, a thermophysical properties routine needed to produce plots. The program is designed so that any two of the state variables, the derived variables, or the transport variables may be plotted as the ordinate - abscissa pair with as many as five parametric variables. The parameters may be temperature, pressure, density, enthalpy, and entropy. Each parameter may have as many a 49 values, and the range of the variables is limited only by the thermophysical properties routine.
NASA Astrophysics Data System (ADS)
Treuhaft, R. N.; Baccini, A.; Goncalves, F. G.; Lei, Y.; Keller, M.; Walker, W. S.
2017-12-01
Tropical forests account for about 50% of the world's forested biomass, and play a critical role in the control of atmospheric carbon dioxide. Large-scale (1000's of km) changes in forest structure and biomass bear on global carbon source-sink dynamics, while small-scale (< 100 m) changes bear on deforestation and degradation monitoring. After describing the interferometric SAR (InSAR) phase-height observation, we show forest phase-height time series from the TanDEM-X radar interferometer at X-band (3 cm), taken with monthly and sub-hectare temporal and spatial resolution, respectively. The measurements were taken with more than 30 TanDEM-X passes over Tapajós National Forest in the Brazilian Amazon between 2011 and 2014. The transformation of phase-height rates into aboveground biomass (AGB) rates is based on the idea that the change in AGB due to a change in phase-height depends on the plot's AGB. Plots with higher AGB will produce more AGB for a given increase in height or phase-height. Postulating a power-law dependence of plot-level mass density on physical height, we previously found that the best conversion factors for transforming phase-height rate to AGB rate were indeed dependent on AGB. For 78 plots, we demonstrated AGB rates from InSAR phase-height rates using AGB from field measurements. For regional modeling of the Amazon Basin, field measurements of AGB, to specify the conversion factors, is impractical. Conversion factors from InSAR phase-height rate to AGB rate in this talk will be based on AGB derived from the Moderate Resolution Imaging Spectroradiometer (MODIS). AGB measurement from MODIS is based on the spectral reflectance of 7 bands from the visible to short wave infrared, and auxiliary metrics describing the variance in reflectance. The mapping of MODIS reflectance to AGB is enabled by training a machine learning algorithm with lidar-derived AGB data, which are in turn trained by field measurements for small areas. The performance of TanDEM-X AGB rate from MODIS-derived conversion factors will be compared to that derived from field-based conversion factors. We will also attempt to improve phase-height rate to AGB rate transformation by deriving improved models of mass density dependences on height, based on the aggregation of single-stem allometrics.
Yamaura, Yuichi; Connor, Edward F; Royle, J Andrew; Itoh, Katsuo; Sato, Kiyoshi; Taki, Hisatomo; Mishima, Yoshio
2016-07-01
Models and data used to describe species-area relationships confound sampling with ecological process as they fail to acknowledge that estimates of species richness arise due to sampling. This compromises our ability to make ecological inferences from and about species-area relationships. We develop and illustrate hierarchical community models of abundance and frequency to estimate species richness. The models we propose separate sampling from ecological processes by explicitly accounting for the fact that sampled patches are seldom completely covered by sampling plots and that individuals present in the sampling plots are imperfectly detected. We propose a multispecies abundance model in which community assembly is treated as the summation of an ensemble of species-level Poisson processes and estimate patch-level species richness as a derived parameter. We use sampling process models appropriate for specific survey methods. We propose a multispecies frequency model that treats the number of plots in which a species occurs as a binomial process. We illustrate these models using data collected in surveys of early-successional bird species and plants in young forest plantation patches. Results indicate that only mature forest plant species deviated from the constant density hypothesis, but the null model suggested that the deviations were too small to alter the form of species-area relationships. Nevertheless, results from simulations clearly show that the aggregate pattern of individual species density-area relationships and occurrence probability-area relationships can alter the form of species-area relationships. The plant community model estimated that only half of the species present in the regional species pool were encountered during the survey. The modeling framework we propose explicitly accounts for sampling processes so that ecological processes can be examined free of sampling artefacts. Our modeling approach is extensible and could be applied to a variety of study designs and allows the inclusion of additional environmental covariates.
Yamaura, Yuichi; Connor, Edward F.; Royle, Andy; Itoh, Katsuo; Sato, Kiyoshi; Taki, Hisatomo; Mishima, Yoshio
2016-01-01
Models and data used to describe species–area relationships confound sampling with ecological process as they fail to acknowledge that estimates of species richness arise due to sampling. This compromises our ability to make ecological inferences from and about species–area relationships. We develop and illustrate hierarchical community models of abundance and frequency to estimate species richness. The models we propose separate sampling from ecological processes by explicitly accounting for the fact that sampled patches are seldom completely covered by sampling plots and that individuals present in the sampling plots are imperfectly detected. We propose a multispecies abundance model in which community assembly is treated as the summation of an ensemble of species-level Poisson processes and estimate patch-level species richness as a derived parameter. We use sampling process models appropriate for specific survey methods. We propose a multispecies frequency model that treats the number of plots in which a species occurs as a binomial process. We illustrate these models using data collected in surveys of early-successional bird species and plants in young forest plantation patches. Results indicate that only mature forest plant species deviated from the constant density hypothesis, but the null model suggested that the deviations were too small to alter the form of species–area relationships. Nevertheless, results from simulations clearly show that the aggregate pattern of individual species density–area relationships and occurrence probability–area relationships can alter the form of species–area relationships. The plant community model estimated that only half of the species present in the regional species pool were encountered during the survey. The modeling framework we propose explicitly accounts for sampling processes so that ecological processes can be examined free of sampling artefacts. Our modeling approach is extensible and could be applied to a variety of study designs and allows the inclusion of additional environmental covariates.
NASA Astrophysics Data System (ADS)
Healey, S. P.; Zhao, F. R.; McCarter, J. B.; Frescino, T.; Goeking, S.
2017-12-01
International reporting of American forest carbon trends depends upon the Forest Service's nationally consistent network of inventory plots. Plots are measured on a rolling basis over a 5- to 10-year cycle, so estimates related to any variable, including carbon storage, reflect conditions over a 5- to 10-year window. This makes it difficult to identify the carbon impact of discrete events (e.g., a bad fire year; extraction rates related to home-building trends), particularly if the events are recent.We report an approach to make inventory estimates more sensitive to discrete and recent events. We use a growth model (the Forest Vegetation Simulator - FVS) that is maintained by the Forest Service to annually update the tree list for every plot, allowing all plots to contribute to a series of single-year estimates. Satellite imagery from the Landsat platform guides the FVS simulations by providing information about which plots have been disturbed, which are recovering from disturbance, and which are undergoing undisturbed growth. The FVS model is only used to "update" plot tree lists until the next field measurement is made (maximum of 9 years). As a result, predicted changes are usually small and error rates are low. We present a pilot study of this system in Idaho, which has experienced several major fire events in the last decade. Empirical estimates of uncertainty, accounting for both plot sampling error and FVS model error, suggest that this approach greatly increases temporal specificity and sensitivity to discrete events without sacrificing much estimate precision at the level of a US state. This approach has the potential to take better advantage of the Forest Service's rolling plot measurement schedule to report carbon storage in the US, and it offers the basis of a system that might allow near-term, forward-looking analysis of the effects of hypothetical forest disturbance patterns.
Chen, Vincent Chin-Hung; Stewart, Robert; Lee, Charles Tzu-Chi
2012-07-01
To investigate the association between weekly lottery sales and number of suicide deaths in Taiwan. All suicides aged 15+ years during 2004-2006 in Taiwan were included. Poisson autoregression time series models investigated associations of weekly numbers with contemporaneous and recent sales from two national lotteries in operation. Adjustments were made for seasonal fluctuation, temperature, monthly unemployment and autocorrelation. In fully adjusted models, suicide deaths were negatively correlated with sales of tickets for a low-prize, low-cost lottery system. However, they were correlated positively with recent sales for a higher-cost, larger-prize system. Both correlations were stronger for male than female suicide numbers but differed in terms of age groups most strongly implicated. Associations between lottery sales and suicide numbers differed according to the nature of the lottery. A low-prize, low-publicity system appeared to be more benign than a high-prize, high-publicity one.
Gravitational instability of slowly rotating isothermal spheres
NASA Astrophysics Data System (ADS)
Chavanis, P. H.
2002-12-01
We discuss the statistical mechanics of rotating self-gravitating systems by allowing properly for the conservation of angular momentum. We study analytically the case of slowly rotating isothermal spheres by expanding the solutions of the Boltzmann-Poisson equation in a series of Legendre polynomials, adapting the procedure introduced by Chandrasekhar (1933) for distorted polytropes. We show how the classical spiral of Lynden-Bell & Wood (1967) in the temperature-energy plane is deformed by rotation. We find that gravitational instability occurs sooner in the microcanonical ensemble and later in the canonical ensemble. According to standard turning point arguments, the onset of the collapse coincides with the minimum energy or minimum temperature state in the series of equilibria. Interestingly, it happens to be close to the point of maximum flattening. We generalize the singular isothermal solution to the case of a slowly rotating configuration. We also consider slowly rotating configurations of the self-gravitating Fermi gas at non-zero temperature.
Inverse sequential procedures for the monitoring of time series
NASA Technical Reports Server (NTRS)
Radok, Uwe; Brown, Timothy
1993-01-01
Climate changes traditionally have been detected from long series of observations and long after they happened. The 'inverse sequential' monitoring procedure is designed to detect changes as soon as they occur. Frequency distribution parameters are estimated both from the most recent existing set of observations and from the same set augmented by 1,2,...j new observations. Individual-value probability products ('likelihoods') are then calculated which yield probabilities for erroneously accepting the existing parameter(s) as valid for the augmented data set and vice versa. A parameter change is signaled when these probabilities (or a more convenient and robust compound 'no change' probability) show a progressive decrease. New parameters are then estimated from the new observations alone to restart the procedure. The detailed algebra is developed and tested for Gaussian means and variances, Poisson and chi-square means, and linear or exponential trends; a comprehensive and interactive Fortran program is provided in the appendix.
IDS plot tools for time series of DORIS station positions and orbit residuals
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.; Moreaux, G.; Mezerette, A.
2012-12-01
DORIS (Doppler Orbitography and Radiopositioning Integrated by Satellite) is a Doppler satellite tracking system developed for precise orbit determination and precise ground location. It is onboard the Cryosat-2, Jason-1, Jason-2 and HY-2A altimetric satellites and the remote sensing satellites SPOT-4 and SPOT-5. It also flew with SPOT-2, SPOT-3, TOPEX/POSEIDON and ENVISAT. Since 1994 and thanks to its worldwide distributed network of more than fifty permanent stations, DORIS contributes to the realization and maintenance of the ITRS (International Terrestrial Reference System). 3D positions and velocities of the reference sites at a cm and mm/yr accuracy lead to scientific studies in geodesy and geophysics. The primary objective of the International DORIS Service (IDS) is to provide a support, through DORIS data and products, to research and operational activities. In order to promote the use of the DORIS products, the IDS has made available on its web site (ids-doris.org) a new set of tools, called Plot tools, to interactively build and display graphs of DORIS station coordinates time series and orbit residuals. These web tools are STCDtool providing station coordinates time series (North, East, Up position evolution) from the IDS Analysis Centers, and POEtool providing statistics time series (orbit residuals and number of measurements for the DORIS stations) from CNES (the French Space Agency) Precise Orbit Determination processing. Complementary data about station and satellites events can also be displayed (e.g. antenna changes, system failures, degraded data...). Information about earthquakes obtained from USGS survey service can also be superimposed on the position time series. All these events can help in interpreting the discontinuities in the time series. The purpose of this presentation is to show the functionalities of these tools and their interest for the monitoring of the crustal deformation at DORIS sites.
Chambers, Jeffrey Q; Negron-Juarez, Robinson I; Marra, Daniel Magnabosco; Di Vittorio, Alan; Tews, Joerg; Roberts, Dar; Ribeiro, Gabriel H P M; Trumbore, Susan E; Higuchi, Niro
2013-03-05
Old-growth forest ecosystems comprise a mosaic of patches in different successional stages, with the fraction of the landscape in any particular state relatively constant over large temporal and spatial scales. The size distribution and return frequency of disturbance events, and subsequent recovery processes, determine to a large extent the spatial scale over which this old-growth steady state develops. Here, we characterize this mosaic for a Central Amazon forest by integrating field plot data, remote sensing disturbance probability distribution functions, and individual-based simulation modeling. Results demonstrate that a steady state of patches of varying successional age occurs over a relatively large spatial scale, with important implications for detecting temporal trends on plots that sample a small fraction of the landscape. Long highly significant stochastic runs averaging 1.0 Mg biomass⋅ha(-1)⋅y(-1) were often punctuated by episodic disturbance events, resulting in a sawtooth time series of hectare-scale tree biomass. To maximize the detection of temporal trends for this Central Amazon site (e.g., driven by CO2 fertilization), plots larger than 10 ha would provide the greatest sensitivity. A model-based analysis of fractional mortality across all gap sizes demonstrated that 9.1-16.9% of tree mortality was missing from plot-based approaches, underscoring the need to combine plot and remote-sensing methods for estimating net landscape carbon balance. Old-growth tropical forests can exhibit complex large-scale structure driven by disturbance and recovery cycles, with ecosystem and community attributes of hectare-scale plots exhibiting continuous dynamic departures from a steady-state condition.
Plot-scale effects on runoff and erosion along a slope degradation gradient
NASA Astrophysics Data System (ADS)
Moreno-de Las Heras, Mariano; Nicolau, José M.; Merino-MartíN, Luis; Wilcox, Bradford P.
2010-04-01
In Earth and ecological sciences, an important, crosscutting issue is the relationship between scale and the processes of runoff and erosion. In drylands, understanding this relationship is critical for understanding ecosystem functionality and degradation processes. Recent work has suggested that the effects of scale may differ depending on the extent of degradation. To test this hypothesis, runoff and sediment yield were monitored during a hydrological year on 20 plots of various lengths (1-15 m). These plots were located on a series of five reclaimed mining slopes in a Mediterranean-dry environment. The five slopes exhibited various degrees of vegetative cover and surface erosion. A general decrease of unit area runoff was observed with increasing plot scale for all slopes. Nevertheless, the amount of reinfiltrated runoff along each slope varied with the extent of degradation, being highest at the least degraded slope and vice versa. In other words, unit area runoff decreased the least on the most disturbed site as plot length increased. Unit area sediment yield declined with increasing plot length for the undisturbed and moderately disturbed sites, but it actually increased for the highly disturbed sites. The different scaling behavior of the most degraded slopes was especially clear under high-intensity rainfall conditions, when flow concentration favored rill erosion. Our results confirm that in drylands, the effects of scale on runoff and erosion change with the extent of degradation, resulting in a substantial loss of soil and water from disturbed systems, which could reinforce the degradation process through feedback mechanisms with vegetation.
Topping, Christopher John; Kjaer, Lene Jung; Hommen, Udo; Høye, Toke Thomas; Preuss, Thomas G; Sibly, Richard M; van Vliet, Peter
2014-07-01
Current European Union regulatory risk assessment allows application of pesticides provided that recovery of nontarget arthropods in-crop occurs within a year. Despite the long-established theory of source-sink dynamics, risk assessment ignores depletion of surrounding populations and typical field trials are restricted to plot-scale experiments. In the present study, the authors used agent-based modeling of 2 contrasting invertebrates, a spider and a beetle, to assess how the area of pesticide application and environmental half-life affect the assessment of recovery at the plot scale and impact the population at the landscape scale. Small-scale plot experiments were simulated for pesticides with different application rates and environmental half-lives. The same pesticides were then evaluated at the landscape scale (10 km × 10 km) assuming continuous year-on-year usage. The authors' results show that recovery time estimated from plot experiments is a poor indicator of long-term population impact at the landscape level and that the spatial scale of pesticide application strongly determines population-level impact. This raises serious doubts as to the utility of plot-recovery experiments in pesticide regulatory risk assessment for population-level protection. Predictions from the model are supported by empirical evidence from a series of studies carried out in the decade starting in 1988. The issues raised then can now be addressed using simulation. Prediction of impacts at landscape scales should be more widely used in assessing the risks posed by environmental stressors. © 2014 SETAC.
Behle, Robert W; Hibbard, Bruce E; Cermak, Steven C; Isbell, Terry A
2008-06-01
In previous crop rotation research, adult emergence traps placed in plots planted to Cuphea PSR-23 (a selected cross of Cuphea viscosissma Jacq. and Cuphea lanceolata Ait.) caught high numbers of adult western corn rootworms, Diabrotica virgifera virgifera LeConte (Coleoptera: Chrysomelidae), suggesting that larvae may have completed development on this broadleaf plant. Because of this observation, a series of greenhouse and field experiments were conducted to test the hypothesis that Cuphea could serve as a host for larval development. Greenhouse-grown plants infested with neonates of a colonized nondiapausing strain of the beetle showed no survival of larvae on Cuphea, although larvae did survive on the positive control (corn, Zea mays L.) and negative control [sorghum, Sorghum bicolor (L.) Moench] plants. Soil samples collected 20 June, 7 July, and 29 July 2005 from field plots planted to Cuphea did not contain rootworm larvae compared with means of 1.28, 0.22, and 0.00 rootworms kg(-1) soil, respectively, for samples collected from plots planted to corn. Emergence traps captured a peak of eight beetles trap(-1) day(-1) from corn plots on 8 July compared with a peak of 0.5 beetle trap(-1) day(-1) on 4 August from Cuphea plots. Even though a few adult beetles were again captured in the emergence traps placed in the Cuphea plots, it is not thought to be the result of successful larval development on Cuphea roots. All the direct evidence reported here supports the conventional belief that rootworm larvae do not survive on broadleaf plants, including Cuphea.
Unimodularity criteria for Poisson structures on foliated manifolds
NASA Astrophysics Data System (ADS)
Pedroza, Andrés; Velasco-Barreras, Eduardo; Vorobiev, Yury
2018-03-01
We study the behavior of the modular class of an orientable Poisson manifold and formulate some unimodularity criteria in the semilocal context, around a (singular) symplectic leaf. Our results generalize some known unimodularity criteria for regular Poisson manifolds related to the notion of the Reeb class. In particular, we show that the unimodularity of the transverse Poisson structure of the leaf is a necessary condition for the semilocal unimodular property. Our main tool is an explicit formula for a bigraded decomposition of modular vector fields of a coupling Poisson structure on a foliated manifold. Moreover, we also exploit the notion of the modular class of a Poisson foliation and its relationship with the Reeb class.
Cappell, M S; Spray, D C; Bennett, M V
1988-06-28
Protractor muscles in the gastropod mollusc Navanax inermis exhibit typical spontaneous miniature end plate potentials with mean amplitude 1.71 +/- 1.19 (standard deviation) mV. The evoked end plate potential is quantized, with a quantum equal to the miniature end plate potential amplitude. When their rate is stationary, occurrence of miniature end plate potentials is a random, Poisson process. When non-stationary, spontaneous miniature end plate potential occurrence is a non-stationary Poisson process, a Poisson process with the mean frequency changing with time. This extends the random Poisson model for miniature end plate potentials to the frequently observed non-stationary occurrence. Reported deviations from a Poisson process can sometimes be accounted for by the non-stationary Poisson process and more complex models, such as clustered release, are not always needed.
A test of inflated zeros for Poisson regression models.
He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan
2017-01-01
Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.
Díaz, J; Carmona, R; Mirón, I J; Luna, M Y; Linares, C
2018-07-01
Many of the studies that analyze the future impact of climate change on mortality assume that the temperature that constitutes a heat wave will not change over time. This is unlikely, however, given the process of adapting to heat changes, prevention plans, and improvements in social and health infrastructure. The objective of this study is to analyze whether, during the 1983-2013 period, there has been a temporal change in the maximum daily temperatures that constitute a heat wave (T threshold ) in Spain, and to investigate whether there has been variation in the attributable risk (AR) associated with mortality due to high temperatures in this period. This study uses daily mortality data for natural causes except accidents CIEX: A00-R99 in municipalities of over 10,000 inhabitants in 10 Spanish provinces and maximum temperature data from observatories located in province capitals. The time series is divided into three periods: 1983-1992, 1993-2003 and 2004-2013. For each period and each province, the value of T threshold was calculated using scatter-plot diagram of the daily mortality pre-whitened series. For each period and each province capitals, it has been calculated the number of heat waves and quantifying the impact on mortality through generalized linear model (GLM) methodology with the Poisson regression link. These models permits obtained the relative risks (RR) and attributable risks (AR). Via a meta-analysis, using the Global RR and AR were calculated the heat impact for the total of the 10 provinces. The results show that in the first two periods RR remained constant RR: 1.14 (CI95%: 1.09 1.19) and RR: 1.14 (CI95%: 1.10 1.18), while the third period shows a sharp decrease with respect to the prior two periods RR: 1.01 (CI95%: 1.00 1.01); the difference is statistically significant. In Spain there has been a sharp decrease in mortality attributable to heat over the past 10 years. The observed variation in RR puts into question the results of numerous studies that analyze the future impact of heat on mortality in different temporal scenarios and show it to be constant over time. Copyright © 2018 Elsevier Ltd. All rights reserved.
Calculation of the Poisson cumulative distribution function
NASA Technical Reports Server (NTRS)
Bowerman, Paul N.; Nolty, Robert G.; Scheuer, Ernest M.
1990-01-01
A method for calculating the Poisson cdf (cumulative distribution function) is presented. The method avoids computer underflow and overflow during the process. The computer program uses this technique to calculate the Poisson cdf for arbitrary inputs. An algorithm that determines the Poisson parameter required to yield a specified value of the cdf is presented.
Poisson's Ratio of a Hyperelastic Foam Under Quasi-static and Dynamic Loading
Sanborn, Brett; Song, Bo
2018-06-03
Poisson's ratio is a material constant representing compressibility of material volume. However, when soft, hyperelastic materials such as silicone foam are subjected to large deformation into densification, the Poisson's ratio may rather significantly change, which warrants careful consideration in modeling and simulation of impact/shock mitigation scenarios where foams are used as isolators. The evolution of Poisson's ratio of silicone foam materials has not yet been characterized, particularly under dynamic loading. In this study, radial and axial measurements of specimen strain are conducted simultaneously during quasi-static and dynamic compression tests to determine the Poisson's ratio of silicone foam. The Poisson's ratiomore » of silicone foam exhibited a transition from compressible to nearly incompressible at a threshold strain that coincided with the onset of densification in the material. Poisson's ratio as a function of engineering strain was different at quasi-static and dynamic rates. Here, the Poisson's ratio behavior is presented and can be used to improve constitutive modeling of silicone foams subjected to a broad range of mechanical loading.« less
Poisson's Ratio of a Hyperelastic Foam Under Quasi-static and Dynamic Loading
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanborn, Brett; Song, Bo
Poisson's ratio is a material constant representing compressibility of material volume. However, when soft, hyperelastic materials such as silicone foam are subjected to large deformation into densification, the Poisson's ratio may rather significantly change, which warrants careful consideration in modeling and simulation of impact/shock mitigation scenarios where foams are used as isolators. The evolution of Poisson's ratio of silicone foam materials has not yet been characterized, particularly under dynamic loading. In this study, radial and axial measurements of specimen strain are conducted simultaneously during quasi-static and dynamic compression tests to determine the Poisson's ratio of silicone foam. The Poisson's ratiomore » of silicone foam exhibited a transition from compressible to nearly incompressible at a threshold strain that coincided with the onset of densification in the material. Poisson's ratio as a function of engineering strain was different at quasi-static and dynamic rates. Here, the Poisson's ratio behavior is presented and can be used to improve constitutive modeling of silicone foams subjected to a broad range of mechanical loading.« less
NASA Technical Reports Server (NTRS)
Bollman, W. E.; Chadwick, C.
1982-01-01
A number of interplanetary missions now being planned involve placing deterministic maneuvers along the flight path to alter the trajectory. Lee and Boain (1973) examined the statistics of trajectory correction maneuver (TCM) magnitude with no deterministic ('bias') component. The Delta v vector magnitude statistics were generated for several values of random Delta v standard deviations using expansions in terms of infinite hypergeometric series. The present investigation uses a different technique (Monte Carlo simulation) to generate Delta v magnitude statistics for a wider selection of random Delta v standard deviations and also extends the analysis to the case of nonzero deterministic Delta v's. These Delta v magnitude statistics are plotted parametrically. The plots are useful in assisting the analyst in quickly answering questions about the statistics of Delta v magnitude for single TCM's consisting of both a deterministic and a random component. The plots provide quick insight into the nature of the Delta v magnitude distribution for the TCM.
Takakura, Isabela Thomaz; Hoshi, Rosangela Akemi; Santos, Márcio Antonio; Pivatelli, Flávio Correa; Nóbrega, João Honorato; Guedes, Débora Linhares; Nogueira, Victor Freire; Frota, Tuane Queiroz; Castelo, Gabriel Castro; Godoy, Moacir Fernandes de
2017-01-01
To evaluate a possible evolutionary post-heart transplant return of autonomic function using quantitative and qualitative information from recurrence plots. Using electrocardiography, 102 RR tachograms of 45 patients (64.4% male) who underwent heart transplantation and that were available in the database were analyzed at different follow-up periods. The RR tachograms were collected from patients in the supine position for about 20 minutes. A time series with 1000 RR intervals was analyzed, a recurrence plot was created, and the following quantitative variables were evaluated: percentage of determinism, percentage of recurrence, average diagonal length, Shannon entropy, and sample entropy, as well as the visual qualitative aspect. Quantitative and qualitative signs of heart rate variability recovery were observed after transplantation. There is evidence that autonomic innervation of the heart begins to happen gradually after transplantation. Quantitative and qualitative analyses of recurrence can be useful tools for monitoring cardiac transplant patients and detecting the gradual return of heart rate variability.
Electrical Characterization of the RCA CDP1822SD Random Access Memory, Volume 1, Appendix a
NASA Technical Reports Server (NTRS)
Klute, A.
1979-01-01
Electrical characteristization tests were performed on 35 RCA CDP1822SD, 256-by-4-bit, CMOS, random access memories. The tests included three functional tests, AC and DC parametric tests, a series of schmoo plots, rise/fall time screening, and a data retention test. All tests were performed on an automated IC test system with temperatures controlled by a thermal airstream unit. All the functional tests, the data retention test, and the AC and DC parametric tests were performed at ambient temperatures of 25 C, -20 C, -55 C, 85 C, and 125 C. The schmoo plots were performed at ambient temperatures of 25 C, -55 C, and 125 C. The data retention test was performed at 25 C. Five devices failed one or more functional tests and four of these devices failed to meet the expected limits of a number of AC parametric tests. Some of the schmoo plots indicated a small degree of interaction between parameters.
NASA Astrophysics Data System (ADS)
Durmuş, Perihan; Altindal, Şemsettin
2017-10-01
In this study, electrical parameters of the Al/Bi4Ti3O12/p-Si metal-ferroelectric-semiconductor (MFS) structure and their temperature dependence were investigated using current-voltage (I-V) data measured between 120 K and 300 K. Semi-logarithmic I-V plots of the structure revealed that fabricated structure presents two-diode behavior that leads to two sets of ideality factor, reverse saturation current and zero-bias barrier height (BH) values. Obtained results of these parameters suggest that current conduction mechanism (CCM) deviates strongly from thermionic emission theory particularly at low temperatures. High values of interface states and nkT/q-kT/q plot supported the idea of deviation from thermionic emission. In addition, ln(I)-ln(V) plots suggested that CCM varies from one bias region to another and depends on temperature as well. Series resistance values were calculated using Ohm’s law and Cheungs’ functions, and they decreased drastically with increasing temperature.
Application of the Hyper-Poisson Generalized Linear Model for Analyzing Motor Vehicle Crashes.
Khazraee, S Hadi; Sáez-Castillo, Antonio Jose; Geedipally, Srinivas Reddy; Lord, Dominique
2015-05-01
The hyper-Poisson distribution can handle both over- and underdispersion, and its generalized linear model formulation allows the dispersion of the distribution to be observation-specific and dependent on model covariates. This study's objective is to examine the potential applicability of a newly proposed generalized linear model framework for the hyper-Poisson distribution in analyzing motor vehicle crash count data. The hyper-Poisson generalized linear model was first fitted to intersection crash data from Toronto, characterized by overdispersion, and then to crash data from railway-highway crossings in Korea, characterized by underdispersion. The results of this study are promising. When fitted to the Toronto data set, the goodness-of-fit measures indicated that the hyper-Poisson model with a variable dispersion parameter provided a statistical fit as good as the traditional negative binomial model. The hyper-Poisson model was also successful in handling the underdispersed data from Korea; the model performed as well as the gamma probability model and the Conway-Maxwell-Poisson model previously developed for the same data set. The advantages of the hyper-Poisson model studied in this article are noteworthy. Unlike the negative binomial model, which has difficulties in handling underdispersed data, the hyper-Poisson model can handle both over- and underdispersed crash data. Although not a major issue for the Conway-Maxwell-Poisson model, the effect of each variable on the expected mean of crashes is easily interpretable in the case of this new model. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Geng, Weihua; Zhao, Shan
2017-12-01
We present a new Matched Interface and Boundary (MIB) regularization method for treating charge singularity in solvated biomolecules whose electrostatics are described by the Poisson-Boltzmann (PB) equation. In a regularization method, by decomposing the potential function into two or three components, the singular component can be analytically represented by the Green's function, while other components possess a higher regularity. Our new regularization combines the efficiency of two-component schemes with the accuracy of the three-component schemes. Based on this regularization, a new MIB finite difference algorithm is developed for solving both linear and nonlinear PB equations, where the nonlinearity is handled by using the inexact-Newton's method. Compared with the existing MIB PB solver based on a three-component regularization, the present algorithm is simpler to implement by circumventing the work to solve a boundary value Poisson equation inside the molecular interface and to compute related interface jump conditions numerically. Moreover, the new MIB algorithm becomes computationally less expensive, while maintains the same second order accuracy. This is numerically verified by calculating the electrostatic potential and solvation energy on the Kirkwood sphere on which the analytical solutions are available and on a series of proteins with various sizes.
Bao-lin, Liu; Hai-yan, Zhu; Chuan-liang, Yan; Zhi-jun, Li; Zhi-qiao, Wang
2014-01-01
When exploiting the deep resources, the surrounding rock readily undergoes the hole shrinkage, borehole collapse, and loss of circulation under high temperature and high pressure. A series of experiments were conducted to discuss the compressional wave velocity, triaxial strength, and permeability of granite cored from 3500 meters borehole under high temperature and three-dimensional stress. In light of the coupling of temperature, fluid, and stress, we get the thermo-fluid-solid model and governing equation. ANSYS-APDL was also used to stimulate the temperature influence on elastic modulus, Poisson ratio, uniaxial compressive strength, and permeability. In light of the results, we establish a temperature-fluid-stress model to illustrate the granite's stability. The compressional wave velocity and elastic modulus, decrease as the temperature rises, while poisson ratio and permeability of granite increase. The threshold pressure and temperature are 15 MPa and 200°C, respectively. The temperature affects the fracture pressure more than the collapse pressure, but both parameters rise with the increase of temperature. The coupling of thermo-fluid-solid, greatly impacting the borehole stability, proves to be a good method to analyze similar problems of other formations. PMID:24778592
Wang, Yu; Liu, Bao-lin; Zhu, Hai-yan; Yan, Chuan-liang; Li, Zhi-jun; Wang, Zhi-qiao
2014-01-01
When exploiting the deep resources, the surrounding rock readily undergoes the hole shrinkage, borehole collapse, and loss of circulation under high temperature and high pressure. A series of experiments were conducted to discuss the compressional wave velocity, triaxial strength, and permeability of granite cored from 3500 meters borehole under high temperature and three-dimensional stress. In light of the coupling of temperature, fluid, and stress, we get the thermo-fluid-solid model and governing equation. ANSYS-APDL was also used to stimulate the temperature influence on elastic modulus, Poisson ratio, uniaxial compressive strength, and permeability. In light of the results, we establish a temperature-fluid-stress model to illustrate the granite's stability. The compressional wave velocity and elastic modulus, decrease as the temperature rises, while poisson ratio and permeability of granite increase. The threshold pressure and temperature are 15 MPa and 200 °C, respectively. The temperature affects the fracture pressure more than the collapse pressure, but both parameters rise with the increase of temperature. The coupling of thermo-fluid-solid, greatly impacting the borehole stability, proves to be a good method to analyze similar problems of other formations.
Statistical inference for classification of RRIM clone series using near IR reflectance properties
NASA Astrophysics Data System (ADS)
Ismail, Faridatul Aima; Madzhi, Nina Korlina; Hashim, Hadzli; Abdullah, Noor Ezan; Khairuzzaman, Noor Aishah; Azmi, Azrie Faris Mohd; Sampian, Ahmad Faiz Mohd; Harun, Muhammad Hafiz
2015-08-01
RRIM clone is a rubber breeding series produced by RRIM (Rubber Research Institute of Malaysia) through "rubber breeding program" to improve latex yield and producing clones attractive to farmers. The objective of this work is to analyse measurement of optical sensing device on latex of selected clone series. The device using transmitting NIR properties and its reflectance is converted in terms of voltage. The obtained reflectance index value via voltage was analyzed using statistical technique in order to find out the discrimination among the clones. From the statistical results using error plots and one-way ANOVA test, there is an overwhelming evidence showing discrimination of RRIM 2002, RRIM 2007 and RRIM 3001 clone series with p value = 0.000. RRIM 2008 cannot be discriminated with RRIM 2014; however both of these groups are distinct from the other clones.
Moxnes, John F; Moen, Aina E Fossum; Leegaard, Truls Michael
2015-10-05
Study the time development of methicillin-resistant Staphylococcus aureus (MRSA) and forecast future behaviour. The major question: Is the number of MRSA isolates in Norway increasing and will it continue to increase? Time trend analysis using non-stationary γ-Poisson distributions. Two data sets were analysed. The first data set (data set I) consists of all MRSA isolates collected in Oslo County from 1997 to 2010; the study area includes the Norwegian capital of Oslo and nearby surrounding areas, covering approximately 11% of the Norwegian population. The second data set (data set II) consists of all MRSA isolates collected in Health Region East from 2002 to 2011. Health Region East consists of Oslo County and four neighbouring counties, and is the most populated area of Norway. Both data sets I and II consist of all persons in the area and time period described in the Settings, from whom MRSA have been isolated. MRSA infections have been mandatory notifiable in Norway since 1995, and MRSA colonisation since 2004. In the time period studied, all bacterial samples in Norway have been sent to a medical microbiological laboratory at the regional hospital for testing. In collaboration with the regional hospitals in five counties, we have collected all MRSA findings in the South-Eastern part of Norway over long time periods. On an average, a linear or exponential increase in MRSA numbers was observed in the data sets. A Poisson process with increasing intensity did not capture the dispersion of the time series, but a γ-Poisson process showed good agreement and captured the overdispersion. The numerical model showed numerical internal consistency. In the present study, we find that the number of MRSA isolates is increasing in the most populated area of Norway during the time period studied. We also forecast a continuous increase until the year 2017. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
High-latitude geomagnetic disturbances during ascending solar cycle 24
NASA Astrophysics Data System (ADS)
Peitso, Pyry; Tanskanen, Eija; Stolle, Claudia; Berthou Lauritsen, Nynne; Matzka, Jürgen
2015-04-01
High-latitude regions are very convenient for study of several space weather phenomena such as substorms. Large geographic coverage as well as long time series of data are essential due to the global nature of space weather and the long duration of solar cycles. We will examine geomagnetic activity in Greenland from magnetic field measurements taken by DTU (Technical University of Denmark) magnetometers during the years 2010 to 2014. The study uses data from 13 magnetometer stations located on the east coast of Greenland and one located on the west coast. The original measurements are in one second resolution, thus the amount of data is quite large. Magnetic field H component (positive direction towards the magnetic north) was used throughout the study. Data processing will be described from calibration of original measurements to plotting of long time series. Calibration consists of determining the quiet hour of a given day and reducing the average of that hour from all the time steps of the day. This normalizes the measurements and allows for better comparison between different time steps. In addition to the full time line of measurements, daily, monthly and yearly averages will be provided for all stations. Differential calculations on the change of the H component will also be made available for the duration of the full data set. Envelope curve plots will be presented for duration of the time line. Geomagnetic conditions during winter and summer will be compared to examine seasonal variation. Finally the measured activity will be compared to NOAA (National Oceanic and Atmospheric Administration) issued geomagnetic space weather alerts from 2010 to 2014. Calculations and plotting of measurement data were done with MATLAB. M_map toolbox was used for plotting of maps featured in the study (http://www2.ocgy.ubc.ca/~rich/map.html). The study was conducted as a part of the ReSoLVE (Research on Solar Long-term Variability and Effects) Center of Excellence.
NASA Astrophysics Data System (ADS)
Foster, J. R.; D'Amato, A. W.; Itter, M.; Reinikainen, M.; Curzon, M.
2017-12-01
The terrestrial carbon cycle is perturbed when disturbances remove leaf biomass from the forest canopy during the growing season. Changes in foliar biomass arise from defoliation caused by insects, disease, drought, frost or human management. As ephemeral disturbances, these often go undetected and their significance to models that predict forest growth from climatic drivers remains unknown. Here, we seek to distinguish the roles of weather vs. canopy disturbance on forest growth by using dense Landsat time-series to quantify departures in mean phenology that in turn predict changes in leaf biomass. We estimated a foliar biomass index (FBMI) from 1984-2016, and predict plot-level wood growth over 28 years on 156 tree-ring monitoring plots in Minnesota, USA. We accessed the entire Landsat archive (sensors 4, 5 & 7) to compute FBMI using Google Earth Engine's cloud computing platform (GEE). GEE allows this pixel-level approach to be applied at any location; a feature we demonstrate with published wood-growth data from flux tower sites. Our Bayesian models predicted biomass changes from tree-ring plots as a function of Landsat FBMI and annual climate data. We expected model parameters to vary by tree functional groups defined by differences in xylem anatomy and leaf longevity, two traits with linkages to phenology, as reported in a recent review. We found that Landsat FBMI was a surprisingly strong predictor of aggregate wood-growth, explaining up to 80% of annual growth variation for some deciduous plots. Growth responses to canopy disturbance varied among tree functional groups, and the importance of some seasonal climate metrics diminished or changed sign when FBMI was included (e.g. fall and spring climatic water deficit), while others remained unchanged (current and lagged summer deficit). Insights emerging from these models can clear up sources of persistent uncertainty and open a new frontier for models of forest productivity.
Order or chaos in Boolean gene networks depends on the mean fraction of canalizing functions
NASA Astrophysics Data System (ADS)
Karlsson, Fredrik; Hörnquist, Michael
2007-10-01
We explore the connection between order/chaos in Boolean networks and the naturally occurring fraction of canalizing functions in such systems. This fraction turns out to give a very clear indication of whether the system possesses ordered or chaotic dynamics, as measured by Derrida plots, and also the degree of order when we compare different networks with the same number of vertices and edges. By studying also a wide distribution of indegrees in a network, we show that the mean probability of canalizing functions is a more reliable indicator of the type of dynamics for a finite network than the classical result on stability relating the bias to the mean indegree. Finally, we compare by direct simulations two biologically derived networks with networks of similar sizes but with power-law and Poisson distributions of indegrees, respectively. The biologically motivated networks are not more ordered than the latter, and in one case the biological network is even chaotic while the others are not.
The 3DGRAPE book: Theory, users' manual, examples
NASA Technical Reports Server (NTRS)
Sorenson, Reese L.
1989-01-01
A users' manual for a new three-dimensional grid generator called 3DGRAPE is presented. The program, written in FORTRAN, is capable of making zonal (blocked) computational grids in or about almost any shape. Grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. The smoothness for which elliptic methods are known is seen here, including smoothness across zonal boundaries. An introduction giving the history, motivation, capabilities, and philosophy of 3DGRAPE is presented first. Then follows a chapter on the program itself. The input is then described in detail. A chapter on reading the output and debugging follows. Three examples are then described, including sample input data and plots of output. Last is a chapter on the theoretical development of the method.
Witte, Anna Kristina; Mester, Patrick; Fister, Susanne; Witte, Matthias; Schoder, Dagmar; Rossmanith, Peter
2016-01-01
The droplet digital polymerase chain reaction (ddPCR) determines DNA amounts based upon the pattern of positive and negative droplets, according to Poisson distribution, without the use of external standards. However, division into positive and negative droplets is often not clear because a part of the droplets has intermediate fluorescence values, appearing as “rain” in the plot. Despite the droplet rain, absolute quantification with ddPCR is possible, as shown previously for the prfA assay in quantifying Listeria monocytogenes. Nevertheless, reducing the rain, and thus ambiguous results, promotes the accuracy and credibility of ddPCR. In this study, we extensively investigated chemical and physical parameters for optimizing the prfA assay for ddPCR. While differences in the concentration of all chemicals and the dye, quencher and supplier of the probe did not alter the droplet pattern, changes in the PCR cycling program, such as prolonged times and increased cycle numbers, improved the assay. PMID:27992475
Witte, Anna Kristina; Mester, Patrick; Fister, Susanne; Witte, Matthias; Schoder, Dagmar; Rossmanith, Peter
2016-01-01
The droplet digital polymerase chain reaction (ddPCR) determines DNA amounts based upon the pattern of positive and negative droplets, according to Poisson distribution, without the use of external standards. However, division into positive and negative droplets is often not clear because a part of the droplets has intermediate fluorescence values, appearing as "rain" in the plot. Despite the droplet rain, absolute quantification with ddPCR is possible, as shown previously for the prfA assay in quantifying Listeria monocytogenes. Nevertheless, reducing the rain, and thus ambiguous results, promotes the accuracy and credibility of ddPCR. In this study, we extensively investigated chemical and physical parameters for optimizing the prfA assay for ddPCR. While differences in the concentration of all chemicals and the dye, quencher and supplier of the probe did not alter the droplet pattern, changes in the PCR cycling program, such as prolonged times and increased cycle numbers, improved the assay.
A Martingale Characterization of Mixed Poisson Processes.
1985-10-01
03LA A 11. TITLE (Inciuae Security Clanafication, ",A martingale characterization of mixed Poisson processes " ________________ 12. PERSONAL AUTHOR... POISSON PROCESSES Jostification .......... . ... . . Di.;t ib,,jtion by Availability Codes Dietmar Pfeifer* Technical University Aachen Dist Special and...Mixed Poisson processes play an important role in many branches of applied probability, for instance in insurance mathematics and physics (see Albrecht
1978-12-01
Poisson processes . The method is valid for Poisson processes with any given intensity function. The basic thinning algorithm is modified to exploit several refinements which reduce computer execution time by approximately one-third. The basic and modified thinning programs are compared with the Poisson decomposition and gap-statistics algorithm, which is easily implemented for Poisson processes with intensity functions of the form exp(a sub 0 + a sub 1t + a sub 2 t-squared. The thinning programs are competitive in both execution
Deformation mechanisms in negative Poisson's ratio materials - Structural aspects
NASA Technical Reports Server (NTRS)
Lakes, R.
1991-01-01
Poisson's ratio in materials is governed by the following aspects of the microstructure: the presence of rotational degrees of freedom, non-affine deformation kinematics, or anisotropic structure. Several structural models are examined. The non-affine kinematics are seen to be essential for the production of negative Poisson's ratios for isotropic materials containing central force linkages of positive stiffness. Non-central forces combined with pre-load can also give rise to a negative Poisson's ratio in isotropic materials. A chiral microstructure with non-central force interaction or non-affine deformation can also exhibit a negative Poisson's ratio. Toughness and damage resistance in these materials may be affected by the Poisson's ratio itself, as well as by generalized continuum aspects associated with the microstructure.
Exact solution for the Poisson field in a semi-infinite strip.
Cohen, Yossi; Rothman, Daniel H
2017-04-01
The Poisson equation is associated with many physical processes. Yet exact analytic solutions for the two-dimensional Poisson field are scarce. Here we derive an analytic solution for the Poisson equation with constant forcing in a semi-infinite strip. We provide a method that can be used to solve the field in other intricate geometries. We show that the Poisson flux reveals an inverse square-root singularity at a tip of a slit, and identify a characteristic length scale in which a small perturbation, in a form of a new slit, is screened by the field. We suggest that this length scale expresses itself as a characteristic spacing between tips in real Poisson networks that grow in response to fluxes at tips.
Ni, Wei; Ding, Guoyong; Li, Yifei; Li, Hongkai; Jiang, Baofa
2014-01-01
Xinxiang, a city in Henan Province, suffered from frequent floods due to persistent and heavy precipitation from 2004 to 2010. In the same period, dysentery was a common public health problem in Xinxiang, with the proportion of reported cases being the third highest among all the notified infectious diseases. We focused on dysentery disease consequences of different degrees of floods and examined the association between floods and the morbidity of dysentery on the basis of longitudinal data during the study period. A time-series Poisson regression model was conducted to examine the relationship between 10 times different degrees of floods and the monthly morbidity of dysentery from 2004 to 2010 in Xinxiang. Relative risks (RRs) of moderate and severe floods on the morbidity of dysentery were calculated in this paper. In addition, we estimated the attributable contributions of moderate and severe floods to the morbidity of dysentery. A total of 7591 cases of dysentery were notified in Xinxiang during the study period. The effect of floods on dysentery was shown with a 0-month lag. Regression analysis showed that the risk of moderate and severe floods on the morbidity of dysentery was 1.55 (95% CI: 1.42-1.670) and 1.74 (95% CI: 1.56-1.94), respectively. The attributable risk proportions (ARPs) of moderate and severe floods to the morbidity of dysentery were 35.53 and 42.48%, respectively. This study confirms that floods have significantly increased the risk of dysentery in the study area. In addition, severe floods have a higher proportional contribution to the morbidity of dysentery than moderate floods. Public health action should be taken to avoid and control a potential risk of dysentery epidemics after floods.
Ni, Wei; Ding, Guoyong; Li, Yifei; Li, Hongkai; Jiang, Baofa
2014-01-01
Background Xinxiang, a city in Henan Province, suffered from frequent floods due to persistent and heavy precipitation from 2004 to 2010. In the same period, dysentery was a common public health problem in Xinxiang, with the proportion of reported cases being the third highest among all the notified infectious diseases. Objectives We focused on dysentery disease consequences of different degrees of floods and examined the association between floods and the morbidity of dysentery on the basis of longitudinal data during the study period. Design A time-series Poisson regression model was conducted to examine the relationship between 10 times different degrees of floods and the monthly morbidity of dysentery from 2004 to 2010 in Xinxiang. Relative risks (RRs) of moderate and severe floods on the morbidity of dysentery were calculated in this paper. In addition, we estimated the attributable contributions of moderate and severe floods to the morbidity of dysentery. Results A total of 7591 cases of dysentery were notified in Xinxiang during the study period. The effect of floods on dysentery was shown with a 0-month lag. Regression analysis showed that the risk of moderate and severe floods on the morbidity of dysentery was 1.55 (95% CI: 1.42–1.670) and 1.74 (95% CI: 1.56–1.94), respectively. The attributable risk proportions (ARPs) of moderate and severe floods to the morbidity of dysentery were 35.53 and 42.48%, respectively. Conclusions This study confirms that floods have significantly increased the risk of dysentery in the study area. In addition, severe floods have a higher proportional contribution to the morbidity of dysentery than moderate floods. Public health action should be taken to avoid and control a potential risk of dysentery epidemics after floods. PMID:25098726
Critique of Sikkink and Keane's comparison of surface fuel sampling techniques
Clinton S. Wright; Roger D. Ottmar; Robert E. Vihnanek
2010-01-01
The 2008 paper of Sikkink and Keane compared several methods to estimate surface fuel loading in western Montana: two widely used inventory techniques (planar intersect and fixed-area plot) and three methods that employ photographs as visual guides (photo load, photoload macroplot and photo series). We feel, however, that their study design was inadequate to evaluate...
Evaluating growth models: A case study using PrognosisBC
Peter Marshall; Pablo Parysow; Shadrach Akindele
2008-01-01
The ability of the PrognosisBC (Version 3.0) growth model to predict tree and stand growth was assessed against a series of remeasured permanent sample plots, including some which had been precommercially thinned. In addition, the model was evaluated for logical consistency across a variety of stand structures using simulation. By the end of the...
Design and analysis for detection monitoring of forest health
F. A. Roesch
1995-01-01
An analysis procedure is proposed for the sample design of the Forest Health Monitoring Program (FHM) in the United States. The procedure is intended to provide increased sensitivity to localized but potentially important changes in forest health by explicitly accounting for the spatial relationships between plots in the FHM design. After a series of median sweeps...
Infrared speckle interferometry and spectroscopy of Io
NASA Technical Reports Server (NTRS)
Howell, Robert R.
1991-01-01
Observations of a series of mutual events of the Galilean satellites occurring in early 1991 are providing high resolution information concerning the volcanic hot spots on Jupiter's moon Io. The brightness of Io is plotted as a function of time as it is occulted by Europa. Voyager derived globes are given and interpreted, giving special attention to observed hot spots.
Eastern cottonwood clonal mixing study: intergenotypic competition effects
G. Sam Foster; R.J. Rousseau; W.L Nance
1998-01-01
Intergenotypic competition of seven clones of eastern cottonwood (Populus deltoides) was evaluated in a replacement series experiment. A partial diallel competition design was used to choose pairs (binary sets) of clones for plot type treatments. Two separate treatments were established for each pair of clones, namely (1) 75 percent clone A: 25 percent clone B and (2)...
Procedures for numerical analysis of circadian rhythms
REFINETTI, ROBERTO; LISSEN, GERMAINE CORNÉ; HALBERG, FRANZ
2010-01-01
This article reviews various procedures used in the analysis of circadian rhythms at the populational, organismal, cellular and molecular levels. The procedures range from visual inspection of time plots and actograms to several mathematical methods of time series analysis. Computational steps are described in some detail, and additional bibliographic resources and computer programs are listed. PMID:23710111
Canine Conjectures: Using Data for Proportional Reasoning
ERIC Educational Resources Information Center
Westenskow, Arla; Moyer-Packenham, Patricia S.
2011-01-01
No person, place, or thing can capture the attention of a class of sixth graders like "man's best friend." To prompt students' interest in a series of lessons on proportional relationships, the authors brought in a unique teaching aid--a dog. A family dog was used to supply the measurements for scatter plots and variables so that students could…
Overland flow velocities on semiarid hillslopes 2471
USDA-ARS?s Scientific Manuscript database
A series of 188 rainfall plot simulations was conducted on grass, shrub, oak savanna, and juniper sites in Arizona and Nevada. A total of 897 flow velocity measurements was obtained on 3.6% to 39.6% slopes with values ranging from 0.007 m s-1 to 0.115 m s-1. The experimental data showed that flow ve...
Fuzzy classifier based support vector regression framework for Poisson ratio determination
NASA Astrophysics Data System (ADS)
Asoodeh, Mojtaba; Bagheripour, Parisa
2013-09-01
Poisson ratio is considered as one of the most important rock mechanical properties of hydrocarbon reservoirs. Determination of this parameter through laboratory measurement is time, cost, and labor intensive. Furthermore, laboratory measurements do not provide continuous data along the reservoir intervals. Hence, a fast, accurate, and inexpensive way of determining Poisson ratio which produces continuous data over the whole reservoir interval is desirable. For this purpose, support vector regression (SVR) method based on statistical learning theory (SLT) was employed as a supervised learning algorithm to estimate Poisson ratio from conventional well log data. SVR is capable of accurately extracting the implicit knowledge contained in conventional well logs and converting the gained knowledge into Poisson ratio data. Structural risk minimization (SRM) principle which is embedded in the SVR structure in addition to empirical risk minimization (EMR) principle provides a robust model for finding quantitative formulation between conventional well log data and Poisson ratio. Although satisfying results were obtained from an individual SVR model, it had flaws of overestimation in low Poisson ratios and underestimation in high Poisson ratios. These errors were eliminated through implementation of fuzzy classifier based SVR (FCBSVR). The FCBSVR significantly improved accuracy of the final prediction. This strategy was successfully applied to data from carbonate reservoir rocks of an Iranian Oil Field. Results indicated that SVR predicted Poisson ratio values are in good agreement with measured values.
Carleton, W. Christopher; Campbell, David
2018-01-01
Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating—the most common chronometric technique in archaeological and palaeoenvironmental research—creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20–30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence. PMID:29351329
Carleton, W Christopher; Campbell, David; Collard, Mark
2018-01-01
Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating-the most common chronometric technique in archaeological and palaeoenvironmental research-creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20-30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence.
Analysis of the Command and Control Segment (CCS) attitude estimation algorithm
NASA Technical Reports Server (NTRS)
Stockwell, Catherine
1993-01-01
This paper categorizes the qualitative behavior of the Command and Control Segment (CCS) differential correction algorithm as applied to attitude estimation using simultaneous spin axis sun angle and Earth cord length measurements. The categories of interest are the domains of convergence, divergence, and their boundaries. Three series of plots are discussed that show the dependence of the estimation algorithm on the vehicle radius, the sun/Earth angle, and the spacecraft attitude. Common qualitative dynamics to all three series are tabulated and discussed. Out-of-limits conditions for the estimation algorithm are identified and discussed.
Multivariable nonlinear analysis of foreign exchange rates
NASA Astrophysics Data System (ADS)
Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo
2003-05-01
We analyze the multivariable time series of foreign exchange rates. These are price movements that have often been analyzed, and dealing time intervals and spreads between bid and ask prices. Considering dealing time intervals as event timing such as neurons’ firings, we use raster plots (RPs) and peri-stimulus time histograms (PSTHs) which are popular methods in the field of neurophysiology. Introducing special processings to obtaining RPs and PSTHs time histograms for analyzing exchange rates time series, we discover that there exists dynamical interaction among three variables. We also find that adopting multivariables leads to improvements of prediction accuracy.
Peat fires as source of polycyclic aromatic hydrocarbons in soils
NASA Astrophysics Data System (ADS)
Tsibart, Anna
2013-04-01
Polycyclic aromatic hydrocarbons (PAHs) arrive from pyrogenic sources including volcanism and the combustion of oil products and plant materials. The production of PAHs during the combustion of plant materials was considered in a number of publications, but their results were mainly obtained in laboratory experiments. Insufficient data are available on the hightemperature production of PAHs in environmental objects. For example, natural fires are frequently related to the PAH sources in landscapes, but very little factual data are available on this topic. On Polistovskii reserve (Russia, Pskov region) the soil series were separated depending on the damage to the plants; these series included soils of plots subjected to fires of different intensities, as well as soils of the background plots. The series of organic and organomineral soils significantly differed in their PAH distributions. In this series, the concentration of PAHs in the upper horizons of the peat soils little varied or slightly decreased, but their accumulation occurred at a depth of 5-10 or 10-20 cm in the soils after the fires. For example, in the series of high moor soils, the content of PAHs in the upper horizons remained almost constant; significant differences were observed in the subsurface horizons: from 2 ng/g in the background soil to 70 ng/g after the fire. In the upper horizons of the oligotrophic peat soils under pine forests, the total PAH content also varied only slightly. At the same time, the content of PAHs in the soil series increased from 15 to 90 ng/g with the increasing pyrogenic damage to the plot. No clear trends of the PAH accumulation were recorded in the organomineral soils. The content of PAHs in the soddy-podzolic soil subjected to fire slightly decreased (from 20 to 10 ng/g) compared to the less damaged soil. In peat fires, the access of oxygen to the fire zone is lower than in forest fires. The oxygen deficit acts as a factor of the organic fragments recombination and PAH production; therefore, larger amounts of PAHs are formed in peat fires. In addition, the peat fires occur directly in the soil layer; therefore, larger amounts of the resulting polyarenes remain in the soils of the fire sites. PAHs also can be formed at the heating of organic matter on the areas adjacent to the fire sites. After the combustion of peat in fires, phenanthrene, chrysene, benz[a]pyrene, and tetraphene accumulate in soils. This is mainly the group of 4-nuclear compounds with the participation of 3-nuclear phenanthrene and 5-nuclear benz[a]pyrene. The formation of high-molecular weight compounds like benz[a]pyrene and, in some places, benzo[ghi]perylene is possible during smoldering under a low oxygen supply.
Yelland, Lisa N; Salter, Amy B; Ryan, Philip
2011-10-15
Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.
A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments.
Fisicaro, G; Genovese, L; Andreussi, O; Marzari, N; Goedecker, S
2016-01-07
The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.
A Review of Multivariate Distributions for Count Data Derived from the Poisson Distribution.
Inouye, David; Yang, Eunho; Allen, Genevera; Ravikumar, Pradeep
2017-01-01
The Poisson distribution has been widely studied and used for modeling univariate count-valued data. Multivariate generalizations of the Poisson distribution that permit dependencies, however, have been far less popular. Yet, real-world high-dimensional count-valued data found in word counts, genomics, and crime statistics, for example, exhibit rich dependencies, and motivate the need for multivariate distributions that can appropriately model this data. We review multivariate distributions derived from the univariate Poisson, categorizing these models into three main classes: 1) where the marginal distributions are Poisson, 2) where the joint distribution is a mixture of independent multivariate Poisson distributions, and 3) where the node-conditional distributions are derived from the Poisson. We discuss the development of multiple instances of these classes and compare the models in terms of interpretability and theory. Then, we empirically compare multiple models from each class on three real-world datasets that have varying data characteristics from different domains, namely traffic accident data, biological next generation sequencing data, and text data. These empirical experiments develop intuition about the comparative advantages and disadvantages of each class of multivariate distribution that was derived from the Poisson. Finally, we suggest new research directions as explored in the subsequent discussion section.
Application of zero-inflated poisson mixed models in prognostic factors of hepatitis C.
Akbarzadeh Baghban, Alireza; Pourhoseingholi, Asma; Zayeri, Farid; Jafari, Ali Akbar; Alavian, Seyed Moayed
2013-01-01
In recent years, hepatitis C virus (HCV) infection represents a major public health problem. Evaluation of risk factors is one of the solutions which help protect people from the infection. This study aims to employ zero-inflated Poisson mixed models to evaluate prognostic factors of hepatitis C. The data was collected from a longitudinal study during 2005-2010. First, mixed Poisson regression (PR) model was fitted to the data. Then, a mixed zero-inflated Poisson model was fitted with compound Poisson random effects. For evaluating the performance of the proposed mixed model, standard errors of estimators were compared. The results obtained from mixed PR showed that genotype 3 and treatment protocol were statistically significant. Results of zero-inflated Poisson mixed model showed that age, sex, genotypes 2 and 3, the treatment protocol, and having risk factors had significant effects on viral load of HCV patients. Of these two models, the estimators of zero-inflated Poisson mixed model had the minimum standard errors. The results showed that a mixed zero-inflated Poisson model was the almost best fit. The proposed model can capture serial dependence, additional overdispersion, and excess zeros in the longitudinal count data.
A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisicaro, G., E-mail: giuseppe.fisicaro@unibas.ch; Goedecker, S.; Genovese, L.
2016-01-07
The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and themore » linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.« less
Fitting monthly Peninsula Malaysian rainfall using Tweedie distribution
NASA Astrophysics Data System (ADS)
Yunus, R. M.; Hasan, M. M.; Zubairi, Y. Z.
2017-09-01
In this study, the Tweedie distribution was used to fit the monthly rainfall data from 24 monitoring stations of Peninsula Malaysia for the period from January, 2008 to April, 2015. The aim of the study is to determine whether the distributions within the Tweedie family fit well the monthly Malaysian rainfall data. Within the Tweedie family, the gamma distribution is generally used for fitting the rainfall totals, however the Poisson-gamma distribution is more useful to describe two important features of rainfall pattern, which are the occurrences (dry months) and the amount (wet months). First, the appropriate distribution of the monthly rainfall was identified within the Tweedie family for each station. Then, the Tweedie Generalised Linear Model (GLM) with no explanatory variable was used to model the monthly rainfall data. Graphical representation was used to assess model appropriateness. The QQ plots of quantile residuals show that the Tweedie models fit the monthly rainfall data better for majority of the stations in the west coast and mid land than those in the east coast of Peninsula. This significant finding suggests that the best fitted distribution depends on the geographical location of the monitoring station. In this paper, a simple model is developed for generating synthetic rainfall data for use in various areas, including agriculture and irrigation. We have showed that the data that were simulated using the Tweedie distribution have fairly similar frequency histogram to that of the actual data. Both the mean number of rainfall events and mean amount of rain for a month were estimated simultaneously for the case that the Poisson gamma distribution fits the data reasonably well. Thus, this work complements previous studies that fit the rainfall amount and the occurrence of rainfall events separately, each to a different distribution.
Li, Xian-Ying; Hu, Shi-Min
2013-02-01
Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.
NASA Astrophysics Data System (ADS)
Watkins, Nicholas; Clarke, Richard; Freeman, Mervyn
2002-11-01
We discuss how the ideal formalism of Computational Mechanics can be adapted to apply to a non-infinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the new concept of effective soficity is introduced. The benefits of these new concepts are demonstrated on simulated time series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the earth-based station under an electrical current pattern that is fixed with respect to the sun-earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, a hypothesis is advanced about model construction in general (see also Clarke et al; arXiv::cond-mat/0110228).
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
NASA Technical Reports Server (NTRS)
Dalee, Robert C.; Bacskay, Allen S.; Knox, James C.
1990-01-01
An overview of the CASE/A-ECLSS series modeling package is presented. CASE/A is an analytical tool that has supplied engineering productivity accomplishments during ECLSS design activities. A components verification program was performed to assure component modeling validity based on test data from the Phase II comparative test program completed at the Marshall Space Flight Center. An integrated plotting feature has been added to the program which allows the operator to analyze on-screen data trends or get hard copy plots from within the CASE/A operating environment. New command features in the areas of schematic, output, and model management, and component data editing have been incorporated to enhance the engineer's productivity during a modeling program.
Multiscale recurrence quantification analysis of order recurrence plots
NASA Astrophysics Data System (ADS)
Xu, Mengjia; Shang, Pengjian; Lin, Aijing
2017-03-01
In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.
Fedosov’s formal symplectic groupoids and contravariant connections
NASA Astrophysics Data System (ADS)
Karabegov, Alexander V.
2006-10-01
Using Fedosov's approach we give a geometric construction of a formal symplectic groupoid over any Poisson manifold endowed with a torsion-free Poisson contravariant connection. In the case of Kähler-Poisson manifolds this construction provides, in particular, the formal symplectic groupoids with separation of variables. We show that the dual of a semisimple Lie algebra does not admit torsion-free Poisson contravariant connections.
Complete synchronization of the global coupled dynamical network induced by Poisson noises.
Guo, Qing; Wan, Fangyi
2017-01-01
The different Poisson noise-induced complete synchronization of the global coupled dynamical network is investigated. Based on the stability theory of stochastic differential equations driven by Poisson process, we can prove that Poisson noises can induce synchronization and sufficient conditions are established to achieve complete synchronization with probability 1. Furthermore, numerical examples are provided to show the agreement between theoretical and numerical analysis.
NASA Astrophysics Data System (ADS)
Chakraborty, Souvik; Mondal, Debabrata; Motalab, Mohammad
2016-07-01
In this present study, the stress-strain behavior of the Human Anterior Cruciate Ligament (ACL) is studied under uniaxial loads applied with various strain rates. Tensile testing of the human ACL samples requires state of the art test facilities. Furthermore, difficulty in finding human ligament for testing purpose results in very limited archival data. Nominal Stress vs. deformation gradient plots for different strain rates, as found in literature, is used to model the material behavior either as a hyperelastic or as a viscoelastic material. The well-known five parameter Mooney-Rivlin constitutivemodel for hyperelastic material and the Prony Series model for viscoelastic material are used and the objective of the analyses comprises of determining the model constants and their variation-trend with strain rates for the Human Anterior Cruciate Ligament (ACL) material using the non-linear curve fitting tool. The relationship between the model constants and strain rate, using the Hyperelastic Mooney-Rivlin model, has been obtained. The variation of the values of each coefficient with strain rates, obtained using Hyperelastic Mooney-Rivlin model are then plotted and variation of the values with strain rates are obtained for all the model constants. These plots are again fitted using the software package MATLAB and a power law relationship between the model constants and strain rates is obtained for each constant. The obtained material model for Human Anterior Cruciate Ligament (ACL) material can be implemented in any commercial finite element software package for stress analysis.
Geologic map of the Devore 7.5' quadrangle, San Bernardino County, California
Morton, Douglas M.; Matti, Jonathan C.
2001-01-01
This Open-File Report contains a digital geologic map database of the Devore 7.5' quadrangle, San Bernardino County, California, that includes: 1. ARC/INFO (Environmental Systems Research Institute) version 7.2.1 coverages of the various components of the geologic map 2. A PostScript (.ps) file to plot the geologic map on a topographic base, containing a Correlation of Map Units diagram, a Description of Map Units, an index map, and a regional structure map 3. Portable Document Format (.pdf) files of: a. This Readme; includes an Appendix, containing metadata details found in devre_met.txt b. The same graphic as plotted in 2 above. (Test plots from this .pdf do not produce 1:24,000-scale maps. Adobe Acrobat page-size settings control map scale.) The Correlation of Map Units and Description of Map Units are in the editorial format of USGS Miscellaneous Investigations Series maps (I-maps) but have not been edited to comply with I-map standards. Within the geologic-map data package, map units are identified by such standard geologic-map criteria as formation name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Devore 7.5’ topographic quadrangle in conjunction with the geologic map.
Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M
2013-08-30
In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.
Lord, Dominique; Guikema, Seth D; Geedipally, Srinivas Reddy
2008-05-01
This paper documents the application of the Conway-Maxwell-Poisson (COM-Poisson) generalized linear model (GLM) for modeling motor vehicle crashes. The COM-Poisson distribution, originally developed in 1962, has recently been re-introduced by statisticians for analyzing count data subjected to over- and under-dispersion. This innovative distribution is an extension of the Poisson distribution. The objectives of this study were to evaluate the application of the COM-Poisson GLM for analyzing motor vehicle crashes and compare the results with the traditional negative binomial (NB) model. The comparison analysis was carried out using the most common functional forms employed by transportation safety analysts, which link crashes to the entering flows at intersections or on segments. To accomplish the objectives of the study, several NB and COM-Poisson GLMs were developed and compared using two datasets. The first dataset contained crash data collected at signalized four-legged intersections in Toronto, Ont. The second dataset included data collected for rural four-lane divided and undivided highways in Texas. Several methods were used to assess the statistical fit and predictive performance of the models. The results of this study show that COM-Poisson GLMs perform as well as NB models in terms of GOF statistics and predictive performance. Given the fact the COM-Poisson distribution can also handle under-dispersed data (while the NB distribution cannot or has difficulties converging), which have sometimes been observed in crash databases, the COM-Poisson GLM offers a better alternative over the NB model for modeling motor vehicle crashes, especially given the important limitations recently documented in the safety literature about the latter type of model.
NASA Technical Reports Server (NTRS)
Lieneweg, Udo (Inventor)
1988-01-01
A system is provided for use with wafers that include multiple integrated circuits that include two conductive layers in contact at multiple interfaces. Contact chains are formed beside the integrated circuits, each contact chain formed of the same two layers as the circuits, in the form of conductive segments alternating between the upper and lower layers and with the ends of the segments connected in series through interfaces. A current source passes a current through the series-connected segments, by way of a pair of current tabs connected to opposite ends of the series of segments. While the current flows, voltage measurements are taken between each of a plurality of pairs of voltage tabs, the two tabs of each pair connected to opposite ends of an interface that lies along the series-connected segments. A plot of interface conductances on a normal probability chart, enables prediction of the yield of good integrated circuits from the wafer.
NASA Technical Reports Server (NTRS)
Lieneweg, U. (Inventor)
1986-01-01
A system is provided for use with wafers that include multiple integrated circuits that include two conductive layers in contact at multiple interfaces. Contact chains are formed beside the integrated circuits, each contact chain formed of the same two layers as the circuits, in the form of conductive segments alternating between the upper and lower layers and with the ends of the segments connected in series through interfaces. A current source passes a current through the series-connected segments, by way of a pair of current tabs connected to opposite ends of the series of segments. While the current flows, voltage measurements are taken between each of a plurality of pairs of voltage tabs, the two tabs of each pair connected to opposite ends of an interface that lies along the series-connected segments. A plot of interface conductances on normal probability chart enables prediction of the yield of good integrated circuits from the wafer.
Neutron monitors and muon detectors for solar modulation studies: 2. ϕ time series
NASA Astrophysics Data System (ADS)
Ghelfi, A.; Maurin, D.; Cheminet, A.; Derome, L.; Hubert, G.; Melot, F.
2017-08-01
The level of solar modulation at different times (related to the solar activity) is a central question of solar and galactic cosmic-ray physics. In the first paper of this series, we have established a correspondence between the uncertainties on ground-based detectors count rates and the parameter ϕ (modulation level in the force-field approximation) reconstructed from these count rates. In this second paper, we detail a procedure to obtain a reference ϕ time series from neutron monitor data. We show that we can have an unbiased and accurate ϕ reconstruction (Δϕ / ϕ ≃ 10 %). We also discuss the potential of Bonner spheres spectrometers and muon detectors to provide ϕ time series. Two by-products of this calculation are updated ϕ values for the cosmic-ray database and a web interface to retrieve and plot ϕ from the 50's to today (http://lpsc.in2p3.fr/crdb).
Whole-plant water flux in understory red maple exposed to altered precipitation regimes.
Wullschleger, Stan D.; Hanson, Paul J.; Tschaplinski, Tim J.
1998-02-01
Sap flow gauges were used to estimate whole-plant water flux for five stem-diameter classes of red maple (Acer rubrum L.) growing in the understory of an upland oak forest and exposed to one of three large-scale (0.64 ha) manipulations of soil water content. This Throughfall Displacement Experiment (TDE) used subcanopy troughs to intercept roughly 30% of the throughfall on a "dry" plot and a series of pipes to move this collected precipitation across an "ambient" plot and onto a "wet" plot. Saplings with a stem diameter larger than 10 cm lost water at rates 50-fold greater than saplings with a stem diameter of 1 to 2 cm (326 versus 6.4 mol H(2)O tree(-1) day(-1)). These size-class differences were driven largely by differences in leaf area and cross-sectional sapwood area, because rates of water flux expressed per unit leaf area (6.90 mol H(2)O m(-2) day(-1)) or sapwood area (288 mol H(2)O dm(-2) day(-1)) were similar among saplings of the five size classes. Daily and hourly rates of transpiration expressed per unit leaf area varied throughout much of the season, as did soil matrix potentials, and treatment differences due to the TDE were observed during two of the seven sampling periods. On July 6, midday rates of transpiration averaged 1.88 mol H(2)O m(-2) h(-1) for saplings in the "wet" plot, 1.22 mol H(2)O m(-2) h(-1) for saplings in the "ambient" plot, and 0.76 mol H(2)O m(-2) h(-1) for saplings in the "dry" plot. During the early afternoon of August 28, transpiration rates were sevenfold lower for saplings in the "dry" plot compared to saplings in the "wet" plot and 2.5-fold lower compared to saplings in the "ambient" plot. Treatment differences in crown conductance followed a pattern similar to that of transpiration, with values that averaged 60% lower for saplings in the "dry" plot compared to saplings in the "wet" plot and 35% lower compared to saplings in the "ambient" plot. Stomatal and boundary layer conductances were roughly equal in magnitude. Estimates of the decoupling coefficient (Omega) ranged between 0.64 and 0.72 for saplings in the three TDE treatment plots. We conclude that red maple saplings growing in the understory of an upland oak forest are responsive to their edaphic and climatic surroundings, and because of either their small stature or their shallow root distribution, or both, are likely to be impacted by precipitation changes similar to those predicted by global climate models.
Fujiyama, Toshifumi; Matsui, Chihiro; Takemura, Akimichi
2016-01-01
We propose a power-law growth and decay model for posting data to social networking services before and after social events. We model the time series structure of deviations from the power-law growth and decay with a conditional Poisson autoregressive (AR) model. Online postings related to social events are described by five parameters in the power-law growth and decay model, each of which characterizes different aspects of interest in the event. We assess the validity of parameter estimates in terms of confidence intervals, and compare various submodels based on likelihoods and information criteria. PMID:27505155
Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.
Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah
2012-01-01
Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Eberle, Jonas; Urban, Marcel; Hüttich, Christian; Schmullius, Christiane
2014-05-01
Numerous datasets providing temperature information from meteorological stations or remote sensing satellites are available. However, the challenging issue is to search in the archives and process the time series information for further analysis. These steps can be automated for each individual product, if the pre-conditions are complied, e.g. data access through web services (HTTP, FTP) or legal rights to redistribute the datasets. Therefore a python-based package was developed to provide data access and data processing tools for MODIS Land Surface Temperature (LST) data, which is provided by NASA Land Processed Distributed Active Archive Center (LPDAAC), as well as the Global Surface Summary of the Day (GSOD) and the Global Historical Climatology Network (GHCN) daily datasets provided by NOAA National Climatic Data Center (NCDC). The package to access and process the information is available as web services used by an interactive web portal for simple data access and analysis. Tools for time series analysis were linked to the system, e.g. time series plotting, decomposition, aggregation (monthly, seasonal, etc.), trend analyses, and breakpoint detection. Especially for temperature data a plot was integrated for the comparison of two temperature datasets based on the work by Urban et al. (2013). As a first result, a kernel density plot compares daily MODIS LST from satellites Aqua and Terra with daily means from GSOD and GHCN datasets. Without any data download and data processing, the users can analyze different time series datasets in an easy-to-use web portal. As a first use case, we built up this complimentary system with remotely sensed MODIS data and in situ measurements from meteorological stations for Siberia within the Siberian Earth System Science Cluster (www.sibessc.uni-jena.de). References: Urban, Marcel; Eberle, Jonas; Hüttich, Christian; Schmullius, Christiane; Herold, Martin. 2013. "Comparison of Satellite-Derived Land Surface Temperature and Air Temperature from Meteorological Stations on the Pan-Arctic Scale." Remote Sens. 5, no. 5: 2348-2367. Further materials: Eberle, Jonas; Clausnitzer, Siegfried; Hüttich, Christian; Schmullius, Christiane. 2013. "Multi-Source Data Processing Middleware for Land Monitoring within a Web-Based Spatial Data Infrastructure for Siberia." ISPRS Int. J. Geo-Inf. 2, no. 3: 553-576.
Characterization of chaotic attractors under noise: A recurrence network perspective
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2016-12-01
We undertake a detailed numerical investigation to understand how the addition of white and colored noise to a chaotic time series changes the topology and the structure of the underlying attractor reconstructed from the time series. We use the methods and measures of recurrence plot and recurrence network generated from the time series for this analysis. We explicitly show that the addition of noise obscures the property of recurrence of trajectory points in the phase space which is the hallmark of every dynamical system. However, the structure of the attractor is found to be robust even upto high noise levels of 50%. An advantage of recurrence network measures over the conventional nonlinear measures is that they can be applied on short and non stationary time series data. By using the results obtained from the above analysis, we go on to analyse the light curves from a dominant black hole system and show that the recurrence network measures are capable of identifying the nature of noise contamination in a time series.
Classification of time-series images using deep convolutional neural networks
NASA Astrophysics Data System (ADS)
Hatami, Nima; Gavet, Yann; Debayle, Johan
2018-04-01
Convolutional Neural Networks (CNN) has achieved a great success in image recognition task by automatically learning a hierarchical feature representation from raw data. While the majority of Time-Series Classification (TSC) literature is focused on 1D signals, this paper uses Recurrence Plots (RP) to transform time-series into 2D texture images and then take advantage of the deep CNN classifier. Image representation of time-series introduces different feature types that are not available for 1D signals, and therefore TSC can be treated as texture image recognition task. CNN model also allows learning different levels of representations together with a classifier, jointly and automatically. Therefore, using RP and CNN in a unified framework is expected to boost the recognition rate of TSC. Experimental results on the UCR time-series classification archive demonstrate competitive accuracy of the proposed approach, compared not only to the existing deep architectures, but also to the state-of-the art TSC algorithms.
Jarukanont, Daungruthai; Bonifas Arredondo, Imelda; Femat, Ricardo; Garcia, Martin E
2015-01-01
Chromaffin cells release catecholamines by exocytosis, a process that includes vesicle docking, priming and fusion. Although all these steps have been intensively studied, some aspects of their mechanisms, particularly those regarding vesicle transport to the active sites situated at the membrane, are still unclear. In this work, we show that it is possible to extract information on vesicle motion in Chromaffin cells from the combination of Langevin simulations and amperometric measurements. We developed a numerical model based on Langevin simulations of vesicle motion towards the cell membrane and on the statistical analysis of vesicle arrival times. We also performed amperometric experiments in bovine-adrenal Chromaffin cells under Ba2+ stimulation to capture neurotransmitter releases during sustained exocytosis. In the sustained phase, each amperometric peak can be related to a single release from a new vesicle arriving at the active site. The amperometric signal can then be mapped into a spike-series of release events. We normalized the spike-series resulting from the current peaks using a time-rescaling transformation, thus making signals coming from different cells comparable. We discuss why the obtained spike-series may contain information about the motion of all vesicles leading to release of catecholamines. We show that the release statistics in our experiments considerably deviate from Poisson processes. Moreover, the interspike-time probability is reasonably well described by two-parameter gamma distributions. In order to interpret this result we computed the vesicles' arrival statistics from our Langevin simulations. As expected, assuming purely diffusive vesicle motion we obtain Poisson statistics. However, if we assume that all vesicles are guided toward the membrane by an attractive harmonic potential, simulations also lead to gamma distributions of the interspike-time probability, in remarkably good agreement with experiment. We also show that including the fusion-time statistics in our model does not produce any significant changes on the results. These findings indicate that the motion of the whole ensemble of vesicles towards the membrane is directed and reflected in the amperometric signals. Our results confirm the conclusions of previous imaging studies performed on single vesicles that vesicles' motion underneath plasma membranes is not purely random, but biased towards the membrane.
Jarukanont, Daungruthai; Bonifas Arredondo, Imelda; Femat, Ricardo; Garcia, Martin E.
2015-01-01
Chromaffin cells release catecholamines by exocytosis, a process that includes vesicle docking, priming and fusion. Although all these steps have been intensively studied, some aspects of their mechanisms, particularly those regarding vesicle transport to the active sites situated at the membrane, are still unclear. In this work, we show that it is possible to extract information on vesicle motion in Chromaffin cells from the combination of Langevin simulations and amperometric measurements. We developed a numerical model based on Langevin simulations of vesicle motion towards the cell membrane and on the statistical analysis of vesicle arrival times. We also performed amperometric experiments in bovine-adrenal Chromaffin cells under Ba2+ stimulation to capture neurotransmitter releases during sustained exocytosis. In the sustained phase, each amperometric peak can be related to a single release from a new vesicle arriving at the active site. The amperometric signal can then be mapped into a spike-series of release events. We normalized the spike-series resulting from the current peaks using a time-rescaling transformation, thus making signals coming from different cells comparable. We discuss why the obtained spike-series may contain information about the motion of all vesicles leading to release of catecholamines. We show that the release statistics in our experiments considerably deviate from Poisson processes. Moreover, the interspike-time probability is reasonably well described by two-parameter gamma distributions. In order to interpret this result we computed the vesicles’ arrival statistics from our Langevin simulations. As expected, assuming purely diffusive vesicle motion we obtain Poisson statistics. However, if we assume that all vesicles are guided toward the membrane by an attractive harmonic potential, simulations also lead to gamma distributions of the interspike-time probability, in remarkably good agreement with experiment. We also show that including the fusion-time statistics in our model does not produce any significant changes on the results. These findings indicate that the motion of the whole ensemble of vesicles towards the membrane is directed and reflected in the amperometric signals. Our results confirm the conclusions of previous imaging studies performed on single vesicles that vesicles’ motion underneath plasma membranes is not purely random, but biased towards the membrane. PMID:26675312
A Method of Poisson's Ration Imaging Within a Material Part
NASA Technical Reports Server (NTRS)
Roth, Don J. (Inventor)
1994-01-01
The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention, longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to display the data.
Method of Poisson's ratio imaging within a material part
NASA Technical Reports Server (NTRS)
Roth, Don J. (Inventor)
1996-01-01
The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to displayed the image.
NASA Astrophysics Data System (ADS)
Zhong, Jie; Zhao, Honggang; Yang, Haibin; Yin, Jianfei; Wen, Jihong
2018-06-01
Rubbery coatings embedded with air cavities are commonly used on underwater structures to reduce reflection of incoming sound waves. In this paper, the relationships between Poisson's and modulus loss factors of rubbery materials are theoretically derived, the different effects of the tiny Poisson's loss factor on characterizing the loss factors of shear and longitudinal moduli are revealed. Given complex Young's modulus and dynamic Poisson's ratio, it is found that the shear loss factor has almost invisible variation with the Poisson's loss factor and is very close to the loss factor of Young's modulus, while the longitudinal loss factor almost linearly decreases with the increase of Poisson's loss factor. Then, a finite element (FE) model is used to investigate the effect of the tiny Poisson's loss factor, which is generally neglected in some FE models, on the underwater sound absorption of rubbery coatings. Results show that the tiny Poisson's loss factor has a significant effect on the sound absorption of homogeneous coatings within the concerned frequency range, while it has both frequency- and structure-dependent influence on the sound absorption of inhomogeneous coatings with embedded air cavities. Given the material parameters and cavity dimensions, more obvious effect can be observed for the rubbery coating with a larger lattice constant and/or a thicker cover layer.
A Review of Multivariate Distributions for Count Data Derived from the Poisson Distribution
Inouye, David; Yang, Eunho; Allen, Genevera; Ravikumar, Pradeep
2017-01-01
The Poisson distribution has been widely studied and used for modeling univariate count-valued data. Multivariate generalizations of the Poisson distribution that permit dependencies, however, have been far less popular. Yet, real-world high-dimensional count-valued data found in word counts, genomics, and crime statistics, for example, exhibit rich dependencies, and motivate the need for multivariate distributions that can appropriately model this data. We review multivariate distributions derived from the univariate Poisson, categorizing these models into three main classes: 1) where the marginal distributions are Poisson, 2) where the joint distribution is a mixture of independent multivariate Poisson distributions, and 3) where the node-conditional distributions are derived from the Poisson. We discuss the development of multiple instances of these classes and compare the models in terms of interpretability and theory. Then, we empirically compare multiple models from each class on three real-world datasets that have varying data characteristics from different domains, namely traffic accident data, biological next generation sequencing data, and text data. These empirical experiments develop intuition about the comparative advantages and disadvantages of each class of multivariate distribution that was derived from the Poisson. Finally, we suggest new research directions as explored in the subsequent discussion section. PMID:28983398
NASA Astrophysics Data System (ADS)
Cooper, Christopher D.; Barba, Lorena A.
2016-05-01
Interactions between surfaces and proteins occur in many vital processes and are crucial in biotechnology: the ability to control specific interactions is essential in fields like biomaterials, biomedical implants and biosensors. In the latter case, biosensor sensitivity hinges on ligand proteins adsorbing on bioactive surfaces with a favorable orientation, exposing reaction sites to target molecules. Protein adsorption, being a free-energy-driven process, is difficult to study experimentally. This paper develops and evaluates a computational model to study electrostatic interactions of proteins and charged nanosurfaces, via the Poisson-Boltzmann equation. We extended the implicit-solvent model used in the open-source code PyGBe to include surfaces of imposed charge or potential. This code solves the boundary integral formulation of the Poisson-Boltzmann equation, discretized with surface elements. PyGBe has at its core a treecode-accelerated Krylov iterative solver, resulting in O(N log N) scaling, with further acceleration on hardware via multi-threaded execution on GPUs. It computes solvation and surface free energies, providing a framework for studying the effect of electrostatics on adsorption. We derived an analytical solution for a spherical charged surface interacting with a spherical dielectric cavity, and used it in a grid-convergence study to build evidence on the correctness of our approach. The study showed the error decaying with the average area of the boundary elements, i.e., the method is O(1 / N) , which is consistent with our previous verification studies using PyGBe. We also studied grid-convergence using a real molecular geometry (protein G B1 D4‧), in this case using Richardson extrapolation (in the absence of an analytical solution) and confirmed the O(1 / N) scaling. With this work, we can now access a completely new family of problems, which no other major bioelectrostatics solver, e.g. APBS, is capable of dealing with. PyGBe is open-source under an MIT license and is hosted under version control at https://github.com/barbagroup/pygbe. To supplement this paper, we prepared ;reproducibility packages; consisting of running and post-processing scripts in Python for replicating the grid-convergence studies, all the way to generating the final plots, with a single command.
Non-linear properties of metallic cellular materials with a negative Poisson's ratio
NASA Technical Reports Server (NTRS)
Choi, J. B.; Lakes, R. S.
1992-01-01
Negative Poisson's ratio copper foam was prepared and characterized experimentally. The transformation into re-entrant foam was accomplished by applying sequential permanent compressions above the yield point to achieve a triaxial compression. The Poisson's ratio of the re-entrant foam depended on strain and attained a relative minimum at strains near zero. Poisson's ratio as small as -0.8 was achieved. The strain dependence of properties occurred over a narrower range of strain than in the polymer foams studied earlier. Annealing of the foam resulted in a slightly greater magnitude of negative Poisson's ratio and greater toughness at the expense of a decrease in the Young's modulus.
The use of multiple imputation in the Southern Annual Forest Inventory System
Gregory A. Reams; Joseph M. McCollum
2000-01-01
The Southern Research Station is currently implementing an annual forest survey in 7 of the 13 States that it is responsible for surveying. The Southern Annual Forest Inventory System (SAFIS) sampling design is a systematic sample of five interpenetrating grids, whereby an equal number of plots are measured each year. The area-representative and time-series...
The use of multiple imputation in the Southern Annual Forest Inventory System
Gregory A. Reams; Joseph M. McCollum
2000-01-01
The Southern Research Station is currently implementing an annual forest survey in 7 of the 13 states that it is responsible for surveying. The Southern Annual Forest Inventory System (SAFIS) sampling design is a systematic sample of five interpenetrating grids, whereby an equal number of plots are measured each year. The area representative and time series nature of...
Relationships between overstory composition and gypsy moth egg-mass density
Robert W. Campbell
1974-01-01
Most of the silvicultural recommendations for reducing the hazard of gypsy moth outbreaks have been based in part on the premise that gypsy moth density levels are related closely to the proportion of favored food trees in the overstory. This premise did not prove to be true for a series of plots observed in eastern New England between 1911 and 1931.
An assessment of Japanese stiltgrass in northern U.S. forests
Cassandra M. Kurtz; Mark H. Hansen
2017-01-01
This publication is part of a series of research notes that provides an overview of the presence of invasive plant species monitored on an extensive systematic network of plots measured by the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service, Northern Research Station (NRS). Each research note features one of the invasive plants monitored on...
Charles B. Halpern; Joseph A. Antos; Janine M. Rice; Ryan D. Haugo; Nicole L. Lang
2010-01-01
We combined spatial point pattern analysis, population age structures, and a time-series of stem maps to quantify spatial and temporal patterns of conifer invasion over a 200-yr period in three plots totaling 4 ha. In combination, spatial and temporal patterns of establishment suggest an invasion process shaped by biotic interactions, with facilitation promoting...
Quantifying Forest Ground Flora Biomass Using Close-range Remote Sensing
Paul F. Doruska; Robert C. Weih; Matthew D. Lane; Don C. Bragg
2005-01-01
Close-range remote sensing was used to estimate biomass of forest ground flora in Arkansas. Digital images of a series of 1-m² plots were taken using Kodak DCS760 and Kodak DCS420CIR digital cameras. ESRI ArcGIS and ERDAS Imagine® software was used to calculate the Normalized Difference Vegetation Index (NDVI) and the Average Visible...
Photo Series for Estimating Post-Hurricane Residues and Fire Behavior in Southern Pine
Dale D. Wade; James K. Forbus; James M. Saveland
1993-01-01
Following Hurricane Hugo, fuels were sampled on nine 2-acre blocks which were then burned during the spring wildfire season. The study was superimposed on dormant-season fire-interval research plots established in 1958 on the Francis Marion National Forest near Charleston, SC. Photographs of preburn fuel loads, fire behavior, and postburn fuel loads were taken to...
G. Starr; C. L. Staudhammer; H. W. Loescher; R. Mitchell; A. Whelan; J. K. Hiers; J. J. O’Brien
2015-01-01
Frequency and intensity of fire determines the structure and regulates the function of savanna ecosystems worldwide, yet our understanding of prescribed fire impacts on carbon in these systems is rudimentary. We combined eddy covariance (EC) techniques and fuel consumption plots to examine the short-term response of longleaf pine forest carbon dynamics to one...
Non-Poisson Processes: Regression to Equilibrium Versus Equilibrium Correlation Functions
2004-07-07
ARTICLE IN PRESSPhysica A 347 (2005) 268–2880378-4371/$ - doi:10.1016/j Correspo E-mail adwww.elsevier.com/locate/physaNon- Poisson processes : regression...05.40.a; 89.75.k; 02.50.Ey Keywords: Stochastic processes; Non- Poisson processes ; Liouville and Liouville-like equations; Correlation function...which is not legitimate with renewal non- Poisson processes , is a correct property if the deviation from the exponential relaxation is obtained by time
Probabilistic Estimation of Rare Random Collisions in 3 Space
2009-03-01
extended Poisson process as a feature of probability theory. With the bulk of research in extended Poisson processes going into parame- ter estimation, the...application of extended Poisson processes to spatial processes is largely untouched. Faddy performed a short study of spatial data, but overtly...the theory of extended Poisson processes . To date, the processes are limited in that the rates only depend on the number of arrivals at some time
NASA Astrophysics Data System (ADS)
Mahmoud, Adel K.; Hammoudi, Zaid S.; Student Samah Rasheed, M. Sc.
2018-02-01
This paper aims to measuring the residual stresses practically in wear protection coatings using the sin2ψ method according to X-ray diffraction technique. The wear protection coatings used in this study was composite coating 95wt% Al2O3-5wt% SiC, while bond coat was AlNi alloy produced by using flame spraying technique on the mild steel substrate. The diffraction angle, 2θ, is measured experimentally and then the lattice spacing is calculated from the diffraction angle, and the known X-ray wavelength using Bragg’s Law. Once the dspacing values are known, they can be plotted versus sin2ψ, (ψ is the tilt angle). In this paper, stress measurement of the samples that exhibit a linear behavior as in the case of a homogenous isotropic sample in a biaxial stress state is included. The plot of dspacing versus sin2ψ is a straight line which slope is proportional to stress. On the other hand, the second set of samples showed oscillatory dspacing versus sin2ψ behaviour. The oscillatory behaviour indicates the presence of inhomogeneous stress distribution. In this case the X-ray elastic constants must be used instead of Young’s modulus (E) and Poisson ratio (ν)values. These constants can be obtained from the literature for a given material and reflection combination. The value of the residual stresses for the present coating calculated was compressive stresses (-325.6758MPa).
Schoenly, Kenneth G; Cohen, Michael B; Barrion, Alberto T; Zhang, Wenjun; Gaolach, Bradley; Viajante, Vicente D
2003-01-01
Endotoxins from Bacillus thuringiensis (Bt) produced in transgenic pest-resistant Bt crops are generally not toxic to predatory and parasitic arthropods. However, elimination of Bt-susceptible prey and hosts in Bt crops could reduce predator and parasitoid abundance and thereby disrupt biological control of other herbivorous pests. Here we report results of a field study evaluating the effects of Bt sprays on non-target terrestrial herbivore and natural enemy assemblages from three rice (Oryza sativa L.) fields on Luzon Island, Philippines. Because of restrictions on field-testing of transgenic rice, Bt sprays were used to remove foliage-feeding lepidopteran larvae that would be targeted by Bt rice. Data from a 546-taxa Philippines-wide food web, matched abundance plots, species accumulation curves, time-series analysis, and ecostatistical tests for species richness and ranked abundance were used to compare different subsets of non-target herbivores, predators, and parasitoids in Bt sprayed and water-sprayed (control) plots. For whole communities of terrestrial predators and parasitoids, Bt sprays altered parasitoid richness in 3 of 3 sites and predator richness in 1 of 3 sites, as measured by rarefaction (in half of these cases, richness was greater in Bt plots), while Spearman tests on ranked abundances showed that correlations, although significantly positive between all treatment pairs, were stronger for predators than for parasitoids, suggesting that parasitoid complexes may have been more sensitive than predators to the effects of Bt sprays. Species accumulation curves and time-series analyses of population trends revealed no evidence that Bt sprays altered the overall buildup of predator or parasitoid communities or population trajectories of non-target herbivores (planthoppers and leafhoppers) nor was evidence found for bottom-up effects in total abundances of non-target species identified in the food web from the addition of spores in the Bt spray formulation. When the same methods were applied to natural enemies (predators and parasitoids) of foliage-feeding lepidopteran and non-lepidopteran (homopteran, hemipteran and dipteran) herbivores, significant differences between treatments were detected in 7 of 12 cases. However, no treatment differences were found in mean abundances of these natural enemies, either in time-series plots or in total (seasonal) abundance. Analysis of guild-level trajectories revealed population behavior and treatment differences that could not be predicted in whole-community studies of predators and parasitoids. A more conclusive test of the impact of Bt rice will require field experiments with transgenic plants, conducted in a range of Asian environments, and over multiple cropping seasons.
Poisson-type inequalities for growth properties of positive superharmonic functions.
Luan, Kuan; Vieira, John
2017-01-01
In this paper, we present new Poisson-type inequalities for Poisson integrals with continuous data on the boundary. The obtained inequalities are used to obtain growth properties at infinity of positive superharmonic functions in a smooth cone.
Information transmission using non-poisson regular firing.
Koyama, Shinsuke; Omi, Takahiro; Kass, Robert E; Shinomoto, Shigeru
2013-04-01
In many cortical areas, neural spike trains do not follow a Poisson process. In this study, we investigate a possible benefit of non-Poisson spiking for information transmission by studying the minimal rate fluctuation that can be detected by a Bayesian estimator. The idea is that an inhomogeneous Poisson process may make it difficult for downstream decoders to resolve subtle changes in rate fluctuation, but by using a more regular non-Poisson process, the nervous system can make rate fluctuations easier to detect. We evaluate the degree to which regular firing reduces the rate fluctuation detection threshold. We find that the threshold for detection is reduced in proportion to the coefficient of variation of interspike intervals.
A GENERAL ALGORITHM FOR THE CONSTRUCTION OF CONTOUR PLOTS
NASA Technical Reports Server (NTRS)
Johnson, W.
1994-01-01
The graphical presentation of experimentally or theoretically generated data sets frequently involves the construction of contour plots. A general computer algorithm has been developed for the construction of contour plots. The algorithm provides for efficient and accurate contouring with a modular approach which allows flexibility in modifying the algorithm for special applications. The algorithm accepts as input data values at a set of points irregularly distributed over a plane. The algorithm is based on an interpolation scheme in which the points in the plane are connected by straight line segments to form a set of triangles. In general, the data is smoothed using a least-squares-error fit of the data to a bivariate polynomial. To construct the contours, interpolation along the edges of the triangles is performed, using the bivariable polynomial if data smoothing was performed. Once the contour points have been located, the contour may be drawn. This program is written in FORTRAN IV for batch execution and has been implemented on an IBM 360 series computer with a central memory requirement of approximately 100K of 8-bit bytes. This computer algorithm was developed in 1981.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mason, R.R.; Paul, H.G.
1996-09-01
Larval densities of the western spruce budworm (Choristoneura occidentalis Freeman) were monitored for 12 years (1984-95) on permanent sample plots in northeastern Oregon. The time series spanned a period of general budworm infestations when populations increased rapidly from low densities, plateaued for a time at high-outbreak densities, and then declind suddenly. Midway through the period (1988), an area with half of the sample plots was sprayed with the microbial insecticide Bacillus thuringiensis (B.t.) in an operational suppression project. The other sample plots were part of an untreated area. In the treated area, B.t. spray reduced numbers of larvae by moremore » than 90 percent; however, populations returned to an outbreak density within 3 years. In the untreated area, populations remained at outbreak densities and continued to fluctuate due to natural feedback processes. Natural decline of the population (1992-95) in the monitored area was largely unexplained and coincided with an overall collapse of the budworm outbreak in the Blue Mountains.« less
Graphic Simulations of the Poisson Process.
1982-10-01
RANDOM NUMBERS AND TRANSFORMATIONS..o......... 11 Go THE RANDOM NUMBERGENERATOR....... .oo..... 15 III. POISSON PROCESSES USER GUIDE....oo.ooo ......... o...again. In the superimposed mode, two Poisson processes are active, each with a different rate parameter, (call them Type I and Type II with respective...occur. The value ’p’ is generated by the following equation where ’Li’ and ’L2’ are the rates of the two Poisson processes ; p = Li / (Li + L2) The value
Soft network materials with isotropic negative Poisson's ratios over large strains.
Liu, Jianxing; Zhang, Yihui
2018-01-31
Auxetic materials with negative Poisson's ratios have important applications across a broad range of engineering areas, such as biomedical devices, aerospace engineering and automotive engineering. A variety of design strategies have been developed to achieve artificial auxetic materials with controllable responses in the Poisson's ratio. The development of designs that can offer isotropic negative Poisson's ratios over large strains can open up new opportunities in emerging biomedical applications, which, however, remains a challenge. Here, we introduce deterministic routes to soft architected materials that can be tailored precisely to yield the values of Poisson's ratio in the range from -1 to 1, in an isotropic manner, with a tunable strain range from 0% to ∼90%. The designs rely on a network construction in a periodic lattice topology, which incorporates zigzag microstructures as building blocks to connect lattice nodes. Combined experimental and theoretical studies on broad classes of network topologies illustrate the wide-ranging utility of these concepts. Quantitative mechanics modeling under both infinitesimal and finite deformations allows the development of a rigorous design algorithm that determines the necessary network geometries to yield target Poisson ratios over desired strain ranges. Demonstrative examples in artificial skin with both the negative Poisson's ratio and the nonlinear stress-strain curve precisely matching those of the cat's skin and in unusual cylindrical structures with engineered Poisson effect and shape memory effect suggest potential applications of these network materials.
Universal Poisson Statistics of mRNAs with Complex Decay Pathways.
Thattai, Mukund
2016-01-19
Messenger RNA (mRNA) dynamics in single cells are often modeled as a memoryless birth-death process with a constant probability per unit time that an mRNA molecule is synthesized or degraded. This predicts a Poisson steady-state distribution of mRNA number, in close agreement with experiments. This is surprising, since mRNA decay is known to be a complex process. The paradox is resolved by realizing that the Poisson steady state generalizes to arbitrary mRNA lifetime distributions. A mapping between mRNA dynamics and queueing theory highlights an identifiability problem: a measured Poisson steady state is consistent with a large variety of microscopic models. Here, I provide a rigorous and intuitive explanation for the universality of the Poisson steady state. I show that the mRNA birth-death process and its complex decay variants all take the form of the familiar Poisson law of rare events, under a nonlinear rescaling of time. As a corollary, not only steady-states but also transients are Poisson distributed. Deviations from the Poisson form occur only under two conditions, promoter fluctuations leading to transcriptional bursts or nonindependent degradation of mRNA molecules. These results place severe limits on the power of single-cell experiments to probe microscopic mechanisms, and they highlight the need for single-molecule measurements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
1992-01-01
VM and the correlation entropy K,(M) versus the embedding dimension M for both the linear and non-linear signals. Crosses refer to the linear signal...mensions, leading to a correlation dimension v=2.7. A similar structure was observed bv Voges et al. [461 in the analysis of the X-ray variability of...0 + 7 1j, and its recurrence plots often indicates whether a where A 0 = 10 and 71, is uniformly random dis- meaningful correlation integral analysis
The solution of large multi-dimensional Poisson problems
NASA Technical Reports Server (NTRS)
Stone, H. S.
1974-01-01
The Buneman algorithm for solving Poisson problems can be adapted to solve large Poisson problems on computers with a rotating drum memory so that the computation is done with very little time lost due to rotational latency of the drum.
Modeling time-series data from microbial communities.
Ridenhour, Benjamin J; Brooker, Sarah L; Williams, Janet E; Van Leuven, James T; Miller, Aaron W; Dearing, M Denise; Remien, Christopher H
2017-11-01
As sequencing technologies have advanced, the amount of information regarding the composition of bacterial communities from various environments (for example, skin or soil) has grown exponentially. To date, most work has focused on cataloging taxa present in samples and determining whether the distribution of taxa shifts with exogenous covariates. However, important questions regarding how taxa interact with each other and their environment remain open thus preventing in-depth ecological understanding of microbiomes. Time-series data from 16S rDNA amplicon sequencing are becoming more common within microbial ecology, but methods to infer ecological interactions from these longitudinal data are limited. We address this gap by presenting a method of analysis using Poisson regression fit with an elastic-net penalty that (1) takes advantage of the fact that the data are time series; (2) constrains estimates to allow for the possibility of many more interactions than data; and (3) is scalable enough to handle data consisting of thousands of taxa. We test the method on gut microbiome data from white-throated woodrats (Neotoma albigula) that were fed varying amounts of the plant secondary compound oxalate over a period of 22 days to estimate interactions between OTUs and their environment.
Modeling Polio Data Using the First Order Non-Negative Integer-Valued Autoregressive, INAR(1), Model
NASA Astrophysics Data System (ADS)
Vazifedan, Turaj; Shitan, Mahendran
Time series data may consists of counts, such as the number of road accidents, the number of patients in a certain hospital, the number of customers waiting for service at a certain time and etc. When the value of the observations are large it is usual to use Gaussian Autoregressive Moving Average (ARMA) process to model the time series. However if the observed counts are small, it is not appropriate to use ARMA process to model the observed phenomenon. In such cases we need to model the time series data by using Non-Negative Integer valued Autoregressive (INAR) process. The modeling of counts data is based on the binomial thinning operator. In this paper we illustrate the modeling of counts data using the monthly number of Poliomyelitis data in United States between January 1970 until December 1983. We applied the AR(1), Poisson regression model and INAR(1) model and the suitability of these models were assessed by using the Index of Agreement(I.A.). We found that INAR(1) model is more appropriate in the sense it had a better I.A. and it is natural since the data are counts.
Flint, M.R.; Bencala, K.E.; Zellweger, G.W.; Hammermeister, D.P.
1985-01-01
A twenty-four hour injection of chloride and sodium was made into Leviathan Creek, Alpine County, California to aid interpretation of the coupled interactions between physical transport processes and geochemical reactions. Leviathan Creek was chosen because it receives acid mine drainage from Leviathan Mine, an abandoned open-pit sulfur mine. Water samples were collected at 15 sites along a 4.39 kilometer reach and analyzed for chloride, sodium, sulfate and fluoride. Dissolved concentrations are presented in tabular format and time-series plots. Duplicate samples were analyzed by two laboratories: the Central Laboratory, Denver, Colorado and a research laboratory in Menlo Park, California. A tabular comparison of the analyses and plots of the differences between the two laboratories is presented. Hydrographs and instantaneous discharge measurements are included. (USGS)
Rodent CNS neuron development: Timing of cell birth and death
NASA Technical Reports Server (NTRS)
Keefe, J. R.
1984-01-01
Data obtained from a staged series of single paired injections of tritiated thymidine to pregnant Wistar rats or C57B16/j mice on selected embryonic days and several postnatal times are reported. All injected specimens were allowed to come to term, each litter culled to six pups and specimens were sacrificed on PN28, with fixation and embedding for paraffin and plastic embedding. The results are derived from serial paraffin sections of PN28 animals exposed to autoradiographic processing and plotted with respect to heavily labelled cell nuclei present in the selected brain stem nuclei and sensory ganglia. Counts from each time sample/structure are totalled and the percentage of cells in the total labelled population/structure represented by each injection time interval plotted.
Influence of transition metal electronegativity on the oxygen storage capacity of perovskite oxides.
Liu, Lu; Taylor, Daniel D; Rodriguez, Efrain E; Zachariah, Michael R
2016-08-16
The selection of highly efficient oxygen carriers (OCs) is a key step necessary for the practical development of chemical looping combustion (CLC). In this study, a series of ABO3 perovskites, where A = La, Ba, Sr, Ca and B = Cr, Mn, Fe, Co, Ni, Cu, are synthesized and tested in a fixed bed reactor for reactivity and stability as OCs with CH4 as the fuel. We find that the electronegativity of the transition metal on the B-site (λB), is a convenient descriptor for oxygen storage capacity (OSC) of our perovskite samples. By plotting OSC for total methane oxidation against λB, we observe an inverted volcano plot relationship. These results could provide useful guidelines for perovskite OC design and their other energy related applications.
SpcAudace: Spectroscopic processing and analysis package of Audela software
NASA Astrophysics Data System (ADS)
Mauclaire, Benjamin
2017-11-01
SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.
Baker, Nathan A.; McCammon, J. Andrew
2008-01-01
The solvent reaction field potential of an uncharged protein immersed in Simple Point Charge/Extended (SPC/E) explicit solvent was computed over a series of molecular dynamics trajectories, intotal 1560 ns of simulation time. A finite, positive potential of 13 to 24 kbTec−1 (where T = 300K), dependent on the geometry of the solvent-accessible surface, was observed inside the biomolecule. The primary contribution to this potential arose from a layer of positive charge density 1.0 Å from the solute surface, on average 0.008 ec/Å3, which we found to be the product of a highly ordered first solvation shell. Significant second solvation shell effects, including additional layers of charge density and a slight decrease in the short-range solvent-solvent interaction strength, were also observed. The impact of these findings on implicit solvent models was assessed by running similar explicit-solvent simulations on the fully charged protein system. When the energy due to the solvent reaction field in the uncharged system is accounted for, correlation between per-atom electrostatic energies for the explicit solvent model and a simple implicit (Poisson) calculation is 0.97, and correlation between per-atom energies for the explicit solvent model and a previously published, optimized Poisson model is 0.99. PMID:17949217
NASA Astrophysics Data System (ADS)
Cerutti, David S.; Baker, Nathan A.; McCammon, J. Andrew
2007-10-01
The solvent reaction field potential of an uncharged protein immersed in simple point charge/extended explicit solvent was computed over a series of molecular dynamics trajectories, in total 1560ns of simulation time. A finite, positive potential of 13-24 kbTec-1 (where T =300K), dependent on the geometry of the solvent-accessible surface, was observed inside the biomolecule. The primary contribution to this potential arose from a layer of positive charge density 1.0Å from the solute surface, on average 0.008ec/Å3, which we found to be the product of a highly ordered first solvation shell. Significant second solvation shell effects, including additional layers of charge density and a slight decrease in the short-range solvent-solvent interaction strength, were also observed. The impact of these findings on implicit solvent models was assessed by running similar explicit solvent simulations on the fully charged protein system. When the energy due to the solvent reaction field in the uncharged system is accounted for, correlation between per-atom electrostatic energies for the explicit solvent model and a simple implicit (Poisson) calculation is 0.97, and correlation between per-atom energies for the explicit solvent model and a previously published, optimized Poisson model is 0.99.
Song, X X; Zhao, Q; Tao, T; Zhou, C M; Diwan, V K; Xu, B
2018-05-30
Records of absenteeism from primary schools are valuable data for infectious diseases surveillance. However, the analysis of the absenteeism is complicated by the data features of clustering at zero, non-independence and overdispersion. This study aimed to generate an appropriate model to handle the absenteeism data collected in a European Commission granted project for infectious disease surveillance in rural China and to evaluate the validity and timeliness of the resulting model for early warnings of infectious disease outbreak. Four steps were taken: (1) building a 'well-fitting' model by the zero-inflated Poisson model with random effects (ZIP-RE) using the absenteeism data from the first implementation year; (2) applying the resulting model to predict the 'expected' number of absenteeism events in the second implementation year; (3) computing the differences between the observations and the expected values (O-E values) to generate an alternative series of data; (4) evaluating the early warning validity and timeliness of the observational data and model-based O-E values via the EARS-3C algorithms with regard to the detection of real cluster events. The results indicate that ZIP-RE and its corresponding O-E values could improve the detection of aberrations, reduce the false-positive signals and are applicable to the zero-inflated data.
Universality in the distance between two teams in a football tournament
NASA Astrophysics Data System (ADS)
da Silva, Roberto; Dahmen, Silvio R.
2014-03-01
Is football (soccer) a universal sport? Beyond the question of geographical distribution, where the answer is most certainly yes, when looked at from a mathematical viewpoint the scoring process during a match can be thought of, in a first approximation, as being modeled by a Poisson distribution. Recently, it was shown that the scoring of real tournaments can be reproduced by means of an agent-based model (da Silva et al. (2013) [24]) based on two simple hypotheses: (i) the ability of a team to win a match is given by the rate of a Poisson distribution that governs its scoring during a match; and (ii) such ability evolves over time according to results of previous matches. In this article we are interested in the question of whether the time series represented by the scores of teams have universal properties. For this purpose we define a distance between two teams as the square root of the sum of squares of the score differences between teams over all rounds in a double-round-robin-system and study how this distance evolves over time. Our results suggest a universal distance distribution of tournaments of different major leagues which is better characterized by an exponentially modified Gaussian (EMG). This result is corroborated by our agent-based model.
NASA Astrophysics Data System (ADS)
Tzoupis, Haralambos; Leonis, Georgios; Durdagi, Serdar; Mouchlis, Varnavas; Mavromoustakos, Thomas; Papadopoulos, Manthos G.
2011-10-01
The objectives of this study include the design of a series of novel fullerene-based inhibitors for HIV-1 protease (HIV-1 PR), by employing two strategies that can also be applied to the design of inhibitors for any other target. Additionally, the interactions which contribute to the observed exceptionally high binding free energies were analyzed. In particular, we investigated: (1) hydrogen bonding (H-bond) interactions between specific fullerene derivatives and the protease, (2) the regions of HIV-1 PR that play a significant role in binding, (3) protease changes upon binding and (4) various contributions to the binding free energy, in order to identify the most significant of them. This study has been performed by employing a docking technique, two 3D-QSAR models, molecular dynamics (MD) simulations and the molecular mechanics Poisson-Boltzmann surface area (MM-PBSA) method. Our computed binding free energies are in satisfactory agreement with the experimental results. The suitability of specific fullerene derivatives as drug candidates was further enhanced, after ADMET (absorption, distribution, metabolism, excretion and toxicity) properties have been estimated to be promising. The outcomes of this study revealed important protein-ligand interaction patterns that may lead towards the development of novel, potent HIV-1 PR inhibitors.
On the Determination of Poisson Statistics for Haystack Radar Observations of Orbital Debris
NASA Technical Reports Server (NTRS)
Stokely, Christopher L.; Benbrook, James R.; Horstman, Matt
2007-01-01
A convenient and powerful method is used to determine if radar detections of orbital debris are observed according to Poisson statistics. This is done by analyzing the time interval between detection events. For Poisson statistics, the probability distribution of the time interval between events is shown to be an exponential distribution. This distribution is a special case of the Erlang distribution that is used in estimating traffic loads on telecommunication networks. Poisson statistics form the basis of many orbital debris models but the statistical basis of these models has not been clearly demonstrated empirically until now. Interestingly, during the fiscal year 2003 observations with the Haystack radar in a fixed staring mode, there are no statistically significant deviations observed from that expected with Poisson statistics, either independent or dependent of altitude or inclination. One would potentially expect some significant clustering of events in time as a result of satellite breakups, but the presence of Poisson statistics indicates that such debris disperse rapidly with respect to Haystack's very narrow radar beam. An exception to Poisson statistics is observed in the months following the intentional breakup of the Fengyun satellite in January 2007.
Lefkimmiatis, Stamatios; Maragos, Petros; Papandreou, George
2009-08-01
We present an improved statistical model for analyzing Poisson processes, with applications to photon-limited imaging. We build on previous work, adopting a multiscale representation of the Poisson process in which the ratios of the underlying Poisson intensities (rates) in adjacent scales are modeled as mixtures of conjugate parametric distributions. Our main contributions include: 1) a rigorous and robust regularized expectation-maximization (EM) algorithm for maximum-likelihood estimation of the rate-ratio density parameters directly from the noisy observed Poisson data (counts); 2) extension of the method to work under a multiscale hidden Markov tree model (HMT) which couples the mixture label assignments in consecutive scales, thus modeling interscale coefficient dependencies in the vicinity of image edges; 3) exploration of a 2-D recursive quad-tree image representation, involving Dirichlet-mixture rate-ratio densities, instead of the conventional separable binary-tree image representation involving beta-mixture rate-ratio densities; and 4) a novel multiscale image representation, which we term Poisson-Haar decomposition, that better models the image edge structure, thus yielding improved performance. Experimental results on standard images with artificially simulated Poisson noise and on real photon-limited images demonstrate the effectiveness of the proposed techniques.
Segmentation of time series with long-range fractal correlations.
Bernaola-Galván, P; Oliver, J L; Hackenberg, M; Coronado, A V; Ivanov, P Ch; Carpena, P
2012-06-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.
Simulation Methods for Poisson Processes in Nonstationary Systems.
1978-08-01
for simulation of nonhomogeneous Poisson processes is stated with log-linear rate function. The method is based on an identity relating the...and relatively efficient new method for simulation of one-dimensional and two-dimensional nonhomogeneous Poisson processes is described. The method is
Poisson geometry from a Dirac perspective
NASA Astrophysics Data System (ADS)
Meinrenken, Eckhard
2018-03-01
We present proofs of classical results in Poisson geometry using techniques from Dirac geometry. This article is based on mini-courses at the Poisson summer school in Geneva, June 2016, and at the workshop Quantum Groups and Gravity at the University of Waterloo, April 2016.
Identification of a Class of Filtered Poisson Processes.
1981-01-01
LD-A135 371 IDENTIFICATION OF A CLASS OF FILERED POISSON PROCESSES I AU) NORTH CAROLINA UNIV AT CHAPEL HIL DEPT 0F STATISTICS D DE RRUC ET AL 1981...STNO&IO$ !tt ~ 4.s " . , ".7" -L N ~ TITLE :IDENTIFICATION OF A CLASS OF FILTERED POISSON PROCESSES Authors : DE BRUCQ Denis - GUALTIEROTTI Antonio...filtered Poisson processes is intro- duced : the amplitude has a law which is spherically invariant and the filter is real, linear and causal. It is shown
1981-11-01
RDRER413 C EH 11-22 HOUSING ELASTIC MODUJLUS (F/L**2). RDRE8415 C PO4 ?3-34 HOUSING POISSON-S PATTO . PDPR416 C DENH 35-46 HOUSING MATERIAL DFNSITY (MA/L...23-34 CAGE POISSON-S PATTO . RDPRE427 C DENC 35-46 CAC7E MATFRIAL DENSITY (MA/L-03), PDPEP4?8 C RDRER4?9 C CARD 11 RDRE9430 C ---- ROPER431 C JF 11-16
Minimum risk wavelet shrinkage operator for Poisson image denoising.
Cheng, Wu; Hirakawa, Keigo
2015-05-01
The pixel values of images taken by an image sensor are said to be corrupted by Poisson noise. To date, multiscale Poisson image denoising techniques have processed Haar frame and wavelet coefficients--the modeling of coefficients is enabled by the Skellam distribution analysis. We extend these results by solving for shrinkage operators for Skellam that minimizes the risk functional in the multiscale Poisson image denoising setting. The minimum risk shrinkage operator of this kind effectively produces denoised wavelet coefficients with minimum attainable L2 error.
Cumulative Poisson Distribution Program
NASA Technical Reports Server (NTRS)
Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert
1990-01-01
Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.
Poly-symplectic Groupoids and Poly-Poisson Structures
NASA Astrophysics Data System (ADS)
Martinez, Nicolas
2015-05-01
We introduce poly-symplectic groupoids, which are natural extensions of symplectic groupoids to the context of poly-symplectic geometry, and define poly-Poisson structures as their infinitesimal counterparts. We present equivalent descriptions of poly-Poisson structures, including one related with AV-Dirac structures. We also discuss symmetries and reduction in the setting of poly-symplectic groupoids and poly-Poisson structures, and use our viewpoint to revisit results and develop new aspects of the theory initiated in Iglesias et al. (Lett Math Phys 103:1103-1133, 2013).
An analytical method for the inverse Cauchy problem of Lame equation in a rectangle
NASA Astrophysics Data System (ADS)
Grigor’ev, Yu
2018-04-01
In this paper, we present an analytical computational method for the inverse Cauchy problem of Lame equation in the elasticity theory. A rectangular domain is frequently used in engineering structures and we only consider the analytical solution in a two-dimensional rectangle, wherein a missing boundary condition is recovered from the full measurement of stresses and displacements on an accessible boundary. The essence of the method consists in solving three independent Cauchy problems for the Laplace and Poisson equations. For each of them, the Fourier series is used to formulate a first-kind Fredholm integral equation for the unknown function of data. Then, we use a Lavrentiev regularization method, and the termwise separable property of kernel function allows us to obtain a closed-form regularized solution. As a result, for the displacement components, we obtain solutions in the form of a sum of series with three regularization parameters. The uniform convergence and error estimation of the regularized solutions are proved.
Fractional poisson--a simple dose-response model for human norovirus.
Messner, Michael J; Berger, Philip; Nappier, Sharon P
2014-10-01
This study utilizes old and new Norovirus (NoV) human challenge data to model the dose-response relationship for human NoV infection. The combined data set is used to update estimates from a previously published beta-Poisson dose-response model that includes parameters for virus aggregation and for a beta-distribution that describes variable susceptibility among hosts. The quality of the beta-Poisson model is examined and a simpler model is proposed. The new model (fractional Poisson) characterizes hosts as either perfectly susceptible or perfectly immune, requiring a single parameter (the fraction of perfectly susceptible hosts) in place of the two-parameter beta-distribution. A second parameter is included to account for virus aggregation in the same fashion as it is added to the beta-Poisson model. Infection probability is simply the product of the probability of nonzero exposure (at least one virus or aggregate is ingested) and the fraction of susceptible hosts. The model is computationally simple and appears to be well suited to the data from the NoV human challenge studies. The model's deviance is similar to that of the beta-Poisson, but with one parameter, rather than two. As a result, the Akaike information criterion favors the fractional Poisson over the beta-Poisson model. At low, environmentally relevant exposure levels (<100), estimation error is small for the fractional Poisson model; however, caution is advised because no subjects were challenged at such a low dose. New low-dose data would be of great value to further clarify the NoV dose-response relationship and to support improved risk assessment for environmentally relevant exposures. © 2014 Society for Risk Analysis Published 2014. This article is a U.S. Government work and is in the public domain for the U.S.A.
Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.
Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai
2011-01-01
Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.
Linear and quadratic models of point process systems: contributions of patterned input to output.
Lindsay, K A; Rosenberg, J R
2012-08-01
In the 1880's Volterra characterised a nonlinear system using a functional series connecting continuous input and continuous output. Norbert Wiener, in the 1940's, circumvented problems associated with the application of Volterra series to physical problems by deriving from it a new series of terms that are mutually uncorrelated with respect to Gaussian processes. Subsequently, Brillinger, in the 1970's, introduced a point-process analogue of Volterra's series connecting point-process inputs to the instantaneous rate of point-process output. We derive here a new series from this analogue in which its terms are mutually uncorrelated with respect to Poisson processes. This new series expresses how patterned input in a spike train, represented by third-order cross-cumulants, is converted into the instantaneous rate of an output point-process. Given experimental records of suitable duration, the contribution of arbitrary patterned input to an output process can, in principle, be determined. Solutions for linear and quadratic point-process models with one and two inputs and a single output are investigated. Our theoretical results are applied to isolated muscle spindle data in which the spike trains from the primary and secondary endings from the same muscle spindle are recorded in response to stimulation of one and then two static fusimotor axons in the absence and presence of a random length change imposed on the parent muscle. For a fixed mean rate of input spikes, the analysis of the experimental data makes explicit which patterns of two input spikes contribute to an output spike. Copyright © 2012 Elsevier Ltd. All rights reserved.
Modeling laser velocimeter signals as triply stochastic Poisson processes
NASA Technical Reports Server (NTRS)
Mayo, W. T., Jr.
1976-01-01
Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.
Dennis E. Ferguson; John C. Byrne
2016-01-01
The response of 28 shrub species to wildfire burn severity was assessed for 8 wildfires on 6 national forests in the northern Rocky Mountains, USA. Stratified random sampling was used to choose 224 stands based on burn severity, habitat type series, slope steepness, stand height, and stand density, which resulted in 896 plots measured at approximately 2-year intervals...
Todd A. Schroeder; Sean P. Healey; Gretchen G. Moisen; Tracey S. Frescino; Warren B. Cohen; Chengquan Huang; Robert E. Kennedy; Zhiqiang Yang
2014-01-01
With earth's surface temperature and human population both on the rise a new emphasis has been placed on monitoring changes to forested ecosystems the world over. In the United States the U.S. Forest Service Forest Inventory and Analysis (FIA) program monitors the forested land base with field data collected over a permanent network of sample plots. Although these...
Randi A. Hansen; David C. Coleman
1997-01-01
To investigate the relationship between litter complexity and composition and the diversity and composition of the oribatid mite fauna inhabiting it, an experiment was carried out at a single forested site in the mountains of North Carolina. USA. Natural litterfall was excluded from a series of 1 m2 plots and replaced with treatment litters that...
ERIC Educational Resources Information Center
Yau, Hon Man; Haines, Ronald S.; Harper, Jason B.
2015-01-01
A "one-pot" method for acquiring kinetic data for the reactions of a series of substituted aromatic esters with potassium hydroxide using [supserscript 13]C NMR spectroscopy is described, which provides an efficient way to obtain sufficient data to demonstrate the Hammett equation in undergraduate laboratories. The method is…
ERIC Educational Resources Information Center
Ortiz-Seda, Darnyd W
Since performance is the main difference between drama and fiction, it should be included in drama instruction in order to give students a complete view of what drama really is. Accordingly, a series of theatrical techniques to teach four elements of drama--plot, character, setting, and mood--were elaborated. Improvisations, pantomimes,…
R.R. Mason; H.G. Paul
1996-01-01
Larval densities of the western spruce budworm (Choristoneura occidentalis Freeman) were monitored for 12 years (1984-95) on permanent sample plots in northeastern Oregon. The time series spanned a period of general budworm infestations when populations increased rapidly from low densities, plateaued for a time at high-outbreak densities, and then declined suddenly....
Exploring Episodic Affordance and Response in Children's Narratives Based on a Picture Book
ERIC Educational Resources Information Center
Hoel, Trude
2016-01-01
This article presents part of a research project where the aim is to investigate six- to seven-year-old children's language use in storytelling. The children's oral texts are based on the picture book "Frog, Where Are You?" The book consists of a series of episodes that more or less directly point to the plot structure. However, it also…
The Importance of Practice in the Development of Statistics.
1983-01-01
RESOLUTION TEST CHART NATIONAL BUREAU OIF STANDARDS 1963 -A NRC Technical Summary Report #2471 C THE IMORTANCE OF PRACTICE IN to THE DEVELOPMENT OF STATISTICS...component analysis, bioassay, limits for a ratio, quality control, sampling inspection, non-parametric tests , transformation theory, ARIMA time series...models, sequential tests , cumulative sum charts, data analysis plotting techniques, and a resolution of the Bayes - frequentist controversy. It appears
NASA Astrophysics Data System (ADS)
Li, Zhanling; Li, Zhanjie; Li, Chengcheng
2014-05-01
Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to 2008, while the intensity of such flow extremes is comparatively increasing especially for the higher return levels.
Asquith, William H.; Heitmuller, Franklin T.
2008-01-01
Analysts and managers of surface-water resources have interest in annual mean and annual harmonic mean statistics of daily mean streamflow for U.S. Geological Survey (USGS) streamflow-gaging stations in Texas. The mean streamflow represents streamflow volume, whereas the harmonic mean streamflow represents an appropriate statistic for assessing constituent concentrations that might adversely affect human health. In 2008, the USGS, in cooperation with the Texas Commission on Environmental Quality, conducted a large-scale documentation of mean and harmonic mean streamflow for 620 active and inactive, continuous-record, streamflow-gaging stations using period of record data through water year 2007. About 99 stations within the Texas USGS streamflow-gaging network are part of the larger national Hydroclimatic Data Network and are identified. The graphical depictions of annual mean and annual harmonic mean statistics in this report provide a historical perspective of streamflow at each station. Each figure consists of three time-series plots, two flow-duration curves, and a statistical summary of the mean annual and annual harmonic mean streamflow statistics for available data for each station.The first time-series plot depicts daily mean streamflow for the period 1900-2007. Flow-duration curves follow and are a graphical depiction of streamflow variability. Next, the remaining two time-series plots depict annual mean and annual harmonic mean streamflow and are augmented with horizontal lines that depict mean and harmonic mean for the period of record. Monotonic trends for the annual mean streamflow and annual harmonic mean streamflow also are identified using Kendall's tau, and the slope of the trend is depicted using the nonparametric (linear) Theil-Sen line, which is only drawn for p-values less than .10 of tau. The history of annual mean and annual harmonic mean streamflow of one or more streamflow-gaging stations could be used in a watershed, river basin, or other regional context by analysts and managers of surface-water resources to guide scientific, regulatory, or other inquiries of streamflow conditions in Texas.
Return periods of losses associated with European windstorm series in a changing climate
NASA Astrophysics Data System (ADS)
Karremann, Melanie K.; Pinto, Joaquim G.; Reyers, Mark; Klawa, Matthias
2015-04-01
During the last decades, several windstorm series hit Europe leading to large aggregated losses. Such storm series are examples of serial clustering of extreme cyclones, presenting a considerable risk for the insurance industry. Clustering of events and return periods of storm series affecting Europe are quantified based on potential losses using empirical models. Moreover, possible future changes of clustering and return periods of European storm series with high potential losses are quantified. Historical storm series are identified using 40 winters of NCEP reanalysis data (1973/1974 - 2012/2013). Time series of top events (1, 2 or 5 year return levels) are used to assess return periods of storm series both empirically and theoretically. Return periods of historical storm series are estimated based on the Poisson and the negative binomial distributions. Additionally, 800 winters of ECHAM5/MPI-OM1 general circulation model simulations for present (SRES scenario 20C: years 1960- 2000) and future (SRES scenario A1B: years 2060- 2100) climate conditions are investigated. Clustering is identified for most countries in Europe, and estimated return periods are similar for reanalysis and present day simulations. Future changes of return periods are estimated for fixed return levels and fixed loss index thresholds. For the former, shorter return periods are found for Western Europe, but changes are small and spatially heterogeneous. For the latter, which combines the effects of clustering and event ranking shifts, shorter return periods are found everywhere except for Mediterranean countries. These changes are generally not statistically significant between recent and future climate. However, the return periods for the fixed loss index approach are mostly beyond the range of preindustrial natural climate variability. This is not true for fixed return levels. The quantification of losses associated with storm series permits a more adequate windstorm risk assessment in a changing climate.
1983-05-20
Poisson processes is introduced: the amplitude has a law which is spherically invariant and the filter is real, linear and causal. It is shown how such a model can be identified from experimental data. (Author)
NASA Astrophysics Data System (ADS)
Long, Kai; Yuan, Philip F.; Xu, Shanqing; Xie, Yi Min
2018-04-01
Most studies on composites assume that the constituent phases have different values of stiffness. Little attention has been paid to the effect of constituent phases having distinct Poisson's ratios. This research focuses on a concurrent optimization method for simultaneously designing composite structures and materials with distinct Poisson's ratios. The proposed method aims to minimize the mean compliance of the macrostructure with a given mass of base materials. In contrast to the traditional interpolation of the stiffness matrix through numerical results, an interpolation scheme of the Young's modulus and Poisson's ratio using different parameters is adopted. The numerical results demonstrate that the Poisson effect plays a key role in reducing the mean compliance of the final design. An important contribution of the present study is that the proposed concurrent optimization method can automatically distribute base materials with distinct Poisson's ratios between the macrostructural and microstructural levels under a single constraint of the total mass.
Application of computational mechanics to the analysis of natural data: an example in geomagnetism.
Clarke, Richard W; Freeman, Mervyn P; Watkins, Nicholas W
2003-01-01
We discuss how the ideal formalism of computational mechanics can be adapted to apply to a noninfinite series of corrupted and correlated data, that is typical of most observed natural time series. Specifically, a simple filter that removes the corruption that creates rare unphysical causal states is demonstrated, and the concept of effective soficity is introduced. We believe that computational mechanics cannot be applied to a noisy and finite data series without invoking an argument based upon effective soficity. A related distinction between noise and unresolved structure is also defined: Noise can only be eliminated by increasing the length of the time series, whereas the resolution of previously unresolved structure only requires the finite memory of the analysis to be increased. The benefits of these concepts are demonstrated in a simulated times series by (a) the effective elimination of white noise corruption from a periodic signal using the expletive filter and (b) the appearance of an effectively sofic region in the statistical complexity of a biased Poisson switch time series that is insensitive to changes in the word length (memory) used in the analysis. The new algorithm is then applied to an analysis of a real geomagnetic time series measured at Halley, Antarctica. Two principal components in the structure are detected that are interpreted as the diurnal variation due to the rotation of the Earth-based station under an electrical current pattern that is fixed with respect to the Sun-Earth axis and the random occurrence of a signature likely to be that of the magnetic substorm. In conclusion, some useful terminology for the discussion of model construction in general is introduced.
Xiao, Hong; Tian, Huai-yu; Zhang, Xi-xing; Zhao, Jian; Zhu, Pei-juan; Liu, Ru-chun; Chen, Tian-mu; Dai, Xiang-yu; Lin, Xiao-ling
2011-10-01
To realize the influence of climatic changes on the transmission of hemorrhagic fever with renal syndrome (HFRS), and to explore the adoption of climatic factors in warning HFRS. A total of 2171 cases of HFRS and the synchronous climatic data in Changsha from 2000 to 2009 were collected to a climate-based forecasting model for HFRS transmission. The Cochran-Armitage trend test was employed to explore the variation trend of the annual incidence of HFRS. Cross-correlations analysis was then adopted to assess the time-lag period between the climatic factors, including monthly average temperature, relative humidity, rainfall and Multivariate Elño-Southern Oscillation Index (MEI) and the monthly HFRS cases. Finally the time-series Poisson regression model was constructed to analyze the influence of different climatic factors on the HFRS transmission. The annual incidence of HFRS in Changsha between 2000 - 2009 was 13.09/100 000 (755 cases), 9.92/100 000 (578 cases), 5.02/100 000 (294 cases), 2.55/100 000 (150 cases), 1.13/100 000 (67 cases), 1.16/100 000 (70 cases), 0.95/100 000 (58 cases), 1.40/100 000 (87 cases), 0.75/100 000 (47 cases) and 1.02/100 000 (65 cases), respectively. The incidence showed a decline during these years (Z = -5.78, P < 0.01). The results of Poisson regression model indicated that the monthly average temperature (18.00°C, r = 0.26, P < 0.01, 1-month lag period; IRR = 1.02, 95%CI: 1.00 - 1.03, P < 0.01), relative humidity (75.50%, r = 0.62, P < 0.01, 3-month lag period; IRR = 1.03, 95%CI: 1.02 - 1.04, P < 0.01), rainfall (112.40 mm, r = 0.25, P < 0.01, 6-month lag period; IRR = 1.01, 95CI: 1.01 - 1.02, P = 0.02), and MEI (r = 0.31, P < 0.01, 3-month lag period; IRR = 0.77, 95CI: 0.67 - 0.88, P < 0.01) were closely associated with monthly HFRS cases (18.10 cases). Climate factors significantly influence the incidence of HFRS. If the influence of variable-autocorrelation, seasonality, and long-term trend were controlled, the accuracy of forecasting by the time-series Poisson regression model in Changsha would be comparatively high, and we could forecast the incidence of HFRS in advance.
NASA Astrophysics Data System (ADS)
Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi
2016-05-01
This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable compared with that of other stations. The highest values of sustainability index at Mundra (0.120) and Naliya (0.112) stations confirmed the earlier findings of R y- R e- V y approach. In general, annual rainfall of the study area is less reliable, less resilient, and moderately vulnerable, which emphasizes the need of developing suitable strategies for managing water resources of the area on sustainable basis. Finally, it is recommended that multiple statistical tests (at least two) should be used in time-series modeling for making reliable decisions. Moreover, methodology and findings of the sustainability concept in rainfall time series can easily be adopted in other arid regions of the world.
Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J
2016-01-15
This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. Published by Elsevier B.V.
Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J.
2016-01-01
This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3 s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. PMID:26726934
Lovelock, Catherine E; Ewel, John J
2005-07-01
We studied the relationships among plant and arbuscular mycorrhizal (AM) fungal diversity, and their effects on ecosystem function, in a series of replicate tropical forestry plots in the La Selva Biological Station, Costa Rica. Forestry plots were 12 yr old and were either monocultures of three tree species, or polycultures of the tree species with two additional understory species. Relationships among the AM fungal spore community, host species, plant community diversity and ecosystem phosphorus-use efficiency (PUE) and net primary productivity (NPP) were assessed. Analysis of the relative abundance of AM fungal spores found that host tree species had a significant effect on the AM fungal community, as did host plant community diversity (monocultures vs polycultures). The Shannon diversity index of the AM fungal spore community differed significantly among the three host tree species, but was not significantly different between monoculture and polyculture plots. Over all the plots, significant positive relationships were found between AM fungal diversity and ecosystem NPP, and between AM fungal community evenness and PUE. Relative abundance of two of the dominant AM fungal species also showed significant correlations with NPP and PUE. We conclude that the AM fungal community composition in tropical forests is sensitive to host species, and provide evidence supporting the hypothesis that the diversity of AM fungi in tropical forests and ecosystem NPP covaries.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.
2009-04-01
Over the last few decades negative trends in stratospheric ozone have been studied because of the direct link between decreasing stratospheric ozone and increasing surface UV-radiation. Recently a discussion on ozone recovery has begun. Long-term measurements of total ozone extending back earlier than 1958 are limited and only available from a few stations in the northern hemisphere. The world's longest total ozone record is available from Arosa, Switzerland (Staehelin et al., 1998a,b). At this site total ozone measurements have been made since late 1926 through the present day. Within this study (Rieder et al., 2009) new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied to select mathematically well-defined thresholds for extreme low and extreme high total ozone. A heavy-tail focused approach is used by fitting the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a sufficiently high (or below a sufficiently low) threshold (Coles, 2001). More precisely, the GPD is the limiting distribution of normalized excesses over a threshold, as the threshold approaches the endpoint of the distribution. In practice, GPD parameters are fitted, to exceedances by maximum likelihood or other methods - such as the probability weighted moments. A preliminary step consists in defining an appropriate threshold for which the asymptotic GPD approximation holds. Suitable tools for threshold selection as the MRL-plot (mean residual life plot) and TC-plot (stability plot) from the POT-package (Ribatet, 2007) are presented. The frequency distribution of extremes in low (termed ELOs) and high (termed EHOs) total ozone and their influence on the long-term changes in total ozone are analyzed. Further it is shown that from the GPD-model the distribution of so-called ozone mini holes (e.g. Bojkov and Balis, 2001) can be precisely estimated and that the "extremes concept" provides new information on the data distribution and variability within the Arosa record as well as on the influence of ELOs and EHOs on the long-term trends of the ozone time series. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Pickands, J.: Statistical inference using extreme order statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Stübi, R., Weihs, P., Holawe, F., and M. Ribatet: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Algorithm Calculates Cumulative Poisson Distribution
NASA Technical Reports Server (NTRS)
Bowerman, Paul N.; Nolty, Robert C.; Scheuer, Ernest M.
1992-01-01
Algorithm calculates accurate values of cumulative Poisson distribution under conditions where other algorithms fail because numbers are so small (underflow) or so large (overflow) that computer cannot process them. Factors inserted temporarily to prevent underflow and overflow. Implemented in CUMPOIS computer program described in "Cumulative Poisson Distribution Program" (NPO-17714).
State Estimation for Linear Systems Driven Simultaneously by Wiener and Poisson Processes.
1978-12-01
The state estimation problem of linear stochastic systems driven simultaneously by Wiener and Poisson processes is considered, especially the case...where the incident intensities of the Poisson processes are low and the system is observed in an additive white Gaussian noise. The minimum mean squared
The Validity of Poisson Assumptions in a Combined Loglinear/MDS Mapping Model.
ERIC Educational Resources Information Center
Everett, James E.
1993-01-01
Addresses objections to the validity of assuming a Poisson loglinear model as the generating process for citations from one journal into another. Fluctuations in citation rate, serial dependence on citations, impossibility of distinguishing between rate changes and serial dependence, evidence for changes in Poisson rate, and transitivity…
Method for resonant measurement
Rhodes, George W.; Migliori, Albert; Dixon, Raymond D.
1996-01-01
A method of measurement of objects to determine object flaws, Poisson's ratio (.sigma.) and shear modulus (.mu.) is shown and described. First, the frequency for expected degenerate responses is determined for one or more input frequencies and then splitting of degenerate resonant modes are observed to identify the presence of flaws in the object. Poisson's ratio and the shear modulus can be determined by identification of resonances dependent only on the shear modulus, and then using that shear modulus to find Poisson's ratio using other modes dependent on both the shear modulus and Poisson's ratio.
Elasticity of α-Cristobalite: A Silicon Dioxide with a Negative Poisson's Ratio
NASA Astrophysics Data System (ADS)
Yeganeh-Haeri, Amir; Weidner, Donald J.; Parise, John B.
1992-07-01
Laser Brillouin spectroscopy was used to determine the adiabatic single-crystal elastic stiffness coefficients of silicon dioxide (SiO_2) in the α-cristobalite structure. This SiO_2 polymorph, unlike other silicas and silicates, exhibits a negative Poisson's ratio; α-cristobalite contracts laterally when compressed and expands laterally when stretched. Tensorial analysis of the elastic coefficients shows that Poisson's ratio reaches a maximum value of -0.5 in some directions, whereas averaged values for the single-phased aggregate yield a Poisson's ratio of -0.16.
Zero-inflated Conway-Maxwell Poisson Distribution to Analyze Discrete Data.
Sim, Shin Zhu; Gupta, Ramesh C; Ong, Seng Huat
2018-01-09
In this paper, we study the zero-inflated Conway-Maxwell Poisson (ZICMP) distribution and develop a regression model. Score and likelihood ratio tests are also implemented for testing the inflation/deflation parameter. Simulation studies are carried out to examine the performance of these tests. A data example is presented to illustrate the concepts. In this example, the proposed model is compared to the well-known zero-inflated Poisson (ZIP) and the zero- inflated generalized Poisson (ZIGP) regression models. It is shown that the fit by ZICMP is comparable or better than these models.
A System of Poisson Equations for a Nonconstant Varadhan Functional on a Finite State Space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cavazos-Cadena, Rolando; Hernandez-Hernandez, Daniel
2006-01-15
Given a discrete-time Markov chain with finite state space and a stationary transition matrix, a system of 'local' Poisson equations characterizing the (exponential) Varadhan's functional J(.) is given. The main results, which are derived for an arbitrary transition structure so that J(.) may be nonconstant, are as follows: (i) Any solution to the local Poisson equations immediately renders Varadhan's functional, and (ii) a solution of the system always exist. The proof of this latter result is constructive and suggests a method to solve the local Poisson equations.
Applying the compound Poisson process model to the reporting of injury-related mortality rates.
Kegler, Scott R
2007-02-16
Injury-related mortality rate estimates are often analyzed under the assumption that case counts follow a Poisson distribution. Certain types of injury incidents occasionally involve multiple fatalities, however, resulting in dependencies between cases that are not reflected in the simple Poisson model and which can affect even basic statistical analyses. This paper explores the compound Poisson process model as an alternative, emphasizing adjustments to some commonly used interval estimators for population-based rates and rate ratios. The adjusted estimators involve relatively simple closed-form computations, which in the absence of multiple-case incidents reduce to familiar estimators based on the simpler Poisson model. Summary data from the National Violent Death Reporting System are referenced in several examples demonstrating application of the proposed methodology.
Park, H M; Lee, J S; Kim, T W
2007-11-15
In the analysis of electroosmotic flows, the internal electric potential is usually modeled by the Poisson-Boltzmann equation. The Poisson-Boltzmann equation is derived from the assumption of thermodynamic equilibrium where the ionic distributions are not affected by fluid flows. Although this is a reasonable assumption for steady electroosmotic flows through straight microchannels, there are some important cases where convective transport of ions has nontrivial effects. In these cases, it is necessary to adopt the Nernst-Planck equation instead of the Poisson-Boltzmann equation to model the internal electric field. In the present work, the predictions of the Nernst-Planck equation are compared with those of the Poisson-Boltzmann equation for electroosmotic flows in various microchannels where the convective transport of ions is not negligible.
Efficiency optimization of a fast Poisson solver in beam dynamics simulation
NASA Astrophysics Data System (ADS)
Zheng, Dawei; Pöplau, Gisela; van Rienen, Ursula
2016-01-01
Calculating the solution of Poisson's equation relating to space charge force is still the major time consumption in beam dynamics simulations and calls for further improvement. In this paper, we summarize a classical fast Poisson solver in beam dynamics simulations: the integrated Green's function method. We introduce three optimization steps of the classical Poisson solver routine: using the reduced integrated Green's function instead of the integrated Green's function; using the discrete cosine transform instead of discrete Fourier transform for the Green's function; using a novel fast convolution routine instead of an explicitly zero-padded convolution. The new Poisson solver routine preserves the advantages of fast computation and high accuracy. This provides a fast routine for high performance calculation of the space charge effect in accelerators.
Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.
Zhang, Jiachao; Hirakawa, Keigo
2017-04-01
This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.
A generalized right truncated bivariate Poisson regression model with applications to health data.
Islam, M Ataharul; Chowdhury, Rafiqul I
2017-01-01
A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.
A generalized right truncated bivariate Poisson regression model with applications to health data
Islam, M. Ataharul; Chowdhury, Rafiqul I.
2017-01-01
A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model. PMID:28586344
Complex wet-environments in electronic-structure calculations
NASA Astrophysics Data System (ADS)
Fisicaro, Giuseppe; Genovese, Luigi; Andreussi, Oliviero; Marzari, Nicola; Goedecker, Stefan
The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of an applied electrochemical potentials, including complex electrostatic screening coming from the solvent. In the present work we present a solver to handle both the Generalized Poisson and the Poisson-Boltzmann equation. A preconditioned conjugate gradient (PCG) method has been implemented for the Generalized Poisson and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations. On the other hand, a self-consistent procedure enables us to solve the Poisson-Boltzmann problem. The algorithms take advantage of a preconditioning procedure based on the BigDFT Poisson solver for the standard Poisson equation. They exhibit very high accuracy and parallel efficiency, and allow different boundary conditions, including surfaces. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and it will be released as a independent program, suitable for integration in other codes. We present test calculations for large proteins to demonstrate efficiency and performances. This work was done within the PASC and NCCR MARVEL projects. Computer resources were provided by the Swiss National Supercomputing Centre (CSCS) under Project ID s499. LG acknowledges also support from the EXTMOS EU project.
Naya, Hugo; Urioste, Jorge I; Chang, Yu-Mei; Rodrigues-Motta, Mariana; Kremer, Roberto; Gianola, Daniel
2008-01-01
Dark spots in the fleece area are often associated with dark fibres in wool, which limits its competitiveness with other textile fibres. Field data from a sheep experiment in Uruguay revealed an excess number of zeros for dark spots. We compared the performance of four Poisson and zero-inflated Poisson (ZIP) models under four simulation scenarios. All models performed reasonably well under the same scenario for which the data were simulated. The deviance information criterion favoured a Poisson model with residual, while the ZIP model with a residual gave estimates closer to their true values under all simulation scenarios. Both Poisson and ZIP models with an error term at the regression level performed better than their counterparts without such an error. Field data from Corriedale sheep were analysed with Poisson and ZIP models with residuals. Parameter estimates were similar for both models. Although the posterior distribution of the sire variance was skewed due to a small number of rams in the dataset, the median of this variance suggested a scope for genetic selection. The main environmental factor was the age of the sheep at shearing. In summary, age related processes seem to drive the number of dark spots in this breed of sheep. PMID:18558072
Ulissi, Zachary W; Govind Rajan, Ananth; Strano, Michael S
2016-08-23
Entropic surfaces represented by fluctuating two-dimensional (2D) membranes are predicted to have desirable mechanical properties when unstressed, including a negative Poisson's ratio ("auxetic" behavior). Herein, we present calculations of the strain-dependent Poisson ratio of self-avoiding 2D membranes demonstrating desirable auxetic properties over a range of mechanical strain. Finite-size membranes with unclamped boundary conditions have positive Poisson's ratio due to spontaneous non-zero mean curvature, which can be suppressed with an explicit bending rigidity in agreement with prior findings. Applying longitudinal strain along a singular axis to this system suppresses this mean curvature and the entropic out-of-plane fluctuations, resulting in a molecular-scale mechanism for realizing a negative Poisson's ratio above a critical strain, with values significantly more negative than the previously observed zero-strain limit for infinite sheets. We find that auxetic behavior persists over surprisingly high strains of more than 20% for the smallest surfaces, with desirable finite-size scaling producing surfaces with negative Poisson's ratio over a wide range of strains. These results promise the design of surfaces and composite materials with tunable Poisson's ratio by prestressing platelet inclusions or controlling the surface rigidity of a matrix of 2D materials.
An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.
Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying
2013-09-01
Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.
Revision of Primary Series Maps
,
2000-01-01
In 1992, the U.S. Geological Survey (USGS) completed a 50-year effort to provide primary series map coverage of the United States. Many of these maps now need to be updated to reflect the construction of new roads and highways and other changes that have taken place over time. The USGS has formulated a graphic revision plan to help keep the primary series maps current. Primary series maps include 1:20,000-scale quadrangles of Puerto Rico, 1:24,000- or 1:25,000-scale quadrangles of the conterminous United States, Hawaii, and U.S. Territories, and 1:63,360-scale quadrangles of Alaska. The revision of primary series maps from new collection sources is accomplished using a variety of processes. The raster revision process combines the scanned content of paper maps with raster updating technologies. The vector revision process involves the automated plotting of updated vector files. Traditional processes use analog stereoplotters and manual scribing instruments on specially coated map separates. The ability to select from or combine these processes increases the efficiency of the National Mapping Division map revision program.
NASA Astrophysics Data System (ADS)
Marwan, Norbert
2003-09-01
In this work, different aspects and applications of the recurrence plot analysis are presented. First, a comprehensive overview of recurrence plots and their quantification possibilities is given. New measures of complexity are defined by using geometrical structures of recurrence plots. These measures are capable to find chaos-chaos transitions in processes. Furthermore, a bivariate extension to cross recurrence plots is studied. Cross recurrence plots exhibit characteristic structures which can be used for the study of differences between two processes or for the alignment and search for matching sequences of two data series. The selected applications of the introduced techniques to various kind of data demonstrate their ability. Analysis of recurrence plots can be adopted to the specific problem and thus opens a wide field of potential applications. Regarding the quantification of recurrence plots, chaos-chaos transitions can be found in heart rate variability data before the onset of life threatening cardiac arrhythmias. This may be of importance for the therapy of such cardiac arrhythmias. The quantification of recurrence plots allows to study transitions in brain during cognitive experiments on the base of single trials. Traditionally, for the finding of these transitions the averaging of a collection of single trials is needed. Using cross recurrence plots, the existence of an El Niño/Southern Oscillation-like oscillation is traced in northwestern Argentina 34,000 yrs. ago. In further applications to geological data, cross recurrence plots are used for time scale alignment of different borehole data and for dating a geological profile with a reference data set. Additional examples from molecular biology and speech recognition emphasize the suitability of cross recurrence plots. Diese Arbeit beschäftigt sich mit verschiedenen Aspekten und Anwendungen von Recurrence Plots. Nach einer Übersicht über Methoden, die auf Recurrence Plots basieren, werden neue Komplexitätsmaße eingeführt, die geometrische Strukturen in den Recurrence Plots beschreiben. Diese neuen Maße erlauben die Identifikation von Chaos-Chaos-Übergängen in dynamischen Prozessen. In einem weiteren Schritt werden Cross Recurrence Plots eingeführt, mit denen zwei verschiedene Prozesse untersucht werden. Diese bivariate Analyse ermöglicht die Bewertung von Unterschieden zwischen zwei Prozessen oder das Anpassen der Zeitskalen von zwei Zeitreihen. Diese Technik kann auch genutzt werden, um ähnliche Abschnitte in zwei verschiedenen Datenreihen zu finden. Im Anschluß werden diese neuen Entwicklungen auf Daten verschiedener Art angewendet. Methoden, die auf Recurrence Plots basieren, können an die speziellen Probleme angepaßt werden, so daß viele weitere Anwendungen möglich sind. Durch die Anwendung der neu eingeführten Komplexitätsmaße können Chaos-Chaos-Übergänge in Herzschlagdaten vor dem Auftreten einer lebensbedrohlichen Herzrhythmusstörung festgestellt werden, was für die Entwicklung neuer Therapien dieser Herzrhythmusstörungen von Bedeutung sein könnte. In einem weiteren Beispiel, in dem EEG-Daten aus einem kognitiv orientierten Experiment untersucht werden, ermöglichen diese Komplexitätsmaße das Erkennen von spezifischen Reaktionen im Gehirn bereits in Einzeltests. Normalerweise können diese Reaktionen erst durch die Auswertung von vielen Einzeltests erkannt werden. Mit der Hilfe von Cross Recurrence Plots wird die Existenz einer klimatischen Zirkulation, die der heutigen El Niño/ Southern Oscillation sehr ähnlich ist, im Nordwesten Argentiniens vor etwa 34000 Jahren nachgewiesen. Außerdem können mit Cross Recurrence Plots die Zeitskalen verschiedener Bohrlochdaten aufeinander abgeglichen werden. Diese Methode kann auch dazu genutzt werden, ein geologisches Profil mit Hilfe eines Referenzprofiles mit bekannter Zeitskala zu datieren. Weitere Beispiele aus den Gebieten der Molekularbiologie und der Spracherkennung unterstreichen das Potential dieser Methode.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Meng-Zheng; School of Physics and Electronic Information, Huaibei Normal University, Huaibei 235000; Ye, Liu, E-mail: yeliu@ahu.edu.cn
An efficient scheme is proposed to implement phase-covariant quantum cloning by using a superconducting transmon qubit coupled to a microwave cavity resonator in the strong dispersive limit of circuit quantum electrodynamics (QED). By solving the master equation numerically, we plot the Wigner function and Poisson distribution of the cavity mode after each operation in the cloning transformation sequence according to two logic circuits proposed. The visualizations of the quasi-probability distribution in phase-space for the cavity mode and the occupation probability distribution in the Fock basis enable us to penetrate the evolution process of cavity mode during the phase-covariant cloning (PCC)more » transformation. With the help of numerical simulation method, we find out that the present cloning machine is not the isotropic model because its output fidelity depends on the polar angle and the azimuthal angle of the initial input state on the Bloch sphere. The fidelity for the actual output clone of the present scheme is slightly smaller than one in the theoretical case. The simulation results are consistent with the theoretical ones. This further corroborates our scheme based on circuit QED can implement efficiently PCC transformation.« less
Study of microvascular non-Newtonian blood flow modulated by electroosmosis.
Tripathi, Dharmendra; Yadav, Ashu; Anwar Bég, O; Kumar, Rakesh
2018-05-01
An analytical study of microvascular non-Newtonian blood flow is conducted incorporating the electro-osmosis phenomenon. Blood is considered as a Bingham rheological aqueous ionic solution. An externally applied static axial electrical field is imposed on the system. The Poisson-Boltzmann equation for electrical potential distribution is implemented to accommodate the electrical double layer in the microvascular regime. With long wavelength, lubrication and Debye-Hückel approximations, the boundary value problem is rendered non-dimensional. Analytical solutions are derived for the axial velocity, volumetric flow rate, pressure gradient, volumetric flow rate, averaged volumetric flow rate along one time period, pressure rise along one wavelength and stream function. A plug swidth is featured in the solutions. Via symbolic software (Mathematica), graphical plots are generated for the influence of Bingham plug flow width parameter, electrical Debye length and Helmholtz-Smoluchowski velocity (maximum electro-osmotic velocity) on the key hydrodynamic variables. This study reveals that blood flow rate accelerates with decreasing the plug width (i.e. viscoplastic nature of fluids) and also with increasing the Debye length parameter. Copyright © 2018 Elsevier Inc. All rights reserved.
Electronic growth charts: watching our patients grow.
Murphy, Cynthia A; Carstens, Kimberly; Villamayor, Precy
2005-01-01
Pediatric Growth Charts have been used in the pediatric community since 1977. The first growth charts were developed by the National Center for Health Statistics as a clinical tool for health care professionals. The growth charts, revised in 2000, by the Center for Disease Control consists of a series of percentile curves for selected body measurements in children [1]. Capitalizing on the benefits of our Electronic Medical Record (EMR), and as a byproduct of nursing electronic documentation of routine heights, weights, and frontal occipital circumferences, our system plots the routine measurements without additional intervention by the staff. Clinicians can view the graphs online or generate printed reports as needed during routine examination for outpatient or hospitalized care. This abstract outlines the background, design process, programming rules utilized to plot growth curves, and the evaluation of the electronic CDC growth charts in our organization.
Estimation of Curve Tracing Time in Supercapacitor based PV Characterization
NASA Astrophysics Data System (ADS)
Basu Pal, Sudipta; Das Bhattacharya, Konika; Mukherjee, Dipankar; Paul, Debkalyan
2017-08-01
Smooth and noise-free characterisation of photovoltaic (PV) generators have been revisited with renewed interest in view of large size PV arrays making inroads into the urban sector of major developing countries. Such practice has recently been observed to be confronted by the use of a suitable data acquisition system and also the lack of a supporting theoretical analysis to justify the accuracy of curve tracing. However, the use of a selected bank of supercapacitors can mitigate the said problems to a large extent. Assuming a piecewise linear analysis of the V-I characteristics of a PV generator, an accurate analysis of curve plotting time has been possible. The analysis has been extended to consider the effect of equivalent series resistance of the supercapacitor leading to increased accuracy (90-95%) of curve plotting times.
Naziroglu, Hayriye Nevin; Durmaz, Mustafa; Bozkurt, Selahattin; Sirit, Abdulkadir
2011-07-01
Four proline-derived chiral receptors 5-8 were readily synthesized starting from L-proline. The enantiomeric recognition ability of chiral receptors was examined with a series of carboxylic acids by (1) H NMR spectroscopy. The molar ratio and the association constants of the chiral compounds with each of the enantiomers of guest molecules were determined by using Job plots and a nonlinear least-squares fitting method, respectively. The Job plots indicate that the hosts form 1:1 instantaneous complexes with all guests. The receptors exhibited different chiral recognition abilities toward the enantiomers of racemic guests. Among the chiral receptors used in this study, prolinamide 6 was found to be the best chiral shift reagent and is effective for the determination of the enantiomeric excess of chiral carboxylic acids. Copyright © 2011 Wiley-Liss, Inc.
Improvements to the fastex flutter analysis computer code
NASA Technical Reports Server (NTRS)
Taylor, Ronald F.
1987-01-01
Modifications to the FASTEX flutter analysis computer code (UDFASTEX) are described. The objectives were to increase the problem size capacity of FASTEX, reduce run times by modification of the modal interpolation procedure, and to add new user features. All modifications to the program are operable on the VAX 11/700 series computers under the VAX operating system. Interfaces were provided to aid in the inclusion of alternate aerodynamic and flutter eigenvalue calculations. Plots can be made of the flutter velocity, display and frequency data. A preliminary capability was also developed to plot contours of unsteady pressure amplitude and phase. The relevant equations of motion, modal interpolation procedures, and control system considerations are described and software developments are summarized. Additional information documenting input instructions, procedures, and details of the plate spline algorithm is found in the appendices.
Poisson Spot with Magnetic Levitation
ERIC Educational Resources Information Center
Hoover, Matthew; Everhart, Michael; D'Arruda, Jose
2010-01-01
In this paper we describe a unique method for obtaining the famous Poisson spot without adding obstacles to the light path, which could interfere with the effect. A Poisson spot is the interference effect from parallel rays of light diffracting around a solid spherical object, creating a bright spot in the center of the shadow.
Modelling infant mortality rate in Central Java, Indonesia use generalized poisson regression method
NASA Astrophysics Data System (ADS)
Prahutama, Alan; Sudarno
2018-05-01
The infant mortality rate is the number of deaths under one year of age occurring among the live births in a given geographical area during a given year, per 1,000 live births occurring among the population of the given geographical area during the same year. This problem needs to be addressed because it is an important element of a country’s economic development. High infant mortality rate will disrupt the stability of a country as it relates to the sustainability of the population in the country. One of regression model that can be used to analyze the relationship between dependent variable Y in the form of discrete data and independent variable X is Poisson regression model. Recently The regression modeling used for data with dependent variable is discrete, among others, poisson regression, negative binomial regression and generalized poisson regression. In this research, generalized poisson regression modeling gives better AIC value than poisson regression. The most significant variable is the Number of health facilities (X1), while the variable that gives the most influence to infant mortality rate is the average breastfeeding (X9).
Modeling health survey data with excessive zero and K responses.
Lin, Ting Hsiang; Tsai, Min-Hsiao
2013-04-30
Zero-inflated Poisson regression is a popular tool used to analyze data with excessive zeros. Although much work has already been performed to fit zero-inflated data, most models heavily depend on special features of the individual data. To be specific, this means that there is a sizable group of respondents who endorse the same answers making the data have peaks. In this paper, we propose a new model with the flexibility to model excessive counts other than zero, and the model is a mixture of multinomial logistic and Poisson regression, in which the multinomial logistic component models the occurrence of excessive counts, including zeros, K (where K is a positive integer) and all other values. The Poisson regression component models the counts that are assumed to follow a Poisson distribution. Two examples are provided to illustrate our models when the data have counts containing many ones and sixes. As a result, the zero-inflated and K-inflated models exhibit a better fit than the zero-inflated Poisson and standard Poisson regressions. Copyright © 2012 John Wiley & Sons, Ltd.
Development of a Methodology for the Rapid Detection of Coliform Bacteria.
1981-02-27
Micelle Concentration Determination of Sodium Lauryl Sulfate 19 10 Sheath Flow Measuring Chamber 24 ll(a-c) Negative Substrate Control Comparisons 27...the net result being a net increase in the level of detectability. Sodium lauryl sulfate was chosen as the candidate surfactant and used at its...determined experimentally by taking conductivity mea- surements for a concentration series of sodium lauryl sulfate . Plotting equivalent conductivity vs
Longleaf pine wood and straw yields from two old-field planted sites in Georgia
E. David Dickens; David J. Moorhead; Bryan C. McElvany; Ray Hicks
2012-01-01
Little is known or published concerning longleaf pineâs growth rate, or wood and pine straw yields on old-field sites. Two study areas were installed in unthinned longleaf plantations established on former old-fields in Screven and Tift Counties, Georgia to address pine growth and straw yields. Soil series were delineated and replicated plots with three levels of...
enhancedGraphics: a Cytoscape app for enhanced node graphics
Morris, John H.; Kuchinsky, Allan; Ferrin, Thomas E.; Pico, Alexander R.
2014-01-01
enhancedGraphics ( http://apps.cytoscape.org/apps/enhancedGraphics) is a Cytoscape app that implements a series of enhanced charts and graphics that may be added to Cytoscape nodes. It enables users and other app developers to create pie, line, bar, and circle plots that are driven by columns in the Cytoscape Node Table. Charts are drawn using vector graphics to allow full-resolution scaling. PMID:25285206
B.E. Borders; Rodney E. Will; R. L. Hendrick; D. Markewitz; T. Harrington; R. O. Teskey; A. Clark
2002-01-01
Beginning in 1987, a series of long-term study plots were installed to determine the effects of annual nitrogen fertilization and complete control of competing vegetation on loblolly pine (Pinus taeda L.) stand growth and development. The study had two locations, one at the Dixon State Forest (DSF) near Waycross, GA on the lower coastal plain and...
Johnny L. Boggs; Steven G. McNulty
2010-01-01
The objective of this study is to describe winter and summer surface air and forest floor temperature patterns and diurnal fluctuations in high-elevation red spruce (Picea rubens Sarg.) forests with different levels of canopy cover. In 1988, a series of 10- x 10-meter plots (control, low nitrogen [N] addition, and high nitrogen addition) were...
Margaret R. Holdaway
1994-01-01
Describes Geo-CLM, a computer application (for Mac or DOS) whose primary aim is to perform multiple kriging runs to interpolate the historic climatic record at research plots in the Lake States. It is an exploration and analysis tool. Addition capabilities include climatic databases, a flexible test mode, cross validation, lat/long conversion, English/metric units,...
Ram Deo; Matthew Russell; Grant Domke; Hans-Erik Andersen; Warren Cohen; Christopher Woodall
2017-01-01
Large-area assessment of aboveground tree biomass (AGB) to inform regional or national forest monitoring programs can be efficiently carried out by combining remotely sensed data and field sample measurements through a generic statistical model, in contrast to site-specific models. We integrated forest inventory plot data with spatial predictors from Landsat time-...
ERIC Educational Resources Information Center
Crozier, Carl
This guide provides agricultural extensionists with basic information that will help them design plans for the conservation of soils and the management of water runoff in specific agricultural plots. It is based on experiences with small hillside farms in Honduras and takes into account the resources and constraints commonly encountered there.…
Characterization of vegetation by microwave and optical remote sensing
NASA Technical Reports Server (NTRS)
Daughtry, C. S. T. (Principal Investigator); Ranson, K. J.; Biehl, L. L.
1986-01-01
Two series of carefully controlled experiments were conducted. First, plots of important crops (corn, soybeans, and sorghum), prairie grasses (big bluestem, switchgrass, tal fescue, orchardgrass, bromegrass), and forage legumes (alfalfa, red clover, and crown vetch) were manipulated to produce wide ranges of phytomass, leaf area index, and canopy architecture. Second, coniferous forest canopies were simulated using small balsam fir trees grown in large pots of soil and arranged systematically on a large (5 m) platform. Rotating the platform produced many new canopies for frequency and spatial averaging of the backscatter signal. In both series of experiments, backscatter of 5.0 GHz (C-Band) was measured as a function of view angle and polarization. Biophysical measurements included leaf area index, fresh and dry phytomass, water content of canopy elements, canopy height, and soil roughness and moisture content. For a subset of the above plots, additional measurements were acquired to exercise microwave backscatter models. These measurements included size and shape of leaves, stems, and fruit and the probability density function of leaf and stem angles. The relationships of the backscattering coefficients and the biophysical properties of the canopies were evaluated using statistical correlations, analysis of variance, and regression analysis. Results from the corn density and balsam fir experiments are discussed and analyses of data from the other experiments are summarized.
Anisotropic scattering of discrete particle arrays.
Paul, Joseph S; Fu, Wai Chong; Dokos, Socrates; Box, Michael
2010-05-01
Far-field intensities of light scattered from a linear centro-symmetric array illuminated by a plane wave of incident light are estimated at a series of detector angles. The intensities are computed from the superposition of E-fields scattered by the individual array elements. An average scattering phase function is used to model the scattered fields of individual array elements. The nature of scattering from the array is investigated using an image (theta-phi plot) of the far-field intensities computed at a series of locations obtained by rotating the detector angle from 0 degrees to 360 degrees, corresponding to each angle of incidence in the interval [0 degrees 360 degrees]. The diffraction patterns observed from the theta-Phi plot are compared with those for isotropic scattering. In the absence of prior information on the array geometry, the intensities corresponding to theta-Phi pairs satisfying the Bragg condition are used to estimate the phase function. An algorithmic procedure is presented for this purpose and tested using synthetic data. The relative error between estimated and theoretical values of the phase function is shown to be determined by the mean spacing factor, the number of elements, and the far-field distance. An empirical relationship is presented to calculate the optimal far-field distance for a given specification of the percentage error.
Hydrodynamic and suspended-solids concentration measurements in Suisun Bay, California, 1995
Cuetara, Jay I.; Burau, Jon R.; Schoellhamer, David H.
2001-01-01
Sea level, current velocity, water temperature, salinity (computed from conductivity and temperature), and suspended-solids data collected in Suisun Bay, California, from May 30, 1995, through October 27, 1995, by the U.S. Geological Survey are documented in this report. Data were collected concurrently at 21 sites. Various parameters were measured at each site. Velocity-profile data were collected at 6 sites, single-point velocity measurements were made at 9 sites, salinity data were collected at 20 sites, and suspended-solids concentrations were measured at 10 sites. Sea-level and velocity data are presented in three forms; harmonic analysis results; time-series plots (sea level, current speed, and current direction versus time); and time-series plots of low-pass-filtered time series. Temperature, salinity, and suspended-solids data are presented as plots of raw and low-pass-filtered time series.The velocity and salinity data presented in this report document a period when the residual current patterns and salt field were transitioning from a freshwater-inflow-dominated condition towards a quasi steady-state summer condition when density-driven circulation and tidal nonlinearities became relatively more important as long-term transport mechanisms. Sacramento-San Joaquin River Delta outflow was high prior to and during this study, so the tidally averaged salinities were abnormally low for this time of year. For example, the tidally averaged salinities varied from 0-12 at Martinez, the western border of Suisun Bay, to a maximum of 2 at Mallard Island, the eastern border of Suisun Bay. Even though salinities increased overall in Suisun Bay during the study period, the near-bed residual currents primarily were directed seaward. Therefore, salinity intrusion through Suisun Bay towards the Delta primarily was accomplished in the absence of the tidally averaged, two-layer flow known as gravitational circulation where, by definition, the net currents are landward at the bed. The Folsom Dam spillway gate failure on July 17, 1995, was analyzed to determine the effect on the hydrodynamics of Suisun Bay. The peak flow of the American River reached roughly 1,000 cubic meters per second as a result of the failure, which is relatively small. This was roughly 15 percent of the approximate 7,000 cubic meters per second tidal flows that occur daily in Suisun Bay and was likely attenuated greatly. Based on analysis of tidally averaged near-bed salinity and depth-averaged currents after the failure, the effect was essentially nonexistent and is indistinguishable from the natural variability.