Balsillie, J.H.; Donoghue, J.F.; Butler, K.M.; Koch, J.L.
2002-01-01
Two-dimensional plotting tools can be of invaluable assistance in analytical scientific pursuits, and have been widely used in the analysis and interpretation of sedimentologic data. We consider, in this work, the use of arithmetic probability paper (APP). Most statistical computer applications do not allow for the generation of APP plots, because of apparent intractable nonlinearity of the percentile (or probability) axis of the plot. We have solved this problem by identifying an equation(s) for determining plotting positions of Gaussian percentiles (or probabilities), so that APP plots can easily be computer generated. An EXCEL example is presented, and a programmed, simple-to-use EXCEL application template is hereby made publicly available, whereby a complete granulometric analysis including data listing, moment measure calculations, and frequency and cumulative APP plots, is automatically produced.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Ahn, Hyunjun; Jung, Younghun; Om, Ju-Seong; Heo, Jun-Haeng
2014-05-01
It is very important to select the probability distribution in Statistical hydrology. Goodness of fit test is a statistical method that selects an appropriate probability model for a given data. The probability plot correlation coefficient (PPCC) test as one of the goodness of fit tests was originally developed for normal distribution. Since then, this test has been widely applied to other probability models. The PPCC test is known as one of the best goodness of fit test because it shows higher rejection powers among them. In this study, we focus on the PPCC tests for the GEV distribution which is widely used in the world. For the GEV model, several plotting position formulas are suggested. However, the PPCC statistics are derived only for the plotting position formulas (Goel and De, In-na and Nguyen, and Kim et al.) in which the skewness coefficient (or shape parameter) are included. And then the regression equations are derived as a function of the shape parameter and sample size for a given significance level. In addition, the rejection powers of these formulas are compared using Monte-Carlo simulation. Keywords: Goodness-of-fit test, Probability plot correlation coefficient test, Plotting position, Monte-Carlo Simulation ACKNOWLEDGEMENTS This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-12-NH-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.
Brains striving for coherence: Long-term cumulative plot formation in the default mode network.
Tylén, K; Christensen, P; Roepstorff, A; Lund, T; Østergaard, S; Donald, M
2015-11-01
Many everyday activities, such as engaging in conversation or listening to a story, require us to sustain attention over a prolonged period of time while integrating and synthesizing complex episodic content into a coherent mental model. Humans are remarkably capable of navigating and keeping track of all the parallel social activities of everyday life even when confronted with interruptions or changes in the environment. However, the underlying cognitive and neurocognitive mechanisms of such long-term integration and profiling of information remain a challenge to neuroscience. While brain activity is generally traceable within the short time frame of working memory (milliseconds to seconds), these integrative processes last for minutes, hours or even days. Here we report two experiments on story comprehension. Experiment I establishes a cognitive dissociation between our comprehension of plot and incidental facts in narratives: when episodic material allows for long-term integration in a coherent plot, we recall fewer factual details. However, when plot formation is challenged, we pay more attention to incidental facts. Experiment II investigates the neural underpinnings of plot formation. Results suggest a central role for the brain's default mode network related to comprehension of coherent narratives while incoherent episodes rather activate the frontoparietal control network. Moreover, an analysis of cortical activity as a function of the cumulative integration of narrative material into a coherent story reveals to linear modulations of right hemisphere posterior temporal and parietal regions. Together these findings point to key neural mechanisms involved in the fundamental human capacity for cumulative plot formation. Copyright © 2015 Elsevier Inc. All rights reserved.
Cumulative probability of neodymium: YAG laser posterior capsulotomy after phacoemulsification.
Ando, Hiroshi; Ando, Nobuyo; Oshika, Tetsuro
2003-11-01
To retrospectively analyze the cumulative probability of neodymium:YAG (Nd:YAG) laser posterior capsulotomy after phacoemulsification and to evaluate the risk factors. Ando Eye Clinic, Kanagawa, Japan. In 3997 eyes that had phacoemulsification with an intact continuous curvilinear capsulorhexis, the cumulative probability of posterior capsulotomy was computed by Kaplan-Meier survival analysis and risk factors were analyzed using the Cox proportional hazards regression model. The variables tested were sex; age; type of cataract; preoperative best corrected visual acuity (BCVA); presence of diabetes mellitus, diabetic retinopathy, or retinitis pigmentosa; type of intraocular lens (IOL); and the year the operation was performed. The IOLs were categorized as 3-piece poly(methyl methacrylate) (PMMA), 1-piece PMMA, 3-piece silicone, and acrylic foldable. The cumulative probability of capsulotomy after cataract surgery was 1.95%, 18.50%, and 32.70% at 1, 3, and 5 years, respectively. Positive risk factors included a better preoperative BCVA (P =.0005; risk ratio [RR], 1.7; 95% confidence interval [CI], 1.3-2.5) and the presence of retinitis pigmentosa (P<.0001; RR, 6.6; 95% CI, 3.7-11.6). Women had a significantly greater probability of Nd:YAG laser posterior capsulotomy (P =.016; RR, 1.4; 95% CI, 1.1-1.8). The type of IOL was significantly related to the probability of Nd:YAG laser capsulotomy, with the foldable acrylic IOL having a significantly lower probability of capsulotomy. The 1-piece PMMA IOL had a significantly higher risk than 3-piece PMMA and 3-piece silicone IOLs. The probability of Nd:YAG laser capsulotomy was higher in women, in eyes with a better preoperative BCVA, and in patients with retinitis pigmentosa. The foldable acrylic IOL had a significantly lower probability of capsulotomy.
NASA Astrophysics Data System (ADS)
Luo, Hanjun; Ouyang, Zhengbiao; Liu, Qiang; Chen, Zhiliang; Lu, Hualan
2017-10-01
Cumulative pulses detection with appropriate cumulative pulses number and threshold has the ability to improve the detection performance of the pulsed laser ranging system with GM-APD. In this paper, based on Poisson statistics and multi-pulses cumulative process, the cumulative detection probabilities and their influence factors are investigated. With the normalized probability distribution of each time bin, the theoretical model of the range accuracy and precision is established, and the factors limiting the range accuracy and precision are discussed. The results show that the cumulative pulses detection can produce higher target detection probability and lower false alarm probability. However, for a heavy noise level and extremely weak echo intensity, the false alarm suppression performance of the cumulative pulses detection deteriorates quickly. The range accuracy and precision is another important parameter evaluating the detection performance, the echo intensity and pulse width are main influence factors on the range accuracy and precision, and higher range accuracy and precision is acquired with stronger echo intensity and narrower echo pulse width, for 5-ns echo pulse width, when the echo intensity is larger than 10, the range accuracy and precision lower than 7.5 cm can be achieved.
Resonances in the cumulative reaction probability for a model electronically nonadiabatic reaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, J.; Bowman, J.M.
1996-05-01
The cumulative reaction probability, flux{endash}flux correlation function, and rate constant are calculated for a model, two-state, electronically nonadiabatic reaction, given by Shin and Light [S. Shin and J. C. Light, J. Chem. Phys. {bold 101}, 2836 (1994)]. We apply straightforward generalizations of the flux matrix/absorbing boundary condition approach of Miller and co-workers to obtain these quantities. The upper adiabatic electronic potential supports bound states, and these manifest themselves as {open_quote}{open_quote}recrossing{close_quote}{close_quote} resonances in the cumulative reaction probability, at total energies above the barrier to reaction on the lower adiabatic potential. At energies below the barrier, the cumulative reaction probability for themore » coupled system is shifted to higher energies relative to the one obtained for the ground state potential. This is due to the effect of an additional effective barrier caused by the nuclear kinetic operator acting on the ground state, adiabatic electronic wave function, as discussed earlier by Shin and Light. Calculations are reported for five sets of electronically nonadiabatic coupling parameters. {copyright} {ital 1996 American Institute of Physics.}« less
Manktelow, Bradley N.; Seaton, Sarah E.
2012-01-01
Background Emphasis is increasingly being placed on the monitoring and comparison of clinical outcomes between healthcare providers. Funnel plots have become a standard graphical methodology to identify outliers and comprise plotting an outcome summary statistic from each provider against a specified ‘target’ together with upper and lower control limits. With discrete probability distributions it is not possible to specify the exact probability that an observation from an ‘in-control’ provider will fall outside the control limits. However, general probability characteristics can be set and specified using interpolation methods. Guidelines recommend that providers falling outside such control limits should be investigated, potentially with significant consequences, so it is important that the properties of the limits are understood. Methods Control limits for funnel plots for the Standardised Mortality Ratio (SMR) based on the Poisson distribution were calculated using three proposed interpolation methods and the probability calculated of an ‘in-control’ provider falling outside of the limits. Examples using published data were shown to demonstrate the potential differences in the identification of outliers. Results The first interpolation method ensured that the probability of an observation of an ‘in control’ provider falling outside either limit was always less than a specified nominal probability (p). The second method resulted in such an observation falling outside either limit with a probability that could be either greater or less than p, depending on the expected number of events. The third method led to a probability that was always greater than, or equal to, p. Conclusion The use of different interpolation methods can lead to differences in the identification of outliers. This is particularly important when the expected number of events is small. We recommend that users of these methods be aware of the differences, and specify which
Ordinal probability effect measures for group comparisons in multinomial cumulative link models.
Agresti, Alan; Kateri, Maria
2017-03-01
We consider simple ordinal model-based probability effect measures for comparing distributions of two groups, adjusted for explanatory variables. An "ordinal superiority" measure summarizes the probability that an observation from one distribution falls above an independent observation from the other distribution, adjusted for explanatory variables in a model. The measure applies directly to normal linear models and to a normal latent variable model for ordinal response variables. It equals Φ(β/2) for the corresponding ordinal model that applies a probit link function to cumulative multinomial probabilities, for standard normal cdf Φ and effect β that is the coefficient of the group indicator variable. For the more general latent variable model for ordinal responses that corresponds to a linear model with other possible error distributions and corresponding link functions for cumulative multinomial probabilities, the ordinal superiority measure equals exp(β)/[1+exp(β)] with the log-log link and equals approximately exp(β/2)/[1+exp(β/2)] with the logit link, where β is the group effect. Another ordinal superiority measure generalizes the difference of proportions from binary to ordinal responses. We also present related measures directly for ordinal models for the observed response that need not assume corresponding latent response models. We present confidence intervals for the measures and illustrate with an example. © 2016, The International Biometric Society.
NEWTONP - CUMULATIVE BINOMIAL PROGRAMS
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.
Exact probability distribution function for the volatility of cumulative production
NASA Astrophysics Data System (ADS)
Zadourian, Rubina; Klümper, Andreas
2018-04-01
In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.
Decision making generalized by a cumulative probability weighting function
NASA Astrophysics Data System (ADS)
dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto
2018-01-01
Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.
McCauley, Erin J
2017-12-01
To estimate the cumulative probability (c) of arrest by age 28 years in the United States by disability status, race/ethnicity, and gender. I estimated cumulative probabilities through birth cohort life tables with data from the National Longitudinal Survey of Youth, 1997. Estimates demonstrated that those with disabilities have a higher cumulative probability of arrest (c = 42.65) than those without (c = 29.68). The risk was disproportionately spread across races/ethnicities, with Blacks with disabilities experiencing the highest cumulative probability of arrest (c = 55.17) and Whites without disabilities experiencing the lowest (c = 27.55). Racial/ethnic differences existed by gender as well. There was a similar distribution of disability types across race/ethnicity, suggesting that the racial/ethnic differences in arrest may stem from racial/ethnic inequalities as opposed to differential distribution of disability types. The experience of arrest for those with disabilities was higher than expected. Police officers should understand how disabilities may affect compliance and other behaviors, and likewise how implicit bias and structural racism may affect reactions and actions of officers and the systems they work within in ways that create inequities.
Cumulative Probability and Time to Reintubation in U.S. ICUs.
Miltiades, Andrea N; Gershengorn, Hayley B; Hua, May; Kramer, Andrew A; Li, Guohua; Wunsch, Hannah
2017-05-01
Reintubation after liberation from mechanical ventilation is viewed as an adverse event in ICUs. We sought to describe the frequency of reintubations across U.S. ICUs and to propose a standard, appropriate time cutoff for reporting of reintubation events. We conducted a cohort study using data from the Project IMPACT database of 185 diverse ICUs in the United States. We included patients who received mechanical ventilation and excluded patients who received a tracheostomy, had a do-not-resuscitate order placed, or died prior to first extubation. We assessed the percentage of patients extubated who were reintubated; the cumulative probability of reintubation, with death and do-not-resuscitate orders after extubation modeled as competing risks, and time to reintubation. Among 98,367 patients who received mechanical ventilation without death or tracheostomy prior to extubation, 9,907 (10.1%) were reintubated, with a cumulative probability of 10.0%. Median time to reintubation was 15 hours (interquartile range, 2-45 hr). Of patients who required reintubation in the ICU, 90% did so within the first 96 hours after initial extubation; this was consistent across various patient subtypes (89.3% for electives surgical patients up to 94.8% for trauma patients) and ICU subtypes (88.6% for cardiothoracic ICUs to 93.5% for medical ICUs). The reintubation rate for ICU patients liberated from mechanical ventilation in U.S. ICUs is approximately 10%. We propose a time cutoff of 96 hours for reintubation definitions and benchmarking efforts, as it captures 90% of ICU reintubation events. Reintubation rates can be reported as simple percentages, without regard for deaths or changes in goals of care that might occur.
ERIC Educational Resources Information Center
Prodromou, Theodosia
2012-01-01
This article seeks to address a pedagogical theory of introducing the classicist and the frequentist approach to probability, by investigating important elements in 9th grade students' learning process while working with a "TinkerPlots2" combinatorial problem. Results from this research study indicate that, after the students had seen…
Cumulants, free cumulants and half-shuffles
Ebrahimi-Fard, Kurusch; Patras, Frédéric
2015-01-01
Free cumulants were introduced as the proper analogue of classical cumulants in the theory of free probability. There is a mix of similarities and differences, when one considers the two families of cumulants. Whereas the combinatorics of classical cumulants is well expressed in terms of set partitions, that of free cumulants is described and often introduced in terms of non-crossing set partitions. The formal series approach to classical and free cumulants also largely differs. The purpose of this study is to put forward a different approach to these phenomena. Namely, we show that cumulants, whether classical or free, can be understood in terms of the algebra and combinatorics underlying commutative as well as non-commutative (half-)shuffles and (half-) unshuffles. As a corollary, cumulants and free cumulants can be characterized through linear fixed point equations. We study the exponential solutions of these linear fixed point equations, which display well the commutative, respectively non-commutative, character of classical and free cumulants. PMID:27547078
NASA Astrophysics Data System (ADS)
Kim, Hannah; Hong, Helen
2014-03-01
We propose an automatic method for nipple detection on 3D automated breast ultrasound (3D ABUS) images using coronal slab-average-projection and cumulative probability map. First, to identify coronal images that appeared remarkable distinction between nipple-areola region and skin, skewness of each coronal image is measured and the negatively skewed images are selected. Then, coronal slab-average-projection image is reformatted from selected images. Second, to localize nipple-areola region, elliptical ROI covering nipple-areola region is detected using Hough ellipse transform in coronal slab-average-projection image. Finally, to separate the nipple from areola region, 3D Otsu's thresholding is applied to the elliptical ROI and cumulative probability map in the elliptical ROI is generated by assigning high probability to low intensity region. False detected small components are eliminated using morphological opening and the center point of detected nipple region is calculated. Experimental results show that our method provides 94.4% nipple detection rate.
Wickham, Hadley; Hofmann, Heike
2011-12-01
We propose a new framework for visualising tables of counts, proportions and probabilities. We call our framework product plots, alluding to the computation of area as a product of height and width, and the statistical concept of generating a joint distribution from the product of conditional and marginal distributions. The framework, with extensions, is sufficient to encompass over 20 visualisations previously described in fields of statistical graphics and infovis, including bar charts, mosaic plots, treemaps, equal area plots and fluctuation diagrams. © 2011 IEEE
Seaton, Sarah E; Manktelow, Bradley N
2012-07-16
Emphasis is increasingly being placed on the monitoring of clinical outcomes for health care providers. Funnel plots have become an increasingly popular graphical methodology used to identify potential outliers. It is assumed that a provider only displaying expected random variation (i.e. 'in-control') will fall outside a control limit with a known probability. In reality, the discrete count nature of these data, and the differing methods, can lead to true probabilities quite different from the nominal value. This paper investigates the true probability of an 'in control' provider falling outside control limits for the Standardised Mortality Ratio (SMR). The true probabilities of an 'in control' provider falling outside control limits for the SMR were calculated and compared for three commonly used limits: Wald confidence interval; 'exact' confidence interval; probability-based prediction interval. The probability of falling above the upper limit, or below the lower limit, often varied greatly from the nominal value. This was particularly apparent when there were a small number of expected events: for expected events ≤ 50 the median probability of an 'in-control' provider falling above the upper 95% limit was 0.0301 (Wald), 0.0121 ('exact'), 0.0201 (prediction). It is important to understand the properties and probability of being identified as an outlier by each of these different methods to aid the correct identification of poorly performing health care providers. The limits obtained using probability-based prediction limits have the most intuitive interpretation and their properties can be defined a priori. Funnel plot control limits for the SMR should not be based on confidence intervals.
Decision analysis with cumulative prospect theory.
Bayoumi, A M; Redelmeier, D A
2000-01-01
Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Rourke, Patrick Francis
The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.
Cumulative Poisson Distribution Program
NASA Technical Reports Server (NTRS)
Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert
1990-01-01
Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.
CUMPOIS- CUMULATIVE POISSON DISTRIBUTION PROGRAM
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The Cumulative Poisson distribution program, CUMPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), can be used independently of one another. CUMPOIS determines the approximate cumulative binomial distribution, evaluates the cumulative distribution function (cdf) for gamma distributions with integer shape parameters, and evaluates the cdf for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. CUMPOIS calculates the probability that n or less events (ie. cumulative) will occur within any unit when the expected number of events is given as lambda. Normally, this probability is calculated by a direct summation, from i=0 to n, of terms involving the exponential function, lambda, and inverse factorials. This approach, however, eventually fails due to underflow for sufficiently large values of n. Additionally, when the exponential term is moved outside of the summation for simplification purposes, there is a risk that the terms remaining within the summation, and the summation itself, will overflow for certain values of i and lambda. CUMPOIS eliminates these possibilities by multiplying an additional exponential factor into the summation terms and the partial sum whenever overflow/underflow situations threaten. The reciprocal of this term is then multiplied into the completed sum giving the cumulative probability. The CUMPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting lambda and n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMPOIS was
Analysis of LDPE-ZnO-clay nanocomposites using novel cumulative rheological parameters
NASA Astrophysics Data System (ADS)
Kracalik, Milan
2017-05-01
Polymer nanocomposites exhibit complex rheological behaviour due to physical and also possibly chemical interactions between individual phases. Up to now, rheology of dispersive polymer systems has been usually described by evaluation of viscosity curve (shear thinning phenomenon), storage modulus curve (formation of secondary plateau) or plotting information about dumping behaviour (e.g. Van Gurp-Palmen-plot, comparison of loss factor tan δ). On the contrary to evaluation of damping behaviour, values of cot δ were calculated and called as "storage factor", analogically to loss factor. Then values of storage factor were integrated over specific frequency range and called as "cumulative storage factor". In this contribution, LDPE-ZnO-clay nanocomposites with different dispersion grades (physical networks) have been prepared and characterized by both conventional as well as novel analysis approach. Next to cumulative storage factor, further cumulative rheological parameters like cumulative complex viscosity, cumulative complex modulus or cumulative storage modulus have been introduced.
High throughput nonparametric probability density estimation.
Farmer, Jenny; Jacobs, Donald
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.
High throughput nonparametric probability density estimation
Farmer, Jenny
2018-01-01
In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803
Huang, Xiaowei; Zhang, Yanling; Meng, Long; Abbott, Derek; Qian, Ming; Wong, Kelvin K L; Zheng, Rongqing; Zheng, Hairong; Niu, Lili
2017-01-01
Carotid plaque echogenicity is associated with the risk of cardiovascular events. Gray-scale median (GSM) of the ultrasound image of carotid plaques has been widely used as an objective method for evaluation of plaque echogenicity in patients with atherosclerosis. We proposed a computer-aided method to evaluate plaque echogenicity and compared its efficiency with GSM. One hundred and twenty-five carotid plaques (43 echo-rich, 35 intermediate, 47 echolucent) were collected from 72 patients in this study. The cumulative probability distribution curves were obtained based on statistics of the pixels in the gray-level images of plaques. The area under the cumulative probability distribution curve (AUCPDC) was calculated as its integral value to evaluate plaque echogenicity. The classification accuracy for three types of plaques is 78.4% (kappa value, κ = 0.673), when the AUCPDC is used for classifier training, whereas GSM is 64.8% (κ = 0.460). The receiver operating characteristic curves were produced to test the effectiveness of AUCPDC and GSM for the identification of echolucent plaques. The area under the curve (AUC) was 0.817 when AUCPDC was used for training the classifier, which is higher than that achieved using GSM (AUC = 0.746). Compared with GSM, the AUCPDC showed a borderline association with coronary heart disease (Spearman r = 0.234, p = 0.050). Our experimental results suggest that AUCPDC analysis is a promising method for evaluation of plaque echogenicity and predicting cardiovascular events in patients with plaques.
Measurement of surface water runoff from plots of two different sizes
NASA Astrophysics Data System (ADS)
Joel, Abraham; Messing, Ingmar; Seguel, Oscar; Casanova, Manuel
2002-05-01
Intensities and amounts of water infiltration and runoff on sloping land are governed by the rainfall pattern and soil hydraulic conductivity, as well as by the microtopography and soil surface conditions. These components are closely interrelated and occur simultaneously, and their particular contribution may change during a rainfall event, or their effects may vary at different field scales. The scale effect on the process of infiltration/runoff was studied under natural field and rainfall conditions for two plot sizes: small plots of 0·25 m2 and large plots of 50 m2. The measurements were carried out in the central region of Chile in a piedmont most recently used as natural pastureland. Three blocks, each having one large plot and five small plots, were established. Cumulative rainfall and runoff quantities were sampled every 5 min. Significant variations in runoff responses to rainfall rates were found for the two plot sizes. On average, large plots yielded only 40% of runoff quantities produced on small plots per unit area. This difference between plot sizes was observed even during periods of continuous runoff.
CProb: a computational tool for conducting conditional probability analysis.
Hollister, Jeffrey W; Walker, Henry A; Paul, John F
2008-01-01
Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.
Atmospheric Teleconnections From Cumulants
NASA Astrophysics Data System (ADS)
Sabou, F.; Kaspi, Y.; Marston, B.; Schneider, T.
2011-12-01
Multi-point cumulants of fields such as vorticity provide a way to visualize atmospheric teleconnections, complementing other approaches such as the method of empirical orthogonal functions (EOFs). We calculate equal-time two-point cumulants of the vorticity from NCEP reanalysis data during the period 1980 -- 2010 and from direct numerical simulation (DNS) using an idealized dry general circulation model (GCM) (Schneider and Walker, 2006). Extratropical correlations seen in the NCEP data are qualitatively reproduced by the model. Three- and four-point cumulants accumulated from DNS quantify departures of the probability distribution function from a normal distribution, shedding light on the efficacy of direct statistical simulation (DSS) of atmosphere dynamics by cumulant expansions (Marston, Conover, and Schneider, 2008; Marston 2011). Lagged-time two-point cumulants between temperature gradients and eddy kinetic energy (EKE), accumulated by DNS of an idealized moist aquaplanet GCM (O'Gorman and Schneider, 2008), reveal dynamics of storm tracks. Regions of enhanced baroclinicity (as found along the eastern boundary of continents) lead to a local enhancement of EKE and a suppression of EKE further downstream as the storm track self-destructs (Kaspi and Schneider, 2011).
About the cumulants of periodic signals
NASA Astrophysics Data System (ADS)
Barrau, Axel; El Badaoui, Mohammed
2018-01-01
This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.
Forest Plots in Excel: Moving beyond a Clump of Trees to a Forest of Visual Information
ERIC Educational Resources Information Center
Derzon, James H.; Alford, Aaron A.
2013-01-01
Forest plots provide an effective means of presenting a wealth of information in a single graphic. Whether used to illustrate multiple results in a single study or the cumulative knowledge of an entire field, forest plots have become an accepted and generally understood way of presenting many estimates simultaneously. This article explores…
Rapidity window dependences of higher order cumulants and diffusion master equation
NASA Astrophysics Data System (ADS)
Kitazawa, Masakiyo
2015-10-01
We study the rapidity window dependences of higher order cumulants of conserved charges observed in relativistic heavy ion collisions. The time evolution and the rapidity window dependence of the non-Gaussian fluctuations are described by the diffusion master equation. Analytic formulas for the time evolution of cumulants in a rapidity window are obtained for arbitrary initial conditions. We discuss that the rapidity window dependences of the non-Gaussian cumulants have characteristic structures reflecting the non-equilibrium property of fluctuations, which can be observed in relativistic heavy ion collisions with the present detectors. It is argued that various information on the thermal and transport properties of the hot medium can be revealed experimentally by the study of the rapidity window dependences, especially by the combined use, of the higher order cumulants. Formulas of higher order cumulants for a probability distribution composed of sub-probabilities, which are useful for various studies of non-Gaussian cumulants, are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less
CUMBIN - CUMULATIVE BINOMIAL PROGRAMS
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.
Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.
NASA Astrophysics Data System (ADS)
Li, Zhanling; Li, Zhanjie; Li, Chengcheng
2014-05-01
Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to
Establishment probability in newly founded populations.
Gusset, Markus; Müller, Michael S; Grimm, Volker
2012-06-20
Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population's state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the "Wissel plot", where -ln(1 - P0(t)) is plotted against time t. This plot is based on the equation P(0)t=1-c(1)e(-ω(1t)), which relates the probability of extinction by time t, P(0)(t), to two constants: c(1) describes the probability of a newly founded population to reach the established phase, whereas ω(1) describes the population's probability of extinction per short time interval once established. For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus). A newly founded population reaches the established phase if the intercept of the (extrapolated) linear parts of the "Wissel plot" with the y-axis, which is -ln(c(1)), is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population's viability by distinguishing establishment from persistence.
Probability of stress-corrosion fracture under random loading
NASA Technical Reports Server (NTRS)
Yang, J. N.
1974-01-01
Mathematical formulation is based on cumulative-damage hypothesis and experimentally-determined stress-corrosion characteristics. Under both stationary random loadings, mean value and variance of cumulative damage are obtained. Probability of stress-corrosion fracture is then evaluated, using principle of maximum entropy.
Probability of stress-corrosion fracture under random loading.
NASA Technical Reports Server (NTRS)
Yang, J.-N.
1972-01-01
A method is developed for predicting the probability of stress-corrosion fracture of structures under random loadings. The formulation is based on the cumulative damage hypothesis and the experimentally determined stress-corrosion characteristics. Under both stationary and nonstationary random loadings, the mean value and the variance of the cumulative damage are obtained. The probability of stress-corrosion fracture is then evaluated using the principle of maximum entropy. It is shown that, under stationary random loadings, the standard deviation of the cumulative damage increases in proportion to the square root of time, while the coefficient of variation (dispersion) decreases in inversed proportion to the square root of time. Numerical examples are worked out to illustrate the general results.
Univariate Probability Distributions
ERIC Educational Resources Information Center
Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.
2012-01-01
We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…
Use of probability analysis to establish routine bioassay screening levels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbaugh, E.H.; Sula, M.J.; McFadden, K.M.
1990-09-01
Probability analysis was used by the Hanford Internal Dosimetry Program to establish bioassay screening levels for tritium and uranium in urine. Background environmental levels of these two radionuclides are generally detectable by the highly sensitive urine analysis procedures routinely used at Hanford. Establishing screening levels requires balancing the impact of false detection with the consequence of potentially undetectable occupation dose. To establish the screening levels, tritium and uranium analyses were performed on urine samples collected from workers exposed only to environmental sources. All samples were collected at home using a simulated 12-hour protocol for tritium and a simulated 24-hour collectionmore » protocol for uranium. Results of the analyses of these samples were ranked according to tritium concentration or total sample uranium. The cumulative percentile was calculated and plotted using log-probability coordinates. Geometric means and screening levels corresponding to various percentiles were estimated by graphical interpolation and standard calculations. The potentially annual internal dose associated with a screening level was calculated. Screening levels were selected corresponding to the 99.9 percentile, implying that, on the average, 1 out of 1000 samples collected from an unexposed worker population would be expected to exceed the screening level. 4 refs., 2 figs.« less
Electrofishing capture probability of smallmouth bass in streams
Dauwalter, D.C.; Fisher, W.L.
2007-01-01
Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.
Measuring Forest Area Loss Over Time Using FIA Plots and Satellite Imagery
Michael L. Hoppus; Andrew J. Lister
2005-01-01
How accurately can FIA plots, scattered at 1 per 6,000 acres, identify often rare forest land loss, estimated at less than 1 percent per year in the Northeast? Here we explore this question mathematically, empirically, and by comparing FIA plot estimates of forest change with satellite image based maps of forest loss. The mathematical probability of exactly estimating...
An exploratory drilling exhaustion sequence plot program
Schuenemeyer, J.H.; Drew, L.J.
1977-01-01
The exhaustion sequence plot program computes the conditional area of influence for wells in a specified rectangular region with respect to a fixed-size deposit. The deposit is represented by an ellipse whose size is chosen by the user. The area of influence may be displayed on computer printer plots consisting of a maximum of 10,000 grid points. At each point, a symbol is presented that indicates the probability of that point being exhausted by nearby wells with respect to a fixed-size ellipse. This output gives a pictorial view of the manner in which oil fields are exhausted. In addition, the exhaustion data may be used to estimate the number of deposits remaining in a basin. ?? 1977.
Singh, Deependra; Pitkäniemi, Janne; Malila, Nea; Anttila, Ahti
2016-09-01
Mammography has been found effective as the primary screening test for breast cancer. We estimated the cumulative probability of false positive screening test results with respect to symptom history reported at screen. A historical prospective cohort study was done using individual screening data from 413,611 women aged 50-69 years with 2,627,256 invitations for mammography screening between 1992 and 2012 in Finland. Symptoms (lump, retraction, and secretion) were reported at 56,805 visits, and 48,873 visits resulted in a false positive mammography result. Generalized linear models were used to estimate the probability of at least one false positive test and true positive at screening visits. The estimates were compared among women with and without symptoms history. The estimated cumulative probabilities were 18 and 6 % for false positive and true positive results, respectively. In women with a history of a lump, the cumulative probabilities of false positive test and true positive were 45 and 16 %, respectively, compared to 17 and 5 % with no reported lump. In women with a history of any given symptom, the cumulative probabilities of false positive test and true positive were 38 and 13 %, respectively. Likewise, women with a history of a 'lump and retraction' had the cumulative false positive probability of 56 %. The study showed higher cumulative risk of false positive tests and more cancers detected in women who reported symptoms compared to women who did not report symptoms at screen. The risk varies substantially, depending on symptom types and characteristics. Information on breast symptoms influences the balance of absolute benefits and harms of screening.
Cumulative hazard: The case of nuisance flooding
NASA Astrophysics Data System (ADS)
Moftakhari, Hamed R.; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.
2017-02-01
The cumulative cost of frequent events (e.g., nuisance floods) over time may exceed the costs of the extreme but infrequent events for which societies typically prepare. Here we analyze the likelihood of exceedances above mean higher high water and the corresponding property value exposure for minor, major, and extreme coastal floods. Our results suggest that, in response to sea level rise, nuisance flooding (NF) could generate property value exposure comparable to, or larger than, extreme events. Determining whether (and when) low cost, nuisance incidents aggregate into high cost impacts and deciding when to invest in preventive measures are among the most difficult decisions for policymakers. It would be unfortunate if efforts to protect societies from extreme events (e.g., 0.01 annual probability) left them exposed to a cumulative hazard with enormous costs. We propose a Cumulative Hazard Index (CHI) as a tool for framing the future cumulative impact of low cost incidents relative to infrequent extreme events. CHI suggests that in New York, NY, Washington, DC, Miami, FL, San Francisco, CA, and Seattle, WA, a careful consideration of socioeconomic impacts of NF for prioritization is crucial for sustainable coastal flood risk management.
S2PLOT: Three-dimensional (3D) Plotting Library
NASA Astrophysics Data System (ADS)
Barnes, D. G.; Fluke, C. J.; Bourke, P. D.; Parry, O. T.
2011-03-01
We present a new, three-dimensional (3D) plotting library with advanced features, and support for standard and enhanced display devices. The library - S2PLOT - is written in C and can be used by C, C++ and FORTRAN programs on GNU/Linux and Apple/OSX systems. S2PLOT draws objects in a 3D (x,y,z) Cartesian space and the user interactively controls how this space is rendered at run time. With a PGPLOT inspired interface, S2PLOT provides astronomers with elegant techniques for displaying and exploring 3D data sets directly from their program code, and the potential to use stereoscopic and dome display devices. The S2PLOT architecture supports dynamic geometry and can be used to plot time-evolving data sets, such as might be produced by simulation codes. In this paper, we introduce S2PLOT to the astronomical community, describe its potential applications, and present some example uses of the library.
The Use of Crow-AMSAA Plots to Assess Mishap Trends
NASA Technical Reports Server (NTRS)
Dawson, Jeffrey W.
2011-01-01
Crow-AMSAA (CA) plots are used to model reliability growth. Use of CA plots has expanded into other areas, such as tracking events of interest to management, maintenance problems, and safety mishaps. Safety mishaps can often be successfully modeled using a Poisson probability distribution. CA plots show a Poisson process in log-log space. If the safety mishaps are a stable homogenous Poisson process, a linear fit to the points in a CA plot will have a slope of one. Slopes of greater than one indicate a nonhomogenous Poisson process, with increasing occurrence. Slopes of less than one indicate a nonhomogenous Poisson process, with decreasing occurrence. Changes in slope, known as "cusps," indicate a change in process, which could be an improvement or a degradation. After presenting the CA conceptual framework, examples are given of trending slips, trips and falls, and ergonomic incidents at NASA (from Agency-level data). Crow-AMSAA plotting is a robust tool for trending safety mishaps that can provide insight into safety performance over time.
A method for analyzing temporal patterns of variability of a time series from Poincare plots.
Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E
2012-07-01
The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.
Net present value probability distributions from decline curve reserves estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, D.E.; Huffman, C.H.; Thompson, R.S.
1995-12-31
This paper demonstrates how reserves probability distributions can be used to develop net present value (NPV) distributions. NPV probability distributions were developed from the rate and reserves distributions presented in SPE 28333. This real data study used practicing engineer`s evaluations of production histories. Two approaches were examined to quantify portfolio risk. The first approach, the NPV Relative Risk Plot, compares the mean NPV with the NPV relative risk ratio for the portfolio. The relative risk ratio is the NPV standard deviation (a) divided the mean ({mu}) NPV. The second approach, a Risk - Return Plot, is a plot of themore » {mu} discounted cash flow rate of return (DCFROR) versus the {sigma} for the DCFROR distribution. This plot provides a risk-return relationship for comparing various portfolios. These methods may help evaluate property acquisition and divestiture alternatives and assess the relative risk of a suite of wells or fields for bank loans.« less
On-plot drinking water supplies and health: A systematic review.
Overbo, Alycia; Williams, Ashley R; Evans, Barbara; Hunter, Paul R; Bartram, Jamie
2016-07-01
Many studies have found that household access to water supplies near or within the household plot can reduce the probability of diarrhea, trachoma, and other water-related diseases, and it is generally accepted that on-plot water supplies produce health benefits for households. However, the body of research literature has not been analyzed to weigh the evidence supporting this. A systematic review was conducted to investigate the impacts of on-plot water supplies on diarrhea, trachoma, child growth, and water-related diseases, to further examine the relationship between household health and distance to water source and to assess whether on-plot water supplies generate health gains for households. Studies provide evidence that households with on-plot water supplies experience fewer diarrheal and helminth infections and greater child height. Findings suggest that water-washed (hygiene associated) diseases are more strongly impacted by on-plot water access than waterborne diseases. Few studies analyzed the effects of on-plot water access on quantity of domestic water used, hygiene behavior, and use of multiple water sources, and the lack of evidence for these relationships reveals an important gap in current literature. The review findings indicate that on-plot water access is a useful health indicator and benchmark for the progressive realization of the Sustainable Development Goal target of universal safe water access as well as the human right to safe water. Copyright © 2016 Elsevier GmbH. All rights reserved.
NASA Technical Reports Server (NTRS)
Fowell, Richard A.
1989-01-01
Most simulation plots are heavily oversampled. Ignoring unnecessary data points dramatically reduces plot time with imperceptible effect on quality. The technique is suited to most plot devices. The departments laser printer's speed was tripled for large simulation plots by data thinning. This reduced printer delays without the expense of a faster laser printer. Surpisingly, it saved computer time as well. All plot data are now thinned, including PostScript and terminal plots. The problem, solution, and conclusions are described. The thinning algorithm is described and performance studies are presented. To obtain FORTRAN 77 or C source listings, mail a SASE to the author.
NASA Technical Reports Server (NTRS)
Chan, Gordon C.
1991-01-01
The new 1991 COSMIC/NASTRAN version, compatible with the older versions, tries to remove some old constraints and make it easier to extract information from the plot file. It also includes some useful improvements and new enhancements. New features available in the 1991 version are described. They include a new PLT1 tape with simplified ASCII plot commands and short records, combined hidden and shrunk plot, an x-y-z coordinate system on all structural plots, element offset plot, improved character size control, improved FIND and NOFIND logic, a new NASPLOT post-prosessor to perform screen plotting or generate PostScript files, and a BASIC/NASTPLOT program for PC.
de Vries, W; Wieggers, H J J; Brus, D J
2010-08-05
Element fluxes through forest ecosystems are generally based on measurements of concentrations in soil solution at regular time intervals at plot locations sampled in a regular grid. Here we present spatially averaged annual element leaching fluxes in three Dutch forest monitoring plots using a new sampling strategy in which both sampling locations and sampling times are selected by probability sampling. Locations were selected by stratified random sampling with compact geographical blocks of equal surface area as strata. In each sampling round, six composite soil solution samples were collected, consisting of five aliquots, one per stratum. The plot-mean concentration was estimated by linear regression, so that the bias due to one or more strata being not represented in the composite samples is eliminated. The sampling times were selected in such a way that the cumulative precipitation surplus of the time interval between two consecutive sampling times was constant, using an estimated precipitation surplus averaged over the past 30 years. The spatially averaged annual leaching flux was estimated by using the modeled daily water flux as an ancillary variable. An important advantage of the new method is that the uncertainty in the estimated annual leaching fluxes due to spatial and temporal variation and resulting sampling errors can be quantified. Results of this new method were compared with the reference approach in which daily leaching fluxes were calculated by multiplying daily interpolated element concentrations with daily water fluxes and then aggregated to a year. Results show that the annual fluxes calculated with the reference method for the period 2003-2005, including all plots, elements and depths, lies only in 53% of the cases within the range of the average +/-2 times the standard error of the new method. Despite the differences in results, both methods indicate comparable N retention and strong Al mobilization in all plots, with Al leaching being
de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander
2017-11-01
To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.
Plot shape effects on plant species diversity measurements
Keeley, Jon E.; Fotheringham, C.J.
2005-01-01
Abstract. Question: Do rectangular sample plots record more plant species than square plots as suggested by both empirical and theoretical studies?Location: Grasslands, shrublands and forests in the Mediterranean-climate region of California, USA.Methods: We compared three 0.1-ha sampling designs that differed in the shape and dispersion of 1-m2 and 100-m2 nested subplots. We duplicated an earlier study that compared the Whittaker sample design, which had square clustered subplots, with the modified Whittaker design, which had dispersed rectangular subplots. To sort out effects of dispersion from shape we used a third design that overlaid square subplots on the modified Whittaker design. Also, using data from published studies we extracted species richness values for 400-m2 subplots that were either square or 1:4 rectangles partially overlaid on each other from desert scrub in high and low rainfall years, chaparral, sage scrub, oak savanna and coniferous forests with and without fire.Results: We found that earlier empirical reports of more than 30% greater richness with rectangles were due to the confusion of shape effects with spatial effects, coupled with the use of cumulative number of species as the metric for comparison. Average species richness was not significantly different between square and 1:4 rectangular sample plots at either 1- or 100-m2. Pairwise comparisons showed no significant difference between square and rectangular samples in all but one vegetation type, and that one exhibited significantly greater richness with squares. Our three intensive study sites appear to exhibit some level of self-similarity at the scale of 400 m2, but, contrary to theoretical expectations, we could not detect plot shape effects on species richness at this scale.Conclusions: At the 0.1-ha scale or lower there is no evidence that plot shape has predictable effects on number of species recorded from sample plots. We hypothesize that for the mediterranean
? Figure 1. Ratio of cumulative released cells to cells initially present in the manure at Week 0 as they vary by time, manure type and age, microbe, and Event (i.e., season). The 95% confidence intervals of the observed median number of cells in microbial runoff are shown as the shaded area.? Figure 2. Typical observed and simulated cumulative microbial runoff for Plots A403 and C209 with individual plot calibration.? Figure 3. Observed versus simulated microbial runoff associated with the Approach 1, adjusted for cumulative results by manure type and Event. Results accounted for counts associated with field monitoring time intervals described in Section 2.1 Field method. NS=Nash-Sutcliffe modeling efficiency, EC=E. coli, En=enterococci, FC= fecal coliforms.? Figure 4. Ratio of cumulative released cells/mass to cells/mass initially present in the aged manure by time and component (e.g., microbe) for solid manure (a) and (b), and amended, dry litter, and slurry manure (c). Solid lines (Equation (11) correspond to values in Table 3 for solid manure, and dry litter and slurry manure, respectively: (a) uses individual b values, and (b) and (c) use the combined values for b. Bounds of first and third quartiles associated with the present study??s results for cattle. Bounds of first and third quartiles associated with the present study??s results for poultry and swine. The full color versions of all figures are available in the online version of this paper, at ht
Probability of detection of nests and implications for survey design
Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.
2009-01-01
Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.
Tygert, Mark
2010-09-21
We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).
Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun
2016-12-01
There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norris, J.; Daniels, B.
1986-02-01
The fancy plot package is a group of five programs which allow the user to make 2- and 3-dimensional document quality plots from the SIG data base. The fancyplot package was developed using a DEC VT100 terminal fitted with a Digital Engineering Retrographics board and the QMS Laserprinter. If a terminal emulates the VT100/Retrographic terminal the package should work. A Pericom terminal for example, works perfectly. The fancy plot package is available to provide report-ready plots without resorting to cutting and pasting. This package is contained in programs FFP, TFP, TDFD, 3DFFP and 3DTFP in directory ERD131::USER2 DISK:(HUDSON.SIG). These programsmore » may be summarized as follows: FFP - 2-Dimensional Frequency Fancy Plots with magnitude/phase option; TFP - 2-Dimensional Time Fancy Plots; TDFD - 2-Dimensional Time Domain Frequency Domain Plots; and 3DFFP - equally spaced 3-Dimensional Frequency Fancy Plots; 3DTFP - equally spaced 3-Dimensional Time Plots. 8 figs.« less
NASA Astrophysics Data System (ADS)
Jambrina, P. G.; Lara, Manuel; Menéndez, M.; Launay, J.-M.; Aoiz, F. J.
2012-10-01
Cumulative reaction probabilities (CRPs) at various total angular momenta have been calculated for the barrierless reaction S(1D) + H2 → SH + H at total energies up to 1.2 eV using three different theoretical approaches: time-independent quantum mechanics (QM), quasiclassical trajectories (QCT), and statistical quasiclassical trajectories (SQCT). The calculations have been carried out on the widely used potential energy surface (PES) by Ho et al. [J. Chem. Phys. 116, 4124 (2002), 10.1063/1.1431280] as well as on the recent PES developed by Song et al. [J. Phys. Chem. A 113, 9213 (2009), 10.1021/jp903790h]. The results show that the differences between these two PES are relatively minor and mostly related to the different topologies of the well. In addition, the agreement between the three theoretical methodologies is good, even for the highest total angular momenta and energies. In particular, the good accordance between the CRPs obtained with dynamical methods (QM and QCT) and the statistical model (SQCT) indicates that the reaction can be considered statistical in the whole range of energies in contrast with the findings for other prototypical barrierless reactions. In addition, total CRPs and rate coefficients in the range of 20-1000 K have been calculated using the QCT and SQCT methods and have been found somewhat smaller than the experimental total removal rates of S(1D).
Model-checking techniques based on cumulative residuals.
Lin, D Y; Wei, L J; Ying, Z
2002-03-01
Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.
NASA Astrophysics Data System (ADS)
Taylor, M. B.
2009-09-01
The new plotting functionality in version 2.0 of STILTS is described. STILTS is a mature and powerful package for all kinds of table manipulation, and this version adds facilities for generating plots from one or more tables to its existing wide range of non-graphical capabilities. 2- and 3-dimensional scatter plots and 1-dimensional histograms may be generated using highly configurable style parameters. Features include multiple dataset overplotting, variable transparency, 1-, 2- or 3-dimensional symmetric or asymmetric error bars, higher-dimensional visualization using color, and textual point labeling. Vector and bitmapped output formats are supported. The plotting options provide enough flexibility to perform meaningful visualization on datasets from a few points up to tens of millions. Arbitrarily large datasets can be plotted without heavy memory usage.
Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund
2016-02-18
In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.
Rodrigues, Nils; Weiskopf, Daniel
2018-01-01
Conventional dot plots use a constant dot size and are typically applied to show the frequency distribution of small data sets. Unfortunately, they are not designed for a high dynamic range of frequencies. We address this problem by introducing nonlinear dot plots. Adopting the idea of nonlinear scaling from logarithmic bar charts, our plots allow for dots of varying size so that columns with a large number of samples are reduced in height. For the construction of these diagrams, we introduce an efficient two-way sweep algorithm that leads to a dense and symmetrical layout. We compensate aliasing artifacts at high dot densities by a specifically designed low-pass filtering method. Examples of nonlinear dot plots are compared to conventional dot plots as well as linear and logarithmic histograms. Finally, we include feedback from an expert review.
NEMAR plotting computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.
Probability and Cumulative Density Function Methods for the Stochastic Advection-Reaction Equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barajas-Solano, David A.; Tartakovsky, Alexandre M.
We present a cumulative density function (CDF) method for the probabilistic analysis of $d$-dimensional advection-dominated reactive transport in heterogeneous media. We employ a probabilistic approach in which epistemic uncertainty on the spatial heterogeneity of Darcy-scale transport coefficients is modeled in terms of random fields with given correlation structures. Our proposed CDF method employs a modified Large-Eddy-Diffusivity (LED) approach to close and localize the nonlocal equations governing the one-point PDF and CDF of the concentration field, resulting in a $(d + 1)$ dimensional PDE. Compared to the classsical LED localization, the proposed modified LED localization explicitly accounts for the mean-field advectivemore » dynamics over the phase space of the PDF and CDF. To illustrate the accuracy of the proposed closure, we apply our CDF method to one-dimensional single-species reactive transport with uncertain, heterogeneous advection velocities and reaction rates modeled as random fields.« less
NASA Astrophysics Data System (ADS)
Pham, T. D.
2016-12-01
Recurrence plots display binary texture of time series from dynamical systems with single dots and line structures. Using fuzzy recurrence plots, recurrences of the phase-space states can be visualized as grayscale texture, which is more informative for pattern analysis. The proposed method replaces the crucial similarity threshold required by symmetrical recurrence plots with the number of cluster centers, where the estimate of the latter parameter is less critical than the estimate of the former.
1987-03-25
by Lloyd (1952) using generalized least squares instead of ordinary least squares, and by Wilk, % 20 Gnanadesikan , and Freeny (1963) using a maximum...plot. The half-normal distribution is a special case of the gamma distribution proposed by Wilk, Gnanadesikan , and Huyett (1962). VARIATIONS ON THE... Gnanadesikan , R. Probability plotting methods for the analysis of data. Biometrika, 1968, 55, 1-17. This paper describes and discusses graphical techniques
Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future
NASA Technical Reports Server (NTRS)
Cates, Grant R.
2014-01-01
The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.
Simple gain probability functions for large reflector antennas of JPL/NASA
NASA Technical Reports Server (NTRS)
Jamnejad, V.
2003-01-01
Simple models for the patterns as well as their cumulative gain probability and probability density functions of the Deep Space Network antennas are developed. These are needed for the study and evaluation of interference from unwanted sources such as the emerging terrestrial system, High Density Fixed Service, with the Ka-band receiving antenna systems in Goldstone Station of the Deep Space Network.
Sampling Error in Relation to Cyst Nematode Population Density Estimation in Small Field Plots.
Župunski, Vesna; Jevtić, Radivoje; Jokić, Vesna Spasić; Župunski, Ljubica; Lalošević, Mirjana; Ćirić, Mihajlo; Ćurčić, Živko
2017-06-01
Cyst nematodes are serious plant-parasitic pests which could cause severe yield losses and extensive damage. Since there is still very little information about error of population density estimation in small field plots, this study contributes to the broad issue of population density assessment. It was shown that there was no significant difference between cyst counts of five or seven bulk samples taken per each 1-m 2 plot, if average cyst count per examined plot exceeds 75 cysts per 100 g of soil. Goodness of fit of data to probability distribution tested with χ 2 test confirmed a negative binomial distribution of cyst counts for 21 out of 23 plots. The recommended measure of sampling precision of 17% expressed through coefficient of variation ( cv ) was achieved if the plots of 1 m 2 contaminated with more than 90 cysts per 100 g of soil were sampled with 10-core bulk samples taken in five repetitions. If plots were contaminated with less than 75 cysts per 100 g of soil, 10-core bulk samples taken in seven repetitions gave cv higher than 23%. This study indicates that more attention should be paid on estimation of sampling error in experimental field plots to ensure more reliable estimation of population density of cyst nematodes.
Setting cumulative emissions targets to reduce the risk of dangerous climate change.
Zickfeld, Kirsten; Eby, Michael; Matthews, H Damon; Weaver, Andrew J
2009-09-22
Avoiding "dangerous anthropogenic interference with the climate system" requires stabilization of atmospheric greenhouse gas concentrations and substantial reductions in anthropogenic emissions. Here, we present an inverse approach to coupled climate-carbon cycle modeling, which allows us to estimate the probability that any given level of carbon dioxide (CO2) emissions will exceed specified long-term global mean temperature targets for "dangerous anthropogenic interference," taking into consideration uncertainties in climate sensitivity and the carbon cycle response to climate change. We show that to stabilize global mean temperature increase at 2 degrees C above preindustrial levels with a probability of at least 0.66, cumulative CO2 emissions from 2000 to 2500 must not exceed a median estimate of 590 petagrams of carbon (PgC) (range, 200 to 950 PgC). If the 2 degrees C temperature stabilization target is to be met with a probability of at least 0.9, median total allowable CO2 emissions are 170 PgC (range, -220 to 700 PgC). Furthermore, these estimates of cumulative CO2 emissions, compatible with a specified temperature stabilization target, are independent of the path taken to stabilization. Our analysis therefore supports an international policy framework aimed at avoiding dangerous anthropogenic interference formulated on the basis of total allowable greenhouse gas emissions.
CROSSER - CUMULATIVE BINOMIAL PROGRAMS
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.
NASA Technical Reports Server (NTRS)
Mcentire, K.
1994-01-01
NPLOT is an interactive computer graphics program for plotting undeformed and deformed NASTRAN finite element models (FEMs). Although there are many commercial codes already available for plotting FEMs, these have limited use due to their cost, speed, and lack of features to view BAR elements. NPLOT was specifically developed to overcome these limitations. On a vector type graphics device the two best ways to show depth are by hidden line plotting or haloed line plotting. A hidden line algorithm generates views of models with all hidden lines removed, and a haloed line algorithm displays views with aft lines broken in order to show depth while keeping the entire model visible. A haloed line algorithm is especially useful for plotting models composed of many line elements and few surface elements. The most important feature of NPLOT is its ability to create both hidden line and haloed line views accurately and much more quickly than with any other existing hidden or haloed line algorithms. NPLOT is also capable of plotting a normal wire frame view to display all lines of a model. NPLOT is able to aid in viewing all elements, but it has special features not generally available for plotting BAR elements. These features include plotting of TRUE LENGTH and NORMALIZED offset vectors and orientation vectors. Standard display operations such as rotation and perspective are possible, but different view planes such as X-Y, Y-Z, and X-Z may also be selected. Another display option is the Z-axis cut which allows a portion of the fore part of the model to be cut away to reveal details of the inside of the model. A zoom function is available to terminals with a locator (graphics cursor, joystick, etc.). The user interface of NPLOT is designed to make the program quick and easy to use. A combination of menus and commands with help menus for detailed information about each command allows experienced users greater speed and efficiency. Once a plot is on the screen the interface
Detection probability of EBPSK-MODEM system
NASA Astrophysics Data System (ADS)
Yao, Yu; Wu, Lenan
2016-07-01
Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.
Prospect evaluation as a function of numeracy and probability denominator.
Millroth, Philip; Juslin, Peter
2015-05-01
This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.
Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi
2016-02-01
Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We
Schaubel, Douglas E; Wei, Guanghui
2011-03-01
In medical studies of time-to-event data, nonproportional hazards and dependent censoring are very common issues when estimating the treatment effect. A traditional method for dealing with time-dependent treatment effects is to model the time-dependence parametrically. Limitations of this approach include the difficulty to verify the correctness of the specified functional form and the fact that, in the presence of a treatment effect that varies over time, investigators are usually interested in the cumulative as opposed to instantaneous treatment effect. In many applications, censoring time is not independent of event time. Therefore, we propose methods for estimating the cumulative treatment effect in the presence of nonproportional hazards and dependent censoring. Three measures are proposed, including the ratio of cumulative hazards, relative risk, and difference in restricted mean lifetime. For each measure, we propose a double inverse-weighted estimator, constructed by first using inverse probability of treatment weighting (IPTW) to balance the treatment-specific covariate distributions, then using inverse probability of censoring weighting (IPCW) to overcome the dependent censoring. The proposed estimators are shown to be consistent and asymptotically normal. We study their finite-sample properties through simulation. The proposed methods are used to compare kidney wait-list mortality by race. © 2010, The International Biometric Society.
MID Plot: a new lithology technique. [Matrix identification plot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clavier, C.; Rust, D.H.
1976-01-01
Lithology interpretation by the Litho-Porosity (M-N) method has been used for years, but is evidently too cumbersome and ambiguous for widespread acceptance as a field technique. To set aside these objections, another method has been devised. Instead of the log-derived parameters M and N, the MID Plot uses quasi-physical quantities, (rho/sub ma/)/sub a/ and (..delta..t/sub ma/)/sub a/, as its porosity-independent variables. These parameters, taken from suitably scaled Neutron-Density and Sonic-Neutron crossplots, define a unique matrix mineral or mixture for each point on the logs. The matrix points on the MID Plot thus remain constant in spite of changes in mudmore » filtrate, porosity, or neutron tool types (all of which significantly affect the M-N Plot). This new development is expected to bring welcome relief in areas where lithology identification is a routine part of log analysis.« less
Setting cumulative emissions targets to reduce the risk of dangerous climate change
Zickfeld, Kirsten; Eby, Michael; Matthews, H. Damon; Weaver, Andrew J.
2009-01-01
Avoiding “dangerous anthropogenic interference with the climate system” requires stabilization of atmospheric greenhouse gas concentrations and substantial reductions in anthropogenic emissions. Here, we present an inverse approach to coupled climate-carbon cycle modeling, which allows us to estimate the probability that any given level of carbon dioxide (CO2) emissions will exceed specified long-term global mean temperature targets for “dangerous anthropogenic interference,” taking into consideration uncertainties in climate sensitivity and the carbon cycle response to climate change. We show that to stabilize global mean temperature increase at 2 °C above preindustrial levels with a probability of at least 0.66, cumulative CO2 emissions from 2000 to 2500 must not exceed a median estimate of 590 petagrams of carbon (PgC) (range, 200 to 950 PgC). If the 2 °C temperature stabilization target is to be met with a probability of at least 0.9, median total allowable CO2 emissions are 170 PgC (range, −220 to 700 PgC). Furthermore, these estimates of cumulative CO2 emissions, compatible with a specified temperature stabilization target, are independent of the path taken to stabilization. Our analysis therefore supports an international policy framework aimed at avoiding dangerous anthropogenic interference formulated on the basis of total allowable greenhouse gas emissions. PMID:19706489
CDF-XL: computing cumulative distribution functions of reaction time data in Excel.
Houghton, George; Grange, James A
2011-12-01
In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.
Map_plot and bgg_plot: software for integration of geoscience datasets
NASA Astrophysics Data System (ADS)
Gaillot, Philippe; Punongbayan, Jane T.; Rea, Brice
2004-02-01
Since 1985, the Ocean Drilling Program (ODP) has been supporting multidisciplinary research in exploring the structure and history of Earth beneath the oceans. After more than 200 Legs, complementary datasets covering different geological environments, periods and space scales have been obtained and distributed world-wide using the ODP-Janus and Lamont Doherty Earth Observatory-Borehole Research Group (LDEO-BRG) database servers. In Earth Sciences, more than in any other science, the ensemble of these data is characterized by heterogeneous formats and graphical representation modes. In order to fully and quickly assess this information, a set of Unix/Linux and Generic Mapping Tool-based C programs has been designed to convert and integrate datasets acquired during the present ODP and the future Integrated ODP (IODP) Legs. Using ODP Leg 199 datasets, we show examples of the capabilities of the proposed programs. The program map_plot is used to easily display datasets onto 2-D maps. The program bgg_plot (borehole geology and geophysics plot) displays data with respect to depth and/or time. The latter program includes depth shifting, filtering and plotting of core summary information, continuous and discrete-sample core measurements (e.g. physical properties, geochemistry, etc.), in situ continuous logs, magneto- and bio-stratigraphies, specific sedimentological analyses (lithology, grain size, texture, porosity, etc.), as well as core and borehole wall images. Outputs from both programs are initially produced in PostScript format that can be easily converted to Portable Document Format (PDF) or standard image formats (GIF, JPEG, etc.) using widely distributed conversion programs. Based on command line operations and customization of parameter files, these programs can be included in other shell- or database-scripts, automating plotting procedures of data requests. As an open source software, these programs can be customized and interfaced to fulfill any specific
NASA Technical Reports Server (NTRS)
Walatka, Pamela P.; Buning, Pieter G.; Pierce, Larry; Elson, Patricia A.
1990-01-01
PLOT3D is a computer graphics program designed to visualize the grids and solutions of computational fluid dynamics. Seventy-four functions are available. Versions are available for many systems. PLOT3D can handle multiple grids with a million or more grid points, and can produce varieties of model renderings, such as wireframe or flat shaded. Output from PLOT3D can be used in animation programs. The first part of this manual is a tutorial that takes the reader, keystroke by keystroke, through a PLOT3D session. The second part of the manual contains reference chapters, including the helpfile, data file formats, advice on changing PLOT3D, and sample command files.
General purpose film plotting system
NASA Technical Reports Server (NTRS)
Mcquillan, C.
1977-01-01
The general purpose film plotting system which is a plot program design to handle a majority of the data tape formats presently available under OS/360 was discussed. The convenience of this program is due to the fact that the user merely describes the format of his data set and the type of data plots he desires. It processes the input data according to the given specifications. The output is generated on a tape which yields data plots when processed by the selected plotter. A summary of each job is produced on the printer.
Quantitative methods for analysing cumulative effects on fish migration success: a review.
Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G
2012-07-01
It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.
Adaptive strategies for cumulative cultural learning.
Ehn, Micael; Laland, Kevin
2012-05-21
The demographic and ecological success of our species is frequently attributed to our capacity for cumulative culture. However, it is not yet known how humans combine social and asocial learning to generate effective strategies for learning in a cumulative cultural context. Here we explore how cumulative culture influences the relative merits of various pure and conditional learning strategies, including pure asocial and social learning, critical social learning, conditional social learning and individual refiner strategies. We replicate the Rogers' paradox in the cumulative setting. However, our analysis suggests that strategies that resolved Rogers' paradox in a non-cumulative setting may not necessarily evolve in a cumulative setting, thus different strategies will optimize cumulative and non-cumulative cultural learning. Copyright © 2012 Elsevier Ltd. All rights reserved.
A Seakeeping Performance and Affordability Tradeoff Study for the Coast Guard Offshore Patrol Cutter
2016-06-01
Index Polar Plot for Sea State 4, All Headings Are Relative to the Wave Motion and Velocity is Given in Meters per Second...40 Figure 15. Probability and Cumulative Density Functions of Annual Sea State Occurrences in the Open Ocean, North Pacific...criteria at a given sea state. Probability distribution functions are available that describe the likelihood that an operational area will experience
Probability and predictors of the cannabis gateway effect: A national study
Secades-Villa, Roberto; Garcia-Rodríguez, Olaya; Jin, Chelsea, J.; Wang, Shuai; Blanco, Carlos
2014-01-01
Background While several studies have shown a high association between cannabis use and use of other illicit drugs, the predictors of progression from cannabis to other illicit drugs remain largely unknown. This study aims to estimate the cumulative probability of progression to illicit drug use among individuals with lifetime history of cannabis use, and to identify predictors of progression from cannabis use to other illicit drugs use. Methods Analyses were conducted on the sub-sample of participants in Wave 1of the National Epidemiological Survey on Alcohol and Related Conditions (NESARC) who started cannabis use before using any other drug (n= 6,624). Estimated projections of the cumulative probability of progression from cannabis use to use of any other illegal drug use in the general population were obtained by the standard actuarial method. Univariate and multivariable survival analyses with time-varying covariates were implemented to identify predictors of progression to any drug use. Results Lifetime cumulative probability estimates indicated that 44.7% of individuals with lifetime cannabis use progressed to other illicit drug use at some time in their lives. Several sociodemographic characteristics, internalizing and externalizing psychiatric disorders and indicators of substance use severity predicted progression from cannabis use to other illicit drugs use. Conclusion A large proportion of individuals who use cannabis go on to use other illegal drugs. The increased risk of progression from cannabis use to other illicit drugs use among individuals with mental disorders underscores the importance of considering the benefits and adverse effects of changes in cannabis regulations and of developing prevention and treatment strategies directed at curtailing cannabis use in these populations. PMID:25168081
Saugel, Bernd; Grothe, Oliver; Wagner, Julia Y
2015-08-01
When comparing 2 technologies for measuring hemodynamic parameters with regard to their ability to track changes, 2 graphical tools are omnipresent in the literature: the 4-quadrant plot and the polar plot recently proposed by Critchley et al. The polar plot is thought to be the more advanced statistical tool, but care should be taken when it comes to its interpretation. The polar plot excludes possibly important measurements from the data. The polar plot transforms the data nonlinearily, which may prevent it from being seen clearly. In this article, we compare the 4-quadrant and the polar plot in detail and thoroughly describe advantages and limitations of each. We also discuss pitfalls concerning the methods to prepare the researcher for the sound use of both methods. Finally, we briefly revisit the Bland-Altman plot for the use in this context.
Dong, Huiru; Robison, Leslie L; Leisenring, Wendy M; Martin, Leah J; Armstrong, Gregory T; Yasui, Yutaka
2015-04-01
Cumulative incidence has been widely used to estimate the cumulative probability of developing an event of interest by a given time, in the presence of competing risks. When it is of interest to measure the total burden of recurrent events in a population, however, the cumulative incidence method is not appropriate because it considers only the first occurrence of the event of interest for each individual in the analysis: Subsequent occurrences are not included. Here, we discuss a straightforward and intuitive method termed "mean cumulative count," which reflects a summarization of all events that occur in the population by a given time, not just the first event for each subject. We explore the mathematical relationship between mean cumulative count and cumulative incidence. Detailed calculation of mean cumulative count is described by using a simple hypothetical example, and the computation code with an illustrative example is provided. Using follow-up data from January 1975 to August 2009 collected in the Childhood Cancer Survivor Study, we show applications of mean cumulative count and cumulative incidence for the outcome of subsequent neoplasms to demonstrate different but complementary information obtained from the 2 approaches and the specific utility of the former. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Liu, I-Hsin; Chang, Shih-Hsin; Lin, Hsin-Yi
2015-05-13
A 3D plotting system was used to make chitosan-based tissue scaffolds with interconnected pores using pure chitosan (C) and chitosan cross-linked with pectin (CP) and genipin (CG). A freeze-dried chitosan scaffold (CF/D) was made to compare with C, to observe the effects of structural differences. The fiber size, pore size, porosity, compression strength, swelling ratio, drug release efficacy, and cumulative weight loss of the scaffolds were measured. Osteoblasts were cultured on the scaffolds and their proliferation, type I collagen production, alkaline phosphatase activity, calcium deposition, and morphology were observed. C had a lower swelling ratio, degradation, porosity and drug release efficacy and a higher compressional stiffness and cell proliferation compared to CF/D (p < 0.05). Of the 3D-plotted samples, cells on CP exhibited the highest degree of mineralization after 21 d (p < 0.05). CP also had the highest swelling ratio and fastest drug release, followed by C and CG (p < 0.05). Both CP and CG were stiffer and degraded more slowly in saline solution than C (p < 0.05). In summary, 3D-plotted scaffolds were stronger, less likely to degrade and better promoted osteoblast cell proliferation in vitro compared to the freeze-dried scaffolds. C, CP and CG were structurally similar, and the different crosslinking caused significant changes in their physical and biological performances.
Compaction of Chromite Cumulates applying a Centrifuging Piston-Cylinder
NASA Astrophysics Data System (ADS)
Manoochehri, S.; Schmidt, M. W.
2012-12-01
around 20 years whereas this value is around 0.4 years for olivine cumulates. When considering a natural outcrop of a layered intrusion with multiple layers of about 50 meters height, adcumulate formation time decreases to a few months. With increasing the effective stress integrated over time, applied during centrifugation, crystal size distribution histograms move slightly toward larger grain sizes, but looking at mean grain sizes, a narrow range of changes can be observed. Classic crystal size distribution profiles corrected for real 3D sizes (CSDCorrectin program) of the chromite grains in different experiments illustrate a collection of parallel log-linear trends at larger grain sizes with a very slight overturn at small grain sizes. This is in close agreement with the idealized CSD plots of adcumulus growth.
Segmented Poincaré plot analysis for risk stratification in patients with dilated cardiomyopathy.
Voss, A; Fischer, C; Schroeder, R; Figulla, H R; Goernig, M
2010-01-01
The prognostic value of heart rate variability in patients with dilated cardiomyopathy (DCM) is limited and does not contribute to risk stratification although the dynamics of ventricular repolarization differs considerably between DCM patients and healthy subjects. Neither linear nor nonlinear methods of heart rate variability analysis could discriminate between patients at high and low risk for sudden cardiac death. The aim of this study was to analyze the suitability of the new developed segmented Poincaré plot analysis (SPPA) to enhance risk stratification in DCM. In contrast to the usual applied Poincaré plot analysis the SPPA retains nonlinear features from investigated beat-to-beat interval time series. Main features of SPPA are the rotation of cloud of points and their succeeded variability depended segmentation. Significant row and column probabilities were calculated from the segments and led to discrimination (up to p<0.005) between low and high risk in DCM patients. For the first time an index from Poincaré plot analysis of heart rate variability was able to contribute to risk stratification in patients suffering from DCM.
Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido
2016-09-01
Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Higher-order cumulants and spectral kurtosis for early detection of subterranean termites
NASA Astrophysics Data System (ADS)
de la Rosa, Juan José González; Moreno Muñoz, Antonio
2008-02-01
This paper deals with termite detection in non-favorable SNR scenarios via signal processing using higher-order statistics. The results could be extrapolated to all impulse-like insect emissions; the situation involves non-destructive termite detection. Fourth-order cumulants in time and frequency domains enhance the detection and complete the characterization of termite emissions, non-Gaussian in essence. Sliding higher-order cumulants offer distinctive time instances, as a complement to the sliding variance, which only reveal power excesses in the signal; even for low-amplitude impulses. The spectral kurtosis reveals non-Gaussian characteristics (the peakedness of the probability density function) associated to these non-stationary measurements, specially in the near ultrasound frequency band. Contrasted estimators have been used to compute the higher-order statistics. The inedited findings are shown via graphical examples.
WCPP-THE WOLF PLOTTING AND CONTOURING PACKAGE
NASA Technical Reports Server (NTRS)
Masaki, G. T.
1994-01-01
The WOLF Contouring and Plotting Package provides the user with a complete general purpose plotting and contouring capability. This package is a complete system for producing line printer, SC4020, Gerber, Calcomp, and SD4060 plots. The package has been designed to be highly flexible and easy to use. Any plot from a quick simple plot (which requires only one call to the package) to highly sophisticated plots (including motion picture plots) can be easily generated with only a basic knowledge of FORTRAN and the plot commands. Anyone designing a software system that requires plotted output will find that this package offers many advantages over the standard hardware support packages available. The WCPP package is divided into a plot segment and a contour segment. The plot segment can produce output for any combination of line printer, SC4020, Gerber, Calcomp, and SD4060 plots. The line printer plots allow the user to have plots available immediately after a job is run at a low cost. Although the resolution of line printer plots is low, the quick results allows the user to judge if a high resolution plot of a particular run is desirable. The SC4020 and SD4060 provide high speed high resolution cathode ray plots with film and hard copy output available. The Gerber and Calcomp plotters provide very high quality (of publishable quality) plots of good resolution. Being bed or drum type plotters, the Gerber and Calcomp plotters are usually slow and not suited for large volume plotting. All output for any or all of the plotters can be produced simultaneously. The types of plots supported are: linear, semi-log, log-log, polar, tabular data using the FORTRAN WRITE statement, 3-D perspective linear, and affine transformations. The labeling facility provides for horizontal labels, vertical labels, diagonal labels, vector characters of a requested size (special character fonts are easily implemented), and rotated letters. The gridding routines label the grid lines according to
ERIC Educational Resources Information Center
Larsen, Russell D.
1985-01-01
Box-and-whisker plots (which give rapid visualization of batches of data) can be effectively used to present diverse collections of data used in traditional first-year chemistry courses. Construction of box-and-whisker plots and their use with bond energy data and data on heats of formation and solution are discussed. (JN)
Gariepy, Aileen M; Creinin, Mitchell D; Smith, Kenneth J; Xu, Xiao
2014-08-01
To compare the expected probability of pregnancy after hysteroscopic versus laparoscopic sterilization based on available data using decision analysis. We developed an evidence-based Markov model to estimate the probability of pregnancy over 10 years after three different female sterilization procedures: hysteroscopic, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation. Parameter estimates for procedure success, probability of completing follow-up testing and risk of pregnancy after different sterilization procedures were obtained from published sources. In the base case analysis at all points in time after the sterilization procedure, the initial and cumulative risk of pregnancy after sterilization is higher in women opting for hysteroscopic than either laparoscopic band or bipolar sterilization. The expected pregnancy rates per 1000 women at 1 year are 57, 7 and 3 for hysteroscopic sterilization, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation, respectively. At 10 years, the cumulative pregnancy rates per 1000 women are 96, 24 and 30, respectively. Sensitivity analyses suggest that the three procedures would have an equivalent pregnancy risk of approximately 80 per 1000 women at 10 years if the probability of successful laparoscopic (band or bipolar) sterilization drops below 90% and successful coil placement on first hysteroscopic attempt increases to 98% or if the probability of undergoing a hysterosalpingogram increases to 100%. Based on available data, the expected population risk of pregnancy is higher after hysteroscopic than laparoscopic sterilization. Consistent with existing contraceptive classification, future characterization of hysteroscopic sterilization should distinguish "perfect" and "typical" use failure rates. Pregnancy probability at 1 year and over 10 years is expected to be higher in women having hysteroscopic as compared to laparoscopic sterilization. Copyright © 2014
Evidence of scaling of void probability in nucleus-nucleus interactions at few GeV energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Dipak; Biswas, Biswanath; Deb, Argha
1997-11-01
The rapidity gap probability in the {sup 24}Mg-AgBr interaction at 4.5GeV/c/nucleon has been studied in detail. The data reveal scaling behavior of the void probability in the central rapidity domain which confirms the validity of the linked-pair approximation for the N-particle cumulant correlation functions. This scaling behavior appears to be similar to the void probability in the Perseus-Pisces supercluster region of galaxies. {copyright} {ital 1997} {ital The American Physical Society}
Origins and implications of the relationship between warming and cumulative carbon emissions
NASA Astrophysics Data System (ADS)
Raupach, M. R.; Davis, S. J.; Peters, G. P.; Andrew, R. M.; Canadell, J.; Le Quere, C.
2014-12-01
A near-linear relationship between warming (T) and cumulative carbon emissions (Q) is a robust finding from numerous studies. This finding opens biophysical questions concerning (1) its theoretical basis, (2) the treatment of non-CO2 forcings, and (3) uncertainty specifications. Beyond these biophysical issues, a profound global policy question is raised: (4) how can a quota on cumulative emissions be shared? Here, an integrated survey of all four issues is attempted. (1) Proportionality between T and Q is an emergent property of a linear carbon-climate system forced by exponentially increasing CO2 emissions. This idealisation broadly explains past but not future near-proportionality between T and Q: in future, the roles of non-CO2 forcings and carbon-climate nonlinearities become important, and trajectory dependence becomes stronger. (2) The warming effects of short-lived non-CO2 forcers depend on instantaneous rather than cumulative fluxes. However, inertia in emissions trajectories reinstates some of the benefits of a cumulative emissions approach, with residual trajectory dependence comparable to that for CO2. (3) Uncertainties arise from several sources: climate projections, carbon-climate feedbacks, and residual trajectory dependencies in CO2 and other emissions. All of these can in principle be combined into a probability distribution P(T|Q) for the warming T from given cumulative CO2 emissions Q. Present knowledge of P(T|Q) allows quantification of the tradeoff between mitigation ambition and climate risk. (4) Cumulative emissions consistent with a given warming target and climate risk are a finite common resource that will inevitably be shared, creating a tragedy-of-the-commons dilemma. Sharing options range from "inertia" (present distribution of emissions is maintained) to "equity" (cumulative emissions are distributed equally per-capita). Both extreme options lead to emissions distributions that are unrealisable in practice, but a blend of the two
A first look at measurement error on FIA plots using blind plots in the Pacific Northwest
Susanna Melson; David Azuma; Jeremy S. Fried
2002-01-01
Measurement error in the Forest Inventory and Analysis work of the Pacific Northwest Station was estimated with a recently implemented blind plot measurement protocol. A small subset of plots was revisited by a crew having limited knowledge of the first crew's measurements. This preliminary analysis of the first 18 months' blind plot data indicates that...
Program Manipulates Plots For Effective Display
NASA Technical Reports Server (NTRS)
Bauer, F.; Downing, J.
1990-01-01
Windowed Observation of Relative Motion (WORM) computer program primarily intended for generation of simple X-Y plots from data created by other programs. Enables user to label, zoom, and change scales of various plots. Three-dimensional contour and line plots provided. Written in PASCAL.
Role of olivine cumulates in destabilizing the flanks of Hawaiian volcanoes
Clague, D.A.; Denlinger, R.P.
1994-01-01
The south flank of Kilauea Volcano is unstable and has the structure of a huge landslide; it is one of at least 17 enormous catastrophic landslides shed from the Hawaiian Islands. Mechanisms previously proposed for movement of the south flank invoke slip of the volcanic pile over seafloor sediments. Slip on a low friction de??collement alone cannot explain why the thickest and widest sector of the flank moves more rapidly than the rest, or why this section contains a 300 km3 aseismic volume above the seismically defined de??collement. It is proposed that this aseismic volume, adjacent to the caldera in the direction of flank slip, consists of olivine cumulates that creep outward, pushing the south flank seawards. Average primary Kilauea tholeiitic magma contains about 16.5 wt.% MgO compared with an average 10 wt.% MgO for erupted subaerial and submarine basalts. This difference requires fractionation of 17 wt.% (14 vol.%) olivine phenocrysts that accumulate near the base of the magma reservoir where they form cumulates. Submarine-erupted Kilauea lavas contain abundant deformed olivine xenocrysts derived from these cumulates. Deformed dunite formed during the tholeiitic shield stage is also erupted as xenoliths in subsequent alkalic lavas. The deformation structures in olivine xenocrysts suggest that the cumulus olivine was densely packed, probably with as little as 5-10 vol.% intercumulus liquid, before entrainment of the xenocrysts. The olivine cumulates were at magmatic temperatures (>1100??C) when the xenocrysts were entrained. Olivine at 1100??C has a rheology similar to ice, and the olivine cumulates should flow down and away from the summit of the volcano. Flow of the olivine cumulates places constant pressure on the unbuttressed seaward flank, leading to an extensional region that localizes deep intrusions behind the flank; these intrusions add to the seaward push. This mechanism ties the source of gravitational instability to the caldera complex and deep
Spatial probability of soil water repellency in an abandoned agricultural field in Lithuania
NASA Astrophysics Data System (ADS)
Pereira, Paulo; Misiūnė, Ieva
2015-04-01
Water repellency is a natural soil property with implications on infiltration, erosion and plant growth. It depends on soil texture, type and amount of organic matter, fungi, microorganisms, and vegetation cover (Doerr et al., 2000). Human activities as agriculture can have implications on soil water repellency (SWR) due tillage and addition of organic compounds and fertilizers (Blanco-Canqui and Lal, 2009; Gonzalez-Penaloza et al., 2012). It is also assumed that SWR has a high small-scale variability (Doerr et al., 2000). The aim of this work is to study the spatial probability of SWR in an abandoned field testing several geostatistical methods, Organic Kriging (OK), Simple Kriging (SK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area it is located near Vilnius urban area at (54 49' N, 25 22', 104 masl) in Lithuania (Pereira and Oliva, 2013). It was designed a experimental plot with 21 m2 (07x03 m). Inside this area it was measured SWR was measured every 50 cm using the water drop penetration time (WDPT) (Wessel, 1998). A total of 105 points were measured. The probability of SWR was classified in 0 (No probability) to 1 (High probability). The methods accuracy was assessed with the cross validation method. The best interpolation method was the one with the lowest Root Mean Square Error (RMSE). The results showed that the most accurate probability method was SK (RMSE=0.436), followed by DK (RMSE=0.437), IK (RMSE=0.448), PK (RMSE=0.452) and OK (RMSE=0.537). Significant differences were identified among probability tests (Kruskal-Wallis test =199.7597 p<0.001). On average the probability of SWR was high with the OK (0.58±0.08) followed by PK (0.49±0.18), SK (0.32±0.16), DK (0.32±0.15) and IK (0.31±0.16). The most accurate probability methods predicted a lower probability of SWR in the studied plot. The spatial distribution of SWR was different according to the tested technique. Simple Kriging, DK, IK and PK methods
Program Aids Creation Of X-Y Plots
NASA Technical Reports Server (NTRS)
Jeletic, James F.
1993-01-01
VEGAS computer program enables application programmers to create X-Y plots in various modes through high-level subroutine calls. Modes consist of passive, autoupdate, and interactive modes. In passive mode, VEGAS takes input data, produces plot, and returns control to application program. In autoupdate mode, forms plots and automatically updates them as more information received. In interactive mode, displays plot and provides popup menus for user to alter appearance of plot or to modify data. Written in FORTRAN 77.
Numerical computation of Pop plot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menikoff, Ralph
The Pop plot — distance-of-run to detonation versus initial shock pressure — is a key characterization of shock initiation in a heterogeneous explosive. Reactive burn models for high explosives (HE) must reproduce the experimental Pop plot to have any chance of accurately predicting shock initiation phenomena. This report describes a methodology for automating the computation of a Pop plot for a specific explosive with a given HE model. Illustrative examples of the computation are shown for PBX 9502 with three burn models (SURF, WSD and Forest Fire) utilizing the xRage code, which is the Eulerian ASC hydrocode at LANL. Comparisonmore » of the numerical and experimental Pop plot can be the basis for a validation test or as an aid in calibrating the burn rate of an HE model. Issues with calibration are discussed.« less
Development of Fourth-Grade Students' Understanding of Experimental and Theoretical Probability
ERIC Educational Resources Information Center
English, Lyn; Watson, Jane
2014-01-01
Students explored variation and expectation in a probability activity at the end of the first year of a 3-year longitudinal study across grades 4-6. The activity involved experiments in tossing coins both manually and with simulation using the graphing software, "TinkerPlots." Initial responses indicated that the students were aware of…
Costentin, Cyrille; Savéant, Jean-Michel
2017-06-14
We analyze here, in the framework of heterogeneous molecular catalysis, the reasons for the occurrence or nonoccurrence of volcanoes upon plotting the kinetics of the catalytic reaction versus the stabilization free energy of the primary intermediate of the catalytic process. As in the case of homogeneous molecular catalysis or catalysis by surface-active metallic sites, a strong motivation of such studies relates to modern energy challenges, particularly those involving small molecules, such as water, hydrogen, oxygen, proton, and carbon dioxide. This motivation is particularly pertinent for what concerns heterogeneous molecular catalysis, since it is commonly preferred to homogeneous molecular catalysis by the same molecules if only for chemical separation purposes and electrolytic cell architecture. As with the two other catalysis modes, the main drawback of the volcano plot approach is the basic assumption that the kinetic responses depend on a single descriptor, viz., the stabilization free energy of the primary intermediate. More comprehensive approaches, investigating the responses to the maximal number of experimental factors, and conveniently expressed as catalytic Tafel plots, should clearly be preferred. This is more so in the case of heterogeneous molecular catalysis in that additional transport factors in the supporting film may additionally affect the current-potential responses. This is attested by the noteworthy presence of maxima in catalytic Tafel plots as well as their dependence upon the cyclic voltammetric scan rate.
ERIC Educational Resources Information Center
Balasooriya, Uditha; Li, Jackie; Low, Chan Kee
2012-01-01
For any density function (or probability function), there always corresponds a "cumulative distribution function" (cdf). It is a well-known mathematical fact that the cdf is more general than the density function, in the sense that for a given distribution the former may exist without the existence of the latter. Nevertheless, while the…
Audio feature extraction using probability distribution function
NASA Astrophysics Data System (ADS)
Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.
2015-05-01
Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.
Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization
Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.
2014-01-01
Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406
NASA Astrophysics Data System (ADS)
Purschke, Oliver; Dengler, Jürgen; Bruelheide, Helge; Chytrý, Milan; Jansen, Florian; Hennekens, Stephan; Jandt, Ute; Jiménez-Alfaro, Borja; Kattge, Jens; De Patta Pillar, Valério; Sandel, Brody; Winter, Marten
2015-04-01
The trait composition of plant communities is determined by abiotic, biotic and historical factors, but the importance of macro-climatic factors in explaining trait-environment relationships at the local scale remains unclear. Such knowledge is crucial for biogeographical and ecological theory but also relevant to devise management measures to mitigate the negative effects of climate change. To address these questions, an iDiv Working Group has established the first global vegetation-plot database (sPlot). sPlot currently contains ~700,000 plots from over 50 countries and all biomes, and is steadily growing. Approx. 70% of the most frequent species are represented by at least one trait in the global trait database TRY and gap-filled data will become available for the most common traits. We will give an overview about the structure and present content of sPlot in terms of spatial distribution, data properties and trait coverage. We will explain next steps and perspectives, present first cross-biome analyses of community-weighted mean traits and trait variability, and highlight some ecological questions that can be addressed with sPlot.
Automatic Classification of Station Quality by Image Based Pattern Recognition of Ppsd Plots
NASA Astrophysics Data System (ADS)
Weber, B.; Herrnkind, S.
2017-12-01
The number of seismic stations is growing and it became common practice to share station waveform data in real-time with the main data centers as IRIS, GEOFON, ORFEUS and RESIF. This made analyzing station performance of increasing importance for automatic real-time processing and station selection. The value of a station depends on different factors as quality and quantity of the data, location of the site and general station density in the surrounding area and finally the type of application it can be used for. The approach described by McNamara and Boaz (2006) became standard in the last decade. It incorporates a probability density function (PDF) to display the distribution of seismic power spectral density (PSD). The low noise model (LNM) and high noise model (HNM) introduced by Peterson (1993) are also displayed in the PPSD plots introduced by McNamara and Boaz allowing an estimation of the station quality. Here we describe how we established an automatic station quality classification module using image based pattern recognition on PPSD plots. The plots were split into 4 bands: short-period characteristics (0.1-0.8 s), body wave characteristics (0.8-5 s), microseismic characteristics (5-12 s) and long-period characteristics (12-100 s). The module sqeval connects to a SeedLink server, checks available stations, requests PPSD plots through the Mustang service from IRIS or PQLX/SQLX or from GIS (gempa Image Server), a module to generate different kind of images as trace plots, map plots, helicorder plots or PPSD plots. It compares the image based quality patterns for the different period bands with the retrieved PPSD plot. The quality of a station is divided into 5 classes for each of the 4 bands. Classes A, B, C, D define regular quality between LNM and HNM while the fifth class represents out of order stations with gain problems, missing data etc. Over all period bands about 100 different patterns are required to classify most of the stations available on the
NASA Astrophysics Data System (ADS)
Steinacher, M.; Joos, F.
2016-02-01
Information on the relationship between cumulative fossil CO2 emissions and multiple climate targets is essential to design emission mitigation and climate adaptation strategies. In this study, the transient response of a climate or environmental variable per trillion tonnes of CO2 emissions, termed TRE, is quantified for a set of impact-relevant climate variables and from a large set of multi-forcing scenarios extended to year 2300 towards stabilization. An ˜ 1000-member ensemble of the Bern3D-LPJ carbon-climate model is applied and model outcomes are constrained by 26 physical and biogeochemical observational data sets in a Bayesian, Monte Carlo-type framework. Uncertainties in TRE estimates include both scenario uncertainty and model response uncertainty. Cumulative fossil emissions of 1000 Gt C result in a global mean surface air temperature change of 1.9 °C (68 % confidence interval (c.i.): 1.3 to 2.7 °C), a decrease in surface ocean pH of 0.19 (0.18 to 0.22), and a steric sea level rise of 20 cm (13 to 27 cm until 2300). Linearity between cumulative emissions and transient response is high for pH and reasonably high for surface air and sea surface temperatures, but less pronounced for changes in Atlantic meridional overturning, Southern Ocean and tropical surface water saturation with respect to biogenic structures of calcium carbonate, and carbon stocks in soils. The constrained model ensemble is also applied to determine the response to a pulse-like emission and in idealized CO2-only simulations. The transient climate response is constrained, primarily by long-term ocean heat observations, to 1.7 °C (68 % c.i.: 1.3 to 2.2 °C) and the equilibrium climate sensitivity to 2.9 °C (2.0 to 4.2 °C). This is consistent with results by CMIP5 models but inconsistent with recent studies that relied on short-term air temperature data affected by natural climate variability.
NASA Astrophysics Data System (ADS)
Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi
To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.
Optimal design of a plot cluster for monitoring
Charles T. Scott
1993-01-01
Traveling costs incurred during extensive forest surveys make cluster sampling cost-effective. Clusters are specified by the type of plots, plot size, number of plots, and the distance between plots within the cluster. A method to determine the optimal cluster design when different plot types are used for different forest resource attributes is described. The method...
Saito, Takaya; Rehmsmeier, Marc
2015-01-01
Binary classifiers are routinely evaluated with performance measures such as sensitivity and specificity, and performance is frequently illustrated with Receiver Operating Characteristics (ROC) plots. Alternative measures such as positive predictive value (PPV) and the associated Precision/Recall (PRC) plots are used less frequently. Many bioinformatics studies develop and evaluate classifiers that are to be applied to strongly imbalanced datasets in which the number of negatives outweighs the number of positives significantly. While ROC plots are visually appealing and provide an overview of a classifier's performance across a wide range of specificities, one can ask whether ROC plots could be misleading when applied in imbalanced classification scenarios. We show here that the visual interpretability of ROC plots in the context of imbalanced datasets can be deceptive with respect to conclusions about the reliability of classification performance, owing to an intuitive but wrong interpretation of specificity. PRC plots, on the other hand, can provide the viewer with an accurate prediction of future classification performance due to the fact that they evaluate the fraction of true positives among positive predictions. Our findings have potential implications for the interpretation of a large number of studies that use ROC plots on imbalanced datasets.
LaDou, Joseph
1978-01-01
A few states, notably California, are experiencing large increases in the number and cost of disability settlements under workers' compensation. Claims of cumulative injury for coronary heart disease, hypertension, stroke, cancer and neuropsychiatric problems have all been interpreted as compensable under workers' compensation, even when these conditions are clearly related to the aging process. Legal precedents for such claims are building rapidly throughout the country. The resultant costs may lead to the demise of the workers' compensation system. The situation in California is discussed in detail including the legal aspects, cumulative injury claims by type of disease and age of claimants, legal costs to the individual and the employer, and the economic outlook for the workers' compensation insurance system. PMID:151986
40 CFR 1508.7 - Cumulative impact.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Cumulative impact. 1508.7 Section 1508... Cumulative impact. Cumulative impact is the impact on the environment which results from the incremental impact of the action when added to other past, present, and reasonably foreseeable future actions...
32 CFR 651.16 - Cumulative impacts.
Code of Federal Regulations, 2011 CFR
2011-07-01
... § 651.16 Cumulative impacts. (a) NEPA analyses must assess cumulative effects, which are the impact on the environment resulting from the incremental impact of the action when added to other past, present... 32 National Defense 4 2011-07-01 2011-07-01 false Cumulative impacts. 651.16 Section 651.16...
32 CFR 651.16 - Cumulative impacts.
Code of Federal Regulations, 2010 CFR
2010-07-01
... § 651.16 Cumulative impacts. (a) NEPA analyses must assess cumulative effects, which are the impact on the environment resulting from the incremental impact of the action when added to other past, present... 32 National Defense 4 2010-07-01 2010-07-01 true Cumulative impacts. 651.16 Section 651.16...
Presenting simulation results in a nested loop plot.
Rücker, Gerta; Schwarzer, Guido
2014-12-12
Statisticians investigate new methods in simulations to evaluate their properties for future real data applications. Results are often presented in a number of figures, e.g., Trellis plots. We had conducted a simulation study on six statistical methods for estimating the treatment effect in binary outcome meta-analyses, where selection bias (e.g., publication bias) was suspected because of apparent funnel plot asymmetry. We varied five simulation parameters: true treatment effect, extent of selection, event proportion in control group, heterogeneity parameter, and number of studies in meta-analysis. In combination, this yielded a total number of 768 scenarios. To present all results using Trellis plots, 12 figures were needed. Choosing bias as criterion of interest, we present a 'nested loop plot', a diagram type that aims to have all simulation results in one plot. The idea was to bring all scenarios into a lexicographical order and arrange them consecutively on the horizontal axis of a plot, whereas the treatment effect estimate is presented on the vertical axis. The plot illustrates how parameters simultaneously influenced the estimate. It can be combined with a Trellis plot in a so-called hybrid plot. Nested loop plots may also be applied to other criteria such as the variance of estimation. The nested loop plot, similar to a time series graph, summarizes all information about the results of a simulation study with respect to a chosen criterion in one picture and provides a suitable alternative or an addition to Trellis plots.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stockem, A.; Lazar, M.; Department of Physics and Engineering Physics, University of Saskatchewan, Saskatoon
2008-01-15
Dispersion formalism reported in Lazar et al. [Phys. Plasmas 13, 102107 (2006)] is affected by errors due to the misfitting of the distribution function (1) used to interpret the counterstreaming plasmas, with the general dispersion relations (4) and (5), where distribution function (1) has been inserted to find the unstable solutions. The analytical approach is reviewed here, providing a correct analytical and numerical description for the cumulative effect of filamentation and Weibel instabilities arising in initially counterstreaming plasmas with temperature anisotropies. The growth rates are plotted again, and for the cumulative mode, they are orders of magnitude larger than thosemore » obtained in Lazar et al. [Phys. Plasmas 13, 102107 (2006)]. Physically, this can be understood as an increasing of the efficiency of magnetic field generation, and rather enhances the potential role of magnetic instabilities for the fast magnetization scenario in astrophysical applications.« less
Round versus rectangular: Does the plot shape matter?
NASA Astrophysics Data System (ADS)
Iserloh, Thomas; Bäthke, Lars; Ries, Johannes B.
2016-04-01
Field rainfall simulators are designed to study soil erosion processes and provide urgently needed data for various geomorphological, hydrological and pedological issues. Due to the different conditions and technologies applied, there are several methodological aspects under review of the scientific community, particularly concerning design, procedures and conditions of measurement for infiltration, runoff and soil erosion. Extensive discussions at the Rainfall Simulator Workshop 2011 in Trier and the Splinter Meeting at EGU 2013 "Rainfall simulation: Big steps forward!" lead to the opinion that the rectangular shape is the more suitable plot shape compared to the round plot. A horizontally edging Gerlach trough is installed for sample collection without forming unnatural necks as is found at round or triangle plots. Since most research groups did and currently do work with round plots at the point scale (<1m²), a precise analysis of the differences between the output of round and square plots are necessary. Our hypotheses are: - Round plot shapes disturb surface runoff, unnatural fluvial dynamics for the given plot size such as pool development especially directly at the plot's outlet occur. - A square plot shape prevent these problems. A first comparison between round and rectangular plots (Iserloh et al., 2015) indicates that the rectangular plot could indeed be the more suitable, but the rather ambiguous results make a more elaborate test setup necessary. The laboratory test setup includes the two plot shapes (round, square), a standardised silty substrate and three inclinations (2°, 6°, 12°). The analysis of the laboratory test provide results on the best performance concerning undisturbed surface runoff and soil/water sampling at the plot's outlet. The analysis of the plot shape concerning its influence on runoff and erosion shows that clear methodological standards are necessary in order to make rainfall simulation experiments comparable. Reference
The rainfall plot: its motivation, characteristics and pitfalls.
Domanska, Diana; Vodák, Daniel; Lund-Andersen, Christin; Salvatore, Stefania; Hovig, Eivind; Sandve, Geir Kjetil
2017-05-18
A visualization referred to as rainfall plot has recently gained popularity in genome data analysis. The plot is mostly used for illustrating the distribution of somatic cancer mutations along a reference genome, typically aiming to identify mutation hotspots. In general terms, the rainfall plot can be seen as a scatter plot showing the location of events on the x-axis versus the distance between consecutive events on the y-axis. Despite its frequent use, the motivation for applying this particular visualization and the appropriateness of its usage have never been critically addressed in detail. We show that the rainfall plot allows visual detection even for events occurring at high frequency over very short distances. In addition, event clustering at multiple scales may be detected as distinct horizontal bands in rainfall plots. At the same time, due to the limited size of standard figures, rainfall plots might suffer from inability to distinguish overlapping events, especially when multiple datasets are plotted in the same figure. We demonstrate the consequences of plot congestion, which results in obscured visual data interpretations. This work provides the first comprehensive survey of the characteristics and proper usage of rainfall plots. We find that the rainfall plot is able to convey a large amount of information without any need for parameterization or tuning. However, we also demonstrate how plot congestion and the use of a logarithmic y-axis may result in obscured visual data interpretations. To aid the productive utilization of rainfall plots, we demonstrate their characteristics and potential pitfalls using both simulated and real data, and provide a set of practical guidelines for their proper interpretation and usage.
Probability and surprisal in auditory comprehension of morphologically complex words.
Balling, Laura Winther; Baayen, R Harald
2012-10-01
Two auditory lexical decision experiments document for morphologically complex words two points at which the probability of a target word given the evidence shifts dramatically. The first point is reached when morphologically unrelated competitors are no longer compatible with the evidence. Adapting terminology from Marslen-Wilson (1984), we refer to this as the word's initial uniqueness point (UP1). The second point is the complex uniqueness point (CUP) introduced by Balling and Baayen (2008), at which morphologically related competitors become incompatible with the input. Later initial as well as complex uniqueness points predict longer response latencies. We argue that the effects of these uniqueness points arise due to the large surprisal (Levy, 2008) carried by the phonemes at these uniqueness points, and provide independent evidence that how cumulative surprisal builds up in the course of the word co-determines response latencies. The presence of effects of surprisal, both at the initial uniqueness point of complex words, and cumulatively throughout the word, challenges the Shortlist B model of Norris and McQueen (2008), and suggests that a Bayesian approach to auditory comprehension requires complementation from information theory in order to do justice to the cognitive cost of updating probability distributions over lexical candidates. Copyright © 2012 Elsevier B.V. All rights reserved.
Spectrum sensing based on cumulative power spectral density
NASA Astrophysics Data System (ADS)
Nasser, A.; Mansour, A.; Yao, K. C.; Abdallah, H.; Charara, H.
2017-12-01
This paper presents new spectrum sensing algorithms based on the cumulative power spectral density (CPSD). The proposed detectors examine the CPSD of the received signal to make a decision on the absence/presence of the primary user (PU) signal. Those detectors require the whiteness of the noise in the band of interest. The false alarm and detection probabilities are derived analytically and simulated under Gaussian and Rayleigh fading channels. Our proposed detectors present better performance than the energy (ED) or the cyclostationary detectors (CSD). Moreover, in the presence of noise uncertainty (NU), they are shown to provide more robustness than ED, with less performance loss. In order to neglect the NU, we modified our algorithms to be independent from the noise variance.
Storytelling in Earth sciences: The eight basic plots
NASA Astrophysics Data System (ADS)
Phillips, Jonathan
2012-11-01
Reporting results and promoting ideas in science in general, and Earth science in particular, is treated here as storytelling. Just as in literature and drama, storytelling in Earth science is characterized by a small number of basic plots. Though the list is not exhaustive, and acknowledging that multiple or hybrid plots and subplots are possible in a single piece, eight standard plots are identified, and examples provided: cause-and-effect, genesis, emergence, destruction, metamorphosis, convergence, divergence, and oscillation. The plots of Earth science stories are not those of literary traditions, nor those of persuasion or moral philosophy, and deserve separate consideration. Earth science plots do not conform those of storytelling more generally, implying that Earth scientists may have fundamentally different motivations than other storytellers, and that the basic plots of Earth Science derive from the characteristics and behaviors of Earth systems. In some cases preference or affinity to different plots results in fundamentally different interpretations and conclusions of the same evidence. In other situations exploration of additional plots could help resolve scientific controversies. Thus explicit acknowledgement of plots can yield direct scientific benefits. Consideration of plots and storytelling devices may also assist in the interpretation of published work, and can help scientists improve their own storytelling.
NASA Astrophysics Data System (ADS)
Zorila, Alexandru; Stratan, Aurel; Nemes, George
2018-01-01
We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.
Urban, Jillian E.; Davenport, Elizabeth M.; Golman, Adam J.; Maldjian, Joseph A.; Whitlow, Christopher T.; Powers, Alexander K.; Stitzel, Joel D.
2015-01-01
Sports-related concussion is the most common athletic head injury with football having the highest rate among high school athletes. Traditionally, research on the biomechanics of football-related head impact has been focused at the collegiate level. Less research has been performed at the high school level, despite the incidence of concussion among high school football players. The objective of this study is to twofold: to quantify the head impact exposure in high school football, and to develop a cumulative impact analysis method. Head impact exposure was measured by instrumenting the helmets of 40 high school football players with helmet mounted accelerometer arrays to measure linear and rotational acceleration. A total of 16,502 head impacts were collected over the course of the season. Biomechanical data were analyzed by team and by player. The median impact for each player ranged from 15.2 to 27.0 g with an average value of 21.7 (±2.4) g. The 95th percentile impact for each player ranged from 38.8 to 72.9 g with an average value of 56.4 (±10.5) g. Next, an impact exposure metric utilizing concussion injury risk curves was created to quantify cumulative exposure for each participating player over the course of the season. Impacts were weighted according to the associated risk due to linear acceleration and rotational acceleration alone, as well as the combined probability (CP) of injury associated with both. These risks were summed over the course of a season to generate risk weighted cumulative exposure. The impact frequency was found to be greater during games compared to practices with an average number of impacts per session of 15.5 and 9.4, respectively. However, the median cumulative risk weighted exposure based on combined probability was found to be greater for practices vs. games. These data will provide a metric that may be used to better understand the cumulative effects of repetitive head impacts, injury mechanisms, and head impact exposure of
Corrections for Cluster-Plot Slop
Harry T. Valentine; Mark J. Ducey; Jeffery H. Gove; Adrian Lanz; David L.R. Affleck
2006-01-01
Cluster-plot designs, including the design used by the Forest Inventory and Analysis program of the USDA Forest Service (FIA), are attended by a complicated boundary slopover problem. Slopover occurs where inclusion zones of objects of interest cross the boundary of the area of interest. The dispersed nature of inclusion zones that arise from the use of cluster plots...
Mapped Plot Patch Size Estimates
Paul C. Van Deusen
2005-01-01
This paper demonstrates that the mapped plot design is relatively easy to analyze and describes existing formulas for mean and variance estimators. New methods are developed for using mapped plots to estimate average patch size of condition classes. The patch size estimators require assumptions about the shape of the condition class, limiting their utility. They may...
Vegetation resurvey is robust to plot location uncertainty
Kopecký, Martin; Macek, Martin
2017-01-01
Aim Resurveys of historical vegetation plots are increasingly used for the assessment of decadal changes in plant species diversity and composition. However, historical plots are usually relocated only approximately. This potentially inflates temporal changes and undermines results. Location Temperate deciduous forests in Central Europe. Methods To explore if robust conclusions can be drawn from resurvey studies despite location uncertainty, we compared temporal changes in species richness, frequency, composition and compositional heterogeneity between exactly and approximately relocated plots. We hypothesized that compositional changes should be lower and changes in species richness should be less variable on exactly relocated plots, because pseudo-turnover inflates temporal changes on approximately relocated plots. Results Temporal changes in species richness were not more variable and temporal changes in species composition and compositional heterogeneity were not higher on approximately relocated plots. Moreover, the frequency of individual species changed similarly on both plot types. Main conclusions The resurvey of historical vegetation plots is robust to uncertainty in original plot location and, when done properly, provides reliable evidence of decadal changes in plant communities. This provides important background for other resurvey studies and opens up the possibility for large-scale assessments of plant community change. PMID:28503083
Estimating soil moisture exceedance probability from antecedent rainfall
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Kalansky, J.; Stock, J. D.; Collins, B. D.
2016-12-01
The first storms of the rainy season in coastal California, USA, add moisture to soils but rarely trigger landslides. Previous workers proposed that antecedent rainfall, the cumulative seasonal rain from October 1 onwards, had to exceed specific amounts in order to trigger landsliding. Recent monitoring of soil moisture upslope of historic landslides in the San Francisco Bay Area shows that storms can cause positive pressure heads once soil moisture values exceed a threshold of volumetric water content (VWC). We propose that antecedent rainfall could be used to estimate the probability that VWC exceeds this threshold. A major challenge to estimating the probability of exceedance is that rain gauge records are frequently incomplete. We developed a stochastic model to impute (infill) missing hourly precipitation data. This model uses nearest neighbor-based conditional resampling of the gauge record using data from nearby rain gauges. Using co-located VWC measurements, imputed data can be used to estimate the probability that VWC exceeds a specific threshold for a given antecedent rainfall. The stochastic imputation model can also provide an estimate of uncertainty in the exceedance probability curve. Here we demonstrate the method using soil moisture and precipitation data from several sites located throughout Northern California. Results show a significant variability between sites in the sensitivity of VWC exceedance probability to antecedent rainfall.
Human cumulative culture: a comparative perspective.
Dean, Lewis G; Vale, Gill L; Laland, Kevin N; Flynn, Emma; Kendal, Rachel L
2014-05-01
Many animals exhibit social learning and behavioural traditions, but human culture exhibits unparalleled complexity and diversity, and is unambiguously cumulative in character. These similarities and differences have spawned a debate over whether animal traditions and human culture are reliant on homologous or analogous psychological processes. Human cumulative culture combines high-fidelity transmission of cultural knowledge with beneficial modifications to generate a 'ratcheting' in technological complexity, leading to the development of traits far more complex than one individual could invent alone. Claims have been made for cumulative culture in several species of animals, including chimpanzees, orangutans and New Caledonian crows, but these remain contentious. Whilst initial work on the topic of cumulative culture was largely theoretical, employing mathematical methods developed by population biologists, in recent years researchers from a wide range of disciplines, including psychology, biology, economics, biological anthropology, linguistics and archaeology, have turned their attention to the experimental investigation of cumulative culture. We review this literature, highlighting advances made in understanding the underlying processes of cumulative culture and emphasising areas of agreement and disagreement amongst investigators in separate fields. © 2013 The Authors. Biological Reviews © 2013 Cambridge Philosophical Society.
Spatially Locating FIA Plots from Pixel Values
Greg C. Liknes; Geoffrey R. Holden; Mark D. Nelson; Ronald E. McRoberts
2005-01-01
The USDA Forest Service Forest Inventory and Analysis (FIA) program is required to ensure the confidentiality of the geographic locations of plots. To accommodate user requests for data without releasing actual plot coordinates, FIA creates overlays of plot locations on various geospatial data, including satellite imagery. Methods for reporting pixel values associated...
Conceptual recurrence plots: revealing patterns in human discourse.
Angus, Daniel; Smith, Andrew; Wiles, Janet
2012-06-01
Human discourse contains a rich mixture of conceptual information. Visualization of the global and local patterns within this data stream is a complex and challenging problem. Recurrence plots are an information visualization technique that can reveal trends and features in complex time series data. The recurrence plot technique works by measuring the similarity of points in a time series to all other points in the same time series and plotting the results in two dimensions. Previous studies have applied recurrence plotting techniques to textual data; however, these approaches plot recurrence using term-based similarity rather than conceptual similarity of the text. We introduce conceptual recurrence plots, which use a model of language to measure similarity between pairs of text utterances, and the similarity of all utterances is measured and displayed. In this paper, we explore how the descriptive power of the recurrence plotting technique can be used to discover patterns of interaction across a series of conversation transcripts. The results suggest that the conceptual recurrence plotting technique is a useful tool for exploring the structure of human discourse.
Evaluating Plot Designs for the Tropics
Paul C. van Deusen; Bruce Bayle
1991-01-01
Theory and procedures are reviewed for determining the best type of plot for a given forest inventory. A general methodology is given that clarifies the relationship between different plot designs and the associated methods to produce the inventory estimates.
Cumulative uncertainty in measured streamflow and water quality data for small watersheds
Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.
2006-01-01
The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of
An Advanced, Three-Dimensional Plotting Library for Astronomy
NASA Astrophysics Data System (ADS)
Barnes, David G.; Fluke, Christopher J.; Bourke, Paul D.; Parry, Owen T.
2006-07-01
We present a new, three-dimensional (3D) plotting library with advanced features, and support for standard and enhanced display devices. The library - s2plot - is written in c and can be used by c, c++, and fortran programs on GNU/Linux and Apple/OSX systems. s2plot draws objects in a 3D (x,y,z) Cartesian space and the user interactively controls how this space is rendered at run time. With a pgplot-inspired interface, s2plot provides astronomers with elegant techniques for displaying and exploring 3D data sets directly from their program code, and the potential to use stereoscopic and dome display devices. The s2plot architecture supports dynamic geometry and can be used to plot time-evolving data sets, such as might be produced by simulation codes. In this paper, we introduce s2plot to the astronomical community, describe its potential applications, and present some example uses of the library.
Cumulative impact assessment: A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irving, J.S.; Bain, M.B.
The National Environmental Policy Act of 1969 (NEPA) indirectly addressed cumulative impacts. Attempts to include cumulative impacts in environmental impact assessments, however, did not began until the early 1980's. One such effort began when The Federal Energy and Regulatory Commission (FERC) received over 1200 applications for hydroelectric projects in the Pacific Northwest. Federal and State Agencies, Indian tribes and environmental groups realized the potential cumulative effect such development could have on fish and wildfire resources. In response, the FERC developed the Cluster Impact Assessment Procedure (CIAP). The CIAP consisted of public scoping meetings; interactive workshops designed to identify projects withmore » potential for cumulative effects, important resources, available data; and preparation of a NEPA document (EA or EIS). The procedure was modifies to assess the cumulative impacts of fifteen hydroelectric projects in the Salmon River Basin, Idaho. The methodology achieved its basic objective of evaluating the impact of hydroelectric development on fish and wildfire resources. In addition, the use of evaluative techniques to determine project interactions and degrees of impact hindered acceptance of the conclusions. Notwithstanding these problems, the studies provided a basis for decision-makers to incorporate the potential effects of cumulative impacts into the decision-making process. 22 refs., 2 figs., 4 tabs.« less
Algorithm Calculates Cumulative Poisson Distribution
NASA Technical Reports Server (NTRS)
Bowerman, Paul N.; Nolty, Robert C.; Scheuer, Ernest M.
1992-01-01
Algorithm calculates accurate values of cumulative Poisson distribution under conditions where other algorithms fail because numbers are so small (underflow) or so large (overflow) that computer cannot process them. Factors inserted temporarily to prevent underflow and overflow. Implemented in CUMPOIS computer program described in "Cumulative Poisson Distribution Program" (NPO-17714).
Split-plot designs for robotic serial dilution assays.
Buzas, Jeffrey S; Wager, Carrie G; Lansky, David M
2011-12-01
This article explores effective implementation of split-plot designs in serial dilution bioassay using robots. We show that the shortest path for a robot to fill plate wells for a split-plot design is equivalent to the shortest common supersequence problem in combinatorics. We develop an algorithm for finding the shortest common supersequence, provide an R implementation, and explore the distribution of the number of steps required to implement split-plot designs for bioassay through simulation. We also show how to construct collections of split plots that can be filled in a minimal number of steps, thereby demonstrating that split-plot designs can be implemented with nearly the same effort as strip-plot designs. Finally, we provide guidelines for modeling data that result from these designs. © 2011, The International Biometric Society.
ERIC Educational Resources Information Center
Vale, G. L.; Flynn, E. G.; Kendal, R. L.
2012-01-01
Cumulative culture denotes the, arguably, human capacity to build on the cultural behaviors of one's predecessors, allowing increases in cultural complexity to occur such that many of our cultural artifacts, products and technologies have progressed beyond what a single individual could invent alone. This process of cumulative cultural evolution…
An Excel macro for generating trilinear plots.
Shikaze, Steven G; Crowe, Allan S
2007-01-01
This computer note describes a method for creating trilinear plots in Microsoft Excel. Macros have been created in MS Excel's internal language: Visual Basic for Applications (VBA). A simple form has been set up to allow the user to input data from an Excel worksheet. The VBA macro is used to convert the triangular data (which consist of three columns of percentage data) into X-Y data. The macro then generates the axes, labels, and grid for the trilinear plot. The X-Y data are plotted as scatter data in Excel. By providing this macro in Excel, users can create trilinear plots in a quick, inexpensive manner.
Application of mapped plots for single-owner forest surveys
Paul C. Van Deusen; Francis Roesch
2009-01-01
Mapped plots are used for the nation forest inventory conducted by the U.S. Forest Service. Mapped plots are also useful foro single ownership inventoires. Mapped plots can handle boundary overlap and can aprovide less variable estimates for specified forest conditions. Mapping is a good fit for fixed plot inventories where the fixed area plot is used for both mapping...
Permanent field plot methodology and equipment
Thomas G. Cole
1993-01-01
Long-term research into the composition, phenology, yield, and growth rates of agroforests can be accomplished with the use of permanent field plots. The periodic remeasurement of these plots provides researchers a quantitative measure of what changes occur over time in indigenous agroforestry systems.
The Probable Ages of Asteroid Families
NASA Technical Reports Server (NTRS)
Harris, A. W.
1993-01-01
There has been considerable debate recently over the ages of the Hirayama families, and in particular if some of the families are very oung(u) It is a straightforward task to estimate the characteristic time of a collision between a body of a given diameter, d_o, by another body of diameter greater of equal to d_1. What is less straightforward is to estimate the critical diameter ratio, d_1/d_o, above which catastrophic disruption occurs, from which one could infer probable ages of the Hirayama families, by knowing the diameter of the parent body, d_o. One can gain some insight into the probable value of d_1/d_o, and of the likely ages of existing families, from the plot below. I have computed the characteristic time between collisions in the asteroid belt of a size ratio greater of equal to d_1/d_o, for 4 sizes of target asteroids, d_o. The solid curves to the lower right are the characteristic times for a single object...
Jose E. Negron; Jill L. Wilson
2003-01-01
We examined attributes of pinon pine (Pinus edulis) associated with the probability of infestation by pinon ips (Ips confusus) in an outbreak in the Coconino National Forest, Arizona. We used data collected from 87 plots, 59 infested and 28 uninfested, and a logistic regression approach to estimate the probability ofinfestation based on plotand tree-level attributes....
Jose F. Negron; Jill L. Wilson
2008-01-01
(Please note, this is an abstract only) We examined attributes associated with the probability of infestation by pinon ips (Ips confusus), in pinon pine (Pinus edulis), in an outbreak in the Coconino National Forest, Arizona. We used data collected from 87 plots, 59 infested and 28 uninfested, and a logistic regression approach to estimate the probability of...
Box Plots in the Australian Curriculum
ERIC Educational Resources Information Center
Watson, Jane M.
2012-01-01
This article compares the definition of "box plot" as used in the "Australian Curriculum: Mathematics" with other definitions used in the education community; describes the difficulties students experience when dealing with box plots; and discusses the elaboration that is necessary to enable teachers to develop the knowledge…
NASA Astrophysics Data System (ADS)
Panuzzo, P.; Li, J.; Caux, E.
2012-09-01
The Herschel Interactive Processing Environment (HIPE) was developed by the European Space Agency (ESA) in collaboration with NASA and the Herschel Instrument Control Centres, to provide the astronomical community a complete environment to process and analyze the data gathered by the Herschel Space Observatory. One of the most important components of HIPE is the plotting system (named PlotXY) that we present here. With PlotXY it is possible to produce easily high quality publication-ready 2D plots. It provides a long list of features, with fully configurable components, and interactive zooming. The entire code of HIPE is written in Java and is open source released under the GNU Lesser General Public License version 3. A new version of PlotXY is being developed to be independent from the HIPE code base; it is available to the software development community for the inclusion in other projects at the URL http://code.google.com/p/jplot2d/.
The estimated lifetime probability of acquiring human papillomavirus in the United States.
Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E
2014-11-01
Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.
Cumulative human impacts on marine predators.
Maxwell, Sara M; Hazen, Elliott L; Bograd, Steven J; Halpern, Benjamin S; Breed, Greg A; Nickel, Barry; Teutschel, Nicole M; Crowder, Larry B; Benson, Scott; Dutton, Peter H; Bailey, Helen; Kappes, Michelle A; Kuhn, Carey E; Weise, Michael J; Mate, Bruce; Shaffer, Scott A; Hassrick, Jason L; Henry, Robert W; Irvine, Ladd; McDonald, Birgitte I; Robinson, Patrick W; Block, Barbara A; Costa, Daniel P
2013-01-01
Stressors associated with human activities interact in complex ways to affect marine ecosystems, yet we lack spatially explicit assessments of cumulative impacts on ecologically and economically key components such as marine predators. Here we develop a metric of cumulative utilization and impact (CUI) on marine predators by combining electronic tracking data of eight protected predator species (n=685 individuals) in the California Current Ecosystem with data on 24 anthropogenic stressors. We show significant variation in CUI with some of the highest impacts within US National Marine Sanctuaries. High variation in underlying species and cumulative impact distributions means that neither alone is sufficient for effective spatial management. Instead, comprehensive management approaches accounting for both cumulative human impacts and trade-offs among multiple stressors must be applied in planning the use of marine resources.
The Doghouse Plot: History, Construction Techniques, and Application
NASA Astrophysics Data System (ADS)
Wilson, John Robert
The Doghouse Plot visually represents an aircraft's performance during combined turn-climb maneuvers. The Doghouse Plot completely describes the turn-climb capability of an aircraft; a single plot demonstrates the relationship between climb performance, turn rate, turn radius, stall margin, and bank angle. Using NASA legacy codes, Empirical Drag Estimation Technique (EDET) and Numerical Propulsion System Simulation (NPSS), it is possible to reverse engineer sufficient basis data for commercial and military aircraft to construct Doghouse Plots. Engineers and operators can then use these to assess their aircraft's full performance envelope. The insight gained from these plots can broaden the understanding of an aircraft's performance and, in turn, broaden the operational scope of some aircraft that would otherwise be limited by the simplifications found in their Airplane Flight Manuals (AFM). More importantly, these plots can build on the current standards of obstacle avoidance and expose risks in operation.
Recurrence plot statistics and the effect of embedding
NASA Astrophysics Data System (ADS)
March, T. K.; Chapman, S. C.; Dendy, R. O.
2005-01-01
Recurrence plots provide a graphical representation of the recurrent patterns in a timeseries, the quantification of which is a relatively new field. Here we derive analytical expressions which relate the values of key statistics, notably determinism and entropy of line length distribution, to the correlation sum as a function of embedding dimension. These expressions are obtained by deriving the transformation which generates an embedded recurrence plot from an unembedded plot. A single unembedded recurrence plot thus provides the statistics of all possible embedded recurrence plots. If the correlation sum scales exponentially with embedding dimension, we show that these statistics are determined entirely by the exponent of the exponential. This explains the results of Iwanski and Bradley [J.S. Iwanski, E. Bradley, Recurrence plots of experimental data: to embed or not to embed? Chaos 8 (1998) 861-871] who found that certain recurrence plot statistics are apparently invariant to embedding dimension for certain low-dimensional systems. We also examine the relationship between the mutual information content of two timeseries and the common recurrent structure seen in their recurrence plots. This allows time-localized contributions to mutual information to be visualized. This technique is demonstrated using geomagnetic index data; we show that the AU and AL geomagnetic indices share half their information, and find the timescale on which mutual features appear.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, P.J.; Phillips, J.R.; Brice, D.J.
This data set reports shrub layer growth assessments for the S1-Bog on the Marcell Experimental Forest in Minnesota from 2010 through 2017. Data were obtained by destructively harvesting two 0.25 m2 plots within defined plot areas of the S1-Bog or SPRUCE experimental plots. In 2015, SPRUCE plots 4, 6, 8, 10, 11, 13, 16, 17, 19 and 20 were enclosed in the SPRUCE enclosures. Prior to 2015 all data are for open ambient conditions. In early years a distinct hummock and a hollow sampling square were both collected, but in later years unsampled hollow areas became unavailable due to priormore » sampling or instrument installations. All vegetation material above the Sphagnum surface of the bog was clipped and transferred to plastic storage bags which were then frozen until the samples could be sorted. Sorting was done by species, tissue type (leaves vs. stems) and tissue age (current-year vs. older tissues).« less
Adjusted variable plots for Cox's proportional hazards regression model.
Hall, C B; Zeger, S L; Bandeen-Roche, K J
1996-01-01
Adjusted variable plots are useful in linear regression for outlier detection and for qualitative evaluation of the fit of a model. In this paper, we extend adjusted variable plots to Cox's proportional hazards model for possibly censored survival data. We propose three different plots: a risk level adjusted variable (RLAV) plot in which each observation in each risk set appears, a subject level adjusted variable (SLAV) plot in which each subject is represented by one point, and an event level adjusted variable (ELAV) plot in which the entire risk set at each failure event is represented by a single point. The latter two plots are derived from the RLAV by combining multiple points. In each point, the regression coefficient and standard error from a Cox proportional hazards regression is obtained by a simple linear regression through the origin fit to the coordinates of the pictured points. The plots are illustrated with a reanalysis of a dataset of 65 patients with multiple myeloma.
Cumulative iron dose and resistance to erythropoietin.
Rosati, A; Tetta, C; Merello, J I; Palomares, I; Perez-Garcia, R; Maduell, F; Canaud, B; Aljama Garcia, P
2015-10-01
Optimizing anemia treatment in hemodialysis (HD) patients remains a priority worldwide as it has significant health and financial implications. Our aim was to evaluate in a large cohort of chronic HD patients in Fresenius Medical Care centers in Spain the value of cumulative iron (Fe) dose monitoring for the management of iron therapy in erythropoiesis-stimulating agent (ESA)-treated patients, and the relationship between cumulative iron dose and risk of hospitalization. Demographic, clinical and laboratory parameters from EuCliD(®) (European Clinical Dialysis Database) on 3,591 patients were recorded including ESA dose (UI/kg/week), erythropoietin resistance index (ERI) [U.I weekly/kg/gr hemoglobin (Hb)] and hospitalizations. Moreover the cumulative Fe dose (mg/kg of bodyweight) administered over the last 2 years was calculated. Univariate and multivariate analyses were performed to identify the main predictors of ESA resistance and risk of hospitalization. Patients belonging to the 4th quartile of ERI were defined as hypo-responders. The 2-year iron cumulative dose was significantly higher in the 4th quartile of ERI. In hypo-responders, 2-year cumulative iron dose was the only iron marker associated with ESA resistance. At case-mix adjusted multivariate analysis, 2-year iron cumulative dose was an independent predictor of hospitalization risk. In ESA-treated patients cumulative Fe dose could be a useful tool to monitor the appropriateness of Fe therapy and to prevent iron overload. To establish whether the associations between cumulative iron dose, ERI and hospitalization risk are causal or attributable to selection bias by indication, clinical trials are necessary.
The Heuristic Interpretation of Box Plots
ERIC Educational Resources Information Center
Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim
2013-01-01
Box plots are frequently used, but are often misinterpreted by students. Especially the area of the box in box plots is often misinterpreted as representing number or proportion of observations, while it actually represents their density. In a first study, reaction time evidence was used to test whether heuristic reasoning underlies this…
Francq, Bernard G; Govaerts, Bernadette
2016-06-30
Two main methodologies for assessing equivalence in method-comparison studies are presented separately in the literature. The first one is the well-known and widely applied Bland-Altman approach with its agreement intervals, where two methods are considered interchangeable if their differences are not clinically significant. The second approach is based on errors-in-variables regression in a classical (X,Y) plot and focuses on confidence intervals, whereby two methods are considered equivalent when providing similar measures notwithstanding the random measurement errors. This paper reconciles these two methodologies and shows their similarities and differences using both real data and simulations. A new consistent correlated-errors-in-variables regression is introduced as the errors are shown to be correlated in the Bland-Altman plot. Indeed, the coverage probabilities collapse and the biases soar when this correlation is ignored. Novel tolerance intervals are compared with agreement intervals with or without replicated data, and novel predictive intervals are introduced to predict a single measure in an (X,Y) plot or in a Bland-Atman plot with excellent coverage probabilities. We conclude that the (correlated)-errors-in-variables regressions should not be avoided in method comparison studies, although the Bland-Altman approach is usually applied to avert their complexity. We argue that tolerance or predictive intervals are better alternatives than agreement intervals, and we provide guidelines for practitioners regarding method comparison studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
High lifetime probability of screen-detected cervical abnormalities.
Pankakoski, Maiju; Heinävaara, Sirpa; Sarkeala, Tytti; Anttila, Ahti
2017-12-01
Objective Regular screening and follow-up is an important key to cervical cancer prevention; however, screening inevitably detects mild or borderline abnormalities that would never progress to a more severe stage. We analysed the cumulative probability and recurrence of cervical abnormalities in the Finnish organized screening programme during a 22-year follow-up. Methods Screening histories were collected for 364,487 women born between 1950 and 1965. Data consisted of 1 207,017 routine screens and 88,143 follow-up screens between 1991 and 2012. Probabilities of cervical abnormalities by age were estimated using logistic regression and generalized estimating equations methodology. Results The probability of experiencing any abnormality at least once at ages 30-64 was 34.0% (95% confidence interval [CI]: 33.3-34.6%) . Probability was 5.4% (95% CI: 5.0-5.8%) for results warranting referral and 2.2% (95% CI: 2.0-2.4%) for results with histologically confirmed findings. Previous occurrences were associated with an increased risk of detecting new ones, specifically in older women. Conclusion A considerable proportion of women experience at least one abnormal screening result during their lifetime, and yet very few eventually develop an actual precancerous lesion. Re-evaluation of diagnostic criteria concerning mild abnormalities might improve the balance of harms and benefits of screening. Special monitoring of women with recurrent abnormalities especially at older ages may also be needed.
Cumulative cultural learning: Development and diversity
2017-01-01
The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children’s learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission—the cornerstone of human cultural diversity. PMID:28739945
Cumulative cultural learning: Development and diversity.
Legare, Cristine H
2017-07-24
The complexity and variability of human culture is unmatched by any other species. Humans live in culturally constructed niches filled with artifacts, skills, beliefs, and practices that have been inherited, accumulated, and modified over generations. A causal account of the complexity of human culture must explain its distinguishing characteristics: It is cumulative and highly variable within and across populations. I propose that the psychological adaptations supporting cumulative cultural transmission are universal but are sufficiently flexible to support the acquisition of highly variable behavioral repertoires. This paper describes variation in the transmission practices (teaching) and acquisition strategies (imitation) that support cumulative cultural learning in childhood. Examining flexibility and variation in caregiver socialization and children's learning extends our understanding of evolution in living systems by providing insight into the psychological foundations of cumulative cultural transmission-the cornerstone of human cultural diversity.
Hunter-Gatherer Inter-Band Interaction Rates: Implications for Cumulative Culture
Hill, Kim R.; Wood, Brian M.; Baggio, Jacopo; Hurtado, A. Magdalena; Boyd, Robert T.
2014-01-01
Our species exhibits spectacular success due to cumulative culture. While cognitive evolution of social learning mechanisms may be partially responsible for adaptive human culture, features of early human social structure may also play a role by increasing the number potential models from which to learn innovations. We present interview data on interactions between same-sex adult dyads of Ache and Hadza hunter-gatherers living in multiple distinct residential bands (20 Ache bands; 42 Hadza bands; 1201 dyads) throughout a tribal home range. Results show high probabilities (5%–29% per year) of cultural and cooperative interactions between randomly chosen adults. Multiple regression suggests that ritual relationships increase interaction rates more than kinship, and that affinal kin interact more often than dyads with no relationship. These may be important features of human sociality. Finally, yearly interaction rates along with survival data allow us to estimate expected lifetime partners for a variety of social activities, and compare those to chimpanzees. Hadza and Ache men are estimated to observe over 300 men making tools in a lifetime, whereas male chimpanzees interact with only about 20 other males in a lifetime. High intergroup interaction rates in ancestral humans may have promoted the evolution of cumulative culture. PMID:25047714
Hunter-gatherer inter-band interaction rates: implications for cumulative culture.
Hill, Kim R; Wood, Brian M; Baggio, Jacopo; Hurtado, A Magdalena; Boyd, Robert T
2014-01-01
Our species exhibits spectacular success due to cumulative culture. While cognitive evolution of social learning mechanisms may be partially responsible for adaptive human culture, features of early human social structure may also play a role by increasing the number potential models from which to learn innovations. We present interview data on interactions between same-sex adult dyads of Ache and Hadza hunter-gatherers living in multiple distinct residential bands (20 Ache bands; 42 Hadza bands; 1201 dyads) throughout a tribal home range. Results show high probabilities (5%-29% per year) of cultural and cooperative interactions between randomly chosen adults. Multiple regression suggests that ritual relationships increase interaction rates more than kinship, and that affinal kin interact more often than dyads with no relationship. These may be important features of human sociality. Finally, yearly interaction rates along with survival data allow us to estimate expected lifetime partners for a variety of social activities, and compare those to chimpanzees. Hadza and Ache men are estimated to observe over 300 men making tools in a lifetime, whereas male chimpanzees interact with only about 20 other males in a lifetime. High intergroup interaction rates in ancestral humans may have promoted the evolution of cumulative culture.
Recurrence plots of discrete-time Gaussian stochastic processes
NASA Astrophysics Data System (ADS)
Ramdani, Sofiane; Bouchara, Frédéric; Lagarde, Julien; Lesne, Annick
2016-09-01
We investigate the statistical properties of recurrence plots (RPs) of data generated by discrete-time stationary Gaussian random processes. We analytically derive the theoretical values of the probabilities of occurrence of recurrence points and consecutive recurrence points forming diagonals in the RP, with an embedding dimension equal to 1. These results allow us to obtain theoretical values of three measures: (i) the recurrence rate (REC) (ii) the percent determinism (DET) and (iii) RP-based estimation of the ε-entropy κ(ε) in the sense of correlation entropy. We apply these results to two Gaussian processes, namely first order autoregressive processes and fractional Gaussian noise. For these processes, we simulate a number of realizations and compare the RP-based estimations of the three selected measures to their theoretical values. These comparisons provide useful information on the quality of the estimations, such as the minimum required data length and threshold radius used to construct the RP.
iCanPlot: Visual Exploration of High-Throughput Omics Data Using Interactive Canvas Plotting
Sinha, Amit U.; Armstrong, Scott A.
2012-01-01
Increasing use of high throughput genomic scale assays requires effective visualization and analysis techniques to facilitate data interpretation. Moreover, existing tools often require programming skills, which discourages bench scientists from examining their own data. We have created iCanPlot, a compelling platform for visual data exploration based on the latest technologies. Using the recently adopted HTML5 Canvas element, we have developed a highly interactive tool to visualize tabular data and identify interesting patterns in an intuitive fashion without the need of any specialized computing skills. A module for geneset overlap analysis has been implemented on the Google App Engine platform: when the user selects a region of interest in the plot, the genes in the region are analyzed on the fly. The visualization and analysis are amalgamated for a seamless experience. Further, users can easily upload their data for analysis—which also makes it simple to share the analysis with collaborators. We illustrate the power of iCanPlot by showing an example of how it can be used to interpret histone modifications in the context of gene expression. PMID:22393367
Faithfulness of Recurrence Plots: A Mathematical Proof
NASA Astrophysics Data System (ADS)
Hirata, Yoshito; Komuro, Motomasa; Horai, Shunsuke; Aihara, Kazuyuki
It is practically known that a recurrence plot, a two-dimensional visualization of time series data, can contain almost all information related to the underlying dynamics except for its spatial scale because we can recover a rough shape for the original time series from the recurrence plot even if the original time series is multivariate. We here provide a mathematical proof that the metric defined by a recurrence plot [Hirata et al., 2008] is equivalent to the Euclidean metric under mild conditions.
National FIA plot intensification procedure report
Jock A. Blackard; Paul L. Patterson
2014-01-01
The Forest Inventory and Analysis (FIA) program of the U.S. Forest Service (USFS) measures a spatially distributed base grid of forest inventory plots across the United States. The sampling intensity of plots may be increased in some regions when warranted by specific inventory objectives. Several intensification methods have been developed within FIA and USFS National...
Computer routine adds plotting capabilities to existing programs
NASA Technical Reports Server (NTRS)
Harris, J. C.; Linnekin, J. S.
1966-01-01
PLOTAN, a generalized plot analysis routine written for the IBM 7094 computer, minimizes the difficulties in adding plot capabilities to large existing programs. PLOTAN is used in conjunction with a binary tape writing routine and has the ability to plot any variable on the intermediate binary tape as a function of any other.
The poor man's Geographic Information System: plot expansion factors
Paul C. Van Deusen
2007-01-01
Plot expansion factors can serve as a crude Geographic Information System for users of Forest Inventory and Analysis (FIA) data. Each FIA plot has an associated expansion factor that is often interpreted as the number of forested acres that the plot represents. The derivation of expansion factors is discussed and it is shown that the mapped plot design requires a...
Measurements of smoke from chipped and unchipped plots
Gary L. Achtemeier; Jeff Glitzenstein; Luke P. Naeher
2006-01-01
Smoke data were collected from two instrumented plots located on the Francis Marion National Forest in South Carolina during prescribed burns on Feb. 12, 2003. One of the plots had been subjected to mechanical chipping. Particulate matter (PM2.5) data analyzed by gravimetric methods were collected at nine locations on the downwind sides of each plot. In addition,...
Jang, Dae -Heung; Anderson-Cook, Christine Michaela
2016-11-22
With many predictors in regression, fitting the full model can induce multicollinearity problems. Least Absolute Shrinkage and Selection Operation (LASSO) is useful when the effects of many explanatory variables are sparse in a high-dimensional dataset. Influential points can have a disproportionate impact on the estimated values of model parameters. Here, this paper describes a new influence plot that can be used to increase understanding of the contributions of individual observations and the robustness of results. This can serve as a complement to other regression diagnostics techniques in the LASSO regression setting. Using this influence plot, we can find influential pointsmore » and their impact on shrinkage of model parameters and model selection. Lastly, we provide two examples to illustrate the methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jang, Dae -Heung; Anderson-Cook, Christine Michaela
With many predictors in regression, fitting the full model can induce multicollinearity problems. Least Absolute Shrinkage and Selection Operation (LASSO) is useful when the effects of many explanatory variables are sparse in a high-dimensional dataset. Influential points can have a disproportionate impact on the estimated values of model parameters. Here, this paper describes a new influence plot that can be used to increase understanding of the contributions of individual observations and the robustness of results. This can serve as a complement to other regression diagnostics techniques in the LASSO regression setting. Using this influence plot, we can find influential pointsmore » and their impact on shrinkage of model parameters and model selection. Lastly, we provide two examples to illustrate the methods.« less
The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single chemical drives the cumulative risk of an individual exposed to multiple chemicals. Phthalates are a class of chemicals with ubiquitous exposures in the general population that have the potential to cause ...
Genome U-Plot: a whole genome visualization.
Gaitatzes, Athanasios; Johnson, Sarah H; Smadbeck, James B; Vasmatzis, George
2018-05-15
The ability to produce and analyze whole genome sequencing (WGS) data from samples with structural variations (SV) generated the need to visualize such abnormalities in simplified plots. Conventional two-dimensional representations of WGS data frequently use either circular or linear layouts. There are several diverse advantages regarding both these representations, but their major disadvantage is that they do not use the two-dimensional space very efficiently. We propose a layout, termed the Genome U-Plot, which spreads the chromosomes on a two-dimensional surface and essentially quadruples the spatial resolution. We present the Genome U-Plot for producing clear and intuitive graphs that allows researchers to generate novel insights and hypotheses by visualizing SVs such as deletions, amplifications, and chromoanagenesis events. The main features of the Genome U-Plot are its layered layout, its high spatial resolution and its improved aesthetic qualities. We compare conventional visualization schemas with the Genome U-Plot using visualization metrics such as number of line crossings and crossing angle resolution measures. Based on our metrics, we improve the readability of the resulting graph by at least 2-fold, making apparent important features and making it easy to identify important genomic changes. A whole genome visualization tool with high spatial resolution and improved aesthetic qualities. An implementation and documentation of the Genome U-Plot is publicly available at https://github.com/gaitat/GenomeUPlot. vasmatzis.george@mayo.edu. Supplementary data are available at Bioinformatics online.
Properties of added variable plots in Cox's regression model.
Lindkvist, M
2000-03-01
The added variable plot is useful for examining the effect of a covariate in regression models. The plot provides information regarding the inclusion of a covariate, and is useful in identifying influential observations on the parameter estimates. Hall et al. (1996) proposed a plot for Cox's proportional hazards model derived by regarding the Cox model as a generalized linear model. This paper proves and discusses properties of this plot. These properties make the plot a valuable tool in model evaluation. Quantities considered include parameter estimates, residuals, leverage, case influence measures and correspondence to previously proposed residuals and diagnostics.
Robert E. Keane
2006-01-01
The Plot Description (PD) form is used to describe general characteristics of the FIREMON macroplot to provide ecological context for data analyses. The PD data characterize the topographical setting, geographic reference point, general plant composition and cover, ground cover, fuels, and soils information. This method provides the general ecological data that can be...
Chapter 19. Cumulative watershed effects and watershed analysis
Leslie M. Reid
1998-01-01
Cumulative watershed effects are environmental changes that are affected by more than.one land-use activity and that are influenced by.processes involving the generation or transport.of water. Almost all environmental changes are.cumulative effects, and almost all land-use.activities contribute to cumulative effects
42 CFR 457.560 - Cumulative cost-sharing maximum.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Cumulative cost-sharing maximum. 457.560 Section... State Plan Requirements: Enrollee Financial Responsibilities § 457.560 Cumulative cost-sharing maximum... writing and orally if appropriate of their individual cumulative cost-sharing maximum amount at the time...
Probability of failure prediction for step-stress fatigue under sine or random stress
NASA Technical Reports Server (NTRS)
Lambert, R. G.
1979-01-01
A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.
Aeronautical Engineering: 1983 cumulative index
NASA Technical Reports Server (NTRS)
1984-01-01
This bibliography is a cumulative index to the abstracts contained in NASA SP-7037 (158) through NASA SP-7037 (169) of Aeronautical Engineering: A Continuing Bibliography. NASA SP-7037 and its supplements have been compiled through the cooperative efforts of the American Institute of Aeronautics and Astronautics (AIAA) and the National Aeronautics and Space Administration (NASA). This cumulative index includes subject, personal author, corporate source, contract, report number, and accession number indexes.
Olarerin-George, Anthony O; Jaffrey, Samie R
2017-05-15
An increasing number of studies are mapping protein binding and nucleotide modifications sites throughout the transcriptome. Often, these sites cluster in certain regions of the transcript, giving clues to their function. Hence, it is informative to summarize where in the transcript these sites occur. A metagene is a simple and effective tool for visualizing the distribution of sites along a simplified transcript model. In this work, we introduce MetaPlotR, a Perl/R pipeline for creating metagene plots. The code and associated tutorial are available at https://github.com/olarerin/metaPlotR . srj2003@med.cornell.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Origin of the enhancement of tunneling probability in the nearly integrable system
NASA Astrophysics Data System (ADS)
Hanada, Yasutaka; Shudo, Akira; Ikeda, Kensuke S.
2015-04-01
The enhancement of tunneling probability in the nearly integrable system is closely examined, focusing on tunneling splittings plotted as a function of the inverse of the Planck's constant. On the basis of the analysis using the absorber which efficiently suppresses the coupling, creating spikes in the plot, we found that the splitting curve should be viewed as the staircase-shaped skeleton accompanied by spikes. We further introduce renormalized integrable Hamiltonians and explore the origin of such a staircase structure by investigating the nature of eigenfunctions closely. It is found that the origin of the staircase structure could trace back to the anomalous structure in tunneling tail which manifests itself in the representation using renormalized action bases. This also explains the reason why the staircase does not appear in the completely integrable system.
HEATPLOT: a temperature distribution plotting program for heating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elrod, D.C.; Turner, W.D.
1977-07-01
HEATPLOT is a temperature distribution plotting program that may be used with HEATING5, a generalized heat conduction code. HEATPLOT is capable of drawing temperature contours (isotherms), temperature-time profiles, and temperature-distance profiles from the current HEATING5 temperature distribution or from temperature changes relative to the initial temperature distribution. Contour plots may be made for two- or three-dimensional models. Temperature-time profiles and temperature-distance profiles may be made for one-, two-, and three-dimensional models. HEATPLOT is an IBM 360/370 computer code which uses the DISSPLA plotting package. Plots may be created on the CALCOMP pen-and-ink, and CALCOMP cathode ray tube (CRT), or themore » EAI pen-and-ink plotters. Printer plots may be produced or a compressed data set that may be routed to any of the available plotters may be made.« less
The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single component of a chemical mixture drives the cumulative risk of a receptor.1 This study used the MCR, the Hazard Index (HI) and Hazard Quotient (HQ) to evaluate co-exposures to six phthalates using biomonito...
Why Veterinary Medical Educators Should Embrace Cumulative Final Exams.
Royal, Kenneth D
The topic of cumulative final examinations often elicits polarizing opinions from veterinary medical educators. While some faculty prefer cumulative finals, there are many who perceive these types of examinations as problematic. Specifically, faculty often cite cumulative examinations are more likely to cause students' greater stress, which may in turn result in negative student evaluations of teaching. Cumulative finals also restrict the number of items one may present to students on most recent material. While these cited disadvantages may have some merit, the advantages of cumulative examinations far exceed the disadvantages. The purpose of this article is to discuss the advantages of cumulative examinations with respect to learning evidence, grade/score validity, fairness issues, and implications for academic policy.
7 CFR 42.132 - Determining cumulative sum values.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Determining cumulative sum values. 42.132 Section 42... REGULATIONS STANDARDS FOR CONDITION OF FOOD CONTAINERS On-Line Sampling and Inspection Procedures § 42.132 Determining cumulative sum values. (a) The parameters for the on-line cumulative sum sampling plans for AQL's...
Kleinman, Daniel; Runnqvist, Elin; Ferreira, Victor S.
2015-01-01
Comprehenders predict upcoming speech and text on the basis of linguistic input. How many predictions do comprehenders make for an upcoming word? If a listener strongly expects to hear the word “sock”, is the word “shirt” partially expected as well, is it actively inhibited, or is it ignored? The present research addressed these questions by measuring the “downstream” effects of prediction on the processing of subsequently presented stimuli using the cumulative semantic interference paradigm. In three experiments, subjects named pictures (sock) that were presented either in isolation or after strongly constraining sentence frames (“After doing his laundry, Mark always seemed to be missing one…”). Naming sock slowed the subsequent naming of the picture shirt – the standard cumulative semantic interference effect. However, although picture naming was much faster after sentence frames, the interference effect was not modulated by the context (bare vs. sentence) in which either picture was presented. According to the only model of cumulative semantic interference that can account for such a pattern of data, this indicates that comprehenders pre-activated and maintained the pre-activation of best sentence completions (sock) but did not maintain the pre-activation of less likely completions (shirt). Thus, comprehenders predicted only the most probable completion for each sentence. PMID:25917550
A fast hidden line algorithm for plotting finite element models
NASA Technical Reports Server (NTRS)
Jones, G. K.
1982-01-01
Effective plotting of finite element models requires the use of fast hidden line plot techniques that provide interactive response. A high speed hidden line technique was developed to facilitate the plotting of NASTRAN finite element models. Based on testing using 14 different models, the new hidden line algorithm (JONES-D) appears to be very fast: its speed equals that for normal (all lines visible) plotting and when compared to other existing methods it appears to be substantially faster. It also appears to be very reliable: no plot errors were observed using the new method to plot NASTRAN models. The new algorithm was made part of the NPLOT NASTRAN plot package and was used by structural analysts for normal production tasks.
Cumulative watershed effects: Then and now
Leslie M. Reid
2001-01-01
Abstract - Cumulative effects are the combined effects of multiple activities, and watershed effects are those which involve processes of water transport. Almost all impacts are influenced by multiple activities, so almost all impacts must be evaluated as cumulative impacts rather than as individual impacts. Existing definitions suggest that to be significant, an...
Moments from Cumulants and Vice Versa
ERIC Educational Resources Information Center
Withers, Christopher S.; Nadarajah, Saralees
2009-01-01
Moments and cumulants are expressed in terms of each other using Bell polynomials. Inbuilt routines for the latter make these expressions amenable to use by algebraic manipulation programs. One of the four formulas given is an explicit version of Kendall's use of Faa di Bruno's chain rule to express cumulants in terms of moments.
1981-02-01
monotonic increasing function of true ability or performance score. A cumulative probability function is * then very convenient for describiny; one’s...possible outcomes such as test scores, grade-point averages or other common outcome variables. Utility is usually a monotonic increasing function of true ...r(0) is negative for 8 <i and positive for 0 > M, U(o) is risk-prone for low 0 values and risk-averse for high 0 values. This property is true for
Cumulative effects of mothers' risk and promotive factors on daughters' disruptive behavior.
van der Molen, Elsa; Hipwell, Alison E; Vermeiren, Robert; Loeber, Rolf
2012-07-01
Little is known about the ways in which the accumulation of maternal factors increases or reduces risk for girls' disruptive behavior during preadolescence. In the current study, maternal risk and promotive factors and the severity of girls' disruptive behavior were assessed annually among girls' ages 7-12 in an urban community sample (N = 2043). Maternal risk and promotive factors were operative at different time points in girls' development. Maternal warmth explained variance in girls' disruptive behavior, even after controlling for maternal risk factors and relevant child and neighborhood factors. In addition, findings supported the cumulative hypothesis that the number of risk factors increased the chance on girls' disruptive behavior disorder (DBD), while the number of promotive factors decreased this probability. Daughters of mothers with a history of Conduct Disorder (CD) were exposed to more risk factors and fewer promotive factors compared to daughters of mothers without prior CD. The identification of malleable maternal factors that can serve as targets for intervention has important implications for intergenerational intervention. Cumulative effects show that the focus of prevention efforts should not be on single factors, but on multiple factors associated with girls' disruptive behavior.
Cumulative Effects of Mothers’ Risk and Promotive Factors on Daughters’ Disruptive Behavior
Hipwell, Alison E.; Vermeiren, Robert; Loeber, Rolf
2012-01-01
Little is known about the ways in which the accumulation of maternal factors increases or reduces risk for girls’ disruptive behavior during preadolescence. In the current study, maternal risk and promotive factors and the severity of girls’ disruptive behavior were assessed annually among girls’ ages 7–12 in an urban community sample (N=2043). Maternal risk and promotive factors were operative at different time points in girls’ development. Maternal warmth explained variance in girls’ disruptive behavior, even after controlling for maternal risk factors and relevant child and neighborhood factors. In addition, findings supported the cumulative hypothesis that the number of risk factors increased the chance on girls’ disruptive behavior disorder (DBD), while the number of promotive factors decreased this probability. Daughters of mothers with a history of Conduct Disorder (CD) were exposed to more risk factors and fewer promotive factors compared to daughters of mothers without prior CD. The identification of malleable maternal factors that can serve as targets for intervention has important implications for intergenerational intervention. Cumulative effects show that the focus of prevention efforts should not be on single factors, but on multiple factors associated with girls’ disruptive behavior. PMID:22127641
7 CFR 42.132 - Determining cumulative sum values.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the previous subgroup. (2) Subtract the subgroup tolerance (“T”). (3) The CuSum value is reset in the... 7 Agriculture 2 2010-01-01 2010-01-01 false Determining cumulative sum values. 42.132 Section 42... Determining cumulative sum values. (a) The parameters for the on-line cumulative sum sampling plans for AQL's...
Advancements in LiDAR-based registration of FIA field plots
Demetrios Gatziolis
2012-01-01
Meaningful integration of National Forest Inventory field plot information with spectral imagery acquired from satellite or airborne platforms requires precise plot registration. Global positioning system-based plot registration procedures, such as the one employed by the Forest Inventory and Analysis (FIA) Program, yield plot coordinates that, although adequate for...
CFD Extraction Tool for TecPlot From DPLR Solutions
NASA Technical Reports Server (NTRS)
Norman, David
2013-01-01
This invention is a TecPlot macro of a computer program in the TecPlot programming language that processes data from DPLR solutions in TecPlot format. DPLR (Data-Parallel Line Relaxation) is a NASA computational fluid dynamics (CFD) code, and TecPlot is a commercial CFD post-processing tool. The Tec- Plot data is in SI units (same as DPLR output). The invention converts the SI units into British units. The macro modifies the TecPlot data with unit conversions, and adds some extra calculations. After unit conversions, the macro cuts a slice, and adds vectors on the current plot for output format. The macro can also process surface solutions. Existing solutions use manual conversion and superposition. The conversion is complicated because it must be applied to a range of inter-related scalars and vectors to describe a 2D or 3D flow field. It processes the CFD solution to create superposition/comparison of scalars and vectors. The existing manual solution is cumbersome, open to errors, slow, and cannot be inserted into an automated process. This invention is quick and easy to use, and can be inserted into an automated data-processing algorithm.
Chen, Yan-Hui; Chen, Ming-Hua; Wang, Guo; Chen, Wen-Xiang; Yang, Shun-Cheng; Chai, Peng
2010-10-01
The effects of different slopes on nitrogen transport along with runoff from sloping plots amended with sewage sludge on a lateritic red soil were studied under simulated rainfall conditions. When the sludge was broadcasted and mixed with surface soils (BM), the MTN (total nitrogen of mixing sample), STN (total nitrogen of settled sample), TPN (total particulate nitrogen), TSN (total suspended nitrogen), TDN (total dissolved nitrogen) and NH4(+) -N concentrations and nitrogen loss amounts in runoff of all treatments were highest at 1 day or 18 days after application. The highest concentrations and the loss amounts of MTN and STN in the slope runoff for the BM treatment increased with slope degree, showing increasing pollution risks to the surface waters. The STN concentration and loss amounts from the 25 degrees plots were 126.1 mg x L(-1) and 1788.6 mg x m(-2), respectively, being 4.6 times and 5.8 times of the corresponding values from the 10 degrees plots, respectively. Then the concentrations and the loss amounts of nitrogen (except NO3(-) -N) from the BM plots diminished rapidly first and then tended to be stable with dwindling differences between the slopes. The loss of MTN and STN in early runoff (1 day and 18 days) accounted for 68.6% -73.4% and 62.3% -66.7% of the cumulative loss amounts during the experimental period for all the broadcasted treatments. Runoff loss coefficients of MTN increased in the order of 20 degrees > 25 degrees > 15 degrees > 10 degrees. Nitrogen was largely lost in dissolved species while large portion of NH4(+) -N was lost with particulates.
Avoiding cumulative trauma disorders in shops and offices.
Kroemer, K H
1992-09-01
Cumulative trauma disorders have been medically described for about 100 yr and have been related to physical activities for nearly 300 yr. Yet, avoiding these disorders in the shop and office is becoming of urgent concern only now, particularly because of the Occupational Safety and Health Administration's (OSHA's) investigation and enforcement program. Such disorders occur most often in soft tissues of the body, particularly at tendons and their sheaths. They may irritate or damage nerves and impede blood flow. They are frequent in the hand/wrist/forearm area; for example, in the carpal tunnel and in the shoulder and neck. Although controversy exists, occupational and leisure activities are generally believed to cause or aggravate cumulative trauma disorders. The major activity-related factors are rapid repetitive movements, forceful movements, static muscle loading, inappropriate body postures, vibrations, and cold. Yet, the quantitative thresholds above which cumulative trauma disorders are expected to occur are largely unknown and need to be researched. Furthermore, certain health conditions may make individuals predisposed to cumulative disorders. For most cumulative trauma disorders, physical activities and job procedures can be identified that are related to the occurrence of cumulative trauma disorders. This allows the establishment of generic and specific recommendations for the avoidance of conditions that may lead to cumulative trauma disorders in the workshop or the office.
ResidPlots-2: Computer Software for IRT Graphical Residual Analyses
ERIC Educational Resources Information Center
Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.
2009-01-01
This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…
A Screening Method for Assessing Cumulative Impacts
Alexeeff, George V.; Faust, John B.; August, Laura Meehan; Milanes, Carmen; Randles, Karen; Zeise, Lauren; Denton, Joan
2012-01-01
The California Environmental Protection Agency (Cal/EPA) Environmental Justice Action Plan calls for guidelines for evaluating “cumulative impacts.” As a first step toward such guidelines, a screening methodology for assessing cumulative impacts in communities was developed. The method, presented here, is based on the working definition of cumulative impacts adopted by Cal/EPA [1]: “Cumulative impacts means exposures, public health or environmental effects from the combined emissions and discharges in a geographic area, including environmental pollution from all sources, whether single or multi-media, routinely, accidentally, or otherwise released. Impacts will take into account sensitive populations and socio-economic factors, where applicable and to the extent data are available.” The screening methodology is built on this definition as well as current scientific understanding of environmental pollution and its adverse impacts on health, including the influence of both intrinsic, biological factors and non-intrinsic socioeconomic factors in mediating the effects of pollutant exposures. It addresses disparities in the distribution of pollution and health outcomes. The methodology provides a science-based tool to screen places for relative cumulative impacts, incorporating both the pollution burden on a community- including exposures to pollutants, their public health and environmental effects- and community characteristics, specifically sensitivity and socioeconomic factors. The screening methodology provides relative rankings to distinguish more highly impacted communities from less impacted ones. It may also help identify which factors are the greatest contributors to a community’s cumulative impact. It is not designed to provide quantitative estimates of community-level health impacts. A pilot screening analysis is presented here to illustrate the application of this methodology. Once guidelines are adopted, the methodology can serve as a screening
Reaction Order Ambiguity in Integrated Rate Plots
ERIC Educational Resources Information Center
Lee, Joe
2008-01-01
Integrated rate plots are frequently used in reaction kinetics to determine orders of reactions. It is often emphasised, when using this methodology in practice, that it is necessary to monitor the reaction to a substantial fraction of completion for these plots to yield unambiguous orders. The present article gives a theoretical and statistical…
NASA Technical Reports Server (NTRS)
Rich, Paul M.; Fournier, Robert; Hall, Forrest G. (Editor); Papagno, Andrea (Editor)
2000-01-01
The Boreal Ecosystem-Atmospheric Study (BOREAS) TE-23 (Terrestrial Ecology) team collected map plot data in support of its efforts to characterize and interpret information on canopy architecture and understory cover at the BOREAS tower flux sites and selected auxiliary sites from May to August 1994. Mapped plots (typical dimensions 50 m x 60 m) were set up and characterized at all BOREAS forested tower flux and selected auxiliary sites. Detailed measurement of the mapped plots included: (1) stand characteristics (location, density, basal area); (2) map locations diameter at breast height (DBH) of all trees; (3) detailed geometric measures of a subset of trees (height, crown dimensions); and (4) understory cover maps. The data are stored in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
Effects of cumulative stress and impulsivity on smoking status.
Ansell, Emily B; Gu, Peihua; Tuit, Keri; Sinha, Rajita
2012-03-01
The stress-vulnerability model of addiction predicts that environmental factors, such as cumulative stress, will result in individual adaptations that decrease self-control, increase impulsivity, and increase risk for addiction. Impulsivity and cumulative stress are risk factors for tobacco smoking that are rarely examined simultaneously in research. We examined the indirect and direct effects of cumulative adversity in a community sample consisting of 291 men and women who participated in an assessment of cumulative stress, self-reported impulsivity, and smoking history. Data were analyzed using bootstrapping techniques to estimate indirect effects of stress on smoking via impulsivity. Cumulative adversity is associated with smoking status via direct effects and indirect effects through impulsivity scores. Additional models examining specific types of stress indicate contributions of traumatic stress and recent life events as well as chronic relationship stressors. Overall, cumulative stress is associated with increased risk of smoking via increased impulsivity and via pathways independent of impulsivity. These findings support the stress-vulnerability model and highlight the utility of mediation models in assessing how, and for whom, cumulative stress increases risk of current cigarette smoking. Increasing self-control is a target for interventions with individuals who have experienced cumulative adversity. Copyright © 2012 John Wiley & Sons, Ltd.
Effects of cumulative stress and impulsivity on smoking status
Ansell, Emily B.; Gu, Peihua; Tuit, Keri; Sinha, Rajita
2013-01-01
Objective The stress-vulnerability model of addiction predicts that environmental factors, such as cumulative stress, will result in individual adaptations that decrease self-control, increase impulsivity, and increase risk for addiction. Impulsivity and cumulative stress are risk factors for tobacco smoking that are rarely examined simultaneously in research. Methods We examined the indirect and direct effects of cumulative adversity in a community sample consisting of 291 men and women who participated in an assessment of cumulative stress, self-reported impulsivity, and smoking history. Data were analyzed using bootstrapping techniques to estimate indirect effects of stress on smoking via impulsivity. Results Cumulative adversity is associated with smoking status via direct effects and indirect effects through impulsivity scores. Additional models examining specific types of stress indicate contributions of traumatic stress and recent life events as well as chronic relationship stressors. Conclusions Overall, cumulative stress is associated with increased risk of smoking via increased impulsivity and via pathways independent of impulsivity. These findings support the stress-vulnerability model and highlight the utility of mediation models in assessing how, and for whom, cumulative stress increases risk of current cigarette smoking. Increasing self-control is a target for interventions with individuals who have experienced cumulative adversity. PMID:22389084
SplicePlot: a utility for visualizing splicing quantitative trait loci.
Wu, Eric; Nance, Tracy; Montgomery, Stephen B
2014-04-01
RNA sequencing has provided unprecedented resolution of alternative splicing and splicing quantitative trait loci (sQTL). However, there are few tools available for visualizing the genotype-dependent effects of splicing at a population level. SplicePlot is a simple command line utility that produces intuitive visualization of sQTLs and their effects. SplicePlot takes mapped RNA sequencing reads in BAM format and genotype data in VCF format as input and outputs publication-quality Sashimi plots, hive plots and structure plots, enabling better investigation and understanding of the role of genetics on alternative splicing and transcript structure. Source code and detailed documentation are available at http://montgomerylab.stanford.edu/spliceplot/index.html under Resources and at Github. SplicePlot is implemented in Python and is supported on Linux and Mac OS. A VirtualBox virtual machine running Ubuntu with SplicePlot already installed is also available.
Long-term tree inventory data from mountain forest plots in France.
Fuhr, Marc; Cordonnier, Thomas; Courbaud, Benoît; Kunstler, Georges; Mermin, Eric; Riond, Catherine; Tardif, Pascal
2017-04-01
We present repeated tree measurement data from 63 permanent plots in mountain forests in France. Plot elevations range from 800 (lower limit of the montane belt) to 1942 m above sea level (subalpine belt). Forests mainly consist of pure or mixed stands dominated by European beech (Fagus sylvatica), Silver fir (Abies alba), and Norway spruce (Picea abies), in association with various broadleaved species at low elevation and with Arolla pine (Pinus cembra) at high elevation. The plot network includes 23 plots in stands that have not been managed for the last 40 years (at least) and 40 plots in plots managed according to an uneven-aged system with single-tree or small-group selection cutting. Plot sizes range from 0.2 to 1.9 ha. Plots were installed from 1994 to 2004 and remeasured two to five times during the 1994-2015 period. During the first census (installation), living trees more than 7.5 cm in dbh were identified, their diameter at breast height (dbh) was measured and their social status (strata) noted. Trees were spatially located, either with x, y, and z coordinates (40 plots) or within 0.25-ha square subplots (23 plots). In addition, in a subset of plots (58 plots), tree heights and tree crown dimensions were measured on a subset of trees and dead standing trees and stumps were included in the census. Remeasurements after installation include live tree diameters (including recruited trees), tree status (living, damaged, dead, stump), and for a subset of trees, height. At the time of establishment of the plots, plot densities range from 181 to 1328 stems/ha and plot basal areas range from 13.6 to 81.3 m 2 /ha. © 2017 by the Ecological Society of America.
Cumulate Mantle Dynamics Response to Magma Ocean Cooling Rate
NASA Astrophysics Data System (ADS)
Boukare, C.-E.; Parmentier, E. M.; Parman, S. W.
2018-05-01
We investigate the issue of the cumulate compaction during magma ocean solidification. We show that the cooling rate of the magma ocean affects the amount and distribution of retained melt in the cumulate layers and the timing of cumulate overturn.
A joint probability approach for coincidental flood frequency analysis at ungauged basin confluences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Cheng
2016-03-12
A reliable and accurate flood frequency analysis at the confluence of streams is of importance. Given that long-term peak flow observations are often unavailable at tributary confluences, at a practical level, this paper presents a joint probability approach (JPA) to address the coincidental flood frequency analysis at the ungauged confluence of two streams based on the flow rate data from the upstream tributaries. One case study is performed for comparison against several traditional approaches, including the position-plotting formula, the univariate flood frequency analysis, and the National Flood Frequency Program developed by US Geological Survey. It shows that the results generatedmore » by the JPA approach agree well with the floods estimated by the plotting position and univariate flood frequency analysis based on the observation data.« less
Palmsten, Kristin; Rolland, Matthieu; Hebert, Mary F; Clowse, Megan E B; Schatz, Michael; Xu, Ronghui; Chambers, Christina D
2018-04-01
To characterize prednisone use in pregnant women with rheumatoid arthritis using individual-level heat-maps and clustering individual trajectories of prednisone dose, and to evaluate the association between prednisone dose trajectory groups and gestational length. This study included pregnant women with rheumatoid arthritis who enrolled in the MotherToBaby Autoimmune Diseases in Pregnancy Study (2003-2014) before gestational week 20 and reported prednisone use without another oral glucocorticoid during pregnancy (n = 254). Information on medication use and pregnancy outcomes was collected by telephone interview plus by medical record review. Prednisone daily dose and cumulative dose were plotted by gestational day using a heat map for each individual. K-means clustering was used to cluster individual trajectories of prednisone dose into groups. The associations between trajectory group and demographics, disease severity measured by the Health Assessment Questionnaire at enrollment, and gestational length were evaluated. Women used prednisone 3 to 292 days during pregnancy, with daily doses ranging from <1 to 60 mg. Total cumulative dose ranged from 8 to 6225 mg. Disease severity, non-biologic disease modifying anti-rheumatic drug use, and gestational length varied significantly by trajectory group. After adjusting for disease severity, non-biologic disease modifying anti-rheumatic drug use, and other covariates, the highest vs lowest daily dose trajectory group was associated with reduced gestational age at delivery (β: -2.3 weeks (95%: -3.4, -1.3)), as was the highest vs lowest cumulative dose trajectory group (β: -2.6 weeks (95%: -3.6, -1.5)). In pregnant women with rheumatoid arthritis, patterns of higher prednisone dose were associated with shorter gestational length compared with lower dose. Copyright © 2018 John Wiley & Sons, Ltd.
Hamada, Tsuyoshi; Nakai, Yousuke; Isayama, Hiroyuki; Togawa, Osamu; Kogure, Hirofumi; Kawakubo, Kazumichi; Tsujino, Takeshi; Sasahira, Naoki; Hirano, Kenji; Yamamoto, Natsuyo; Ito, Yukiko; Sasaki, Takashi; Mizuno, Suguru; Toda, Nobuo; Tada, Minoru; Koike, Kazuhiko
2014-03-01
Self-expandable metallic stent (SEMS) placement is widely carried out for distal malignant biliary obstruction, and survival analysis is used to evaluate the cumulative incidences of SEMS dysfunction (e.g. the Kaplan-Meier [KM] method and the log-rank test). However, these statistical methods might be inappropriate in the presence of 'competing risks' (here, death without SEMS dysfunction), which affects the probability of experiencing the event of interest (SEMS dysfunction); that is, SEMS dysfunction can no longer be observed after death. A competing risk analysis has rarely been done in studies on SEMS. We introduced the concept of a competing risk analysis and illustrated its impact on the evaluation of SEMS outcomes using hypothetical and actual data. Our illustrative study included 476 consecutive patients who underwent SEMS placement for unresectable distal malignant biliary obstruction. A significant difference between cumulative incidences of SEMS dysfunction in male and female patients via theKM method (P = 0.044 by the log-rank test) disappeared after applying a competing risk analysis (P = 0.115 by Gray's test). In contrast, although cumulative incidences of SEMS dysfunction via the KM method were similar with and without chemotherapy (P = 0.647 by the log-rank test), cumulative incidence of SEMS dysfunction in the non-chemotherapy group was shown to be significantly lower (P = 0.031 by Gray's test) in a competing risk analysis. Death as a competing risk event needs to be appropriately considered in estimating a cumulative incidence of SEMS dysfunction, otherwise analytical results may be biased. © 2013 The Authors. Digestive Endoscopy © 2013 Japan Gastroenterological Endoscopy Society.
PET kinetic analysis --pitfalls and a solution for the Logan plot.
Kimura, Yuichi; Naganawa, Mika; Shidahara, Miho; Ikoma, Yoko; Watabe, Hiroshi
2007-01-01
The Logan plot is a widely used algorithm for the quantitative analysis of neuroreceptors using PET because it is easy to use and simple to implement. The Logan plot is also suitable for receptor imaging because its algorithm is fast. However, use of the Logan plot, and interpretation of the formed receptor images should be regarded with caution, because noise in PET data causes bias in the Logan plot estimates. In this paper, we describe the basic concept of the Logan plot in detail and introduce three algorithms for the Logan plot. By comparing these algorithms, we demonstrate the pitfalls of the Logan plot and discuss the solution.
Digital data collection in forest dynamics plots
Faith Inman-Narahari; Christian Giardina; Rebecca Ostertag; Susan Cordell; Lawren Sack
2010-01-01
Summary 1. Computers are widely used in all aspects of research but their application to in-field data collection for forest plots has rarely been evaluated. 2. We developed digital data collection methods using ESRI mapping software and ruggedized field computers to map and measure ~30 000 trees in two 4-ha forest dynamics plots in wet and dry...
NASA Astrophysics Data System (ADS)
Boslough, M.
2011-12-01
global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.
Longleaf pine regeneration following Hurricane Ivan utilizing the RLGS plots
John C. Gilbert; John S. Kush
2013-01-01
On September 16, 2004, Hurricane Ivan hit the Alabama coast and severely impacted numerous plots in the U.S. Forest Serviceâs Regional Longleaf Growth Study (RLGS). The Escambia Experimental Forest (EEF) has 201 of the 325 RLGS plots. Nearly one-third of the EEF was impacted. Nine plots with pole-sized trees were entirely lost. Another 54 plots had some type of damage...
SEGY to ASCII: Conversion and Plotting Program
Goldman, Mark R.
1999-01-01
This report documents a computer program to convert standard 4 byte, IBM floating point SEGY files to ASCII xyz format. The program then optionally plots the seismic data using the GMT plotting package. The material for this publication is contained in a standard tar file (of99-126.tar) that is uncompressed and 726 K in size. It can be downloaded by any Unix machine. Move the tar file to the directory you wish to use it in, then type 'tar xvf of99-126.tar' The archive files (and diskette) contain a NOTE file, a README file, a version-history file, source code, a makefile for easy compilation, and an ASCII version of the documentation. The archive files (and diskette) also contain example test files, including a typical SEGY file along with the resulting ASCII xyz and postscript files. Requirements for compiling the source code into an executable are a C++ compiler. The program has been successfully compiled using Gnu's g++ version 2.8.1, and use of other compilers may require modifications to the existing source code. The g++ compiler is a free, high quality C++ compiler and may be downloaded from the ftp site: ftp://ftp.gnu.org/gnu Requirements for plotting the seismic data is the existence of the GMT plotting package. The GMT plotting package may be downloaded from the web site: http://www.soest.hawaii.edu/gmt/
FLOWCHART; a computer program for plotting flowcharts
Bender, Bernice
1982-01-01
The computer program FLOWCHART can be used to very quickly and easily produce flowcharts of high quality for publication. FLOWCHART centers each element or block of text that it processes on one of a set of (imaginary) vertical lines. It can enclose a text block in a rectangle, circle or other selected figure. It can draw a 'line connecting the midpoint of any side of any figure with the midpoint of any side of any other figure and insert an arrow pointing in the direction of flow. It can write 'yes' or 'no' next to the line joining two figures. FLOWCHART creates flowcharts using some basic plotting subroutine* which permit plots to be generated interactively and inspected on a Tektronix compatible graphics screen or plotted in a deferred mode on a Houston Instruments 42' pen plotter. The size of the plot, character set and character height in inches are inputs to the program. Plots generated using the pen plotter can be up to 42' high--the larger size plots being directly usable as visual aids in a talk. FLOWCHART centers each block of text on an imaginary column line. (The number of columns and column width are specified as input.) The midpoint of the longest line of text within the block is defined to be the center of the block and is placed on the column line. The spacing of individual words within the block is not altered when the block is positioned. The program writes the first block of text in a designated column and continues placing each subsequent block below the previous block in the same column. A block of text may be placed in a different column by specifying the number of the column and an earlier block of text with which the new block is to be aligned. If block zero is given as the earlier block, the new text is placed in the new column continuing down the page below the previous block. Optionally a column and number of inches from the top of the page may be given for positioning the next block of text. The program will normally draw one of five
True versus perturbed forest inventory plot locations for modeling: a simulation study
John W. Coulston; Kurt H. Riitters; Ronald E. McRoberts; William D. Smith
2006-01-01
USDA Forest Service Forest Inventory and Analysis plot information is widely used for timber inventories, forest health assessments, and environmental risk analyses. With few exceptions, true plot locations are not revealed; the plot coordinates are manipulated to obscure the location of field plots and thereby preserve plot integrity. The influence of perturbed plot...
NASA Technical Reports Server (NTRS)
Chadwick, C.
1984-01-01
This paper describes the development and use of an algorithm to compute approximate statistics of the magnitude of a single random trajectory correction maneuver (TCM) Delta v vector. The TCM Delta v vector is modeled as a three component Cartesian vector each of whose components is a random variable having a normal (Gaussian) distribution with zero mean and possibly unequal standard deviations. The algorithm uses these standard deviations as input to produce approximations to (1) the mean and standard deviation of the magnitude of Delta v, (2) points of the probability density function of the magnitude of Delta v, and (3) points of the cumulative and inverse cumulative distribution functions of Delta v. The approximates are based on Monte Carlo techniques developed in a previous paper by the author and extended here. The algorithm described is expected to be useful in both pre-flight planning and in-flight analysis of maneuver propellant requirements for space missions.
Workplace statistical literacy for teachers: interpreting box plots
NASA Astrophysics Data System (ADS)
Pierce, Robyn; Chick, Helen
2013-06-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the appropriate knowledge and experience to interpret the graphs, tables and other data that they receive. This study examined the statistical literacy demands placed on teachers, with a particular focus on box plot representations. Although box plots summarise the data in a way that makes visual comparisons possible across sets of data, this study showed that teachers do not always have the necessary fluency with the representation to describe correctly how the data are distributed in the representation. In particular, a significant number perceived the size of the regions of the box plot to be depicting frequencies rather than density, and there were misconceptions associated with outlying data that were not displayed on the plot. As well, teachers' perceptions of box plots were found to relate to three themes: attitudes, perceived value and misconceptions.
Cumulative versus Rapid Introduction of New Information.
ERIC Educational Resources Information Center
Gleason, Mary; And Others
1991-01-01
Forty-seven elementary and middle school students, most with learning disabilities, used a computer-assisted instruction program which rapidly presented seven pieces of information or one which cumulatively presented smaller information "chunks." Both groups worked to mastery level successfully, but the cumulative group spent one-third…
Dalitz plot distributions in presence of triangle singularities
Szczepaniak, Adam P.
2016-03-25
We discuss properties of three-particle Dalitz distributions in coupled channel systems in presence of triangle singularities. The single channel case was discussed long ago where it was found that as a consequence of unitarity, effects of a triangle singularity seen in the Dalitz plot are not seen in Dalitz plot projections. In the coupled channel case we find the same is true for the sum of intensities of all interacting channels. As a result, unlike the single channel case, however, triangle singularities do remain visible in Dalitz plot projections of individual channels.
Dalitz plot distributions in presence of triangle singularities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szczepaniak, Adam P.
We discuss properties of three-particle Dalitz distributions in coupled channel systems in presence of triangle singularities. The single channel case was discussed long ago where it was found that as a consequence of unitarity, effects of a triangle singularity seen in the Dalitz plot are not seen in Dalitz plot projections. In the coupled channel case we find the same is true for the sum of intensities of all interacting channels. As a result, unlike the single channel case, however, triangle singularities do remain visible in Dalitz plot projections of individual channels.
This tool plots daily AQI values for a specific location and time period. Each square or “tile” represents one day of the year and is color-coded based on the AQI level for that day. The legend tallies the number of days in each AQI category.
Pötschger, Ulrike; Heinzl, Harald; Valsecchi, Maria Grazia; Mittlböck, Martina
2018-01-19
Investigating the impact of a time-dependent intervention on the probability of long-term survival is statistically challenging. A typical example is stem-cell transplantation performed after successful donor identification from registered donors. Here, a suggested simple analysis based on the exogenous donor availability status according to registered donors would allow the estimation and comparison of survival probabilities. As donor search is usually ceased after a patient's event, donor availability status is incompletely observed, so that this simple comparison is not possible and the waiting time to donor identification needs to be addressed in the analysis to avoid bias. It is methodologically unclear, how to directly address cumulative long-term treatment effects without relying on proportional hazards while avoiding waiting time bias. The pseudo-value regression technique is able to handle the first two issues; a novel generalisation of this technique also avoids waiting time bias. Inverse-probability-of-censoring weighting is used to account for the partly unobserved exogenous covariate donor availability. Simulation studies demonstrate unbiasedness and satisfying coverage probabilities of the new method. A real data example demonstrates that study results based on generalised pseudo-values have a clear medical interpretation which supports the clinical decision making process. The proposed generalisation of the pseudo-value regression technique enables to compare survival probabilities between two independent groups where group membership becomes known over time and remains partly unknown. Hence, cumulative long-term treatment effects are directly addressed without relying on proportional hazards while avoiding waiting time bias.
On Connected Diagrams and Cumulants of Erdős-Rényi Matrix Models
NASA Astrophysics Data System (ADS)
Khorunzhiy, O.
2008-08-01
Regarding the adjacency matrices of n-vertex graphs and related graph Laplacian we introduce two families of discrete matrix models constructed both with the help of the Erdős-Rényi ensemble of random graphs. Corresponding matrix sums represent the characteristic functions of the average number of walks and closed walks over the random graph. These sums can be considered as discrete analogues of the matrix integrals of random matrix theory. We study the diagram structure of the cumulant expansions of logarithms of these matrix sums and analyze the limiting expressions as n → ∞ in the cases of constant and vanishing edge probabilities.
Music-evoked incidental happiness modulates probability weighting during risky lottery choices
Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.
2014-01-01
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007
Music-evoked incidental happiness modulates probability weighting during risky lottery choices.
Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R
2014-01-07
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.
Cumulative incidence of cancer after solid organ transplantation.
Hall, Erin C; Pfeiffer, Ruth M; Segev, Dorry L; Engels, Eric A
2013-06-15
Solid organ transplantation recipients have elevated cancer incidence. Estimates of absolute cancer risk after transplantation can inform prevention and screening. The Transplant Cancer Match Study links the US transplantation registry with 14 state/regional cancer registries. The authors used nonparametric competing risk methods to estimate the cumulative incidence of cancer after transplantation for 2 periods (1987-1999 and 2000-2008). For recipients from 2000 to 2008, the 5-year cumulative incidence, stratified by organ, sex, and age at transplantation, was estimated for 6 preventable or screen-detectable cancers. For comparison, the 5-year cumulative incidence was calculated for the same cancers in the general population at representative ages using Surveillance, Epidemiology, and End Results data. Among 164,156 recipients, 8520 incident cancers were identified. The absolute cancer risk was slightly higher for recipients during the period from 2000 to 2008 than during the period from 1987 to 1999 (5-year cumulative incidence: 4.4% vs. 4.2%; P = .006); this difference arose from the decreasing risk of competing events (5-year cumulative incidence of death, graft failure, or retransplantation: 26.6% vs. 31.9%; P < .001). From 2000 to 2008, the 5-year cumulative incidence of non-Hodgkin lymphoma was highest at extremes of age, especially in thoracic organ recipients (ages 0-34 years: range, 1.74%-3.28%; aged >50 years; range, 0.36%-2.22%). For recipients aged >50 years, the 5-year cumulative incidence was higher for colorectal cancer (range, 0.33%-1.94%) than for the general population at the recommended screening age (aged 50 years: range, 0.25%-0.33%). For recipients aged >50 years, the 5-year cumulative incidence was high for lung cancer among thoracic organ recipients (range, 1.16%-3.87%) and for kidney cancer among kidney recipients (range, 0.53%-0.84%). The 5-year cumulative incidence for prostate cancer and breast cancer was similar or lower in
Van Cauwenberg, Jelle; Clarys, Peter; De Bourdeaudhuij, Ilse; Van Holle, Veerle; Verté, Dominique; De Witte, Nico; De Donder, Liesbeth; Buffel, Tine; Dury, Sarah; Deforche, Benedicte
2013-08-14
The physical environment may play a crucial role in promoting older adults' walking for transportation. However, previous studies on relationships between the physical environment and older adults' physical activity behaviors have reported inconsistent findings. A possible explanation for these inconsistencies is the focus upon studying environmental factors separately rather than simultaneously. The current study aimed to investigate the cumulative influence of perceived favorable environmental factors on older adults' walking for transportation. Additionally, the moderating effect of perceived distance to destinations on this relationship was studied. The sample was comprised of 50,685 non-institutionalized older adults residing in Flanders (Belgium). Cross-sectional data on demographics, environmental perceptions and frequency of walking for transportation were collected by self-administered questionnaires in the period 2004-2010. Perceived distance to destinations was categorized into short, medium, and large distance to destinations. An environmental index (=a sum of favorable environmental factors, ranging from 0 to 7) was constructed to investigate the cumulative influence of favorable environmental factors. Multilevel logistic regression analyses were applied to predict probabilities of daily walking for transportation. For short distance to destinations, probability of daily walking for transportation was significantly higher when seven compared to three, four or five favorable environmental factors were present. For medium distance to destinations, probabilities significantly increased for an increase from zero to four favorable environmental factors. For large distance to destinations, no relationship between the environmental index and walking for transportation was observed. Our findings suggest that the presence of multiple favorable environmental factors can motivate older adults to walk medium distances to facilities. Future research should focus
2013-01-01
Background The physical environment may play a crucial role in promoting older adults’ walking for transportation. However, previous studies on relationships between the physical environment and older adults’ physical activity behaviors have reported inconsistent findings. A possible explanation for these inconsistencies is the focus upon studying environmental factors separately rather than simultaneously. The current study aimed to investigate the cumulative influence of perceived favorable environmental factors on older adults’ walking for transportation. Additionally, the moderating effect of perceived distance to destinations on this relationship was studied. Methods The sample was comprised of 50,685 non-institutionalized older adults residing in Flanders (Belgium). Cross-sectional data on demographics, environmental perceptions and frequency of walking for transportation were collected by self-administered questionnaires in the period 2004-2010. Perceived distance to destinations was categorized into short, medium, and large distance to destinations. An environmental index (=a sum of favorable environmental factors, ranging from 0 to 7) was constructed to investigate the cumulative influence of favorable environmental factors. Multilevel logistic regression analyses were applied to predict probabilities of daily walking for transportation. Results For short distance to destinations, probability of daily walking for transportation was significantly higher when seven compared to three, four or five favorable environmental factors were present. For medium distance to destinations, probabilities significantly increased for an increase from zero to four favorable environmental factors. For large distance to destinations, no relationship between the environmental index and walking for transportation was observed. Conclusions Our findings suggest that the presence of multiple favorable environmental factors can motivate older adults to walk medium distances
The quartile benefit plot: a middle ear surgery benefit assessment scheme.
Schmerber, Sébastien; Karkas, Alexandre; Righini, Christian A; Chahine, Karim A
2008-05-01
The purpose of this study is to present a new method for the assessment of hearing improvement following stapes surgery, taking into account additional, previously omitted evaluation criteria. Retrospective. A quartile plot, based on the currently used Glasgow benefit plot, is structured to include two additional criteria of hearing assessment, namely the absence of postoperative sensorineural hearing loss and the closure of the air-bone gap to <10 dB. Pre- and postoperative hearing results of 132 patients diagnosed with bilateral otosclerosis and treated with bilateral stapes surgery were plotted on both the classical Glasgow benefit plot and the new quartile benefit plot. The difference in success assessment due to stricter assessment criteria is demonstrated. Functional success rate following bilateral stapes surgery as plotted on the traditional Glasgow benefit plot was 51.5%. Success rate for bilateral stapes surgery assessed on the new quartile plot with the addition of the two new criteria was 38.64%. The difference in success rates was found to be statistically significant. The basis of benefit assessment in stapes surgery solely on the mean deficit in air conduction results in overestimation of success rate. This study demonstrates that results that appear satisfactory when judged by the Glasgow benefit plot are of modest success when assessed by the new quartile plot. The quartile benefit plot presented in this paper provides a strict measure of presentation and evaluation of stapes surgery results.
Cumulative versus rapid introduction of new information.
Gleason, M; Carnine, D; Vala, N
1991-02-01
This study investigated the way new information is presented to students. Subjects were 60 elementary and middle school students, most with learning disabilities. Students used two versions of a specially designed computer-assisted instruction (CAI) program. One version rapidly presented students with seven pieces of information (rapid-introduction group); the other cumulatively presented smaller "chunks" of information (cumulative-introduction group). Both groups worked to mastery level successfully but students in the cumulative group spent one-third the time, required fewer responses, showed less frustration, and made fewer errors in the process. Results suggest that students with learning disabilities need much more practice than most commercial CAI programs supply.
The Plotting Library http://astroplotlib.stsci.edu
NASA Astrophysics Data System (ADS)
Úbeda, L.
2014-05-01
astroplotlib is a multi-language astronomical library of plots. It is a collection of software templates that are useful to create paper-quality figures. All current templates are coded in IDL, some in Python and Mathematica. This free resource supported at Space Telescope Science Institute allows users to download any plot and customize it to their own needs. It is also intended as an educational tool.
Cumulative Student Loan Debt in Minnesota, 2015
ERIC Educational Resources Information Center
Williams-Wyche, Shaun
2016-01-01
To better understand student debt in Minnesota, the Minnesota Office of Higher Education (the Office) gathers information on cumulative student loan debt from Minnesota degree-granting institutions. These data detail the number of students with loans by institution, the cumulative student loan debt incurred at that institution, and the percentage…
Realtime multi-plot graphics system
NASA Technical Reports Server (NTRS)
Shipkowski, Michael S.
1990-01-01
The increased complexity of test operations and customer requirements at Langley Research Center's National Transonic Facility (NTF) surpassed the capabilities of the initial realtime graphics system. The analysis of existing hardware and software and the enhancements made to develop a new realtime graphics system are described. The result of this effort is a cost effective system, based on hardware already in place, that support high speed, high resolution, generation and display of multiple realtime plots. The enhanced graphics system (EGS) meets the current and foreseeable future realtime graphics requirements of the NTF. While this system was developed to support wind tunnel operations, the overall design and capability of the system is applicable to other realtime data acquisition systems that have realtime plot requirements.
PLOT3D Export Tool for Tecplot
NASA Technical Reports Server (NTRS)
Alter, Stephen
2010-01-01
The PLOT3D export tool for Tecplot solves the problem of modified data being impossible to output for use by another computational science solver. The PLOT3D Exporter add-on enables the use of the most commonly available visualization tools to engineers for output of a standard format. The exportation of PLOT3D data from Tecplot has far reaching effects because it allows for grid and solution manipulation within a graphical user interface (GUI) that is easily customized with macro language-based and user-developed GUIs. The add-on also enables the use of Tecplot as an interpolation tool for solution conversion between different grids of different types. This one add-on enhances the functionality of Tecplot so significantly, it offers the ability to incorporate Tecplot into a general suite of tools for computational science applications as a 3D graphics engine for visualization of all data. Within the PLOT3D Export Add-on are several functions that enhance the operations and effectiveness of the add-on. Unlike Tecplot output functions, the PLOT3D Export Add-on enables the use of the zone selection dialog in Tecplot to choose which zones are to be written by offering three distinct options - output of active, inactive, or all zones (grid blocks). As the user modifies the zones to output with the zone selection dialog, the zones to be written are similarly updated. This enables the use of Tecplot to create multiple configurations of a geometry being analyzed. For example, if an aircraft is loaded with multiple deflections of flaps, by activating and deactivating different zones for a specific flap setting, new specific configurations of that aircraft can be easily generated by only writing out specific zones. Thus, if ten flap settings are loaded into Tecplot, the PLOT3D Export software can output ten different configurations, one for each flap setting.
Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta
2017-02-15
Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.
PETRO.CALC.PLOT, Microsoft Excel macros to aid petrologic interpretation
Sidder, G.B.
1994-01-01
PETRO.CALC.PLOT is a package of macros which normalizes whole-rock oxide data to 100%, calculates the cation percentages and molecular proportions used for normative mineral calculations, computes the apices for ternary diagrams, determines sums and ratios of specific elements of petrologic interest, and plots 33 X-Y graphs and five ternary diagrams. PETRO.CALC.PLOT also may be used to create other diagrams as desired by the user. The macros run in Microsoft Excel 3.0 and 4.0 for Macintosh computers and in Microsoft Excel 3.0 and 4.0 for Windows. Macros provided in PETRO.CALC.PLOT minimize repetition and time required to recalculate and plot whole-rock oxide data for petrologic analysis. ?? 1994.
Instrumentation for full-year plot-scale runoff monitoring
USDA-ARS?s Scientific Manuscript database
Replicated 0.34 ha cropping systems plots have been in place since 1991 at the USDA-ARS Goodwater Creek Experimental Watershed in central Missouri. Recently, instrumentation has been installed at 18 of those plots for continuous runoff water quality and quantity monitoring. That installation require...
NASA Astrophysics Data System (ADS)
Sabarish, R. Mani; Narasimhan, R.; Chandhru, A. R.; Suribabu, C. R.; Sudharsan, J.; Nithiyanantham, S.
2017-05-01
In the design of irrigation and other hydraulic structures, evaluating the magnitude of extreme rainfall for a specific probability of occurrence is of much importance. The capacity of such structures is usually designed to cater to the probability of occurrence of extreme rainfall during its lifetime. In this study, an extreme value analysis of rainfall for Tiruchirapalli City in Tamil Nadu was carried out using 100 years of rainfall data. Statistical methods were used in the analysis. The best-fit probability distribution was evaluated for 1, 2, 3, 4 and 5 days of continuous maximum rainfall. The goodness of fit was evaluated using Chi-square test. The results of the goodness-of-fit tests indicate that log-Pearson type III method is the overall best-fit probability distribution for 1-day maximum rainfall and consecutive 2-, 3-, 4-, 5- and 6-day maximum rainfall series of Tiruchirapalli. To be reliable, the forecasted maximum rainfalls for the selected return periods are evaluated in comparison with the results of the plotting position.
A review of contemporary methods for the presentation of scientific uncertainty.
Makinson, K A; Hamby, D M; Edwards, J A
2012-12-01
Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.
Import Manipulate Plot RELAP5/MOD3 Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, K. R.
1999-10-05
XMGR5 was derived from an XY plotting tool called ACE/gr, which is copyrighted by Paul J. Turner and in the public domain. The interactive version of ACE/GR is xmgr, and includes a graphical interface to the X-windows system. Enhancements to xmgr have been developed which import, manipualate, and plot data from RELAP/MOD3, MELCOR, FRAPCON, and SINDA codes, and NRC databank files. capabilities, include two-phase property table lookup functions, an equation interpreter, arithmetic library functions, and units conversion. Plot titles, labels, legends, and narrative can be displayed using Latin or Cyrillic alphabets.
Plotting and Analyzing Data Trends in Ternary Diagrams Made Easy
NASA Astrophysics Data System (ADS)
John, Cédric M.
2004-04-01
Ternary plots are used in many fields of science to characterize a system based on three components. Triangular plotting is thus useful to a broad audience in the Earth sciences and beyond. Unfortunately, it is typically the most expensive commercial software packages that offer the option to plot data in ternary diagrams, and they lack features that are paramount to the geosciences, such as the ability to plot data directly into a standardized diagram and the possibility to analyze temporal and stratigraphic trends within this diagram. To address these issues, δPlot was developed with a strong emphasis on ease of use, community orientation, and availability free of charges. This ``freeware'' supports a fully graphical user interface where data can be imported as text files, or by copying and pasting. A plot is automatically generated, and any standard diagram can be selected for plotting in the background using a simple pull-down menu. Standard diagrams are stored in an external database of PDF files that currently holds some 30 diagrams that deal with different fields of the Earth sciences. Using any drawing software supporting PDF, one can easily produce new standard diagrams to be used with δPlot by simply adding them to the library folder. An independent column of values, commonly stratigraphic depths or ages, can be used to sort the data sets.
Asymptotic Normality Through Factorial Cumulants and Partition Identities
Bobecka, Konstancja; Hitczenko, Paweł; López-Blázquez, Fernando; Rempała, Grzegorz; Wesołowski, Jacek
2013-01-01
In the paper we develop an approach to asymptotic normality through factorial cumulants. Factorial cumulants arise in the same manner from factorial moments as do (ordinary) cumulants from (ordinary) moments. Another tool we exploit is a new identity for ‘moments’ of partitions of numbers. The general limiting result is then used to (re-)derive asymptotic normality for several models including classical discrete distributions, occupancy problems in some generalized allocation schemes and two models related to negative multinomial distribution. PMID:24591773
Cumulative stress and autonomic dysregulation in a community sample.
Lampert, Rachel; Tuit, Keri; Hong, Kwang-Ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita
2016-05-01
Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers (N = 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview (CAI), a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24-h ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors and cumulative adverse life events (CALE) were all inversely associated with ultra low-frequency (ULF), very low-frequency (VLF) and low-frequency (LF) power and SDNN (all p < 0.05). In hierarchical regression analyses, total cumulative stress and chronic stress each was significantly associated with SDNN and ULF even after the highly significant contributions of age and sex, with no other covariates accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance alone but were not longer significant after adjusting for race and health behaviors. In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial.
Cumulative stress and autonomic dysregulation in a community sample
Lampert, Rachel; Tuit, Keri; Hong, Kwang-ik; Donovan, Theresa; Lee, Forrester; Sinha, Rajita
2016-01-01
Whether cumulative stress, including both chronic stress and adverse life events, is associated with decreased heart rate variability (HRV), a non-invasive measure of autonomic status which predicts poor cardiovascular outcomes, is unknown. Healthy community dwelling volunteers, (N= 157, mean age 29 years) participated in the Cumulative Stress/Adversity Interview, (CAI) a 140-item event interview measuring cumulative adversity including major life events, life trauma, recent life events and chronic stressors, and underwent 24 hour ambulatory ECG monitoring. HRV was analyzed in the frequency domain and standard deviation of NN intervals (SDNN) calculated. Initial simple regression analyses revealed that total cumulative stress score, chronic stressors, and cumulative adverse life events (CALE) were all inversely associated with ultra low frequency (ULF), very low frequency (VLF), and low frequency (LF) power and SDNN (all p<0.05). In hierarchical regression analyses, total cumulative stress and chronic stress each was significantly associated with SDNN and ULF even after the high significant contribution of age and sex, with no other covariates accounting for additional appreciable variance. For VLF and LF, both total cumulative stress and chronic stress significantly contributed to the variance were no longer significant after adjusting for race and health behaviors. (p’s<.05). In summary, total cumulative stress, and its components of adverse life events and chronic stress were associated with decreased cardiac autonomic function as measured by HRV. Findings suggest one potential mechanism by which stress may exert adverse effects on mortality in healthy individuals. Primary preventive strategies including stress management may prove beneficial. PMID:27112063
Maximum-entropy probability distributions under Lp-norm constraints
NASA Technical Reports Server (NTRS)
Dolinar, S.
1991-01-01
Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.
Slope Stability of Geosynthetic Clay Liner Test Plots
Fourteen full-scale field test plots containing five types of geosynthetic clay liners (GCLs) were constructed on 2H:IV and 3H:IV slopes for the purpose of assessing slope stability. The test plots were designed to simulate typical final cover systems for landfill. Slides occurr...
FORTRAN plotting subroutines for the space plasma laboratory
NASA Technical Reports Server (NTRS)
Williams, R.
1983-01-01
The computer program known as PLOTRW was custom made to satisfy some of the graphics requirements for the data collected in the Space Plasma Laboratory at the Johnson Space Center (JSC). The general requirements for the program were as follows: (1) all subroutines shall be callable through a FORTRAN source program; (2) all graphs shall fill one page and be properly labeled; (3) there shall be options for linear axes and logarithmic axes; (4) each axis shall have tick marks equally spaced with numeric values printed at the beginning tick mark and at the last tick mark; and (5) there shall be three options for plotting. These are: (1) point plot, (2) line plot and (3) point-line plot. The subroutines were written in FORTRAN IV for the LSI-11 Digital equipment Corporation (DEC) Computer. The program is now operational and can be run on any TEKTRONICX graphics terminal that uses a DEC Real-Time-11 (RT-11) operating system.
Cumulative watershed effects: Caspar Creek and beyond
Leslie M. Reid
1998-01-01
Cumulative effects are the combined effects of multiple activities, and watershed effects are those which involve processes of water transport. Almost all impacts are influenced by multiple activities, so almost all impacts must be evaluated as cumulative impacts rather than as individual impacts. Existing definitions suggest that to be significant, an impact must be...
Mineral Plot from Esperance Target
2014-01-23
This plot segregates various minerals examined by NASA Mars Exploration Rover Opportunity according to their different compositions; for example, those with more iron and magnesium oxides are located in the lower right corner.
High cumulants of conserved charges and their statistical uncertainties
NASA Astrophysics Data System (ADS)
Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu
2017-10-01
We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)
NASA Astrophysics Data System (ADS)
Liu, Y.; Weisberg, R. H.
2017-12-01
The Lagrangian separation distance between the endpoints of simulated and observed drifter trajectories is often used to assess the performance of numerical particle trajectory models. However, the separation distance fails to indicate relative model performance in weak and strong current regions, such as a continental shelf and its adjacent deep ocean. A skill score is proposed based on the cumulative Lagrangian separation distances normalized by the associated cumulative trajectory lengths. The new metrics correctly indicates the relative performance of the Global HYCOM in simulating the strong currents of the Gulf of Mexico Loop Current and the weaker currents of the West Florida Shelf in the eastern Gulf of Mexico. In contrast, the Lagrangian separation distance alone gives a misleading result. Also, the observed drifter position series can be used to reinitialize the trajectory model and evaluate its performance along the observed trajectory, not just at the drifter end position. The proposed dimensionless skill score is particularly useful when the number of drifter trajectories is limited and neither a conventional Eulerian-based velocity nor a Lagrangian-based probability density function may be estimated.
Possibility of using NURBS for surface plotting by survey data
NASA Astrophysics Data System (ADS)
Pravdina, E. A.; Lepikhina, O. J.
2018-05-01
Different methods of surface plotting were discussed in this article. Constructing the surface with the help of the Delaunay triangulation algorithm is described. The TIN-surfaces (triangles irregular net) method is used in the entire CAD software. This type of surfaces is plotting by results of laser scanning and stadia surveying. Possibility of using spline surfaces (NURBS) for surface plotting is studied. For a defined number of points by Mathcad software, the curvilinear function that described two-dimensional spline surfaces was calculated and plotted.
User manual for two simple postscript output FORTRAN plotting routines
NASA Technical Reports Server (NTRS)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Cumulative culture in the laboratory: methodological and theoretical challenges.
Miton, Helena; Charbonneau, Mathieu
2018-05-30
In the last decade, cultural transmission experiments (transmission chains, replacement, closed groups and seeded groups) have become important experimental tools in investigating cultural evolution. However, these methods face important challenges, especially regarding the operationalization of theoretical claims. In this review, we focus on the study of cumulative cultural evolution, the process by which traditions are gradually modified and, for technological traditions in particular, improved upon over time. We identify several mismatches between theoretical definitions of cumulative culture and their implementation in cultural transmission experiments. We argue that observed performance increase can be the result of participants learning faster in a group context rather than effectively leading to a cumulative effect. We also show that in laboratory experiments, participants are asked to complete quite simple tasks, which can undermine the evidential value of the diagnostic criterion traditionally used for cumulative culture (i.e. that cumulative culture is a process that produces solutions that no single individual could have invented on their own). We show that the use of unidimensional metrics of cumulativeness drastically curtail the variation that may be observed, which raises specific issues in the interpretation of the experimental evidence. We suggest several solutions to these mismatches (learning times, task complexity and variation) and develop the use of design spaces in experimentally investigating old and new questions about cumulative culture. © 2018 The Author(s).
Simplified plotting package for the LSI-11 computer and Tektronix terminals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henline, P.
1980-12-01
These plotting subroutines were written to allow the user to do plotting easily and quickly, but do not contain many fancy features in order to minimize memory space. Plots are produced of real values only. The first element of the plotting array contains the number of points to plot and the values to plot are stored in the remaining array locations. The maximum number of points which can be plotted is 300. The user must provide titles and other alpha numeric information. This can be done easily by a call to LOCATE, then ALPHA, and then doing a FORTRAN write.more » LOCATE and ALPHA are part of the Oak Ridge TEK11 Graphics Package. All plots are framed and labeled. The X axis has ten tick marks and three labels (left, center, and right side) and the Y axis has three tick marks and three labels. The subroutines assume the user is smart. Curves (especially when more than one is drawn on one plot) are assumed to be completely within the defined area as no clipping or dark lines are drawn. The user has the ability to do multiple curves on one graph or multiple graphs on a page.« less
Establishment of monitoring plots and evaluation of trees injured by ozone
Daniel Duriscoe; Kenneth Stolte; John Pronos
1996-01-01
By establishing longâterm monitoring plots, it is possible to record environmental and biological conditions of the plot and individual trees, evaluate the condition of crowns of trees in the plot, and determine the extent of ozone injury to western conifers. This chapter recommends various methods for recording data and selecting plots, and provides information for...
Intelligence Constraints on Terrorist Network Plots
NASA Astrophysics Data System (ADS)
Woo, Gordon
Since 9/11, the western intelligence and law enforcement services have managed to interdict the great majority of planned attacks against their home countries. Network analysis shows that there are important intelligence constraints on the number and complexity of terrorist plots. If two many terrorists are involved in plots at a given time, a tipping point is reached whereby it becomes progressively easier for the dots to be joined and for the conspirators to be arrested, and for the aggregate evidence to secure convictions. Implications of this analysis are presented for the campaign to win hearts and minds.
Advanced Bode Plot Techniques for Ultrasonic Transducers
NASA Astrophysics Data System (ADS)
DeAngelis, D. A.; Schulze, G. W.
The Bode plot, displayed as either impedance or admittance versus frequency, is the most basic test used by ultrasonic transducer designers. With simplicity and ease-of-use, Bode plots are ideal for baseline comparisons such as spacing of parasitic modes or impedance, but quite often the subtleties that manifest as poor process control are hard to interpret or are nonexistence. In-process testing of transducers is time consuming for quantifying statistical aberrations, and assessments made indirectly via the workpiece are difficult. This research investigates the use of advanced Bode plot techniques to compare ultrasonic transducers with known "good" and known "bad" process performance, with the goal of a-priori process assessment. These advanced techniques expand from the basic constant voltage versus frequency sweep to include constant current and constant velocity interrogated locally on transducer or tool; they also include up and down directional frequency sweeps to quantify hysteresis effects like jumping and dropping phenomena. The investigation focuses solely on the common PZT8 piezoelectric material used with welding transducers for semiconductor wire bonding. Several metrics are investigated such as impedance, displacement/current gain, velocity/current gain, displacement/voltage gain and velocity/voltage gain. The experimental and theoretical research methods include Bode plots, admittance loops, laser vibrometry and coupled-field finite element analysis.
NASA Technical Reports Server (NTRS)
Lanzi, R. James; Vincent, Brett T.
1993-01-01
The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.
Pavlovian conditioning and cumulative reinforcement rate.
Harris, Justin A; Patterson, Angela E; Gharaei, Saba
2015-04-01
In 5 experiments using delay conditioning of magazine approach with rats, reinforcement rate was varied either by manipulating the mean interval between onset of the conditioned stimulus (CS) and unconditioned stimulus (US) or by manipulating the proportion of CS presentations that ended with the US (trial-based reinforcement rate). Both manipulations influenced the acquisition of responding. In each experiment, a specific comparison was made between 2 CSs that differed in their mean CS-US interval and in their trial-based reinforcement rate, such that the cumulative reinforcement rate-the cumulative duration of the CS between reinforcements-was the same for the 2 CSs. For example, a CS reinforced on 100% of trials with a mean CS-US interval of 60 s was compared with a CS reinforced on 33% of trials and a mean duration of 20 s. Across the 5 experiments, conditioning was virtually identical for the 2 CSs with matched cumulative reinforcement rate. This was true as long as the timing of the US was unpredictable and, thus, response rates were uniform across the length of the CS. We conclude that the effects of CS-US interval and of trial-based reinforcement rate are reducible entirely to their common effect on cumulative reinforcement rate. We discuss the implications of this for rate-based, trial-based, and real-time associative models of conditioning. (c) 2015 APA, all rights reserved).
Cumulative exposure to traumatic events in older adults.
Ogle, Christin M; Rubin, David C; Siegler, Ilene C
2014-01-01
The present study examined the impact of cumulative trauma exposure on current posttraumatic stress disorder (PTSD) symptom severity in a nonclinical sample of adults in their 60s. The predictive utility of cumulative trauma exposure was compared to other known predictors of PTSD, including trauma severity, personality traits, social support, and event centrality. Community-dwelling adults (n = 2515) from the crest of the Baby Boom generation completed the Traumatic Life Events Questionnaire, the PTSD Checklist, the NEO Personality Inventory, the Centrality of Event Scale, and rated their current social support. Cumulative trauma exposure predicted greater PTSD symptom severity in hierarchical regression analyses consistent with a dose-response model. Neuroticism and event centrality also emerged as robust predictors of PTSD symptom severity. In contrast, the severity of individuals' single most distressing life event, as measured by self-report ratings of the A1 PTSD diagnostic criterion, did not add explanatory variance to the model. Analyses concerning event categories revealed that cumulative exposure to childhood violence and adulthood physical assaults were most strongly associated with PTSD symptom severity in older adulthood. Moreover, cumulative self-oriented events accounted for a larger percentage of variance in symptom severity compared to events directed at others. Our findings suggest that the cumulative impact of exposure to traumatic events throughout the life course contributes significantly to posttraumatic stress in older adulthood above and beyond other known predictors of PTSD.
Cumulative exposure to traumatic events in older adults
Ogle, Christin M.; Rubin, David C.; Siegler, Ilene C.
2014-01-01
Objectives The present study examined the impact of cumulative trauma exposure on current posttraumatic stress disorder (PTSD) symptom severity in a nonclinical sample of adults in their 60s. The predictive utility of cumulative trauma exposure was compared to other known predictors of PTSD, including trauma severity, personality traits, social support, and event centrality. Method Community-dwelling adults (n = 2,515) from the crest of the Baby Boom generation completed the Traumatic Life Events Questionnaire, the PTSD Checklist, the NEO Personality Inventory, the Centrality of Event Scale, and rated their current social support. Results Cumulative trauma exposure predicted greater PTSD symptom severity in hierarchical regression analyses consistent with a dose-response model. Neuroticism and event centrality also emerged as robust predictors of PTSD symptom severity. In contrast, the severity of individuals’ single most distressing life event, as measured by self-report ratings of the A1 PTSD diagnostic criterion, did not add explanatory variance to the model. Analyses concerning event categories revealed that cumulative exposure to childhood violence and adulthood physical assaults were most strongly associated with PTSD symptom severity in older adulthood. Moreover, cumulative self-oriented events accounted for a larger percentage of variance in symptom severity compared to events directed at others. Conclusion Our findings suggest that the cumulative impact of exposure to traumatic events throughout the life course contributes significantly to post-traumatic stress in older adulthood above and beyond other known predictors of PTSD. PMID:24011223
Evaluation of an Ensemble Dispersion Calculation.
NASA Astrophysics Data System (ADS)
Draxler, Roland R.
2003-02-01
A Lagrangian transport and dispersion model was modified to generate multiple simulations from a single meteorological dataset. Each member of the simulation was computed by assuming a ±1-gridpoint shift in the horizontal direction and a ±250-m shift in the vertical direction of the particle position, with respect to the meteorological data. The configuration resulted in 27 ensemble members. Each member was assumed to have an equal probability. The model was tested by creating an ensemble of daily average air concentrations for 3 months at 75 measurement locations over the eastern half of the United States during the Across North America Tracer Experiment (ANATEX). Two generic graphical displays were developed to summarize the ensemble prediction and the resulting concentration probabilities for a specific event: a probability-exceed plot and a concentration-probability plot. Although a cumulative distribution of the ensemble probabilities compared favorably with the measurement data, the resulting distribution was not uniform. This result was attributed to release height sensitivity. The trajectory ensemble approach accounts for about 41%-47% of the variance in the measurement data. This residual uncertainty is caused by other model and data errors that are not included in the ensemble design.
Lawrance, R A; Dorsch, M F; Sapsford, R J; Mackintosh, A F; Greenwood, D C; Jackson, B M; Morrell, C; Robinson, M B; Hall, A S
2001-08-11
Use of cumulative mortality adjusted for case mix in patients with acute myocardial infarction for early detection of variation in clinical practice. Observational study. 20 hospitals across the former Yorkshire region. All 2153 consecutive patients with confirmed acute myocardial infarction identified during three months. Variable life-adjusted displays showing cumulative differences between observed and expected mortality of patients; expected mortality calculated from risk model based on admission characteristics of age, heart rate, and systolic blood pressure. The performance of two individual hospitals over three months was examined as an example. One, the smallest district hospital in the region, had a series of 30 consecutive patients but had five more deaths than predicted. The variable life-adjusted display showed minimal variation from that predicted for the first 15 patients followed by a run of unexpectedly high mortality. The second example was the main tertiary referral centre for the region, which admitted 188 consecutive patients. The display showed a period of apparently poor performance followed by substantial improvement, where the plot rose steadily from a cumulative net lives saved of -4 to 7. These variations in patient outcome are unlikely to have been revealed during conventional audit practice. Variable life-adjusted display has been integrated into surgical care as a graphical display of risk-adjusted survival for individual surgeons or centres. In combination with a simple risk model, it may have a role in monitoring performance and outcome in patients with acute myocardial infarction.
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.
1983-01-01
Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.H.
1980-01-01
Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Iron status determination in pregnancy using the Thomas plot.
Weyers, R; Coetzee, M J; Nel, M
2016-04-01
Physiological changes during pregnancy affect routine tests for iron deficiency. The reticulocyte haemoglobin equivalent (RET-He) and serum-soluble transferrin receptor (sTfR) assay are newer diagnostic parameters for the detection of iron deficiency, combined in the Thomas diagnostic plot. We used this plot to determine the iron status of pregnant women presenting for their first visit to an antenatal clinic in Bloemfontein, South Africa. Routine laboratory tests (serum ferritin, full blood count and C-reactive protein) and RET-He and sTfR were performed. The iron status was determined using the Thomas plot. For this study, 103 pregnant women were recruited. According to the Thomas plot, 72.8% of the participants had normal iron stores and erythropoiesis. Iron-deficient erythropoiesis was detected in 12.6%. A third of participants were anaemic. Serum ferritin showed excellent sensitivity but poor specificity for detecting depleted iron stores. HIV status had no influence on the iron status of the participants. Our findings reiterate that causes other than iron deficiency should be considered in anaemic individuals. When compared with the Thomas plot, a low serum ferritin is a sensitive but nonspecific indicator of iron deficiency. The Thomas plot may provide useful information to identify pregnant individuals in whom haematologic parameters indicate limited iron availability for erythropoiesis. © 2015 John Wiley & Sons Ltd.
FERMI/GLAST Integrated Trending and Plotting System Release 5.0
NASA Technical Reports Server (NTRS)
Ritter, Sheila; Brumer, Haim; Reitan, Denise
2012-01-01
An Integrated Trending and Plotting System (ITPS) is a trending, analysis, and plotting system used by space missions to determine performance and status of spacecraft and its instruments. ITPS supports several NASA mission operational control centers providing engineers, ground controllers, and scientists with access to the entire spacecraft telemetry data archive for the life of the mission, and includes a secure Web component for remote access. FERMI/GLAST ITPS Release 5.0 features include the option to display dates (yyyy/ddd) instead of orbit numbers along orbital Long-Term Trend (LTT) plot axis, the ability to save statistics from daily production plots as image files, and removal of redundant edit/create Input Definition File (IDF) screens. Other features are a fix to address invalid packet lengths, a change in naming convention of image files in order to use in script, the ability to save all ITPS plot images (from Windows or the Web) as GIF or PNG format, the ability to specify ymin and ymax on plots where previously only the desired range could be specified, Web interface capability to plot IDFs that contain out-oforder page and plot numbers, and a fix to change all default file names to show yyyydddhhmmss time stamps instead of hhmmssdddyyyy. A Web interface capability sorts files based on modification date (with newest one at top), and the statistics block can be displayed via a Web interface. Via the Web, users can graphically view the volume of telemetry data from each day contained in the ITPS archive in the Web digest. The ITPS could be also used in nonspace fields that need to plot data or trend data, including financial and banking systems, aviation and transportation systems, healthcare and educational systems, sales and marketing, and housing and construction.
Myth Structure and Media Fiction Plot: An Exploration.
ERIC Educational Resources Information Center
Harless, James D.
Based on the general research of Joseph Campbell in adventure plots from mythology, the author explores the simplified monomyth plots currently in frequent use in mass media programing. The close relationship of media fiction to mythic stories is established through the analysis of more than 25 stories resulting from media broadcasting. The media…
Automatic Extraction of Small Spatial Plots from Geo-Registered UAS Imagery
NASA Astrophysics Data System (ADS)
Cherkauer, Keith; Hearst, Anthony
2015-04-01
Accurate extraction of spatial plots from high-resolution imagery acquired by Unmanned Aircraft Systems (UAS), is a prerequisite for accurate assessment of experimental plots in many geoscience fields. If the imagery is correctly geo-registered, then it may be possible to accurately extract plots from the imagery based on their map coordinates. To test this approach, a UAS was used to acquire visual imagery of 5 ha of soybean fields containing 6.0 m2 plots in a complex planting scheme. Sixteen artificial targets were setup in the fields before flights and different spatial configurations of 0 to 6 targets were used as Ground Control Points (GCPs) for geo-registration, resulting in a total of 175 geo-registered image mosaics with a broad range of geo-registration accuracies. Geo-registration accuracy was quantified based on the horizontal Root Mean Squared Error (RMSE) of targets used as checkpoints. Twenty test plots were extracted from the geo-registered imagery. Plot extraction accuracy was quantified based on the percentage of the desired plot area that was extracted. It was found that using 4 GCPs along the perimeter of the field minimized the horizontal RMSE and enabled a plot extraction accuracy of at least 70%, with a mean plot extraction accuracy of 92%. The methods developed are suitable for work in many fields where replicates across time and space are necessary to quantify variability.
Comparing Mapped Plot Estimators
Paul C. Van Deusen
2006-01-01
Two alternative derivations of estimators for mean and variance from mapped plots are compared by considering the models that support the estimators and by simulation. It turns out that both models lead to the same estimator for the mean but lead to very different variance estimators. The variance estimators based on the least valid model assumptions are shown to...
Comparative analysis through probability distributions of a data set
NASA Astrophysics Data System (ADS)
Cristea, Gabriel; Constantinescu, Dan Mihai
2018-02-01
In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.
Covariate-adjusted Spearman's rank correlation with probability-scale residuals.
Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E
2018-06-01
It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.
[Comparative quality measurements part 3: funnel plots].
Kottner, Jan; Lahmann, Nils
2014-02-01
Comparative quality measurements between organisations or institutions are common. Quality measures need to be standardised and risk adjusted. Random error must also be taken adequately into account. Rankings without consideration of the precision lead to flawed interpretations and enhances "gaming". Application of confidence intervals is one possibility to take chance variation into account. Funnel plots are modified control charts based on Statistical Process Control (SPC) theory. The quality measures are plotted against their sample size. Warning and control limits that are 2 or 3 standard deviations from the center line are added. With increasing group size the precision increases and so the control limits are forming a funnel. Data points within the control limits are considered to show common cause variation; data points outside special cause variation without the focus of spurious rankings. Funnel plots offer data based information about how to evaluate institutional performance within quality management contexts.
Polychromatic plots: graphical display of multidimensional data.
Roederer, Mario; Moody, M Anthony
2008-09-01
Limitations of graphical displays as well as human perception make the presentation and analysis of multidimensional data challenging. Graphical display of information on paper or by current projectors is perforce limited to two dimensions; the encoding of information from other dimensions must be overloaded into the two physical dimensions. A number of alternative means of encoding this information have been implemented, such as offsetting data points at an angle (e.g., three-dimensional projections onto a two-dimensional surface) or generating derived parameters that are combinations of other variables (e.g., principal components). Here, we explore the use of color to encode additional dimensions of data. PolyChromatic Plots are standard dot plots, where the color of each event is defined by the values of one, two, or three of the measurements for that event. The measurements for these parameters are mapped onto an intensity value for each primary color (red, green, or blue) based on different functions. In addition, differential weighting of the priority with which overlapping events are displayed can be defined by these same measurements. PolyChromatic Plots can encode up to five independent dimensions of data in a single display. By altering the color mapping function and the priority function, very different displays that highlight or de-emphasize populations of events can be generated. As for standard black-and-white dot plots, frequency information can be significantly biased by this display; care must be taken to ensure appropriate interpretation of the displays. PolyChromatic Plots are a powerful display type that enables rapid data exploration. By virtue of encoding as many as five dimensions of data independently, an enormous amount of information can be gleaned from the displays. In many ways, the display performs somewhat like an unsupervised cluster algorithm, by highlighting events of similar distributions in multivariate space.
Permanent-plot procedures for silvicultural and yield research.
Robert O. Curtis; David D. Marshall
2005-01-01
This paper reviews purposes and procedures for establishing and maintaining permanent plots for silvicultural and yield research, sampling and plot design, common errors, and procedures for measuring and recording data. It is a revision and update of a 1983 publication. Although some details are specific to coastal Pacific Northwest conditions, most of the material is...
1983-06-01
1D-A132 95 DEVELOPMENT OF A GIFTS (GRAPHICS ORIENTED INTERACTIVE i/i FINITE-ELEMENT TIME..(U) NAVAL POSTGRADUATE SCHOOL I MONTEREY CA T R PICKLES JUN...183 THESIS " DEVELOPMENT OF A GIFTS PLOTTING PACKAGE COMPATIBLE WITH EITHER PLOT10 OR IBM/DSM GRAPHICS by Thomas R. Pickles June 1983 Thesis Advisor: G...TYPEAFtWEPORT & PERIOD COVERED Development of GIFTS Plotting Package Bi ’s Thesis; Compatible with either PLOTl0 or June 1983 IBM/DSM Graphics 6. PERFORMING ORO
Compensating for missing plot observations inforest inventory estimation
Ronald E. McRoberts
2003-01-01
The Enhanced Forest Inventory and Analysis program of the U.S. Forest Service has established a nationwide array of permanent field plots, each representing approximately 2400 ha. Each plot has been assigned to one of five interpenetrating, nonoverlapping panels, with one panel selected for measurement on a rotating basis each year. As with most large surveys,...
The 2002 RPA Plot Summary database users manual
Patrick D. Miles; John S. Vissage; W. Brad Smith
2004-01-01
Describes the structure of the RPA 2002 Plot Summary database and provides information on generating estimates of forest statistics from these data. The RPA 2002 Plot Summary database provides a consistent framework for storing forest inventory data across all ownerships across the entire United States. The data represents the best available data as of October 2001....
Online plot services for paleomagnetism and rock magnetism
NASA Astrophysics Data System (ADS)
Hatakeyama, T.
2017-12-01
In paleomagnetism and rock magnetism, a lot of types of original plots are used for obtained data from measurements. Many researchers in paleomagnetism often use not only general-purpose plotting programs such as Microsoft Excel but also single-purpose tools. A large benefit of using the latter tools is that we can make a beautiful figure for our own data. However, those programs require specific environment for their operation such as type of hardware and platform, type of operation system and its version, libraries for execution and so on. Therefore, it is difficult to share the result and graphics among the collaborators who use different environments on their PCs. Thus, one of the best solution is likely a program operated on popular environment. The most popular is web environment as we all know. Almost all current operating systems have web browsers as standard and all people use them regularly. Now we provide a web-based service plotting paleomagnetic results easily.We develop original programs with a command-line user interface (non-GUI), and we prepared web pages for input of the simple measured data and options and a wrapper script which transfers the entered values to the program. The results, analyzed values and plotted graphs from the program are shown in the HTML page and downloadable. Our plot services are provided in http://mage-p.org/mageplot/. In this talk, we introduce our program and service and discuss the philosophy and efficiency of these services.
Parametric-Studies and Data-Plotting Modules for the SOAP
NASA Technical Reports Server (NTRS)
2008-01-01
"Parametric Studies" and "Data Table Plot View" are the names of software modules in the Satellite Orbit Analysis Program (SOAP). Parametric Studies enables parameterization of as many as three satellite or ground-station attributes across a range of values and computes the average, minimum, and maximum of a specified metric, the revisit time, or 21 other functions at each point in the parameter space. This computation produces a one-, two-, or three-dimensional table of data representing statistical results across the parameter space. Inasmuch as the output of a parametric study in three dimensions can be a very large data set, visualization is a paramount means of discovering trends in the data (see figure). Data Table Plot View enables visualization of the data table created by Parametric Studies or by another data source: this module quickly generates a display of the data in the form of a rotatable three-dimensional-appearing plot, making it unnecessary to load the SOAP output data into a separate plotting program. The rotatable three-dimensionalappearing plot makes it easy to determine which points in the parameter space are most desirable. Both modules provide intuitive user interfaces for ease of use.
Utsumi, Takanobu; Oka, Ryo; Endo, Takumi; Yano, Masashi; Kamijima, Shuichi; Kamiya, Naoto; Fujimura, Masaaki; Sekita, Nobuyuki; Mikami, Kazuo; Hiruta, Nobuyuki; Suzuki, Hiroyoshi
2015-11-01
The aim of this study is to validate and compare the predictive accuracy of two nomograms predicting the probability of Gleason sum upgrading between biopsy and radical prostatectomy pathology among representative patients with prostate cancer. We previously developed a nomogram, as did Chun et al. In this validation study, patients originated from two centers: Toho University Sakura Medical Center (n = 214) and Chibaken Saiseikai Narashino Hospital (n = 216). We assessed predictive accuracy using area under the curve values and constructed calibration plots to grasp the tendency for each institution. Both nomograms showed a high predictive accuracy in each institution, although the constructed calibration plots of the two nomograms underestimated the actual probability in Toho University Sakura Medical Center. Clinicians need to use calibration plots for each institution to correctly understand the tendency of each nomogram for their patients, even if each nomogram has a good predictive accuracy. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A Missing Link in the Evolution of the Cumulative Recorder
ERIC Educational Resources Information Center
Asano, Toshio; Lattal, Kennon A.
2012-01-01
A recently recovered cumulative recorder provides a missing link in the evolution of the cumulative recorder from a modified kymograph to a reliably operating, scientifically and commercially successful instrument. The recorder, the only physical evidence of such an early precommercial cumulative recorder yet found, was sent to Keio University in…
Selection of Plot Remeasurement in an Annual Inventory
Mark H. Hansen; Hans T. Schreuder; Dave Heinzen
2000-01-01
A plot selection approach is proposed based on experience from the Annual Forest Inventory System (AFIS) in the Aspen-Birch Unit of northestern Minnesota. The emphasisis on a mixture of strategies. Although the Agricultural Act of 1998 requires that a fixed 20 percent of plots be measured each year in each state, sooner or later we will need to vary the scheme to...
The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.
Kühberger; Schulte-Mecklenbeck; Perner
1999-06-01
A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.
Dynamic probability control limits for risk-adjusted Bernoulli CUSUM charts.
Zhang, Xiang; Woodall, William H
2015-11-10
The risk-adjusted Bernoulli cumulative sum (CUSUM) chart developed by Steiner et al. (2000) is an increasingly popular tool for monitoring clinical and surgical performance. In practice, however, the use of a fixed control limit for the chart leads to a quite variable in-control average run length performance for patient populations with different risk score distributions. To overcome this problem, we determine simulation-based dynamic probability control limits (DPCLs) patient-by-patient for the risk-adjusted Bernoulli CUSUM charts. By maintaining the probability of a false alarm at a constant level conditional on no false alarm for previous observations, our risk-adjusted CUSUM charts with DPCLs have consistent in-control performance at the desired level with approximately geometrically distributed run lengths. Our simulation results demonstrate that our method does not rely on any information or assumptions about the patients' risk distributions. The use of DPCLs for risk-adjusted Bernoulli CUSUM charts allows each chart to be designed for the corresponding particular sequence of patients for a surgeon or hospital. Copyright © 2015 John Wiley & Sons, Ltd.
The re-incarnation, re-interpretation and re-demise of the transition probability model.
Koch, A L
1999-05-28
There are two classes of models for the cell cycle that have both a deterministic and a stochastic part; they are the transition probability (TP) models and sloppy size control (SSC) models. The hallmark of the basic TP model are two graphs: the alpha and beta plots. The former is the semi-logarithmic plot of the percentage of cell divisions yet to occur, this results in a horizontal line segment at 100% corresponding to the deterministic phase and a straight line sloping tail corresponding to the stochastic part. The beta plot concerns the differences of the age-at-division of sisters (the beta curve) and gives a straight line parallel to the tail of the alpha curve. For the SC models the deterministic part is the time needed for the cell to accumulate a critical amount of some substance(s). The variable part differs in the various variants of the general model, but they do not give alpha and beta curves with linear tails as postulated by the TP model. This paper argues against TP and for an elaboration of SSC type of model. The main argument against TP is that it assumes that the probability of the transition from the stochastic phase is time invariant even though it is certain that the cells are growing and metabolizing throughout the cell cycle; a fact that should make the transition probability be variable. The SSC models presume that cell division is triggered by the cell's success in growing and not simply the result of elapsed time. The extended model proposed here to accommodate the predictions of the SSC to the straight tailed parts of the alpha and beta plots depends on the existence of a few percent of the cell in a growing culture that are not growing normally, these are growing much slower or are temporarily quiescent. The bulk of the cells, however, grow nearly exponentially. Evidence for a slow growing component comes from experimental analyses of population size distributions for a variety of cell types by the Collins-Richmond technique. These
Diversification and cumulative evolution in New Caledonian crow tool manufacture.
Hunt, Gavin R; Gray, Russell D
2003-01-01
Many animals use tools but only humans are generally considered to have the cognitive sophistication required for cumulative technological evolution. Three important characteristics of cumulative technological evolution are: (i) the diversification of tool design; (ii) cumulative change; and (iii) high-fidelity social transmission. We present evidence that crows have diversified and cumulatively changed the design of their pandanus tools. In 2000 we carried out an intensive survey in New Caledonia to establish the geographical variation in the manufacture of these tools. We documented the shapes of 5550 tools from 21 sites throughout the range of pandanus tool manufacture. We found three distinct pandanus tool designs: wide tools, narrow tools and stepped tools. The lack of ecological correlates of the three tool designs and their different, continuous and overlapping geographical distributions make it unlikely that they evolved independently. The similarities in the manufacture method of each design further suggest that pandanus tools have gone through a process of cumulative change from a common historical origin. We propose a plausible scenario for this rudimentary cumulative evolution. PMID:12737666
Non-parametric and least squares Langley plot methods
NASA Astrophysics Data System (ADS)
Kiedron, P. W.; Michalsky, J. J.
2016-01-01
Langley plots are used to calibrate sun radiometers primarily for the measurement of the aerosol component of the atmosphere that attenuates (scatters and absorbs) incoming direct solar radiation. In principle, the calibration of a sun radiometer is a straightforward application of the Bouguer-Lambert-Beer law V = V0e-τ ṡ m, where a plot of ln(V) voltage vs. m air mass yields a straight line with intercept ln(V0). This ln(V0) subsequently can be used to solve for τ for any measurement of V and calculation of m. This calibration works well on some high mountain sites, but the application of the Langley plot calibration technique is more complicated at other, more interesting, locales. This paper is concerned with ferreting out calibrations at difficult sites and examining and comparing a number of conventional and non-conventional methods for obtaining successful Langley plots. The 11 techniques discussed indicate that both least squares and various non-parametric techniques produce satisfactory calibrations with no significant differences among them when the time series of ln(V0)'s are smoothed and interpolated with median and mean moving window filters.
Panesar, Sukhmeet S; Rao, Christopher; Vecht, Joshua A; Mirza, Saqeb B; Netuveli, Gopalakrishnan; Morris, Richard; Rosenthal, Joe; Darzi, Ara; Athanasiou, Thanos
2009-10-01
Meta-analyses may be prone to generating misleading results because of a paucity of experimental studies (especially in surgery); publication bias; and heterogeneity in study design, intervention and the patient population of included studies. When investigating a specific clinical or scientific question on which several relevant meta-analyses may have been published, value judgments must be applied to determine which analysis represents the most robust evidence. These value judgments should be specifically acknowledged. We designed the Veritas plot to explicitly explore important elements of quality and to facilitate decision-making by highlighting specific areas in which meta-analyses are found to be deficient. Furthermore, as a graphic tool, it may be more intuitive than when similar data are presented in a tabular or text format. The Veritas plot is an adaption of the radar plot, a graphic tool for the description of multiattribute data. Key elements of meta-analytical quality such as heterogeneity, publication bias and study design are assessed. Existing qualitative methods such as the Assessment of Multiple Systematic Reviews (AMSTAR) tool have been incorporated in addition to important considerations when interpreting surgical meta-analyses such as the year of publication and population characteristics. To demonstrate the potential of the Veritas plot to inform clinical practice, we apply the Veritas plot to the meta-analytical literature comparing the incidence of 30-day stroke in off-pump coronary artery bypass surgery and conventional coronary artery bypass surgery. We demonstrate that a visually-stimulating and practical evidence-synthesis tool can direct the clinician and scientist to a particular meta-analytical study to inform clinical practice. The Veritas plot is also cumulative and allowed us to assess the quality of evidence over time. We have presented a practical graphic application for scientists and clinicians to identify and interpret
Splatterplots: overcoming overdraw in scatter plots.
Mayorga, Adrian; Gleicher, Michael
2013-09-01
We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the data set as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how Splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.
Splatterplots: Overcoming Overdraw in Scatter Plots
Mayorga, Adrian; Gleicher, Michael
2014-01-01
We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen. PMID:23846097
Splatterplots: Overcoming Overdraw in Scatter Plots.
Mayorga, Adrian; Gleicher, Michael
2013-03-20
We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.
PLOT3D- DRAWING THREE DIMENSIONAL SURFACES
NASA Technical Reports Server (NTRS)
Canright, R. B.
1994-01-01
PLOT3D is a package of programs to draw three-dimensional surfaces of the form z = f(x,y). The function f and the boundary values for x and y are the input to PLOT3D. The surface thus defined may be drawn after arbitrary rotations. However, it is designed to draw only functions in rectangular coordinates expressed explicitly in the above form. It cannot, for example, draw a sphere. Output is by off-line incremental plotter or online microfilm recorder. This package, unlike other packages, will plot any function of the form z = f(x,y) and portrays continuous and bounded functions of two independent variables. With curve fitting; however, it can draw experimental data and pictures which cannot be expressed in the above form. The method used is division into a uniform rectangular grid of the given x and y ranges. The values of the supplied function at the grid points (x, y) are calculated and stored; this defines the surface. The surface is portrayed by connecting successive (y,z) points with straight-line segments for each x value on the grid and, in turn, connecting successive (x,z) points for each fixed y value on the grid. These lines are then projected by parallel projection onto the fixed yz-plane for plotting. This program has been implemented on the IBM 360/67 with on-line CDC microfilm recorder.
SpectraPlot.com: Integrated spectroscopic modeling of atomic and molecular gases
NASA Astrophysics Data System (ADS)
Goldenstein, Christopher S.; Miller, Victor A.; Mitchell Spearrin, R.; Strand, Christopher L.
2017-10-01
SpectraPlot is a web-based application for simulating spectra of atomic and molecular gases. At the time this manuscript was written, SpectraPlot consisted of four primary tools for calculating: (1) atomic and molecular absorption spectra, (2) atomic and molecular emission spectra, (3) transition linestrengths, and (4) blackbody emission spectra. These tools currently employ the NIST ASD, HITRAN2012, and HITEMP2010 databases to perform line-by-line simulations of spectra. SpectraPlot employs a modular, integrated architecture, enabling multiple simulations across multiple databases and/or thermodynamic conditions to be visualized in an interactive plot window. The primary objective of this paper is to describe the architecture and spectroscopic models employed by SpectraPlot in order to provide its users with the knowledge required to understand the capabilities and limitations of simulations performed using SpectraPlot. Further, this manuscript discusses the accuracy of several underlying approximations used to decrease computational time, in particular, the use of far-wing cutoff criteria.
Standardized mean differences cause funnel plot distortion in publication bias assessments.
Zwetsloot, Peter-Paul; Van Der Naald, Mira; Sena, Emily S; Howells, David W; IntHout, Joanna; De Groot, Joris Ah; Chamuleau, Steven Aj; MacLeod, Malcolm R; Wever, Kimberley E
2017-09-08
Meta-analyses are increasingly used for synthesis of evidence from biomedical research, and often include an assessment of publication bias based on visual or analytical detection of asymmetry in funnel plots. We studied the influence of different normalisation approaches, sample size and intervention effects on funnel plot asymmetry, using empirical datasets and illustrative simulations. We found that funnel plots of the Standardized Mean Difference (SMD) plotted against the standard error (SE) are susceptible to distortion, leading to overestimation of the existence and extent of publication bias. Distortion was more severe when the primary studies had a small sample size and when an intervention effect was present. We show that using the Normalised Mean Difference measure as effect size (when possible), or plotting the SMD against a sample size-based precision estimate, are more reliable alternatives. We conclude that funnel plots using the SMD in combination with the SE are unsuitable for publication bias assessments and can lead to false-positive results.
Standardized mean differences cause funnel plot distortion in publication bias assessments
Van Der Naald, Mira; Sena, Emily S; Howells, David W; IntHout, Joanna; De Groot, Joris AH; Chamuleau, Steven AJ; MacLeod, Malcolm R
2017-01-01
Meta-analyses are increasingly used for synthesis of evidence from biomedical research, and often include an assessment of publication bias based on visual or analytical detection of asymmetry in funnel plots. We studied the influence of different normalisation approaches, sample size and intervention effects on funnel plot asymmetry, using empirical datasets and illustrative simulations. We found that funnel plots of the Standardized Mean Difference (SMD) plotted against the standard error (SE) are susceptible to distortion, leading to overestimation of the existence and extent of publication bias. Distortion was more severe when the primary studies had a small sample size and when an intervention effect was present. We show that using the Normalised Mean Difference measure as effect size (when possible), or plotting the SMD against a sample size-based precision estimate, are more reliable alternatives. We conclude that funnel plots using the SMD in combination with the SE are unsuitable for publication bias assessments and can lead to false-positive results. PMID:28884685
Pilot Inventory of FIA plots traditionally called `nonforest'
Rachel Riemann
2003-01-01
Forest-inventory data were collected on plots defined as ?nonforest? by the USDA Forest Service?s Forest Inventory and Analysis (FIA) unit. Nonforest plots may have trees on them, but they do not fit FIA?s definition of forest because the area covered by trees is too small, too sparsely populated by trees, too narrow (e.g., trees between fields or in the middle of a...
Selection of plot remeasurement in an annual inventory
Mark H. Hansen; Hans T. Schreuder; Dave Heinzen
2000-01-01
A plot selection approach is proposed based on experience from the Annual Forest Inventory System (AFIS) in the Aspen-Birch Unit of northeastern Minnesota. The emphasis is on a mixture of strategies. Although the Agricultural Act of 1998 requires that a fixed 20 percent of plots be measured each year in each state, sooner or later we will need to vary the scheme to...
Dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses.
Zhang, Xiang; Loda, Justin B; Woodall, William H
2017-07-20
For a patient who has survived a surgery, there could be several levels of recovery. Thus, it is reasonable to consider more than two outcomes when monitoring surgical outcome quality. The risk-adjusted cumulative sum (CUSUM) chart based on multiresponses has been developed for monitoring a surgical process with three or more outcomes. However, there is a significant effect of varying risk distributions on the in-control performance of the chart when constant control limits are applied. To overcome this disadvantage, we apply the dynamic probability control limits to the risk-adjusted CUSUM charts for multiresponses. The simulation results demonstrate that the in-control performance of the charts with dynamic probability control limits can be controlled for different patient populations because these limits are determined for each specific sequence of patients. Thus, the use of dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses allows each chart to be designed for the corresponding patient sequence of a surgeon or a hospital and therefore does not require estimating or monitoring the patients' risk distribution. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
From fuzzy recurrence plots to scalable recurrence networks of time series
NASA Astrophysics Data System (ADS)
Pham, Tuan D.
2017-04-01
Recurrence networks, which are derived from recurrence plots of nonlinear time series, enable the extraction of hidden features of complex dynamical systems. Because fuzzy recurrence plots are represented as grayscale images, this paper presents a variety of texture features that can be extracted from fuzzy recurrence plots. Based on the notion of fuzzy recurrence plots, defuzzified, undirected, and unweighted recurrence networks are introduced. Network measures can be computed for defuzzified recurrence networks that are scalable to meet the demand for the network-based analysis of big data.
"Delta Plots"--A New Way to Visualize Electronic Excitation.
ERIC Educational Resources Information Center
Morrison, Harry; And Others
1985-01-01
Presents procedures for obtaining and examples of delta plots (a way of illustrating electron density changes associated with electronic excitation). These plots are pedagogically useful for visualizing simple and complex transitions and provide a way of "seeing" the origin of highest occupied molecular orbital (HOMO)-dictated carbonyl…
Igloo-Plot: a tool for visualization of multidimensional datasets.
Kuntal, Bhusan K; Ghosh, Tarini Shankar; Mande, Sharmila S
2014-01-01
Advances in science and technology have resulted in an exponential growth of multivariate (or multi-dimensional) datasets which are being generated from various research areas especially in the domain of biological sciences. Visualization and analysis of such data (with the objective of uncovering the hidden patterns therein) is an important and challenging task. We present a tool, called Igloo-Plot, for efficient visualization of multidimensional datasets. The tool addresses some of the key limitations of contemporary multivariate visualization and analysis tools. The visualization layout, not only facilitates an easy identification of clusters of data-points having similar feature compositions, but also the 'marker features' specific to each of these clusters. The applicability of the various functionalities implemented herein is demonstrated using several well studied multi-dimensional datasets. Igloo-Plot is expected to be a valuable resource for researchers working in multivariate data mining studies. Igloo-Plot is available for download from: http://metagenomics.atc.tcs.com/IglooPlot/. Copyright © 2014 Elsevier Inc. All rights reserved.
SEGY to ASCII Conversion and Plotting Program 2.0
Goldman, Mark R.
2005-01-01
INTRODUCTION SEGY has long been a standard format for storing seismic data and header information. Almost every seismic processing package can read and write seismic data in SEGY format. In the data processing world, however, ASCII format is the 'universal' standard format. Very few general-purpose plotting or computation programs will accept data in SEGY format. The software presented in this report, referred to as SEGY to ASCII (SAC), converts seismic data written in SEGY format (Barry et al., 1975) to an ASCII data file, and then creates a postscript file of the seismic data using a general plotting package (GMT, Wessel and Smith, 1995). The resulting postscript file may be plotted by any standard postscript plotting program. There are two versions of SAC: one version for plotting a SEGY file that contains a single gather, such as a stacked CDP or migrated section, and a second version for plotting multiple gathers from a SEGY file containing more than one gather, such as a collection of shot gathers. Note that if a SEGY file has multiple gathers, then each gather must have the same number of traces per gather, and each trace must have the same sample interval and number of samples per trace. SAC will read several common standards of SEGY data, including SEGY files with sample values written in either IBM or IEEE floating-point format. In addition, utility programs are present to convert non-standard Seismic Unix (.sux) SEGY files and PASSCAL (.rsy) SEGY files to standard SEGY files. SAC allows complete user control over all plotting parameters including label size and font, tick mark intervals, trace scaling, and the inclusion of a title and descriptive text. SAC shell scripts create a postscript image of the seismic data in vector rather than bitmap format, using GMT's pswiggle command. Although this can produce a very large postscript file, the image quality is generally superior to that of a bitmap image, and commercial programs such as Adobe Illustrator
Cumulative Social Risk and Obesity in Early Childhood
Duarte, Cristiane S.; Chambers, Earle C.; Boynton-Jarrett, Renée
2012-01-01
OBJECTIVES: The goal of this study was to examine the relationship between cumulative social adversity and childhood obesity among preschool-aged children (N = 1605) in the Fragile Families and Child Wellbeing Study. METHODS: Maternal reports of intimate partner violence, food insecurity, housing insecurity, maternal depressive symptoms, maternal substance use, and father’s incarceration were obtained when the child was 1 and 3 years of age. Two cumulative social risk scores were created by summing the 6 factors assessed at ages 1 and 3 years. Child height and weight were measured at 5 years of age. Logistic regression models stratified according to gender were used to estimate the association between cumulative social risk and obesity, adjusting for sociodemographic factors. RESULTS: Seventeen percent of children were obese at age 5 years, and 57% had at least 1 social risk factor. Adjusting for sociodemographic factors, girls experiencing high cumulative social risk (≥2 factors) at age 1 year only (odds ratio [OR]: 2.1 [95% confidence interval [CI]: 1.1–4.1]) or at 3 years only (OR: 2.2 [95% CI: 1.2–4.2]) were at increased odds of being obese compared with girls with no risk factors at either time point. Those experiencing high cumulative risk at age 1 and 3 years were not at statistically significant odds of being obese (OR: 1.9 [95% CI: 0.9–4.0]). No significant associations were noted among boys. CONCLUSIONS: There seems to be gender differences in the effects of cumulative social risk factors on the prevalence of obesity at 5 years of age. Understanding the social context of families could make for more effective preventive efforts to combat childhood obesity. PMID:22508921
Physical intelligence does matter to cumulative technological culture.
Osiurak, François; De Oliveira, Emmanuel; Navarro, Jordan; Lesourd, Mathieu; Claidière, Nicolas; Reynaud, Emanuelle
2016-08-01
Tool-based culture is not unique to humans, but cumulative technological culture is. The social intelligence hypothesis suggests that this phenomenon is fundamentally based on uniquely human sociocognitive skills (e.g., shared intentionality). An alternative hypothesis is that cumulative technological culture also crucially depends on physical intelligence, which may reflect fluid and crystallized aspects of intelligence and enables people to understand and improve the tools made by predecessors. By using a tool-making-based microsociety paradigm, we demonstrate that physical intelligence is a stronger predictor of cumulative technological performance than social intelligence. Moreover, learners' physical intelligence is critical not only in observational learning but also when learners interact verbally with teachers. Finally, we show that cumulative performance is only slightly influenced by teachers' physical and social intelligence. In sum, human technological culture needs "great engineers" to evolve regardless of the proportion of "great pedagogues." Social intelligence might play a more limited role than commonly assumed, perhaps in tool-use/making situations in which teachers and learners have to share symbolic representations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Plotting Lightning-Stroke Data
NASA Technical Reports Server (NTRS)
Tatom, F. B.; Garst, R. A.
1986-01-01
Data on lightning-stroke locations become easier to correlate with cloudcover maps with aid of new graphical treatment. Geographic region divided by grid into array of cells. Number of lightning strokes in each cell tabulated, and value representing density of lightning strokes assigned to each cell. With contour-plotting routine, computer draws contours of lightning-stroke density for region. Shapes of contours compared directly with shapes of storm cells.
46 CFR 15.816 - Automatic radar plotting aids (ARPAs).
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 1 2014-10-01 2014-10-01 false Automatic radar plotting aids (ARPAs). 15.816 Section 15.816 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS Computations § 15.816 Automatic radar plotting aids (ARPAs). Every person in the required...
Atrial fibrillation detection by heart rate variability in Poincare plot.
Park, Jinho; Lee, Sangwook; Jeon, Moongu
2009-12-11
Atrial fibrillation (AFib) is one of the prominent causes of stroke, and its risk increases with age. We need to detect AFib correctly as early as possible to avoid medical disaster because it is likely to proceed into a more serious form in short time. If we can make a portable AFib monitoring system, it will be helpful to many old people because we cannot predict when a patient will have a spasm of AFib. We analyzed heart beat variability from inter-beat intervals obtained by a wavelet-based detector. We made a Poincare plot using the inter-beat intervals. By analyzing the plot, we extracted three feature measures characterizing AFib and non-AFib: the number of clusters, mean stepping increment of inter-beat intervals, and dispersion of the points around a diagonal line in the plot. We divided distribution of the number of clusters into two and calculated mean value of the lower part by k-means clustering method. We classified data whose number of clusters is more than one and less than this mean value as non-AFib data. In the other case, we tried to discriminate AFib from non-AFib using support vector machine with the other feature measures: the mean stepping increment and dispersion of the points in the Poincare plot. We found that Poincare plot from non-AFib data showed some pattern, while the plot from AFib data showed irregularly irregular shape. In case of non-AFib data, the definite pattern in the plot manifested itself with some limited number of clusters or closely packed one cluster. In case of AFib data, the number of clusters in the plot was one or too many. We evaluated the accuracy using leave-one-out cross-validation. Mean sensitivity and mean specificity were 91.4% and 92.9% respectively. Because pulse beats of ventricles are less likely to be influenced by baseline wandering and noise, we used the inter-beat intervals to diagnose AFib. We visually displayed regularity of the inter-beat intervals by way of Poincare plot. We tried to design an
Volcano plots in analyzing differential expressions with mRNA microarrays.
Li, Wentian
2012-12-01
A volcano plot displays unstandardized signal (e.g. log-fold-change) against noise-adjusted/standardized signal (e.g. t-statistic or -log(10)(p-value) from the t-test). We review the basic and interactive use of the volcano plot and its crucial role in understanding the regularized t-statistic. The joint filtering gene selection criterion based on regularized statistics has a curved discriminant line in the volcano plot, as compared to the two perpendicular lines for the "double filtering" criterion. This review attempts to provide a unifying framework for discussions on alternative measures of differential expression, improved methods for estimating variance, and visual display of a microarray analysis result. We also discuss the possibility of applying volcano plots to other fields beyond microarray.
Model-independent plot of dynamic PET data facilitates data interpretation and model selection.
Munk, Ole Lajord
2012-02-21
When testing new PET radiotracers or new applications of existing tracers, the blood-tissue exchange and the metabolism need to be examined. However, conventional plots of measured time-activity curves from dynamic PET do not reveal the inherent kinetic information. A novel model-independent volume-influx plot (vi-plot) was developed and validated. The new vi-plot shows the time course of the instantaneous distribution volume and the instantaneous influx rate. The vi-plot visualises physiological information that facilitates model selection and it reveals when a quasi-steady state is reached, which is a prerequisite for the use of the graphical analyses by Logan and Gjedde-Patlak. Both axes of the vi-plot have direct physiological interpretation, and the plot shows kinetic parameter in close agreement with estimates obtained by non-linear kinetic modelling. The vi-plot is equally useful for analyses of PET data based on a plasma input function or a reference region input function. The vi-plot is a model-independent and informative plot for data exploration that facilitates the selection of an appropriate method for data analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
PLOT3D/AMES, SGI IRIS VERSION (WITHOUT TURB3D)
NASA Technical Reports Server (NTRS)
Buning, P.
1994-01-01
PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into
PLOT3D/AMES, SGI IRIS VERSION (WITH TURB3D)
NASA Technical Reports Server (NTRS)
Buning, P.
1994-01-01
PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into
Cumulative Incidence of Cancer among HIV-infected Individuals in North America
Silverberg, Michael J.; Lau, Bryan; Achenbach, Chad J.; Jing, Yuezhou; Althoff, Keri N.; D’Souza, Gypsyamber; Engels, Eric A.; Hessol, Nancy; Brooks, John T.; Burchell, Ann N.; Gill, M. John; Goedert, James J.; Hogg, Robert; Horberg, Michael A.; Kirk, Gregory D.; Kitahata, Mari M.; Korthuis, Phillip T.; Mathews, William C.; Mayor, Angel; Modur, Sharada P.; Napravnik, Sonia; Novak, Richard M.; Patel, Pragna; Rachlis, Anita R.; Sterling, Timothy R.; Willig, James H.; Justice, Amy C.; Moore, Richard D.; Dubrow, Robert
2016-01-01
Background Cancer is increasingly common among HIV patients given improved survival. Objective To examine calendar trends in cumulative cancer incidence and hazard rate by HIV status. Design Cohort study Setting North American AIDS Cohort Collaboration on Research and Design during 1996–2009 Patients 86,620 HIV-infected and 196,987 uninfected adults Measurements We estimated cancer-type-specific cumulative incidence by age 75 years by HIV status and calendar era, and examined calendar trends in cumulative incidence and hazard rates. Results Cumulative incidences (%) of cancer by age 75 (HIV+/HIV−) were: Kaposi sarcoma (KS), 4.4/0.01; non-Hodgkin’s lymphoma (NHL), 4.5/0.7; lung, 3.4/2.8; anal, 1.5/0.1; colorectal, 1.0/1.5; liver, 1.1/0.4; Hodgkin lymphoma (HL), 0.9/0.1; melanoma, 0.5/0.6; and oral cavity/pharyngeal, 0.8/0.8. Among HIV-infected subjects, we observed decreasing calendar trends in cumulative incidence and hazard rate for KS and NHL. For anal, colorectal and liver cancers, increasing cumulative incidence, but not hazard rate trends, were due to the decreasing mortality rate trend (−9% per year), allowing greater opportunity to be diagnosed with these cancer types. Despite decreasing hazard rate trends for lung, HL, and melanoma, we did not observe cumulative incidence trends due to the compensating effect of the declining mortality rate on cumulative incidence. Limitations Secular trends in screening, smoking, and viral co-infections were not evaluated. Conclusions Our analytic approach helped disentangle the effects of improved survival and changing cancer-specific hazard rates on cumulative incidence trends among HIV patients. Cumulative cancer incidence by age 75, approximating lifetime risk in HIV patients, may have clinical utility in this population. The high cumulative incidences by age 75 for KS, NHL, and lung cancer supports early and sustained ART and smoking cessation. Primary Funding Source National Institutes of Health PMID:26436616
SPRUCE Peat Physical and Chemical Characteristics from Experimental Plot Cores, 2012
Iversen, C. M. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Hanson, P. J. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Brice, D. J. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Phillips, J. R. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; McFarlane, K. J. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Hobbie, E. A. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Kolka, R. K. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.
2012-01-01
This data set reports the results of physical and chemical analyses of peat core samples from the SPRUCE experimental study plots located in the S1-Bog. On August 13-15, 2012, a team of SPRUCE investigators and collaborators collected core samples of peat in the SPRUCE experimental plots. The goal was to characterize the biological, physical, and chemical characteristics of peat, and how those characteristics changed throughout the depth profile of the bog, prior to the initialization of the SPRUCE experimental warming and CO2 treatments. Cores were collected from 16 experimental plots; samples were collected from the hummock and hollow surfaces to depths of 200-300 cm in defined increments. Three replicate cores were collected from both hummock and hollow locations in each plot. The coring locations within each plot were mapped
An evaluation paradigm for cumulative impact analysis
NASA Astrophysics Data System (ADS)
Stakhiv, Eugene Z.
1988-09-01
Cumulative impact analysis is examined from a conceptual decision-making perspective, focusing on its implicit and explicit purposes as suggested within the policy and procedures for environmental impact analysis of the National Environmental Policy Act of 1969 (NEPA) and its implementing regulations. In this article it is also linked to different evaluation and decision-making conventions, contrasting a regulatory context with a comprehensive planning framework. The specific problems that make the application of cumulative impact analysis a virtually intractable evaluation requirement are discussed in connection with the federal regulation of wetlands uses. The relatively familiar US Army Corps of Engineers' (the Corps) permit program, in conjunction with the Environmental Protection Agency's (EPA) responsibilities in managing its share of the Section 404 regulatory program requirements, is used throughout as the realistic context for highlighting certain pragmatic evaluation aspects of cumulative impact assessment. To understand the purposes of cumulative impact analysis (CIA), a key distinction must be made between the implied comprehensive and multiobjective evaluation purposes of CIA, promoted through the principles and policies contained in NEPA, and the more commonly conducted and limited assessment of cumulative effects (ACE), which focuses largely on the ecological effects of human actions. Based on current evaluation practices within the Corps' and EPA's permit programs, it is shown that the commonly used screening approach to regulating wetlands uses is not compatible with the purposes of CIA, nor is the environmental impact statement (EIS) an appropriate vehicle for evaluating the variety of objectives and trade-offs needed as part of CIA. A heuristic model that incorporates the basic elements of CIA is developed, including the idea of trade-offs among social, economic, and environmental protection goals carried out within the context of environmental
Extended quantification of the generalized recurrence plot
NASA Astrophysics Data System (ADS)
Riedl, Maik; Marwan, Norbert; Kurths, Jürgen
2016-04-01
The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing structures, turbulent spatial plankton patterns, and fractals. But, it is also successfully applied to the description of spatio-temporal dynamics and the detection of regime shifts, such as in the complex Ginzburg-Landau- equation. The recurrence plot based determinism is a central measure in this framework quantifying the level of regularities in temporal and spatial structures. We extend this measure for the generalized recurrence plot considering additional operations of symmetry than the simple translation. It is tested not only on two-dimensional regular patterns and noise but also on complex spatial patterns reconstructing the parameter space of the complex Ginzburg-Landau-equation. The extended version of the determinism resulted in values which are consistent to the original recurrence plot approach. Furthermore, the proposed method allows a split of the determinism into parts which based on laminar and non-laminar regions of the two-dimensional pattern of the complex Ginzburg-Landau-equation. A comparison of these parts with a standard method of image classification, the co-occurrence matrix approach, shows differences especially in the description of patterns associated with turbulence. In that case, it seems that the extended version of the determinism allows a distinction of phase turbulence and defect turbulence by means of their spatial patterns. This ability of the proposed method promise new insights in other systems with turbulent dynamics coming from climatology, biology, ecology, and social sciences, for example.
Polar plot representation of time-resolved fluorescence.
Eichorst, John Paul; Wen Teng, Kai; Clegg, Robert M
2014-01-01
Measuring changes in a molecule's fluorescence emission is a common technique to study complex biological systems such as cells and tissues. Although the steady-state fluorescence intensity is frequently used, measuring the average amount of time that a molecule spends in the excited state (the fluorescence lifetime) reveals more detailed information about its local environment. The lifetime is measured in the time domain by detecting directly the decay of fluorescence following excitation by short pulse of light. The lifetime can also be measured in the frequency domain by recording the phase and amplitude of oscillation in the emitted fluorescence of the sample in response to repetitively modulated excitation light. In either the time or frequency domain, the analysis of data to extract lifetimes can be computationally intensive. For example, a variety of iterative fitting algorithms already exist to determine lifetimes from samples that contain multiple fluorescing species. However, recently a method of analysis referred to as the polar plot (or phasor plot) is a graphical tool that projects the time-dependent features of the sample's fluorescence in either the time or frequency domain into the Cartesian plane to characterize the sample's lifetime. The coordinate transformations of the polar plot require only the raw data, and hence, there are no uncertainties from extensive corrections or time-consuming fitting in this analysis. In this chapter, the history and mathematical background of the polar plot will be presented along with examples that highlight how it can be used in both cuvette-based and imaging applications.
NASA Astrophysics Data System (ADS)
Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.
2018-02-01
Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.
An ecoinformatics application for forest dynamics plot data management and sharing
Chau-Chin Lin; Abd Rahman Kassim; Kristin Vanderbilt; Donald Henshaw; Eda C. Melendez-Colom; John H. Porter; Kaoru Niiyama; Tsutomu Yagihashi; Sek Aun Tan; Sheng-Shan Lu; Chi-Wen Hsiao; Li-Wan Chang; Meei-Ru Jeng
2011-01-01
Several forest dynamics plot research projects in the East-Asia Pacific region of the International Long-Term Ecological Research network actively collect long-term data, and some of these large plots are members of the Center for Tropical Forest Science network. The wealth of forest plot data presents challenges in information management to researchers. In order to...
A Universal Graph Plotting Routine.
ERIC Educational Resources Information Center
Bogart, Theodore F., Jr.
1984-01-01
Presents a programing subroutine which will create a graphical plot that occupies any number of columns specified by user and will run with versions of BASIC programming language. Illustrations of the subroutine's ability to operate successfully for three possibilities (negative values, positive values, and both positive and negative values) are…
Triangular Plots and Spreadsheet Software.
ERIC Educational Resources Information Center
Holm, Paul Eric
1988-01-01
Describes how the limitations of the built-in graphics capabilities of spreadsheet software can be overcome by making full use of the flexibility of the grahics options. Uses triangular plots with labeled field boundaries produced using Lotus 1-2-3 to demonstrate these techniques and their use in teaching geology. (CW)
SPRUCE Porewater Chemistry Data for Experimental Plots Beginning in 2013
Griffiths, N. A. [Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Sebestyen, S. D. [Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.
2016-01-01
This data set reports the chemistry of porewater in the SPRUCE plots located in the S1 bog. Sample collection and analyses started in August of 2013 and will continue for the duration of the experiment. Results will be added to this data set and released to the public periodically as quality assurance and publication of results are accomplished. These data are the pre- and post-treatment data from the warming and elevated CO2 treatments associated with the SPRUCE experiment. There are 10 experimental plots in SPRUCE: 5 temperature treatments (+0, +2.25, +4.5, +6.75, +9°C) at ambient CO2, and the same 5 temperature treatments at elevated CO2 (+500 ppm). There are 7 additional ambient plots without experimental enclosures, and thus a total of 17 plots.
Unbalanced and Minimal Point Equivalent Estimation Second-Order Split-Plot Designs
NASA Technical Reports Server (NTRS)
Parker, Peter A.; Kowalski, Scott M.; Vining, G. Geoffrey
2007-01-01
Restricting the randomization of hard-to-change factors in industrial experiments is often performed by employing a split-plot design structure. From an economic perspective, these designs minimize the experimental cost by reducing the number of resets of the hard-to- change factors. In this paper, unbalanced designs are considered for cases where the subplots are relatively expensive and the experimental apparatus accommodates an unequal number of runs per whole-plot. We provide construction methods for unbalanced second-order split- plot designs that possess the equivalence estimation optimality property, providing best linear unbiased estimates of the parameters; independent of the variance components. Unbalanced versions of the central composite and Box-Behnken designs are developed. For cases where the subplot cost approaches the whole-plot cost, minimal point designs are proposed and illustrated with a split-plot Notz design.
Harold S.J. Zald; Janet L. Ohmann; Heather M. Roberts; Matthew J. Gregory; Emilie B. Henderson; Robert J. McGaughey; Justin Braaten
2014-01-01
This study investigated how lidar-derived vegetation indices, disturbance history from Landsat time series (LTS) imagery, plot location accuracy, and plot size influenced accuracy of statistical spatial models (nearest-neighbor imputation maps) of forest vegetation composition and structure. Nearest-neighbor (NN) imputation maps were developed for 539,000 ha in the...
Charvat, Hadrien; Sasazuki, Shizuka; Inoue, Manami; Iwasaki, Motoki; Sawada, Norie; Shimazu, Taichi; Yamaji, Taiki; Tsugane, Shoichiro
2016-01-15
Gastric cancer is a particularly important issue in Japan, where incidence rates are among the highest observed. In this work, we provide a risk prediction model allowing the estimation of the 10-year cumulative probability of gastric cancer occurrence. The study population consisted of 19,028 individuals from the Japanese Public Health Center cohort II who were followed-up from 1993 to 2009. A parametric survival model was used to assess the impact on the probability of gastric cancer of clinical and lifestyle-related risk factors in combination with serum anti-Helicobacter pylori antibody titres and pepsinogen I and pepsinogen II levels. Based on the resulting model, cumulative probability estimates were calculated and a simple risk scoring system was developed. A total of 412 cases of gastric cancer occurred during 270,854 person-years of follow-up. The final model included (besides the biological markers) age, gender, smoking status, family history of gastric cancer and consumption of highly salted food. The developed prediction model showed good predictive performance in terms of discrimination (optimism-corrected c-index: 0.768) and calibration (Nam and d'Agostino's χ(2) test: 14.78; p values = 0.06). Estimates of the 10-year probability of gastric cancer occurrence ranged from 0.04% (0.02, 0.1) to 14.87% (8.96, 24.14) for men and from 0.03% (0.02, 0.07) to 4.91% (2.71, 8.81) for women. In conclusion, we developed a risk prediction model for gastric cancer that combines clinical and biological markers. It might prompt individuals to modify their lifestyle habits, attend regular check-up visits or participate in screening programmes. © 2015 UICC.
Forest Practice Rules and cumulative watershed impacts in California
L. M. Reid
1999-01-01
Response to the following questions, "As currently implemented, are existing California forest practice rules effective in preventing cumulative watershed impacts, including flooding?" and "What kind of measures might improve the effectiveness of forest practices rules for avoiding forestry-related cumulative watershed impacts
Probability and Conditional Probability of Cumulative Cloud Cover for Selected Stations Worldwide.
1985-07-01
INTRODUCTION The performance of precision-guided munition (PGM) systems may be severely compromised by the presence of clouds in the desired target...Korea 37.98 N 12794 E Mar 67-Dec 79 4Ku san, Korea 37.90 N 126.63 E Jaug 51-Dec 81 (No Jan 71-Dec 72) 4 -7141- Taegu & Tonchon, Korea 35.90 N 128.67 E Jan
Comparison of Imputation Procedures for Replacing Denied-access Plots
Susan L. King
2005-01-01
In forest inventories, missing plots are caused by hazardous terrain, inaccessible locations, or denied access. Maryland had a large number of denied-access plots in the latest periodic inventory conducted by the Northeastern Forest Inventory and Analysis unit. The denial pattern, which can introduce error into the estimates, was investigated by dropping the 1999...
Stain Associated with Nails in Trees on Permanent Plots
Charles B. Briscoe; Benton H. Box
1959-01-01
In studies involving the measurement and subsequent remeasurement of trees, such as CFI plots or siliviculture research plots, the trees are commonly identified by metal tags fastened to the trees by means of nails. In 1957 a study was begun to determine whether this practice would lead to dregrade or scalable defect in the trees.
Refining FIA plot locations using LiDAR point clouds
Charlie Schrader-Patton; Greg C. Liknes; Demetrios Gatziolis; Brian M. Wing; Mark D. Nelson; Patrick D. Miles; Josh Bixby; Daniel G. Wendt; Dennis Kepler; Abbey Schaaf
2015-01-01
Forest Inventory and Analysis (FIA) plot location coordinate precision is often insufficient for use with high resolution remotely sensed data, thereby limiting the use of these plots for geospatial applications and reducing the validity of models that assume the locations are precise. A practical and efficient method is needed to improve coordinate precision. To...
Procedures for establishing and maintaining permanent plots for silvicultural and yield research.
Robert O. Curtis
1983-01-01
This paper reviews procedures for establishing and maintaining permanent plots for silvicultural and yield research; discusses purposes, sampling, and plot design; points out common errors; and makes recommendations for research plot designs and procedures for measuring and recording data.
Childhood Cumulative Risk Exposure and Adult Amygdala Volume and Function
Evans, Gary W.; Swain, James E.; King, Anthony P.; Wang, Xin; Javanbakht, Arash; Ho, S. Shaun; Angstadt, Michael; Phan, K. Luan; Xie, Hong; Liberzon, Israel
2015-01-01
Considerable work indicates that early cumulative risk exposure is aversive to human development, but very little research has examined neurological underpinnings of these robust findings. We investigated amygdala volume and reactivity to facial stimuli among adults (M = 23.7 years, n = 54) as a function of cumulative risk exposure during childhood (ages 9 and 13). In addition, we tested whether expected, cumulative risk elevations in amygdala volume would mediate functional reactivity of the amygdala during socio-emotional processing. Risks included substandard housing quality, noise, crowding, family turmoil, child separation from family, and violence. Total and left hemisphere adult amygdala volumes, respectively were positively related to cumulative risk exposure during childhood. The links between childhood cumulative risk exposure and elevated amygdala responses to emotionally neutral facial stimuli in adulthood were mediated by the respective amygdala volumes. Cumulative risk exposure in later adolescence (17 years), however, was unrelated to subsequent, adult amygdala volume or function. Physical and socioemotional risk exposures early in life appear to alter amygdala development, rendering adults more reactive to ambiguous stimuli such as neutral faces. These stress-related differences in childhood amygdala development might contribute to well-documented psychological distress as a function of early risk exposure. PMID:26469872
Childhood Cumulative Risk Exposure and Adult Amygdala Volume and Function.
Evans, Gary W; Swain, James E; King, Anthony P; Wang, Xin; Javanbakht, Arash; Ho, S Shaun; Angstadt, Michael; Phan, K Luan; Xie, Hong; Liberzon, Israel
2016-06-01
Considerable work indicates that early cumulative risk exposure is aversive to human development, but very little research has examined the neurological underpinnings of these robust findings. This study investigates amygdala volume and reactivity to facial stimuli among adults (mean 23.7 years of age, n = 54) as a function of cumulative risk exposure during childhood (9 and 13 years of age). In addition, we test to determine whether expected cumulative risk elevations in amygdala volume would mediate functional reactivity of the amygdala during socioemotional processing. Risks included substandard housing quality, noise, crowding, family turmoil, child separation from family, and violence. Total and left hemisphere adult amygdala volumes were positively related to cumulative risk exposure during childhood. The links between childhood cumulative risk exposure and elevated amygdala responses to emotionally neutral facial stimuli in adulthood were mediated by the corresponding amygdala volumes. Cumulative risk exposure in later adolescence (17 years of age), however, was unrelated to subsequent adult amygdala volume or function. Physical and socioemotional risk exposures early in life appear to alter amygdala development, rendering adults more reactive to ambiguous stimuli such as neutral faces. These stress-related differences in childhood amygdala development might contribute to the well-documented psychological distress as a function of early risk exposure. © 2015 Wiley Periodicals, Inc.
Goodness of fit of probability distributions for sightings as species approach extinction.
Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael
2009-04-01
Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.
DISSPLA plotting routines for the G-189A EC/LS computer program
NASA Technical Reports Server (NTRS)
Simpson, C. D.
1982-01-01
Data from a G-189A execution is formatted and plotted. The plotting may be done at the time of execution of the program. DISSPLA plot packages are used. The user has the choice of FR80 or TEKTRONIX output.
Further Evidence for Geochemical Diversity, and Possible Bimodality, Among Cumulate Eucrites
NASA Astrophysics Data System (ADS)
Warren, P. H.; Kallemeyn, G. W.
1992-07-01
We have used INAA, RNAA, and fused-bead analysis to determine the bulk compositions of numerous Antarctic eucrites (and also the LEW88516 SNC meteorite). Only a few of the most unusual eucrites can be discussed in the limited space here. Takeda et al. (1988) noted that Y791195 is a slowly cooled eucrite, with an equant, medium-grained texture, and pyroxene exsolution lamellae up to 10 micrometers across. In Y791195,81-3, we find lamellae up to 14 micrometers across. In this respect, Y791195 resembles RKPA80224, in which exsolution lamellae up to 12 micrometers across. We have previously discussed the evidence that RKPA80224 is a mildly accumulative rock that formed from an unusually low-mg parent melt. Our second analysis of RKPA80224 only partly confirms the unusually low incompatible trace element (ITE) content, but the Ce anomaly is consistently small (Ce/La = 0.90-1.02 x CI), and based on a weighted mean composition the implied parent melt is still unlike any noncumulate eucrite (see Fig. 1, which shows results from mass balance calculations modeling the sample as a mixture of cumulus px and plag, plus trapped melt). A parent melt similar to an extreme low-mg, variant of the "Nuevo Laredo Trend" would plausibly account for RKPA80224. The spectrum of possible parents for Y791195 is similar, even though its "true" Sm content is slightly obscured by weathering (Ce/La = 1.4 x CI). The [Sm] used in the figure is scaled to the highest CI-normalized REE concentration. Data of Mittlefehldt and Lindstrom (1991) indicate that except for exterior samples "showing extreme rustiness," Sm even in weathered eucrites is generally not altered beyond a few tens of pct. relative (sample size seems to account for more of the variation in [Sm] among interior, non-rusty samples). Even assuming a Sm content twice that assumed in the figure, the parent melt still must be well to the low-MgO/FeO, low-Sm side of all known eucrites. The LEW87002 eucrite is brecciated, but probably
Cumulative Damage Model for Advanced Composite Materials.
1984-03-09
Masters, J.L., "Investigation of Characteristic Damage States in Composites Laminat -s," ASME Paper No. 79-WA-AERO-4, 1978. [26] Jivinall, R.C., "Stress...AD-A144 84e CUMULATIVE DAMAGE MODEL FOR RDVRNCED COMPOSITE 1/2 MATERIRLS(U) DYNA EAST CORP PHILADELPHIA PA P C CHOU ET AL. 09 MAR 84 RFWRL-TR-84-4084...MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS- 1963-A AFWAL-TR-84-4004 •S CUMULATIVE DAMAGE MODEL FOR ADVANCED COMPOSITE MATERIALS PHASE II 0
Boundary pint corrections for variable radius plots - simulation results
Margaret Penner; Sam Otukol
2000-01-01
The boundary plot problem is encountered when a forest inventory plot includes two or more forest conditions. Depending on the correction method used, the resulting estimates can be biased. The various correction alternatives are reviewed. No correction, area correction, half sweep, and toss-back methods are evaluated using simulation on an actual data set. Based on...
Effects of plot size on forest-type algorithm accuracy
James A. Westfall
2009-01-01
The Forest Inventory and Analysis (FIA) program utilizes an algorithm to consistently determine the forest type for forested conditions on sample plots. Forest type is determined from tree size and species information. Thus, the accuracy of results is often dependent on the number of trees present, which is highly correlated with plot area. This research examines the...
Estimating mapped-plot forest attributes with ratios of means
S.J. Zarnoch; W.A. Bechtold
2000-01-01
The mapped-plot design utilized by the U.S. Department of Agriculture (USDA) Forest Inventory and Analysis and the National Forest Health Monitoring Programs is described. Data from 2458 forested mapped plots systematically spread across 25 States reveal that 35 percent straddle multiple conditions. The ratio-of-means estimator is developed as a method to obtain...
Considerations in Forest Growth Estimation Between Two Measurements of Mapped Forest Inventory Plots
Michael T. Thompson
2006-01-01
Several aspects of the enhanced Forest Inventory and Analysis (FIA) program?s national plot design complicate change estimation. The design incorporates up to three separate plot sizes (microplot, subplot, and macroplot) to sample trees of different sizes. Because multiple plot sizes are involved, change estimators designed for polyareal plot sampling, such as those...
A Framework for Treating Cumulative Trauma with Art Therapy
ERIC Educational Resources Information Center
Naff, Kristina
2014-01-01
Cumulative trauma is relatively undocumented in art therapy practice, although there is growing evidence that art therapy provides distinct benefits for resolving various traumas. This qualitative study proposes an art therapy treatment framework for cumulative trauma derived from semi-structured interviews with three art therapists and artistic…
Automatic Target Recognition Based on Cross-Plot
Wong, Kelvin Kian Loong; Abbott, Derek
2011-01-01
Automatic target recognition that relies on rapid feature extraction of real-time target from photo-realistic imaging will enable efficient identification of target patterns. To achieve this objective, Cross-plots of binary patterns are explored as potential signatures for the observed target by high-speed capture of the crucial spatial features using minimal computational resources. Target recognition was implemented based on the proposed pattern recognition concept and tested rigorously for its precision and recall performance. We conclude that Cross-plotting is able to produce a digital fingerprint of a target that correlates efficiently and effectively to signatures of patterns having its identity in a target repository. PMID:21980508
Cumulative Environmental Impacts: Science and Policy to Protect Communities.
Solomon, Gina M; Morello-Frosch, Rachel; Zeise, Lauren; Faust, John B
2016-01-01
Many communities are located near multiple sources of pollution, including current and former industrial sites, major roadways, and agricultural operations. Populations in such locations are predominantly low-income, with a large percentage of minorities and non-English speakers. These communities face challenges that can affect the health of their residents, including limited access to health care, a shortage of grocery stores, poor housing quality, and a lack of parks and open spaces. Environmental exposures may interact with social stressors, thereby worsening health outcomes. Age, genetic characteristics, and preexisting health conditions increase the risk of adverse health effects from exposure to pollutants. There are existing approaches for characterizing cumulative exposures, cumulative risks, and cumulative health impacts. Although such approaches have merit, they also have significant constraints. New developments in exposure monitoring, mapping, toxicology, and epidemiology, especially when informed by community participation, have the potential to advance the science on cumulative impacts and to improve decision making.
Conceptual models for cumulative risk assessment.
Linder, Stephen H; Sexton, Ken
2011-12-01
In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.
NASA Astrophysics Data System (ADS)
Pernot, Pascal; Savin, Andreas
2018-06-01
Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.
Cumulative Estrogen Exposure and Prospective Memory in Older Women
ERIC Educational Resources Information Center
Hesson, Jacqueline
2012-01-01
This study looked at cumulative lifetime estrogen exposure, as estimated with a mathematical index (Index of Cumulative Estrogen Exposure (ICEE)) that included variables (length of time on estrogen therapy, age at menarche and menopause, postmenopausal body mass index, time since menopause, nulliparity and duration of breastfeeding) known to…
National land cover monitoring using large, permanent photo plots
Raymond L. Czaplewski; Glenn P. Catts; Paul W. Snook
1987-01-01
A study in the State of North Carplina, U.S.A. demonstrated that large, permanent photo plots (400 hectares) can be used to monitor large regions of land by using remote sensing techniques. Estimates of area in a variety of land cover categories were made by photointerpretation of medium-scale aerial photography from a single month using 111 photo plots. Many of these...
[Heart rate variability study based on a novel RdR RR Intervals Scatter Plot].
Lu, Hongwei; Lu, Xiuyun; Wang, Chunfang; Hua, Youyuan; Tian, Jiajia; Liu, Shihai
2014-08-01
On the basis of Poincare scatter plot and first order difference scatter plot, a novel heart rate variability (HRV) analysis method based on scatter plots of RR intervals and first order difference of RR intervals (namely, RdR) was proposed. The abscissa of the RdR scatter plot, the x-axis, is RR intervals and the ordinate, y-axis, is the difference between successive RR intervals. The RdR scatter plot includes the information of RR intervals and the difference between successive RR intervals, which captures more HRV information. By RdR scatter plot analysis of some records of MIT-BIH arrhythmias database, we found that the scatter plot of uncoupled premature ventricular contraction (PVC), coupled ventricular bigeminy and ventricular trigeminy PVC had specific graphic characteristics. The RdR scatter plot method has higher detecting performance than the Poincare scatter plot method, and simpler and more intuitive than the first order difference method.
Experimental Garden Plots for Botany Lessons
ERIC Educational Resources Information Center
Gorodnicheva, V. V.; Vasil'eva, E. I.
1976-01-01
Discussion of the botany lessons used at two schools points out the need for fifth and sixth grade students to be taught the principles of plant life through observations made at an experimental garden plot at the school. (ND)
NASA Astrophysics Data System (ADS)
Fredj, Erick; Kohut, Josh; Roarty, Hugh; Lai, Jian-Wu
2017-04-01
The Lagrangian separation distance between the endpoints of simulated and observed drifter trajectories is often used to assess the performance of numerical particle trajectory models. However, the separation distance fails to indicate relative model performance in weak and strong current regions, such as over continental shelves and the adjacent deep ocean. A skill score described in detail by (Lui et.al. 2011) was applied to estimate the cumulative Lagrangian separation distances normalized by the associated cumulative trajectory lengths. In contrast, the Lagrangian separation distance alone gives a misleading result. The proposed dimensionless skill score is particularly useful when the number of drifter trajectories is limited and neither a conventional Eulerian-based velocity nor a Lagrangian based probability density function may be estimated. The skill score assesses The Taiwan Ocean Radar Observing System (TOROS) performance. TOROS consists of 17 SeaSonde type radars around the Taiwan Island. The currents off Taiwan are significantly influenced by the nearby Kuroshio current. The main stream of the Kuroshio flows along the east coast of Taiwan to the north throughout the year. Sometimes its branch current also bypasses the south end of Taiwan and goes north along the west coast of Taiwan. The Kuroshio is also prone to seasonal change in its speed of flow, current capacity, distribution width, and depth. The evaluations of HF-Radar National Taiwanese network performance using Lagrangian drifter records demonstrated the high quality and robustness of TOROS HF-Radar data using a purely trajectory-based non-dimensional index. Yonggang Liu and Robert H. Weisberg, "Evaluation of trajectory modeling in different dynamic regions using normalized cumulative Lagrangian separation", Journal of Geophysical Research, Vol. 116, C09013, doi:10.1029/2010JC006837, 2011
Flyby Error Analysis Based on Contour Plots for the Cassini Tour
NASA Technical Reports Server (NTRS)
Stumpf, P. W.; Gist, E. M.; Goodson, T. D.; Hahn, Y.; Wagner, S. V.; Williams, P. N.
2008-01-01
The maneuver cancellation analysis consists of cost contour plots employed by the Cassini maneuver team. The plots are two-dimensional linear representations of a larger six-dimensional solution to a multi-maneuver, multi-encounter mission at Saturn. By using contours plotted with the dot product of vectors B and R and the dot product of vectors B and T components, it is possible to view the effects delta V on for various encounter positions in the B-plane. The plot is used in operations to help determine if the Approach Maneuver (ensuing encounter minus three days) and/or the Cleanup Maneuver (ensuing encounter plus three days) can be cancelled and also is a linear check of an integrated solution.
Aeronautical Engineering: A continuing bibliography, 1982 cumulative index
NASA Technical Reports Server (NTRS)
1983-01-01
This bibliography is a cumulative index to the abstracts contained in NASA SP-7037 (145) through NASA SP-7037 (156) of Aeronautical Engineering: A Continuing Bibliography. NASA SP-7037 and its supplements have been compiled through the cooperative efforts of the American Institute of Aeronautics and Astronautics (AIAA) and the National Aeronautics and Space Administration (NASA). This cumulative index includes subject, personal author, corporate source, contract, and report number indexes.
Using canopy resistance for infrared heater control when warming open-field plots
USDA-ARS?s Scientific Manuscript database
Several research groups are using or planning to use arrays of infrared heaters to simulate global warming in open-field plots with a control strategy that involves maintaining a constant rise in canopy temperatures of the heated plots above those of un-heated Reference plots. . However, if the warm...
Cumulative query method for influenza surveillance using search engine data.
Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il
2014-12-16
Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.
76 FR 82296 - Pyrethrins/Pyrethroid Cumulative Risk Assessment; Extension of Comment Period
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-30
... Cumulative Risk Assessment; Extension of Comment Period AGENCY: Environmental Protection Agency (EPA). ACTION..., 2011, concerning the availability of EPA's cumulative risk assessment for the naturally occurring... cumulative risk assessment for the pyrethroids. Based on this assessment, the EPA concluded that the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higginson, Drew P.
Here, we describe and justify a full-angle scattering (FAS) method to faithfully reproduce the accumulated differential angular Rutherford scattering probability distribution function (pdf) of particles in a plasma. The FAS method splits the scattering events into two regions. At small angles it is described by cumulative scattering events resulting, via the central limit theorem, in a Gaussian-like pdf; at larger angles it is described by single-event scatters and retains a pdf that follows the form of the Rutherford differential cross-section. The FAS method is verified using discrete Monte-Carlo scattering simulations run at small timesteps to include each individual scattering event.more » We identify the FAS regime of interest as where the ratio of temporal/spatial scale-of-interest to slowing-down time/length is from 10 -3 to 0.3–0.7; the upper limit corresponds to Coulomb logarithm of 20–2, respectively. Two test problems, high-velocity interpenetrating plasma flows and keV-temperature ion equilibration, are used to highlight systems where including FAS is important to capture relevant physics.« less
Higginson, Drew P.
2017-08-12
Here, we describe and justify a full-angle scattering (FAS) method to faithfully reproduce the accumulated differential angular Rutherford scattering probability distribution function (pdf) of particles in a plasma. The FAS method splits the scattering events into two regions. At small angles it is described by cumulative scattering events resulting, via the central limit theorem, in a Gaussian-like pdf; at larger angles it is described by single-event scatters and retains a pdf that follows the form of the Rutherford differential cross-section. The FAS method is verified using discrete Monte-Carlo scattering simulations run at small timesteps to include each individual scattering event.more » We identify the FAS regime of interest as where the ratio of temporal/spatial scale-of-interest to slowing-down time/length is from 10 -3 to 0.3–0.7; the upper limit corresponds to Coulomb logarithm of 20–2, respectively. Two test problems, high-velocity interpenetrating plasma flows and keV-temperature ion equilibration, are used to highlight systems where including FAS is important to capture relevant physics.« less
The challenges and opportunities in cumulative effects assessment
Foley, Melissa M.; Mease, Lindley A; Martone, Rebecca G; Prahler, Erin E; Morrison, Tiffany H; Clarke Murray, Cathryn; Wojcik, Deborah
2016-01-01
The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmental assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.
The challenges and opportunities in cumulative effects assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foley, Melissa M., E-mail: mfoley@usgs.gov; Center for Ocean Solutions, Stanford University, 99 Pacific St., Monterey, CA 93940; Mease, Lindley A., E-mail: lamease@stanford.edu
The cumulative effects of increasing human use of the ocean and coastal zone have contributed to a rapid decline in ocean and coastal resources. As a result, scientists are investigating how multiple, overlapping stressors accumulate in the environment and impact ecosystems. These investigations are the foundation for the development of new tools that account for and predict cumulative effects in order to more adequately prevent or mitigate negative effects. Despite scientific advances, legal requirements, and management guidance, those who conduct assessments—including resource managers, agency staff, and consultants—continue to struggle to thoroughly evaluate cumulative effects, particularly as part of the environmentalmore » assessment process. Even though 45 years have passed since the United States National Environmental Policy Act was enacted, which set a precedent for environmental assessment around the world, defining impacts, baseline, scale, and significance are still major challenges associated with assessing cumulative effects. In addition, we know little about how practitioners tackle these challenges or how assessment aligns with current scientific recommendations. To shed more light on these challenges and gaps, we undertook a comparative study on how cumulative effects assessment (CEA) is conducted by practitioners operating under some of the most well-developed environmental laws around the globe: California, USA; British Columbia, Canada; Queensland, Australia; and New Zealand. We found that practitioners used a broad and varied definition of impact for CEA, which led to differences in how baseline, scale, and significance were determined. We also found that practice and science are not closely aligned and, as such, we highlight opportunities for managers, policy makers, practitioners, and scientists to improve environmental assessment.« less
N -tag probability law of the symmetric exclusion process
NASA Astrophysics Data System (ADS)
Poncet, Alexis; Bénichou, Olivier; Démery, Vincent; Oshanin, Gleb
2018-06-01
The symmetric exclusion process (SEP), in which particles hop symmetrically on a discrete line with hard-core constraints, is a paradigmatic model of subdiffusion in confined systems. This anomalous behavior is a direct consequence of strong spatial correlations induced by the requirement that the particles cannot overtake each other. Even if this fact has been recognized qualitatively for a long time, up to now there has been no full quantitative determination of these correlations. Here we study the joint probability distribution of an arbitrary number of tagged particles in the SEP. We determine analytically its large-time limit for an arbitrary density of particles, and its full dynamics in the high-density limit. In this limit, we obtain the time-dependent large deviation function of the problem and unveil a universal scaling form shared by the cumulants.
Variable Cultural Acquisition Costs Constrain Cumulative Cultural Evolution
Mesoudi, Alex
2011-01-01
One of the hallmarks of the human species is our capacity for cumulative culture, in which beneficial knowledge and technology is accumulated over successive generations. Yet previous analyses of cumulative cultural change have failed to consider the possibility that as cultural complexity accumulates, it becomes increasingly costly for each new generation to acquire from the previous generation. In principle this may result in an upper limit on the cultural complexity that can be accumulated, at which point accumulated knowledge is so costly and time-consuming to acquire that further innovation is not possible. In this paper I first review existing empirical analyses of the history of science and technology that support the possibility that cultural acquisition costs may constrain cumulative cultural evolution. I then present macroscopic and individual-based models of cumulative cultural evolution that explore the consequences of this assumption of variable cultural acquisition costs, showing that making acquisition costs vary with cultural complexity causes the latter to reach an upper limit above which no further innovation can occur. These models further explore the consequences of different cultural transmission rules (directly biased, indirectly biased and unbiased transmission), population size, and cultural innovations that themselves reduce innovation or acquisition costs. PMID:21479170
Towards a plot size for Canada's national forest inventory
Steen Magnussen; P. Boudewyn; M. Gillis
2000-01-01
A proposed national forest inventory for Canada is to report on the state and trends of resource attributes gathered mainly from aerial photos of sample plots located on a national grid. A pilot project in New Brunswick indicates it takes about 2,800 square 400-ha plots (10 percent inventoried) to achieve a relative standard error of 10 percent or less on 14 out of 17...
25. Cumulative effects assessment impact thresholds: myths and realities
Robert R. Ziemer
1994-01-01
A cumulative impact has been commonly defined as: ""...the impact on the environment which results from the incremental impact of the action when added to other past, present, and reasonably foreseeable future actions regardless of what agency or person undertakes such other actions. Cumulative impacts can result from individually minor but collectively...
Zipper plot: visualizing transcriptional activity of genomic regions.
Avila Cobos, Francisco; Anckaert, Jasper; Volders, Pieter-Jan; Everaert, Celine; Rombaut, Dries; Vandesompele, Jo; De Preter, Katleen; Mestdagh, Pieter
2017-05-02
Reconstructing transcript models from RNA-sequencing (RNA-seq) data and establishing these as independent transcriptional units can be a challenging task. Current state-of-the-art tools for long non-coding RNA (lncRNA) annotation are mainly based on evolutionary constraints, which may result in false negatives due to the overall limited conservation of lncRNAs. To tackle this problem we have developed the Zipper plot, a novel visualization and analysis method that enables users to simultaneously interrogate thousands of human putative transcription start sites (TSSs) in relation to various features that are indicative for transcriptional activity. These include publicly available CAGE-sequencing, ChIP-sequencing and DNase-sequencing datasets. Our method only requires three tab-separated fields (chromosome, genomic coordinate of the TSS and strand) as input and generates a report that includes a detailed summary table, a Zipper plot and several statistics derived from this plot. Using the Zipper plot, we found evidence of transcription for a set of well-characterized lncRNAs and observed that fewer mono-exonic lncRNAs have CAGE peaks overlapping with their TSSs compared to multi-exonic lncRNAs. Using publicly available RNA-seq data, we found more than one hundred cases where junction reads connected protein-coding gene exons with a downstream mono-exonic lncRNA, revealing the need for a careful evaluation of lncRNA 5'-boundaries. Our method is implemented using the statistical programming language R and is freely available as a webtool.
Fitting Data to Model: Structural Equation Modeling Diagnosis Using Two Scatter Plots
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro
2010-01-01
This article introduces two simple scatter plots for model diagnosis in structural equation modeling. One plot contrasts a residual-based M-distance of the structural model with the M-distance for the factor score. It contains information on outliers, good leverage observations, bad leverage observations, and normal cases. The other plot contrasts…
SpectraPLOT, Visualization Package with a User-Friendly Graphical Interface
NASA Astrophysics Data System (ADS)
Sebald, James; Macfarlane, Joseph; Golovkin, Igor
2017-10-01
SPECT3D is a collisional-radiative spectral analysis package designed to compute detailed emission, absorption, or x-ray scattering spectra, filtered images, XRD signals, and other synthetic diagnostics. The spectra and images are computed for virtual detectors by post-processing the results of hydrodynamics simulations in 1D, 2D, and 3D geometries. SPECT3D can account for a variety of instrumental response effects so that direct comparisons between simulations and experimental measurements can be made. SpectraPLOT is a user-friendly graphical interface for viewing a wide variety of results from SPECT3D simulations, and applying various instrumental effects to the simulated images and spectra. We will present SpectraPLOT's ability to display a variety of data, including spectra, images, light curves, streaked spectra, space-resolved spectra, and drilldown plasma property plots, for an argon-doped capsule implosion experiment example. Future SpectraPLOT features and enhancements will also be discussed.
Conceptual Models for Cumulative Risk Assessment
Sexton, Ken
2011-01-01
In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive “family” of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317
Abravanel, Benjamin T; Sinha, Rajita
2015-02-01
Repeated exposure to stressful events across the lifespan, referred to as cumulative adversity, is a potent risk factor for depression. Research indicates that cumulative adversity detrimentally affects emotion regulation processes, which may represent a pathway linking cumulative adversity to vulnerability to depression. However, empirical evidence that emotion dysregulation mediates the relationship between cumulative adversity and depression is limited, particularly in adult populations. We examined the direct and indirect effects of cumulative adversity on depressive symptomatology in a large community sample of adults (n = 745) who were further characterized by risk status: never-depressed (n = 638) and "at-risk" remitted mood-disordered (n = 107). All participants completed the Cumulative Adversity Inventory (CAI), the Difficulties in Emotion Regulation Scale (DERS), and the Center for Epidemiologic Studies Depression Scale (CES-D). Bootstrapped confidence intervals were computed to estimate the indirect effect of emotion dysregulation on the relationship between cumulative adversity and depressive symptomatology and to test whether this indirect effect was moderated by risk status. Emotion dysregulation partially and significantly mediated the relationship between cumulative adversity and depressive symptomatology independent of risk status. Overall, cumulative adversity and emotion dysregulation accounted for 50% of the variance in depressive symptomatology. These findings support the hypothesis that disruption of adaptive emotion regulation processes associated with repeated exposure to stressful life events represents an intrapersonal mechanism linking the experience of adverse events to depression. Our results support the utility of interventions that simultaneously emphasize stress reduction and emotion regulation to treat and prevent depressive vulnerability and pathology. Copyright © 2014 Elsevier Ltd. All rights reserved.
Abravanel, Benjamin T.; Sinha, Rajita
2014-01-01
Repeated exposure to stressful events across the lifespan, referred to as cumulative adversity, is a potent risk factor for depression. Research indicates that cumulative adversity detrimentally affects emotion regulation processes, which may represent a pathway linking cumulative adversity to vulnerability to depression. However, empirical evidence that emotion dysregulation mediates the relationship between cumulative adversity and depression is limited, particularly in adult populations. We examined the direct and indirect effects of cumulative adversity on depressive symptomatology in a large community sample of adults (n = 745) who were further characterized by risk status: never-depressed (n = 638) and “at-risk” remitted mood-disordered (n = 107). All participants completed the Cumulative Adversity Inventory (CAI), the Difficulties in Emotion Regulation Scale (DERS), and the Center for Epidemiologic Studies Depression Scale (CES-D). Bootstrapped confidence intervals were computed to estimate the indirect effect of emotion dysregulation on the relationship between cumulative adversity and depressive symptomatology and to test whether this indirect effect was moderated by risk status. Emotion dysregulation partially and significantly mediated the relationship between cumulative adversity and depressive symptomatology independent of risk status. Overall, cumulative adversity and emotion dysregulation accounted for 50% of the variance in depressive symptomatology. These findings support the hypothesis that disruption of adaptive emotion regulation processes associated with repeated exposure to stressful life events represents an intrapersonal mechanism linking the experience of adverse events to depression. Our results support the utility of interventions that simultaneously emphasize stress reduction and emotion regulation to treat and prevent depressive vulnerability and pathology. PMID:25528603
Vasilakis, Dimitris P; Whitfield, D Philip; Kati, Vassiliki
2017-01-01
Wind farm development can combat climate change but may also threaten bird populations' persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM) to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms) than currently (operating wind farms), equivalent to 44% of the current population (103 individuals) if all proposals are authorized (2744 MW). Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW), cumulative collision mortality would still be high (17% of current population) and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2%) caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our 'win-win' approach is appropriate to other potential conflicts where wind farms may cumulatively threaten wildlife
Whitfield, D. Philip; Kati, Vassiliki
2017-01-01
Wind farm development can combat climate change but may also threaten bird populations’ persistence through collision with wind turbine blades if such development is improperly planned strategically and cumulatively. Such improper planning may often occur. Numerous wind farms are planned in a region hosting the only cinereous vulture population in south-eastern Europe. We combined range use modelling and a Collision Risk Model (CRM) to predict the cumulative collision mortality for cinereous vulture under all operating and proposed wind farms. Four different vulture avoidance rates were considered in the CRM. Cumulative collision mortality was expected to be eight to ten times greater in the future (proposed and operating wind farms) than currently (operating wind farms), equivalent to 44% of the current population (103 individuals) if all proposals are authorized (2744 MW). Even under the most optimistic scenario whereby authorized proposals will not collectively exceed the national target for wind harnessing in the study area (960 MW), cumulative collision mortality would still be high (17% of current population) and likely lead to population extinction. Under any wind farm proposal scenario, over 92% of expected deaths would occur in the core area of the population, further implying inadequate spatial planning and implementation of relevant European legislation with scant regard for governmental obligations to protect key species. On the basis of a sensitivity map we derive a spatially explicit solution that could meet the national target of wind harnessing with a minimum conservation cost of less than 1% population loss providing that the population mortality (5.2%) caused by the operating wind farms in the core area would be totally mitigated. Under other scenarios, the vulture population would probably be at serious risk of extinction. Our ‘win-win’ approach is appropriate to other potential conflicts where wind farms may cumulatively threaten wildlife
On the Nature of Earth-Mars Porkchop Plots
NASA Technical Reports Server (NTRS)
Woolley, Ryan C.; Whetsel, Charles W.
2013-01-01
Porkchop plots are a quick and convenient tool to help mission designers plan ballistic trajectories between two bodies. Parameter contours give rise to the familiar 'porkchop' shape. Each synodic period the pattern repeats, but not exactly, primarily due to differences in inclination and non-zero eccentricity. In this paper we examine the morphological features of Earth-to-Mars porkchop plots and the orbital characteristics that create them. These results are compared to idealistic and optimized transfers. Conclusions are drawn about 'good' opportunities versus 'bad' opportunities for different mission applications.
NASA Astrophysics Data System (ADS)
Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo
2014-05-01
This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.
A pebble count procedure for assessing watershed cumulative effects
Gregory S. Bevenger; Rudy M. King
1995-01-01
Land mangement activities can result in the delivery of fine sediment to streams. Over time, such delivery can lead to cumulative impacts to the aquactic ecosystem. Because numerous laws require Federal land managers to analyze watershed cumulative effects, field personnel need simple monitoring procedures that can be used directly and consistently. One approach to...
The effects of cumulative practice on mathematics problem solving.
Mayfield, Kristin H; Chase, Philip N
2002-01-01
This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving.
The effects of cumulative practice on mathematics problem solving.
Mayfield, Kristin H; Chase, Philip N
2002-01-01
This study compared three different methods of teaching five basic algebra rules to college students. All methods used the same procedures to teach the rules and included four 50-question review sessions interspersed among the training of the individual rules. The differences among methods involved the kinds of practice provided during the four review sessions. Participants who received cumulative practice answered 50 questions covering a mix of the rules learned prior to each review session. Participants who received a simple review answered 50 questions on one previously trained rule. Participants who received extra practice answered 50 extra questions on the rule they had just learned. Tests administered after each review included new questions for applying each rule (application items) and problems that required novel combinations of the rules (problem-solving items). On the final test, the cumulative group outscored the other groups on application and problem-solving items. In addition, the cumulative group solved the problem-solving items significantly faster than the other groups. These results suggest that cumulative practice of component skills is an effective method of training problem solving. PMID:12102132
9 CFR 108.2 - Plot plans, blueprints, and legends required.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Plot plans, blueprints, and legends required. 108.2 Section 108.2 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... REQUIREMENTS FOR LICENSED ESTABLISHMENTS § 108.2 Plot plans, blueprints, and legends required. Each applicant...
Super-Resolution Algorithm in Cumulative Virtual Blanking
NASA Astrophysics Data System (ADS)
Montillet, J. P.; Meng, X.; Roberts, G. W.; Woolfson, M. S.
2008-11-01
The proliferation of mobile devices and the emergence of wireless location-based services have generated consumer demand for precise location. In this paper, the MUSIC super-resolution algorithm is applied to time delay estimation for positioning purposes in cellular networks. The goal is to position a Mobile Station with UMTS technology. The problem of Base-Stations herability is solved using Cumulative Virtual Blanking. A simple simulator is presented using DS-SS signal. The results show that MUSIC algorithm improves the time delay estimation in both the cases whether or not Cumulative Virtual Blanking was carried out.
Visual analytics of large multidimensional data using variable binned scatter plots
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Dayal, Umeshwar; Sharma, Ratnesh K.; Keim, Daniel A.; Janetzko, Halldór
2010-01-01
The scatter plot is a well-known method of visualizing pairs of two-dimensional continuous variables. Multidimensional data can be depicted in a scatter plot matrix. They are intuitive and easy-to-use, but often have a high degree of overlap which may occlude a significant portion of data. In this paper, we propose variable binned scatter plots to allow the visualization of large amounts of data without overlapping. The basic idea is to use a non-uniform (variable) binning of the x and y dimensions and plots all the data points that fall within each bin into corresponding squares. Further, we map a third attribute to color for visualizing clusters. Analysts are able to interact with individual data points for record level information. We have applied these techniques to solve real-world problems on credit card fraud and data center energy consumption to visualize their data distribution and cause-effect among multiple attributes. A comparison of our methods with two recent well-known variants of scatter plots is included.
A missing link in the evolution of the cumulative recorder.
Asano, Toshio; Lattal, Kennon A
2012-09-01
A recently recovered cumulative recorder provides a missing link in the evolution of the cumulative recorder from a modified kymograph to a reliably operating, scientifically and commercially successful instrument. The recorder, the only physical evidence of such an early precommercial cumulative recorder yet found, was sent to Keio University in Tokyo, Japan, in 1952 at the behest of B. F. Skinner at Harvard University. Last used in research in the late 1960s, the cumulative recorder remained locked in a storage room until 2007, when it was found again. A historical context for the recorder is followed by a description of the recorder and a comparison between it and the commercially successful Gerbrands Model C-1 recorder. Labeled the Keio recorder, it is a testament to Skinner's persistence in developing a reliable means of quantifying the behavior of living organisms in real time.
Qi, Li; Ding, Xian-bin; Mao, De-qiang; Feng, Lian-gui; Wang, Yu-lin; Jiao, Yan; Zhang, Chun-hua; Lü, Xiao-yan; Li, Hong; Xia, Yi-yin
2013-03-01
To evaluate the effect of comprehensive control and prevention for chronic diseases in demonstration plot of Chongqing. Residents were enrolled through multi-stage stratified random sampling method from 17 districts or counties which had successfully established demonstration plots and 21 districts or counties which had not established demonstration plots (non-demonstration plot for short) yet on May, 2012. Questionnaire was designed to survey awareness of health knowledge, health behaviors and utilization of health supportive tools. The results were analyzed by SPSS 15.0 software. We investigated 15 108 residents, 6156 of which were in demonstration plot and others (8951) were not. The findings revealed the percentage of the people who were aware the national action of health lifestyle in demonstration plot and in non-demonstration plot were 44.4% (2734/6157) and 40.2% (3598/8951), respectively, and the awareness of the hypertension risk of too much sodium were 72.4% (4458/6156) and 67.5% (6042/8951), respectively, and the awareness of the cardinal vascular disease (CVD) risk of obesity and overweight were 77.2% (4753/6157) and 69.6% (6230/8951), respectively. About the residents' health behaviors in demonstration plot and in non-demonstration plot, the utilization rates of salt restriction scoop or pot were 23.5% (1447/6157) and 17.9% (1602/8951), and the utilization rates of oil restriction pot were 16.7% (1028/6157) and 11.8% (1064/8951), respectively. Totally, 33 of the 37 indexes were shown higher in demonstration plot than that in non-demonstration plot (P < 0.05). The chronic diseases comprehensive control and prevention in demonstration plot was more effective, and the remarkable improvement of health knowledge and behaviors level had been achieved in demonstration plot.
Modeling post-fire woody carbon dynamics with data from remeasured inventory plots
Bianca N.I. Eskelson; Jeremy Fried; Vicente Monleon
2015-01-01
In California, the Forest Inventory and Analysis (FIA) plots within large fires were visited one year after the fire occurred resulting in a time series of measurements before and after fire. During this additional plot visit, the standard inventory measurements were augmented for these burned plots to assess fire effects. One example of the additional measurements is...
Orientation-Enhanced Parallel Coordinate Plots.
Raidou, Renata Georgia; Eisemann, Martin; Breeuwer, Marcel; Eisemann, Elmar; Vilanova, Anna
2016-01-01
Parallel Coordinate Plots (PCPs) is one of the most powerful techniques for the visualization of multivariate data. However, for large datasets, the representation suffers from clutter due to overplotting. In this case, discerning the underlying data information and selecting specific interesting patterns can become difficult. We propose a new and simple technique to improve the display of PCPs by emphasizing the underlying data structure. Our Orientation-enhanced Parallel Coordinate Plots (OPCPs) improve pattern and outlier discernibility by visually enhancing parts of each PCP polyline with respect to its slope. This enhancement also allows us to introduce a novel and efficient selection method, the Orientation-enhanced Brushing (O-Brushing). Our solution is particularly useful when multiple patterns are present or when the view on certain patterns is obstructed by noise. We present the results of our approach with several synthetic and real-world datasets. Finally, we conducted a user evaluation, which verifies the advantages of the OPCPs in terms of discernibility of information in complex data. It also confirms that O-Brushing eases the selection of data patterns in PCPs and reduces the amount of necessary user interactions compared to state-of-the-art brushing techniques.
Activities: Plotting and Predicting from Pairs.
ERIC Educational Resources Information Center
Shulte, Albert P.; Swift, Jim
1984-01-01
This teacher's guide provides objectives, procedures, and list of materials needed for activities which center around the use of a scatter plot to examine relationships shown by bivariate data. The activities are suitable for grades 7 to 12. Four student worksheets are included. (JN)
Smoothed Residual Plots for Generalized Linear Models. Technical Report #450.
ERIC Educational Resources Information Center
Brant, Rollin
Methods for examining the viability of assumptions underlying generalized linear models are considered. By appealing to the likelihood, a natural generalization of the raw residual plot for normal theory models is derived and is applied to investigating potential misspecification of the linear predictor. A smooth version of the plot is also…
[Effects of sampling plot number on tree species distribution prediction under climate change].
Liang, Yu; He, Hong-Shi; Wu, Zhi-Wei; Li, Xiao-Na; Luo, Xu
2013-05-01
Based on the neutral landscapes under different degrees of landscape fragmentation, this paper studied the effects of sampling plot number on the prediction of tree species distribution at landscape scale under climate change. The tree species distribution was predicted by the coupled modeling approach which linked an ecosystem process model with a forest landscape model, and three contingent scenarios and one reference scenario of sampling plot numbers were assumed. The differences between the three scenarios and the reference scenario under different degrees of landscape fragmentation were tested. The results indicated that the effects of sampling plot number on the prediction of tree species distribution depended on the tree species life history attributes. For the generalist species, the prediction of their distribution at landscape scale needed more plots. Except for the extreme specialist, landscape fragmentation degree also affected the effects of sampling plot number on the prediction. With the increase of simulation period, the effects of sampling plot number on the prediction of tree species distribution at landscape scale could be changed. For generalist species, more plots are needed for the long-term simulation.
Digital computer programs for generating oblique orthographic projections and contour plots
NASA Technical Reports Server (NTRS)
Giles, G. L.
1975-01-01
User and programer documentation is presented for two programs for automatic plotting of digital data. One of the programs generates oblique orthographic projections of three-dimensional numerical models and the other program generates contour plots of data distributed in an arbitrary planar region. A general description of the computational algorithms, user instructions, and complete listings of the programs is given. Several plots are included to illustrate various program options, and a single example is described to facilitate learning the use of the programs.
Cumulative Trauma Among Mayas Living in Southeast Florida.
Millender, Eugenia I; Lowe, John
2017-06-01
Mayas, having experienced genocide, exile, and severe poverty, are at high risk for the consequences of cumulative trauma that continually resurfaces through current fear of an uncertain future. Little is known about the mental health and alcohol use status of this population. This correlational study explored t/he relationship of cumulative trauma as it relates to social determinants of health (years in the United States, education, health insurance status, marital status, and employment), psychological health (depression symptoms), and health behaviors (alcohol use) of 102 Guatemalan Mayas living in Southeast Florida. The results of this study indicated that, as specific social determinants of health and cumulative trauma increased, depression symptoms (particularly among women) and the risk for harmful alcohol use (particularly among men) increased. Identifying risk factors at an early stage before serious disease or problems are manifest provides room for early screening leading to early identification, early treatment, and better outcomes.
Cumulative psychosocial stress, coping resources, and preterm birth.
McDonald, Sheila W; Kingston, Dawn; Bayrampour, Hamideh; Dolan, Siobhan M; Tough, Suzanne C
2014-12-01
Preterm birth constitutes a significant international public health issue, with implications for child and family well-being. High levels of psychosocial stress and negative affect before and during pregnancy are contributing factors to shortened gestation and preterm birth. We developed a cumulative psychosocial stress variable and examined its association with early delivery controlling for known preterm birth risk factors and confounding environmental variables. We further examined this association among subgroups of women with different levels of coping resources. Utilizing the All Our Babies (AOB) study, an ongoing prospective pregnancy cohort study in Alberta, Canada (n = 3,021), multinomial logistic regression was adopted to examine the independent effect of cumulative psychosocial stress and preterm birth subgroups compared to term births. Stratified analyses according to categories of perceived social support and optimism were undertaken to examine differential effects among subgroups of women. Cumulative psychosocial stress was a statistically significant risk factor for late preterm birth (OR = 1.73; 95 % CI = 1.07, 2.81), but not for early preterm birth (OR = 2.44; 95 % CI = 0.95, 6.32), controlling for income, history of preterm birth, pregnancy complications, reproductive history, and smoking in pregnancy. Stratified analyses showed that cumulative psychosocial stress was a significant risk factor for preterm birth at <37 weeks gestation for women with low levels of social support (OR = 2.09; 95 % CI = 1.07, 4.07) or optimism (OR = 1.87; 95 % CI = 1.04, 3.37). Our analyses suggest that early vulnerability combined with current anxiety symptoms in pregnancy confers risk for preterm birth. Coping resources may mitigate the effect of cumulative psychosocial stress on the risk for early delivery.
A computational model of selection by consequences: log survivor plots.
Kulubekova, Saule; McDowell, J J
2008-06-01
[McDowell, J.J, 2004. A computational model of selection by consequences. J. Exp. Anal. Behav. 81, 297-317] instantiated the principle of selection by consequences in a virtual organism with an evolving repertoire of possible behaviors undergoing selection, reproduction, and mutation over many generations. The process is based on the computational approach, which is non-deterministic and rules-based. The model proposes a causal account for operant behavior. McDowell found that the virtual organism consistently showed a hyperbolic relationship between response and reinforcement rates according to the quantitative law of effect. To continue validation of the computational model, the present study examined its behavior on the molecular level by comparing the virtual organism's IRT distributions in the form of log survivor plots to findings from live organisms. Log survivor plots did not show the "broken-stick" feature indicative of distinct bouts and pauses in responding, although the bend in slope of the plots became more defined at low reinforcement rates. The shape of the virtual organism's log survivor plots was more consistent with the data on reinforced responding in pigeons. These results suggest that log survivor plot patterns of the virtual organism were generally consistent with the findings from live organisms providing further support for the computational model of selection by consequences as a viable account of operant behavior.
Procedures to handle inventory cluster plots that straddle two or more conditions
Jerold T. Hahn; Colin D. MacLean; Stanford L. Arner; William A. Bechtold
1995-01-01
We review the relative merits and field procedures for four basic plot designs to handle forest inventory plots that straddle two or more conditions, given that subplots will not be moved. A cluster design is recommended that combines fixed-area subplots and variable-radius plot (VRP) sampling. Each subplot in a cluster consists of a large fixed-area subplot for...
Distribution of permanent plots to evaluate silvicultural treatments in the Inland Empire
John C. Byrne; Albert R. Stage; David L. Renner
1988-01-01
To assess the adequacy of a permanent-plot data base for estimating growth and yield, one first needs to know how the plots in the data base are distributed in relation to the population they are presumed to represent. The distribution of permanent plots to study forest growth in the Inland Empire (northeastern Washington, northern Idaho, and western Montana) is...
Precise FIA plot registration using field and dense LIDAR data
Demetrios Gatziolis
2009-01-01
Precise registration of forest inventory and analysis (FIA) plots is a prerequisite for an effective fusion of field data with ancillary spatial information, which is an approach commonly employed in the mapping of various forest parameters. Although the adoption of Global Positioning System technology has improved the precision of plot coordinates obtained during...
An investigation of condition mapping and plot proportion calculation issues
Demetrios Gatziolis
2007-01-01
A systematic examination of Forest Inventory and Analysis condition data collected under the annual inventory protocol in the Pacific Northwest region between 2000 and 2004 revealed the presence of errors both in condition topology and plot proportion computations. When plots were compiled to generate population estimates, proportion errors were found to cause...
A Guided Inquiry on Hubble Plots and the Big Bang
ERIC Educational Resources Information Center
Forringer, Ted
2014-01-01
In our science for non-science majors course "21st Century Physics," we investigate modern "Hubble plots" (plots of velocity versus distance for deep space objects) in order to discuss the Big Bang, dark matter, and dark energy. There are two potential challenges that our students face when encountering these topics for the…
A Bayesian CUSUM plot: Diagnosing quality of treatment.
Rosthøj, Steen; Jacobsen, Rikke-Line
2017-12-01
To present a CUSUM plot based on Bayesian diagnostic reasoning displaying evidence in favour of "healthy" rather than "sick" quality of treatment (QOT), and to demonstrate a technique using Kaplan-Meier survival curves permitting application to case series with ongoing follow-up. For a case series with known final outcomes: Consider each case a diagnostic test of good versus poor QOT (expected vs. increased failure rates), determine the likelihood ratio (LR) of the observed outcome, convert LR to weight taking log to base 2, and add up weights sequentially in a plot showing how many times odds in favour of good QOT have been doubled. For a series with observed survival times and an expected survival curve: Divide the curve into time intervals, determine "healthy" and specify "sick" risks of failure in each interval, construct a "sick" survival curve, determine the LR of survival or failure at the given observation times, convert to weights, and add up. The Bayesian plot was applied retrospectively to 39 children with acute lymphoblastic leukaemia with completed follow-up, using Nordic collaborative results as reference, showing equal odds between good and poor QOT. In the ongoing treatment trial, with 22 of 37 children still at risk for event, QOT has been monitored with average survival curves as reference, odds so far favoring good QOT 2:1. QOT in small patient series can be assessed with a Bayesian CUSUM plot, retrospectively when all treatment outcomes are known, but also in ongoing series with unfinished follow-up. © 2017 John Wiley & Sons, Ltd.
Convex Arrhenius plots and their interpretation
Truhlar, Donald G.; Kohen, Amnon
2001-01-01
This paper draws attention to selected experiments on enzyme-catalyzed reactions that show convex Arrhenius plots, which are very rare, and points out that Tolman's interpretation of the activation energy places a fundamental model-independent constraint on any detailed explanation of these reactions. The analysis presented here shows that in such systems, the rate coefficient as a function of energy is not just increasing more slowly than expected, it is actually decreasing. This interpretation of the data provides a constraint on proposed microscopic models, i.e., it requires that any successful model of a reaction with a convex Arrhenius plot should be consistent with the microcanonical rate coefficient being a decreasing function of energy. The implications and limitations of this analysis to interpreting enzyme mechanisms are discussed. This model-independent conclusion has broad applicability to all fields of kinetics, and we also draw attention to an analogy with diffusion in metastable fluids and glasses. PMID:11158559
Probability workshop to be better in probability topic
NASA Astrophysics Data System (ADS)
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Effect of rainfall simulator and plot scale on overland flow and phosphorus transport.
Sharpley, Andrew; Kleinman, Peter
2003-01-01
Rainfall simulation experiments are widely used to study erosion and contaminant transport in overland flow. We investigated the use of two rainfall simulators designed to rain on 2-m-long (2-m2) and 10.7-m-long (32.6-m2) plots to estimate overland flow and phosphorus (P) transport in comparison with watershed-scale data. Simulated rainfall (75 mm h(-1)) generated more overland flow from 2-m-long (20 L m2) than from 10.7-m-long (10 L m2) plots established in grass, no-till corn (Zea mays L.), and recently tilled fields, because a relatively greater area of the smaller plots became saturated (>75% of area) during rainfall compared with large plots (<75% area). Although average concentrations of dissolved reactive phosphorus (DRP) in overland flow were greater from 2-m-long (0.50 mg L(-1)) than 10.7-m-long (0.35 mg L(-1)) plots, the relationship between DRP and Mehlich-3 soil P (as defined by regression slope) was similar for both plots and for published watershed data (0.0022 for grassed, 0.0036 for no-till, and 0.0112 for tilled sites). Conversely, sediment, particulate phosphorus (PP), and total phosphorus (TP) concentrations and selective transport of soil fines (<2 microm) were significantly lower from 2- than 10.7-m-long plots. However, slopes of the logarithmic regression between P enrichment ratio and sediment discharge were similar (0.281-0.301) for 2- and 10.7-m-long plots, and published watershed data. While concentrations and loads of P change with plot scales, processes governing DRP and PP transport in overland flow are consistent, supporting the limited use of small plots and rainfall simulators to assess the relationship between soil P and overland flow P as a function of soil type and management.
Master plot analysis of microcracking in graphite/epoxy and graphite/PEEK laminates
NASA Technical Reports Server (NTRS)
Nairn, John A.; Hu, Shoufeng; Bark, Jong Song
1993-01-01
We used a variational stress analysis and an energy release rate failure criterion to construct a master plot analysis of matrix microcracking. In the master plot, the results for all laminates of a single material are predicted to fall on a single line whose slope gives the microcracking toughness of the material. Experimental results from 18 different layups of AS4/3501-6 laminates show that the master plot analysis can explain all observations. In particular, it can explain the differences between microcracking of central 90 deg plies and of free-surface 90 deg plies. Experimental results from two different AS4/PEEK laminates tested at different temperatures can be explained by a modified master plot that accounts for changes in the residual thermal stresses. Finally, we constructed similar master plot analyses for previous literature microcracking models. All microcracking theories that ignore the thickness dependence of the stresses gave poor results.
Plotting Rates of Photosynthesis as a Function of Light Quantity.
ERIC Educational Resources Information Center
Dean, Rob L.
1996-01-01
Discusses methods for plotting rates of photosynthesis as a function of light quantity. Presents evidence that suggests that empirically derived conversion factors, which are used to convert foot candles to photon fluence rates, should be used with extreme caution. Suggests how rate data are best plotted when any kind of light meter is not…
An Intuitive Graphical Approach to Understanding the Split-Plot Experiment
ERIC Educational Resources Information Center
Robinson, Timothy J.; Brenneman, William A.; Myers, William R.
2009-01-01
While split-plot designs have received considerable attention in the literature over the past decade, there seems to be a general lack of intuitive understanding of the error structure of these designs and the resulting statistical analysis. Typically, students learn the proper error terms for testing factors of a split-plot design via "expected…
magicaxis: Pretty scientific plotting with minor-tick and log minor-tick support
NASA Astrophysics Data System (ADS)
Robotham, Aaron S. G.
2016-04-01
The R suite magicaxis makes useful and pretty plots for scientific plotting and includes functions for base plotting, with particular emphasis on pretty axis labelling in a number of circumstances that are often used in scientific plotting. It also includes functions for generating images and contours that reflect the 2D quantile levels of the data designed particularly for output of MCMC posteriors where visualizing the location of the 68% and 95% 2D quantiles for covariant parameters is a necessary part of the post MCMC analysis, can generate low and high error bars, and allows clipping of values, rejection of bad values, and log stretching.
Busch, Michael; Wodrich, Matthew D; Corminboeuf, Clémence
2015-12-01
Linear free energy scaling relationships and volcano plots are common tools used to identify potential heterogeneous catalysts for myriad applications. Despite the striking simplicity and predictive power of volcano plots, they remain unknown in homogeneous catalysis. Here, we construct volcano plots to analyze a prototypical reaction from homogeneous catalysis, the Suzuki cross-coupling of olefins. Volcano plots succeed both in discriminating amongst different catalysts and reproducing experimentally known trends, which serves as validation of the model for this proof-of-principle example. These findings indicate that the combination of linear scaling relationships and volcano plots could serve as a valuable methodology for identifying homogeneous catalysts possessing a desired activity through a priori computational screening.
Cumulative effects of forest management activities: how might they occur?
R. M. Rice; R. B. Thomas
1985-01-01
Concerns are often voiced about possible environmental damage as the result of the cumulative sedimentation effects of logging and forest road construction. In response to these concerns, National Forests are developing procedures to reduce the possibility that their activities may lead to unacceptable cumulative effects
Cockings, Jerome G L; Cook, David A; Iqbal, Rehana K
2006-02-01
A health care system is a complex adaptive system. The effect of a single intervention, incorporated into a complex clinical environment, may be different from that expected. A national database such as the Intensive Care National Audit & Research Centre (ICNARC) Case Mix Programme in the UK represents a centralised monitoring, surveillance and reporting system for retrospective quality and comparative audit. This can be supplemented with real-time process monitoring at a local level for continuous process improvement, allowing early detection of the impact of both unplanned and deliberately imposed changes in the clinical environment. Demographic and UK Acute Physiology and Chronic Health Evaluation II (APACHE II) data were prospectively collected on all patients admitted to a UK regional hospital between 1 January 2003 and 30 June 2004 in accordance with the ICNARC Case Mix Programme. We present a cumulative expected minus observed (E-O) plot and the risk-adjusted p chart as methods of continuous process monitoring. We describe the construction and interpretation of these charts and show how they can be used to detect planned or unplanned organisational process changes affecting mortality outcomes. Five hundred and eighty-nine adult patients were included. The overall death rate was 0.78 of predicted. Calibration showed excess survival in ranges above 30% risk of death. The E-O plot confirmed a survival above that predicted. Small transient variations were seen in the slope that could represent random effects, or real but transient changes in the quality of care. The risk-adjusted p chart showed several observations below the 2 SD control limits of the expected mortality rate. These plots provide rapid analysis of risk-adjusted performance suitable for local application and interpretation. The E-O chart provided rapid easily visible feedback of changes in risk-adjusted mortality, while the risk-adjusted p chart allowed statistical evaluation. Local analysis of
Latino Mothers' Cumulative Food Insecurity Exposure and Child Body Composition.
Hernandez, Daphne C
2016-01-01
To document whether an intergenerational transmission of food insecurity is occurring by assessing low-income foreign-born Latino mothers' experiences with food insecurity as none, once (either childhood or adulthood) or twice (during both childhood and adulthood). Also the association between maternal cumulative food insecurity and children's body composition was examined. Maternal self-reported surveys on retrospective measures of food insecurity during childhood, current measures of food insecurity, and demographics were collected from Houston-area community centers (N = 96). Children's body mass index (BMI) and waist circumference (WC) were directly assessed. Covariate-adjusted logistic regression models analyzed the association between cumulative food insecurity experiences and children's body composition. Fifty-eight percent of mothers experienced food insecurity both as a child and as an adult and 31% of the mothers experienced food insecurity either as a child or adult. Maternal cumulative exposure to food insecurity was unrelated to BMI but was negatively related to elevated WC. Although an intergenerational transmission of food insecurity does exist, maternal cumulative exposure to food insecurity does not impact children's body composition negatively in the short term. Studying the long-term effects of cumulative food insecurity exposure can provide information for the development and timing of obesity interventions.
In-situ polymerization PLOT columns I: divinylbenzene
NASA Technical Reports Server (NTRS)
Shen, T. C.
1992-01-01
A novel method for preparation of porous-layer open-tubular (PLOT) columns is described. The method involves a simple and reproducible, straight-forward in-situ polymerization of monomer directly on the metal tube.
CASPER: A GENERALIZED PROGRAM FOR PLOTTING AND SCALING DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietzke, M.P.; Smith, R.E.
A Fortran subroutine was written to scale floating-point data and generate a magnetic tape to plot it on the Calcomp 570 digital plotter. The routine permits a great deal of flexibility, and may be used with any type of FORTRAN or FAP calling program. A simple calling program was also written to permit the user to read in data from cards and plot it without any additional programming. Both the Fortran and binary decks are available. (auth)
Probability and Statistics in Sensor Performance Modeling
2010-12-01
language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of
Aquifer test interpretation using derivative analysis and diagnostic plots
NASA Astrophysics Data System (ADS)
Hernández-Espriú, Antonio; Real-Rangel, Roberto; Cortés-Salazar, Iván; Castro-Herrera, Israel; Luna-Izazaga, Gabriela; Sánchez-León, Emilio
2017-04-01
Pumping tests remain a method of choice to deduce fundamental aquifer properties and to assess well condition. In the oil and gas (O&G) industry, well testing has been the core technique in examining reservoir behavior over the last 50 years. The pressure derivative by Bourdet, it is perhaps, the most significant single development in the history of well test analysis. Recently, the so-called diagnostics plots (e.g. drawdown and drawdown derivative in a log-log plot) have been successfully tested in aquifers. However, this procedure is still underutilized by groundwater professionals. This research illustrates the applicability range, advantages and drawbacks (e.g. smoothing procedures) of diagnostic plots using field examples from a wide spectrum of tests (short/long tests, constant/variable flow rates, drawdown/buildup stages, pumping well/observation well) in dissimilar geological conditions. We analyze new and pre-existent aquifer tests in Mexico, USA, Canada, Germany, France and Saudi Arabia. In constant flow rate tests, our results show that derivative analysis is an easy, robust and powerful tool to assess near-borehole damage effects, formation heterogeneity, boundaries, flow regimes, infinite-acting radial stages, i.e., valid Theisian framework, and fracture-driven flow. In step tests, the effectiveness relies on high-frequency drawdown measurements. Moreover, we adapt O&G analytical solutions to cater for the conditions in groundwater systems. In this context, further parameters can be computed analytically from the plots, such as skin factor, head losses, wellbore storage, distance to the boundary, channel-aquifer and/or fracture zone width, among others. Therefore, diagnostic plots should be considered a mandatory tool for pumping tests analysis among hydrogeologists. This project has been supported by DGAPA (UNAM) under the research project PAPIIT IN-112815.
Consistency of patterns in concentration‐discharge plots
Chanat, Jeffrey G.; Rice, Karen C.; Hornberger, George M.
2002-01-01
Concentration‐discharge (c‐Q) plots have been used to infer how flow components such as event water, soil water, and groundwater mix to produce the observed episodic hydrochemical response of small catchments. Because c‐Q plots are based only on observed streamflow and solute concentration, their interpretation requires assumptions about the relative volume, hydrograph timing, and solute concentration of the streamflow end‐members. Evans and Davies [1998] present a taxonomy of c‐Q loops resulting from three‐component conservative mixing. Their analysis, based on a fixed template of end‐member hydrograph volume, timing, and concentration, suggests a unique relationship between c‐Q loop form and the rank order of end‐member concentrations. Many catchments exhibit variability in component contributions to storm flow in response to antecedent conditions or rainfall characteristics, but the effects of such variation on c‐Q relationships have not been studied systematically. Starting with a “baseline” condition similar to that assumed by Evans and Davies [1998], we use a simple computer model to characterize the variability in c‐Q plot patterns resulting from variation in end‐member volume, timing, and solute concentration. Variability in these three factors can result in more than one c‐Q loop shape for a given rank order of end‐member solute concentrations. The number of resulting hysteresis patterns and their relative frequency depends on the rank order of solute concentrations and on their separation in absolute value. In ambiguous cases the c‐Q loop shape is determined by the relative “prominence” of the event water versus soil water components. This “prominence” is broadly defined as a capacity to influence the total streamflow concentration and may result from a combination of end‐member volume, timing, or concentration. The modeling results indicate that plausible hydrological variability in field situations can
A cumulative index to Aeronautical Engineering: A special bibliography
NASA Technical Reports Server (NTRS)
1978-01-01
This publication is a cumulative index to the abstracts contained in NASA SP-7037 (80) through NASA SP-7037 (91) of Aeronautical Engineering: A Special Bibliography. NASA SP-7037 and its supplements have been compiled through the cooperative efforts of the American Institute of Aeronautics (AIAA) and Space Administration (NASA). This cumulative index includes subject, personal author, corporate source, contract, and report number indexes.
Rapidity dependence of proton cumulants and correlation functions
Bzdak, Adam; Koch, Volker
2017-11-13
The dependence of multiproton correlation functions and cumulants on the acceptance in rapidity and transverse momentum is studied. Here, we found that the preliminary data of various cumulant ratios are consistent, within errors, with rapidity and transverse momentum-independent correlation functions. But, rapidity correlations which moderately increase with rapidity separation between protons are slightly favored. We propose to further explore the rapidity dependence of multiparticle correlation functions by measuring the dependence of the integrated reduced correlation functions as a function of the size of the rapidity window.
Rapidity dependence of proton cumulants and correlation functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bzdak, Adam; Koch, Volker
The dependence of multiproton correlation functions and cumulants on the acceptance in rapidity and transverse momentum is studied. Here, we found that the preliminary data of various cumulant ratios are consistent, within errors, with rapidity and transverse momentum-independent correlation functions. But, rapidity correlations which moderately increase with rapidity separation between protons are slightly favored. We propose to further explore the rapidity dependence of multiparticle correlation functions by measuring the dependence of the integrated reduced correlation functions as a function of the size of the rapidity window.
Plotting landscape perspectives of clearcut units.
Roger H. Twito
1978-01-01
The aesthetics of clearcut units potentially visible from a distance can be improved by judicious design. Screening clearcuts from view or shaping them to characterize natural openings are current methods used for this purpose. Perspective plots illustrating how proposed clearcut units will look from specific off-site viewing points provide an...
Research and cumulative watershed effects
L. M. Reid
1993-01-01
The mandate for land managers to address cumulative watershed effects (CWEs) requires that planners evaluate the potential impacts of their activities on multiple beneficial uses within the context of other coexisting activities in a watershed. Types of CWEs vary with the types of land-use activities and their modes of interaction, but published studies illustrate...
PLOT3D/AMES, GENERIC UNIX VERSION USING DISSPLA (WITH TURB3D)
NASA Technical Reports Server (NTRS)
Buning, P.
1994-01-01
PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into
PLOT3D/AMES, GENERIC UNIX VERSION USING DISSPLA (WITHOUT TURB3D)
NASA Technical Reports Server (NTRS)
Buning, P.
1994-01-01
PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into
Scott-Storey, Kelly
2011-07-01
For women, any one type of abuse rarely occurs in isolation of other types, and a single abusive experience is often the exception rather than the norm. The importance of this concept of the cumulative nature of abuse and its negative impact on health has been well recognized within the empirical literature, however there has been little consensus on what to call this phenomenon or how to study it. For the most part researchers have operated on the premise that it is the sheer number of different types of cumulating abuse experiences that is primarily responsible for worse health outcomes among women. And although this simplistic 'more is worse' approach to conceptualizing and operationalizing cumulative abuse has proven to be a powerful predictor of poorer health, it contradicts growing empirical evidence that suggests not all victimizations are created equal and that some victimizations may have a more deleterious effect on health than others. Embedded in abuse histories are individual and abuse characteristics as well as other life adversities that need to be considered in order to fully understand the spectrum and magnitude of cumulative abuse and its impact on women's health. Furthermore, given the long-term and persistent effects of abuse on health it becomes imperative to not only evaluate recent abusive experiences, but rather all abuse experiences occurring across the lifespan. This review highlights and evaluates the conceptual, operational, and methodological challenges posed by our current methods of studying and understanding the phenomenon of cumulative abuse and suggests that this phenomenon and its relationship to health is much more complex than research is currently portraying. This paper calls for the urgent need for interdisciplinary collaboration in order to more effectively and innovatively study the phenomenon of cumulative abuse.
Cumulative carbon as a policy framework for achieving climate stabilization
Matthews, H. Damon; Solomon, Susan; Pierrehumbert, Raymond
2012-01-01
The primary objective of the United Nations Framework Convention on Climate Change is to stabilize greenhouse gas concentrations at a level that will avoid dangerous climate impacts. However, greenhouse gas concentration stabilization is an awkward framework within which to assess dangerous climate change on account of the significant lag between a given concentration level and the eventual equilibrium temperature change. By contrast, recent research has shown that global temperature change can be well described by a given cumulative carbon emissions budget. Here, we propose that cumulative carbon emissions represent an alternative framework that is applicable both as a tool for climate mitigation as well as for the assessment of potential climate impacts. We show first that both atmospheric CO2 concentration at a given year and the associated temperature change are generally associated with a unique cumulative carbon emissions budget that is largely independent of the emissions scenario. The rate of global temperature change can therefore be related to first order to the rate of increase of cumulative carbon emissions. However, transient warming over the next century will also be strongly affected by emissions of shorter lived forcing agents such as aerosols and methane. Non-CO2 emissions therefore contribute to uncertainty in the cumulative carbon budget associated with near-term temperature targets, and may suggest the need for a mitigation approach that considers separately short- and long-lived gas emissions. By contrast, long-term temperature change remains primarily associated with total cumulative carbon emissions owing to the much longer atmospheric residence time of CO2 relative to other major climate forcing agents. PMID:22869803
Interactive effects of cumulative stress and impulsivity on alcohol consumption.
Fox, Helen C; Bergquist, Keri L; Peihua, Gu; Rajita, Sinha
2010-08-01
Alcohol addiction may reflect adaptations to stress, reward, and regulatory brain systems. While extensive research has identified both stress and impulsivity as independent risk factors for drinking, few studies have assessed the interactive relationship between stress and impulsivity in terms of hazardous drinking within a community sample of regular drinkers. One hundred and thirty regular drinkers (56M/74F) from the local community were assessed for hazardous and harmful patterns of alcohol consumption using the Alcohol Use Disorders Identification Test (AUDIT). All participants were also administered the Barratt Impulsiveness Scale (BIS-11) as a measure of trait impulsivity and the Cumulative Stress/Adversity Checklist (CSC) as a comprehensive measure of cumulative adverse life events. Standard multiple regression models were used to ascertain the independent and interactive nature of both overall stress and impulsivity as well as specific types of stress and impulsivity on hazardous and harmful drinking. Recent life stress, cumulative traumatic stress, overall impulsivity, and nonplanning-related impulsivity as well as cognitive and motor-related impulsivity were all independently predictive of AUDIT scores. However, the interaction between cumulative stress and total impulsivity scores accounted for a significant amount of the variance, indicating that a high to moderate number of adverse events and a high trait impulsivity rating interacted to affect greater AUDIT scores. The subscale of cumulative life trauma accounted for the most variance in AUDIT scores among the stress and impulsivity subscales. Findings highlight the interactive relationship between stress and impulsivity with regard to hazardous drinking. The specific importance of cumulative traumatic stress as a marker for problem drinking is also discussed.
Interactive Effects of Cumulative Stress and Impulsivity on Alcohol Consumption
Fox, Helen C.; Bergquist, Keri L.; Gu, Peihua; Sinha, Rajita
2013-01-01
Background Alcohol addiction may reflect adaptations to stress, reward, and regulatory brain systems. While extensive research has identified both stress and impulsivity as independent risk factors for drinking, few studies have assessed the interactive relationship between stress and impulsivity in terms of hazardous drinking within a community sample of regular drinkers. Methods One hundred and thirty regular drinkers (56M/74F) from the local community were assessed for hazardous and harmful patterns of alcohol consumption using the Alcohol Use Disorders Identification Test (AUDIT). All participants were also administered the Barratt Impulsiveness Scale (BIS-11) as a measure of trait impulsivity and the Cumulative Stress/Adversity Checklist (CSC) as a comprehensive measure of cumulative adverse life events. Standard multiple regression models were used to ascertain the independent and interactive nature of both overall stress and impulsivity as well as specific types of stress and impulsivity on hazardous and harmful drinking. Results Recent life stress, cumulative traumatic stress, overall impulsivity, and nonplanning-related impulsivity as well as cognitive and motor-related impulsivity were all independently predictive of AUDIT scores. However, the interaction between cumulative stress and total impulsivity scores accounted for a significant amount of the variance, indicating that a high to moderate number of adverse events and a high trait impulsivity rating interacted to affect greater AUDIT scores. The subscale of cumulative life trauma accounted for the most variance in AUDIT scores among the stress and impulsivity subscales. Conclusions Findings highlight the interactive relationship between stress and impulsivity with regard to hazardous drinking. The specific importance of cumulative traumatic stress as a marker for problem drinking is also discussed. PMID:20491738
Influence of tree spatial pattern and sample plot type and size on inventory
John-Pascall Berrill; Kevin L. O' Hara
2012-01-01
Sampling with different plot types and sizes was simulated using tree location maps and data collected in three even-aged coast redwood (Sequoia sempervirens) stands selected to represent uniform, random, and clumped spatial patterns of tree locations. Fixed-radius circular plots, belt transects, and variable-radius plots were installed by...
Cumulative hip contact stress predicts osteoarthritis in DDH.
Mavcic, Blaz; Iglic, Ales; Kralj-Iglic, Veronika; Brand, Richard A; Vengust, Rok
2008-04-01
Hip stresses are generally believed to influence whether a hip develops osteoarthritis (OA); similarly, various osteotomies have been proposed to reduce contact stresses and the risk of OA. We asked whether elevated hip contact stress predicted osteoarthritis in initially asymptomatic human hips. We identified 58 nonoperatively treated nonsubluxated hips with developmental dysplasia (DDH) without symptoms at skeletal maturity; the control group included 48 adult hips without hip disease. The minimum followup was 20 years (mean, 29 years; range, 20-41 years). Peak contact stress was computed with the HIPSTRESS method using anteroposterior pelvic radiographs at skeletal maturity. The cumulative contact stress was determined by multiplying the peak contact stress by age at followup. We compared WOMAC scores and radiographic indices of OA. Dysplastic hips had higher mean peak contact and higher mean cumulative contact stress than normal hips. Mean WOMAC scores and percentage of asymptomatic hips in the study group (mean age 51 years) were similar to those in the control group (mean age 68 years). After adjusting for gender and age, the cumulative contact stress, Wiberg center-edge angle, body mass index, but not the peak contact stress, independently predicted the final WOMAC score in dysplastic hips but not in normal hips. Cumulative contact stress predicted early hip OA better than the Wiberg center-edge angle. Level II, prognostic study. See the Guidelines for Authors for a complete description of levels of evidence.
Representing Uncertainty on Model Analysis Plots
ERIC Educational Resources Information Center
Smith, Trevor I.
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…
Using volcano plots and regularized-chi statistics in genetic association studies.
Li, Wentian; Freudenberg, Jan; Suh, Young Ju; Yang, Yaning
2014-02-01
Labor intensive experiments are typically required to identify the causal disease variants from a list of disease associated variants in the genome. For designing such experiments, candidate variants are ranked by their strength of genetic association with the disease. However, the two commonly used measures of genetic association, the odds-ratio (OR) and p-value may rank variants in different order. To integrate these two measures into a single analysis, here we transfer the volcano plot methodology from gene expression analysis to genetic association studies. In its original setting, volcano plots are scatter plots of fold-change and t-test statistic (or -log of the p-value), with the latter being more sensitive to sample size. In genetic association studies, the OR and Pearson's chi-square statistic (or equivalently its square root, chi; or the standardized log(OR)) can be analogously used in a volcano plot, allowing for their visual inspection. Moreover, the geometric interpretation of these plots leads to an intuitive method for filtering results by a combination of both OR and chi-square statistic, which we term "regularized-chi". This method selects associated markers by a smooth curve in the volcano plot instead of the right-angled lines which corresponds to independent cutoffs for OR and chi-square statistic. The regularized-chi incorporates relatively more signals from variants with lower minor-allele-frequencies than chi-square test statistic. As rare variants tend to have stronger functional effects, regularized-chi is better suited to the task of prioritization of candidate genes. Copyright © 2013 Elsevier Ltd. All rights reserved.
A cumulative index to Aeronautical Engineering: A continuing bibliography
NASA Technical Reports Server (NTRS)
1982-01-01
This bibliography is a cumulated index to the abstracts contained in NASA SP-7037(132) through NASA SP-7037(143) of Aeronautical Engineering: A continuing bibliography. NASA SP-7037 and its supplements have been compiled through the cooperative efforts of the American Institute of Aeronautics and Astronautics (AIAA) and the National Aeronautics and Space Administration (NASA). This cumulative index includes subject, personal author, corporate source, contract, and report number indexes.
NASA Astrophysics Data System (ADS)
Duchesne, J. C.; Charlier, B.
2005-08-01
Whole-rock major element compositions are investigated in 99 cumulates from the Proterozoic Bjerkreim-Sokndal layered intrusion (Rogaland Anorthosite Province, SW Norway), which results from the crystallization of a jotunite (Fe-Ti-P-rich hypersthene monzodiorite) parental magma. The scattering of cumulate compositions covers three types of cumulates: (1) ilmenite-leuconorite with plagioclase, ilmenite and Ca-poor pyroxene as cumulus minerals, (2) magnetite-leuconorite with the same minerals plus magnetite, and (3) gabbronorite made up of plagioclase, Ca-poor and Ca-rich pyroxenes, ilmenite, Ti-magnetite and apatite. Each type of cumulate displays a linear trend in variation diagrams. One pole of the linear trends is represented by plagioclase, and the other by a mixture of the mafic minerals in constant proportion. The mafic minerals were not sorted during cumulate formation though they display large density differences. This suggests that crystal settling did not operate during cumulate formation, and that in situ crystallization with variable nucleation rate for plagioclase was the dominant formation mechanism. The trapped liquid fraction of the cumulate plays a negligible role for the cumulate major element composition. Each linear trend is a locus for the cotectic composition of the cumulates. This property permits reconstruction by graphical mass balance calculation of the first two stages of the liquid line of descent, starting from a primitive jotunite, the Tjörn parental magma. Another type of cumulate, called jotunite cumulate and defined by the mineral association from the Transition Zone of the intrusion, has to be subtracted to simulate the most evolved part of the liquid line of descent. The proposed model demonstrates that average cumulate compositions represent cotectic compositions when the number of samples is large (> 40). The model, however, does not account for the K 2O evolution, suggesting that the system was open to contamination by roof
Incorporating Nonchemical Stressors Into Cumulative Risk Assessments
Rider, Cynthia V.; Dourson, Michael L.; Hertzberg, Richard C.; Mumtaz, Moiz M.; Price, Paul S.; Simmons, Jane Ellen
2012-01-01
The role of nonchemical stressors in modulating the human health risk associated with chemical exposures is an area of increasing attention. On 9 March 2011, a workshop titled “Approaches for Incorporating Nonchemical Stressors into Cumulative Risk Assessment” took place during the 50th Anniversary Annual Society of Toxicology Meeting in Washington D.C. Objectives of the workshop included describing the current state of the science from various perspectives (i.e., regulatory, exposure, modeling, and risk assessment) and presenting expert opinions on currently available methods for incorporating nonchemical stressors into cumulative risk assessments. Herein, distinct frameworks for characterizing exposure to, joint effects of, and risk associated with chemical and nonchemical stressors are discussed. PMID:22345310
A Guided Inquiry on Hubble Plots and the Big Bang
NASA Astrophysics Data System (ADS)
Forringer, Ted
2014-04-01
In our science for non-science majors course "21st Century Physics," we investigate modern "Hubble plots" (plots of velocity versus distance for deep space objects) in order to discuss the Big Bang, dark matter, and dark energy. There are two potential challenges that our students face when encountering these topics for the first time. The first challenge is in understanding and interpreting Hubble plots. The second is that some of our students have religious or cultural objections to the concept of a "Big Bang" or a universe that is billions of years old. This paper presents a guided inquiry exercise that was created with the goal of introducing students to Hubble plots and giving them the opportunity to discover for themselves why we believe our universe started with an explosion billions of years ago. The exercise is designed to be completed before the topics are discussed in the classroom. We did the exercise during a one hour and 45 minute "lab" time and it was done in groups of three or four students, but it would also work as an individual take-home assignment.
Evolution of costly explicit memory and cumulative culture.
Nakamaru, Mayuko
2016-06-21
Humans can acquire new information and modify it (cumulative culture) based on their learning and memory abilities, especially explicit memory, through the processes of encoding, consolidation, storage, and retrieval. Explicit memory is categorized into semantic and episodic memories. Animals have semantic memory, while episodic memory is unique to humans and essential for innovation and the evolution of culture. As both episodic and semantic memory are needed for innovation, the evolution of explicit memory influences the evolution of culture. However, previous theoretical studies have shown that environmental fluctuations influence the evolution of imitation (social learning) and innovation (individual learning) and assume that memory is not an evolutionary trait. If individuals can store and retrieve acquired information properly, they can modify it and innovate new information. Therefore, being able to store and retrieve information is essential from the perspective of cultural evolution. However, if both storage and retrieval were too costly, forgetting and relearning would have an advantage over storing and retrieving acquired information. In this study, using mathematical analysis and individual-based simulations, we investigate whether cumulative culture can promote the coevolution of costly memory and social and individual learning, assuming that cumulative culture improves the fitness of each individual. The conclusions are: (1) without cumulative culture, a social learning cost is essential for the evolution of storage-retrieval. Costly storage-retrieval can evolve with individual learning but costly social learning does not evolve. When low-cost social learning evolves, the repetition of forgetting and learning is favored more than the evolution of costly storage-retrieval, even though a cultural trait improves the fitness. (2) When cumulative culture exists and improves fitness, storage-retrieval can evolve with social and/or individual learning, which
Higher order cumulants in colorless partonic plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cherif, S.; Laboratoire de Physique et de Mathématiques Appliquées; Ahmed, M. A. A.
2016-06-10
Any physical system considered to study the QCD deconfinement phase transition certainly has a finite volume, so the finite size effects are inevitably present. This renders the location of the phase transition and the determination of its order as an extremely difficult task, even in the simplest known cases. In order to identify and locate the colorless QCD deconfinement transition point in finite volume T{sub 0}(V), a new approach based on the finite-size cumulant expansion of the order parameter and the ℒ{sub m,n}-Method is used. We have shown that both cumulants of higher order and their ratios, associated to themore » thermodynamical fluctuations of the order parameter, in QCD deconfinement phase transition behave in a particular enough way revealing pronounced oscillations in the transition region. The sign structure and the oscillatory behavior of these in the vicinity of the deconfinement phase transition point might be a sensitive probe and may allow one to elucidate their relation to the QCD phase transition point. In the context of our model, we have shown that the finite volume transition point is always associated to the appearance of a particular point in whole higher order cumulants under consideration.« less
Phthalates and Cumulative Risk Assessment (NAS Final ...
On December 18, 2008, the National Academy of Sciences' National Research Council released a final report, requested and sponsored by the EPA, entitled Phthalates and Cumulative Risk Assessment: The Task Ahead. Risk assessment has become a dominant public policy tool for making choices, based on limited resources, to protect public health and the environment. It has been instrumental to the mission of the U.S. Environmental Protection Agency (EPA) as well as other federal agencies in evaluating public health concerns, informing regulatory and technological decisions, prioritizing research needs and funding, and in developing approaches for cost-benefit analysis. People are exposed to a variety of chemicals throughout their daily lives. To protect public health, regulators use risk assessments to examine the effects of chemical exposures. This book provides guidance for assessing the risk of phthalates, chemicals found in many consumer products that have been shown to affect the development of the male reproductive system of laboratory animals. Because people are exposed to multiple phthalates and other chemicals that affect male reproductive development, a cumulative risk assessment should be conducted that evaluates the combined effects of exposure to all these chemicals. The book suggests an approach for cumulative risk assessment that can serve as a model for evaluating the health risks of other types of chemicals.
EPA Workshop on Epigenetics and Cumulative Risk ...
Agenda Download the Workshop Agenda (PDF) The workshop included presentations and discussions by scientific experts pertaining to three topics (i.e., epigenetic changes associated with diverse stressors, key science considerations in understanding epigenetic changes, and practical application of epigenetic tools to address cumulative risks from environmental stressors), to address several questions under each topic, and included an opportunity for attendees to participate in break-out groups, provide comments and ask questions. Workshop Goals The workshop seeks to examine the opportunity for use of aggregate epigenetic change as an indicator in cumulative risk assessment for populations exposed to multiple stressors that affect epigenetic status. Epigenetic changes are specific molecular changes around DNA that alter expression of genes. Epigenetic changes include DNA methylation, formation of histone adducts, and changes in micro RNAs. Research today indicates that epigenetic changes are involved in many chronic diseases (cancer, cardiovascular disease, obesity, diabetes, mental health disorders, and asthma). Research has also linked a wide range of stressors including pollution and social factors with occurrence of epigenetic alterations. Epigenetic changes have the potential to reflect impacts of risk factors across multiple stages of life. Only recently receiving attention is the nexus between the factors of cumulative exposure to environmental
IMS/Satellite Situation Center report: Predicted orbit plots for IMP-J-1976
NASA Technical Reports Server (NTRS)
1975-01-01
Predicted orbit plots for the IMP-J satellite were given for the time period January-December 1976. These plots are shown in three projections. The time period covered by each set of projections is 12 days and 6 hours, corresponding approximately to the period of IMP-J. The three coordinate systems used are the Geocentric Solar Ecliptic system (GSE), the Geocentric Solar Magnetospheric system (GSM), and the Solar Magnetic system (SM). For each of the three projections, time ticks and codes are given on the satellite trajectories. The codes are interpreted in the table at the base of each plot. Time is given in the table as year/day/decimal hour, and the total time covered by each plot is shown at the bottom of each table.
CUMULATE ROCKS ASSOCIATED WITH CARBONATE ASSIMILATION, HORTAVÆR COMPLEX, NORTH-CENTRAL NORWAY
NASA Astrophysics Data System (ADS)
Barnes, C. G.; Prestvik, T.; Li, Y.
2009-12-01
The Hortavær igneous complex intruded high-grade metamorphic rocks of the Caledonian Helgeland Nappe Complex at ca. 466 Ma. The complex is an unusual mafic-silicic layered intrusion (MASLI) because the principal felsic rock type is syenite and because the syenite formed in situ rather than by deep-seated partial melting of crustal rocks. Magma differentiation in the complex was by assimilation, primarily of calc-silicate rocks and melts with contributions from marble and semi-pelites, plus fractional crystallization. The effect of assimilation of calcite-rich rocks was to enhance stability of fassaitic clinopyroxene at the expense of olivine, which resulted in alkali-rich residual melts and lowering of silica activity. This combination of MASLI-style emplacement and carbonate assimilation produced three types of cumulate rocks: (1) Syenitic cumulates formed by liquid-crystal separation. As sheets of mafic magma were loaded on crystal-rich syenitic magma, residual liquid was expelled, penetrating the overlying mafic sheets in flame structures, and leaving a cumulate syenite. (2) Reaction cumulates. Carbonate assimilation, illustrated by a simple assimilation reaction: olivine + calcite + melt = clinopyroxene + CO2 resulted in cpx-rich cumulates such as clinopyroxenite, gabbro, and mela-monzodiorite, many of which contain igneous calcite. (3) Magmatic skarns. Calc-silicate host rocks underwent partial melting during assimilation, yielding a Ca-rich melt as the principal assimilated material and permitting extensive reaction with surrounding magma to form Kspar + cpx + garnet-rich ‘cumulate’ rocks. Cumulate types (2) and (3) do not reflect traditional views of cumulate rocks but instead result from a series of melt-present discontinuous (peritectic) reactions and partial melting of calc-silicate xenoliths. In the Hortavær complex, such cumulates are evident because of the distinctive peritectic cumulate assemblages. It is unclear whether assimilation of
Emma Lucy Braun's forest plots in eastern North America.
Ricklefs, Robert E
2018-02-01
Relative abundances of tree species are presented for the 348 forest plots described in E. Lucy Braun's (1950) book, Deciduous Forests of Eastern North America (Hafner, New York, facsimile reprint 1972). Information about the plots includes forest type, location with latitude and longitude, WorldClim climate variables, and sources of original studies where applicable. No copyright restrictions are associated with the use of this data set. Please cite this article when the data are used in other publications. © 2017 by the Ecological Society of America.
Hyperscaling breakdown and Ising spin glasses: The Binder cumulant
NASA Astrophysics Data System (ADS)
Lundow, P. H.; Campbell, I. A.
2018-02-01
Among the Renormalization Group Theory scaling rules relating critical exponents, there are hyperscaling rules involving the dimension of the system. It is well known that in Ising models hyperscaling breaks down above the upper critical dimension. It was shown by Schwartz (1991) that the standard Josephson hyperscaling rule can also break down in Ising systems with quenched random interactions. A related Renormalization Group Theory hyperscaling rule links the critical exponents for the normalized Binder cumulant and the correlation length in the thermodynamic limit. An appropriate scaling approach for analyzing measurements from criticality to infinite temperature is first outlined. Numerical data on the scaling of the normalized correlation length and the normalized Binder cumulant are shown for the canonical Ising ferromagnet model in dimension three where hyperscaling holds, for the Ising ferromagnet in dimension five (so above the upper critical dimension) where hyperscaling breaks down, and then for Ising spin glass models in dimension three where the quenched interactions are random. For the Ising spin glasses there is a breakdown of the normalized Binder cumulant hyperscaling relation in the thermodynamic limit regime, with a return to size independent Binder cumulant values in the finite-size scaling regime around the critical region.
Split-plot microarray experiments: issues of design, power and sample size.
Tsai, Pi-Wen; Lee, Mei-Ling Ting
2005-01-01
This article focuses on microarray experiments with two or more factors in which treatment combinations of the factors corresponding to the samples paired together onto arrays are not completely random. A main effect of one (or more) factor(s) is confounded with arrays (the experimental blocks). This is called a split-plot microarray experiment. We utilise an analysis of variance (ANOVA) model to assess differentially expressed genes for between-array and within-array comparisons that are generic under a split-plot microarray experiment. Instead of standard t- or F-test statistics that rely on mean square errors of the ANOVA model, we use a robust method, referred to as 'a pooled percentile estimator', to identify genes that are differentially expressed across different treatment conditions. We illustrate the design and analysis of split-plot microarray experiments based on a case application described by Jin et al. A brief discussion of power and sample size for split-plot microarray experiments is also presented.
HUMAN EXPOSURE MODELING FOR CUMULATIVE RISK
US EPA's Office of Research and Development (ORD) has identified cumulative risk assessment as a priority research area. This is because humans and other organisms are exposed to a multitude of chemicals, physical agents, and other stressors through multiple pathways, routes, an...
Aeronautical engineering: A cumulative index to a continuing bibliography
NASA Technical Reports Server (NTRS)
1987-01-01
This bibliography is a cumulative index to the abstracts contained in NASA SP-7037 (197) through NASA SP-7037 (208) of Aeronautical Engineering: A Continuing Bibliography. NASA SP-7037 and its supplements have been compiled through the cooperative efforts of the American Institute of Aeronautics and Astronautics (AIAA) and the National Aeronautics and Space Administration (NASA). This cumulative index includes subject, personal author, corporate source, foreign technology, contract, report number, and accession number indexes.
Aeronautical engineering: A cumulative index to a continuing bibliography
NASA Technical Reports Server (NTRS)
1988-01-01
This bibliography is a cumulative index to the abstracts contained in NASA SP-7037(210) through NASA SP-7037(221) of Aeronautical Engineering: A Continuing Bibliography. NASA SP-7037 and its supplements have been compiled through the cooperative efforts of the American Institute of Aeronautics and Astronautics (AIAA) and the National Aeronautics and Space Administration (NASA). This cumulative index includes subject, personal author, corporate source, foreign technology, contract number, report number, and accession number indexes.
NPLOT: an Interactive Plotting Program for NASTRAN Finite Element Models
NASA Technical Reports Server (NTRS)
Jones, G. K.; Mcentire, K. J.
1985-01-01
The NPLOT (NASTRAN Plot) is an interactive computer graphics program for plotting undeformed and deformed NASTRAN finite element models. Developed at NASA's Goddard Space Flight Center, the program provides flexible element selection and grid point, ASET and SPC degree of freedom labelling. It is easy to use and provides a combination menu and command driven user interface. NPLOT also provides very fast hidden line and haloed line algorithms. The hidden line algorithm in NPLOT proved to be both very accurate and several times faster than other existing hidden line algorithms. A fast spatial bucket sort and horizon edge computation are used to achieve this high level of performance. The hidden line and the haloed line algorithms are the primary features that make NPLOT different from other plotting programs.
Acquisition of choice in concurrent chains: Assessing the cumulative decision model.
Grace, Randolph C
2016-05-01
Concurrent chains is widely used to study pigeons' choice between terminal links that can vary in delay, magnitude, or probability of reinforcement. We review research on the acquisition of choice in this procedure. Acquisition has been studied with a variety of research designs, and some studies have incorporated no-food trials to allow for timing and choice to be observed concurrently. Results show that: Choice can be acquired rapidly within sessions when terminal links change unpredictably; under steady-state conditions, acquisition depends on both initial- and terminal-link schedules; and initial-link responding is mediated by learning about the terminal-link stimulus-reinforcer relations. The cumulative decision model (CDM) proposed by Christensen and Grace (2010) and Grace and McLean (2006, 2015) provides a good description of within-session acquisition, and correctly predicts the effects of initial and terminal-link schedules in steady-state designs (Grace, 2002a). Questions for future research include how abrupt shifts in preference within individual sessions and temporal control of terminal-link responding can be modeled. Copyright © 2016 Elsevier B.V. All rights reserved.
Sylvio Mannel; Mark A. Rumble; Maribeth Price; Thomas M. Juntti; Dong Hua
2006-01-01
Many aspects of ecological research require measurement of characteristics within plots. Often, the time spent establishing plots is small relative to the time spent collecting and recording data. However, some studies require larger numbers of plots, where the time spent establishing the plot is consequential to the field effort. In open habitats, circular plots are...
NASA Technical Reports Server (NTRS)
Mohr, Karen I.; Molinari, John; Thorncroft, Chris D,
2010-01-01
The characteristics of convective system populations in West Africa and the western Pacific tropical cyclone basin were analyzed to investigate whether interannual variability in convective activity in tropical continental and oceanic environments is driven by variations in the number of events during the wet season or by favoring large and/or intense convective systems. Convective systems were defined from TRMM data as a cluster of pixels with an 85 GHz polarization-corrected brightness temperature below 255 K and with an area at least 64 km 2. The study database consisted of convective systems in West Africa from May Sep for 1998-2007 and in the western Pacific from May Nov 1998-2007. Annual cumulative frequency distributions for system minimum brightness temperature and system area were constructed for both regions. For both regions, there were no statistically significant differences among the annual curves for system minimum brightness temperature. There were two groups of system area curves, split by the TRMM altitude boost in 2001. Within each set, there was no statistically significant interannual variability. Sub-setting the database revealed some sensitivity in distribution shape to the size of the sampling area, length of sample period, and climate zone. From a regional perspective, the stability of the cumulative frequency distributions implied that the probability that a convective system would attain a particular size or intensity does not change interannually. Variability in the number of convective events appeared to be more important in determining whether a year is wetter or drier than normal.
NASA Technical Reports Server (NTRS)
Mohr, Karen I.; Molinari, John; Thorncroft, Chris
2009-01-01
The characteristics of convective system populations in West Africa and the western Pacific tropical cyclone basin were analyzed to investigate whether interannual variability in convective activity in tropical continental and oceanic environments is driven by variations in the number of events during the wet season or by favoring large and/or intense convective systems. Convective systems were defined from Tropical Rainfall Measuring Mission (TRMM) data as a cluster of pixels with an 85-GHz polarization-corrected brightness temperature below 255 K and with an area of at least 64 square kilometers. The study database consisted of convective systems in West Africa from May to September 1998-2007, and in the western Pacific from May to November 1998-2007. Annual cumulative frequency distributions for system minimum brightness temperature and system area were constructed for both regions. For both regions, there were no statistically significant differences between the annual curves for system minimum brightness temperature. There were two groups of system area curves, split by the TRMM altitude boost in 2001. Within each set, there was no statistically significant interannual variability. Subsetting the database revealed some sensitivity in distribution shape to the size of the sampling area, the length of the sample period, and the climate zone. From a regional perspective, the stability of the cumulative frequency distributions implied that the probability that a convective system would attain a particular size or intensity does not change interannually. Variability in the number of convective events appeared to be more important in determining whether a year is either wetter or drier than normal.
Cumulative effective dose associated with radiography and CT of adolescents with spinal injuries.
Lemburg, Stefan P; Peters, Soeren A; Roggenland, Daniela; Nicolas, Volkmar; Heyer, Christoph M
2010-12-01
The purpose of this study was to analyze the quantity and distribution of cumulative effective doses in diagnostic imaging of adolescents with spinal injuries. At a level 1 trauma center from July 2003 through June 2009, imaging procedures during initial evaluation and hospitalization and after discharge of all patients 10-20 years old with spinal fractures were retrospectively analyzed. The cumulative effective doses for all imaging studies were calculated, and the doses to patients with spinal injuries who had multiple traumatic injuries were compared with the doses to patients with spinal injuries but without multiple injuries. The significance level was set at 5%. Imaging studies of 72 patients (32 with multiple injuries; average age, 17.5 years) entailed a median cumulative effective dose of 18.89 mSv. Patients with multiple injuries had a significantly higher total cumulative effective dose (29.70 versus 10.86 mSv, p < 0.001) mainly owing to the significantly higher CT-related cumulative effective dose to multiple injury patients during the initial evaluation (18.39 versus 2.83 mSv, p < 0.001). Overall, CT accounted for 86% of the total cumulative effective dose. Adolescents with spinal injuries receive a cumulative effective dose equal to that of adult trauma patients and nearly three times that of pediatric trauma patients. Areas of focus in lowering cumulative effective dose should be appropriate initial estimation of trauma severity and careful selection of CT scan parameters.
A computer program for obtaining airplane configuration plots from digital Datcom input data
NASA Technical Reports Server (NTRS)
Roy, M. L.; Sliwa, S. M.
1983-01-01
A computer program is described which reads the input file for the Stability and Control Digital Datcom program and generates plots from the aircraft configuration data. These plots can be used to verify the geometric input data to the Digital Datcom program. The program described interfaces with utilities available for plotting aircraft configurations by creating a file from the Digital Datcom input data.
Thomas Shelton
2013-01-01
A small-plot field trial was conducted to examine the area of influence of fipronil at incremental distances away from treated plots on the Harrison Experimental Forest near Saucier, MS. Small treated (water and fipronil) plots were surrounded by untreated wooden boards in an eight-point radial pattern, and examined for evidence of termite feeding every 60 d for 1 yr...
CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES
Cumulative Risk Analysis for Organophosphorus Pesticides
R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711
The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...
Plots, pixels, and partnerships: prospects for mapping, monitoring and modeling biodiversity.
H. Gyde Lund; Victor A. Rudis; Kenneth W. Stolte
1998-01-01
Many biodiversity inventories are conducted in relatively small areas, yet information is needed at the national, regional, and global levels.Most nations have forest inventory plot networks.While forest inventories may not contain the detailed species information that biodiversity inventories do, the forest inventory plot networks do represent large areas.Linkages...
Rainfall–runoff model parameter estimation and uncertainty evaluation on small plots
Four seasonal rainfall simulations in 2009 and 2010were applied to a field containing 36 plots (0.75 × 2 m each), resulting in 144 runoff events. In all simulations, a constant rate of rainfall was applied then halted 60min after initiation of runoff, with plot-scale monitoring o...
Millroth, Philip; Guath, Mona; Juslin, Peter
2018-06-07
The rationality of decision making under risk is of central concern in psychology and other behavioral sciences. In real-life, the information relevant to a decision often arrives sequentially or changes over time, implying nontrivial demands on memory. Yet, little is known about how this affects the ability to make rational decisions and a default assumption is rather that information about outcomes and probabilities are simultaneously available at the time of the decision. In 4 experiments, we show that participants receiving probability- and outcome information sequentially report substantially (29 to 83%) higher certainty equivalents than participants with simultaneous presentation. This holds also for monetary-incentivized participants with perfect recall of the information. Participants in the sequential conditions often violate stochastic dominance in the sense that they pay more for a lottery with low probability of an outcome than participants in the simultaneous condition pay for a high probability of the same outcome. Computational modeling demonstrates that Cumulative Prospect Theory (Tversky & Kahneman, 1992) fails to account for the effects of sequential presentation, but a model assuming anchoring-and adjustment constrained by memory can account for the data. By implication, established assumptions of rationality may need to be reconsidered to account for the effects of memory in many real-life tasks. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Complexity and demographic explanations of cumulative culture.
Querbes, Adrien; Vaesen, Krist; Houkes, Wybo
2014-01-01
Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing--while favoured by increasing--population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change.
Luo, Quanzhou; Yue, Guihua; Valaskovic, Gary A; Gu, Ye; Wu, Shiaw-Lin; Karger, Barry L.
2008-01-01
Following on our recent work, on-line one dimensional (1D) and two dimensional (2D) PLOT/LC-ESI-MS platforms using 3.2 m × 10 μm i.d. poly(styrenedivinylbenzene) (PS-DVB) porous layer open tubular (PLOT) columns have been developed to provide robust, high performance and ultrasensitive proteomic analysis. Using a PicoClear tee, the dead volume connection between a 50 μm i.d. PS-DVB monolithic microSPE column and the PLOT column was minimized. The microSPE/PLOT column assembly provided a separation performance similar to that obtained with direct injection onto the PLOT column at a mobile phase flow rate of 20 nL/min. The trace analysis potential of the platform was evaluated using an in-gel tryptic digest sample of a gel fraction (15 to 40 kDa) of a cervical cancer (SiHa) cell line. As an example of the sensitivity of the system, ∼2.5 ng of protein in 2 μL solution, an amount corresponding to 20 SiHa cells, was subjected to on-line microSPE-PLOT/LC-ESIMS/MS analysis using a linear ion trap MS. 237 peptides associated with 163 unique proteins were identified from a single analysis when using stringent criteria associated with a false positive rate less than 1% . The number of identified peptides and proteins increased to 638 and 343, respectively, as the injection amount was raised to ∼45 ng of protein, an amount corresponding to 350 SiHa cells. In comparison, only 338 peptides and 231 unique proteins were identified (false positive rate again less than 1%) from 750 ng of protein from the identical gel fraction, an amount corresponding to 6000 SiHa cells, using a typical 15 cm × 75 μm i.d. packed capillary column. The greater sensitivity, higher recovery, and higher resolving power of the PLOT column resulted in the increased number of identifications from only ∼5% of the injected sample amount. The resolving power of the microSPE/PLOT assembly was further extended by 2D chromatography via combination of the high-efficiency reversed phase PLOT column
Cumulative emission budgets and their implications: the case for SAFE carbon
NASA Astrophysics Data System (ADS)
Allen, Myles; Bowerman, Niel; Frame, David; Mason, Charles
2010-05-01
The risk of dangerous long-term climate change due to anthropogenic carbon dioxide emissions is predominantly determined by cumulative emissions over all time, not the rate of emission in any given year or commitment period. This has profound implications for climate mitigation policy: emission targets for specific years such as 2020 or 2050 provide no guarantee of meeting any overall cumulative emission budget. By focusing attention on short-term measures to reduce the flow of emissions, they may even exacerbate the overall long-term stock. Here we consider how climate policies might be designed explicitly to limit cumulative emissions to, for example, one trillion tonnes of carbon, a figure that has been estimated to give a most likely warming of two degrees above pre-industrial, with a likely range of 1.6-2.6 degrees. Three approaches are considered: tradable emission permits with the possibility of indefinite emission banking, carbon taxes explicitly linked to cumulative emissions and mandatory carbon sequestration. Framing mitigation policy around cumulative targets alleviates the apparent tension between climate protection and short-term consumption that bedevils any attempt to forge global agreement. We argue that the simplest and hence potentially the most effective approach might be a mandatory requirement on the fossil fuel industry to ensure that a steadily increasing fraction of fossil carbon extracted from the ground is artificially removed from the active carbon cycle through some form of sequestration. We define Sequestered Adequate Fraction of Extracted (SAFE) carbon as a source in which this sequestered fraction is anchored to cumulative emissions, increasing smoothly to reach 100% before we release the trillionth tonne. While adopting the use of SAFE carbon would increase the cost of fossil energy much as a system of emission permits or carbon taxes would, it could do so with much less explicit government intervention. We contrast this proposal
Experimental strategies in carrying out VCU for tobacco crop I: plot design and size.
Toledo, F H R B; Ramalho, M A P; Pulcinelli, C E; Bruzi, A T
2013-09-19
We aimed to establish standards for tobacco Valor de Cultivo e Uso (VCU) in Brazil. We obtained information regarding the size and design of plots of two varietal groups of tobacco (Virginia and Burley). Ten inbred lines of each varietal group were evaluated in a randomized complete block design with four replications. The plot contained 42 plants with six rows of seven columns each. For each experiment plant, considering the position of the respective plant in the plot (row and column) as a reference, cured leaf weight (g/plant), total sugar content (%), and total alkaloid content (%) were determined. The maximum curvature of the variations in coefficients was estimated. Trials with the number of plants per plot ranging from 2 to 41 were simulated. The use of a border was not justified because the interactions between inbred lines x position in the plots were never significant, showing that the behavior of the inbred lines coincided with the different positions. The plant performance varied according to the column position in the plot. To lessen the effect of this factor, the use of plots with more than one row is recommended. Experimental precision, evaluated by the CV%, increased with an increase in plot size; nevertheless, the maximum curvature of the variation coefficient method showed no expressive increase in precision if the number of plants was greater than seven. The result in identification of the best inbred line, in terms of the size of each plot, coincided with the maximum curvature method.
Variable life-adjusted display (VLAD) for hip fracture patients: a prospective trial.
Williams, H; Gwyn, R; Smith, A; Dramis, A; Lewis, J
2015-08-01
With restructuring within the NHS, there is increased public and media interest in surgical outcomes. The Nottingham Hip Fracture Score (NHFS) is a well-validated tool in predicting 30-day mortality in hip fractures. VLAD provides a visual plot in real time of the difference between the cumulative expected mortality and the actual death occurring. Survivors are incorporated as a positive value equal to 1 minus the probability of survival and deaths as a negative value equal to the probability of survival. Downward deflections indicate mortality and potentially suboptimal care. We prospectively included every hip fracture admitted to UHW that underwent surgery from January-August 2014. NHFS was then calculated and predicted survival identified. A VLAD plot was then produced comparing the predicted with the actual 30-day mortality. Two hundred and seventy-seven patients have completed the 30-day follow-up, and initial results showed that the actual 30-day mortality (7.2 %) was much lower than that predicted by the NHFS (8.0 %). This was reflected by a positive trend on the VLAD plot. Variable life-adjusted display provides an easy-to-use graphical representation of risk-adjusted survival over time and can act as an "early warning" system to identify trends in mortality for hip fractures.
Exploration of the forbidden regions of the Ramachandran plot (ϕ-ψ) with QTAIM.
Momen, Roya; Azizi, Alireza; Wang, Lingling; Ping, Yang; Xu, Tianlv; Kirk, Steven R; Li, Wenxuan; Manzhos, Sergei; Jenkins, Samantha
2017-10-04
A new QTAIM interpretation of the Ramachandran plot is formulated from the most and least facile eigenvectors of the second-derivative matrix of the electron density with a set of 29 magainin-2 peptide conformers. The presence of QTAIM eigenvectors associated with the most and least preferred directions of electronic charge density explained the role of hydrogen bonding, HH contacts and the glycine amino acid monomer in peptide folding. The highest degree of occupation of the QTAIM interpreted Ramachandran plot was found for the glycine amino acid monomer compared with the remaining backbone peptide bonds. The mobility of the QTAIM eigenvectors of the glycine amino acid monomer was higher than for the other amino acids and was comparable to that of the hydrogen bonding, explaining the flexibility of the magainin-2 backbone. We experimented with a variety of hybrid QTAIM-Ramachandran plots to highlight and explain why the glycine amino acid monomer largely occupies the 'forbidden' region on the Ramachandran plot. In addition, the new hybrid QTAIM-Ramachandran plots contained recognizable regions that can be associated with concepts familiar from the conventional Ramachandran plot whilst retaining the character of the QTAIM most and least preferred regions.
9 CFR 108.7 - Filing of plot plans, blueprints, and legends.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Filing of plot plans, blueprints, and legends. 108.7 Section 108.7 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE... REQUIREMENTS FOR LICENSED ESTABLISHMENTS § 108.7 Filing of plot plans, blueprints, and legends. Three copies of...
Rainfall-runoff model parameter estimation and uncertainty evaluation on small plots
USDA-ARS?s Scientific Manuscript database
Four seasonal rainfall simulations in 2009 and 2010 were applied to a field containing 36 plots (0.75 × 2 m each), resulting in 144 runoff events. In all simulations, a constant rate of rainfall was applied, then halted 60 minutes after initiation of runoff, with plot-scale monitoring of runoff ever...
Lee, Minjung; Dignam, James J.; Han, Junhee
2014-01-01
We propose a nonparametric approach for cumulative incidence estimation when causes of failure are unknown or missing for some subjects. Under the missing at random assumption, we estimate the cumulative incidence function using multiple imputation methods. We develop asymptotic theory for the cumulative incidence estimators obtained from multiple imputation methods. We also discuss how to construct confidence intervals for the cumulative incidence function and perform a test for comparing the cumulative incidence functions in two samples with missing cause of failure. Through simulation studies, we show that the proposed methods perform well. The methods are illustrated with data from a randomized clinical trial in early stage breast cancer. PMID:25043107
NASA Astrophysics Data System (ADS)
Peters, Bradley J.; Day, James M. D.; Taylor, Lawrence A.
2016-08-01
Ultramafic cumulate rocks form during intrusive crystallization of high-MgO magmas, incorporating relatively high abundances of compatible elements, including Cr and Ni, and high abundances of the highly siderophile elements (HSE: Os, Ir, Ru, Pt, Pd, Re). Here, we utilize a suite of cumulate xenoliths from Piton de la Fournaise, La Réunion (Indian Ocean), to examine the mantle source composition of the Réunion hotspot using HSE abundances and Os isotopes. Dunite and wherlite xenoliths and associated lavas from the Piton de la Fournaise volcanic complex span a range of MgO contents (46 to 7 wt.%), yet exhibit remarkably homogeneous 187Os/188Os (0.1324 ± 0.0014, 2σ), representing the Os-isotopic composition of Réunion hotspot primary melts. A significant fraction of the xenoliths also have primitive upper-mantle (PUM) normalized HSE patterns with elevated Ru and Pd (PUM-normalized Ru/Ir and Pd/Ir of 0.8-6.3 and 0.2-7.2, respectively). These patterns are not artifacts of alteration, fractional crystallization, or partial melting processes, but rather require a primary magma with similar relative enrichments. Some highly olivine-phyric (>40 modal percent olivine) Piton de la Fournaise lavas also preserve these relative Ru and Pd enrichments, while others preserve a pattern that is likely related to sulfur saturation in evolved melts. The estimate of HSE abundances in PUM indicates high Ru/Ir and Pd/Pt values relative to carbonaceous, ordinary and enstatite chondrite meteorite groups. Thus, the existence of cumulate rocks with even more fractionated HSE patterns relative to PUM suggests that the Réunion hotspot samples a yet unrecognized mantle source. The origin of fractionated HSE patterns in Réunion melts may arise from sampling of a mantle source that experienced limited late accretion (<0.2% by mass) compared with PUM (0.5-0.8%), possibly involving impactors that were distinct from present-day chondrites, or limited core-mantle interactions. Given the
Combining FIA plot data with topographic variables: Are precise locations needed?
Stephen P. Prisley; Huei-Jin Wang; Philip J Radtke; John Coulston
2009-01-01
Plot data from the USFS FIA program could be combined with terrain variables to attempt to explain how terrain characteristics influence forest growth, species composition, productivity, fire behavior, wildlife habitat, and other phenomena. While some types of analyses using FIA data have been shown to be insensitive to precision of plot locations, it has been...
Cumulative irritation potential of topical retinoid formulations.
Leyden, James J; Grossman, Rachel; Nighland, Marge
2008-08-01
Localized irritation can limit treatment success with topical retinoids such as tretinoin and adapalene. The factors that influence irritant reactions have been shown to include individual skin sensitivity, the particular retinoid and concentration used, and the vehicle formulation. To compare the cutaneous tolerability of tretinoin 0.04% microsphere gel (TMG) with that of adapalene 0.3% gel and a standard tretinoin 0.025% cream. The results of 2 randomized, investigator-blinded studies of 2 to 3 weeks' duration, which utilized a split-face method to compare cumulative irritation scores induced by topical retinoids in subjects with healthy skin, were combined. Study 1 compared TMG 0.04% with adapalene 0.3% gel over 2 weeks, while study 2 compared TMG 0.04% with tretinoin 0.025% cream over 3 weeks. In study 1, TMG 0.04% was associated with significantly lower cumulative scores for erythema, dryness, and burning/stinging than adapalene 0.3% gel. However, in study 2, there were no significant differences in cumulative irritation scores between TMG 0.04% and tretinoin 0.025% cream. Measurements of erythema by a chromameter showed no significant differences between the test formulations in either study. Cutaneous tolerance of TMG 0.04% on the face was superior to that of adapalene 0.3% gel and similar to that of a standard tretinoin cream containing a lower concentration of the drug (0.025%).
Expansive Soil Crack Depth under Cumulative Damage
Shi, Bei-xiao; Chen, Sheng-shui; Han, Hua-qiang; Zheng, Cheng-feng
2014-01-01
The crack developing depth is a key problem to slope stability of the expansive soil and its project governance and the crack appears under the roles of dry-wet cycle and gradually develops. It is believed from the analysis that, because of its own cohesion, the expansive soil will have a certain amount of deformation under pulling stress but without cracks. The soil body will crack only when the deformation exceeds the ultimate tensile strain that causes cracks. And it is also believed that, due to the combined effect of various environmental factors, particularly changes of the internal water content, the inherent basic physical properties of expansive soil are weakened, and irreversible cumulative damages are eventually formed, resulting in the development of expansive soil cracks in depth. Starting from the perspective of volumetric strain that is caused by water loss, considering the influences of water loss rate and dry-wet cycle on crack developing depth, the crack developing depth calculation model which considers the water loss rate and the cumulative damages is established. Both the proposal of water loss rate and the application of cumulative damage theory to the expansive soil crack development problems try to avoid difficulties in matrix suction measurement, which will surely play a good role in promoting and improving the research of unsaturated expansive soil. PMID:24737974
shinyCircos: an R/Shiny application for interactive creation of Circos plot.
Yu, Yiming; Ouyang, Yidan; Yao, Wen
2018-04-01
Creation of Circos plot is one of the most efficient approaches to visualize genomic data. However, the installation and use of existing tools to make Circos plot are challenging for users lacking of coding experiences. To address this issue, we developed an R/Shiny application shinyCircos, a graphical user interface for interactive creation of Circos plot. shinyCircos can be easily installed either on computers for personal use or on local or public servers to provide online use to the community. Furthermore, various types of Circos plots could be easily generated and decorated with simple mouse-click. shinyCircos and its manual are freely available at https://github.com/venyao/shinyCircos. shinyCircos is deployed at https://yimingyu.shinyapps.io/shinycircos/ and http://shinycircos.ncpgr.cn/ for online use. diana1983941@mail.hzau.edu.cn or yaowen@henau.edu.cn.
Vernon J. LaBau; John W. Hazard
2000-01-01
During an inventory to assess spruce bark beetle impact on the Kenai Peninsula in south-central Alaska, 5-year mortality estimates were made for all growing-stock trees on 0.6 ha areas, on 0.4 ha areas, and on a cluster of four 1/60-ha subplots. The analysis of the results of the comparison between cluster data and the larger plot data highlighted some of the problems...
Programming Details for MDPLOT: A Program for Plotting Multi-Dimensional Data
W.L. Nance; B.H. Polmer; G.C. Keith
1975-01-01
The program is written in ASA FORTRAN IV and consists of the main program (MAIN) with 14 subroutines. Subroutines SETUP, PLOT, GRID, SCALE, and 01s are microfilm-dependent and therefore must be replaced with the equivalent routines written for the high resolution plotting device available at the user's installation. The calls to these subroutines are flagged...
Estimating number and size of forest patches from FIA plot data
Mark D. Nelson; Andrew J. Lister; Mark H. Hansen
2009-01-01
Forest inventory and analysis (FIA) annual plot data provide for estimates of forest area, type, volume, growth, and other attributes. Estimates of forest landscape metrics, such as those describing abundance, size, and shape of forest patches, however, typically are not derived from FIA plot data but from satellite image-based land cover maps. Associating image-based...
Dodd, C.K.; Dorazio, R.M.
2004-01-01
A critical variable in both ecological and conservation field studies is determining how many individuals of a species are present within a defined sampling area. Labor intensive techniques such as capture-mark-recapture and removal sampling may provide estimates of abundance, but there are many logistical constraints to their widespread application. Many studies on terrestrial and aquatic salamanders use counts as an index of abundance, assuming that detection remains constant while sampling. If this constancy is violated, determination of detection probabilities is critical to the accurate estimation of abundance. Recently, a model was developed that provides a statistical approach that allows abundance and detection to be estimated simultaneously from spatially and temporally replicated counts. We adapted this model to estimate these parameters for salamanders sampled over a six vear period in area-constrained plots in Great Smoky Mountains National Park. Estimates of salamander abundance varied among years, but annual changes in abundance did not vary uniformly among species. Except for one species, abundance estimates were not correlated with site covariates (elevation/soil and water pH, conductivity, air and water temperature). The uncertainty in the estimates was so large as to make correlations ineffectual in predicting which covariates might influence abundance. Detection probabilities also varied among species and sometimes among years for the six species examined. We found such a high degree of variation in our counts and in estimates of detection among species, sites, and years as to cast doubt upon the appropriateness of using count data to monitor population trends using a small number of area-constrained survey plots. Still, the model provided reasonable estimates of abundance that could make it useful in estimating population size from count surveys.
Design criteria and eigensequence plots for satellite computed tomography
NASA Technical Reports Server (NTRS)
Wahba, G.
1983-01-01
The use of the degrees of freedom for signal is proposed as a design criteria for comparing different designs for satellite and other measuring systems. It is also proposed that certain eigensequence plots be examined at the design stage along with appropriate estimates of the parameter lambda playing the role of noise to signal ratio. The degrees of freedom for signal and the eigensequence plots may be determined using prior information in the spectral domain which is presently available along with a description of the system, and simulated data for estimating lambda. This work extends the 1972 work of Weinreb and Crosby.
Design criteria and eigensequence plots for satellite computed tomography
NASA Astrophysics Data System (ADS)
Wahba, G.
1983-11-01
The use of the degrees of freedom for signal is proposed as a design criteria for comparing different designs for satellite and other measuring systems. It is also proposed that certain eigensequence plots be examined at the design stage along with appropriate estimates of the parameter lambda playing the role of noise to signal ratio. The degrees of freedom for signal and the eigensequence plots may be determined using prior information in the spectral domain which is presently available along with a description of the system, and simulated data for estimating lambda. This work extends the 1972 work of Weinreb and Crosby.
Aerospace Medicine and Biology: Cumulative index, 1979
NASA Technical Reports Server (NTRS)
1980-01-01
This publication is a cumulative index to the abstracts contained in the Supplements 190 through 201 of 'Aerospace Medicine and Biology: A Continuing Bibliography.' It includes three indexes-subject, personal author, and corporate source.
User's manual for THPLOT, A FORTRAN 77 Computer program for time history plotting
NASA Technical Reports Server (NTRS)
Murray, J. E.
1982-01-01
A general purpose FORTRAN 77 computer program (THPLOT) for plotting time histories using Calcomp pen plotters is described. The program is designed to read a time history data file and to generate time history plots for selected time intervals and/or selected data channels. The capabilities of the program are described. The card input required to define the plotting operation is described and examples of card input and the resulting plotted output are given. The examples are followed by a description of the printed output, including both normal output and error messages. Lastly, implementation of the program is described. A complete listing of the program with reference maps produced by the CDC FTN 5.0 compiler is included.
NASA Astrophysics Data System (ADS)
Marwan, Norbert
2003-09-01
In this work, different aspects and applications of the recurrence plot analysis are presented. First, a comprehensive overview of recurrence plots and their quantification possibilities is given. New measures of complexity are defined by using geometrical structures of recurrence plots. These measures are capable to find chaos-chaos transitions in processes. Furthermore, a bivariate extension to cross recurrence plots is studied. Cross recurrence plots exhibit characteristic structures which can be used for the study of differences between two processes or for the alignment and search for matching sequences of two data series. The selected applications of the introduced techniques to various kind of data demonstrate their ability. Analysis of recurrence plots can be adopted to the specific problem and thus opens a wide field of potential applications. Regarding the quantification of recurrence plots, chaos-chaos transitions can be found in heart rate variability data before the onset of life threatening cardiac arrhythmias. This may be of importance for the therapy of such cardiac arrhythmias. The quantification of recurrence plots allows to study transitions in brain during cognitive experiments on the base of single trials. Traditionally, for the finding of these transitions the averaging of a collection of single trials is needed. Using cross recurrence plots, the existence of an El Niño/Southern Oscillation-like oscillation is traced in northwestern Argentina 34,000 yrs. ago. In further applications to geological data, cross recurrence plots are used for time scale alignment of different borehole data and for dating a geological profile with a reference data set. Additional examples from molecular biology and speech recognition emphasize the suitability of cross recurrence plots. Diese Arbeit beschäftigt sich mit verschiedenen Aspekten und Anwendungen von Recurrence Plots. Nach einer Übersicht über Methoden, die auf Recurrence Plots basieren, werden neue
Complexity and Demographic Explanations of Cumulative Culture
Querbes, Adrien; Vaesen, Krist; Houkes, Wybo
2014-01-01
Formal models have linked prehistoric and historical instances of technological change (e.g., the Upper Paleolithic transition, cultural loss in Holocene Tasmania, scientific progress since the late nineteenth century) to demographic change. According to these models, cumulation of technological complexity is inhibited by decreasing— while favoured by increasing—population levels. Here we show that these findings are contingent on how complexity is defined: demography plays a much more limited role in sustaining cumulative culture in case formal models deploy Herbert Simon's definition of complexity rather than the particular definitions of complexity hitherto assumed. Given that currently available empirical evidence doesn't afford discriminating proper from improper definitions of complexity, our robustness analyses put into question the force of recent demographic explanations of particular episodes of cultural change. PMID:25048625
Evidence-based evaluation of the cumulative effects of ecosystem restoration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diefenderfer, Heida L.; Johnson, Gary E.; Thom, Ronald M.
Evaluating the cumulative effects of large-scale ecological restoration programs is necessary to inform adaptive ecosystem management and provide society with resilient and sustainable services. However, complex linkages between restorative actions and ecosystem responses make evaluations problematic. Despite long-term federal investments in restoring aquatic ecosystems, no standard evaluation method has been adopted and most programs focus on monitoring and analysis, not synthesis and evaluation. In this paper, we demonstrate a new transdisciplinary approach integrating techniques from evidence-based medicine, critical thinking, and cumulative effects assessment. Tiered hypotheses are identified using an ecosystem conceptual model. The systematic literature review at the core ofmore » evidence-based assessment becomes one of many lines of evidence assessed collectively, using critical thinking strategies and causal criteria from a cumulative effects perspective. As a demonstration, we analyzed data from 166 locations on the Columbia River and estuary representing 12 indicators of habitat and fish response to floodplain restoration actions intended to benefit threatened and endangered salmon. Synthesis of seven lines of evidence showed that hydrologic reconnection promoted macrodetritis export, prey availability, and fish access and feeding. The evidence was sufficient to infer cross-boundary, indirect, compounding and delayed cumulative effects, and suggestive of nonlinear, landscape-scale, and spatial density effects. On the basis of causal inferences regarding food web functions, we concluded that the restoration program has a cumulative beneficial effect on juvenile salmon. As a result, this evidence-based approach will enable the evaluation of restoration in complex coastal and riverine ecosystems where data have accumulated without sufficient synthesis.« less
Evidence-based evaluation of the cumulative effects of ecosystem restoration
Diefenderfer, Heida L.; Johnson, Gary E.; Thom, Ronald M.; ...
2016-03-18
Evaluating the cumulative effects of large-scale ecological restoration programs is necessary to inform adaptive ecosystem management and provide society with resilient and sustainable services. However, complex linkages between restorative actions and ecosystem responses make evaluations problematic. Despite long-term federal investments in restoring aquatic ecosystems, no standard evaluation method has been adopted and most programs focus on monitoring and analysis, not synthesis and evaluation. In this paper, we demonstrate a new transdisciplinary approach integrating techniques from evidence-based medicine, critical thinking, and cumulative effects assessment. Tiered hypotheses are identified using an ecosystem conceptual model. The systematic literature review at the core ofmore » evidence-based assessment becomes one of many lines of evidence assessed collectively, using critical thinking strategies and causal criteria from a cumulative effects perspective. As a demonstration, we analyzed data from 166 locations on the Columbia River and estuary representing 12 indicators of habitat and fish response to floodplain restoration actions intended to benefit threatened and endangered salmon. Synthesis of seven lines of evidence showed that hydrologic reconnection promoted macrodetritis export, prey availability, and fish access and feeding. The evidence was sufficient to infer cross-boundary, indirect, compounding and delayed cumulative effects, and suggestive of nonlinear, landscape-scale, and spatial density effects. On the basis of causal inferences regarding food web functions, we concluded that the restoration program has a cumulative beneficial effect on juvenile salmon. As a result, this evidence-based approach will enable the evaluation of restoration in complex coastal and riverine ecosystems where data have accumulated without sufficient synthesis.« less
The relationship between species detection probability and local extinction probability
Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.
2004-01-01
In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.
Arif, Muhammad
2012-06-01
In pattern classification problems, feature extraction is an important step. Quality of features in discriminating different classes plays an important role in pattern classification problems. In real life, pattern classification may require high dimensional feature space and it is impossible to visualize the feature space if the dimension of feature space is greater than four. In this paper, we have proposed a Similarity-Dissimilarity plot which can project high dimensional space to a two dimensional space while retaining important characteristics required to assess the discrimination quality of the features. Similarity-dissimilarity plot can reveal information about the amount of overlap of features of different classes. Separable data points of different classes will also be visible on the plot which can be classified correctly using appropriate classifier. Hence, approximate classification accuracy can be predicted. Moreover, it is possible to know about whom class the misclassified data points will be confused by the classifier. Outlier data points can also be located on the similarity-dissimilarity plot. Various examples of synthetic data are used to highlight important characteristics of the proposed plot. Some real life examples from biomedical data are also used for the analysis. The proposed plot is independent of number of dimensions of the feature space.
PLOT3D/AMES, DEC VAX VMS VERSION USING DISSPLA (WITHOUT TURB3D)
NASA Technical Reports Server (NTRS)
Buning, P. G.
1994-01-01
PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into
PLOT3D/AMES, DEC VAX VMS VERSION USING DISSPLA (WITH TURB3D)
NASA Technical Reports Server (NTRS)
Buning, P.
1994-01-01
PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into
Short- and Long-Term Effects of Cumulative Finals on Student Learning
ERIC Educational Resources Information Center
Khanna, Maya M.; Brack, Amy S. Badura; Finken, Laura L.
2013-01-01
In two experiments, we examined the benefits of cumulative and noncumulative finals on students' short- and long-term course material retention. In Experiment 1, we examined results from course content exams administered immediately after course finals. Course sections including cumulative finals had higher content exam scores than sections…
Studies of two cumulative effects riddles
R. M. Rice; R. R. Ziemer; J. Lewis; T. E. Lisle
1989-01-01
Although it is unquestionably prudent to consider the cumulative watershed effects (CWEs) of timber harvesting, the presumed CWE phenomena offer limited opportunity for scientific inquiry. We are addressing two questions: are there synergistic sedimentation effects of sufficient magnitude to warrant consideration beyond efforts to reduce on-site erosion; and what are...
Mars Science Laboratory Launch-Arrival Space Study: A Pork Chop Plot Analysis
NASA Technical Reports Server (NTRS)
Cianciolo, Alicia Dwyer; Powell, Richard; Lockwood, Mary Kae
2006-01-01
Launch-Arrival, or "pork chop", plot analysis can provide mission designers with valuable information and insight into a specific launch and arrival space selected for a mission. The study begins with the array of entry states for each pair of selected Earth launch and Mars arrival dates, and nominal entry, descent and landing trajectories are simulated for each pair. Parameters of interest, such as maximum heat rate, are plotted in launch-arrival space. The plots help to quickly identify launch and arrival regions that are not feasible under current constraints or technology and also provide information as to what technologies may need to be developed to reach a desired region. This paper provides a discussion of the development, application, and results of a pork chop plot analysis to the Mars Science Laboratory mission. This technique is easily applicable to other missions at Mars and other destinations.
Spiegelhalter, David; Grigg, Olivia; Kinsman, Robin; Treasure, Tom
2003-02-01
To investigate the use of the risk-adjusted sequential probability ratio test in monitoring the cumulative occurrence of adverse clinical outcomes. Retrospective analysis of three longitudinal datasets. Patients aged 65 years and over under the care of Harold Shipman between 1979 and 1997, patients under 1 year of age undergoing paediatric heart surgery in Bristol Royal Infirmary between 1984 and 1995, adult patients receiving cardiac surgery from a team of cardiac surgeons in London,UK. Annual and 30-day mortality rates. Using reasonable boundaries, the procedure could have indicated an 'alarm' in Bristol after publication of the 1991 Cardiac Surgical Register, and in 1985 or 1997 for Harold Shipman depending on the data source and the comparator. The cardiac surgeons showed no significant deviation from expected performance. The risk-adjusted sequential probability test is simple to implement, can be applied in a variety of contexts, and might have been useful to detect specific instances of past divergent performance. The use of this and related techniques deserves further attention in the context of prospectively monitoring adverse clinical outcomes.
Codes, Liana; de Souza, Ygor Gomes; D'Oliveira, Ricardo Azevedo Cruz; Bastos, Jorge Luiz Andrade; Bittencourt, Paulo Lisboa
2018-04-24
To analyze whether fluid overload is an independent risk factor of adverse outcomes after liver transplantation (LT). One hundred and twenty-one patients submitted to LT were retrospectively evaluated. Data regarding perioperative and postoperative variables previously associated with adverse outcomes after LT were reviewed. Cumulative fluid balance (FB) in the first 12 h and 4 d after surgery were compared with major adverse outcomes after LT. Most of the patients were submitted to a liberal approach of fluid administration with a mean cumulative FB over 5 L and 10 L, respectively, in the first 12 h and 4 d after LT. Cumulative FB in 4 d was independently associated with occurrence of both AKI and requirement for renal replacement therapy (RRT) (OR = 2.3; 95%CI: 1.37-3.86, P = 0.02 and OR = 2.89; 95%CI: 1.52-5.49, P = 0.001 respectively). Other variables on multivariate analysis associated with AKI and RRT were, respectively, male sex and Acute Physiology and Chronic Health Disease Classification System (APACHE II) levels and sepsis or septic shock. Mortality was shown to be independently related to AST and APACHE II levels (OR = 2.35; 95%CI: 1.1-5.05, P = 0.02 and 2.63; 95%CI: 1.0-6.87, P = 0.04 respectively), probably reflecting the degree of graft dysfunction and severity of early postoperative course of LT. No effect of FB on mortality after LT was disclosed. Cumulative positive FB over 4 d after LT is independently associated with the development of AKI and the requirement of RRT. Survival was not independently related to FB, but to surrogate markers of graft dysfunction and severity of postoperative course of LT.