Sample records for normal probability plots

  1. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Applying the log-normal distribution to target detection

    NASA Astrophysics Data System (ADS)

    Holst, Gerald C.

    1992-09-01

    Holst and Pickard experimentally determined that MRT responses tend to follow a log-normal distribution. The log normal distribution appeared reasonable because nearly all visual psychological data is plotted on a logarithmic scale. It has the additional advantage that it is bounded to positive values; an important consideration since probability of detection is often plotted in linear coordinates. Review of published data suggests that the log-normal distribution may have universal applicability. Specifically, the log-normal distribution obtained from MRT tests appears to fit the target transfer function and the probability of detection of rectangular targets.

  3. Investigation into the Use of Normal and Half-Normal Plots for Interpreting Results from Screening Experiments.

    DTIC Science & Technology

    1987-03-25

    by Lloyd (1952) using generalized least squares instead of ordinary least squares, and by Wilk, % 20 Gnanadesikan , and Freeny (1963) using a maximum...plot. The half-normal distribution is a special case of the gamma distribution proposed by Wilk, Gnanadesikan , and Huyett (1962). VARIATIONS ON THE... Gnanadesikan , R. Probability plotting methods for the analysis of data. Biometrika, 1968, 55, 1-17. This paper describes and discusses graphical techniques

  4. A Performance Comparison on the Probability Plot Correlation Coefficient Test using Several Plotting Positions for GEV Distribution.

    NASA Astrophysics Data System (ADS)

    Ahn, Hyunjun; Jung, Younghun; Om, Ju-Seong; Heo, Jun-Haeng

    2014-05-01

    It is very important to select the probability distribution in Statistical hydrology. Goodness of fit test is a statistical method that selects an appropriate probability model for a given data. The probability plot correlation coefficient (PPCC) test as one of the goodness of fit tests was originally developed for normal distribution. Since then, this test has been widely applied to other probability models. The PPCC test is known as one of the best goodness of fit test because it shows higher rejection powers among them. In this study, we focus on the PPCC tests for the GEV distribution which is widely used in the world. For the GEV model, several plotting position formulas are suggested. However, the PPCC statistics are derived only for the plotting position formulas (Goel and De, In-na and Nguyen, and Kim et al.) in which the skewness coefficient (or shape parameter) are included. And then the regression equations are derived as a function of the shape parameter and sample size for a given significance level. In addition, the rejection powers of these formulas are compared using Monte-Carlo simulation. Keywords: Goodness-of-fit test, Probability plot correlation coefficient test, Plotting position, Monte-Carlo Simulation ACKNOWLEDGEMENTS This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-12-NH-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.

  5. Statistical characterization of a large geochemical database and effect of sample size

    USGS Publications Warehouse

    Zhang, C.; Manheim, F.T.; Hinde, J.; Grossman, J.N.

    2005-01-01

    The authors investigated statistical distributions for concentrations of chemical elements from the National Geochemical Survey (NGS) database of the U.S. Geological Survey. At the time of this study, the NGS data set encompasses 48,544 stream sediment and soil samples from the conterminous United States analyzed by ICP-AES following a 4-acid near-total digestion. This report includes 27 elements: Al, Ca, Fe, K, Mg, Na, P, Ti, Ba, Ce, Co, Cr, Cu, Ga, La, Li, Mn, Nb, Nd, Ni, Pb, Sc, Sr, Th, V, Y and Zn. The goal and challenge for the statistical overview was to delineate chemical distributions in a complex, heterogeneous data set spanning a large geographic range (the conterminous United States), and many different geological provinces and rock types. After declustering to create a uniform spatial sample distribution with 16,511 samples, histograms and quantile-quantile (Q-Q) plots were employed to delineate subpopulations that have coherent chemical and mineral affinities. Probability groupings are discerned by changes in slope (kinks) on the plots. Major rock-forming elements, e.g., Al, Ca, K and Na, tend to display linear segments on normal Q-Q plots. These segments can commonly be linked to petrologic or mineralogical associations. For example, linear segments on K and Na plots reflect dilution of clay minerals by quartz sand (low in K and Na). Minor and trace element relationships are best displayed on lognormal Q-Q plots. These sensitively reflect discrete relationships in subpopulations within the wide range of the data. For example, small but distinctly log-linear subpopulations for Pb, Cu, Zn and Ag are interpreted to represent ore-grade enrichment of naturally occurring minerals such as sulfides. None of the 27 chemical elements could pass the test for either normal or lognormal distribution on the declustered data set. Part of the reasons relate to the presence of mixtures of subpopulations and outliers. Random samples of the data set with successively smaller numbers of data points showed that few elements passed standard statistical tests for normality or log-normality until sample size decreased to a few hundred data points. Large sample size enhances the power of statistical tests, and leads to rejection of most statistical hypotheses for real data sets. For large sample sizes (e.g., n > 1000), graphical methods such as histogram, stem-and-leaf, and probability plots are recommended for rough judgement of probability distribution if needed. ?? 2005 Elsevier Ltd. All rights reserved.

  6. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  7. Behavior of visual field index in advanced glaucoma.

    PubMed

    Rao, Harsha L; Senthil, Sirisha; Choudhari, Nikhil S; Mandal, Anil K; Garudadri, Chandra S

    2013-01-14

    To evaluate the magnitude of Visual Field Index (VFI) change attributable to change in the estimation algorithm from the pattern deviation probability plot (PDPP) to the total deviation probability plot (TDPP) when the mean deviation (MD) crosses -20 decibels (dB). In a retrospective study, 37 stable glaucoma eyes in which MD of the VFs crossed -20 dB were identified. For each eye, a pair of VFs was selected so that one VF of the pair had a MD better than but close to -20 dB and the other had a MD worse than but again close to -20 dB. The change in VFI in the VF pairs and its associations with the number of points in probability plots with normal threshold sensitivities were evaluated. Similar pairs of VFs from 28 stable glaucoma eyes where the MD crossed -10 dB were chosen as controls. The change in VFI in VF pairs when the MD crossed 20 dB ranged from 3% to 33% (median: 15%), while the change when MD crossed -10 dB ranged from 1% to 8% (median: 4%). Difference in the number of points with normal threshold sensitivities in PDPP when MD was better than -20 dB compared to those in TDPP when MD crossed -20 dB significantly influenced the VFI change (R(2) = 0.65). Considering the eccentricity of these points further explained the VFI change (R(2) = 0.81). The decrease in VFI when MD crosses -20 dB can be highly variable. This has to be considered with the use of VFI in clinical and research settings.

  8. Plotting equation for gaussian percentiles and a spreadsheet program for generating probability plots

    USGS Publications Warehouse

    Balsillie, J.H.; Donoghue, J.F.; Butler, K.M.; Koch, J.L.

    2002-01-01

    Two-dimensional plotting tools can be of invaluable assistance in analytical scientific pursuits, and have been widely used in the analysis and interpretation of sedimentologic data. We consider, in this work, the use of arithmetic probability paper (APP). Most statistical computer applications do not allow for the generation of APP plots, because of apparent intractable nonlinearity of the percentile (or probability) axis of the plot. We have solved this problem by identifying an equation(s) for determining plotting positions of Gaussian percentiles (or probabilities), so that APP plots can easily be computer generated. An EXCEL example is presented, and a programmed, simple-to-use EXCEL application template is hereby made publicly available, whereby a complete granulometric analysis including data listing, moment measure calculations, and frequency and cumulative APP plots, is automatically produced.

  9. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  10. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  11. Confidence limits for contribution plots in multivariate statistical process control using bootstrap estimates.

    PubMed

    Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund

    2016-02-18

    In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Normal versus High Tension Glaucoma: A Comparison of Functional and Structural Defects

    PubMed Central

    Thonginnetra, Oraorn; Greenstein, Vivienne C.; Chu, David; Liebmann, Jeffrey M.; Ritch, Robert; Hood, Donald C.

    2009-01-01

    Purpose To compare visual field defects obtained with both multifocal visual evoked potential (mfVEP) and Humphrey visual field (HVF) techniques to topographic optic disc measurements in patients with normal tension glaucoma (NTG) and high tension glaucoma (HTG). Methods We studied 32 patients with NTG and 32 with HTG. All patients had reliable 24-2 HVFs with a mean deviation (MD) of −10 dB or better, a glaucomatous optic disc and an abnormal HVF in at least one eye. Multifocal VEPs were obtained from each eye and probability plots created. The mfVEP and HVF probability plots were divided into a central 10-degree (radius) and an outer arcuate subfield in both superior and inferior hemifields. Cluster analyses and counts of abnormal points were performed in each subfield. Optic disc images were obtained with the Heidelberg Retina Tomograph III (HRT III). Eleven stereometric parameters were calculated. Moorfields regression analysis (MRA) and the glaucoma probability score (GPS) were performed. Results There were no significant differences in MD and PSD values between NTG and HTG eyes. However, NTG eyes had a higher percentage of abnormal test points and clusters of abnormal points in the central subfields on both mfVEP and HVF than HTG eyes. For HRT III, there were no significant differences in the 11 stereometric parameters or in the MRA and GPS analyses of the optic disc images. Conclusions The visual field data suggest more localized and central defects for NTG than HTG. PMID:19223786

  13. Overland flow connectivity on planar patchy hillslopes - modified percolation theory approaches and combinatorial model of urns

    NASA Astrophysics Data System (ADS)

    Nezlobin, David; Pariente, Sarah; Lavee, Hanoch; Sachs, Eyal

    2017-04-01

    Source-sink systems are very common in hydrology; in particular, some land cover types often generate runoff (e.g. embedded rocks, bare soil) , while other obstruct it (e.g. vegetation, cracked soil). Surface runoff coefficients of patchy slopes/plots covered by runoff generating and obstructing covers (e.g., bare soil and vegetation) depend critically on the percentage cover (i.e. sources/sinks abundance) and decrease strongly with observation scale. The classic mathematical percolation theory provides a powerful apparatus for describing the runoff connectivity on patchy hillslopes, but it ignores strong effect of the overland flow directionality. To overcome this and other difficulties, modified percolation theory approaches can be considered, such as straight percolation (for the planar slopes), quasi-straight percolation and models with limited obstruction. These approaches may explain both the observed critical dependence of runoff coefficients on percentage cover and their scale decrease in systems with strong flow directionality (e.g. planar slopes). The contributing area increases sharply when the runoff generating percentage cover approaches the straight percolation threshold. This explains the strong increase of the surface runoff and erosion for relatively low values (normally less than 35%) of the obstructing cover (e.g., vegetation). Combinatorial models of urns with restricted occupancy can be applied for the analytic evaluation of meaningful straight percolation quantities, such as NOGA's (Non-Obstructed Generating Area) expected value and straight percolation probability. It is shown that the nature of the cover-related runoff scale decrease is combinatorial - the probability for the generated runoff to avoid obstruction in unit area decreases with scale for the non-trivial percentage cover values. The magnitude of the scale effect is found to be a skewed non-monotonous function of the percentage cover. It is shown that the cover-related scale effect becomes less prominent if the obstructing capacity decreases, as generally occurs during heavy rainfalls. The plot width have a moderate positive statistical effect on runoff and erosion coefficients, since wider patchy plots have, on average, a greater normalized contributing area and a higher probability to have runoff of a certain length. The effect of plot width depends by itself on the percentage cover, plot length, and compared width scales. The contributing area uncertainty brought about by cover spatial arrangement is examined, including its dependence on the percentage cover and scale. In general, modified percolation theory approaches and combinatorial models of urns with restricted occupancy may link between critical dependence of runoff on percentage cover, cover-related scale effect, and statistical uncertainty of the observed quantities.

  14. Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose volume outcome relationships

    NASA Astrophysics Data System (ADS)

    El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.

    2006-11-01

    Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.

  15. Forest expansion and climate change in the Mountain Hemlock (Tsuga mertensiana) zone, Lassen Volcanic National Park, California, U.S.A.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, A.H.

    1995-08-01

    The relationship between climate change and the dynamics of ecotonal populations of mountain hemlock (Tsuga mertensiana [Bong.] Carr.) was determined by comparing climate and the age structure of trees from 24 plots and seedlings from 13 plots in the subalpine zone of Lassen Volcanic National Park, California. Tree establishment was greatest during periods with above normal annual and summer temperatures, and normal or above normal precipitation. Seedling establishment was positively correlated with above normal annual and summer temperatures and negatively correlated with April snowpack depth. The different responses of trees and seedlings to precipitation variation is probably related to sitemore » soil moisture conditions. Mountain hemlock populations began to expand in 1842 and establishment increased dramatically after 1880 and peaked during a warm mesic period between 1895 and 1910. The onset of forest expansion coincides with warming that began at the end of the Little Ice Age (1850-1880). These data indicate that stability of the mountain hemlock ecotone is strongly influenced by climate. If warming induced by greenhouse gases does occur as climate models predict, then the structure and dynamics of near timberline forests in the Pacific Northwest will change. 52 refs., 8 figs., 3 tabs.« less

  16. The re-incarnation, re-interpretation and re-demise of the transition probability model.

    PubMed

    Koch, A L

    1999-05-28

    There are two classes of models for the cell cycle that have both a deterministic and a stochastic part; they are the transition probability (TP) models and sloppy size control (SSC) models. The hallmark of the basic TP model are two graphs: the alpha and beta plots. The former is the semi-logarithmic plot of the percentage of cell divisions yet to occur, this results in a horizontal line segment at 100% corresponding to the deterministic phase and a straight line sloping tail corresponding to the stochastic part. The beta plot concerns the differences of the age-at-division of sisters (the beta curve) and gives a straight line parallel to the tail of the alpha curve. For the SC models the deterministic part is the time needed for the cell to accumulate a critical amount of some substance(s). The variable part differs in the various variants of the general model, but they do not give alpha and beta curves with linear tails as postulated by the TP model. This paper argues against TP and for an elaboration of SSC type of model. The main argument against TP is that it assumes that the probability of the transition from the stochastic phase is time invariant even though it is certain that the cells are growing and metabolizing throughout the cell cycle; a fact that should make the transition probability be variable. The SSC models presume that cell division is triggered by the cell's success in growing and not simply the result of elapsed time. The extended model proposed here to accommodate the predictions of the SSC to the straight tailed parts of the alpha and beta plots depends on the existence of a few percent of the cell in a growing culture that are not growing normally, these are growing much slower or are temporarily quiescent. The bulk of the cells, however, grow nearly exponentially. Evidence for a slow growing component comes from experimental analyses of population size distributions for a variety of cell types by the Collins-Richmond technique. These subpopulations existence is consistent with the new concept that there are a large class of rapidly reversible mutations occurring in many organisms and at many loci serving a large range of purposes to enable the cell to survive environmental challenges. These mutations yield special subpopulations of cells within a population. The reversible mutational changes, relevant to the elaboration of SSC models, produce slow-growing cells that are either very large or very small in size; these later revert to normal growth and division. The subpopulations, however, distort the population distribution in such a way as to fit better the exponential tails of the alpha and beta curves of the TP model.

  17. Electrochemical oxidation of ampicillin antibiotic at boron-doped diamond electrodes and process optimization using response surface methodology.

    PubMed

    Körbahti, Bahadır K; Taşyürek, Selin

    2015-03-01

    Electrochemical oxidation and process optimization of ampicillin antibiotic at boron-doped diamond electrodes (BDD) were investigated in a batch electrochemical reactor. The influence of operating parameters, such as ampicillin concentration, electrolyte concentration, current density, and reaction temperature, on ampicillin removal, COD removal, and energy consumption was analyzed in order to optimize the electrochemical oxidation process under specified cost-driven constraints using response surface methodology. Quadratic models for the responses satisfied the assumptions of the analysis of variance well according to normal probability, studentized residuals, and outlier t residual plots. Residual plots followed a normal distribution, and outlier t values indicated that the approximations of the fitted models to the quadratic response surfaces were very good. Optimum operating conditions were determined at 618 mg/L ampicillin concentration, 3.6 g/L electrolyte concentration, 13.4 mA/cm(2) current density, and 36 °C reaction temperature. Under response surface optimized conditions, ampicillin removal, COD removal, and energy consumption were obtained as 97.1 %, 92.5 %, and 71.7 kWh/kg CODr, respectively.

  18. How To ... Guide

    Treesearch

    Duncan C. Lutes; Robert E. Keane; John F. Caratti; Carl H. Key; Nathan C. Benson

    2006-01-01

    This is probably the most critical phase of FIREMON sampling because this plot ID must be unique across all plots that will be entered in the FIREMON database. The plot identifier is made up of three parts: Registration Code, Project Code, and Plot Number.The FIREMON Analysis Tools program will allow summarization and comparison of plots only if...

  19. Maturation of human hypothalamic-pituitary-thyroid function and control.

    PubMed

    Fisher, D A; Nelson, J C; Carlton, E I; Wilcox, R B

    2000-03-01

    Measurements of serum thyrotropin (TSH) and free thyroxine (T4) concentrations were conducted in infants, children, and adults to assess maturation of the hypothalamic-pituitary-thyroid (HPT) feedback control axis. Serum free T4 and TSH concentration data were collated for cord blood of the midgestation fetus, for premature and term infants, and for peripheral blood from newborn infants, children, and adults. Mean values were plotted on a nomogram developed to characterize the reference ranges of the normal axis quantitatively based on data from 522 healthy subjects, 2 weeks to 54 years of age; 83 untreated hypothyroid patients; and 116 untreated hyperthyroid patients. Samples for 75 patients with thyroid hormone resistance were also plotted. The characterized pattern of HPT maturation included a progressive decrease in the TSH/free T4 ratio with age, from 15 in the midterm fetus, to 4.7 in term infants, and 0.97 in adults. Maturation plotted on the nomogram was complex, suggesting increasing hypothalamic-pituitary T4 resistance during fetal development, probably secondary to increasing thyrotropin-releasing hormone (TRH) secretion, the marked, cold-stimulated TRH-TSH surge at birth with reequilibration by 2-20 weeks, and a final maturation phase characterized by a decreasing serum TSH with minimal change in free T4 concentration during childhood and adolescence. The postnatal maturative phase during childhood and adolescence correlates with the progressive decrease in thyroxine secretion rate (on a microg/kg per day basis) and metabolic rate and probably reflects decreasing TRH secretion.

  20. Biostatistics Series Module 3: Comparing Groups: Numerical Variables.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2016-01-01

    Numerical data that are normally distributed can be analyzed with parametric tests, that is, tests which are based on the parameters that define a normal distribution curve. If the distribution is uncertain, the data can be plotted as a normal probability plot and visually inspected, or tested for normality using one of a number of goodness of fit tests, such as the Kolmogorov-Smirnov test. The widely used Student's t-test has three variants. The one-sample t-test is used to assess if a sample mean (as an estimate of the population mean) differs significantly from a given population mean. The means of two independent samples may be compared for a statistically significant difference by the unpaired or independent samples t-test. If the data sets are related in some way, their means may be compared by the paired or dependent samples t-test. The t-test should not be used to compare the means of more than two groups. Although it is possible to compare groups in pairs, when there are more than two groups, this will increase the probability of a Type I error. The one-way analysis of variance (ANOVA) is employed to compare the means of three or more independent data sets that are normally distributed. Multiple measurements from the same set of subjects cannot be treated as separate, unrelated data sets. Comparison of means in such a situation requires repeated measures ANOVA. It is to be noted that while a multiple group comparison test such as ANOVA can point to a significant difference, it does not identify exactly between which two groups the difference lies. To do this, multiple group comparison needs to be followed up by an appropriate post hoc test. An example is the Tukey's honestly significant difference test following ANOVA. If the assumptions for parametric tests are not met, there are nonparametric alternatives for comparing data sets. These include Mann-Whitney U-test as the nonparametric counterpart of the unpaired Student's t-test, Wilcoxon signed-rank test as the counterpart of the paired Student's t-test, Kruskal-Wallis test as the nonparametric equivalent of ANOVA and the Friedman's test as the counterpart of repeated measures ANOVA.

  1. Net present value probability distributions from decline curve reserves estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, D.E.; Huffman, C.H.; Thompson, R.S.

    1995-12-31

    This paper demonstrates how reserves probability distributions can be used to develop net present value (NPV) distributions. NPV probability distributions were developed from the rate and reserves distributions presented in SPE 28333. This real data study used practicing engineer`s evaluations of production histories. Two approaches were examined to quantify portfolio risk. The first approach, the NPV Relative Risk Plot, compares the mean NPV with the NPV relative risk ratio for the portfolio. The relative risk ratio is the NPV standard deviation (a) divided the mean ({mu}) NPV. The second approach, a Risk - Return Plot, is a plot of themore » {mu} discounted cash flow rate of return (DCFROR) versus the {sigma} for the DCFROR distribution. This plot provides a risk-return relationship for comparing various portfolios. These methods may help evaluate property acquisition and divestiture alternatives and assess the relative risk of a suite of wells or fields for bank loans.« less

  2. Causal inference, probability theory, and graphical insights.

    PubMed

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  3. Population dynamics of hispid cotton rats (Sigmodon hispidus) across a nitrogen-amended landscape

    USGS Publications Warehouse

    Clark, J.E.; Hellgren, E.C.; Jorgensen, E.E.; Tunnell, S.J.; Engle, David M.; Leslie, David M.

    2003-01-01

    We conducted a mark-recapture experiment to examine the population dynamics of hispid cotton rats (Sigmodon hispidus) in response to low-level nitrogen amendments (16.4 kg nitrogen/ha per year) and exclosure fencing in an old-field grassland. The experimental design consisted of sixteen 0.16-ha plots with 4 replicates of each treatment combination. We predicted that densities, reproductive success, movement probabilities, and survival rates of cotton rats would be greater on nitrogen-amended plots because of greater aboveground biomass and canopy cover. Population densities of cotton rats tended to be highest on fenced nitrogen plots, but densities on unfenced nitrogen plots were similar to those on control and fenced plots. We observed no distinct patterns in survival rates, reproductive success, or movement probabilities with regard to nitrogen treatments. However, survival rates and reproductive success tended to be higher for cotton rats on fenced plots than for those on unfenced plots and this was likely attributable to decreased predation on fenced plots. As low-level nitrogen amendments continue to be applied, we predict that survival, reproduction, and population-growth rates of cotton rats on control plots, especially fenced plots with no nitrogen amendment, will eventually exceed those on nitrogen-amended plots as a result of higher plant-species diversity, greater food availability, and better quality cover.

  4. The Classicist and the Frequentist Approach to Probability within a "TinkerPlots2" Combinatorial Problem

    ERIC Educational Resources Information Center

    Prodromou, Theodosia

    2012-01-01

    This article seeks to address a pedagogical theory of introducing the classicist and the frequentist approach to probability, by investigating important elements in 9th grade students' learning process while working with a "TinkerPlots2" combinatorial problem. Results from this research study indicate that, after the students had seen…

  5. A fast hidden line algorithm for plotting finite element models

    NASA Technical Reports Server (NTRS)

    Jones, G. K.

    1982-01-01

    Effective plotting of finite element models requires the use of fast hidden line plot techniques that provide interactive response. A high speed hidden line technique was developed to facilitate the plotting of NASTRAN finite element models. Based on testing using 14 different models, the new hidden line algorithm (JONES-D) appears to be very fast: its speed equals that for normal (all lines visible) plotting and when compared to other existing methods it appears to be substantially faster. It also appears to be very reliable: no plot errors were observed using the new method to plot NASTRAN models. The new algorithm was made part of the NPLOT NASTRAN plot package and was used by structural analysts for normal production tasks.

  6. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    PubMed

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  7. Mechanism-based model for tumor drug resistance.

    PubMed

    Kuczek, T; Chan, T C

    1992-01-01

    The development of tumor resistance to cytotoxic agents has important implications in the treatment of cancer. If supported by experimental data, mathematical models of resistance can provide useful information on the underlying mechanisms and aid in the design of therapeutic regimens. We report on the development of a model of tumor-growth kinetics based on the assumption that the rates of cell growth in a tumor are normally distributed. We further assumed that the growth rate of each cell is proportional to its rate of total pyrimidine synthesis (de novo plus salvage). Using an ovarian carcinoma cell line (2008) and resistant variants selected for chronic exposure to a pyrimidine antimetabolite, N-phosphonacetyl-L-aspartate (PALA), we derived a simple and specific analytical form describing the growth curves generated in 72 h growth assays. The model assumes that the rate of de novo pyrimidine synthesis, denoted alpha, is shifted down by an amount proportional to the log10 PALA concentration and that cells whose rate of pyrimidine synthesis falls below a critical level, denoted alpha 0, can no longer grow. This is described by the equation: Probability (growth) = probability (alpha 0 less than alpha-constant x log10 [PALA]). This model predicts that when growth curves are plotted on probit paper, they will produce straight lines. This prediction is in agreement with the data we obtained for the 2008 cells. Another prediction of this model is that the same probit plots for the resistant variants should shift to the right in a parallel fashion. Probit plots of the dose-response data obtained for each resistant 2008 line following chronic exposure to PALA again confirmed this prediction. Correlation of the rightward shift of dose responses to uridine transport (r = 0.99) also suggests that salvage metabolism plays a key role in tumor-cell resistance to PALA. Furthermore, the slope of the regression lines enables the detection of synergy such as that observed between dipyridamole and PALA. Although the rate-normal model was used to study the rate of salvage metabolism in PALA resistance in the present study, it may be widely applicable to modeling of other resistance mechanisms such as gene amplification of target enzymes.

  8. Specifying the Probability Characteristics of Funnel Plot Control Limits: An Investigation of Three Approaches

    PubMed Central

    Manktelow, Bradley N.; Seaton, Sarah E.

    2012-01-01

    Background Emphasis is increasingly being placed on the monitoring and comparison of clinical outcomes between healthcare providers. Funnel plots have become a standard graphical methodology to identify outliers and comprise plotting an outcome summary statistic from each provider against a specified ‘target’ together with upper and lower control limits. With discrete probability distributions it is not possible to specify the exact probability that an observation from an ‘in-control’ provider will fall outside the control limits. However, general probability characteristics can be set and specified using interpolation methods. Guidelines recommend that providers falling outside such control limits should be investigated, potentially with significant consequences, so it is important that the properties of the limits are understood. Methods Control limits for funnel plots for the Standardised Mortality Ratio (SMR) based on the Poisson distribution were calculated using three proposed interpolation methods and the probability calculated of an ‘in-control’ provider falling outside of the limits. Examples using published data were shown to demonstrate the potential differences in the identification of outliers. Results The first interpolation method ensured that the probability of an observation of an ‘in control’ provider falling outside either limit was always less than a specified nominal probability (p). The second method resulted in such an observation falling outside either limit with a probability that could be either greater or less than p, depending on the expected number of events. The third method led to a probability that was always greater than, or equal to, p. Conclusion The use of different interpolation methods can lead to differences in the identification of outliers. This is particularly important when the expected number of events is small. We recommend that users of these methods be aware of the differences, and specify which interpolation method is to be used prior to any analysis. PMID:23029202

  9. Repeated count surveys help standardize multi-agency estimates of American Oystercatcher (Haematopus palliatus) abundance

    USGS Publications Warehouse

    Hostetter, Nathan J.; Gardner, Beth; Schweitzer, Sara H.; Boettcher, Ruth; Wilke, Alexandra L.; Addison, Lindsay; Swilling, William R.; Pollock, Kenneth H.; Simons, Theodore R.

    2015-01-01

    The extensive breeding range of many shorebird species can make integration of survey data problematic at regional spatial scales. We evaluated the effectiveness of standardized repeated count surveys coordinated across 8 agencies to estimate the abundance of American Oystercatcher (Haematopus palliatus) breeding pairs in the southeastern United States. Breeding season surveys were conducted across coastal North Carolina (90 plots) and the Eastern Shore of Virginia (3 plots). Plots were visited on 1–5 occasions during April–June 2013. N-mixture models were used to estimate abundance and detection probability in relation to survey date, tide stage, plot size, and plot location (coastal bay vs. barrier island). The estimated abundance of oystercatchers in the surveyed area was 1,048 individuals (95% credible interval: 851–1,408) and 470 pairs (384–637), substantially higher than estimates that did not account for detection probability (maximum counts of 674 individuals and 316 pairs). Detection probability was influenced by a quadratic function of survey date, and increased from mid-April (~0.60) to mid-May (~0.80), then remained relatively constant through June. Detection probability was also higher during high tide than during low, rising, or falling tides. Abundance estimates from N-mixture models were validated at 13 plots by exhaustive productivity studies (2–5 surveys wk−1). Intensive productivity studies identified 78 breeding pairs across 13 productivity plots while the N-mixture model abundance estimate was 74 pairs (62–119) using only 1–5 replicated surveys season−1. Our results indicate that standardized replicated count surveys coordinated across multiple agencies and conducted during a relatively short time window (closure assumption) provide tremendous potential to meet both agency-level (e.g., state) and regional-level (e.g., flyway) objectives in large-scale shorebird monitoring programs.

  10. Determining density of maize canopy. 3: Temporal considerations

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.; Anuta, P. E.; Cipra, J. E.

    1972-01-01

    Multispectral scanner data were collected in two flights over ground cover plots at an altitude of 305 m. Eight ground reflectance panels in close proximity to the ground cover plots were used to normalize the scanner data obtained on different dates. Separate prediction equations were obtained for both flight dates for all eleven reflective wavelength bands of the multispectral scanner. Ratios of normalized scanner data were related to leaf area index over time. Normalized scanner data were used to plot relative reflectance versus wavelength for the ground cover plots. Spectral response curves were similar to those for bare soil and green vegetation as determined by laboratory measurements. The spectral response curves from the normalized scanner data indicated that reflectance in the 0.72 to 1.3 micron wavelength range increased as leaf area index increased. A decrease in reflectance was observed in the 0.65 micron chlorophyll absorption band as leaf area index increased.

  11. Deviation from Boltzmann distribution in excited energy levels of singly-ionized iron in an argon glow discharge plasma for atomic emission spectrometry

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Kashiwakura, Shunsuke; Wagatsuma, Kazuaki

    2012-01-01

    A Boltzmann plot for many iron ionic lines having excitation energies of 4.7-9.1 eV was investigated in an argon glow discharge plasma when the discharge parameters, such as the voltage/current and the gas pressure, were varied. A Grimm-style radiation source was employed in a DC voltage range of 400-800 V at argon pressures of 400-930 Pa. The plot did not follow a linear relationship over a wide range of the excitation energy, but it yielded a normal Boltzmann distribution in the range of 4.7-5.8 eV and a large overpopulation in higher-lying excitation levels of iron ion. A probable reason for this phenomenon is that excitations for higher excited energy levels of iron ion would be predominantly caused by non-thermal collisions with argon species, the internal energy of which is received by iron atoms for the ionization. Particular intense ionic lines, which gave a maximum peak of the Boltzmann plot, were observed at an excitation energy of ca. 7.7 eV. They were the Fe II 257.297-nm and the Fe II 258.111-nm lines, derived from the 3d54s4p 6P excited levels. The 3d54s4p 6P excited levels can be highly populated through a resonance charge transfer from the ground state of argon ion, because of good matching in the excitation energy as well as the conservation of the total spin before and after the collision. An enhancement factor of the emission intensity for various Fe II lines could be obtained from a deviation from the normal Boltzmann plot, which comprised the emission lines of 4.7-5.8 eV. It would roughly correspond to a contribution of the charge transfer excitation to the excited levels of iron ion, suggesting that the charge-transfer collision could elevate the number density of the corresponding excited levels by a factor of ca.104. The Boltzmann plots give important information on the reason why a variety of iron ionic lines can be emitted from glow discharge plasmas.

  12. Establishment probability in newly founded populations.

    PubMed

    Gusset, Markus; Müller, Michael S; Grimm, Volker

    2012-06-20

    Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population's state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the "Wissel plot", where -ln(1 - P0(t)) is plotted against time t. This plot is based on the equation P(0)t=1-c(1)e(-ω(1t)), which relates the probability of extinction by time t, P(0)(t), to two constants: c(1) describes the probability of a newly founded population to reach the established phase, whereas ω(1) describes the population's probability of extinction per short time interval once established. For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus). A newly founded population reaches the established phase if the intercept of the (extrapolated) linear parts of the "Wissel plot" with the y-axis, which is -ln(c(1)), is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population's viability by distinguishing establishment from persistence.

  13. Product plots.

    PubMed

    Wickham, Hadley; Hofmann, Heike

    2011-12-01

    We propose a new framework for visualising tables of counts, proportions and probabilities. We call our framework product plots, alluding to the computation of area as a product of height and width, and the statistical concept of generating a joint distribution from the product of conditional and marginal distributions. The framework, with extensions, is sufficient to encompass over 20 visualisations previously described in fields of statistical graphics and infovis, including bar charts, mosaic plots, treemaps, equal area plots and fluctuation diagrams. © 2011 IEEE

  14. The probability of being identified as an outlier with commonly used funnel plot control limits for the standardised mortality ratio.

    PubMed

    Seaton, Sarah E; Manktelow, Bradley N

    2012-07-16

    Emphasis is increasingly being placed on the monitoring of clinical outcomes for health care providers. Funnel plots have become an increasingly popular graphical methodology used to identify potential outliers. It is assumed that a provider only displaying expected random variation (i.e. 'in-control') will fall outside a control limit with a known probability. In reality, the discrete count nature of these data, and the differing methods, can lead to true probabilities quite different from the nominal value. This paper investigates the true probability of an 'in control' provider falling outside control limits for the Standardised Mortality Ratio (SMR). The true probabilities of an 'in control' provider falling outside control limits for the SMR were calculated and compared for three commonly used limits: Wald confidence interval; 'exact' confidence interval; probability-based prediction interval. The probability of falling above the upper limit, or below the lower limit, often varied greatly from the nominal value. This was particularly apparent when there were a small number of expected events: for expected events ≤ 50 the median probability of an 'in-control' provider falling above the upper 95% limit was 0.0301 (Wald), 0.0121 ('exact'), 0.0201 (prediction). It is important to understand the properties and probability of being identified as an outlier by each of these different methods to aid the correct identification of poorly performing health care providers. The limits obtained using probability-based prediction limits have the most intuitive interpretation and their properties can be defined a priori. Funnel plot control limits for the SMR should not be based on confidence intervals.

  15. Measuring Forest Area Loss Over Time Using FIA Plots and Satellite Imagery

    Treesearch

    Michael L. Hoppus; Andrew J. Lister

    2005-01-01

    How accurately can FIA plots, scattered at 1 per 6,000 acres, identify often rare forest land loss, estimated at less than 1 percent per year in the Northeast? Here we explore this question mathematically, empirically, and by comparing FIA plot estimates of forest change with satellite image based maps of forest loss. The mathematical probability of exactly estimating...

  16. Cowbird removals unexpectedly increase productivity of a brood parasite and the songbird host.

    PubMed

    Kosciuch, Karl L; Sandercock, Brett K

    2008-03-01

    Generalist brood parasites reduce productivity and population growth of avian hosts and have been implicated in population declines of several songbirds of conservation concern. To estimate the demographic effects of brood parasitism on Bell's Vireos (Vireo bellii), we removed Brown-headed Cowbirds (Molothrus ater) in a replicated switchback experimental design. Cowbird removals decreased parasitism frequency from 77% and 85% at unmanipulated plots to 58% and 47% at removal plots in 2004 and 2005, respectively. Vireo productivity per pair was higher at cowbird removal plots when years were pooled (mean = 2.6 +/- 0.2 [SE] young per pair) compared to unmanipulated plots (1.2 +/- 0.1). Nest desertion frequency was lower at cowbird removal plots (35% of parasitized nests) compared to unmanipulated plots (69%) because removal of host eggs was the proximate cue for nest desertion, and vireos experienced lower rates of egg loss at cowbird removal plots. Nest success was higher among unparasitized than parasitized nests, and parasitized nests at cowbird removal plots had a higher probability of success than parasitized nests at unmanipulated plots. Unexpectedly, cowbird productivity from vireo pairs was higher at cowbird removal plots (mean = 0.3 +/- 0.06 young per pair) than at unmanipulated plots (0.1 +/- 0.03) because fewer parasitized nests were deserted and the probability of nest success was higher. Our study provides the first evidence that increases in cowbird productivity may be an unintended consequence of cowbird control programs, especially during the initial years of trapping when parasitism may only be moderately reduced. Thus, understanding the demographic impacts of cowbird removals requires an informed understanding of the behavioral ecology of host-parasite interactions.

  17. Simple reaction time in 8-9-year old children environmentally exposed to PCBs.

    PubMed

    Šovčíková, Eva; Wimmerová, Soňa; Strémy, Maximilián; Kotianová, Janette; Loffredo, Christopher A; Murínová, Ľubica Palkovičová; Chovancová, Jana; Čonka, Kamil; Lancz, Kinga; Trnovec, Tomáš

    2015-12-01

    Simple reaction time (SRT) has been studied in children exposed to polychlorinated biphenyls (PCBs), with variable results. In the current work we examined SRT in 146 boys and 161 girls, aged 8.53 ± 0.65 years (mean ± SD), exposed to PCBs in the environment of eastern Slovakia. We divided the children into tertiles with regard to increasing PCB serum concentration. The mean ± SEM serum concentration of the sum of 15 PCB congeners was 191.15 ± 5.39, 419.23 ± 8.47, and 1315.12 ± 92.57 ng/g lipids in children of the first, second, and third tertiles, respectively. We created probability distribution plots for each child from their multiple trials of the SRT testing. We fitted response time distributions from all valid trials with the ex-Gaussian function, a convolution of a normal and an additional exponential function, providing estimates of three independent parameters μ, σ, and τ. μ is the mean of the normal component, σ is the standard deviation of the normal component, and τ is the mean of the exponential component. Group response time distributions were calculated using the Vincent averaging technique. A Q-Q plot comparing probability distribution of the first vs. third tertile indicated that deviation of the quantiles of the latter tertile from those of the former begins at the 40th percentile and does not show a positive acceleration. This was confirmed in comparison of the ex-Gaussian parameters of these two tertiles adjusted for sex, age, Raven IQ of the child, mother's and father's education, behavior at home and school, and BMI: the results showed that the parameters μ and τ significantly (p ≤ 0.05) increased with PCB exposure. Similar increases of the ex-Gaussian parameter τ in children suffering from ADHD have been previously reported and interpreted as intermittent attentional lapses, but were not seen in our cohort. Our study has confirmed that environmental exposure of children to PCBs is associated with prolongation of simple reaction time reflecting impairment of cognitive functions. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Radionuclide esophageal transit: an evaluation of therapy in achalasia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinney, M.K.; Brady, C.E.; Weiland, F.L.

    1983-09-01

    We measured quantitative esophageal transit, expressed as percentage of esophageal retention, before and after pneumatic dilatation in two patients with achalasia. In the sitting position they ingested a 500 ml liquid meal containing 500 muCi technetium Tc 99m sulfur colloid. Radioactivity counts of the entire esophagus were plotted at five-minute intervals for 30 minutes. In five normal control subjects the esophagus essentially cleared in less than one minute. Both patients with achalasia had definite retention 30 minutes before dilatation and had quantitative improvement after dilatation. Radionuclide scintigraphic esophageal transit probably correlates better than other parameters with the physiologic degree ofmore » obstruction in achalasia.« less

  19. Modeling natural regeneration establishment in the northern Rocky Mountains of the U.S.A

    Treesearch

    D. E. Ferguson

    1996-01-01

    Retrospective examination of cutover forests enables the development of models that predict regeneration success as a function of plot conditions and time since disturbance. The modeling process uses a two-state system. In the first state, all plots are analyzed to predict the probability of stocking (at least one established seedling on the plot). In the second state...

  20. Examining robustness of model selection with half-normal and LASSO plots for unreplicated factorial designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jang, Dae -Heung; Anderson-Cook, Christine Michaela

    When there are constraints on resources, an unreplicated factorial or fractional factorial design can allow efficient exploration of numerous factor and interaction effects. A half-normal plot is a common graphical tool used to compare the relative magnitude of effects and to identify important effects from these experiments when no estimate of error from the experiment is available. An alternative is to use a least absolute shrinkage and selection operation plot to examine the pattern of model selection terms from an experiment. We examine how both the half-normal and least absolute shrinkage and selection operation plots are impacted by the absencemore » of individual observations or an outlier, and the robustness of conclusions obtained from these 2 techniques for identifying important effects from factorial experiments. As a result, the methods are illustrated with 2 examples from the literature.« less

  1. Examining robustness of model selection with half-normal and LASSO plots for unreplicated factorial designs

    DOE PAGES

    Jang, Dae -Heung; Anderson-Cook, Christine Michaela

    2017-04-12

    When there are constraints on resources, an unreplicated factorial or fractional factorial design can allow efficient exploration of numerous factor and interaction effects. A half-normal plot is a common graphical tool used to compare the relative magnitude of effects and to identify important effects from these experiments when no estimate of error from the experiment is available. An alternative is to use a least absolute shrinkage and selection operation plot to examine the pattern of model selection terms from an experiment. We examine how both the half-normal and least absolute shrinkage and selection operation plots are impacted by the absencemore » of individual observations or an outlier, and the robustness of conclusions obtained from these 2 techniques for identifying important effects from factorial experiments. As a result, the methods are illustrated with 2 examples from the literature.« less

  2. ROBUST: an interactive FORTRAN-77 package for exploratory data analysis using parametric, ROBUST and nonparametric location and scale estimates, data transformations, normality tests, and outlier assessment

    NASA Astrophysics Data System (ADS)

    Rock, N. M. S.

    ROBUST calculates 53 statistics, plus significance levels for 6 hypothesis tests, on each of up to 52 variables. These together allow the following properties of the data distribution for each variable to be examined in detail: (1) Location. Three means (arithmetic, geometric, harmonic) are calculated, together with the midrange and 19 high-performance robust L-, M-, and W-estimates of location (combined, adaptive, trimmed estimates, etc.) (2) Scale. The standard deviation is calculated along with the H-spread/2 (≈ semi-interquartile range), the mean and median absolute deviations from both mean and median, and a biweight scale estimator. The 23 location and 6 scale estimators programmed cover all possible degrees of robustness. (3) Normality: Distributions are tested against the null hypothesis that they are normal, using the 3rd (√ h1) and 4th ( b 2) moments, Geary's ratio (mean deviation/standard deviation), Filliben's probability plot correlation coefficient, and a more robust test based on the biweight scale estimator. These statistics collectively are sensitive to most usual departures from normality. (4) Presence of outliers. The maximum and minimum values are assessed individually or jointly using Grubbs' maximum Studentized residuals, Harvey's and Dixon's criteria, and the Studentized range. For a single input variable, outliers can be either winsorized or eliminated and all estimates recalculated iteratively as desired. The following data-transformations also can be applied: linear, log 10, generalized Box Cox power (including log, reciprocal, and square root), exponentiation, and standardization. For more than one variable, all results are tabulated in a single run of ROBUST. Further options are incorporated to assess ratios (of two variables) as well as discrete variables, and be concerned with missing data. Cumulative S-plots (for assessing normality graphically) also can be generated. The mutual consistency or inconsistency of all these measures helps to detect errors in data as well as to assess data-distributions themselves.

  3. Fitting Data to Model: Structural Equation Modeling Diagnosis Using Two Scatter Plots

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Hayashi, Kentaro

    2010-01-01

    This article introduces two simple scatter plots for model diagnosis in structural equation modeling. One plot contrasts a residual-based M-distance of the structural model with the M-distance for the factor score. It contains information on outliers, good leverage observations, bad leverage observations, and normal cases. The other plot contrasts…

  4. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    NASA Technical Reports Server (NTRS)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  5. ensembleBMA: An R Package for Probabilistic Forecasting using Ensembles and Bayesian Model Averaging

    DTIC Science & Technology

    2007-08-15

    library is used to allow addition of the legend and map outline to the plot. > bluescale <- function(n) hsv (4/6, s = seq(from = 1 /8, to = 1 , length = n...v = 1 ) > plotBMAforecast( probFreeze290104, lon=srftGridData$lon, lat =srftGridData$ lat , type="image", col=bluescale(100)) > title("Probability of...probPrecip130103) # used to determine zlim in plots [ 1 ] 0.02832709 0.99534860 > plotBMAforecast( probPrecip130103[,Ŕ"], lon=prcpGridData$lon, lat

  6. Detection of the fracture zone by the method of recurrence plot

    NASA Astrophysics Data System (ADS)

    Hilarov, V. L.

    2017-12-01

    Recurrence plots (RPs) and recurrence quantification analysis (RQA) characteristics for the normal component of the displacement vector upon excitation of a defect steel plate by a sound pulse are analyzed. Different cases of spatial distribution of defects (uniform and normal) are considered, and a difference in the RQA parameters in these cases is revealed.

  7. Occupancy in continuous habitat

    USGS Publications Warehouse

    Efford, Murray G.; Dawson, Deanna K.

    2012-01-01

    The probability that a site has at least one individual of a species ('occupancy') has come to be widely used as a state variable for animal population monitoring. The available statistical theory for estimation when detection is imperfect applies particularly to habitat patches or islands, although it is also used for arbitrary plots in continuous habitat. The probability that such a plot is occupied depends on plot size and home-range characteristics (size, shape and dispersion) as well as population density. Plot size is critical to the definition of occupancy as a state variable, but clear advice on plot size is missing from the literature on the design of occupancy studies. We describe models for the effects of varying plot size and home-range size on expected occupancy. Temporal, spatial, and species variation in average home-range size is to be expected, but information on home ranges is difficult to retrieve from species presence/absence data collected in occupancy studies. The effect of variable home-range size is negligible when plots are very large (>100 x area of home range), but large plots pose practical problems. At the other extreme, sampling of 'point' plots with cameras or other passive detectors allows the true 'proportion of area occupied' to be estimated. However, this measure equally reflects home-range size and density, and is of doubtful value for population monitoring or cross-species comparisons. Plot size is ill-defined and variable in occupancy studies that detect animals at unknown distances, the commonest example being unlimited-radius point counts of song birds. We also find that plot size is ill-defined in recent treatments of "multi-scale" occupancy; the respective scales are better interpreted as temporal (instantaneous and asymptotic) rather than spatial. Occupancy is an inadequate metric for population monitoring when it is confounded with home-range size or detection distance.

  8. External validation and comparison of two nomograms predicting the probability of Gleason sum upgrading between biopsy and radical prostatectomy pathology in two patient populations: a retrospective cohort study.

    PubMed

    Utsumi, Takanobu; Oka, Ryo; Endo, Takumi; Yano, Masashi; Kamijima, Shuichi; Kamiya, Naoto; Fujimura, Masaaki; Sekita, Nobuyuki; Mikami, Kazuo; Hiruta, Nobuyuki; Suzuki, Hiroyoshi

    2015-11-01

    The aim of this study is to validate and compare the predictive accuracy of two nomograms predicting the probability of Gleason sum upgrading between biopsy and radical prostatectomy pathology among representative patients with prostate cancer. We previously developed a nomogram, as did Chun et al. In this validation study, patients originated from two centers: Toho University Sakura Medical Center (n = 214) and Chibaken Saiseikai Narashino Hospital (n = 216). We assessed predictive accuracy using area under the curve values and constructed calibration plots to grasp the tendency for each institution. Both nomograms showed a high predictive accuracy in each institution, although the constructed calibration plots of the two nomograms underestimated the actual probability in Toho University Sakura Medical Center. Clinicians need to use calibration plots for each institution to correctly understand the tendency of each nomogram for their patients, even if each nomogram has a good predictive accuracy. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  10. Long-slit Spectroscopy of Edge-on Low Surface Brightness Galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Wei; Wu, Hong; Zhu, Yinan

    2017-03-10

    We present long-slit optical spectra of 12 edge-on low surface brightness galaxies (LSBGs) positioned along their major axes. After performing reddening corrections for the emission-line fluxes measured from the extracted integrated spectra, we measured the gas-phase metallicities of our LSBG sample using both the [N ii]/H α and the R {sub 23} diagnostics. Both sets of oxygen abundances show good agreement with each other, giving a median value of 12 + log(O/H) = 8.26 dex. In the luminosity–metallicity plot, our LSBG sample is consistent with the behavior of normal galaxies. In the mass–metallicity diagram, our LSBG sample has lower metallicitiesmore » for lower stellar mass, similar to normal galaxies. The stellar masses estimated from z -band luminosities are comparable to those of prominent spirals. In a plot of the gas mass fraction versus metallicity, our LSBG sample generally agrees with other samples in the high gas mass fraction space. Additionally, we have studied stellar populations of three LSBGs, which have relatively reliable spectral continua and high signal-to-noise ratios, and qualitatively conclude that they have a potential dearth of stars with ages <1 Gyr instead of being dominated by stellar populations with ages >1 Gyr. Regarding the chemical evolution of our sample, the LSBG data appear to allow for up to 30% metal loss, but we cannot completely rule out the closed-box model. Additionally, we find evidence that our galaxies retain up to about three times as much of their metals compared with dwarfs, consistent with metal retention being related to galaxy mass. In conclusion, our data support the view that LSBGs are probably just normal disk galaxies continuously extending to the low end of surface brightness.« less

  11. Quantile Functions, Convergence in Quantile, and Extreme Value Distribution Theory.

    DTIC Science & Technology

    1980-11-01

    Gnanadesikan (1968). Quantile functions are advocated by Parzen (1979) as providing an approach to probability-based data analysis. Quantile functions are... Gnanadesikan , R. (1968). Probability Plotting Methods for the Analysis of Data, Biomtrika, 55, 1-17.

  12. A Generalized Approach to the Two Sample Problem: The Quantile Approach.

    DTIC Science & Technology

    1981-04-01

    advantages in this regard as remarked in Parzen (1979) and Wilk and Gnanadesikan (1968). One explanation of its statistical virtues is the fact that Q...differences between male and female right congruence kneecap angles. Wilkand Gnanadesikan (1968)have named a plot of q versus G- [F(q)] a Q-Q plot and...function techniques. 5.3.5 Comparison Function Techniques Wilk and Gnanadesikan (1968) stimulated research in the area of probability plotting where they

  13. Development of Soil Characteristics and Plant Communities On Reclaimed and Unreclaimed Spoil Heaps After Coal Mining

    NASA Astrophysics Data System (ADS)

    Cudlín, Ondřej; Řehák, Zdeněk; Cudlín, Pavel

    2016-10-01

    The aim of this study was to compare soil characteristics, plant communities and the rate of selected ecosystem function performance on reclaimed and unreclaimed plots (left for spontaneous succession) of different age on spoil heaps. Twelve spoil heaps (three circle plots of radius 12.5 m) near the town Kladno in north-west direction from Prague, created after deep coal mining, were compared. Five mixed soil samples from organo-mineral horizons in each plot were analysed for total content of carbon, nitrogen and phosphorus. In addition, active soil pH (pHH2O) was determined. Plant diversity was determined by vegetation releves. The biodiversity value of the habitat according to the Habitat Valuation Method was assessed and the rate of evapotranspiration function by the Method of Valuation Functions and Services of Ecosystems in the Czech Republic were determined. The higher organo-mineral layers and higher amount of total nitrogen content were found on the older reclaimed and unreclaimed plots than in younger plots. The number of plant species and the total contents of carbon and nitrogen were significantly higher at the unreclaimed plots compared to reclaimed plots. The biodiversity values and evapotranspiration function rate were also higher on unreclaimed plots. From this perspective, it is possible to recommend using of spontaneous succession, together with routine reclamation methods to restore habitats after coal mining. Despite the relatively high age of vegetation in some of selected plots (90 years), both the reclaimed and unreclaimed plots have not reached the stage of potential vegetation near to natural climax. Slow development of vegetation was probably due to unsuitable substrate of spoil heaps and a lack of plant and tree species of natural forest habitats in this area. However, it is probable that vegetation communities on observed spoil heaps in both type of management (reclaimed and unreclaimed) will achieve the stage of natural climax and they will provide ecosystem functions more effectively in the future.

  14. Investigation of Dielectric Breakdown Characteristics for Double-break Vacuum Interrupter and Dielectric Breakdown Probability Distribution in Vacuum Interrupter

    NASA Astrophysics Data System (ADS)

    Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi

    To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.

  15. The costs of evaluating species densities and composition of snakes to assess development impacts in amazonia.

    PubMed

    Fraga, Rafael de; Stow, Adam J; Magnusson, William E; Lima, Albertina P

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities.

  16. The Costs of Evaluating Species Densities and Composition of Snakes to Assess Development Impacts in Amazonia

    PubMed Central

    de Fraga, Rafael; Stow, Adam J.; Magnusson, William E.; Lima, Albertina P.

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities. PMID:25147930

  17. Smoothed Residual Plots for Generalized Linear Models. Technical Report #450.

    ERIC Educational Resources Information Center

    Brant, Rollin

    Methods for examining the viability of assumptions underlying generalized linear models are considered. By appealing to the likelihood, a natural generalization of the raw residual plot for normal theory models is derived and is applied to investigating potential misspecification of the linear predictor. A smooth version of the plot is also…

  18. Impact of signal scattering and parametric uncertainties on receiver operating characteristics

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Breton, Daniel J.; Hart, Carl R.; Pettit, Chris L.

    2017-05-01

    The receiver operating characteristic (ROC curve), which is a plot of the probability of detection as a function of the probability of false alarm, plays a key role in the classical analysis of detector performance. However, meaningful characterization of the ROC curve is challenging when practically important complications such as variations in source emissions, environmental impacts on the signal propagation, uncertainties in the sensor response, and multiple sources of interference are considered. In this paper, a relatively simple but realistic model for scattered signals is employed to explore how parametric uncertainties impact the ROC curve. In particular, we show that parametric uncertainties in the mean signal and noise power substantially raise the tails of the distributions; since receiver operation with a very low probability of false alarm and a high probability of detection is normally desired, these tails lead to severely degraded performance. Because full a priori knowledge of such parametric uncertainties is rarely available in practice, analyses must typically be based on a finite sample of environmental states, which only partially characterize the range of parameter variations. We show how this effect can lead to misleading assessments of system performance. For the cases considered, approximately 64 or more statistically independent samples of the uncertain parameters are needed to accurately predict the probabilities of detection and false alarm. A connection is also described between selection of suitable distributions for the uncertain parameters, and Bayesian adaptive methods for inferring the parameters.

  19. Solar cell angle of incidence corrections

    NASA Technical Reports Server (NTRS)

    Burger, Dale R.; Mueller, Robert L.

    1995-01-01

    Literature on solar array angle of incidence corrections was found to be sparse and contained no tabular data for support. This lack along with recent data on 27 GaAs/Ge 4 cm by 4 cm cells initiated the analysis presented in this paper. The literature cites seven possible contributors to angle of incidence effects: cosine, optical front surface, edge, shadowing, UV degradation, particulate soiling, and background color. Only the first three are covered in this paper due to lack of sufficient data. The cosine correction is commonly used but is not sufficient when the incident angle is large. Fresnel reflection calculations require knowledge of the index of refraction of the coverglass front surface. The absolute index of refraction for the coverglass front surface was not known nor was it measured due to lack of funds. However, a value for the index of refraction was obtained by examining how the prediction errors varied with different assumed indices and selecting the best fit to the set of measured values. Corrections using front surface Fresnel reflection along with the cosine correction give very good predictive results when compared to measured data, except there is a definite trend away from predicted values at the larger incident angles. This trend could be related to edge effects and is illustrated by a use of a box plot of the errors and by plotting the deviation of the mean against incidence angle. The trend is for larger deviations at larger incidence angles and there may be a fourth order effect involved in the trend. A chi-squared test was used to determine if the measurement errors were normally distributed. At 10 degrees the chi-squared test failed, probably due to the very small numbers involved or a bias from the measurement procedure. All other angles showed a good fit to the normal distribution with increasing goodness-of-fit as the angles increased which reinforces the very small numbers hypothesis. The contributed data only went to 65 degrees from normal which prevented any firm conclusions about extreme angle effects although a trend in the right direction was seen. Measurement errors were estimated and found to be consistent with the conclusions that were drawn. A controlled experiment using coverglasses and cells from the same lots and extending to larger incidence angles would probably lead to further insight into the subject area.

  20. Weighted triangulation adjustment

    USGS Publications Warehouse

    Anderson, Walter L.

    1969-01-01

    The variation of coordinates method is employed to perform a weighted least squares adjustment of horizontal survey networks. Geodetic coordinates are required for each fixed and adjustable station. A preliminary inverse geodetic position computation is made for each observed line. Weights associated with each observed equation for direction, azimuth, and distance are applied in the formation of the normal equations in-the least squares adjustment. The number of normal equations that may be solved is twice the number of new stations and less than 150. When the normal equations are solved, shifts are produced at adjustable stations. Previously computed correction factors are applied to the shifts and a most probable geodetic position is found for each adjustable station. Pinal azimuths and distances are computed. These may be written onto magnetic tape for subsequent computation of state plane or grid coordinates. Input consists of punch cards containing project identification, program options, and position and observation information. Results listed include preliminary and final positions, residuals, observation equations, solution of the normal equations showing magnitudes of shifts, and a plot of each adjusted and fixed station. During processing, data sets containing irrecoverable errors are rejected and the type of error is listed. The computer resumes processing of additional data sets.. Other conditions cause warning-errors to be issued, and processing continues with the current data set.

  1. On the distribution of scaling hydraulic parameters in a spatially anisotropic banana field

    NASA Astrophysics Data System (ADS)

    Regalado, Carlos M.

    2005-06-01

    When modeling soil hydraulic properties at field scale it is desirable to approximate the variability in a given area by means of some scaling transformations which relate spatially variable local hydraulic properties to global reference characteristics. Seventy soil cores were sampled within a drip irrigated banana plantation greenhouse on a 14×5 array of 2.5 m×5 m rectangles at 15 cm depth, to represent the field scale variability of flow related properties. Saturated hydraulic conductivity and water retention characteristics were measured in these 70 soil cores. van Genuchten water retention curves (WRC) with optimized m ( m≠1-1/ n) were fitted to the WR data and a general Mualem-van Genuchten model was used to predict hydraulic conductivity functions for each soil core. A scaling law, of the form ν=ανi*, was fitted to soil hydraulic data, such that the original hydraulic parameters νi were scaled down to a reference curve with parameters νi*. An analytical expression, in terms of Beta functions, for the average suction value, hc, necessary to apply the above scaling method, was obtained. A robust optimization procedure with fast convergence to the global minimum is used to find the optimum hc, such that dispersion is minimized in the scaled data set. Via the Box-Cox transformation P(τ)=(αiτ-1)/τ, Box-Cox normality plots showed that scaling factors for the suction ( αh) and hydraulic conductivity ( αk) were approximately log-normally distributed (i.e. τ=0), as it would be expected for such dynamic properties involving flow. By contrast static soil related properties as αθ were found closely Gaussian, although a power τ=3/4 was best for approaching normality. Application of four different normality tests (Anderson-Darling, Shapiro-Wilk, Kolmogorov-Smirnov and χ2 goodness-of-fit tests) rendered some contradictory results among them, thus suggesting that this widely extended practice is not recommended for providing a suitable probability density function for the scaling parameters, αi. Some indications for the origin of these disagreements, in terms of population size and test constraints, are pointed out. Visual inspection of normal probability plots can also lead to erroneous results. The scaling parameters αθ and αK show a sinusoidal spatial variation coincident with the underlying alignment of banana plants on the field. Such anisotropic distribution is explained in terms of porosity variations due to processes promoting soil degradation as surface desiccation and soil compaction, induced by tillage and localized irrigation of banana plants, and it is quantified by means of cross-correlograms.

  2. Initial and continued effects of a release spray in a coastal Oregon Douglas-fir plantation.

    Treesearch

    Richard E. Miller; Edmund L. Obermeyer

    1996-01-01

    Portions of a 4-year-old Douglas-fir (Pseudotsuga menziesii var. menziesii (Mirb.) Franco) plantation were sprayed with herbicide. Five years after spraying, we established 18 plots and used several means to determine retrospectively that six plots probably received full spray treatment and six others received no spray. Various...

  3. Survivability Versus Time

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2014-01-01

    Develop Survivability vs Time Model as a decision-evaluation tool to assess various emergency egress methods used at Launch Complex 39B (LC 39B) and in the Vehicle Assembly Building (VAB) on NASAs Kennedy Space Center. For each hazard scenario, develop probability distributions to address statistical uncertainty resulting in survivability plots over time and composite survivability plots encompassing multiple hazard scenarios.

  4. Emergent dynamics of spiking neurons with fluctuating threshold

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, Anindita; Das, M. K.

    2017-05-01

    Role of fluctuating threshold on neuronal dynamics is investigated. The threshold function is assumed to follow a normal probability distribution. Standard deviation of inter-spike interval of the response is computed as an indicator of irregularity in spike emission. It has been observed that, the irregularity in spiking is more if the threshold variation is more. A significant change in modal characteristics of Inter Spike Intervals (ISI) is seen to occur as a function of fluctuation parameter. Investigation is further carried out for coupled system of neurons. Cooperative dynamics of coupled neurons are discussed in view of synchronization. Total and partial synchronization regimes are depicted with the help of contour plots of synchrony measure under various conditions. Results of this investigation may provide a basis for exploring the complexities of neural communication and brain functioning.

  5. Advanced probabilistic methods for quantifying the effects of various uncertainties in structural response

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1988-01-01

    The effects of actual variations, also called uncertainties, in geometry and material properties on the structural response of a space shuttle main engine turbopump blade are evaluated. A normal distribution was assumed to represent the uncertainties statistically. Uncertainties were assumed to be totally random, partially correlated, and fully correlated. The magnitude of these uncertainties were represented in terms of mean and variance. Blade responses, recorded in terms of displacements, natural frequencies, and maximum stress, was evaluated and plotted in the form of probabilistic distributions under combined uncertainties. These distributions provide an estimate of the range of magnitudes of the response and probability of occurrence of a given response. Most importantly, these distributions provide the information needed to estimate quantitatively the risk in a structural design.

  6. Petrology, geochemistry and zircon U-Pb geochronology of a layered igneous complex from Akarui Point in the Lützow-Holm Complex, East Antarctica: Implications for Antarctica-Sri Lanka correlation

    NASA Astrophysics Data System (ADS)

    Kazami, Sou; Tsunogae, Toshiaki; Santosh, M.; Tsutsumi, Yukiyasu; Takamura, Yusuke

    2016-11-01

    The Lützow-Holm Complex (LHC) of East Antarctica forms part of a complex subduction-collision orogen related to the amalgamation of the Neoproterozoic supercontinent Gondwana. Here we report new petrological, geochemical, and geochronological data from a metamorphosed and disrupted layered igneous complex from Akarui Point in the LHC which provide new insights into the evolution of the complex. The complex is composed of mafic orthogneiss (edenite/pargasite + plagioclase ± clinopyroxene ± orthopyroxene ± spinel ± sapphirine ± K-feldspar), meta-ultramafic rock (pargasite + olivine + spinel + orthopyroxene), and felsic orthogneiss (plagioclase + quartz + pargasite + biotite ± garnet). The rocks show obvious compositional layering reflecting the chemical variation possibly through magmatic differentiation. The metamorphic conditions of the rocks were estimated using hornblende-plagioclase geothermometry which yielded temperatures of 720-840 °C. The geochemical data of the orthogneisses indicate fractional crystallization possibly related to differentiation within a magma chamber. Most of the mafic-ultramafic samples show enrichment of LILE, negative Nb, Ta, P and Ti anomalies, and constant HFSE contents in primitive-mantle normalized trace element plots suggesting volcanic arc affinity probably related to subduction. The enrichment of LREE and flat HREE patterns in chondrite-normalized REE plot, with the Nb-Zr-Y, Y-La-Nb, and Th/Yb-Nb/Yb plots also suggest volcanic arc affinity. The felsic orthogneiss plotted on Nb/Zr-Zr diagram (low Nb/Zr ratio) and spider diagrams (enrichment of LILE, negative Nb, Ta, P and Ti anomalies) also show magmatic arc origin. The morphology, internal structure, and high Th/U ratio of zircon grains in felsic orthogneiss are consistent with magmatic origin for most of these grains. Zircon U-Pb analyses suggest Early Neoproterozoic (847.4 ± 8.0 Ma) magmatism and protolith formation. Some older grains (1026-882 Ma) are regarded as xenocrysts from basement entrained in the magma through limited crustal reworking. The younger ages (807-667 Ma) might represent subsequent thermal events. The results of this study suggest that the ca. 850 Ma layered igneous complex in Akarui Point was derived from a magma chamber constructed through arc-related magmatism which included components from ca. 1.0 Ga felsic continental crustal basement. The geochemical characteristics and the timing of protolith emplacement from this complex are broadly identical to those of similar orthogneisses from Kasumi Rock and Tama Point in the LHC and the Kadugannawa Complex in Sri Lanka, which record Early Neoproterozoic (ca. 1.0 Ga) arc magmatism. Although the magmatic event in Akarui Point is slightly younger, the thermal event probably continued from ca. 1.0 Ga to ca. 850 Ma or even to ca. 670 Ma. We therefore correlate the Akarui Point igneous complex with those in the LHC and Kadugannawa Complex formed under similar Early Neoproterozoic arc magmatic events during the convergent margin processes prior to the assembly of the Gondwana supercontinent.

  7. Comparison of scalar measures used in magnetic resonance diffusion tensor imaging.

    PubMed

    Bahn, M M

    1999-07-01

    The tensors derived from diffusion tensor imaging describe complex diffusion in tissues. However, it is difficult to compare tensors directly or to produce images that contain all of the information of the tensor. Therefore, it is convenient to produce scalar measures that extract desired aspects of the tensor. These measures map the three-dimensional eigenvalues of the diffusion tensor into scalar values. The measures impose an order on eigenvalue space. Many invariant scalar measures have been introduced in the literature. In the present manuscript, a general approach for producing invariant scalar measures is introduced. Because it is often difficult to determine in clinical practice which of the many measures is best to apply to a given situation, two formalisms are introduced for the presentation, definition, and comparison of measures applied to eigenvalues: (1) normalized eigenvalue space, and (2) parametric eigenvalue transformation plots. All of the anisotropy information contained in the three eigenvalues can be retained and displayed in a two-dimensional plot, the normalized eigenvalue plot. An example is given of how to determine the best measure to use for a given situation by superimposing isometric contour lines from various anisotropy measures on plots of actual measured eigenvalue data points. Parametric eigenvalue transformation plots allow comparison of how different measures impose order on normalized eigenvalue space to determine whether the measures are equivalent and how the measures differ. These formalisms facilitate the comparison of scalar invariant measures for diffusion tensor imaging. Normalized eigenvalue space allows presentation of eigenvalue anisotropy information. Copyright 1999 Academic Press.

  8. Attributes associated with probability of infestation by the pinon ips, Ips confusus, (Coleoptera: Scolytidae) in pinon pine, Pinus edulis

    Treesearch

    Jose E. Negron; Jill L. Wilson

    2003-01-01

    We examined attributes of pinon pine (Pinus edulis) associated with the probability of infestation by pinon ips (Ips confusus) in an outbreak in the Coconino National Forest, Arizona. We used data collected from 87 plots, 59 infested and 28 uninfested, and a logistic regression approach to estimate the probability ofinfestation based on plotand tree-level attributes....

  9. Committor of elementary reactions on multistate systems

    NASA Astrophysics Data System (ADS)

    Király, Péter; Kiss, Dóra Judit; Tóth, Gergely

    2018-04-01

    In our study, we extend the committor concept on multi-minima systems, where more than one reaction may proceed, but the feasible data evaluation needs the projection onto partial reactions. The elementary reaction committor and the corresponding probability density of the reactive trajectories are defined and calculated on a three-hole two-dimensional model system explored by single-particle Langevin dynamics. We propose a method to visualize more elementary reaction committor functions or probability densities of reactive trajectories on a single plot that helps to identify the most important reaction channels and the nonreactive domains simultaneously. We suggest a weighting for the energy-committor plots that correctly shows the limits of both the minimal energy path and the average energy concepts. The methods also performed well on the analysis of molecular dynamics trajectories of 2-chlorobutane, where an elementary reaction committor, the probability densities, the potential energy/committor, and the free-energy/committor curves are presented.

  10. Coal-seismic, desktop computer programs in BASIC; Part 5, Perform X-square T-square analysis and plot normal moveout lines on seismogram overlay

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.

  11. Attributes associated with probability of infestation by the pinon Ips, Ips confusus, (Coleoptera: Scolytidae) in pinon pine, Pinus edulis

    Treesearch

    Jose F. Negron; Jill L. Wilson

    2008-01-01

    (Please note, this is an abstract only) We examined attributes associated with the probability of infestation by pinon ips (Ips confusus), in pinon pine (Pinus edulis), in an outbreak in the Coconino National Forest, Arizona. We used data collected from 87 plots, 59 infested and 28 uninfested, and a logistic regression approach to estimate the probability of...

  12. Evaluation of an Ensemble Dispersion Calculation.

    NASA Astrophysics Data System (ADS)

    Draxler, Roland R.

    2003-02-01

    A Lagrangian transport and dispersion model was modified to generate multiple simulations from a single meteorological dataset. Each member of the simulation was computed by assuming a ±1-gridpoint shift in the horizontal direction and a ±250-m shift in the vertical direction of the particle position, with respect to the meteorological data. The configuration resulted in 27 ensemble members. Each member was assumed to have an equal probability. The model was tested by creating an ensemble of daily average air concentrations for 3 months at 75 measurement locations over the eastern half of the United States during the Across North America Tracer Experiment (ANATEX). Two generic graphical displays were developed to summarize the ensemble prediction and the resulting concentration probabilities for a specific event: a probability-exceed plot and a concentration-probability plot. Although a cumulative distribution of the ensemble probabilities compared favorably with the measurement data, the resulting distribution was not uniform. This result was attributed to release height sensitivity. The trajectory ensemble approach accounts for about 41%-47% of the variance in the measurement data. This residual uncertainty is caused by other model and data errors that are not included in the ensemble design.

  13. CO2 and CH4 Fluxes across Polygon Geomorphic Types, Barrow, Alaska, 2006-2010

    DOE Data Explorer

    Tweedie,Craig; Lara, Mark

    2014-09-17

    Carbon flux data are reported as Net Ecosystem Exchange (NEE), Gross Ecosystem Exchange (GEE), Ecosystem Respiration (ER), and Methane (CH4) flux. Measurements were made at 82 plots across various polygon geomorphic classes at research sites on the Barrow Environmental Observatory (BEO), the Biocomplexity Experiment site on the BEO, and the International Biological Program (IBP) site a little west of the BEO. This product is a compilation of data from 27 plots as presented in Lara et al. (2012), data from six plots presented in Olivas et al. (2010); and from 49 plots described in (Lara et al. 2014). Measurements were made during the peak of the growing seasons during 2006 to 2010. At each of the measurement plots (except Olivas et al., 2010) four different thicknesses of shade cloth were used to generate CO2 light response curves. Light response curves were used to normalize photosynthetically active radiation that is diurnally variable to a peak growing season average ~400 umolm-2sec-1. At the Olivas et al. (2010) plots, diurnal patterns were characterized by repeated sampling. CO2 measurements were made using a closed-chamber photosynthesis system and CH4 measurements were made using a photo-acoustic multi-gas analyzer. In addition, plot-level measurements for thaw depth (TD), water table depth (WTD), leaf area index (LAI), and normalized difference vegetation index (NDVI) are summarized by geomorphic polygon type.

  14. Quantifying restoration effectiveness using multi-scale habitat models: implications for sage-grouse in the Great Basin

    USGS Publications Warehouse

    Arkle, Robert S.; Pilliod, David S.; Hanser, Steven E.; Brooks, Matthew L.; Chambers, Jeanne C.; Grace, James B.; Knutson, Kevin C.; Pyke, David A.; Welty, Justin L.

    2014-01-01

    A recurrent challenge in the conservation of wide-ranging, imperiled species is understanding which habitats to protect and whether we are capable of restoring degraded landscapes. For Greater Sage-grouse (Centrocercus urophasianus), a species of conservation concern in the western United States, we approached this problem by developing multi-scale empirical models of occupancy in 211 randomly located plots within a 40 million ha portion of the species' range. We then used these models to predict sage-grouse habitat quality at 826 plots associated with 101 post-wildfire seeding projects implemented from 1990 to 2003. We also compared conditions at restoration sites to published habitat guidelines. Sage-grouse occupancy was positively related to plot- and landscape-level dwarf sagebrush (Artemisia arbuscula, A. nova, A. tripartita) and big sagebrush steppe prevalence, and negatively associated with non-native plants and human development. The predicted probability of sage-grouse occupancy at treated plots was low on average (0.09) and not substantially different from burned areas that had not been treated. Restoration sites with quality habitat tended to occur at higher elevation locations with low annual temperatures, high spring precipitation, and high plant diversity. Of 313 plots seeded after fire, none met all sagebrush guidelines for breeding habitats, but approximately 50% met understory guidelines, particularly for perennial grasses. This pattern was similar for summer habitat. Less than 2% of treated plots met winter habitat guidelines. Restoration actions did not increase the probability of burned areas meeting most guideline criteria. The probability of meeting guidelines was influenced by a latitudinal gradient, climate, and topography. Our results suggest that sage-grouse are relatively unlikely to use many burned areas within 20 years of fire, regardless of treatment. Understory habitat conditions are more likely to be adequate than overstory conditions, but in most climates, establishing forbs and reducing cheatgrass dominance is unlikely. Reestablishing sagebrush cover will require more than 20 years using past restoration methods. Given current fire frequencies and restoration capabilities, protection of landscapes containing a mix of dwarf sagebrush and big sagebrush steppe, minimal human development, and low non-native plant cover may provide the best opportunity for conservation of sage-grouse habitats.

  15. Acute Brain Dysfunction: Development and Validation of a Daily Prediction Model.

    PubMed

    Marra, Annachiara; Pandharipande, Pratik P; Shotwell, Matthew S; Chandrasekhar, Rameela; Girard, Timothy D; Shintani, Ayumi K; Peelen, Linda M; Moons, Karl G M; Dittus, Robert S; Ely, E Wesley; Vasilevskis, Eduard E

    2018-03-24

    The goal of this study was to develop and validate a dynamic risk model to predict daily changes in acute brain dysfunction (ie, delirium and coma), discharge, and mortality in ICU patients. Using data from a multicenter prospective ICU cohort, a daily acute brain dysfunction-prediction model (ABD-pm) was developed by using multinomial logistic regression that estimated 15 transition probabilities (from one of three brain function states [normal, delirious, or comatose] to one of five possible outcomes [normal, delirious, comatose, ICU discharge, or died]) using baseline and daily risk factors. Model discrimination was assessed by using predictive characteristics such as negative predictive value (NPV). Calibration was assessed by plotting empirical vs model-estimated probabilities. Internal validation was performed by using a bootstrap procedure. Data were analyzed from 810 patients (6,711 daily transitions). The ABD-pm included individual risk factors: mental status, age, preexisting cognitive impairment, baseline and daily severity of illness, and daily administration of sedatives. The model yielded very high NPVs for "next day" delirium (NPV: 0.823), coma (NPV: 0.892), normal cognitive state (NPV: 0.875), ICU discharge (NPV: 0.905), and mortality (NPV: 0.981). The model demonstrated outstanding calibration when predicting the total number of patients expected to be in any given state across predicted risk. We developed and internally validated a dynamic risk model that predicts the daily risk for one of three cognitive states, ICU discharge, or mortality. The ABD-pm may be useful for predicting the proportion of patients for each outcome state across entire ICU populations to guide quality, safety, and care delivery activities. Copyright © 2018 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  16. Detecting early functional damage in glaucoma suspect and ocular hypertensive patients with the multifocal VEP technique.

    PubMed

    Thienprasiddhi, Phamornsak; Greenstein, Vivienne C; Chu, David H; Xu, Li; Liebmann, Jeffrey M; Ritch, Robert; Hood, Donald C

    2006-08-01

    To determine whether the multifocal visual evoked potential (mfVEP) technique can detect early functional damage in ocular hypertensive (OHT) and glaucoma suspect (GS) patients with normal standard achromatic automated perimetry (SAP) results. Twenty-five GS patients (25 eyes), 25 patients with OHT (25 eyes), and 50 normal controls (50 eyes) were enrolled in this study. All GS, OHT and normal control eyes had normal SAP as defined by a pattern standard deviation and mean deviation within the 95% confidence interval and a glaucoma hemifield test within normal limits on the Humphrey visual field 24-2 program. Eyes with GS had optic disc changes consistent with glaucoma with or without raised intraocular pressure (IOP), and eyes with OHT showed no evidence of glaucomatous optic neuropathy and IOPs >or=22 mm Hg. Monocular mfVEPs were obtained from both eyes of each subject using a pattern-reversal dartboard array with 60 sectors. The entire display had a radius of 22.3 degrees. The mfVEPs, for each eye, were defined as abnormal when either the monocular or interocular probability plot had a cluster of 3 or more contiguous points with P<0.05 and at least 2 of these points with P<0.01. The mfVEP results were abnormal in 4% of the eyes from normal subjects. Abnormal mfVEPs were detected in 20% of the eyes of GS patients and 16% of the eyes of OHT patients. Significantly more mfVEP abnormalities were detected in GS patients than in normal controls. However, there was no significant difference in mfVEP results between OHT patients and normal controls. The mfVEP technique can detect visual field deficits in a minority of eyes with glaucomatous optic disks and normal SAP results.

  17. Development of a prognostic nomogram for cirrhotic patients with upper gastrointestinal bleeding.

    PubMed

    Zhou, Yu-Jie; Zheng, Ji-Na; Zhou, Yi-Fan; Han, Yi-Jing; Zou, Tian-Tian; Liu, Wen-Yue; Braddock, Martin; Shi, Ke-Qing; Wang, Xiao-Dong; Zheng, Ming-Hua

    2017-10-01

    Upper gastrointestinal bleeding (UGIB) is a complication with a high mortality rate in critically ill patients presenting with cirrhosis. Today, there exist few accurate scoring models specifically designed for mortality risk assessment in critically ill cirrhotic patients with upper gastrointestinal bleeding (CICGIB). Our aim was to develop and evaluate a novel nomogram-based model specific for CICGIB. Overall, 540 consecutive CICGIB patients were enrolled. On the basis of Cox regression analyses, the nomogram was constructed to estimate the probability of 30-day, 90-day, 270-day, and 1-year survival. An upper gastrointestinal bleeding-chronic liver failure-sequential organ failure assessment (UGIB-CLIF-SOFA) score was derived from the nomogram. Performance assessment and internal validation of the model were performed using Harrell's concordance index (C-index), calibration plot, and bootstrap sample procedures. UGIB-CLIF-SOFA was also compared with other prognostic models, such as CLIF-SOFA and model for end-stage liver disease, using C-indices. Eight independent factors derived from Cox analysis (including bilirubin, creatinine, international normalized ratio, sodium, albumin, mean artery pressure, vasopressin used, and hematocrit decrease>10%) were assembled into the nomogram and the UGIB-CLIF-SOFA score. The calibration plots showed optimal agreement between nomogram prediction and actual observation. The C-index of the nomogram using bootstrap (0.729; 95% confidence interval: 0.689-0.766) was higher than that of the other models for predicting survival of CICGIB. We have developed and internally validated a novel nomogram and an easy-to-use scoring system that accurately predicts the mortality probability of CICGIB on the basis of eight easy-to-obtain parameters. External validation is now warranted in future clinical studies.

  18. Prescribed burning and wildfire risk in the 1998 fire season in Florida

    Treesearch

    John M. Pye; Jeffrey P. Prestemon; David T. Butry; Karen L. Abt

    2003-01-01

    Measures of understory burning activity in and around FIA plots in northeastern Florida were not significantly associated with reduced burning probability in the extreme fire season of 1998. In this unusual year, burn probability was greatest on ordinarily wetter sites, especially baldcypress stands, and positively associated with understory vegetation. Moderate...

  19. PETRO.CALC.PLOT, Microsoft Excel macros to aid petrologic interpretation

    USGS Publications Warehouse

    Sidder, G.B.

    1994-01-01

    PETRO.CALC.PLOT is a package of macros which normalizes whole-rock oxide data to 100%, calculates the cation percentages and molecular proportions used for normative mineral calculations, computes the apices for ternary diagrams, determines sums and ratios of specific elements of petrologic interest, and plots 33 X-Y graphs and five ternary diagrams. PETRO.CALC.PLOT also may be used to create other diagrams as desired by the user. The macros run in Microsoft Excel 3.0 and 4.0 for Macintosh computers and in Microsoft Excel 3.0 and 4.0 for Windows. Macros provided in PETRO.CALC.PLOT minimize repetition and time required to recalculate and plot whole-rock oxide data for petrologic analysis. ?? 1994.

  20. User manual for two simple postscript output FORTRAN plotting routines

    NASA Technical Reports Server (NTRS)

    Nguyen, T. X.

    1991-01-01

    Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.

  1. The Use of Crow-AMSAA Plots to Assess Mishap Trends

    NASA Technical Reports Server (NTRS)

    Dawson, Jeffrey W.

    2011-01-01

    Crow-AMSAA (CA) plots are used to model reliability growth. Use of CA plots has expanded into other areas, such as tracking events of interest to management, maintenance problems, and safety mishaps. Safety mishaps can often be successfully modeled using a Poisson probability distribution. CA plots show a Poisson process in log-log space. If the safety mishaps are a stable homogenous Poisson process, a linear fit to the points in a CA plot will have a slope of one. Slopes of greater than one indicate a nonhomogenous Poisson process, with increasing occurrence. Slopes of less than one indicate a nonhomogenous Poisson process, with decreasing occurrence. Changes in slope, known as "cusps," indicate a change in process, which could be an improvement or a degradation. After presenting the CA conceptual framework, examples are given of trending slips, trips and falls, and ergonomic incidents at NASA (from Agency-level data). Crow-AMSAA plotting is a robust tool for trending safety mishaps that can provide insight into safety performance over time.

  2. Joint Task Force Two, Test 4.1; B 52 Aircraft Data Book

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Department 9210

    1968-10-01

    This volume contains plots of the aircraft position track in the target area. There are also plots of the aircraft altitude above the terrain, normal accelerations, roll angle, pitch angle & slant range from the navigation check points and the targets.

  3. Spatial probability of soil water repellency in an abandoned agricultural field in Lithuania

    NASA Astrophysics Data System (ADS)

    Pereira, Paulo; Misiūnė, Ieva

    2015-04-01

    Water repellency is a natural soil property with implications on infiltration, erosion and plant growth. It depends on soil texture, type and amount of organic matter, fungi, microorganisms, and vegetation cover (Doerr et al., 2000). Human activities as agriculture can have implications on soil water repellency (SWR) due tillage and addition of organic compounds and fertilizers (Blanco-Canqui and Lal, 2009; Gonzalez-Penaloza et al., 2012). It is also assumed that SWR has a high small-scale variability (Doerr et al., 2000). The aim of this work is to study the spatial probability of SWR in an abandoned field testing several geostatistical methods, Organic Kriging (OK), Simple Kriging (SK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area it is located near Vilnius urban area at (54 49' N, 25 22', 104 masl) in Lithuania (Pereira and Oliva, 2013). It was designed a experimental plot with 21 m2 (07x03 m). Inside this area it was measured SWR was measured every 50 cm using the water drop penetration time (WDPT) (Wessel, 1998). A total of 105 points were measured. The probability of SWR was classified in 0 (No probability) to 1 (High probability). The methods accuracy was assessed with the cross validation method. The best interpolation method was the one with the lowest Root Mean Square Error (RMSE). The results showed that the most accurate probability method was SK (RMSE=0.436), followed by DK (RMSE=0.437), IK (RMSE=0.448), PK (RMSE=0.452) and OK (RMSE=0.537). Significant differences were identified among probability tests (Kruskal-Wallis test =199.7597 p<0.001). On average the probability of SWR was high with the OK (0.58±0.08) followed by PK (0.49±0.18), SK (0.32±0.16), DK (0.32±0.15) and IK (0.31±0.16). The most accurate probability methods predicted a lower probability of SWR in the studied plot. The spatial distribution of SWR was different according to the tested technique. Simple Kriging, DK, IK and PK methods identified the high SWR probabilities in the northeast and central part of the plot, while OK observed mainly in the south-western part of the plot. In conclusion, before predict the spatial probability of SWR it is important to test several methods in order to identify the most accurate. Acknowledgments COST action ES1306 (Connecting European connectivity research). References Blanco-Canqui, H., Lal, R. (2009) Extend of water repellency under long-term no-till soils. Geoderma, 149, 171-180. Doerr, S.H., Shakesby, R.A., Walsh, R.P.D. (2000) Soil water repellency: Its causes, characteristics and hydro-geomorphological significance. Earth-Science Reviews, 51, 33-65. Gonzalez-Penaloza, F.A., Cerda, A., Zavala, L.M., Jordan, A., Gimenez-Morera, A., Arcenegui, V. (2012) Do conservative agriculture practices increase soil water repellency? A case study in citrus-croped soils. Soil and Tillage Research, 124, 233-239. Pereira, P., Oliva, M. (2013) Modelling soil water repellency in an abandoned agricultural field, Visnyk Geology, Visnyk Geology 4, 77-80. Wessel, A.T. (1988) On using the effective contact angle and the water drop penetration time for classification of water repellency in dune soils. Earth Surface Process and Landforms, 13, 555-265.

  4. Analytical and experimental investigation of a 1/8-scale dynamic model of the shuttle orbiter. Volume 3B: Supporting data

    NASA Technical Reports Server (NTRS)

    Mason, P. W.; Harris, H. G.; Zalesak, J.; Bernstein, M.

    1974-01-01

    The NASA Structural Analysis System (NASTRAN) Model 1 finite element idealization, input data, and detailed analytical results are presented. The data presented include: substructuring analysis for normal modes, plots of member data, plots of symmetric free-free modes, plots of antisymmetric free-free modes, analysis of the wing, analysis of the cargo doors, analysis of the payload, and analysis of the orbiter.

  5. Decision curve analysis and external validation of the postoperative Karakiewicz nomogram for renal cell carcinoma based on a large single-center study cohort.

    PubMed

    Zastrow, Stefan; Brookman-May, Sabine; Cong, Thi Anh Phuong; Jurk, Stanislaw; von Bar, Immanuel; Novotny, Vladimir; Wirth, Manfred

    2015-03-01

    To predict outcome of patients with renal cell carcinoma (RCC) who undergo surgical therapy, risk models and nomograms are valuable tools. External validation on independent datasets is crucial for evaluating accuracy and generalizability of these models. The objective of the present study was to externally validate the postoperative nomogram developed by Karakiewicz et al. for prediction of cancer-specific survival. A total of 1,480 consecutive patients with a median follow-up of 82 months (IQR 46-128) were included into this analysis with 268 RCC-specific deaths. Nomogram-estimated survival probabilities were compared with survival probabilities of the actual cohort, and concordance indices were calculated. Calibration plots and decision curve analyses were used for evaluating calibration and clinical net benefit of the nomogram. Concordance between predictions of the nomogram and survival rates of the cohort was 0.911 after 12, 0.909 after 24 months and 0.896 after 60 months. Comparison of predicted probabilities and actual survival estimates with calibration plots showed an overestimation of tumor-specific survival based on nomogram predictions of high-risk patients, although calibration plots showed a reasonable calibration for probability ranges of interest. Decision curve analysis showed a positive net benefit of nomogram predictions for our patient cohort. The postoperative Karakiewicz nomogram provides a good concordance in this external cohort and is reasonably calibrated. It may overestimate tumor-specific survival in high-risk patients, which should be kept in mind when counseling patients. A positive net benefit of nomogram predictions was proven.

  6. Optimized endogenous post-stratification in forest inventories

    Treesearch

    Paul L. Patterson

    2012-01-01

    An example of endogenous post-stratification is the use of remote sensing data with a sample of ground data to build a logistic regression model to predict the probability that a plot is forested and using the predicted probabilities to form categories for post-stratification. An optimized endogenous post-stratified estimator of the proportion of forest has been...

  7. Development of Fourth-Grade Students' Understanding of Experimental and Theoretical Probability

    ERIC Educational Resources Information Center

    English, Lyn; Watson, Jane

    2014-01-01

    Students explored variation and expectation in a probability activity at the end of the first year of a 3-year longitudinal study across grades 4-6. The activity involved experiments in tossing coins both manually and with simulation using the graphing software, "TinkerPlots." Initial responses indicated that the students were aware of…

  8. Multifocal visual evoked potential in optic neuritis, ischemic optic neuropathy and compressive optic neuropathy

    PubMed Central

    Jayaraman, Manju; Gandhi, Rashmin Anilkumar; Ravi, Priya; Sen, Parveen

    2014-01-01

    Purpose: To investigate the effect of optic neuritis (ON), ischemic optic neuropathy (ION) and compressive optic neuropathy (CON) on multifocal visual evoked potential (mfVEP) amplitudes and latencies, and to compare the parameters among three optic nerve disorders. Materials and Methods: mfVEP was recorded for 71 eyes of controls and 48 eyes of optic nerve disorders with subgroups of optic neuritis (ON, n = 21 eyes), ischemic optic neuropathy (ION, n = 14 eyes), and compressive optic neuropathy (CON, n = 13 eyes). The size of defect in mfVEP amplitude probability plots and relative latency plots were analyzed. The pattern of the defect in amplitude probability plot was classified according to the visual field profile of optic neuritis treatment trail (ONTT). Results: Median of mfVEP amplitude (log SNR) averaged across 60 sectors were reduced in ON (0.17 (0.13-0.33)), ION (0.14 (0.12-0.21)) and CON (0.21 (0.14-0.30)) when compared to controls. The median mfVEP relative latencies compared to controls were significantly prolonged in ON and CON group of 10.53 (2.62-15.50) ms and 5.73 (2.67-14.14) ms respectively compared to ION group (2.06 (-4.09-13.02)). The common mfVEP amplitude defects observed in probability plots were diffuse pattern in ON, inferior altitudinal defect in ION and temporal hemianopia in CON eyes. Conclusions: Optic nerve disorders cause reduction in mfVEP amplitudes. The extent of delayed latency noted in ischemic optic neuropathy was significantly lesser compared to subjects with optic neuritis and compressive optic neuropathy. mfVEP amplitudes can be used to objectively assess the topography of the visual field defect. PMID:24088641

  9. Tree cover at fine and coarse spatial grains interacts with shade tolerance to shape plant species distributions across the Alps

    PubMed Central

    Nieto-Lugilde, Diego; Lenoir, Jonathan; Abdulhak, Sylvain; Aeschimann, David; Dullinger, Stefan; Gégout, Jean-Claude; Guisan, Antoine; Pauli, Harald; Renaud, Julien; Theurillat, Jean-Paul; Thuiller, Wilfried; Van Es, Jérémie; Vittoz, Pascal; Willner, Wolfgang; Wohlgemuth, Thomas; Zimmermann, Niklaus E.; Svenning, Jens-Christian

    2015-01-01

    The role of competition for light among plants has long been recognised at local scales, but its importance for plant species distributions at larger spatial scales has generally been ignored. Tree cover modifies the local abiotic conditions below the canopy, notably by reducing light availability, and thus, also the performance of species that are not adapted to low-light conditions. However, this local effect may propagate to coarser spatial grains, by affecting colonisation probabilities and local extinction risks of herbs and shrubs. To assess the effect of tree cover at both the plot- and landscape-grain sizes (approximately 10-m and 1-km), we fit Generalised Linear Models (GLMs) for the plot-level distributions of 960 species of herbs and shrubs using 6,935 vegetation plots across the European Alps. We ran four models with different combinations of variables (climate, soil and tree cover) at both spatial grains for each species. We used partial regressions to evaluate the independent effects of plot- and landscape-grain tree cover on plot-level plant communities. Finally, the effects on species-specific elevational range limits were assessed by simulating a removal experiment comparing the species distributions under high and low tree cover. Accounting for tree cover improved the model performance, with the probability of the presence of shade-tolerant species increasing with increasing tree cover, whereas shade-intolerant species showed the opposite pattern. The tree cover effect occurred consistently at both the plot and landscape spatial grains, albeit most strongly at the former. Importantly, tree cover at the two grain sizes had partially independent effects on plot-level plant communities. With high tree cover, shade-intolerant species exhibited narrower elevational ranges than with low tree cover whereas shade-tolerant species showed wider elevational ranges at both limits. These findings suggest that forecasts of climate-related range shifts for herb and shrub species may be modified by tree cover dynamics. PMID:26290621

  10. A method for developing design diagrams for ceramic and glass materials using fatigue data

    NASA Technical Reports Server (NTRS)

    Heslin, T. M.; Magida, M. B.; Forrest, K. A.

    1986-01-01

    The service lifetime of glass and ceramic materials can be expressed as a plot of time-to-failure versus applied stress whose plot is parametric in percent probability of failure. This type of plot is called a design diagram. Confidence interval estimates for such plots depend on the type of test that is used to generate the data, on assumptions made concerning the statistical distribution of the test results, and on the type of analysis used. This report outlines the development of design diagrams for glass and ceramic materials in engineering terms using static or dynamic fatigue tests, assuming either no particular statistical distribution of test results or a Weibull distribution and using either median value or homologous ratio analysis of the test results.

  11. Objective Assessment of Image Quality VI: Imaging in Radiation Therapy

    PubMed Central

    Barrett, Harrison H.; Kupinski, Matthew A.; Müeller, Stefan; Halpern, Howard J.; Morris, John C.; Dwyer, Roisin

    2015-01-01

    Earlier work on Objective Assessment of Image Quality (OAIQ) focused largely on estimation or classification tasks in which the desired outcome of imaging is accurate diagnosis. This paper develops a general framework for assessing imaging quality on the basis of therapeutic outcomes rather than diagnostic performance. By analogy to Receiver Operating Characteristic (ROC) curves and their variants as used in diagnostic OAIQ, the method proposed here utilizes the Therapy Operating Characteristic or TOC curves, which are plots of the probability of tumor control vs. the probability of normal-tissue complications as the overall dose level of a radiotherapy treatment is varied. The proposed figure of merit is the area under the TOC curve, denoted AUTOC. This paper reviews an earlier exposition of the theory of TOC and AUTOC, which was specific to the assessment of image-segmentation algorithms, and extends it to other applications of imaging in external-beam radiation treatment as well as in treatment with internal radioactive sources. For each application, a methodology for computing the TOC is presented. A key difference between ROC and TOC is that the latter can be defined for a single patient rather than a population of patients. PMID:24200954

  12. Ditching Investigation of a 1/11-Scale Model of the Chance Vought F7U-3 Airplane, TED NO. NACA DE 360

    NASA Technical Reports Server (NTRS)

    Fisher, Lloyd J.; Windham, John O.

    1955-01-01

    An investigation was made of a 1/11-scale dynamically similar model of the Chance Vought F7U-3 airplane to study its behavior when ditched. The model was landed in calm water at the Langley tank no. 2 monorail. Various landing attitudes, speeds, and configurations were investigated. The behavior of the model was determined from visual observations, acceleration records, and motion-picture records of the ditchings. Data are presented in tabular form, sequence photographs, time-history acceleration curves, and plots of attitude change against time after contact. From the results of the investigation, it was concluded that the airplane should be ditched at the lowest speed and highest attitude consistent with adequate control. The aft part of the fuselage and the main landing-gear doors will probably be damaged. In a calm-water ditching under these conditions the airplane will probably skip slightly and then porpoise for the remainder of the run. Maximum longitudinal decelerations will be about 3 1/2g and maximum normal accelerations will be about 7g in a landing run of about 500 feet.

  13. Analysis on Flexural Strength of A36 Mild Steel by Design of Experiment (DOE)

    NASA Astrophysics Data System (ADS)

    Nurulhuda, A.; Hafizzal, Y.; Izzuddin, MZM; Sulawati, MRN; Rafidah, A.; Suhaila, Y.; Fauziah, AR

    2017-08-01

    Nowadays demand for high quality and reliable components and materials are increasing so flexural tests have become vital test method in both the research and manufacturing process and development to explain in details about the material’s ability to withstand deformation under load. Recently, there are lack research studies on the effect of thickness, welding type and joint design on the flexural condition by DOE approach method. Therefore, this research will come out with the flexural strength of mild steel since it is not well documented. By using Design of Experiment (DOE), a full factorial design with two replications has been used to study the effects of important parameters which are welding type, thickness and joint design. The measurement of output response is identified as flexural strength value. Randomize experiments was conducted based on table generated via Minitab software. A normal probability test was carried out using Anderson Darling Test and show that the P-value is <0.005. Thus, the data is not normal since there is significance different between the actual data with the ideal data. Referring to the ANOVA, only factor joint design is significant since the P-value is less than 0.05. From the main plot and interaction plot, the recommended setting for each of parameters were suggested as high level for welding type, high level for thickness and low level for joint design. The prediction model was developed thru regression in order to measure effect of output response for any changes on parameters setting. In the future, the experiments can be enhanced using Taguchi methods in order to do verification of result.

  14. Pretest probability of a normal echocardiography: validation of a simple and practical algorithm for routine use.

    PubMed

    Hammoudi, Nadjib; Duprey, Matthieu; Régnier, Philippe; Achkar, Marc; Boubrit, Lila; Preud'homme, Gisèle; Healy-Brucker, Aude; Vignalou, Jean-Baptiste; Pousset, Françoise; Komajda, Michel; Isnard, Richard

    2014-02-01

    Management of increased referrals for transthoracic echocardiography (TTE) examinations is a challenge. Patients with normal TTE examinations take less time to explore than those with heart abnormalities. A reliable method for assessing pretest probability of a normal TTE may optimize management of requests. To establish and validate, based on requests for examinations, a simple algorithm for defining pretest probability of a normal TTE. In a retrospective phase, factors associated with normality were investigated and an algorithm was designed. In a prospective phase, patients were classified in accordance with the algorithm as being at high or low probability of having a normal TTE. In the retrospective phase, 42% of 618 examinations were normal. In multivariable analysis, age and absence of cardiac history were associated to normality. Low pretest probability of normal TTE was defined by known cardiac history or, in case of doubt about cardiac history, by age>70 years. In the prospective phase, the prevalences of normality were 72% and 25% in high (n=167) and low (n=241) pretest probability of normality groups, respectively. The mean duration of normal examinations was significantly shorter than abnormal examinations (13.8 ± 9.2 min vs 17.6 ± 11.1 min; P=0.0003). A simple algorithm can classify patients referred for TTE as being at high or low pretest probability of having a normal examination. This algorithm might help to optimize management of requests in routine practice. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  15. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment

    NASA Astrophysics Data System (ADS)

    Stanton, Carly; Starek, Michael J.; Elliott, Norman; Brewer, Michael; Maeda, Murilo M.; Chu, Tianxing

    2017-04-01

    A small, fixed-wing unmanned aircraft system (UAS) was used to survey a replicated small plot field experiment designed to estimate sorghum damage caused by an invasive aphid. Plant stress varied among 40 plots through manipulation of aphid densities. Equipped with a consumer-grade near-infrared camera, the UAS was flown on a recurring basis over the growing season. The raw imagery was processed using structure-from-motion to generate normalized difference vegetation index (NDVI) maps of the fields and three-dimensional point clouds. NDVI and plant height metrics were averaged on a per plot basis and evaluated for their ability to identify aphid-induced plant stress. Experimental soil signal filtering was performed on both metrics, and a method filtering low near-infrared values before NDVI calculation was found to be the most effective. UAS NDVI was compared with NDVI from sensors onboard a manned aircraft and a tractor. The correlation results showed dependence on the growth stage. Plot averages of NDVI and canopy height values were compared with per-plot yield at 14% moisture and aphid density. The UAS measures of plant height and NDVI were correlated to plot averages of yield and insect density. Negative correlations between aphid density and NDVI were seen near the end of the season in the most damaged crops.

  16. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  17. Dose and detectability for a cone-beam C-arm CT system revisited

    PubMed Central

    Ganguly, Arundhuti; Yoon, Sungwon; Fahrig, Rebecca

    2010-01-01

    Purpose: The authors had previously published measurements of the detectability of disk-shaped contrast objects in images obtained from a C-arm CT system. A simple approach based on Rose’s criterion was used to scale the date, assuming the threshold for the smallest diameter detected should be inversely proportional to (dose)1∕2. A more detailed analysis based on recent theoretical modeling of C-arm CT images is presented in this work. Methods: The signal and noise propagations in a C-arm based CT system have been formulated by other authors using cascaded systems analysis. They established a relationship between detectability and the noise equivalent quanta. Based on this model, the authors obtained a relation between x-ray dose and the diameter of the smallest disks detected. A closed form solution was established by assuming no rebinning and no resampling of data, with low additive noise and using a ramp filter. For the case when no such assumptions were made, a numerically calculated solution using previously reported imaging and reconstruction parameters was obtained. The detection probabilities for a range of dose and kVp values had been measured previously. These probabilities were normalized to a single dose of 56.6 mGy using the Rose-criteria-based relation to obtain a universal curve. Normalizations based on the new numerically calculated relationship were compared to the measured results. Results: The theoretical and numerical calculations have similar results and predict the detected diameter size to be inversely proportional to (dose)1∕3 and (dose)1∕2.8, respectively. The normalized experimental curves and the associated universal plot using the new relation were not significantly different from those obtained using the Rose-criterion-based normalization. Conclusions: From numerical simulations, the authors found that the diameter of detected disks depends inversely on the cube root of the dose. For observer studies for disks larger than 4 mm, the cube root as well as square root relations appear to give similar results when used for normalization. PMID:20527560

  18. How to regress and predict in a Bland-Altman plot? Review and contribution based on tolerance intervals and correlated-errors-in-variables models.

    PubMed

    Francq, Bernard G; Govaerts, Bernadette

    2016-06-30

    Two main methodologies for assessing equivalence in method-comparison studies are presented separately in the literature. The first one is the well-known and widely applied Bland-Altman approach with its agreement intervals, where two methods are considered interchangeable if their differences are not clinically significant. The second approach is based on errors-in-variables regression in a classical (X,Y) plot and focuses on confidence intervals, whereby two methods are considered equivalent when providing similar measures notwithstanding the random measurement errors. This paper reconciles these two methodologies and shows their similarities and differences using both real data and simulations. A new consistent correlated-errors-in-variables regression is introduced as the errors are shown to be correlated in the Bland-Altman plot. Indeed, the coverage probabilities collapse and the biases soar when this correlation is ignored. Novel tolerance intervals are compared with agreement intervals with or without replicated data, and novel predictive intervals are introduced to predict a single measure in an (X,Y) plot or in a Bland-Atman plot with excellent coverage probabilities. We conclude that the (correlated)-errors-in-variables regressions should not be avoided in method comparison studies, although the Bland-Altman approach is usually applied to avert their complexity. We argue that tolerance or predictive intervals are better alternatives than agreement intervals, and we provide guidelines for practitioners regarding method comparison studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Smisc - A collection of miscellaneous functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landon Sego, PNNL

    2015-08-31

    A collection of functions for statistical computing and data manipulation. These include routines for rapidly aggregating heterogeneous matrices, manipulating file names, loading R objects, sourcing multiple R files, formatting datetimes, multi-core parallel computing, stream editing, specialized plotting, etc. Smisc-package A collection of miscellaneous functions allMissing Identifies missing rows or columns in a data frame or matrix as.numericSilent Silent wrapper for coercing a vector to numeric comboList Produces all possible combinations of a set of linear model predictors cumMax Computes the maximum of the vector up to the current index cumsumNA Computes the cummulative sum of a vector without propogating NAsmore » d2binom Probability functions for the sum of two independent binomials dataIn A flexible way to import data into R. dbb The Beta-Binomial Distribution df2list Row-wise conversion of a data frame to a list dfplapply Parallelized single row processing of a data frame dframeEquiv Examines the equivalence of two dataframes or matrices dkbinom Probability functions for the sum of k independent binomials factor2character Converts all factor variables in a dataframe to character variables findDepMat Identify linearly dependent rows or columns in a matrix formatDT Converts date or datetime strings into alternate formats getExtension Filename manipulations: remove the extension or path, extract the extension or path getPath Filename manipulations: remove the extension or path, extract the extension or path grabLast Filename manipulations: remove the extension or path, extract the extension or path ifelse1 Non-vectorized version of ifelse integ Simple numerical integration routine interactionPlot Two-way Interaction Plot with Error Bar linearMap Linear mapping of a numerical vector or scalar list2df Convert a list to a data frame loadObject Loads and returns the object(s) in an ".Rdata" file more Display the contents of a file to the R terminal movAvg2 Calculate the moving average using a 2-sided window openDevice Opens a graphics device based on the filename extension p2binom Probability functions for the sum of two independent binomials padZero Pad a vector of numbers with zeros parseJob Parses a collection of elements into (almost) equal sized groups pbb The Beta-Binomial Distribution pcbinom A continuous version of the binomial cdf pkbinom Probability functions for the sum of k independent binomials plapply Simple parallelization of lapply plotFun Plot one or more functions on a single plot PowerData An example of power data pvar Prints the name and value of one or more objects qbb The Beta-Binomial Distribution rbb And numerous others (space limits reporting).« less

  20. Singularity spectrum of intermittent seismic tremor at Kilauea Volcano, Hawaii

    USGS Publications Warehouse

    Shaw, H.R.; Chouet, B.

    1989-01-01

    Fractal singularity analysis (FSA) is used to study a 22-yr record of deep seismic tremor (30-60 km depth) for regions below Kilauea Volcano on the assumption that magma transport and fracture can be treated as a system of coupled nonlinear oscillators. Tremor episodes range from 1 to 100 min (cumulative duration = 1.60 ?? 104 min; yearly average - 727 min yr-1; mean gradient = 24.2 min yr-1km-1). Partitioning of probabilities, Pi, in the phase space of normalized durations, xi, are expressed in terms of a function f(??), where ?? is a variable exponent of a length scale, l. Plots of f(??) vs. ?? are called multifractal singularity spectra. The spectrum for deep tremor durations is bounded by ?? values of about 0.4 and 1.9 at f = O; fmax ???1.0 for ?? ??? 1. Results for tremor are similar to those found for systems transitional between complete mode locking and chaos. -Authors

  1. Cross-contact chain

    NASA Technical Reports Server (NTRS)

    Lieneweg, Udo (Inventor)

    1988-01-01

    A system is provided for use with wafers that include multiple integrated circuits that include two conductive layers in contact at multiple interfaces. Contact chains are formed beside the integrated circuits, each contact chain formed of the same two layers as the circuits, in the form of conductive segments alternating between the upper and lower layers and with the ends of the segments connected in series through interfaces. A current source passes a current through the series-connected segments, by way of a pair of current tabs connected to opposite ends of the series of segments. While the current flows, voltage measurements are taken between each of a plurality of pairs of voltage tabs, the two tabs of each pair connected to opposite ends of an interface that lies along the series-connected segments. A plot of interface conductances on a normal probability chart, enables prediction of the yield of good integrated circuits from the wafer.

  2. Cross-contact chain

    NASA Technical Reports Server (NTRS)

    Lieneweg, U. (Inventor)

    1986-01-01

    A system is provided for use with wafers that include multiple integrated circuits that include two conductive layers in contact at multiple interfaces. Contact chains are formed beside the integrated circuits, each contact chain formed of the same two layers as the circuits, in the form of conductive segments alternating between the upper and lower layers and with the ends of the segments connected in series through interfaces. A current source passes a current through the series-connected segments, by way of a pair of current tabs connected to opposite ends of the series of segments. While the current flows, voltage measurements are taken between each of a plurality of pairs of voltage tabs, the two tabs of each pair connected to opposite ends of an interface that lies along the series-connected segments. A plot of interface conductances on normal probability chart enables prediction of the yield of good integrated circuits from the wafer.

  3. Design optimization of condenser microphone: a design of experiment perspective.

    PubMed

    Tan, Chee Wee; Miao, Jianmin

    2009-06-01

    A well-designed condenser microphone backplate is very important in the attainment of good frequency response characteristics--high sensitivity and wide bandwidth with flat response--and low mechanical-thermal noise. To study the design optimization of the backplate, a 2(6) factorial design with a single replicate, which consists of six backplate parameters and four responses, has been undertaken on a comprehensive condenser microphone model developed by Zuckerwar. Through the elimination of insignificant parameters via normal probability plots of the effect estimates, the projection of an unreplicated factorial design into a replicated one can be performed to carry out an analysis of variance on the factorial design. The air gap and slot have significant effects on the sensitivity, mechanical-thermal noise, and bandwidth while the slot/hole location interaction has major influence over the latter two responses. An organized and systematic approach of designing the backplate is summarized.

  4. User's manual for THPLOT, A FORTRAN 77 Computer program for time history plotting

    NASA Technical Reports Server (NTRS)

    Murray, J. E.

    1982-01-01

    A general purpose FORTRAN 77 computer program (THPLOT) for plotting time histories using Calcomp pen plotters is described. The program is designed to read a time history data file and to generate time history plots for selected time intervals and/or selected data channels. The capabilities of the program are described. The card input required to define the plotting operation is described and examples of card input and the resulting plotted output are given. The examples are followed by a description of the printed output, including both normal output and error messages. Lastly, implementation of the program is described. A complete listing of the program with reference maps produced by the CDC FTN 5.0 compiler is included.

  5. Sampling Error in Relation to Cyst Nematode Population Density Estimation in Small Field Plots.

    PubMed

    Župunski, Vesna; Jevtić, Radivoje; Jokić, Vesna Spasić; Župunski, Ljubica; Lalošević, Mirjana; Ćirić, Mihajlo; Ćurčić, Živko

    2017-06-01

    Cyst nematodes are serious plant-parasitic pests which could cause severe yield losses and extensive damage. Since there is still very little information about error of population density estimation in small field plots, this study contributes to the broad issue of population density assessment. It was shown that there was no significant difference between cyst counts of five or seven bulk samples taken per each 1-m 2 plot, if average cyst count per examined plot exceeds 75 cysts per 100 g of soil. Goodness of fit of data to probability distribution tested with χ 2 test confirmed a negative binomial distribution of cyst counts for 21 out of 23 plots. The recommended measure of sampling precision of 17% expressed through coefficient of variation ( cv ) was achieved if the plots of 1 m 2 contaminated with more than 90 cysts per 100 g of soil were sampled with 10-core bulk samples taken in five repetitions. If plots were contaminated with less than 75 cysts per 100 g of soil, 10-core bulk samples taken in seven repetitions gave cv higher than 23%. This study indicates that more attention should be paid on estimation of sampling error in experimental field plots to ensure more reliable estimation of population density of cyst nematodes.

  6. CADDIS Volume 4. Data Analysis: Exploratory Data Analysis

    EPA Pesticide Factsheets

    Intro to exploratory data analysis. Overview of variable distributions, scatter plots, correlation analysis, GIS datasets. Use of conditional probability to examine stressor levels and impairment. Exploring correlations among multiple stressors.

  7. A comparison of hands-on inquiry instruction to lectureinstruction with special needs high school biology students

    NASA Astrophysics Data System (ADS)

    Jensen-Ruopp, Helga Spitko

    A comparison of hands-on inquiry instruction with lecture instruction was presented to 134 Patterns and Process Biology students. Students participated in seven biology lessons that were selected from Biology Survey of Living Things (1992). A pre and post paper and pencil assessment was used as the data collecting instrument. The treatment group was taught using hands-on inquiry strategies while the non-treatment group was taught in the lecture method of instruction. The team teaching model was used as the mode of presentation to the treatment group and the non-treatment group. Achievement levels using specific criterion; novice (0% to 50%), developing proficiency (51% to 69%), accomplished (70% to 84) and exceptional or mastery level (85% to 100%) were used as a guideline to tabulate the results of the pre and post assessment. Rubric tabulation was done to interpret the testing results. The raw data was plotted using percentage change in test score totals versus reading level score by gender as well as percentage change in test score totals versus auditory vocabulary score by gender. Box Whisker plot comparative descriptive of individual pre and post test scores for the treatment and non-treatment group was performed. Analysis of covariance (ANCOVA) using MINITAB Statistical Software version 14.11 was run on data of the seven lessons, as well as on gender (male results individual and combined, and female results individual and combined) results. Normal Probability Plots for total scores as well as individual test scores were performed. The results suggest that hands-on inquiry based instruction when presented to special needs students including; at-risk; English as a second language limited, English proficiency and special education inclusive students' learning may enhance individual student achievement.

  8. Thermal buffering of concrete by seaweeds during a prolonged summer heatwave

    NASA Astrophysics Data System (ADS)

    Naylor, Larissa; Coombes, Martin

    2014-05-01

    Hard coastal infrastructure is subject to aggressive environmental conditions, including a suite of weathering processes in the intertidal zone. These processes, along with waves, lead to costly deterioration of coastal structures. Existing methods (e.g. coatings, less porous concrete) to reduce the risk of concrete deterioration rapidly lose their effectiveness in the intertidal zone. Additionally, a changing climate will lead to increased frequency of storms, higher sea level and higher extreme temperatures - and therefore, pose an increased risk of deterioration. Might there be a biogenic solution? New research (Coombes et al. 2013) has shown that fucoid seaweeds reduce microclimatic extremes and variability under normal summer conditions. The results presented here supplement these findings in two ways. First, they demonstrate that fucoid seaweeds act as a thermal buffer during a prolonged summer heatwave in Britain (July 2013). Over 36 days of continuous monitoring at two sites in Cornwall, UK, 19 of which were during the official heatwave, there were statistically significant differences (p = 0.000) in the maximum temperatures between thick seaweed (7.5 - 9.5 cm thickness) and thin seaweed (2 - 2.5 cm thickness) plots. Maximum temperatures reached 22°C and 33°C, for thick seaweed and thin seaweed plots, respectively. Variations in maximum temperatures between the two sites appear to be related to aspect. Second, the significantly different maximum temperature results between plots also demonstrate that seaweed thickness is an important factor influencing thermal buffering capacity. These data clearly demonstrate that fucoid seaweeds buffer concrete seawalls against extreme temperature fluxes during a heatwave, probably limiting the efficiency of deteriorative processes such as thermal expansion and contraction and salt crystallisation.

  9. Composite recovery type curves in normalized time from Theis' exact solution

    USGS Publications Warehouse

    Goode, Daniel J.

    1997-01-01

    Type curves derived from Theis’ exact nonequilibrium well function solution are proposed for graphical estimation of aquifer hydraulic properties, transmissivity (T), and storage coefficient (S), from water-level recovery data after cessation of a constant-rate discharge test. Drawdown (on log scale) is plotted versus the ratio of time since pumping stopped to duration of pumping, a normalized time. Under Theis conditions, individual type curves depend on only the dimensionless pumping duration, which depends in turn on S and radial distance from the pumping well. Type curve matching, in contrast to the Theis procedure for pumping data, is performed by shifting only the drawdown axis; the time axis is fixed because it is a relative or normalized time. The match-point for the drawdown axis is used to compute T, and S is determined from matching the curve shape, which depends on early dimensionless-time data. Multiple well data can be plotted and matched simultaneously (a composite plot), with drawdown at different radial distances matching different curves. The ratio of dimensionless pumping durations for any two matched curves is equal to one over the squared ratio of radial distances. Application to two recovery datasets from the literature confirm the utility of these type curves in normalized time for composite estimation of T and S.

  10. On-plot drinking water supplies and health: A systematic review.

    PubMed

    Overbo, Alycia; Williams, Ashley R; Evans, Barbara; Hunter, Paul R; Bartram, Jamie

    2016-07-01

    Many studies have found that household access to water supplies near or within the household plot can reduce the probability of diarrhea, trachoma, and other water-related diseases, and it is generally accepted that on-plot water supplies produce health benefits for households. However, the body of research literature has not been analyzed to weigh the evidence supporting this. A systematic review was conducted to investigate the impacts of on-plot water supplies on diarrhea, trachoma, child growth, and water-related diseases, to further examine the relationship between household health and distance to water source and to assess whether on-plot water supplies generate health gains for households. Studies provide evidence that households with on-plot water supplies experience fewer diarrheal and helminth infections and greater child height. Findings suggest that water-washed (hygiene associated) diseases are more strongly impacted by on-plot water access than waterborne diseases. Few studies analyzed the effects of on-plot water access on quantity of domestic water used, hygiene behavior, and use of multiple water sources, and the lack of evidence for these relationships reveals an important gap in current literature. The review findings indicate that on-plot water access is a useful health indicator and benchmark for the progressive realization of the Sustainable Development Goal target of universal safe water access as well as the human right to safe water. Copyright © 2016 Elsevier GmbH. All rights reserved.

  11. Two tools for applying chromatographic retention data to the mass-based identification of peptides during hydrogen/deuterium exchange experiments by nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Gershon, P D

    2010-12-15

    Two tools are described for integrating LC elution position with mass-based data in hydrogen-deuterium exchange (HDX) experiments by nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry (nanoLC/MALDI-MS, a novel approach to HDX-MS). The first of these, 'TOF2H-Z Comparator', highlights peptides in HDX experiments that are potentially misidentified on the basis of mass alone. The program first calculates normalized values for the organic solvent concentration responsible for the elution of ions in nanoLC/MALDI HDX experiments. It then allows the solvent gradients for the multiple experiments contributing to an MS/MS-confirmed peptic peptide library to be brought into mutual alignment by iteratively re-modeling variables among LC parameters such as gradient shape, solvent species, fraction duration and LC dead time. Finally, using the program, high-probability chromatographic outliers can be flagged within HDX experimental data. The role of the second tool, 'TOF2H-XIC Comparator', is to normalize the LC chromatograms corresponding to all deuteration timepoints of all HDX experiments of a project, to a common reference. Accurate normalization facilitates the verification of chromatographic consistency between all ions whose spectral segments contribute to particular deuterium uptake plots. Gradient normalization in this manner revealed chromatographic inconsistencies between ions whose masses were either indistinguishable or separated by precise isotopic increments. Copyright © 2010 John Wiley & Sons, Ltd.

  12. Poincaré plot analysis of autocorrelation function of RR intervals in patients with acute myocardial infarction.

    PubMed

    Chuang, Shin-Shin; Wu, Kung-Tai; Lin, Chen-Yang; Lee, Steven; Chen, Gau-Yang; Kuo, Cheng-Deng

    2014-08-01

    The Poincaré plot of RR intervals (RRI) is obtained by plotting RRIn+1 against RRIn. The Pearson correlation coefficient (ρRRI), slope (SRRI), Y-intercept (YRRI), standard deviation of instantaneous beat-to-beat RRI variability (SD1RR), and standard deviation of continuous long-term RRI variability (SD2RR) can be defined to characterize the plot. Similarly, the Poincaré plot of autocorrelation function (ACF) of RRI can be obtained by plotting ACFk+1 against ACFk. The corresponding Pearson correlation coefficient (ρACF), slope (SACF), Y-intercept (YACF), SD1ACF, and SD2ACF can be defined similarly to characterize the plot. By comparing the indices of Poincaré plots of RRI and ACF between patients with acute myocardial infarction (AMI) and patients with patent coronary artery (PCA), we found that the ρACF and SACF were significantly larger, whereas the RMSSDACF/SDACF and SD1ACF/SD2ACF were significantly smaller in AMI patients. The ρACF and SACF correlated significantly and negatively with normalized high-frequency power (nHFP), and significantly and positively with normalized very low-frequency power (nVLFP) of heart rate variability in both groups of patients. On the contrary, the RMSSDACF/SDACF and SD1ACF/SD2ACF correlated significantly and positively with nHFP, and significantly and negatively with nVLFP and low-/high-frequency power ratio (LHR) in both groups of patients. We concluded that the ρACF, SACF, RMSSDACF/SDACF, and SD1ACF/SD2ACF, among many other indices of ACF Poincaré plot, can be used to differentiate between patients with AMI and patients with PCA, and that the increase in ρACF and SACF and the decrease in RMSSDACF/SDACF and SD1ACF/SD2ACF suggest an increased sympathetic and decreased vagal modulations in both groups of patients.

  13. An exploratory drilling exhaustion sequence plot program

    USGS Publications Warehouse

    Schuenemeyer, J.H.; Drew, L.J.

    1977-01-01

    The exhaustion sequence plot program computes the conditional area of influence for wells in a specified rectangular region with respect to a fixed-size deposit. The deposit is represented by an ellipse whose size is chosen by the user. The area of influence may be displayed on computer printer plots consisting of a maximum of 10,000 grid points. At each point, a symbol is presented that indicates the probability of that point being exhausted by nearby wells with respect to a fixed-size ellipse. This output gives a pictorial view of the manner in which oil fields are exhausted. In addition, the exhaustion data may be used to estimate the number of deposits remaining in a basin. ?? 1977.

  14. Diameter distribution in a Brazilian tropical dry forest domain: predictions for the stand and species.

    PubMed

    Lima, Robson B DE; Bufalino, Lina; Alves, Francisco T; Silva, José A A DA; Ferreira, Rinaldo L C

    2017-01-01

    Currently, there is a lack of studies on the correct utilization of continuous distributions for dry tropical forests. Therefore, this work aims to investigate the diameter structure of a brazilian tropical dry forest and to select suitable continuous distributions by means of statistic tools for the stand and the main species. Two subsets were randomly selected from 40 plots. Diameter at base height was obtained. The following functions were tested: log-normal; gamma; Weibull 2P and Burr. The best fits were selected by Akaike's information validation criterion. Overall, the diameter distribution of the dry tropical forest was better described by negative exponential curves and positive skewness. The forest studied showed diameter distributions with decreasing probability for larger trees. This behavior was observed for both the main species and the stand. The generalization of the function fitted for the main species show that the development of individual models is needed. The Burr function showed good flexibility to describe the diameter structure of the stand and the behavior of Mimosa ophthalmocentra and Bauhinia cheilantha species. For Poincianella bracteosa, Aspidosperma pyrifolium and Myracrodum urundeuva better fitting was obtained with the log-normal function.

  15. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    NASA Astrophysics Data System (ADS)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to 2008, while the intensity of such flow extremes is comparatively increasing especially for the higher return levels.

  16. EUCLID: an outcome analysis tool for high-dimensional clinical studies

    NASA Astrophysics Data System (ADS)

    Gayou, Olivier; Parda, David S.; Miften, Moyed

    2007-03-01

    Treatment management decisions in three-dimensional conformal radiation therapy (3DCRT) and intensity-modulated radiation therapy (IMRT) are usually made based on the dose distributions in the target and surrounding normal tissue. These decisions may include, for example, the choice of one treatment over another and the level of tumour dose escalation. Furthermore, biological predictors such as tumour control probability (TCP) and normal tissue complication probability (NTCP), whose parameters available in the literature are only population-based estimates, are often used to assess and compare plans. However, a number of other clinical, biological and physiological factors also affect the outcome of radiotherapy treatment and are often not considered in the treatment planning and evaluation process. A statistical outcome analysis tool, EUCLID, for direct use by radiation oncologists and medical physicists was developed. The tool builds a mathematical model to predict an outcome probability based on a large number of clinical, biological, physiological and dosimetric factors. EUCLID can first analyse a large set of patients, such as from a clinical trial, to derive regression correlation coefficients between these factors and a given outcome. It can then apply such a model to an individual patient at the time of treatment to derive the probability of that outcome, allowing the physician to individualize the treatment based on medical evidence that encompasses a wide range of factors. The software's flexibility allows the clinicians to explore several avenues to select the best predictors of a given outcome. Its link to record-and-verify systems and data spreadsheets allows for a rapid and practical data collection and manipulation. A wide range of statistical information about the study population, including demographics and correlations between different factors, is available. A large number of one- and two-dimensional plots, histograms and survival curves allow for an easy visual analysis of the population. Several visual and analytical methods are available to quantify the predictive power of the multivariate regression model. The EUCLID tool can be readily integrated with treatment planning and record-and-verify systems.

  17. EUCLID: an outcome analysis tool for high-dimensional clinical studies.

    PubMed

    Gayou, Olivier; Parda, David S; Miften, Moyed

    2007-03-21

    Treatment management decisions in three-dimensional conformal radiation therapy (3DCRT) and intensity-modulated radiation therapy (IMRT) are usually made based on the dose distributions in the target and surrounding normal tissue. These decisions may include, for example, the choice of one treatment over another and the level of tumour dose escalation. Furthermore, biological predictors such as tumour control probability (TCP) and normal tissue complication probability (NTCP), whose parameters available in the literature are only population-based estimates, are often used to assess and compare plans. However, a number of other clinical, biological and physiological factors also affect the outcome of radiotherapy treatment and are often not considered in the treatment planning and evaluation process. A statistical outcome analysis tool, EUCLID, for direct use by radiation oncologists and medical physicists was developed. The tool builds a mathematical model to predict an outcome probability based on a large number of clinical, biological, physiological and dosimetric factors. EUCLID can first analyse a large set of patients, such as from a clinical trial, to derive regression correlation coefficients between these factors and a given outcome. It can then apply such a model to an individual patient at the time of treatment to derive the probability of that outcome, allowing the physician to individualize the treatment based on medical evidence that encompasses a wide range of factors. The software's flexibility allows the clinicians to explore several avenues to select the best predictors of a given outcome. Its link to record-and-verify systems and data spreadsheets allows for a rapid and practical data collection and manipulation. A wide range of statistical information about the study population, including demographics and correlations between different factors, is available. A large number of one- and two-dimensional plots, histograms and survival curves allow for an easy visual analysis of the population. Several visual and analytical methods are available to quantify the predictive power of the multivariate regression model. The EUCLID tool can be readily integrated with treatment planning and record-and-verify systems.

  18. Origin of the enhancement of tunneling probability in the nearly integrable system

    NASA Astrophysics Data System (ADS)

    Hanada, Yasutaka; Shudo, Akira; Ikeda, Kensuke S.

    2015-04-01

    The enhancement of tunneling probability in the nearly integrable system is closely examined, focusing on tunneling splittings plotted as a function of the inverse of the Planck's constant. On the basis of the analysis using the absorber which efficiently suppresses the coupling, creating spikes in the plot, we found that the splitting curve should be viewed as the staircase-shaped skeleton accompanied by spikes. We further introduce renormalized integrable Hamiltonians and explore the origin of such a staircase structure by investigating the nature of eigenfunctions closely. It is found that the origin of the staircase structure could trace back to the anomalous structure in tunneling tail which manifests itself in the representation using renormalized action bases. This also explains the reason why the staircase does not appear in the completely integrable system.

  19. A joint probability approach for coincidental flood frequency analysis at ungauged basin confluences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Cheng

    2016-03-12

    A reliable and accurate flood frequency analysis at the confluence of streams is of importance. Given that long-term peak flow observations are often unavailable at tributary confluences, at a practical level, this paper presents a joint probability approach (JPA) to address the coincidental flood frequency analysis at the ungauged confluence of two streams based on the flow rate data from the upstream tributaries. One case study is performed for comparison against several traditional approaches, including the position-plotting formula, the univariate flood frequency analysis, and the National Flood Frequency Program developed by US Geological Survey. It shows that the results generatedmore » by the JPA approach agree well with the floods estimated by the plotting position and univariate flood frequency analysis based on the observation data.« less

  20. Data Analysis and Modeling of Arctic Sea Ice Subsurface Roughness.

    DTIC Science & Technology

    1982-11-01

    X in terms of that of Z The above considerations suggest that sculpturing is sometimes a natural way of fitting a Wilk- Gnanadesikan [1968] q-q plot...of ice profiles obtained by submarine sonar in the Beaufort Sea. J. of Glaciology, Vol. 25, pp. 401-424. 12. Wilk, M. B. and Gnanadesikan , R. (1968...obtained by submarine sonar in the Beaufort Sea. J. of Glaciology, Vol. 25, pp. 401-424. 11. Wilk, M. B. and Gnanadesikan , R. (1968). Probability plotting

  1. ACE Design Study and Experiments

    DTIC Science & Technology

    1976-06-01

    orthophoto on off-line printer o Automatically compute contours on UNIVAC 1108 and plot on CALCOMP o Manually trace planimetry and drainage from... orthophoto * o Manually edit and trace plotted contours to obtain completed contour manuscript* - Edit errors - Add missing contour detail - Combine...stereomodels - Contours adjusted to drainage chart and spot elevations - Referring to orthophoto , rectified photos, original photos o Normal

  2. Hyperspectral remote sensing analysis of short rotation woody crops grown with controlled nutrient and irrigation treatments

    Treesearch

    Jungho Im; John R. Jensen; Mark Coleman; Eric Nelson

    2009-01-01

    Hyperspectral remote sensing research was conducted to document the biophysical and biochemical characteristics of controlled forest plots subjected to various nutrient and irrigation treatments. The experimental plots were located on the Savannah River Site near Aiken, SC. AISA hyperspectral imagery were analysed using three approaches, including: (1) normalized...

  3. Lethal Trap Trees and Semiochemical Repellents as Area Host Protection Strategies for Spruce Beetle (Coleoptera: Curculionidae, Scolytinae) in Utah.

    PubMed

    Matthew Hansen, E; Steven Munson, A; Blackford, Darren C; Wakarchuk, David; Scott Baggett, L

    2016-10-01

    We tested lethal trap trees and repellent semiochemicals as area treatments to protect host trees from spruce beetle (Dendroctonus rufipennis Kirby) attacks. Lethal trap tree treatments ("spray treatment") combined a spruce beetle bait with carbaryl treatment of the baited spruce. Repellent treatments ("spray-repellent") combined a baited lethal trap tree within a 16-m grid of MCH (3-methylcyclohex-2-en-1-one) and two novel spruce beetle repellents. After beetle flight, we surveyed all trees within 50 m of plot center, stratified by 10-m radius subplots, and compared attack rates to those from baited and unbaited control plots. Compared to the baited controls, spruce in the spray treatment had significantly reduced likelihood of a more severe attack classification (e.g., mass-attacked over strip-attacked or unsuccessful-attacked over unattacked). Because spruce in the spray treatment also had significantly heightened probability of more severe attack classification than those in the unbaited controls, however, we do not recommend lethal trap trees as a stand-alone beetle suppression strategy for epidemic beetle populations. Spruce in the spray-repellent treatment were slightly more likely to be classified as more severely attacked within 30 m of plot center compared to unbaited controls but, overall, had reduced probabilities of beetle attack over the entire 50-m radius plots. The semiochemical repellents deployed in this study were effective at reducing attacks on spruce within treated plots despite the presence of a centrally located spruce beetle bait. Further testing will be required to clarify operational protocols such as dose, elution rate, and release device spacing. Published by Oxford University Press on behalf of Entomological Society of America 2016. This work is written by US Government employees and is in the public domain in the US.

  4. A heat and water transfer model for seasonally frozen soils with application to a precipitation-runoff model

    USGS Publications Warehouse

    Emerson, Douglas G.

    1994-01-01

    A model that simulates heat and water transfer in soils during freezing and thawing periods was developed and incorporated into the U.S. Geological Survey's Precipitation-Runoff Modeling System. The model's transfer of heat is based on an equation developed from Fourier's equation for heat flux. The model's transfer of water within the soil profile is based on the concept of capillary forces. Field capacity and infiltration rate can vary throughout the freezing and thawing period, depending on soil conditions and rate and timing of snowmelt. The model can be used to determine the effects of seasonally frozen soils on ground-water recharge and surface-water runoff. Data collected for two winters, 1985-86 and 1986-87, on three runoff plots were used to calibrate and verify the model. The winter of 1985-86 was colder than normal, and snow cover was continuous throughout the winter. The winter of 1986-87 was warmer than normal, and snow accumulated for only short periods of several days. as the criteria for determining the degree of agreement between simulated and measured data. The model was calibrated using the 1985-86 data for plot 2. The calibration simulation agreed closely with the measured data. The verification simulations for plots 1 and 3 using the 1985-86 data and for plots 1 and 2 using the 1986-87 data agreed closely with the measured data. The verification simulation for plot 3 using the 1986-87 data did not agree closely. The recalibration simulations for plots 1 and 3 using the 1985-86 data indicated little improvement because the verification simulations for plots 1 and 3 already agreed closely with the measured data.

  5. Documentation of a heat and water transfer model for seasonally frozen soils with application to a precipitation-runoff model

    USGS Publications Warehouse

    Emerson, Douglas G.

    1991-01-01

    A model that simulates heat and water transfer in soils during freezing and thawing periods was developed and incorporated into the U.S. Geological Survey's Precipitation-Runoff Modeling System. The transfer of heat 1s based on an equation developed from Fourier's equation for heat flux. Field capacity and infiltration rate can vary throughout the freezing and thawing period, depending on soil conditions and rate and timing of snowmelt. The transfer of water within the soil profile is based on the concept of capillary forces. The model can be used to determine the effects of seasonally frozen soils on ground-water recharge and surface-water runoff. Data collected for two winters, 1985-86 and 1986-87, on three runoff plots were used to calibrate and verify the model. The winter of 1985-86 was colder than normal and snow cover was continuous throughout the winter. The winter of 1986-87 was wanner than normal and snow accumulated for only short periods of several days.Runoff, snowmelt, and frost depths were used as the criteria for determining the degree of agreement between simulated and measured data. The model was calibrated using the 1985-86 data for plot 2. The calibration simulation agreed closely with the measured data. The verification simulations for plots 1 and 3 using the 1985-86 data and for plots 1 and 2 using the 1986-87 data agreed closely with the measured data. The verification simulation for plot 3 using the 1986-87 data did not agree closely. The recalibratlon simulations for plots 1 and 3 using the 1985-86 data Indicated small improvement because the verification simulations for plots 1 and 3 already agreed closely with the measured data.

  6. Petrogenesis of the Majiari ophiolite (western Tibet, China): Implications for intra-oceanic subduction in the Bangong-Nujiang Tethys

    NASA Astrophysics Data System (ADS)

    Huang, Qiang-tai; Liu, Wei-liang; Xia, Bin; Cai, Zhou-rong; Chen, Wei-yan; Li, Jian-feng; Yin, Zheng-xin

    2017-09-01

    The Majiari ophiolite lies in the western Bangong-Nujiang Suture Zone, which separates the Qiangtang and Lhasa blocks in central Tibet. The ophiolite consists of peridotite, gabbro/diabase and basalt. Zircon U-Pb dating yielded an age of 170.5 ± 1.7 Ma for the gabbro, whereas 40Ar/39Ar dating of plagioclase from the same gabbro yielded ages of 108.4 ± 2.6 Ma (plateau age) and 112 ± 2 Ma (isochron age), indicating that the ophiolite was formed during the Middle Jurassic and was probably emplaced during the Early Cretaceous. Zircons from the gabbro have εHf(t) values ranging from +6.9 to +10.6 and f(Lu/Hf) values ranging from -0.92 to -0.98. Mafic lavas plot in the tholeiitic basalt field but are depleted in Nb, Ta and Ti and enriched in Rb, Ba and Th in the N-MORB-normalized trace element spider diagram. These lavas have whole-rock εNd(t) values of +5.9 to +6.6, suggesting that they were derived from a depleted mantle source, which was probably modified by subducted materials. The Majiari ophiolite probably formed in a typical back-arc basin above a supra-subduction zone (SSZ) mantle wedge. Intra-oceanic subduction occurred during the Middle Jurassic and collision of the Lhasa and South Qiangtang terranes likely occurred in the Early Cretaceous. Thus, closure of the Bangong-Nujiang Tethys Ocean likely occurred before the Early Cretaceous.

  7. [Recurrence plot analysis of HRV for brain ischemia and asphyxia].

    PubMed

    Chen, Xiaoming; Qiu, Yihong; Zhu, Yisheng

    2008-02-01

    Heart rate variability (HRV) is the tiny variability existing in the cycles of the heart beats, which reflects the corresponding balance between sympathetic and vagus nerves. Since the nonlinear characteristic of HRV is confirmed, the Recurrence Plot method, a nonlinear dynamic analysis method based on the complexity, could be used to analyze HRV. The results showed the recurrence plot structures and some quantitative indices (L-Mean, L-Entr) during asphyxia insult vary significantly as compared to those in normal conditions, which offer a new method to monitor brain asphyxia injury.

  8. A Seakeeping Performance and Affordability Tradeoff Study for the Coast Guard Offshore Patrol Cutter

    DTIC Science & Technology

    2016-06-01

    Index Polar Plot for Sea State 4, All Headings Are Relative to the Wave Motion and Velocity is Given in Meters per Second...40 Figure 15. Probability and Cumulative Density Functions of Annual Sea State Occurrences in the Open Ocean, North Pacific...criteria at a given sea state. Probability distribution functions are available that describe the likelihood that an operational area will experience

  9. On the Conformable Fractional Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Mozaffari, F. S.; Hassanabadi, H.; Sobhani, H.; Chung, W. S.

    2018-05-01

    In this paper, a conformable fractional quantum mechanic has been introduced using three postulates. Then in such a formalism, Schr¨odinger equation, probability density, probability flux and continuity equation have been derived. As an application of considered formalism, a fractional-radial harmonic oscillator has been considered. After obtaining its wave function and energy spectrum, effects of the conformable fractional parameter on some quantities have been investigated and plotted for different excited states.

  10. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    PubMed

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The diagnostic plot proposed in this paper can be used to assess how adequate ChIP/Input normalization constants are, and thus it allows the user to choose the most adequate estimate for the analysis.

  11. Implications of land-use change on forest carbon stocks in the eastern United States

    NASA Astrophysics Data System (ADS)

    Puhlick, Joshua; Woodall, Christopher; Weiskittel, Aaron

    2017-02-01

    Given the substantial role that forests play in removing CO2 from the atmosphere, there has been a growing need to evaluate the carbon (C) implications of various forest management and land-use decisions. Although assessment of land-use change is central to national-level greenhouse gas monitoring guidelines, it is rarely incorporated into forest stand-level evaluations of C dynamics and trajectories. To better inform the assessment of forest stand C dynamics in the context of potential land-use change, we used a region-wide repeated forest inventory (n = 71 444 plots) across the eastern United States to assess forest land-use conversion and associated changes in forest C stocks. Specifically, the probability of forest area reduction between 2002-2006 and 2007-2012 on these plots was related to key driving factors such as proportion of the landscape in forest land use, distance to roads, and initial forest C. Additional factors influencing the actual reduction in forest area were then used to assess the risk of forest land-use conversion to agriculture, settlement, and water. Plots in forests along the Great Plains had the highest periodic (approximately 5 years) probability of land-use change (0.160 ± 0.075; mean ± SD) with forest conversion to agricultural uses accounting for 70.5% of the observed land-use change. Aboveground forest C stock change for plots with a reduction in forest area was -4.2 ± 17.7 Mg ha-1 (mean ± SD). The finding that poorly stocked stands and/or those with small diameter trees had the highest probability of conversion to non-forest land uses suggests that forest management strategies can maintain the US terrestrial C sink not only in terms of increased net forest growth but also retention of forest area to avoid conversion. This study highlights the importance of considering land-use change in planning and policy decisions that seek to maintain or enhance regional C sinks.

  12. Performance of deep-rooted phreatophytic trees at a site containing total petroleum hydrocarbons.

    PubMed

    Ferro, Ari M; Adham, Tareq; Berra, Brett; Tsao, David

    2013-01-01

    Poplar and willow tree stands were installed in 2003 at a site in Raleigh, North Carolina containing total petroleum hydrocarbon - contaminated groundwater. The objective was groundwater uptake and plume control. The water table was 5 to 6 m below ground surface (bgs) and therefore methods were used to encourage deep root development. Growth rates, rooting depth and sap flow were measured for trees in Plot A located in the center of the plume and in Plot B peripheral to the plume. The trees were initially sub-irrigated with vertically installed drip-lines and by 2005 had roots 4 to 5 m bgs. Water balance calculations suggested groundwater uptake. In 2007, the average sap flow was higher for Plot B (approximately 59 L per day per tree) than for Plot A (approximately 23 L per day per tree), probably as a result of TPH-induced stress in Plot A. Nevertheless, the estimated rate of groundwater uptake for Plot A was sufficient, relative to the calculated rate of groundwater flux beneath the stand, that a high level of plume control was achieved based on MODFLOW modeling results. Down-gradient groundwater monitoring wells installed in late 2011 should provide quantitative data for plume control.

  13. An Unusual Exponential Graph

    ERIC Educational Resources Information Center

    Syed, M. Qasim; Lovatt, Ian

    2014-01-01

    This paper is an addition to the series of papers on the exponential function begun by Albert Bartlett. In particular, we ask how the graph of the exponential function y = e[superscript -t/t] would appear if y were plotted versus ln t rather than the normal practice of plotting ln y versus t. In answering this question, we find a new way to…

  14. Rapid classification of landsat TM imagery for phase 1 stratification using the automated NDVI threshold supervised classification (ANTSC) methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2002-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....

  15. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment

    USDA-ARS?s Scientific Manuscript database

    A small, fixed-wing UAS was used to survey a replicated small plot field experiment designed to estimate sorghum damage caused by an invasive aphid. Plant stress varied among 40 plots through manipulation of aphid densities. Equipped with a consumer-grade near-infrared camera, the UAS was flown on...

  16. Application of Tryptophan Fluorescence Bandwidth-Maximum Plot in Analysis of Monoclonal Antibody Structure.

    PubMed

    Huang, Cheng-Yen; Hsieh, Ming-Ching; Zhou, Qinwei

    2017-04-01

    Monoclonal antibodies have become the fastest growing protein therapeutics in recent years. The stability and heterogeneity pertaining to its physical and chemical structures remain a big challenge. Tryptophan fluorescence has been proven to be a versatile tool to monitor protein tertiary structure. By modeling the tryptophan fluorescence emission envelope with log-normal distribution curves, the quantitative measure can be exercised for the routine characterization of monoclonal antibody overall tertiary structure. Furthermore, the log-normal deconvolution results can be presented as a two-dimensional plot with tryptophan emission bandwidth vs. emission maximum to enhance the resolution when comparing samples or as a function of applied perturbations. We demonstrate this by studying four different monoclonal antibodies, which show the distinction on emission bandwidth-maximum plot despite their similarity in overall amino acid sequences and tertiary structures. This strategy is also used to demonstrate the tertiary structure comparability between different lots manufactured for one of the monoclonal antibodies (mAb2). In addition, in the unfolding transition studies of mAb2 as a function of guanidine hydrochloride concentration, the evolution of the tertiary structure can be clearly traced in the emission bandwidth-maximum plot.

  17. Statin, testosterone and phosphodiesterase 5-inhibitor treatments and age related mortality in diabetes

    PubMed Central

    Hackett, Geoffrey; Jones, Peter W; Strange, Richard C; Ramachandran, Sudarshan

    2017-01-01

    AIM To determine how statins, testosterone (T) replacement therapy (TRT) and phosphodiesterase 5-inhibitors (PDE5I) influence age related mortality in diabetic men. METHODS We studied 857 diabetic men screened for the BLAST study, stratifying them (mean follow-up = 3.8 years) into: (1) Normal T levels/untreated (total T > 12 nmol/L and free T > 0.25 nmol/L), Low T/untreated and Low T/treated; (2) PDE5I/untreated and PDE5I/treated; and (3) statin/untreated and statin/treated groups. The relationship between age and mortality, alone and with T/TRT, statin and PDE5I treatment was studied using logistic regression. Mortality probability and 95%CI were calculated from the above models for each individual. RESULTS Age was associated with mortality (logistic regression, OR = 1.10, 95%CI: 1.08-1.13, P < 0.001). With all factors included, age (OR = 1.08, 95%CI: 1.06-1.11, P < 0.001), Low T/treated (OR = 0.38, 95%CI: 0.15-0.92, P = 0.033), PDE5I/treated (OR = 0.17, 95%CI: 0.053-0.56, P = 0.004) and statin/treated (OR = 0.59, 95%CI: 0.36-0.97, P = 0.038) were associated with lower mortality. Age related mortality was as described by Gompertz, r2 = 0.881 when Ln (mortality) was plotted against age. The probability of mortality and 95%CI (from logistic regression) of individuals, treated/untreated with the drugs, alone and in combination was plotted against age. Overlap of 95%CI lines was evident with statins and TRT. No overlap was evident with PDE5I alone and with statins and TRT, this suggesting a change in the relationship between age and mortality. CONCLUSION We show that statins, PDE5I and TRT reduce mortality in diabetes. PDE5I, alone and with the other treatments significantly alter age related mortality in diabetic men. PMID:28344753

  18. Therapy operating characteristic curves: tools for precision chemotherapy

    PubMed Central

    Barrett, Harrison H.; Alberts, David S.; Woolfenden, James M.; Caucci, Luca; Hoppin, John W.

    2016-01-01

    Abstract. The therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control versus the probability of normal-tissue complications as the overall radiation dose level is varied, e.g., by varying the beam current in external-beam radiotherapy or the total injected activity in radionuclide therapy. This paper shows how TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy, AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. The mathematical analogy between response of observers to images and the response of tumors to distributions of a chemotherapy drug is exploited to obtain linear discriminant functions from which AUTOC can be calculated. Methods for using mathematical models of drug delivery and tumor response with imaging data to estimate patient-specific parameters that are needed for calculation of AUTOC are outlined. The implications of this viewpoint for clinical trials are discussed. PMID:27175376

  19. The Tail Exponent for Stock Returns in Bursa Malaysia for 2003-2008

    NASA Astrophysics Data System (ADS)

    Rusli, N. H.; Gopir, G.; Usang, M. D.

    2010-07-01

    A developed discipline of econophysics that has been introduced is exhibiting the application of mathematical tools that are usually applied to the physical models for the study of financial models. In this study, an analysis of the time series behavior of several blue chip and penny stock companies in Main Market of Bursa Malaysia has been performed. Generally, the basic quantity being used is the relative price changes or is called the stock price returns, contains daily-sampled data from the beginning of 2003 until the end of 2008, containing 1555 trading days recorded. The aim of this paper is to investigate the tail exponent in tails of the distribution for blue chip stocks and penny stocks financial returns in six years period. By using a standard regression method, it is found that the distribution performed double scaling on the log-log plot of the cumulative probability of the normalized returns. Thus we calculate α for a small scale return as well as large scale return. Based on the result obtained, it is found that the power-law behavior for the probability density functions of the stock price absolute returns P(z)˜z-α with values lying inside and outside the Lévy stable regime with values α>2. All the results were discussed in detail.

  20. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  1. Estimation of descriptive statistics for multiply censored water quality data

    USGS Publications Warehouse

    Helsel, Dennis R.; Cohn, Timothy A.

    1988-01-01

    This paper extends the work of Gilliom and Helsel (1986) on procedures for estimating descriptive statistics of water quality data that contain “less than” observations. Previously, procedures were evaluated when only one detection limit was present. Here we investigate the performance of estimators for data that have multiple detection limits. Probability plotting and maximum likelihood methods perform substantially better than simple substitution procedures now commonly in use. Therefore simple substitution procedures (e.g., substitution of the detection limit) should be avoided. Probability plotting methods are more robust than maximum likelihood methods to misspecification of the parent distribution and their use should be encouraged in the typical situation where the parent distribution is unknown. When utilized correctly, less than values frequently contain nearly as much information for estimating population moments and quantiles as would the same observations had the detection limit been below them.

  2. Parsimonious nonstationary flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Serago, Jake M.; Vogel, Richard M.

    2018-02-01

    There is now widespread awareness of the impact of anthropogenic influences on extreme floods (and droughts) and thus an increasing need for methods to account for such influences when estimating a frequency distribution. We introduce a parsimonious approach to nonstationary flood frequency analysis (NFFA) based on a bivariate regression equation which describes the relationship between annual maximum floods, x, and an exogenous variable which may explain the nonstationary behavior of x. The conditional mean, variance and skewness of both x and y = ln (x) are derived, and combined with numerous common probability distributions including the lognormal, generalized extreme value and log Pearson type III models, resulting in a very simple and general approach to NFFA. Our approach offers several advantages over existing approaches including: parsimony, ease of use, graphical display, prediction intervals, and opportunities for uncertainty analysis. We introduce nonstationary probability plots and document how such plots can be used to assess the improved goodness of fit associated with a NFFA.

  3. Systematic design for trait introgression projects.

    PubMed

    Cameron, John N; Han, Ye; Wang, Lizhi; Beavis, William D

    2017-10-01

    Using an Operations Research approach, we demonstrate design of optimal trait introgression projects with respect to competing objectives. We demonstrate an innovative approach for designing Trait Introgression (TI) projects based on optimization principles from Operations Research. If the designs of TI projects are based on clear and measurable objectives, they can be translated into mathematical models with decision variables and constraints that can be translated into Pareto optimality plots associated with any arbitrary selection strategy. The Pareto plots can be used to make rational decisions concerning the trade-offs between maximizing the probability of success while minimizing costs and time. The systematic rigor associated with a cost, time and probability of success (CTP) framework is well suited to designing TI projects that require dynamic decision making. The CTP framework also revealed that previously identified 'best' strategies can be improved to be at least twice as effective without increasing time or expenses.

  4. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    PubMed

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  5. High-resolution Manometry and Globus: Comparison of Globus, Gastroesophageal Reflux Disease and Normal Controls Using High-resolution Manometry.

    PubMed

    Choi, Won Seok; Kim, Tae Wan; Kim, Ja Hyun; Lee, Sang Hyuk; Hur, Woon Je; Choe, Young Gil; Lee, Sang Hyuk; Park, Jung Ho; Sohn, Chong Il

    2013-10-01

    Globus is a foreign body sense in the throat without dysphagia, odynophagia, esophageal motility disorders, or gastroesophageal reflux. The etiology is unclear. Previous studies suggested that increased upper esophageal sphincter pressure, gastroesophageal reflux and hypertonicity of esophageal body were possible etiologies. This study was to quantify the upper esophageal sphincter (UES) pressure, contractile front velocity (CFV), proximal contractile integral (PCI), distal contractile integral (DCI) and transition zone (TZ) in patient with globus gastroesophageal reflux disease (GERD) without globus, and normal controls to suggest the correlation of specific high-resolution manometry (HRM) findings and globus. Fifty-seven globus patients, 24 GERD patients and 7 normal controls were studied with HRM since 2009. We reviewed the reports, and selected 5 swallowing plots suitable for analysis in each report, analyzed each individual plot with ManoView. The 5 parameters from each plot in 57 globus patients were compared with that of 24 GERD patients and 7 normal controls. There was no significant difference in the UES pressure, CFV, PCI and DCI. TZ (using 30 mmHg isobaric contour) in globus showed significant difference compared with normal controls and GERD patients. The median values of TZ were 4.26 cm (interquartile range [IQR], 2.30-5.85) in globus patients, 5.91 cm (IQR, 3.97-7.62) in GERD patients and 2.26 cm (IQR, 1.22-2.92) in normal controls (P = 0.001). HRM analysis suggested that UES pressure, CFV, PCI and DCI were not associated with globus. Instead increased length of TZ may be correlated with globus. Further study comparing HRM results in globus patients within larger population needs to confirm their correlation.

  6. High-resolution Manometry and Globus: Comparison of Globus, Gastroesophageal Reflux Disease and Normal Controls Using High-resolution Manometry

    PubMed Central

    Choi, Won Seok; Kim, Tae Wan; Kim, Ja Hyun; Lee, Sang Hyuk; Hur, Woon Je; Choe, Young Gil; Lee, Sang Hyuk; Park, Jung Ho

    2013-01-01

    Background/Aims Globus is a foreign body sense in the throat without dysphagia, odynophagia, esophageal motility disorders, or gastroesophageal reflux. The etiology is unclear. Previous studies suggested that increased upper esophageal sphincter pressure, gastroesophageal reflux and hypertonicity of esophageal body were possible etiologies. This study was to quantify the upper esophageal sphincter (UES) pressure, contractile front velocity (CFV), proximal contractile integral (PCI), distal contractile integral (DCI) and transition zone (TZ) in patient with globus gastroesophageal reflux disease (GERD) without globus, and normal controls to suggest the correlation of specific high-resolution manometry (HRM) findings and globus. Methods Fifty-seven globus patients, 24 GERD patients and 7 normal controls were studied with HRM since 2009. We reviewed the reports, and selected 5 swallowing plots suitable for analysis in each report, analyzed each individual plot with ManoView. The 5 parameters from each plot in 57 globus patients were compared with that of 24 GERD patients and 7 normal controls. Results There was no significant difference in the UES pressure, CFV, PCI and DCI. TZ (using 30 mmHg isobaric contour) in globus showed significant difference compared with normal controls and GERD patients. The median values of TZ were 4.26 cm (interquartile range [IQR], 2.30-5.85) in globus patients, 5.91 cm (IQR, 3.97-7.62) in GERD patients and 2.26 cm (IQR, 1.22-2.92) in normal controls (P = 0.001). Conclusions HRM analysis suggested that UES pressure, CFV, PCI and DCI were not associated with globus. Instead increased length of TZ may be correlated with globus. Further study comparing HRM results in globus patients within larger population needs to confirm their correlation. PMID:24199007

  7. Removing Parallax-Induced False Changes in Change Detection

    DTIC Science & Technology

    2014-03-27

    viii Figure Page 11 Three hypothetical ROC curves. The probability of detection (PD) is plotted against the probability of false alarm ( PFA ) based on...red and green) approach the value of PD = 1 and PFA = 0, the detector performance is said to improve. . . . . . . . . . . . . . . . 32 12 Possible... sorption are commonly among those with low SNRs as the gases and vapor in the atmosphere between the (airborne) sensor and the ground plane tend to

  8. Cycles till failure of silver-zinc cells with completing failures modes: Preliminary data analysis

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.; Leibecki, H. F.; Bozek, J. M.

    1980-01-01

    One hundred and twenty nine cells were run through charge-discharge cycles until failure. The experiment design was a variant of a central composite factorial in five factors. Preliminary data analysis consisted of response surface estimation of life. Batteries fail under two basic modes; a low voltage condition and an internal shorting condition. A competing failure modes analysis using maximum likelihood estimation for the extreme value life distribution was performed. Extensive diagnostics such as residual plotting and probability plotting were employed to verify data quality and choice of model.

  9. Cycles till failure of silver-zinc cells with competing failure modes - Preliminary data analysis

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.; Leibecki, H. F.; Bozek, J. M.

    1980-01-01

    The data analysis of cycles to failure of silver-zinc electrochemical cells with competing failure modes is presented. The test ran 129 cells through charge-discharge cycles until failure; preliminary data analysis consisted of response surface estimate of life. Batteries fail through low voltage condition and an internal shorting condition; a competing failure modes analysis was made using maximum likelihood estimation for the extreme value life distribution. Extensive residual plotting and probability plotting were used to verify data quality and selection of model.

  10. An improved multiple linear regression and data analysis computer program package

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  11. Comparison of pre-processing methods for multiplex bead-based immunoassays.

    PubMed

    Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter

    2016-08-11

    High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.

  12. A normalized plot as a novel and time-saving tool in complex enzyme kinetic analysis.

    PubMed

    Bravo, I G; Busto, F; De Arriaga, D; Ferrero, M A; Rodríguez-Aparicio, L B; Martínez-Blanco, H; Reglero, A

    2001-09-15

    A new data treatment is described for designing kinetic experiments and analysing kinetic results for multi-substrate enzymes. Normalized velocities are plotted against normalized substrate concentrations. Data are grouped into n + 1 families across the range of substrate or product tested, n being the number of substrates plus products assayed. It has the following advantages over traditional methods: (1) it reduces to less than a half the amount of data necessary for a proper description of the system; (2) it introduces a self-consistency checking parameter that ensures the 'scientific reliability' of the mathematical output; (3) it eliminates the need for a prior knowledge of Vmax; (4) the normalization of data allows the use of robust and fuzzy methods suitable for managing really 'noisy' data; (5) it is appropriate for analysing complex systems, as the complete general equation is used, and the actual influence of effectors can be typified; (6) it is amenable to being implemented as a software that incorporates testing and electing among rival kinetic models.

  13. Ballistic limit regression analysis for Space Station Freedom meteoroid and space debris protection system

    NASA Technical Reports Server (NTRS)

    Jolly, William H.

    1992-01-01

    Relationships defining the ballistic limit of Space Station Freedom's (SSF) dual wall protection systems have been determined. These functions were regressed from empirical data found in Marshall Space Flight Center's (MSFC) Hypervelocity Impact Testing Summary (HITS) for the velocity range between three and seven kilometers per second. A stepwise linear least squares regression was used to determine the coefficients of several expressions that define a ballistic limit surface. Using statistical significance indicators and graphical comparisons to other limit curves, a final set of expressions is recommended for potential use in Probability of No Critical Flaw (PNCF) calculations for Space Station. The three equations listed below represent the mean curves for normal, 45 degree, and 65 degree obliquity ballistic limits, respectively, for a dual wall protection system consisting of a thin 6061-T6 aluminum bumper spaced 4.0 inches from a .125 inches thick 2219-T87 rear wall with multiple layer thermal insulation installed between the two walls. Normal obliquity is d(sub c) = 1.0514 v(exp 0.2983 t(sub 1)(exp 0.5228). Forty-five degree obliquity is d(sub c) = 0.8591 v(exp 0.0428) t(sub 1)(exp 0.2063). Sixty-five degree obliquity is d(sub c) = 0.2824 v(exp 0.1986) t(sub 1)(exp -0.3874). Plots of these curves are provided. A sensitivity study on the effects of using these new equations in the probability of no critical flaw analysis indicated a negligible increase in the performance of the dual wall protection system for SSF over the current baseline. The magnitude of the increase was 0.17 percent over 25 years on the MB-7 configuration run with the Bumper II program code.

  14. Ballistic limit regression analysis for Space Station Freedom meteoroid and space debris protection system

    NASA Astrophysics Data System (ADS)

    Jolly, William H.

    1992-05-01

    Relationships defining the ballistic limit of Space Station Freedom's (SSF) dual wall protection systems have been determined. These functions were regressed from empirical data found in Marshall Space Flight Center's (MSFC) Hypervelocity Impact Testing Summary (HITS) for the velocity range between three and seven kilometers per second. A stepwise linear least squares regression was used to determine the coefficients of several expressions that define a ballistic limit surface. Using statistical significance indicators and graphical comparisons to other limit curves, a final set of expressions is recommended for potential use in Probability of No Critical Flaw (PNCF) calculations for Space Station. The three equations listed below represent the mean curves for normal, 45 degree, and 65 degree obliquity ballistic limits, respectively, for a dual wall protection system consisting of a thin 6061-T6 aluminum bumper spaced 4.0 inches from a .125 inches thick 2219-T87 rear wall with multiple layer thermal insulation installed between the two walls. Normal obliquity is d(sub c) = 1.0514 v(exp 0.2983 t(sub 1)(exp 0.5228). Forty-five degree obliquity is d(sub c) = 0.8591 v(exp 0.0428) t(sub 1)(exp 0.2063). Sixty-five degree obliquity is d(sub c) = 0.2824 v(exp 0.1986) t(sub 1)(exp -0.3874). Plots of these curves are provided. A sensitivity study on the effects of using these new equations in the probability of no critical flaw analysis indicated a negligible increase in the performance of the dual wall protection system for SSF over the current baseline. The magnitude of the increase was 0.17 percent over 25 years on the MB-7 configuration run with the Bumper II program code.

  15. Rapid Classification of Landsat TM Imagery for Phase 1 Stratification Using the Automated NDVI Threshold Supervised Classification (ANTSC) Methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2005-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....

  16. Water use and water use efficiency after thinning in Aleppo pine plantation in Southwest of Valencia, Spain

    NASA Astrophysics Data System (ADS)

    Fernandes, Tarcísio José Gualberto; Damaso Del Campo, Antonio; Gonzáles-Sanchís, María

    2014-05-01

    Mediterranean forests need a proactive adaptive silviculture in the face of global change, being their water-use (WU) and water use efficiency (WUE) the key factors to forest managers. Thinning, as a silvicultural practice, has the potential to alter the water potential gradients that exist between soil and canopy. As a result, a change in the amount of water used by trees is produced. The aim of this study is to analyse the effects of the adaptive silviculture on the water-use and water-use efficiency. To that end, both WU and WUE, are measured in an Aleppo pine plantation, where different thinning intensities were applied. The experimental set-up consisted of four plots, three of them corresponding to thinning treatments in 2008 at different intensities High, Middle and Low plus an unthinned plot - control. Additionally, a plot next to the treatment, thinned with High intensity in 1998 was sampled to assess the longer-term effects of thinning. The plots are located at Southwest of Valencia-Spain. WU was measured in four trees per plot on the period April 2009 to May 2011 using HRM sapflow-sensors. WUE was described following the Carbon stable isotope theory by a dendrochronological approach. A stable isotope analysis was performed in the same trees used to measure sapflow. The analysed rings were those correspondent to the 3 previous years to the thinning, and the following after the treatment. The results from this study indicate that stand WU is significantly different (p<0.05) in each tested treatment, being higher in control plot, followed by Low, Medium and Heavy treatments. However, considering only the tree, the average WU was higher in the Heavy treatment. No significantly differences were found between low and control trees. The dendrochronological analyses showed a general variability in ring width during the initial growth (first 15 years). In the following years, the ring widths were very small, probably conditioned by climate conditions. However, immediately after thinning, all trees showed a significant increase when compared with control. The WUE show different patterns in dry and wet years, and between thinned and control plots. The correlation between WU and WUE was higher in the thinned plots than in control plot. Different patterns of the relationship between WUE and WU were found during years 2009 and 2010. A positive slope was found in thinned plots during 2008 (Low, Medium and Heavy), while negative slope was described in Heavy thinning 1998 and Control plots. In conclusion the reactions after thinning equally promote an increase in WU (tree transpiration), growth and WUE. However in the control plot the increase of WU produces a decrease of WUE. This probably responds to the lower rate of growth found in this plot. This study shows clearly the impacts of thinning in forest growth, water use and water use efficiency. Some of the effects of thinning have been pointed out in other studies. However, this study introduce a novel contribution relating WU to WUE in a Mediterranean Aleppo pine plantation.

  17. Application of the FINDER system to the search for epithermal vein gold-silver deposits : Kushikino, Japan, a case study

    USGS Publications Warehouse

    Singer, Donald A.; Kouda, Ryoichi

    1991-01-01

    The FINDER system employs geometric probability, Bayesian statistics, and the normal probability density function to integrate spatial and frequency information to produce a map of probabilities of target centers. Target centers can be mineral deposits, alteration associated with mineral deposits, or any other target that can be represented by a regular shape on a two dimensional map. The size, shape, mean, and standard deviation for each variable are characterized in a control area and the results applied by means of FINDER to the study area. The Kushikino deposit consists of groups of quartz-calcite-adularia veins that produced 55 tonnes of gold and 456 tonnes of silver since 1660. Part of a 6 by 10 km area near Kushikino served as a control area. Within the control area, data plotting, contouring, and cluster analysis were used to identify the barren and mineralized populations. Sodium was found to be depleted in an elliptically shaped area 3.1 by 1.6 km, potassium was both depleted and enriched locally in an elliptically shaped area 3.0 by 1.3 km, and sulfur was enriched in an elliptically shaped area 5.8 by 1.6 km. The potassium, sodium, and sulfur content from 233 surface rock samples were each used in FINDER to produce probability maps for the 12 by 30 km study area which includes Kushikino. High probability areas for each of the individual variables are over and offset up to 4 km eastward from the main Kushikino veins. In general, high probability areas identified by FINDER are displaced from the main veins and cover not only the host andesite and the dacite-andesite that is about the same age as the Kushikino mineralization, but also younger sedimentary rocks, andesite, and tuff units east and northeast of Kushikino. The maps also display the same patterns observed near Kushikino, but with somewhat lower probabilities, about 1.5 km east of the old gold prospect, Hajima, and in a broad zone 2.5 km east-west and 1 km north-south, centered 2 km west of the old gold prospect, Yaeyama.

  18. Ancient human disturbances may be skewing our understanding of Amazonian forests.

    PubMed

    McMichael, Crystal N H; Matthews-Bird, Frazer; Farfan-Rios, William; Feeley, Kenneth J

    2017-01-17

    Although the Amazon rainforest houses much of Earth's biodiversity and plays a major role in the global carbon budget, estimates of tree biodiversity originate from fewer than 1,000 forest inventory plots, and estimates of carbon dynamics are derived from fewer than 200 recensus plots. It is well documented that the pre-European inhabitants of Amazonia actively transformed and modified the forest in many regions before their population collapse around 1491 AD; however, the impacts of these ancient disturbances remain entirely unaccounted for in the many highly influential studies using Amazonian forest plots. Here we examine whether Amazonian forest inventory plot locations are spatially biased toward areas with high probability of ancient human impacts. Our analyses reveal that forest inventory plots, and especially forest recensus plots, in all regions of Amazonia are located disproportionately near archaeological evidence and in areas likely to have ancient human impacts. Furthermore, regions of the Amazon that are relatively oversampled with inventory plots also contain the highest values of predicted ancient human impacts. Given the long lifespan of Amazonian trees, many forest inventory and recensus sites may still be recovering from past disturbances, potentially skewing our interpretations of forest dynamics and our understanding of how these forests are responding to global change. Empirical data on the human history of forest inventory sites are crucial for determining how past disturbances affect modern patterns of forest composition and carbon flux in Amazonian forests.

  19. Mineral exploration with ERTS imagery. [Colorado

    NASA Technical Reports Server (NTRS)

    Nicolais, S. M.

    1974-01-01

    Ten potential target areas for metallic mineral exploration were selected on the basis of a photo-lineament interpretation of the ERTS image 1172-17141 in central Colorado. An evaluation of bias indicated that prior geologic knowledge of the region had little, if any, effect on target selection. In addition, a contoured plot of the frequency of photo-lineament intersections was made to determine what relationships exist between the photo-lineaments and mineral districts. Comparison of this plot with a plot of the mineral districts indicates that areas with a high frequency of intersections commonly coincide with known mineral districts. The results of this experiment suggest that photo-lineaments are fractures or fracture-controlled features, and their distribution may be a guide to metallic mineral deposits in Colorado, and probably other areas as well.

  20. Variable life-adjusted display (VLAD) for hip fracture patients: a prospective trial.

    PubMed

    Williams, H; Gwyn, R; Smith, A; Dramis, A; Lewis, J

    2015-08-01

    With restructuring within the NHS, there is increased public and media interest in surgical outcomes. The Nottingham Hip Fracture Score (NHFS) is a well-validated tool in predicting 30-day mortality in hip fractures. VLAD provides a visual plot in real time of the difference between the cumulative expected mortality and the actual death occurring. Survivors are incorporated as a positive value equal to 1 minus the probability of survival and deaths as a negative value equal to the probability of survival. Downward deflections indicate mortality and potentially suboptimal care. We prospectively included every hip fracture admitted to UHW that underwent surgery from January-August 2014. NHFS was then calculated and predicted survival identified. A VLAD plot was then produced comparing the predicted with the actual 30-day mortality. Two hundred and seventy-seven patients have completed the 30-day follow-up, and initial results showed that the actual 30-day mortality (7.2 %) was much lower than that predicted by the NHFS (8.0 %). This was reflected by a positive trend on the VLAD plot. Variable life-adjusted display provides an easy-to-use graphical representation of risk-adjusted survival over time and can act as an "early warning" system to identify trends in mortality for hip fractures.

  1. Spatial variability in cost and success of revegetation in a Wyoming big sagebrush community.

    PubMed

    Boyd, Chad S; Davies, Kirk W

    2012-09-01

    The ecological integrity of the Wyoming big sagebrush (Artemisia tridentata Nutt. ssp. wyomingensis Beetle and A. Young) alliance is being severely interrupted by post-fire invasion of non-native annual grasses. To curtail this invasion, successful post-fire revegetation of perennial grasses is required. Environmental factors impacting post-fire restoration success vary across space within the Wyoming big sagebrush alliance; however, most restorative management practices are applied uniformly. Our objectives were to define probability of revegetation success over space using relevant soil-related environmental factors, use this information to model cost of successful revegetation and compare the importance of vegetation competition and soil factors to revegetation success. We studied a burned Wyoming big sagebrush landscape in southeast Oregon that was reseeded with perennial grasses. We collected soil and vegetation data at plots spaced at 30 m intervals along a 1.5 km transect in the first two years post-burn. Plots were classified as successful (>5 seedlings/m(2)) or unsuccessful based on density of seeded species. Using logistic regression we found that abundance of competing vegetation correctly predicted revegetation success on 51 % of plots, and soil-related variables correctly predicted revegetation performance on 82.4 % of plots. Revegetation estimates varied from $167.06 to $43,033.94/ha across the 1.5 km transect based on probability of success, but were more homogenous at larger scales. Our experimental protocol provides managers with a technique to identify important environmental drivers of restoration success and this process will be of value for spatially allocating logistical and capital expenditures in a variable restoration environment.

  2. Effect of heliotropism on the bidirectional reflectance of irrigated cotton

    NASA Technical Reports Server (NTRS)

    Schutt, J. B.; Kimes, D. S.; Newcomb, W. W.

    1985-01-01

    The dynamic behavior of cotton leaves is described using gyroscopic coordinates. Angular movements represented as pitching, rolling, and yawing are used to follow the movement of leaf normals and their instantaneous relationships to the sun on an individual basis. A sensitivity analysis establishes that the angle between a leaf normal and the sun is most affected by changes in pitch and roll. Plots of the phase angle gamma averaged by quadrant show the pronounced heliotropic behavior of cotton leaves. Plots of pitch versus roll averaged by quadrant demonstrate the differential behavior of cotton leaves relative to the position of the sun. These results are used to interpret sections taken from bidirectional reflectance curves obtained using 0.57-0.69 micron band in terms of the evolution of gamma from sunrise until noon. The measured and experimental values of gamma are in reasonable agreement. Forescattered and backscattered exitances are observed to have distinct leaf normal directions.

  3. Statistical Analysis of an Infrared Thermography Inspection of Reinforced Carbon-Carbon

    NASA Technical Reports Server (NTRS)

    Comeaux, Kayla

    2011-01-01

    Each piece of flight hardware being used on the shuttle must be analyzed and pass NASA requirements before the shuttle is ready for launch. One tool used to detect cracks that lie within flight hardware is Infrared Flash Thermography. This is a non-destructive testing technique which uses an intense flash of light to heat up the surface of a material after which an Infrared camera is used to record the cooling of the material. Since cracks within the material obstruct the natural heat flow through the material, they are visible when viewing the data from the Infrared camera. We used Ecotherm, a software program, to collect data pertaining to the delaminations and analyzed the data using Ecotherm and University of Dayton Log Logistic Probability of Detection (POD) Software. The goal was to reproduce the statistical analysis produced by the University of Dayton software, by using scatter plots, log transforms, and residuals to test the assumption of normality for the residuals.

  4. Failure Maps for Rectangular 17-4PH Stainless Steel Sandwiched Foam Panels

    NASA Technical Reports Server (NTRS)

    Raj, S. V.; Ghosn, L. J.

    2007-01-01

    A new and innovative concept is proposed for designing lightweight fan blades for aircraft engines using commercially available 17-4PH precipitation hardened stainless steel. Rotating fan blades in aircraft engines experience a complex loading state consisting of combinations of centrifugal, distributed pressure and torsional loads. Theoretical failure plastic collapse maps, showing plots of the foam relative density versus face sheet thickness, t, normalized by the fan blade span length, L, have been generated for rectangular 17-4PH sandwiched foam panels under these three loading modes assuming three failure plastic collapse modes. These maps show that the 17-4PH sandwiched foam panels can fail by either the yielding of the face sheets, yielding of the foam core or wrinkling of the face sheets depending on foam relative density, the magnitude of t/L and the loading mode. The design envelop of a generic fan blade is superimposed on the maps to provide valuable insights on the probable failure modes in a sandwiched foam fan blade.

  5. Glycemic control in diabetes in three Danish counties.

    PubMed

    Jørgensen, Lone G M; Petersen, Per Hyltoft; Heickendorff, Lene; Møller, Holger Jon; Hendel, Jørn; Christensen, Cramer; Schmitz, Anita; Reinholdt, Birgitte; Lund, Erik D; Christensen, Niels J; Hansen, Erik Kjaersgaard; Hastrup, Jens; Skjødt, Hanne; Eriksen, Ebbe Wendel; Brandslund, Ivan

    2005-01-01

    Hemoglobin A1c (HbA1c) is a proxy measure for glycemic control in diabetes. We investigated the trend for glycemic control in patients from three Danish counties using HbA1c measurements. We studied 2454 patients from a population of 807,000 inhabitants for whom routine monitoring of diabetes using HbA1c-DCCT aligned was initiated in 2001. We estimated the incidence of monitored patients in the population. The progress in patients with originally diabetic HbA1c levels was investigated by cumulative probability plots, and the individual trend in clinical outcome was investigated by a modified difference plot. The age-standardized incidence of monitored patients was <0.5% in all regions. Patients with diabetic first HbA1c concentrations (>or=6.62% HbA1c) showed on average 15% improved glycemic control in the first year. Further improvement was limited. The overall percentage above the treatment target (>or=6.62% HbA1c) was 51% in 2003 compared to 59% in 2001, and the percentage with poor glycemic control (>or=10.0% HbA1c) was reduced from 19% to 4%. Of patients with originally diabetic HbA1c levels, 15% showed progress in glycemic control, and 28% reached treatment targets. In patients with originally normal HbA1c, 75% showed an upward trend in HbA1c levels, which reached diabetic concentrations in 17%. Patients with diabetic first HbA1c concentrations (>or=6.62% HbA1c) showed on average 15% improved glycemic control in the first year. Further improvement was limited. In individual patients, 75% with originally diabetic HbA1c levels showed improved glycemic control after 3 years, while 78% with originally normal concentrations showed an upward trend in HbA1c levels.

  6. Performance of optimized McRAPD in identification of 9 yeast species frequently isolated from patient samples: potential for automation.

    PubMed

    Trtkova, Jitka; Pavlicek, Petr; Ruskova, Lenka; Hamal, Petr; Koukalova, Dagmar; Raclavsky, Vladislav

    2009-11-10

    Rapid, easy, economical and accurate species identification of yeasts isolated from clinical samples remains an important challenge for routine microbiological laboratories, because susceptibility to antifungal agents, probability to develop resistance and ability to cause disease vary in different species. To overcome the drawbacks of the currently available techniques we have recently proposed an innovative approach to yeast species identification based on RAPD genotyping and termed McRAPD (Melting curve of RAPD). Here we have evaluated its performance on a broader spectrum of clinically relevant yeast species and also examined the potential of automated and semi-automated interpretation of McRAPD data for yeast species identification. A simple fully automated algorithm based on normalized melting data identified 80% of the isolates correctly. When this algorithm was supplemented by semi-automated matching of decisive peaks in first derivative plots, 87% of the isolates were identified correctly. However, a computer-aided visual matching of derivative plots showed the best performance with average 98.3% of the accurately identified isolates, almost matching the 99.4% performance of traditional RAPD fingerprinting. Since McRAPD technique omits gel electrophoresis and can be performed in a rapid, economical and convenient way, we believe that it can find its place in routine identification of medically important yeasts in advanced diagnostic laboratories that are able to adopt this technique. It can also serve as a broad-range high-throughput technique for epidemiological surveillance.

  7. Numerical characteristics of recurrence plots as applied to the evaluation of mechanical damage in materials

    NASA Astrophysics Data System (ADS)

    Hilarov, V. L.

    2017-09-01

    The response of a material with a random uniform distribution of pores to a sound impulse was studied. The behavior of the numerical characteristics of the recurrence plots (RP) of the normal displacement vector component depending on the degree of damage was investigated. It was shown that the recurrence quantification analysis (RQA) parameters could be very informative for sonic fault detection.

  8. Stable isotope, chemical, and mineral compositions of the Middle Proterozoic Lijiaying Mn deposit, Shaanxi Province, China

    USGS Publications Warehouse

    Yeh, Hsueh-Wen; Hein, James R.; Ye, Jie; Fan, Delian

    1999-01-01

    The Lijiaying Mn deposit, located about 250 km southwest of Xian, is a high-quality ore characterized by low P and Fe contents and a mean Mn content of about 23%. The ore deposit occurs in shallow-water marine sedimentary rocks of probable Middle Proterozoic age. Carbonate minerals in the ore deposit include kutnahorite, calcite, Mn calcite, and Mg calcite. Carbon (−0.4 to −4.0‰) and oxygen (−3.7 to −12.9‰) isotopes show that, with a few exceptions, those carbonate minerals are not pristine low-temperature marine precipitates. All samples are depleted in rare earth elements (REEs) relative to shale and have negative Eu and positive Ce anomalies on chondrite-normalized plots. The Fe/Mn ratios of representative ore samples range from about 0.034 to <0.008 and P/Mn from 0.0023 to <0.001. Based on mineralogical data, the low ends of those ranges of ratios are probably close to ratios for the pure Mn minerals. Manganese contents have a strong positive correlation with Ce anomaly values and a moderate correlation with total REE contents. Compositional data indicate that kutnahorite is a metamorphic mineral and that most calcites formed as low-temperature marine carbonates that were subsequently metamorphosed. The braunite ore precursor mineral was probably a Mn oxyhydroxide, similar to those that formed on the deep ocean-floor during the Cenozoic. Because the Lijiaying precursor mineral formed in a shallow-water marine environment, the atmospheric oxygen content during the Middle Proterozoic may have been lower than it has been during the Cenozoic.

  9. Modelling spruce bark beetle infestation probability

    Treesearch

    Paulius Zolubas; Jose Negron; A. Steven Munson

    2009-01-01

    Spruce bark beetle (Ips typographus L.) risk model, based on pure Norway spruce (Picea abies Karst.) stand characteristics in experimental and control plots was developed using classification and regression tree statistical technique under endemic pest population density. The most significant variable in spruce bark beetle...

  10. Salmonberry and salal annual aerial stem production: The maintenance of shrub cover in forest stands

    USGS Publications Warehouse

    Tappeiner, J. C.; Zasada, J.; Huffman, D.; Ganio, L.

    2001-01-01

    Annual sprouting of aerial stems and ramets enables populations of salmonberry (Rubus spectabilis Pursh), salal (Gaultheria shallon Pursh), and probably other forest shrubs to maintain dense covers (>20 000 stems/ha). We studied annual stem production of salmonberry on cut (all stems cut within 15 cm of the ground) and uncut (stems were not treated) plots for 8 years and salal for 5 years in the understories of Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco), alder, and riparian stands, as well as clearcuts, which are all common stand types in western Oregon. Mean salmonberry stem production on uncut plots ranged from 4.7 stemsA?ma??2A?yeara??1 (95% CI 2.9a??7.4) in alder stands and clearcuts to 1.6 stemsA?ma??2A?yeara??1 (95% CI 1.0a??2.6) in conifer stands. Mean salal production was greater, ranging from 58 stemsA?ma??2A?yeara??1 (95% CI 25a??135) to 8.6 stemsA?ma??2A?yeara??1 (95% CI 3.7a??20.1) on uncut plots in clearcuts and unthinned Douglas-fir stands, respectively. Annual production of both species was somewhat greater on cut plots. Most stems produced in early spring die by December, but enough are recruited to replace mortality of older stems. Stem density was maintained for 8 years for salmonberry and 5 years for salal on both cut and uncut plots. Based on length of rhizomes and bud density we estimate that only 1a??5% of the buds in the rhizomes are needed to support this annual stem production. Although these species sprout vigorously after their aerial stems are killed, disturbance is not necessary for maintaining a dense cover. It appears that, once established, salal, salmonberry, and probably other clonal forest shrubs can maintain a dense cover that can interfere with establishment of trees and other shrubs in canopy gaps or other openings.

  11. Evaluation of Visual Field Test Parameters after Artificial Tear Administration in Patients with Glaucoma and Dry Eye.

    PubMed

    Özyol, Pelin; Özyol, Erhan; Karalezli, Aylin

    2018-01-01

    To examine the effect of a single dose of artificial tear administration on automated visual field (VF) testing in patients with glaucoma and dry eye syndrome. A total of 35 patients with primary open-angle glaucoma experienced in VF testing with symptoms of dry eye were enrolled in this study. At the first visit, standard VF testing was performed. At the second and third visits with an interval of one week, while the left eyes served as control, one drop of artificial tear was administered to each patient's right eye, and then VF testing was performed again. The reliability parameters, VF indices, number of depressed points at probability levels of pattern deviation plots, and test times were compared between visits. No significant difference was observed in any VF testing parameters of control eyes (P>0.05). In artificial tear administered eyes, significant improvement was observed in test duration, mean deviation, and the number of depressed points at probability levels (P˂0.5%, P˂1%, P˂2) of pattern deviation plots (P˂0.05). The post-hoc test revealed that artificial tear administration elicited an improvement in test duration, mean deviation, and the number of depressed points at probability levels (P˂0.5%, P˂1%, P˂2%) of pattern deviation plots from first visit to second and third visits (P˂0.01, for all comparisons). The intraclass correlation coefficient for the three VF test indices was found to be between 0.735 and 0.85 (P<0.001, for all). A single dose of artificial tear administration immediately before VF testing seems to improve test results and decrease test time.

  12. Quantification of the vertical translocation rate of soil solid-phase material by the magnetic tracer method

    NASA Astrophysics Data System (ADS)

    Zhidkin, A. P.; Gennadiev, A. N.

    2016-07-01

    Approaches to the quantification of the vertical translocation rate of soil solid-phase material by the magnetic tracer method have been developed; the tracer penetration depth and rate have been determined, as well as the radial distribution of the tracer in chernozems (Chernozems) and dark gray forest soils (Luvisols) of Belgorod oblast under natural steppe and forest vegetation and in arable lands under agricultural use of different durations. It has been found that the penetration depth of spherical magnetic particles (SMPs) during their 150-year-occurrence in soils of a forest plot is 68 cm under forest, 58 cm on a 100-year old plowland, and only 49 cm on a 150-year-old plowland. In the chernozems of the steppe plot, the penetration depth of SMPs exceeds the studied depth of 70 cm both under natural vegetation and on the plowlands. The penetration rates of SMPs deep into the soil vary significantly among the key plots: 0.92-1.32 mm/year on the forest plot and 1.47-1.63 mm/year on the steppe plot, probably because of the more active recent turbation activity of soil animals.

  13. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  14. Confidence regions of planar cardiac vectors

    NASA Technical Reports Server (NTRS)

    Dubin, S.; Herr, A.; Hunt, P.

    1980-01-01

    A method for plotting the confidence regions of vectorial data obtained in electrocardiology is presented. The 90%, 95% and 99% confidence regions of cardiac vectors represented in a plane are obtained in the form of an ellipse centered at coordinates corresponding to the means of a sample selected at random from a bivariate normal distribution. An example of such a plot for the frontal plane QRS mean electrical axis for 80 horses is also presented.

  15. Truck-mounted Area-wide Application of Pyriproxyfen Targeting Aedes aegypti and Aedes albopictus in Northeast Florida

    DTIC Science & Technology

    2014-12-01

    the time and the lower rate of pyriproxyfen applied. Within 2 wk following Spray 2, however, Ae. albopictus collections from the treatment plot...districts/public health agencies. Pyriproxyfen, a juvenile hormone mimic, func- tions as an insect growth regulator (IGR) preventing normal development of...the lower rate of pyriproxyfen applied. Within 2 wk following Spray 2, however, Ae. albopictus collections from the treatment plot averaged

  16. Gene Expression Profiling of Monkeypox Virus-Infected Cells Reveals Novel Interfaces for Host-Virus Interactions

    DTIC Science & Technology

    2010-07-28

    expression is plotted on Y -axis after normalization to mock-treated samples. Results plotted to compare calculated fold change in expression of each gene ...RESEARCH Open Access Gene expression profiling of monkeypox virus-infected cells reveals novel interfaces for host-virus interactions Abdulnaser...suppress antiviral cell defenses, exploit host cell machinery, and delay infection-induced cell death. However, a comprehensive study of all host genes

  17. Effects of ungulate disturbance and weather variation on Pediocactus winkleri: insights from long-term monitoring

    USGS Publications Warehouse

    Clark, Deborah J.; Clark, Thomas O.; Duniway, Michael C.; Flagg, Cody B.

    2015-01-01

    Population dynamics and effects of large ungulate disturbances on Winkler cactus (Pediocactus winkleri K.D. Heil) were documented annually over a 20-year time span at one plot within Capitol Reef National Park, Utah. This cactus species was federally listed as threatened in 1998. The study began in 1995 to gain a better understanding of life history aspects and threats to this species. Data were collected annually in early spring and included diameter, condition, reproductive structures, mortality, recruitment, and disturbance by large ungulates. We used odds ratio and probability model analyses to determine effects of large ungulate trampling and weather on these cacti. During the study, plot population declined by 18%, with trampling of cactus, low precipitation, and cold spring temperatures implicated as causal factors. Precipitation and temperature affected flowering, mortality, and recruitment. Large ungulate disturbances increased mortality and reduced the probability of flowering. These results suggest that large ungulate disturbances and recent climate regimes have had an adverse impact on long-term persistence of this cactus.

  18. Non-Deterministic Dynamic Instability of Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2004-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics, and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties, in that order.

  19. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  20. Ancient human disturbances may be skewing our understanding of Amazonian forests

    PubMed Central

    McMichael, Crystal N. H.; Matthews-Bird, Frazer; Farfan-Rios, William; Feeley, Kenneth J.

    2017-01-01

    Although the Amazon rainforest houses much of Earth’s biodiversity and plays a major role in the global carbon budget, estimates of tree biodiversity originate from fewer than 1,000 forest inventory plots, and estimates of carbon dynamics are derived from fewer than 200 recensus plots. It is well documented that the pre-European inhabitants of Amazonia actively transformed and modified the forest in many regions before their population collapse around 1491 AD; however, the impacts of these ancient disturbances remain entirely unaccounted for in the many highly influential studies using Amazonian forest plots. Here we examine whether Amazonian forest inventory plot locations are spatially biased toward areas with high probability of ancient human impacts. Our analyses reveal that forest inventory plots, and especially forest recensus plots, in all regions of Amazonia are located disproportionately near archaeological evidence and in areas likely to have ancient human impacts. Furthermore, regions of the Amazon that are relatively oversampled with inventory plots also contain the highest values of predicted ancient human impacts. Given the long lifespan of Amazonian trees, many forest inventory and recensus sites may still be recovering from past disturbances, potentially skewing our interpretations of forest dynamics and our understanding of how these forests are responding to global change. Empirical data on the human history of forest inventory sites are crucial for determining how past disturbances affect modern patterns of forest composition and carbon flux in Amazonian forests. PMID:28049821

  1. A normalized model for the half-bridge series resonant converter

    NASA Technical Reports Server (NTRS)

    King, R.; Stuart, T. A.

    1981-01-01

    Closed-form steady-state equations are derived for the half-bridge series resonant converter with a rectified (dc) load. Normalized curves for various currents and voltages are then plotted as a function of the circuit parameters. Experimental results based on a 10-kHz converter are presented for comparison with the calculations.

  2. Next Generation Quality: Assessing the Physician in Clinical History Completeness and Diagnostic Interpretations Using Funnel Plots and Normalized Deviations Plots in 3,854 Prostate Biopsies.

    PubMed

    Bonert, Michael; El-Shinnawy, Ihab; Carvalho, Michael; Williams, Phillip; Salama, Samih; Tang, Damu; Kapoor, Anil

    2017-01-01

    Observational data and funnel plots are routinely used outside of pathology to understand trends and improve performance. Extract diagnostic rate (DR) information from free text surgical pathology reports with synoptic elements and assess whether inter-rater variation and clinical history completeness information useful for continuous quality improvement (CQI) can be obtained. All in-house prostate biopsies in a 6-year period at two large teaching hospitals were extracted and then diagnostically categorized using string matching, fuzzy string matching, and hierarchical pruning. DRs were then stratified by the submitting physicians and pathologists. Funnel plots were created to assess for diagnostic bias. 3,854 prostate biopsies were found and all could be diagnostically classified. Two audits involving the review of 700 reports and a comparison of the synoptic elements with the free text interpretations suggest a categorization error rate of <1%. Twenty-seven pathologists each read >40 cases and together assessed 3,690 biopsies. There was considerable inter-rater variability and a trend toward more World Health Organization/International Society of Urologic Pathology Grade 1 cancers in older pathologists. Normalized deviations plots, constructed using the median DR, and standard error can elucidate associated over- and under-calls for an individual pathologist in relation to their practice group. Clinical history completeness by submitting medical doctor varied significantly (100% to 22%). Free text data analyses have some limitations; however, they could be used for data-driven CQI in anatomical pathology, and could lead to the next generation in quality of care.

  3. Dielectric, Impedance and Conduction Behavior of Double Perovskite Pr2CuTiO6 Ceramics

    NASA Astrophysics Data System (ADS)

    Mahato, Dev K.; Sinha, T. P.

    2017-01-01

    Polycrystalline Pr2CuTiO6 (PCT) ceramics exhibits dielectric, impedance and modulus characteristics as a possible material for microelectronic devices. PCT was synthesized through the standard solid-state reaction method. The dielectric permittivity, impedance and electric modulus of PCT have been studied in a wide frequency (100 Hz-1 MHz) and temperature (303-593 K) range. Structural analysis of the compound revealed a monoclinic phase at room temperature. Complex impedance Cole-Cole plots are used to interpret the relaxation mechanism, and grain boundary contributions towards conductivity have been estimated. From electrical modulus formalism polarization and conductivity relaxation behavior in PCT have been discussed. Normalization of the imaginary part of impedance ( Z″) and the normalized imaginary part of modulus ( M″) indicates contributions from both long-range and localized relaxation effects. The grain boundary resistance along with their relaxation frequencies are plotted in the form of an Arrhenius plot with activation energy 0.45 eV and 0.46 eV, respectively. The ac conductivity mechanism has been discussed.

  4. Does prescribed fire promote resistance to drought in low elevation forests of the Sierra Nevada, California, USA?

    USGS Publications Warehouse

    van Mantgem, Phillip J.; Caprio, Anthony C.; Stephenson, Nathan L.; Das, Adrian J.

    2016-01-01

    Prescribed fire is a primary tool used to restore western forests following more than a century of fire exclusion, reducing fire hazard by removing dead and live fuels (small trees and shrubs).  It is commonly assumed that the reduced forest density following prescribed fire also reduces competition for resources among the remaining trees, so that the remaining trees are more resistant (more likely to survive) in the face of additional stressors, such as drought.  Yet this proposition remains largely untested, so that managers do not have the basic information to evaluate whether prescribed fire may help forests adapt to a future of more frequent and severe drought.During the third year of drought, in 2014, we surveyed 9950 trees in 38 burned and 18 unburned mixed conifer forest plots at low elevation (<2100 m a.s.l.) in Kings Canyon, Sequoia, and Yosemite national parks in California, USA.  Fire had occurred in the burned plots from 6 yr to 28 yr before our survey.  After accounting for differences in individual tree diameter, common conifer species found in the burned plots had significantly reduced probability of mortality compared to unburned plots during the drought.  Stand density (stems ha-1) was significantly lower in burned versus unburned sites, supporting the idea that reduced competition may be responsible for the differential drought mortality response.  At the time of writing, we are not sure if burned stands will maintain lower tree mortality probabilities in the face of the continued, severe drought of 2015.  Future work should aim to better identify drought response mechanisms and how these may vary across other forest types and regions, particularly in other areas experiencing severe drought in the Sierra Nevada and on the Colorado Plateau.

  5. Exploring and accounting for publication bias in mental health: a brief overview of methods.

    PubMed

    Mavridis, Dimitris; Salanti, Georgia

    2014-02-01

    OBJECTIVE Publication bias undermines the integrity of published research. The aim of this paper is to present a synopsis of methods for exploring and accounting for publication bias. METHODS We discussed the main features of the following methods to assess publication bias: funnel plot analysis; trim-and-fill methods; regression techniques and selection models. We applied these methods to a well-known example of antidepressants trials that compared trials submitted to the Food and Drug Administration (FDA) for regulatory approval. RESULTS The funnel plot-related methods (visual inspection, trim-and-fill, regression models) revealed an association between effect size and SE. Contours of statistical significance showed that asymmetry in the funnel plot is probably due to publication bias. Selection model found a significant correlation between effect size and propensity for publication. CONCLUSIONS Researchers should always consider the possible impact of publication bias. Funnel plot-related methods should be seen as a means of examining for small-study effects and not be directly equated with publication bias. Possible causes for funnel plot asymmetry should be explored. Contours of statistical significance may help disentangle whether asymmetry in a funnel plot is caused by publication bias or not. Selection models, although underused, could be useful resource when publication bias and heterogeneity are suspected because they address directly the problem of publication bias and not that of small-study effects.

  6. Circular Probable Error for Circular and Noncircular Gaussian Impacts

    DTIC Science & Technology

    2012-09-01

    1M simulated impacts ph(k)=mean(imp(:,1).^2+imp(:,2).^2<=CEP^2); % hit frequency on CEP end phit (j)=mean(ph...avg 100 hit frequencies to “incr n” end % GRAPHICS plot(i, phit ,’r-’); % error exponent versus Ph estimate

  7. Accelerated Testing Of Photothermal Degradation Of Polymers

    NASA Technical Reports Server (NTRS)

    Kim, Soon Sam; Liang, Ranty Hing; Tsay, Fun-Dow

    1989-01-01

    Electron-spin-resonance (ESR) spectroscopy and Arrhenius plots used to determine maximum safe temperature for accelerated testing of photothermal degradation of polymers. Aging accelerated by increasing illumination, temperature, or both. Results of aging tests at temperatures higher than those encountered in normal use valid as long as mechanism of degradation same throughout range of temperatures. Transition between different mechanisms at some temperature identified via transition between activation energies, manifesting itself as change in slope of Arrhenius plot at that temperature.

  8. Test and Evaluation of the Time/Frequency Collision Avoidance System Concept.

    DTIC Science & Technology

    1973-09-01

    cumulative distributions were then plotted on “normal” graph paper , i.e., graph paper on whit..h a normal distribution will plot as a straight line...apparent problems. 6-8 _ _ _ _ _ _ _ _ _ _ _ _ _ CIMP TER SEVEN CONCLUSIONS AND RECOMMENDAT IONS 7. 1 CONCLUSIONS The time/frequency technique for...instrumentation due to waiting for an event that will not occur , there are time—outs that cause the process to step past the event in questions . In this

  9. Static Scene Statistical Non-Uniformity Correction

    DTIC Science & Technology

    2015-03-01

    Error NUC Non-Uniformity Correction RMSE Root Mean Squared Error RSD Relative Standard Deviation S3NUC Static Scene Statistical Non-Uniformity...Deviation ( RSD ) which normalizes the standard deviation, σ, to the mean estimated value, µ using the equation RS D = σ µ × 100. The RSD plot of the gain...estimates is shown in Figure 4.1(b). The RSD plot shows that after a sample size of approximately 10, the different photocount values and the inclusion

  10. Precision measurement of the η → π + π - π 0 Dalitz plot distribution with the KLOE detector

    NASA Astrophysics Data System (ADS)

    Anastasi, A.; Babusci, D.; Bencivenni, G.; Berlowski, M.; Bloise, C.; Bossi, F.; Branchini, P.; Budano, A.; Caldeira Balkeståhl, L.; Cao, B.; Ceradini, F.; Ciambrone, P.; Curciarello, F.; Czerwinski, E.; D'Agostini, G.; Danè, E.; De Leo, V.; De Lucia, E.; De Santis, A.; De Simone, P.; Di Cicco, A.; Di Domenico, A.; Di Salvo, R.; Domenici, D.; D'Uffizi, A.; Fantini, A.; Felici, G.; Fiore, S.; Gajos, A.; Gauzzi, P.; Giardina, G.; Giovannella, S.; Graziani, E.; Happacher, F.; Heijkenskjöld, L.; Ikegami Andersson, W.; Johansson, T.; Kaminska, D.; Krzemien, W.; Kupsc, A.; Loffredo, S.; Mandaglio, G.; Martini, M.; Mascolo, M.; Messi, R.; Miscetti, S.; Morello, G.; Moricciani, D.; Moskal, P.; Papenbrock, M.; Passeri, A.; Patera, V.; Perez del Rio, E.; Ranieri, A.; Santangelo, P.; Sarra, I.; Schioppa, M.; Silarski, M.; Sirghi, F.; Tortora, L.; Venanzoni, G.; Wislicki, W.; Wolke, M.

    2016-05-01

    Using 1.6 fb-1 of e + e - → ϕ → ηγ data collected with the KLOE detector at DAΦNE, the Dalitz plot distribution for the η → π + π - π 0 decay is studied with the world's largest sample of ˜ 4 .7 · 106 events. The Dalitz plot density is parametrized as a polynomial expansion up to cubic terms in the normalized dimensionless variables X and Y . The experiment is sensitive to all charge conjugation conserving terms of the expansion, including a gX 2 Y term. The statistical uncertainty of all parameters is improved by a factor two with respect to earlier measurements.

  11. Detection of amblyopia utilizing generated retinal reflexes

    NASA Technical Reports Server (NTRS)

    Kerr, J. H.; Hay, S. H.

    1981-01-01

    Investigation confirmed that GRR images can be consistently obtained and that these images contain information required to detect the optical inequality of one eye compared to the fellow eye. Digital analyses, electro-optical analyses, and trained observers were used to evaluate the GRR images. Two and three dimensional plots were made from the digital analyses results. These plotted data greatly enhanced the GRR image content, and it was possible for nontrained observers to correctly identify normal vs abnormal ocular status by viewing the plots. Based upon the criteria of detecting equality or inequality of ocular status of a person's eyes, the trained observer correctly identified the ocular status of 90% of the 232 persons who participated in this program.

  12. Iron status determination in pregnancy using the Thomas plot.

    PubMed

    Weyers, R; Coetzee, M J; Nel, M

    2016-04-01

    Physiological changes during pregnancy affect routine tests for iron deficiency. The reticulocyte haemoglobin equivalent (RET-He) and serum-soluble transferrin receptor (sTfR) assay are newer diagnostic parameters for the detection of iron deficiency, combined in the Thomas diagnostic plot. We used this plot to determine the iron status of pregnant women presenting for their first visit to an antenatal clinic in Bloemfontein, South Africa. Routine laboratory tests (serum ferritin, full blood count and C-reactive protein) and RET-He and sTfR were performed. The iron status was determined using the Thomas plot. For this study, 103 pregnant women were recruited. According to the Thomas plot, 72.8% of the participants had normal iron stores and erythropoiesis. Iron-deficient erythropoiesis was detected in 12.6%. A third of participants were anaemic. Serum ferritin showed excellent sensitivity but poor specificity for detecting depleted iron stores. HIV status had no influence on the iron status of the participants. Our findings reiterate that causes other than iron deficiency should be considered in anaemic individuals. When compared with the Thomas plot, a low serum ferritin is a sensitive but nonspecific indicator of iron deficiency. The Thomas plot may provide useful information to identify pregnant individuals in whom haematologic parameters indicate limited iron availability for erythropoiesis. © 2015 John Wiley & Sons Ltd.

  13. Characterizing rainfall of hot arid region by using time-series modeling and sustainability approaches: a case study from Gujarat, India

    NASA Astrophysics Data System (ADS)

    Machiwal, Deepesh; Kumar, Sanjay; Dayal, Devi

    2016-05-01

    This study aimed at characterization of rainfall dynamics in a hot arid region of Gujarat, India by employing time-series modeling techniques and sustainability approach. Five characteristics, i.e., normality, stationarity, homogeneity, presence/absence of trend, and persistence of 34-year (1980-2013) period annual rainfall time series of ten stations were identified/detected by applying multiple parametric and non-parametric statistical tests. Furthermore, the study involves novelty of proposing sustainability concept for evaluating rainfall time series and demonstrated the concept, for the first time, by identifying the most sustainable rainfall series following reliability ( R y), resilience ( R e), and vulnerability ( V y) approach. Box-whisker plots, normal probability plots, and histograms indicated that the annual rainfall of Mandvi and Dayapar stations is relatively more positively skewed and non-normal compared with that of other stations, which is due to the presence of severe outlier and extreme. Results of Shapiro-Wilk test and Lilliefors test revealed that annual rainfall series of all stations significantly deviated from normal distribution. Two parametric t tests and the non-parametric Mann-Whitney test indicated significant non-stationarity in annual rainfall of Rapar station, where the rainfall was also found to be non-homogeneous based on the results of four parametric homogeneity tests. Four trend tests indicated significantly increasing rainfall trends at Rapar and Gandhidham stations. The autocorrelation analysis suggested the presence of persistence of statistically significant nature in rainfall series of Bhachau (3-year time lag), Mundra (1- and 9-year time lag), Nakhatrana (9-year time lag), and Rapar (3- and 4-year time lag). Results of sustainability approach indicated that annual rainfall of Mundra and Naliya stations ( R y = 0.50 and 0.44; R e = 0.47 and 0.47; V y = 0.49 and 0.46, respectively) are the most sustainable and dependable compared with that of other stations. The highest values of sustainability index at Mundra (0.120) and Naliya (0.112) stations confirmed the earlier findings of R y- R e- V y approach. In general, annual rainfall of the study area is less reliable, less resilient, and moderately vulnerable, which emphasizes the need of developing suitable strategies for managing water resources of the area on sustainable basis. Finally, it is recommended that multiple statistical tests (at least two) should be used in time-series modeling for making reliable decisions. Moreover, methodology and findings of the sustainability concept in rainfall time series can easily be adopted in other arid regions of the world.

  14. Segmented Poincaré plot analysis for risk stratification in patients with dilated cardiomyopathy.

    PubMed

    Voss, A; Fischer, C; Schroeder, R; Figulla, H R; Goernig, M

    2010-01-01

    The prognostic value of heart rate variability in patients with dilated cardiomyopathy (DCM) is limited and does not contribute to risk stratification although the dynamics of ventricular repolarization differs considerably between DCM patients and healthy subjects. Neither linear nor nonlinear methods of heart rate variability analysis could discriminate between patients at high and low risk for sudden cardiac death. The aim of this study was to analyze the suitability of the new developed segmented Poincaré plot analysis (SPPA) to enhance risk stratification in DCM. In contrast to the usual applied Poincaré plot analysis the SPPA retains nonlinear features from investigated beat-to-beat interval time series. Main features of SPPA are the rotation of cloud of points and their succeeded variability depended segmentation. Significant row and column probabilities were calculated from the segments and led to discrimination (up to p<0.005) between low and high risk in DCM patients. For the first time an index from Poincaré plot analysis of heart rate variability was able to contribute to risk stratification in patients suffering from DCM.

  15. Measurements of Oleic Acid among Individual Kernels Harvested from Test Plots of Purified Runner and Spanish High Oleic Seed

    USDA-ARS?s Scientific Manuscript database

    Normal oleic peanuts are often found within commercial lots of high oleic peanuts when sampling among individual kernels. Kernels not meeting high oleic threshold could be true contamination with normal oleic peanuts introduced via poor handling, or they could be immature and not fully expressing th...

  16. Measurements of Oleic Acid among Individual kernels Harvested from Test Plots of Purified Runner and Spanish High Oleic Seed

    USDA-ARS?s Scientific Manuscript database

    Normal oleic peanuts are often found within commercial lots of high oleic peanuts when sampling among individual kernels. Kernels not meeting high oleic threshold could be true contamination with normal oleic peanuts introduced via poor handling, or they could be immature and not fully expressing th...

  17. Measurements of oleic acid among individual kernels harvested from test plots of purified runner and spanish high oleic seed

    USDA-ARS?s Scientific Manuscript database

    Normal oleic peanuts are often found within commercial lots of high oleic peanuts when sampling among individual kernels. Kernels not meeting high oleic threshold could be true contamination with normal oleic peanuts introduced via poor handling, or kernels not meeting threshold could be immature a...

  18. REGIONAL PATTERNS OF LOCAL DIVERSITY OF TREES: ASSOCIATIONS WITH ANTHROPOGENIC DISTURBANCE

    EPA Science Inventory

    We used a probability-based sampling scheme to survey the forested lands of 14 states in five regions in the US(California,Colorado,and parts of the Southeast,Mid-Atlantic,and Northeast)from 1990 to 1993. Using a nationally consistent plot design, we evaluated the local diversity...

  19. Predicting the Probability of Stand Disturbance

    Treesearch

    Gregory A. Reams; Joseph M. McCollum

    1999-01-01

    Forest managers are often interested in identifying and scheduling future stand treatment opportunities. One of the greatest management opportunities is presented following major stand level disturbances that result from natural or anthropogenic forces. Remeasurement data from the Forest Inventory and Analysis (FIA) permanent plot system are used to fit a set of...

  20. Mortality of trees in loblolly pine plantations

    Treesearch

    Boris Zeide; Yujia Zhang

    2006-01-01

    The annual probability of mortality for planted loblolly pine (Pinus taeda L.) trees was estimated using a set of permanent plots covering the entire native range of the species. The recorded causes of death were infestation by the southern pine beetle (Dendroctonus frontalis Zimmermann) and other insects, lightning, and unknown...

  1. Multiscale Poincaré plots for visualizing the structure of heartbeat time series.

    PubMed

    Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L

    2016-02-09

    Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.

  2. Red-shouldered hawk nesting habitat preference in south Texas

    USGS Publications Warehouse

    Strobel, Bradley N.; Boal, Clint W.

    2010-01-01

    We examined nesting habitat preference by red-shouldered hawks Buteo lineatus using conditional logistic regression on characteristics measured at 27 occupied nest sites and 68 unused sites in 2005–2009 in south Texas. We measured vegetation characteristics of individual trees (nest trees and unused trees) and corresponding 0.04-ha plots. We evaluated the importance of tree and plot characteristics to nesting habitat selection by comparing a priori tree-specific and plot-specific models using Akaike's information criterion. Models with only plot variables carried 14% more weight than models with only center tree variables. The model-averaged odds ratios indicated red-shouldered hawks selected to nest in taller trees and in areas with higher average diameter at breast height than randomly available within the forest stand. Relative to randomly selected areas, each 1-m increase in nest tree height and 1-cm increase in the plot average diameter at breast height increased the probability of selection by 85% and 10%, respectively. Our results indicate that red-shouldered hawks select nesting habitat based on vegetation characteristics of individual trees as well as the 0.04-ha area surrounding the tree. Our results indicate forest management practices resulting in tall forest stands with large average diameter at breast height would benefit red-shouldered hawks in south Texas.

  3. Effect of Tillage and Planting Date on Seasonal Abundance and Diversity of Predacious Ground Beetles in Cotton

    PubMed Central

    Shrestha, R. B.; Parajulee, M. N.

    2010-01-01

    A 2-year field study was conducted in the southern High Plains region of Texas to evaluate the effect of tillage system and cotton planting date window on seasonal abundance and activity patterns of predacious ground beetles. The experiment was deployed in a split-plot randomized block design with tillage as the main-plot factor and planting date as the subplot factor. There were two levels for each factor. The two tillage systems were conservation tillage (30% or more of the soil surface is covered with crop residue) and conventional tillage. The two cotton planting date window treatments were early May (normal planting) and early June (late planting). Five prevailing predacious ground beetles, Cicindela sexguttata F., Calosoma scrutator Drees, Pasimachus spp., Pterostichus spp., and Megacephala Carolina L. (Coleoptera: Carabidae), were monitored using pitfall traps at 2-week intervals from June 2002 to October 2003. The highest total number of ground beetles (6/trap) was observed on 9 July 2003. Cicindela sexguttata was the dominant ground dwelling predacious beetle among the five species. A significant difference between the two tillage systems was observed in the abundances of Pterostichus spp. and C. sexguttata. In 2002. significantly more Pterostichus spp. were recorded from conventional plots (0.27/trap) than were recorded from conservation tillage plots (0.05/trap). Significantly more C. sexguttata were recorded in 2003 from conservation plots (3.77/trap) than were recorded from conventional tillage plots (1.04/trap). There was a significant interaction between year and tillage treatments. However, there was no significant difference in the abundances of M. Carolina and Pasimachus spp. between the two tillage practices in either of the two years. M. Carolina numbers were significantly higher in late-planted cotton compared with those observed in normal-planted cotton. However, planting date window had no significant influence on the activity patterns of the other species. Ground beetle species abundance, diversity, and species richness were significantly higher in conservation tillage plots. This suggests that field conditions arising from the practice of conservation tillage may support higher predacious ground beetle activity than might be observed under field conditions arising from conventional tillage practices. PMID:21062204

  4. A branching process model for the analysis of abortive colony size distributions in carbon ion-irradiated normal human fibroblasts.

    PubMed

    Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki

    2014-05-01

    A single cell can form a colony, and ionizing irradiation has long been known to reduce such a cellular clonogenic potential. Analysis of abortive colonies unable to continue to grow should provide important information on the reproductive cell death (RCD) following irradiation. Our previous analysis with a branching process model showed that the RCD in normal human fibroblasts can persist over 16 generations following irradiation with low linear energy transfer (LET) γ-rays. Here we further set out to evaluate the RCD persistency in abortive colonies arising from normal human fibroblasts exposed to high-LET carbon ions (18.3 MeV/u, 108 keV/µm). We found that the abortive colony size distribution determined by biological experiments follows a linear relationship on the log-log plot, and that the Monte Carlo simulation using the RCD probability estimated from such a linear relationship well simulates the experimentally determined surviving fraction and the relative biological effectiveness (RBE). We identified the short-term phase and long-term phase for the persistent RCD following carbon-ion irradiation, which were similar to those previously identified following γ-irradiation. Taken together, our results suggest that subsequent secondary or tertiary colony formation would be invaluable for understanding the long-lasting RCD. All together, our framework for analysis with a branching process model and a colony formation assay is applicable to determination of cellular responses to low- and high-LET radiation, and suggests that the long-lasting RCD is a pivotal determinant of the surviving fraction and the RBE.

  5. Emission characteristics of 6.78-MHz radio-frequency glow discharge plasma in a pulsed mode

    NASA Astrophysics Data System (ADS)

    Zhang, Xinyue; Wagatsuma, Kazuaki

    2017-07-01

    This paper investigated Boltzmann plots for both atomic and ionic emission lines of iron in an argon glow discharge plasma driven by 6.78-MHz radio-frequency (RF) voltage in a pulsed operation, in order to discuss how the excitation/ionization process was affected by the pulsation. For this purpose, a pulse frequency as well as a duty ratio of the pulsed RF voltage was selected as the experimenter parameters. A Grimm-style radiation source was employed at a forward RF power of 70 W and at an argon pressures of 670 Pa. The Boltzmann plot for low-lying excited levels of iron atom was on a linear relationship, which was probably attributed to thermal collisions with ultimate electrons in the negative glow region; in this case, the excitation temperature was obtained in a narrow range of 3300-3400 K, which was hardly affected by the duty ratio as well as the pulse frequency of the pulsed RF glow discharge plasma. This observation suggested that the RF plasma could be supported by a self-stabilized negative glow region, where the kinetic energy distribution of the electrons would be changed to a lesser extent. Additional non-thermal excitation processes, such as a Penning-type collision and a charge-transfer collision, led to deviations (overpopulation) of particular energy levels of iron atom or iron ion from the normal Boltzmann distribution. However, their contributions to the overall excitation/ionization were not altered so greatly, when the pulse frequency or the duty ratio was varied in the pulsed RF glow discharge plasma.

  6. Recovery of biological soil crust richness and cover 12-16 years after wildfires in Idaho, USA

    NASA Astrophysics Data System (ADS)

    Root, Heather T.; Brinda, John C.; Dodson, E. Kyle

    2017-09-01

    Changing fire regimes in western North America may impact biological soil crust (BSC) communities that influence many ecosystem functions, such as soil stability and C and N cycling. However, longer-term effects of wildfire on BSC abundance, species richness, functional groups, and ecosystem functions after wildfire (i.e., BSC resilience) are still poorly understood. We sampled BSC lichen and bryophyte communities at four sites in Idaho, USA, within foothill steppe communities that included wildfires from 12 to 16 years old. We established six plots outside each burn perimeter and compared them with six plots of varying severity within each fire perimeter at each site. BSC cover was most strongly negatively impacted by wildfire at sites that had well-developed BSC communities in adjacent unburned plots. BSC species richness was estimated to be 65 % greater in unburned plots compared with burned plots, and fire effects did not vary among sites. In contrast, there was no evidence that vascular plant functional groups or fire severity (as measured by satellite metrics differenced normalized burn ratio (dNBR) or relativized differenced normalized burn ratio (RdNBR)) significantly affected longer-term BSC responses. Three large-statured BSC functional groups that may be important in controlling wind and water erosion (squamulose lichens, vagrant lichens, and tall turf mosses) exhibited a significant decrease in abundance in burned areas relative to adjacent unburned areas. The decreases in BSC cover and richness along with decreased abundance of several functional groups suggest that wildfire can negatively impact ecosystem function in these semiarid ecosystems for at least 1 to 2 decades. This is a concern given that increased fire frequency is predicted for the region due to exotic grass invasion and climate change.

  7. Comparison Tools for Assessing the Microgravity Environment of Missions, Carriers and Conditions

    NASA Technical Reports Server (NTRS)

    DeLombard, Richard; McPherson, Kevin; Moskowitz, Milton; Hrovat, Ken

    1997-01-01

    The Principal Component Spectral Analysis and the Quasi-steady Three-dimensional Histogram techniques provide the means to describe the microgravity acceleration environment of an entire mission on a single plot. This allows a straight forward comparison of the microgravity environment between missions, carriers, and conditions. As shown in this report, the PCSA and QTH techniques bring both the range and median of the microgravity environment onto a single page for an entire mission or another time period or condition of interest. These single pages may then be used to compare similar analyses of other missions, time periods or conditions. The PCSA plot is based on the frequency distribution of the vibrational energy and is normally used for an acceleration data set containing frequencies above the lowest natural frequencies of the vehicle. The QTH plot is based on the direction and magnitude of the acceleration and is normally used for acceleration data sets with frequency content less than 0.1 Hz. Various operating conditions are made evident by using PCSA and QTH plots. Equipment operating either full or part time with sufficient magnitude to be considered a disturbance is very evident as well as equipment contributing to the background acceleration environment. A source's magnitude and/or frequency variability is also evident by the source's appearance on a PCSA plot. The PCSA and QTH techniques are valuable tools for extracting useful information from acceleration data taken over large spans of time. This report shows that these techniques provide a tool for comparison between different sets of microgravity acceleration data, for example different missions, different activities within a mission, and/or different attitudes within a mission. These techniques, as well as others, may be employed in order to derive useful information from acceleration data.

  8. How grazing and soil quality affect native and exotic plant diversity in Rocky Mountain grasslands

    USGS Publications Warehouse

    Stohlgren, T.J.; Schell, L.D.; Vanden, Heuvel B.

    1999-01-01

    We used multiscale plots to sample vascular plant diversity and soil characteristics in and adjacent to 26 long-term grazing exclosure sites in Colorado, Wyoming, Montana, and South Dakota, USA. The exclosures were 7-60 yr old (31.2 ?? 2.5 yr, mean ?? 1 SE). Plots were also randomly placed in the broader landscape in open rangeland in the same vegetation type at each site to assess spatial variation in grazed landscapes. Consistent sampling in the nine National Parks, Wildlife Refuges, and other management units yielded data from 78 1000-m2 plots and 780 1-m2 subplots. We hypothesized that native species richness would be lower in the exclosures than in grazed sites, due to competitive exclusion in the absence of grazing. We also hypothesized that grazed sites would have higher native and exotic species richness compared to ungrazed areas, due to disturbance (i.e., the intermediate-disturbance hypothesis) and the conventional wisdom that grazing may accelerate weed invasion. Both hypotheses were soundly rejected. Although native species richness in 1-m2 subplots was significantly higher (P < 0.05) in grazed sites, we found nearly identical native or exotic species richness in 1000-m2 plots in exclosures (31.5 ?? 2.5 native and 3.1 ?? 0.5 exotic species), adjacent grazed plots (32.6 ?? 2.8 native and 3.2 ?? 0.6 exotic species), and randomly selected grazed plots (31.6 ?? 2.9 native and 3.2 ?? 0.6 exotic species). We found no significant differences in species diversity (Hill's diversity indices, N1 and N2), evenness (Hill's ratio of evenness, E5), cover of various life-forms (grasses, forbs, and shrubs), soil texture, or soil percentage of N and C between grazed and ungrazed sites at the 1000-m2 plot scale. The species lists of the long-ungrazed and adjacent grazed plots overlapped just 57.9 ?? 2.8%. This difference in species composition is commonly attributed solely to the difference in grazing regimes. However, the species lists between pairs of grazed plots (adjacent and distant 1000-m2 plots) in the same vegetation type overlapped just 48.6 ?? 3.6%, and the ungrazed plots and distant grazed plots overlapped 49.4 ?? 3.6%. Differences in vegetation and soils between grazed and ungrazed sites were minimal in most cases, but soil characteristics and elevation were strongly correlated with native and exotic plant diversity in the study region. For the 78 1000-m2 plots, 59.4% of the variance in total species richness was explained by percentage of silt (coefficient = 0.647, t = 5.107, P < 0.001), elevation (coefficient = 0.012, t = 5.084, P < 0.001), and total foliar cover (coefficient = 0.110, t = 2.104, P < 0.039). Only 12.8% of the variance in exotic species cover (log10cover) was explained by percentage of clay (coefficient = -0.011, t = -2.878, P < 0.005), native species richness (coefficient = -0.011, t = -2.156, P < 0.034), and log10N (coefficient = 2.827, t = 1.860, P < 0.067). Native species cover and exotic species richness and frequency were also significantly positively correlated with percentage of soil N at the 1000-m2 plot scale. Our research led to five broad generalizations about current levels of grazing in these Rocky Mountain grasslands: (1) grazing probably has little effect on native species richness at landscape scales; (2) grazing probably has little effect on the accelerated spread of most exotic plant species at landscape scales; (3) grazing affects local plant species and life-form composition and cover, but spatial variation is considerable; (4) soil characteristics, climate, and disturbances may have a greater effect on plant species diversity than do current levels of grazing; and (5) few plant species show consistent, directional responses to grazing or cessation of grazing.

  9. A further use for the Harvest plot: a novel method for the presentation of data synthesis.

    PubMed

    Crowther, Mark; Avenell, Alison; MacLennan, Graeme; Mowatt, Graham

    2011-06-01

    When performing a systematic review, whether or not a meta-analysis is performed, graphical displays can be useful. Data do still need to be described, ideally in graphical form. The Harvest plot has been developed to display combined data from several studies that allows demonstration of not only effect but also study quality. We describe a modification to the Harvest plot that allows the presentation of data that normally could not be included in a forest plot meta-analysis and allows extra information to be displayed. Using specific examples, we describe how the arrangement of studies, height of the bars and additional information can be used to enhance the plot. This is an important development, which by fulfilling Tufte's nine requirements for graphical presentation, allows researchers to display evidence in a flexible way. This means readers can follow an argument in a clear and efficient manner without the need for large volumes of descriptive text. Copyright © 2011 John Wiley & Sons, Ltd. Copyright © 2011 John Wiley & Sons, Ltd.

  10. Unintended consequences of invasive predator control in an Australian forest: overabundant wallabies and vegetation change.

    PubMed

    Dexter, Nick; Hudson, Matt; James, Stuart; Macgregor, Christopher; Lindenmayer, David B

    2013-01-01

    Over-abundance of native herbivores is a problem in many forests worldwide. The abundance of native macropod wallabies is extremely high at Booderee National Park (BNP) in south-eastern Australia. This has occurred because of the reduction of exotic predators through an intensive baiting program, coupled with the absence of other predators. The high density of wallabies at BNP may be inhibiting the recruitment of many plant species following fire-induced recruitment events. We experimentally examined the post-fire response of a range of plant species to browsing by wallabies in a forest heavily infested with the invasive species, bitou bush Chrysanthemoides monilifera. We recorded the abundance and size of a range of plant species in 18 unfenced (browsed) and 16 fenced (unbrowsed) plots. We found the abundance and size of bitou bush was suppressed in browsed plots compared to unbrowsed plots. Regenerating seedlings of the canopy or middle storey tree species Eucalyptus pilularis, Acacia implexa, Allocasuarina littoralis, Breynia oblongifolia and Banksia integrifolia were either smaller or fewer in number in grazed plots than treatment plots as were the vines Kennedia rubicunda, Glycine tabacina and Glycine clandestina. In contrast, the understorey fern, Pteridium esculentum increased in abundance in the browsed plots relative to unbrowsed plots probably because of reduced competition with more palatable angiosperms. Twelve months after plots were installed the community structure of the browsed and unbrowsed plots was significantly different (P = 0.023, Global R = 0.091). The relative abundance of C. monilifera and P. esculentum contributed most to the differences. We discuss the possible development of a low diversity bracken fern parkland in Booderee National Park through a trophic cascade, similar to that caused by overabundant deer in the northern hemisphere. We also discuss its implications for broad scale fox control in southern Australian forests.

  11. Unintended Consequences of Invasive Predator Control in an Australian Forest: Overabundant Wallabies and Vegetation Change

    PubMed Central

    Dexter, Nick; Hudson, Matt; James, Stuart; MacGregor, Christopher; Lindenmayer, David B.

    2013-01-01

    Over-abundance of native herbivores is a problem in many forests worldwide. The abundance of native macropod wallabies is extremely high at Booderee National Park (BNP) in south-eastern Australia. This has occurred because of the reduction of exotic predators through an intensive baiting program, coupled with the absence of other predators. The high density of wallabies at BNP may be inhibiting the recruitment of many plant species following fire-induced recruitment events. We experimentally examined the post-fire response of a range of plant species to browsing by wallabies in a forest heavily infested with the invasive species, bitou bush Chrysanthemoides monilifera. We recorded the abundance and size of a range of plant species in 18 unfenced (browsed) and 16 fenced (unbrowsed) plots. We found the abundance and size of bitou bush was suppressed in browsed plots compared to unbrowsed plots. Regenerating seedlings of the canopy or middle storey tree species Eucalyptus pilularis, Acacia implexa, Allocasuarina littoralis, Breynia oblongifolia and Banksia integrifolia were either smaller or fewer in number in grazed plots than treatment plots as were the vines Kennedia rubicunda, Glycine tabacina and Glycine clandestina. In contrast, the understorey fern, Pteridium esculentum increased in abundance in the browsed plots relative to unbrowsed plots probably because of reduced competition with more palatable angiosperms. Twelve months after plots were installed the community structure of the browsed and unbrowsed plots was significantly different (P = 0.023, Global R = 0.091). The relative abundance of C. monilifera and P. esculentum contributed most to the differences. We discuss the possible development of a low diversity bracken fern parkland in Booderee National Park through a trophic cascade, similar to that caused by overabundant deer in the northern hemisphere. We also discuss its implications for broad scale fox control in southern Australian forests. PMID:23990879

  12. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  13. Assigning the low lying vibronic states of CH3O and CD3O

    NASA Astrophysics Data System (ADS)

    Johnson, Britta A.; Sibert, Edwin L.

    2017-05-01

    The assignment of lines in vibrational spectra in strongly mixing systems is considered. Several low lying vibrational states of the ground electronic X˜ 2E state of the CH3O and CD3O radicals are assigned. Jahn-Teller, spin-orbit, and Fermi couplings mix the normal mode states. The mixing complicates the assignment of the infrared spectra using a zero-order normal mode representation. Alternative zero-order representations, which include specific Jahn-Teller couplings, are explored. These representations allow for definitive assignments. In many instances it is possible to plot the wavefunctions on which the assignments are based. The plots, which are shown in the adiabatic representation, allow one to visualize the effects of various higher order couplings. The plots also enable one to visualize the conical seam and its effect on the wavefunctions. The first and the second order Jahn-Teller couplings in the rocking motion dominate the spectral features in CH3O, while first order and modulated first order couplings dominate the spectral features in CD3O. The methods described here are general and can be applied to other Jahn-Teller systems.

  14. Detection of fallen trees in ALS point clouds using a Normalized Cut approach trained by simulation

    NASA Astrophysics Data System (ADS)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2015-07-01

    Downed dead wood is regarded as an important part of forest ecosystems from an ecological perspective, which drives the need for investigating its spatial distribution. Based on several studies, Airborne Laser Scanning (ALS) has proven to be a valuable remote sensing technique for obtaining such information. This paper describes a unified approach to the detection of fallen trees from ALS point clouds based on merging short segments into whole stems using the Normalized Cut algorithm. We introduce a new method of defining the segment similarity function for the clustering procedure, where the attribute weights are learned from labeled data. Based on a relationship between Normalized Cut's similarity function and a class of regression models, we show how to learn the similarity function by training a classifier. Furthermore, we propose using an appearance-based stopping criterion for the graph cut algorithm as an alternative to the standard Normalized Cut threshold approach. We set up a virtual fallen tree generation scheme to simulate complex forest scenarios with multiple overlapping fallen stems. This simulated data is then used as a basis to learn both the similarity function and the stopping criterion for Normalized Cut. We evaluate our approach on 5 plots from the strictly protected mixed mountain forest within the Bavarian Forest National Park using reference data obtained via a manual field inventory. The experimental results show that our method is able to detect up to 90% of fallen stems in plots having 30-40% overstory cover with a correctness exceeding 80%, even in quite complex forest scenes. Moreover, the performance for feature weights trained on simulated data is competitive with the case when the weights are calculated using a grid search on the test data, which indicates that the learned similarity function and stopping criterion can generalize well on new plots.

  15. The Coast Artillery Journal. Volume 67, Number 6, December 1927

    DTIC Science & Technology

    1927-12-01

    of some treasonable plot or practise in an Army, the Generall must first assure the place, and then more fully search into the treason, and punish the...in Coast Artillery ~Iemorandum No. 7 for 1928_ 106 Changing Zones With Mortars, 138 The Variable Probable Error 405 Taxation : Taxes Show Staggering

  16. Focus on Rashomon.

    ERIC Educational Resources Information Center

    Richie, Donald S., Ed.

    This Film Focus series is a collection of reviews, essays, and commentaries on the Japanese film Rashomon. The plot consists of an attack, a rape, and a robbery, all of which probably occurred during the Middle Ages. Each character relates his own version of what happened, or might have happened, revealing the outward and inner driving forces,…

  17. Teaching Qualitative Energy-Eigenfunction Shape with Physlets

    ERIC Educational Resources Information Center

    Belloni, Mario; Christian, Wolfgang; Cox, Anne J.

    2007-01-01

    More than 35 years ago, French and Taylor outlined an approach to teach students and teachers alike how to understand "qualitative plots of bound-state wave functions." They described five fundamental statements based on the quantum-mechanical concepts of probability and energy (total and potential), which could be used to deduce the shape of…

  18. Adaptive Dynamics, Control, and Extinction in Networked Populations

    DTIC Science & Technology

    2015-07-09

    network geometries. From the pre-history of paths that go extinct, a density function is created from the prehistory of these paths, and a clear local...density plots of Fig. 3b. Using the IAMM to compute the most probable path and comparing it to the prehistory of extinction events on stochastic networks

  19. A survival model for individual shortleaf pine trees in even-aged natural stands

    Treesearch

    Thomas B. Lynch; Michael M. Huebschmann; Paul A. Murphy

    2000-01-01

    A model was developed that predicts the probability of survival for individual shortleaf pine (Pinus echinata Mill.) trees growing in even-aged natural stands. Data for model development were obtained from the first two measurements of permanently established plots located in naturally occurring shortleaf pine forests on the Ouachita and...

  20. Enclosure fire hazard analysis using relative energy release criteria. [burning rate and combustion control

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.

  1. Adjustments for the display of quantized ion channel dwell times in histograms with logarithmic bins.

    PubMed

    Stark, J A; Hladky, S B

    2000-02-01

    Dwell-time histograms are often plotted as part of patch-clamp investigations of ion channel currents. The advantages of plotting these histograms with a logarithmic time axis were demonstrated by, J. Physiol. (Lond.). 378:141-174), Pflügers Arch. 410:530-553), and, Biophys. J. 52:1047-1054). Sigworth and Sine argued that the interpretation of such histograms is simplified if the counts are presented in a manner similar to that of a probability density function. However, when ion channel records are recorded as a discrete time series, the dwell times are quantized. As a result, the mapping of dwell times to logarithmically spaced bins is highly irregular; bins may be empty, and significant irregularities may extend beyond the duration of 100 samples. Using simple approximations based on the nature of the binning process and the transformation rules for probability density functions, we develop adjustments for the display of the counts to compensate for this effect. Tests with simulated data suggest that this procedure provides a faithful representation of the data.

  2. Multivariate probability distribution for sewer system vulnerability assessment under data-limited conditions.

    PubMed

    Del Giudice, G; Padulano, R; Siciliano, D

    2016-01-01

    The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements.

  3. Parametric study and global sensitivity analysis for co-pyrolysis of rape straw and waste tire via variance-based decomposition.

    PubMed

    Xu, Li; Jiang, Yong; Qiu, Rong

    2018-01-01

    In present study, co-pyrolysis behavior of rape straw, waste tire and their various blends were investigated. TG-FTIR indicated that co-pyrolysis was characterized by a four-step reaction, and H 2 O, CH, OH, CO 2 and CO groups were the main products evolved during the process. Additionally, using BBD-based experimental results, best-fit multiple regression models with high R 2 -pred values (94.10% for mass loss and 95.37% for reaction heat), which correlated explanatory variables with the responses, were presented. The derived models were analyzed by ANOVA at 95% confidence interval, F-test, lack-of-fit test and residues normal probability plots implied the models described well the experimental data. Finally, the model uncertainties as well as the interactive effect of these parameters were studied, the total-, first- and second-order sensitivity indices of operating factors were proposed using Sobol' variance decomposition. To the authors' knowledge, this is the first time global parameter sensitivity analysis has been performed in (co-)pyrolysis literature. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. External Threat Risk Assessment Algorithm (ExTRAA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Troy C.

    Two risk assessment algorithms and philosophies have been augmented and combined to form a new algorit hm, the External Threat Risk Assessment Algorithm (ExTRAA), that allows for effective and statistically sound analysis of external threat sources in relation to individual attack methods . In addition to the attack method use probability and the attack method employment consequence, t he concept of defining threat sources is added to the risk assessment process. Sample data is tabulated and depicted in radar plots and bar graphs for algorithm demonstration purposes. The largest success of ExTRAA is its ability to visualize the kind ofmore » r isk posed in a given situation using the radar plot method.« less

  5. Variation of normal tissue complication probability (NTCP) estimates of radiation-induced hypothyroidism in relation to changes in delineation of the thyroid gland.

    PubMed

    Rønjom, Marianne F; Brink, Carsten; Lorenzen, Ebbe L; Hegedüs, Laszlo; Johansen, Jørgen

    2015-01-01

    To examine the variations of risk-estimates of radiation-induced hypothyroidism (HT) from our previously developed normal tissue complication probability (NTCP) model in patients with head and neck squamous cell carcinoma (HNSCC) in relation to variability of delineation of the thyroid gland. In a previous study for development of an NTCP model for HT, the thyroid gland was delineated in 246 treatment plans of patients with HNSCC. Fifty of these plans were randomly chosen for re-delineation for a study of the intra- and inter-observer variability of thyroid volume, Dmean and estimated risk of HT. Bland-Altman plots were used for assessment of the systematic (mean) and random [standard deviation (SD)] variability of the three parameters, and a method for displaying the spatial variation in delineation differences was developed. Intra-observer variability resulted in a mean difference in thyroid volume and Dmean of 0.4 cm(3) (SD ± 1.6) and -0.5 Gy (SD ± 1.0), respectively, and 0.3 cm(3) (SD ± 1.8) and 0.0 Gy (SD ± 1.3) for inter-observer variability. The corresponding mean differences of NTCP values for radiation-induced HT due to intra- and inter-observer variations were insignificantly small, -0.4% (SD ± 6.0) and -0.7% (SD ± 4.8), respectively, but as the SDs show, for some patients the difference in estimated NTCP was large. For the entire study population, the variation in predicted risk of radiation-induced HT in head and neck cancer was small and our NTCP model was robust against observer variations in delineation of the thyroid gland. However, for the individual patient, there may be large differences in estimated risk which calls for precise delineation of the thyroid gland to obtain correct dose and NTCP estimates for optimized treatment planning in the individual patient.

  6. How Does Gender Affect Sustainable Intensification of Cereal Production in the West African Sahel? Evidence from Burkina Faso.

    PubMed

    Theriault, Veronique; Smale, Melinda; Haider, Hamza

    2017-04-01

    Better understanding of gender differences in the adoption of agricultural intensification strategies is crucial for designing effective policies to close the gender gap while sustainably enhancing farm productivity. We examine gender differences in adoption rates, likelihood and determinants of adopting strategy sets that enhance yields, protect crops, and restore soils in the West African Sahel, based on analysis of cereal production in Burkina Faso. Applying a multivariate probit model to a nationally representative household panel, we exploit the individual plot as unit of analysis and control for plot manager characteristics along with other covariates. Reflecting the socio-cultural context of farming combined with the economic attributes of inputs, we find that female managers of individual cereal fields are less likely than their male counterparts to adopt yield-enhancing and soil-restoring strategies, although no differential is apparent for yield-protecting strategies. More broadly, gender-disaggregated regressions demonstrate that adoption determinants differ by gender. Plot manager characteristics, including age, marital status, and access to credit or extension services do influence adoption decisions. Furthermore, household resources influence the probability of adopting intensification strategy sets differently by gender of the plot manager. Variables expressing the availability of household labor strongly influence the adoption of soil-restoring strategies by female plot managers. By contrast, household resources such as extent of livestock owned, value of non-farm income, and area planted to cotton affect the adoption choices of male plot managers. Rectifying the male bias in extension services along with improving access to credit, income, and equipment to female plot managers could contribute to sustainable agricultural intensification.

  7. IMS/Satellite Situation Center report: Orbit plots and bar charts for Prognoz 4, days 1-91 1976

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Orbit plots for the Prognoz 4 satellite for the time period January to March 1976 are given. This satellite was identified as a possible important contributor to the International Magnetospheric Study project. The orbits were based on an element epoch of December 26, 1975, 3h 8min and 17s. In view of the low perigee of this satellite, the Satellite Situation Center (SSC) considered that the effect of atmospheric drag precludes orbit predictions for the length of time normally used by the SSC for high-altitude satellites. Consequently, orbit data are shown for the first 3 months of 1976 only. The orbit generated for this report was based on the earlier epoch, and it positions the satellite within 30s of the ascending node at the later epoch. Therefore, within the accuracy of the plots shown in this report, the orbit used was regarded as an achieved orbit. The orbit information is displayed graphically in four ways: bar charts, geocentric solar ecliptic plots, boundary plots, and solar magnetic latitude versus local time plots. The most concise presentation is the bar charts. The bar charts give the crude three-dimensional position of the satellite for each magnetospheric region.

  8. Quantitative features in the computed tomography of healthy lungs.

    PubMed Central

    Fromson, B H; Denison, D M

    1988-01-01

    This study set out to determine whether quantitative features of lung computed tomography scans could be identified that would lead to a tightly defined normal range for use in assessing patients. Fourteen normal subjects with apparently healthy lungs were studied. A technique was developed for rapid and automatic extraction of lung field data from the computed tomography scans. The Hounsfield unit histograms were constructed and, when normalised for predicted lung volumes, shown to be consistent in shape for all the subjects. A three dimensional presentation of the data in the form of a "net plot" was devised, and from this a logarithmic relationship between the area of each lung slice and its mean density was derived (r = 0.9, n = 545, p less than 0.0001). The residual density, calculated as the difference between measured density and density predicted from the relationship with area, was shown to be normally distributed with a mean of 0 and a standard deviation of 25 Hounsfield units (chi 2 test: p less than 0.05). A presentation combining this residual density with the net plot is described. PMID:3353883

  9. Co-evolution of soils and vegetation in the Aísa Valley Experimental Station (Central Pyrenees)

    NASA Astrophysics Data System (ADS)

    Serrano Muela, Maria Pilar; Nadal Romero, Estela; Lasanta, Teodoro; María García Ruiz, José

    2013-04-01

    Soils and vegetation tend to evolve jointly in relation to climate evolution and the impacts of human activity. This study analyzes soil and vegetation characteristics under various plant covers, using information from the Aísa Valley Experimental Station (AVES), Spanish Pyrenees, from 1991 to 2010. The land uses considered were: dense shrub cover, grazing meadow, abandoned field, cereal (barley), abandoned shifting agriculture, active shifting agriculture, burnt1 and burnt2 plots, and in-fallow plot. All the plots were installed on a field abandoned 45 years ago. Some of the plots did not change in plant cover through the study period (e.g., the meadow, cereal and shifting agriculture plots), but others underwent changes in density and composition, such as: (i) The dense shrub cover plot represents the natural evolution of the abandoned field. When the AVES was equipped, this plot was completely dominated by Genista scorpius, with a few stands of Rosa gr. Canina. Twenty years later, Genista scorpius is affected of senescence and shows almost no regeneration capacity. (ii) The abandoned field had previously been cultivated with cereals until 1993. Once abandoned, the progression of plant colonization was very rapid. Firstly with grasses and, 10 years later, with Genista scorpius. At present, this latter occupies more than 50% of the plot. (iii) The evolution of plant colonization in the abandoned shifting agriculture plot was slower than that in the 'normal' abandoned field, mainly because of the differences in fertilization when they were cultivated. (iv) One of the burnt plots evolved from 0% to a coverage of almost 100% in a shot period, whereas the other plot remained with a shrub density of about 60% several years after the fire. Soil samples (superficial and depth) were analyzed to obtain physical and chemical properties: structure, texture, pH, CaCO3, Organic Matter and various anions and cations. The main purpose was to detect differences in the soil properties as a consequence of land cover/land uses.

  10. Algae Tile Data: 2004-2007, BPA-51; Preliminary Report, October 28, 2008.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holderman, Charles

    Multiple files containing 2004 through 2007 Tile Chlorophyll data for the Kootenai River sites designated as: KR1, KR2, KR3, KR4 (Downriver) and KR6, KR7, KR9, KR9.1, KR10, KR11, KR12, KR13, KR14 (Upriver) were received by SCS. For a complete description of the sites covered, please refer to http://ktoi.scsnetw.com. To maintain consistency with the previous SCS algae reports, all analyses were carried out separately for the Upriver and Downriver categories, as defined in the aforementioned paragraph. The Upriver designation, however, now includes three additional sites, KR11, KR12, and the nutrient addition site, KR9.1. Summary statistics and information on the four responses,more » chlorophyll a, chlorophyll a Accrual Rate, Total Chlorophyll, and Total Chlorophyll Accrual Rate are presented in Print Out 2. Computations were carried out separately for each river position (Upriver and Downriver) and year. For example, the Downriver position in 2004 showed an average Chlorophyll a level of 25.5 mg with a standard deviation of 21.4 and minimum and maximum values of 3.1 and 196 mg, respectively. The Upriver data in 2004 showed a lower overall average chlorophyll a level at 2.23 mg with a lower standard deviation (3.6) and minimum and maximum values of (0.13 and 28.7, respectively). A more comprehensive summary of each variable and position is given in Print Out 3. This lists the information above as well as other summary information such as the variance, standard error, various percentiles and extreme values. Using the 2004 Downriver Chlorophyll a as an example again, the variance of this data was 459.3 and the standard error of the mean was 1.55. The median value or 50th percentile was 21.3, meaning 50% of the data fell above and below this value. It should be noted that this value is somewhat different than the mean of 25.5. This is an indication that the frequency distribution of the data is not symmetrical (skewed). The skewness statistic, listed as part of the first section of each analysis, quantifies this. In a symmetric distribution, such as a Normal distribution, the skewness value would be 0. The tile chlorophyll data, however, shows larger values. Chlorophyll a, in the 2004 Downriver example, has a skewness statistic of 3.54, which is quite high. In the last section of the summary analysis, the stem and leaf plot graphically demonstrates the asymmetry, showing most of the data centered around 25 with a large value at 196. The final plot is referred to as a normal probability plot and graphically compares the data to a theoretical normal distribution. For chlorophyll a, the data (asterisks) deviate substantially from the theoretical normal distribution (diagonal reference line of pluses), indicating that the data is non-normal. Other response variables in both the Downriver and Upriver categories also indicated skewed distributions. Because the sample size and mean comparison procedures below require symmetrical, normally distributed data, each response in the data set was logarithmically transformed. The logarithmic transformation, in this case, can help mitigate skewness problems. The summary statistics for the four transformed responses (log-ChlorA, log-TotChlor, and log-accrual ) are given in Print Out 4. For the 2004 Downriver Chlorophyll a data, the logarithmic transformation reduced the skewness value to -0.36 and produced a more bell-shaped symmetric frequency distribution. Similar improvements are shown for the remaining variables and river categories. Hence, all subsequent analyses given below are based on logarithmic transformations of the original responses.« less

  11. Preliminary Findings of the Photovoltaic Cell Calibration Experiment on Pathfinder Flight 95-3

    NASA Technical Reports Server (NTRS)

    Vargas-Aburto, Carlos

    1997-01-01

    The objective of the photovoltaic (PV) cell calibration experiment for Pathfinder was to develop an experiment compatible with an ultralight UAV to predict the performance of PV cells at AM0, the solar spectrum in space, using the Langley plot technique. The Langley plot is a valuable technique for this purpose and requires accurate measurements of air mass (pressure), cell temperature, solar irradiance, and current-voltage(IV) characteristics with the cells directed normal to the direct ray of the sun. Pathfinder's mission objective (95-3) of 65,000 ft. maximum altitude, is ideal for performing the Langley plot measurements. Miniaturization of electronic data acquisition equipment enabled the design and construction of an accurate and light weight measurement system that meets Pathfinder's low payload weight requirements.

  12. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  13. Analysis of vegetation in an Imperata grassland of Barak valley, Assam.

    PubMed

    Astapati, Ashim Das; Das, Ashesh Kumar

    2012-09-01

    Imperata grassland at Dorgakona, Barak valley, North Eastern India was analyzed for species composition and diversity pattern in relation to traditional management practices. 19 families were in the burnt and unburnt plots of the study site with Poaceae as the most dominant one. 29 species occurred in the burnt plot and 28 in the unburnt plot. Most of the species were common in both the plots. The pattern of frequency diagrams indicated that the vegetation was homogeneous. Imperata cylindrica, a rhizomatous grass was the dominant species based on density (318.75 and 304.18 nos. m(-2)), basal cover (158.22 and 148.34 cm2 m(-2)) and Importance value index (IVI) (132.64 and 138.74) for the burnt and unburnt plots respectively. Borreria pusilla was the co-dominant species constituting Imperata-Borreria assemblage of the studied grassland. It was observed that B. pusilla (162.25 nos. m(-2) and 50.37 nos. m(-2), I. cylindrica (318.75 nos. m(-2) and 304.18 nos. m(-2)) and Setaria glauca (24.70 nos. m(-2) and 16.46 nos. m(-2) were benefited from burning as shown by the values sequentially placed for burnt and unburnt plots. Certain grasses like Chrysopogon aciculatus and Sacciolepis indica were restricted to burnt plot while Oxalis corniculata showed its presence to unburnt plot. Grasses dominated the grassland as revealed by their contribution to the mean percentage cover of 72% in burnt plot and 76% in umburnt plot. The dominance-diversity curves in the study site approaches a log normal series distribution suggesting that the resources are shared by the constituent species. Seasonal pattern in diversity index suggested definite influence of climatic seasonality on species diversity; rainy season was conducive for maximum diversity (1.40 and 1.38 in the burnt and unburnt plots, respectively). Dominance increased with concentration of fewer species (0.0021 in burnt plot and 0.0055 in unbumt plot) in summer and behaves inversely to index of diversity. This study showed that the traditional management practices benefits the farmers as it promote grassland regeneration with I. cylindrica as the dominant grass.

  14. Characterizing Decision-Analysis Performances of Risk Prediction Models Using ADAPT Curves.

    PubMed

    Lee, Wen-Chung; Wu, Yun-Chun

    2016-01-01

    The area under the receiver operating characteristic curve is a widely used index to characterize the performance of diagnostic tests and prediction models. However, the index does not explicitly acknowledge the utilities of risk predictions. Moreover, for most clinical settings, what counts is whether a prediction model can guide therapeutic decisions in a way that improves patient outcomes, rather than to simply update probabilities.Based on decision theory, the authors propose an alternative index, the "average deviation about the probability threshold" (ADAPT).An ADAPT curve (a plot of ADAPT value against the probability threshold) neatly characterizes the decision-analysis performances of a risk prediction model.Several prediction models can be compared for their ADAPT values at a chosen probability threshold, for a range of plausible threshold values, or for the whole ADAPT curves. This should greatly facilitate the selection of diagnostic tests and prediction models.

  15. Probability analysis for consecutive-day maximum rainfall for Tiruchirapalli City (south India, Asia)

    NASA Astrophysics Data System (ADS)

    Sabarish, R. Mani; Narasimhan, R.; Chandhru, A. R.; Suribabu, C. R.; Sudharsan, J.; Nithiyanantham, S.

    2017-05-01

    In the design of irrigation and other hydraulic structures, evaluating the magnitude of extreme rainfall for a specific probability of occurrence is of much importance. The capacity of such structures is usually designed to cater to the probability of occurrence of extreme rainfall during its lifetime. In this study, an extreme value analysis of rainfall for Tiruchirapalli City in Tamil Nadu was carried out using 100 years of rainfall data. Statistical methods were used in the analysis. The best-fit probability distribution was evaluated for 1, 2, 3, 4 and 5 days of continuous maximum rainfall. The goodness of fit was evaluated using Chi-square test. The results of the goodness-of-fit tests indicate that log-Pearson type III method is the overall best-fit probability distribution for 1-day maximum rainfall and consecutive 2-, 3-, 4-, 5- and 6-day maximum rainfall series of Tiruchirapalli. To be reliable, the forecasted maximum rainfalls for the selected return periods are evaluated in comparison with the results of the plotting position.

  16. A new general method for the assessment of the molecular-weight distribution of polydisperse preparations. Its application to an intestinal epithelial glycoprotein and two dextran samples, and comparison with a monodisperse glycoprotein

    PubMed Central

    Gibbons, Richard A.; Dixon, Stephen N.; Pocock, David H.

    1973-01-01

    A specimen of intestinal glycoprotein isolated from the pig and two samples of dextran, all of which are polydisperse (that is, the preparations may be regarded as consisting of a continuous distribution of molecular weights), have been examined in the ultracentrifuge under meniscus-depletion conditions at equilibrium. They are compared with each other and with a glycoprotein from Cysticercus tenuicollis cyst fluid which is almost monodisperse. The quantity c−⅓ (c=concentration) is plotted against ξ (the reduced radius); this plot is linear when the molecular-weight distribution approximates to the `most probable', i.e. when Mn:Mw:Mz: M(z+1)....... is as 1:2:3:4: etc. The use of this plot, and related procedures, to evaluate qualitatively and semi-quantitatively molecular-weight distribution functions where they can be realistically approximated to Schulz distributions is discussed. The theoretical basis is given in an Appendix. PMID:4778265

  17. Modeled forest inventory data suggest climate benefits from fuels management

    Treesearch

    Jeremy S. Fried; Theresa B. Jain; Jonathan. Sandquist

    2013-01-01

    As part of a recent synthesis addressing fuel management in dry, mixed-conifer forests we analyzed more than 5,000 Forest Inventory and Analysis (FIA) plots, a probability sample that represents 33 million acres of these forests throughout Washington, Oregon, Idaho, Montana, Utah, and extreme northern California. We relied on the BioSum analysis framework that...

  18. Point-Sampling and Line-Sampling Probability Theory, Geometric Implications, Synthesis

    Treesearch

    L.R. Grosenbaugh

    1958-01-01

    Foresters concerned with measuring tree populations on definite areas have long employed two well-known methods of representative sampling. In list or enumerative sampling the entire tree population is tallied with a known proportion being randomly selected and measured for volume or other variables. In area sampling all trees on randomly located plots or strips...

  19. Analyzing the Surface Warfare Operational Effectiveness of an Offshore Patrol Vessel using Agent Based Modeling

    DTIC Science & Technology

    2012-09-01

    20 Figure 6. Marte Missile Phit – Range Profile...22 Figure 7. Exocet Missile Phit – Range Profile .................................................................22 Figure 8. Gun Phit – Range...in the OSN model. Factors like range and Phit probability plots and agent dependent factors could be directly implemented in MANA with little effort

  20. Climatic stress increases forest fire severity across the western United States

    Treesearch

    Phillip J. van Mantgem; Jonathan C.B. Nesmith; MaryBeth Keifer; Eric E. Knapp; Alan Flint; Lorriane Flint

    2013-01-01

    Pervasive warming can lead to chronic stress on forest trees, which may contribute to mortality resulting from fire-caused injuries. Longitudinal analyses of forest plots from across the western US show that high pre-fire climatic water deficit was related to increased post-fire tree mortality probabilities. This relationship between climate and fire was present after...

  1. A survival model for individual shortleaf pine trees in even-aged natural stands

    Treesearch

    Thomas B. Lynch; Michael M. Huebschmann; Paul A. Murphy

    2000-01-01

    A model was developed that predicts the probability of survival for individual shortleaf pine (Pinus echinata Mill.) trees growing in even-aged natural stands. Data for model development were obtained from the first two measurements of permanently established plots located in naturally occurring shortleaf pine forests on the Ouachita and Ozark...

  2. Diameter Growth, Survival, and Volume Estimates for Missouri Trees

    Treesearch

    Stephen R. Shifley; W. Brad Smith

    1982-01-01

    Measurements of more than 20,000 Missouri trees were summarized by species and diameter class into tables of mean annual diameter growth, annual probability of survival, net cubic foot volume, and net board foot volume. In the absence of better forecasting techniques, this information can be utilized to project short-term changes for Missouri trees, inventory plots,...

  3. [Establishment of the mathematic model of total quantum statistical moment standard similarity for application to medical theoretical research].

    PubMed

    He, Fu-yuan; Deng, Kai-wen; Huang, Sheng; Liu, Wen-long; Shi, Ji-lian

    2013-09-01

    The paper aims to elucidate and establish a new mathematic model: the total quantum statistical moment standard similarity (TQSMSS) on the base of the original total quantum statistical moment model and to illustrate the application of the model to medical theoretical research. The model was established combined with the statistical moment principle and the normal distribution probability density function properties, then validated and illustrated by the pharmacokinetics of three ingredients in Buyanghuanwu decoction and of three data analytical method for them, and by analysis of chromatographic fingerprint for various extracts with different solubility parameter solvents dissolving the Buyanghanwu-decoction extract. The established model consists of four mainly parameters: (1) total quantum statistical moment similarity as ST, an overlapped area by two normal distribution probability density curves in conversion of the two TQSM parameters; (2) total variability as DT, a confidence limit of standard normal accumulation probability which is equal to the absolute difference value between the two normal accumulation probabilities within integration of their curve nodical; (3) total variable probability as 1-Ss, standard normal distribution probability within interval of D(T); (4) total variable probability (1-beta)alpha and (5) stable confident probability beta(1-alpha): the correct probability to make positive and negative conclusions under confident coefficient alpha. With the model, we had analyzed the TQSMS similarities of pharmacokinetics of three ingredients in Buyanghuanwu decoction and of three data analytical methods for them were at range of 0.3852-0.9875 that illuminated different pharmacokinetic behaviors of each other; and the TQSMS similarities (ST) of chromatographic fingerprint for various extracts with different solubility parameter solvents dissolving Buyanghuanwu-decoction-extract were at range of 0.6842-0.999 2 that showed different constituents with various solvent extracts. The TQSMSS can characterize the sample similarity, by which we can quantitate the correct probability with the test of power under to make positive and negative conclusions no matter the samples come from same population under confident coefficient a or not, by which we can realize an analysis at both macroscopic and microcosmic levels, as an important similar analytical method for medical theoretical research.

  4. Normalized coffin-manson plot in terms of a new life function based on stress relaxation under creep-fatigue conditions

    NASA Astrophysics Data System (ADS)

    Jeong, Chang Yeol; Nam, Soo Woo; Lim, Jong Dae

    2003-04-01

    A new life prediction function based on a model formulated in terms of stress relaxation during hold time under creep-fatigue conditions is proposed. From the idea that reduction in fatigue life with hold is due to the creep effect of stress relaxation that results in additional energy dissipation in the hysteresis loop, it is suggested that the relaxed stress range may be a creep-fatigue damage function. Creep-fatigue data from the present and other investigators are used to check the validity of the proposed life prediction equation. It is shown that the data satisfy the applicability of the life relation model. Accordingly, using this life prediction model, one may realize that all the Coffin-Manson plots at various levels of hold time in strain-controlled creep-fatigue tests can be normalized to make one straight line.

  5. The steady-state mosaic of disturbance and succession across an old-growth Central Amazon forest landscape.

    PubMed

    Chambers, Jeffrey Q; Negron-Juarez, Robinson I; Marra, Daniel Magnabosco; Di Vittorio, Alan; Tews, Joerg; Roberts, Dar; Ribeiro, Gabriel H P M; Trumbore, Susan E; Higuchi, Niro

    2013-03-05

    Old-growth forest ecosystems comprise a mosaic of patches in different successional stages, with the fraction of the landscape in any particular state relatively constant over large temporal and spatial scales. The size distribution and return frequency of disturbance events, and subsequent recovery processes, determine to a large extent the spatial scale over which this old-growth steady state develops. Here, we characterize this mosaic for a Central Amazon forest by integrating field plot data, remote sensing disturbance probability distribution functions, and individual-based simulation modeling. Results demonstrate that a steady state of patches of varying successional age occurs over a relatively large spatial scale, with important implications for detecting temporal trends on plots that sample a small fraction of the landscape. Long highly significant stochastic runs averaging 1.0 Mg biomass⋅ha(-1)⋅y(-1) were often punctuated by episodic disturbance events, resulting in a sawtooth time series of hectare-scale tree biomass. To maximize the detection of temporal trends for this Central Amazon site (e.g., driven by CO2 fertilization), plots larger than 10 ha would provide the greatest sensitivity. A model-based analysis of fractional mortality across all gap sizes demonstrated that 9.1-16.9% of tree mortality was missing from plot-based approaches, underscoring the need to combine plot and remote-sensing methods for estimating net landscape carbon balance. Old-growth tropical forests can exhibit complex large-scale structure driven by disturbance and recovery cycles, with ecosystem and community attributes of hectare-scale plots exhibiting continuous dynamic departures from a steady-state condition.

  6. Key seabird areas in southern New England identified using a community occupancy model

    USGS Publications Warehouse

    O'Connell, Allan F.; Flanders, Nicholas P.; Gardner, Beth; Winiarski, Kristopher J.; Paton, Peter W. C.; Allison, Taber

    2015-01-01

    Seabirds are of conservation concern, and as new potential risks to seabirds are arising, the need to provide unbiased estimates of species’ distributions is growing. We applied community occupancy models to detection/non-detection data collected from repeated aerial strip-transect surveys conducted in 2 large study plots off southern New England, USA; one off the coast of Rhode Island and the other in Nantucket Sound. A total of 17 seabird species were observed at least once in each study plot. We found that detection varied by survey date and effort for most species and the average detection probability across species was less than 0.4. We estimated the influence of water depth, sea surface temperature, and sea surface chl a concentration on species-specific occupancy. Diving species showed large differences between the 2 study plots in their predicted winter distributions, which were largely explained by water depth acting as a stronger predictor of occupancy in Rhode Island than in Nantucket Sound. Conversely, similarities between the 2 study plots in predicted winter distributions of surface-feeding species were explained by sea surface temperature or chlorophyll a concentration acting as predictors of these species’ occupancy in both study plots. We predicted the number of species at each site using the observed data in order to detect ‘hot-spots’ of seabird diversity and use in the 2 study plots. These results provide new information on detection of species, areas of use, and relationships with environmental variables that will be valuable for biologists and planners interested in seabird conservation in the region.

  7. Probability density function of non-reactive solute concentration in heterogeneous porous formations.

    PubMed

    Bellin, Alberto; Tonina, Daniele

    2007-10-30

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for the local concentration of conservative tracers migrating in heterogeneous aquifers. Our model accounts for dilution, mechanical mixing within the sampling volume and spreading due to formation heterogeneity. It is developed by modeling local concentration dynamics with an Ito Stochastic Differential Equation (SDE) that under the hypothesis of statistical stationarity leads to the Beta probability distribution function (pdf) for the solute concentration. This model shows large flexibility in capturing the smoothing effect of the sampling volume and the associated reduction of the probability of exceeding large concentrations. Furthermore, it is fully characterized by the first two moments of the solute concentration, and these are the same pieces of information required for standard geostatistical techniques employing Normal or Log-Normal distributions. Additionally, we show that in the absence of pore-scale dispersion and for point concentrations the pdf model converges to the binary distribution of [Dagan, G., 1982. Stochastic modeling of groundwater flow by unconditional and conditional probabilities, 2, The solute transport. Water Resour. Res. 18 (4), 835-848.], while it approaches the Normal distribution for sampling volumes much larger than the characteristic scale of the aquifer heterogeneity. Furthermore, we demonstrate that the same model with the spatial moments replacing the statistical moments can be applied to estimate the proportion of the plume volume where solute concentrations are above or below critical thresholds. Application of this model to point and vertically averaged bromide concentrations from the first Cape Cod tracer test and to a set of numerical simulations confirms the above findings and for the first time it shows the superiority of the Beta model to both Normal and Log-Normal models in interpreting field data. Furthermore, we show that assuming a-priori that local concentrations are normally or log-normally distributed may result in a severe underestimate of the probability of exceeding large concentrations.

  8. Evaluation of aided phytostabilization of Pb and Zn in Santa Antonieta tailing pond two years after its remediation

    NASA Astrophysics Data System (ADS)

    Martínez-Martínez, Silvia; Neveu, Aurore; Acosta, Jose A.; Zornoza, Raúl; Gómez, M. Dolores; Faz, Ángel

    2017-04-01

    Mining and its subsequent activities have been found to degrade the land to a significant extent. Phytostabilization aims to generate a functional soil ecosystem that supports plant growth over contaminated wastes, lessening surface and subsurface water flow, providing stability to soil through the development of extensive root systems, and hastening successional development. A field experiment was carried out in Santa Antonieta tailing pond, located in Cartagena-La Unión mining district (SE Spain) in order to know the reasons why important differences in the percentage of plant cover were observed in the studied areas two years after the end of assisted phytostabilization. The main objectives of this research were to: a) determine the vegetation cover and biodiversity of the four plots selected; b) evaluate which soil physicochemical properties influence significant the growth and development of plant species and c) identify in which soil fractions are mostly retained Pb and Zn. The results of this study showed that the highest percentage of vegetation cover was registered in the plot 1 (85%), while the lowest percentage was observed in Plot 3 where no plant grew as in the control plot. The most influential physicochemical properties on the growth and development of the plant species that grew on the plots were: pH, electrical conductivity, inorganic carbon and bioavailable phosphorus.With regard to sequential extraction, Pb and Zn were in a very high percentage in the residual fraction. The highest concentration of bioavailable metal was observed with Zn in plot 3, around 15%, probably due to its acidity (pH value of 3.2) and this may be the cause of this plot is devoid of vegetation. For future research in the study area, a new sampling of plant species that continue growing on plots would need to be carried out to determine if metals continue to accumulate in the rhizosphere or are accumulating at the aerial part of the plant, and avoid possible environmental risks.

  9. Diffusion tensor imaging profiles reveal specific neural tract distortion in normal pressure hydrocephalus

    PubMed Central

    Pena, Alonso; Price, Stephen J.; Czosnyka, Marek; Czosnyka, Zofia; DeVito, Elise E.; Housden, Charlotte R.; Sahakian, Barbara J.; Pickard, John D.

    2017-01-01

    Background The pathogenesis of normal pressure hydrocephalus (NPH) remains unclear which limits both early diagnosis and prognostication. The responsiveness to intervention of differing, complex and concurrent injury patterns on imaging have not been well-characterized. We used diffusion tensor imaging (DTI) to explore the topography and reversibility of white matter injury in NPH pre- and early after shunting. Methods Twenty-five participants (sixteen NPH patients and nine healthy controls) underwent DTI, pre-operatively and at two weeks post-intervention in patients. We interrogated 40 datasets to generate a full panel of DTI measures and corroborated findings with plots of isotropy (p) vs. anisotropy (q). Results Concurrent examination of DTI measures revealed distinct profiles for NPH patients vs. controls. PQ plots demonstrated that patterns of injury occupied discrete white matter districts. DTI profiles for different white matter tracts showed changes consistent with i) predominant transependymal diffusion with stretch/ compression, ii) oedema with or without stretch/ compression and iii) predominant stretch/ compression. Findings were specific to individual tracts and dependent upon their proximity to the ventricles. At two weeks post-intervention, there was a 6·7% drop in axial diffusivity (p = 0·022) in the posterior limb of the internal capsule, compatible with improvement in stretch/ compression, that preceded any discernible changes in clinical outcome. On PQ plots, the trajectories of the posterior limb of the internal capsule and inferior longitudinal fasciculus suggested attempted ‘round trips’. i.e. return to normality. Conclusion DTI profiling with p:q correlation may offer a non-invasive biomarker of the characteristics of potentially reversible white matter injury. PMID:28817574

  10. Diffusion tensor imaging profiles reveal specific neural tract distortion in normal pressure hydrocephalus.

    PubMed

    Keong, Nicole C; Pena, Alonso; Price, Stephen J; Czosnyka, Marek; Czosnyka, Zofia; DeVito, Elise E; Housden, Charlotte R; Sahakian, Barbara J; Pickard, John D

    2017-01-01

    The pathogenesis of normal pressure hydrocephalus (NPH) remains unclear which limits both early diagnosis and prognostication. The responsiveness to intervention of differing, complex and concurrent injury patterns on imaging have not been well-characterized. We used diffusion tensor imaging (DTI) to explore the topography and reversibility of white matter injury in NPH pre- and early after shunting. Twenty-five participants (sixteen NPH patients and nine healthy controls) underwent DTI, pre-operatively and at two weeks post-intervention in patients. We interrogated 40 datasets to generate a full panel of DTI measures and corroborated findings with plots of isotropy (p) vs. anisotropy (q). Concurrent examination of DTI measures revealed distinct profiles for NPH patients vs. controls. PQ plots demonstrated that patterns of injury occupied discrete white matter districts. DTI profiles for different white matter tracts showed changes consistent with i) predominant transependymal diffusion with stretch/ compression, ii) oedema with or without stretch/ compression and iii) predominant stretch/ compression. Findings were specific to individual tracts and dependent upon their proximity to the ventricles. At two weeks post-intervention, there was a 6·7% drop in axial diffusivity (p = 0·022) in the posterior limb of the internal capsule, compatible with improvement in stretch/ compression, that preceded any discernible changes in clinical outcome. On PQ plots, the trajectories of the posterior limb of the internal capsule and inferior longitudinal fasciculus suggested attempted 'round trips'. i.e. return to normality. DTI profiling with p:q correlation may offer a non-invasive biomarker of the characteristics of potentially reversible white matter injury.

  11. Study of gastric cancer samples using terahertz techniques

    NASA Astrophysics Data System (ADS)

    Wahaia, Faustino; Kasalynas, Irmantas; Seliuta, Dalius; Molis, Gediminas; Urbanowicz, Andrzej; Carvalho Silva, Catia D.; Carneiro, Fatima; Valusis, Gintaras; Granja, Pedro L.

    2014-08-01

    In the present work, samples of healthy and adenocarcinoma-affected human gastric tissue were analyzed using transmission time-domain THz spectroscopy (THz-TDS) and spectroscopic THz imaging at 201 and 590 GHz. The work shows that it is possible to distinguish between normal and cancerous regions in dried and paraffin-embedded samples. Plots of absorption coefficient α and refractive index n of normal and cancer affected tissues, as well as 2-D transmission THz images are presented and the conditions for discrimination between normal and affected tissues are discussed.

  12. RNA Thermodynamic Structural Entropy

    PubMed Central

    Garcia-Martin, Juan Antonio; Clote, Peter

    2015-01-01

    Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner’99 and Turner’04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http://bioinformatics.bc.edu/clotelab/RNAentropy, including source code and ancillary programs. PMID:26555444

  13. RNA Thermodynamic Structural Entropy.

    PubMed

    Garcia-Martin, Juan Antonio; Clote, Peter

    2015-01-01

    Conformational entropy for atomic-level, three dimensional biomolecules is known experimentally to play an important role in protein-ligand discrimination, yet reliable computation of entropy remains a difficult problem. Here we describe the first two accurate and efficient algorithms to compute the conformational entropy for RNA secondary structures, with respect to the Turner energy model, where free energy parameters are determined from UV absorption experiments. An algorithm to compute the derivational entropy for RNA secondary structures had previously been introduced, using stochastic context free grammars (SCFGs). However, the numerical value of derivational entropy depends heavily on the chosen context free grammar and on the training set used to estimate rule probabilities. Using data from the Rfam database, we determine that both of our thermodynamic methods, which agree in numerical value, are substantially faster than the SCFG method. Thermodynamic structural entropy is much smaller than derivational entropy, and the correlation between length-normalized thermodynamic entropy and derivational entropy is moderately weak to poor. In applications, we plot the structural entropy as a function of temperature for known thermoswitches, such as the repression of heat shock gene expression (ROSE) element, we determine that the correlation between hammerhead ribozyme cleavage activity and total free energy is improved by including an additional free energy term arising from conformational entropy, and we plot the structural entropy of windows of the HIV-1 genome. Our software RNAentropy can compute structural entropy for any user-specified temperature, and supports both the Turner'99 and Turner'04 energy parameters. It follows that RNAentropy is state-of-the-art software to compute RNA secondary structure conformational entropy. Source code is available at https://github.com/clotelab/RNAentropy/; a full web server is available at http://bioinformatics.bc.edu/clotelab/RNAentropy, including source code and ancillary programs.

  14. Vector wind and vector wind shear models 0 to 27 km altitude for Cape Kennedy, Florida, and Vandenberg AFB, California

    NASA Technical Reports Server (NTRS)

    Smith, O. E.

    1976-01-01

    The techniques are presented to derive several statistical wind models. The techniques are from the properties of the multivariate normal probability function. Assuming that the winds can be considered as bivariate normally distributed, then (1) the wind components and conditional wind components are univariate normally distributed, (2) the wind speed is Rayleigh distributed, (3) the conditional distribution of wind speed given a wind direction is Rayleigh distributed, and (4) the frequency of wind direction can be derived. All of these distributions are derived from the 5-sample parameter of wind for the bivariate normal distribution. By further assuming that the winds at two altitudes are quadravariate normally distributed, then the vector wind shear is bivariate normally distributed and the modulus of the vector wind shear is Rayleigh distributed. The conditional probability of wind component shears given a wind component is normally distributed. Examples of these and other properties of the multivariate normal probability distribution function as applied to Cape Kennedy, Florida, and Vandenberg AFB, California, wind data samples are given. A technique to develop a synthetic vector wind profile model of interest to aerospace vehicle applications is presented.

  15. Radiocesium discharge from paddy fields with different initial scrapings for decontamination after the Fukushima Dai-ichi Nuclear Power Plant accident.

    PubMed

    Wakahara, Taeko; Onda, Yuich; Kato, Hiroaki; Sakaguchi, Aya; Yoshimura, Kazuya

    2014-11-01

    To explore the behavior of radionuclides released after the Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident in March 2011, and the distribution of radiocesium in paddy fields, we monitored radiocesium (Cs) and suspended sediment (SS) discharge from paddy fields. We proposed a rating scale for measuring the effectiveness of surface soil removal. Our experimental plots in paddy fields were located ∼40 km from the FDNPP. Two plots were established: one in a paddy field where surface soil was not removed (the "normally cultivated paddy field") and the second in a paddy field where the top 5-10 cm of soil was removed before cultivation (the "surface-removed paddy field"). The amounts of Cs and SS discharge from the paddy fields were continuously measured from June to August 2011. The Cs soil inventory measured 3 months after the FDNPP accident was approximately 200 kBq m(-2). However, after removing the surface soil, the concentration of Cs-137 decreased to 5 kBq m(-2). SS discharged from the normally cultivated and surface-removed paddy fields after puddling (mixing of soil and water before planting rice) was 11.0 kg and 3.1 kg, respectively, and Cs-137 discharge was 630,000 Bq (1240 Bq m(-2)) and 24,800 Bq (47.8 Bq m(-2)), respectively. The total amount of SS discharge after irrigation (natural rainfall-runoff) was 5.5 kg for the normally cultivated field and 70 kg for the surface-removed field, and the total amounts of Cs-137 discharge were 51,900 Bq (102 Bq m(-2)) and 165,000 Bq (317 Bq m(-2)), respectively. During the irrigation period, discharge from the surface-removed plot showed a twofold greater inflow than that from the normally cultivated plot. Thus, Cs inflow may originate from the upper canal. The topsoil removal process eliminated at least approximately 95% of the Cs-137, but upstream water contaminated with Cs-137 flowed into the paddy field. Therefore, to accurately determine the Cs discharge, it is important to examine Cs inflow from the upper channel. Furthermore, puddling and irrigation processes inhibit the discharge of radiocesium downstream. This indicates that water control in paddy fields is an important process in the prevention of river pollution and radionuclide transfer.

  16. Preliminary stage-discharge relations for Tombigbee River at Aliceville lock and dam, near Pickensville, Alabama

    USGS Publications Warehouse

    Nelson, G.H.; Ming, C.O.

    1983-01-01

    The construction of Aliceville lock and dam and other related channel alterations, completed in 1979, has resulted in changes to the stage-discharge relations in the vicinity. The scarcity of current-meter measurements, coupled with backwater conditions, makes definition of a single stage-discharge relation impossible. However, limit curves can be defined that would encompass such a relation. Backwater is defined as water backed up or retarded in its course as compared with water flowing under normal or natural conditions. This results in a rise in stage above normal water level while the discharge remains unaffected. Backwater is usually caused by temporary obstruction(s) to flow downstream. Backwater at Aliceville Dam results from a variety of river conditions. Some of these conditions are large tributary inflow, return of flood plain flows to the main channel during recessions, and operations at Gainesville Dam during low flows. The discharges obtained from 26 current-meter measurements, along with computed discharges through the dam, are plotted versus stage. The plot illustrates, by the scatter of data points, the variations in backwater. Curves are drawn to envelope the extreme plot patterns showing possible ranges of several feet in stage for any given discharge. The upper end of the curves were extrapolated based on the results of a step-backwater analysis.

  17. Presentation of Diagnostic Information to Doctors May Change Their Interpretation and Clinical Management: A Web-Based Randomised Controlled Trial.

    PubMed

    Ben-Shlomo, Yoav; Collin, Simon M; Quekett, James; Sterne, Jonathan A C; Whiting, Penny

    2015-01-01

    There is little evidence on how best to present diagnostic information to doctors and whether this makes any difference to clinical management. We undertook a randomised controlled trial to see if different data presentations altered clinicians' decision to further investigate or treat a patient with a fictitious disorder ("Green syndrome") and their ability to determine post-test probability. We recruited doctors registered with the United Kingdom's largest online network for medical doctors between 10 July and 6" November 2012. Participants were randomised to one of four arms: (a) text summary of sensitivity and specificity, (b) Fagan's nomogram, (c) probability-modifying plot (PMP), (d) natural frequency tree (NFT). The main outcome measure was the decision whether to treat, not treat or undertake a brain biopsy on the hypothetical patient and the correct post-test probability. Secondary outcome measures included knowledge of diagnostic tests. 917 participants attempted the survey and complete data were available from 874 (95.3%). Doctors randomized to the PMP and NFT arms were more likely to treat the patient than those randomized to the text-only arm. (ORs 1.49, 95% CI 1.02, 2.16) and 1.43, 95% CI 0.98, 2.08 respectively). More patients randomized to the PMP (87/218-39.9%) and NFT (73/207-35.3%) arms than the nomogram (50/194-25.8%) or text only (30/255-11.8%) arms reported the correct post-test probability (p <0.001). Younger age, postgraduate training and higher self-rated confidence all predicted better knowledge performance. Doctors with better knowledge were more likely to view an optional learning tutorial (OR per correct answer 1.18, 95% CI 1.06, 1.31). Presenting diagnostic data using a probability-modifying plot or natural frequency tree influences the threshold for treatment and improves interpretation of tests results compared to text summary of sensitivity and specificity or Fagan's nomogram.

  18. Propagules of arbuscular mycorrhizal fungi in a secondary dry forest of Oaxaca, Mexico.

    PubMed

    Guadarrama, Patricia; Castillo-Argüero, Silvia; Ramos-Zapata, José A; Camargo-Ricalde, Sara L; Alvarez-Sánchez, Javier

    2008-03-01

    Plant cover loss due to changes in land use promotes a decrease in spore diversity of arbuscular mycorrhizal fungi (AMF), viable mycelium and, therefore, in AMF colonization, this has an influence in community diversity and, as a consequence, in its recovery. To evaluate different AMF propagules, nine plots in a tropical dry forest with secondary vegetation were selected: 0, 1, 7, 10, 14, 18, 22, 25, and 27 years after abandonment in Nizanda, Oaxaca, Mexico. The secondary vegetation with different stages of development is a consequence of slash and burn agriculture, and posterior abandonment. Soil samples (six per plot) were collected and percentage of AMF field colonization, extrarradical mycelium, viable spore density, infectivity and most probable number (MPN) ofAMF propagules were quantified through a bioassay. Means for field colonization ranged between 40% and 70%, mean of total mycelium length was 15.7 +/- 1.88 mg(-1) dry soil, with significant differences between plots; however, more than 40% of extracted mycelium was not viable, between 60 and 456 spores in 100 g of dry soil were recorded, but more than 64% showed some kind of damage. Infectivity values fluctuated between 20% and 50%, while MPN showed a mean value of 85.42 +/- 44.17 propagules (100 g dry soil). We conclude that secondary communities generated by elimination of vegetation with agricultural purposes in a dry forest in Nizanda do not show elimination of propagules, probably as a consequence of the low input agriculture practices in this area, which may encourage natural regeneration.

  19. Influence of elevation and site productivity on conifer distributions across Alaskan temperate rainforests

    Treesearch

    John P. Caouette; Ashley E. Steel; Paul E. Hennon; Pat G. Cunningham; Cathy A. Pohl; Barbara A. Schrader

    2016-01-01

    We investigated the influence of landscape factors on the distribution and life stage stability of coastal tree species near the northern limit of their ranges. Using data from 1465 forest inventory plots, we estimated probability of occurrence and basal area of six common conifer species across three broad latitudinal regions of coastal Alaska. By also comparing...

  20. ERTS-1 observations of sea surface circulation and sediment transport, Cook Inlet, Alaska

    NASA Technical Reports Server (NTRS)

    Wright, F. F.; Sharma, G. D.; Burbank, D. C.

    1973-01-01

    Cook Inlet is a large tide-dominated estuary in southern Alaska. Highly turbid streams enter the upper inlet, providing an excellent tracer for circulation in the lower inlet. MSS 4 and 5 images both can be used in this area to plot sediment and pollutant trajectories, areas of (probable) commercial fish concentration, and the entire circulation regime.

  1. Agents Overcoming Resource Independent Scaling Threats (AORIST)

    DTIC Science & Technology

    2004-10-01

    20 Table 8: Tilted Consumer Preferences Experiment (m=8, N=61, G=2, C=60, Mean over 13 experiments...probabilities. Non-uniform consumer preferences create a new potential for sub-optimal system performance and thus require an additional adaptive...distribu- tion of the capacities across the sup- plier population must match the non- uniform consumer preferences . The second plot in Table 8

  2. Building baby universes

    NASA Astrophysics Data System (ADS)

    Coles, Peter

    2017-08-01

    The thought of a scientist trying to design a laboratory experiment in which to create a whole new universe probably sounds like it belongs in the plot of a science-fiction B-movie. But as author Zeeya Merali explains in her new book A Big Bang in a Little Room, there are more than a few eminent physicists who think that this is theoretically possible.

  3. Health Risk Assessment of Ambient Air Concentrations of Benzene, Toluene and Xylene (BTX) in Service Station Environments

    PubMed Central

    Edokpolo, Benjamin; Yu, Qiming Jimmy; Connell, Des

    2014-01-01

    A comprehensive evaluation of the adverse health effects of human exposures to BTX from service station emissions was carried out using BTX exposure data from the scientific literature. The data was grouped into different scenarios based on activity, location and occupation and plotted as Cumulative Probability Distributions (CPD) plots. Health risk was evaluated for each scenario using the Hazard Quotient (HQ) at 50% (CEXP50) and 95% (CEXP95) exposure levels. HQ50 and HQ95 > 1 were obtained with benzene in the scenario for service station attendants and mechanics repairing petrol dispensing pumps indicating a possible health risk. The risk was minimized for service stations using vapour recovery systems which greatly reduced the benzene exposure levels. HQ50 and HQ95 < 1 were obtained for all other scenarios with benzene suggesting minimal risk for most of the exposed population. However, HQ50 and HQ95 < 1 was also found with toluene and xylene for all scenarios, suggesting minimal health risk. The lifetime excess Cancer Risk (CR) and Overall Risk Probability for cancer on exposure to benzene was calculated for all Scenarios and this was higher amongst service station attendants than any other scenario. PMID:24945191

  4. Hypervelocity Impact Test Fragment Modeling: Modifications to the Fragment Rotation Analysis and Lightcurve Code

    NASA Technical Reports Server (NTRS)

    Gouge, Michael F.

    2011-01-01

    Hypervelocity impact tests on test satellites are performed by members of the orbital debris scientific community in order to understand and typify the on-orbit collision breakup process. By analysis of these test satellite fragments, the fragment size and mass distributions are derived and incorporated into various orbital debris models. These same fragments are currently being put to new use using emerging technologies. Digital models of these fragments are created using a laser scanner. A group of computer programs referred to as the Fragment Rotation Analysis and Lightcurve code uses these digital representations in a multitude of ways that describe, measure, and model on-orbit fragments and fragment behavior. The Dynamic Rotation subroutine generates all of the possible reflected intensities from a scanned fragment as if it were observed to rotate dynamically while in orbit about the Earth. This calls an additional subroutine that graphically displays the intensities and the resulting frequency of those intensities as a range of solar phase angles in a Probability Density Function plot. This document reports the additions and modifications to the subset of the Fragment Rotation Analysis and Lightcurve concerned with the Dynamic Rotation and Probability Density Function plotting subroutines.

  5. Predicting cotton yield of small field plots in a cotton breeding program using UAV imagery data

    NASA Astrophysics Data System (ADS)

    Maja, Joe Mari J.; Campbell, Todd; Camargo Neto, Joao; Astillo, Philip

    2016-05-01

    One of the major criteria used for advancing experimental lines in a breeding program is yield performance. Obtaining yield performance data requires machine picking each plot with a cotton picker, modified to weigh individual plots. Harvesting thousands of small field plots requires a great deal of time and resources. The efficiency of cotton breeding could be increased significantly while the cost could be decreased with the availability of accurate methods to predict yield performance. This work is investigating the feasibility of using an image processing technique using a commercial off-the-shelf (COTS) camera mounted on a small Unmanned Aerial Vehicle (sUAV) to collect normal RGB images in predicting cotton yield on small plot. An orthonormal image was generated from multiple images and used to process multiple, segmented plots. A Gaussian blur was used to eliminate the high frequency component of the images, which corresponds to the cotton pixels, and used image subtraction technique to generate high frequency pixel images. The cotton pixels were then separated using k-means cluster with 5 classes. Based on the current work, the calculated percentage cotton area was computed using the generated high frequency image (cotton pixels) divided by the total area of the plot. Preliminary results showed (five flights, 3 altitudes) that cotton cover on multiple pre-selected 227 sq. m. plots produce an average of 8% which translate to approximately 22.3 kgs. of cotton. The yield prediction equation generated from the test site was then use on a separate validation site and produced a prediction error of less than 10%. In summary, the results indicate that a COTS camera with an appropriate image processing technique can produce results that are comparable to expensive sensors.

  6. Evaluation of Critical Bandwidth Using Digitally Processed Speech.

    DTIC Science & Technology

    1982-05-12

    observed after re- peating the two tests on persons with confirmed cases of sensorineural hearing impairment. Again, the plotted speech discrimination...quantifying the critical bandwidth of persons on a cli- nical or pre-employment level. The complex portion of the test design (the computer generation of...34super" normal hearing indi- viduals (i.e., those persons with narrower-than-normal cri- tical bands). This ability of the test shows promise as a valuable

  7. Portfolio effects, climate change, and the persistence of small populations: analyses on the rare plant Saussurea weberi.

    PubMed

    Abbott, Ronald E; Doak, Daniel F; Peterson, Megan L

    2017-04-01

    The mechanisms that stabilize small populations in the face of environmental variation are crucial to their long-term persistence. Building from diversity-stability concepts in community ecology, within-population diversity is gaining attention as an important component of population stability. Genetic and microhabitat variation within populations can generate diverse responses to common environmental fluctuations, dampening temporal variability across the population as a whole through portfolio effects. Yet, the potential for portfolio effects to operate at small scales within populations or to change with systematic environmental shifts, such as climate change, remain largely unexplored. We tracked the abundance of a rare alpine perennial plant, Saussurea weberi, in 49 1-m 2 plots within a single population over 20 yr. We estimated among-plot correlations in log annual growth rate to test for population-level synchrony and quantify portfolio effects across the 20-yr study period and also in 5-yr subsets based on June temperature quartiles. Asynchrony among plots, due to different plot-level responses to June temperature, reduced overall fluctuations in abundance and the probability of decline in population models, even when accounting for the effects of density dependence on dynamics. However, plots became more synchronous and portfolio effects decreased during the warmest years of the study, suggesting that future climate warming may erode stabilizing mechanisms in populations of this rare plant. © 2017 by the Ecological Society of America.

  8. Feedback dynamics of grazing lawns: Coupling vegetation change with animal growth

    USGS Publications Warehouse

    Person, Brian T.; Herzog, M.P.; Ruess, Roger W.; Sedinger, J.S.; Anthony, R. Michael; Babcock, C.A.

    2003-01-01

    We studied the effects of grazing by Black Brant (Branta bernicla nigricans) geese (hereafter Brant) on plant community zonation and gosling growth between 1987 and 2000 at a nesting colony in southwestern Alaska. The preferred forage of Brant, Carex subspathacea, is only found as a grazing lawn. An alternate forage species, C. ramenskii, exists primarily as meadow but also forms grazing lawns when heavily grazed. We mowed plots of ungrazed C. ramenskii meadows to create swards that Brant could select and maintain as grazing lawns. Fecal counts were higher on mowed plots than on control plots in the year after plots were mowed. Both nutritional quality and aboveground biomass of C. ramenskii in mowed plots were similar to that of C. subspathacea grazing lawns. The areal extent of grazing lawns depends in part on the population size of Brant. High Brant populations can increase the areal extent of grazing lawns, which favors the growth of goslings. Grazing lawns increased from 3% to 8% of surface area as the areal extent of C. ramenskii meadows declined between 1991 and 1999. Gosling mass was lower early in this time period due to density dependent effects. As the goose population stabilized, and area of grazing lawns increased, gosling mass increased between 1993 and 1999. Because larger goslings have increased survival, higher probability of breeding, and higher fecundity, herbivore-mediated changes in the distribution grazing lawn extent may result in a numerical increase of the population within the next two decades.

  9. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  10. Waterbird nest-site selection is influenced by neighboring nests and island topography

    USGS Publications Warehouse

    Hartman, Christopher; Ackerman, Joshua T.; Takekawa, John Y.; Herzog, Mark

    2016-01-01

    Avian nest-site selection is influenced by factors operating across multiple spatial scales. Identifying preferred physical characteristics (e.g., topography, vegetation structure) can inform managers to improve nesting habitat suitability. However, social factors (e.g., attraction, territoriality, competition) can complicate understanding physical characteristics preferred by nesting birds. We simultaneously evaluated the physical characteristics and social factors influencing selection of island nest sites by colonial-nesting American avocets (Recurvirostra americana) and Forster's terns (Sterna forsteri) at 2 spatial scales in San Francisco Bay, 2011–2012. At the larger island plot (1 m2) scale, we used real-time kinematics to produce detailed topographies of nesting islands and map the distribution of nests. Nesting probability was greatest in island plots between 0.5 m and 1.5 m above the water surface, at distances <10 m from the water's edge, and of moderately steep (avocets) or flat (terns) slopes. Further, avocet and tern nesting probability increased as the number of nests initiated in adjacent plots increased up to a peak of 11–12 tern nests, and then decreased thereafter. Yet, avocets were less likely to nest in plots adjacent to plots with nesting avocets, suggesting an influence of intra-specific territoriality. At the smaller microhabitat scale, or the area immediately surrounding the nest, we compared topography, vegetation, and distance to nearest nest between nest sites and paired random sites. Topography had little influence on selection of the nest microhabitat. Instead, nest sites were more likely to have vegetation present, and greater cover, than random sites. Finally, avocet, and to a lesser extent tern, nest sites were closer to other active conspecific or heterospecific nests than random sites, indicating that social attraction played a role in selection of nest microhabitat. Our results demonstrate key differences in nest-site selection between co-occurring avocets and terns, and indicate the effects of physical characteristics and social factors on selection of nesting habitat are dependent on the spatial scale examined. Moreover, these results indicate that islands with abundant area between 0.5 m and 1.5 m above the water surface, within 10 m of the water's edge, and containing a mosaic of slopes ranging from flat to moderately steep would provide preferred nesting habitat for avocets and terns. © 2016 The Wildlife Society.

  11. Effects of water additions, chemical amendments, and plants on in situ measures of nutrient bioavailability in calcareous soils of southeastern Utah, USA

    USGS Publications Warehouse

    Miller, M.E.; Belnap, J.; Beatty, S.W.; Webb, B.L.

    2006-01-01

    We used ion-exchange resin bags to investigate effects of water additions, chemical amendments, and plant presence on in situ measures of nutrient bioavailability in conjunction with a study examining soil controls of ecosystem invasion by the exotic annual grass Bromus tectorum L. At five dryland sites in southeastern Utah, USA, resin bags were buried in experimental plots randomly assigned to combinations of two watering treatments (wet and dry), four chemical-amendment treatments (KCl, MgO, CaO, and no amendment), and four plant treatments (B. tectorum alone, the perennial bunchgrass Stipa hymenoides R. & S. alone, B. tectorum and S. hymenoides together, and no plants). Resin bags were initially buried in September 1997; replaced in January, April, and June 1998; and removed at the end of the study in October 1998. When averaged across watering treatments, plots receiving KCl applications had lower resin-bag NO 3- than plots receiving no chemical amendments during three of four measurement periods-probably due to NO 3- displacement from resin bags by Cl- ions. During the January-April period, KCl application in wet plots (but not dry plots) decreased resin-bag NH 4+ and increased resin-bag NO 3- . This interaction effect likely resulted from displacement of NH 4+ from resins by K+ ions, followed by nitrification and enhanced NO 3- capture by resin bags. In plots not receiving KCl applications, resin-bag NH 4+ was higher in wet plots than in dry plots during the same period. During the January-April period, resin-bag measures for carbonate-related ions HPO 42- , Ca2+, and Mn2+ tended to be greater in the presence of B. tectorum than in the absence of B. tectorum. This trend was evident only in wet plots where B. tectorum densities were much higher than in dry plots. We attribute this pattern to the mobilization of carbonate-associated ions by root exudates of B. tectorum. These findings indicate the importance of considering potential indirect effects of soil amendments performed in conjunction with resource-limitation studies, and they suggest the need for further research concerning nutrient acquisition mechanisms of B. tectorum. ?? 2006 Springer Science+Business Media B.V.

  12. Suppression tuning of distortion-product otoacoustic emissions: Results from cochlear mechanics simulation

    PubMed Central

    Liu, Yi-Wen; Neely, Stephen T.

    2013-01-01

    This paper presents the results of simulating the acoustic suppression of distortion-product otoacoustic emissions (DPOAEs) from a computer model of cochlear mechanics. A tone suppressor was introduced, causing the DPOAE level to decrease, and the decrement was plotted against an increasing suppressor level. Suppression threshold was estimated from the resulting suppression growth functions (SGFs), and suppression tuning curves (STCs) were obtained by plotting the suppression threshold as a function of suppressor frequency. Results show that the slope of SGFs is generally higher for low-frequency suppressors than high-frequency suppressors, resembling those obtained from normal hearing human ears. By comparing responses of normal (100%) vs reduced (50%) outer-hair-cell sensitivities, the model predicts that the tip-to-tail difference of the STCs correlates well with that of intra-cochlear iso-displacement tuning curves. The correlation is poorer, however, between the sharpness of the STCs and that of the intra-cochlear tuning curves. These results agree qualitatively with what was recently reported from normal-hearing and hearing-impaired human subjects, and examination of intra-cochlear model responses can provide the needed insight regarding the interpretation of DPOAE STCs obtained in individual ears. PMID:23363112

  13. Relationship between esophageal clinical symptoms and manometry findings in patients with esophageal motility disorders: a cross-sectional study.

    PubMed

    FakhreYaseri, Hashem; FakhreYaseri, Ali Mohammad; Baradaran Moghaddam, Ali; Soltani Arabshhi, Seyed Kamran

    2015-01-01

    Manometry is the gold-standard diagnostic test for motility disorders in the esophagus. The development of high-resolution manometry catheters and software displays of manometry recordings in color-coded pressure plots have changed the diagnostic assessment of esophageal disease. The diagnostic value of particular esophageal clinical symptoms among patients suspected of esophageal motor disorders (EMDs) is still unknown. The aim of this study was to explore the sensitivity, specificity, and predictive accuracy of presenting esophageal symptoms between abnormal and normal esophageal manometry findings. We conducted a cross-sectional study of 623 patients aged 11-80 years. Data were collected from clinical examinations as well as patient questionnaires. The sensitivity, specificity, and accuracy were calculated after high-resolution manometry plots were reviewed according to the most recent Chicago Criteria. The clinical symptoms were not sensitive enough to discriminate between EMDs. Nevertheless, dysphagia, noncardiac chest pain, hoarseness, vomiting, and weight loss had high specificity and high accuracy to distinguish EMDs from normal findings. Regurgitation and heartburn did not have good accuracy for the diagnosis of EMDs. Clinical symptoms are not reliable enough to discriminate between EMDs. Clinical symptoms can, however, discriminate between normal findings and EMDs, especially achalasia.

  14. Deep brain stimulation abolishes slowing of reactions to unlikely stimuli.

    PubMed

    Antoniades, Chrystalina A; Bogacz, Rafal; Kennard, Christopher; FitzGerald, James J; Aziz, Tipu; Green, Alexander L

    2014-08-13

    The cortico-basal-ganglia circuit plays a critical role in decision making on the basis of probabilistic information. Computational models have suggested how this circuit could compute the probabilities of actions being appropriate according to Bayes' theorem. These models predict that the subthalamic nucleus (STN) provides feedback that normalizes the neural representation of probabilities, such that if the probability of one action increases, the probabilities of all other available actions decrease. Here we report the results of an experiment testing a prediction of this theory that disrupting information processing in the STN with deep brain stimulation should abolish the normalization of the neural representation of probabilities. In our experiment, we asked patients with Parkinson's disease to saccade to a target that could appear in one of two locations, and the probability of the target appearing in each location was periodically changed. When the stimulator was switched off, the target probability affected the reaction times (RT) of patients in a similar way to healthy participants. Specifically, the RTs were shorter for more probable targets and, importantly, they were longer for the unlikely targets. When the stimulator was switched on, the patients were still faster for more probable targets, but critically they did not increase RTs as the target was becoming less likely. This pattern of results is consistent with the prediction of the model that the patients on DBS no longer normalized their neural representation of prior probabilities. We discuss alternative explanations for the data in the context of other published results. Copyright © 2014 the authors 0270-6474/14/3410844-09$15.00/0.

  15. NPLOT - NASTRAN PLOT

    NASA Technical Reports Server (NTRS)

    Mcentire, K.

    1994-01-01

    NPLOT is an interactive computer graphics program for plotting undeformed and deformed NASTRAN finite element models (FEMs). Although there are many commercial codes already available for plotting FEMs, these have limited use due to their cost, speed, and lack of features to view BAR elements. NPLOT was specifically developed to overcome these limitations. On a vector type graphics device the two best ways to show depth are by hidden line plotting or haloed line plotting. A hidden line algorithm generates views of models with all hidden lines removed, and a haloed line algorithm displays views with aft lines broken in order to show depth while keeping the entire model visible. A haloed line algorithm is especially useful for plotting models composed of many line elements and few surface elements. The most important feature of NPLOT is its ability to create both hidden line and haloed line views accurately and much more quickly than with any other existing hidden or haloed line algorithms. NPLOT is also capable of plotting a normal wire frame view to display all lines of a model. NPLOT is able to aid in viewing all elements, but it has special features not generally available for plotting BAR elements. These features include plotting of TRUE LENGTH and NORMALIZED offset vectors and orientation vectors. Standard display operations such as rotation and perspective are possible, but different view planes such as X-Y, Y-Z, and X-Z may also be selected. Another display option is the Z-axis cut which allows a portion of the fore part of the model to be cut away to reveal details of the inside of the model. A zoom function is available to terminals with a locator (graphics cursor, joystick, etc.). The user interface of NPLOT is designed to make the program quick and easy to use. A combination of menus and commands with help menus for detailed information about each command allows experienced users greater speed and efficiency. Once a plot is on the screen the interface becomes command driven, enabling the user to manipulate the display or execute a command without having to return to the menu. NPLOT is also able to plot deformed shapes allowing it to perform post-processing. The program can read displacements, either static displacements or eigenvectors, from a MSC/NASTRAN F06 file or a UAI/NASTRAN PRT file. The displacements are written into a unformatted scratch file where they are available for rapid access when the user wishes to display a deformed shape. All subcases or mode shapes can be read in at once. Then it is easy to enable the deformed shape, to change subcases or mode shapes and to change the scale factor for subsequent plots. NPLOT is written in VAX FORTRAN for DEC VAX series computers running VMS. As distributed, the NPLOT source code makes calls to the DI3000 graphics package from Precision Visuals; however, a set of interface routines is provided to translate the DI3000 calls into Tektronix PLOT10/TCS graphics library calls so that NPLOT can use the standard Tektronix 4010 which many PC terminal emulation software programs support. NPLOT is available in VAX BACKUP format on a 9-track 1600 BPI DEC VAX BACKUP format magnetic tape (standard media) or a TK50 tape cartridge. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. Tektronix, PLOT10, and TCS are trademarks of Tektronix, Inc. DI3000 is a registered trademark of Precision Visuals, Inc. NASTRAN is a registered trademark of the National Aeronautics and Space Administration. MSC/ is a trademark of MacNeal-Schwendler Corporation. UAI is a trademark of Universal Analytics, Inc.

  16. Annual survival estimation of migratory songbirds confounded by incomplete breeding site-fidelity: Study designs that may help

    USGS Publications Warehouse

    Marshall, M.R.; Diefenbach, D.R.; Wood, L.A.; Cooper, R.J.

    2004-01-01

    Many species of bird exhibit varying degrees of site-fidelity to the previous year's territory or breeding area, a phenomenon we refer to as incomplete breeding site-fidelity. If the territory they occupy is located beyond the bounds of the study area or search area (i.e., they have emigrated from the study area), the bird will go undetected and is therefore indistinguishable from dead individuals in capture-mark-recapture studies. Differential emigration rates confound inferences regarding differences in survival between sexes and among species if apparent survival rates are used as estimates of true survival. Moreover, the bias introduced by using apparent survival rates for true survival rates can have profound effects on the predictions of population persistence through time, source/sink dynamics, and other aspects of life-history theory. We investigated four study design and analysis approaches that result in apparent survival estimates that are closer to true survival estimates. Our motivation for this research stemmed from a multi-year capture-recapture study of Prothonotary Warblers (Protonotaria citrea) on multiple study plots within a larger landscape of suitable breeding habitat where substantial inter-annual movements of marked individuals among neighboring study plots was documented. We wished to quantify the effects of this type of movement on annual survival estimation. The first two study designs we investigated involved marking birds in a core area and resighting them in the core as well as an area surrounding the core. For the first of these two designs, we demonstrated that as the resighting area surrounding the core gets progressively larger, and more "emigrants" are resighted, apparent survival estimates begin to approximate true survival rates (bias < 0.01). However, given observed inter-annual movements of birds, it is likely to be logistically impractical to resight birds on sufficiently large surrounding areas to minimize bias. Therefore, as an alternative protocol, we analyzed the data with subsets of three progressively larger areas surrounding the core. The data subsets provided four estimates of apparent survival that asymptotically approached true survival. This study design and analytical approach is likely to be logistically feasible in field settings and yields estimates of true survival unbiased (bias < 0.03) by incomplete breeding site-fidelity over a range of inter-annual territory movement patterns. The third approach we investigated used a robust design data collection and analysis approach. This approach resulted in estimates of survival that were unbiased (bias < 0.02), but were very imprecise and likely would not yield reliable estimates in field situations. The fourth approach utilized a fixed study area size, but modeled detection probability as a function of bird proximity to the study plot boundary (e.g., those birds closest to the edge are more likely to emigrate). This approach also resulted in estimates of survival that were unbiased (bias < 0.02), but because the individual covariates were normalized, the average capture probability was 0.50, and thus did not provide an accurate estimate of the true capture probability. Our results show that the core-area with surrounding resight-only can provide estimates of survival that are not biased by the effects of incomplete breeding site-fidelity. ?? 2004 Museu de Cie??ncies Naturals.

  17. Extreme Drought Event and Shrub Invasion Reduce Oak Trees Functioning and Resilience on Water-Limited Ecosystems

    NASA Astrophysics Data System (ADS)

    Caldeira, M. C.; Lobo-do-Vale, R.; Lecomte, X.; David, T. S.; Pinto, J. G.; Bugalho, M. N.; Werner, C.

    2016-12-01

    Extreme droughts and plant invasions are major drivers of global change that can critically affect ecosystem functioning. Shrub encroachment is increasing in many regions worldwide and extreme events are projected to increase in frequency and intensity, namely in the Mediterranean region. Nevertheless, little is known about how these drivers may interact and affect ecosystem functioning and resilience Using a manipulative shrub removal experiment and the co-occurrence of an extreme drought event in a Mediterranean oak woodland, we show that the combination of native shrub invasion and extreme drought reduced ecosystem transpiration and the resilience of the key-stone oak tree species. We established six 25 x 25 m paired plots in a shrub (Cistus ladanifer L.) encroached Mediterranean cork-oak (Quercus suber L.) woodland. We measured sapflow and pre-dawn leaf water potential of trees and shrubs and soil water content in all plots during four years. We determined the resilience of tree transpiration to evaluate to what extent trees recovered from the extreme drought event. From February to November 2011 we conducted baseline measurements for plot comparison. In November 2011 all the shrubs from one of all the paired plots were cut and removed. Ecosystem transpiration was dominated by the water use of the invasive shrub, which further increased after the extreme drought. Simultaneously, tree transpiration in invaded plots declined more sharply (67 ± 13 %) than in plots cleared from shrubs (31 ± 11%) relative to the pre-drought year (2011). Trees in invaded plots were not able to recover in the following wetter year showing lower resilience to the extreme drought event. Our results imply that in Mediterranean-type of climates invasion by water spending species coupled with the projected recurrent extreme droughts will cause critical drought tolerance thresholds of trees to be overcome, thus increasing the probability of tree mortality.

  18. Oscillation properties of active and sterile neutrinos and neutrino anomalies at short distances

    NASA Astrophysics Data System (ADS)

    Khruschov, V. V.; Fomichev, S. V.; Titov, O. A.

    2016-09-01

    A generalized phenomenological (3 + 2 + 1) model featuring three active and three sterile neutrinos that is intended for calculating oscillation properties of neutrinos for the case of a normal activeneutrino mass hierarchy and a large splitting between the mass of one sterile neutrino and the masses of the other two sterile neutrinos is considered. A new parametrization and a specific form of the general mixing matrix are proposed for active and sterile neutrinos with allowance for possible CP violation in the lepton sector, and test values are chosen for the neutrino masses and mixing parameters. The probabilities for the transitions between different neutrino flavors are calculated, and graphs representing the probabilities for the disappearance of muon neutrinos/antineutrinos and the appearance of electron neutrinos/antineutrinos in a beam of muon neutrinos/antineutrinos versus the distance from the neutrino source for various values of admissible model parameters at neutrino energies not higher than 50 MeV, as well as versus the ratio of this distance to the neutrino energy, are plotted. It is shown that the short-distance accelerator anomaly in neutrino data (LNSD anomaly) can be explained in the case of a specific mixing matrix for active and sterile neutrinos (which belongs to the a 2 type) at the chosen parameter values. The same applies to the short-distance reactor and gallium anomalies. The theoretical results obtained in the present study can be used to interpret and predict the results of ground-based neutrino experiments aimed at searches for sterile neutrinos, as well as to analyze some astrophysical observational data.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grantham, K; Santanam, L; Goddu, S

    Purpose: We retrospectively evaluate the dosimetric impact of a 3.5% range uncertainty on CTV coverage and normal organ toxicity for a cohort of brain patients. Methods: Twenty treatment plans involving 20 brain cancer patients treated with Mevions S250 were reviewed. Forty uncertain plans were made by changing the ranges in original plans by ±3.5% while keeping all devices unchanged. Fidelity to the original plans was evaluated with gamma index. Changes in generalized equivalent uniform dose (gEUD) were reported for the following structures: CTV coverage, brainstem, optic chiasm, and optic nerves. Comparisons were made by plotting the relevant endpoints from themore » uncertain plans as a function of the same endpoints from the original clinical plan. Results: Gamma-index analysis resulted in a 50% pass rate of the uncertain plans using a 90% passing rate and 3%/3mm criterion. A 9.5% decrease in the slope of gEUD plot for the CTV was observed for the 3.5% downward range shift. However, the change in slope did not result in a gEUD change greater than 1.1% for the CTV. The slopes of the gEUD plots for normal structures increased by 3.1% 3.9% 2.4% and 0.2% for the chiasm, brainstem, left optic nerve and right optic nerve respectively. The maximum deviation from the gEUD of the clinical plan for normal structures was: 64% in the chiasm, 31% for the brainstem, and 19% for both optic nerves. Conclusion: A retrospective review shows moderate radiobiological impact of range uncertainty in passively scattered proton therapy with sporadic catastrophe. The linear regression analysis on the statistical data indicates a systematic deviation of gEUD from treatment planning in the light of range uncertainty.« less

  20. Approximating Multivariate Normal Orthant Probabilities. ONR Technical Report. [Biometric Lab Report No. 90-1.

    ERIC Educational Resources Information Center

    Gibbons, Robert D.; And Others

    The probability integral of the multivariate normal distribution (ND) has received considerable attention since W. F. Sheppard's (1900) and K. Pearson's (1901) seminal work on the bivariate ND. This paper evaluates the formula that represents the "n x n" correlation matrix of the "chi(sub i)" and the standardized multivariate…

  1. Spatial patch occupancy patterns of the Lower Keys marsh rabbit

    USGS Publications Warehouse

    Eaton, Mitchell J.; Hughes, Phillip T.; Nichols, James D.; Morkill, Anne; Anderson, Chad

    2011-01-01

    Reliable estimates of presence or absence of a species can provide substantial information on management questions related to distribution and habitat use but should incorporate the probability of detection to reduce bias. We surveyed for the endangered Lower Keys marsh rabbit (Sylvilagus palustris hefneri) in habitat patches on 5 Florida Key islands, USA, to estimate occupancy and detection probabilities. We derived detection probabilities using spatial replication of plots and evaluated hypotheses that patch location (coastal or interior) and patch size influence occupancy and detection. Results demonstrate that detection probability, given rabbits were present, was <0.5 and suggest that naïve estimates (i.e., estimates without consideration of imperfect detection) of patch occupancy are negatively biased. We found that patch size and location influenced probability of occupancy but not detection. Our findings will be used by Refuge managers to evaluate population trends of Lower Keys marsh rabbits from historical data and to guide management decisions for species recovery. The sampling and analytical methods we used may be useful for researchers and managers of other endangered lagomorphs and cryptic or fossorial animals occupying diverse habitats.

  2. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    PubMed

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  3. Spatial-frequency dependent binocular imbalance in amblyopia

    PubMed Central

    Kwon, MiYoung; Wiecek, Emily; Dakin, Steven C.; Bex, Peter J.

    2015-01-01

    While amblyopia involves both binocular imbalance and deficits in processing high spatial frequency information, little is known about the spatial-frequency dependence of binocular imbalance. Here we examined binocular imbalance as a function of spatial frequency in amblyopia using a novel computer-based method. Binocular imbalance at four spatial frequencies was measured with a novel dichoptic letter chart in individuals with amblyopia, or normal vision. Our dichoptic letter chart was composed of band-pass filtered letters arranged in a layout similar to the ETDRS acuity chart. A different chart was presented to each eye of the observer via stereo-shutter glasses. The relative contrast of the corresponding letter in each eye was adjusted by a computer staircase to determine a binocular Balance Point at which the observer reports the letter presented to either eye with equal probability. Amblyopes showed pronounced binocular imbalance across all spatial frequencies, with greater imbalance at high compared to low spatial frequencies (an average increase of 19%, p < 0.01). Good test-retest reliability of the method was demonstrated by the Bland-Altman plot. Our findings suggest that spatial-frequency dependent binocular imbalance may be useful for diagnosing amblyopia and as an outcome measure for recovery of binocular vision following therapy. PMID:26603125

  4. Classification of the degenerative grade of lesions of supraspinatus rotator cuff tendons by FT-Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Palma Fogazza, Bianca; da Silva Carvalho, Carolina; Godoy Penteado, Sergio; Meneses, Cláudio S.; Abrahão Martin, Airton; da Silva Martinho, Herculano

    2007-02-01

    FT-Raman spectroscopy was employed to access the biochemical alterations occurring on the degenerative process of the rotator cuff supraspinatus tendons. The spectral characteristic variations in the 351 spectra of samples of 39 patients were identified with the help of Principal Components Analysis. The main variations occurred in the 840-911; 1022- 1218; 1257; 1270; 1300; 1452; 1663; and 1751 cm -1 regions corresponding to the vibrational bands of proline, hydroxiproline, lipids, nucleic acids, carbohydrates, collagen, and elastin. These alterations are compatible with the pathology alterations reported on the literature. Scattering plots of PC 4 vs PC 2 and PC 3 vs PC 2 contrasted with histopathological analysis has enabled the spectral classification of the data into normal and degenerated groups of tendons. By depicting empiric lines the estimated sensibility and specificity were 39,6 % and 97,8 %, respectively for PC 4 vs PC 2 and 36,0 % and 100 %, respectively for PC 3 vs PC 2. These results indicate that Raman spectroscopy can be used to probe the general tendon quality and could be applied as co adjuvant element in the usual arthroscopy surgery apparatus to guide the procedure and possibly infer about the probability of rerupture.

  5. Spatial-frequency dependent binocular imbalance in amblyopia.

    PubMed

    Kwon, MiYoung; Wiecek, Emily; Dakin, Steven C; Bex, Peter J

    2015-11-25

    While amblyopia involves both binocular imbalance and deficits in processing high spatial frequency information, little is known about the spatial-frequency dependence of binocular imbalance. Here we examined binocular imbalance as a function of spatial frequency in amblyopia using a novel computer-based method. Binocular imbalance at four spatial frequencies was measured with a novel dichoptic letter chart in individuals with amblyopia, or normal vision. Our dichoptic letter chart was composed of band-pass filtered letters arranged in a layout similar to the ETDRS acuity chart. A different chart was presented to each eye of the observer via stereo-shutter glasses. The relative contrast of the corresponding letter in each eye was adjusted by a computer staircase to determine a binocular Balance Point at which the observer reports the letter presented to either eye with equal probability. Amblyopes showed pronounced binocular imbalance across all spatial frequencies, with greater imbalance at high compared to low spatial frequencies (an average increase of 19%, p < 0.01). Good test-retest reliability of the method was demonstrated by the Bland-Altman plot. Our findings suggest that spatial-frequency dependent binocular imbalance may be useful for diagnosing amblyopia and as an outcome measure for recovery of binocular vision following therapy.

  6. Spatial interpolation of soil organic carbon using apparent electrical conductivity as secondary information

    NASA Astrophysics Data System (ADS)

    Martinez, G.; Vanderlinden, K.; Ordóñez, R.; Muriel, J. L.

    2009-04-01

    Soil organic carbon (SOC) spatial characterization is necessary to evaluate under what circumstances soil acts as a source or sink of carbon dioxide. However, at the field or catchment scale it is hard to accurately characterize its spatial distribution since large numbers of soil samples are necessary. As an alternative, near-surface geophysical sensor-based information can improve the spatial estimation of soil properties at these scales. Electromagnetic induction (EMI) sensors provide non-invasive and non-destructive measurements of the soil apparent electrical conductivity (ECa), which depends under non-saline conditions on clay content, water content or SOC, among other properties that determine the electromagnetic behavior of the soil. This study deals with the possible use of ECa-derived maps to improve SOC spatial estimation by Simple Kriging with varying local means (SKlm). Field work was carried out in a vertisol in SW Spain. The field is part of a long-term tillage experiment set up in 1982 with three replicates of conventional tillage (CT) and Direct Drilling (DD) plots with unitary dimensions of 15x65m. Shallow and deep (up to 0.8m depth) apparent electrical conductivity (ECas and ECad, respectively) was measured using the EM38-DD EMI sensor. Soil samples were taken from the upper horizont and analyzed for their SOC content. Correlation coefficients of ECas and ECad with SOC were low (0.331 and 0.175) due to the small range of SOC values and possibly also to the different support of the ECa and SOC data. Especially the ECas values were higher in the DD plots. The normalized ECa difference (ΔECa), calculated as the difference between the normalized ECas and ECad values, distinguished clearly the CT and DD plots, with the DD plots showing positive ΔECa values and CT plots ΔECa negative values. The field was stratified using fuzzy k-means (FKM) classification of ΔECa (FKM1), and ECas and ECad (FKM2). The FKM1 map mainly showed the difference between CT and DD plots, while the FKM2 map showed both differences between CT and DD and topography-associated features. Using the FKM1 and FKM2 maps as secondary information accounted for 30% of the total SOC variability, whereas plot and management average SOC explained 44 and 41%, respectively. Cross validation of SKlm using FKM2 reduced the RMSE by 8% and increased the efficiency index almost 70% as compared to Ordinary Kriging. This work shows how ECa can improve the spatial characterization of SOC, despite its low correlation and the small size of the plots used in this study.

  7. A model-based approach to estimating forest area

    Treesearch

    Ronald E. McRoberts

    2006-01-01

    A logistic regression model based on forest inventory plot data and transformations of Landsat Thematic Mapper satellite imagery was used to predict the probability of forest for 15 study areas in Indiana, USA, and 15 in Minnesota, USA. Within each study area, model-based estimates of forest area were obtained for circular areas with radii of 5 km, 10 km, and 15 km and...

  8. On the He(plus) triplet line intensities

    NASA Technical Reports Server (NTRS)

    Daltabuit, E.; Cox, D.

    1971-01-01

    The theoretical calculations of helium triplet line strengths, including collisional enhancement, are compared to astronomical observations. Both are plotted on an I(10830)/I(5876) vs I(5876)/I(4471) plane. It appears that the theory of helium triplet line strengths agrees with present observations, and that the question of an additional depopulation mechanism for the 2 3S population is probably predicted correctly within 30%.

  9. Mortality rates associated with crown health for eastern forest tree species

    Treesearch

    Randall S. Morin; KaDonna C. Randolph; Jim Steinman

    2015-01-01

    The condition of tree crowns is an important indicator of tree and forest health. Crown conditions have been evaluated during inventories of the US Forest Service Forest Inventory and Analysis (FIA) program since 1999. In this study, remeasured data from 55,013 trees on 2616 FIA plots in the eastern USA were used to assess the probability of survival among various tree...

  10. Experimental enhancement of pickleweed, Suisun Bay, California

    USGS Publications Warehouse

    Miles, A. Keith; Van Vuren, Dirk H.; Tsao, Danika C.; Yee, Julie L.

    2015-01-01

    As mitigation for habitat impacted by the expansion of a pier on Suisun Bay, California, two vehicle parking lots (0.36 ha and 0.13 ha) were restored by being excavated, graded, and contoured using dredged sediments to the topography or elevation of nearby wetlands. We asked if pickleweed (Sarcocornia pacifica L, [Amaranthaceae]) colonization could be enhanced by experimental manipulation on these new wetlands. Pickleweed dominates ecologically important communities at adjacent San Francisco Bay, but is not typically dominant at Suisun Bay probably because of widely fluctuating water salinity and is outcompeted by other brackish water plants. Experimental treatments (1.0-m2 plots) included mulching with pickleweed cuttings in either the fall or the spring, tilling in the fall, or applying organic enrichments in the fall. Control plots received no treatment. Pickleweed colonization was most enhanced at treatment plots that were mulched with pickleweed in the fall. Since exotic vegetation can colonize bare sites within the early phases of restoration and reduce habitat quality, we concluded that mulching was most effective in the fall by reducing invasive plant cover while facilitating native plant colonization.

  11. Automatic classification of spectra from the Infrared Astronomical Satellite (IRAS)

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Stutz, John; Self, Matthew; Taylor, William; Goebel, John; Volk, Kevin; Walker, Helen

    1989-01-01

    A new classification of Infrared spectra collected by the Infrared Astronomical Satellite (IRAS) is presented. The spectral classes were discovered automatically by a program called Auto Class 2. This program is a method for discovering (inducing) classes from a data base, utilizing a Bayesian probability approach. These classes can be used to give insight into the patterns that occur in the particular domain, in this case, infrared astronomical spectroscopy. The classified spectra are the entire Low Resolution Spectra (LRS) Atlas of 5,425 sources. There are seventy-seven classes in this classification and these in turn were meta-classified to produce nine meta-classes. The classification is presented as spectral plots, IRAS color-color plots, galactic distribution plots and class commentaries. Cross-reference tables, listing the sources by IRAS name and by Auto Class class, are also given. These classes show some of the well known classes, such as the black-body class, and silicate emission classes, but many other classes were unsuspected, while others show important subtle differences within the well known classes.

  12. Time delay and long-range connection induced synchronization transitions in Newman-Watts small-world neuronal networks.

    PubMed

    Qian, Yu

    2014-01-01

    The synchronization transitions in Newman-Watts small-world neuronal networks (SWNNs) induced by time delay τ and long-range connection (LRC) probability P have been investigated by synchronization parameter and space-time plots. Four distinct parameter regions, that is, asynchronous region, transition region, synchronous region, and oscillatory region have been discovered at certain LRC probability P = 1.0 as time delay is increased. Interestingly, desynchronization is observed in oscillatory region. More importantly, we consider the spatiotemporal patterns obtained in delayed Newman-Watts SWNNs are the competition results between long-range drivings (LRDs) and neighboring interactions. In addition, for moderate time delay, the synchronization of neuronal network can be enhanced remarkably by increasing LRC probability. Furthermore, lag synchronization has been found between weak synchronization and complete synchronization as LRC probability P is a little less than 1.0. Finally, the two necessary conditions, moderate time delay and large numbers of LRCs, are exposed explicitly for synchronization in delayed Newman-Watts SWNNs.

  13. Time Delay and Long-Range Connection Induced Synchronization Transitions in Newman-Watts Small-World Neuronal Networks

    PubMed Central

    Qian, Yu

    2014-01-01

    The synchronization transitions in Newman-Watts small-world neuronal networks (SWNNs) induced by time delay and long-range connection (LRC) probability have been investigated by synchronization parameter and space-time plots. Four distinct parameter regions, that is, asynchronous region, transition region, synchronous region, and oscillatory region have been discovered at certain LRC probability as time delay is increased. Interestingly, desynchronization is observed in oscillatory region. More importantly, we consider the spatiotemporal patterns obtained in delayed Newman-Watts SWNNs are the competition results between long-range drivings (LRDs) and neighboring interactions. In addition, for moderate time delay, the synchronization of neuronal network can be enhanced remarkably by increasing LRC probability. Furthermore, lag synchronization has been found between weak synchronization and complete synchronization as LRC probability is a little less than 1.0. Finally, the two necessary conditions, moderate time delay and large numbers of LRCs, are exposed explicitly for synchronization in delayed Newman-Watts SWNNs. PMID:24810595

  14. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  15. Gradually truncated log-normal in USA publicly traded firm size distribution

    NASA Astrophysics Data System (ADS)

    Gupta, Hari M.; Campanha, José R.; de Aguiar, Daniela R.; Queiroz, Gabriel A.; Raheja, Charu G.

    2007-03-01

    We study the statistical distribution of firm size for USA and Brazilian publicly traded firms through the Zipf plot technique. Sale size is used to measure firm size. The Brazilian firm size distribution is given by a log-normal distribution without any adjustable parameter. However, we also need to consider different parameters of log-normal distribution for the largest firms in the distribution, which are mostly foreign firms. The log-normal distribution has to be gradually truncated after a certain critical value for USA firms. Therefore, the original hypothesis of proportional effect proposed by Gibrat is valid with some modification for very large firms. We also consider the possible mechanisms behind this distribution.

  16. Reproducibility Between Brain Uptake Ratio Using Anatomic Standardization and Patlak-Plot Methods.

    PubMed

    Shibutani, Takayuki; Onoguchi, Masahisa; Noguchi, Atsushi; Yamada, Tomoki; Tsuchihashi, Hiroko; Nakajima, Tadashi; Kinuya, Seigo

    2015-12-01

    The Patlak-plot and conventional methods of determining brain uptake ratio (BUR) have some problems with reproducibility. We formulated a method of determining BUR using anatomic standardization (BUR-AS) in a statistical parametric mapping algorithm to improve reproducibility. The objective of this study was to demonstrate the inter- and intraoperator reproducibility of mean cerebral blood flow as determined using BUR-AS in comparison to the conventional-BUR (BUR-C) and Patlak-plot methods. The images of 30 patients who underwent brain perfusion SPECT were retrospectively used in this study. The images were reconstructed using ordered-subset expectation maximization and processed using an automatic quantitative analysis for cerebral blood flow of ECD tool. The mean SPECT count was calculated from axial basal ganglia slices of the normal side (slices 31-40) drawn using a 3-dimensional stereotactic region-of-interest template after anatomic standardization. The mean cerebral blood flow was calculated from the mean SPECT count. Reproducibility was evaluated using coefficient of variation and Bland-Altman plotting. For both inter- and intraoperator reproducibility, the BUR-AS method had the lowest coefficient of variation and smallest error range about the Bland-Altman plot. Mean CBF obtained using the BUR-AS method had the highest reproducibility. Compared with the Patlak-plot and BUR-C methods, the BUR-AS method provides greater inter- and intraoperator reproducibility of cerebral blood flow measurement. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  17. Biodiversity Measurement Using Indices Based on Hyperspectral Reflectance on the Coast of Lagos

    NASA Astrophysics Data System (ADS)

    Omodanisi, E. O.; Salami, A. T.

    2013-12-01

    Hyperspectral measurements provide explicit measurements which can be used in the analysis of biodiversity change. This study was carried out in the coastal area of Lagos State, Nigeria. The objective of this study was to determine if gasoline seepage affects vegetation species distribution and reflectance; with the view to analyzing the vegetation condition. To evaluate the potential of different reflectance spectroscopy of species, the ASD Handheld2 Spectrometer was used. Three identified impacted plots of 30m by 30m were selected randomly and a control plot established in relatively undisturbed vegetated areas away from but perpendicular to the source of seepage. Each identified plot and the control consisted of five transects and measurement were taken at every 2m with about four reflectance measurement per sample point, to average out differences in reflectance as a result of different leaf angles. The radiance output of the spectrometer was converted into reflectance using the reflectance of a white reference over a standardized white spectralon panel. Indices such as Normalized Differential Vegetation Index, RedEdge Normalized Difference Vegetation Index, Soil Adjusted Vegetation Index, Ratio Vegetation Index and Volgelmann RedEdge Index 1 were calculated to accurately estimate the chlorophyll content in the vegetation within optimal band wavelength. Shannon-Weiner's index, Spearman's rank correlation and Analysis of Variance were used to analyze the data. Cocos nucifera was observed to be the most dominant species with a relative abundance of 47.27% while Ananas comosus recorded the lowest relative abundance of 21.8%. In the control plot, Cocos nucifera had the highest relative abundance of 42.3% and Mangifera indica with the least relative abundance of 16.7%. The relationship between the indices and chlorophyll content of the vegetation were significantly higher at (p>0.01) for all the indices in all the plots; however, RedEdgeNDVI and VOG1 indices had the highest occurring frequency among the entire plots. Thus they were used to distinguish relatively healthy from relatively unhealthy vegetation and it was statistically higher at F-ratio 4.825 (p<0.01) and 3.194 (p<0.01) respectively. It was concluded that gasoline affected the condition of vegetation.Table 2: Spearman's rank correlation analysis for relating indices with chlorophyll content for the field data at p>0.01, rho - correlation coefficient. (Source: Author: 2012) Field Spectral Indices Measurement The measurement above is the averaged value for the entire transect in each plot.(Source: Author, 2012)

  18. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  19. Effectiveness of oral hydration in preventing contrast-induced acute kidney injury in patients undergoing coronary angiography or intervention: a pairwise and network meta-analysis.

    PubMed

    Zhang, Weidai; Zhang, Jiawei; Yang, Baojun; Wu, Kefei; Lin, Hanfei; Wang, Yanping; Zhou, Lihong; Wang, Huatao; Zeng, Chujuan; Chen, Xiao; Wang, Zhixing; Zhu, Junxing; Songming, Chen

    2018-06-01

    The effectiveness of oral hydration in preventing contrast-induced acute kidney injury (CI-AKI) in patients undergoing coronary angiography or intervention has not been well established. This study aims to evaluate the efficacy of oral hydration compared with intravenous hydration and other frequently used hydration strategies. PubMed, Embase, Web of Science, and the Cochrane central register of controlled trials were searched from inception to 8 October 2017. To be eligible for analysis, studies had to evaluate the relative efficacy of different prophylactic hydration strategies. We selected and assessed the studies that fulfilled the inclusion criteria and carried out a pairwise and network meta-analysis using RevMan5.2 and Aggregate Data Drug Information System 1.16.8 software. A total of four studies (538 participants) were included in our pairwise meta-analysis and 1754 participants from eight studies with four frequently used hydration strategies were included in a network meta-analysis. Pairwise meta-analysis indicated that oral hydration was as effective as intravenous hydration for the prevention of CI-AKI (5.88 vs. 8.43%; odds ratio: 0.73; 95% confidence interval: 0.36-1.47; P>0.05), with no significant heterogeneity between studies. Network meta-analysis showed that there was no significant difference in the prevention of CI-AKI. However, the rank probability plot suggested that oral plus intravenous hydration had a higher probability (51%) of being the best strategy, followed by diuretic plus intravenous hydration (39%) and oral hydration alone (10%). Intravenous hydration alone was the strategy with the highest probability (70%) of being the worst hydration strategy. Our study shows that oral hydration is not inferior to intravenous hydration for the prevention of CI-AKI in patients with normal or mild-to-moderate renal dysfunction undergoing coronary angiography or intervention.

  20. Multivariate methods to visualise colour-space and colour discrimination data.

    PubMed

    Hastings, Gareth D; Rubin, Alan

    2015-01-01

    Despite most modern colour spaces treating colour as three-dimensional (3-D), colour data is usually not visualised in 3-D (and two-dimensional (2-D) projection-plane segments and multiple 2-D perspective views are used instead). The objectives of this article are firstly, to introduce a truly 3-D percept of colour space using stereo-pairs, secondly to view colour discrimination data using that platform, and thirdly to apply formal statistics and multivariate methods to analyse the data in 3-D. This is the first demonstration of the software that generated stereo-pairs of RGB colour space, as well as of a new computerised procedure that investigated colour discrimination by measuring colour just noticeable differences (JND). An initial pilot study and thorough investigation of instrument repeatability were performed. Thereafter, to demonstrate the capabilities of the software, five colour-normal and one colour-deficient subject were examined using the JND procedure and multivariate methods of data analysis. Scatter plots of responses were meaningfully examined in 3-D and were useful in evaluating multivariate normality as well as identifying outliers. The extent and direction of the difference between each JND response and the stimulus colour point was calculated and appreciated in 3-D. Ellipsoidal surfaces of constant probability density (distribution ellipsoids) were fitted to response data; the volumes of these ellipsoids appeared useful in differentiating the colour-deficient subject from the colour-normals. Hypothesis tests of variances and covariances showed many statistically significant differences between the results of the colour-deficient subject and those of the colour-normals, while far fewer differences were found when comparing within colour-normals. The 3-D visualisation of colour data using stereo-pairs, as well as the statistics and multivariate methods of analysis employed, were found to be unique and useful tools in the representation and study of colour. Many additional studies using these methods along with the JND and other procedures have been identified and will be reported in future publications. © 2014 The Authors Ophthalmic & Physiological Optics © 2014 The College of Optometrists.

  1. Tilt changes of short duration

    USGS Publications Warehouse

    McHugh, Stuart

    1976-01-01

    Section I of this report contains a classification scheme for short period tilt data. For convenience, all fluctuations in the local tilt field of less than 24 hours duration will be designated SP (i.e., short period) tilt events. Three basic categories of waveshape appearance are defined, and the rules for naming the waveforms are outlined. Examples from tilt observations at four central California sites are provided. Section II contains some coseismic tilt data. Fourteen earthquakes in central California, ranging in magnitude from 2.9 to 5.2, were chosen for study on four tiltmeters within 10 source dimensions of the epicenters. The raw records from each of the four tiltmeters at the times of the earthquakes were photographed and are presented in this section. Section III contains documentation of computer programs used in the analysis of the short period tilt data. Program VECTOR computes the difference vector of a tilt event and displays the sequence of events as a head-to-tail vector plot. Program ONSTSP 1) requires two component digitized tilt data as input, 2) scales and plots the data, and 3) computes and displays the amplitude, azimuth, and normalized derivative of the tilt amplitude. Program SHARPS computes the onset sharpness, (i.e., the normalized derivative of the tilt amplitude at the onset of the tilt event) as a function of source-station distance from a model of creep-related tilt changes. Program DSPLAY plots the digitized data.

  2. Stochastic modelling of the monthly average maximum and minimum temperature patterns in India 1981-2015

    NASA Astrophysics Data System (ADS)

    Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.

    2018-04-01

    The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.

  3. Determination of complex electromechanical coefficients for piezoelectric materials

    NASA Astrophysics Data System (ADS)

    Du, Xiao-Hong

    Sugar maple decline, a result of many possible biotic and abiotic causes, has been a problem in northern Pennsylvania since the early 1980s. Several studies have focused on specific causes, yet few have tried to look at a wide array. The purpose of this research was to investigate stresses in sugar maple forest plots in northern Pennsylvania. Three studies were undertaken. The first study examined the spatial extent of sugar maple on 248 plots in Bailey's ecoregions 212F and 212G, which are glaciated and unglaciated regions, respectively. In addition, a health assessment of sugar maple in Pennsylvania was made, with a resulting separation in population between healthy and unhealthy stands occurring at 20 percent dead sugar maple basal area. The second study was conducted to evaluate a statistical sampling design of 28 forested plots, from the above studies population of plots (248), and to provide data on physical and chemical soil variability and sample size estimation for other researchers. The variability of several soil parameters was examined within plots and between health classes of sugar maple and sample size estimations were derived for these populations. The effect of log-normal transformations on reducing variability and sample sizes was examined and soil descriptions of the plots sampled in 1998 were compared to the USDA Soil Survey mapping unit series descriptions for the plot location. Lastly, the effect of sampling intensity on the detection of significant differences between health class treatments was examined. The last study addressed sugar maple decline in northern Pennsylvania during the same period as the first study (approximately 1979-1989) but on 28 plots chosen from the first studies population. These were the same plots used in the second study on soil variability. Recent literature on sugar maple decline has focused on specific causes and few have tried to look at a wide array. This paper investigates stresses in sugar maple plots related to moisture and how these interact with other stresses such as chemistry, insect defoliation, geology, aspect, slope, topography, and atmospheric deposition.

  4. Model aerodynamic test results for two variable cycle engine coannular exhaust systems at simulated takeoff and cruise conditions. Comprehensive data report. Volume 3: Graphical data book 1

    NASA Technical Reports Server (NTRS)

    Nelson, D. P.

    1981-01-01

    A graphical presentation of the aerodynamic data acquired during coannular nozzle performance wind tunnel tests is given. The graphical data consist of plots of nozzle gross thrust coefficient, fan nozzle discharge coefficient, and primary nozzle discharge coefficient. Normalized model component static pressure distributions are presented as a function of primary total pressure, fan total pressure, and ambient static pressure for selected operating conditions. In addition, the supersonic cruise configuration data include plots of nozzle efficiency and secondary-to-fan total pressure pumping characteristics. Supersonic and subsonic cruise data are given.

  5. Bayesian anomaly detection in monitoring data applying relevance vector machine

    NASA Astrophysics Data System (ADS)

    Saito, Tomoo

    2011-04-01

    A method for automatically classifying the monitoring data into two categories, normal and anomaly, is developed in order to remove anomalous data included in the enormous amount of monitoring data, applying the relevance vector machine (RVM) to a probabilistic discriminative model with basis functions and their weight parameters whose posterior PDF (probabilistic density function) conditional on the learning data set is given by Bayes' theorem. The proposed framework is applied to actual monitoring data sets containing some anomalous data collected at two buildings in Tokyo, Japan, which shows that the trained models discriminate anomalous data from normal data very clearly, giving high probabilities of being normal to normal data and low probabilities of being normal to anomalous data.

  6. Probability of regenerating a normal limb after bite injury in the Mexican axolotl (Ambystoma mexicanum)

    PubMed Central

    Thompson, Sierra; Muzinic, Laura; Muzinic, Christopher; Niemiller, Matthew L.

    2014-01-01

    Abstract Multiple factors are thought to cause limb abnormalities in amphibian populations by altering processes of limb development and regeneration. We examined adult and juvenile axolotls (Ambystoma mexicanum) in the Ambystoma Genetic Stock Center (AGSC) for limb and digit abnormalities to investigate the probability of normal regeneration after bite injury. We observed that 80% of larval salamanders show evidence of bite injury at the time of transition from group housing to solitary housing. Among 717 adult axolotls that were surveyed, which included solitary‐housed males and group‐housed females, approximately half presented abnormalities, including examples of extra or missing digits and limbs, fused digits, and digits growing from atypical anatomical positions. Bite injury probably explains these limb defects, and not abnormal development, because limbs with normal anatomy regenerated after performing rostral amputations. We infer that only 43% of AGSC larvae will present four anatomically normal looking adult limbs after incurring a bite injury. Our results show regeneration of normal limb anatomy to be less than perfect after bite injury. PMID:25745564

  7. [A modification of the Gompertz plot resulting from the age index by Ries and an approximation of the survivorship curve (author's transl)].

    PubMed

    Lohmann, W

    1978-01-01

    The shape of the survivorship curve can easily be interpreted on condition that the probability of death is proportional to an exponentially rising function of ageing. According to the formation of a sum for determining of the age index by Ries it was investigated to what extent the survivorship curve may be approximated by a sum of exponentials. It follows that the difference between the pure exponential function and a sum of exponentials by using possible values is lying within the random variation. Because the probability of death for different diseases is variable, the new statement is a better one.

  8. Generalized likelihood ratios for quantitative diagnostic test scores.

    PubMed

    Tandberg, D; Deely, J J; O'Malley, A J

    1997-11-01

    The reduction of quantitative diagnostic test scores to the dichotomous case is a wasteful and unnecessary simplification in the era of high-speed computing. Physicians could make better use of the information embedded in quantitative test results if modern generalized curve estimation techniques were applied to the likelihood functions of Bayes' theorem. Hand calculations could be completely avoided and computed graphical summaries provided instead. Graphs showing posttest probability of disease as a function of pretest probability with confidence intervals (POD plots) would enhance acceptance of these techniques if they were immediately available at the computer terminal when test results were retrieved. Such constructs would also provide immediate feedback to physicians when a valueless test had been ordered.

  9. Synopsis of Mid-latitude Radio Wave Absorption in Europe

    NASA Technical Reports Server (NTRS)

    Torkar, K. M.; Friedrich, M.

    1984-01-01

    Radio wave absorption data covering almost two years from Europe to Central Asia are presented. They are normalized by relating them to a reference absorption. Every day these normalized data are fitted to a mathematical function of geographical location in order to obtain a daily synopsis of radio wave absorption. A film of these absorption charts was made which is intended to reveal movements of absorption or absorption anomaly. In addition, radiance (temperature) data from the lower D-region are also plotted onto these charts.

  10. Strategies for Optimal Control Design of Normal Acceleration Command Following on the F-16

    DTIC Science & Technology

    1992-12-01

    Padd approximation. This approximation has a pole at -40, and introduces a nonminimum phase zero at +40. In deriving the equation for normal acceleration...input signal. The mean not being exactly zero will surface in some simulation plots, but does not alter the point of showing general trends. Also...closer to reality, I will ’know that my goal has been accomplished. My honest belief is that general mixed H2/H.. optimization is the methodology of

  11. Columbia: The first five flights entry heating data series. Volume 2: The OMS Pod

    NASA Technical Reports Server (NTRS)

    Williams, S. D.

    1983-01-01

    Entry heating flight data and wind tunnel data on the OMS Pod are presented for the first five flights of the Space Shuttle Orbiter. The heating rate data are presented in terms of normalized film heat transfer coefficients as a function of angle-of-attack, Mach number, and normal shock Reynolds number. The surface heating rates and temperatures were obtained via the JSC NONLIN/INVERSE computer program. Time history plots of the surface heating rates and temperatures are also presented.

  12. Probability of infestation and extent of mortality models for mountain pine beetle in lodgepole pine forests in Colorado

    Treesearch

    Jose F. Negron; Jennifer G. Klutsch

    2017-01-01

    The mountain pine beetle, Dendroctonus ponderosae Hopkins, is a significant agent of tree mortality in lodgepole pine (Pinus contorta Dougl. ex Loud.) forests throughout western North America. A large outbreak of mountain pine beetle caused extensive tree mortality in north-central Colorado beginning in the late 1990s. We use data from a network of plots established in...

  13. Use of the Weibull function to predict future diameter distributions from current plot data

    Treesearch

    Quang V. Cao

    2012-01-01

    The Weibull function has been widely used to characterize diameter distributions in forest stands. The future diameter distribution of a forest stand can be predicted by use of a Weibull probability density function from current inventory data for that stand. The parameter recovery approach has been used to “recover” the Weibull parameters from diameter moments or...

  14. Target Detection and Identification Using Canonical Correlations Analysis and Subspace Partitioning

    DTIC Science & Technology

    2008-04-01

    Fig. 2. ROCs for DCC, DCC-P, NNLS, and NNLSP (Present chemical=t1, background= t56 , SNR= 5 dB) alarm, or 1−specificity, and PD is the probability of...discrimination values are given in each ROC plot. In Fig. 2, we use t56 as the background, and t1 as the target chemical. The SNR is 5 dB. For each

  15. Utility of tree crown condition indicators to predict tree survival using remeasured Forest Inventory and Analysis data

    Treesearch

    Randall S. Morin; Jim Steinman; KaDonna C. Randolph

    2012-01-01

    The condition of tree crowns is an important indicator of tree and forest health. Crown conditions have been evaluated during surveys of Forest Inventory and Analysis (FIA) Phase 3 (P3) plots since 1999. In this study, remeasured data from 39,357 trees in the northern United States were used to assess the probability of survival among various tree species using the...

  16. Infrasound Signals as Basis for Event Discriminants

    DTIC Science & Technology

    2007-09-01

    tests ( UGT ) at the NTS, some work was done on finding discriminants between UGTs and earthquakes. Plots of wind-corrected infrasound pressure amplitude...probably due to the longer duration of earthquake motion compared to the Figure 1. These figure illustrate the initial comparisons for UGTs and...earthquakes for signal duration (left) and wind corrected amplitude (right) as functions of Mb. relatively short duration for a UGT . On the other hand

  17. Toxicity of nitrogenous fertilizers to eggs of snapping turtles (Chelydra serpentina) in field and laboratory exposures.

    PubMed

    de Solla, Shane Raymond; Martin, Pamela Anne

    2007-09-01

    Many reptiles oviposit in soil of agricultural landscapes. We evaluated the toxicity of two commonly used nitrogenous fertilizers, urea and ammonium nitrate, on the survivorship of exposed snapping turtle (Chelydra serpentina) eggs. Eggs were incubated in a community garden plot in which urea was applied to the soil at realistic rates of up to 200 kg/ha in 2004, and ammonium nitrate was applied at rates of up to 2,000 kg/ha in 2005. Otherwise, the eggs were unmanipulated and were subject to ambient temperature and weather conditions. Eggs were also exposed in the laboratory in covered bins so as to minimize loss of nitrogenous compounds through volatilization or leaching from the soil. Neither urea nor ammonium nitrate had any impact on hatching success or development when exposed in the garden plot, despite overt toxicity of ammonium nitrate to endogenous plants. Both laboratory exposures resulted in reduced hatching success, lower body mass at hatching, and reduced posthatching survival compared to controls. The lack of toxicity of these fertilizers in the field was probably due to leaching in the soil and through atmospheric loss. In general, we conclude that nitrogenous fertilizers probably have little direct impacts on turtle eggs deposited in agricultural landscapes.

  18. The Rectangle Target Plot: A New Approach to the Graphical Presentation of Accuracy of Systems for Self-Monitoring of Blood Glucose.

    PubMed

    Stephan, Peter; Schmid, Christina; Freckmann, Guido; Pleus, Stefan; Haug, Cornelia; Müller, Peter

    2015-10-09

    The measurement accuracy of systems for self-monitoring of blood glucose (SMBG) is usually analyzed by a method comparison in which the analysis results are displayed using difference plots or similar graphs. However, such plots become difficult to comprehend as the number of data points displayed increases. This article introduces a new approach, the rectangle target plot (RTP), which aims to provide a simplified and comprehensible visualization of accuracy data. The RTP is based on ISO 15197 accuracy evaluations of SMBG systems. Two-sided tolerance intervals for normally distributed data are calculated for absolute and relative differences at glucose concentrations <100 mg/dL and ≥100 mg/dL. These tolerance intervals provide an estimator of where a 90% proportion of results is found with a confidence level of 95%. Plotting these tolerance intervals generates a rectangle whose center indicates the systematic measurement difference of the investigated system relative to the comparison method. The size of the rectangle depends on the measurement variability. The RTP provides a means of displaying measurement accuracy data in a simple and comprehensible manner. The visualization is simplified by reducing the displayed information from typically 200 data points to just 1 rectangle. Furthermore, this allows data for several systems or several lots from 1 system to be displayed clearly and concisely in a single graph. © 2015 Diabetes Technology Society.

  19. 49 CFR 173.50 - Class 1-Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... insensitive that there is very little probability of initiation or of transition from burning to detonation under normal conditions of transport. 1 The probability of transition from burning to detonation is... contain only extremely insensitive detonating substances and which demonstrate a negligible probability of...

  20. An evaluation of procedures to estimate monthly precipitation probabilities

    NASA Astrophysics Data System (ADS)

    Legates, David R.

    1991-01-01

    Many frequency distributions have been used to evaluate monthly precipitation probabilities. Eight of these distributions (including Pearson type III, extreme value, and transform normal probability density functions) are comparatively examined to determine their ability to represent accurately variations in monthly precipitation totals for global hydroclimatological analyses. Results indicate that a modified version of the Box-Cox transform-normal distribution more adequately describes the 'true' precipitation distribution than does any of the other methods. This assessment was made using a cross-validation procedure for a global network of 253 stations for which at least 100 years of monthly precipitation totals were available.

  1. Automatic Classification of Station Quality by Image Based Pattern Recognition of Ppsd Plots

    NASA Astrophysics Data System (ADS)

    Weber, B.; Herrnkind, S.

    2017-12-01

    The number of seismic stations is growing and it became common practice to share station waveform data in real-time with the main data centers as IRIS, GEOFON, ORFEUS and RESIF. This made analyzing station performance of increasing importance for automatic real-time processing and station selection. The value of a station depends on different factors as quality and quantity of the data, location of the site and general station density in the surrounding area and finally the type of application it can be used for. The approach described by McNamara and Boaz (2006) became standard in the last decade. It incorporates a probability density function (PDF) to display the distribution of seismic power spectral density (PSD). The low noise model (LNM) and high noise model (HNM) introduced by Peterson (1993) are also displayed in the PPSD plots introduced by McNamara and Boaz allowing an estimation of the station quality. Here we describe how we established an automatic station quality classification module using image based pattern recognition on PPSD plots. The plots were split into 4 bands: short-period characteristics (0.1-0.8 s), body wave characteristics (0.8-5 s), microseismic characteristics (5-12 s) and long-period characteristics (12-100 s). The module sqeval connects to a SeedLink server, checks available stations, requests PPSD plots through the Mustang service from IRIS or PQLX/SQLX or from GIS (gempa Image Server), a module to generate different kind of images as trace plots, map plots, helicorder plots or PPSD plots. It compares the image based quality patterns for the different period bands with the retrieved PPSD plot. The quality of a station is divided into 5 classes for each of the 4 bands. Classes A, B, C, D define regular quality between LNM and HNM while the fifth class represents out of order stations with gain problems, missing data etc. Over all period bands about 100 different patterns are required to classify most of the stations available on the IRIS server. The results are written to a file and stations can be filtered by quality. AAAA represents the best quality in all 4 bands. Also a differentiation between instrument types as broad band and short period stations is possible. A regular check using the IRIS SeedLink and Mustang service allow users to be informed about new stations with a specific quality.

  2. A 20-year growth record for three stands of red alder.

    Treesearch

    Carl M. Berntsen

    1962-01-01

    Until very recently, only fragmentary growth information has been available for red alder (Alnus rubra). This gap has now been partially closed through the preparation of normal yield tabled based on temporary sample plot data collected in northwestern Oregon, western Washington, and southern British Columbia.

  3. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  4. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    USGS Publications Warehouse

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (<10%) for effective normal stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  5. Dimensional analysis yields the general second-order differential equation underlying many natural phenomena: the mathematical properties of a phenomenon's data plot then specify a unique differential equation for it.

    PubMed

    Kepner, Gordon R

    2014-08-27

    This study uses dimensional analysis to derive the general second-order differential equation that underlies numerous physical and natural phenomena described by common mathematical functions. It eschews assumptions about empirical constants and mechanisms. It relies only on the data plot's mathematical properties to provide the conditions and constraints needed to specify a second-order differential equation that is free of empirical constants for each phenomenon. A practical example of each function is analyzed using the general form of the underlying differential equation and the observable unique mathematical properties of each data plot, including boundary conditions. This yields a differential equation that describes the relationship among the physical variables governing the phenomenon's behavior. Complex phenomena such as the Standard Normal Distribution, the Logistic Growth Function, and Hill Ligand binding, which are characterized by data plots of distinctly different sigmoidal character, are readily analyzed by this approach. It provides an alternative, simple, unifying basis for analyzing each of these varied phenomena from a common perspective that ties them together and offers new insights into the appropriate empirical constants for describing each phenomenon.

  6. Mining of hospital laboratory information systems: a model study defining age- and gender-specific reference intervals and trajectories for plasma creatinine in a pediatric population.

    PubMed

    Søeby, Karen; Jensen, Peter Bjødstrup; Werge, Thomas; Sørensen, Steen

    2015-09-01

    The knowledge of physiological fluctuation and variation of even commonly used biochemical quantities in extreme age groups and during development is sparse. This challenges the clinical interpretation and utility of laboratory tests in these age groups. To explore the utility of hospital laboratory data as a source of information, we analyzed enzymatic plasma creatinine as a model analyte in two large pediatric hospital samples. Plasma creatinine measurements from 9700 children aged 0-18 years were obtained from hospital laboratory databases and partitioned into high-resolution gender- and age-groups. Normal probability plots were used to deduce parameters of the normal distributions from healthy creatinine values in the mixed hospital datasets. Furthermore, temporal trajectories were generated from repeated measurements to examine developmental patterns in periods of changing creatinine levels. Creatinine shows great age dependence from birth throughout childhood. We computed and replicated 95% reference intervals in narrow gender and age bins and showed them to be comparable to those determined in healthy population studies. We identified pronounced transitions in creatinine levels at different time points after birth and around the early teens, which challenges the establishment and usefulness of reference intervals in those age groups. The study documents that hospital laboratory data may inform on the developmental aspects of creatinine, on periods with pronounced heterogeneity and valid reference intervals. Furthermore, part of the heterogeneity in creatinine distribution is likely due to differences in biological and chronological age of children and should be considered when using age-specific reference intervals.

  7. Presentation of Diagnostic Information to Doctors May Change Their Interpretation and Clinical Management: A Web-Based Randomised Controlled Trial

    PubMed Central

    Ben-Shlomo, Yoav; Collin, Simon M.; Quekett, James; Sterne, Jonathan A. C.; Whiting, Penny

    2015-01-01

    Background There is little evidence on how best to present diagnostic information to doctors and whether this makes any difference to clinical management. We undertook a randomised controlled trial to see if different data presentations altered clinicians’ decision to further investigate or treat a patient with a fictitious disorder (“Green syndrome”) and their ability to determine post-test probability. Methods We recruited doctors registered with the United Kingdom’s largest online network for medical doctors between 10 July and 6” November 2012. Participants were randomised to one of four arms: (a) text summary of sensitivity and specificity, (b) Fagan’s nomogram, (c) probability-modifying plot (PMP), (d) natural frequency tree (NFT). The main outcome measure was the decision whether to treat, not treat or undertake a brain biopsy on the hypothetical patient and the correct post-test probability. Secondary outcome measures included knowledge of diagnostic tests. Results 917 participants attempted the survey and complete data were available from 874 (95.3%). Doctors randomized to the PMP and NFT arms were more likely to treat the patient than those randomized to the text-only arm. (ORs 1.49, 95% CI 1.02, 2.16) and 1.43, 95% CI 0.98, 2.08 respectively). More patients randomized to the PMP (87/218–39.9%) and NFT (73/207–35.3%) arms than the nomogram (50/194–25.8%) or text only (30/255–11.8%) arms reported the correct post-test probability (p <0.001). Younger age, postgraduate training and higher self-rated confidence all predicted better knowledge performance. Doctors with better knowledge were more likely to view an optional learning tutorial (OR per correct answer 1.18, 95% CI 1.06, 1.31). Conclusions Presenting diagnostic data using a probability-modifying plot or natural frequency tree influences the threshold for treatment and improves interpretation of tests results compared to text summary of sensitivity and specificity or Fagan’s nomogram. PMID:26147744

  8. Increased determinism in brain electrical activity occurs in association with multiple sclerosis.

    PubMed

    Carrubba, Simona; Minagar, Alireza; Chesson, Andrew L; Frilot, Clifton; Marino, Andrew A

    2012-04-01

    Increased determinism (decreased complexity) of brain electrical activity has been associated with some brain diseases. Our objective was to determine whether a similar association occurred for multiple sclerosis (MS). Ten subjects with a relapsing-remitting course of MS who were in remission were studied; the controls were age- and gender-matched clinically normal subjects. Recurrence plots were calculated using representative electroencephalogram (EEG) epochs (1-7 seconds) from six derivations; the plots were quantified using the nonlinear variables percent recurrence (%R) and percent determinism (%D). The results were averaged over all derivations for each participant, and the means were compared between the groups. As a linear control procedure the groups were also compared using spectral analysis. The mean±SD of %R for the MS subjects was 6·6±1·3%, compared with 5·1±1·3% in the normal group (P = 0·017), indicating that brain activity in the subjects with MS was less complex, as hypothesized. The groups were not distinguishable using %D or spectral analysis. Taken together with our earlier report that %R could be used to discriminate between MS and normal subjects based on the ability to exhibit evoked potentials, the evidence suggests that complexity analysis of the EEG has potential for development as a diagnostic test for MS.

  9. A comparison of Gemini and ERTS imagery obtained over southern Morocco

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Anderson, A. T.

    1973-01-01

    A mosaic constructed from three ERTS MSS band 5 images enlarged to 1:500,000 compares favorably with a similar scale geologic map of southern Morocco, and a near-similar scale Gemini 5 photo pair. A comparative plot of lineations and generalized geology on the three formats show that a significantly greater number of probable fractures are visible on the ERTS imagery than on the Gemini photography, and that both orbital formats show several times more lineaments than were previously mapped. A plot of mineral occurrences on the structural overlays indicates that definite structure-mineralization relationships exist; this finding is used to define underdeveloped areas which are prospective for mineralization. More detailed mapping is possible using MSS imagery than on Gemini 5 photographs, and in addition, the ERTS format is not restricted to limited coverage.

  10. Status of the desert tortoise in Red Rock Canyon State Park

    USGS Publications Warehouse

    Berry, Kristin H.; Keith, Kevin; Bailey, Tracy Y.

    2008-01-01

    We surveyed for desert tortoises, Gopherus agassizii, in the western part of Red Rock Canyon State Park and watershed in eastern Kern County, California, between 2002 and 2004. We used two techniques: a single demographic plot (~4 km2 ) and 37 landscape plots (1-ha each). We estimated population densities of tortoises to be between 2.7 and 3.57/km2 and the population in the Park to be 108 tortoises. We estimated the death rate at 67% for subadults and adults during the last 4 yrs. Mortality was high for several reasons: gunshot deaths, avian predation, mammalian predation, and probably disease. Historic and recent anthropogenic impacts from State Highway 14, secondary roads, trash, cross-country vehicle tracks, and livestock have contributed to elevated death rates and degradation of habitat. We propose conservation actions to reduce mortality.

  11. Effects of spatial heterogeneity on butterfly species richness in Rocky Mountain National Park, CO, USA

    USGS Publications Warehouse

    Kumar, S.; Simonson, S.E.; Stohlgren, T.J.

    2009-01-01

    We investigated butterfly responses to plot-level characteristics (plant species richness, vegetation height, and range in NDVI [normalized difference vegetation index]) and spatial heterogeneity in topography and landscape patterns (composition and configuration) at multiple spatial scales. Stratified random sampling was used to collect data on butterfly species richness from seventy-six 20 ?? 50 m plots. The plant species richness and average vegetation height data were collected from 76 modified-Whittaker plots overlaid on 76 butterfly plots. Spatial heterogeneity around sample plots was quantified by measuring topographic variables and landscape metrics at eight spatial extents (radii of 300, 600 to 2,400 m). The number of butterfly species recorded was strongly positively correlated with plant species richness, proportion of shrubland and mean patch size of shrubland. Patterns in butterfly species richness were negatively correlated with other variables including mean patch size, average vegetation height, elevation, and range in NDVI. The best predictive model selected using Akaike's Information Criterion corrected for small sample size (AICc), explained 62% of the variation in butterfly species richness at the 2,100 m spatial extent. Average vegetation height and mean patch size were among the best predictors of butterfly species richness. The models that included plot-level information and topographic variables explained relatively less variation in butterfly species richness, and were improved significantly after including landscape metrics. Our results suggest that spatial heterogeneity greatly influences patterns in butterfly species richness, and that it should be explicitly considered in conservation and management actions. ?? 2008 Springer Science+Business Media B.V.

  12. A Waveform Detector that Targets Template-Decorrelated Signals and Achieves its Predicted Performance: Demonstration with IMS Data

    NASA Astrophysics Data System (ADS)

    Carmichael, J.

    2016-12-01

    Waveform correlation detectors used in seismic monitoring scan multichannel data to test two competing hypotheses: that data contain (1) a noisy, amplitude-scaled version of a template waveform, or, (2) only noise. In reality, seismic wavefields include signals triggered by non-target sources (background seismicity) and target signals that are only partially correlated with the waveform template. We reform the waveform correlation detector hypothesis test to accommodate deterministic uncertainty in template/target waveform similarity and thereby derive a new detector from convex set projections (the "cone detector") for use in explosion monitoring. Our analyses give probability density functions that quantify the detectors' degraded performance with decreasing waveform similarity. We then apply our results to three announced North Korean nuclear tests and use International Monitoring System (IMS) arrays to determine the probability that low magnitude, off-site explosions can be reliably detected with a given waveform template. We demonstrate that cone detectors provide (1) an improved predictive capability over correlation detectors to identify such spatially separated explosive sources, (2) competitive detection rates, and (3) reduced false alarms on background seismicity. Figure Caption: Observed and predicted receiver operating characteristic curves for correlation statistic r(x) (left) and cone statistic s(x) (right) versus semi-empirical explosion magnitude. a: Shaded region shows range of ROC curves for r(x) that give the predicted detection performance in noise conditions recorded over 24 hrs on 8 October 2006. Superimposed stair plot shows the empirical detection performance (recorded detections/total events) averaged over 24 hr of data. Error bars indicate the demeaned range in observed detection probability over the day; means are removed to avoid risk of misinterpreting range to indicate probabilities can exceed one. b: Shaded region shows range of ROC curves for s(x) that give the predicted detection performance for the cone detector. Superimposed stair plot show observed detection performance averaged over 24 hr of data analogous to that shown in a.

  13. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  14. Statistical hypothesis tests of some micrometeorological observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SethuRaman, S.; Tichler, J.

    Chi-square goodness-of-fit is used to test the hypothesis that the medium scale of turbulence in the atmospheric surface layer is normally distributed. Coefficients of skewness and excess are computed from the data. If the data are not normal, these coefficients are used in Edgeworth's asymptotic expansion of Gram-Charlier series to determine an altrnate probability density function. The observed data are then compared with the modified probability densities and the new chi-square values computed.Seventy percent of the data analyzed was either normal or approximatley normal. The coefficient of skewness g/sub 1/ has a good correlation with the chi-square values. Events withmore » vertical-barg/sub 1/vertical-bar<0.21 were normal to begin with and those with 0.21« less

  15. The recursive combination filter approach of pre-processing for the estimation of standard deviation of RR series.

    PubMed

    Mishra, Alok; Swati, D

    2015-09-01

    Variation in the interval between the R-R peaks of the electrocardiogram represents the modulation of the cardiac oscillations by the autonomic nervous system. This variation is contaminated by anomalous signals called ectopic beats, artefacts or noise which mask the true behaviour of heart rate variability. In this paper, we have proposed a combination filter of recursive impulse rejection filter and recursive 20% filter, with recursive application and preference of replacement over removal of abnormal beats to improve the pre-processing of the inter-beat intervals. We have tested this novel recursive combinational method with median method replacement to estimate the standard deviation of normal to normal (SDNN) beat intervals of congestive heart failure (CHF) and normal sinus rhythm subjects. This work discusses the improvement in pre-processing over single use of impulse rejection filter and removal of abnormal beats for heart rate variability for the estimation of SDNN and Poncaré plot descriptors (SD1, SD2, and SD1/SD2) in detail. We have found the 22 ms value of SDNN and 36 ms value of SD2 descriptor of Poincaré plot as clinical indicators in discriminating the normal cases from CHF cases. The pre-processing is also useful in calculation of Lyapunov exponent which is a nonlinear index as Lyapunov exponents calculated after proposed pre-processing modified in a way that it start following the notion of less complex behaviour of diseased states.

  16. Bivariate categorical data analysis using normal linear conditional multinomial probability model.

    PubMed

    Sun, Bingrui; Sutradhar, Brajendra

    2015-02-10

    Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Randomized path optimization for thevMitigated counter detection of UAVS

    DTIC Science & Technology

    2017-06-01

    using Bayesian filtering . The KL divergence is used to compare the probability density of aircraft termination to a normal distribution around the...Bayesian filtering . The KL divergence is used to compare the probability density of aircraft termination to a normal distribution around the true terminal...algorithm’s success. A recursive Bayesian filtering scheme is used to assimilate noisy measurements of the UAVs position to predict its terminal location. We

  18. Bayes classification of terrain cover using normalized polarimetric data

    NASA Technical Reports Server (NTRS)

    Yueh, H. A.; Swartz, A. A.; Kong, J. A.; Shin, R. T.; Novak, L. M.

    1988-01-01

    The normalized polarimetric classifier (NPC) which uses only the relative magnitudes and phases of the polarimetric data is proposed for discrimination of terrain elements. The probability density functions (PDFs) of polarimetric data are assumed to have a complex Gaussian distribution, and the marginal PDF of the normalized polarimetric data is derived by adopting the Euclidean norm as the normalization function. The general form of the distance measure for the NPC is also obtained. It is demonstrated that for polarimetric data with an arbitrary PDF, the distance measure of NPC will be independent of the normalization function selected even when the classifier is mistrained. A complex Gaussian distribution is assumed for the polarimetric data consisting of grass and tree regions. The probability of error for the NPC is compared with those of several other single-feature classifiers. The classification error of NPCs is shown to be independent of the normalization function.

  19. Critical zone properties control the fate of nitrogen during experimental rainfall in montane forests of the Colorado Front Range

    USGS Publications Warehouse

    Hinckley, Eve-Lyn S.; Ebel, Brian A.; Barnes, Rebecca T.; Murphy, Sheila F.; Anderson, Suzanne P.

    2017-01-01

    Several decades of research in alpine ecosystems have demonstrated links among the critical zone, hydrologic response, and the fate of elevated atmospheric nitrogen (N) deposition. Less research has occurred in mid-elevation forests, which may be important for retaining atmospheric N deposition. To explore the fate of N in the montane zone, we conducted plot-scale experimental rainfall events across a north–south transect within a catchment of the Boulder Creek Critical Zone Observatory. Rainfall events mimicked relatively common storms (20–50% annual exceedance probability) and were labeled with 15N-nitrate (NO3−">NO−3NO3−) and lithium bromide tracers. For 4 weeks, we measured soil–water and leachate concentrations of Br−, 15NO3−,">15NO−3,15NO3−, and NO3−">NO−3NO3− daily, followed by recoveries of 15N species in bulk soils and microbial biomass. Tracers moved immediately into the subsurface of north-facing slope plots, exhibiting breakthrough at 10 and 30 cm over 22 days. Conversely, little transport of Br− or 15NO3−">15NO−315NO3− occurred in south-facing slope plots; tracers remained in soil or were lost via pathways not measured. Hillslope position was a significant determinant of soil 15N-NO3−">NO−3NO3− recoveries, while soil depth and time were significant determinants of 15N recovery in microbial biomass. Overall, 15N recovery in microbial biomass and leachate was greater in upper north-facing slope plots than lower north-facing (toeslope) and both south-facing slope plots in August; by October, 15N recovery in microbial N biomass within south-facing slope plots had increased substantially. Our results point to the importance of soil properties in controlling the fate of N in mid-elevation forests during the summer season.

  20. Uptake of cations under two different water regimes in a boreal scots pine forest.

    PubMed

    Plamboeck, A H; Nylén, T; Grip, H

    2000-07-10

    There is still much to find out about how trees react to changing nutrient conditions. In this cation uptake study, 134Cs and 22Na were injected between the humus and the mineral soil, and into a 20-cm depth in the mineral soil, respectively. Half of the experimental site was subjected to desiccation in 1995 and 1996, while the other half was subjected to irrigation in 1995, and desiccation in 1996. One month after the injections, the concentration of 134Cs in the xylem sap was higher in the irrigated plots (ID) than in the desiccated plots (DD). In August 1995, the difference in the 134Cs concentration in the xylem sap was even higher between the treatments. In 1995, 22Na was also higher in the xylem sap on the ID plots than on the DD plots, but not significantly. Exponential relationships were found between the amount of 134Cs and 22Na in the xylem sap; the relative water uptake from humus and 0-10-cm mineral soil (134Cs); and 10-25-cm mineral soil (22Na) in July 1995, when the tracers had not yet reached the top of the boles. The relative uptake of injected 22Na was larger than that of injected 134Cs, probably due to low exchangeability of Cs in the soil. One year after the injection (1996), more 134Cs was found in the wood, bark, needles and cones on the plots irrigated in 1995 than on the desiccated plots. The content of 134Cs in the stem wood and stump amounted to nearly 80% of the total uptake in the trees. The Cs distribution 1 year after the Chernobyl accident was dominated by Cs on/in needles and bark. After 10 years of redistribution, the Chernobyl Cs content of the different parts of the trees approached that of K.

  1. A review of contemporary methods for the presentation of scientific uncertainty.

    PubMed

    Makinson, K A; Hamby, D M; Edwards, J A

    2012-12-01

    Graphic methods for displaying uncertainty are often the most concise and informative way to communicate abstract concepts. Presentation methods currently in use for the display and interpretation of scientific uncertainty are reviewed. Numerous subjective and objective uncertainty display methods are presented, including qualitative assessments, node and arrow diagrams, standard statistical methods, box-and-whisker plots,robustness and opportunity functions, contribution indexes, probability density functions, cumulative distribution functions, and graphical likelihood functions.

  2. RadVel: General toolkit for modeling Radial Velocities

    NASA Astrophysics Data System (ADS)

    Fulton, Benjamin J.; Petigura, Erik A.; Blunt, Sarah; Sinukoff, Evan

    2018-01-01

    RadVel models Keplerian orbits in radial velocity (RV) time series. The code is written in Python with a fast Kepler's equation solver written in C. It provides a framework for fitting RVs using maximum a posteriori optimization and computing robust confidence intervals by sampling the posterior probability density via Markov Chain Monte Carlo (MCMC). RadVel can perform Bayesian model comparison and produces publication quality plots and LaTeX tables.

  3. Model Description for the SOCRATES Contamination Code

    DTIC Science & Technology

    1988-10-21

    Special A2-I V ILLUSTRATIONS A Schematic Representaction of the Major Elements or Shuttle Contaminacion Problem .... .............. 3 2 A Diagram of the...Atmospherically Scattered Molecules on Ambient Number Density for the 200, 250, and 300 Km Runs 98 A--I A Plot of the Chi-Square Probability Density Function...are scaled with respect to the far field ambient number density, nD, which leaves only the cross section scaling factor to be determined. This factor

  4. Purification of L-( sup 3 H) Nicotine eliminates low affinity binding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romm, E.; Marks, M.J.; Collins, A.C.

    1990-01-01

    Some studies of L-({sup 3}H) nicotine binding to rodent and human brain tissue have detected two binding sites as evidenced by nonlinear Scatchard plots. Evidence presented here indicated that the low affinity binding site is not stereospecific, is not inhibited by low concentrations of cholinergic agonists and is probably due to breakdown products of nicotine since purification of the L-({sup 3}H)nicotine eliminates the low affinity site.

  5. Comments on PDF methods

    NASA Technical Reports Server (NTRS)

    Chen, J.-Y.

    1992-01-01

    Viewgraphs are presented on the following topics: the grand challenge of combustion engineering; research of probability density function (PDF) methods at Sandia; experiments of turbulent jet flames (Masri and Dibble, 1988); departures from chemical equilibrium; modeling turbulent reacting flows; superequilibrium OH radical; pdf modeling of turbulent jet flames; scatter plot for CH4 (methane) and O2 (oxygen); methanol turbulent jet flames; comparisons between predictions and experimental data; and turbulent C2H4 jet flames.

  6. Composition and Structure of a l930s-Era Pine-Hardwood Stand in Arkansas

    Treesearch

    Don C. Bragg

    2004-01-01

    This paper describes an unmanaged 1930s-era pine-hardwood stand on a minor stream terrace in Ashley County, AR. Probably inventoried as a part of an early growth and yield study, the sample plot was approximately 3.2 ha in size and contained at least 21 tree species. Loblolly pine comprised 39.1% of all stems, followed by willow oak (12.7%), winged elm (9.6%), sweetgum...

  7. Estimating Mixed Broadleaves Forest Stand Volume Using Dsm Extracted from Digital Aerial Images

    NASA Astrophysics Data System (ADS)

    Sohrabi, H.

    2012-07-01

    In mixed old growth broadleaves of Hyrcanian forests, it is difficult to estimate stand volume at plot level by remotely sensed data while LiDar data is absent. In this paper, a new approach has been proposed and tested for estimating stand forest volume. The approach is based on this idea that forest volume can be estimated by variation of trees height at plots. In the other word, the more the height variation in plot, the more the stand volume would be expected. For testing this idea, 120 circular 0.1 ha sample plots with systematic random design has been collected in Tonekaon forest located in Hyrcanian zone. Digital surface model (DSM) measure the height values of the first surface on the ground including terrain features, trees, building etc, which provides a topographic model of the earth's surface. The DSMs have been extracted automatically from aerial UltraCamD images so that ground pixel size for extracted DSM varied from 1 to 10 m size by 1m span. DSMs were checked manually for probable errors. Corresponded to ground samples, standard deviation and range of DSM pixels have been calculated. For modeling, non-linear regression method was used. The results showed that standard deviation of plot pixels with 5 m resolution was the most appropriate data for modeling. Relative bias and RMSE of estimation was 5.8 and 49.8 percent, respectively. Comparing to other approaches for estimating stand volume based on passive remote sensing data in mixed broadleaves forests, these results are more encouraging. One big problem in this method occurs when trees canopy cover is totally closed. In this situation, the standard deviation of height is low while stand volume is high. In future studies, applying forest stratification could be studied.

  8. Experimental removal of woody vegetation does not increase nesting success or fledgling production in two grassland sparrows (Ammodramus) in Pennsylvania

    USGS Publications Warehouse

    Hill, Jason M.; Diefenbach, Duane R.

    2013-01-01

    The influence of vegetation structure on the probability of daily nest survival (DNS) for grassland passerines has received considerable attention. Some correlative studies suggest that the presence of woody vegetation lowers DNS. Over 3 years (2009–2011), we monitored 215 nests of the Grasshopper Sparrow (Ammodramus savannarum) and Henslow's Sparrow (A. henslowii) on 162 ha of reclaimed surface-mine grasslands in Pennsylvania. We removed shrubs from treatment plots with ≤36% areal coverage of woody vegetation in a before-after-control-impact-pairs (BACIP) design framework. Daily nest survival (95% CI: 0.94–0.96) was as high as previous studies have reported but was not associated with woody vegetative cover, proximity to woody vegetation, or woody stem density. Variation in DNS was best explained by increasing nonwoody-vegetation height. Grasshopper Sparrow fledgling production on treatment plots in 2010 (95% CI: 3.3–4.7) and 2011 (95% CI: 3.8–5.0) was similar to baseline conditions of treatment plots (95% CI: 3.4–4.9) and control plots (95% CI: 3.2–4.5) in 2009. Fledgling production was associated with thatch depth (β ± SE = 0.13 ± 0.09) and bare ground (β ± SE = -2.62 ± 1.29) adjacent to the nest and plot woody vegetative cover ( ± SE = -3.09 ± 1.02). Our experimental research suggests that overall reproductive success of Grasshopper and Henslow's sparrows on reclaimed surfacemine grasslands is driven by a suite of largely nonwoody—vegetation components, and both species can successfully nest and produce young in habitats with greater amounts of scattered woody vegetation than has generally been considered.

  9. Enantioseparation of angiotensin II receptor type 1 blockers: evaluation of 6-substituted carbamoyl benzimidazoles on immobilized polysaccharide-based chiral stationary phases. Unusual temperature behavior.

    PubMed

    Su, Ran; Hou, Zhun; Sang, Lihong; Zhou, Zhi-Ming; Fang, Hao; Yang, Xinying

    2017-09-15

    Enantioseparation of thirteen 6-substituted carbamoyl benzimidazoles by high-performance liquid chromatography (HPLC) was investigated using two immobilized polysaccharide-based chiral stationary phases (CSPs), Chiralpak IC and Chiralpak IA, in normal-phase mode. Most of the examined compounds were completely resolved. The effects of a polar alcohol modifier, analyte structure, and column temperature on the chiral recognition were investigated. Furthermore, the structure-retention relationship was evaluated, and thermodynamic parameters were calculated from plots of ln k' or ln α versus 1/T. The thermodynamic parameters indicated that the separations were enthalpy-driven. Moreover, nonlinear van't Hoff plots were obtained on Chiralpak IA. However, two unusual phenomena were observed: (1) an unusual increase in retention with increasing temperature with linear van't Hoff plots on Chiralpak IC and (2) an extremely high T iso value (i.e., several thousand degrees centigrade). Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Understand Centrifugal Compressor stage curves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stadler, E.L.

    1986-08-01

    Multistage Centrifugal Compressor Performance is generally presented in the form of a composite curve showing discharge pressure and bhp plotted as a function of capacity. This composite curve represents the cumulative performance of each stage performance curve. A simple yet quite accurate means of measuring compressor total performance is to test each stage as a single-stage compressor, usually on air with atmospheric inlets. Stage curves are then generated from the test data and three important variables are plotted: head coefficient, work coefficient and adiabatic efficiency. These variables are plotted against a normalized flow coefficient, Q/N, which is inlet volume flowmore » (cfm) divided by impeller speed (rpm). The nomenclature used to define these stage variables changes from manufacturer to manufacturer; however, the parameters presented are the same. An understanding of each parameter's theoretical derivation and determination from test data will help the engineer reviewing test curves to be more cognizant of the interrelationships between these variables; specifically, how they affect overall machine pressure rise and power consumption.« less

  11. Absolute calibration technique for broadband ultrasonic transducers

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Cantrell, John H. (Inventor)

    1994-01-01

    Calibrating an ultrasonic transducer can be performed with a reduced number of calculations and testing. A wide-band pulser is connected to an ultrasonic transducer under test to generate ultrasonic waves in a liquid. A single frequency is transmitted to the electrostatic acoustic transducer (ESAT) and the voltage change produced is monitored. Then a broadband ultrasonic pulse is generated by the ultrasonic transducer and received by the ESAT. The output of the ESAT is amplified and input to a digitized oscilloscope for fast Fourier transform. The resulting plot is normalized with the monitored signal from the single frequency pulse. The plot is then corrected for characteristics of the membrane and diffraction effects. The transfer function of the final plot is determined. The transfer function gives the final sensitivity of the ultrasonic transducer as a function of frequency. The advantage of the system is the speed of calibrating the transducer by a reduced number of measurements and removal of the membrane and diffraction effects.

  12. Extreme drought event and shrub invasion combine to reduce ecosystem functioning and resilience in water-limited climates

    NASA Astrophysics Data System (ADS)

    Caldeira, Maria; Lecomte, Xavier; David, Teresa; Pinto, Joaquim; Bugalho, Miguel; Werner, Christiane

    2016-04-01

    Extreme droughts and plant invasions are major drivers of global change that can critically affect ecosystem functioning. Shrub encroachment is increasing in many regions worldwide and extreme events are projected to increase in frequency and intensity, namely in the Mediterranean region. Nevertheless, little is known about how these drivers may interact and affect ecosystem functioning and resilience to extreme droughts. Using a manipulative shrub removal experiment and the co-occurrence of an extreme drought event (2011/2012) in a Mediterranean woodland, we show that the native shrub invasion and extreme drought combined to reduce ecosystem transpiration and the resilience of the key-stone oak tree species. We established six 25 x 25 m paired plots in a shrub (Cistus ladanifer L.) encroached Mediterranean cork-oak (Quercus suber L.) woodland. We measured sapflow and pre-dawn leaf water potential of trees and shrubs and soil water content in all plots during three years. We determined the resilience of tree transpiration to evaluate to what extent trees recovered from the extreme drought event. From February to November 2011 we conducted baseline measurements for plot comparison. In November 2011 all the shrubs from one of all the paired plots were cut and removed. Ecosystem transpiration was dominated by the water use of the invasive shrub, which further increased after the extreme drought. Simultaneously, tree transpiration in invaded plots declined much stronger (67 ± 13 %) than in plots cleared from shrubs (31 ± 11%) relative to the pre-drought year. Trees in invaded plots were not able to recover in the following wetter year showing lower resilience to the extreme drought event. Our results imply that in Mediterranean-type of climates invasion by water spending species can combine with projected recurrent extreme droughts causing critical drought tolerance thresholds of trees to be overcome increasing the probability of tree mortality (Caldeira et.al. 2015). Caldeira M.C., Lecomte X., David T.S., Pinto J.G., Bugalho M.N. & Werner C. (2015). Synergy of extreme drought and shrub invasion reduce ecosystem functioning and resilience in water-limited climates. Scientific Reports, 5, 15110.

  13. Recovery of tall cotton-grass following real and simulated feeding by snow geese

    USGS Publications Warehouse

    Hupp, Jerry W.; Robertson, Donna G.; Schmutz, Joel A.

    2000-01-01

    Lesser snow geese Anser caerulescens caerulescens from the western Canadian Arctic feed on underground parts of tall cotton-grass Eriophorum angustifolium during autumn staging on the coastal plain of the Beaufort Sea in Canada and Alaska. We studied revegetation of sites where cotton-grass had been removed either by human-imprinted snow geese or by hand to simulate snow goose feeding. Aerial cover of cotton-grass at sites (n = 4) exploited by human-imprinted snow geese averaged 60 and 39% lower than in undisturbed control plots during the first and second year after feeding, respectively. Underground biomass of cotton-grass stembases and rhizomes in hand-treated plots was 80 and 62% less than in control plots 2 and 4 yr after removal, respectively (n = 10 yr-1). Aerial cover and biomass of common non-forage species such as Carex aquatilis did not increase on treated areas. Removal of cotton-grass by geese likely reduces forage availability at exploited sites for at least 2-4 yr after feeding but probably does not affect long-term community composition. Temporal heterogeneity in forage abundance likely contributes to the large spatial requirement of snow geese during staging.

  14. Analyzing Multiple-Choice Questions by Model Analysis and Item Response Curves

    NASA Astrophysics Data System (ADS)

    Wattanakasiwich, P.; Ananta, S.

    2010-07-01

    In physics education research, the main goal is to improve physics teaching so that most students understand physics conceptually and be able to apply concepts in solving problems. Therefore many multiple-choice instruments were developed to probe students' conceptual understanding in various topics. Two techniques including model analysis and item response curves were used to analyze students' responses from Force and Motion Conceptual Evaluation (FMCE). For this study FMCE data from more than 1000 students at Chiang Mai University were collected over the past three years. With model analysis, we can obtain students' alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts. The model analysis consists of two algorithms—concentration factor and model estimation. This paper only presents results from using the model estimation algorithm to obtain a model plot. The plot helps to identify a class model state whether it is in the misconception region or not. Item response curve (IRC) derived from item response theory is a plot between percentages of students selecting a particular choice versus their total score. Pros and cons of both techniques are compared and discussed.

  15. Design prediction for long term stress rupture service of composite pressure vessels

    NASA Technical Reports Server (NTRS)

    Robinson, Ernest Y.

    1992-01-01

    Extensive stress rupture studies on glass composites and Kevlar composites were conducted by the Lawrence Radiation Laboratory beginning in the late 1960's and extending to about 8 years in some cases. Some of the data from these studies published over the years were incomplete or were tainted by spurious failures, such as grip slippage. Updated data sets were defined for both fiberglass and Kevlar composite stand test specimens. These updated data are analyzed in this report by a convenient form of the bivariate Weibull distribution, to establish a consistent set of design prediction charts that may be used as a conservative basis for predicting the stress rupture life of composite pressure vessels. The updated glass composite data exhibit an invariant Weibull modulus with lifetime. The data are analyzed in terms of homologous service load (referenced to the observed median strength). The equations relating life, homologous load, and probability are given, and corresponding design prediction charts are presented. A similar approach is taken for Kevlar composites, where the updated stand data do show a turndown tendency at long life accompanied by a corresponding change (increase) of the Weibull modulus. The turndown characteristic is not present in stress rupture test data of Kevlar pressure vessels. A modification of the stress rupture equations is presented to incorporate a latent, but limited, strength drop, and design prediction charts are presented that incorporate such behavior. The methods presented utilize Cartesian plots of the probability distributions (which are a more natural display for the design engineer), based on median normalized data that are independent of statistical parameters and are readily defined for any set of test data.

  16. Plant calendar pattern based on rainfall forecast and the probability of its success in Deli Serdang regency of Indonesia

    NASA Astrophysics Data System (ADS)

    Darnius, O.; Sitorus, S.

    2018-03-01

    The objective of this study was to determine the pattern of plant calendar of three types of crops; namely, palawija, rice, andbanana, based on rainfall in Deli Serdang Regency. In the first stage, we forecasted rainfall by using time series analysis, and obtained appropriate model of ARIMA (1,0,0) (1,1,1)12. Based on the forecast result, we designed a plant calendar pattern for the three types of plant. Furthermore, the probability of success in the plant types following the plant calendar pattern was calculated by using the Markov process by discretizing the continuous rainfall data into three categories; namely, Below Normal (BN), Normal (N), and Above Normal (AN) to form the probability transition matrix. Finally, the combination of rainfall forecasting models and the Markov process were used to determine the pattern of cropping calendars and the probability of success in the three crops. This research used rainfall data of Deli Serdang Regency taken from the office of BMKG (Meteorologist Climatology and Geophysics Agency), Sampali Medan, Indonesia.

  17. Calibration of micromechanical parameters for DEM simulations by using the particle filter

    NASA Astrophysics Data System (ADS)

    Cheng, Hongyang; Shuku, Takayuki; Thoeni, Klaus; Yamamoto, Haruyuki

    2017-06-01

    The calibration of DEM models is typically accomplished by trail and error. However, the procedure lacks of objectivity and has several uncertainties. To deal with these issues, the particle filter is employed as a novel approach to calibrate DEM models of granular soils. The posterior probability distribution of the microparameters that give numerical results in good agreement with the experimental response of a Toyoura sand specimen is approximated by independent model trajectories, referred as `particles', based on Monte Carlo sampling. The soil specimen is modeled by polydisperse packings with different numbers of spherical grains. Prepared in `stress-free' states, the packings are subjected to triaxial quasistatic loading. Given the experimental data, the posterior probability distribution is incrementally updated, until convergence is reached. The resulting `particles' with higher weights are identified as the calibration results. The evolutions of the weighted averages and posterior probability distribution of the micro-parameters are plotted to show the advantage of using a particle filter, i.e., multiple solutions are identified for each parameter with known probabilities of reproducing the experimental response.

  18. NASTRAN applications to aircraft propulsion systems

    NASA Technical Reports Server (NTRS)

    White, J. L.; Beste, D. L.

    1975-01-01

    The use of NASTRAN in propulsion system structural integration analysis is described. Computer support programs for modeling, substructuring, and plotting analysis results are discussed. Requirements on interface information and data exchange by participants in a NASTRAN substructure analysis are given. Static and normal modes vibration analysis results are given with comparison to test and other analytical results.

  19. 33 CFR 164.38 - Automatic radar plotting aids (ARPA).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ARPA data is clearly visible in general to more than one observer in the conditions of light normally... radar display and, in the case of automatic acquisition, enters within the acquisition area chosen by the observer or, in the case of manual acquisition, has been acquired by the observer, the ARPA should...

  20. 33 CFR 164.38 - Automatic radar plotting aids (ARPA).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ARPA data is clearly visible in general to more than one observer in the conditions of light normally... radar display and, in the case of automatic acquisition, enters within the acquisition area chosen by the observer or, in the case of manual acquisition, has been acquired by the observer, the ARPA should...

  1. 33 CFR 164.38 - Automatic radar plotting aids (ARPA).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ARPA data is clearly visible in general to more than one observer in the conditions of light normally... radar display and, in the case of automatic acquisition, enters within the acquisition area chosen by the observer or, in the case of manual acquisition, has been acquired by the observer, the ARPA should...

  2. 33 CFR 164.38 - Automatic radar plotting aids (ARPA).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ARPA data is clearly visible in general to more than one observer in the conditions of light normally... radar display and, in the case of automatic acquisition, enters within the acquisition area chosen by the observer or, in the case of manual acquisition, has been acquired by the observer, the ARPA should...

  3. Cool-and Unusual-CAD Applications

    ERIC Educational Resources Information Center

    Calhoun, Ken

    2004-01-01

    This article describes several very useful applications of AutoCAD that may lie outside the normal scope of application. AutoCAD commands used in this article are based on AutoCAD 2000I. The author and his students used a Hewlett Packard 750C DesignJet plotter for plotting. (Contains 5 figures and 5 photos.)

  4. Estimating leaf area and above-ground biomass of forest regeneration areas using a corrected normalized difference vegetation index

    Treesearch

    Tommy L. Coleman; James H. Miller; Bruce R. Zutter

    1992-01-01

    The objective of this study was to investigate the regression relations between vegetation indices derived from remotely-sensed data of single and mixed forest regeneration plots. Loblolly pine (Pinus taeda L.) seedlings, sweelgum (Liquidambar styraciflua L.) seedlings and broomsedge (Andropogon virginicus L.)...

  5. Association of green stem disorder with agronomic traits in soybean

    USDA-ARS?s Scientific Manuscript database

    Green stem disorder of soybean (GSD) is the occurrence of non-senescent, fleshy green stems of plants with normal, fully mature pods and seeds. Data on GSD incidence based on a percentage of plants in plots showing symptoms were collected for soybean cultivars in 86 trials from 2009 to 2012 at seven...

  6. Topography and crop management are key factors for the development of american leaf spot epidemics on coffee in costa rica.

    PubMed

    Avelino, Jacques; Cabut, Sandrine; Barboza, Bernardo; Barquero, Miguel; Alfaro, Ronny; Esquivel, César; Durand, Jean-François; Cilas, Christian

    2007-12-01

    ABSTRACT We monitored the development of American leaf spot of coffee, a disease caused by the gemmiferous fungus Mycena citricolor, in 57 plots in Costa Rica for 1 or 2 years in order to gain a clearer understanding of conditions conducive to the disease and improve its control. During the investigation, characteristics of the coffee trees, crop management, and the environment were recorded. For the analyses, we used partial least-squares regression via the spline functions (PLSS), which is a nonlinear extension to partial least-squares regression (PLS). The fungus developed well in areas located between approximately 1,100 and 1,550 m above sea level. Slopes were conducive to its development, but eastern-facing slopes were less affected than the others, probably because they were more exposed to sunlight, especially in the rainy season. The distance between planting rows, the shade percentage, coffee tree height, the type of shade, and the pruning system explained disease intensity due to their effects on coffee tree shading and, possibly, on the humidity conditions in the plot. Forest trees and fruit trees intercropped with coffee provided particularly propitious conditions. Apparently, fertilization was unfavorable for the disease, probably due to dilution phenomena associated with faster coffee tree growth. Finally, series of wet spells interspersed with dry spells, which were frequent in the middle of the rainy season, were critical for the disease, probably because they affected the production and release of gemmae and their viability. These results could be used to draw up a map of epidemic risks taking topographical factors into account. To reduce those risks and improve chemical control, our results suggested that farmers should space planting rows further apart, maintain light shading in the plantation, and prune their coffee trees.

  7. Using counts to simultaneously estimate abundance and detection probabilities in a salamander community

    USGS Publications Warehouse

    Dodd, C.K.; Dorazio, R.M.

    2004-01-01

    A critical variable in both ecological and conservation field studies is determining how many individuals of a species are present within a defined sampling area. Labor intensive techniques such as capture-mark-recapture and removal sampling may provide estimates of abundance, but there are many logistical constraints to their widespread application. Many studies on terrestrial and aquatic salamanders use counts as an index of abundance, assuming that detection remains constant while sampling. If this constancy is violated, determination of detection probabilities is critical to the accurate estimation of abundance. Recently, a model was developed that provides a statistical approach that allows abundance and detection to be estimated simultaneously from spatially and temporally replicated counts. We adapted this model to estimate these parameters for salamanders sampled over a six vear period in area-constrained plots in Great Smoky Mountains National Park. Estimates of salamander abundance varied among years, but annual changes in abundance did not vary uniformly among species. Except for one species, abundance estimates were not correlated with site covariates (elevation/soil and water pH, conductivity, air and water temperature). The uncertainty in the estimates was so large as to make correlations ineffectual in predicting which covariates might influence abundance. Detection probabilities also varied among species and sometimes among years for the six species examined. We found such a high degree of variation in our counts and in estimates of detection among species, sites, and years as to cast doubt upon the appropriateness of using count data to monitor population trends using a small number of area-constrained survey plots. Still, the model provided reasonable estimates of abundance that could make it useful in estimating population size from count surveys.

  8. A Study of Slipper and Rail Wear Interaction at Low Speed

    DTIC Science & Technology

    2014-06-19

    on gun barrel steel 24 Figure 2.5: Plot of Montgomery Pin on Disk Data with Curve Fit[11] for different pressure-velocity values. The function for this...Normal Force (N) F̃ Normalized Pressure (N/ m2 /N/ m2 ) lc Length of Contact Area (mm) r Radius of Asperity (µm) s Mie-Grüneisen Material Parameter T...revolution of the early 1900s drove the field of Tribology as many machines with a myriad of moving parts containing surfaces that wore against each other

  9. Probable autosomal recessive Marfan syndrome.

    PubMed Central

    Fried, K; Krakowsky, D

    1977-01-01

    A probable autosomal recessive mode of inheritance is described in a family with two affected sisters. The sisters showed the typical picture of Marfan syndrome and were of normal intelligence. Both parents and all four grandparents were personally examined and found to be normal. Homocystinuria was ruled out on repeated examinations. This family suggests genetic heterogeneity in Marfan syndrome and that in some rare families the mode of inheritance may be autosomal recessive. Images PMID:592353

  10. UAV remote sening for precision agriculture

    NASA Astrophysics Data System (ADS)

    Vigneau, Nathalie; Chéron, Corentin; Mainfroy, Florent; Faroux, Romain

    2014-05-01

    Airinov offers to farmers, scientists and experimenters (plant breeders, etc.) its technical skills about UAVs, cartography and agronomic remote sensing. The UAV is a 2-m-wingspan flying wing. It can carry away either a RGB camera or a multispectral sensor, which records reflectance in 4 spectral bands. The spectral characteristics of the sensor are modular. Each spectral band is comprised between 400 and 850 nm and the FWHM (Full Width at Half Maximum) is between 10 and 40 nm. The spatial resolution varies according to sensor, flying height and user needs from 15cm/px for multispectral sensor at 150m to 1.5cm/px for RGB camera at 50m. The flight is totally automatic thanks to on-board autopilot, IMU (Inertial Measurement Unit) and GPS. Data processing (unvignetting, mosaicking, correction in reflectance) leads to agronomic variables as LAI (Leaf Area Index) or chlorophyll content for barley, wheat, rape and maize as well as vegetation indices as NDVI (Normalized Difference Vegetation Index). Using these data, Airinov can product advices for farmers as nitrogen preconisation for rape. For scientists, Airinov offers trial plot monitoring by micro-plots vectorisation and numerical data exctraction micro-plot by micro-plot. This can lead to kinetic curve for LAI or NDVI to compare cover establishment for different genotypes for example. Airinov's system is a new way to monitor plots with a lot of data (biophysical or biochemical parameters) at high rate, high spatial resolution and high precision.

  11. Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code. Volume 1; Analysis and Results

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This report provides a study of rotor and stator scattering using the SOURCE3D Rotor Wake/Stator Interaction Code. SOURCE3D is a quasi-three-dimensional computer program that uses three-dimensional acoustics and two-dimensional cascade load response theory to calculate rotor and stator modal reflection and transmission (scattering) coefficients. SOURCE3D is at the core of the TFaNS (Theoretical Fan Noise Design/Prediction System), developed for NASA, which provides complete fully coupled (inlet, rotor, stator, exit) noise solutions for turbofan engines. The reason for studying scattering is that we must first understand the behavior of the individual scattering coefficients provided by SOURCE3D, before eventually understanding the more complicated predictions from TFaNS. To study scattering, we have derived a large number of scattering curves for vane and blade rows. The curves are plots of output wave power divided by input wave power (in dB units) versus vane/blade ratio. Some of these plots are shown in this report. All of the plots are provided in a separate volume. To assist in understanding the plots, formulas have been derived for special vane/blade ratios for which wavefronts are either parallel or normal to rotor or stator chords. From the plots, we have found that, for the most part, there was strong transmission and weak reflection over most of the vane/blade ratio range for the stator. For the rotor, there was little transmission loss.

  12. Recurrence plots of discrete-time Gaussian stochastic processes

    NASA Astrophysics Data System (ADS)

    Ramdani, Sofiane; Bouchara, Frédéric; Lagarde, Julien; Lesne, Annick

    2016-09-01

    We investigate the statistical properties of recurrence plots (RPs) of data generated by discrete-time stationary Gaussian random processes. We analytically derive the theoretical values of the probabilities of occurrence of recurrence points and consecutive recurrence points forming diagonals in the RP, with an embedding dimension equal to 1. These results allow us to obtain theoretical values of three measures: (i) the recurrence rate (REC) (ii) the percent determinism (DET) and (iii) RP-based estimation of the ε-entropy κ(ε) in the sense of correlation entropy. We apply these results to two Gaussian processes, namely first order autoregressive processes and fractional Gaussian noise. For these processes, we simulate a number of realizations and compare the RP-based estimations of the three selected measures to their theoretical values. These comparisons provide useful information on the quality of the estimations, such as the minimum required data length and threshold radius used to construct the RP.

  13. Scaling and the frequency dependence of Nyquist plot maxima of the electrical impedance of the human thigh.

    PubMed

    Shiffman, Carl

    2017-11-30

    To define and elucidate the properties of reduced-variable Nyquist plots. Non-invasive measurements of the electrical impedance of the human thigh. A retrospective analysis of the electrical impedances of 154 normal subjects measured over the past decade shows that 'scaling' of the Nyquist plots for human thigh muscles is a property shared by healthy thigh musculature, irrespective of subject and the length of muscle segment. Here the term scaling signifies the near and sometimes 'perfect' coalescence of the separate X versus R plots into one 'reduced' Nyquist plot by the simple expedient of dividing R and X by X m , the value of X at the reactance maximum. To the extent allowed by noise levels one can say that there is one 'universal' reduced Nyquist plot for the thigh musculature of healthy subjects. There is one feature of the Nyquist curves which is not 'universal', however, namely the frequency f m at which the maximum in X is observed. That is found to vary from 10 to 100 kHz. depending on subject and segment length. Analysis shows, however, that the mean value of 1/f m is an accurately linear function of segment length, though there is a small subject-to-subject random element as well. Also, following the recovery of an otherwise healthy victim of ankle fracture demonstrates the clear superiority of measurements above about 800 kHz, where scaling is not observed, in contrast to measurements below about 400 kHz, where scaling is accurately obeyed. The ubiquity of 'scaling' casts new light on the interpretation of impedance results as they are used in electrical impedance myography and bioelectric impedance analysis.

  14. A Study of the Magnetic Fingerprint of Tsunami Induced Deposits in the Ixtapa-Zihuatanejo Area (Western Mexico)

    NASA Astrophysics Data System (ADS)

    Goguitchaichrili, A.; Ramirez-Herrera, M.; Calvo-Rathert, M.; Aguilar, B.; Carrancho, Alonso; Morales, J.; Caballero, C. I.; Bautista, F.

    2013-05-01

    The Pacific coast of Mexico has repeatedly been exposed to destructive tsunamis. Recent studies have shown that rock-magnetic methods can be a promising approach for identification of tsunami or storm induced deposits. We present new rock-magnetic and anisotropy of magnetic susceptibility results to try to distinguish tsunami deposits in the Ixtapa-Zihuatanejo area (Western Mexico). The sampled, 80 cm deep sequence is characterised by the presence of two anomalous sand beds within fine-grained coastal deposits. The first lower lying sand bed is probably associated with the 14th March 1979 Petatlán earthquake (MW = 7.6) while the second one was originated by the September 21st 1985 Mexico earthquake (MW = 8.1). Rock magnetic experiments have shown significant variations within the analysed sequence. Thermomagnetic curves reveal two types of behaviour: In the upper part of the sequence, after the occurrence of the first tsunami and in the lower part of the sequence, during that event and below. Analysis of hysteresis parameter ratios in a Day-plot also allows distinguishing two kinds of behaviour. The samples associated to the second tsunami plot in the PSD area, while specimens associated to the first tsunami and the time between both tsunamis display a very different trend which can be ascribed to the production of a considerable amount of superparamagnetic grains which might be due to pedogenic processes after the first tsunami. The studied profile is characterised by a sedimentary fabric with almost vertical minimum principal susceptibilities. The maximum susceptibility axis shows a declination angle D = 27, suggesting a NNE flow direction which is equal for both tsunamis and normal currents. The standard AMS parameters display a significant enhancement within the transitional zone between both tsunamis. The study of rock-magnetic parameters may represent a useful tool for the identification and understanding of tsunami deposits.

  15. Statistical Analysis of 30 Years Rainfall Data: A Case Study

    NASA Astrophysics Data System (ADS)

    Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.

    2017-07-01

    Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.

  16. Multifocal visual evoked potential and automated perimetry abnormalities in strabismic amblyopes.

    PubMed

    Greenstein, Vivienne C; Eggers, Howard M; Hood, Donald C

    2008-02-01

    To compare visual field abnormalities obtained with standard automated perimetry (SAP) to those obtained with the multifocal visual evoked potential (mfVEP) technique in strabismic amblyopes. Humphrey 24-2 visual fields (HVF) and mfVEPs were obtained from each eye of 12 strabismic amblyopes. For the mfVEP, amplitudes and latencies were analyzed and probability plots were derived. Multifocal VEP and HVF hemifields were abnormal if they had clusters of two or more contiguous points at p < 0.01, or three or more contiguous points at p < 0.05 with at least one at p < 0.01. An eye was abnormal if it had an abnormal hemifield. On SAP, amblyopic eyes had significantly higher foveal thresholds (p = 0.003) and lower mean deviation values (p = 0.005) than fellow eyes. For the mfVEP, 11 amblyopic and 6 fellow eyes were abnormal. Of the 11 amblyopic eyes, 6 were abnormal on SAP. The deficits extended from the center to mid periphery. Monocular mfVEP latencies were significantly decreased for amblyopic eyes compared with control eyes (p < 0.0002). Both techniques revealed deficits in visual function across the visual field in strabismic amblyopes, but the mfVEP revealed deficits in fellow eyes and in more amblyopic eyes. In addition, mfVEP response latencies for amblyopic eyes were shorter than normal.

  17. Surface-water-quality assessment of the upper Illinois River basin in Illinois, Indiana, and Wisconsin; spatial distribution of geochemicals in the fine fraction of streambed sediment, 1987

    USGS Publications Warehouse

    Fitzpatrick, Faith A.; Arnold, Terri L.; Colman, John A.

    1998-01-01

    Geochemical data for the upper Illinois River Basin are presented for concentrations of 39 elements in streambed sediment collected by the U.S. Geological Survey in the fall of 1987. These data were collected as part of the pilot phase of the National Water-Quality Assessment Program. A total of 372 sites were sampled, with 238 sites located on first- and second-order streams, and 134 sites located on main stems. Spatial distribution maps and exceedance probability plots are presented for aluminum, antimony, arsenic, barium, beryllium, boron, cadmium, calcium, carbon (total, inorganic, and organic), cerium, chromium, cobalt, copper, gallium, iron, lanthanum, lead, lithium, magnesium, manganese, mercury, molybdenum, neodymium, nickel, niobium, phosphorus, potassium, scandium, selenium, silver, sodium, strontium, sulfur, thorium, titanium, uranium, vanadium, yttrium, and zinc. For spatial distribution maps, concentrations of the elements are grouped into four ranges bounded by the minimum concentration, the 10th, 50th, and 90th percentiles, and the maximum concentrations. These ranges were selected to highlight streambed sediment with very low or very high element concentrations relative to the rest of the streambed sediment in the upper Illinois River Basin. Exceedance probability plots for each element display the differences, if any, in distributions between high- and low-order streams and may be helpful in determining differences between background and elevated concentrations.

  18. Stability Investigation of a Blunted Cone and a Blunted Ogive with a Flared Cylinder Afterbody at Mach Numbers from 0.30 to 2.85

    NASA Technical Reports Server (NTRS)

    Coltrane, Lucille C.

    1959-01-01

    A cone with a blunt nose tip and a 10.7 deg cone half angle and an ogive with a blunt nose tip and a 20 deg flared cylinder afterbody have been tested in free flight over a Mach number range of 0.30 to 2.85 and a Reynolds number range of 1 x 10(exp 6) to 23 x 10(exp 6). Time histories, cross plots of force and moment coefficients, and plots of the longitudinal force,coefficient, rolling velocity, aerodynamic center, normal- force-curve slope, and dynamic stability are presented. With the center-of-gravity location at about 50 percent of the model length, the models were both statically and dynamically stable throughout the Mach number range. For the cone, the average aerodynamic center moved slightly forward with decreasing speeds and the normal-force-curve slope was fairly constant throughout the speed range. For the ogive, the average aerodynamic center remained practically constant and the normal-force-curve slope remained practically constant to a Mach number of approximately 1.6 where a rising trend is noted. Maximum drag coefficient for the cone, with reference to the base area, was approximately 0.6, and for the ogive, with reference to the area of the cylindrical portion, was approximately 2.1.

  19. Study on probability distribution of prices in electricity market: A case study of zhejiang province, china

    NASA Astrophysics Data System (ADS)

    Zhou, H.; Chen, B.; Han, Z. X.; Zhang, F. Q.

    2009-05-01

    The study on probability density function and distribution function of electricity prices contributes to the power suppliers and purchasers to estimate their own management accurately, and helps the regulator monitor the periods deviating from normal distribution. Based on the assumption of normal distribution load and non-linear characteristic of the aggregate supply curve, this paper has derived the distribution of electricity prices as the function of random variable of load. The conclusion has been validated with the electricity price data of Zhejiang market. The results show that electricity prices obey normal distribution approximately only when supply-demand relationship is loose, whereas the prices deviate from normal distribution and present strong right-skewness characteristic. Finally, the real electricity markets also display the narrow-peak characteristic when undersupply occurs.

  20. Forage site selection by lesser snow geese during autumn staging on the Arctic National Wildlife Refuge, Alaska

    USGS Publications Warehouse

    Hupp, Jerry W.; Robertson, Donna G.

    1998-01-01

    Lesser snow geese (Chen caerulescens caerulescens) of the Western Canadian Arctic Population feed intensively for 2-4 weeks on the coastal plain of the Beaufort Sea in Canada and Alaska at the beginning of their autumn migration. Petroleum leasing proposed for the Alaskan portion of the staging area on the Arctic National Wildlife Refuge (ANWR) could affect staging habitats and their use by geese. Therefore we studied availability, distribution, and use by snow geese of tall and russett cotton-grass (Eriophorum angustifolium and E. russeolum, respectively) feeding habitats on the ANWR. We studied selection of feeding habitats at 3 spatial scales (feeding sites [0.06 m2], feeding patches [ca. 100 m2], and feeding areas [>1 ha]) during 1990-93. We used logistic regression analysis to discriminate differences in soil moisture and vegetation between 1,548 feeding sites where snow geese exploited individual cotton-grass plants and 1,143 unexploited sites at 61 feeding patches in 1990. Feeding likelihood increased with greater soil moisture and decreased where nonforage species were present. We tested the logistic regression model in 1991 by releasing human-imprinted snow geese into 4 10 × 20-m enclosed plots where plant communities had been mapped, habitats sampled, and feeding probabilities calculated. Geese selected more feeding sites per square meter in areas of predicted high quality feeding habitat (feeding probability ≥ 0.6) than in medium (feeding probability = 0.3-0.59) or poor (feeding probability < 0.3) quality habitat (P < 0.0001). Geese increasingly used medium quality areas and spent more time feeding as trials progressed and forage was presumably reduced in high quality habitats. We examined relationships between underground biomass of plants, feeding probability, and surface microrelief at 474 0.06- m2 sites in 20 thermokarst pits in 1992. Feeding probability was correlated with the percentage of underground biomass composed of cotton-grass (r = 0.56). Feeding probability and relative availability of cotton-grass forage were highest in flooded soils along the ecotone of flooded and upland habitats. In 1992, we also used the logistic regression model to estimate availability of high quality feeding sites on 192 80 × 90-m plots that were randomly located on 24 study areas. A mean of 1.6% of the area sampled in each plot was classified as high quality feeding habitat at 23 of the study areas. Relative availability of high quality sites was highest in troughs, thermokarst pits, and water tracks because saturated soils in those microreliefs were dominated by cotton-grass. Relative availability of high quality sites was lower in saturated soils of basins (low-centered polygons, wet meadows, and strangmoor) because that microrelief was dominated by Carex spp. Most (63%) of the saturated area on the ANWR coastal plain was in basins. We examined distribution of feeding patches relative to microrelief in 49 snow goose feeding areas in 1993. Only 2.5% of the tundra in each feeding area was exploited by snow geese. Snow geese preferentially fed in thermokarst pits, water tracks, and troughs, and avoided basins and uplands. Feeding areas had more thermokarst pit but less basin microrelief than adjacent randomly-selected areas. Thermokarst pits and water tracks occurred most frequently in regions of the coastal plain where geese were observed most often during aerial surveys (1982-93). Microrelief influenced selection of feeding patches and feeding areas and may have affected snow goose distribution on the ANWR. Potential feeding patches were widely distributed but composed a small percentage (≤2.5%) of the tundra landscape and were highly interspersed with less suitable habitat. The Western Canadian Arctic Population probably used a large staging area on the Beaufort Sea coastal plain because snow geese exploited a spatially and temporally heterogeneous resource.

  1. Derivation of the expressions for γ50 and D50 for different individual TCP and NTCP models

    NASA Astrophysics Data System (ADS)

    Stavreva, N.; Stavrev, P.; Warkentin, B.; Fallone, B. G.

    2002-10-01

    This paper presents a complete set of formulae for the position (D50) and the normalized slope (γ50) of the dose-response relationship based on the most commonly used radiobiological models for tumours as well as for normal tissues. The functional subunit response models (critical element and critical volume) are used in the derivation of the formulae for the normal tissue. Binomial statistics are used to describe the tumour control probability, the functional subunit response as well as the normal tissue complication probability. The formulae are derived for the single hit and linear quadratic models of cell kill in terms of the number of fractions and dose per fraction. It is shown that the functional subunit models predict very steep, almost step-like, normal tissue individual dose-response relationships. Furthermore, the formulae for the normalized gradient depend on the cellular parameters α and β when written in terms of number of fractions, but not when written in terms of dose per fraction.

  2. Runoff and Leaching of Metolachlor from Mississippi River Alluvial Soil during Seasons of Average and Below-Average Rainfall

    USDA-ARS?s Scientific Manuscript database

    The movement of metolachlor via runoff and leaching from plots planted to corn on Mississippi River alluvial soil (Commerce silt loam) was measured for a six-year period, 1995-2000. The first three years were characterized by normal rainfall volume, the second three years by reduced rainfall. The ...

  3. 76 FR 28460 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Rock Burst...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... develop a rock burst plan within 90 days after a rock burst has been experienced. Stress data are normally recorded on gauges and plotted on maps. This information is used for work assignments to ensure miner safety and to schedule correction work. This information collection is subject to the PRA. A Federal...

  4. Modeling the full-bridge series-resonant power converter

    NASA Technical Reports Server (NTRS)

    King, R. J.; Stuart, T. A.

    1982-01-01

    A steady state model is derived for the full-bridge series-resonant power converter. Normalized parametric curves for various currents and voltages are then plotted versus the triggering angle of the switching devices. The calculations are compared with experimental measurements made on a 50 kHz converter and a discussion of certain operating problems is presented.

  5. Acidity of Lakes and Impoundments in North-Central Minnesota

    Treesearch

    Elon S. Verry

    1981-01-01

    Measurements of lake and impoundment pH for several years, intensive sampling within years, and pH-calcium plots verify normal pH levels and do not show evidence of changes due to acid precipitation. These data in comparison with general lake data narrow the northern Lake States area in which rain or snow may cause lake acidification.

  6. Anorthosite belts, continental drift, and the anorthosite event

    USGS Publications Warehouse

    Herz, N.

    1969-01-01

    Most anorthosites lie in two principal belts when plotted on a predrift continental reconstruction. Anorthosite ages in the belts cluster around 1300 ?? 200 million years and range from 1100 to 1700 million years. This suggests that anorthosites are the product of a unique cataclysmic event or a thermal event that was normal only during the earth's early history.

  7. Anorthosite belts, continental drift, and the anorthosite event.

    PubMed

    Herz, N

    1969-05-23

    Most anorthosites lie in two principal belts when plotted on a predrift continental reconstruction. Anorthosite ages in the belts cluster around 1300 +/- 200 million years and range from 1100 to 1700 million years. This suggests that anorthosites are the product of a unique cataclysmic event or a thermal event that was normal only during the earth's early history.

  8. Sugar Beet Activities of the USDA-ARS East Lansing Conducted in Cooperation with Saginaw Valley Bean and Beet Farm During 2009

    USDA-ARS?s Scientific Manuscript database

    Two evaluation plots were planted at the Saginaw Valley Research & Extension Center in Frankenmuth, MI in 2009; one agronomic trial and one combined Cercospora evaluation trial. All trials were planted, following normal fall and spring tillage operations, with a USDA-ARS modified John Deere/Almaco ...

  9. Modeling marine oily wastewater treatment by a probabilistic agent-based approach.

    PubMed

    Jing, Liang; Chen, Bing; Zhang, Baiyu; Ye, Xudong

    2018-02-01

    This study developed a novel probabilistic agent-based approach for modeling of marine oily wastewater treatment processes. It begins first by constructing a probability-based agent simulation model, followed by a global sensitivity analysis and a genetic algorithm-based calibration. The proposed modeling approach was tested through a case study of the removal of naphthalene from marine oily wastewater using UV irradiation. The removal of naphthalene was described by an agent-based simulation model using 8 types of agents and 11 reactions. Each reaction was governed by a probability parameter to determine its occurrence. The modeling results showed that the root mean square errors between modeled and observed removal rates were 8.73 and 11.03% for calibration and validation runs, respectively. Reaction competition was analyzed by comparing agent-based reaction probabilities, while agents' heterogeneity was visualized by plotting their real-time spatial distribution, showing a strong potential for reactor design and process optimization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. The Probable Ages of Asteroid Families

    NASA Technical Reports Server (NTRS)

    Harris, A. W.

    1993-01-01

    There has been considerable debate recently over the ages of the Hirayama families, and in particular if some of the families are very oung(u) It is a straightforward task to estimate the characteristic time of a collision between a body of a given diameter, d_o, by another body of diameter greater of equal to d_1. What is less straightforward is to estimate the critical diameter ratio, d_1/d_o, above which catastrophic disruption occurs, from which one could infer probable ages of the Hirayama families, by knowing the diameter of the parent body, d_o. One can gain some insight into the probable value of d_1/d_o, and of the likely ages of existing families, from the plot below. I have computed the characteristic time between collisions in the asteroid belt of a size ratio greater of equal to d_1/d_o, for 4 sizes of target asteroids, d_o. The solid curves to the lower right are the characteristic times for a single object...

  11. Geochemistry and structure of the Hawley Formation: Northwestern Massachusetts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J.; Jacobi, R.

    1993-03-01

    The Hawley Formation in northwestern Massachusetts is composed of mafic and felsic, (trondhjemitic) igneous units and black sulfidic schists and quartzites. The dominant lithology is a thinly foliated hbd-plag.-chi-qtz.-Fe carbonate schist with or without hornblende fasicules. Locally, this schist has alternating folia of chl/hbd and plag. and probably has a volcaniclastic protolith. Distinct pillows and tuffs are observed locally. In general, these schists have flat REE patterns at 10X chondrite and plot as IABs on discrimination diagrams. In the eastern part of the Hawley, some amphibolites show concave upward REE patterns, plot in the IAT or boninite field on discriminationmore » diagrams, and appear to have boninitic affinities. The felsic lithologies are trondhjemitic and are intrusive into the IAT/boninite amphibolites. The intrusive nature is based on the presence of mafic xenoliths and intruded rafts of country rock in the trondhjemite as well as the occurrence of thin tabular trondhjemite bodies in sharp contact with the surrounding amphibolite. The trondhjemite varies from coarse-grained weakly foliated qtz-plag.-biotite gneiss with probable relict igneous zoned plagioclases to finer-grained well foliated qtz-plag.-garnet-hbd gneiss. REE patterns for the trondhjemites are weakly U-shaped with moderate to pronounced negative Eu anomalies. The trondhjemites, surrounding amphibolites, and black sulfidic schists and quartzites of the eastern part of the Hawley are intruded by massive, granular, medium grained, plagioclase phenocryst amphibolites with chilled margins. These intrusive sills predate or are coeval with the dominant foliation in the Hawley. Both sills and country rock contain a contact-parallel foliation as well as a later foliation at a low angle to the earlier foliation. The sill amphibolites are high TiO2 high Zr varieties that plot as MORBs to WPBs on discrimination diagrams and exhibit slightly LREE enriched MORB-like to T-MORB REE patterns.« less

  12. Saturation behavior: a general relationship described by a simple second-order differential equation.

    PubMed

    Kepner, Gordon R

    2010-04-13

    The numerous natural phenomena that exhibit saturation behavior, e.g., ligand binding and enzyme kinetics, have been approached, to date, via empirical and particular analyses. This paper presents a mechanism-free, and assumption-free, second-order differential equation, designed only to describe a typical relationship between the variables governing these phenomena. It develops a mathematical model for this relation, based solely on the analysis of the typical experimental data plot and its saturation characteristics. Its utility complements the traditional empirical approaches. For the general saturation curve, described in terms of its independent (x) and dependent (y) variables, a second-order differential equation is obtained that applies to any saturation phenomena. It shows that the driving factor for the basic saturation behavior is the probability of the interactive site being free, which is described quantitatively. Solving the equation relates the variables in terms of the two empirical constants common to all these phenomena, the initial slope of the data plot and the limiting value at saturation. A first-order differential equation for the slope emerged that led to the concept of the effective binding rate at the active site and its dependence on the calculable probability the interactive site is free. These results are illustrated using specific cases, including ligand binding and enzyme kinetics. This leads to a revised understanding of how to interpret the empirical constants, in terms of the variables pertinent to the phenomenon under study. The second-order differential equation revealed the basic underlying relations that describe these saturation phenomena, and the basic mathematical properties of the standard experimental data plot. It was shown how to integrate this differential equation, and define the common basic properties of these phenomena. The results regarding the importance of the slope and the new perspectives on the empirical constants governing the behavior of these phenomena led to an alternative perspective on saturation behavior kinetics. Their essential commonality was revealed by this analysis, based on the second-order differential equation.

  13. Wildlife and habitat damage assessment from Hurricane Charley: recommendations for recovery of the J. N. "Ding" Darling National Wildlife Refuge Complex. [Final report to U.S. Fish and Wildlife Service

    USGS Publications Warehouse

    Meyers, J.M.; Langtimm, C.A.; Smith, T. J.; Pednault-Willett, K.

    2005-01-01

    On 13 August 2004, the first of four hurricanes to strike Florida in 50% and sometimes 90% of their vegetation severely damaged (dead, broken tree stems, and tipped trees). Shell Mound Trail of JNDDNWR sustained catastrophic damage to its old growth mangrove forests. Direct storm mortality and injury to manatees in the area was probably slight. Because seagrass beads and manatee habitat extend beyond refuge boundaries, we recommended a regional approach with partner agencies to more thoroughly assess storm impacts and monitor recovery of seagrass and manatees. Besides intensive monitoring of waterbirds and their nesting habitat (pre- and post-storm), we recommend that the Mangrove Cuckoo be used as an indicator species for recovery of mangrove forests and also for monitoring songbirds at risk. Black-whiskered Vireo may be another potential indicator species to monitor in mangrove forests. Damaged vegetation should be monitored for recovery (permanent or long-term plots), especially where previous study plots have been established and with additional plots in mangrove forests of waterbird nesting islands and freshwater wetlands. Potential loss of wetlands may be prevented by water level monitoring, locating the positions (GPS-GIS) and maintaining existing water control structures, creating a GIS map of refuge with accurate vertical data, and monitoring and eradicating invasive plants. Invasive species, including Brazilian pepper (Schinus terebinthifolius) and air potato (Dioscorea bulbifora), were common in a very limited survey. As an important monitoring goal, we recommend that species presence-absence data analysis (with probability of detection) be used to determine changes in animal communities. This could be accomplished possibly with comparison to other storm-damaged and undamaged refuges in the Region. This information may be helpful to refuge managers when storms return in the future.

  14. Avoidance of voiding cystourethrography in infants younger than 3 months with Escherichia coli urinary tract infection and normal renal ultrasound.

    PubMed

    Pauchard, Jean-Yves; Chehade, Hassib; Kies, Chafika Zohra; Girardin, Eric; Cachat, Francois; Gehri, Mario

    2017-09-01

    Urinary tract infection (UTI) represents the most common bacterial infection in infants, and its prevalence increases with the presence of high-grade vesicoureteral reflux (VUR). However, voiding cystourethrography (VCUG) is invasive, and its indication in infants <3 months is not yet defined. This study aims to investigate, in infants aged 0-3 months, if the presence of Escherichia coli versus non- E. coli bacteria and/or normal or abnormal renal ultrasound (US) could avoid the use of VCUG. One hundred and twenty-two infants with a first febrile UTI were enrolled. High-grade VUR was defined by the presence of VUR grade ≥III. The presence of high-grade VUR was recorded using VCUG, and correlated with the presence of E. coli /non- E. coli UTI and with the presence of normal/abnormal renal US. The Bayes theorem was used to calculate pretest and post-test probability. The probability of high-grade VUR was 3% in the presence of urinary E. coli infection. Adding a normal renal US finding decreased this probability to 1%. However, in the presence of non- E. coli bacteria, the probability of high-grade VUR was 26%, and adding an abnormal US finding increased further this probability to 55%. In infants aged 0-3 months with a first febrile UTI, the presence of E. coli and normal renal US findings allow to safely avoid VCUG. Performing VCUG only in infants with UTI secondary to non- E. coli bacteria and/or abnormal US would save many unnecessary invasive procedures, limit radiation exposure, with a very low risk (<1%) of missing a high-grade VUR. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  16. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  17. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  18. Geochemical exploration for copper-nickel deposits in the cool-humid climate of northeastern Minnesota

    USGS Publications Warehouse

    Miller, W.R.; Ficklin, W.H.; McHugh, J.B.

    1992-01-01

    Water was used as a medium for geochemical exploration to detect copper-nickel mineralization along the basal zone of the Duluth Complex. Ni2+ is the most important pathfinder for the detection of the mineralized rocks, followed by Cu2+ and SO42- and to a lesser extent Mg2+ and SiO2. A normalized sum plot using these species defines the mineralization more consistently than a single-element plot, mainly because the absence of one variable does not significantly influence the normalized sum value. A hydrogeochemical survey was conducted in an area of known copper-nickel mineralization in the cool-humid climate of northeastern Minnesota. The area is covered with glacial drift, and wetlands are abundant. Modeling of the chemistry of waters indicates that the waters are oxidizing and have a pH of 7 or less. The most important pathfinder species in the waters, Cu2+, Ni2+, and SO42-, are derived from the simple weathering of sulfide minerals and are mobile in the waters in this environment. Plots of Cu and Ni concentrations in soils show that Cu followed by Ni are the most useful indicator elements for delineating copper-nickel mineralization. The ability of soils and water to delineate the mineralization supports the use of both media for geochemical exploration in this cool-humid environment. In the wetlands, abundant water is available and soils are scarce or absent; where soils are abundant, waters are generally scarce or absent. The use of both media is recommended for geochemical exploration in this environment. ?? 1992.

  19. GRAPHIC REANALYSIS OF THE TWO NINDS-TPA TRIALS CONFIRMS SUBSTANTIAL TREATMENT BENEFIT

    PubMed Central

    Saver, Jeffrey L.; Gornbein, Jeffrey; Starkman, Sidney

    2010-01-01

    Background of Comment/Review Multiple statistical analyses of the two NINDS-TPA Trials have confirmed study findings of benefit of fibrinolytic therapy. A recent graphic analysis departed from best practices in the visual display of quantitative information by failing to take into account the skewed functional importance NIH Stroke Scale raw scores and by scaling change axes at up to twenty times the range achievable by individual patients. Methods Using the publicly available datasets of the 2 NINDS-TPA Trials, we generated a variety of figures appropriate to the characteristics of acute stroke trial data. Results A diverse array of figures all visually delineated substantial benefits of fibrinolytic therapy, including: bar charts of normalized gain and loss; stacked bar, bar, and matrix plots of clinically relevant ordinal ranks; a time series stacked line plot of continuous scale disability weights; and line plot, bubble chart, and person icon array graphs of joint outcome table analysis. The achievable change figure showed substantially greater improvement among TPA than placebo patients, median 66.7% (IQR 0–92.0) vs 50.0% (IQR −7.1 – 80.0), p=0.003. Conclusions On average, under 3 hour patients treated with TPA recovered two-thirds while placebo patients improved only half of the way towards fully normal. Graphical analyses of the two NINDS-TPA trials, when performed according to best practices, is a useful means of conveying details about patient response to therapy not fully delineated by summary statistics, and confirms a valuable treatment benefit of under 3 hour fibrinolytic therapy in acute stroke. PMID:20829518

  20. Influences of Normalization Method on Biomarker Discovery in Gas Chromatography-Mass Spectrometry-Based Untargeted Metabolomics: What Should Be Considered?

    PubMed

    Chen, Jiaqing; Zhang, Pei; Lv, Mengying; Guo, Huimin; Huang, Yin; Zhang, Zunjian; Xu, Fengguo

    2017-05-16

    Data reduction techniques in gas chromatography-mass spectrometry-based untargeted metabolomics has made the following workflow of data analysis more lucid. However, the normalization process still perplexes researchers, and its effects are always ignored. In order to reveal the influences of normalization method, five representative normalization methods (mass spectrometry total useful signal, median, probabilistic quotient normalization, remove unwanted variation-random, and systematic ratio normalization) were compared in three real data sets with different types. First, data reduction techniques were used to refine the original data. Then, quality control samples and relative log abundance plots were utilized to evaluate the unwanted variations and the efficiencies of normalization process. Furthermore, the potential biomarkers which were screened out by the Mann-Whitney U test, receiver operating characteristic curve analysis, random forest, and feature selection algorithm Boruta in different normalized data sets were compared. The results indicated the determination of the normalization method was difficult because the commonly accepted rules were easy to fulfill but different normalization methods had unforeseen influences on both the kind and number of potential biomarkers. Lastly, an integrated strategy for normalization method selection was recommended.

  1. A theory of the Io phase asymmetry of the Jovian decametric radiation

    NASA Technical Reports Server (NTRS)

    Hashimoto, K.; Goldstein, M. L.

    1982-01-01

    An explanation of an asymmetry in the occurrence probability of the Io-dependent Jovian decametric radiation is proposed. Io generates stronger Alfven waves toward the south when it is in the northern part of the torus. This wave then generates decametric radiation in the northern ionosphere after it reflects in the southern ionosphere. The asymmetry then results from computing the propagation time of the alfven wave along this trajectory. The ray paths of the decameter radiation are calculated using a three dimensional ray tracing program in the Jovian ionosphere. Variations in the expected probability plots are computer for two models of the Jovian ionosphere and global magnetic field, as well as for several choices of the ratio of the radiated frequency to the X-mode cutoff frequency.

  2. Manual discrimination of force

    NASA Technical Reports Server (NTRS)

    Pang, Xiao-Dong; Tan, HONG-Z.; Durlach, Nathaniel I.

    1991-01-01

    Optimal design of human-machine interfaces for teleoperators and virtual-environment systems which involve the tactual and kinesthetic modalities requires knowledge of the human's resolving power in these modalities. The resolution of the interface should be appropriately matched to that of the human operator. We report some preliminary results on the ability of the human hand to distinguish small differences in force under a variety of conditions. Experiments were conducted on force discrimination with the thumb pushing an interface that exerts a constant force over the pushing distance and the index finger pressing against a fixed support. The dependence of the sensitivity index d' on force increment can be fit by a straight line through the origin and the just-noticeable difference (JND) in force can thus be described by the inverse of the slope of this line. The receiver operating characteristic (ROC) was measured by varying the a priori probabilities of the two alternatives, reference force and reference force plus an increment, in one-interval, two-alternative, forced-choice experiments. When plotted on normal deviate coordinates, the ROC's were roughly straight lines of unit slope, thus supporting the assumption of equal-variance normal distributions and the use of the conventional d' measure. The JND was roughly 6-8 percent for reference force ranging from 2.5 to 10 newtons, pushing distance from 5 to 30 mm, and initial finger-span from 45 to 125 mm. Also, the JND remained the same when the subjects were instructed to change the average speed of pushing from 23 to 153 mm/sec. The pushing was terminated by reaching either a wall or a well, and the JND's were essentially the same in both cases.

  3. Primary Epstein-Barr virus infection and probable parvovirus B19 reactivation resulting in fulminant hepatitis and fulfilling five of eight criteria for hemophagocytic lymphohistiocytosis.

    PubMed

    Karrasch, Matthias; Felber, Jörg; Keller, Peter M; Kletta, Christine; Egerer, Renate; Bohnert, Jürgen; Hermann, Beate; Pfister, Wolfgang; Theis, Bernhard; Petersen, Iver; Stallmach, Andreas; Baier, Michael

    2014-11-01

    A case of primary Epstein-Barr virus (EBV) infection/parvovirus B19 reactivation fulfilling five of eight criteria for hemophagocytic lymphohistiocytosis (HLH) is presented. Despite two coinciding viral infections, massive splenomegaly, and fulminant hepatitis, the patient had a good clinical outcome, probably due to an early onset form of HLH with normal leukocyte count, normal natural killer (NK) cell function, and a lack of hemophagocytosis.

  4. High throughput nonparametric probability density estimation.

    PubMed

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  5. High throughput nonparametric probability density estimation

    PubMed Central

    Farmer, Jenny

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803

  6. Ordinal probability effect measures for group comparisons in multinomial cumulative link models.

    PubMed

    Agresti, Alan; Kateri, Maria

    2017-03-01

    We consider simple ordinal model-based probability effect measures for comparing distributions of two groups, adjusted for explanatory variables. An "ordinal superiority" measure summarizes the probability that an observation from one distribution falls above an independent observation from the other distribution, adjusted for explanatory variables in a model. The measure applies directly to normal linear models and to a normal latent variable model for ordinal response variables. It equals Φ(β/2) for the corresponding ordinal model that applies a probit link function to cumulative multinomial probabilities, for standard normal cdf Φ and effect β that is the coefficient of the group indicator variable. For the more general latent variable model for ordinal responses that corresponds to a linear model with other possible error distributions and corresponding link functions for cumulative multinomial probabilities, the ordinal superiority measure equals exp(β)/[1+exp(β)] with the log-log link and equals approximately exp(β/2)/[1+exp(β/2)] with the logit link, where β is the group effect. Another ordinal superiority measure generalizes the difference of proportions from binary to ordinal responses. We also present related measures directly for ordinal models for the observed response that need not assume corresponding latent response models. We present confidence intervals for the measures and illustrate with an example. © 2016, The International Biometric Society.

  7. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  8. Usefulness of the novel risk estimation software, Heart Risk View, for the prediction of cardiac events in patients with normal myocardial perfusion SPECT.

    PubMed

    Sakatani, Tomohiko; Shimoo, Satoshi; Takamatsu, Kazuaki; Kyodo, Atsushi; Tsuji, Yumika; Mera, Kayoko; Koide, Masahiro; Isodono, Koji; Tsubakimoto, Yoshinori; Matsuo, Akiko; Inoue, Keiji; Fujita, Hiroshi

    2016-12-01

    Myocardial perfusion single-photon emission-computed tomography (SPECT) can predict cardiac events in patients with coronary artery disease with high accuracy; however, pseudo-negative cases sometimes occur. Heart Risk View, which is based on the prospective cohort study (J-ACCESS), is a software for evaluating cardiac event probability. We examined whether Heart Risk View was useful to evaluate the cardiac risk in patients with normal myocardial perfusion SPECT (MPS). We studied 3461 consecutive patients who underwent MPS to detect myocardial ischemia and those who had normal MPS were enrolled in this study (n = 698). We calculated cardiac event probability by Heart Risk View and followed-up for 3.8 ± 2.4 years. The cardiac events were defined as cardiac death, non-fatal myocardial infarction, and heart failure requiring hospitalization. During the follow-up period, 21 patients (3.0 %) had cardiac events. The event probability calculated by Heart Risk View was higher in the event group (5.5 ± 2.6 vs. 2.9 ± 2.6 %, p < 0.001). According to the receiver-operating characteristics curve, the cut-off point of the event probability for predicting cardiac events was 3.4 % (sensitivity 0.76, specificity 0.72, and AUC 0.85). Kaplan-Meier curves revealed that a higher event rate was observed in the high-event probability group by the log-rank test (p < 0.001). Although myocardial perfusion SPECT is useful for the prediction of cardiac events, risk estimation by Heart Risk View adds more prognostic information, especially in patients with normal MPS.

  9. Columbia: The first five flights entry heating data series. Volume 4: The lower windward wing 50 percent and 80 percent semispans

    NASA Technical Reports Server (NTRS)

    Williams, S. D.

    1983-01-01

    Entry heating flight data and wind tunnel data on the lower wing 50% and 80% Semi-Spans are presented for the first five flights of the Space Shuttle Orbiter. The heating rate data is presented in terms of normalized film heat transfer coefficients as a function of angle-of-attack, Mach number, and Normal Shock Reynolds number. The surface heating rates and temperatures were obtained via the JSC NONLIN/INVERSE computer program. Time history plots of the surface heating rates and temperatures are also presented.

  10. NRL Glider Data Report for the Shelf-Slope Experiment

    DTIC Science & Technology

    2017-09-12

    until 16 February, 2016. The glider made a total of 349 survey profiles within the geographical box bounded between 28.95N to 29.25N and 88.6W to 88.25W...12-09-2017 Memorandum Report Gulf of Mexico Oceanography Ocean optics Glider survey 73-1D52-05-5 The University of Southern Mississippi 118 College... survey corresponds to when Salinity increased, probably in water less affected by Mississippi outflow. 14 Figure 13. Chlorophyll waterfall image plot

  11. Robust Rapid Change-Point Detection in Multi-Sensor Data Fusion and Behavior Research

    DTIC Science & Technology

    2011-02-25

    size. The specific data motivating our re- search concerns male thyroid cancer cases (with malignant behavior) in New Mexico during 1973-2005. The data...epidemiology and (bio)surveillance is to determine whether or not the risk for male thyroid cancer increases over time. The term risk here essentially means the...probability of developing thyroid cancer in a given year, which can be characterized by the incidence rate per 100,000 (male) population; see the plot

  12. Nitrogen Fixation Inputs in Pasture and Early Successional Forest in the Brazilian Amazon Region: Evidence From a Claybox Mesocosm Study

    NASA Astrophysics Data System (ADS)

    Davidson, Eric A.; Markewitz, Daniel; de O. Figueiredo, Ricardo; de Camargo, Plínio B.

    2018-02-01

    The role of biological nitrogen fixation (BNF) during secondary forest succession and in tropical pastures has been investigated and debated for several decades. Here we present results of a replicated experimental study in a degraded cattle pasture of eastern Amazonia using mass balance and a 15N tracer in lined soil pit mesocosms with three treatments: (1) plant-free control plots, (2) pasture grass Brachiaria brizantha, and (3) regrowth of early successional secondary forest species. Accumulation of N in grass biomass slightly exceeded estimates of net N mineralization from the plant-free control plots but was within the margin of error, so inputs of BNF may not have been needed. In contrast, the secondary forest vegetation accumulated about 3 times as much biomass N annually as the net N mineralization estimate, suggesting at least some role for BNF. Based on isotopic and mass measurements of N-fixing species, BNF was estimated to contribute at least 27 ± 3% of mean annual plant uptake in the secondary forest regrowth vegetation plots. Although BNF is probably important for recuperation of tropical secondary forests following land use change, the majority of the N taken up by both grasses and secondary forest regrowth arose from mineralization of the stocks of soil N.

  13. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  14. Characterization of the efficiency of microbore liquid chromatography columns by van Deemter and kinetic plot analysis.

    PubMed

    Hetzel, Terence; Loeker, Denise; Teutenberg, Thorsten; Schmidt, Torsten C

    2016-10-01

    The efficiency of miniaturized liquid chromatography columns with inner diameters between 200 and 300 μm has been investigated using a dedicated micro-liquid chromatography system. Fully porous, core-shell and monolithic commercially available stationary phases were compared applying van Deemter and kinetic plot analysis. The sub-2 μm fully porous as well as the 2.7 μm core-shell particle packed columns showed superior efficiency and similar values for the minimum reduced plate heights (2.56-2.69) before correction for extra-column contribution compared to normal-bore columns. Moreover, the influence of extra-column contribution was investigated to demonstrate the difference between apparent and intrinsic efficiency by replacing the column by a zero dead volume union to determine the band spreading caused by the system. It was demonstrated that 72% of the intrinsic efficiency could be reached. The results of the kinetic plot analysis indicate the superior performance of the sub-2 μm fully porous particle packed column for ultra-fast liquid chromatography. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Seasonal LAI in slash pine estimated with LANDSAT TM

    NASA Technical Reports Server (NTRS)

    Curran, Paul J.; Dungan, Jennifer L.; Gholz, Henry L.

    1990-01-01

    The leaf area index (LAI, total area of leaves per unit area of ground) of most forest canopies varies throughout the year, yet for logistical reasons it is difficult to estimate anything more detailed than a seasonal maximum LAI. To determine if remotely sensed data can be used to estimate LAI seasonally, field measurements of LAI were compared to normalized difference vegetation index (NDVI) values derived using LANDSAT Thematic Mapper (TM) data, for 16 fertilized and control slash pine plots on 3 dates. Linear relationships existed between NDVI and LAI with R(sup 2) values of 0.35, 0.75, and 0.86 for February 1988, September 1988, and March, 1989, respectively. This is the first reported study in which NDVI is related to forest LAI recorded during the month of sensor overpass. Predictive relationships based on data from eight of the plots were used to estimate the LAI of the other eight plots with a root-mean-square error of 0.74 LAI, which is 15.6 percent of the mean LAI. This demonstrates the potential use of LANDSAT TM data for studying seasonal dynamics in forest canopies.

  16. Clinicians' perceptions of the value of ventilation-perfusion scans.

    PubMed

    Siegel, Alan; Holtzman, Stephen R; Bettmann, Michael A; Black, William C

    2004-07-01

    The goal of this investigation was to understand clinicians' perceptions of the probability of pulmonary embolism as a function of V/Q scan results of normal, low, intermediate, and high probability. A questionnaire was developed and distributed to 429 clinicians at a single academic medical center. The response rate was 44% (188 of 429). The questions included level of training, specialty, probability of PE given 1 of the 4 V/Q scan results, and estimations of the charges for V/Q scanning and pulmonary angiography, and estimations of the risks of pulmonary angiography. The medians and ranges for the probability of pulmonary embolism given a normal, low, intermediate, and high probability V/Q scan result were 2.5% (0-30), 12.5% (0.5-52.5), 41.25% (5-75), and 85% (5-100), respectively. Eleven percent (21 of 188) of the respondents listed the probability of PE in patients with a low probability V/Q scan as being 5% or less, and 33% (62 of 188) listed the probability of PE given an intermediate probability scan as 50% or greater. The majority correctly identified the rate of serious complications of pulmonary arteriography, but many respondents underestimated the charge for V/Q scans and pulmonary arteriography. A substantial minority of clinicians do not understand the probability of pulmonary embolism in patients with low and intermediate probability ventilation-perfusion scans. More quantitative reporting of results is recommended. This could be particularly important because VQ scans are used less frequently but are still needed in certain clinical situations.

  17. Probabilities of Dilating Vesicoureteral Reflux in Children with First Time Simple Febrile Urinary Tract Infection, and Normal Renal and Bladder Ultrasound.

    PubMed

    Rianthavorn, Pornpimol; Tangngamsakul, Onjira

    2016-11-01

    We evaluated risk factors and assessed predicted probabilities for grade III or higher vesicoureteral reflux (dilating reflux) in children with a first simple febrile urinary tract infection and normal renal and bladder ultrasound. Data for 167 children 2 to 72 months old with a first febrile urinary tract infection and normal ultrasound were compared between those who had dilating vesicoureteral reflux (12 patients, 7.2%) and those who did not. Exclusion criteria consisted of history of prenatal hydronephrosis or familial reflux and complicated urinary tract infection. The logistic regression model was used to identify independent variables associated with dilating reflux. Predicted probabilities for dilating reflux were assessed. Patient age and prevalence of nonEscherichia coli bacteria were greater in children who had dilating reflux compared to those who did not (p = 0.02 and p = 0.004, respectively). Gender distribution was similar between the 2 groups (p = 0.08). In multivariate analysis older age and nonE. coli bacteria independently predicted dilating reflux, with odds ratios of 1.04 (95% CI 1.01-1.07, p = 0.02) and 3.76 (95% CI 1.05-13.39, p = 0.04), respectively. The impact of nonE. coli bacteria on predicted probabilities of dilating reflux increased with patient age. We support the concept of selective voiding cystourethrogram in children with a first simple febrile urinary tract infection and normal ultrasound. Voiding cystourethrogram should be considered in children with late onset urinary tract infection due to nonE. coli bacteria since they are at risk for dilating reflux even if the ultrasound is normal. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Shih-Jung

    Dynamic strength of the High Flux Isotope Reactor (HFIR) vessel to resist hypothetical accidents is analyzed by using the method of fracture mechanics. Vessel critical stresses are estimated by applying dynamic pressure pulses of a range of magnitudes and pulse-durations. The pulses versus time functions are assumed to be step functions. The probability of vessel fracture is then calculated by assuming a distribution of possible surface cracks of different crack depths. The probability distribution function for the crack depths is based on the form that is recommended by the Marshall report. The toughness of the vessel steel used in themore » analysis is based on the projected and embrittled value after 10 effective full power years from 1986. From the study made by Cheverton, Merkle and Nanstad, the weakest point on the vessel for fracture evaluation is known to be located within the region surrounding the tangential beam tube HB3. The increase in the probability of fracture is obtained as an extension of the result from that report for the regular operating condition to include conditions of higher dynamic pressures due to accident loadings. The increase in the probability of vessel fracture is plotted for a range of hoop stresses to indicate the vessel strength against hypothetical accident conditions.« less

  19. Goodness of fit of probability distributions for sightings as species approach extinction.

    PubMed

    Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael

    2009-04-01

    Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.

  20. Fragment size distribution in viscous bag breakup of a drop

    NASA Astrophysics Data System (ADS)

    Kulkarni, Varun; Bulusu, Kartik V.; Plesniak, Michael W.; Sojka, Paul E.

    2015-11-01

    In this study we examine the drop size distribution resulting from the fragmentation of a single drop in the presence of a continuous air jet. Specifically, we study the effect of Weber number, We, and Ohnesorge number, Oh on the disintegration process. The regime of breakup considered is observed between 12 <= We <= 16 for Oh <= 0.1. Experiments are conducted using phase Doppler anemometry. Both the number and volume fragment size probability distributions are plotted. The volume probability distribution revealed a bi-modal behavior with two distinct peaks: one corresponding to the rim fragments and the other to the bag fragments. This behavior was suppressed in the number probability distribution. Additionally, we employ an in-house particle detection code to isolate the rim fragment size distribution from the total probability distributions. Our experiments showed that the bag fragments are smaller in diameter and larger in number, while the rim fragments are larger in diameter and smaller in number. Furthermore, with increasing We for a given Ohwe observe a large number of small-diameter drops and small number of large-diameter drops. On the other hand, with increasing Oh for a fixed We the opposite is seen.

  1. Effect of intercropping period management on runoff and erosion in a maize cropping system.

    PubMed

    Laloy, Eric; Bielders, C L

    2010-01-01

    The management of winter cover crops is likely to influence their performance in reducing runoff and erosion during the intercropping period that precedes spring crops but also during the subsequent spring crop. This study investigated the impact of two dates of destruction and burial of a rye (Secale cereale L.) and ryegrass (Lolium multiflorum Lam.) cover crop on runoff and erosion, focusing on a continuous silage maize (Zea mays L.) cropping system. Thirty erosion plots with various intercrop management options were monitored for 3 yr at two sites. During the intercropping period, cover crops reduced runoff and erosion by more than 94% compared with untilled, post-maize harvest plots. Rough tillage after maize harvest proved equally effective as a late sown cover crop. There was no effect of cover crop destruction and burial dates on runoff and erosion during the intercropping period, probably because rough tillage for cover crop burial compensates for the lack of soil cover. During two of the monitored maize seasons, it was observed that plots that had been covered during the previous intercropping period lost 40 to 90% less soil compared with maize plots that had been left bare during the intercropping period. The burial of an aboveground cover crop biomass in excess of 1.5 t ha(-1) was a necessary, yet not always sufficient, condition to induce a residual effect. Because of the possible beneficial residual effect of cover crop burial on erosion reduction, the sowing of a cover crop should be preferred over rough tillage after maize harvest.

  2. NO versus N2O emissions from an NH4 +-amended Bermuda grass pasture

    NASA Astrophysics Data System (ADS)

    Hutchinson, G. L.; Brams, E. A.

    1992-06-01

    We used an enclosure technique to monitor soil NO and N2O emissions during early summer regrowth of Bermuda grass (Cynodon dactylon) on sandy loam in a humid, subtropical region of southern Texas. The evolution of both gases was substantially higher from plots harvested at the beginning of the experiment and fertilized 5 days later with 52 kg N ha-1 as (NH4)2SO4 than from plots not harvested or fertilized. Emission of NO, but not N2O, was stimulated by clipping and removing the grass, probably because eliminating the shading provided by the dense grass canopy changed these plots from cooler to warmer than unharvested plots, thereby stimulating the activity of soil microorganisms responsible for NO production. Neither gas flux was significantly affected by application of N until the next rainfall dissolved and moved the surface-applied fertilizer into the soil. Immediately thereafter, emissions of NO and N2O increased dramatically to peaks of 160 and 12 g N ha-1 d-1, respectively, and then declined at rates that closely paralleled the nitrification rate of added NH4+, indicating that the gases resulted from the activity of nitrifying microorganisms, rather than denitrifiers. Nitric oxide emissions during the 9-week measurement period averaged 7.2 times greater than N2O emissions and accounted for 3.2% of the added N. The data indicate that humid, subtropical grasslands, which not only have large geographical extent but also have been subject to intense anthropogenic disturbance, contribute significantly to the global atmospheric NOx budget.

  3. NO versus N2O emissions from an NH4(+)-amended Bermuda grass pasture

    NASA Technical Reports Server (NTRS)

    Hutchinson, G. L.; Brams, E. A.

    1992-01-01

    An enclosure technique is used to monitor soil NO and N2O emissions during early summer regrowth of Bermuda grass (Cynodon dactylon) on sandy loam in a humid, subtropical region of southern Texas. The evolution of both gases was substantially higher from plots harvested at the beginning of the experiment and fertilized five days later with 52 kg N/ha as (NH4)2SO4 than from plots not harvested or fertilized. Emission of NO, but not N2O, was stimulated by clipping and removing the grass, probably because eliminating the shading provided by the dense grass canopy changed these plots from cooler to warmer than unharvested plots, thereby stimulating the activity of soil microorganisms responsible for NO production. Neither gas flux was significantly affected by application of N until the next rainfall dissolved and moved the surface-applied fertilizer into the soil. Immediately thereafter, emissions of NO and N2O increased dramatically to peaks of 160 and 12 g N/ha/d, respectively, and then declined at rates that closely parallel the nitrification rate of added NH4(+), indicating that the gases resulted from the activity of nitrifying microorganisms, rather than denitrifiers. Nitric oxide emissions during the nine-week measurement period averaged 7.2 times greater than N2O emissions and accounted for 3.2 percent of the added N. The data indicate that humid, subtropical grasslands, which not only have large geographical extent but also have been subject to intense anthropogenic disturbance, contribute significantly to the global atmospheric NO(x) budget.

  4. Probability of survival of implant-supported metal ceramic and CAD/CAM resin nanoceramic crowns.

    PubMed

    Bonfante, Estevam A; Suzuki, Marcelo; Lorenzoni, Fábio C; Sena, Lídia A; Hirata, Ronaldo; Bonfante, Gerson; Coelho, Paulo G

    2015-08-01

    To evaluate the probability of survival and failure modes of implant-supported resin nanoceramic relative to metal-ceramic crowns. Resin nanoceramic molar crowns (LU) (Lava Ultimate, 3M ESPE, USA) were milled and metal-ceramic (MC) (Co-Cr alloy, Wirobond C+, Bego, USA) with identical anatomy were fabricated (n=21). The metal coping and a burnout-resin veneer were created by CAD/CAM, using an abutment (Stealth-abutment, Bicon LLC, USA) and a milled crown from the LU group as models for porcelain hot-pressing (GC-Initial IQ-Press, GC, USA). Crowns were cemented, the implants (n=42, Bicon) embedded in acrylic-resin for mechanical testing, and subjected to single-load to fracture (SLF, n=3 each) for determination of step-stress profiles for accelerated-life testing in water (n=18 each). Weibull curves (50,000 cycles at 200N, 90% CI) were plotted. Weibull modulus (m) and characteristic strength (η) were calculated and a contour plot used (m versus η) for determining differences between groups. Fractography was performed in SEM and polarized-light microscopy. SLF mean values were 1871N (±54.03) for MC and 1748N (±50.71) for LU. Beta values were 0.11 for MC and 0.49 for LU. Weibull modulus was 9.56 and η=1038.8N for LU, and m=4.57 and η=945.42N for MC (p>0.10). Probability of survival (50,000 and 100,000 cycles at 200 and 300N) was 100% for LU and 99% for MC. Failures were cohesive within LU. In MC crowns, porcelain veneer fractures frequently extended to the supporting metal coping. Probability of survival was not different between crown materials, but failure modes differed. In load bearing regions, similar reliability should be expected for metal ceramics, known as the gold standard, and resin nanoceramic crowns over implants. Failure modes involving porcelain veneer fracture and delamination in MC crowns are less likely to be successfully repaired compared to cohesive failures in resin nanoceramic material. Copyright © 2015 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  5. Concurrent effects of age class and food distribution on immigration success and population dynamics in a small mammal.

    PubMed

    Rémy, Alice; Le Galliard, Jean-François; Odden, Morten; Andreassen, Harry P

    2014-07-01

    During the settlement stage of dispersal, the outcome of conflicts between residents and immigrants should depend on the social organization of resident populations as well as on individual traits of immigrants, such as their age class, body mass and/or behaviour. We have previously shown that spatial distribution of food influences the social organization of female bank voles (Myodes glareolus). Here, we aimed to determine the relative impact of food distribution and immigrant age class on the success and demographic consequences of female bank vole immigration. We manipulated the spatial distribution of food within populations having either clumped or dispersed food. After a pre-experimental period, we released either adult immigrants or juvenile immigrants, for which we scored sociability and aggressiveness prior to introduction. We found that immigrant females survived less well and moved more between populations than resident females, which suggest settlement costs. However, settled juvenile immigrants had a higher probability to reproduce than field-born juveniles. Food distribution had little effects on the settlement success of immigrant females. Survival and settlement probabilities of immigrants were influenced by adult female density in opposite ways for adult and juvenile immigrants, suggesting a strong adult-adult competition. Moreover, females of higher body mass at release had a lower probability to survive, and the breeding probability of settled immigrants increased with their aggressiveness and decreased with their sociability. Prior to the introduction of immigrants, resident females were more aggregated in the clumped food treatment than in the dispersed food treatment, but immigration reversed this relationship. In addition, differences in growth trajectories were seen during the breeding season, with populations reaching higher densities when adult immigrants were introduced in a plot with dispersed food, or when juvenile immigrants were introduced in a plot with clumped food. These results indicate the relative importance of intrinsic and extrinsic factors on immigration success and demographic consequences of dispersal and are of relevance to conservation actions, such as reinforcement of small populations. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  6. Probability models for growth and aflatoxin B1 production as affected by intraspecies variability in Aspergillus flavus.

    PubMed

    Aldars-García, Laila; Berman, María; Ortiz, Jordi; Ramos, Antonio J; Marín, Sonia

    2018-06-01

    The probability of growth and aflatoxin B 1 (AFB 1 ) production of 20 isolates of Aspergillus flavus were studied using a full factorial design with eight water activity levels (0.84-0.98 a w ) and six temperature levels (15-40 °C). Binary data obtained from growth studies were modelled using linear logistic regression analysis as a function of temperature, water activity and time for each isolate. In parallel, AFB 1 was extracted at different times from newly formed colonies (up to 20 mm in diameter). Although a total of 950 AFB 1 values over time for all conditions studied were recorded, they were not considered to be enough to build probability models over time, and therefore, only models at 30 days were built. The confidence intervals of the regression coefficients of the probability of growth models showed some differences among the 20 growth models. Further, to assess the growth/no growth and AFB 1 /no- AFB 1 production boundaries, 0.05 and 0.5 probabilities were plotted at 30 days for all of the isolates. The boundaries for growth and AFB 1 showed that, in general, the conditions for growth were wider than those for AFB 1 production. The probability of growth and AFB 1 production seemed to be less variable among isolates than AFB 1 accumulation. Apart from the AFB 1 production probability models, using growth probability models for AFB 1 probability predictions could be, although conservative, a suitable alternative. Predictive mycology should include a number of isolates to generate data to build predictive models and take into account the genetic diversity of the species and thus make predictions as similar as possible to real fungal food contamination. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Technology-Enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like "What is the chance of event A occurring, given that event B was observed?" This generic question arises in discussions of many intriguing scientific questions such as "What is the probability that an adolescent weighs between 120 and 140 pounds given that…

  8. Runoff and leaching of metolachlor from Mississippi River alluvial soil during seasons of average and below-average rainfall.

    PubMed

    Southwick, Lloyd M; Appelboom, Timothy W; Fouss, James L

    2009-02-25

    The movement of the herbicide metolachlor [2-chloro-N-(2-ethyl-6-methylphenyl)-N-(2-methoxy-1-methylethyl)acetamide] via runoff and leaching from 0.21 ha plots planted to corn on Mississippi River alluvial soil (Commerce silt loam) was measured for a 6-year period, 1995-2000. The first three years received normal rainfall (30 year average); the second three years experienced reduced rainfall. The 4-month periods prior to application plus the following 4 months after application were characterized by 1039 +/- 148 mm of rainfall for 1995-1997 and by 674 +/- 108 mm for 1998-2000. During the normal rainfall years 216 +/- 150 mm of runoff occurred during the study seasons (4 months following herbicide application), accompanied by 76.9 +/- 38.9 mm of leachate. For the low-rainfall years these amounts were 16.2 +/- 18.2 mm of runoff (92% less than the normal years) and 45.1 +/- 25.5 mm of leachate (41% less than the normal seasons). Runoff of metolachlor during the normal-rainfall seasons was 4.5-6.1% of application, whereas leaching was 0.10-0.18%. For the below-normal periods, these losses were 0.07-0.37% of application in runoff and 0.22-0.27% in leachate. When averages over the three normal and the three less-than-normal seasons were taken, a 35% reduction in rainfall was characterized by a 97% reduction in runoff loss and a 71% increase in leachate loss of metolachlor on a percent of application basis. The data indicate an increase in preferential flow in the leaching movement of metolachlor from the surface soil layer during the reduced rainfall periods. Even with increased preferential flow through the soil during the below-average rainfall seasons, leachate loss (percent of application) of the herbicide remained below 0.3%. Compared to the average rainfall seasons of 1995-1997, the below-normal seasons of 1998-2000 were characterized by a 79% reduction in total runoff and leachate flow and by a 93% reduction in corresponding metolachlor movement via these routes. An added observation in the study was that neither runoff of rainfall nor runoff loss of metolachlor was influenced by the presence of subsurface drains, compared to the results from plots without such drains that were described in an earlier paper.

  9. Scattering from Rock and Rock Outcrops

    DTIC Science & Technology

    2014-09-30

    orientations and size distributions reflect the internal fault organization of the bedrock. The plot in Fig. 3 displays experimentally determined PFA...mechanisms contributing could be scattering from small scale roughness combined with specular scattering from facets oriented close to normal incidence to...Larvik, Norway made with a stereo photogrammetry system. 7 IMPACT/APPLICATIONS The primary work completed over the course of this project

  10. Cryogenic Research

    DTIC Science & Technology

    1952-05-01

    needed work lies in the ultra low- temperature range available only through use of the demagnetization cycle. SUPERCONDUCTIVITY BELOW 10 ABSOLUTE In...In Figure 1 is plotted, as a function of temperature, the magnetic field required to change hafnium from the superconducting to the normal state. For...fields of crystal physics, properties of metals, and magnetism and magnetic resonance. This article discusses the work of one group, the Cryogenics

  11. Quantifying Forest Ground Flora Biomass Using Close-range Remote Sensing

    Treesearch

    Paul F. Doruska; Robert C. Weih; Matthew D. Lane; Don C. Bragg

    2005-01-01

    Close-range remote sensing was used to estimate biomass of forest ground flora in Arkansas. Digital images of a series of 1-m² plots were taken using Kodak DCS760 and Kodak DCS420CIR digital cameras. ESRI ArcGIS™ and ERDAS Imagine® software was used to calculate the Normalized Difference Vegetation Index (NDVI) and the Average Visible...

  12. A method to study response of large trees to different amounts of available soil water

    Treesearch

    D.H. Marx; Shi-Jean S. Sung; J.S. Cunningham; M.D. Thompson; L.M. White

    1995-01-01

    A method was developed to manipulate available soil water on large trees by intercepting thrufall with gutters placed under tree canopies and irrigating the intercepted thrufall onto other trees. With this design, trees were exposed for 2 years to either 25% less thrufall, normal thrufall, or 25% additional thrufall.Undercanopy construction in these plots moderately...

  13. A Method to Study Response of Large Trees to Different Amounts of Available Soil Water

    Treesearch

    Donald H. Marx; Shi-jean S. Sung; James S. Cunningham; Michael D. Thompson; Linda M. White

    1995-01-01

    A method was developed to manipulate available soil water on large trees by intercepting thrufall with gutters placed under tree canopies and irrigating the intercepted thrufall onto other trees. With this design, trees were exposed for 2 years to either 25 percent less thrufall, normal tbrufall,or 25 percent additional thrufall. Undercanopy construction in these plots...

  14. Impacts of sampling design and estimation methods on nutrient leaching of intensively monitored forest plots in the Netherlands.

    PubMed

    de Vries, W; Wieggers, H J J; Brus, D J

    2010-08-05

    Element fluxes through forest ecosystems are generally based on measurements of concentrations in soil solution at regular time intervals at plot locations sampled in a regular grid. Here we present spatially averaged annual element leaching fluxes in three Dutch forest monitoring plots using a new sampling strategy in which both sampling locations and sampling times are selected by probability sampling. Locations were selected by stratified random sampling with compact geographical blocks of equal surface area as strata. In each sampling round, six composite soil solution samples were collected, consisting of five aliquots, one per stratum. The plot-mean concentration was estimated by linear regression, so that the bias due to one or more strata being not represented in the composite samples is eliminated. The sampling times were selected in such a way that the cumulative precipitation surplus of the time interval between two consecutive sampling times was constant, using an estimated precipitation surplus averaged over the past 30 years. The spatially averaged annual leaching flux was estimated by using the modeled daily water flux as an ancillary variable. An important advantage of the new method is that the uncertainty in the estimated annual leaching fluxes due to spatial and temporal variation and resulting sampling errors can be quantified. Results of this new method were compared with the reference approach in which daily leaching fluxes were calculated by multiplying daily interpolated element concentrations with daily water fluxes and then aggregated to a year. Results show that the annual fluxes calculated with the reference method for the period 2003-2005, including all plots, elements and depths, lies only in 53% of the cases within the range of the average +/-2 times the standard error of the new method. Despite the differences in results, both methods indicate comparable N retention and strong Al mobilization in all plots, with Al leaching being nearly equal to the leaching of SO(4) and NO(3) with fluxes expressed in mol(c) ha(-1) yr(-1). This illustrates that Al release, which is the clearest signal of soil acidification, is mainly due to the external input of SO(4) and NO(3).

  15. Scaling Laws in Canopy Flows: A Wind-Tunnel Analysis

    NASA Astrophysics Data System (ADS)

    Segalini, Antonio; Fransson, Jens H. M.; Alfredsson, P. Henrik

    2013-08-01

    An analysis of velocity statistics and spectra measured above a wind-tunnel forest model is reported. Several measurement stations downstream of the forest edge have been investigated and it is observed that, while the mean velocity profile adjusts quickly to the new canopy boundary condition, the turbulence lags behind and shows a continuous penetration towards the free stream along the canopy model. The statistical profiles illustrate this growth and do not collapse when plotted as a function of the vertical coordinate. However, when the statistics are plotted as function of the local mean velocity (normalized with a characteristic velocity scale), they do collapse, independently of the streamwise position and freestream velocity. A new scaling for the spectra of all three velocity components is proposed based on the velocity variance and integral time scale. This normalization improves the collapse of the spectra compared to existing scalings adopted in atmospheric measurements, and allows the determination of a universal function that provides the velocity spectrum. Furthermore, a comparison of the proposed scaling laws for two different canopy densities is shown, demonstrating that the vertical velocity variance is the most sensible statistical quantity to the characteristics of the canopy roughness.

  16. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    PubMed

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  17. Adsorption properties of argon on Ti doped SBA-15.

    PubMed

    Kim, Euikwoun; Lee, Sang-Hwa; Kim, Jaeyong

    2014-11-01

    Thermodynamic properties of argon on Ti doped Santa barbara amorphous No. 15 (SBA-15) were investigated in the temperature range of 77-89 K to understand the interaction of gas molecules with porous materials. When the total amount of adsorbed molecules is plotted as a function of the equilibrium vapor pressure of the adsorbed Ar, the results exhibit two distinct isotherm steps. The first step appears at the beginning of the isotherm while the second step locates at 0.7 of the normalized pressure. The existence of the second isotherm step which spanned in the normalized pressure from 0.7 to 0.9 is confirmed when the isotherm data were plotted in terms of the 2-dimensional compressibility values. The total amount of adsorbed molecules forming the second isotherm step is 2.5 times greater than the one for the first step. These adsorption behaviors are typical patterns noted from porous materials and far different from the ones observed from non-pore materials. Our observations demonstrate that most of adsorbed molecules reside in the pores and the height of the second isotherm step is strongly associated with filling pores with gas molecules.

  18. Drought forecasting in Luanhe River basin involving climatic indices

    NASA Astrophysics Data System (ADS)

    Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.

    2017-11-01

    Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the three proposed models outperform the two traditional models and involving large-scale climatic indices can improve the forecasting accuracy.

  19. Quality and use of ERTS radiometric information in geologic applications

    NASA Technical Reports Server (NTRS)

    Goetz, A. F. H.; Billingsley, F. C.

    1974-01-01

    Some techniques are described for making full use of the data contained in an ERTS MSS image. Only about one-fourth of the data in a single band can be displayed at one time on a black and white image; therefore, when all four bands are considered, only about 7% of the available data can be used by the interpreter. Selecting the proper subset of information for the photointerpreter is therefore a necessity. Ratio methods exclude the brightness information from the display. A field study in one area using a portable spectrometer has shown only fair correlation with ERTS radiometry after one normalization procedure. Plots of brightness of test areas with sun angle show discrepancies. Plots of ratios show discrepancies of lesser magnitude, although the error limits are large.

  20. On proper linearization, construction and analysis of the Boyle-van't Hoff plots and correct calculation of the osmotically inactive volume.

    PubMed

    Katkov, Igor I

    2011-06-01

    The Boyle-van't Hoff (BVH) law of physics has been widely used in cryobiology for calculation of the key osmotic parameters of cells and optimization of cryo-protocols. The proper use of linearization of the Boyle-vant'Hoff relationship for the osmotically inactive volume (v(b)) has been discussed in a rigorous way in (Katkov, Cryobiology, 2008, 57:142-149). Nevertheless, scientists in the field have been continuing to use inappropriate methods of linearization (and curve fitting) of the BVH data, plotting the BVH line and calculation of v(b). Here, we discuss the sources of incorrect linearization of the BVH relationship using concrete examples of recent publications, analyze the properties of the correct BVH line (which is unique for a given v(b)), provide appropriate statistical formulas for calculation of v(b) from the experimental data, and propose simplistic instructions (standard operation procedure, SOP) for proper normalization of the data, appropriate linearization and construction of the BVH plots, and correct calculation of v(b). The possible sources of non-linear behavior or poor fit of the data to the proper BVH line such as active water and/or solute transports, which can result in large discrepancy between the hyperosmotic and hypoosmotic parts of the BVH plot, are also discussed. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Probability of Regenerating a Normal Limb After Bite Injury in the Mexican Axolotl (Ambystoma mexicanum).

    PubMed

    Thompson, Sierra; Muzinic, Laura; Muzinic, Christopher; Niemiller, Matthew L; Voss, S Randal

    2014-06-01

    Multiple factors are thought to cause limb abnormalities in amphibian populations by altering processes of limb development and regeneration. We examined adult and juvenile axolotls ( Ambystoma mexicanum ) in the Ambystoma Genetic Stock Center (AGSC) for limb and digit abnormalities to investigate the probability of normal regeneration after bite injury. We observed that 80% of larval salamanders show evidence of bite injury at the time of transition from group housing to solitary housing. Among 717 adult axolotls that were surveyed, which included solitary-housed males and group-housed females, approximately half presented abnormalities, including examples of extra or missing digits and limbs, fused digits, and digits growing from atypical anatomical positions. Bite injury likely explains these limb defects, and not abnormal development, because limbs with normal anatomy regenerated after performing rostral amputations. We infer that only 43% of AGSC larvae will present four anatomically normal looking adult limbs after incurring a bite injury. Our results show regeneration of normal limb anatomy to be less than perfect after bite injury.

  2. Historic Mining and Agriculture as Indicators of Occurrence and Abundance of Widespread Invasive Plant Species

    PubMed Central

    Calinger, Kellen; Calhoon, Elisabeth; Chang, Hsiao-chi; Whitacre, James; Wenzel, John; Comita, Liza; Queenborough, Simon

    2015-01-01

    Anthropogenic disturbances often change ecological communities and provide opportunities for non-native species invasion. Understanding the impacts of disturbances on species invasion is therefore crucial for invasive species management. We used generalized linear mixed effects models to explore the influence of land-use history and distance to roads on the occurrence and abundance of two invasive plant species (Rosa multiflora and Berberis thunbergii) in a 900-ha deciduous forest in the eastern U.S.A., the Powdermill Nature Reserve. Although much of the reserve has been continuously forested since at least 1939, aerial photos revealed a variety of land-uses since then including agriculture, mining, logging, and development. By 2008, both R. multiflora and B. thunbergii were widespread throughout the reserve (occurring in 24% and 13% of 4417 10-m diameter regularly-placed vegetation plots, respectively) with occurrence and abundance of each varying significantly with land-use history. Rosa multiflora was more likely to occur in historically farmed, mined, logged or developed plots than in plots that remained forested, (log odds of 1.8 to 3.0); Berberis thunbergii was more likely to occur in plots with agricultural, mining, or logging history than in plots without disturbance (log odds of 1.4 to 2.1). Mining, logging, and agriculture increased the probability that R. multiflora had >10% cover while only past agriculture was related to cover of B. thunbergii. Proximity to roads was positively correlated with the occurrence of R. multiflora (a 0.26 increase in the log odds for every 1-m closer) but not B. thunbergii, and roads had no impact on the abundance of either species. Our results indicated that a wide variety of disturbances may aid the introduction of invasive species into new habitats, while high-impact disturbances such as agriculture and mining increase the likelihood of high abundance post-introduction. PMID:26046534

  3. Mapping of power consumption and friction reduction in piezoelectrically-assisted ultrasonic lubrication

    NASA Astrophysics Data System (ADS)

    Dong, Sheng; Dapino, Marcelo J.

    2015-04-01

    Ultrasonic lubrication has been proven effective in reducing dynamic friction. This paper investigates the relationship between friction reduction, power consumption, linear velocity, and normal stress. A modified pin-on-disc tribometer was adopted as the experimental set-up, and a Labview system was utilized for signal generation and data acquisition. Friction reduction was quantified for 0.21 to 5.31 W of electric power, 50 to 200 mm/s of linear velocity, and 23 to 70 MPa of normal stress. Friction reduction near 100% can be achieved under certain conditions. Lower linear velocity and higher electric power result in greater friction reduction, while normal stress has little effect on friction reduction. Contour plots of friction reduction, power consumption, linear velocity, and normal stress were created. An efficiency coefficient was proposed to calculate power requirements for a certain friction reduction or reduced friction for a given electric power.

  4. Quantifying Anderson's fault types

    USGS Publications Warehouse

    Simpson, R.W.

    1997-01-01

    Anderson [1905] explained three basic types of faulting (normal, strike-slip, and reverse) in terms of the shape of the causative stress tensor and its orientation relative to the Earth's surface. Quantitative parameters can be defined which contain information about both shape and orientation [Ce??le??rier, 1995], thereby offering a way to distinguish fault-type domains on plots of regional stress fields and to quantify, for example, the degree of normal-faulting tendencies within strike-slip domains. This paper offers a geometrically motivated generalization of Angelier's [1979, 1984, 1990] shape parameters ?? and ?? to new quantities named A?? and A??. In their simple forms, A?? varies from 0 to 1 for normal, 1 to 2 for strike-slip, and 2 to 3 for reverse faulting, and A?? ranges from 0?? to 60??, 60?? to 120??, and 120?? to 180??, respectively. After scaling, A?? and A?? agree to within 2% (or 1??), a difference of little practical significance, although A?? has smoother analytical properties. A formulation distinguishing horizontal axes as well as the vertical axis is also possible, yielding an A?? ranging from -3 to +3 and A?? from -180?? to +180??. The geometrically motivated derivation in three-dimensional stress space presented here may aid intuition and offers a natural link with traditional ways of plotting yield and failure criteria. Examples are given, based on models of Bird [1996] and Bird and Kong [1994], of the use of Anderson fault parameters A?? and A?? for visualizing tectonic regimes defined by regional stress fields. Copyright 1997 by the American Geophysical Union.

  5. Decreasing phosphorus runoff losses from land-applied poultry litter with dietary modifications and alum addition.

    PubMed

    Smith, Douglas R; Moore, P A; Miles, D M; Haggard, B E; Daniel, T C

    2004-01-01

    Phosphorus (P) losses from pastures fertilized with poultry litter contribute to the degradation of surface water quality in the United States. Dietary modification and manure amendments may reduce potential P runoff losses from pastures. In the current study, broilers were fed a normal diet, phytase diet, high available phosphorus (HAP) corn diet, or HAP corn + phytase diet. Litter treatments were untreated control and alum added at 10% by weight between flocks. Phytase and HAP corn diets reduced litter dissolved P content in poultry litter by 10 and 35%, respectively, compared with the normal diet (789 mg P kg(-1)). Alum treatment of poultry litter reduced the amount of dissolved P by 47%, while a 74% reduction was noted after alum treatment of litter from the HAP corn + phytase diet. The P concentrations in runoff water were highest from plots receiving poultry litter from the normal diet, whereas plots receiving poultry litter from phytase and HAP corn diets had reduced P concentrations. The addition of alum to the various poultry litters reduced P runoff by 52 to 69%; the greatest reduction occurred when alum was used in conjunction with HAP corn and phytase. This study demonstrates the potential added benefits of using dietary modification in conjunction with manure amendments in poultry operations. Integrators and producers should consider the use of phytase, HAP corn, and alum to reduce potential P losses associated with poultry litter application to pastures.

  6. Cost Effectiveness of Support for People Starting a New Medication for a Long-Term Condition Through Community Pharmacies: An Economic Evaluation of the New Medicine Service (NMS) Compared with Normal Practice.

    PubMed

    Elliott, Rachel A; Tanajewski, Lukasz; Gkountouras, Georgios; Avery, Anthony J; Barber, Nick; Mehta, Rajnikant; Boyd, Matthew J; Latif, Asam; Chuter, Antony; Waring, Justin

    2017-12-01

    The English community pharmacy New Medicine Service (NMS) significantly increases patient adherence to medicines, compared with normal practice. We examined the cost effectiveness of NMS compared with normal practice by combining adherence improvement and intervention costs with the effect of increased adherence on patient outcomes and healthcare costs. We developed Markov models for diseases targeted by the NMS (hypertension, type 2 diabetes mellitus, chronic obstructive pulmonary disease, asthma and antiplatelet regimens) to assess the impact of patients' non-adherence. Clinical event probability, treatment pathway, resource use and costs were extracted from literature and costing tariffs. Incremental costs and outcomes associated with each disease were incorporated additively into a composite probabilistic model and combined with adherence rates and intervention costs from the trial. Costs per extra quality-adjusted life-year (QALY) were calculated from the perspective of NHS England, using a lifetime horizon. NMS generated a mean of 0.05 (95% CI 0.00-0.13) more QALYs per patient, at a mean reduced cost of -£144 (95% CI -769 to 73). The NMS dominates normal practice with a probability of 0.78 [incremental cost-effectiveness ratio (ICER) -£3166 per QALY]. NMS has a 96.7% probability of cost effectiveness compared with normal practice at a willingness to pay of £20,000 per QALY. Sensitivity analysis demonstrated that targeting each disease with NMS has a probability over 0.90 of cost effectiveness compared with normal practice at a willingness to pay of £20,000 per QALY. Our study suggests that the NMS increased patient medicine adherence compared with normal practice, which translated into increased health gain at reduced overall cost. ClinicalTrials.gov Trial reference number NCT01635361 ( http://clinicaltrials.gov/ct2/show/NCT01635361 ). Current Controlled trials: Trial reference number ISRCTN 23560818 ( http://www.controlled-trials.com/ISRCTN23560818/ ; DOI 10.1186/ISRCTN23560818 ). UK Clinical Research Network (UKCRN) study 12494 ( http://public.ukcrn.org.uk/Search/StudyDetail.aspx?StudyID=12494 ). Department of Health Policy Research Programme.

  7. Shade tree spatial structure and pod production explain frosty pod rot intensity in cacao agroforests, Costa Rica.

    PubMed

    Gidoin, Cynthia; Avelino, Jacques; Deheuvels, Olivier; Cilas, Christian; Bieng, Marie Ange Ngo

    2014-03-01

    Vegetation composition and plant spatial structure affect disease intensity through resource and microclimatic variation effects. The aim of this study was to evaluate the independent effect and relative importance of host composition and plant spatial structure variables in explaining disease intensity at the plot scale. For that purpose, frosty pod rot intensity, a disease caused by Moniliophthora roreri on cacao pods, was monitored in 36 cacao agroforests in Costa Rica in order to assess the vegetation composition and spatial structure variables conducive to the disease. Hierarchical partitioning was used to identify the most causal factors. Firstly, pod production, cacao tree density and shade tree spatial structure had significant independent effects on disease intensity. In our case study, the amount of susceptible tissue was the most relevant host composition variable for explaining disease intensity by resource dilution. Indeed, cacao tree density probably affected disease intensity more by the creation of self-shading rather than by host dilution. Lastly, only regularly distributed forest trees, and not aggregated or randomly distributed forest trees, reduced disease intensity in comparison to plots with a low forest tree density. A regular spatial structure is probably crucial to the creation of moderate and uniform shade as recommended for frosty pod rot management. As pod production is an important service expected from these agroforests, shade tree spatial structure may be a lever for integrated management of frosty pod rot in cacao agroforests.

  8. Assessment of NDE Reliability Data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Chang, F. H.; Couchman, J. C.; Lemon, G. H.; Packman, P. F.

    1976-01-01

    Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.

  9. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Herzog, James P. (Inventor); Bickford, Randall L. (Inventor)

    2005-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  10. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2006-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  11. Surveillance System and Method having an Adaptive Sequential Probability Fault Detection Test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2008-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  12. The Influence of Part-Word Phonotactic Probability/Neighborhood Density on Word Learning by Preschool Children Varying in Expressive Vocabulary

    ERIC Educational Resources Information Center

    Storkel, Holly L.; Hoover, Jill R.

    2011-01-01

    The goal of this study was to examine the influence of part-word phonotactic probability/neighborhood density on word learning by preschool children with normal vocabularies that varied in size. Ninety-eight children (age 2 ; 11-6 ; 0) were taught consonant-vowel-consonant (CVC) nonwords orthogonally varying in the probability/density of the CV…

  13. An Experimental Study of the Effects of Litter and Duff Consumption and Ash Formation on Post-Fire Runoff.

    NASA Astrophysics Data System (ADS)

    Woods, S. W.; Balfour, V.

    2007-12-01

    Consumption of the litter and duff layers in forest wildfires can lead to substantial increases in the frequency and magnitude of overland flow. These increases result from the loss of storage in the organic surface layer, reduced surface roughness, and from sealing of the exposed mineral soil surface. The presence of an ash layer may accentuate surface sealing by providing an additional source of fine material, or it may reduce runoff by storing rainfall and by protecting the soil surface from raindrop impacts. We used simulated rainfall experiments to assess the effects of litter and duff consumption and the presence of ash layers of varying thickness on post fire runoff at two forested sites in western Montana, one with sandy loam soils formed out of granodiorite and the other with gravelly silt loam soils formed out of argillite. At each site we measured the runoff from simulated rainfall in replicated 0.5 m2 plots before and after application of the following treatments: 1) burning with a fuel load of 90 Mg ha-1, 2) manual removal of the litter and duff layers, 3) addition of 0.5, 2.5 and 5 cm of ash to plots from which the litter and duff had previously been removed, and 4) addition of the same depths of ash to burned plots at the sandy loam site. In the burned plots the surface litter and duff layers were completely consumed and a <1cm layer of black and gray ash and char was formed, indicating a moderate severity burn. The mean soil temperature in the upper 1 cm of the mineral soil was 70° C, and there was no detectable increase in water repellency. The mean final infiltration capacity of the burned sandy loam plots was 35 mm hr-1 compared to a pre-fire mean of 87 mm hr-1, while in the gravelly silt loam plots the pre- and post burn infiltration capacities (27 and 31 mm hr- 1) were not significantly different. Manual removal of the litter and duff layers reduced the mean final infiltration capacity in the sandy loam plots from 64 mm hr-1 to 40 mm hr-1 and in the gravelly silt loam plots from 23 mm hr-1 to 16 mm hr-1. We attribute decreases in infiltration due to the burning and duff removal treatments primarily to surface sealing. In the sandy loam plots, burning may have had a greater effect on infiltration than duff removal because the thin ash layer in the burned plots provided an additional source of fine material. In the gravelly silt loam plots, macropores located around rock fragments helped to minimize sealing effects. The addition of 0.5 cm of ash to the burned granitic plots resulted in a 20 mm hr-1 decrease in the final infiltration rate, and this was also probably due to surface sealing. However, the overall effect of ash addition was to increase the cumulative infiltration in proportion to the ash thickness and to maintain a higher average infiltration rate, indicating that while thin (<1 cm) ash layers may promote sealing, thicker ash layers help to reduce the runoff rate by providing additional storage for rainfall and by protecting the soil surface from raindrop impacts.

  14. Terahertz spectroscopy for the study of paraffin-embedded gastric cancer samples

    NASA Astrophysics Data System (ADS)

    Wahaia, Faustino; Kasalynas, Irmantas; Seliuta, Dalius; Molis, Gediminas; Urbanowicz, Andrzej; Carvalho Silva, Catia D.; Carneiro, Fatima; Valusis, Gintaras; Granja, Pedro L.

    2015-01-01

    Terahertz (THz) spectroscopy constitute promising technique for biomedical applications as a complementary and powerful tool for diseases screening specially for early cancer diagnostic. The THz radiation is not harmful to biological tissues. As increased blood supply in cancer-affected tissues and consequent local increase in tissue water content makes THz technology a potentially attractive. In the present work, samples of healthy and adenocarcinoma-affected gastric tissue were analyzed using transmission time-domain THz spectroscopy (THz-TDS). The work shows the capability of the technique to distinguish between normal and cancerous regions in dried and paraffin-embedded samples. Plots of absorption coefficient α and refractive index n of normal and cancer affected tissues, are presented and the conditions for discrimination between normal and affected tissues are discussed.

  15. Avian associations of the Northern Great Plains grasslands

    USGS Publications Warehouse

    Kantrud, H.A.; Kologiski, R.L.

    1983-01-01

    The grassland region of the northern Great Plains was divided into six broad subregions by application of an avian indicator species analysis to data obtained from 582 sample plots censused during the breeding season. Common, ubiquitous species and rare species had little classificatory value and were eliminated from the data set used to derive the avian associations. Initial statistical division of the plots likely reflected structure of the dominant plant species used for nesting; later divisions probably were related to foraging or nesting cover requirements based on vegetation height or density, habitat heterogeneity, or possibly to the existence of mutually similar distributions or shared areas of greater than average abundance for certain groups of species. Knowledge of the effects of grazing, mostly by cattle, on habitat use by the breeding bird species was used to interpret the results of the indicator species analysis. Moderate grazing resulted in greater species richness in nearly all subregions; effects of grazing on total bird density were more variable.

  16. Computation of neutron fluxes in clusters of fuel pins arranged in hexagonal assemblies (2D and 3D)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabha, H.; Marleau, G.

    2012-07-01

    For computations of fluxes, we have used Carvik's method of collision probabilities. This method requires tracking algorithms. An algorithm to compute tracks (in 2D and 3D) has been developed for seven hexagonal geometries with cluster of fuel pins. This has been implemented in the NXT module of the code DRAGON. The flux distribution in cluster of pins has been computed by using this code. For testing the results, they are compared when possible with the EXCELT module of the code DRAGON. Tracks are plotted in the NXT module by using MATLAB, these plots are also presented here. Results are presentedmore » with increasing number of lines to show the convergence of these results. We have numerically computed volumes, surface areas and the percentage errors in these computations. These results show that 2D results converge faster than 3D results. The accuracy on the computation of fluxes up to second decimal is achieved with fewer lines. (authors)« less

  17. Understanding arsenic mobilization using reactive transport modeling of groundwater hydrochemistry in the Datong basin study plot, China.

    PubMed

    Mapoma, Harold Wilson Tumwitike; Xie, Xianjun; Pi, Kunfu; Liu, Yaqing; Zhu, Yapeng

    2016-03-01

    This paper discusses the reactive transport and evolution of arsenic along a selected flow path in a study plot within the central part of Datong basin. The simulation used the TOUGHREACT code. The spatial and temporal trends in hydrochemistry and mineral volume fraction along a flow path were observed. Furthermore, initial simulation of major ions and pH fits closely to the measured data. The study shows that equilibrium conditions may be attained at different stress periods for each parameter simulated. It is noted that the variations in ionic chemistry have a greater impact on arsenic distribution while reducing conditions drive the mobilization of arsenic. The study concluded that the reduction of Fe(iii) and As(v) and probably SO4/HS cycling are significant factors affecting localized mobilization of arsenic. Besides cation exchange and water-rock interaction, incongruent dissolution of silicates is also a significant control mechanism of general chemistry of the Datong basin aquifer.

  18. Pairing preferences of the model mono-valence mono-atomic ions investigated by molecular simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Qiang; Department of Chemistry, Bohai University, Jinzhou 121000; Zhang, Ruiting

    2014-05-14

    We carried out a series of potential of mean force calculations to study the pairing preferences of a series of model mono-atomic 1:1 ions with evenly varied sizes. The probabilities of forming the contact ion pair (CIP) and the single water separate ion pair (SIP) were presented in the two-dimensional plots with respect to the ion sizes. The pairing preferences reflected in these plots largely agree with the empirical rule of matching ion sizes in the small and big size regions. In the region that the ion sizes are close to the size of the water molecule; however, a significantmore » deviation from this conventional rule is observed. Our further analysis indicated that this deviation originates from the competition between CIP and the water bridging SIP state. The competition is mainly an enthalpy modulated phenomenon in which the existing of the water bridging plays a significant role.« less

  19. Normal uniform mixture differential gene expression detection for cDNA microarrays

    PubMed Central

    Dean, Nema; Raftery, Adrian E

    2005-01-01

    Background One of the primary tasks in analysing gene expression data is finding genes that are differentially expressed in different samples. Multiple testing issues due to the thousands of tests run make some of the more popular methods for doing this problematic. Results We propose a simple method, Normal Uniform Differential Gene Expression (NUDGE) detection for finding differentially expressed genes in cDNA microarrays. The method uses a simple univariate normal-uniform mixture model, in combination with new normalization methods for spread as well as mean that extend the lowess normalization of Dudoit, Yang, Callow and Speed (2002) [1]. It takes account of multiple testing, and gives probabilities of differential expression as part of its output. It can be applied to either single-slide or replicated experiments, and it is very fast. Three datasets are analyzed using NUDGE, and the results are compared to those given by other popular methods: unadjusted and Bonferroni-adjusted t tests, Significance Analysis of Microarrays (SAM), and Empirical Bayes for microarrays (EBarrays) with both Gamma-Gamma and Lognormal-Normal models. Conclusion The method gives a high probability of differential expression to genes known/suspected a priori to be differentially expressed and a low probability to the others. In terms of known false positives and false negatives, the method outperforms all multiple-replicate methods except for the Gamma-Gamma EBarrays method to which it offers comparable results with the added advantages of greater simplicity, speed, fewer assumptions and applicability to the single replicate case. An R package called nudge to implement the methods in this paper will be made available soon at . PMID:16011807

  20. Bayesian alternative to the ISO-GUM's use of the Welch Satterthwaite formula

    NASA Astrophysics Data System (ADS)

    Kacker, Raghu N.

    2006-02-01

    In certain disciplines, uncertainty is traditionally expressed as an interval about an estimate for the value of the measurand. Development of such uncertainty intervals with a stated coverage probability based on the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) requires a description of the probability distribution for the value of the measurand. The ISO-GUM propagates the estimates and their associated standard uncertainties for various input quantities through a linear approximation of the measurement equation to determine an estimate and its associated standard uncertainty for the value of the measurand. This procedure does not yield a probability distribution for the value of the measurand. The ISO-GUM suggests that under certain conditions motivated by the central limit theorem the distribution for the value of the measurand may be approximated by a scaled-and-shifted t-distribution with effective degrees of freedom obtained from the Welch-Satterthwaite (W-S) formula. The approximate t-distribution may then be used to develop an uncertainty interval with a stated coverage probability for the value of the measurand. We propose an approximate normal distribution based on a Bayesian uncertainty as an alternative to the t-distribution based on the W-S formula. A benefit of the approximate normal distribution based on a Bayesian uncertainty is that it greatly simplifies the expression of uncertainty by eliminating altogether the need for calculating effective degrees of freedom from the W-S formula. In the special case where the measurand is the difference between two means, each evaluated from statistical analyses of independent normally distributed measurements with unknown and possibly unequal variances, the probability distribution for the value of the measurand is known to be a Behrens-Fisher distribution. We compare the performance of the approximate normal distribution based on a Bayesian uncertainty and the approximate t-distribution based on the W-S formula with respect to the Behrens-Fisher distribution. The approximate normal distribution is simpler and better in this case. A thorough investigation of the relative performance of the two approximate distributions would require comparison for a range of measurement equations by numerical methods.

  1. Thirty-five-year growth of thinned and unthinned ponderosa pine in the Methow Valley of northern Washington.

    Treesearch

    P.H. Cochran; James W. Barrett

    1998-01-01

    It is commonly expected that self-thinning will maintain small-diameter stands at near-normal densities and allow dominant trees to grow reasonably well. Such self-thinning did not occur in the unthinned plots in a thinning study in the Methow Valley of northern Washington, even though there was some suppression-caused mortality. A shift from suppression-caused...

  2. Radiobiological Optimization of Combination Radiopharmaceutical Therapy Applied to Myeloablative Treatment of Non-Hodgkin’s Lymphoma

    PubMed Central

    Hobbs, Robert F; Wahl, Richard L; Frey, Eric C; Kasamon, Yvette; Song, Hong; Huang, Peng; Jones, Richard J; Sgouros, George

    2014-01-01

    Combination treatment is a hallmark of cancer therapy. Although the rationale for combination radiopharmaceutical therapy was described in the mid ‘90s, such treatment strategies have only been implemented clinically recently, and without a rigorous methodology for treatment optimization. Radiobiological and quantitative imaging-based dosimetry tools are now available that enable rational implementation of combined targeted radiopharmaceutical therapy. Optimal implementation should simultaneously account for radiobiological normal organ tolerance while optimizing the ratio of two different radiopharmaceuticals required to maximize tumor control. We have developed such a methodology and applied it to hypothetical myeloablative treatment of non-hodgkin’s lymphoma (NHL) patients using 131I-tositumomab and 90Y-ibritumomab tiuxetan. Methods The range of potential administered activities (AA) is limited by the normal organ maximum tolerated biologic effective doses (MTBEDs) arising from the combined radiopharmaceuticals. Dose limiting normal organs are expected to be the lungs for 131I-tositumomab and the liver for 90Y-ibritumomab tiuxetan in myeloablative NHL treatment regimens. By plotting the limiting normal organ constraints as a function of the AAs and calculating tumor biological effective dose (BED) along the normal organ MTBED limits, the optimal combination of activities is obtained. The model was tested using previously acquired patient normal organ and tumor kinetic data and MTBED values taken from the literature. Results The average AA values based solely on normal organ constraints was (19.0 ± 8.2) GBq with a range of 3.9 – 36.9 GBq for 131I-tositumomab, and (2.77 ± 1.64) GBq with a range of 0.42 – 7.54 GBq for 90Y-ibritumomab tiuxetan. Tumor BED optimization results were calculated and plotted as a function of AA for 5 different cases, established using patient normal organ kinetics for the two radiopharmaceuticals. Results included AA ranges which would deliver 95 % of the maximum tumor BED, which allows for informed inclusion of clinical considerations, such as a maximum allowable 131I administration. Conclusions A rational approach for combination radiopharmaceutical treatment has been developed within the framework of a proven 3-dimensional personalized dosimetry software, 3D-RD, and applied to the myeloablative treatment of NHL. We anticipate combined radioisotope therapy will ultimately supplant single radioisotope therapy, much as combination chemotherapy has substantially replaced single agent chemotherapy. PMID:23918734

  3. Columbia: The first 5 flights entry heating data series. Volume 5: The side fuselage and payload bay door

    NASA Technical Reports Server (NTRS)

    Williams, S. D.

    1984-01-01

    Entry heating flight data and wind tunnel data on the side fuselage and payload bay door, Z = 400 and 440 trace aft of X/L=0.2, for the first five flights of the Space Shuttle Orbiter are presented. The heating rate data are reviewed in terms of normalized film heat transfer coefficients as a function of angle of attack, Mach number, and normal shock Reynolds number. The surface heatings rates and temperatures were obtained by the JSC NONLIN/INVERSE computer program. Time history plots of the surface heating rates and temperatures are outlined.

  4. Radiobiological Impact of Reduced Margins and Treatment Technique for Prostate Cancer in Terms of Tumor Control Probability (TCP) and Normal Tissue Complication Probability (NTCP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Ingelise, E-mail: inje@rn.d; Carl, Jesper; Lund, Bente

    2011-07-01

    Dose escalation in prostate radiotherapy is limited by normal tissue toxicities. The aim of this study was to assess the impact of margin size on tumor control and side effects for intensity-modulated radiation therapy (IMRT) and 3D conformal radiotherapy (3DCRT) treatment plans with increased dose. Eighteen patients with localized prostate cancer were enrolled. 3DCRT and IMRT plans were compared for a variety of margin sizes. A marker detectable on daily portal images was presupposed for narrow margins. Prescribed dose was 82 Gy within 41 fractions to the prostate clinical target volume (CTV). Tumor control probability (TCP) calculations based on themore » Poisson model including the linear quadratic approach were performed. Normal tissue complication probability (NTCP) was calculated for bladder, rectum and femoral heads according to the Lyman-Kutcher-Burman method. All plan types presented essentially identical TCP values and very low NTCP for bladder and femoral heads. Mean doses for these critical structures reached a minimum for IMRT with reduced margins. Two endpoints for rectal complications were analyzed. A marked decrease in NTCP for IMRT plans with narrow margins was seen for mild RTOG grade 2/3 as well as for proctitis/necrosis/stenosis/fistula, for which NTCP <7% was obtained. For equivalent TCP values, sparing of normal tissue was demonstrated with the narrow margin approach. The effect was more pronounced for IMRT than 3DCRT, with respect to NTCP for mild, as well as severe, rectal complications.« less

  5. Vocalization behavior and response of black rails

    USGS Publications Warehouse

    Legare, M.L.; Eddleman, W.R.; Buckley, P.A.; Kelly, C.

    1999-01-01

    We measured the vocal responses and movements of radio-tagged black rails (Laterallus jamaicensis) (n = 43, 26 males, 17 females) to playback of vocalizations at 2 sites in Florida during the breeding seasons of 1992-95. We used regression coefficients from logistic regression equations to model the probability of a response conditional to the birds' sex, nesting status, distance to playback source, and the time of survey. With a probability of 0.811, non-nesting male black rails were most likely to respond to playback, while nesting females were the least likely to respond (probability = 0.189). Linear regression was used to determine daily, monthly, and annual variation in response from weekly playback surveys along a fixed route during the breeding seasons of 1993-95. Significant sources of variation in the linear regression model were month (F = 3.89, df = 3, p = 0.0140), year (F = 9.37, df = 2, p = 0.0003), temperature (F = 5.44, df=1, p = 0.0236), and month*year (F = 2.69, df = 5, p = 0.0311). The model was highly significant (p < 0.0001) and explained 53% of the variation of mean response per survey period (R2 = 0.5353). Response probability data obtained from the radio-tagged black rails and data from the weekly playback survey route were combined to provide a density estimate of 0.25 birds/ha for the St. Johns National Wildlife Refuge. Density estimates for black rails may be obtained from playback surveys, and fixed radius circular plots. Circular plots should be considered as having a radius of 80 m and be located so the plot centers are 150 m apart. Playback tapes should contain one series of Kic-kic-kerr and Growl vocalizations recorded within the same geographic region as the study area. Surveys should be conducted from 0-2 hours after sunrise or 0-2 hours before sunset, during the pre-nesting season, and when wind velocity is < 20 kph. Observers should listen for 3-4 minutes after playing the survey tape and record responses heard during that time. Observers should be trained to identify black rail vocalizations and should have acceptable hearing ability. Given the number of variables that may have large effects on the response behavior of black rails to tape playback, we recommend that future studies using playback surveys should be cautious when presenting estimates of 'absolute' density. Though results did account for variation in response behavior, we believe that additional variation in vocal response between sites, with breeding status, and bird density remains in question. Playback surveys along fixed routes providing a simple index of abundance would be useful to monitor populations over large geographic areas, and over time. Considering the limitations of most agency resources for webless waterbirds, index surveys may be more appropriate. Future telemetry studies of this type on other species and at other sites would be useful to calibrate information obtained from playback surveys whether reporting an index of abundance or density estimate.

  6. Evaluation of glioblastomas and lymphomas with whole-brain CT perfusion: Comparison between a delay-invariant singular-value decomposition algorithm and a Patlak plot.

    PubMed

    Hiwatashi, Akio; Togao, Osamu; Yamashita, Koji; Kikuchi, Kazufumi; Yoshimoto, Koji; Mizoguchi, Masahiro; Suzuki, Satoshi O; Yoshiura, Takashi; Honda, Hiroshi

    2016-07-01

    Correction of contrast leakage is recommended when enhancing lesions during perfusion analysis. The purpose of this study was to assess the diagnostic performance of computed tomography perfusion (CTP) with a delay-invariant singular-value decomposition algorithm (SVD+) and a Patlak plot in differentiating glioblastomas from lymphomas. This prospective study included 17 adult patients (12 men and 5 women) with pathologically proven glioblastomas (n=10) and lymphomas (n=7). CTP data were analyzed using SVD+ and a Patlak plot. The relative tumor blood volume and flow compared to contralateral normal-appearing gray matter (rCBV and rCBF derived from SVD+, and rBV and rFlow derived from the Patlak plot) were used to differentiate between glioblastomas and lymphomas. The Mann-Whitney U test and receiver operating characteristic (ROC) analyses were used for statistical analysis. Glioblastomas showed significantly higher rFlow (3.05±0.49, mean±standard deviation) than lymphomas (1.56±0.53; P<0.05). There were no statistically significant differences between glioblastomas and lymphomas in rBV (2.52±1.57 vs. 1.03±0.51; P>0.05), rCBF (1.38±0.41 vs. 1.29±0.47; P>0.05), or rCBV (1.78±0.47 vs. 1.87±0.66; P>0.05). ROC analysis showed the best diagnostic performance with rFlow (Az=0.871), followed by rBV (Az=0.771), rCBF (Az=0.614), and rCBV (Az=0.529). CTP analysis with a Patlak plot was helpful in differentiating between glioblastomas and lymphomas, but CTP analysis with SVD+ was not. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  7. Dependence of normal brain integral dose and normal tissue complication probability on the prescription isodose values for γ-knife radiosurgery

    NASA Astrophysics Data System (ADS)

    Ma, Lijun

    2001-11-01

    A recent multi-institutional clinical study suggested possible benefits of lowering the prescription isodose lines for stereotactic radiosurgery procedures. In this study, we investigate the dependence of the normal brain integral dose and the normal tissue complication probability (NTCP) on the prescription isodose values for γ-knife radiosurgery. An analytical dose model was developed for γ-knife treatment planning. The dose model was commissioned by fitting the measured dose profiles for each helmet size. The dose model was validated by comparing its results with the Leksell gamma plan (LGP, version 5.30) calculations. The normal brain integral dose and the NTCP were computed and analysed for an ensemble of treatment cases. The functional dependence of the normal brain integral dose and the NCTP versus the prescribing isodose values was studied for these cases. We found that the normal brain integral dose and the NTCP increase significantly when lowering the prescription isodose lines from 50% to 35% of the maximum tumour dose. Alternatively, the normal brain integral dose and the NTCP decrease significantly when raising the prescribing isodose lines from 50% to 65% of the maximum tumour dose. The results may be used as a guideline for designing future dose escalation studies for γ-knife applications.

  8. Using occupancy models to investigate the prevalence of ectoparasitic vectors on hosts: an example with fleas on prairie dogs

    USGS Publications Warehouse

    Eads, David A.; Biggins, Dean E.; Doherty, Paul F.; Gage, Kenneth L.; Huyvaert, Kathryn P.; Long, Dustin H.; Antolin, Michael F.

    2013-01-01

    Ectoparasites are often difficult to detect in the field. We developed a method that can be used with occupancy models to estimate the prevalence of ectoparasites on hosts, and to investigate factors that influence rates of ectoparasite occupancy while accounting for imperfect detection. We describe the approach using a study of fleas (Siphonaptera) on black-tailed prairie dogs (Cynomys ludovicianus). During each primary occasion (monthly trapping events), we combed a prairie dog three consecutive times to detect fleas (15 s/combing). We used robust design occupancy modeling to evaluate hypotheses for factors that might correlate with the occurrence of fleas on prairie dogs, and factors that might influence the rate at which prairie dogs are colonized by fleas. Our combing method was highly effective; dislodged fleas fell into a tub of water and could not escape, and there was an estimated 99.3% probability of detecting a flea on an occupied host when using three combings. While overall detection was high, the probability of detection was always <1.00 during each primary combing occasion, highlighting the importance of considering imperfect detection. The combing method (removal of fleas) caused a decline in detection during primary occasions, and we accounted for that decline to avoid inflated estimates of occupancy. Regarding prairie dogs, flea occupancy was heightened in old/natural colonies of prairie dogs, and on hosts that were in poor condition. Occupancy was initially low in plots with high densities of prairie dogs, but, as the study progressed, the rate of flea colonization increased in plots with high densities of prairie dogs in particular. Our methodology can be used to improve studies of ectoparasites, especially when the probability of detection is low. Moreover, the method can be modified to investigate the co-occurrence of ectoparasite species, and community level factors such as species richness and interspecific interactions.

  9. Normal myocardial perfusion scan portends a benign prognosis independent from the pretest probability of coronary artery disease. Sub-analysis of the J-ACCESS study.

    PubMed

    Imamura, Yosihiro; Fukuyama, Takaya; Nishimura, Sigeyuki; Nishimura, Tsunehiko

    2009-08-01

    We assessed the usefulness of gated stress/rest 99mTc-tetrofosmin myocardial perfusion single photon emission computed tomography (SPECT) to predict ischemic cardiac events in Japanese patients with various estimated pretest probabilities of coronary artery disease (CAD). Of the 4031 consecutively registered patients for a J-ACCESS (Japanese Assessment of Cardiac Events and Survival Study by Quantitative Gated SPECT) study, 1904 patients without prior cardiac events were selected. Gated stress/rest myocardial perfusion SPECT was performed and segmental perfusion scores and quantitative gated SPECT results were derived. The pretest probability for having CAD was estimated using the American College of Cardiology/American Heart Association/American College of Physicians-American Society of Internal Medicine guideline data for the management of patients with chronic stable angina, which includes age, gender, and type of chest discomfort. The patients were followed up for three years. During the three-year follow-up period, 96 developed ischemic cardiac events: 17 cardiac deaths, 8 nonfatal myocardial infarction, and 71 clinically driven revascularization. The summed stress score (SSS) was the most powerful independent predictor of all ischemic cardiac events (hazard ratio 1.077, CI 1.045-1.110). Abnormal SSS (> 3) was associated with a significantly higher cardiac event rate in patients with an intermediate to high pretest probability of CAD. Normal SSS (< or = 3) was associated with a low event rate in patients with any pretest probability of CAD. Myocardial perfusion SPECT is useful for further risk-stratification of patients with suspected CAD. The abnormal scan result (SSS > 3) is discriminative for subsequent cardiac events only in the groups with an intermediate to high pretest probability of CAD. The salient result is that normal scan results portend a benign prognosis independent from the pretest probability of CAD.

  10. Bayes’ theorem, the ROC diagram and reference values: Definition and use in clinical diagnosis

    PubMed Central

    Kallner, Anders

    2017-01-01

    Medicine is diagnosis, treatment and care. To diagnose is to consider the probability of the cause of discomfort experienced by the patient. The physician may face many options and all decisions are liable to uncertainty to some extent. The rational action is to perform selected tests and thereby increase the pre-test probability to reach a superior post-test probability of a particular option. To draw the right conclusions from a test, certain background information about the performance of the test is necessary. We set up a partially artificial dataset with measured results obtained from the laboratory information system and simulated diagnosis attached. The dataset is used to explore the use of contingency tables with a unique graphic design and software to establish and compare ROC graphs. The loss of information in the ROC curve is compensated by a cumulative data analysis (CDA) plot linked to a display of the efficiency and predictive values. A standard for the contingency table is suggested and the use of dynamic reference intervals discussed. PMID:29209139

  11. Local regularity for time-dependent tug-of-war games with varying probabilities

    NASA Astrophysics Data System (ADS)

    Parviainen, Mikko; Ruosteenoja, Eero

    2016-07-01

    We study local regularity properties of value functions of time-dependent tug-of-war games. For games with constant probabilities we get local Lipschitz continuity. For more general games with probabilities depending on space and time we obtain Hölder and Harnack estimates. The games have a connection to the normalized p (x , t)-parabolic equation ut = Δu + (p (x , t) - 2) Δ∞N u.

  12. Evaluation of alternative planting strategies to reduce wheat stem sawfly (Hymenoptera: Cephidae) damage to spring wheat in the northern Great Plains.

    PubMed

    Beres, B L; Cárcamo, H A; Bremer, E

    2009-12-01

    Wheat, Triticum aestivum L., producers are often reluctant to use solid-stemmed wheat cultivars resistant to wheat stem sawfly, Cephus cinctus Norton (Hymenoptera: Cephidae), due to concerns regarding yield, efficacy or market opportunities. We evaluated the impact of several planting strategies on wheat yield and quality and wheat stem sawfly infestation at two locations over a three-year period. Experimental units consisted of large plots (50 by 200 m) located on commercial farms adjacent to wheat stem sawfly-infested fields. Compared with a monoculture of a hollow-stemmed cultivar ('AC Barrie'), planting a monoculture of a solid-stemmed cultivar ('AC Eatonia') increased yield by an average of 16% (0.4 mg ha(-1)) and increased the grade of wheat by one unit at the two most heavily infested site-years. Planting a 1:1 blend of AC Eatonia and AC Barrie increased yield by an average of 11%, whereas planting 20- or 40-m plot margins to AC Eatonia increased yield by an average of 8%. High wheat stem sawfly pressure limited the effectiveness of using resistant cultivars in field margins because plants were often infested beyond the plot margin, with uniform infestation down the length of the plots at the two most heavily infested site-years. The effectiveness of AC Eatonia to reduce wheat stem sawfly survivorship was modest in this study, probably due to weather-related factors influencing pith expression and to the high abundance of wheat stem sawfly. Greater benefits from planting field margins to resistant cultivars or planting a blend of resistant and susceptible cultivars might be achievable under lower wheat stem sawfly pressure.

  13. Long-term dynamics of heavy metals in the upper horizons of soils in the region of a copper smelter impacts during the period of reduced emission

    NASA Astrophysics Data System (ADS)

    Vorobeichik, E. L.; Kaigorodova, S. Yu.

    2017-08-01

    The 23-year-long dynamics of actual acidity (pHwater) and acid-soluble heavy metals (Cu, Pb, Cd, Zn) in the forest litter and humus horizon of soils in spruce-fir forests were studied in the area subjected to the long-term (since 1940) pollution with atmospheric emissions from the Middle Ural Copper Smelter (Revda, Sverdlovsk oblast). For this purpose, 25 permanent sample plots were established on lower slopes at different distances from the enterprise (30, 7, 4, 2, and 1 km; 5 plots at each distance) in 1989. The emissions from the smelter have decreased since the early 1990s. In 2012, the emissions of sulfur dioxide and dust decreased by 100 and 40 times, respectively, as compared with the emissions in 1980. Samples of litter and humus horizons were collected on permanent plots in 1989, 1999, and 2012. The results indicate that the pH of the litter and humus horizons restored to the background level 10 and 23 years after the beginning of the reduction in emissions, respectively. However, these characteristics in the impact zone still somewhat differ from those in the background area. In 2012, the content of Cu in the litter decreased compared to 1989 on all the plots; the content of Cu in the humus horizon decreased only in the close vicinity of the smelter. The contents of other metals in the litter and humus horizons remain constant or increased (probably because of the pH-dependent decrease in migration capacity). The absence of pronounced removal of metals from soils results in the retention of high contamination risk and the conservation of the suppressed state of biota within the impact zone.

  14. The effect of leaf litter cover on surface runoff and soil erosion in Northern China.

    PubMed

    Li, Xiang; Niu, Jianzhi; Xie, Baoyuan

    2014-01-01

    The role of leaf litter in hydrological processes and soil erosion of forest ecosystems is poorly understood. A field experiment was conducted under simulated rainfall in runoff plots with a slope of 10%. Two common types of litter in North China (from Quercus variabilis, representing broadleaf litter, and Pinus tabulaeformis, representing needle leaf litter), four amounts of litter, and five rainfall intensities were tested. Results revealed that the litter reduced runoff and delayed the beginning of runoff, but significantly reduced soil loss (p<0.05). Average runoff yield was 29.5% and 31.3% less than bare-soil plot, and for Q. variabilis and P. tabulaeformis, respectively, and average sediment yield was 85.1% and 79.9% lower. Rainfall intensity significantly affected runoff (R = 0.99, p<0.05), and the efficiency in runoff reduction by litter decreased considerably. Runoff yield and the runoff coefficient increased dramatically by 72.9 and 5.4 times, respectively. The period of time before runoff appeared decreased approximately 96.7% when rainfall intensity increased from 5.7 to 75.6 mm h-1. Broadleaf and needle leaf litter showed similarly relevant effects on runoff and soil erosion control, since no significant differences (p≤0.05) were observed in runoff and sediment variables between two litter-covered plots. In contrast, litter mass was probably not a main factor in determining runoff and sediment because a significant correlation was found only with sediment in Q. variabilis litter plot. Finally, runoff yield was significantly correlated (p<0.05) with sediment yield. These results suggest that the protective role of leaf litter in runoff and erosion processes was crucial, and both rainfall intensity and litter characteristics had an impact on these processes.

  15. Modelling of OPNMR phenomena using photon energy-dependent 〈Sz〉 in GaAs and InP

    NASA Astrophysics Data System (ADS)

    Wheeler, Dustin D.; Willmering, Matthew M.; Sesti, Erika L.; Pan, Xingyuan; Saha, Dipta; Stanton, Christopher J.; Hayes, Sophia E.

    2016-12-01

    We have modified the model for optically-pumped NMR (OPNMR) to incorporate a revised expression for the expectation value of the z-projection of the electron spin, 〈Sz 〉 and apply this model to both bulk GaAs and a new material, InP. This expression includes the photon energy dependence of the electron polarization when optically pumping direct-gap semiconductors in excess of the bandgap energy, Eg . Rather than using a fixed value arising from coefficients (the matrix elements) for the optical transitions at the k = 0 bandedge, we define a new parameter, Sopt (Eph) . Incorporating this revised element into the expression for 〈Sz 〉 , we have simulated the photon energy dependence of the OPNMR signals from bulk semi-insulating GaAs and semi-insulating InP. In earlier work, we matched calculations of electron spin polarization (alone) to features in a plot of OPNMR signal intensity versus photon energy for optical pumping (Ramaswamy et al., 2010). By incorporating an electron spin polarization which varies with pump wavelength into the penetration depth model of OPNMR signal, we are able to model features in both III-V semiconductors. The agreement between the OPNMR data and the corresponding model demonstrates that fluctuations in the OPNMR intensity have particular sensitivity to light hole-to-conduction band transitions in bulk systems. We provide detailed plots of the theoretical predictions for optical pumping transition probabilities with circularly-polarized light for both helicities of light, broken down into illustrative plots of optical magnetoabsorption and spin polarization, shown separately for heavy-hole and light-hole transitions. These plots serve as an effective roadmap of transitions, which are helpful to other researchers investigating optical pumping effects.

  16. The Effect of Leaf Litter Cover on Surface Runoff and Soil Erosion in Northern China

    PubMed Central

    Li, Xiang; Niu, Jianzhi; Xie, Baoyuan

    2014-01-01

    The role of leaf litter in hydrological processes and soil erosion of forest ecosystems is poorly understood. A field experiment was conducted under simulated rainfall in runoff plots with a slope of 10%. Two common types of litter in North China (from Quercus variabilis, representing broadleaf litter, and Pinus tabulaeformis, representing needle leaf litter), four amounts of litter, and five rainfall intensities were tested. Results revealed that the litter reduced runoff and delayed the beginning of runoff, but significantly reduced soil loss (p<0.05). Average runoff yield was 29.5% and 31.3% less than bare-soil plot, and for Q. variabilis and P. tabulaeformis, respectively, and average sediment yield was 85.1% and 79.9% lower. Rainfall intensity significantly affected runoff (R = 0.99, p<0.05), and the efficiency in runoff reduction by litter decreased considerably. Runoff yield and the runoff coefficient increased dramatically by 72.9 and 5.4 times, respectively. The period of time before runoff appeared decreased approximately 96.7% when rainfall intensity increased from 5.7 to 75.6 mm h−1. Broadleaf and needle leaf litter showed similarly relevant effects on runoff and soil erosion control, since no significant differences (p≤0.05) were observed in runoff and sediment variables between two litter-covered plots. In contrast, litter mass was probably not a main factor in determining runoff and sediment because a significant correlation was found only with sediment in Q. variabilis litter plot. Finally, runoff yield was significantly correlated (p<0.05) with sediment yield. These results suggest that the protective role of leaf litter in runoff and erosion processes was crucial, and both rainfall intensity and litter characteristics had an impact on these processes. PMID:25232858

  17. Modelling of OPNMR phenomena using photon energy-dependent 〈Sz〉 in GaAs and InP.

    PubMed

    Wheeler, Dustin D; Willmering, Matthew M; Sesti, Erika L; Pan, Xingyuan; Saha, Dipta; Stanton, Christopher J; Hayes, Sophia E

    2016-12-01

    We have modified the model for optically-pumped NMR (OPNMR) to incorporate a revised expression for the expectation value of the z-projection of the electron spin, 〈S z 〉 and apply this model to both bulk GaAs and a new material, InP. This expression includes the photon energy dependence of the electron polarization when optically pumping direct-gap semiconductors in excess of the bandgap energy, E g . Rather than using a fixed value arising from coefficients (the matrix elements) for the optical transitions at the k=0 bandedge, we define a new parameter, S opt (E ph ). Incorporating this revised element into the expression for 〈S z 〉, we have simulated the photon energy dependence of the OPNMR signals from bulk semi-insulating GaAs and semi-insulating InP. In earlier work, we matched calculations of electron spin polarization (alone) to features in a plot of OPNMR signal intensity versus photon energy for optical pumping (Ramaswamy et al., 2010). By incorporating an electron spin polarization which varies with pump wavelength into the penetration depth model of OPNMR signal, we are able to model features in both III-V semiconductors. The agreement between the OPNMR data and the corresponding model demonstrates that fluctuations in the OPNMR intensity have particular sensitivity to light hole-to-conduction band transitions in bulk systems. We provide detailed plots of the theoretical predictions for optical pumping transition probabilities with circularly-polarized light for both helicities of light, broken down into illustrative plots of optical magnetoabsorption and spin polarization, shown separately for heavy-hole and light-hole transitions. These plots serve as an effective roadmap of transitions, which are helpful to other researchers investigating optical pumping effects. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Local Topography Effect on Plant Area Index Profile Calculation from Small Footprint Airborne Laser Scanning

    NASA Astrophysics Data System (ADS)

    Liu, J.; Wang, T.; Skidmore, A. K.; Heurich, M.

    2016-12-01

    The plant area index (PAI) profile is a quantitative description of how plants (including leaves and woody materials) are distributed vertically, as a function of height. PAI profiles can be used for many applications including biomass estimation, radiative transfer modelling, fire fuel modelling and wildlife habitat assessment. With airborne laser scanning (ALS), forest structure underneath the canopy surface can be detected. PAI profiles can be calculated through estimates of the vertically resolved gap fraction from ALS data. In this process, a gridding or aggregation step is often involved. Most current research neglects local topographic change, and utilizes a height normalization algorithm to achieve a local or relative height, implying a flat local terrain assumption inside the grid or aggregation area. However, in mountainous forest, this assumption is often not valid. Therefore, in this research, the local topographic effect on the PAI profile calculation was studied. Small footprint discrete multi-return ALS data was acquired over the Bavarian Forest National Park under leaf-off and leaf-on conditions. Ground truth data, including tree height, canopy cover, DBH as well as digital hemispherical photos, were collected in 30 plots. These plots covered a wide range of forest structure, plant species, local topography condition and understory coverage. PAI profiles were calculated both with and without height normalization. The difference between height normalized and non-normalized profiles were evaluated with the coefficient of variation of root mean squared difference (CV-RMSD). The derived metric PAI values from PAI profiles were also evaluated with ground truth PAI from the hemispherical photos. Results showed that change in local topography had significant effects on the PAI profile. The CV-RMSD between PAI profile results calculated with or without height normalization ranged from 24.5% to 163.9%. Height normalization (neglecting topography change) can lead to offsets in the height of plant material that could potentially cause large errors and uncertainty when used in applications utilizing absolute height such as radiative transfer modeling and fire fuel modelling. This research demonstrates that when calculating the PAI profile from ALS, local topography has to be taken into account.

  19. The determination of substrate conditions from the orientations of solitary rugose corals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolton, J.C.; Driese, S.G.

    1990-10-01

    The substrate conditions of mudstone strata formed in ancient epicontinental settings may be determined from taphonomic assemblages of solitary rugose corals. Equal-area plots on the orientations of preserved corals can be used to infer whether subsequent hydrodynamic conditions affected any post-mortem reworking of the corals. Mechanically stable positions for curved corals can be determined. Curved corals preserved in mechanically stable positions are interpreted to have been deposited on firm or hard substrates. Curved corals preserved in mechanically unstable positions were probably embedded in soft or soupy substrates.

  20. Structural analysis of vibroacoustical processes

    NASA Technical Reports Server (NTRS)

    Gromov, A. P.; Myasnikov, L. L.; Myasnikova, Y. N.; Finagin, B. A.

    1973-01-01

    The method of automatic identification of acoustical signals, by means of the segmentation was used to investigate noises and vibrations in machines and mechanisms, for cybernetic diagnostics. The structural analysis consists of presentation of a noise or vibroacoustical signal as a sequence of segments, determined by the time quantization, in which each segment is characterized by specific spectral characteristics. The structural spectrum is plotted as a histogram of the segments, also as a relation of the probability density of appearance of a segment to the segment type. It is assumed that the conditions of ergodic processes are maintained.

  1. A generic interface between COSMIC/NASTRAN and PATRAN (R)

    NASA Technical Reports Server (NTRS)

    Roschke, Paul N.; Premthamkorn, Prakit; Maxwell, James C.

    1990-01-01

    Despite its powerful analytical capabilities, COSMIC/NASTRAN lacks adequate post-processing adroitness. PATRAN, on the other hand is widely accepted for its graphical capabilities. A nonproprietary, public domain code mnemonically titled CPI (for COSMIC/NASTRAN-PATRAN Interface) is designed to manipulate a large number of files rapidly and efficiently between the two parent codes. In addition to PATRAN's results file preparation, CPI also prepares PATRAN's P/PLOT data files for xy plotting. The user is prompted for necessary information during an interactive session. Current implementation supports NASTRAN's displacement approach including the following rigid formats: (1) static analysis, (2) normal modal analysis, (3) direct transient response, and (4) modal transient response. A wide variety of data blocks are also supported. Error trapping is given special consideration. A sample session with CPI illustrates its simplicity and ease of use.

  2. Testing the Amazon savannization hypothesis: fire effects on invasion of a neotropical forest by native cerrado and exotic pasture grasses

    PubMed Central

    Silvério, Divino V.; Brando, Paulo M.; Balch, Jennifer K.; Putz, Francis E.; Nepstad, Daniel C.; Oliveira-Santos, Claudinei; Bustamante, Mercedes M. C.

    2013-01-01

    Changes in climate and land use that interact synergistically to increase fire frequencies and intensities in tropical regions are predicted to drive forests to new grass-dominated stable states. To reveal the mechanisms for such a transition, we established 50 ha plots in a transitional forest in the southwestern Brazilian Amazon to different fire treatments (unburned, burned annually (B1yr) or at 3-year intervals (B3yr)). Over an 8-year period since the commencement of these treatments, we documented: (i) the annual rate of pasture and native grass invasion in response to increasing fire frequency; (ii) the establishment of Brachiaria decumbens (an African C4 grass) as a function of decreasing canopy cover and (iii) the effects of grass fine fuel on fire intensity. Grasses invaded approximately 200 m from the edge into the interiors of burned plots (B1yr: 4.31 ha; B3yr: 4.96 ha) but invaded less than 10 m into the unburned plot (0.33 ha). The probability of B. decumbens establishment increased with seed availability and decreased with leaf area index. Fine fuel loads along the forest edge were more than three times higher in grass-dominated areas, which resulted in especially intense fires. Our results indicate that synergies between fires and invasive C4 grasses jeopardize the future of tropical forests. PMID:23610179

  3. Estimation of pressure-, temperature- and frictional heating-related effects on proteins' retention under ultra-high-pressure liquid chromatographic conditions.

    PubMed

    Fekete, Szabolcs; Guillarme, Davy

    2015-05-08

    The goal of this work was to evaluate the changes in retention induced by frictional heating, pressure and temperature under ultra high pressure liquid chromatography (UHPLC) conditions, for four model proteins (i.e. lysozyme, myoglobin, fligrastim and interferon alpha-2A) possessing molecular weights between 14 and 20kDa. First of all, because the decrease of the molar volume upon adsorption onto a hydrophobic surface was more pronounced for large molecules such as proteins, the impact of pressure appears to overcome the frictional heating effects. Nevertheless, we have also demonstrated that the retention decrease due to frictional heating was not negligible with such large biomolecules in the variable inlet pressure mode. Secondly, it is clearly shown that the modification of retention under various pressure and temperature conditions cannot be explained solely by the frictional heating and pressure effects. Indeed, some very uncommon van't Hoff plots (concave plots with a maximum) were recorded for our model/therapeutic proteins. These maximum retention factors values on the van't Hoff plots indicate a probable change of secondary structure/conformation with pressure and temperature. Based on these observations, it seems that the combination of pressure and temperature causes the protein denaturation and this folding-unfolding procedure is clearly protein dependent. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Grounding a natural background level for fluoride in a potentially contaminated crystalline aquifer in south India.

    PubMed

    Sajil Kumar, P J

    2017-12-01

    Fluoride contamination is one of the most alarming issues for those countries that depend on groundwater drinking water supply. A careful examination of the hydrogeochemical conditions and routine monitoring of fluoride level are therefore quintessential. Estimation of natural background level (NBL) of fluoride becomes significant information for assessing the current and future contamination episodes. Vellore District in Tamil Nadu is a hard rock terrain known for its F-rich groundwater. In this study, we attempted to form a benchmark for fluoride using hydrochemical pre-selection (based on TDS and NO 3 ) and cumulative probability plots (CPP). Principle components analysis is (PCA) applied to evaluate the corresponding factor grouping of the total of 68 samples, which is later mapped using geostatistical tool in ArcGIS. From the CPP, we derived the NBL of F as 0.75 mg/L. This value is compared with the observed concentration in each sample and they were spatially plotted based on the NBL. Resultant plot suggests that W-NW part of the study area has exceeded and E-EW regions are below the NBL of F. Spatial variation of the factor scores also supported this observation. Grounding an NBL and extending it to other parts of the potential contaminated aquifers are highly recommended for better understanding and management of the water supply systems.

  5. Catalytic site interactions in yeast OMP synthase.

    PubMed

    Hansen, Michael Riis; Barr, Eric W; Jensen, Kaj Frank; Willemoës, Martin; Grubmeyer, Charles; Winther, Jakob R

    2014-01-15

    The enigmatic kinetics, half-of-the-sites binding, and structural asymmetry of the homodimeric microbial OMP synthases (orotate phosphoribosyltransferase, EC 2.4.2.10) have been proposed to result from an alternating site mechanism in these domain-swapped enzymes [R.W. McClard et al., Biochemistry 45 (2006) 5330-5342]. This behavior was investigated in the yeast enzyme by mutations in the conserved catalytic loop and 5-phosphoribosyl-1-diphosphate (PRPP) binding motif. Although the reaction is mechanistically sequential, the wild-type (WT) enzyme shows parallel lines in double reciprocal initial velocity plots. Replacement of Lys106, the postulated intersubunit communication device, produced intersecting lines in kinetic plots with a 2-fold reduction of kcat. Loop (R105G K109S H111G) and PRPP-binding motif (D131N D132N) mutant proteins, each without detectable enzymatic activity and ablated ability to bind PRPP, complemented to produce a heterodimer with a single fully functional active site showing intersecting initial velocity plots. Equilibrium binding of PRPP and orotidine 5'-monophosphate showed a single class of two binding sites per dimer in WT and K106S enzymes. Evidence here shows that the enzyme does not follow half-of-the-sites cooperativity; that interplay between catalytic sites is not an essential feature of the catalytic mechanism; and that parallel lines in steady-state kinetics probably arise from tight substrate binding. Copyright © 2013. Published by Elsevier Inc.

  6. Integrated Cognitive-neuroscience Architectures for Understanding Sensemaking (ICArUS): Phase 2 Test and Evaluation Development Guide

    DTIC Science & Technology

    2014-11-01

    location, based on the evidence provided in Datum ( OSINT , IMINT, and the BLUEBOOK). The targetSum and normalizationConstraint attributes indicate that the...34LessThanOrEqualTo" id="Pp" name="P(Attack | IMINT, OSINT )" type="AttackProbabilityReport_Pp"> <Datum locationId=ŕ-1" datumType=" OSINT ...AttackProbabilityProbe_Ppc targetSum=蔴.0" normalizationConstraint="LessThanOrEqualTo" id="Ppc" name="P(Attack | HUMINT, IMINT, OSINT )" type

  7. Relationships of Leaf Area Index and NDVI for 12 Brassica Cultivars in Northeastern Montana

    NASA Astrophysics Data System (ADS)

    Jabro, Jay; Allen, Brett; Long, Dan; Isbell, Terry; Gesch, Russ; Brown, Jack; Hatfield, Jerry; Archer, David; Oblath, Emily; Vigil, Merle; Kiniry, Jim; Hunter, Kimberly; Shonnard, David

    2017-04-01

    To our knowledge, there is limited information on the relationship of the normalized difference vegetation index (NDVI) and leaf area index (LAI) in spring Brassica oilseed crops. The 2014 results of NDVI and LAI of 12 spring varieties of oilseed crops were measured in a field study conducted in Sidney, Montana, USA under dryland conditions. These 12 varieties were grouped under six species (B. napus, B. rapa, B. juncea, B. carinata, Sinapis alba, and Camelina sativa). The NDVI and LAI were measured weekly throughout the growing season. The NDVI was continually measured at one sample per second across the whole plot using a Crop Circle ACS-470 active crop canopy sensor. The LAI was measured at two locations at 12 samples per plot using an AccuPar model LP-80 Ceptometer. Treatments were replicated four times in a randomized complete block design in plots of 3 m×9 m. Temporal dynamics of NDVI and LAI in various growth stages of 12 varieties were evaluated throughout the growing season. Significant relationships and models between NDVI and LAI were obtained when 12 varieties were grouped under six species.

  8. Timescales and controls on phosphorus loss from a grassland hillslope following a cessation in P application.

    NASA Astrophysics Data System (ADS)

    Cassidy, Rachel; Doody, Donnacha; Watson, Catherine

    2016-04-01

    Despite the implementation of EU regulations controlling the use of fertilisers in agriculture, reserves of phosphorus (P) in soils continue to pose a threat to water quality. Mobilisation and transport of legacy P from soil to surface waters has been highlighted as a probable cause of many water bodies continuing to fail to achieve targets under the Water Framework Directive. However, the rates and quantities lost from farmland, and the timescales for positive change to water quality, following cessation of P inputs, remain poorly understood. Monitoring data from an instrumented grassland research site in Northern Ireland provide some insights. The site is located in a hydrologically 'flashy' landscape characterised by steep gradients and poorly drained soils over impermeable bedrock. Between 2000 and 2005 soil Olsen P concentrations were altered in five 0.2 ha hydrologically isolated grazed grassland plots through chemical fertiliser applications of 0, 10, 20, 40, 80 kg P ha-1yr-1. By 2004 this had resulted in soil Olsen P concentrations of 19, 24, 28, 38 and 67 mg P L-1 across the plots, after which applications ceased. Subsequently, until 2012, changes in soil Olsen P across the plots and losses to overland flow and drainage were monitored, with near-continuous flow measurement and water samples abstracted for chemical analysis. Runoff events were sampled at 20 minute intervals while drainage flows were taken as a weekly composite of 4-hourly samples. Overland flow events were defined by at least 24 hours without flow being recorded at the respective plot outlets. Drainage flow was examined on a weekly basis as it was continuous except during prolonged dry periods. To examine the hydrological drivers of overland flow and drainage losses the dissolved reactive P (DRP) and total P (TP) time series were synchronised with rainfall data and modelled soil moisture deficits. Results demonstrated that from 2005-2012 there was no significant difference among plots in the recorded TP and DRP time series for either overland flow or drainage flow despite the large variation in soil Olsen P. Flow-weighted mean concentrations for overland flow losses declined slightly over the period but remained in excess of the chemical Environmental Quality Standard in all plots (EQS; 0.035 mg/L). In individual events the plot receiving zero P fertiliser inputs since 2000 often lost as much, or more, P than the plot which received 80 kg ha-1 yr-1 up to 2005. Annual loads also reflect this. Drainage losses showed no decline over the period. The hydrological drivers, particularly the antecedent dry period and soil moisture, were observed to have a greater influence on P loss from the plots than soil P status. Given that Olsen P often forms the basis of nutrient management advice this raises questions on the environmental sustainability of current nutrient advice for some soil types under similar geoclimatic conditions.

  9. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    NASA Astrophysics Data System (ADS)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  10. QUESPOWR MRI: QUantification of Exchange as a function of Saturation Power On the Water Resonance

    PubMed Central

    Randtke, Edward A.; Pagel, Mark D.; Cárdenas-Rodríguez, Julio

    2018-01-01

    QUantification of Exchange as a function of Saturation Power On the Water Resonance (QUESPOWR) MRI is a new method that can estimate chemical exchange rates. This method acquires a series of OPARACHEE MRI acquisitions with a range of RF powers for the WALTZ16* pulse train, which are applied on the water resonance. A QUESPOWR plot can be generated from the power dependence of the % water signal, which is similar to a QUESP plot that is generated from CEST MRI acquisition methods with RF saturation applied off-resonance from water. A QUESPOWR plot can be quantitatively analyzed using linear fitting methods to provide estimates of average chemical exchange rates. Analyses of the shapes of QUESPOWR plots can also be used to estimate relative differences in average chemical exchange rates and concentrations of biomolecules. The performance of QUESPOWR MRI was assessed via simulations, an in vitro study with iopamidol, and an in vivo study with a mouse model of mammary carcinoma. The results showed that QUESPOWR MRI is especially sensitive to chemical exchange between water and biomolecules that have intermediate to fast chemical exchange rates and chemical shifts that are close to water, which are notoriously difficult to assess with other CEST MRI methods. In addition, in vivo QUESPOWR MRI detected acidic tumor tissues relative to normal tissues that are pH-neutral, and therefore may be a new paradigm for tumor detection with MRI. PMID:27404128

  11. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...

  12. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...

  13. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...

  14. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...

  15. 12 CFR 700.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... that the facts that caused the deficient share-asset ratio no longer exist; and (ii) The likelihood of further depreciation of the share-asset ratio is not probable; and (iii) The return of the share-asset ratio to its normal limits within a reasonable time for the credit union concerned is probable; and (iv...

  16. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...

  17. Use of Flutemetamol F 18-Labeled Positron Emission Tomography and Other Biomarkers to Assess Risk of Clinical Progression in Patients With Amnestic Mild Cognitive Impairment.

    PubMed

    Wolk, David A; Sadowsky, Carl; Safirstein, Beth; Rinne, Juha O; Duara, Ranjan; Perry, Richard; Agronin, Marc; Gamez, Jose; Shi, Jiong; Ivanoiu, Adrian; Minthon, Lennart; Walker, Zuzana; Hasselbalch, Steen; Holmes, Clive; Sabbagh, Marwan; Albert, Marilyn; Fleisher, Adam; Loughlin, Paul; Triau, Eric; Frey, Kirk; Høgh, Peter; Bozoki, Andrea; Bullock, Roger; Salmon, Eric; Farrar, Gillian; Buckley, Christopher J; Zanette, Michelle; Sherwin, Paul F; Cherubini, Andrea; Inglis, Fraser

    2018-05-14

    Patients with amnestic mild cognitive impairment (aMCI) may progress to clinical Alzheimer disease (AD), remain stable, or revert to normal. Earlier progression to AD among patients who were β-amyloid positive vs those who were β-amyloid negative has been previously observed. Current research now accepts that a combination of biomarkers could provide greater refinement in the assessment of risk for clinical progression. To evaluate the ability of flutemetamol F 18 and other biomarkers to assess the risk of progression from aMCI to probable AD. In this multicenter cohort study, from November 11, 2009, to January 16, 2014, patients with aMCI underwent positron emission tomography (PET) at baseline followed by local clinical assessments every 6 months for up to 3 years. Patients with aMCI (365 screened; 232 were eligible) were recruited from 28 clinical centers in Europe and the United States. Physicians remained strictly blinded to the results of PET, and the standard of truth was an independent clinical adjudication committee that confirmed or refuted local assessments. Flutemetamol F 18-labeled PET scans were read centrally as either negative or positive by 5 blinded readers with no knowledge of clinical status. Statistical analysis was conducted from February 19, 2014, to January 26, 2018. Flutemetamol F 18-labeled PET at baseline followed by up to 6 clinical visits every 6 months, as well as magnetic resonance imaging and multiple cognitive measures. Time from PET to probable AD or last follow-up was plotted as a Kaplan-Meier survival curve; PET scan results, age, hippocampal volume, and aMCI stage were entered into Cox proportional hazards logistic regression analyses to identify variables associated with progression to probable AD. Of 232 patients with aMCI (118 women and 114 men; mean [SD] age, 71.1 [8.6] years), 98 (42.2%) had positive results detected on PET scan. By 36 months, the rates of progression to probable AD were 36.2% overall (81 of 224 patients), 53.6% (52 of 97) for patients with positive results detected on PET scan, and 22.8% (29 of 127) for patients with negative results detected on PET scan. Hazard ratios for association with progression were 2.51 (95% CI, 1.57-3.99; P < .001) for a positive β-amyloid scan alone (primary outcome measure), 5.60 (95% CI, 3.14-9.98; P < .001) with additional low hippocampal volume, and 8.45 (95% CI, 4.40-16.24; P < .001) when poorer cognitive status was added to the model. A combination of positive results of flutemetamol F 18-labeled PET, low hippocampal volume, and cognitive status corresponded with a high probability of risk of progression from aMCI to probable AD within 36 months.

  18. A Gaussian Model-Based Probabilistic Approach for Pulse Transit Time Estimation.

    PubMed

    Jang, Dae-Geun; Park, Seung-Hun; Hahn, Minsoo

    2016-01-01

    In this paper, we propose a new probabilistic approach to pulse transit time (PTT) estimation using a Gaussian distribution model. It is motivated basically by the hypothesis that PTTs normalized by RR intervals follow the Gaussian distribution. To verify the hypothesis, we demonstrate the effects of arterial compliance on the normalized PTTs using the Moens-Korteweg equation. Furthermore, we observe a Gaussian distribution of the normalized PTTs on real data. In order to estimate the PTT using the hypothesis, we first assumed that R-waves in the electrocardiogram (ECG) can be correctly identified. The R-waves limit searching ranges to detect pulse peaks in the photoplethysmogram (PPG) and to synchronize the results with cardiac beats--i.e., the peaks of the PPG are extracted within the corresponding RR interval of the ECG as pulse peak candidates. Their probabilities of being the actual pulse peak are then calculated using a Gaussian probability function. The parameters of the Gaussian function are automatically updated when a new pulse peak is identified. This update makes the probability function adaptive to variations of cardiac cycles. Finally, the pulse peak is identified as the candidate with the highest probability. The proposed approach is tested on a database where ECG and PPG waveforms are collected simultaneously during the submaximal bicycle ergometer exercise test. The results are promising, suggesting that the method provides a simple but more accurate PTT estimation in real applications.

  19. Towards the harmonization between National Forest Inventory and Forest Condition Monitoring. Consistency of plot allocation and effect of tree selection methods on sample statistics in Italy.

    PubMed

    Gasparini, Patrizia; Di Cosmo, Lucio; Cenni, Enrico; Pompei, Enrico; Ferretti, Marco

    2013-07-01

    In the frame of a process aiming at harmonizing National Forest Inventory (NFI) and ICP Forests Level I Forest Condition Monitoring (FCM) in Italy, we investigated (a) the long-term consistency between FCM sample points (a subsample of the first NFI, 1985, NFI_1) and recent forest area estimates (after the second NFI, 2005, NFI_2) and (b) the effect of tree selection method (tree-based or plot-based) on sample composition and defoliation statistics. The two investigations were carried out on 261 and 252 FCM sites, respectively. Results show that some individual forest categories (larch and stone pine, Norway spruce, other coniferous, beech, temperate oaks and cork oak forests) are over-represented and others (hornbeam and hophornbeam, other deciduous broadleaved and holm oak forests) are under-represented in the FCM sample. This is probably due to a change in forest cover, which has increased by 1,559,200 ha from 1985 to 2005. In case of shift from a tree-based to a plot-based selection method, 3,130 (46.7%) of the original 6,703 sample trees will be abandoned, and 1,473 new trees will be selected. The balance between exclusion of former sample trees and inclusion of new ones will be particularly unfavourable for conifers (with only 16.4% of excluded trees replaced by new ones) and less for deciduous broadleaves (with 63.5% of excluded trees replaced). The total number of tree species surveyed will not be impacted, while the number of trees per species will, and the resulting (plot-based) sample composition will have a much larger frequency of deciduous broadleaved trees. The newly selected trees have-in general-smaller diameter at breast height (DBH) and defoliation scores. Given the larger rate of turnover, the deciduous broadleaved part of the sample will be more impacted. Our results suggest that both a revision of FCM network to account for forest area change and a plot-based approach to permit statistical inference and avoid bias in the tree sample composition in terms of DBH (and likely age and structure) are desirable in Italy. As the adoption of a plot-based approach will keep a large share of the trees formerly selected, direct tree-by-tree comparison will remain possible, thus limiting the impact on the time series comparability. In addition, the plot-based design will favour the integration with NFI_2.

  20. The fault-tree compiler

    NASA Technical Reports Server (NTRS)

    Martensen, Anna L.; Butler, Ricky W.

    1987-01-01

    The Fault Tree Compiler Program is a new reliability tool used to predict the top event probability for a fault tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N gates. The high level input language is easy to understand and use when describing the system tree. In addition, the use of the hierarchical fault tree capability can simplify the tree description and decrease program execution time. The current solution technique provides an answer precise (within the limits of double precision floating point arithmetic) to the five digits in the answer. The user may vary one failure rate or failure probability over a range of values and plot the results for sensitivity analyses. The solution technique is implemented in FORTRAN; the remaining program code is implemented in Pascal. The program is written to run on a Digital Corporation VAX with the VMS operation system.

Top