Sample records for universal calibration plots

  1. External validation and comparison of two nomograms predicting the probability of Gleason sum upgrading between biopsy and radical prostatectomy pathology in two patient populations: a retrospective cohort study.

    PubMed

    Utsumi, Takanobu; Oka, Ryo; Endo, Takumi; Yano, Masashi; Kamijima, Shuichi; Kamiya, Naoto; Fujimura, Masaaki; Sekita, Nobuyuki; Mikami, Kazuo; Hiruta, Nobuyuki; Suzuki, Hiroyoshi

    2015-11-01

    The aim of this study is to validate and compare the predictive accuracy of two nomograms predicting the probability of Gleason sum upgrading between biopsy and radical prostatectomy pathology among representative patients with prostate cancer. We previously developed a nomogram, as did Chun et al. In this validation study, patients originated from two centers: Toho University Sakura Medical Center (n = 214) and Chibaken Saiseikai Narashino Hospital (n = 216). We assessed predictive accuracy using area under the curve values and constructed calibration plots to grasp the tendency for each institution. Both nomograms showed a high predictive accuracy in each institution, although the constructed calibration plots of the two nomograms underestimated the actual probability in Toho University Sakura Medical Center. Clinicians need to use calibration plots for each institution to correctly understand the tendency of each nomogram for their patients, even if each nomogram has a good predictive accuracy. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. A universal airborne LiDAR approach for tropical forest carbon mapping.

    PubMed

    Asner, Gregory P; Mascaro, Joseph; Muller-Landau, Helene C; Vieilledent, Ghislain; Vaudry, Romuald; Rasamoelina, Maminiaina; Hall, Jefferson S; van Breugel, Michiel

    2012-04-01

    Airborne light detection and ranging (LiDAR) is fast turning the corner from demonstration technology to a key tool for assessing carbon stocks in tropical forests. With its ability to penetrate tropical forest canopies and detect three-dimensional forest structure, LiDAR may prove to be a major component of international strategies to measure and account for carbon emissions from and uptake by tropical forests. To date, however, basic ecological information such as height-diameter allometry and stand-level wood density have not been mechanistically incorporated into methods for mapping forest carbon at regional and global scales. A better incorporation of these structural patterns in forests may reduce the considerable time needed to calibrate airborne data with ground-based forest inventory plots, which presently necessitate exhaustive measurements of tree diameters and heights, as well as tree identifications for wood density estimation. Here, we develop a new approach that can facilitate rapid LiDAR calibration with minimal field data. Throughout four tropical regions (Panama, Peru, Madagascar, and Hawaii), we were able to predict aboveground carbon density estimated in field inventory plots using a single universal LiDAR model (r ( 2 ) = 0.80, RMSE = 27.6 Mg C ha(-1)). This model is comparable in predictive power to locally calibrated models, but relies on limited inputs of basal area and wood density information for a given region, rather than on traditional plot inventories. With this approach, we propose to radically decrease the time required to calibrate airborne LiDAR data and thus increase the output of high-resolution carbon maps, supporting tropical forest conservation and climate mitigation policy.

  3. Calibration of an electronic counter and pulse height analyzer for plotting erythrocyte volume spectra.

    DOT National Transportation Integrated Search

    1963-03-01

    A simple technique is presented for calibrating an electronic system used in the plotting of erythrocyte volume spectra. The calibration factors, once obtained, apparently remain applicable for some time. Precise estimates of calibration factors appe...

  4. Non-parametric and least squares Langley plot methods

    NASA Astrophysics Data System (ADS)

    Kiedron, P. W.; Michalsky, J. J.

    2016-01-01

    Langley plots are used to calibrate sun radiometers primarily for the measurement of the aerosol component of the atmosphere that attenuates (scatters and absorbs) incoming direct solar radiation. In principle, the calibration of a sun radiometer is a straightforward application of the Bouguer-Lambert-Beer law V = V0e-τ ṡ m, where a plot of ln(V) voltage vs. m air mass yields a straight line with intercept ln(V0). This ln(V0) subsequently can be used to solve for τ for any measurement of V and calculation of m. This calibration works well on some high mountain sites, but the application of the Langley plot calibration technique is more complicated at other, more interesting, locales. This paper is concerned with ferreting out calibrations at difficult sites and examining and comparing a number of conventional and non-conventional methods for obtaining successful Langley plots. The 11 techniques discussed indicate that both least squares and various non-parametric techniques produce satisfactory calibrations with no significant differences among them when the time series of ln(V0)'s are smoothed and interpolated with median and mean moving window filters.

  5. The Wally plot approach to assess the calibration of clinical prediction models.

    PubMed

    Blanche, Paul; Gerds, Thomas A; Ekstrøm, Claus T

    2017-12-06

    A prediction model is calibrated if, roughly, for any percentage x we can expect that x subjects out of 100 experience the event among all subjects that have a predicted risk of x%. Typically, the calibration assumption is assessed graphically but in practice it is often challenging to judge whether a "disappointing" calibration plot is the consequence of a departure from the calibration assumption, or alternatively just "bad luck" due to sampling variability. We propose a graphical approach which enables the visualization of how much a calibration plot agrees with the calibration assumption to address this issue. The approach is mainly based on the idea of generating new plots which mimic the available data under the calibration assumption. The method handles the common non-trivial situations in which the data contain censored observations and occurrences of competing events. This is done by building on ideas from constrained non-parametric maximum likelihood estimation methods. Two examples from large cohort data illustrate our proposal. The 'wally' R package is provided to make the methodology easily usable.

  6. Application of the Langley plot for calibration of sun sensors for the Halogen Occultation Experiment (HALOE)

    NASA Technical Reports Server (NTRS)

    Moore, Alvah S., Jr.; Mauldin, L. ED, III; Stump, Charles W.; Reagan, John A.; Fabert, Milton G.

    1989-01-01

    The calibration of the Halogen Occultation Experiment (HALOE) sun sensor is described. This system consists of two energy-balancing silicon detectors which provide coarse azimuth and elevation control signals and a silicon photodiode array which provides top and bottom solar edge data for fine elevation control. All three detectors were calibrated on a mountaintop near Tucson, Ariz., using the Langley plot technique. The conventional Langley plot technique was modified to allow calibration of the two coarse detectors, which operate wideband. A brief description of the test setup is given. The HALOE instrument is a gas correlation radiometer that is now being developed for the Upper Atmospheric Research Satellite.

  7. High-Altitude Air Mass Zero Calibration of Solar Cells

    NASA Technical Reports Server (NTRS)

    Woodyard, James R.; Snyder, David B.

    2005-01-01

    Air mass zero calibration of solar cells has been carried out for several years by NASA Glenn Research Center using a Lear-25 aircraft and Langley plots. The calibration flights are carried out during early fall and late winter when the tropopause is at the lowest altitude. Measurements are made starting at about 50,000 feet and continue down to the tropopause. A joint NASA/Wayne State University program called Suntracker is underway to explore the use of weather balloon and communication technologies to characterize solar cells at elevations up to about 100 kft. The balloon flights are low-cost and can be carried out any time of the year. AMO solar cell characterization employing the mountaintop, aircraft and balloon methods are reviewed. Results of cell characterization with the Suntracker are reported and compared with the NASA Glenn Research Center aircraft method.

  8. Evaluation of the Applicability of Solar and Lamp Radiometric Calibrations of a Precision Sun Photometer Operating Between 300 and 1025 nm

    NASA Technical Reports Server (NTRS)

    Schmid, Beat; Spyak, Paul R.; Biggar, Stuart F.; Joerg, Sekler; Ingold, Thomas; Maetzler, Christian; Kaempfer, Niklaus

    2000-01-01

    Over a period of 3 year a precision Sun photometer (SPM) operating between 300 and 1025 nm was calibrated four times at three different high-mountain sites in Switzerland, Germany, and the United States by means of the Langley-plot technique. We found that for atmospheric window wavelengths the total error (2 sigma-statistical plus systematic errors) of the calibration constants V(sub 0)(lambda), the SPM voltage in the absence of any attenuating atmosphere, can be kept below 1.60% in the UV-A and blue, 0.9% in the mid-visible, and 0.6% in the near-infra red spectral region. For SPM channels within strong water-vapor or ozone absorption bands a modified Langley-plot technique was used to determine V(sub 0)(lambda) with a lower accuracy. Within the same period of time, we calibrated the SPM five times using irradiance standard lamps in the optical labs of the Physikalisch-Meteorologisches Observatorium Davos and World Radiation Center, Switzerland, and of the Remote Sensing Group of the Optical Sciences Center, University of Arizona, Tucson, Arizona. The lab calibration method requires knowledge of the extraterrestrial spectral irradiance. When we refer the standard lamp results to the World Radiation Center extraterrestrial solar irradiance spectrum, they agree with the Langley results within 2% at 6 or 13 SPM wavelengths. The largest disagreement (4.4%) is found for the channel centered at 610 nm. The results of these intercomparisons change significantly when the lamp results are referred to two different extraterrestrial solar irradiance spectra that have become recently available.

  9. Calibration and use of plate meter regressions for pasture mass estimation in an Appalachian silvopasture

    USDA-ARS?s Scientific Manuscript database

    A standardized plate meter for measuring pasture mass was calibrated at the Agroforestry Research and Demonstration Site in Blacksburg, VA, using six ungrazed plots of established tall fescue (Festuca arundinaceae) overseeded with orchardgrass (Dactylis glomerata). Each plot was interplanted with b...

  10. Set-up and calibration of an outdoor nozzle-type rainfall simulator for soil erosion studies at the Masse experimental station (central Italy)

    NASA Astrophysics Data System (ADS)

    Vergni, Lorenzo; Todisco, Francesca

    2016-04-01

    This contribution describes the technical characteristics and the preliminary calibration of a rainfall simulator recently installed by the Department of Agricultural, Food and Environmental Sciences (Perugia University) at the Masse experimental station located 20 km south of Perugia, in the region of Umbria (central Italy). The site includes some USLE plots of different length λ = 11 and 22 m and width w = 2, 4 and 8 m, oriented parallel to a 16 % slope and kept free of vegetation by frequent ploughing. Since 2008, the station enabled to collect data from more than 80 erosive events, that were mainly used to investigate the relationship between rainfall characteristics and soil loss. The relevant soil loss variability that characterizes erosive storm events with similar overall characteristics (duration and/or depth) can be explained by the different rainfall profile of erosive storms and by the different antecedent soil aggregate stability. To analyse in more detail these aspects, recently, the Masse experimental station has been equipped with a semi-portable rainfall simulator placed over two micro-plots of 1x1 m each, having the same topographic and pedologic conditions of the adjacent USLE plots. The rainfall simulator consists of four full-cone spray nozzles for each micro-plot, placed at the angles of a 0.18-m square, centred over the plot at a height of 2.7 m above the ground. The operating pressure is regulated by pressure regulating valves and checked by pressure gauges mounted in correspondence of each nozzle. An electronic control unit regulates the start and stop of the inlet solenoid valves. A range of rainfall intensities can be achieved, by activating different combinations of nozzles (15 different intensities) also during the same simulation trial. The particular design of the plots allows to collect separately the runoff volume deriving from the plots and the water volume fallen outside of the plot. In this way it is possible to derive, by difference, the actual infiltration volume. The experiments are carried out simultaneously on the two adjacent micro-plots. In particular, this contribution reports the results of the first experimental trials aimed to assess the uniformity attainable by single nozzles and its reproducibility (between plots and in time). The interferences between adjacent nozzles (when they work simultaneously) were also evaluated.

  11. Strategies for minimizing sample size for use in airborne LiDAR-based forest inventory

    USGS Publications Warehouse

    Junttila, Virpi; Finley, Andrew O.; Bradford, John B.; Kauranne, Tuomo

    2013-01-01

    Recently airborne Light Detection And Ranging (LiDAR) has emerged as a highly accurate remote sensing modality to be used in operational scale forest inventories. Inventories conducted with the help of LiDAR are most often model-based, i.e. they use variables derived from LiDAR point clouds as the predictive variables that are to be calibrated using field plots. The measurement of the necessary field plots is a time-consuming and statistically sensitive process. Because of this, current practice often presumes hundreds of plots to be collected. But since these plots are only used to calibrate regression models, it should be possible to minimize the number of plots needed by carefully selecting the plots to be measured. In the current study, we compare several systematic and random methods for calibration plot selection, with the specific aim that they be used in LiDAR based regression models for forest parameters, especially above-ground biomass. The primary criteria compared are based on both spatial representativity as well as on their coverage of the variability of the forest features measured. In the former case, it is important also to take into account spatial auto-correlation between the plots. The results indicate that choosing the plots in a way that ensures ample coverage of both spatial and feature space variability improves the performance of the corresponding models, and that adequate coverage of the variability in the feature space is the most important condition that should be met by the set of plots collected.

  12. Fitting and Calibrating a Multilevel Mixed-Effects Stem Taper Model for Maritime Pine in NW Spain

    PubMed Central

    Arias-Rodil, Manuel; Castedo-Dorado, Fernando; Cámara-Obregón, Asunción; Diéguez-Aranda, Ulises

    2015-01-01

    Stem taper data are usually hierarchical (several measurements per tree, and several trees per plot), making application of a multilevel mixed-effects modelling approach essential. However, correlation between trees in the same plot/stand has often been ignored in previous studies. Fitting and calibration of a variable-exponent stem taper function were conducted using data from 420 trees felled in even-aged maritime pine (Pinus pinaster Ait.) stands in NW Spain. In the fitting step, the tree level explained much more variability than the plot level, and therefore calibration at plot level was omitted. Several stem heights were evaluated for measurement of the additional diameter needed for calibration at tree level. Calibration with an additional diameter measured at between 40 and 60% of total tree height showed the greatest improvement in volume and diameter predictions. If additional diameter measurement is not available, the fixed-effects model fitted by the ordinary least squares technique should be used. Finally, we also evaluated how the expansion of parameters with random effects affects the stem taper prediction, as we consider this a key question when applying the mixed-effects modelling approach to taper equations. The results showed that correlation between random effects should be taken into account when assessing the influence of random effects in stem taper prediction. PMID:26630156

  13. Numerical computation of Pop plot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph

    The Pop plot — distance-of-run to detonation versus initial shock pressure — is a key characterization of shock initiation in a heterogeneous explosive. Reactive burn models for high explosives (HE) must reproduce the experimental Pop plot to have any chance of accurately predicting shock initiation phenomena. This report describes a methodology for automating the computation of a Pop plot for a specific explosive with a given HE model. Illustrative examples of the computation are shown for PBX 9502 with three burn models (SURF, WSD and Forest Fire) utilizing the xRage code, which is the Eulerian ASC hydrocode at LANL. Comparisonmore » of the numerical and experimental Pop plot can be the basis for a validation test or as an aid in calibrating the burn rate of an HE model. Issues with calibration are discussed.« less

  14. Energy dispersive X-ray fluorescence determination of cadmium in uranium matrix using Cd Kα line excited by continuum

    NASA Astrophysics Data System (ADS)

    Dhara, Sangita; Misra, N. L.; Aggarwal, S. K.; Venugopal, V.

    2010-06-01

    An energy dispersive X-ray fluorescence method for determination of cadmium (Cd) in uranium (U) matrix using continuum source of excitation was developed. Calibration and sample solutions of cadmium, with and without uranium were prepared by mixing different volumes of standard solutions of cadmium and uranyl nitrate, both prepared in suprapure nitric acid. The concentration of Cd in calibration solutions and samples was in the range of 6 to 90 µg/mL whereas the concentration of Cd with respect to U ranged from 90 to 700 µg/g of U. From the calibration solutions and samples containing uranium, the major matrix uranium was selectively extracted using 30% tri-n-butyl phosphate in dodecane. Fixed volumes (1.5 mL) of aqueous phases thus obtained were taken directly in specially designed in-house fabricated leak proof Perspex sample cells for the energy dispersive X-ray fluorescence measurements and calibration plots were made by plotting Cd Kα intensity against respective Cd concentration. For the calibration solutions not having uranium, the energy dispersive X-ray fluorescence spectra were measured without any extraction and Cd calibration plots were made accordingly. The results obtained showed a precision of 2% (1 σ) and the results deviated from the expected values by < 4% on average.

  15. Improving LiDAR Biomass Model Uncertainty through Non-Destructive Allometry and Plot-level 3D Reconstruction with Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Stovall, A. E.; Shugart, H. H., Jr.

    2017-12-01

    Future NASA and ESA satellite missions plan to better quantify global carbon through detailed observations of forest structure, but ultimately rely on uncertain ground measurement approaches for calibration and validation. A significant amount of the uncertainty in estimating plot-level biomass can be attributed to inadequate and unrepresentative allometric relationships used to convert plot-level tree measurements to estimates of aboveground biomass. These allometric equations are known to have high errors and biases, particularly in carbon rich forests because they were calibrated with small and often biased samples of destructively harvested trees. To overcome this issue, a non-destructive methodology for estimating tree and plot-level biomass has been proposed through the use of Terrestrial Laser Scanning (TLS). We investigated the potential for using TLS as a ground validation approach in LiDAR-based biomass mapping though virtual plot-level tree volume reconstruction and biomass estimation. Plot-level biomass estimates were compared on the Virginia-based Smithsonian Conservation Biology Institute's SIGEO forest with full 3D reconstruction, TLS allometry, and Jenkins et al. (2003) allometry. On average, full 3D reconstruction ultimately provided the lowest uncertainty estimate of plot-level biomass (9.6%), followed by TLS allometry (16.9%) and the national equations (20.2%). TLS offered modest improvements to the airborne LiDAR empirical models, reducing RMSE from 16.2% to 14%. Our findings suggest TLS plot acquisitions and non-destructive allometry can play a vital role for reducing uncertainty in calibration and validation data for biomass mapping in the upcoming NASA and ESA missions.

  16. Prediction of non-biochemical recurrence rate after radical prostatectomy in a Japanese cohort: development of a postoperative nomogram.

    PubMed

    Okubo, Hidenori; Ohori, Makoto; Ohno, Yoshio; Nakashima, Jun; Inoue, Rie; Nagao, Toshitaka; Tachibana, Masaaki

    2014-05-01

    To develop a nomogram based on postoperative factors and prostate-specific antigen levels to predict the non-biochemical recurrence rate after radical prostatectomy ina Japanese cohort. A total of 606 Japanese patients with T1-3N0M0 prostate cancer who underwent radical prostatectomy and pelvic lymph node dissection at Tokyo Medical University hospital from 2000 to 2010 were studied. A nomogram was constructed based on Cox hazard regression analysis evaluating the prognostic significance of serum prostate-specific antigen and pathological factors in the radical prostatectomy specimens. The discriminating ability of the nomogram was assessed by the concordance index (C-index), and the predicted and actual outcomes were compared with a bootstrapped calibration plot. With a mean follow up of 60.0 months, a total of 187 patients (30.9%) experienced biochemical recurrence, with a 5-year non-biochemical recurrence rate of 72.3%. Based on a Cox hazard regression model, a nomogram was constructed to predict non-biochemical recurrence using serum prostate-specific antigen level and pathological features in radical prostatectomy specimens. The concordance index was 0.77, and the calibration plots appeared to be accurate. The postoperative nomogram described here can provide valuable information regarding the need for adjuvant/salvage radiation or hormonal therapy in patients after radical prostatectomy.

  17. Field strategies for the calibration and validation of high-resolution forest carbon maps: Scaling from plots to a three state region MD, DE, & PA, USA.

    NASA Astrophysics Data System (ADS)

    Dolan, K. A.; Huang, W.; Johnson, K. D.; Birdsey, R.; Finley, A. O.; Dubayah, R.; Hurtt, G. C.

    2016-12-01

    In 2010 Congress directed NASA to initiate research towards the development of Carbon Monitoring Systems (CMS). In response, our team has worked to develop a robust, replicable framework to quantify and map aboveground forest biomass at high spatial resolutions. Crucial to this framework has been the collection of field-based estimates of aboveground tree biomass, combined with remotely detected canopy and structural attributes, for calibration and validation. Here we evaluate the field- based calibration and validation strategies within this carbon monitoring framework and discuss the implications on local to national monitoring systems. Through project development, the domain of this research has expanded from two counties in MD (2,181 km2), to the entire state of MD (32,133 km2), and most recently the tri-state region of MD, PA, and DE (157,868 km2) and covers forests in four major USDA ecological providences. While there are approximately 1000 Forest Inventory and Analysis (FIA) plots distributed across the state of MD, 60% fell in areas considered non-forest or had conditions that precluded them from being measured in the last forest inventory. Across the two pilot counties, where population and landuse competition is high, that proportion rose to 70% Thus, during the initial phases of this project 850 independent field plots were established for model calibration following a random stratified design to insure the adequate representation of height and vegetation classes found across the state, while FIA data were used as an independent data source for validation. As the project expanded to cover the larger spatial tri-state domain, the strategy was flipped to base calibration on more than 3,300 measured FIA plots, as they provide a standardized, consistent and available data source across the nation. An additional 350 stratified random plots were deployed in the Northern Mixed forests of PA and the Coastal Plains forests of DE for validation.

  18. A heat and water transfer model for seasonally frozen soils with application to a precipitation-runoff model

    USGS Publications Warehouse

    Emerson, Douglas G.

    1994-01-01

    A model that simulates heat and water transfer in soils during freezing and thawing periods was developed and incorporated into the U.S. Geological Survey's Precipitation-Runoff Modeling System. The model's transfer of heat is based on an equation developed from Fourier's equation for heat flux. The model's transfer of water within the soil profile is based on the concept of capillary forces. Field capacity and infiltration rate can vary throughout the freezing and thawing period, depending on soil conditions and rate and timing of snowmelt. The model can be used to determine the effects of seasonally frozen soils on ground-water recharge and surface-water runoff. Data collected for two winters, 1985-86 and 1986-87, on three runoff plots were used to calibrate and verify the model. The winter of 1985-86 was colder than normal, and snow cover was continuous throughout the winter. The winter of 1986-87 was warmer than normal, and snow accumulated for only short periods of several days. as the criteria for determining the degree of agreement between simulated and measured data. The model was calibrated using the 1985-86 data for plot 2. The calibration simulation agreed closely with the measured data. The verification simulations for plots 1 and 3 using the 1985-86 data and for plots 1 and 2 using the 1986-87 data agreed closely with the measured data. The verification simulation for plot 3 using the 1986-87 data did not agree closely. The recalibration simulations for plots 1 and 3 using the 1985-86 data indicated little improvement because the verification simulations for plots 1 and 3 already agreed closely with the measured data.

  19. Documentation of a heat and water transfer model for seasonally frozen soils with application to a precipitation-runoff model

    USGS Publications Warehouse

    Emerson, Douglas G.

    1991-01-01

    A model that simulates heat and water transfer in soils during freezing and thawing periods was developed and incorporated into the U.S. Geological Survey's Precipitation-Runoff Modeling System. The transfer of heat 1s based on an equation developed from Fourier's equation for heat flux. Field capacity and infiltration rate can vary throughout the freezing and thawing period, depending on soil conditions and rate and timing of snowmelt. The transfer of water within the soil profile is based on the concept of capillary forces. The model can be used to determine the effects of seasonally frozen soils on ground-water recharge and surface-water runoff. Data collected for two winters, 1985-86 and 1986-87, on three runoff plots were used to calibrate and verify the model. The winter of 1985-86 was colder than normal and snow cover was continuous throughout the winter. The winter of 1986-87 was wanner than normal and snow accumulated for only short periods of several days.Runoff, snowmelt, and frost depths were used as the criteria for determining the degree of agreement between simulated and measured data. The model was calibrated using the 1985-86 data for plot 2. The calibration simulation agreed closely with the measured data. The verification simulations for plots 1 and 3 using the 1985-86 data and for plots 1 and 2 using the 1986-87 data agreed closely with the measured data. The verification simulation for plot 3 using the 1986-87 data did not agree closely. The recalibratlon simulations for plots 1 and 3 using the 1985-86 data Indicated small improvement because the verification simulations for plots 1 and 3 already agreed closely with the measured data.

  20. High-latitude geomagnetic disturbances during ascending solar cycle 24

    NASA Astrophysics Data System (ADS)

    Peitso, Pyry; Tanskanen, Eija; Stolle, Claudia; Berthou Lauritsen, Nynne; Matzka, Jürgen

    2015-04-01

    High-latitude regions are very convenient for study of several space weather phenomena such as substorms. Large geographic coverage as well as long time series of data are essential due to the global nature of space weather and the long duration of solar cycles. We will examine geomagnetic activity in Greenland from magnetic field measurements taken by DTU (Technical University of Denmark) magnetometers during the years 2010 to 2014. The study uses data from 13 magnetometer stations located on the east coast of Greenland and one located on the west coast. The original measurements are in one second resolution, thus the amount of data is quite large. Magnetic field H component (positive direction towards the magnetic north) was used throughout the study. Data processing will be described from calibration of original measurements to plotting of long time series. Calibration consists of determining the quiet hour of a given day and reducing the average of that hour from all the time steps of the day. This normalizes the measurements and allows for better comparison between different time steps. In addition to the full time line of measurements, daily, monthly and yearly averages will be provided for all stations. Differential calculations on the change of the H component will also be made available for the duration of the full data set. Envelope curve plots will be presented for duration of the time line. Geomagnetic conditions during winter and summer will be compared to examine seasonal variation. Finally the measured activity will be compared to NOAA (National Oceanic and Atmospheric Administration) issued geomagnetic space weather alerts from 2010 to 2014. Calculations and plotting of measurement data were done with MATLAB. M_map toolbox was used for plotting of maps featured in the study (http://www2.ocgy.ubc.ca/~rich/map.html). The study was conducted as a part of the ReSoLVE (Research on Solar Long-term Variability and Effects) Center of Excellence.

  1. A storm-based CSLE incorporating the modified SCS-CN method for soil loss prediction on the Chinese Loess Plateau

    NASA Astrophysics Data System (ADS)

    Shi, Wenhai; Huang, Mingbin

    2017-04-01

    The Chinese Loess Plateau is one of the most erodible areas in the world. In order to reduce soil and water losses, suitable conservation practices need to be designed. For this purpose, there is an increasing demand for an appropriate model that can accurately predict storm-based surface runoff and soil losses on the Loess Plateau. The Chinese Soil Loss Equation (CSLE) has been widely used in this region to assess soil losses from different land use types. However, the CSLE was intended only to predict the mean annual gross soil loss. In this study, a CSLE was proposed that would be storm-based and that introduced a new rainfall-runoff erosivity factor. A dataset was compiled that comprised measurements of soil losses during individual storms from three runoff-erosion plots in each of three different watersheds in the gully region of the Plateau for 3-7 years in three different time periods (1956-1959; 1973-1980; 2010-13). The accuracy of the soil loss predictions made by the new storm-based CSLE was determined using the data for the six plots in two of the watersheds measured during 165 storm-runoff events. The performance of the storm-based CSLE was further compared with the performance of the storm-based Revised Universal Soil Loss Equation (RUSLE) for the same six plots. During the calibration (83 storms) and validation (82 storms) of the storm-based CSLE, the model efficiency, E, was 87.7% and 88.9%, respectively, while the root mean square error (RMSE) was 2.7 and 2.3 t ha-1 indicating a high degree of accuracy. Furthermore, the storm-based CSLE performed better than the storm-based RULSE (E: 75.8% and 70.3%; RMSE: 3.8 and 3.7 t ha-1, for the calibration and validation storms, respectively). The storm-based CSLE was then used to predict the soil losses from the three experimental plots in the third watershed. For these predictions, the model parameter values, previously determined by the calibration based on the data from the initial six plots, were used in the storm-based CSLE. In addition, the surface runoff used by the storm-based CSLE was either obtained from measurements or from the values predicted by the modified Soil Conservation Service Curve Number (SCS-CN) method. When using the measured runoff, the storm-based CSLE had an E of 76.6%, whereas the use of the predicted runoff gave an E of 76.4%. The high E values indicated that the storm-based CSLE incorporating the modified SCS-CN method could accurately predict storm-event-based soil losses resulting from both sheet and rill erosion at the field scale on the Chinese Loess Plateau. This approach could be applicable to other areas of the world once the model parameters have been suitably calibrated.

  2. Absolute calibration technique for broadband ultrasonic transducers

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Cantrell, John H. (Inventor)

    1994-01-01

    Calibrating an ultrasonic transducer can be performed with a reduced number of calculations and testing. A wide-band pulser is connected to an ultrasonic transducer under test to generate ultrasonic waves in a liquid. A single frequency is transmitted to the electrostatic acoustic transducer (ESAT) and the voltage change produced is monitored. Then a broadband ultrasonic pulse is generated by the ultrasonic transducer and received by the ESAT. The output of the ESAT is amplified and input to a digitized oscilloscope for fast Fourier transform. The resulting plot is normalized with the monitored signal from the single frequency pulse. The plot is then corrected for characteristics of the membrane and diffraction effects. The transfer function of the final plot is determined. The transfer function gives the final sensitivity of the ultrasonic transducer as a function of frequency. The advantage of the system is the speed of calibrating the transducer by a reduced number of measurements and removal of the membrane and diffraction effects.

  3. Application of the Langley plot method to the calibration of the solar backscattered ultraviolet instrument on the Nimbus 7 satellite

    NASA Technical Reports Server (NTRS)

    Bhartia, P. K.; Taylor, S.; Mcpeters, R. D.; Wellemeyer, C.

    1995-01-01

    The concept of the well-known Langley plot technique, used for the calibration of ground-based instruments, has been generalized for application to satellite instruments. In polar regions, near summer solstice, the solar backscattered ultraviolet (SBUV) instrument on the Nimbus 7 satellite samples the same ozone field at widely different solar zenith angles. These measurements are compared to assess the long-term drift in the instrument calibration. Although the technique provides only a relative wavelength-to-wavelength calibration, it can be combined with existing techniques to determine the drift of the instrument at any wavelength. Using this technique, we have generated a 12-year data set of ozone vertical profiles from SBUV with an estimated accuracy of +/- 5% at 1 mbar and +/- 2% at 10 mbar (95% confidence) over 12 years. Since the method is insensitive to true changes in the atmospheric ozone profile, it can also be used to compare the calibrations of similar SBUV instruments launched without temporal overlap.

  4. Decision curve analysis and external validation of the postoperative Karakiewicz nomogram for renal cell carcinoma based on a large single-center study cohort.

    PubMed

    Zastrow, Stefan; Brookman-May, Sabine; Cong, Thi Anh Phuong; Jurk, Stanislaw; von Bar, Immanuel; Novotny, Vladimir; Wirth, Manfred

    2015-03-01

    To predict outcome of patients with renal cell carcinoma (RCC) who undergo surgical therapy, risk models and nomograms are valuable tools. External validation on independent datasets is crucial for evaluating accuracy and generalizability of these models. The objective of the present study was to externally validate the postoperative nomogram developed by Karakiewicz et al. for prediction of cancer-specific survival. A total of 1,480 consecutive patients with a median follow-up of 82 months (IQR 46-128) were included into this analysis with 268 RCC-specific deaths. Nomogram-estimated survival probabilities were compared with survival probabilities of the actual cohort, and concordance indices were calculated. Calibration plots and decision curve analyses were used for evaluating calibration and clinical net benefit of the nomogram. Concordance between predictions of the nomogram and survival rates of the cohort was 0.911 after 12, 0.909 after 24 months and 0.896 after 60 months. Comparison of predicted probabilities and actual survival estimates with calibration plots showed an overestimation of tumor-specific survival based on nomogram predictions of high-risk patients, although calibration plots showed a reasonable calibration for probability ranges of interest. Decision curve analysis showed a positive net benefit of nomogram predictions for our patient cohort. The postoperative Karakiewicz nomogram provides a good concordance in this external cohort and is reasonably calibrated. It may overestimate tumor-specific survival in high-risk patients, which should be kept in mind when counseling patients. A positive net benefit of nomogram predictions was proven.

  5. Preliminary Findings of the Photovoltaic Cell Calibration Experiment on Pathfinder Flight 95-3

    NASA Technical Reports Server (NTRS)

    Vargas-Aburto, Carlos

    1997-01-01

    The objective of the photovoltaic (PV) cell calibration experiment for Pathfinder was to develop an experiment compatible with an ultralight UAV to predict the performance of PV cells at AM0, the solar spectrum in space, using the Langley plot technique. The Langley plot is a valuable technique for this purpose and requires accurate measurements of air mass (pressure), cell temperature, solar irradiance, and current-voltage(IV) characteristics with the cells directed normal to the direct ray of the sun. Pathfinder's mission objective (95-3) of 65,000 ft. maximum altitude, is ideal for performing the Langley plot measurements. Miniaturization of electronic data acquisition equipment enabled the design and construction of an accurate and light weight measurement system that meets Pathfinder's low payload weight requirements.

  6. Hydrologic Impact of Straw Mulch On Runoff from a Burned Area for Various Soil Water Content

    NASA Astrophysics Data System (ADS)

    Carnicle, M. M.; Moody, J. A.; Ahlstrom, A. K.

    2011-12-01

    Mountainous watersheds often exhibit increases in runoff and flash floods after wildfires. During 11 days of September 2010, the Fourmile Canyon wildfire burned 2500 hectares of the foothills of the Rocky Mountains near Boulder, Colorado. In an effort to minimize the risk of flash floods after the wildfire, Boulder County aerially applied straw mulch on high-risk areas selected primarily on the basis of their slopes and burn severities. The purpose of this research is to investigate the hydrologic response, specifically runoff, of a burned area where straw mulch is applied. We measured the runoff, at different soil water contents, from 0.8-m diameter plots. Paired plots were installed in June 2011 in a basin burned by the Fourmile Canyon Fire. Two sets of bounded, paired plot (two control and two experimental plots) were calibrated for 35 days without straw on either plot by measuring volumetric soil water content 2-3 times per week and measuring total runoff from each storm. Straw (5 cm thick) was added to the two experimental plots on 19 July 2011 and also to the funnels of two visual rain gages in order to measure the amount of rainfall absorbed by the straw. Initial results during the calibration period showed nearly linear relations between the volumetric soil water content of the control and experimental plots. The regression line for the runoff from the control versus the runoff from the experiment plot did not fit a linear trend; the variability may have been caused by two intense storms, which produced runoff that exceeded the capacity of the runoff gages. Also, during the calibration period, when soil water content was low the runoff coefficients were high. It is anticipated that the final results will show that the total runoff is greater on plots with no straw compared to those with straw, under conditions of various antecedent soil water content. We are continuing to collect data during the summer of 2011 to test this hypothesis.

  7. Success and challenges met during the calibration of APEX on large plots

    USDA-ARS?s Scientific Manuscript database

    As the APEX model is increasingly considered for the evaluation of agricultural systems, satisfactory performance of APEX on fields is critical. APEX was applied to 16 replicated large plots established in 1991 in Northeast Missouri. Until 2009, each phase of each rotation was represented every year...

  8. Diameter Growth Models for Inventory Applications

    Treesearch

    Ronald E. McRoberts; Christopher W. Woodall; Veronica C. Lessard; Margaret R. Holdaway

    2002-01-01

    Distant-independent, individual-tree, diametar growth models were constructed to update information for forest inventory plots measured in previous years. The models are nonlinear in the parameters and were calibrated weighted nonlinear least squares techniques and forest inventory plot data. Analyses of residuals indicated that model predictions compare favorably to...

  9. SUMS calibration test report

    NASA Technical Reports Server (NTRS)

    Robertson, G.

    1982-01-01

    Calibration was performed on the shuttle upper atmosphere mass spectrometer (SUMS). The results of the calibration and the as run test procedures are presented. The output data is described, and engineering data conversion factors, tables and curves, and calibration on instrument gauges are included. Static calibration results which include: instrument sensitive versus external pressure for N2 and O2, data from each scan of calibration, data plots from N2 and O2, and sensitivity of SUMS at inlet for N2 and O2, and ratios of 14/28 for nitrogen and 16/32 for oxygen are given.

  10. Conversion of calibration curves for accurate estimation of molecular weight averages and distributions of polyether polyols by conventional size exclusion chromatography.

    PubMed

    Xu, Xiuqing; Yang, Xiuhan; Martin, Steven J; Mes, Edwin; Chen, Junlan; Meunier, David M

    2018-08-17

    Accurate measurement of molecular weight averages (M¯ n, M¯ w, M¯ z ) and molecular weight distributions (MWD) of polyether polyols by conventional SEC (size exclusion chromatography) is not as straightforward as it would appear. Conventional calibration with polystyrene (PS) standards can only provide PS apparent molecular weights which do not provide accurate estimates of polyol molecular weights. Using polyethylene oxide/polyethylene glycol (PEO/PEG) for molecular weight calibration could improve the accuracy, but the retention behavior of PEO/PEG is not stable in THF-based (tetrahydrofuran) SEC systems. In this work, two approaches for calibration curve conversion with narrow PS and polyol molecular weight standards were developed. Equations to convert PS-apparent molecular weight to polyol-apparent molecular weight were developed using both a rigorous mathematical analysis and graphical plot regression method. The conversion equations obtained by the two approaches were in good agreement. Factors influencing the conversion equation were investigated. It was concluded that the separation conditions such as column batch and operating temperature did not have significant impact on the conversion coefficients and a universal conversion equation could be obtained. With this conversion equation, more accurate estimates of molecular weight averages and MWDs for polyether polyols can be achieved from conventional PS-THF SEC calibration. Moreover, no additional experimentation is required to convert historical PS equivalent data to reasonably accurate molecular weight results. Copyright © 2018. Published by Elsevier B.V.

  11. The new camera calibration system at the US Geological Survey

    USGS Publications Warehouse

    Light, D.L.

    1992-01-01

    Modern computerized photogrammetric instruments are capable of utilizing both radial and decentering camera calibration parameters which can increase plotting accuracy over that of older analog instrumentation technology from previous decades. Also, recent design improvements in aerial cameras have minimized distortions and increased the resolving power of camera systems, which should improve the performance of the overall photogrammetric process. In concert with these improvements, the Geological Survey has adopted the rigorous mathematical model for camera calibration developed by Duane Brown. An explanation of the Geological Survey's calibration facility and the additional calibration parameters now being provided in the USGS calibration certificate are reviewed. -Author

  12. Calibration of Hydrophone Stations: Lessons Learned from the Ascension Island Experiment

    DTIC Science & Technology

    2000-09-01

    source based on the implosion of a glass sphere for future long-range calibrations. RESEARCH ACCOMPLISHED The J.C. Ross, an icebreaker class...waters around Ascension Island. The blow - ups show the track in the immediate vicinity of the three hydrophones and plots their nominal location. The...used has practical and cost-driven limitations. Small implosive sources such as lightbulbs have been used from ships as hydrophone calibration sources

  13. Effects of experimental design on calibration curve precision in routine analysis

    PubMed Central

    Pimentel, Maria Fernanda; Neto, Benício de Barros; Saldanha, Teresa Cristina B.

    1998-01-01

    A computational program which compares the effciencies of different experimental designs with those of maximum precision (D-optimized designs) is described. The program produces confidence interval plots for a calibration curve and provides information about the number of standard solutions, concentration levels and suitable concentration ranges to achieve an optimum calibration. Some examples of the application of this novel computational program are given, using both simulated and real data. PMID:18924816

  14. An Analytical Calibration Approach for the Polarimetric Airborne C Band Radiometer

    NASA Technical Reports Server (NTRS)

    Pham, Hanh; Kim, Edward J.

    2004-01-01

    Passive microwave remote sensing is sensitive to the quantity and distribution of water in soil and vegetation. During summer 2000, the Microwave Geophysics Group at the University of Michigan conducted the 7th Radiobrightness Energy Balance Experiment (REBEX-7) over a corn canopy in Michigan. Long time series of brightness temperatures, soil moisture and micrometeorology on the plot scale were taken. This paper addresses the calibration of the NASA GSFC polarimetric airborne C band microwave radiometer (ACMR) that participated in REBEX-7. Passive polarimeters are typically calibrated using an end-to-end approach based upon a standard artificial target or a well-known geophysical target. Analyzing the major internal functional subsystems offers a different perspective. The primary goal of this approach is to provide a transfer function that not only describes the system in its entirety but also accounts for the contributions of each subsystem toward the final modified Stokes parameters. This approach also serves as a realistic instrument simulator, a useful tool for future designs. The ACMR architecture can be partitioned into several functional subsystems. Each subsystem was extensively measured and the estimated parameters were imported into the overall system model. We will present the results of polarimetric antenna measurements, the instrument model as well as four Stokes observations from REBEX-7 using a first order inversion.

  15. Validation of the Web-Based IBTR! 2.0 Nomogram to Predict for Ipsilateral Breast Tumor Recurrence After Breast-Conserving Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kindts, Isabelle, E-mail: Isabelle.kindts@uzleuven.be; Department of Radiation Oncology, University Hospitals Leuven, Leuven; Laenen, Annouschka

    Purpose: To evaluate the IBTR! 2.0 nomogram, which predicts 10-year ipsilateral breast tumor recurrence (IBTR) after breast-conserving therapy with and without radiation therapy for breast cancer, by using a large, external, and independent cancer center database. Methods and Materials: We retrospectively identified 1898 breast cancer cases, treated with breast-conserving therapy and radiation therapy at the University Hospital Leuven from 2000 to 2007, with requisite data for the nomogram variables. Clinicopathologic factors were assessed. Two definitions of IBTR were considered where simultaneous regional or distant recurrence were either censored (conform IBTR! 2.0) or included as event. Validity of the prediction algorithm wasmore » tested in terms of discrimination and calibration. Discrimination was assessed by the concordance probability estimate and Harrell's concordance index. The mean predicted and observed 10-year estimates were compared for the entire cohort and for 4 risk groups predefined by nomogram-predicted IBTR risks, and a calibration plot was drawn. Results: Median follow-up was 10.9 years. The 10-year IBTR rates were 1.3% and 2.1%, according to the 2 definitions of IBTR. The validation cohort differed from the development cohort with respect to the administration of hormonal therapy, surgical section margins, lymphovascular invasion, and tumor size. In univariable analysis, younger age (P=.002) and a positive nodal status (P=.048) were significantly associated with IBTR, with a trend for the omission of hormonal therapy (P=.061). The concordance probability estimate and concordance index varied between 0.57 and 0.67 for the 2 definitions of IBTR. In all 4 risk groups the model overestimated the IBTR risk. In particular, between the lowest-risk groups a limited differentiation was suggested by the calibration plot. Conclusions: The IBTR! 2.0 predictive model for IBTR in breast cancer patients shows substandard discriminative ability, with an overestimation of the risk in all subgroups.« less

  16. Validation of the Web-Based IBTR! 2.0 Nomogram to Predict for Ipsilateral Breast Tumor Recurrence After Breast-Conserving Therapy.

    PubMed

    Kindts, Isabelle; Laenen, Annouschka; Peeters, Stephanie; Janssen, Hilde; Depuydt, Tom; Nevelsteen, Ines; Van Limbergen, Erik; Weltens, Caroline

    2016-08-01

    To evaluate the IBTR! 2.0 nomogram, which predicts 10-year ipsilateral breast tumor recurrence (IBTR) after breast-conserving therapy with and without radiation therapy for breast cancer, by using a large, external, and independent cancer center database. We retrospectively identified 1898 breast cancer cases, treated with breast-conserving therapy and radiation therapy at the University Hospital Leuven from 2000 to 2007, with requisite data for the nomogram variables. Clinicopathologic factors were assessed. Two definitions of IBTR were considered where simultaneous regional or distant recurrence were either censored (conform IBTR! 2.0) or included as event. Validity of the prediction algorithm was tested in terms of discrimination and calibration. Discrimination was assessed by the concordance probability estimate and Harrell's concordance index. The mean predicted and observed 10-year estimates were compared for the entire cohort and for 4 risk groups predefined by nomogram-predicted IBTR risks, and a calibration plot was drawn. Median follow-up was 10.9 years. The 10-year IBTR rates were 1.3% and 2.1%, according to the 2 definitions of IBTR. The validation cohort differed from the development cohort with respect to the administration of hormonal therapy, surgical section margins, lymphovascular invasion, and tumor size. In univariable analysis, younger age (P=.002) and a positive nodal status (P=.048) were significantly associated with IBTR, with a trend for the omission of hormonal therapy (P=.061). The concordance probability estimate and concordance index varied between 0.57 and 0.67 for the 2 definitions of IBTR. In all 4 risk groups the model overestimated the IBTR risk. In particular, between the lowest-risk groups a limited differentiation was suggested by the calibration plot. The IBTR! 2.0 predictive model for IBTR in breast cancer patients shows substandard discriminative ability, with an overestimation of the risk in all subgroups. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Measuring water and sediment discharge from a road plot with a settling basin and tipping bucket

    Treesearch

    Thomas A. Black; Charles H. Luce

    2013-01-01

    A simple empirical method quantifies water and sediment production from a forest road surface, and is well suited for calibration and validation of road sediment models. To apply this quantitative method, the hydrologic technician installs bordered plots on existing typical road segments and measures coarse sediment production in a settling tank. When a tipping bucket...

  18. Accounting for temporal variation in soil hydrological properties when simulating surface runoff on tilled plots

    NASA Astrophysics Data System (ADS)

    Chahinian, Nanée; Moussa, Roger; Andrieux, Patrick; Voltz, Marc

    2006-07-01

    Tillage operations are known to greatly influence local overland flow, infiltration and depressional storage by altering soil hydraulic properties and soil surface roughness. The calibration of runoff models for tilled fields is not identical to that of untilled fields, as it has to take into consideration the temporal variability of parameters due to the transient nature of surface crusts. In this paper, we seek the application of a rainfall-runoff model and the development of a calibration methodology to take into account the impact of tillage on overland flow simulation at the scale of a tilled plot (3240 m 2) located in southern France. The selected model couples the (Morel-Seytoux, H.J., 1978. Derivation of equations for variable rainfall infiltration. Water Resources Research. 14(4), 561-568). Infiltration equation to a transfer function based on the diffusive wave equation. The parameters to be calibrated are the hydraulic conductivity at natural saturation Ks, the surface detention Sd and the lag time ω. A two-step calibration procedure is presented. First, eleven rainfall-runoff events are calibrated individually and the variability of the calibrated parameters are analysed. The individually calibrated Ks values decrease monotonously according to the total amount of rainfall since tillage. No clear relationship is observed between the two parameters Sd and ω, and the date of tillage. However, the lag time ω increases inversely with the peakflow of the events. Fairly good agreement is observed between the simulated and measured hydrographs of the calibration set. Simple mathematical laws describing the evolution of Ks and ω are selected, while Sd is considered constant. The second step involves the collective calibration of the law of evolution of each parameter on the whole calibration set. This procedure is calibrated on 11 events and validated on ten runoff inducing and four non-runoff inducing rainfall events. The suggested calibration methodology seems robust and can be transposed to other gauged sites.

  19. External Validation of the Updated Partin Tables in a Cohort of French and Italian Men

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhojani, Naeem; Department of Urology, University of Montreal, Montreal, PQ; Salomon, Laurent

    2009-02-01

    Purpose: To test the discrimination and calibration properties of the newly developed 2007 Partin Tables in two European cohorts with localized prostate cancer. Methods: Data on clinical and pathologic characteristics were obtained for 1,064 men treated with radical prostatectomy at the Creteil University Health Center in France (n = 839) and at the Milan University Vita-Salute in Italy (n = 225). Overall discrimination was assessed with receiver operating characteristic curve analysis, which quantified the accuracy of stage predictions for each center. Calibration plots graphically explored the relationship between predicted and observed rates of extracapsular extension (ECE), seminal vesicle invasion (SVI)more » and lymph node invasion (LNI). Results: The rates of ECE, SVI, and LNI were 28%, 14%, and 2% in the Creteil cohort vs. 11%, 5%, and 5% in the Milan cohort. In the Creteil cohort, the accuracy of ECE, SVI, and LNI prediction was 61%, 71%, and 82% vs. 66%, 92% and 75% for the Milan cohort. Important departures were recorded between Partin Tables' predicted and observed rates of ECE, SVI, and LNI within both cohorts. Conclusions: The 2007 Partin Tables demonstrated worse performance in European men than they originally did in North American men. This indicates that predictive models need to be externally validated before their implementation into clinical practice.« less

  20. Calibration of multivariate scatter plots for exploratory analysis of relations within and between sets of variables in genomic research.

    PubMed

    Graffelman, Jan; van Eeuwijk, Fred

    2005-12-01

    The scatter plot is a well known and easily applicable graphical tool to explore relationships between two quantitative variables. For the exploration of relations between multiple variables, generalisations of the scatter plot are useful. We present an overview of multivariate scatter plots focussing on the following situations. Firstly, we look at a scatter plot for portraying relations between quantitative variables within one data matrix. Secondly, we discuss a similar plot for the case of qualitative variables. Thirdly, we describe scatter plots for the relationships between two sets of variables where we focus on correlations. Finally, we treat plots of the relationships between multiple response and predictor variables, focussing on the matrix of regression coefficients. We will present both known and new results, where an important original contribution concerns a procedure for the inclusion of scales for the variables in multivariate scatter plots. We provide software for drawing such scales. We illustrate the construction and interpretation of the plots by means of examples on data collected in a genomic research program on taste in tomato.

  1. SURF Model Calibration Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph

    2017-03-10

    SURF and SURFplus are high explosive reactive burn models for shock initiation and propagation of detonation waves. They are engineering models motivated by the ignition & growth concept of high spots and for SURFplus a second slow reaction for the energy release from carbon clustering. A key feature of the SURF model is that there is a partial decoupling between model parameters and detonation properties. This enables reduced sets of independent parameters to be calibrated sequentially for the initiation and propagation regimes. Here we focus on a methodology for tting the initiation parameters to Pop plot data based on 1-Dmore » simulations to compute a numerical Pop plot. In addition, the strategy for tting the remaining parameters for the propagation regime and failure diameter is discussed.« less

  2. Calibration of the STEMS diameter growth model using FIA data

    Treesearch

    Veronica C. Lessard

    2000-01-01

    The diameter growth model used in STEMS, the Stand and Tree Evaluation and Modeling System, was originally calibrated using data from permanent growth plots in Minnesota, Wisconsin, and Michigan. Because the model has been applied in predicting growth using Forest Inventory and Analysis (FIA) data, it was appropriate to refit the model to FIA data. The model was...

  3. 40 CFR 61.32 - Emission standard.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... frequency of calibration. (b) Method of sample analysis. (c) Averaging technique for determining 30-day...) Plant and sampling area plots showing emission points and sampling sites. Topographic features...

  4. The impact of forest structure and spatial scale on the relationship between ground plot above ground biomass and GEDI lidar waveforms

    NASA Astrophysics Data System (ADS)

    Armston, J.; Marselis, S.; Hancock, S.; Duncanson, L.; Tang, H.; Kellner, J. R.; Calders, K.; Disney, M.; Dubayah, R.

    2017-12-01

    The NASA Global Ecosystem Dynamics Investigation (GEDI) will place a multi-beam waveform lidar instrument on the International Space Station (ISS) to provide measurements of forest vertical structure globally. These measurements of structure will underpin empirical modelling of above ground biomass density (AGBD) at the scale of individual GEDI lidar footprints (25m diameter). The GEDI pre-launch calibration strategy for footprint level models relies on linking AGBD estimates from ground plots with GEDI lidar waveforms simulated from coincident discrete return airborne laser scanning data. Currently available ground plot data have variable and often large uncertainty at the spatial resolution of GEDI footprints due to poor colocation, allometric model error, sample size and plot edge effects. The relative importance of these sources of uncertainty partly depends on the quality of ground measurements and region. It is usually difficult to know the magnitude of these uncertainties a priori so a common approach to mitigate their influence on model training is to aggregate ground plot and waveform lidar data to a coarser spatial scale (0.25-1ha). Here we examine the impacts of these principal sources of uncertainty using a 3D simulation approach. Sets of realistic tree models generated from terrestrial laser scanning (TLS) data or parametric modelling matched to tree inventory data were assembled from four contrasting forest plots across tropical rainforest, deciduous temperate forest, and sclerophyll eucalypt woodland sites. These tree models were used to simulate geometrically explicit 3D scenes with variable tree density, size class and spatial distribution. GEDI lidar waveforms are simulated over ground plots within these scenes using monte carlo ray tracing, allowing the impact of varying ground plot and waveform colocation error, forest structure and edge effects on the relationship between ground plot AGBD and GEDI lidar waveforms to be directly assessed. We quantify the sensitivity of calibration equations relating GEDI lidar structure measurements and AGBD to these factors at a range of spatial scales (0.0625-1ha) and discuss the implications for the expanding use of existing in situ ground plot data by GEDI.

  5. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    PubMed

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.

  6. Data user's notes of the radio astronomy experiment aboard the OGO-V spacecraft

    NASA Technical Reports Server (NTRS)

    Haddock, F. T.; Breckenridge, S. L.

    1970-01-01

    General information concerning the low-frequency radiometer, instrument package launching and operation, and scientific objectives of the flight are provided. Calibration curves and correction factors, with general and detailed information on the preflight calibration procedure are included. The data acquisition methods and the format of the data reduction, both on 35 mm film and on incremental computer plots, are described.

  7. Estimating Aboveground Biomass in Tropical Forests: Field Methods and Error Analysis for the Calibration of Remote Sensing Observations

    DOE PAGES

    Gonçalves, Fabio; Treuhaft, Robert; Law, Beverly; ...

    2017-01-07

    Mapping and monitoring of forest carbon stocks across large areas in the tropics will necessarily rely on remote sensing approaches, which in turn depend on field estimates of biomass for calibration and validation purposes. Here, we used field plot data collected in a tropical moist forest in the central Amazon to gain a better understanding of the uncertainty associated with plot-level biomass estimates obtained specifically for the calibration of remote sensing measurements. In addition to accounting for sources of error that would be normally expected in conventional biomass estimates (e.g., measurement and allometric errors), we examined two sources of uncertaintymore » that are specific to the calibration process and should be taken into account in most remote sensing studies: the error resulting from spatial disagreement between field and remote sensing measurements (i.e., co-location error), and the error introduced when accounting for temporal differences in data acquisition. We found that the overall uncertainty in the field biomass was typically 25% for both secondary and primary forests, but ranged from 16 to 53%. Co-location and temporal errors accounted for a large fraction of the total variance (>65%) and were identified as important targets for reducing uncertainty in studies relating tropical forest biomass to remotely sensed data. Although measurement and allometric errors were relatively unimportant when considered alone, combined they accounted for roughly 30% of the total variance on average and should not be ignored. Lastly, our results suggest that a thorough understanding of the sources of error associated with field-measured plot-level biomass estimates in tropical forests is critical to determine confidence in remote sensing estimates of carbon stocks and fluxes, and to develop strategies for reducing the overall uncertainty of remote sensing approaches.« less

  8. An Analytical Calibration Approach for the Polarimetric Airborne C Band Radiometer

    NASA Technical Reports Server (NTRS)

    Pham, Hanh; Kim, Edward J.

    2004-01-01

    Passive microwave remote sensing is sensitive to the quantity and distribution of water in soil and vegetation. During summer 2000, the Microwave Geophysics Group a t the University of Michigan conducted the seventh Radiobrighness Energy Balance Experiment (REBEX-7) over a corn canopy in Michigan. Long time series of brightness temperatures, soil moisture and micrometeorology on the plot were taken. This paper addresses the calibration of the NASA GSFC polarimetric airborne C band microwave radiometer (ACMR) that participated in REBEX-7. These passive polarimeters are typically calibrated using an end-to-end approach based upon a standard artificial target or a well-known geophysical target. Analyzing the major internal functional subsystems offers a different perspective. The primary goal of this approach is to provide a transfer function that not only describes the system in its entire5 but also accounts for the contributions of each subsystem toward the final modified Stokes parameters. This approach does not assume that the radiometric system is linear as it does not take polarization isolation for granted, and it also serves as a realistic instrument simulator, a useful tool for future designs. The ACMR architecture can be partitioned into functional subsystems. The characteristics of each subsystem was extensively measured and the estimated parameters were imported into the overall dosed form system model. Inversion of the model yields a calibration for the modeled Stokes parameters with uncertainties of 0.2 K for the V and H polarizations and 2.4 K for the 3rd and 4th parameters. Application to the full Stokes parameters over a senescent cornfield is presented.

  9. Integrating the ODI-PPA scientific gateway with the QuickReduce pipeline for on-demand processing

    NASA Astrophysics Data System (ADS)

    Young, Michael D.; Kotulla, Ralf; Gopu, Arvind; Liu, Wilson

    2014-07-01

    As imaging systems improve, the size of astronomical data has continued to grow, making the transfer and processing of data a significant burden. To solve this problem for the WIYN Observatory One Degree Imager (ODI), we developed the ODI-Portal, Pipeline, and Archive (ODI-PPA) science gateway, integrating the data archive, data reduction pipelines, and a user portal. In this paper, we discuss the integration of the QuickReduce (QR) pipeline into PPA's Tier 2 processing framework. QR is a set of parallelized, stand-alone Python routines accessible to all users, and operators who can create master calibration products and produce standardized calibrated data, with a short turn-around time. Upon completion, the data are ingested into the archive and portal, and made available to authorized users. Quality metrics and diagnostic plots are generated and presented via the portal for operator approval and user perusal. Additionally, users can tailor the calibration process to their specific science objective(s) by selecting custom datasets, applying preferred master calibrations or generating their own, and selecting pipeline options. Submission of a QuickReduce job initiates data staging, pipeline execution, and ingestion of output data products all while allowing the user to monitor the process status, and to download or further process/analyze the output within the portal. User-generated data products are placed into a private user-space within the portal. ODI-PPA leverages cyberinfrastructure at Indiana University including the Big Red II supercomputer, the Scholarly Data Archive tape system and the Data Capacitor shared file system.

  10. A comparison of the abilities of the USLE-M, RUSLE2 and WEPP to model event erosion from bare fallow areas.

    PubMed

    Kinnell, P I A

    2017-10-15

    Traditionally, the Universal Soil Loss Equation (USLE) and the revised version of it (RUSLE) have been applied to predicting the long term average soil loss produced by rainfall erosion in many parts of the world. Overtime, it has been recognized that there is a need to predict soil losses over shorter time scales and this has led to the development of WEPP and RUSLE2 which can be used to predict soil losses generated by individual rainfall events. Data currently exists that enables the RUSLE2, WEPP and the USLE-M to estimate historic soil losses from bare fallow runoff and soil loss plots recorded in the USLE database. Comparisons of the abilities of the USLE-M and RUSLE2 to estimate event soil losses from bare fallow were undertaken under circumstances where both models produced the same total soil loss as observed for sets of erosion events on 4 different plots at 4 different locations. Likewise, comparisons of the abilities of the USLE-M and WEPP to model event soil loss from bare fallow were undertaken for sets of erosion events on 4 plots at 4 different locations. Despite being calibrated specifically for each plot, WEPP produced the worst estimates of event soil loss for all the 4 plots. Generally, the USLE-M using measured runoff to calculate the product of the runoff ratio, storm kinetic energy and the maximum 30-minute rainfall intensity produced the best estimates. As to be expected, ability of the USLE-M to estimate event soil loss was reduced when runoff predicted by either RUSLE2 or WEPP was used. Despite this, the USLE-M using runoff predicted by WEPP estimated event soil loss better than WEPP. RUSLE2 also outperformed WEPP. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Identifying, monitoring and implementing "sustainable" agricultural practices for smallholder farmers over large geographic areas in India and Vietnam

    NASA Astrophysics Data System (ADS)

    Kritee, K.; Ahuja, R.; Nair, D.; Esteves, T.; Rudek, J.; Thu Ha, T.

    2015-12-01

    Industrial agriculture systems, mostly in developed and some emerging economies, are far different from the small-holder farms (size <1 acre) in Asia and Africa. Along with our partners from non-governmental, corporate, academic and government sectors and tens of thousands of farming families, we have worked actively in five states in India and two provinces in Vietnam for the last five years to understand how sustainable and climate smart farming practices can be monitored at small-holder farms. Here, any approach to monitor farming must begin by accounting for the tremendous management variability from farm to farm and also the current inability to ground-truth remote sensing data due to lack of relaible basic parameters (e.g., yields, N use, farm boundaries) which are necessary for calibrating empirical/biogeochemical models. While we continue to learn from new research, we have found that it is crucial to follow some steps if sustainable farming programs are to succeed at small-holder farms Demographic data collection and GPS plot demarcation to establish farm size and ownership Baseline nutrient, water & energy use and crop yield determination via surveys and self-reporting which are verifiable through farmer networks given the importance of peer to peer learning in the dissemination of new techniques in such landscapes "Sustainable" practice determination in consultation with local universities/NGO experts Measurements on representative plots for 3-4 years to help calibrate biogeochemical models and/or empirical equations and establish which practices are truly "sustainable" (e.g., GHG emission reduction varies from 0-7 tCO2e/acre for different sustainable practices). Propagation of sustainable practices across the landscape via local NGOs/governments after analyzing the replicability of identified farming practices in the light of local financial, cultural or socio-political barriers. We will present results from representative plots (including soil and weather parameters, GHG emissions, yields, inputs, economic and environmental savings), farmer surveys and diary data; and discuss our key conclusions based on our approach and the analysis of the collected data which was enabled by use of a commercially available comprehensive agricultural data collection software.

  12. Geometrical Calibration of the Photo-Spectral System and Digital Maps Retrieval

    NASA Astrophysics Data System (ADS)

    Bruchkouskaya, S.; Skachkova, A.; Katkovski, L.; Martinov, A.

    2013-12-01

    Imaging systems for remote sensing of the Earth are required to demonstrate high metric accuracy of the picture which can be provided through preliminary geometrical calibration of optical systems. Being defined as a result of the geometrical calibration, parameters of internal and external orientation of the cameras are needed while solving such problems of image processing, as orthotransformation, geometrical correction, geographical coordinate fixing, scale adjustment and image registration from various channels and cameras, creation of image mosaics of filmed territories, and determination of geometrical characteristics of objects in the images. The geometrical calibration also helps to eliminate image deformations arising due to manufacturing defects and errors in installation of camera elements and photo receiving matrices as well as those resulted from lens distortions. A Photo-Spectral System (PhSS), which is intended for registering reflected radiation spectra of underlying surfaces in a wavelength range from 350 nm to 1050 nm and recording images of high spatial resolution, has been developed at the A.N. Sevchenko Research Institute of Applied Physical Problems of the Belarusian State University. The PhSS has undergone flight tests over the territory of Belarus onboard the Antonov AN-2 aircraft with the aim to obtain visible range images of the underlying surface. Then we performed the geometrical calibration of the PhSS and carried out the correction of images obtained during the flight tests. Furthermore, we have plotted digital maps of the terrain using the stereo pairs of images acquired from the PhSS and evaluated the accuracy of the created maps. Having obtained the calibration parameters, we apply them for correction of the images from another identical PhSS device, which is located at the Russian Orbital Segment of the International Space Station (ROS ISS), aiming to retrieve digital maps of the terrain with higher accuracy.

  13. Multicenter Evaluation of a Commercial Cytomegalovirus Quantitative Standard: Effects of Commutability on Interlaboratory Concordance

    PubMed Central

    Shahbazian, M. D.; Valsamakis, A.; Boonyaratanakornkit, J.; Cook, L.; Pang, X. L.; Preiksaitis, J. K.; Schönbrunner, E. R.; Caliendo, A. M.

    2013-01-01

    Commutability of quantitative reference materials has proven important for reliable and accurate results in clinical chemistry. As international reference standards and commercially produced calibration material have become available to address the variability of viral load assays, the degree to which such materials are commutable and the effect of commutability on assay concordance have been questioned. To investigate this, 60 archived clinical plasma samples, which previously tested positive for cytomegalovirus (CMV), were retested by five different laboratories, each using a different quantitative CMV PCR assay. Results from each laboratory were calibrated both with lab-specific quantitative CMV standards (“lab standards”) and with common, commercially available standards (“CMV panel”). Pairwise analyses among laboratories were performed using mean results from each clinical sample, calibrated first with lab standards and then with the CMV panel. Commutability of the CMV panel was determined based on difference plots for each laboratory pair showing plotted values of standards that were within the 95% prediction intervals for the clinical specimens. Commutability was demonstrated for 6 of 10 laboratory pairs using the CMV panel. In half of these pairs, use of the CMV panel improved quantitative agreement compared to use of lab standards. Two of four laboratory pairs for which the CMV panel was noncommutable showed reduced quantitative agreement when that panel was used as a common calibrator. Commutability of calibration material varies across different quantitative PCR methods. Use of a common, commutable quantitative standard can improve agreement across different assays; use of a noncommutable calibrator can reduce agreement among laboratories. PMID:24025907

  14. Calibration of the Diameter Distribution Derived from the Area-based Approach with Individual Tree-based Diameter Estimates Using the Airborne Laser Scanning

    NASA Astrophysics Data System (ADS)

    Xu, Q.; Hou, Z.; Maltamo, M.; Tokola, T.

    2015-12-01

    Diameter distributions of trees are important indicators of current forest stand structure and future dynamics. A new method was proposed in the study to combine the diameter distributions derived from the area-based approach (ABA) and the diameter distribution derived from the individual tree detection (ITD) in order to obtain more accurate forest stand attributes. Since dominant trees can be reliably detected and measured by the Lidar data via the ITD, the focus of the study is to retrieve the suppressed trees (trees that were missed by the ITD) from the ABA. Replacement and histogram matching were respectively employed at the plot level to retrieve the suppressed trees. Cut point was detected from the ITD-derived diameter distribution for each sample plot to distinguish dominant trees from the suppressed trees. The results showed that calibrated diameter distributions were more accurate in terms of error index and the entire growing stock estimates. Compared with the best performer between the ABA and the ITD, calibrated diameter distributions decreased the relative RMSE of the estimated entire growing stock, saw log and pulpwood fractions by 2.81%, 3.05% and 7.73% points respectively. Calibration improved the estimation of pulpwood fraction significantly, resulting in a negligible bias of the estimated entire growing stock.

  15. A Summary of The 2000-2001 NASA Glenn Lear Jet AM0 Solar Cell Calibration Program

    NASA Technical Reports Server (NTRS)

    Scheiman, David; Brinker, David; Snyder, David; Baraona, Cosmo; Jenkins, Phillip; Rieke, William J.; Blankenship, Kurt S.; Tom, Ellen M.

    2002-01-01

    Calibration of solar cells for space is extremely important for satellite power system design. Accurate prediction of solar cell performance is critical to solar array sizing, often required to be within 1%. The NASA Glenn Research Center solar cell calibration airplane facility has been in operation since 1963 with 531 flights to date. The calibration includes real data to Air Mass (AM) 0.2 and uses the Langley plot method plus an ozone correction factor to extrapolate to AM0. Comparison of the AM0 calibration data indicates that there is good correlation with Balloon and Shuttle flown solar cells. This paper will present a history of the airplane calibration procedure, flying considerations, and a brief summary of the previous flying season with some measurement results. This past flying season had a record 35 flights. It will also discuss efforts to more clearly define the ozone correction factor.

  16. IN VITRO MEASUREMENT OF TOTAL ANTIOXIDANT CAPACITY OF CRATAEGUS MACRACANTHA LODD LEAVES.

    PubMed

    Miftode, Alina Monica; Stefanache, Alina; Spac, A F; Miftode, R F; Miron, Anca; Dorneanu, V

    2016-01-01

    Crataegus macracantha Lodd, family Rosaceae, is a very rare species in Europe, and unlike Crataegus monogyna is less investigated for pharmacologic activity. To analyze the ability of the lyophilisate of extract obtained from leaves of Crataegus macracantha Lodd (single plant at the Iaşi Botanical Garden) to capture free radicals in vitro. The lyophilisate obtained in Department of Pharmacognosy, Faculty of Pharmacy, "Grigore T. Popa" University of Medicine and Pharmacy Iaşi. The decreased absorbance of chromophore chlorpromazine radical cation in the presence of the lyophilized solutions was studied spectrophotometrically. The indicator radical cation, obtained by oxidation of chlorpromazine by potassium persulfate, has the maximum absorbance at 525 nm. Ascorbic acid was used as a standard antioxidant. The absorbance of radical solution was determined after the addition of a certain amount of lyophilisate at different time intervals. The antioxidant activity was calculated using the calibration curve obtained by plotting the variation in radical solution absorbance depending on ascorbic acid concentration. For each ascorbic acid concentration the area under the curve was calculated from plotting the percentage inhibition of the absorbance at two pre-established time intervals. The results confirm the antioxidant activity of the leaves of Crataegus Macracantha Lodd and by optimizing the proposed analytical methods the antiradical activity can be quickly evaluated with minimal reagent consumption.

  17. A combined microphone and camera calibration technique with application to acoustic imaging.

    PubMed

    Legg, Mathew; Bradley, Stuart

    2013-10-01

    We present a calibration technique for an acoustic imaging microphone array, combined with a digital camera. Computer vision and acoustic time of arrival data are used to obtain microphone coordinates in the camera reference frame. Our new method allows acoustic maps to be plotted onto the camera images without the need for additional camera alignment or calibration. Microphones and cameras may be placed in an ad-hoc arrangement and, after calibration, the coordinates of the microphones are known in the reference frame of a camera in the array. No prior knowledge of microphone positions, inter-microphone spacings, or air temperature is required. This technique is applied to a spherical microphone array and a mean difference of 3 mm was obtained between the coordinates obtained with this calibration technique and those measured using a precision mechanical method.

  18. Calibration and validation of TRUST MRI for the estimation of cerebral blood oxygenation

    PubMed Central

    Lu, Hanzhang; Xu, Feng; Grgac, Ksenija; Liu, Peiying; Qin, Qin; van Zijl, Peter

    2011-01-01

    Recently, a T2-Relaxation-Under-Spin-Tagging (TRUST) MRI technique was developed to quantitatively estimate blood oxygen saturation fraction (Y) via the measurement of pure blood T2. This technique has shown promise for normalization of fMRI signals, for the assessment of oxygen metabolism, and in studies of cognitive aging and multiple sclerosis. However, a human validation study has not been conducted. In addition, the calibration curve used to convert blood T2 to Y has not accounted for the effects of hematocrit (Hct). In the present study, we first conducted experiments on blood samples under physiologic conditions, and the Carr-Purcell-Meiboom-Gill (CPMG) T2 was determined for a range of Y and Hct values. The data were fitted to a two-compartment exchange model to allow the characterization of a three-dimensional plot that can serve to calibrate the in vivo data. Next, in a validation study in humans, we showed that arterial Y estimated using TRUST MRI was 0.837±0.036 (N=7) during the inhalation of 14% O2, which was in excellent agreement with the gold-standard Y values of 0.840±0.036 based on Pulse-Oximetry. These data suggest that the availability of this calibration plot should enhance the applicability of TRUST MRI for non-invasive assessment of cerebral blood oxygenation. PMID:21590721

  19. Development of an Analytical Method for the Determination of Amoxicillin in Commercial Drugs and Wastewater Samples, and Assessing its Stability in Simulated Gastric Digestion.

    PubMed

    Unutkan, Tugçe; Bakirdere, Sezgin; Keyf, Seyfullah

    2018-01-01

    A highly sensitive analytical HPLC-UV method was developed for the determination of amoxicillin in drugs and wastewater samples at a single wavelength (230 nm). In order to substantially predict the in vivo behavior of amoxicillin, drug samples were subjected to simulated gastric conditions. The calibration plot of the method was linear from 0.050 to 500 mg L-1 with a correlation coefficient of 0.9999. The limit of detection and limit of quantitation were found to be 16 and 54 μg L-1, respectively. The percentage recovery of amoxicillin in wastewater was found to be 97.0 ± 1.6%. The method was successfully applied for the qualitative and quantitative determination of amoxicillin in drug samples including tablets and suspensions. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Colorimetric Determination of the Iron(III)-Thiocyanate Reaction Equilibrium Constant with Calibration and Equilibrium Solutions Prepared in a Cuvette by Sequential Additions of One Reagent to the Other

    ERIC Educational Resources Information Center

    Nyasulu, Frazier; Barlag, Rebecca

    2011-01-01

    The well-known colorimetric determination of the equilibrium constant of the iron(III-thiocyanate complex is simplified by preparing solutions in a cuvette. For the calibration plot, 0.10 mL increments of 0.00100 M KSCN are added to 4.00 mL of 0.200 M Fe(NO[subscript 3])[subscript 3], and for the equilibrium solutions, 0.50 mL increments of…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph

    Previously the SURFplus reactive burn model was calibrated for the TATB based explosive PBX 9502. The calibration was based on fitting Pop plot data, the failure diameter and the limiting detonation speed, and curvature effect data for small curvature. The model failure diameter is determined utilizing 2-D simulations of an unconfined rate stick to find the minimum diameter for which a detonation wave propagates. Here we examine the effect of mesh resolution on an unconfined rate stick with a diameter (10mm) slightly greater than the measured failure diameter (8 to 9 mm).

  2. Selections from 2017: Image Processing with AstroImageJ

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-12-01

    Editors note:In these last two weeks of 2017, well be looking at a few selections that we havent yet discussed on AAS Nova from among the most-downloaded paperspublished in AAS journals this year. The usual posting schedule will resume in January.AstroImageJ: Image Processing and Photometric Extraction for Ultra-Precise Astronomical Light CurvesPublished January2017The AIJ image display. A wide range of astronomy specific image display options and image analysis tools are available from the menus, quick access icons, and interactive histogram. [Collins et al. 2017]Main takeaway:AstroImageJ is a new integrated software package presented in a publication led byKaren Collins(Vanderbilt University,Fisk University, andUniversity of Louisville). Itenables new users even at the level of undergraduate student, high school student, or amateur astronomer to quickly start processing, modeling, and plotting astronomical image data.Why its interesting:Science doesnt just happen the momenta telescope captures a picture of a distantobject. Instead, astronomical images must firstbe carefully processed to clean up thedata, and this data must then be systematically analyzed to learn about the objects within it. AstroImageJ as a GUI-driven, easily installed, public-domain tool is a uniquelyaccessible tool for thisprocessing and analysis, allowing even non-specialist users to explore and visualizeastronomical data.Some features ofAstroImageJ:(as reported by Astrobites)Image calibration:generate master flat, dark, and bias framesImage arithmetic:combineimages viasubtraction, addition, division, multiplication, etc.Stack editing:easily perform operations on a series of imagesImage stabilization and image alignment featuresPrecise coordinate converters:calculate Heliocentric and Barycentric Julian DatesWCS coordinates:determine precisely where atelescope was pointed for an image by PlateSolving using Astronomy.netMacro and plugin support:write your own macrosMulti-aperture photometry with interactive light curve fitting:plot light curves of a star in real timeCitationKaren A. Collins et al 2017 AJ 153 77. doi:10.3847/1538-3881/153/2/77

  3. Set-up and calibration of an indoor nozzle-type rainfall simulator for soil erosion studies

    NASA Astrophysics Data System (ADS)

    Lassu, T.; Seeger, M.

    2012-04-01

    Rainfall simulation is one of the most prevalent methods used in soil erosion studies on agricultural land. In-situ simulators have been used to relate soil surface characteristics and management to runoff generation, infiltration and erosion, eg. the influence of different cultivation systems, and to parameterise erosion models. Laboratory rainfall simulators have been used to determine the impact of the soil surface characteristics such as micro-topography, surface roughness, and soil chemistry on infiltration and erosion rates, and to elucidate the processes involved. The purpose of the following study is to demonstrate the set-up and the calibration of a large indoor, nozzle-type rainfall simulator (RS) for soil erosion, surface runoff and rill development studies. This RS is part of the Kraijenhoff van de Leur Laboratory for Water and Sediment Dynamics in Wageningen University. The rainfall simulator consists from a 6 m long and 2,5 m wide plot, with metal lateral frame and one open side. Infiltration can be collected in different segments. The plot can be inclined up to 15.5° slope. From 3,85 m height above the plot 2 Lechler nozzles 460.788 are sprinkling the water onto the surface with constant intensity. A Zehnder HMP 450 pump provides the constant water supply. An automatic pressure switch on the pump keeps the pressure constant during the experiments. The flow rate is controlled for each nozzle by independent valves. Additionally, solenoid valves are mounted at each nozzle to interrupt water flow. The flow is monitored for each nozzle with flow meters and can be recorded within the computer network. For calibration of the RS we measured the rainfall distribution with 60 gauges equally distributed over the plot during 15 minutes for each nozzle independently and for a combination of 2 identical nozzles. The rainfall energy was recorded on the same grid by measuring drop size distribution and fall velocity with a laser disdrometer. We applied 2 different flow rates (4,5 l/min and 5,5 l/min), resulting in different rainfall intensities and made 2 repetitions each. The average rainfall intensity was 36,8 mm/h at the first and 37,6 mm/h at the second repetition with the lower flow rate (4,5 l/min). With the higher flow rate (5,5 l/min) at the first repetition it was 44,4 mm/h and 46 mm/h at the second one. The maximum and minimum values were 22 mm and 2 mm at the lower (4,5 l/min) flow rate, respectively 26 mm and 4 mm at the higher one (5,5 l/min). In this latter case, the resulting average kinetic energy reached 7 J m-2 mm-1, with a maximum 31,3 J m-2 mm-1 of and a minimum of 2,9 J m-2 mm-1. The Christiansen Uniformity coefficient (CU) for the lower intensities was 66% and 69%, respectively, with the higher intensities slightly better (70% and 72%). The data of the rainfall simulator in Wageningen make it a promising tool for research in soil erosion processes.

  4. Psychophysica: Mathematica notebooks for psychophysical experiments (cinematica--psychometrica--quest)

    NASA Technical Reports Server (NTRS)

    Watson, A. B.; Solomon, J. A.

    1997-01-01

    Psychophysica is a set of software tools for psychophysical research. Functions are provided for calibrated visual displays, for fitting and plotting of psychometric functions, and for the QUEST adaptive staircase procedure. The functions are written in the Mathematica programming language.

  5. Estimating Aboveground Forest Carbon Stock of Major Tropical Forest Land Uses Using Airborne Lidar and Field Measurement Data in Central Sumatra

    NASA Astrophysics Data System (ADS)

    Thapa, R. B.; Watanabe, M.; Motohka, T.; Shiraishi, T.; shimada, M.

    2013-12-01

    Tropical forests are providing environmental goods and services including carbon sequestration, energy regulation, water fluxes, wildlife habitats, fuel, and building materials. Despite the policy attention, the tropical forest reserve in Southeast Asian region is releasing vast amount of carbon to the atmosphere due to deforestation. Establishing quality forest statistics and documenting aboveground forest carbon stocks (AFCS) are emerging in the region. Airborne and satellite based large area monitoring methods are developed to compliment conventional plot based field measurement methods as they are costly, time consuming, and difficult to implement for large regions. But these methods still require adequate ground measurements for calibrating accurate AFCS model. Furthermore, tropical region comprised of varieties of natural and plantation forests capping higher variability of forest structures and biomass volumes. To address this issue and the needs for ground data, we propose the systematic collection of ground data integrated with airborne light detection and ranging (LiDAR) data. Airborne LiDAR enables accurate measures of vertical forest structure, including canopy height and volume demanding less ground measurement plots. Using an appropriate forest type based LiDAR sampling framework, structural properties of forest can be quantified and treated similar to ground measurement plots, producing locally relevant information to use independently with satellite data sources including synthetic aperture radar (SAR). In this study, we examined LiDAR derived forest parameters with field measured data and developed general and specific AFCS models for tropical forests in central Sumatra. The general model is fitted for all types of natural and plantation forests while the specific model is fitted to the specific forest type. The study region consists of natural forests including peat swamp and dry moist forests, regrowth, and mangrove and plantation forests including rubber, acacia, oil palm, and coconut. To cover these variations of forest type, eight LiDAR transacts crossing 60 (1-ha size) field plots were acquired for calibrating the models. The field plots consisted of AFCS ranging from 4 - 161 Mg /ha. The calibrated LiDAR to AFCS general model enabled to predict the AFCS with R2 = 0.87 and root mean square errors (RMSE) = 17.4 Mg /ha. The specific AFCS models provided carbon estimates, varied by forest types, with R2 ranging from 0.72 - 0.97 and uncertainty (RMSE) ranging from 1.4 - 10.7 Mg /ha. Using these models, AFCS maps were prepared for the LiDAR coverage that provided AFCS estimates for 8,000 ha offering larger ground sampling measurements for calibration of SAR based carbon mapping model to wider region of Sumatra.

  6. Results of calibrations of the NOAA-11 AVHRR made by reference to calibrated SPOT imagery at White Sands, N.M

    NASA Technical Reports Server (NTRS)

    Nianzeng, Che; Grant, Barbara G.; Flittner, David E.; Slater, Philip N.; Biggar, Stuart F.; Jackson, Ray D.; Moran, M. S.

    1991-01-01

    The calibration method reported here makes use of the reflectances of several large, uniform areas determined from calibrated and atmospherically corrected SPOT Haute Resolution Visible (HRV) scenes of White Sands, New Mexico. These reflectances were used to predict the radiances in the first two channels of the NOAA-11 Advanced Very High Resolution Radiometer (AVHRR). The digital counts in the AVHRR image corresponding to these known reflectance areas were determined by the use of two image registration techniques. The plots of digital counts versus pixel radiance provided the calibration gains and offsets for the AVHRR. A reduction in the gains of 4 and 13 percent in channels 1 and 2 respectively was found during the period 1988-11-19 to 1990-6-21. An error budget is presented for the method and is extended to the case of cross-calibrating sensors on the same orbital platform in the Earth Observing System (EOS) era.

  7. Recent progress in the joint multisensor mine-signatures database project

    NASA Astrophysics Data System (ADS)

    Lewis, Adam M.; Verlinde, Patrick S. A.; Acheroy, Marc P. J.; Sieber, Alois J.

    2002-08-01

    The MsMs project is a major campaign to collect calibrated and well-documented data, suitable for use by workers developing advanced multisensor algorithms for antipersonnel mine detection. The data, together with a full description of the site layout and measurement protocols, are publicly available via the internet site http://demining.jrc.it/msms. Measurements are made on a test lane consisting of 7 plots of different soils, each 6m by 6m, populated with surrogate mines, calibration objects, simulated clutter and position markers. There are 48 targets in each plot, configured identically for all plots. A first report was presented last year. Since then, laser acoustic vibrometer and magnetometer data have been added and the metal detector and thermal infrared data have been augmented. The database has been reformatted to make it more uniform and user-friendly and to remove typographic mistakes. The test site remains essentially unchanged, apart from some equipment upgrades, and is available for further data collection. In particular, the targets have not been moved, so as to provide stable surrounding soil conditions representative of mines left undisturbed for long periods post-conflict. This presentation will describe the new data and data format, the status of the upgrades and the outlook for the future.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonçalves, Fabio; Treuhaft, Robert; Law, Beverly

    Mapping and monitoring of forest carbon stocks across large areas in the tropics will necessarily rely on remote sensing approaches, which in turn depend on field estimates of biomass for calibration and validation purposes. Here, we used field plot data collected in a tropical moist forest in the central Amazon to gain a better understanding of the uncertainty associated with plot-level biomass estimates obtained specifically for the calibration of remote sensing measurements. In addition to accounting for sources of error that would be normally expected in conventional biomass estimates (e.g., measurement and allometric errors), we examined two sources of uncertaintymore » that are specific to the calibration process and should be taken into account in most remote sensing studies: the error resulting from spatial disagreement between field and remote sensing measurements (i.e., co-location error), and the error introduced when accounting for temporal differences in data acquisition. We found that the overall uncertainty in the field biomass was typically 25% for both secondary and primary forests, but ranged from 16 to 53%. Co-location and temporal errors accounted for a large fraction of the total variance (>65%) and were identified as important targets for reducing uncertainty in studies relating tropical forest biomass to remotely sensed data. Although measurement and allometric errors were relatively unimportant when considered alone, combined they accounted for roughly 30% of the total variance on average and should not be ignored. Lastly, our results suggest that a thorough understanding of the sources of error associated with field-measured plot-level biomass estimates in tropical forests is critical to determine confidence in remote sensing estimates of carbon stocks and fluxes, and to develop strategies for reducing the overall uncertainty of remote sensing approaches.« less

  9. Electrochemical quantification of iodide ions in synthetic urine using silver nanoparticles: a proof-of-concept.

    PubMed

    Toh, Her Shuang; Tschulik, Kristina; Batchelor-McAuley, Christopher; Compton, Richard G

    2014-08-21

    Typical urinary iodide concentrations range from 0.3 μM to 6.0 μM. The conventional analytical method is based on the Sandell-Kolthoff reaction. It involves the toxic reagent, arsenic acid, and a waiting time of 30 minutes for the iodide ions to reduce the cerium(iv) ions. In the presented work, an alternative fast electrochemical method based on a silver nanoparticle modified electrode is proposed. Cyclic voltammetry was performed with a freshly modified electrode in presence of iodide ions and the voltammetric peaks corresponding to the oxidation of silver to silver iodide and the reverse reaction were recorded. The peak height of the reduction signal of silver iodide was used to plot a calibration line for the iodide ions. Two calibration plots for the iodide ions were obtained, one in 0.1 M sodium nitrate (a chloride-ion free environment to circumvent any interference from the other halides) and another in synthetic urine (which contains 0.2 M KCl). In both of the calibration plots, linear relationships were found between the reduction peak height and the iodide ion concentration of 0.3 μM to 6.0 μM. A slope of 1.46 × 10(-2) A M(-1) and a R(2) value of 0.999 were obtained for the iodide detection in sodium nitrate. For the synthetic urine experiments, a slope of 3.58 × 10(-3) A M(-1) and a R(2) value of 0.942 were measured. A robust iodide sensor with the potential to be developed into a point-of-care system has been validated.

  10. Use of the soil and water assessment tool to scale sediment delivery from field to watershed in an agricultural landscape with topographic depressions.

    PubMed

    Almendinger, James E; Murphy, Marylee S; Ulrich, Jason S

    2014-01-01

    For two watersheds in the northern Midwest United States, we show that landscape depressions have a significant impact on watershed hydrology and sediment yields and that the Soil and Water Assessment Tool (SWAT) has appropriate features to simulate these depressions. In our SWAT models of the Willow River in Wisconsin and the Sunrise River in Minnesota, we used Pond and Wetland features to capture runoff from about 40% of the area in each watershed. These depressions trapped considerable sediment, yet further reductions in sediment yield were required for calibration and achieved by reducing the Universal Soil Loss Equation (USLE) cropping-practice (P) factor to 0.40 to 0.45. We suggest terminology to describe annual sediment yields at different conceptual spatial scales and show how SWAT output can be partitioned to extract data at each of these scales. These scales range from plot-scale yields calculated with the USLE to watershed-scale yields measured at the outlet. Intermediate scales include field, upland, pre-riverine, and riverine scales, in descending order along the conceptual flow path from plot to outlet. Sediment delivery ratios, when defined as watershed-scale yields as a percentage of plot-scale yields, ranged from 1% for the Willow watershed (717 km) to 7% for the Sunrise watershed (991 km). Sediment delivery ratios calculated from published relations based on watershed area alone were about 5 to 6%, closer to pre-riverine-scale yields in our watersheds. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  11. Small scale rainfall simulators: Challenges for a future use in soil erosion research

    NASA Astrophysics Data System (ADS)

    Ries, Johannes B.; Iserloh, Thomas; Seeger, Manuel

    2013-04-01

    Rainfall simulation on micro-plot scale is a method used worldwide to assess the generation of overland flow, soil erosion, infiltration and interrelated processes such as soil sealing, crusting, splash and redistribution of solids and solutes. The produced data are of great significance not only for the analysis of the simulated processes, but also as a source of input-data for soil erosion modelling. The reliability of the data is therefore of paramount importance, and quality management of rainfall simulation procedure a general responsibility of the rainfall simulation community. This was an accepted outcome at the "International Rainfall Simulator Workshop 2011" at Trier University. The challenges of the present and near future use of small scale rainfall simulations concern the comparability of results and scales, the quality of the data for soil erosion modelling, and further technical developments to overcome physical limitations and constraints. Regarding the high number of research questions, different fields of application, and due to the great technical creativity of researchers, a large number of different types of rainfall simulators is available. But each of the devices produces a different rainfall, leading to different kinetic energy values influencing soil surface and erosion processes. Plot sizes are also variable, as well as the experimental simulation procedures. As a consequence, differing runoff and erosion results are produced. The presentation summarises the three important aspects of rainfall simulations, following a processual order: 1. Input-factor "rain" and its calibration 2. Surface-factor "plot" and its documentation 3. Output-factors "runoff" and "sediment concentration" Finally, general considerations about the limitations and challenges for further developments and applications of rainfall simulation data are presented.

  12. A comparison of quality of present-day heat flow obtained from BHTs, Horner Plots of Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waples, D.W.; Mahadir, R.

    1994-07-01

    Reconciling temperature data obtained from measurement of single BHT, multiple BHT at a single depth, RFTs, and DSTs, is very difficult. Quality of data varied widely, however DST data were assumed to be most reliable. Data from 87 wells was used in this study, but only 47 wells have DST data. BASINMOD program was used to calculate the present-day heat flow, using measured thermal conductivity and calibrated against the DST data. The heat flows obtained from the DST data were assumed to be correct and representative throughout the basin. Then, heat flows using (1) uncorrected RFT data, (2) multiple BHTmore » data corrected by the Horner plot method, and (3) single BHT values corrected upward by a standard 10% were calculated. All of these three heat-flow populations had identically standard deviations to that for the DST data, but with significantly lower mean values. Correction factors were calculated to give each of the three erroneous populations the same mean value as the DST population. Heat flows calculated from RFT data had to be corrected upward by a factor of 1.12 to be equivalent to DST data; Horner plot data corrected by a factor of 1.18, and single BHT data by a factor of 1.2. These results suggest that present-day subsurface temperatures using RFT, Horner plot, and BHT data are considerably lower than they should be. The authors suspect qualitatively similar results would be found in other areas. Hence, they recommend significant corrections be routinely made until local calibration factors are established.« less

  13. Indirect Field Measurement of Wine-Grape Vineyard Canopy Leaf Area Index

    NASA Technical Reports Server (NTRS)

    Johnson, Lee F.; Pierce, Lars L.; Skiles, J. W. (Technical Monitor)

    2002-01-01

    Leaf area index (LAI) indirect measurements were made at 12 study plots in California's Napa Valley commercial wine-grape vineyards with a LI-COR LI-2000 Plant Canopy Analyzer (PCA). The plots encompassed different trellis systems, biological varieties, and planting densities. LAI ranged from 0.5 - 2.25 sq m leaf area/ sq m ground area according to direct (defoliation) measurements. Indirect LAI reported by the PCA was significantly related to direct LAI (r(exp 2) = 0.78, p less than 001). However, the PCA tended to underestimate direct LAI by about a factor of two. Narrowing the instrument's conical field of view from 148 deg to 56 deg served to increase readings by approximately 30%. The PCA offers a convenient way to discern relative differences in vineyard canopy density. Calibration by direct measurement (defoliation) is recommended in cases where absolute LAI is desired. Calibration equations provided herein may be inverted to retrieve actual vineyard LAI from PCA readings.

  14. Rapid, quantitative analysis of ppm/ppb nicotine using surface-enhanced Raman scattering from polymer-encapsulated Ag nanoparticles (gel-colls).

    PubMed

    Bell, Steven E J; Sirimuthu, Narayana M S

    2004-11-01

    Rapid, quantitative SERS analysis of nicotine at ppm/ppb levels has been carried out using stable and inexpensive polymer-encapsulated Ag nanoparticles (gel-colls). The strongest nicotine band (1030 cm(-1)) was measured against d(5)-pyridine internal standard (974 cm(-1)) which was introduced during preparation of the stock gel-colls. Calibration plots of I(nic)/I(pyr) against the concentration of nicotine were non-linear but plotting I(nic)/I(pyr) against [nicotine](x)(x = 0.6-0.75, depending on the exact experimental conditions) gave linear calibrations over the range (0.1-10 ppm) with R(2) typically ca. 0.998. The RMS prediction error was found to be 0.10 ppm when the gel-colls were used for quantitative determination of unknown nicotine samples in 1-5 ppm level. The main advantages of the method are that the gel-colls constitute a highly stable and reproducible SERS medium that allows high throughput (50 sample h(-1)) measurements.

  15. The Critical Importance of Russell's Diagram

    NASA Astrophysics Data System (ADS)

    Gingerich, O.

    2013-04-01

    The idea of dwarf and giants stars, but not the nomenclature, was first established by Eijnar Hertzsprung in 1905; his first diagrams in support appeared in 1911. In 1913 Henry Norris Russell could demonstrate the effect far more strikingly because he measured the parallaxes of many stars at Cambridge, and could plot absolute magnitude against spectral type for many points. The general concept of dwarf and giant stars was essential in the galactic structure work of Harlow Shapley, Russell's first graduate student. In order to calibrate the period-luminosity relation of Cepheid variables, he was obliged to fall back on statistical parallax using only 11 Cepheids, a very sparse sample. Here the insight provided by the Russell diagram became critical. The presence of yellow K giant stars in globular clusters credentialed his calibration of the period-luminosity relation by showing that the calibrated luminosity of the Cepheids was comparable to the luminosity of the K giants. It is well known that in 1920 Shapley did not believe in the cosmological distances of Heber Curtis' spiral nebulae. It is not so well known that in 1920 Curtis' plot of the period-luminosity relation suggests that he didn't believe it was a physical relation and also he failed to appreciate the significance of the Russell diagram for understanding the large size of the Milky Way.

  16. External validation of a prediction model for surgical site infection after thoracolumbar spine surgery in a Western European cohort.

    PubMed

    Janssen, Daniël M C; van Kuijk, Sander M J; d'Aumerie, Boudewijn B; Willems, Paul C

    2018-05-16

    A prediction model for surgical site infection (SSI) after spine surgery was developed in 2014 by Lee et al. This model was developed to compute an individual estimate of the probability of SSI after spine surgery based on the patient's comorbidity profile and invasiveness of surgery. Before any prediction model can be validly implemented in daily medical practice, it should be externally validated to assess how the prediction model performs in patients sampled independently from the derivation cohort. We included 898 consecutive patients who underwent instrumented thoracolumbar spine surgery. To quantify overall performance using Nagelkerke's R 2 statistic, the discriminative ability was quantified as the area under the receiver operating characteristic curve (AUC). We computed the calibration slope of the calibration plot, to judge prediction accuracy. Sixty patients developed an SSI. The overall performance of the prediction model in our population was poor: Nagelkerke's R 2 was 0.01. The AUC was 0.61 (95% confidence interval (CI) 0.54-0.68). The estimated slope of the calibration plot was 0.52. The previously published prediction model showed poor performance in our academic external validation cohort. To predict SSI after instrumented thoracolumbar spine surgery for the present population, a better fitting prediction model should be developed.

  17. Batch Conversion of 1-D FITS Spectra to Common Graphical Display Files

    NASA Astrophysics Data System (ADS)

    MacConnell, Darrell J.; Patterson, A. P.; Wing, R. F.; Costa, E.; Jedrzejewski, R. I.

    2008-09-01

    Authors DJM, RFW, and EC have accumulated about 1000 spectra of cool stars from CTIO, ESO, and LCO over the interval 1985 to 1994 and processed them with the standard IRAF tasks into FITS files of normalized intensity vs. wavelength. With the growth of the Web as a means of exchanging and preserving scientific information, we desired to put the spectra into a Web-readable format. We have searched without success sites such as the Goddard FITS Image Viewer page, http://fits.gsfc.nasa.gov/fits_viewer.html, for a program to convert a large number of 1-d stellar spectra from FITS format into common formats such as PDF, PS, or PNG. Author APP has written a Python script to do this using the PyFITS module and plotting routines from Pylab. The program determines the wavelength calibration using header keywords and creates PNG plots with a legend read from a CSV file that may contain the star name, position, spectral type, etc. It could readily be adapted to perform almost any kind of simple batch processing of astronomical data. The program may be obtained from the first author (jack@stsci.edu). Support for DJM from the research program for CSC astronomers at STScI is gratefully acknowledged. The Space Telescope Science Institute is operated by the Association of Universities for Research in Astronomy Inc. under NASA contract NAS 5-26555.

  18. Development and Validation of a Practical Two-Step Prediction Model and Clinical Risk Score for Post-Thrombotic Syndrome.

    PubMed

    Amin, Elham E; van Kuijk, Sander M J; Joore, Manuela A; Prandoni, Paolo; Cate, Hugo Ten; Cate-Hoek, Arina J Ten

    2018-06-04

     Post-thrombotic syndrome (PTS) is a common chronic consequence of deep vein thrombosis that affects the quality of life and is associated with substantial costs. In clinical practice, it is not possible to predict the individual patient risk. We develop and validate a practical two-step prediction tool for PTS in the acute and sub-acute phase of deep vein thrombosis.  Multivariable regression modelling with data from two prospective cohorts in which 479 (derivation) and 1,107 (validation) consecutive patients with objectively confirmed deep vein thrombosis of the leg, from thrombosis outpatient clinic of Maastricht University Medical Centre, the Netherlands (derivation) and Padua University hospital in Italy (validation), were included. PTS was defined as a Villalta score of ≥ 5 at least 6 months after acute thrombosis.  Variables in the baseline model in the acute phase were: age, body mass index, sex, varicose veins, history of venous thrombosis, smoking status, provoked thrombosis and thrombus location. For the secondary model, the additional variable was residual vein obstruction. Optimism-corrected area under the receiver operating characteristic curves (AUCs) were 0.71 for the baseline model and 0.60 for the secondary model. Calibration plots showed well-calibrated predictions. External validation of the derived clinical risk scores was successful: AUC, 0.66 (95% confidence interval [CI], 0.63-0.70) and 0.64 (95% CI, 0.60-0.69).  Individual risk for PTS in the acute phase of deep vein thrombosis can be predicted based on readily accessible baseline clinical and demographic characteristics. The individual risk in the sub-acute phase can be predicted with limited additional clinical characteristics. Schattauer GmbH Stuttgart.

  19. (Pre-) calibration of a Reduced Complexity Model of the Antarctic Contribution to Sea-level Changes

    NASA Astrophysics Data System (ADS)

    Ruckert, K. L.; Guan, Y.; Shaffer, G.; Forest, C. E.; Keller, K.

    2015-12-01

    (Pre-) calibration of a Reduced Complexity Model of the Antarctic Contribution to Sea-level ChangesKelsey L. Ruckert1*, Yawen Guan2, Chris E. Forest1,3,7, Gary Shaffer 4,5,6, and Klaus Keller1,7,81 Department of Geosciences, The Pennsylvania State University, University Park, Pennsylvania, USA 2 Department of Statistics, The Pennsylvania State University, University Park, Pennsylvania, USA 3 Department of Meteorology, The Pennsylvania State University, University Park, Pennsylvania, USA 4 GAIA_Antarctica, University of Magallanes, Punta Arenas, Chile 5 Center for Advanced Studies in Arid Zones, La Serena, Chile 6 Niels Bohr Institute, University of Copenhagen, Copenhagen, Denmark 7 Earth and Environmental Systems Institute, The Pennsylvania State University, University Park, Pennsylvania, USA 8 Department of Engineering and Public Policy, Carnegie Mellon University, Pittsburgh, Pennsylvania, USA * Corresponding author. E-mail klr324@psu.eduUnderstanding and projecting future sea-level changes poses nontrivial challenges. Sea-level changes are driven primarily by changes in the density of seawater as well as changes in the size of glaciers and ice sheets. Previous studies have demonstrated that a key source of uncertainties surrounding sea-level projections is the response of the Antarctic ice sheet to warming temperatures. Here we calibrate a previously published and relatively simple model of the Antarctic ice sheet over a hindcast period from the last interglacial period to the present. We apply and compare a range of (pre-) calibration methods, including a Bayesian approach that accounts for heteroskedasticity. We compare the model hindcasts and projections for different levels of model complexity and calibration methods. We compare the projections with the upper bounds from previous studies and find our projections have a narrower range in 2100. Furthermore we discuss the implications for the design of climate risk management strategies.

  20. Monitoring irrigation water consumption using high resolution NDVI image time series (Sentinel-2 like). Calibration and validation in the Kairouan plain (Tunisia)

    NASA Astrophysics Data System (ADS)

    Saadi, Sameh; Simonneaux, Vincent; Boulet, Gilles; Mougenot, Bernard; Zribi, Mehrez; Lili Chabaane, Zohra

    2015-04-01

    Water scarcity is one of the main factors limiting agricultural development in semi-arid areas. It is thus of major importance to design tools allowing a better management of this resource. Remote sensing has long been used for computing evapotranspiration estimates, which is an input for crop water balance monitoring. Up to now, only medium and low resolution data (e.g. MODIS) are available on regular basis to monitor cultivated areas. However, the increasing availability of high resolution high repetitivity VIS-NIR remote sensing, like the forthcoming Sentinel-2 mission to be lunched in 2015, offers unprecedented opportunity to improve this monitoring. In this study, regional crops water consumption was estimated with the SAMIR software (Satellite of Monitoring Irrigation) using the FAO-56 dual crop coefficient water balance model fed with high resolution NDVI image time series providing estimates of both the actual basal crop coefficient (Kcb) and the vegetation fraction cover. The model includes a soil water model, requiring the knowledge of soil water holding capacity, maximum rooting depth, and water inputs. As irrigations are usually not known on large areas, they are simulated based on rules reproducing the farmer practices. The main objective of this work is to assess the operationality and accuracy of SAMIR at plot and perimeter scales, when several land use types (winter cereals, summer vegetables…), irrigation and agricultural practices are intertwined in a given landscape, including complex canopies such as sparse orchards. Meteorological ground stations were used to compute the reference evapotranspiration and get the rainfall depths. Two time series of ten and fourteen high-resolution SPOT5 have been acquired for the 2008-2009 and 2012-2013 hydrological years over an irrigated area in central Tunisia. They span the various successive crop seasons. The images were radiometrically corrected, first, using the SMAC6s Algorithm, second, using invariant objects located on the scene, based on visual observation of the images. From these time series, a Normalized Difference Vegetation Index (NDVI) profile was generated for each pixel. SAMIR was first calibrated based on ground measurements of evapotranspiration achieved using eddy-correlation devices installed on irrigated wheat and barley plots. After calibration, the model was run to spatialize irrigation over the whole area and a validation was done using cumulated seasonal water volumes obtained from ground survey at both plot and perimeter scales. The results show that although determination of model parameters was successful at plot scale, irrigation rules required an additional calibration which was achieved at perimeter scale.

  1. The determination of ethanol in blood and urine by mass fragmentography

    NASA Technical Reports Server (NTRS)

    Pereira, W. E.; Summons, R. E.; Rindfleisch, T. C.; Duffield, A. M.

    1974-01-01

    A mass fragmentographic technique for a rapid, specific and sensitive determination of ethanol in blood and urine is described. A Varian gas chromatograph coupled through an all-glass membrane separator to a Finnigan quadripole mass spectrometer and interfaced to a computer system is used for ethanol determination in blood and urine samples. A procedure for plotting calibration curves for ethanol quantitation is also described. Quantitation is achieved by plotting the peak area ratios of undeuterated-to-deuterated ethanol fragment ions against the amount of ethanol added. Representative results obtained by this technique are included.

  2. Perennial grass and native wildflowers: a synergistic approach to habitat management

    USDA-ARS?s Scientific Manuscript database

    A total of 19 buffer plots were established on University of Georgia experimental farms and lands near Tifton, GA in 2015. The buffer plots were assigned to a 2 x 2 design of local spatial context and irrigation. For local spatial context, ten plots were located adjacent to woodland (“T”) and ten in...

  3. Design and Implementation of High Precision Temperature Measurement Unit

    NASA Astrophysics Data System (ADS)

    Zeng, Xianzhen; Yu, Weiyu; Zhang, Zhijian; Liu, Hancheng

    2018-03-01

    Large-scale neutrino detector requires calibration of photomultiplier tubes (PMT) and electronic system in the detector, performed by plotting the calibration source with a group of designated coordinates in the acrylic sphere. Where the calibration source positioning is based on the principle of ultrasonic ranging, the transmission speed of ultrasonic in liquid scintillator of acrylic sphere is related to temperature. This paper presents a temperature measurement unit based on STM32L031 and single-line bus digital temperature sensor TSic506. The measurement data of the temperature measurement unit can help the ultrasonic ranging to be more accurate. The test results show that the temperature measurement error is within ±0.1°C, which satisfies the requirement of calibration source positioning. Take energy-saving measures, with 3.7V/50mAH lithium battery-powered, the temperature measurement unit can work continuously more than 24 hours.

  4. Exploration of attenuated total reflectance mid-infrared spectroscopy and multivariate calibration to measure immunoglobulin G in human sera.

    PubMed

    Hou, Siyuan; Riley, Christopher B; Mitchell, Cynthia A; Shaw, R Anthony; Bryanton, Janet; Bigsby, Kathryn; McClure, J Trenton

    2015-09-01

    Immunoglobulin G (IgG) is crucial for the protection of the host from invasive pathogens. Due to its importance for human health, tools that enable the monitoring of IgG levels are highly desired. Consequently there is a need for methods to determine the IgG concentration that are simple, rapid, and inexpensive. This work explored the potential of attenuated total reflectance (ATR) infrared spectroscopy as a method to determine IgG concentrations in human serum samples. Venous blood samples were collected from adults and children, and from the umbilical cord of newborns. The serum was harvested and tested using ATR infrared spectroscopy. Partial least squares (PLS) regression provided the basis to develop the new analytical methods. Three PLS calibrations were determined: one for the combined set of the venous and umbilical cord serum samples, the second for only the umbilical cord samples, and the third for only the venous samples. The number of PLS factors was chosen by critical evaluation of Monte Carlo-based cross validation results. The predictive performance for each PLS calibration was evaluated using the Pearson correlation coefficient, scatter plot and Bland-Altman plot, and percent deviations for independent prediction sets. The repeatability was evaluated by standard deviation and relative standard deviation. The results showed that ATR infrared spectroscopy is potentially a simple, quick, and inexpensive method to measure IgG concentrations in human serum samples. The results also showed that it is possible to build a united calibration curve for the umbilical cord and the venous samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Complete suite of geochemical values computed using wireline logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lancaster, J.R.; Atkinson, A.

    1996-12-31

    Geochemical values of {open_quotes}black shale{close_quotes} source rocks can be computed from a complete suite of wireline log data. The computed values are: Total Organic Carbon (Wt%). S1, S2, S3, Hydrogen Index, Oxygen Index, Atomic H/C and O/C ratios, Genetic Potential (S1+S2), S2/S3, and Transfomation Ratio (S1/(S1+S2)). The results are most reliable when calibrated to laboratory analyses of samples in the study area. However, in the absence of samples, reasonable estimates can be made using calibration data from analogous depositional and thermal environments and/or professional judgement and experience. The evaluations provide answers to critical geochemical questions relative to: (1) Organic Mattermore » Quantity; T.O.C. (Wt%), S1, and S2. (2) Kerogen Types; I, II, and III, based on T.O.C. vs S2 cross plot and the van Krevelen diagram of Atomic O/C vs Atomic H/C ratios. (3) Thermal Maturation levels; Transfomation Ratio can be converted to Level of Organic Metamorphism (LOM), pyrolysis Tmax (degC), Vitrinite Reflectance (Ro), Time Temperature Index (TTI) and others. Various analog plots and cross plots can be prepared for interpretation. Case history examples are shown and discussed. Lowstand fan deposits on Barbados were studied in outcrop to construct a conceptual reservoir model for prediction of facies assemblages.« less

  6. Complete suite of geochemical values computed using wireline logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lancaster, J.R.; Atkinson, A.

    1996-01-01

    Geochemical values of [open quotes]black shale[close quotes] source rocks can be computed from a complete suite of wireline log data. The computed values are: Total Organic Carbon (Wt%). S1, S2, S3, Hydrogen Index, Oxygen Index, Atomic H/C and O/C ratios, Genetic Potential (S1+S2), S2/S3, and Transfomation Ratio (S1/(S1+S2)). The results are most reliable when calibrated to laboratory analyses of samples in the study area. However, in the absence of samples, reasonable estimates can be made using calibration data from analogous depositional and thermal environments and/or professional judgement and experience. The evaluations provide answers to critical geochemical questions relative to: (1)more » Organic Matter Quantity; T.O.C. (Wt%), S1, and S2. (2) Kerogen Types; I, II, and III, based on T.O.C. vs S2 cross plot and the van Krevelen diagram of Atomic O/C vs Atomic H/C ratios. (3) Thermal Maturation levels; Transfomation Ratio can be converted to Level of Organic Metamorphism (LOM), pyrolysis Tmax (degC), Vitrinite Reflectance (Ro), Time Temperature Index (TTI) and others. Various analog plots and cross plots can be prepared for interpretation. Case history examples are shown and discussed. Lowstand fan deposits on Barbados were studied in outcrop to construct a conceptual reservoir model for prediction of facies assemblages.« less

  7. Elementary Preservice Teachers' Reasoning about Modeling a "Family Factory" with TinkerPlots--A Pilot Study

    ERIC Educational Resources Information Center

    Biehler, Rolf; Frischemeier, Daniel; Podworny, Susanne

    2017-01-01

    Connecting data and chance is fundamental in statistics curricula. The use of software like TinkerPlots can bridge both worlds because the TinkerPlots Sampler supports learners in expressive modeling. We conducted a study with elementary preservice teachers with a basic university education in statistics. They were asked to set up and evaluate…

  8. Calibrating and testing a gap model for simulating forest management in the Oregon Coast Range

    USGS Publications Warehouse

    Pabst, R.J.; Goslin, M.N.; Garman, S.L.; Spies, T.A.

    2008-01-01

    The complex mix of economic and ecological objectives facing today's forest managers necessitates the development of growth models with a capacity for simulating a wide range of forest conditions while producing outputs useful for economic analyses. We calibrated the gap model ZELIG to simulate stand-level forest development in the Oregon Coast Range as part of a landscape-scale assessment of different forest management strategies. Our goal was to incorporate the predictive ability of an empirical model with the flexibility of a forest succession model. We emphasized the development of commercial-aged stands of Douglas-fir, the dominant tree species in the study area and primary source of timber. In addition, we judged that the ecological approach of ZELIG would be robust to the variety of other forest conditions and practices encountered in the Coast Range, including mixed-species stands, small-scale gap formation, innovative silvicultural methods, and reserve areas where forests grow unmanaged for long periods of time. We parameterized the model to distinguish forest development among two ecoregions, three forest types and two site productivity classes using three data sources: chronosequences of forest inventory data, long-term research data, and simulations from an empirical growth-and-yield model. The calibrated model was tested with independent, long-term measurements from 11 Douglas-fir plots (6 unthinned, 5 thinned), 3 spruce-hemlock plots, and 1 red alder plot. ZELIG closely approximated developmental trajectories of basal area and large trees in the Douglas-fir plots. Differences between simulated and observed conifer basal area for these plots ranged from -2.6 to 2.4 m2/ha; differences in the number of trees/ha ???50 cm dbh ranged from -8.8 to 7.3 tph. Achieving these results required the use of a diameter-growth multiplier, suggesting some underlying constraints on tree growth such as the temperature response function. ZELIG also tended to overestimate regeneration of shade-tolerant trees and underestimate total tree density (i.e., higher rates of tree mortality). However, comparisons with the chronosequences of forest inventory data indicated that the simulated data are within the range of variability observed in the Coast Range. Further exploration and improvement of ZELIG is warranted in three key areas: (1) modeling rapid rates of conifer tree growth without the need for a diameter-growth multiplier; (2) understanding and remedying rates of tree mortality that were higher than those observed in the independent data; and (3) improving the tree regeneration module to account for competition with understory vegetation. ?? 2008 Elsevier B.V.

  9. Near infrared spectroscopy to estimate the temperature reached on burned soils: strategies to develop robust models.

    NASA Astrophysics Data System (ADS)

    Guerrero, César; Pedrosa, Elisabete T.; Pérez-Bejarano, Andrea; Keizer, Jan Jacob

    2014-05-01

    The temperature reached on soils is an important parameter needed to describe the wildfire effects. However, the methods for measure the temperature reached on burned soils have been poorly developed. Recently, the use of the near-infrared (NIR) spectroscopy has been pointed as a valuable tool for this purpose. The NIR spectrum of a soil sample contains information of the organic matter (quantity and quality), clay (quantity and quality), minerals (such as carbonates and iron oxides) and water contents. Some of these components are modified by the heat, and each temperature causes a group of changes, leaving a typical fingerprint on the NIR spectrum. This technique needs the use of a model (or calibration) where the changes in the NIR spectra are related with the temperature reached. For the development of the model, several aliquots are heated at known temperatures, and used as standards in the calibration set. This model offers the possibility to make estimations of the temperature reached on a burned sample from its NIR spectrum. However, the estimation of the temperature reached using NIR spectroscopy is due to changes in several components, and cannot be attributed to changes in a unique soil component. Thus, we can estimate the temperature reached by the interaction between temperature and the thermo-sensible soil components. In addition, we cannot expect the uniform distribution of these components, even at small scale. Consequently, the proportion of these soil components can vary spatially across the site. This variation will be present in the samples used to construct the model and also in the samples affected by the wildfire. Therefore, the strategies followed to develop robust models should be focused to manage this expected variation. In this work we compared the prediction accuracy of models constructed with different approaches. These approaches were designed to provide insights about how to distribute the efforts needed for the development of robust models, since this step is the bottle-neck of this technique. In the first approach, a plot-scale model was used to predict the temperature reached in samples collected in other plots from the same site. In a plot-scale model, all the heated aliquots come from a unique plot-scale sample. As expected, the results obtained with this approach were deceptive, because this approach was assuming that a plot-scale model would be enough to represent the whole variability of the site. The accuracy (measured as the root mean square error of prediction, thereinafter RMSEP) was 86ºC, and the bias was also high (>30ºC). In the second approach, the temperatures predicted through several plot-scale models were averaged. The accuracy was improved (RMSEP=65ºC) respect the first approach, because the variability from several plots was considered and biased predictions were partially counterbalanced. However, this approach implies more efforts, since several plot-scale models are needed. In the third approach, the predictions were obtained with site-scale models. These models were constructed with aliquots from several plots. In this case, the results were accurate, since the RMSEP was around 40ºC, the bias was very small (<1ºC) and the R2 was 0.92. As expected, this approach clearly outperformed the second approach, in spite of the fact that the same efforts were needed. In a plot-scale model, only one interaction between temperature and soil components was modelled. However, several different interactions between temperature and soil components were present in the calibration matrix of a site-scale model. Consequently, the site-scale models were able to model the temperature reached excluding the influence of the differences in soil composition, resulting in more robust models respect that variation. Summarizing, the results were highlighting the importance of an adequate strategy to develop robust and accurate models with moderate efforts, and how a wrong strategy can result in deceptive predictions.

  10. Comparison of modelled and empirical atmospheric propagation data

    NASA Technical Reports Server (NTRS)

    Schott, J. R.; Biegel, J. D.

    1983-01-01

    The radiometric integrity of TM thermal infrared channel data was evaluated and monitored to develop improved radiometric preprocessing calibration techniques for removal of atmospheric effects. Modelled atmospheric transmittance and path radiance were compared with empirical values derived from aircraft underflight data. Aircraft thermal infrared imagery and calibration data were available on two dates as were corresponding atmospheric radiosonde data. The radiosonde data were used as input to the LOWTRAN 5A code which was modified to output atmospheric path radiance in addition to transmittance. The aircraft data were calibrated and used to generate analogous measurements. These data indicate that there is a tendancy for the LOWTRAN model to underestimate atmospheric path radiance and transmittance as compared to empirical data. A plot of transmittance versus altitude for both LOWTRAN and empirical data is presented.

  11. A Guided Inquiry on Hubble Plots and the Big Bang

    NASA Astrophysics Data System (ADS)

    Forringer, Ted

    2014-04-01

    In our science for non-science majors course "21st Century Physics," we investigate modern "Hubble plots" (plots of velocity versus distance for deep space objects) in order to discuss the Big Bang, dark matter, and dark energy. There are two potential challenges that our students face when encountering these topics for the first time. The first challenge is in understanding and interpreting Hubble plots. The second is that some of our students have religious or cultural objections to the concept of a "Big Bang" or a universe that is billions of years old. This paper presents a guided inquiry exercise that was created with the goal of introducing students to Hubble plots and giving them the opportunity to discover for themselves why we believe our universe started with an explosion billions of years ago. The exercise is designed to be completed before the topics are discussed in the classroom. We did the exercise during a one hour and 45 minute "lab" time and it was done in groups of three or four students, but it would also work as an individual take-home assignment.

  12. 40 CFR 92.119 - Hydrocarbon analyzer calibration.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... plot of the difference between the span and zero response versus fuel flow will be similar to the one... basic operating adjustment using the appropriate fuel (see § 92.112) and zero-grade air. (2) Optimize on.... Allow at least one-half hour after the oven has reached temperature for the system to equilibrate. (C...

  13. 40 CFR 86.331-79 - Hydrocarbon analyzer calibration.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... plot of the difference between the span and zero response versus fuel flow will be similar to the one... least one-half hour after the oven has reached temperature for the system to equilibrate. (c) Initial... difference between the span-gas response and the zero-gas response. Incrementally adjust the fuel flow above...

  14. Compositional variability of the Martian surface

    NASA Technical Reports Server (NTRS)

    Adams, John B.; Smith, Milton O.

    1991-01-01

    Spectral reflectance data from Viking Landers and Orbiters and from telescopic observations were analyzed with the objective of isolating compositional information about the Martian surface and assessing compositional variability. Two approaches were used to calibrate the data to reflectance to permit direct comparisons with laboratory reference spectra of well characterized materials. In Viking Lander multispectral images (six spectral bands) most of the spectral variation is caused by changes in lighting geometry within individual scenes, from scene to scene, and over time. Lighting variations are both wavelength independent and wavelength dependent. By calibrating lander image radiance values to reflectance using spectral mixture analysis, the possible range of compositions was assessed with reference to a collection of laboratory samples, also resampled to the lander spectral bands. All spectra from the lander images studied plot (in six-space) within a planar triangle having at the apexes the respective spectra of tan basaltic palagonite, gray basalt, and shale. Within this plane all lander spectra fit as mixtures of these three endmembers. Reference spectra that plot outside of the triangle are unable to account for the spectral variation observed in the images.

  15. Calibration of a subcutaneous amperometric glucose sensor implanted for 7 days in diabetic patients. Part 2. Superiority of the one-point calibration method.

    PubMed

    Choleau, C; Klein, J C; Reach, G; Aussedat, B; Demaria-Pesce, V; Wilson, G S; Gifford, R; Ward, W K

    2002-08-01

    Calibration, i.e. the transformation in real time of the signal I(t) generated by the glucose sensor at time t into an estimation of glucose concentration G(t), represents a key issue for the development of a continuous glucose monitoring system. To compare two calibration procedures. In the one-point calibration, which assumes that I(o) is negligible, S is simply determined as the ratio I/G, and G(t) = I(t)/S. The two-point calibration consists in the determination of a sensor sensitivity S and of a background current I(o) by plotting two values of the sensor signal versus the concomitant blood glucose concentrations. The subsequent estimation of G(t) is given by G(t) = (I(t)-I(o))/S. A glucose sensor was implanted in the abdominal subcutaneous tissue of nine type 1 diabetic patients during 3 (n = 2) and 7 days (n = 7). The one-point calibration was performed a posteriori either once per day before breakfast, or twice per day before breakfast and dinner, or three times per day before each meal. The two-point calibration was performed each morning during breakfast. The percentages of points present in zones A and B of the Clarke Error Grid were significantly higher when the system was calibrated using the one-point calibration. Use of two one-point calibrations per day before meals was virtually as accurate as three one-point calibrations. This study demonstrates the feasibility of a simple method for calibrating a continuous glucose monitoring system.

  16. External validation of prognostic models to predict risk of gestational diabetes mellitus in one Dutch cohort: prospective multicentre cohort study.

    PubMed

    Lamain-de Ruiter, Marije; Kwee, Anneke; Naaktgeboren, Christiana A; de Groot, Inge; Evers, Inge M; Groenendaal, Floris; Hering, Yolanda R; Huisjes, Anjoke J M; Kirpestein, Cornel; Monincx, Wilma M; Siljee, Jacqueline E; Van 't Zelfde, Annewil; van Oirschot, Charlotte M; Vankan-Buitelaar, Simone A; Vonk, Mariska A A W; Wiegers, Therese A; Zwart, Joost J; Franx, Arie; Moons, Karel G M; Koster, Maria P H

    2016-08-30

     To perform an external validation and direct comparison of published prognostic models for early prediction of the risk of gestational diabetes mellitus, including predictors applicable in the first trimester of pregnancy.  External validation of all published prognostic models in large scale, prospective, multicentre cohort study.  31 independent midwifery practices and six hospitals in the Netherlands.  Women recruited in their first trimester (<14 weeks) of pregnancy between December 2012 and January 2014, at their initial prenatal visit. Women with pre-existing diabetes mellitus of any type were excluded.  Discrimination of the prognostic models was assessed by the C statistic, and calibration assessed by calibration plots.  3723 women were included for analysis, of whom 181 (4.9%) developed gestational diabetes mellitus in pregnancy. 12 prognostic models for the disorder could be validated in the cohort. C statistics ranged from 0.67 to 0.78. Calibration plots showed that eight of the 12 models were well calibrated. The four models with the highest C statistics included almost all of the following predictors: maternal age, maternal body mass index, history of gestational diabetes mellitus, ethnicity, and family history of diabetes. Prognostic models had a similar performance in a subgroup of nulliparous women only. Decision curve analysis showed that the use of these four models always had a positive net benefit.  In this external validation study, most of the published prognostic models for gestational diabetes mellitus show acceptable discrimination and calibration. The four models with the highest discriminative abilities in this study cohort, which also perform well in a subgroup of nulliparous women, are easy models to apply in clinical practice and therefore deserve further evaluation regarding their clinical impact. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Demonstration of an Integrated Pest Management Program for Wheat in Tajikistan.

    PubMed

    Landis, Douglas A; Saidov, Nurali; Jaliov, Anvar; El Bouhssini, Mustapha; Kennelly, Megan; Bahlai, Christie; Landis, Joy N; Maredia, Karim

    2016-01-01

    Wheat is an important food security crop in central Asia but frequently suffers severe damage and yield losses from insect pests, pathogens, and weeds. With funding from the United States Agency for International Development, a team of scientists from three U.S. land-grant universities in collaboration with the International Center for Agricultural Research in Dry Areas and local institutions implemented an integrated pest management (IPM) demonstration program in three regions of Tajikistan from 2011 to 2014. An IPM package was developed and demonstrated in farmer fields using a combination of crop and pest management techniques including cultural practices, host plant resistance, biological control, and chemical approaches. The results from four years of demonstration/research indicated that the IPM package plots almost universally had lower pest abundance and damage and higher yields and were more profitable than the farmer practice plots. Wheat stripe rust infestation ranged from 30% to over 80% in farmer practice plots, while generally remaining below 10% in the IPM package plots. Overall yield varied among sites and years but was always at least 30% to as much as 69% greater in IPM package plots. More than 1,500 local farmers-40% women-were trained through farmer field schools and field days held at the IPM demonstration sites. In addition, students from local agricultural universities participated in on-site data collection. The IPM information generated by the project was widely disseminated to stakeholders through peer-reviewed scientific publications, bulletins and pamphlets in local languages, and via Tajik national television.

  18. Retrodirective Radar Calibration Nanosatellite

    DTIC Science & Technology

    2013-07-01

    Martin (Student Program Manager); Nicholas G. Fisher (Student Systems Engineer) University of Hawaii JULY 2013 Final Report...Cost-Effective, Rapid Design of a Student-Built Radar Calibration Nanosatellite Larry K. Martin , Nicholas G. Fisher, Toy Lim, John...University of Hawaii Reinventing Space Conference AIAA-RS-2012-3001 Martin 1 AIAA Reinventing Space Conference 2012

  19. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    NASA Astrophysics Data System (ADS)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  20. A Progress Report on X-Ray Diffraction Measurements on New Low-Thermal Conductivity Thermoelectric Materials

    DTIC Science & Technology

    1999-04-01

    as the only moving parts and no environmentally unfriendly gases . Thermoelectric generators can also improve fuel efficiency by using the heat lost...Facolta di Chimica Industriale di Bologna, 24[4] (1966) 113-132. 11 — i at £ 73 U « ■ 2-Theta (deg) Figure 1. Calibration plot for SRM1976

  1. A Low Cost Weather Balloon Borne Solar Cell Calibration Payload

    NASA Technical Reports Server (NTRS)

    Snyder, David B.; Wolford, David S.

    2012-01-01

    Calibration of standard sets of solar cell sub-cells is an important step to laboratory verification of on-orbit performance of new solar cell technologies. This paper, looks at the potential capabilities of a lightweight weather balloon payload for solar cell calibration. A 1500 gr latex weather balloon can lift a 2.7 kg payload to over 100,000 ft altitude, above 99% of the atmosphere. Data taken between atmospheric pressures of about 30 to 15 mbar may be extrapolated via the Langley Plot method to 0 mbar, i.e. AMO. This extrapolation, in principle, can have better than 0.1 % error. The launch costs of such a payload arc significantly less than the much larger, higher altitude balloons, or the manned flight facility. The low cost enables a risk tolerant approach to payload development. Demonstration of 1% standard deviation flight-to-flight variation is the goal of this project. This paper describes the initial concept of solar cell calibration payload, and reports initial test flight results. .

  2. Atmospheric Correction and Vicarious Calibration of Oceansat-1 Ocean Color Monitor (OCM) Data in Coastal Case 2 Waters

    DTIC Science & Technology

    2012-06-08

    Earth Scan Laboratory, Louisiana State University. Raw OCM data were calibrated by converting raw counts to radiance values for the eight OCM spectral...La(λi)) Aerosol path radiance is the contribution of scattering by particles similar to or larger than the wavelength of light such as dust, pollen ...University. Raw OCM data were calibrated by converting raw counts to radiance values for the eight OCM spectral bands using the SeaSpace Terascan TM

  3. Carnage interrupted : an analysis of fifteen terrorist plots against public surface transportation.

    DOT National Transportation Integrated Search

    2012-04-01

    This report examines 13 terrorist plots against public surface transportation that were uncovered and foiled by authorities between 1997 and 2010 and two failed attempts to carry out attacks. Certainly, this is not the total universe of foiled or fai...

  4. Data documentation for the bare soil experiment at the University of Arkansas

    NASA Technical Reports Server (NTRS)

    Waite, W. P.; Scott, H. D. (Principal Investigator); Hancock, G. D.

    1980-01-01

    The reflectivities of several controlled moisture test plots were investigated. These test plots were of a similar soil texture which was clay loam and were prepared to give a desired initial soil moisture and density profile. Measurements were conducted on the plots as the soil water redistributed for both long term and diurnal cycles. These measurements included reflectivity, gravimetric and volumetric soil moisture, soil moisture potential, and soil temperature.

  5. A Universal Graph Plotting Routine.

    ERIC Educational Resources Information Center

    Bogart, Theodore F., Jr.

    1984-01-01

    Presents a programing subroutine which will create a graphical plot that occupies any number of columns specified by user and will run with versions of BASIC programming language. Illustrations of the subroutine's ability to operate successfully for three possibilities (negative values, positive values, and both positive and negative values) are…

  6. Requirements for Calibration in Noninvasive Glucose Monitoring by Raman Spectroscopy

    PubMed Central

    Lipson, Jan; Bernhardt, Jeff; Block, Ueyn; Freeman, William R.; Hofmeister, Rudy; Hristakeva, Maya; Lenosky, Thomas; McNamara, Robert; Petrasek, Danny; Veltkamp, David; Waydo, Stephen

    2009-01-01

    Background In the development of noninvasive glucose monitoring technology, it is highly desirable to derive a calibration that relies on neither person-dependent calibration information nor supplementary calibration points furnished by an existing invasive measurement technique (universal calibration). Method By appropriate experimental design and associated analytical methods, we establish the sufficiency of multiple factors required to permit such a calibration. Factors considered are the discrimination of the measurement technique, stabilization of the experimental apparatus, physics–physiology-based measurement techniques for normalization, the sufficiency of the size of the data set, and appropriate exit criteria to establish the predictive value of the algorithm. Results For noninvasive glucose measurements, using Raman spectroscopy, the sufficiency of the scale of data was demonstrated by adding new data into an existing calibration algorithm and requiring that (a) the prediction error should be preserved or improved without significant re-optimization, (b) the complexity of the model for optimum estimation not rise with the addition of subjects, and (c) the estimation for persons whose data were removed entirely from the training set should be no worse than the estimates on the remainder of the population. Using these criteria, we established guidelines empirically for the number of subjects (30) and skin sites (387) for a preliminary universal calibration. We obtained a median absolute relative difference for our entire data set of 30 mg/dl, with 92% of the data in the Clarke A and B ranges. Conclusions Because Raman spectroscopy has high discrimination for glucose, a data set of practical dimensions appears to be sufficient for universal calibration. Improvements based on reducing the variance of blood perfusion are expected to reduce the prediction errors substantially, and the inclusion of supplementary calibration points for the wearable device under development will be permissible and beneficial. PMID:20144354

  7. HPTLC and Spectrophotometric Estimation of Febuxostat and Diclofenac Potassium in Their Combined Tablets.

    PubMed

    El-Yazbi, Fawzi A; Amin, Omayma A; El-Kimary, Eman I; Khamis, Essam F; Younis, Sameh E

    2016-08-01

    An accurate, precise, rapid, specific and economic high-performance thin-layer chromatographic (HPTLC) method has been developed for the simultaneous quantitative determination of febuxostat (FEB) and diclofenac potassium (DIC). The chromatographic separation was performed on precoated silica gel 60 GF254 plates with chloroform-methanol 7:3 (v/v) as the mobile phase. The developed plates were scanned and quantified at 289 nm. Experimental conditions including band size, mobile phase composition and chamber-saturation time were critically studied, and the optimum conditions were selected. A satisfactory resolution (Rs = 2.67) with RF 0.48 and 0.69 and high sensitivity with limits of detection of 4 and 7 ng/band for FEB and DIC, respectively, were obtained. In addition, derivative ratio and ratio difference spectrophotometric methods were established for the analysis of such a mixture. All methods were validated as per the ICH guidelines. In the HPTLC method, the calibration plots were linear between 0.01-0.55 and 0.02-0.60 µg/band, for FEB and DIC, respectively. For the spectrophotometric methods, the calibration graphs were linear between 2-14 and 4-18 µg/mL for FEB and DIC, respectively. The simplicity and specificity of the proposed methods suggest their application in quality control analysis of FEB and DIC in their raw materials and tablets. A comparison of the proposed methods with the existing methods is presented. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Updating Indiana Annual Forest Inventory and Analysis Plot Data Using Eastern Broadleaf Forest Diameter Growth Models

    Treesearch

    Veronica C. Lessard

    2001-01-01

    The Forest Inventory and Analysis (FIA) program of the North Central Research Station (NCRS), USDA Forest Service, has developed nonlinear, individual-tree, distance-independent annual diameter growth models. The models are calibrated for species groups and formulated as the product of an average diameter growth component and a modifier component. The regional models...

  9. Determination of thiopental in urine sample with high-performance liquid chromatography using iodine-azide reaction as a postcolumn detection system.

    PubMed

    Zakrzewski, Robert; Ciesielski, Witold

    2005-09-25

    The reaction between iodine and azide ions induced by thiopental was utilized as a postcolumn reaction for chromatographic determination of thiopental. The method is based on the separation of thiopental on an Nova-Pak CN HP column with an acetonitrile-aqueous solution of sodium azide as a mobile phase, followed by spectrophotometric measurement of the residual iodine (lambda=350 nm) from the postcolumn iodine-azide reaction induced by thiopental after mixing an iodine solution containing iodide ions with the column effluent containing azide ions and thiopental. Chromatograms obtained for thiopental showed negative peaks as a result of the decrease in background absorbance. The detection limit (defined as S/N=3) was 20 nM (0.4 pmol injected amount) for thiopental. Calibration graphs, plotted as peak area versus concentrations, were linear from 40 nM. The elaborated method was applied to determine thiopental in urine samples. The detection limit (defined as S/N=3) was 0.025 nmol/ml urine. Calibration graphs, plotted as peak area versus concentrations, were linear from 0.05 nmol/ml urine. Authentic urine samples were analyzed, thiopental was determined at nmol/ml urine level.

  10. Demonstration of an Integrated Pest Management Program for Wheat in Tajikistan

    PubMed Central

    Landis, Douglas A.; Saidov, Nurali; Jaliov, Anvar; El Bouhssini, Mustapha; Kennelly, Megan; Bahlai, Christie; Landis, Joy N.; Maredia, Karim

    2016-01-01

    Wheat is an important food security crop in central Asia but frequently suffers severe damage and yield losses from insect pests, pathogens, and weeds. With funding from the United States Agency for International Development, a team of scientists from three U.S. land-grant universities in collaboration with the International Center for Agricultural Research in Dry Areas and local institutions implemented an integrated pest management (IPM) demonstration program in three regions of Tajikistan from 2011 to 2014. An IPM package was developed and demonstrated in farmer fields using a combination of crop and pest management techniques including cultural practices, host plant resistance, biological control, and chemical approaches. The results from four years of demonstration/research indicated that the IPM package plots almost universally had lower pest abundance and damage and higher yields and were more profitable than the farmer practice plots. Wheat stripe rust infestation ranged from 30% to over 80% in farmer practice plots, while generally remaining below 10% in the IPM package plots. Overall yield varied among sites and years but was always at least 30% to as much as 69% greater in IPM package plots. More than 1,500 local farmers—40% women—were trained through farmer field schools and field days held at the IPM demonstration sites. In addition, students from local agricultural universities participated in on-site data collection. The IPM information generated by the project was widely disseminated to stakeholders through peer-reviewed scientific publications, bulletins and pamphlets in local languages, and via Tajik national television. PMID:28446990

  11. 1998 Calibration of the Mach 4.7 and Mach 6 Arc-Heated Scramjet Test Facility Nozzles

    NASA Technical Reports Server (NTRS)

    Witte, David W.; Irby, Richard G.; Auslender, Aaron H.; Rock, Kenneth E.

    2004-01-01

    A calibration of the Arc-Heated Scramjet Test Facility (AHSTF) Mach 4.7 and Mach 6 nozzles was performed in 1998. For each nozzle, three different typical facility operating test points were selected for calibration. Each survey consisted of measurements, at 340 separate locations across the 11 inch square nozzle exit plane, of pitot pressure, static pressure, and total temperature. Measurement density was higher (4/inch) in the boundary layer near the nozzle wall than in the core nozzle flow (1/inch). The results generated for each of these calibration surveys were contour plots at the nozzle exit plane of the measured and calculated flow properties which completely defined the thermodynamic state of the nozzle exit flow. An area integration of the mass flux at the nozzle exit for each survey was compared to the AHSTF mass flow meter results to provide an indication of the overall quality of the calibration performed. The percent difference between the integrated nozzle exit mass flow and the flow meter ranged from 0.0 to 1.3 percent for the six surveys. Finally, a comparison of this 1998 calibration was made with the 1986 calibration. Differences of less than 10 percent were found within the nozzle core flow while in the boundary layer differences on the order of 20 percent were quite common.

  12. Development of Decision Support Formulas for the Prediction of Bladder Outlet Obstruction and Prostatic Surgery in Patients With Lower Urinary Tract Symptom/Benign Prostatic Hyperplasia: Part II, External Validation and Usability Testing of a Smartphone App.

    PubMed

    Choo, Min Soo; Jeong, Seong Jin; Cho, Sung Yong; Yoo, Changwon; Jeong, Chang Wook; Ku, Ja Hyeon; Oh, Seung-June

    2017-04-01

    We aimed to externally validate the prediction model we developed for having bladder outlet obstruction (BOO) and requiring prostatic surgery using 2 independent data sets from tertiary referral centers, and also aimed to validate a mobile app for using this model through usability testing. Formulas and nomograms predicting whether a subject has BOO and needs prostatic surgery were validated with an external validation cohort from Seoul National University Bundang Hospital and Seoul Metropolitan Government-Seoul National University Boramae Medical Center between January 2004 and April 2015. A smartphone-based app was developed, and 8 young urologists were enrolled for usability testing to identify any human factor issues of the app. A total of 642 patients were included in the external validation cohort. No significant differences were found in the baseline characteristics of major parameters between the original (n=1,179) and the external validation cohort, except for the maximal flow rate. Predictions of requiring prostatic surgery in the validation cohort showed a sensitivity of 80.6%, a specificity of 73.2%, a positive predictive value of 49.7%, and a negative predictive value of 92.0%, and area under receiver operating curve of 0.84. The calibration plot indicated that the predictions have good correspondence. The decision curve showed also a high net benefit. Similar evaluation results using the external validation cohort were seen in the predictions of having BOO. Overall results of the usability test demonstrated that the app was user-friendly with no major human factor issues. External validation of these newly developed a prediction model demonstrated a moderate level of discrimination, adequate calibration, and high net benefit gains for predicting both having BOO and requiring prostatic surgery. Also a smartphone app implementing the prediction model was user-friendly with no major human factor issue.

  13. SURFplus Model Calibration for PBX 9502

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph

    2017-12-06

    The SURFplus reactive burn model is calibrated for the TATB based explosive PBX 9502 at three initial temperatures; hot (75 C), ambient (23 C) and cold (-55 C). The CJ state depends on the initial temperature due to the variation in the initial density and initial specific energy of the PBX reactants. For the reactants, a porosity model for full density TATB is used. This allows the initial PBX density to be set to its measured value even though the coeffcient of thermal expansion for the TATB and the PBX differ. The PBX products EOS is taken as independent ofmore » the initial PBX state. The initial temperature also affects the sensitivity to shock initiation. The model rate parameters are calibrated to Pop plot data, the failure diameter, the limiting detonation speed just above the failure diameters, and curvature effect data for small curvature.« less

  14. The standard calibration instrument automation system for the atomic absorption spectrophotometer. Part 3: Program documentation

    NASA Astrophysics Data System (ADS)

    Ryan, D. P.; Roth, G. S.

    1982-04-01

    Complete documentation of the 15 programs and 11 data files of the EPA Atomic Absorption Instrument Automation System is presented. The system incorporates the following major features: (1) multipoint calibration using first, second, or third degree regression or linear interpolation, (2) timely quality control assessments for spiked samples, duplicates, laboratory control standards, reagent blanks, and instrument check standards, (3) reagent blank subtraction, and (4) plotting of calibration curves and raw data peaks. The programs of this system are written in Data General Extended BASIC, Revision 4.3, as enhanced for multi-user, real-time data acquisition. They run in a Data General Nova 840 minicomputer under the operating system RDOS, Revision 6.2. There is a functional description, a symbol definitions table, a functional flowchart, a program listing, and a symbol cross reference table for each program. The structure of every data file is also detailed.

  15. Standard Reference Line Combined with One-Point Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) to Quantitatively Analyze Stainless and Heat Resistant Steel.

    PubMed

    Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong

    2018-01-01

    Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.

  16. Design and initial evaluation of a portable in situ runoff and sediment monitoring device

    NASA Astrophysics Data System (ADS)

    Sun, Tao; Cruse, Richard M.; Chen, Qiang; Li, Hao; Song, Chunyu; Zhang, Xingyi

    2014-11-01

    An inexpensive portable runoff and sediment monitoring device (RSMD) requiring no external electric power was developed for measuring water runoff and associated sediment loss from field plots ranging from 0.005 to 0.1 ha. The device consists of runoff gauge, sediment mixing and sectional subsampling assemblies. The runoff hydrograph is determined using a calibrated tipping bucket. The sediment mixing assembly minimizes fluid splash while mixing the runoff water/sediment mixture prior to subsampling this material. Automatic flow-proportional sampling utilizes mechanical power supplied by the tipping bucket action, with power transmitted to the sample collection assembly via the tipping bucket pivot bar. Runoff is well-mixed and subdivided twice before subsamples are collected for analysis. The resolution of this device for a 100 m2 plot is 0.025 mm of runoff; the device is able to capture maximum flow rates up to 82 mm h-1 in a plot of the same dimension. Calibration results indicated the maximum error is 2.1% for estimating flow rate and less than 10% for sediment concentration in most of the flow range. The RSMD was assessed by measuring field runoff and soil loss from different tillage and slope treatments for a single natural rainfall event. Results were in close agreement with those in published literature, giving additional evidence that this device is performing acceptably well. The RSMD is uniquely adapted for a wide range of field sites, especially for those without electric power, making it a useful tool for studying soil management strategies.

  17. Sustained prediction ability of net analyte preprocessing methods using reduced calibration sets. Theoretical and experimental study involving the spectrophotometric analysis of multicomponent mixtures.

    PubMed

    Goicoechea, H C; Olivieri, A C

    2001-07-01

    A newly developed multivariate method involving net analyte preprocessing (NAP) was tested using central composite calibration designs of progressively decreasing size regarding the multivariate simultaneous spectrophotometric determination of three active components (phenylephrine, diphenhydramine and naphazoline) and one excipient (methylparaben) in nasal solutions. Its performance was evaluated and compared with that of partial least-squares (PLS-1). Minimisation of the calibration predicted error sum of squares (PRESS) as a function of a moving spectral window helped to select appropriate working spectral ranges for both methods. The comparison of NAP and PLS results was carried out using two tests: (1) the elliptical joint confidence region for the slope and intercept of a predicted versus actual concentrations plot for a large validation set of samples and (2) the D-optimality criterion concerning the information content of the calibration data matrix. Extensive simulations and experimental validation showed that, unlike PLS, the NAP method is able to furnish highly satisfactory results when the calibration set is reduced from a full four-component central composite to a fractional central composite, as expected from the modelling requirements of net analyte based methods.

  18. True Colors Shining Through

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This image mosaic illustrates how scientists use the color calibration targets (upper left) located on both Mars Exploration Rovers to fine-tune the rovers' sense of color. In the center, spectra, or light signatures, acquired in the laboratory of the colored chips on the targets are shown as lines. Actual data from Mars Exploration Rover Spirit's panoramic camera is mapped on top of these lines as dots. The plot demonstrates that the observed colors of Mars match the colors of the chips, and thus approximate the red planet's true colors. This finding is further corroborated by the picture taken on Mars of the calibration target, which shows the colored chips as they would appear on Earth.

  19. Containerized Nursery Start-up Costs

    Treesearch

    Mike Edwards

    2002-01-01

    About 4 years ago, I seriously began to entertain the idea of opening a containerized forestry nursery. During this period, some of the timberland owned by American Forest Seed Service near Auburn University's Solon Dixon Forestry Center, located in Andalusia, AL, was being used for three out-planting test plots. The test plots compared growth rates between...

  20. Extreme Energy Particle Astrophysics with ANITA-V

    NASA Astrophysics Data System (ADS)

    Wissel, Stephanie

    This proposal is in collaboration with Peter Gorham at the University of Hawaii, who is the PI of the lead proposal. Co-I Wissel and her group at California Polytechnic State University (Cal Poly) will be responsible for calibration equipment upgrades, calibration equipment, and deployment of the calibration system. The Cal Poly group is planning to provide calibration hardware and software products in support of the analysis of ANITAV data in search of ultra high-energy (UHE) neutrinos and cosmic rays. Wissel (now at Cal Poly, a new collaborating institution for ANITA-5) brings significant experience in the detection of high-energy and ultra-high energy particles to the collaboration, leveraging her thirteen years of experience in particle astrophysics and previous work on ANITA-III and ANITA-IV.

  1. Modelling Water Flow through Paddy Soils under Alternate Wetting and Drying Irrigation Practice

    NASA Astrophysics Data System (ADS)

    Shekhar, S.; Mailapalli, D. R.; Das, B. S.; Raghuwanshi, N. S.

    2017-12-01

    Alternate wetting and drying (AWD) irrigation practice in paddy cultivation requires an optimum soil moisture stress (OSMS) level at which irrigation water savings can be maximized without compromising the yield reduction. Determining OSMS experimentally is challenging and only possible with appropriate modeling tools. In this study, field experiments on paddy were conducted in thirty non-weighing type lysimeters during dry seasons of 2016 and 2017. Ten plots were irrigated using continuous flooding (CF) and the rest were irrigated with AWD practice at 40mb and 75mb soil moisture stress levels. Depth of ponding and soil suction at 10, 40 and 70 cm from the soil surface were measured daily from all lysimeter plots. The measured field data were used in calibration and validation of Hydrus-1D model and simulated the water flow for both AWD and CF plots. The Hydrus-1D is being used to estimate OSMS for AWD practice and compared the seasonal irrigation water input and deep percolation losses with CF practice.

  2. Measures and Relative Motions of Some Mostly F. G. W. Struve Doubles

    NASA Astrophysics Data System (ADS)

    Wiley, E. O.

    2012-04-01

    Measures of 59 pairs of double stars with long observational histories using "lucky imaging" techniques are reported. Relative motions of 59 pairs are investigated using histories of observation, scatter plots of relative motion, ordinary least-squares (OLS) and total proper motion analyses performed in "R," an open source programming language. A scatter plot of the coefficient of determinations derived from the OLS y|epoch and OLS x|epoch clearly separates common proper motion pairs from optical pairs and what are termed "long-period binary candidates." Differences in proper motion separate optical pairs from long-term binary candidates. An Appendix is provided that details how to use known rectilinear pairs as calibration pairs for the program REDUC.

  3. STS-3/OSS-1 Plasma Diagnostics Package (PDP) measurements of Orbiter transmitter and subsystem electromagnetic interference

    NASA Technical Reports Server (NTRS)

    Shawhan, S. D.; Murphy, G.

    1983-01-01

    The plasma diagnostics package receiver system is described to identify the various antennas and to characterize the complement of receivers which cover the frequency range of 30 Hz to 800 Hz and S-band at 2200 + or - 300 MHz. Sample results are presented to show the variability of electromagnetic effects associated with the orbiter and the time variability of these effects. The electric field and magnetic field maximum and minimum field strength spectra observed during the mission at the pallet location are plotted. Values are also derived for the maximum UHF transmitter and S-band transmitter field strengths. Calibration data to convert from the survey plots to actual narrowband and broadband field strengths are listed.

  4. MetaPlotR: a Perl/R pipeline for plotting metagenes of nucleotide modifications and other transcriptomic sites.

    PubMed

    Olarerin-George, Anthony O; Jaffrey, Samie R

    2017-05-15

    An increasing number of studies are mapping protein binding and nucleotide modifications sites throughout the transcriptome. Often, these sites cluster in certain regions of the transcript, giving clues to their function. Hence, it is informative to summarize where in the transcript these sites occur. A metagene is a simple and effective tool for visualizing the distribution of sites along a simplified transcript model. In this work, we introduce MetaPlotR, a Perl/R pipeline for creating metagene plots. The code and associated tutorial are available at https://github.com/olarerin/metaPlotR . srj2003@med.cornell.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. THE USE OF QUENCHING IN A LIQUID SCINTILLATION COUNTER FOR QUANTITATIVE ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, G.V.

    1963-01-01

    Quenching was used to quantitatively determine the amonnt of quenching agent present. A sealed promethium147 source was prepared to be used for the count rate determinations. Two methods to determine the amount of quenching agent present in a sample were developed. One method related the count rate of a sample containing a quenching agent to the amount of quenching agent present. Calibration curves were plotted using both color and chemical quenchers. The quenching agents used were: F.D.C. Orange No. 2, F.D.C. Yellow No. 3, F.D.C. Yellow No. 4, Scarlet Red, acetone, benzaldehyde, and carbon tetrachloride. the color quenchers gave amore » linear-relationship, while the chemical quenchers gave a non-linear relationship. Quantities of the color quenchers between about 0.008 mg and 0.100 mg can be determined with an error less than 5%. The calibration curves were found to be usable over a long period of time. The other method related the change in the ratio of the count rates in two voltage windows to the amount of quenching agent present. The quenchers mentioned above were used. Calibration curves were plotted for both the color and chemical quenchers. The relationships of ratio versus amount of quencher were non-linear in each case. It was shown that the reproducibility of the count rate and the ratio was independent of the amount of quencher present but was dependent on the count rate. At count rates above 10,000 counts per minute the reproducibility was better than 1%. (TCO)« less

  6. Improved CRDS δ13C Stability Through New Calibration Application For CO2 and CH4

    NASA Astrophysics Data System (ADS)

    Arata, C.; Rella, C.

    2014-12-01

    Stable carbon isotope ratio measurements of CO2 and CH4 provide valuable insight into global and regional sources and sinks of the two most important greenhouse gasses. Methodologies based on Cavity Ring-Down Spectroscopy (CRDS) have been developed capable of delivering δ13C measurements with a precision greater than 0.12 permil for CO2 and 0.4 permil for CH4 (1 hour window, 5 minute average). Here we present a method to further improve this measurement's stability. We have developed a two point calibration method which corrects for δ13C drift due to a dependance on carbon species concentration. This method calibrates for both carbon species concentration as well as δ13C. We go on to show that this added stability is especially valuable when using carbon isotope data in linear regression models such as Keeling plots, where even small amounts of error can be magnified to give inconclusive results. This method is demonstrated in both laboratory and ambient atmospheric conditions, and we demonstrate how to select the calibration frequency.

  7. Calibration of micromechanical parameters for DEM simulations by using the particle filter

    NASA Astrophysics Data System (ADS)

    Cheng, Hongyang; Shuku, Takayuki; Thoeni, Klaus; Yamamoto, Haruyuki

    2017-06-01

    The calibration of DEM models is typically accomplished by trail and error. However, the procedure lacks of objectivity and has several uncertainties. To deal with these issues, the particle filter is employed as a novel approach to calibrate DEM models of granular soils. The posterior probability distribution of the microparameters that give numerical results in good agreement with the experimental response of a Toyoura sand specimen is approximated by independent model trajectories, referred as `particles', based on Monte Carlo sampling. The soil specimen is modeled by polydisperse packings with different numbers of spherical grains. Prepared in `stress-free' states, the packings are subjected to triaxial quasistatic loading. Given the experimental data, the posterior probability distribution is incrementally updated, until convergence is reached. The resulting `particles' with higher weights are identified as the calibration results. The evolutions of the weighted averages and posterior probability distribution of the micro-parameters are plotted to show the advantage of using a particle filter, i.e., multiple solutions are identified for each parameter with known probabilities of reproducing the experimental response.

  8. Three-dimensional canopy fuel loading predicted using upward and downward sensing LiDAR systems

    Treesearch

    Nicholas S. Skowronski; Kenneth L. Clark; Matthew Duveneck; John. Hom

    2011-01-01

    We calibrated upward sensing profiling and downward sensing scanning LiDAR systems to estimates of canopy fuel loading developed from field plots and allometric equations, and then used the LiDAR datasets to predict canopy bulk density (CBD) and crown fuel weight (CFW) in wildfire prone stands in the New Jersey Pinelands. LiDAR-derived height profiles were also...

  9. Diameter growth models using FIA data from the Northeastern, Southern, and North Central Research Stations

    Treesearch

    Veronica C. Lessard; Ronald E. McRoberts; Margaret R. Holdaway

    2000-01-01

    Nonlinear, individual-tree, distance-independent annual diameter growth models are presented for species in two ecoregions defined by R.G. Bailey in the northern Lake States and in parts of the central and southern regions of the U.S. The models were calibrated using Forest Inventory and Analysis (FIA) data from undisturbed plots on land classified as timberland across...

  10. SouthPro : a computer program for managing uneven-aged loblolly pine stands

    Treesearch

    Benedict Schulte; Joseph Buongiorno; Ching-Rong Lin; Kenneth E. Skog

    1998-01-01

    SouthPro is a Microsoft Excel add-in program that simulates the management, growth, and yield of uneven-aged loblolly pine stands in the Southern United States. The built-in growth model of this program was calibrated from 991 uneven-aged plots in seven states, covering most growing conditions and sites. Stands are described by the number of trees in 13 size classes...

  11. Best Practices to Achieve the Lowest Uncertainty in Measuring with Respect

    Science.gov Websites

    been sitting in a cabinet from time to time. If control charts are used, then this interval could be 6 packaged cells or module for use in control charts to monitor the test bed and any potential drift in the reference device's calibration. Measure the control sample at least once a week. Plot percentage deviation

  12. Stratified estimates of forest area using the k-nearest neighbors technique and satellite imagery

    Treesearch

    Ronald E. McRoberts; Mark D. Nelson; Daniel Wendt

    2002-01-01

    For two study areas in Minnesota, stratified estimation using Landsat Thematic Mapper satellite imagery as the basis for stratification was used to estimate forest area. Measurements of forest inventory plots obtained for a 12-month period in 1998 and 1999 were used as the source of data for within-strata estimates. These measurements further served as calibration data...

  13. Short- and long-term responses of total soil organic carbon to harvesting in a northern hardwood forest

    Treesearch

    Kristofer Johnson; Frederick N. Scatena; Yude Pan

    2010-01-01

    The long-term response of total soil organic carbon pools ('total SOC', i.e. soil and dead wood) to different harvesting scenarios in even-aged northern hardwood forest stands was evaluated using two soil carbon models, CENTURY and YASSO, that were calibrated with forest plot empirical data in the Green Mountains of Vermont. Overall, 13 different harvesting...

  14. Diagnostic utility of appetite loss in addition to existing prediction models for community-acquired pneumonia in the elderly: a prospective diagnostic study in acute care hospitals in Japan

    PubMed Central

    Yamamoto, Yosuke; Terada, Kazuhiko; Ohta, Mitsuyasu; Mikami, Wakako; Yokota, Hajime; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuma, Shingo; Fukuhara, Shunichi

    2017-01-01

    Objective Diagnosis of community-acquired pneumonia (CAP) in the elderly is often delayed because of atypical presentation and non-specific symptoms, such as appetite loss, falls and disturbance in consciousness. The aim of this study was to investigate the external validity of existing prediction models and the added value of the non-specific symptoms for the diagnosis of CAP in elderly patients. Design Prospective cohort study. Setting General medicine departments of three teaching hospitals in Japan. Participants A total of 109 elderly patients who consulted for upper respiratory symptoms between 1 October 2014 and 30 September 2016. Main outcome measures The reference standard for CAP was chest radiograph evaluated by two certified radiologists. The existing models were externally validated for diagnostic performance by calibration plot and discrimination. To evaluate the additional value of the non-specific symptoms to the existing prediction models, we developed an extended logistic regression model. Calibration, discrimination, category-free net reclassification improvement (NRI) and decision curve analysis (DCA) were investigated in the extended model. Results Among the existing models, the model by van Vugt demonstrated the best performance, with an area under the curve of 0.75(95% CI 0.63 to 0.88); calibration plot showed good fit despite a significant Hosmer-Lemeshow test (p=0.017). Among the non-specific symptoms, appetite loss had positive likelihood ratio of 3.2 (2.0–5.3), negative likelihood ratio of 0.4 (0.2–0.7) and OR of 7.7 (3.0–19.7). Addition of appetite loss to the model by van Vugt led to improved calibration at p=0.48, NRI of 0.53 (p=0.019) and higher net benefit by DCA. Conclusions Information on appetite loss improved the performance of an existing model for the diagnosis of CAP in the elderly. PMID:29122806

  15. Path Analysis and Residual Plotting as Methods of Environmental Scanning in Higher Education: An Illustration with Applications and Enrollments.

    ERIC Educational Resources Information Center

    Morcol, Goktug; McLaughlin, Gerald W.

    1990-01-01

    The study proposes using path analysis and residual plotting as methods supporting environmental scanning in strategic planning for higher education institutions. Path models of three levels of independent variables are developed. Dependent variables measuring applications and enrollments at Virginia Polytechnic Institute and State University are…

  16. Establishing a Eucalyptus energy plantation on the central coast of California

    Treesearch

    Norman H. Pillsbury; Nelson L. Ayers

    1983-01-01

    A 17.5-acre non-irrigated biomass energy plantation has been established near San Luis Obispo. This joint California Polytechnic State University - California Department of Forestry project is measuring plot growth response of seven eucalyptus species for three spacing trials and for the effect of fertilization. All study plots are replicated. Site preparation strategy...

  17. Space Surveillance Tech Area Benefits from University Partnerships (Postprint)

    DTIC Science & Technology

    2013-12-10

    Michigan Technological University’s Oculus- ASR, is a calibration satellite for AMOS’s telescopic non- resolved object characterization program...calibration satellite for AMOS’s telescopic non- resolved object characterization program. Another example is the University of Buffalo, which is... time and on-orbit data. Without partnerships such as these, a school would not be able to deliver as high quality a satellite , decreasing relevance

  18. CONCH: A Visual Basic program for interactive processing of ion-microprobe analytical data

    NASA Astrophysics Data System (ADS)

    Nelson, David R.

    2006-11-01

    A Visual Basic program for flexible, interactive processing of ion-microprobe data acquired for quantitative trace element, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni and U-Th-Pb geochronology applications is described. Default but editable run-tables enable software identification of secondary ion species analyzed and for characterization of the standard used. Counts obtained for each species may be displayed in plots against analysis time and edited interactively. Count outliers can be automatically identified via a set of editable count-rejection criteria and displayed for assessment. Standard analyses are distinguished from Unknowns by matching of the analysis label with a string specified in the Set-up dialog, and processed separately. A generalized routine writes background-corrected count rates, ratios and uncertainties, plus weighted means and uncertainties for Standards and Unknowns, to a spreadsheet that may be saved as a text-delimited file. Specialized routines process trace-element concentration, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni, and Th-U disequilibrium analysis types, and U-Th-Pb isotopic data obtained for zircon, titanite, perovskite, monazite, xenotime and baddeleyite. Correction to measured Pb-isotopic, Pb/U and Pb/Th ratios for the presence of common Pb may be made using measured 204Pb counts, or the 207Pb or 208Pb counts following subtraction from these of the radiogenic component. Common-Pb corrections may be made automatically, using a (user-specified) common-Pb isotopic composition appropriate for that on the sample surface, or for that incorporated within the mineral at the time of its crystallization, depending on whether the 204Pb count rate determined for the Unknown is substantially higher than the average 204Pb count rate for all session standards. Pb/U inter-element fractionation corrections are determined using an interactive log e-log e plot of common-Pb corrected 206Pb/ 238U ratios against any nominated fractionation-sensitive species pair (commonly 238U 16O +/ 238U +) for session standards. Also displayed with this plot are calculated Pb/U and Pb/Th calibration line regression slopes, y-intercepts, calibration uncertainties, standard 204Pb- and 208Pb-corrected 207Pb/ 206Pb dates and other parameters useful for assessment of the calibration-line data. Calibrated data for Unknowns may be automatically grouped according to calculated date and displayed in color on interactive Wetherill Concordia, Tera-Wasserburg Concordia, Linearized Gaussian ("Probability Paper") and Gaussian-summation probability density diagrams.

  19. Investigating and Modeling Ecosystem Response to an Experimental and a Natural Ice Storm

    NASA Astrophysics Data System (ADS)

    Fakhraei, H.; Driscoll, C. T.; Rustad, L.; Campbell, J. L.; Groffman, P.; Fahey, T.; Likens, G.; Swaminathan, R.

    2017-12-01

    Our understanding of ecosystem response to the extreme events is generally limited to rare observations from the natural historical events. However, investigating extreme events under controlled conditions can improve our understanding of these natural phenomena. A novel field experiment was conducted in a northern hardwood forest at the Hubbard Brook Experimental Forest in New Hampshire in the northeastern United States to quantify the influence of ice storms on the ecological processes. During subfreezing conditions in the winters of 2016 and 2017, water from a nearby stream was pumped and sprayed on the canopy of eight experimental plots to accrete ice to a targeted thickness on the canopy. The experiment was conducted at three levels of icing thickness (0.25, 0.5, 0.75 in.) in 2016 comparable to the naturally occurring 1998 ice storm and a second 0.5 in. treatment 2017 which were compared with reference plots. The most notable response of the icing treatments was a marked increase in fine and course litter fall which increased exponentially with increases in the icing thickness. Post-treatment openings in the canopy caused short-term increases in soil temperature in the ice-treatment plots compared to the reference plots. No response from the ice storm treatments were detected for soil moisture, net N mineralization, net nitrification, or denitrification after both natural and experimental ice storm. In contrast to the marked increase in the stream water nitrate after the natural occurring 1998 ice storm, we have not observed any significant change in soil solution N concentrations in the experimental ice storm treatments. Inconsistency in the response between the natural and experimental ice storm is likely due to differences in geophysical characteristics of the study sites including slope and lateral uptake of nutrient by the trees outside the experimental plots. In order to evaluate the long-term impacts of ice storms on northern hardwood forests, we used the biogeochemical model, PnET-BGC. The model was calibrated to the study watersheds using observations from the natural and experimental ice storms. Future projections for ice storm events were estimated from an advanced climate model and applied to the calibrated PnET-BGC model to simulate future impacts of ice storms on the northern hardwood forests.

  20. FlowCal: A user-friendly, open source software tool for automatically converting flow cytometry data from arbitrary to calibrated units

    PubMed Central

    Castillo-Hair, Sebastian M.; Sexton, John T.; Landry, Brian P.; Olson, Evan J.; Igoshin, Oleg A.; Tabor, Jeffrey J.

    2017-01-01

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, non-proprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae mVenus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond. PMID:27110723

  1. Using Calibrated Peer Review to Teach Basic Research Skills

    ERIC Educational Resources Information Center

    Bracke, Marianne S.; Graveel, John G.

    2014-01-01

    Calibrated Peer Review (CPR) is an online tool being used in the class Introduction to Agriculture and Purdue University (AGR 10100) to integrate a writing and research component (http://cpr.molsci.ucla.edu/Home.aspx). Calibrated Peer Review combines the ability to create writing intensive assignments with an introduction to the peer-review…

  2. Hydrological processes and model representation: impact of soft data on calibration

    Treesearch

    J.G. Arnold; M.A. Youssef; H. Yen; M.J. White; A.Y. Sheshukov; A.M. Sadeghi; D.N. Moriasi; J.L. Steiner; Devendra Amatya; R.W. Skaggs; E.B. Haney; J. Jeong; M. Arabi; P.H. Gowda

    2015-01-01

    Hydrologic and water quality models are increasingly used to determine the environmental impacts of climate variability and land management. Due to differing model objectives and differences in monitored data, there are currently no universally accepted procedures for model calibration and validation in the literature. In an effort to develop accepted model calibration...

  3. The Ferrara hard X-ray facility for testing/calibrating hard X-ray focusing telescopes

    NASA Astrophysics Data System (ADS)

    Loffredo, Gianluca; Frontera, Filippo; Pellicciotta, Damiano; Pisa, Alessandro; Carassiti, Vito; Chiozzi, Stefano; Evangelisti, Federico; Landi, Luca; Melchiorri, Michele; Squerzanti, Stefano

    2005-12-01

    We will report on the current configuration of the X-ray facility of the University of Ferrara recently used to perform reflectivity tests of mosaic crystals and to calibrate the experiment JEM X aboard Integral. The facility is now located in the technological campus of the University of Ferrara in a new building (named LARIX laboratory= LARge Italian X-ray facility) that includes a tunnel 100 m long with, on the sides, two large experimental rooms. The facility is being improved for determining the optical axis of mosaic crystals in Laue configuration, for calibrating Laue lenses and hard X-ray mirror prototypes.

  4. Photometric calibration of T40 telescope system at Ankara University Kreiken Observatory (AUKR)

    NASA Astrophysics Data System (ADS)

    Karakuş, O.; Ekmekçi, F.

    2017-07-01

    We aim to present the photometric calibration of T40 telescope system at Ankara University Kreiken Observatory(AUKR) in the Johnson BVRI bands system through CCD observations of selected Landolt stars on the clearest 11 nights. Ten more stars with a magnitude of V< 11 were also observed in order to check up on standard transformation coefficients. Using these coefficients, we present standard brightness and color magnitudes for these 10 selected stars. These standard brightness values of these 10 stars are also compared with the previously published ones. It is clearly seen that the calibration results are sufficiently reliable.

  5. CalPro: a spreadsheet program for the management of California mixed-conifer stands.

    Treesearch

    Jingjing Liang; Joseph Buongiorno; Robert A. Monserud

    2004-01-01

    CalPro is an add-in program developed to work with Microsoft Excel to simulate the growth and management of uneven-aged mixed-conifer stands in California. Its built-in growth model was calibrated from 177 uneven-aged plots on industry and other private lands. Stands are described by the number of trees per acre in each of nineteen 2-inch diameter classes in...

  6. Calibration and use of an interactive-accounting model to simulate dissolved solids, streamflow, and water-supply operations in the Arkansas River basin, Colorado

    USGS Publications Warehouse

    Burns, A.W.

    1989-01-01

    An interactive-accounting model was used to simulate dissolved solids, streamflow, and water supply operations in the Arkansas River basin, Colorado. Model calibration of specific conductance to streamflow relations at three sites enabled computation of dissolved-solids loads throughout the basin. To simulate streamflow only, all water supply operations were incorporated in the regression relations for streamflow. Calibration for 1940-85 resulted in coefficients of determination that ranged from 0.89 to 0.58, and values in excess of 0.80 were determined for 16 of 20 nodes. The model then incorporated 74 water users and 11 reservoirs to simulate the water supply operations for two periods, 1943-74 and 1975-85. For the 1943-74 calibration, coefficients of determination for streamflow ranged from 0.87 to 0.02. Calibration of the water supply operations resulted in coefficients of determination that ranged from 0.87 to negative for simulated irrigation diversions of 37 selected water users. Calibration for 1975-85 was not evaluated statistically, but average values and plots of reservoir contents indicated reasonableness of the simulation. To demonstrate the utility of the model, six specific alternatives were simulated to consider effects of potential enlargement of Pueblo Reservoir. Three general major alternatives were simulated: the 1975-85 calibrated model data, the calibrated model data with an addition of 30 cu ft/sec in Fountain Creek flows, and the calibrated model data plus additional municipal water in storage. These three major alternatives considered the options of reservoir enlargement or no enlargement. A 40,000-acre-foot reservoir enlargement resulted in average increases of 2,500 acre-ft in transmountain diversions, of 800 acre-ft in storage diversions, and of 100 acre-ft in winter-water storage. (USGS)

  7. NURE aerial gamma-ray and magnetic-reconnaissance survey portions of New Mexico, Arizona, and Texas. Volume I. Instrumentation and data reduction. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    As part of the Department of Energy (DOE) National Uranium Resource Evaluation Program, a rotary-wing high sensitivity radiometric and magnetic survey was flown covering portions of the State of New Mexico, Arizona and Texas. The survey encompassed six 1:250,000 scale quadrangles, Holbrook, El Paso, Las Cruces, Carlsbad, Fort Sumner and Roswell. The survey was flown with a Sikorsky S58T helicopter equipped with a high sensitivity gamma ray spectrometer which was calibrated at the DOE calibration facilities at Walker Field in Grand Junction, Colorado, and the Dynamic Test Range at Lake Mead, Arizona. The radiometric data were processed to compensate formore » Compton scattering effects and altitude variations. The data were normalized to 400 feet terrain clearance. The reduced data is presented in the form of stacked profiles, standard deviation anomaly plots, histogram plots and microfiche listings. The results of the geologic interpretation of the radiometric data together with the profiles, anomaly maps and histograms are presented in the individual quadrangle reports. The survey was awarded to LKB Resources, Inc. which completed the data acquisition. In April, 1980 Carson Helicopters, Inc. and Carson Geoscience Company agreed to manage the project and complete delivery of this final report.« less

  8. AstroImageJ: Image Processing and Photometric Extraction for Ultra-precise Astronomical Light Curves

    NASA Astrophysics Data System (ADS)

    Collins, Karen A.; Kielkopf, John F.; Stassun, Keivan G.; Hessman, Frederic V.

    2017-02-01

    ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard Flexible Image Transport System (FITS) files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System aware, including an automated interface to the astrometry.net web portal for plate solving images. AIJ provides research grade image calibration and analysis tools with a GUI driven approach, and easily installed cross-platform compatibility. It enables new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.

  9. Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) with Standard Reference Line for the Analysis of Stainless Steel.

    PubMed

    Fu, Hongbo; Dong, Fengzhong; Wang, Huadong; Jia, Junwei; Ni, Zhibo

    2017-08-01

    In this work, calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is used to analyze a certified stainless steel sample. Due to self-absorption of the spectral lines from the major element Fe and the sparse lines of trace elements, it is usually not easy to construct the Boltzmann plots of all species. A standard reference line method is proposed here to solve this difficulty under the assumption of local thermodynamic equilibrium so that the same temperature value for all elements present into the plasma can be considered. Based on the concentration and rich spectral lines of Fe, the Stark broadening of Fe(I) 381.584 nm and Saha-Boltzmann plots of this element are used to calculate the electron density and the plasma temperature, respectively. In order to determine the plasma temperature accurately, which is seriously affected by self-absorption, a pre-selection procedure for eliminating those spectral lines with strong self-absorption is employed. Then, one spectral line of each element is selected to calculate its corresponding concentration. The results from the standard reference lines with and without self-absorption of Fe are compared. This method allows us to measure trace element content and effectively avoid the adverse effects due to self-absorption.

  10. The Magnetospheric Multiscale Magnetometers

    NASA Technical Reports Server (NTRS)

    Russell, C. T.; Anderson, B. J.; Baumjohann, W.; Bromund, K. R.; Dearborn, D.; Fischer, D.; Le, G.; Leinweber, H. K.; Leneman, D.; Magnes, W.; hide

    2014-01-01

    The success of the Magnetospheric Multiscale mission depends on the accurate measurement of the magnetic field on all four spacecraft. To ensure this success, two independently designed and built fluxgate magnetometers were developed, avoiding single-point failures. The magnetometers were dubbed the digital fluxgate (DFG), which uses an ASIC implementation and was supplied by the Space Research Institute of the Austrian Academy of Sciences and the analogue magnetometer (AFG) with a more traditional circuit board design supplied by the University of California, Los Angeles. A stringent magnetic cleanliness program was executed under the supervision of the Johns Hopkins University,s Applied Physics Laboratory. To achieve mission objectives, the calibration determined on the ground will be refined in space to ensure all eight magnetometers are precisely inter-calibrated. Near real-time data plays a key role in the transmission of high-resolution observations stored onboard so rapid processing of the low-resolution data is required. This article describes these instruments, the magnetic cleanliness program, and the instrument pre-launch calibrations, the planned in-flight calibration program, and the information flow that provides the data on the rapid time scale needed for mission success.

  11. HST archive primer, version 4.1

    NASA Technical Reports Server (NTRS)

    Fruchter, A. (Editor); Baum, S. (Editor)

    1994-01-01

    This version of the HST Archive Primer provides the basic information a user needs to know to access the HST archive via StarView the new user interface to the archive. Using StarView, users can search for observations interest, find calibration reference files, and retrieve data from the archive. Both the terminal version of StarView and the X-windows version feature a name resolver which simplifies searches of the HST archive based on target name. In addition, the X-windows version of StarView allows preview of all public HST data; compressed versions of public images are displayed via SAOIMAGE, while spectra are plotted using the public plotting package, XMGR. Finally, the version of StarView described here features screens designed for observers preparing Cycle 5 HST proposals.

  12. SpcAudace: Spectroscopic processing and analysis package of Audela software

    NASA Astrophysics Data System (ADS)

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  13. First Results of Field Absolute Calibration of the GPS Receiver Antenna at Wuhan University.

    PubMed

    Hu, Zhigang; Zhao, Qile; Chen, Guo; Wang, Guangxing; Dai, Zhiqiang; Li, Tao

    2015-11-13

    GNSS receiver antenna phase center variations (PCVs), which arise from the non-spherical phase response of GNSS signals have to be well corrected for high-precision GNSS applications. Without using a precise antenna phase center correction (PCC) model, the estimated position of a station monument will lead to a bias of up to several centimeters. The Chinese large-scale research project "Crustal Movement Observation Network of China" (CMONOC), which requires high-precision positions in a comprehensive GPS observational network motived establishment of a set of absolute field calibrations of the GPS receiver antenna located at Wuhan University. In this paper the calibration facilities are firstly introduced and then the multipath elimination and PCV estimation strategies currently used are elaborated. The validation of estimated PCV values of test antenna are finally conducted, compared with the International GNSS Service (IGS) type values. Examples of TRM57971.00 NONE antenna calibrations from our calibration facility demonstrate that the derived PCVs and IGS type mean values agree at the 1 mm level.

  14. 47. Historic American Buildings Survey Alex Bush, Photographer, October 16, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    47. Historic American Buildings Survey Alex Bush, Photographer, October 16, 1935 SLAVE CABIN #1 (WESTERNMOST) IN SHEETS, CABIN AT WEST SIDE OF PLOT, FACES EAST, GIRL'S DORMITORY IN REAR - University of Alabama, President's House, University Boulevard, Tuscaloosa, Tuscaloosa County, AL

  15. Quantitative analysis of titanium concentration using calibration-free laser-induced breakdown spectroscopy (LIBS)

    NASA Astrophysics Data System (ADS)

    Zaitun; Prasetyo, S.; Suliyanti, M. M.; Isnaeni; Herbani, Y.

    2018-03-01

    Laser-induced breakdown spectroscopy (LIBS) can be used for quantitative and qualitative analysis. Calibration-free LIBS (CF-LIBS) is a method to quantitatively analyze concentration of elements in a sample in local thermodynamic equilibrium conditions without using available matrix-matched calibration. In this study, we apply CF-LIBS for quantitative analysis of Ti in TiO2 sample. TiO2 powder sample was mixed with polyvinyl alcohol and formed into pellets. An Nd:YAG pulsed laser at a wavelength of 1064 nm was focused onto the sample to generate plasma. The spectrum of plasma was recorded using spectrophotometer then compared to NIST spectral line to determine energy levels and other parameters. The value of plasma temperature obtained using Boltzmann plot is 8127.29 K and electron density from calculation is 2.49×1016 cm-3. Finally, the concentration of Ti in TiO2 sample from this study is 97% that is in proximity with the sample certificate.

  16. Calibration of water-velocity meters

    USGS Publications Warehouse

    Kaehrle, William R.; Bowie, James E.

    1988-01-01

    The U.S. Geological Survey, Department of the Interior, as part of its responsibility to appraise the quantity of water resources in the United States, maintains facilities for the calibration of water-velocity meters at the Gulf Coast Hydroscience Center's Hydraulic Laboratory Facility, NSTL, Mississippi. These meters are used in hydrologic studies by the Geological Survey, U.S. Army Corps of Engineers, U.S. Department of Energy, state agencies, universities, and others in the public and private sector. This paper describes calibration facilities, types of water-velocity meters calibrated, and calibration standards, methods and results.

  17. Prediction of new onset of end stage renal disease in Chinese patients with type 2 diabetes mellitus - a population-based retrospective cohort study.

    PubMed

    Wan, Eric Yuk Fai; Fong, Daniel Yee Tak; Fung, Colman Siu Cheung; Yu, Esther Yee Tak; Chin, Weng Yee; Chan, Anca Ka Chun; Lam, Cindy Lo Kuen

    2017-08-01

    Since diabetes mellitus (DM) is the leading cause of end stage renal disease (ESRD), this study aimed to develop a 5-year ESRD risk prediction model among Chinese patients with Type 2 DM (T2DM) in primary care. A retrospective cohort study was conducted on 149,333 Chinese adult T2DM primary care patients without ESRD in 2010. Using the derivation cohort over a median of 5 years follow-up, the gender-specific models including the interaction effect between predictors and age were derived using Cox regression with a forward stepwise approach. Harrell's C-statistic and calibration plot were applied to the validation cohort to assess discrimination and calibration of the models. Prediction models showed better discrimination with Harrell's C-statistics of 0.866 (males) and 0.862 (females) and calibration power from the plots than other established models. The predictors included age, usages of anti-hypertensive drugs, anti-glucose drugs, and Hemogloblin A1c, blood pressure, urine albumin/creatinine ratio (ACR) and estimated glomerular filtration rate (eGFR). Specific predictors for male were smoking and presence of sight threatening diabetic retinopathy while additional predictors for female included longer duration of diabetes and quadratic effect of body mass index. Interaction factors with age showed a greater weighting of insulin and urine ACR in younger males, and eGFR in younger females. Our newly developed gender-specific models provide a more accurate 5-year ESRD risk predictions for Chinese diabetic primary care patients than other existing models. The models included several modifiable risk factors that clinicians can use to counsel patients, and to target at in the delivery of care to patients.

  18. The calibration analysis of soil infiltration formula in farmland scale

    NASA Astrophysics Data System (ADS)

    Qian, Tao; Han, Na Na; Chang, Shuan Ling

    2018-06-01

    Soil infiltration characteristic is an important basis of farmland scale parameter estimation. Based on 12 groups of double-loop infiltration tests conducted in the test field of tianjin agricultural university west campus. Based on the calibration theory and the combination of statistics, the calibration analysis of phillips formula was carried out and the spatial variation characteristics of the calibration factor were analyzed. Results show that in study area based on the soil stability infiltration rate A calculate calibration factor αA calibration effect is best, that is suitable for the area formula of calibration infiltration and αA variation coefficient is 0.3234, with A certain degree of spatial variability.

  19. A GIS tool for modelling annual diffuse infiltration on a plot scale

    NASA Astrophysics Data System (ADS)

    España, Salvador; Alcalá, Francisco J.; Vallejos, Ángela; Pulido-Bosch, Antonio

    2013-04-01

    ArcB is a GIS tool for modelling annual diffuse infiltration (RP) from precipitation (P) on a plot scale that uses ArcObjects as the programming language to incorporate equations and boundary conditions for the water-balance consistency. Because detailed weather, soil, and vegetation data are often missing, ArcB uses well-known non-global models such as Hargreaves for daily potential evapotranspiration and Budyko for annual actual evapotranspiration (EA), as well as the SCS Curve Number procedure for 24-h plot runoff (RO). Annual RP is quantified as the difference in annual P, EA, and RO. Because the use of non-global models for EA may induce suboptimal RP results, ArcB allows corrections of EA estimates by comparisons with data from a reference station. In a semiarid heterogeneous region in south-eastern Spain, the uncertainty of RO and RP was lowered to 4% and 2%, respectively, when correcting EA. ArcObjects is a versatile programming language which allows advanced users to incorporate more complex formulations for more accurate results as detailed data is acquired and to develop routines for calibration when reference data exist.

  20. High-resolution mapping of forest carbon stocks in the Colombian Amazon

    NASA Astrophysics Data System (ADS)

    Asner, G. P.; Clark, J. K.; Mascaro, J.; Galindo García, G. A.; Chadwick, K. D.; Navarrete Encinales, D. A.; Paez-Acosta, G.; Cabrera Montenegro, E.; Kennedy-Bowdoin, T.; Duque, Á.; Balaji, A.; von Hildebrand, P.; Maatoug, L.; Bernal, J. F. Phillips; Yepes Quintero, A. P.; Knapp, D. E.; García Dávila, M. C.; Jacobson, J.; Ordóñez, M. F.

    2012-07-01

    High-resolution mapping of tropical forest carbon stocks can assist forest management and improve implementation of large-scale carbon retention and enhancement programs. Previous high-resolution approaches have relied on field plot and/or light detection and ranging (LiDAR) samples of aboveground carbon density, which are typically upscaled to larger geographic areas using stratification maps. Such efforts often rely on detailed vegetation maps to stratify the region for sampling, but existing tropical forest maps are often too coarse and field plots too sparse for high-resolution carbon assessments. We developed a top-down approach for high-resolution carbon mapping in a 16.5 million ha region (> 40%) of the Colombian Amazon - a remote landscape seldom documented. We report on three advances for large-scale carbon mapping: (i) employing a universal approach to airborne LiDAR-calibration with limited field data; (ii) quantifying environmental controls over carbon densities; and (iii) developing stratification- and regression-based approaches for scaling up to regions outside of LiDAR coverage. We found that carbon stocks are predicted by a combination of satellite-derived elevation, fractional canopy cover and terrain ruggedness, allowing upscaling of the LiDAR samples to the full 16.5 million ha region. LiDAR-derived carbon maps have 14% uncertainty at 1 ha resolution, and the regional map based on stratification has 28% uncertainty in any given hectare. High-resolution approaches with quantifiable pixel-scale uncertainties will provide the most confidence for monitoring changes in tropical forest carbon stocks. Improved confidence will allow resource managers and decision makers to more rapidly and effectively implement actions that better conserve and utilize forests in tropical regions.

  1. High-resolution Mapping of Forest Carbon Stocks in the Colombian Amazon

    NASA Astrophysics Data System (ADS)

    Asner, G. P.; Clark, J. K.; Mascaro, J.; Galindo García, G. A.; Chadwick, K. D.; Navarrete Encinales, D. A.; Paez-Acosta, G.; Cabrera Montenegro, E.; Kennedy-Bowdoin, T.; Duque, Á.; Balaji, A.; von Hildebrand, P.; Maatoug, L.; Bernal, J. F. Phillips; Knapp, D. E.; García Dávila, M. C.; Jacobson, J.; Ordóñez, M. F.

    2012-03-01

    High-resolution mapping of tropical forest carbon stocks can assist forest management and improve implementation of large-scale carbon retention and enhancement programs. Previous high-resolution approaches have relied on field plot and/or Light Detection and Ranging (LiDAR) samples of aboveground carbon density, which are typically upscaled to larger geographic areas using stratification maps. Such efforts often rely on detailed vegetation maps to stratify the region for sampling, but existing tropical forest maps are often too coarse and field plots too sparse for high resolution carbon assessments. We developed a top-down approach for high-resolution carbon mapping in a 16.5 million ha region (>40 %) of the Colombian Amazon - a remote landscape seldom documented. We report on three advances for large-scale carbon mapping: (i) employing a universal approach to airborne LiDAR-calibration with limited field data; (ii) quantifying environmental controls over carbon densities; and (iii) developing stratification- and regression-based approaches for scaling up to regions outside of LiDAR coverage. We found that carbon stocks are predicted by a combination of satellite-derived elevation, fractional canopy cover and terrain ruggedness, allowing upscaling of the LiDAR samples to the full 16.5 million ha region. LiDAR-derived carbon mapping samples had 14.6 % uncertainty at 1 ha resolution, and regional maps based on stratification and regression approaches had 25.6 % and 29.6 % uncertainty, respectively, in any given hectare. High-resolution approaches with reported local-scale uncertainties will provide the most confidence for monitoring changes in tropical forest carbon stocks. Improved confidence will allow resource managers and decision-makers to more rapidly and effectively implement actions that better conserve and utilize forests in tropical regions.

  2. Improved CRDS δ13C Stability Through New Calibration Application For CO2 And CH4

    NASA Astrophysics Data System (ADS)

    Rella, Chris; Arata, Caleb; Saad, Nabil; Leggett, Graham; Miles, Natasha; Richardson, Scott; Davis, Ken

    2015-04-01

    Stable carbon isotope ratio measurements of CO2 and CH4 provide valuable insight into global and regional sources and sinks of the two most important greenhouse gases. Methodologies based on Cavity Ring-Down Spectroscopy (CRDS) have been developed and are capable of delivering δ13C measurements with a precision better than 0.12 permil for CO2 and 0.4 permil for CH4 (1 hour window, 5 minute average). Here we present a method to further improve this measurement stability. We have developed a two-point calibration method which corrects for δ13C drift due to a dependence on carbon species concentration. This method calibrates for both carbon species concentration as well as δ13C. In addition, we further demonstrate that this added stability is especially valuable when using carbon isotope data in linear regression models such as Keeling plots, where even small amounts of error can be magnified to give inconclusive results. Furthermore, we show how this method is used to validate multiple instruments simultaneously and can be used to create the standard samples needed for field calibrations.

  3. Calibration plots for risk prediction models in the presence of competing risks.

    PubMed

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-08-15

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves for competing risks models based on jackknife pseudo-values that are combined with a nearest neighborhood smoother and a cross-validation approach to deal with all three problems. Copyright © 2014 John Wiley & Sons, Ltd.

  4. The effects from high-altitude storm discharges in Earth atmosphere

    NASA Astrophysics Data System (ADS)

    Kozak, L.; Odzimek, A.; Ivchenko, V.; Kozak, P.; Gala, I.; Lapchuk, V.

    2016-06-01

    The regularities of appearance of transient luminous effects in Earth atmosphere and features of their ground-based observations are considered. Using video-observations obtained in the Institution of Geophysics of Poland Academy of Sciences the energy of atmospheric afterglow from these processes in visual wavelength range has been determined. Calibrating curve was plotted using unfocal images of Vega. The star spectrum,atmosphere absorption coefficient and characteristics of the observational camera were used.

  5. WestProPlus: a stochastic spreadsheet program for the management of all-aged Douglas-fir–hemlock forests in the Pacific Northwest.

    Treesearch

    Jingjing Liang; Joseph Buongiorno; Robert A. Monserud

    2006-01-01

    WestProPlus is an add-in program developed to work with Microsoft Excel to simulate the growth and management of all-aged Douglas-fir–western hemlock (Pseudotsuga menziesii (Mirb.) Franco–Tsuga heterophylla (Raf.) Sarg.) stands in Oregon and Washington. Its built-in growth model was calibrated from 2,706 permanent plots in the...

  6. Oculomotor Reflexes as a Test of Visual Dysfunctions in Cognitively Impaired Observers

    DTIC Science & Technology

    2012-10-01

    visual nystagmus much more robust. Because the absolute gaze is not measured in our paradigm (this would require a gaze calibration, involving...the dots were also drifting to the right. Gaze horizontal position is plotted along the y-axis. The red bar indicates a visual nystagmus event...for automated 5 Reflex Stimulus Functions Visual Nystagmus luminance grating low-level motion equiluminant grating color vision contrast gratings at 3

  7. Short- and long-term responses of total soil organic carbon to harvesting in a northern hardwood forest

    Treesearch

    K. Johnson; F. N. Scatena; Y. Pan

    2010-01-01

    The long-term response of total soil organic carbon pools (‘total SOC’, i.e. soil and dead wood) to different harvesting scenarios in even-aged northern hardwood forest stands was evaluated using two soil carbon models, CENTURY and YASSO, that were calibrated with forest plot empirical data in the Green Mountains of Vermont. Overall, 13 different harvesting scenarios...

  8. Flow rate calibration to determine cell-derived microparticles and homogeneity of blood components.

    PubMed

    Noulsri, Egarit; Lerdwana, Surada; Kittisares, Kulvara; Palasuwan, Attakorn; Palasuwan, Duangdao

    2017-08-01

    Cell-derived microparticles (MPs) are currently of great interest to screening transfusion donors and blood components. However, the current approach to counting MPs is not affordable for routine laboratory use due to its high cost. The current study aimed to investigate the potential use of flow-rate calibration for counting MPs in whole blood, packed red blood cells (PRBCs), and platelet concentrates (PCs). The accuracy of flow-rate calibration was investigated by comparing the platelet counts of an automated counter and a flow-rate calibrator. The concentration of MPs and their origins in whole blood (n=100), PRBCs (n=100), and PCs (n=92) were determined using a FACSCalibur. The MPs' fold-changes were calculated to assess the homogeneity of the blood components. Comparing the platelet counts conducted by automated counting and flow-rate calibration showed an r 2 of 0.6 (y=0.69x+97,620). The CVs of the within-run and between-run variations of flow-rate calibration were 8.2% and 12.1%, respectively. The Bland-Altman plot showed a mean bias of -31,142platelets/μl. MP enumeration revealed both the difference in MP levels and their origins in whole blood, PRBCs, and PCs. Screening the blood components demonstrated high heterogeneity of the MP levels in PCs when compared to whole blood and PRBCs. The results of the present study suggest the accuracy and precision of flow-rate calibration for enumerating MPs. This flow-rate approach is affordable for assessing the homogeneity of MPs in blood components in routine laboratory practice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Strategies for implementing Climate Smart Agriculture and creating marketable Greenhouse emission reduction credits, for small scale rice farmers in Asia

    NASA Astrophysics Data System (ADS)

    Ahuja, R.; Kritee, K.; Rudek, J.; Van Sanh, N.; Thu Ha, T.

    2014-12-01

    Industrial agriculture systems, mostly in developed and some emerging economies, are far different from the small holder farms that dot the landscapes in Asia and Africa. At Environmental Defense Fund, along with our partners from non-governmental, corporate, academic and government sectors and farmers, we have worked actively in India and Vietnam for the last four years to better understand how small scale farmers working on rice paddy (and other upland crops) cultivation can best deal with climate change. Some of the questions we have tried to answer are: What types of implementable best practices, both old and new, on small farm systems lend themselves to improved yields, farm incomes, climate resilience and mitigation? Can these practices be replicated everywhere or is the change more landscape and people driven? What are the institutional, cultural, financial and risk-perception related barriers that prevent scaling up of these practices? How do we innovate and overcome these barriers? The research community needs to work more closely together and leverage multiple scientific, economic and policy disciplines to fully answer these questions. In the case of small farm systems, we find that it helps to follow certain steps if the climate-smart (or low carbon) farming programs are to succeed and the greenhouse credits generated are to be marketed: Demographic data collection and plot demarcation Farmer networks and diaries Rigorous baseline determination via surveys Alternative practice determination via consultation with local universities/experts Measurements on representative plots for 3-4 years (including GHG emissions, yields, inputs, economic and environmental savings) to help calibrate biogeochemical models and/or calculate regional emission factors. Propagation of alternative practices across the landscape via local NGOs/governments Recording of parameters necessary to extrapolate representative plot GHG emission reductions to all farmers in a given landscape under several existing and new carbon offset methodologies. In this presentation, we will discuss our initial encouraging results on the basis of which our wider team now seeks to identify and recommend policies that the local governments to be able to scale up climate smart agriculture to larger jurisdictional levels.

  10. The Worth of a Sparrow: A Decision Case in University Research and Public Relations.

    ERIC Educational Resources Information Center

    Crookston, R. Kent; And Others

    1993-01-01

    The University of Minnesota trapped and killed birds to reduce bird damage to research grain plots. When the Animal Rights Coalition demanded the practice be stopped, the situation became a public controversy. Presents an abridged form of this case as a focus for consideration of research methods, interest group agenda, and the universities' role…

  11. Calculus detection calibration among dental hygiene faculty members utilizing dental endoscopy: a pilot study.

    PubMed

    Partido, Brian B; Jones, Archie A; English, Dana L; Nguyen, Carol A; Jacks, Mary E

    2015-02-01

    Dental and dental hygiene faculty members often do not provide consistent instruction in the clinical environment, especially in tasks requiring clinical judgment. From previous efforts to calibrate faculty members in calculus detection using typodonts, researchers have suggested using human subjects and emerging technology to improve consistency in clinical instruction. The purpose of this pilot study was to determine if a dental endoscopy-assisted training program would improve intra- and interrater reliability of dental hygiene faculty members in calculus detection. Training included an ODU 11/12 explorer, typodonts, and dental endoscopy. A convenience sample of six participants was recruited from the dental hygiene faculty at a California community college, and a two-group randomized experimental design was utilized. Intra- and interrater reliability was measured before and after calibration training. Pretest and posttest Kappa averages of all participants were compared using repeated measures (split-plot) ANOVA to determine the effectiveness of the calibration training on intra- and interrater reliability. The results showed that both kinds of reliability significantly improved for all participants and the training group improved significantly in interrater reliability from pretest to posttest. Calibration training was beneficial to these dental hygiene faculty members, especially those beginning with less than full agreement. This study suggests that calculus detection calibration training utilizing dental endoscopy can effectively improve interrater reliability of dental and dental hygiene clinical educators. Future studies should include human subjects, involve more participants at multiple locations, and determine whether improved rater reliability can be sustained over time.

  12. Scaling and the frequency dependence of Nyquist plot maxima of the electrical impedance of the human thigh.

    PubMed

    Shiffman, Carl

    2017-11-30

    To define and elucidate the properties of reduced-variable Nyquist plots. Non-invasive measurements of the electrical impedance of the human thigh. A retrospective analysis of the electrical impedances of 154 normal subjects measured over the past decade shows that 'scaling' of the Nyquist plots for human thigh muscles is a property shared by healthy thigh musculature, irrespective of subject and the length of muscle segment. Here the term scaling signifies the near and sometimes 'perfect' coalescence of the separate X versus R plots into one 'reduced' Nyquist plot by the simple expedient of dividing R and X by X m , the value of X at the reactance maximum. To the extent allowed by noise levels one can say that there is one 'universal' reduced Nyquist plot for the thigh musculature of healthy subjects. There is one feature of the Nyquist curves which is not 'universal', however, namely the frequency f m at which the maximum in X is observed. That is found to vary from 10 to 100 kHz. depending on subject and segment length. Analysis shows, however, that the mean value of 1/f m is an accurately linear function of segment length, though there is a small subject-to-subject random element as well. Also, following the recovery of an otherwise healthy victim of ankle fracture demonstrates the clear superiority of measurements above about 800 kHz, where scaling is not observed, in contrast to measurements below about 400 kHz, where scaling is accurately obeyed. The ubiquity of 'scaling' casts new light on the interpretation of impedance results as they are used in electrical impedance myography and bioelectric impedance analysis.

  13. Comparison of icing cloud instruments for 1982-1983 icing season flight program

    NASA Technical Reports Server (NTRS)

    Ide, R. F.; Richter, G. P.

    1984-01-01

    A number of modern and old style liquid water content (LWC) and droplet sizing instruments were mounted on a DeHavilland DHC-6 Twin Otter and operated in natural icing clouds in order to determine their comparative operating characteristics and their limitations over a broad range of conditions. The evaluation period occurred during the 1982-1983 icing season from January to March 1983. Time histories of all instrument outputs were plotted and analyzed to assess instrument repeatability and reliability. Scatter plots were also generated for comparison of instruments. The measured LWC from four instruments differed by as much as 20 percent. The measured droplet size from two instruments differed by an average of three microns. The overall effort demonstrated the need for additional data, and for some means of calibrating these instruments to known standards.

  14. Prognostic models for renal cell carcinoma recurrence: external validation in a Japanese population.

    PubMed

    Utsumi, Takanobu; Ueda, Takeshi; Fukasawa, Satoshi; Komaru, Atsushi; Sazuka, Tomokazu; Kawamura, Koji; Imamoto, Takashi; Nihei, Naoki; Suzuki, Hiroyoshi; Ichikawa, Tomohiko

    2011-09-01

    The aim of the present study was to compare the accuracy of three prognostic models in predicting recurrence-free survival among Japanese patients who underwent nephrectomy for non-metastatic renal cell carcinoma (RCC). Patients originated from two centers: Chiba University Hospital (n = 152) and Chiba Cancer Center (n = 65). The following data were collected: age, sex, clinical presentation, Eastern Cooperative Oncology Group performance status, surgical technique, 1997 tumor-node-metastasis stage, clinical and pathological tumor size, histological subtype, disease recurrence, and progression. Three western models, including Yaycioglu's model, Cindolo's model and Kattan's nomogram, were used to predict recurrence-free survival. Predictive accuracy of these models were validated by using Harrell's concordance-index. Concordance-indexes were 0.795 and 0.745 for Kattan's nomogram, 0.700 and 0.634 for Yaycioglu's model, and 0.700 and 0.634 for Cindolo's model, respectively. Furthermore, the constructed calibration plots of Kattan's nomogram overestimated the predicted probability of recurrence-free survival after 5 years compared with the actual probability. Our findings suggest that despite working better than other predictive tools, Kattan's nomogram needs be used with caution when applied to Japanese patients who have undergone nephrectomy for non-metastatic RCC. © 2011 The Japanese Urological Association.

  15. An overview of the Sierra Ancha Experimental Forest's role in the free-air CO2 enrichment large wood decomposition experiment

    Treesearch

    Peter E. Koestner; Karen Koestner; Daniel G. Neary; Carl C. Trettin

    2012-01-01

    The Duke University FACE facility is located near Chapel Hill, in Orange County, North Carolina on the eastern edge of the North Carolina piedmont. The initial prototype plot was established in June, 1994 and eleven additional treatment plots were activated in August 1996 and operated until October, 2010. To date, 263 publications have reported on results from the...

  16. Universal properties of galactic rotation curves and a first principles derivation of the Tully-Fisher relation

    NASA Astrophysics Data System (ADS)

    O'Brien, James G.; Chiarelli, Thomas L.; Mannheim, Philip D.

    2018-07-01

    In a recent paper McGaugh, Lelli, and Schombert showed that in an empirical plot of the observed centripetal accelerations in spiral galaxies against those predicted by the Newtonian gravity of the luminous matter in those galaxies the data points occupied a remarkably narrow band. While one could summarize the mean properties of the band by drawing a single mean curve through it, by fitting the band with the illustrative conformal gravity theory with fits that fill out the width of the band we show here that the width of the band is just as physically significant. We show that at very low luminous Newtonian accelerations the plot can become independent of the luminous Newtonian contribution altogether, but still be non-trivial due to the contribution of matter outside of the galaxies (viz. the rest of the visible universe). We present a new empirical plot of the difference between the observed centripetal accelerations and the luminous Newtonian expectations as a function of distance from the centers of galaxies, and show that at distances greater than 10 kpc the plot also occupies a remarkably narrow band, one even close to constant. Using the conformal gravity theory we provide a first principles derivation of the empirical Tully-Fisher relation.

  17. Variability of Total Below Ground Carbon Allocation amongst Common Agricultural Land Management Practices: a Case Study

    NASA Astrophysics Data System (ADS)

    Wacha, K. M.; Papanicolaou, T.; Wilson, C. G.

    2010-12-01

    Field measurements and numerical models are currently being used to estimate quantities of Total Belowground Carbon Allocation (TBCA) for three representative land uses, viz. corn, soybeans, and prairie bromegrass for CRP (Conservation Reserve Program) of an agricultural Iowa sub-watershed, located within the Clear Creek Watershed (CCW). Since it is difficult to measure TBCA directly, a mass balance approach has been implemented to estimate TBCA as follows: TBCA = FS + FE+ Δ(CS + CR + CL) - FA , where the term Fs denotes soil respiration; FE is the carbon content of the eroded/deposited soil; ΔCS, ΔCR, ΔCL denote the changes in carbon content of the mineral soil, plant roots, and litter layer, respectively; and FA is the above ground litter fall of dead plant material to the soil. The terms are hypothesized to have a huge impact on TBCA within agricultural settings due to intensive tillage practices, water-driven soil erosion/deposition, and high usage of fertilizer. To test our hypothesis, field measurements are being performed at the plot scale, replicating common agricultural land management practices. Soil respiration (FS) is being measured with an EGM-4 CO2 Gas Analyzer and SRC-1 Soil Respiration Chamber (PP Systems), soil moisture and temperature are recorded in the top 20 cm for each respective soil respiration measurement, and litter fall rates (FA) are acquired by collecting the residue in a calibrated pan. The change in carbon content of the soil (ΔCS), roots (ΔCR) and litter layer (ΔCL) are being analyzed by collecting soil samples throughout the life cycle of the plant. To determine the term FE for the three representative land management practices, a funnel collection system located at the plot outlet was used for collecting the eroded material after natural rainfall events. Field measurements of TBCA at the plot scale via the mass balance approach are used to calibrate the numerical agronomic process model DAYCENT, which simulates the daily fluxes of carbon (CS) and soil respiration (FS) and incorporates a plant-growth model that allows the determination of the terms FA, CR, and CL. Once calibrated, DAYCENT can be used in conjunction with the Watershed Erosion Prediction Project (WEPP) model, which calculates erosion/deposition rates, to provide estimates of TBCA at a larger global scale.

  18. GenomeD3Plot: a library for rich, interactive visualizations of genomic data in web applications.

    PubMed

    Laird, Matthew R; Langille, Morgan G I; Brinkman, Fiona S L

    2015-10-15

    A simple static image of genomes and associated metadata is very limiting, as researchers expect rich, interactive tools similar to the web applications found in the post-Web 2.0 world. GenomeD3Plot is a light weight visualization library written in javascript using the D3 library. GenomeD3Plot provides a rich API to allow the rapid visualization of complex genomic data using a convenient standards based JSON configuration file. When integrated into existing web services GenomeD3Plot allows researchers to interact with data, dynamically alter the view, or even resize or reposition the visualization in their browser window. In addition GenomeD3Plot has built in functionality to export any resulting genome visualization in PNG or SVG format for easy inclusion in manuscripts or presentations. GenomeD3Plot is being utilized in the recently released Islandviewer 3 (www.pathogenomics.sfu.ca/islandviewer/) to visualize predicted genomic islands with other genome annotation data. However, its features enable it to be more widely applicable for dynamic visualization of genomic data in general. GenomeD3Plot is licensed under the GNU-GPL v3 at https://github.com/brinkmanlab/GenomeD3Plot/. brinkman@sfu.ca. © The Author 2015. Published by Oxford University Press.

  19. Automated system for the calibration of magnetometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrucha, Vojtech; Kaspar, Petr; Ripka, Pavel

    2009-04-01

    A completely nonmagnetic calibration platform has been developed and constructed at DTU Space (Technical University of Denmark). It is intended for on-site scalar calibration of high-precise fluxgate magnetometers. An enhanced version of the same platform is being built at the Czech Technical University. There are three axes of rotation in this design (compared to two axes in the previous version). The addition of the third axis allows us to calibrate more complex devices. An electronic compass based on a vector fluxgate magnetometer and micro electro mechanical systems (MEMS) accelerometer is one example. The new platform can also be used tomore » evaluate the parameters of the compass in all possible variations in azimuth, pitch, and roll. The system is based on piezoelectric motors, which are placed on a platform made of aluminum, brass, plastic, and glass. Position sensing is accomplished through custom-made optical incremental sensors. The system is controlled by a microcontroller, which executes commands from a computer. The properties of the system as well as calibration and measurement results will be presented.« less

  20. First Results of Field Absolute Calibration of the GPS Receiver Antenna at Wuhan University

    PubMed Central

    Hu, Zhigang; Zhao, Qile; Chen, Guo; Wang, Guangxing; Dai, Zhiqiang; Li, Tao

    2015-01-01

    GNSS receiver antenna phase center variations (PCVs), which arise from the non-spherical phase response of GNSS signals have to be well corrected for high-precision GNSS applications. Without using a precise antenna phase center correction (PCC) model, the estimated position of a station monument will lead to a bias of up to several centimeters. The Chinese large-scale research project “Crustal Movement Observation Network of China” (CMONOC), which requires high-precision positions in a comprehensive GPS observational network motived establishment of a set of absolute field calibrations of the GPS receiver antenna located at Wuhan University. In this paper the calibration facilities are firstly introduced and then the multipath elimination and PCV estimation strategies currently used are elaborated. The validation of estimated PCV values of test antenna are finally conducted, compared with the International GNSS Service (IGS) type values. Examples of TRM57971.00 NONE antenna calibrations from our calibration facility demonstrate that the derived PCVs and IGS type mean values agree at the 1 mm level. PMID:26580616

  1. On-plot drinking water supplies and health: A systematic review.

    PubMed

    Overbo, Alycia; Williams, Ashley R; Evans, Barbara; Hunter, Paul R; Bartram, Jamie

    2016-07-01

    Many studies have found that household access to water supplies near or within the household plot can reduce the probability of diarrhea, trachoma, and other water-related diseases, and it is generally accepted that on-plot water supplies produce health benefits for households. However, the body of research literature has not been analyzed to weigh the evidence supporting this. A systematic review was conducted to investigate the impacts of on-plot water supplies on diarrhea, trachoma, child growth, and water-related diseases, to further examine the relationship between household health and distance to water source and to assess whether on-plot water supplies generate health gains for households. Studies provide evidence that households with on-plot water supplies experience fewer diarrheal and helminth infections and greater child height. Findings suggest that water-washed (hygiene associated) diseases are more strongly impacted by on-plot water access than waterborne diseases. Few studies analyzed the effects of on-plot water access on quantity of domestic water used, hygiene behavior, and use of multiple water sources, and the lack of evidence for these relationships reveals an important gap in current literature. The review findings indicate that on-plot water access is a useful health indicator and benchmark for the progressive realization of the Sustainable Development Goal target of universal safe water access as well as the human right to safe water. Copyright © 2016 Elsevier GmbH. All rights reserved.

  2. Learning analytics: Dataset for empirical evaluation of entry requirements into engineering undergraduate programs in a Nigerian university.

    PubMed

    Odukoya, Jonathan A; Popoola, Segun I; Atayero, Aderemi A; Omole, David O; Badejo, Joke A; John, Temitope M; Olowo, Olalekan O

    2018-04-01

    In Nigerian universities, enrolment into any engineering undergraduate program requires that the minimum entry criteria established by the National Universities Commission (NUC) must be satisfied. Candidates seeking admission to study engineering discipline must have reached a predetermined entry age and met the cut-off marks set for Senior School Certificate Examination (SSCE), Unified Tertiary Matriculation Examination (UTME), and the post-UTME screening. However, limited effort has been made to show that these entry requirements eventually guarantee successful academic performance in engineering programs because the data required for such validation are not readily available. In this data article, a comprehensive dataset for empirical evaluation of entry requirements into engineering undergraduate programs in a Nigerian university is presented and carefully analyzed. A total sample of 1445 undergraduates that were admitted between 2005 and 2009 to study Chemical Engineering (CHE), Civil Engineering (CVE), Computer Engineering (CEN), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mechanical Engineering (MEE), and Petroleum Engineering (PET) at Covenant University, Nigeria were randomly selected. Entry age, SSCE aggregate, UTME score, Covenant University Scholastic Aptitude Screening (CUSAS) score, and the Cumulative Grade Point Average (CGPA) of the undergraduates were obtained from the Student Records and Academic Affairs unit. In order to facilitate evidence-based evaluation, the robust dataset is made publicly available in a Microsoft Excel spreadsheet file. On yearly basis, first-order descriptive statistics of the dataset are presented in tables. Box plot representations, frequency distribution plots, and scatter plots of the dataset are provided to enrich its value. Furthermore, correlation and linear regression analyses are performed to understand the relationship between the entry requirements and the corresponding academic performance in engineering programs. The data provided in this article will help Nigerian universities, the NUC, engineering regulatory bodies, and relevant stakeholders to objectively evaluate and subsequently improve the quality of engineering education in the country.

  3. Cyclic injection, storage, and withdrawal of heated water in a sandstone aquifer at St. Paul, Minnesota--Analysis of thermal data and nonisothermal modeling of short-term test cycles

    USGS Publications Warehouse

    Miller, Robert T.; Delin, G.N.

    2002-01-01

    In May 1980, the University of Minnesota began a project to evaluate the feasibility of storing heated water (150 degrees Celsius) in the Franconia-Ironton Galesville aquifer (183 to 245 meters below land surface) and later recovering it for space heating. The University's steam-generation facilities supplied high-temperature water for injection. The Aquifer Thermal-Energy Storage system is a doublet-well design in which the injection-withdrawal wells are spaced approximately 250 meters apart. Water was pumped from one of the wells through a heat exchanger, where heat was added or removed. This water was then injected back into the aquifer through the other well. Four short-term test cycles were completed. Each cycle consisted of approximately equal durations of injection and withdrawal ranging from 5.25 to 8.01 days. Equal rates of injection and withdrawal, ranging from 17.4 to 18.6 liters per second, were maintained for each short-term test cycle. Average injection temperatures ranged from 88.5 to 117.9 degrees Celsius. Temperature graphs for selected depths at individual observation wells indicate that the Ironton and Galesville Sandstones received and stored more thermal energy than the upper part of the Franconia Formation. Clogging of the Ironton Sandstone was possibly due to precipitation of calcium carbonate or movement of fine-grain material or both. Vertical-profile plots indicate that the effects of buoyancy flow were small within the aquifer. A three-dimensional, anisotropic, nonisothermal, ground-water-flow, and thermal-energy-transport model was constructed to simulate the four short-term test cycles. The model was used to simulate the entire short-term testing period of approximately 400 days. The only model properties varied during model calibration were longitudinal and transverse thermal dispersivities, which, for final calibration, were simulated as 3.3 and 0.33 meters, respectively. The model was calibrated by comparing model-computed results to (1) measured temperatures at selected altitudes in four observation wells, (2) measured temperatures at the production well, and (3) calculated thermal efficiencies of the aquifer. Model-computed withdrawal-water temperatures were within an average of about 3 percent of measured values and model-computed aquifer-thermal efficiencies were within an average of about 5 percent of calculated values for the short-term test cycles. These data indicate that the model accurately simulated thermal-energy storage within the Franconia-Ironton-Galesville aquifer.

  4. Effect of Using Extreme Years in Hydrologic Model Calibration Performance

    NASA Astrophysics Data System (ADS)

    Goktas, R. K.; Tezel, U.; Kargi, P. G.; Ayvaz, T.; Tezyapar, I.; Mesta, B.; Kentel, E.

    2017-12-01

    Hydrological models are useful in predicting and developing management strategies for controlling the system behaviour. Specifically they can be used for evaluating streamflow at ungaged catchments, effect of climate change, best management practices on water resources, or identification of pollution sources in a watershed. This study is a part of a TUBITAK project named "Development of a geographical information system based decision-making tool for water quality management of Ergene Watershed using pollutant fingerprints". Within the scope of this project, first water resources in Ergene Watershed is studied. Streamgages found in the basin are identified and daily streamflow measurements are obtained from State Hydraulic Works of Turkey. Streamflow data is analysed using box-whisker plots, hydrographs and flow-duration curves focusing on identification of extreme periods, dry or wet. Then a hydrological model is developed for Ergene Watershed using HEC-HMS in the Watershed Modeling System (WMS) environment. The model is calibrated for various time periods including dry and wet ones and the performance of calibration is evaluated using Nash-Sutcliffe Efficiency (NSE), correlation coefficient, percent bias (PBIAS) and root mean square error. It is observed that calibration period affects the model performance, and the main purpose of the development of the hydrological model should guide calibration period selection. Acknowledgement: This study is funded by The Scientific and Technological Research Council of Turkey (TUBITAK) under Project Number 115Y064.

  5. Fresh Biomass Estimation in Heterogeneous Grassland Using Hyperspectral Measurements and Multivariate Statistical Analysis

    NASA Astrophysics Data System (ADS)

    Darvishzadeh, R.; Skidmore, A. K.; Mirzaie, M.; Atzberger, C.; Schlerf, M.

    2014-12-01

    Accurate estimation of grassland biomass at their peak productivity can provide crucial information regarding the functioning and productivity of the rangelands. Hyperspectral remote sensing has proved to be valuable for estimation of vegetation biophysical parameters such as biomass using different statistical techniques. However, in statistical analysis of hyperspectral data, multicollinearity is a common problem due to large amount of correlated hyper-spectral reflectance measurements. The aim of this study was to examine the prospect of above ground biomass estimation in a heterogeneous Mediterranean rangeland employing multivariate calibration methods. Canopy spectral measurements were made in the field using a GER 3700 spectroradiometer, along with concomitant in situ measurements of above ground biomass for 170 sample plots. Multivariate calibrations including partial least squares regression (PLSR), principal component regression (PCR), and Least-Squared Support Vector Machine (LS-SVM) were used to estimate the above ground biomass. The prediction accuracy of the multivariate calibration methods were assessed using cross validated R2 and RMSE. The best model performance was obtained using LS_SVM and then PLSR both calibrated with first derivative reflectance dataset with R2cv = 0.88 & 0.86 and RMSEcv= 1.15 & 1.07 respectively. The weakest prediction accuracy was appeared when PCR were used (R2cv = 0.31 and RMSEcv= 2.48). The obtained results highlight the importance of multivariate calibration methods for biomass estimation when hyperspectral data are used.

  6. Quantification of calcium using localized normalization on laser-induced breakdown spectroscopy data

    NASA Astrophysics Data System (ADS)

    Sabri, Nursalwanie Mohd; Haider, Zuhaib; Tufail, Kashif; Aziz, Safwan; Ali, Jalil; Wahab, Zaidan Abdul; Abbas, Zulkifly

    2017-03-01

    This paper focuses on localized normalization for improved calibration curves in laser-induced breakdown spectroscopy (LIBS) measurements. The calibration curves have been obtained using five samples consisting of different concentrations of calcium (Ca) in potassium bromide (KBr) matrix. The work has utilized Q-switched Nd:YAG laser installed in LIBS2500plus system with fundamental wavelength and laser energy of 650 mJ. Optimization of gate delay can be obtained from signal-to-background ratio (SBR) of Ca II 315.9 and 317.9 nm. The optimum conditions are determined in which having high spectral intensity and SBR. The highest spectral lines of ionic and emission lines of Ca at gate delay of 0.83 µs. From SBR, the optimized gate delay is at 5.42 µs for both Ca II spectral lines. Calibration curves consist of three parts; original intensity from LIBS experimentation, normalization and localized normalization of the spectral line intensity. The R2 values of the calibration curves plotted using locally normalized intensities of Ca I 610.3, 612.2 and 616.2 nm spectral lines are 0.96329, 0.97042, and 0.96131, respectively. The enhancement from calibration curves using the regression coefficient allows more accurate analysis in LIBS. At the request of all authors of the paper, and with the agreement of the Proceedings Editor, an updated version of this article was published on 24 May 2017.

  7. Universal test fixture for monolithic mm-wave integrated circuits calibrated with an augmented TRD algorithm

    NASA Technical Reports Server (NTRS)

    Romanofsky, Robert R.; Shalkhauser, Kurt A.

    1989-01-01

    The design and evaluation of a novel fixturing technique for characterizing millimeter wave solid state devices is presented. The technique utilizes a cosine-tapered ridge guide fixture and a one-tier de-embedding procedure to produce accurate and repeatable device level data. Advanced features of this technique include nondestructive testing, full waveguide bandwidth operation, universality of application, and rapid, yet repeatable, chip-level characterization. In addition, only one set of calibration standards is required regardless of the device geometry.

  8. Correction of microplate location effects improves performance of the thrombin generation test

    PubMed Central

    2013-01-01

    Background Microplate-based thrombin generation test (TGT) is widely used as clinical measure of global hemostatic potential and it becomes a useful tool for control of drug potency and quality by drug manufactures. However, the convenience of the microtiter plate technology can be deceiving: microplate assays are prone to location-based variability in different parts of the microtiter plate. Methods In this report, we evaluated the well-to-well consistency of the TGT variant specifically applied to the quantitative detection of the thrombogenic substances in the immune globulin product. We also studied the utility of previously described microplate layout designs in the TGT experiment. Results Location of the sample on the microplate (location effect) contributes to the variability of TGT measurements. Use of manual pipetting techniques and applications of the TGT to the evaluation of procoagulant enzymatic substances are especially sensitive. The effects were not sensitive to temperature or choice of microplate reader. Smallest location effects were observed with automated dispenser-based calibrated thrombogram instrument. Even for an automated instrument, the use of calibration curve resulted in up to 30% bias in thrombogenic potency assignment. Conclusions Use of symmetrical version of the strip-plot layout was demonstrated to help to minimize location artifacts even under the worst-case conditions. Strip-plot layouts are required for quantitative thrombin-generation based bioassays used in the biotechnological field. PMID:23829491

  9. Correction of microplate location effects improves performance of the thrombin generation test.

    PubMed

    Liang, Yideng; Woodle, Samuel A; Shibeko, Alexey M; Lee, Timothy K; Ovanesov, Mikhail V

    2013-07-05

    Microplate-based thrombin generation test (TGT) is widely used as clinical measure of global hemostatic potential and it becomes a useful tool for control of drug potency and quality by drug manufactures. However, the convenience of the microtiter plate technology can be deceiving: microplate assays are prone to location-based variability in different parts of the microtiter plate. In this report, we evaluated the well-to-well consistency of the TGT variant specifically applied to the quantitative detection of the thrombogenic substances in the immune globulin product. We also studied the utility of previously described microplate layout designs in the TGT experiment. Location of the sample on the microplate (location effect) contributes to the variability of TGT measurements. Use of manual pipetting techniques and applications of the TGT to the evaluation of procoagulant enzymatic substances are especially sensitive. The effects were not sensitive to temperature or choice of microplate reader. Smallest location effects were observed with automated dispenser-based calibrated thrombogram instrument. Even for an automated instrument, the use of calibration curve resulted in up to 30% bias in thrombogenic potency assignment. Use of symmetrical version of the strip-plot layout was demonstrated to help to minimize location artifacts even under the worst-case conditions. Strip-plot layouts are required for quantitative thrombin-generation based bioassays used in the biotechnological field.

  10. Quantitative analysis of polyhexamethylene guanidine (PHMG) oligomers via matrix-assisted laser desorption/ionization time-of-flight mass spectrometry with an ionic-liquid matrix.

    PubMed

    Yoon, Donhee; Lee, Dongkun; Lee, Jong-Hyeon; Cha, Sangwon; Oh, Han Bin

    2015-01-30

    Quantifying polymers by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOFMS) with a conventional crystalline matrix generally suffers from poor sample-to-sample or shot-to-shot reproducibility. An ionic-liquid matrix has been demonstrated to mitigate these reproducibility issues by providing a homogeneous sample surface, which is useful for quantifying polymers. In the present study, we evaluated the use of an ionic liquid matrix, i.e., 1-methylimidazolium α-cyano-4-hydroxycinnamate (1-MeIm-CHCA), to quantify polyhexamethylene guanidine (PHMG) samples that impose a critical health hazard when inhaled in the form of droplets. MALDI-TOF mass spectra were acquired for PHMG oligomers using a variety of ionic-liquid matrices including 1-MeIm-CHCA. Calibration curves were constructed by plotting the sum of the PHMG oligomer peak areas versus PHMG sample concentration with a variety of peptide internal standards. Compared with the conventional crystalline matrix, the 1-MeIm-CHCA ionic-liquid matrix had much better reproducibility (lower standard deviations). Furthermore, by using an internal peptide standard, good linear calibration plots could be obtained over a range of PMHG concentrations of at least 4 orders of magnitude. This study successfully demonstrated that PHMG samples can be quantitatively characterized by MALDI-TOFMS with an ionic-liquid matrix and an internal standard. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Performance of a Nomogram Predicting Disease-Specific Survival After an R0 Resection for Gastric Cancer in Patients Receiving Postoperative Chemoradiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dikken, Johan L.; Department of Surgery, Leiden University Medical Center, Leiden; Coit, Daniel G.

    Purpose: The internationally validated Memorial Sloan-Kettering Cancer Center (MSKCC) gastric carcinoma nomogram was based on patients who underwent curative (R0) gastrectomy, without any other therapy. The purpose of the current study was to assess the performance of this gastric cancer nomogram in patients who received chemoradiation therapy after an R0 resection for gastric cancer. Methods and Materials: In a combined dataset of 76 patients from the Netherlands Cancer Institute (NKI), and 63 patients from MSKCC, who received postoperative chemoradiation therapy (CRT) after an R0 gastrectomy, the nomogram was validated by means of the concordance index (CI) and a calibration plot. Results:more » The concordance index for the nomogram was 0.64, which was lower than the CI of the nomogram for patients who received no adjuvant therapy (0.80). In the calibration plot, observed survival was approximately 20% higher than the nomogram-predicted survival for patients receiving postoperative CRT. Conclusions: The MSKCC gastric carcinoma nomogram significantly underpredicted survival for patients in the current study, suggesting an impact of postoperative CRT on survival in patients who underwent an R0 resection for gastric cancer, which has been demonstrated by randomized controlled trials. This analysis stresses the need for updating nomograms with the incorporation of multimodal strategies.« less

  12. Analysis of a mesoscale infiltration and water seepage test in unsaturated fractured rock: Spatial variabilities and discrete fracture patterns

    USGS Publications Warehouse

    Zhou, Q.; Salve, R.; Liu, H.-H.; Wang, J.S.Y.; Hudson, D.

    2006-01-01

    A mesoscale (21??m in flow distance) infiltration and seepage test was recently conducted in a deep, unsaturated fractured rock system at the crossover point of two underground tunnels. Water was released from a 3??m ?? 4??m infiltration plot on the floor of an alcove in the upper tunnel, and seepage was collected from the ceiling of a niche in the lower tunnel. Significant temporal and (particularly) spatial variabilities were observed in both measured infiltration and seepage rates. To analyze the test results, a three-dimensional unsaturated flow model was used. A column-based scheme was developed to capture heterogeneous hydraulic properties reflected by these spatial variabilities observed. Fracture permeability and van Genuchten ?? parameter [van Genuchten, M.T., 1980. A closed-form equation for predicting the hydraulic conductivity of unsaturated soils. Soil Sci. Soc. Am. J. 44, 892-898] were calibrated for each rock column in the upper and lower hydrogeologic units in the test bed. The calibrated fracture properties for the infiltration and seepage zone enabled a good match between simulated and measured (spatially varying) seepage rates. The numerical model was also able to capture the general trend of the highly transient seepage processes through a discrete fracture network. The calibrated properties and measured infiltration/seepage rates were further compared with mapped discrete fracture patterns at the top and bottom boundaries. The measured infiltration rates and calibrated fracture permeability of the upper unit were found to be partially controlled by the fracture patterns on the infiltration plot (as indicated by their positive correlations with fracture density). However, no correlation could be established between measured seepage rates and density of fractures mapped on the niche ceiling. This lack of correlation indicates the complexity of (preferential) unsaturated flow within the discrete fracture network. This also indicates that continuum-based modeling of unsaturated flow in fractured rock at mesoscale or a larger scale is not necessarily conditional explicitly on discrete fracture patterns. ?? 2006 Elsevier B.V. All rights reserved.

  13. Diagnostic utility of appetite loss in addition to existing prediction models for community-acquired pneumonia in the elderly: a prospective diagnostic study in acute care hospitals in Japan.

    PubMed

    Takada, Toshihiko; Yamamoto, Yosuke; Terada, Kazuhiko; Ohta, Mitsuyasu; Mikami, Wakako; Yokota, Hajime; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuma, Shingo; Fukuhara, Shunichi

    2017-11-08

    Diagnosis of community-acquired pneumonia (CAP) in the elderly is often delayed because of atypical presentation and non-specific symptoms, such as appetite loss, falls and disturbance in consciousness. The aim of this study was to investigate the external validity of existing prediction models and the added value of the non-specific symptoms for the diagnosis of CAP in elderly patients. Prospective cohort study. General medicine departments of three teaching hospitals in Japan. A total of 109 elderly patients who consulted for upper respiratory symptoms between 1 October 2014 and 30 September 2016. The reference standard for CAP was chest radiograph evaluated by two certified radiologists. The existing models were externally validated for diagnostic performance by calibration plot and discrimination. To evaluate the additional value of the non-specific symptoms to the existing prediction models, we developed an extended logistic regression model. Calibration, discrimination, category-free net reclassification improvement (NRI) and decision curve analysis (DCA) were investigated in the extended model. Among the existing models, the model by van Vugt demonstrated the best performance, with an area under the curve of 0.75(95% CI 0.63 to 0.88); calibration plot showed good fit despite a significant Hosmer-Lemeshow test (p=0.017). Among the non-specific symptoms, appetite loss had positive likelihood ratio of 3.2 (2.0-5.3), negative likelihood ratio of 0.4 (0.2-0.7) and OR of 7.7 (3.0-19.7). Addition of appetite loss to the model by van Vugt led to improved calibration at p=0.48, NRI of 0.53 (p=0.019) and higher net benefit by DCA. Information on appetite loss improved the performance of an existing model for the diagnosis of CAP in the elderly. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Radiometric Characterization Results for the IKONOS, Quickbird, and OrbView-3 Sensor

    NASA Technical Reports Server (NTRS)

    Holekamp, Kara; Aaron, David; Thome, Kurtis

    2006-01-01

    Radiometric calibration of commercial imaging satellite products is required to ensure that science and application communities better understand commercial imaging satellite properties. Inaccurate radiometric calibrations can lead to erroneous decisions and invalid conclusions and can limit intercomparisons with other systems. To address this calibration need, the NASA Applied Sciences Directorate (ASD) at Stennis Space Center established a commercial satellite imaging radiometric calibration team consisting of three independent groups: NASA ASD, the University of Arizona Remote Sensing Group, and South Dakota State University. Each group independently determined the absolute radiometric calibration coefficients of available high-spatial-resolution commercial 4-band multispectral products, in the visible though near-infrared spectrum, from GeoEye(tradeMark) (formerly SpaceImaging(Registered TradeMark)) IKONOS, DigitalGlobe(Regitered TradeMark) QuickBird, and GeoEye (formerly ORBIMAGE(Registered TradeMark) OrbView. Each team member employed some variant of reflectance-based vicarious calibration approach, requiring ground-based measurements coincident with image acquisitions and radiative transfer calculations. Several study sites throughout the United States that covered a significant portion of the sensor's dynamic range were employed. Satellite at-sensor radiance values were compared to those estimated by each independent team member to evaluate the sensor's radiometric accuracy. The combined results of this evaluation provide the user community with an independent assessment of these sensors' absolute calibration values.

  15. Quantification of polyhydroxyalkanoates in mixed and pure cultures biomass by Fourier transform infrared spectroscopy: comparison of different approaches.

    PubMed

    Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J

    2016-08-01

    Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.

  16. Variations of measured and simulated soil-loss amounts in a semiarid area in Turkey.

    PubMed

    Hacisalihoğlu, Sezgin

    2010-06-01

    The main goal of this research was soil-loss determination and comparison of the plot measurement results with simulation model (universal soil loss equation (USLE)) results in different land use and slope classes. The research took place in three different land-use types (Scotch pine forest, pasture land, and agricultural land) and in two different slope classes (15-20%, 35-40%). Within six measurement stations (for each land-use type and slope class-one station), totally 18 measurement plots have been constituted, and soil-loss amount measurements have been investigated during the research period (3 years along). USLE simulation model is used in these measurement plots for calculation the soil-loss amounts. The results pointed out that measured (in plots) and simulated (with USLE) soil-loss amounts differ significantly in each land-use type and slope class.

  17. Recording wildlife locations with the Universal Transverse Mercator (UTM) grid system

    Treesearch

    T. G. Grubb; W. L. Eakle

    1988-01-01

    The Universal Transverse Mercator (UTM) international, planar, grid system is described and shown to offer greater simplicity, efficiency and accuracy for plotting wildlife locations than the more familiar Latitude-Longitude (Latilong) and Section-Township-Range (Cadastral) systems, and the State planar system. Use of the UTM system is explained with examples.

  18. A flux calibration device for the SuperNova Integral Field Spectrograph (SNIFS)

    NASA Astrophysics Data System (ADS)

    Lombardo, Simona; Aldering, Greg; Hoffmann, Akos; Kowalski, Marek; Kuesters, Daniel; Reif, Klaus; Rigault, Michael

    2014-07-01

    Observational cosmology employing optical surveys often require precise flux calibration. In this context we present SNIFS Calibration Apparatus (SCALA), a flux calibration system developed for the SuperNova Integral Field Spectrograph (SNIFS), operating at the University of Hawaii 2.2 m telescope. SCALA consists of a hexagonal array of 18 small parabolic mirrors distributed over the face of, and feeding parallel light to, the telescope entrance pupil. The mirrors are illuminated by integrating spheres and a wavelength-tunable (from UV to IR) light source, generating light beams with opening angles of 1°. These nearly parallel beams are flat and flux-calibrated at a subpercent level, enabling us to calibrate our "telescope + SNIFS system" at the required precision.

  19. Variety identification of brown sugar using short-wave near infrared spectroscopy and multivariate calibration

    NASA Astrophysics Data System (ADS)

    Yang, Haiqing; Wu, Di; He, Yong

    2007-11-01

    Near-infrared spectroscopy (NIRS) with the characteristics of high speed, non-destructiveness, high precision and reliable detection data, etc. is a pollution-free, rapid, quantitative and qualitative analysis method. A new approach for variety discrimination of brown sugars using short-wave NIR spectroscopy (800-1050nm) was developed in this work. The relationship between the absorbance spectra and brown sugar varieties was established. The spectral data were compressed by the principal component analysis (PCA). The resulting features can be visualized in principal component (PC) space, which can lead to discovery of structures correlative with the different class of spectral samples. It appears to provide a reasonable variety clustering of brown sugars. The 2-D PCs plot obtained using the first two PCs can be used for the pattern recognition. Least-squares support vector machines (LS-SVM) was applied to solve the multivariate calibration problems in a relatively fast way. The work has shown that short-wave NIR spectroscopy technique is available for the brand identification of brown sugar, and LS-SVM has the better identification ability than PLS when the calibration set is small.

  20. Molecular constituents of colorectal cancer metastatic to the liver by imaging infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Coe, James V.; Chen, Zhaomin; Li, Ran; Nystrom, Steven V.; Butke, Ryan; Miller, Barrie; Hitchcock, Charles L.; Allen, Heather C.; Povoski, Stephen P.; Martin, Edward W.

    2015-03-01

    Infrared (IR) imaging spectroscopy of human liver tissue slices has been used to identify and characterize liver metastasis of colorectal origin which was surgically removed from a consenting patient and frozen without formalin fixation or dehydration procedures, so that lipids and water remain in the tissues. First, a k-means clustering analysis, using metrics from the IR spectra, identified groups within the image. The groups were identified as tumor or nontumor regions by comparing to an H and E stain of the same sample after IR imaging. Then, calibrant IR spectra of protein, several fats, glycogen, and polyvinyl alcohol were isolated by differencing spectra from different regions or groups in the image space. Finally, inner products (or scores) of the IR spectra at each pixel in the image with each of the various calibrants were calculated showing how the calibrant molecules vary in tumor and nontumor regions. In this particular case, glycogen and protein changes enable separation of tumor and nontumor regions as shown with a contour plot of the glycogen scores versus the protein scores.

  1. Microbial biosensor for detection of methyl parathion using screen printed carbon electrode and cyclic voltammetry.

    PubMed

    Kumar, Jitendra; D'Souza, S F

    2011-07-15

    Whole cells of recombinant Escherichia coli were immobilized on the screen printed carbon electrode (SPCE) using glutaraldehyde. Recombinant E. coli was having high periplasmic expression of organophosphorus hydrolase enzyme, which hydrolyzes the methyl parathion into two products, p-nitrophenol and dimethyl thiophosphoric acid. Cells immobilized SPCE was studied under SEM. Cells immobilized SPCE was associated with cyclic voltammetry and cyclic voltammograms were recorded before and after hydrolysis of methyl parathion. Detection was calibrated based on the relationship between the changes in the current observed at +0.1 V potential, because of redox behavior of the hydrolyzed product p-nitrophenol. As concentration of methyl parathion was increased the oxidation current also increased. Only 20 μl volume of the sample was required for analysis. Detection range of biosensor was calibrated between 2 and 80 μM of methyl parathion from the linear range of calibration plot. A single immobilized SPCE was reused for 32 reactions with retention of 80% of its initial enzyme activity. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacFarlane, Michael; Battista, Jerry; Chen, Jeff

    Purpose: To develop a radiotherapy dose tracking and plan evaluation technique using cone-beam computed tomography (CBCT) images. Methods: We developed a patient-specific method of calibrating CBCT image sets for dose calculation. The planning CT was first registered with the CBCT using deformable image registration (DIR). A scatter plot was generated between the CT numbers of the planning CT and CBCT for each slice. The CBCT calibration curve was obtained by least-square fitting of the data, and applied to each CBCT slice. The calibrated CBCT was then merged with original planning CT to extend the small field of view of CBCT.more » Finally, the treatment plan was copied to the merged CT for dose tracking and plan evaluation. The proposed patient-specific calibration method was also compared to two methods proposed in literature. To evaluate the accuracy of each technique, 15 head-and-neck patients requiring plan adaptation were arbitrarily selected from our institution. The original plan was calculated on each method’s data set, including a second planning CT acquired within 48 hours of the CBCT (serving as gold standard). Clinically relevant dose metrics and 3D gamma analysis of dose distributions were compared between the different techniques. Results: Compared to the gold standard of using planning CTs, the patient-specific CBCT calibration method was shown to provide promising results with gamma pass rates above 95% and average dose metric agreement within 2.5%. Conclusions: The patient-specific CBCT calibration method could potentially be used for on-line dose tracking and plan evaluation, without requiring a re-planning CT session.« less

  3. Legato: Personal Computer Software for Analyzing Pressure-Sensitive Paint Data

    NASA Technical Reports Server (NTRS)

    Schairer, Edward T.

    2001-01-01

    'Legato' is personal computer software for analyzing radiometric pressure-sensitive paint (PSP) data. The software is written in the C programming language and executes under Windows 95/98/NT operating systems. It includes all operations normally required to convert pressure-paint image intensities to normalized pressure distributions mapped to physical coordinates of the test article. The program can analyze data from both single- and bi-luminophore paints and provides for both in situ and a priori paint calibration. In addition, there are functions for determining paint calibration coefficients from calibration-chamber data. The software is designed as a self-contained, interactive research tool that requires as input only the bare minimum of information needed to accomplish each function, e.g., images, model geometry, and paint calibration coefficients (for a priori calibration) or pressure-tap data (for in situ calibration). The program includes functions that can be used to generate needed model geometry files for simple model geometries (e.g., airfoils, trapezoidal wings, rotor blades) based on the model planform and airfoil section. All data files except images are in ASCII format and thus are easily created, read, and edited. The program does not use database files. This simplifies setup but makes the program inappropriate for analyzing massive amounts of data from production wind tunnels. Program output consists of Cartesian plots, false-colored real and virtual images, pressure distributions mapped to the surface of the model, assorted ASCII data files, and a text file of tabulated results. Graphical output is displayed on the computer screen and can be saved as publication-quality (PostScript) files.

  4. An analysis of new techniqes for radiometric correction of LANDSAT-4 Thematic Mapper images. [Terrebonne Bay, Louisiana and Grand Bahamas scenes

    NASA Technical Reports Server (NTRS)

    Kogut, J.; Larduinat, E.; Fitzgerald, M.

    1983-01-01

    The utility of methods for generating TM RLUTS which can improve the quality of the resultant images was investigated. The TM-CCT-ADDS tape was changed to account for a different collection window for the calibration data. Several scenes of Terrebonne Bay, Louisiana and the Grand Bahamas were analyzed to evaluate the radiometric corrections operationally applied to the image data and to investigate several techniques for reducing striping in the images. Printer plots for the TM shutter data were produced and detector statistics were compiled and plotted. These statistics included various combinations of the average shutter counts for each scan before and after DC restore for forward and reverse scans. Results show that striping is caused by the detectors becoming saturated when they view a bright cloud and depress the DC restore level.

  5. Sound absorption coefficient in situ: an alternative for estimating soil loss factors.

    PubMed

    Freire, Rosane; Meletti de Abreu, Marco Henrique; Okada, Rafael Yuri; Soares, Paulo Fernando; GranhenTavares, Célia Regina

    2015-01-01

    The relationship between the sound absorption coefficient and factors of the Universal Soil Loss Equation (USLE) was determined in a section of the Maringá Stream basin, Paraná State, by using erosion plots. In the field, four erosion plots were built on a reduced scale, with dimensions of 2.0×12.5m. With respect to plot coverage, one was kept with bare soil and the others contained forage grass (Brachiaria), corn and wheat crops, respectively. Planting was performed without any type of conservation practice in an area with a 9% slope. A sedimentation tank was placed at the end of each plot to collect the material transported. For the acoustic system, pink noise was used in the measurement of the proposed monitoring, for collecting information on incident and reflected sound pressure levels. In general, obtained values of soil loss confirmed that 94.3% of material exported to the basin water came from the bare soil plot, 2.8% from the corn plot, 1.8% from the wheat plot, and 1.1% from the forage grass plot. With respect to the acoustic monitoring, results indicated that at 16kHz erosion plot coverage type had a significant influence on the sound absorption coefficient. High correlation coefficients were found in estimations of the A and C factors of the USLE, confirming that the acoustic technique is feasible for the determination of soil loss directly in the field. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Microwave Comb Generator

    NASA Technical Reports Server (NTRS)

    Sigman, E. H.

    1989-01-01

    Stable reference tones aid testing and calibration of microwave receivers. Signal generator puts out stable tones in frequency range of 2 to 10 GHz at all multiples of reference input frequency, at any frequency up to 1 MHz. Called "comb generator" because spectral plot resembles comb. DC reverse-bias current switched on and off at 1 MHz to generate sharp pulses in step-recovery diode. Microwave components mounted on back of special connector containing built-in attenuator. Used in testing microwave and spread-spectrum wide-band receivers.

  7. Cheap DECAF: Density Estimation for Cetaceans from Acoustic Fixed Sensors Using Separate, Non-Linked Devices

    DTIC Science & Technology

    2015-09-30

    interpolation was used to estimate fin whale density in between the hydrophone locations , and the result plotted as a density image. This was repeated every 5...singing fin whale density throughout the year for the study location off Portugal. Color indicates whale density, with calibration scale at right; yellow...spots are hydrophone locations ; timeline at top indicates the time of year; circle at lower right is 1000 km 2 , the area used in the unit of whale

  8. Intensity measurement of automotive headlamps using a photometric vision system

    NASA Astrophysics Data System (ADS)

    Patel, Balvant; Cruz, Jose; Perry, David L.; Himebaugh, Frederic G.

    1996-01-01

    Requirements for automotive head lamp luminous intensity tests are introduced. The rationale for developing a non-goniometric photometric test system is discussed. The design of the Ford photometric vision system (FPVS) is presented, including hardware, software, calibration, and system use. Directional intensity plots and regulatory test results obtained from the system are compared to corresponding results obtained from a Ford goniometric test system. Sources of error for the vision system and goniometer are discussed. Directions for new work are identified.

  9. Design and Production of Color Calibration Targets for Digital Input Devices

    DTIC Science & Technology

    2000-07-01

    gamuts . Fourth, color transform form CIELCH to sRGB will be described. Fifth, the relevant target mockups will be created. Sixth, the quality will be...Implement statistical _ • process controls Print, process and measure •, reject Transfer the measured CIEXYZ of I the target patches to SRGB a Genterate...Kodak Royal VII paper and sRGB . This plot shows all points on the a*-b* plane without information about the L*. The sRGB’s color gamut is obtained from

  10. Spectroradiometric calibration of the thematic mapper and multispectral scanner system

    NASA Technical Reports Server (NTRS)

    Slater, P. N. (Principal Investigator); Palmer, J. M.

    1983-01-01

    The design of a spectroradiometer under construction for atmosheric and surface measurements at White Sands, New Mexico is described. The instrument's observation capability encompasses (1) measuring the solar radiance at a number of wavelengths as a function of air mass for Langley plot analysis in order to generate the optical depth; (2) measuring the ground radiance to determine the absolute ground reflectance; and (3) measuring the sky radiance as a method of checking the accuracy of the radiative transfer program.

  11. Upgrading the Arecibo Potassium Lidar Receiver for Meridional Wind Measurements

    NASA Astrophysics Data System (ADS)

    Piccone, A. N.; Lautenbach, J.

    2017-12-01

    Lidar can be used to measure a plethora of variables: temperature, density of metals, and wind. This REU project is focused on the set up of a semi steerable telescope that will allow the measurement of meridional wind in the mesosphere (80-105 km) with Arecibo Observatory's potassium resonance lidar. This includes the basic design concept of a steering system that is able to turn the telescope to a maximum of 40°, alignment of the mirror with the telescope frame to find the correct focusing, and the triggering and programming of a CCD camera. The CCD camera's purpose is twofold: looking though the telescope and matching the stars in the field of view with a star map to accurately calibrate the steering system and determining the laser beam properties and position. Using LabVIEW, the frames from the CCD camera can be analyzed to identify the most intense pixel in the image (and therefore the brightest point in the laser beam or stars) by plotting average pixel values per row and column and locating the peaks of these plots. The location of this pixel can then be plotted, determining the jitter in the laser and position within the field of view of the telescope.

  12. [The characteristics of RR-Lorenz plot in persistent atrial fibrillation patients complicating with escape beats and rhythm].

    PubMed

    Pan, Yunping; Zhang, Fangfang; Liu, Ru; Jing, Yan; Shen, Jihong; Li, Zhongjian; Zhu, Huaijie

    2014-06-01

    To explore the characteristics of RR-Lorenz plot in persistent atrial fibrillation (AF) patients complicating with escape beats and rhythm though ambulatory electrocardiogram. The 24-hour ambulatory electrocardiogram of 291 persistent AF patients in second affiliated hospital of Zhengzhou university from July 2005 to April 2013 were retrospectively analyzed and the RR interval and the QRS wave were measured. Patients were divided into two groups according to the distribution of the RR-Lorenz point [AF without escape beats and rhythm group (Group A, n = 259) and AF with escape beats and rhythm group (Group B, n = 32)]. The characteristics of RR-Lorenz plot between the two groups were compared. (1) Fan-shaped RR-Lorenz plots were evidenced in Group A. (2)In Group B, 30 cases showed fan-shaped with L-shaped and a short dense rods along 45° line. The proportion of escape beats and rhythm was 0.28% (275/98 369) -14.06% (11 263/80 112) . The other 2 cases in group B showed no typical RR-Lorenz plots features. RR-Lorenz plot could help to quickly diagnose persistent AF complicating with escape beats and rhythm according to the typical RR-Lorenz plot characteristics in 24-hour ambulatory electrocardiogram.

  13. In-Flight Calibration Processes for the MMS Fluxgate Magnetometers

    NASA Astrophysics Data System (ADS)

    Bromund, K. R.; Leinweber, H. K.; Plaschke, F.; Strangeway, R. J.; Magnes, W.; Fischer, D.; Nakamura, R.; Anderson, B. J.; Russell, C. T.; Baumjohann, W.; Chutter, M.; Torbert, R. B.; Le, G.; Slavin, J. A.; Kepko, L.

    2015-12-01

    The calibration effort for the Magnetospheric Multiscale Mission (MMS) Analog Fluxgate (AFG) and Digital Fluxgate (DFG) magnetometers is a coordinated effort between three primary institutions: University of California, Los Angeles (UCLA); Space Research Institute, Graz, Austria (IWF); and Goddard Space Flight Center (GSFC). Since the successful deployment of all 8 magnetometers on 17 March 2015, the effort to confirm and update the ground calibrations has been underway during the MMS commissioning phase. The in-flight calibration processes evaluate twelve parameters that determine the alignment, orthogonalization, offsets, and gains for all 8 magnetometers using algorithms originally developed by UCLA and the Technical University of Braunschweig and tailored to MMS by IWF, UCLA, and GSFC. We focus on the processes run at GSFC to determine the eight parameters associated with spin tones and harmonics. We will also discuss the processing flow and interchange of parameters between GSFC, IWF, and UCLA. IWF determines the low range spin axis offsets using the Electron Drift Instrument (EDI). UCLA determines the absolute gains and sensor azimuth orientation using Earth field comparisons. We evaluate the performance achieved for MMS and give examples of the quality of the resulting calibrations.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovard R. Perry; David L. Georgeson

    This report describes the April 2011 calibration of the Accuscan II HpGe In Vivo system for high energy lung counting. The source used for the calibration was a NIST traceable lung set manufactured at the University of Cincinnati UCLL43AMEU & UCSL43AMEU containing Am-241 and Eu-152 with energies from 26 keV to 1408 keV. The lung set was used in conjunction with a Realistic Torso phantom. The phantom was placed on the RMC II counting table (with pins removed) between the v-ridges on the backwall of the Accuscan II counter. The top of the detector housing was positioned perpendicular to themore » junction of the phantom clavicle with the sternum. This position places the approximate center line of the detector housing with the center of the lungs. The energy and efficiency calibrations were performed using a Realistic Torso phantom (Appendix I) and the University of Cincinnati lung set. This report includes an overview introduction and records for the energy/FWHM and efficiency calibration including performance verification and validation counting. The Accuscan II system was successfully calibrated for high energy lung counting and verified in accordance with ANSI/HPS N13.30-1996 criteria.« less

  15. MEASUREMENT OF THE INTENSITY OF THE PROTON BEAM OF THE HARVARD UNIVERSITY SYNCHROCYCLOTRON FOR ENERGY-SPECTRAL MEASUREMENTS OF NUCLEAR SECONDARIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santoro, R.T.; Peelle, R.W.

    1964-03-01

    Two thin helium-filled parallel-plate ionization chambers were designed for use in continuously monitoring the 160-Mev proton beam of the Harvard University Synchrocyclotron over an intensity range from 10/sup 5/ to 10/sup 10/ protons/ sec. The ionlzation chambers were calibrated by two independert methods. In four calibrations the charge collected in the ionization chambers was compared with that deposited in a Faraday cup which followed the ionization chambers in the proton beam. In a second method, a calibration was made by individually counting beam protons with a pnir of thin scintillation detectors. The ionization chamber response was found to be flatmore » within 2% for a five-decade range of beam intensity. Comparison of the Faraday-cup calibrations with that from proton counting shows agreement to within 5%, which is considered satisfactory. The experimental results were also in agreement, within estimated errors, with the ionization chamber response calculated using an accepted value of the average energy loss per ion pair for helium. A slow shift in the calibrations with time is ascribed to a gradual contamination of the helium of the chambers by air leakage. (auth)« less

  16. The Fossil Calibration Database-A New Resource for Divergence Dating.

    PubMed

    Ksepka, Daniel T; Parham, James F; Allman, James F; Benton, Michael J; Carrano, Matthew T; Cranston, Karen A; Donoghue, Philip C J; Head, Jason J; Hermsen, Elizabeth J; Irmis, Randall B; Joyce, Walter G; Kohli, Manpreet; Lamm, Kristin D; Leehr, Dan; Patané, Josés L; Polly, P David; Phillips, Matthew J; Smith, N Adam; Smith, Nathan D; Van Tuinen, Marcel; Ware, Jessica L; Warnock, Rachel C M

    2015-09-01

    Fossils provide the principal basis for temporal calibrations, which are critical to the accuracy of divergence dating analyses. Translating fossil data into minimum and maximum bounds for calibrations is the most important-often least appreciated-step of divergence dating. Properly justified calibrations require the synthesis of phylogenetic, paleontological, and geological evidence and can be difficult for nonspecialists to formulate. The dynamic nature of the fossil record (e.g., new discoveries, taxonomic revisions, updates of global or local stratigraphy) requires that calibration data be updated continually lest they become obsolete. Here, we announce the Fossil Calibration Database (http://fossilcalibrations.org), a new open-access resource providing vetted fossil calibrations to the scientific community. Calibrations accessioned into this database are based on individual fossil specimens and follow best practices for phylogenetic justification and geochronological constraint. The associated Fossil Calibration Series, a calibration-themed publication series at Palaeontologia Electronica, will serve as a key pipeline for peer-reviewed calibrations to enter the database. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Environmental and Water Quality Operational Studies: An Assessment of Reservoir Mixing Processes

    DTIC Science & Technology

    1986-07-01

    Lake Calhoun, Minnesota 1974 Calibration 1975 Verification C. J. Brown Reservoir, 1974 Simulation of filling Ohio 1975 Calibration Lake Coralville , Iowa ...Conference, University of Iowa , Iowa City, pp 289-306. Koberg, G. E. 1962. "Methods to Compute Long Wave Radiation from the Atmosphere and Reflected Solar

  18. Raman water vapor lidar calibration

    NASA Astrophysics Data System (ADS)

    Landulfo, E.; Da Costa, R. F.; Torres, A. S.; Lopes, F. J. S.; Whiteman, D. N.; Venable, D. D.

    2009-09-01

    We show here new results of a Raman LIDAR calibration methodology effort putting emphasis in the assessment of the cross-section ratio between water vapor and nitrogen by the use of a calibrated NIST traceable tungsten lamp. Therein we give a step by step procedure of how to employ such equipment by means of a mapping/scanning procedure over the receiving optics of a water vapor Raman LIDAR. This methodology has been independently used at Howard University Raman LIDAR and at IPEN Raman LIDAR what strongly supports its reproducibility and points towards an independently calibration methodology to be carried on within an experiment routine.

  19. Design and calibration of field deployable ground-viewing radiometers.

    PubMed

    Anderson, Nikolaus; Czapla-Myers, Jeffrey; Leisso, Nathan; Biggar, Stuart; Burkhart, Charles; Kingston, Rob; Thome, Kurtis

    2013-01-10

    Three improved ground-viewing radiometers were built to support the Radiometric Calibration Test Site (RadCaTS) developed by the Remote Sensing Group (RSG) at the University of Arizona. Improved over previous light-emitting diode based versions, these filter-based radiometers employ seven silicon detectors and one InGaAs detector covering a wavelength range of 400-1550 nm. They are temperature controlled and designed for greater stability and lower noise. The radiometer systems show signal-to-noise ratios of greater than 1000 for all eight channels at typical field calibration signal levels. Predeployment laboratory radiance calibrations using a 1 m spherical integrating source compare well with in situ field calibrations using the solar radiation based calibration method; all bands are within ±2.7% for the case tested.

  20. An exhaustive survey of regular peptide conformations using a new metric for backbone handedness ( h )

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mannige, Ranjan V.

    The Ramachandran plot is important to structural biology as it describes a peptide backbone in the context of its dominant degrees of freedom—the backbone dihedral anglesφandψ(Ramachandran, Ramakrishnan & Sasisekharan, 1963). Since its introduction, the Ramachandran plot has been a crucial tool to characterize protein backbone features. However, the conformation or twist of a backbone as a function ofφandψhas not been completely described for bothcisandtransbackbones. Additionally, little intuitive understanding is available about a peptide’s conformation simply from knowing theφandψvalues of a peptide (e.g., is the regular peptide defined byφ = ψ =  - 100°  left-handed or right-handed?). This report provides a new metric for backbone handednessmore » (h) based on interpreting a peptide backbone as a helix with axial displacementdand angular displacementθ, both of which are derived from a peptide backbone’s internal coordinates, especially dihedral anglesφ,ψandω. In particular,hequals sin(θ)d/d|, with range [-1, 1] and negative (or positive) values indicating left(or right)-handedness. The metrichis used to characterize the handedness of every region of the Ramachandran plot for bothcis(ω = 0°) and trans (ω = 180°) backbones, which provides the first exhaustive survey of twist handedness in Ramachandran (φ,ψ) space. These maps fill in the ‘dead space’ within the Ramachandran plot, which are regions that are not commonly accessed by structured proteins, but which may be accessible to intrinsically disordered proteins, short peptide fragments, and protein mimics such as peptoids. Finally, building on the work of (Zacharias & Knapp, 2013), this report presents a new plot based ondandθthat serves as a universal and intuitive alternative to the Ramachandran plot. The universality arises from the fact that the co-inhabitants of such a plot include every possible peptide backbone includingcisandtransbackbones. The intuitiveness arises from the fact thatdandθprovide, at a glance, numerous aspects of the backbone including compactness, handedness, and planarity.« less

  1. An exhaustive survey of regular peptide conformations using a new metric for backbone handedness ( h )

    DOE PAGES

    Mannige, Ranjan V.

    2017-05-16

    The Ramachandran plot is important to structural biology as it describes a peptide backbone in the context of its dominant degrees of freedom—the backbone dihedral anglesφandψ(Ramachandran, Ramakrishnan & Sasisekharan, 1963). Since its introduction, the Ramachandran plot has been a crucial tool to characterize protein backbone features. However, the conformation or twist of a backbone as a function ofφandψhas not been completely described for bothcisandtransbackbones. Additionally, little intuitive understanding is available about a peptide’s conformation simply from knowing theφandψvalues of a peptide (e.g., is the regular peptide defined byφ = ψ =  - 100°  left-handed or right-handed?). This report provides a new metric for backbone handednessmore » (h) based on interpreting a peptide backbone as a helix with axial displacementdand angular displacementθ, both of which are derived from a peptide backbone’s internal coordinates, especially dihedral anglesφ,ψandω. In particular,hequals sin(θ)d/d|, with range [-1, 1] and negative (or positive) values indicating left(or right)-handedness. The metrichis used to characterize the handedness of every region of the Ramachandran plot for bothcis(ω = 0°) and trans (ω = 180°) backbones, which provides the first exhaustive survey of twist handedness in Ramachandran (φ,ψ) space. These maps fill in the ‘dead space’ within the Ramachandran plot, which are regions that are not commonly accessed by structured proteins, but which may be accessible to intrinsically disordered proteins, short peptide fragments, and protein mimics such as peptoids. Finally, building on the work of (Zacharias & Knapp, 2013), this report presents a new plot based ondandθthat serves as a universal and intuitive alternative to the Ramachandran plot. The universality arises from the fact that the co-inhabitants of such a plot include every possible peptide backbone includingcisandtransbackbones. The intuitiveness arises from the fact thatdandθprovide, at a glance, numerous aspects of the backbone including compactness, handedness, and planarity.« less

  2. Patient-specific calibration of cone-beam computed tomography data sets for radiotherapy dose calculations and treatment plan assessment.

    PubMed

    MacFarlane, Michael; Wong, Daniel; Hoover, Douglas A; Wong, Eugene; Johnson, Carol; Battista, Jerry J; Chen, Jeff Z

    2018-03-01

    In this work, we propose a new method of calibrating cone beam computed tomography (CBCT) data sets for radiotherapy dose calculation and plan assessment. The motivation for this patient-specific calibration (PSC) method is to develop an efficient, robust, and accurate CBCT calibration process that is less susceptible to deformable image registration (DIR) errors. Instead of mapping the CT numbers voxel-by-voxel with traditional DIR calibration methods, the PSC methods generates correlation plots between deformably registered planning CT and CBCT voxel values, for each image slice. A linear calibration curve specific to each slice is then obtained by least-squares fitting, and applied to the CBCT slice's voxel values. This allows each CBCT slice to be corrected using DIR without altering the patient geometry through regional DIR errors. A retrospective study was performed on 15 head-and-neck cancer patients, each having routine CBCTs and a middle-of-treatment re-planning CT (reCT). The original treatment plan was re-calculated on the patient's reCT image set (serving as the gold standard) as well as the image sets produced by voxel-to-voxel DIR, density-overriding, and the new PSC calibration methods. Dose accuracy of each calibration method was compared to the reference reCT data set using common dose-volume metrics and 3D gamma analysis. A phantom study was also performed to assess the accuracy of the DIR and PSC CBCT calibration methods compared with planning CT. Compared with the gold standard using reCT, the average dose metric differences were ≤ 1.1% for all three methods (PSC: -0.3%; DIR: -0.7%; density-override: -1.1%). The average gamma pass rates with thresholds 3%, 3 mm were also similar among the three techniques (PSC: 95.0%; DIR: 96.1%; density-override: 94.4%). An automated patient-specific calibration method was developed which yielded strong dosimetric agreement with the results obtained using a re-planning CT for head-and-neck patients. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  3. Increased transpiration and plant water stress in a black spruce bog exposed to whole ecosystem warming

    NASA Astrophysics Data System (ADS)

    Warren, J.; Ward, E. J.; Wullschleger, S. D.; Hanson, P. J.

    2017-12-01

    The Spruce and Peatland Responses under Changing Environments (SPRUCE) experiment (http://mnspruce.ornl.gov/) in Northern Minnesota, USA, has exposed 12.8 m diameter plots of an ombrotrophic Picea mariana-Ericaceous shrub bog to whole ecosystem warming (0, +2.25, +4.5, +6.75, +9 °C) since August 2015, and elevated CO2 treatments (ambient or +500 ppm) since June 2016. The mixed-age stand has trees up to 40 year old, and a 5-8 m tall canopy. Thermal dissipation sap flow probes were installed into dominant Picea mariana and Larix laricina trees in each of the 10 open-top chambers in fall 2015. This talk will focus on the first two years of sap flux data from the 10 treatment plots and the relationships with seasonal growth and prevailing environmental conditions. Sap flow was scaled to whole tree and plot level transpiration based on prior in situ calibrations using cut trees, establishment of a sapwood depth: tree diameter relationship, and the tree size distribution within each plot. We also assessed water potential in the trees and two dominant shrubs at the site: Rhododendron groenlandicum and Chamaedaphne calyculata. The warming treatments increased the growing season by up to 6 weeks, with sapflow beginning earlier in spring and lasting later into the fall. The deciduous Larix was the only species exhibiting substantial predawn water stress under the treatments, where water potentials reached -2.5 MPa for the warmest plots. The elevated CO2 reduced midday water stress in the Rhododendron, but not the Chamaedaphne, which could lead to shifts in shrub species composition.

  4. Fusion of GEDI, ICESAT2 & NISAR data for above ground biomass mapping in Sonoma County, California

    NASA Astrophysics Data System (ADS)

    Duncanson, L.; Simard, M.; Thomas, N. M.; Neuenschwander, A. L.; Hancock, S.; Armston, J.; Dubayah, R.; Hofton, M. A.; Huang, W.; Tang, H.; Marselis, S.; Fatoyinbo, T.

    2017-12-01

    Several upcoming NASA missions will collect data sensitive to forest structure (GEDI, ICESAT-2 & NISAR). The LiDAR and SAR data collected by these missions will be used in coming years to map forest aboveground biomass at various resolutions. This research focuses on developing and testing multi-sensor data fusion approaches in advance of these missions. Here, we present the first case study of a CMS-16 grant with results from Sonoma County, California. We simulate lidar and SAR datasets from GEDI, ICESAT-2 and NISAR using airborne discrete return lidar and UAVSAR data, respectively. GEDI and ICESAT-2 signals are simulated from high point density discrete return lidar that was acquired over the entire county in 2014 through a previous CMS project (Dubayah & Hurtt, CMS-13). NISAR is simulated from L-band UAVSAR data collected in 2014. These simulations are empirically related to 300 field plots of aboveground biomass as well as a 30m biomass map produced from the 2014 airborne lidar data. We model biomass independently for each simulated mission dataset and then test two fusion methods for County-wide mapping 1) a pixel based approach and 2) an object oriented approach. In the pixel-based approach, GEDI and ICESAT-2 biomass models are calibrated over field plots and applied in orbital simulations for a 2-year period of the GEDI and ICESAT-2 missions. These simulated samples are then used to calibrate UAVSAR data to produce a 0.25 ha map. In the object oriented approach, the GEDI and ICESAT-2 data are identical to the pixel-based approach, but calibrate image objects of similar L-band backscatter rather than uniform pixels. The results of this research demonstrate the estimated ability for each of these three missions to independently map biomass in a temperate, high biomass system, as well as the potential improvement expected through combining mission datasets.

  5. Direct solid analysis of powdered tungsten carbide hardmetal precursors by laser-induced argon spark ablation with inductively coupled plasma atomic emission spectrometry.

    PubMed

    Holá, Markéta; Kanický, Viktor; Mermet, Jean-Michel; Otruba, Vítezslav

    2003-12-01

    The potential of the laser-induced argon spark atomizer (LINA-Spark atomizer) coupled with ICP-AES as a convenient device for direct analysis of WC/Co powdered precursors of sintered hardmetals was studied. The samples were presented for the ablation as pressed pellets prepared by mixing with powdered silver binder containing GeO2 as internal standard. The pellets were ablated with the aid of a Q-switched Nd:YAG laser (1064 nm) focused 16 mm behind the target surface with a resulting estimated power density of 5 GW cm(-2). Laser ablation ICP-AES signals were studied as a function of ablation time, and the duration of time prior to measurement (pre-ablation time) which was necessary to obtain reliable results was about 40 s. Linear calibration plots were obtained up to 10% (m/m) Ti, 9% Ta and 3.5% Nb both without internal standardization and by using germanium as an added internal standard or tungsten as a contained internal standard. The relative uncertainty at the centroid of the calibration line was in the range from +/- 6% to +/- 11% for Nb, Ta and Ti both with and without internal standardisation by Ge. A higher spread of points about the regression was observed for cobalt for which the relative uncertainty at the centroid was in the range from +/- 9% to +/- 14%. Repeatability of results was improved by the use of both Ge and W internal standards. The lowest determinable quantities calculated for calibration plots were 0.060% Co, 0.010% Nb, 0.16% Ta and 0.030% Ti with internal standardization by Ge. The LA-ICP-AES analyses of real samples led to good agreement with the results obtained by solution-based ICP determination with a relative bias not exceeding 10%. The elimination of the dissolution procedure of powdered tungsten (Nb, Ta, Ti) carbide is the principal advantage of the developed LA-ICP-AES method.

  6. Forest-Observation-System.net - towards a global in-situ data repository for biomass datasets validation

    NASA Astrophysics Data System (ADS)

    Shchepashchenko, D.; Chave, J.; Phillips, O. L.; Davies, S. J.; Lewis, S. L.; Perger, C.; Dresel, C.; Fritz, S.; Scipal, K.

    2017-12-01

    Forest monitoring is high on the scientific and political agenda. Global measurements of forest height, biomass and how they change with time are urgently needed as essential climate and ecosystem variables. The Forest Observation System - FOS (http://forest-observation-system.net/) is an international cooperation to establish a global in-situ forest biomass database to support earth observation and to encourage investment in relevant field-based observations and science. FOS aims to link the Remote Sensing (RS) community with ecologists who measure forest biomass and estimating biodiversity in the field for a common benefit. The benefit of FOS for the RS community is the partnering of the most established teams and networks that manage permanent forest plots globally; to overcome data sharing issues and introduce a standard biomass data flow from tree level measurement to the plot level aggregation served in the most suitable form for the RS community. Ecologists benefit from the FOS with improved access to global biomass information, data standards, gap identification and potential improved funding opportunities to address the known gaps and deficiencies in the data. FOS closely collaborate with the Center for Tropical Forest Science -CTFS-ForestGEO, the ForestPlots.net (incl. RAINFOR, AfriTRON and T-FORCES), AusCover, Tropical managed Forests Observatory and the IIASA network. FOS is an open initiative with other networks and teams most welcome to join. The online database provides open access for both metadata (e.g. who conducted the measurements, where and which parameters) and actual data for a subset of plots where the authors have granted access. A minimum set of database values include: principal investigator and institution, plot coordinates, number of trees, forest type and tree species composition, wood density, canopy height and above ground biomass of trees. Plot size is 0.25 ha or large. The database will be essential for validating and calibrating satellite observations and various models.

  7. Prediction of five-year all-cause mortality in Chinese patients with type 2 diabetes mellitus - A population-based retrospective cohort study.

    PubMed

    Wan, Eric Yuk Fai; Fong, Daniel Yee Tak; Fung, Colman Siu Cheung; Yu, Esther Yee Tak; Chin, Weng Yee; Chan, Anca Ka Chun; Lam, Cindy Lo Kuen

    2017-06-01

    This study aimed to develop and validate an all-cause mortality risk prediction model for Chinese primary care patients with type 2 diabetes mellitus(T2DM) in Hong Kong. A population-based retrospective cohort study was conducted on 132,462 Chinese patients who had received public primary care services during 2010. Each gender sample was randomly split on a 2:1 basis into derivation and validation cohorts and was followed-up for a median period of 5years. Gender-specific mortality risk prediction models showing the interaction effect between predictors and age were derived using Cox proportional hazards regression with forward stepwise approach. Developed models were compared with pre-existing models by Harrell's C-statistic and calibration plot using validation cohort. Common predictors of increased mortality risk in both genders included: age; smoking habit; diabetes duration; use of anti-hypertensive agents, insulin and lipid-lowering drugs; body mass index; hemoglobin A1c; systolic blood pressure(BP); total cholesterol to high-density lipoprotein-cholesterol ratio; urine albumin to creatinine ratio(urine ACR); and estimated glomerular filtration rate(eGFR). Prediction models showed better discrimination with Harrell"'s C-statistics of 0.768(males) and 0.782(females) and calibration power from the plots than previously established models. Our newly developed gender-specific models provide a more accurate predicted 5-year mortality risk for Chinese diabetic patients than other established models. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. IRISpy: Analyzing IRIS Data in Python

    NASA Astrophysics Data System (ADS)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  9. Stability indicating high performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in combined dosage form

    PubMed Central

    Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao

    2011-01-01

    A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride (Rf value of 0.55±0.02) and pantoprazole sodium (Rf value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance–absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9988±0.0012 in the concentration range of 100–400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9990±0.0008 in the concentration range of 200–1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method. PMID:29403710

  10. Stability indicating high performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in combined dosage form.

    PubMed

    Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao

    2011-11-01

    A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F 254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride ( R f value of 0.55±0.02) and pantoprazole sodium ( R f value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance-absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9988±0.0012 in the concentration range of 100-400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9990±0.0008 in the concentration range of 200-1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method.

  11. Nimbus-7 ERB Solar Analysis Tape (ESAT) user's guide

    NASA Technical Reports Server (NTRS)

    Major, Eugene; Hickey, John R.; Kyle, H. Lee; Alton, Bradley M.; Vallette, Brenda J.

    1988-01-01

    Seven years and five months of Nimbus-7 Earth Radiation Budget (ERB) solar data are available on a single ERB Solar Analysis Tape (ESAT). The period covered is November 16, 1978 through March 31, 1986. The Nimbus-7 satellite performs approximately 14 orbits per day and the ERB solar telescope observes the sun once per orbit as the satellite crosses the southern terminator. The solar data were carefully calibrated and screened. Orbital and daily mean values are given for the total solar irradiance plus other spectral intervals (10 solar channels in all). In addition, selected solar activity indicators are included on the ESAT. The ESAT User's Guide is an update of the previous ESAT User's Guide (NASA TM 86143) and includes more detailed information on the solar data calibration, screening procedures, updated solar data plots, and applications to solar variability. Details of the tape format, including source code to access ESAT, are included.

  12. Calibration of a turbidity meter for making estimates of total suspended solids concentrations and beam attenuation coefficients in field experiments

    NASA Technical Reports Server (NTRS)

    Usry, J. W.; Whitlock, C. H.

    1981-01-01

    Management of water resources such as a reservoir requires using analytical models which describe such parameters as the suspended sediment field. To select or develop an appropriate model requires making many measurements to describe the distribution of this parameter in the water column. One potential method for making those measurements expeditiously is to measure light transmission or turbidity and relate that parameter to total suspended solids concentrations. An instrument which may be used for this purpose was calibrated by generating curves of transmission measurements plotted against measured values of total suspended solids concentrations and beam attenuation coefficients. Results of these experiments indicate that field measurements made with this instrument using curves generated in this study should correlate with total suspended solids concentrations and beam attenuation coefficients in the water column within 20 percent.

  13. Performance of One-Class Classifiers for Invasive Species Mapping using Hyperspectral Remote Sensing

    NASA Astrophysics Data System (ADS)

    Skowronek, S.; Asner, G. P.; Feilhauer, H.

    2016-12-01

    Reliable distribution maps are crucial for the monitoring and management of invasive plant species. Remote sensing can provide such maps for larger areas. However, most remote sensing approaches focus on species in a prominent phenological stage, and a systematic assessment of the performance of different one-class classifiers for mapping species in a more inconspicuous phenological stage is missing so far. In this study, we used hyperspectral remote sensing data to detect the invasive grass Phalaris aquatica and the invasive herb Centaurea solstitialisin a pre-flowering stage in the Jasper Ridge Biological Preserve in California. We collected presence-only data, 66 plots for C. solstitialis and 30 plots for P. aquatica, to calibrate a distribution model and additional presence-absence data (166 / 173 plots) to validate model performance. All plots have a size of 3 m x 3 m. The hyperspectral remote sensing imagery was acquired using the Carnegie Airborne Observatory (CAO) visible to shortwave infrared (VSWIR) imaging spectrometer (400-2500 nm range) in May 2015 with a ground sampling distance (pixel size) of 1 m x 1 m. To find the best approach for mapping these species, we compared the performance of three different state-of-the-art classifiers working with presence-only data: Maxent, biased support vector machines and boosted regression trees. The resulting overall accuracies were 72 - 74% for C. solstitialis, and 83 - 88% for P. aquatica. For both species the overall performance was slightly better for Maxent and BRT than for biased SVM. The detection rates for low cover plots were considerably higher for C. solstitialis than for P. aquatica. For C. solstitalis, they ranged between 71 and 75% for plots with less than 15% cover, highlighting the potential of remote sensing to contribute to an early detection. The models relied on different areas of the spectrum, but still produced the same general pattern, which implies that more than one property of a species or a mixed plot can be used to create a viable model. We conclude that the different one-class classifiers we tested do allow detecting the target species in a more inconspicuous phenological stage, with similar success rates.

  14. Calibration of a universal indicated turbulence system

    NASA Technical Reports Server (NTRS)

    Chapin, W. G.

    1977-01-01

    Theoretical and experimental work on a Universal Indicated Turbulence Meter is described. A mathematical transfer function from turbulence input to output indication was developed. A random ergodic process and a Gaussian turbulence distribution were assumed. A calibration technique based on this transfer function was developed. The computer contains a variable gain amplifier to make the system output independent of average velocity. The range over which this independence holds was determined. An optimum dynamic response was obtained for the tubulation between the system pitot tube and pressure transducer by making dynamic response measurements for orifices of various lengths and diameters at the source end.

  15. Wavelength selection-based nonlinear calibration for transcutaneous blood glucose sensing using Raman spectroscopy

    PubMed Central

    Dingari, Narahara Chari; Barman, Ishan; Kang, Jeon Woong; Kong, Chae-Ryon; Dasari, Ramachandra R.; Feld, Michael S.

    2011-01-01

    While Raman spectroscopy provides a powerful tool for noninvasive and real time diagnostics of biological samples, its translation to the clinical setting has been impeded by the lack of robustness of spectroscopic calibration models and the size and cumbersome nature of conventional laboratory Raman systems. Linear multivariate calibration models employing full spectrum analysis are often misled by spurious correlations, such as system drift and covariations among constituents. In addition, such calibration schemes are prone to overfitting, especially in the presence of external interferences that may create nonlinearities in the spectra-concentration relationship. To address both of these issues we incorporate residue error plot-based wavelength selection and nonlinear support vector regression (SVR). Wavelength selection is used to eliminate uninformative regions of the spectrum, while SVR is used to model the curved effects such as those created by tissue turbidity and temperature fluctuations. Using glucose detection in tissue phantoms as a representative example, we show that even a substantial reduction in the number of wavelengths analyzed using SVR lead to calibration models of equivalent prediction accuracy as linear full spectrum analysis. Further, with clinical datasets obtained from human subject studies, we also demonstrate the prospective applicability of the selected wavelength subsets without sacrificing prediction accuracy, which has extensive implications for calibration maintenance and transfer. Additionally, such wavelength selection could substantially reduce the collection time of serial Raman acquisition systems. Given the reduced footprint of serial Raman systems in relation to conventional dispersive Raman spectrometers, we anticipate that the incorporation of wavelength selection in such hardware designs will enhance the possibility of miniaturized clinical systems for disease diagnosis in the near future. PMID:21895336

  16. GOplot: an R package for visually combining expression data with functional analysis.

    PubMed

    Walter, Wencke; Sánchez-Cabo, Fátima; Ricote, Mercedes

    2015-09-01

    Despite the plethora of methods available for the functional analysis of omics data, obtaining comprehensive-yet detailed understanding of the results remains challenging. This is mainly due to the lack of publicly available tools for the visualization of this type of information. Here we present an R package called GOplot, based on ggplot2, for enhanced graphical representation. Our package takes the output of any general enrichment analysis and generates plots at different levels of detail: from a general overview to identify the most enriched categories (bar plot, bubble plot) to a more detailed view displaying different types of information for molecules in a given set of categories (circle plot, chord plot, cluster plot). The package provides a deeper insight into omics data and allows scientists to generate insightful plots with only a few lines of code to easily communicate the findings. The R package GOplot is available via CRAN-The Comprehensive R Archive Network: http://cran.r-project.org/web/packages/GOplot. The shiny web application of the Venn diagram can be found at: https://wwalter.shinyapps.io/Venn/. A detailed manual of the package with sample figures can be found at https://wencke.github.io/ fscabo@cnic.es or mricote@cnic.es. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Analysis of the Best-Fit Sky Model Produced Through Redundant Calibration of Interferometers

    NASA Astrophysics Data System (ADS)

    Storer, Dara; Pober, Jonathan

    2018-01-01

    21 cm cosmology provides unique insights into the formation of stars and galaxies in the early universe, and particularly the Epoch of Reionization. Detection of the 21 cm line is challenging because it is generally 4-5 magnitudes weaker than the emission from foreground sources, and therefore the instruments used for detection must be carefully designed and calibrated. 21 cm cosmology is primarily conducted using interferometers, which are difficult to calibrate because of their complex structure. Here I explore the relationship between sky-based calibration, which relies on an accurate and comprehensive sky model, and redundancy-based calibration, which makes use of redundancies in the orientation of the interferometer's dishes. In addition to producing calibration parameters, redundant calibration also produces a best fit model of the sky. In this work I examine that sky model and explore the possibility of using that best fit model as an additional input to improve on sky-based calibration.

  18. In-Flight Calibration Processes for the MMS Fluxgate Magnetometers

    NASA Technical Reports Server (NTRS)

    Bromund, K. R.; Leinweber, H. K.; Plaschke, F.; Strangeway, R. J.; Magnes, W.; Fischer, D.; Nakamura, R.; Anderson, B. J.; Russell, C. T.; Baumjohann, W.; hide

    2015-01-01

    The calibration effort for the Magnetospheric Multiscale Mission (MMS) Analog Fluxgate (AFG) and DigitalFluxgate (DFG) magnetometers is a coordinated effort between three primary institutions: University of California, LosAngeles (UCLA); Space Research Institute, Graz, Austria (IWF); and Goddard Space Flight Center (GSFC). Since thesuccessful deployment of all 8 magnetometers on 17 March 2015, the effort to confirm and update the groundcalibrations has been underway during the MMS commissioning phase. The in-flight calibration processes evaluatetwelve parameters that determine the alignment, orthogonalization, offsets, and gains for all 8 magnetometers usingalgorithms originally developed by UCLA and the Technical University of Braunschweig and tailored to MMS by IWF,UCLA, and GSFC. We focus on the processes run at GSFC to determine the eight parameters associated with spin tonesand harmonics. We will also discuss the processing flow and interchange of parameters between GSFC, IWF, and UCLA.IWF determines the low range spin axis offsets using the Electron Drift Instrument (EDI). UCLA determines the absolutegains and sensor azimuth orientation using Earth field comparisons. We evaluate the performance achieved for MMS andgive examples of the quality of the resulting calibrations.

  19. The OLI Radiometric Scale Realization Round Robin Measurement Campaign

    NASA Technical Reports Server (NTRS)

    Cutlip, Hansford; Cole,Jerold; Johnson, B. Carol; Maxwell, Stephen; Markham, Brian; Ong, Lawrence; Hom, Milton; Biggar, Stuart

    2011-01-01

    A round robin radiometric scale realization was performed at the Ball Aerospace Radiometric Calibration Laboratory in January/February 2011 in support of the Operational Land Imager (OLI) Program. Participants included Ball Aerospace, NIST, NASA Goddard Space Flight Center, and the University of Arizona. The eight day campaign included multiple observations of three integrating sphere sources by nine radiometers. The objective of the campaign was to validate the radiance calibration uncertainty ascribed to the integrating sphere used to calibrate the OLI instrument. The instrument level calibration source uncertainty was validated by quatnifying: (1) the long term stability of the NIST calibrated radiance artifact, (2) the responsivity scale of the Ball Aerospace transfer radiometer and (3) the operational characteristics of the large integrating sphere.

  20. Building baby universes

    NASA Astrophysics Data System (ADS)

    Coles, Peter

    2017-08-01

    The thought of a scientist trying to design a laboratory experiment in which to create a whole new universe probably sounds like it belongs in the plot of a science-fiction B-movie. But as author Zeeya Merali explains in her new book A Big Bang in a Little Room, there are more than a few eminent physicists who think that this is theoretically possible.

  1. High throughput field plant phenotyping facility at University of Nebraska-Lincoln and the first year experience

    NASA Astrophysics Data System (ADS)

    Ge, Y.; Bai, G.; Irmak, S.; Awada, T.; Stoerger, V.; Graef, G.; Scoby, D.; Schnable, J.

    2017-12-01

    University of Nebraska - Lincoln's high throughput field plant phenotyping facility is a cable robot based system built on a 1-ac field. The sensor platform is tethered with eight cables via four poles at the corners of the field for its precise control and positioning. The sensor modules on the platform include a 4-band RGB-NIR camera, a thermal infrared camera, a 3D LiDAR, VNIR spectrometers, and environmental sensors. These sensors are used to collect multifaceted physiological, structural and chemical properties of plants from the field plots. A subsurface drip irrigation system is established in this field which allows a controlled amount of water and fertilizers to be delivered to individual plots. An extensive soil moisture sensor network is also established to monitor soil water status, and serve as a feedback loop for irrigation scheduling. In the first year of operation, the field is planted maize and soybean. Weekly ground truth data were collected from the plots to validate image and sensor data from the phenotyping system. This presentation will provide an overview of this state-of-the-art field plant phenotyping facility, and present preliminary data from the first year operation of the system.

  2. Development of public science archive system of Subaru Telescope

    NASA Astrophysics Data System (ADS)

    Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatochi; Watanabe, Masaru; Okumura, Shin-Ichiro; Ozawa, Tomohiko; Yamamoto, Naotaka; Hamabe, Masaru

    2002-09-01

    We have developed a public science archive system, Subaru-Mitaka-Okayama-Kiso Archive system (SMOKA), as a successor of Mitaka-Okayama-Kiso Archive (MOKA) system. SMOKA provides an access to the public data of Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory of the University of Tokyo. Since 1997, we have tried to compile the dictionary of FITS header keywords. The accomplishment of the dictionary enabled us to construct an unified public archive of the data obtained with various instruments at the telescopes. SMOKA has two kinds of user interfaces; Simple Search and Advanced Search. Novices can search data by simply selecting the name of the target with the Simple Search interface. Experts would prefer to set detailed constraints on the query, using the Advanced Search interface. In order to improve the efficiency of searching, several new features are implemented, such as archive status plots, calibration data search, an annotation system, and an improved Quick Look Image browsing system. We can efficiently develop and operate SMOKA by adopting a three-tier model for the system. Java servlets and Java Server Pages (JSP) are useful to separate the front-end presentation from the middle and back-end tiers.

  3. Determination of Ankle and Metatarsophalangeal Stiffness During Walking and Jogging.

    PubMed

    Mager, Fabian; Richards, Jim; Hennies, Malika; Dötzel, Eugen; Chohan, Ambreen; Mbuli, Alex; Capanni, Felix

    2018-05-29

    Forefoot stiffness has been shown to influence joint biomechanics. However, little or no data exists on metatarsophalangeal stiffness. Twenty-four healthy rearfoot strike runners were recruited from a staff and student population at the University of Central Lancashire. Five repetitions of shod, self-selected speed level walking and jogging were performed. Kinetic and kinematic data were collected using retro-reflective markers placed on the lower limb and foot, to create a three-segment foot model using the Calibrated Anatomical System Technique. Ankle and metatarsophalangeal moments and angles were calculated. Stiffness values were calculated using a linear best fit line of moment versus of angle plots. Paired t-tests were used to compare values between walking and jogging conditions. Significant differences were seen in ankle range of motion (ROM), but not in metatarsophalangeal ROM. Maximum moments were significantly greater in the ankle during jogging, but these were not significantly different at the metatarsophalangeal joint. Average ankle joint stiffness exhibited significantly lower stiffness when walking compared to jogging. However, the metatarsophalangeal joint exhibited significantly greater stiffness when walking compared to jogging. A greater understanding of forefoot stiffness may inform the development of footwear, prosthetic feet and orthotic devices, such as ankle-foot orthoses for walking and sporting activities.

  4. Development and deployment of an underway radioactive cesium monitor off the Japanese coast near Fukushima Dai-ichi.

    PubMed

    Caffrey, J A; Higley, K A; Farsoni, A T; Smith, S; Menn, S

    2012-09-01

    A custom radiation monitoring system was developed by Oregon State University at the request of the Woods Hole Oceanographic Institute to measure radioactive cesium contaminants in the ocean waters near Fukushima Dai-ichi Nuclear Power Plant. The system was to be used on board the R/V Ka'imikai-O-Kanaloa during a 15 d research cruise to provide real-time approximations of radionuclide concentration and alert researchers to the possible occurrence of highly elevated radionuclide concentrations. A NaI(Tl) scintillation detector was coupled to a custom-built compact digital spectroscopy system and suspended within a sealed tank of continuously flowing seawater. A series of counts were acquired within an energy region corresponding to the main photopeak of (137)Cs. The system was calibrated using known quantities of radioactive (134)Cs and (137)Cs in a ratio equating to that present at the reactors' ocean outlet. The response between net count rate and concentration of (137)Cs was then used to generate temporal and geographic plots of (137)Cs concentration throughout the research cruise in Japanese coastal waters. The concentration of (137)Cs was low but detectable, reaching a peak of 3.8 ± 0.2 Bq/L. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Long-term simulations of water and isoproturon dynamics in a heterogeneous soil receiving different urban waste composts

    NASA Astrophysics Data System (ADS)

    Filipović, Vilim; Coquet, Yves; Pot, Valérie; Romić, Davor; Benoit, Pierre; Houot, Sabine

    2016-04-01

    Implementing various compost amendments and tillage practices has a large influence on soil structure and can create heterogeneities at the plot/field scale. While tillage affects soil physical properties, compost application influences also chemical properties like pesticide sorption and degradation. A long-term field experiment called "QualiAgro" (https://www6.inra.fr/qualiagro_eng/), conducted since 1998 aims at characterizing the agronomic value of urban waste composts and their environmental impacts. A modeling study was carried out using HYDRUS-2D for the 2004-2010 period to confront the effects of two different compost types combined with the presence of heterogeneities due to tillage in terms of water and isoproturon dynamics in soil. A municipal solid waste compost (MSW) and a co-compost of sewage sludge and green wastes (SGW) have been applied to experimental plots and compared to a control plot without any compost addition (CONT). Two wick lysimeters, 5 TDR probes, and 7 tensiometers were installed per plot to monitor water and isoproturon dynamics. In the ploughed layer, four zones with differing soil structure were identified: compacted clods (Δ), non-compacted soil (Γ), interfurrows (IF), and the plough pan (PP). These different soil structural zones were implemented into HYDRUS-2D according to field observation and using measured soil hydraulic properties. Lysimeter data showed (2004 -2010 period) that the CONT plot had the largest cumulative water outflow (1388 mm) compared to the MSW plot (962 mm) and SGW plot (979 mm). HYDRUS-2D was able to describe cumulative water outflow after calibration of soil hydraulic properties, for the whole 2004-2010 period with a model efficiency value of 0.99 for all three plots. Isoproturon leaching showed had the largest cumulative value in the CONT plot (21.31 μg) while similar cumulated isoproturon leachings were measured in the SGW (0.663 μg) and MSW (0.245 μg) plots. The model was able to simulate isoproturon leaching patterns except for the large preferential flow events that were observed in the MSW and CONT plots. The timing of these preferential flow events could be reproduced by the model but not their magnitude. Additional simulations were carried out, assuming temporal variation of the IPU degradation rate to explain the leaching events observed at the end of the monitoring period (2010). Modeling results indicate that spatial and temporal variations in pesticide degradation rate due to tillage and compost application play a major role in the dynamics of isoproturon leaching. Both types of compost were found to reduce isoproturon leaching on the long-term (6 years) duration of the field experiment. Keywords: Compost amendment; Soil heterogeneity; Conventional tillage; Water flow; Isoproturon; HYDRUS-2D

  6. JUPITER PROJECT - JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY

    EPA Science Inventory

    The JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) project builds on the technology of two widely used codes for sensitivity analysis, data assessment, calibration, and uncertainty analysis of environmental models: PEST and UCODE.

  7. Report on International Spaceborne Imaging Spectroscopy Technical Committee Calibration and Validation Workshop, National Environment Research Council Field Spectroscopy Facility, University of Edinburgh

    NASA Technical Reports Server (NTRS)

    Ong, C,; Mueller, A.; Thome, K.; Bachmann, M.; Czapla-Myers, J.; Holzwarth, S.; Khalsa, S. J.; Maclellan, C.; Malthus, T.; Nightingale, J.; hide

    2016-01-01

    Calibration and validation are fundamental for obtaining quantitative information from Earth Observation (EO) sensor data. Recognising this and the impending launch of at least five sensors in the next five years, the International Spaceborne Imaging Spectroscopy Technical Committee instigated a calibration and validation initiative. A workshop was conducted recently as part of this initiative with the objective of establishing a good practice framework for radiometric and spectral calibration and validation in support of spaceborne imaging spectroscopy missions. This paper presents the outcomes and recommendations for future work arising from the workshop.

  8. DIRBE External Calibrator (DEC)

    NASA Technical Reports Server (NTRS)

    Wyatt, Clair L.; Thurgood, V. Alan; Allred, Glenn D.

    1987-01-01

    Under NASA Contract No. NAS5-28185, the Center for Space Engineering at Utah State University has produced a calibration instrument for the Diffuse Infrared Background Experiment (DIRBE). DIRBE is one of the instruments aboard the Cosmic Background Experiment Observatory (COBE). The calibration instrument is referred to as the DEC (Dirbe External Calibrator). DEC produces a steerable, infrared beam of controlled spectral content and intensity and with selectable point source or diffuse source characteristics, that can be directed into the DIRBE to map fields and determine response characteristics. This report discusses the design of the DEC instrument, its operation and characteristics, and provides an analysis of the systems capabilities and performance.

  9. Experimental Determination of the HPGe Spectrometer Efficiency Calibration Curves for Various Sample Geometry for Gamma Energy from 50 keV to 2000 keV

    NASA Astrophysics Data System (ADS)

    Saat, Ahmad; Hamzah, Zaini; Yusop, Mohammad Fariz; Zainal, Muhd Amiruddin

    2010-07-01

    Detection efficiency of a gamma-ray spectrometry system is dependent upon among others, energy, sample and detector geometry, volume and density of the samples. In the present study the efficiency calibration curves of newly acquired (August 2008) HPGe gamma-ray spectrometry system was carried out for four sample container geometries, namely Marinelli beaker, disc, cylindrical beaker and vial, normally used for activity determination of gamma-ray from environmental samples. Calibration standards were prepared by using known amount of analytical grade uranium trioxide ore, homogenized in plain flour into the respective containers. The ore produces gamma-rays of energy ranging from 53 keV to 1001 keV. Analytical grade potassium chloride were prepared to determine detection efficiency of 1460 keV gamma-ray emitted by potassium isotope K-40. Plots of detection efficiency against gamma-ray energy for the four sample geometries were found to fit smoothly to a general form of ɛ = AΕa+BΕb, where ɛ is efficiency, Ε is energy in keV, A, B, a and b are constants that are dependent on the sample geometries. All calibration curves showed the presence of a "knee" at about 180 keV. Comparison between the four geometries showed that the efficiency of Marinelli beaker is higher than cylindrical beaker and vial, while cylindrical disk showed the lowest.

  10. Diversity of arbuscular mycorrhiza in the rhizosphere of Cajeput in agroforestry system with different fertilizer management of maize

    NASA Astrophysics Data System (ADS)

    Parwi; Pudjiasmanto, B.; Purnomo, D.; Cahyani, VR

    2017-11-01

    This study investigated the diversity of arbuscular mycorrhiza in rhizosphere of cajeput with different fertilizer management of maize. This research was conducted by observation on cajeput agroforestry system in Ponorogo that have different fertilizer management of maize: conventional management (CM), universal management (UM) and alternative management (AM1, AM2, and AM3). The result showed that the highest infection of arbuscular mycorrhiza was observed in the plot of AM3, while the lowest colonization was observed in the plot of CM. Infection of arbuscular mycorrhiza in roots cajeput from five fertilizer management, ranging from 32.64% - 63.33%. In all fertilizer management, there were eight species of arbuscular mycorrhiza which five species were Glomus genus, one species was Acaulospora genus and two species were Gigaspora genus. Glomus constrictum was the dominant species in all fertilizer management. Acaulospora favoeta was found only in the plot of AM3. Spore density varies between 150-594 / 100g of soil. The highest spore density was observed in the plot of AM3, while the lowest spore density was observed in the plot of AM1. The highest diversity index value of arbuscular mycorrhiza (Species richness and Shannon-Wiener) was observed in the plot of AM3.

  11. In-Situ Field Data Gathering Stations, San Francisco Bay-Delta, Salinity Intrusion with Navigation Channels. Appendices 1-11.

    DTIC Science & Technology

    1981-03-18

    submitted 430 days after the notice to proceed. The final report will discuss instrumentation at the sites and present composite plots of the two years...tightened fittings inside the instrument. This allowed leakage of the very viscous silicon fluid which fills the sensor after the sensor was...IntcrOceans s, sten . inc. / 3540 acro ct . ,an diego. ca. 92123 / 714-565-84C / telex 69-50,2 , \\ %,y- ceanq CSTD Calibration Coant. Notes to CSTD DO pH

  12. Standard method of test for grindability of coal by the Hardgrove-machine method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1975-01-01

    A procedure is described for sampling coal, grinding in a Hardgrove grinding machine, and passing through standard sieves to determine the degree of pulverization of coals. The grindability index of the coal tested is calculated from a calibration chart prepared by plotting weight of material passing a No. 200 sieve versus the Hardgrove Grindability Index for the standard reference samples. The Hardgrove machine is shown schematically. The method for preparing and determining grindability indexes of standard reference samples is given in the appendix. (BLM)

  13. Relation between the Surface Friction of Plates and their Statistical Microgeometry

    DTIC Science & Technology

    1980-01-01

    3-6 and 𔃽-7. Calibration-- are taken for each of the Uicr~r unit exponent values and best fit li;nes by least squares fitted through each"n set of...parameter, [ = 1.de (2-43) (Clauser 1954, 1956). Data from near equilibrium flows (Coles & Hurst 1968) was plotted along with some typical non-equilibrium...too bad a fit even for the non equilibrium flows. Coles and Hurst (1968) recommended that the fit of the law of the wake to velocity profiles should be

  14. Scaling Issues Between Plot and Satellite Radiobrightness Observations of Arctic Tundra

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; England, Anthony W.; Judge, Jasmeet; Zukor, Dorothy J. (Technical Monitor)

    2000-01-01

    Data from generation of satellite microwave radiometer will allow the detection of seasonal to decadal changes in the arctic hydrology cycle as expressed in temporal and spatial patterns of moisture stored in soil and snow This nw capability will require calibrated Land Surface Process/Radiobrightness (LSP/R) model for the principal terrains found in the circumpolar Arctic. These LSP/R models can than be used in weak constraint. Dimensional Data Assimilation (DDA)of the daily satellite observation to estimate temperature and moisture profiles within the permafrost in active layer.

  15. Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.

    PubMed

    Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P

    2016-07-01

    Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A new model for bed load sampler calibration to replace the probability-matching method

    Treesearch

    Robert B. Thomas; Jack Lewis

    1993-01-01

    In 1977 extensive data were collected to calibrate six Helley-Smith bed load samplers with four sediment particle sizes in a flume at the St. Anthony Falls Hydraulic Laboratory at the University of Minnesota. Because sampler data cannot be collected at the same time and place as ""true"" trap measurements, the ""probability-matching...

  17. Postgraduate Preferences: A Study of Factors Contributing to Programme Satisfaction amongst Masters Students

    ERIC Educational Resources Information Center

    Frumkin, Lara A.; Milankovic-Atkinson, Maya; Sadler, Chris

    2007-01-01

    Background: Universities have a vested interest in attracting and encouraging enrolment of as many high calibre students as possible. With greater frequency, universities are using marketing techniques to do so. Aims: The study reviewed current student opinions of a programme within a UK university to discover its shortcomings and strengths.…

  18. Mitigating Uncertainty from Vegetation Spatial Complexity with Highly Portable Lidar

    NASA Astrophysics Data System (ADS)

    Paynter, I.; Schaaf, C.; Peri, F.; Saenz, E. J.; Genest, D.; Strahler, A. H.; Li, Z.

    2015-12-01

    To fully utilize the excellent spatial coverage and temporal resolution offered by satellite resources for estimating ecological variables, fine-scale observations are required for comparison, calibration and validation. Lidar instruments have proved effective in estimating the properties of vegetation components of ecosystems, but they are often challenged by occlusion, especially in structurally complex and spatially fragmented ecosystems such as tropical forests. Increasing the range of view angles, both horizontally and vertically, by increasing the number of scans, can mitigate occlusion. However these scans must occur within the window of temporal stability for the ecosystem and vegetation property being measured. The Compact Biomass Lidar (CBL) is a TLS optimized for portability and scanning speed, developed and operated by University of Massachusetts Boston. This 905nm wavelength scanner achieves an angular resolution of 0.25 degrees at a rate of 33 seconds per scan. The ability to acquire many scans within narrow windows of temporal stability for ecological variables has facilitated the more complete investigation of ecosystem structural characteristics, and their expression as a function of view angle. The lightweight CBL has facilitated the use of alternative deployment platforms including towers, trams and masts, allowing analysis of the vertical structure of ecosystems, even in highly enclosed environments such as the sub-canopy of tropical forests where aerial vehicles cannot currently operate. We will present results from view angle analyses of lidar surveys of tropical rainforest in La Selva, Costa Rica where the CBL was deployed at heights up to 10m in Carbono long-term research plots utilizing a portable mast, and on a 25m stationary tower; and temperate forest at Harvard Forest, Massachusetts, USA, where the CBL has been deployed biannually at long-term research plots of hardwood and hemlock, as well as at heights of up to 25m utilizing a stationary tower.

  19. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series against data time series, and plotting predicted versus observed values) have little diagnostic power. We propose that it may be more useful to statistically extract the relationships of primary interest from the time series, and test the model directly against them.

  20. A calibration rig for multi-component internal strain gauge balance using the new design-of-experiment (DOE) approach

    NASA Astrophysics Data System (ADS)

    Nouri, N. M.; Mostafapour, K.; Kamran, M.

    2018-02-01

    In a closed water-tunnel circuit, the multi-component strain gauge force and moment sensor (also known as balance) are generally used to measure hydrodynamic forces and moments acting on scaled models. These balances are periodically calibrated by static loading. Their performance and accuracy depend significantly on the rig and the method of calibration. In this research, a new calibration rig was designed and constructed to calibrate multi-component internal strain gauge balances. The calibration rig has six degrees of freedom and six different component-loading structures that can be applied separately and synchronously. The system was designed based on the applicability of formal experimental design techniques, using gravity for balance loading and balance positioning and alignment relative to gravity. To evaluate the calibration rig, a six-component internal balance developed by Iran University of Science and Technology was calibrated using response surface methodology. According to the results, calibration rig met all design criteria. This rig provides the means by which various methods of formal experimental design techniques can be implemented. The simplicity of the rig saves time and money in the design of experiments and in balance calibration while simultaneously increasing the accuracy of these activities.

  1. SCALA: In situ calibration for integral field spectrographs

    NASA Astrophysics Data System (ADS)

    Lombardo, S.; Küsters, D.; Kowalski, M.; Aldering, G.; Antilogus, P.; Bailey, S.; Baltay, C.; Barbary, K.; Baugh, D.; Bongard, S.; Boone, K.; Buton, C.; Chen, J.; Chotard, N.; Copin, Y.; Dixon, S.; Fagrelius, P.; Feindt, U.; Fouchez, D.; Gangler, E.; Hayden, B.; Hillebrandt, W.; Hoffmann, A.; Kim, A. G.; Leget, P.-F.; McKay, L.; Nordin, J.; Pain, R.; Pécontal, E.; Pereira, R.; Perlmutter, S.; Rabinowitz, D.; Reif, K.; Rigault, M.; Rubin, D.; Runge, K.; Saunders, C.; Smadja, G.; Suzuki, N.; Taubenberger, S.; Tao, C.; Thomas, R. C.; Nearby Supernova Factory

    2017-11-01

    Aims: The scientific yield of current and future optical surveys is increasingly limited by systematic uncertainties in the flux calibration. This is the case for type Ia supernova (SN Ia) cosmology programs, where an improved calibration directly translates into improved cosmological constraints. Current methodology rests on models of stars. Here we aim to obtain flux calibration that is traceable to state-of-the-art detector-based calibration. Methods: We present the SNIFS Calibration Apparatus (SCALA), a color (relative) flux calibration system developed for the SuperNova integral field spectrograph (SNIFS), operating at the University of Hawaii 2.2 m (UH 88) telescope. Results: By comparing the color trend of the illumination generated by SCALA during two commissioning runs, and to previous laboratory measurements, we show that we can determine the light emitted by SCALA with a long-term repeatability better than 1%. We describe the calibration procedure necessary to control for system aging. We present measurements of the SNIFS throughput as estimated by SCALA observations. Conclusions: The SCALA calibration unit is now fully deployed at the UH 88 telescope, and with it color-calibration between 4000 Å and 9000 Å is stable at the percent level over a one-year baseline.

  2. Adiabatic Shear Bands in Simple and Dipolar Viscoplastic Materials

    DTIC Science & Technology

    1991-08-01

    d Engineering Mechanics. University of Missouri-Holla. Holla. MO 65401-0249. C S.A Mace -r. Germam .Maxirr. Cork. - DrucK GmbH Carl-i -0&sset7«v...iaxirn Gorki - Druck GmoH Car:- -Ossietzk^ -Str 30 11 7400 Auenöurs I _LL 1+ Xf . . M „ .,Mqon ACTA MECHANICA Acta Mechamca 86, .31—.1...points 13, 14, 15 and 17 becomes clear from the results plotted in Fig. 3d . The plots of the temperature rise at other points con- sidered are not

  3. Frequency and Content of Chat Questions by Time of Semester at the University of Central Florida: Implications for Training, Staffing and Marketing

    ERIC Educational Resources Information Center

    Goda, Donna; Bishop, Corinne

    2008-01-01

    The more than 4,000 "chats" received by the University of Central Florida's (UCF) Ask-A-Librarian digital reference service are the subject of this practitioner-based, descriptive case study. Question content from chats received during four semesters between January 2005 and May 2006 are categorized and plotted, by semester, to show the…

  4. The Effect of Psychological Counselling in Group on Life Orientation and Loneliness Levels of the University Students

    ERIC Educational Resources Information Center

    Gurgan, Ugur

    2013-01-01

    The present study was an experimental research which was applied for increasing of the life tendencies and decreasing the loneliness of the university students, and in which the effect of psychological counselling in group on loneliness level was analysed. The present study consisting of mix measurements was carried out by 2x2 split-plot in order…

  5. Analysis of Resistant Starches in Rat Cecal Contents Using Fourier Transform Infrared Photoacoustic Spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Timothy J.; Ai, Yongfeng; Jones, Roger W.

    Fourier transform infrared photoacoustic spectroscopy (FTIR-PAS) qualitatively and quantitatively measured resistant starch (RS) in rat cecal contents. Fisher 344 rats were fed diets of 55% (w/w, dry basis) starch for 8 weeks. Cecal contents were collected from sacrificed rats. A corn starch control was compared against three RS diets. The RS diets were high-amylose corn starch (HA7), HA7 chemically modified with octenyl succinic anhydride, and stearic-acid-complexed HA7 starch. To calibrate the FTIR-PAS analysis, samples from each diet were analyzed using an enzymatic assay. A partial least-squares cross-validation plot generated from the enzymatic assay and FTIR-PAS spectral results for starch fitmore » the ideal curve with a R2 of 0.997. A principal component analysis plot of components 1 and 2 showed that spectra from diets clustered significantly from each other. This study clearly showed that FTIR-PAS can accurately quantify starch content and identify the form of starch in complex matrices.« less

  6. Thermal Infrared Spectral Imager for Airborne Science Applications

    NASA Technical Reports Server (NTRS)

    Johnson, William R.; Hook, Simon J.; Mouroulis, Pantazis; Wilson, Daniel W.; Gunapala, Sarath D.; Hill, Cory J.; Mumolo, Jason M.; Eng, Bjorn T.

    2009-01-01

    An airborne thermal hyperspectral imager is under development which utilizes the compact Dyson optical configuration and quantum well infrared photo detector (QWIP) focal plane array. The Dyson configuration uses a single monolithic prism-like grating design which allows for a high throughput instrument (F/1.6) with minimal ghosting, stray-light and large swath width. The configuration has the potential to be the optimal imaging spectroscopy solution for lighter-than-air (LTA) vehicles and unmanned aerial vehicles (UAV) due to its small form factor and relatively low power requirements. The planned instrument specifications are discussed as well as design trade-offs. Calibration testing results (noise equivalent temperature difference, spectral linearity and spectral bandwidth) and laboratory emissivity plots from samples are shown using an operational testbed unit which has similar specifications as the final airborne system. Field testing of the testbed unit was performed to acquire plots of apparent emissivity for various known standard minerals (such as quartz). A comparison is made using data from the ASTER spectral library.

  7. Effective Cross Section of Cold Formed Steel Column Under Axial Compression

    NASA Astrophysics Data System (ADS)

    Manikandan, P.; Pradeep, T.

    2018-06-01

    The compressive resistance of cold-formed steel (CFS) section may be governed by local, distortional or overall buckling and any apparent interaction between these modes. A new inventive stiffened CFS section is elected in this study, selected cross sections geometries and lengths are chosen such that all the types of buckling modes are met with. Buckling plot is plotted using linear elastic buckling analysis software (CUFSM). Using the test results obtained in the literature, the developed finite element model is calibrated and furthers a total of 126 parametric study is conducted such as a consequence of dimensions and the length of the cross section, thickness and yield stress. The FEA included relevant material and geometric imperfections. All the columns are analyzed under pin end conditions with axial compression. The analysis results demonstrate that the DSM equations generally assess the strength of stiffened section conservatively. Modifications to the DSM equations are recommended to evaluate the strength of stiffened section more precisely.

  8. An exhaustive survey of regular peptide conformations using a new metric for backbone handedness (h)

    PubMed Central

    2017-01-01

    The Ramachandran plot is important to structural biology as it describes a peptide backbone in the context of its dominant degrees of freedom—the backbone dihedral angles φ and ψ (Ramachandran, Ramakrishnan & Sasisekharan, 1963). Since its introduction, the Ramachandran plot has been a crucial tool to characterize protein backbone features. However, the conformation or twist of a backbone as a function of φ and ψ has not been completely described for both cis and trans backbones. Additionally, little intuitive understanding is available about a peptide’s conformation simply from knowing the φ and ψ values of a peptide (e.g., is the regular peptide defined by φ = ψ =  − 100°  left-handed or right-handed?). This report provides a new metric for backbone handedness (h) based on interpreting a peptide backbone as a helix with axial displacement d and angular displacement θ, both of which are derived from a peptide backbone’s internal coordinates, especially dihedral angles φ, ψ and ω. In particular, h equals sin(θ)d∕|d|, with range [−1, 1] and negative (or positive) values indicating left(or right)-handedness. The metric h is used to characterize the handedness of every region of the Ramachandran plot for both cis (ω = 0°) and trans (ω = 180°) backbones, which provides the first exhaustive survey of twist handedness in Ramachandran (φ, ψ) space. These maps fill in the ‘dead space’ within the Ramachandran plot, which are regions that are not commonly accessed by structured proteins, but which may be accessible to intrinsically disordered proteins, short peptide fragments, and protein mimics such as peptoids. Finally, building on the work of (Zacharias & Knapp, 2013), this report presents a new plot based on d and θ that serves as a universal and intuitive alternative to the Ramachandran plot. The universality arises from the fact that the co-inhabitants of such a plot include every possible peptide backbone including cis and trans backbones. The intuitiveness arises from the fact that d and θ provide, at a glance, numerous aspects of the backbone including compactness, handedness, and planarity. PMID:28533975

  9. The Berkeley extreme ultraviolet calibration facility

    NASA Technical Reports Server (NTRS)

    Welsh, Barry Y.; Jelinsky, Patrick; Malina, Roger F.

    1988-01-01

    The vacuum calibration facilities of the Space Sciences Laboratory, University of California at Berkeley are designed for the calibration and testing of EUV and FUV spaceborne instrumentation (spectral range 44-2500 A). The facility includes one large cylindrical vacuum chamber (3 x 5 m) containing two EUV collimators, and it is equipped with a 4-axis manipulator of angular-control resolution 1 arcsec for payloads weighing up to 500 kg. In addition, two smaller cylindrical chambers, each 0.9 x 1.2 m, are available for vacuum and thermal testing of UV detectors, filters, and space electronics hardware. All three chambers open into class-10,000 clean rooms, and all calibrations are referred to NBS secondary standards.

  10. Construction of a 1 MeV Electron Accelerator for High Precision Beta Decay Studies

    NASA Astrophysics Data System (ADS)

    Longfellow, Brenden

    2014-09-01

    Beta decay energy calibration for detectors is typically established using conversion sources. However, the calibration points from conversion sources are not evenly distributed over the beta energy spectrum and the foil backing of the conversion sources produces perturbations in the calibration spectrum. To improve this, an external, tunable electron beam coupled by a magnetic field can be used to calibrate the detector. The 1 MeV electron accelerator in development at Triangle Universities Nuclear Laboratory (TUNL) utilizes a pelletron charging system. The electron gun shoots 104 electrons per second with an energy range of 50 keV to 1 MeV and is pulsed at a 10 kHz rate with a few ns width. The magnetic field in the spectrometer is 1 T and guiding fields of 0.01 to 0.05 T for the electron gun are used to produce a range of pitch angles. This accelerator can be used to calibrate detectors evenly over its energy range and determine the detector response over a range of pitch angles. Beta decay energy calibration for detectors is typically established using conversion sources. However, the calibration points from conversion sources are not evenly distributed over the beta energy spectrum and the foil backing of the conversion sources produces perturbations in the calibration spectrum. To improve this, an external, tunable electron beam coupled by a magnetic field can be used to calibrate the detector. The 1 MeV electron accelerator in development at Triangle Universities Nuclear Laboratory (TUNL) utilizes a pelletron charging system. The electron gun shoots 104 electrons per second with an energy range of 50 keV to 1 MeV and is pulsed at a 10 kHz rate with a few ns width. The magnetic field in the spectrometer is 1 T and guiding fields of 0.01 to 0.05 T for the electron gun are used to produce a range of pitch angles. This accelerator can be used to calibrate detectors evenly over its energy range and determine the detector response over a range of pitch angles. TUNL REU Program.

  11. Towards improved characterization of northern wetlands (or other landscapes) by remote sensing - a rapid approach to collect ground truth data

    NASA Astrophysics Data System (ADS)

    Gålfalk, Magnus; Karlson, Martin; Crill, Patrick; Bastviken, David

    2017-04-01

    The calibration and validation of remote sensing land cover products is highly dependent on accurate ground truth data, which are costly and practically challenging to collect. This study evaluates a novel and efficient alternative to field surveys and UAV imaging commonly applied for this task. The method consists of i) a light weight, water proof, remote controlled RGB-camera mounted on an extendable monopod used for acquiring wide-field images of the ground from a height of 4.5 meters, and ii) a script for semi-automatic image classification. In the post-processing, the wide-field images are corrected for optical distortion and geometrically rectified so that the spatial resolution is the same over the surface area used for classification. The script distinguishes land surface components by color, brightness and spatial variability. The method was evaluated in wetland areas located around Abisko, northern Sweden. Proportional estimates of the six main surface components in the wetlands (wet and dry Sphagnum, shrub, grass, water, rock) were derived for 200 images, equivalent to 10 × 10 m field plots. These photo plots were then used as calibration data for a regional scale satellite based classification which separates the six wetland surface components using a Sentinel-1 time series. The method presented in this study is accurate, rapid, robust and cost efficient in comparison to field surveys (time consuming) and drone mapping (which require low wind speeds and no rain, suffer from battery limited flight times, have potential GPS/compass errors far north, and in some areas are prohibited by law).

  12. Corneal surface temperature change as the mode of stimulation of the non-contact corneal aesthesiometer.

    PubMed

    Murphy, P J; Morgan, P B; Patel, S; Marshall, J

    1999-05-01

    The non-contact corneal aesthesiometer (NCCA) assesses corneal sensitivity by using a controlled pulse of air, directed at the corneal surface. The purpose of this paper was to investigate whether corneal surface temperature change was a component in the mode of stimulation. Thermocouple experiment: A simple model corneal surface was developed that was composed of a moistened circle of filter paper placed on a thermocouple and mounted on a glass slide. The temperature change produced by different stimulus pressures was measured for five different ambient temperatures. Thermal camera experiment: Using a thermal camera, the corneal surface temperature change was measured in nine young, healthy subjects after exposure to different stimulus air pulses. Pulse duration was set at 0.9 s but was varied in pressure from 0.5 to 3.5 millibars. Thermocouple experiment: An immediate drop in temperature was detected by the thermocouple as soon as the air flow was incident on the filter paper. A greater temperature change was produced by increasing the pressure of the incident air flow. A relationship was found and a calibration curve plotted. Thermal camera experiment: For each subject, a drop in surface temperature was detected at each stimulus pressure. Furthermore, as the stimulus pressure increased, the induced reduction in temperature also increased. A relationship was found and a calibration curve plotted. The NCCA air-pulse stimulus was capable of producing a localized temperature change on the corneal surface. The principal mode of corneal nerve stimulation, by the NCCA air pulse, was the rate of temperature change of the corneal surface.

  13. TRMM Applications for Rainfall-Induced Landslide Early Warning

    NASA Astrophysics Data System (ADS)

    Dok, A.; Fukuoka, H.; Hong, Y.

    2012-04-01

    Early warning system (EWS) is the most effective method in saving lives and reducing property damages resulted from the catastrophic landslides if properly implemented in populated areas of landslide-prone nations. For predicting the occurrence of landslides, it requires examination of empirical relationship between rainfall characteristics and past landslide occurrence. In developed countries like Japan and the US, precipitation is monitored by rain radars and ground-based rain gauge matrix. However, in developing regions like Southeast Asian countries, very limited number of rain gauges is available, and there is no implemented methodology for issuing effective warming of landslides yet. Correspondingly, satellite precipitation monitoring could be therefore a possible and promising solution for launching landslide quasi-real-time early warning system in those countries. It is due to the fact that TMPA (TRMM Multi-satellite Precipitation Analysis) can provides a globally calibration-based sequential scheme for combining precipitation estimates from multiple satellites, and gauge analyses where feasible, at fine scales (3-hourly with 0.25°x0.25° spatial resolution). It is available both after and in quasi-real time, calibrated by TRMM Combined Instrument and TRMM Microwave Imager precipitation product. However, validation of ground based rain gauge and TRMM satellite data in the vulnerable regions is still not yet operative. Snake-line/Critical-line and Soil Water Index (SWI) are used for issuing warning of landslide occurrence in Japan; whereas, Caine criterion is preferable in Europe and western nations. Herewith, it presents rainfall behavior which took place in Beichuan city (located on the 2008 Chinese Wenchuan earthquake fault), Hofu and Shobara cities in Japan where localized heavy rainfall attacked in 2009 and 2010, respectively, from TRMM 3B42RT correlated with ground based rain gauge data. The 1-day rainfall intensity and 15-day cumulative rainfall (snake line) were independently plotted to investigate the impact of short-term rainfall intensity and accumulated effective rainfall volume respectively for obtaining some probabilistic threshold. Japanese SWI was also tested to distribute threshold regarding to highly nonlinear rainfall patterns in predicting the landslide occurrence through the plot of total water of 3 serial tank models and daily precipitation. As a result, the snake line plots using TMPA work well for landslide warning in the selected cities; while SWI plots shows unusual peak value on the day of the debris flow occurrence. Graph of daily precipitation vs SWI implies possible zone of critical line, and second peak appearance 1 day before, indicating possibility of early warning.

  14. 48. Photograph of an original construction drawing, dated August 1927, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    48. Photograph of an original construction drawing, dated August 1927, in the possession of Facilities Planning Office, Iowa State University, Ames, Iowa. ELEVATIONS CROSS SECTIONS THROUGH COURTYARD SHOWING EAST ELEVATION OF FRONT (WEST) PORTION OF BUILDING, SOUTH ELEVATION OF NORTH WING, NORTH ELEVATION OF SOUTH WING, PLOT PLAN, AND DETAILS; SHEET NO. 6 OF 10 - Dairy Industry Building, Iowa State University campus, Ames, Story County, IA

  15. Polymer Electrolyte Based on Poly(ethylene imine) and Lithium Salts.

    DTIC Science & Technology

    1985-10-01

    plots of AC impedance data obtained over the frequency range from 100 Nz to 13 az12 . AC impedance was determined using a computerized Hewlett-Packard...E. Yasger repartment of Chemistry Dr. Sam Perone Case Western Reserve University Chemitry & Materials Cleveland, Ohio 41106 1 Scifnte Department...1 Dr. Carl Kannewurf borthvestern University Dr. Joseph Gordon, I1 Dipartment of Electrical Engineering IB Corporation and Computer Science X33/281

  16. A randomized comparison between league tables and funnel plots to inform health care decision-making.

    PubMed

    Anell, Anders; Hagberg, Oskar; Liedberg, Fredrik; Ryden, Stefan

    2016-12-01

    Comparison of provider performance is commonly used to inform health care decision-making. Little attention has been paid to how data presentations influence decisions. This study analyzes differences in suggested actions by decision-makers informed by league tables or funnel plots. Decision-makers were invited to a survey and randomized to compare hospital performance using either league tables or funnel plots for four different measures within the area of cancer care. For each measure, decision-makers were asked to suggest actions towards 12-16 hospitals (no action, ask for more information, intervene) and provide feedback related to whether the information provided had been useful. Swedish health care. Two hundred and twenty-one decision-makers at administrative and clinical levels. Data presentations in the form of league tables or funnel plots. Number of actions suggested by participants. Proportion of appropriate actions. For all four measures, decision-makers tended to suggest more actions based on the information provided in league tables compared to funnel plots (44% vs. 21%, P < 0.001). Actions were on average more appropriate for funnel plots. However, when using funnel plots, decision-makers more often missed to react even when appropriate. The form of data presentation had an influence on decision-making. With league tables, decision-makers tended to suggest more actions compared to funnel plots. A difference in sensitivity and specificity conditioned by the form of presentation could also be identified, with different implications depending on the purpose of comparisons. Explanations and visualization aids are needed to support appropriate actions. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  17. Research product transfer for local calibration factors of the Highway Safety Manual (HSM) and integrated surrogate safety assessment framework : final report.

    DOT National Transportation Integrated Search

    2015-12-01

    This technology transfer workshop presented transportation planners in the public and private sectors with two successful and closely related studies, conducted respectively by Morgan State University and the University of Virginia. The first module ...

  18. Derivation and validation of the prediabetes self-assessment screening score after acute pancreatitis (PERSEUS).

    PubMed

    Soo, Danielle H E; Pendharkar, Sayali A; Jivanji, Chirag J; Gillies, Nicola A; Windsor, John A; Petrov, Maxim S

    2017-10-01

    Approximately 40% of patients develop abnormal glucose metabolism after a single episode of acute pancreatitis. This study aimed to develop and validate a prediabetes self-assessment screening score for patients after acute pancreatitis. Data from non-overlapping training (n=82) and validation (n=80) cohorts were analysed. Univariate logistic and linear regression identified variables associated with prediabetes after acute pancreatitis. Multivariate logistic regression developed the score, ranging from 0 to 215. The area under the receiver-operating characteristic curve (AUROC), Hosmer-Lemeshow χ 2 statistic, and calibration plots were used to assess model discrimination and calibration. The developed score was validated using data from the validation cohort. The score had an AUROC of 0.88 (95% CI, 0.80-0.97) and Hosmer-Lemeshow χ 2 statistic of 5.75 (p=0.676). Patients with a score of ≥75 had a 94.1% probability of having prediabetes, and were 29 times more likely to have prediabetes than those with a score of <75. The AUROC in the validation cohort was 0.81 (95% CI, 0.70-0.92) and the Hosmer-Lemeshow χ 2 statistic was 5.50 (p=0.599). Model calibration of the score showed good calibration in both cohorts. The developed and validated score, called PERSEUS, is the first instrument to identify individuals who are at high risk of developing abnormal glucose metabolism following an episode of acute pancreatitis. Copyright © 2017 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  19. Calibration Adjustment of the Mid-infrared Analyzer for an Accurate Determination of the Macronutrient Composition of Human Milk.

    PubMed

    Billard, Hélène; Simon, Laure; Desnots, Emmanuelle; Sochard, Agnès; Boscher, Cécile; Riaublanc, Alain; Alexandre-Gouabau, Marie-Cécile; Boquien, Clair-Yves

    2016-08-01

    Human milk composition analysis seems essential to adapt human milk fortification for preterm neonates. The Miris human milk analyzer (HMA), based on mid-infrared methodology, is convenient for a unique determination of macronutrients. However, HMA measurements are not totally comparable with reference methods (RMs). The primary aim of this study was to compare HMA results with results from biochemical RMs for a large range of protein, fat, and carbohydrate contents and to establish a calibration adjustment. Human milk was fractionated in protein, fat, and skim milk by covering large ranges of protein (0-3 g/100 mL), fat (0-8 g/100 mL), and carbohydrate (5-8 g/100 mL). For each macronutrient, a calibration curve was plotted by linear regression using measurements obtained using HMA and RMs. For fat, 53 measurements were performed, and the linear regression equation was HMA = 0.79RM + 0.28 (R(2) = 0.92). For true protein (29 measurements), the linear regression equation was HMA = 0.9RM + 0.23 (R(2) = 0.98). For carbohydrate (15 measurements), the linear regression equation was HMA = 0.59RM + 1.86 (R(2) = 0.95). A homogenization step with a disruptor coupled to a sonication step was necessary to obtain better accuracy of the measurements. Good repeatability (coefficient of variation < 7%) and reproducibility (coefficient of variation < 17%) were obtained after calibration adjustment. New calibration curves were developed for the Miris HMA, allowing accurate measurements in large ranges of macronutrient content. This is necessary for reliable use of this device in individualizing nutrition for preterm newborns. © The Author(s) 2015.

  20. Hands-On PV Experience (HOPE) Workshop - Text Version | Photovoltaic

    Science.gov Websites

    there. Ryan Ellis, Purdue University: So, one thing I learned quite a bit about was calibrating research group, we do use a reference cell, however, we typically just calibrate it to one sun... so we've kind of learned that you know, you may be getting to one sun on your reference cell but you may be

  1. Calibration Of An Active Mammosite Using A Low Activity Sr-90 Radioactive Source

    NASA Astrophysics Data System (ADS)

    Winston, Jacquelyn

    2007-03-01

    The latest involvement of the Brachytherapy research group of the medical physics program at Hampton University is in the development of a scintillating fiber based detector for the breast cancer specific Mammosite (balloon device) from Cytyc Inc. Recent data were acquired at a local hospital to evaluate the possibility of measuring the dose distribution during breast Brachytherapy cancer treatments with this device. Since sub-millimeter accuracy in position is required, precision of the device relies on the accurate calibration of the scintillating fiber element. As part of a collaboration work, data were acquired for that purpose at Hampton University and subsequently analyzed at Morgan State University. An 8 mm diameter strontium-90 radioactive field source with a low activity of 25 μCi was used along with a dedicated LabView data acquisition system. We will discuss the data collected and address some of the features of this novel system.

  2. Calibration Of An Active Mammosite Using A Low Activity Sr-90 Radioactive Source

    NASA Astrophysics Data System (ADS)

    Winston, Jacquelyn

    2006-03-01

    The latest involvement of the Brachytherapy research group of the medical physics program at Hampton University is in the development of a scintillator fiber based detector for the breast cancer specific Mammosite (balloon device) from Cytyc Inc. Recent data were acquired at a local hospital to evaluate the possibility of measuring the dose distribution during breast Brachytherapy cancer treatments with this device. Since sub-millimeter accuracy in position is required, precision of the device relies on the accurate calibration of the scintillating fiber element. As part of a collaboration work, data were acquired for that purpose at Hampton University and subsequently analyzed at Morgan State University. An 8 mm diameter strontium-90 radioactive field source with a low activity of 25 μCi was used along with a dedicated LabView data acquisition system. We will discuss the data collected and address some of the features of this novel system.

  3. Macroeconomic Analysis of Universal Coverage in the U.S.

    NASA Astrophysics Data System (ADS)

    Feng, Zhigang

    In this paper I employ a dynamic general equilibrium model to study macroeconomic effects and welfare implications of health policies for universal coverage in the U.S. The model is calibrated to the U.S. data. Numerical simulations indicate that adopting universal coverage has several important macroeconomic effects on health expenditures, hours worked, and increases welfare by improving aggregate health status, and removing adverse selection.

  4. Applying the log-normal distribution to target detection

    NASA Astrophysics Data System (ADS)

    Holst, Gerald C.

    1992-09-01

    Holst and Pickard experimentally determined that MRT responses tend to follow a log-normal distribution. The log normal distribution appeared reasonable because nearly all visual psychological data is plotted on a logarithmic scale. It has the additional advantage that it is bounded to positive values; an important consideration since probability of detection is often plotted in linear coordinates. Review of published data suggests that the log-normal distribution may have universal applicability. Specifically, the log-normal distribution obtained from MRT tests appears to fit the target transfer function and the probability of detection of rectangular targets.

  5. High Precision 2-D Grating Groove Density Measurement

    NASA Astrophysics Data System (ADS)

    Zhang, Ningxiao; McEntaffer, Randall; Tedesco, Ross

    2017-08-01

    Our research group at Penn State University is working on producing X-ray reflection gratings with high spectral resolving power and high diffraction efficiency. To estimate our fabrication accuracy, we apply a precise 2-D grating groove density measurement to plot groove density distributions of gratings on 6-inch wafers. In addition to plotting a fixed groove density distribution, this method is also sensitive to measuring the variation of the groove density simultaneously. This system can reach a measuring accuracy (ΔN/N) of 10-3. Here we present this groove density measurement and some applications.

  6. Seed: a user-friendly tool for exploring and visualizing microbial community data.

    PubMed

    Beck, Daniel; Dennis, Christopher; Foster, James A

    2015-02-15

    In this article we present Simple Exploration of Ecological Data (Seed), a data exploration tool for microbial communities. Seed is written in R using the Shiny library. This provides access to powerful R-based functions and libraries through a simple user interface. Seed allows users to explore ecological datasets using principal coordinate analyses, scatter plots, bar plots, hierarchal clustering and heatmaps. Seed is open source and available at https://github.com/danlbek/Seed. danlbek@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  7. Design Modification and Calibration of the Picatinny Activator for Setback Safety Testing of SADARM

    DTIC Science & Technology

    1992-05-01

    Modified activator 25 13 Hammer velocity versus gap closing velocity 27 14 Peak air pressures 28 15 Peak air temperatures, 29 16 Pulse durations at half...Variations in P, and P 2 with gap size for 20 KG’s and 30 KG’s acceleration, and for the 20 KG’s case with the heat transfer arbitrarily reduced to 10... closing velocity at first jump-up and its value is plotted in figure 7. It only depends on gap size and acceleration and appears to be the most

  8. Interstellar lines in high resolution IUE spectra. Part 1: Groningen data reduction package and technical results

    NASA Astrophysics Data System (ADS)

    Gilra, D. P.; Pwa, T. H.; Arnal, E. M.; de Vries, J.

    1982-06-01

    In order to process and analyze high resolution IUE data on a large number of interstellar lines in a large number of images for a large number of stars, computer programs were developed for 115 lines in the short wavelength range and 40 in the long wavelength range. Programs include extraction, processing, plotting, averaging, and profile fitting. Wavelength calibration in high resolution spectra, fixed pattern noise, instrument profile and resolution, and the background problem in the region where orders are crowding are discussed. All the expected lines are detected in at least one spectrum.

  9. Modeling marine oily wastewater treatment by a probabilistic agent-based approach.

    PubMed

    Jing, Liang; Chen, Bing; Zhang, Baiyu; Ye, Xudong

    2018-02-01

    This study developed a novel probabilistic agent-based approach for modeling of marine oily wastewater treatment processes. It begins first by constructing a probability-based agent simulation model, followed by a global sensitivity analysis and a genetic algorithm-based calibration. The proposed modeling approach was tested through a case study of the removal of naphthalene from marine oily wastewater using UV irradiation. The removal of naphthalene was described by an agent-based simulation model using 8 types of agents and 11 reactions. Each reaction was governed by a probability parameter to determine its occurrence. The modeling results showed that the root mean square errors between modeled and observed removal rates were 8.73 and 11.03% for calibration and validation runs, respectively. Reaction competition was analyzed by comparing agent-based reaction probabilities, while agents' heterogeneity was visualized by plotting their real-time spatial distribution, showing a strong potential for reactor design and process optimization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Piezo-thermal Probe Array for High Throughput Applications

    PubMed Central

    Gaitas, Angelo; French, Paddy

    2012-01-01

    Microcantilevers are used in a number of applications including atomic-force microscopy (AFM). In this work, deflection-sensing elements along with heating elements are integrated onto micromachined cantilever arrays to increase sensitivity, and reduce complexity and cost. An array of probes with 5–10 nm gold ultrathin film sensors on silicon substrates for high throughput scanning probe microscopy is developed. The deflection sensitivity is 0.2 ppm/nm. Plots of the change in resistance of the sensing element with displacement are used to calibrate the probes and determine probe contact with the substrate. Topographical scans demonstrate high throughput and nanometer resolution. The heating elements are calibrated and the thermal coefficient of resistance (TCR) is 655 ppm/K. The melting temperature of a material is measured by locally heating the material with the heating element of the cantilever while monitoring the bending with the deflection sensing element. The melting point value measured with this method is in close agreement with the reported value in literature. PMID:23641125

  11. Summary of suspended-sediment concentration data, San Francisco Bay, California, water year 2010

    USGS Publications Warehouse

    Buchanan, Paul A.; Morgan, Tara L.

    2014-01-01

    Suspended-sediment concentration data were collected by the U.S. Geological Survey in San Francisco Bay during water year 2010 (October 1, 2009–September 30, 2010). Turbidity sensors and water samples were used to monitor suspended-sediment concentration at two sites in Suisun Bay, one site in San Pablo Bay, three sites in Central San Francisco Bay, and one site in South San Francisco Bay. Sensors were positioned at two depths at most sites to help define the vertical variability of suspended sediments. Water samples were collected periodically and analyzed for concentrations of suspended sediment. The results of the analyses were used to calibrate the output of the turbidity sensors so that a record of suspended-sediment concentrations could be computed. This report presents the data-collection methods used and summarizes, in graphs, the suspended-sediment concentration data collected from October 2009 through September 2010. Calibration curves and plots of the processed data for each sensor also are presented.

  12. Erratum: Voyager Color Photometry of Saturn's Main Rings

    NASA Technical Reports Server (NTRS)

    Estrada, Paul R.; Cuzzi, Jeffrey N.; Showalter, Mark R.; DeVincenzi, Donald (Technical Monitor)

    2002-01-01

    We correct a calibration error in our earlier analysis of Voyager color observations of Saturn's main rings at 14 deg phase angle and present thoroughly revised and reanalyzed radial profiles of the brightness of the main rings in Voyager G, V, and UV filters, and ratios of these brightnesses. These results are consistent with more recent HST results at 6 deg phase angle, once allowance is made for plausible phase reddening of the rings. Unfortunately, the Voyager camera calibration factors are simply not sufficiently well known for a combination of the Voyager and HST data to be used to constrain the phase reddening quantitatively. However, some interesting radial variations in reddening between 6-14 deg phase angles are hinted at. We update a ring-and-satellite color vs. albedo plot from Cuzzi and Estrada in several ways. The A and B rings are still found to be in a significantly redder part of color-albedo space than Saturn's icy satellites.

  13. The effect of vegetation type, microrelief, and incidence angle on radar backscatter

    NASA Technical Reports Server (NTRS)

    Owe, M.; Oneill, P. E.; Jackson, T. J.; Schmugge, T. J.

    1985-01-01

    The NASA/JPL Synthetic Aperture Radar (SAR) was flown over a 20 x 110 km test site in the Texas High Plains regions north of Lubbock during February/March 1984. The effect of incidence angle was investigated by comparing the pixel values of the calibrated and uncalibrated images. Ten-pixel-wide transects along the entire azimuth were averaged in each of the two scenes, and plotted against the calculated incidence angle of the center of each range increment. It is evident from the graphs that both the magnitudes and patterns exhibited by the corresponding transect means of the two images are highly dissimilar. For each of the cross-poles, the uncalibrated image displayed very distinct and systematic positive trends through the entire range of incidence angles. The two like-poles, however, exhibited relatively constant returns. In the calibrated image, the cross-poles exhibited a constant return, while the like-poles demonstrated a strong negative trend across the range of look-angles, as might be expected.

  14. Data reduction of digitized images processed from calibrated photographic and spectroscopic films obtained from terrestial, rocket and space shuttle telescopic instruments

    NASA Technical Reports Server (NTRS)

    Hammond, Ernest C., Jr.

    1990-01-01

    The Microvax 2 computer, the basic software in VMS, and the Mitsubishi High Speed Disk were received and installed. The digital scanning tunneling microscope is fully installed and operational. A new technique was developed for pseudocolor analysis of the line plot images of a scanning tunneling microscope. Computer studies and mathematical modeling of the empirical data associated with many of the film calibration studies were presented. A gas can follow-up experiment which will be launched in September, on the Space Shuttle STS-50, was prepared and loaded. Papers were presented on the structure of the human hair strand using scanning electron microscopy and x ray analysis and updated research on the annual rings produced by the surf clam of the ocean estuaries of Maryland. Scanning electron microscopic work was conducted by the research team for the study of the Mossbauer and Magnetic Susceptibility Studies on NmNi(4.25)Fe(.85) and its Hydride.

  15. Online Resource for Earth-Observing Satellite Sensor Calibration

    NASA Technical Reports Server (NTRS)

    McCorkel, J.; Czapla-Myers, J.; Thome, K.; Wenny, B.

    2015-01-01

    The Radiometric Calibration Test Site (RadCaTS) at Railroad Valley Playa, Nevada is being developed by the University of Arizona to enable improved accuracy and consistency for airborne and satellite sensor calibration. Primary instrumentation at the site consists of ground-viewing radiometers, a sun photometer, and a meteorological station. Measurements made by these instruments are used to calculate surface reflectance, atmospheric properties and a prediction for top-of-atmosphere reflectance and radiance. This work will leverage research for RadCaTS, and describe the requirements for an online database, associated data formats and quality control, and processing levels.

  16. Rainfall erosivity: An historical review

    USDA-ARS?s Scientific Manuscript database

    Rainfall erosivity is the capability of rainfall to cause soil loss from hillslopes by water. Modern definitions of rainfall erosivity began with the development of the Universal Soil Loss Equation (USLE), where rainfall characteristics were statistically related to soil loss from thousands of plot...

  17. Calibrating the SNfactory Integral Field Spectrograph (SNIFS) with SCALA

    NASA Astrophysics Data System (ADS)

    Küsters, Daniel; Lombardo, Simona; Kowalski, Marek; Aldering, Greg; Nordin, Jakob; Rigault, Mickael

    2016-08-01

    The SNIFS CALibration Apparatus (SCALA), a device to calibrate the Supernova Integral Field Spectrograph on the University Hawaii 2.2m telescope, was developed and installed in Spring 2014. SCALA produces an artificial planet with a diameter of 1° and a constant surface brightness. The wavelength of the beam can be tuned between 3200 Å and 10000 Å and has a bandwidth of 35 Å. The amount of light injected into the telescope is monitored with NIST calibrated photodiodes. SCALA was upgraded in 2015 with a mask installed at the entrance pupil of the UH88 telescope, ensuring that the illumination of the telescope by stars is similar to that of SCALA. With this setup, a first calibration run was performed in conjunction with the spectrophotometric observations of standard stars. We present first estimates for the expected systematic uncertainties of the in-situ calibration and discuss the results of tests that examine the influence of stray light produced in the optics.

  18. NIST Standard Reference Material 3600: Absolute Intensity Calibration Standard for Small-Angle X-ray Scattering

    DOE PAGES

    Allen, Andrew J.; Zhang, Fan; Kline, R. Joseph; ...

    2017-03-07

    The certification of a new standard reference material for small-angle scattering [NIST Standard Reference Material (SRM) 3600: Absolute Intensity Calibration Standard for Small-Angle X-ray Scattering (SAXS)], based on glassy carbon, is presented. Creation of this SRM relies on the intrinsic primary calibration capabilities of the ultra-small-angle X-ray scattering technique. This article describes how the intensity calibration has been achieved and validated in the certified Q range, Q = 0.008–0.25 Å –1, together with the purpose, use and availability of the SRM. The intensity calibration afforded by this robust and stable SRM should be applicable universally to all SAXS instruments thatmore » employ a transmission measurement geometry, working with a wide range of X-ray energies or wavelengths. As a result, the validation of the SRM SAXS intensity calibration using small-angle neutron scattering (SANS) is discussed, together with the prospects for including SANS in a future renewal certification.« less

  19. NIST Standard Reference Material 3600: Absolute Intensity Calibration Standard for Small-Angle X-ray Scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Andrew J.; Zhang, Fan; Kline, R. Joseph

    The certification of a new standard reference material for small-angle scattering [NIST Standard Reference Material (SRM) 3600: Absolute Intensity Calibration Standard for Small-Angle X-ray Scattering (SAXS)], based on glassy carbon, is presented. Creation of this SRM relies on the intrinsic primary calibration capabilities of the ultra-small-angle X-ray scattering technique. This article describes how the intensity calibration has been achieved and validated in the certified Q range, Q = 0.008–0.25 Å –1, together with the purpose, use and availability of the SRM. The intensity calibration afforded by this robust and stable SRM should be applicable universally to all SAXS instruments thatmore » employ a transmission measurement geometry, working with a wide range of X-ray energies or wavelengths. As a result, the validation of the SRM SAXS intensity calibration using small-angle neutron scattering (SANS) is discussed, together with the prospects for including SANS in a future renewal certification.« less

  20. NIST Standard Reference Material 3600: Absolute Intensity Calibration Standard for Small-Angle X-ray Scattering.

    PubMed

    Allen, Andrew J; Zhang, Fan; Kline, R Joseph; Guthrie, William F; Ilavsky, Jan

    2017-04-01

    The certification of a new standard reference material for small-angle scattering [NIST Standard Reference Material (SRM) 3600: Absolute Intensity Calibration Standard for Small-Angle X-ray Scattering (SAXS)], based on glassy carbon, is presented. Creation of this SRM relies on the intrinsic primary calibration capabilities of the ultra-small-angle X-ray scattering technique. This article describes how the intensity calibration has been achieved and validated in the certified Q range, Q = 0.008-0.25 Å -1 , together with the purpose, use and availability of the SRM. The intensity calibration afforded by this robust and stable SRM should be applicable universally to all SAXS instruments that employ a transmission measurement geometry, working with a wide range of X-ray energies or wavelengths. The validation of the SRM SAXS intensity calibration using small-angle neutron scattering (SANS) is discussed, together with the prospects for including SANS in a future renewal certification.

  1. Calibrating the IXPE observatory from ground to space

    NASA Astrophysics Data System (ADS)

    Muleri, Fabio; Baldini, Luca; Baumgartner, Wayne; Evangelista, Yuri; Fabiani, Sergio; Kolodziejczak, Jeffery; Latronico, Luca; Lefevre, Carlo; O'Dell, Stephen L.; Ramsey, Brian; Sgrò, Carmelo; Soffitta, Paolo; Tennant, Allyn; Weisskopf, Martin C.

    2017-08-01

    The Imaging X-ray Polarimetry Explorer (IXPE) will be the next SMEX mission launched by NASA in 2021 in collaboration with the Italian Space Agency (ASI). IXPE will perform groundbreaking measurements of imaging polarization in X-rays for a number of different classes of sources with three identical telescopes, finally (re)opening a window in the high energy Universe after more than 40 years since the first pioneering results. The unprecedented sensitivity of IXPE to polarization poses peculiar requirements on the payload calibration, e.g. the use of polarized and completely unpolarized radiation, both on ground and in orbit, and can not rely on a systematic comparison with results obtained by previous observatories. In this paper, we will present the IXPE calibration plan, describing both calibrations which will be performed on the detectors at INAF-IAPS in Rome (Italy) and the calibration on the mirror and detector assemblies which will be carried out at Marshall Space Flight Center in Huntsville, Alabama. On orbit calibrations, performed with calibrations sources mounted on a filter wheel and placed in front of each detector when necessary, will be presented as well.

  2. Calibrating the IXPE Observatory from Ground to Space

    NASA Technical Reports Server (NTRS)

    Muleri, Fabio; Baldini, Luca; Baumgartner, Wayne; Evangelista, Yuri; Fabiani, Sergio; Kolodziejczak, Jeffery; Latronico, Luca; Lefevre, Carlo; O'Dell, Stephen L.; Ramsey, Brian; hide

    2017-01-01

    The Imaging X-ray Polarimetry Explorer (IXPE) will be the next SMEX mission launched by NASA in 2021 in collaboration with the Italian Space Agency (ASI). IXPE will perform groundbreaking measurements of imaging polarization in X-rays for a number of different classes of sources with three identical telescopes, finally (re)opening a window in the high energy Universe after more than 40 years since the first pioneering results. The unprecedented sensitivity of IXPE to polarization poses peculiar requirements on the payload calibration, e.g. the use of polarized and completely unpolarized radiation, both on ground and in orbit, and can not rely on a systematic comparison with results obtained by previous observatories. In this paper, we will present the IXPE calibration plan, describing both calibrations which will be performed on the detectors at INAF-IAPS in Rome (Italy) and the calibration on the mirror and detector assemblies which will be carried out at Marshall Space Flight Center in Huntsville, Alabama. On orbit calibrations, performed with calibrations sources mounted on a filter wheel and placed in front of each detector when necessary, will be presented as well.

  3. Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers

    PubMed Central

    Sun, Ting; Xing, Fei; You, Zheng

    2013-01-01

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers. PMID:23567527

  4. GTOOLS: an Interactive Computer Program to Process Gravity Data for High-Resolution Applications

    NASA Astrophysics Data System (ADS)

    Battaglia, M.; Poland, M. P.; Kauahikaua, J. P.

    2012-12-01

    An interactive computer program, GTOOLS, has been developed to process gravity data acquired by the Scintrex CG-5 and LaCoste & Romberg EG, G and D gravity meters. The aim of GTOOLS is to provide a validated methodology for computing relative gravity values in a consistent way accounting for as many environmental factors as possible (e.g., tides, ocean loading, solar constraints, etc.), as well as instrument drift. The program has a modular architecture. Each processing step is implemented in a tool (function) that can be either run independently or within an automated task. The tools allow the user to (a) read the gravity data acquired during field surveys completed using different types of gravity meters; (b) compute Earth tides using an improved version of Longman's (1959) model; (c) compute ocean loading using the HARDISP code by Petit and Luzum (2010) and ocean loading harmonics from the TPXO7.2 ocean tide model; (d) estimate the instrument drift using linear functions as appropriate; and (e) compute the weighted least-square-adjusted gravity values and their errors. The corrections are performed up to microGal ( μGal) precision, in accordance with the specifications of high-resolution surveys. The program has the ability to incorporate calibration factors that allow for surveys done using different gravimeters to be compared. Two additional tools (functions) allow the user to (1) estimate the instrument calibration factor by processing data collected by a gravimeter on a calibration range; (2) plot gravity time-series at a chosen benchmark. The interactive procedures and the program output (jpeg plots and text files) have been designed to ease data handling and archiving, to provide useful information for future data interpretation or modeling, and facilitate comparison of gravity surveys conducted at different times. All formulas have been checked for typographical errors in the original reference. GTOOLS, developed using Matlab, is open source and machine independent. We will demonstrate program use and utility with data from multiple microgravity surveys at Kilauea volcano, Hawai'i.

  5. Validity and reliability of a food frequency questionnaire to estimate dietary intake among Lebanese children.

    PubMed

    Moghames, Patricia; Hammami, Nour; Hwalla, Nahla; Yazbeck, Nadine; Shoaib, Hikma; Nasreddine, Lara; Naja, Farah

    2016-01-12

    Nutritional status during childhood is critical given its effect on growth and development as well as its association with disease risk later in life. The Middle East and North Africa (MENA) region is experiencing alarming rates of childhood malnutrition, both over- and under-nutrition. Hence, there is a need for valid tools to assess dietary intake for children in this region. To date, there are no validated dietary assessment tools for children in any country of the MENA region. The main objective of this study was to examine the validity and reliability of a Food Frequency Questionnaire (FFQ) for the assessment of dietary intake among Lebanese children. Children, aged 5 to 10 years (n = 111), were recruited from public and private schools of Beirut, Lebanon. Mothers (proxies to report their children's dietary intake) completed two FFQs, four weeks apart. Four 24-hour recalls (24-HRs) were collected weekly during the duration of the study. Spearman correlations and Bland-Altman plots were used to assess validity. Linear regression models were used to derive calibration factors for boys and girls. Reproducibility statistics included Intraclass Correlation Coefficient (ICC) and percent agreement. Correlation coefficients between dietary intake estimates derived from FFQ and 24-HRs were significant at p < 0.001 with the highest correlation observed for energy (0.54) and the lowest for monounsaturated fatty acids (0.26). The majority of data points in the Bland-Altman plots lied between the limits of agreement, closer to the middle horizontal line. After applying the calibration factors for boys and girls, the mean energy and nutrient intakes estimated by the FFQ were similar to those obtained by the mean 24-HRs. As for reproducibility, ICC ranged between 0.31 for trans-fatty acids and 0.73 for calcium intakes. Over 80 % of study participants were classified in the same or adjacent quartile of energy and nutrients intake. Findings of this study showed that the developed FFQ is reliable and is also valid, when used with calibration factors. This FFQ is a useful tool in dietary assessment and evaluation of diet-disease relationship in this age group.

  6. Calibration and Application of FOREST-BGC in NorthWestern of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, M. A.; Lopes, D. M.; Leite, M. S.; Tabuada, V. M.

    2010-05-01

    Net primary production (NPP) is one of the most important variables in terms of ecosystems inventory and management, because it quantifies its growth and reflects the impact of biotic and abiotic factors, which could affect it. Interest in NP has increased recently because of the increasing interesting in climate change and the need in understanding its impact on the environment. There are ecophysiologic models, as Forest-BGC that allow for estimating NPP. The types of models offer a possible methodology to test these phenomena, beyond temporal and spatial scales, not available with tradicional inventory methodologies. To analyze the Forest-BGC performance, NPP data obtained with model were compared with collected data in the field, in the same sampling plots. For a parameterization and validation of the FOREST-BGC, this study was carried on based on 500m2 sampling plots from the National Forest Inventory 2006 and are located in several County Halls of the district of Vila Real, Portugal (Montalegre, Chaves, Valpaços, Boticas, Vila Pouca de Aguiar, Murça, Mondim de Basto, Alijó, Sabrosa and Vila Real). In order to quantify Biomass dinamics, we have selected 45 sampling plots: 19 from Pinus pinaster stands, 17 from Quercus pyreneica and 10 from mixed of Quercus with Pinus. Adaptation strategies for climate change impacts can be proposed based on these research results.

  7. Tools for Implementing Science Practice in a Large Introductory Class

    NASA Astrophysics Data System (ADS)

    Prothero, W. A.

    2008-12-01

    Scientists must have in-depth background knowledge of their subject area and know where current knowledge can be advanced. They perform experiments that gather data to test new or existing theories, present their findings at meetings, publish their results, critically review the results of others, and respond to the reviews of their own work. In the context of a course, these activities correspond to learning the background material by listening to lectures or reading a text, formulating a problem, exploring data using student friendly data access and plotting software, giving brief talks to classmates in a small class or lab setting, writing a science paper or lab report, reviewing the writing of their peers, and receiving feedback (and grades) from their instructors and/or peers. These activities can be supported using course management software and online resources. The "LearningWithData" software system allows solid Earth (focused on plate tectonics) data exploration and plotting. Ocean data access, display, and plotting are also supported. Background material is delivered using animations and slide show type displays. Students are accountable for their learning through included homework assignments. Lab and small group activities provide support for data exploration and interpretation. Writing is most efficiently implemented using the "Calibrated Peer Review" method. This methodology is available at http://cpr.molsci.ucla.edu/. These methods have been successfully implemented in a large oceanography class at UCSB.

  8. Primary and Secondary Controls on Measurements of Forest Height Using Large-Footprint Lidar at the Hubbard Brook LTER

    NASA Technical Reports Server (NTRS)

    Knox, Robert G.; Blair, J. Bryan; Schwarz, Paul A.; Hofton, Michelle A.; Dubayah, Ralph; Smith, David E. (Technical Monitor)

    2000-01-01

    On September 26, 1999, we mapped canopy structure over 90% of the Hubbard Brook Experimental Forest in White Mountain National Forest, New Hampshire, using the Laser Vegetation Imaging Sensor (LVIS). This airborne instrument was configured to emulate data expected from the Vegetation Canopy Lidar (VCL) space mission. We compared above ground heights of the tallest surfaces detected by lidar with average forest canopy heights estimated from tree-based measurements in or near 346 0.05 ha plots (made in autumn of 1997 and 1998). Vegetation heights had by far the predominant influence on lidar top heights, but with this large data set we were able to measure two significant secondary effects: those of steepness or slope of the underlying terrain and of tree crown form. The size of the slope effect was intermediate between that expected from models of homogeneous canopy layers and for solitary tree crowns. The first detected surfaces were also proportionately taller for plots with more basal area in broad leaved northern hardwoods than for mostly coniferous plots. We expected this because of the contrast between the shapes of cumulative distributions of surface area for elliptical or hemi-elliptical tree crowns and those for conical crowns. Correcting for these secondary effects, when appropriate data are available for calibration, may improve vegetation structure estimates in regional studies using VCL or similar lidar data sources.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Prinzio, Renato; Almeida, Carlos Eduardo de; Laboratorio de Ciencias Radiologicas-Universidade do Estado do Rio de Janeiro

    In Brazil there are over 100 high dose rate (HDR) brachytherapy facilities using well-type chambers for the determination of the air kerma rate of {sup 192}Ir sources. This paper presents the methodology developed and extensively tested by the Laboratorio de Ciencias Radiologicas (LCR) and presently in use to calibrate those types of chambers. The system was initially used to calibrate six well-type chambers of brachytherapy services, and the maximum deviation of only 1.0% was observed between the calibration coefficients obtained and the ones in the calibration certificate provided by the UWADCL. In addition to its traceability to the Brazilian Nationalmore » Standards, the whole system was taken to University of Wisconsin Accredited Dosimetry Calibration Laboratory (UWADCL) for a direct comparison and the same formalism to calculate the air kerma was used. The comparison results between the two laboratories show an agreement of 0.9% for the calibration coefficients. Three Brazilian well-type chambers were calibrated at the UWADCL, and by LCR, in Brazil, using the developed system and a clinical HDR machine. The results of the calibration of three well chambers have shown an agreement better than 1.0%. Uncertainty analyses involving the measurements made both at the UWADCL and LCR laboratories are discussed.« less

  10. Fast hydrological model calibration based on the heterogeneous parallel computing accelerated shuffled complex evolution method

    NASA Astrophysics Data System (ADS)

    Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Hong, Yang; Zuo, Depeng; Ren, Minglei; Lei, Tianjie; Liang, Ke

    2018-01-01

    Hydrological model calibration has been a hot issue for decades. The shuffled complex evolution method developed at the University of Arizona (SCE-UA) has been proved to be an effective and robust optimization approach. However, its computational efficiency deteriorates significantly when the amount of hydrometeorological data increases. In recent years, the rise of heterogeneous parallel computing has brought hope for the acceleration of hydrological model calibration. This study proposed a parallel SCE-UA method and applied it to the calibration of a watershed rainfall-runoff model, the Xinanjiang model. The parallel method was implemented on heterogeneous computing systems using OpenMP and CUDA. Performance testing and sensitivity analysis were carried out to verify its correctness and efficiency. Comparison results indicated that heterogeneous parallel computing-accelerated SCE-UA converged much more quickly than the original serial version and possessed satisfactory accuracy and stability for the task of fast hydrological model calibration.

  11. Camera calibration based on the back projection process

    NASA Astrophysics Data System (ADS)

    Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui

    2015-12-01

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method.

  12. Solar Cell Short Circuit Current Errors and Uncertainties During High Altitude Calibrations

    NASA Technical Reports Server (NTRS)

    Snyder, David D.

    2012-01-01

    High altitude balloon based facilities can make solar cell calibration measurements above 99.5% of the atmosphere to use for adjusting laboratory solar simulators. While close to on-orbit illumination, the small attenuation to the spectra may result in under measurements of solar cell parameters. Variations of stratospheric weather, may produce flight-to-flight measurement variations. To support the NSCAP effort, this work quantifies some of the effects on solar cell short circuit current (Isc) measurements on triple junction sub-cells. This work looks at several types of high altitude methods, direct high altitude meas urements near 120 kft, and lower stratospheric Langley plots from aircraft. It also looks at Langley extrapolation from altitudes above most of the ozone, for potential small balloon payloads. A convolution of the sub-cell spectral response with the standard solar spectrum modified by several absorption processes is used to determine the relative change from AMO, lscllsc(AMO). Rayleigh scattering, molecular scatterin g from uniformly mixed gases, Ozone, and water vapor, are included in this analysis. A range of atmosph eric pressures are examined, from 0. 05 to 0.25 Atm to cover the range of atmospheric altitudes where solar cell calibrations a reperformed. Generally these errors and uncertainties are less than 0.2%

  13. Dental hygiene faculty calibration in the evaluation of calculus detection.

    PubMed

    Garland, Kandis V; Newell, Kathleen J

    2009-03-01

    The purpose of this pilot study was to explore the impact of faculty calibration training on intra- and interrater reliability regarding calculus detection. After IRB approval, twelve dental hygiene faculty members were recruited from a pool of twenty-two for voluntary participation and randomized into two groups. All subjects provided two pre- and two posttest scorings of calculus deposits on each of three typodonts by recording yes or no indicating if they detected calculus. Accuracy and consistency of calculus detection were evaluated using an answer key. The experimental group received three two-hour training sessions to practice a prescribed exploring sequence and technique for calculus detection. Participants immediately corrected their answers, received feedback from the trainer, and reconciled missed areas. Intra- and interrater reliability (pre- and posttest) was determined using Cohen's Kappa and compared between groups using repeated measures (split-plot) ANOVA. The groups did not differ from pre- to posttraining (intrarater reliability p=0.64; interrater reliability p=0.20). Training had no effect on reliability levels for simulated calculus detection in this study. Recommendations for future studies of faculty calibration when evaluating students include using patients for assessing rater reliability, employing larger samples at multiple sites, and assessing the impact on students' attitudes and learning outcomes.

  14. Calibration of a subcutaneous amperometric glucose sensor. Part 1. Effect of measurement uncertainties on the determination of sensor sensitivity and background current.

    PubMed

    Choleau, C; Klein, J C; Reach, G; Aussedat, B; Demaria-Pesce, V; Wilson, G S; Gifford, R; Ward, W K

    2002-08-01

    The calibration of a continuous glucose monitoring system, i.e. the transformation of the signal I(t) generated by the glucose sensor at time (t) into an estimation of glucose concentration G(t), represents a key issue. The two-point calibration procedure consists of the determination of a sensor sensitivity S and of a background current I(o) by plotting two values of the sensor signal versus the concomitant blood glucose concentrations. The estimation of G(t) is subsequently given by G(t) = (I(t)-I(o))/S. A glucose sensor was implanted in the subcutaneous tissue of nine type 1 diabetic patients during 3 (n = 2) and 7 days (n = 7). For each individual trial, S and I(o) were determined by taking into account the values of two sets of sensor output and blood glucose concentration distant by at least 1 h, the procedure being repeated for each consecutive set of values. S and I(o) were found to be negatively correlated, the value of I(o) being sometimes negative. Theoretical analysis demonstrates that this phenomenon can be explained by the effect of measurement uncertainties on the determination of capillary glucose concentration and of sensor output.

  15. LevelScheme: A level scheme drawing and scientific figure preparation system for Mathematica

    NASA Astrophysics Data System (ADS)

    Caprio, M. A.

    2005-09-01

    LevelScheme is a scientific figure preparation system for Mathematica. The main emphasis is upon the construction of level schemes, or level energy diagrams, as used in nuclear, atomic, molecular, and hadronic physics. LevelScheme also provides a general infrastructure for the preparation of publication-quality figures, including support for multipanel and inset plotting, customizable tick mark generation, and various drawing and labeling tasks. Coupled with Mathematica's plotting functions and powerful programming language, LevelScheme provides a flexible system for the creation of figures combining diagrams, mathematical plots, and data plots. Program summaryTitle of program:LevelScheme Catalogue identifier:ADVZ Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVZ Operating systems:Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux Programming language used:Mathematica 4 Number of bytes in distributed program, including test and documentation:3 051 807 Distribution format:tar.gz Nature of problem:Creation of level scheme diagrams. Creation of publication-quality multipart figures incorporating diagrams and plots. Method of solution:A set of Mathematica packages has been developed, providing a library of level scheme drawing objects, tools for figure construction and labeling, and control code for producing the graphics.

  16. Importance of Calibration Method in Central Blood Pressure for Cardiac Structural Abnormalities.

    PubMed

    Negishi, Kazuaki; Yang, Hong; Wang, Ying; Nolan, Mark T; Negishi, Tomoko; Pathan, Faraz; Marwick, Thomas H; Sharman, James E

    2016-09-01

    Central blood pressure (CBP) independently predicts cardiovascular risk, but calibration methods may affect accuracy of central systolic blood pressure (CSBP). Standard central systolic blood pressure (Stan-CSBP) from peripheral waveforms is usually derived with calibration using brachial SBP and diastolic BP (DBP). However, calibration using oscillometric mean arterial pressure (MAP) and DBP (MAP-CSBP) is purported to provide more accurate representation of true invasive CSBP. This study sought to determine which derived CSBP could more accurately discriminate cardiac structural abnormalities. A total of 349 community-based patients with risk factors (71±5years, 161 males) had CSBP measured by brachial oscillometry (Mobil-O-Graph, IEM GmbH, Stolberg, Germany) using 2 calibration methods: MAP-CSBP and Stan-CSBP. Left ventricular hypertrophy (LVH) and left atrial dilatation (LAD) were measured based on standard guidelines. MAP-CSBP was higher than Stan-CSBP (149±20 vs. 128±15mm Hg, P < 0.0001). Although they were modestly correlated (rho = 0.74, P < 0.001), the Bland-Altman plot demonstrated a large bias (21mm Hg) and limits of agreement (24mm Hg). In receiver operating characteristic (ROC) curve analyses, MAP-CSBP significantly better discriminated LVH compared with Stan-CSBP (area under the curve (AUC) 0.66 vs. 0.59, P = 0.0063) and brachial SBP (0.62, P = 0.027). Continuous net reclassification improvement (NRI) (P < 0.001) and integrated discrimination improvement (IDI) (P < 0.001) corroborated superior discrimination of LVH by MAP-CSBP. Similarly, MAP-CSBP better distinguished LAD than Stan-CSBP (AUC 0.63 vs. 0.56, P = 0.005) and conventional brachial SBP (0.58, P = 0.006), whereas Stan-CSBP provided no better discrimination than conventional brachial BP (P = 0.09). CSBP is calibration dependent and when oscillometric MAP and DBP are used, the derived CSBP is a better discriminator for cardiac structural abnormalities. © American Journal of Hypertension, Ltd 2016. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Sensor-centric calibration and characterization of the VIIRS Ocean Color bands using Suomi NPP operational data

    NASA Astrophysics Data System (ADS)

    Pratt, P.

    2012-12-01

    Ocean color bands on VIIRS span the visible spectrum and include two NIR bands. There are sixteen detectors per band and two HAM (Half-angle mirror) sides giving a total of thirty two independent systems. For each scan, thirty two hundred pixels are collected and each has a fixed specific optical path and a dynamic position relative to the earth geoid. For a given calibration target where scene variation is minimized, sensor characteristics can be observed. This gives insight into the performance and calibration of the instrument from a sensor-centric perspective. Calibration of the blue bands is especially challenging since there are few blue targets on land. An ocean region called the South Pacific Gyre (SPG) was chosen for its known stability and large area to serve as a calibration target for this investigation. Thousands of pixels from every granule that views the SPG are collected daily through an automated system and tabulated along with the detector, HAM and scan position. These are then collated and organized in a sensor-centric set of tables. The data are then analyzed by slicing by each variable and then plotted in a number of ways over time. Trends in the data show that the VIIRS sensor is largely behaving as expected according to heritage data and also reveals weaknesses where additional characterization of the sensor is possible. This work by Northrop Grumman NPP CalVal Team is supporting the VIIRS on-orbit calibration and validation teams for the sensor and ocean color as well as providing scientists interested in performing ground truth with results that show which detectors and scan angles are the most reliable over time. This novel approach offers a comprehensive sensor-centric on-orbit characterization of the VIIRS instrument on the NASA Suomi NPP mission.

  18. Nondestructive prediction of the drug content of an aspirin suppository by near-infrared spectroscopy.

    PubMed

    Otsuka, Eri; Abe, Hiroyuki; Aburada, Masaki; Otsuka, Makoto

    2010-07-01

    A suppository dosage form has a rapid effect on therapeutics, because it dissolves in the rectum, is absorbed in the bloodstream, and passes the hepatic metabolism. However, the dosage form is unstable, because a suppository is made in a semisolid form, and so it is not easy to mix the bulk drug powder in the base. This article describes a nondestructive method of determining the drug content of suppositories using near-infrared spectrometry (NIR) combined with chemometrics. Suppositories (aspirin content: 1.8, 2.7, 4.5, 7.3, and 9.1%, w/w) were produced by mixing an aspirin bulk powder with hard fat at 50 degrees C and pouring the melt mixture into a plastic mold (2.25 mL). NIR spectra of 12 calibration and 12 validation sample sets were recorded 5 times. A total of 60 spectral data were used as a calibration set to establish a calibration model to predict drug content with a partial least-squares (PLS) regression analysis. NIR data of the suppository samples were divided into two wave number ranges, 4000-12500 cm(-1) (LR), and 5900-6300 cm(-1) (SR). Calibration models for the aspirin content of the suppositories were calculated based on LR and SR ranges of second-derivative NIR spectra using PLS. The models for LR and SR consisted of five and one principal components (PC), respectively. The plots of predicted values against actual values gave a straight line with regression coefficient constants of 0.9531 and 0.9749, respectively. The mean bias and mean accuracy of the calibration models were calculated based on the SR of variation data sets, and were lower than those of LR, respectively. Limiting the wave number of spectral data sets is useful to help understand the calibration model because of noise cancellation and to measure objective functions.

  19. Quantifying Vegetation Composition, Structure and Dynamics in Selected Australian Ecosystems: Science to Management

    NASA Astrophysics Data System (ADS)

    Phinn, S. R.; Scarth, P.; Armston, J.; Witte, C.; Danaher, T.; Flood, N.; Gill, T.; Lucas, R.

    2011-12-01

    Management of Australian ecosystems is carried out by state governments using information derived from satellite image data. The state of Queensland covers approximately 1.8 x 10^6 km^2 and uses satellite remote sensing and field survey programs to support legislated environmental monitoring, management and compliance activities.This poster outlines how the Joint Remote Sensing Research Program(JRSRP)delivered satellite image based data sets to address these activities by mapping foliage projective cover, vegetation height and biomass. Foliage projective cover (FPC), the vertically projected percentage cover of photosynthetic foliage of all strata, is produced from Landsat TM/ETM data using 88 scenes and over 1700 field sites. The JRSRP enabled government staff to be seconded to a university research group to work on the project, and the university provided postdoctoral and graduate student support. The JRSRP activities focussed on geometric and topographic corrections, BRDF corrections and time-series based approaches for correcting the archive of field survey and Landsat TM/ETM+ images. This has now progressed to a program using the entire Landsat TM/ETM+ archive on an annual basis and annual state-wide field survey data. The Landsat TM/ETM+ calibrations have been a critical input to the Landsat program's global vicarious calibration activities. Vegetation height is a critical parameter required for a range of state-wide activities and can be mapped accurately from field plots to regional areas using airborne Lidar. To develop statewide height estimates, an approach was developed using Icesat and existing vegetation community maps. By aggregating the spaceborne Icesat full waveform data within the mapped vegetation structure polygons it was possible to retrieve vegetation vertical structure information continuously across the landscape. This was used to derive mean canopy and understorey height, depth and density across Queensland, which was validated using airborne lidar data provided by the JRSRP. Biomass mapping is emerging as a critical environmental parameter for local, state and national agencies in Australia. Staff from JRSRP developed an approach with University of Aberystwyth in Wales, through JAXA's Kyoto and Carbon initiative, for acquiring ALOS PALSAR L-band image data, conducting geometric and radiometric corrections, and normalising for significant scene to scene differences in soil and vegetation moisture content. This pre-processing of 31 image strip time-series generated state-wide mosaics for Queensland that were then used with 1815 field survey sites collected across the state to produce a state-wide biomass estimation model for L-HV data, providing estimates for both remnant and non-remnant forests, with saturation at 263 Mg.Ha^-1 for 20% estimation error. The Joint Remote Sensing Research Program has enabled a sound approach to research and development for validated operational applications.

  20. Simulation of water flow and nitrogen transport for a Bulgarian experimental plot using SWAP and ANIMO models.

    PubMed

    Marinov, Dimitar; Querner, Erik; Roelsma, Jan

    2005-04-01

    Unsaturated zone models are useful tools in predicting effects of measures and can be used to optimise agricultural practice aiming to minimise the impact on the environment. However, current soil models have a varying degree of abstraction level referring to simulated processes in time and space. In the framework of an EU funded project the SWAP (Soil-Water-Atmosphere-Plant) and ANIMO (Agricultural-Nutrient-Model) models were tested for an experimental arable plot in Bulgaria. SWAP was used to simulate water flow in the soil while ANIMO describes nitrogen movement and transformations. The objectives of this study are: (i) to show results of the combined application of water and nitrogen dynamics of originally Dutch models SWAP and ANIMO for specific Bulgarian soil and hydrological conditions; (ii) to calibrate and evaluate SWAP and ANIMO models by comparing numerical results with field measurements collected for an arable field in western Bulgaria and (iii) to analyse possible contamination of groundwater due to agricultural practice in the considered region. Further a short description of the experimental plot, as well as information about parameters of the investigated soil profiles, is provided. The obtained SWAP results evidenced that the model gives sufficient adaptation for soil water dynamics. The simulations of ANIMO for nitrogen cycle show greater divergence with observations but are satisfactory precise for the purposes of assessing land use impact on groundwater quality. In general, differences between model results and field measurements do not exceed 10-15%. For the experimental plot predictions indicate nitrate-N concentrations less then 5 mg/l in deeper soil compartments and low downward annual flux containing 0.133 kg N/ha. These results indicate that there is no serious pollution of the shallow groundwater table by nitrogen resulting from land use and agricultural activities.

  1. Support activities to maintain SUMS flight readiness

    NASA Technical Reports Server (NTRS)

    Wright, Willie

    1992-01-01

    The Shuttle Upper Atmosphere Mass Spectrometer (SUMS), a component experiment of the NASA Orbital Experiments Program (OEX), was flown aboard the shuttle Columbia (OV102) mounted at the forward end of the nose landing gear well with an atmospheric gas inlet system fitted to the lower fuselage (chin panel) surface. The SUMS was designed to provide atmospheric data in flow regimes inaccessible prior to the development of the Space Transportation System (STS). The experiment mission operation began about one hour prior to shuttle de-orbit entry maneuver and continued until reaching 1.6 torr (about 86 km altitude). The SUMS mass spectrometer consists of the spare unit from the Viking mission to Mars. Bendix Aerospace under contract to NASA LaRC incorporated the Viking mass spectrometer, a microprocessor based logic card, a pressurized instrument case, and the University of Texas at Dallas provided a gas inlet system into a configuration suited to interface with the shuttle Columbia. The SUMS experiment underwent static and dynamic calibration as well as vacuum maintenance before and after STS 40 shuttle flight. The SUMS flew a total of 3 times on the space shuttle Columbia. Between flights the SUMS was maintained in flight ready status. The flight data has been analyzed by the NASA LaRC Aerothermodynamics Branch. Flight data spectrum plots and reports are presented in the Appendices to the Final Technical Report for NAS1-17399.

  2. Mfold web server for nucleic acid folding and hybridization prediction.

    PubMed

    Zuker, Michael

    2003-07-01

    The abbreviated name, 'mfold web server', describes a number of closely related software applications available on the World Wide Web (WWW) for the prediction of the secondary structure of single stranded nucleic acids. The objective of this web server is to provide easy access to RNA and DNA folding and hybridization software to the scientific community at large. By making use of universally available web GUIs (Graphical User Interfaces), the server circumvents the problem of portability of this software. Detailed output, in the form of structure plots with or without reliability information, single strand frequency plots and 'energy dot plots', are available for the folding of single sequences. A variety of 'bulk' servers give less information, but in a shorter time and for up to hundreds of sequences at once. The portal for the mfold web server is http://www.bioinfo.rpi.edu/applications/mfold. This URL will be referred to as 'MFOLDROOT'.

  3. Short- term effects of post-fire logging on runoff and soil erosion at two spatial scales

    NASA Astrophysics Data System (ADS)

    Malvar, Maruxa; Silva, Flavio; Prats, Sergio; Vieira, Diana; Keizer, Jacob

    2017-04-01

    Logging is the most common management practice after wildfires in forested areas in Portugal. Clearcutting is undertaken to recover burnt timber resources, to control resprouting, notably in the case of eucalypt plantations, and to reduce the risks of possible insect plagues, notably in the case of maritime pine because of the nematode plague. Still, relatively little is known about the combined effect of wildfire and post-fire logging on erosion processes. In the framework of the EU-FP7 project RECARE (www.recare-project.eu), the ESP team of the University of Aveiro set up an experiment to quantify the hydrological and erosion impacts of post-fire logging, at the scale of both 0.25 m2 micro-plots and 16 m2 plots. A eucalypt slope burnt in August 2015 by a moderate intensity fire and logged in September 2015 was selected for this study. The burned trees were harvested with a chainsaw, while the logs were piled with a rubber wheeled forwarder tractor. Following logging, two distinct sub-areas were identified within the logged slope based on soil disturbance: an area where the forwarder wheels had left marked trails ("trail"), and an area where such trails were absent ("control"). Three micro-plots and three plots were installed in the control area, while three micro-plots and six plots were installed in the trail area. Generally, the trail area showed greater soil compaction and larger soil surface roughness than the control area. Between October 2015 and September 2016, mean runoff was 500 mm in the control micro-plots and 50% higher in the trail micro-plots. At the plot scale, however, no differences in runoff generation were observed between the two subareas. Sediment production over the same period, however, was twice as high in the trail area than the control area, at both plot scales. In the control area, mean sediment production was 8 Mg ha-1 yr-1 at the micro-plot scale and 6 Mg ha-1 yr-1at the plot scale; in the trail area, these figures were 21 Mg ha-1 yr-1 and 13 Mg ha-1 yr-1, respectively. Post-fire logging activities and their timing should be evaluated against their potential impacts on runoff and erosion, and should be contemplated for additional erosion mitigation practices.

  4. Design of an ultra-portable field transfer radiometer supporting automated vicarious calibration

    NASA Astrophysics Data System (ADS)

    Anderson, Nikolaus; Thome, Kurtis; Czapla-Myers, Jeffrey; Biggar, Stuart

    2015-09-01

    The University of Arizona Remote Sensing Group (RSG) began outfitting the radiometric calibration test site (RadCaTS) at Railroad Valley Nevada in 2004 for automated vicarious calibration of Earth-observing sensors. RadCaTS was upgraded to use RSG custom 8-band ground viewing radiometers (GVRs) beginning in 2011 and currently four GVRs are deployed providing an average reflectance for the test site. This measurement of ground reflectance is the most critical component of vicarious calibration using the reflectance-based method. In order to ensure the quality of these measurements, RSG has been exploring more efficient and accurate methods of on-site calibration evaluation. This work describes the design of, and initial results from, a small portable transfer radiometer for the purpose of GVR calibration validation on site. Prior to deployment, RSG uses high accuracy laboratory calibration methods in order to provide radiance calibrations with low uncertainties for each GVR. After deployment, a solar radiation based calibration has typically been used. The method is highly dependent on a clear, stable atmosphere, requires at least two people to perform, is time consuming in post processing, and is dependent on several large pieces of equipment. In order to provide more regular and more accurate calibration monitoring, the small portable transfer radiometer is designed for quick, one-person operation and on-site field calibration comparison results. The radiometer is also suited for laboratory calibration use and thus could be used as a transfer radiometer calibration standard for ground viewing radiometers of a RadCalNet site.

  5. European Randomized Study of Screening for Prostate Cancer Risk Calculator: External Validation, Variability, and Clinical Significance.

    PubMed

    Gómez-Gómez, Enrique; Carrasco-Valiente, Julia; Blanca-Pedregosa, Ana; Barco-Sánchez, Beatriz; Fernandez-Rueda, Jose Luis; Molina-Abril, Helena; Valero-Rosa, Jose; Font-Ugalde, Pilar; Requena-Tapia, Maria José

    2017-04-01

    To externally validate the European Randomized Study of Screening for Prostate Cancer (ERSPC) risk calculator (RC) and to evaluate its variability between 2 consecutive prostate-specific antigen (PSA) values. We prospectively catalogued 1021 consecutive patients before prostate biopsy for suspicion of prostate cancer (PCa). The risk of PCa and significant PCa (Gleason score ≥7) from 749 patients was calculated according to ERSPC-RC (digital rectal examination-based version 3 of 4) for 2 consecutive PSA tests per patient. The calculators' predictions were analyzed using calibration plots and the area under the receiver operating characteristic curve (area under the curve). Cohen kappa coefficient was used to compare the ability and variability. Of 749 patients, PCa was detected in 251 (33.5%) and significant PCa was detected in 133 (17.8%). Calibration plots showed an acceptable parallelism and similar discrimination ability for both PSA levels with an area under the curve of 0.69 for PCa and 0.74 for significant PCa. The ERSPC showed 226 (30.2%) unnecessary biopsies with the loss of 10 significant PCa. The variability of the RC was 16% for PCa and 20% for significant PCa, and a higher variability was associated with a reduced risk of significant PCa. We can conclude that the performance of the ERSPC-RC in the present cohort shows a high similitude between the 2 PSA levels; however, the RC variability value is associated with a decreased risk of significant PCa. The use of the ERSPC in our cohort detects a high number of unnecessary biopsies. Thus, the incorporation of ERSPC-RC could help the clinical decision to carry out a prostate biopsy. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Erratum to. Energy calibration of gamma spectra in plastic scintillators using Compton kinematics [Nucl. Instr. and Meth. A 594 (2008) 232–243

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siciliano, Edward R.; Ely, James H.; Kouzes, Richard T.

    2009-11-01

    In recent work at our laboratory, we were re-examining our data and found an inconsistency between the values listed for 137Cs in Table 2 (Siciliano et al. 2008) and results plotted for that source in Figures 11 and 12. In the course of fitting the parabolic function (Equation 4) to the Compton maxima, two ranges of channels were used when determining the parameters for 137Cs. The parabolic fit curve shown in Figure 11 resulted from fitting channels 50 to 70. The parameters for that fit are: are: A = 0.972(12), B = 1.42(24) x 10 -3, and C NO =more » 60.2(5). The parameters for 137Cs listed in Table 2 (and also used to determine the calibration relations in Figure 12—the main result of this paper) came from fitting the 137Cs data in channels 40 to 80. Although the curves plotted from these two different sets of parameters would be visually distinguishable in Figure 11, when incorporated with the other isotope values shown in Figure 12 to obtain the linear energy-channel fit, the 50-70 channel parameter set plus the correction from the Compton maximum to the Compton edge gives a negligible change in the slope [6.470(41) as opposed to the reported 6.454(15) keV/channel] and a small change in the intercept [41(8) as opposed to 47(3) keV] for the dashed line. The conclusions of the article therefore do not change as a result of this inconsistency.« less

  7. Australian validation of the Cancer of the Prostate Risk Assessment Post-Surgical score to predict biochemical recurrence after radical prostatectomy.

    PubMed

    Beckmann, Kerri; O'Callaghan, Michael; Vincent, Andrew; Roder, David; Millar, Jeremy; Evans, Sue; McNeil, John; Moretti, Kim

    2018-03-01

    The Cancer of the Prostate Risk Assessment Post-Surgical (CAPRA-S) score is a simple post-operative risk assessment tool predicting disease recurrence after radical prostatectomy, which is easily calculated using available clinical data. To be widely useful, risk tools require multiple external validations. We aimed to validate the CAPRA-S score in an Australian multi-institutional population, including private and public settings and reflecting community practice. The study population were all men on the South Australian Prostate Cancer Clinical Outcomes Collaborative Database with localized prostate cancer diagnosed during 1998-2013, who underwent radical prostatectomy without adjuvant therapy (n = 1664). Predictive performance was assessed via Kaplan-Meier and Cox proportional regression analyses, Harrell's Concordance index, calibration plots and decision curve analysis. Biochemical recurrence occurred in 342 (21%) cases. Five-year recurrence-free probabilities for CAPRA-S scores indicating low (0-2), intermediate (3-5) and high risk were 95, 79 and 46%, respectively. The hazard ratio for CAPRA-S score increments was 1.56 (95% confidence interval 1.49-1.64). The Concordance index for 5-year recurrence-free survival was 0.77. The calibration plot showed good correlation between predicted and observed recurrence-free survival across scores. Limitations include the retrospective nature and small numbers with higher CAPRA-S scores. The CAPRA-S score is an accurate predictor of recurrence after radical prostatectomy in our cohort, supporting its utility in the Australian setting. This simple tool can assist in post-surgical selection of patients who would benefit from adjuvant therapy while avoiding morbidity among those less likely to benefit. © 2017 Royal Australasian College of Surgeons.

  8. Development of a prognostic nomogram for cirrhotic patients with upper gastrointestinal bleeding.

    PubMed

    Zhou, Yu-Jie; Zheng, Ji-Na; Zhou, Yi-Fan; Han, Yi-Jing; Zou, Tian-Tian; Liu, Wen-Yue; Braddock, Martin; Shi, Ke-Qing; Wang, Xiao-Dong; Zheng, Ming-Hua

    2017-10-01

    Upper gastrointestinal bleeding (UGIB) is a complication with a high mortality rate in critically ill patients presenting with cirrhosis. Today, there exist few accurate scoring models specifically designed for mortality risk assessment in critically ill cirrhotic patients with upper gastrointestinal bleeding (CICGIB). Our aim was to develop and evaluate a novel nomogram-based model specific for CICGIB. Overall, 540 consecutive CICGIB patients were enrolled. On the basis of Cox regression analyses, the nomogram was constructed to estimate the probability of 30-day, 90-day, 270-day, and 1-year survival. An upper gastrointestinal bleeding-chronic liver failure-sequential organ failure assessment (UGIB-CLIF-SOFA) score was derived from the nomogram. Performance assessment and internal validation of the model were performed using Harrell's concordance index (C-index), calibration plot, and bootstrap sample procedures. UGIB-CLIF-SOFA was also compared with other prognostic models, such as CLIF-SOFA and model for end-stage liver disease, using C-indices. Eight independent factors derived from Cox analysis (including bilirubin, creatinine, international normalized ratio, sodium, albumin, mean artery pressure, vasopressin used, and hematocrit decrease>10%) were assembled into the nomogram and the UGIB-CLIF-SOFA score. The calibration plots showed optimal agreement between nomogram prediction and actual observation. The C-index of the nomogram using bootstrap (0.729; 95% confidence interval: 0.689-0.766) was higher than that of the other models for predicting survival of CICGIB. We have developed and internally validated a novel nomogram and an easy-to-use scoring system that accurately predicts the mortality probability of CICGIB on the basis of eight easy-to-obtain parameters. External validation is now warranted in future clinical studies.

  9. Clinical Nomograms to Predict Stone-Free Rates after Shock-Wave Lithotripsy: Development and Internal-Validation

    PubMed Central

    Kim, Jung Kwon; Ha, Seung Beom; Jeon, Chan Hoo; Oh, Jong Jin; Cho, Sung Yong; Oh, Seung-June; Kim, Hyeon Hoe; Jeong, Chang Wook

    2016-01-01

    Purpose Shock-wave lithotripsy (SWL) is accepted as the first line treatment modality for uncomplicated upper urinary tract stones; however, validated prediction models with regards to stone-free rates (SFRs) are still needed. We aimed to develop nomograms predicting SFRs after the first and within the third session of SWL. Computed tomography (CT) information was also modeled for constructing nomograms. Materials and Methods From March 2006 to December 2013, 3028 patients were treated with SWL for ureter and renal stones at our three tertiary institutions. Four cohorts were constructed: Total-development, Total-validation, CT-development, and CT-validation cohorts. The nomograms were developed using multivariate logistic regression models with selected significant variables in a univariate logistic regression model. A C-index was used to assess the discrimination accuracy of nomograms and calibration plots were used to analyze the consistency of prediction. Results The SFR, after the first and within the third session, was 48.3% and 68.8%, respectively. Significant variables were sex, stone location, stone number, and maximal stone diameter in the Total-development cohort, and mean Hounsfield unit (HU) and grade of hydronephrosis (HN) were additional parameters in the CT-development cohort. The C-indices were 0.712 and 0.723 for after the first and within the third session of SWL in the Total-development cohort, and 0.755 and 0.756, in the CT-development cohort, respectively. The calibration plots showed good correspondences. Conclusions We constructed and validated nomograms to predict SFR after SWL. To the best of our knowledge, these are the first graphical nomograms to be modeled with CT information. These may be useful for patient counseling and treatment decision-making. PMID:26890006

  10. Joint inversion of 3-PG using eddy-covariance and inventory plot measurements in temperate-maritime conifer forests: Uncertainty in transient carbon-balance responses to climate change

    NASA Astrophysics Data System (ADS)

    Hember, R. A.; Kurz, W. A.; Coops, N. C.; Black, T. A.

    2010-12-01

    Temperate-maritime forests of coastal British Columbia store large amounts of carbon (C) in soil, detritus, and trees. To better understand the sensitivity of these C stocks to climate variability, simulations were conducted using a hybrid version of the model, Physiological Principles Predicting Growth (3-PG), combined with algorithms from the Carbon Budget Model of the Canadian Forest Sector - version 3 (CBM-CFS3) to account for full ecosystem C dynamics. The model was optimized based on a combination of monthly CO2 and H2O flux measurements derived from three eddy-covariance systems and multi-annual stemwood growth (Gsw) and mortality (Msw) derived from 1300 permanent sample plots by means of Markov chain Monte Carlo sampling. The calibrated model serves as an unbiased estimator of stemwood C with enhanced precision over that of strictly-empirical models, minimized reliance on local prescriptions, and the flexibility to study impacts of environmental change on regional C stocks. We report the contribution of each dataset in identifying key physiological parameters and the posterior uncertainty in predictions of net ecosystem production (NEP). The calibrated model was used to spin up pre-industrial C pools and estimate the sensitivity of regional net carbon balance to a gradient of temperature changes, λ=ΔC/ΔT, during three 62-year harvest rotations, spanning 1949-2135. Simulations suggest that regional net primary production, tree mortality, and heterotrophic respiration all began increasing, while NEP began decreasing in response to warming following the 1976 shift in northeast-Pacific climate. We quantified the uncertainty of λ and how it was mediated by initial dead C, tree mortality, precipitation change, and the time horizon in which it was calculated.

  11. Development and validation of a prediction model for functional decline in older medical inpatients.

    PubMed

    Takada, Toshihiko; Fukuma, Shingo; Yamamoto, Yosuke; Tsugihashi, Yukio; Nagano, Hiroyuki; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuhara, Shunichi

    2018-05-17

    To prevent functional decline in older inpatients, identification of high-risk patients is crucial. The aim of this study was to develop and validate a prediction model to assess the risk of functional decline in older medical inpatients. In this retrospective cohort study, patients ≥65 years admitted acutely to medical wards were included. The healthcare database of 246 acute care hospitals (n = 229,913) was used for derivation, and two acute care hospitals (n = 1767 and 5443, respectively) were used for validation. Data were collected using a national administrative claims and discharge database. Functional decline was defined as a decline of the Katz score at discharge compared with on admission. About 6% of patients in the derivation cohort and 9% and 2% in each validation cohort developed functional decline. A model with 7 items, age, body mass index, living in a nursing home, ambulance use, need for assistance in walking, dementia, and bedsore, was developed. On internal validation, it demonstrated a c-statistic of 0.77 (95% confidence interval (CI) = 0.767-0.771) and good fit on the calibration plot. On external validation, the c-statistics were 0.79 (95% CI = 0.77-0.81) and 0.75 (95% CI = 0.73-0.77) for each cohort, respectively. Calibration plots showed good fit in one cohort and overestimation in the other one. A prediction model for functional decline in older medical inpatients was derived and validated. It is expected that use of the model would lead to early identification of high-risk patients and introducing early intervention. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Assessing the performance of handheld glucose testing for critical care.

    PubMed

    Kost, Gerald J; Tran, Nam K; Louie, Richard F; Gentile, Nicole L; Abad, Victor J

    2008-12-01

    We assessed the performance of a point-of-care (POC) glucose meter system (GMS) with multitasking test strip by using the locally-smoothed (LS) median absolute difference (MAD) curve method in conjunction with a modified Bland-Altman difference plot and superimposed International Organization for Standardization (ISO) 15197 tolerance bands. We analyzed performance for tight glycemic control (TGC). A modified glucose oxidase enzyme with a multilayer-gold, multielectrode, four-well test strip (StatStriptrade mark, NOVA Biomedical, Waltham, MA) was used. There was no test strip calibration code. Pragmatic comparison was done of GMS results versus paired plasma glucose measurements from chemistry analyzers in clinical laboratories. Venous samples (n = 1,703) were analyzed at 35 hospitals that used 20 types of chemistry analyzers. Erroneous results were identified using the Bland-Altman plot and ISO 15197 criteria. Discrepant values were analyzed for the TGC interval of 80-110 mg/dL. The GMS met ISO 15197 guidelines; 98.6% (410 of 416) of observations were within tolerance for glucose <75 mg/dL, and for > or =75 mg/dL, 100% were within tolerance. Paired differences (handheld minus reference) averaged -2.2 (SD 9.8) mg/dL; the median was -1 (range, -96 to 45) mg/dL. LS MAD curve analysis revealed satisfactory performance below 186 mg/dL; above 186 mg/dL, the recommended error tolerance limit (5 mg/dL) was not met. No discrepant values appeared. All points fell in Clarke Error Grid zone A. Linear regression showed y = 1.018x - 0.716 mg/dL, and r2 = 0.995. LS MAD curves draw on human ability to discriminate performance visually. LS MAD curve and ISO 15197 performance were acceptable for TGC. POC and reference glucose calibration should be harmonized and standardized.

  13. Validation of a point-of-care (POC) lactate testing device for fetal scalp blood sampling during labor: clinical considerations, practicalities and realities.

    PubMed

    Reif, Philipp; Lakovschek, Ioanna; Tappauf, Carmen; Haas, Josef; Lang, Uwe; Schöll, Wolfgang

    2014-06-01

    Although fetal blood sampling for pH is well established the use of lactate has not been widely adopted. This study validated the performance and utility of a handheld point-of-care (POC) lactate device in comparison with the lactate and pH values obtained by the ABL 800 blood gas analyzer. The clinical performance and influences on accuracy and decision-making criteria were assessed with freshly taken fetal blood scalp samples (n=57) and umbilical cord samples (n=310). Bland-Altman plot was used for data plotting and analyzing the agreement between the two measurement devices and correlation coefficients (R²) were determined using Passing-Bablok regression analysis. Sample processing errors were much lower in the testing device (22.8% vs. 0.5%). Following a preclinical assessment and calibration offset alignment (0.5 mmol/L) the test POC device showed good correlation with the reference method for lactate FBS (R²=0.977, p<0.0001, 95% CI 0.9 59-0.988), arterial cord blood (R²=0.976, p<0.0001, 95% CI 0.967-0.983) and venous cord blood (R²=0.977, p<0.0001, 95% CI 0.968-0.984). A POC device which allows for a calibration adjustment to be made following preclinical testing can provide results that will correlate closely to an incumbent lactate method such as a blood gas analyzer. The use of a POC lactate device can address the impracticality and reality of pH sample collection and testing failures experienced in day to day clinical practice. For the StatStrip Lactate meter we suggest using a lactate cut-off of 5.1 mmol/L for predicting fetal acidosis (pH<7.20).

  14. Satellite inventory of Minnesota forest resources

    NASA Technical Reports Server (NTRS)

    Bauer, Marvin E.; Burk, Thomas E.; Ek, Alan R.; Coppin, Pol R.; Lime, Stephen D.; Walsh, Terese A.; Walters, David K.; Befort, William; Heinzen, David F.

    1993-01-01

    The methods and results of using Landsat Thematic Mapper (TM) data to classify and estimate the acreage of forest covertypes in northeastern Minnesota are described. Portions of six TM scenes covering five counties with a total area of 14,679 square miles were classified into six forest and five nonforest classes. The approach involved the integration of cluster sampling, image processing, and estimation. Using cluster sampling, 343 plots, each 88 acres in size, were photo interpreted and field mapped as a source of reference data for classifier training and calibration of the TM data classifications. Classification accuracies of up to 75 percent were achieved; most misclassification was between similar or related classes. An inverse method of calibration, based on the error rates obtained from the classifications of the cluster plots, was used to adjust the classification class proportions for classification errors. The resulting area estimates for total forest land in the five-county area were within 3 percent of the estimate made independently by the USDA Forest Service. Area estimates for conifer and hardwood forest types were within 0.8 and 6.0 percent respectively, of the Forest Service estimates. A trial of a second method of estimating the same classes as the Forest Service resulted in standard errors of 0.002 to 0.015. A study of the use of multidate TM data for change detection showed that forest canopy depletion, canopy increment, and no change could be identified with greater than 90 percent accuracy. The project results have been the basis for the Minnesota Department of Natural Resources and the Forest Service to define and begin to implement an annual system of forest inventory which utilizes Landsat TM data to detect changes in forest cover.

  15. Hotspots engineering by grafting Au@Ag core-shell nanoparticles on the Au film over slightly etched nanoparticles substrate for on-site paraquat sensing.

    PubMed

    Wang, Chaoguang; Wu, Xuezhong; Dong, Peitao; Chen, Jian; Xiao, Rui

    2016-12-15

    Paraquat (PQ) pollutions are ultra-toxic to human beings and hard to be decomposed in the environment, thus requiring an on-site detection strategy. Herein, we developed a robust and rapid PQ sensing strategy based on the surface-enhanced Raman scattering (SERS) technique. A hybrid SERS substrate was prepared by grafting the Au@Ag core-shell nanoparticles (NPs) on the Au film over slightly etched nanoparticles (Au FOSEN). Hotspots were engineered at the junctions as indicated by the finite difference time domain calculation. SERS performance of the hybrid substrate was explored using p-ATP as the Raman probe. The hybrid substrate gives higher enhancement factor comparing to either the Au FOSEN substrate or the Au@Ag core-shell NPs, and exhibits excellent reproducibility, homogeneity and stability. The proposed SERS substrates were prepared in batches for the practical PQ sensing. The total analysis time for a single sample, including the pre-treatment and measurement, was less than 5min with a PQ detection limit of 10nM. Peak intensities of the SERS signal were plotted as a function of the PQ concentrations to calibrate the sensitivity by fitting the Hill's equation. The plotted calibration curve showed a good log-log linearity with the coefficient of determination of 0.98. The selectivity of the sensing proposal was based on the "finger print" Raman spectra of the analyte. The proposed substrate exhibited good recovery when it applied to real water samples, including lab tap water, bottled water, and commercially obtained apple juice and grape juice. This SERS-based PQ detection method is simple, rapid, sensitive and selective, which shows great potential in pesticide residue and additives abuse monitoring. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A Microcomputer-Based Program for Printing Check Plots of Integrated Circuits Specified in Caltech Intermediate Form.

    DTIC Science & Technology

    1984-12-01

    only four transistors[5]. Each year since that time, the semiconductor industry has con- sistently improved the quality of the fabrication tech- niques...rarely took place at universities and was almost exclusively confined to industry . IC design techniques were developed, tested, and taught only in the...community, it is not uncommon for industry to borrow ideas and even particular programs from these university designed tools. The Very Large Scale Integration

  17. Measurement of forest disturbance and regrowth with Landsat and forest inventory and analysis data: anticipated benefits from forest and inventory analysis' collaboration with the national aeronautics and space administration and university partners

    Treesearch

    Sean Healey; Gretchen Moisen; Jeff Masek; Warren Cohen; Sam Goward; < i> et al< /i>

    2007-01-01

    The Forest Inventory and Analysis (FIA) program has partnered with researchers from the National Aeronautics and Space Administration, the University of Maryland, and other U.S. Department of Agriculture Forest Service units to identify disturbance patterns across the United States using FIA plot data and time series of Landsat satellite images. Spatially explicit...

  18. Results from Source-Based and Detector-Based Calibrations of a CLARREO Calibration Demonstration System

    NASA Technical Reports Server (NTRS)

    Angal, Amit; Mccorkel, Joel; Thome, Kurt

    2016-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission is formulated to determine long-term climate trends using SI-traceable measurements. The CLARREO mission will include instruments operating in the reflected solar (RS) wavelength region from 320 nm to 2300 nm. The Solar, Lunar for Absolute Reflectance Imaging Spectroradiometer (SOLARIS) is the calibration demonstration system (CDS) for the reflected solar portion of CLARREO and facilitates testing and evaluation of calibration approaches. The basis of CLARREO and SOLARIS calibration is the Goddard Laser for Absolute Measurement of Response (GLAMR) that provides a radiance-based calibration at reflective solar wavelengths using continuously tunable lasers. SI-traceability is achieved via detector-based standards that, in GLAMRs case, are a set of NIST-calibrated transfer radiometers. A portable version of the SOLARIS, Suitcase SOLARIS is used to evaluate GLAMRs calibration accuracies. The calibration of Suitcase SOLARIS using GLAMR agrees with that obtained from source-based results of the Remote Sensing Group (RSG) at the University of Arizona to better than 5 (k2) in the 720-860 nm spectral range. The differences are within the uncertainties of the NIST-calibrated FEL lamp-based approach of RSG and give confidence that GLAMR is operating at 5 (k2) absolute uncertainties. Limitations of the Suitcase SOLARIS instrument also discussed and the next edition of the SOLARIS instrument (Suitcase SOLARIS- 2) is expected to provide an improved mechanism to further assess GLAMR and CLARREO calibration approaches. (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  19. Calibrated birth-death phylogenetic time-tree priors for bayesian inference.

    PubMed

    Heled, Joseph; Drummond, Alexei J

    2015-05-01

    Here we introduce a general class of multiple calibration birth-death tree priors for use in Bayesian phylogenetic inference. All tree priors in this class separate ancestral node heights into a set of "calibrated nodes" and "uncalibrated nodes" such that the marginal distribution of the calibrated nodes is user-specified whereas the density ratio of the birth-death prior is retained for trees with equal values for the calibrated nodes. We describe two formulations, one in which the calibration information informs the prior on ranked tree topologies, through the (conditional) prior, and the other which factorizes the prior on divergence times and ranked topologies, thus allowing uniform, or any arbitrary prior distribution on ranked topologies. Although the first of these formulations has some attractive properties, the algorithm we present for computing its prior density is computationally intensive. However, the second formulation is always faster and computationally efficient for up to six calibrations. We demonstrate the utility of the new class of multiple-calibration tree priors using both small simulations and a real-world analysis and compare the results to existing schemes. The two new calibrated tree priors described in this article offer greater flexibility and control of prior specification in calibrated time-tree inference and divergence time dating, and will remove the need for indirect approaches to the assessment of the combined effect of calibration densities and tree priors in Bayesian phylogenetic inference. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  20. Visualizing Qualitative Information

    ERIC Educational Resources Information Center

    Slone, Debra J.

    2009-01-01

    The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…

  1. Effects of processed oil shale on the element content of Atriplex cancescens

    USGS Publications Warehouse

    Anderson, B.M.

    1982-01-01

    Samples of four-wing saltbush were collected from the Colorado State University Intensive Oil Shale Revegetation Study Site test plots in the Piceance basin, Colorado. The test plots were constructed to evaluate the effects of processed oil shale geochemistry on plant growth using various thicknesses of soil cover over the processed shale and/or over a gravel barrier between the shale and soil. Generally, the thicker the soil cover, the less the influence of the shale geochemistry on the element concentrations in the plants. Concentrations of 20 elements were larger in the ash of four-wing saltbush growing on the plot with the gravel barrier (between the soil and processed shale) when compared to the sample from the control plot. A greater water content in the soil in this plot has been reported, and the interaction between the increased, percolating water and shale may have increased the availability of these elements for plant uptake. Concentrations of boron, copper, fluorine, lithium, molybdenum, selenium, silicon, and zinc were larger in the samples grown over processed shale, compared to those from the control plot, and concentrations for barium, calcium, lanthanum, niobium, phosphorus, and strontium were smaller. Concentrations for arsenic, boron, fluorine, molybdenum, and selenium-- considered to be potential toxic contaminants--were similar to results reported in the literature for vegetation from the test plots. The copper-to-molybdenum ratios in three of the four samples of four-wing saltbush growing over the processed shale were below the ratio of 2:1, which is judged detrimental to ruminants, particularly cattle. Boron concentrations averaged 140 ppm, well above the phytotoxicity level for most plant species. Arsenic, fluorine, and selenium concentrations were below toxic levels, and thus should not present any problem for revegetation or forage use at this time.

  2. Software for Preprocessing Data from Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  3. PredictABEL: an R package for the assessment of risk prediction models.

    PubMed

    Kundu, Suman; Aulchenko, Yurii S; van Duijn, Cornelia M; Janssens, A Cecile J W

    2011-04-01

    The rapid identification of genetic markers for multifactorial diseases from genome-wide association studies is fuelling interest in investigating the predictive ability and health care utility of genetic risk models. Various measures are available for the assessment of risk prediction models, each addressing a different aspect of performance and utility. We developed PredictABEL, a package in R that covers descriptive tables, measures and figures that are used in the analysis of risk prediction studies such as measures of model fit, predictive ability and clinical utility, and risk distributions, calibration plot and the receiver operating characteristic plot. Tables and figures are saved as separate files in a user-specified format, which include publication-quality EPS and TIFF formats. All figures are available in a ready-made layout, but they can be customized to the preferences of the user. The package has been developed for the analysis of genetic risk prediction studies, but can also be used for studies that only include non-genetic risk factors. PredictABEL is freely available at the websites of GenABEL ( http://www.genabel.org ) and CRAN ( http://cran.r-project.org/).

  4. X-ray excited Auger transitions of Pu compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Art J., E-mail: nelson63@llnl.gov; Grant, William K.; Stanford, Jeff A.

    2015-05-15

    X-ray excited Pu core–valence–valence and core–core–valence Auger line-shapes were used in combination with the Pu 4f photoelectron peaks to characterize differences in the oxidation state and local electronic structure for Pu compounds. The evolution of the Pu 4f core-level chemical shift as a function of sputtering depth profiling and hydrogen exposure at ambient temperature was quantified. The combination of the core–valence–valence Auger peak energies with the associated chemical shift of the Pu 4f photoelectron line defines the Auger parameter and results in a reliable method for definitively determining oxidation states independent of binding energy calibration. Results show that PuO{sub 2},more » Pu{sub 2}O{sub 3}, PuH{sub 2.7}, and Pu have definitive Auger line-shapes. These data were used to produce a chemical state (Wagner) plot for select plutonium oxides. This Wagner plot allowed us to distinguish between the trivalent hydride and the trivalent oxide, which cannot be differentiated by the Pu 4f binding energy alone.« less

  5. Iterative Boltzmann plot method for temperature and pressure determination in a xenon high pressure discharge lamp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zalach, J.; Franke, St.

    2013-01-28

    The Boltzmann plot method allows to calculate plasma temperatures and pressures if absolutely calibrated emission coefficients of spectral lines are available. However, xenon arcs are not very well suited to be analyzed this way, as there are only a limited number of lines with atomic data available. These lines have high excitation energies in a small interval between 9.8 and 11.5 eV. Uncertainties in the experimental method and in the atomic data further limit the accuracy of the evaluation procedure. This may result in implausible values of temperature and pressure with inadmissible uncertainty. To omit these shortcomings, an iterative schememore » is proposed that is making use of additional information about the xenon fill pressure. This method is proved to be robust against noisy data and significantly reduces the uncertainties. Intentionally distorted synthetic data are used to illustrate the performance of the method, and measurements performed on a laboratory xenon high pressure discharge lamp are analyzed resulting in reasonable temperatures and pressures with significantly reduced uncertainties.« less

  6. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2003-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: (1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. (2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot. (3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  7. Omega Dante Soft X-Ray Power Diagnostic Component Calibration at the National Synchrotron Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, K; Weber, F; Dewald, E

    2004-04-15

    The Dante soft x-ray spectrometer installed on the Omega laser facility at the Laboratory for Laser Energetics, University of Rochester is a twelve-channel filter-edge defined x-ray power diagnostic. It is used to measure the absolute flux from direct drive, indirect drive (hohlraums) and other plasma sources. Calibration efforts using two beam lines, U3C (50eV-1keV) and X8A (1keV-6keV) at the National Synchrotron Light Source (NSLS) have been implemented to insure the accuracy of these measurements. We have calibrated vacuum x-ray diodes, mirrors and filters.

  8. Calibration of an arbitrarily arranged projection moiré system for 3D shape measurement

    NASA Astrophysics Data System (ADS)

    Tang, Ying; Yao, Jun; Zhou, Yihao; Sun, Chen; Yang, Peng; Miao, Hong; Chen, Jubing

    2018-05-01

    An arbitrarily arranged projection moiré system is presented for three-dimensional shape measurement. We develop a model for projection moiré system and derive a universal formula expressing the relation between height and phase variation before and after we put the object on the reference plane. With so many system parameters involved, a system calibration technique is needed. In this work, we provide a robust and accurate calibration method for an arbitrarily arranged projection moiré system. The system no longer puts restrictions on the configuration of the optical setup. Real experiments have been conducted to verify the validity of this method.

  9. Simulation of hydrodynamics and solute transport in the Pamlico River estuary, North Carolina

    USGS Publications Warehouse

    Bales, Jerad; Robbins, Jeanne C.

    1995-01-01

    An investigation was conducted to characterize flow, circulation, and solute transport in the Pamlico River estuary, North Carolina. The study included a detailed field-measurement program and the calibration, validation, and application of a physically realistic numerical model of hydro- dynamics and transport. Water level, salinity, water temperature, wind speed and direction, and current data were collected during March 1988 through September 1992, and were used to characterize physical conditions in the estuary. Data from pre- existing streamflow gaging stations and meteoro- logical stations were also used. A two-dimensional vertically averaged hydrodynamic and solute transport model was applied to the 48-kilometer study reach. The model domain was discretized into 5,620 separate 200- by 200-meter computational cells. Model calibration was achieved through adjustment of parameters for June 14-30, 1991. Data from selected periods in 1989 and 1991 were used for model validation. Water levels used for model calibration and validation ranged from -0.052 to 0.698 meter; salinities ranged from 0.1 to 13.1 parts per thousand; and wind speeds ranged from calm to 22 meters per second. The model was tested for stratified and unstratified conditions. Simulated and observed data were used to evaluate model performance. The calibrated model was applied for selected periods in 1989 and 1991. Instantaneous flows were simulated at each boundary and at mid- estuary. Circulation patterns were characterized using vector plots, particle tracking, and solute transport. Particle tracks showed that materials released at mid-estuary may remain in the system for 25 days or longer.

  10. Vicarious Calibration of EO-1 Hyperion

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Thome, Kurt; Lawrence, Ong

    2012-01-01

    The Hyperion imaging spectrometer on the Earth Observing-1 satellite is the first high-spatial resolution imaging spectrometer to routinely acquire science-grade data from orbit. Data gathered with this instrument needs to be quantitative and accurate in order to derive meaningful information about ecosystem properties and processes. Also, comprehensive and long-term ecological studies require these data to be comparable over time, between coexisting sensors and between generations of follow-on sensors. One method to assess the radiometric calibration is the reflectance-based approach, a common technique used for several other earth science sensors covering similar spectral regions. This work presents results of radiometric calibration of Hyperion based on the reflectance-based approach of vicarious calibration implemented by University of Arizona during 2001 2005. These results show repeatability to the 2% level and accuracy on the 3 5% level for spectral regions not affected by strong atmospheric absorption. Knowledge of the stability of the Hyperion calibration from moon observations allows for an average absolute calibration based on the reflectance-based results to be determined and applicable for the lifetime of Hyperion.

  11. Assessing learning in small sized physics courses

    NASA Astrophysics Data System (ADS)

    Ene, Emanuela; Ackerson, Bruce J.

    2018-01-01

    We describe the construction, validation, and testing of a concept inventory for an Introduction to Physics of Semiconductors course offered by the department of physics to undergraduate engineering students. By design, this inventory addresses both content knowledge and the ability to interpret content via different cognitive processes outlined in Bloom's revised taxonomy. The primary challenge comes from the low number of test takers. We describe the Rasch modeling analysis for this concept inventory, and the results of the calibration on a small sample size, with the intention of providing a useful blueprint to other instructors. Our study involved 101 students from Oklahoma State University and fourteen faculty teaching or doing research in the field of semiconductors at seven universities. The items were written in four-option multiple-choice format. It was possible to calibrate a 30-item unidimensional scale precisely enough to characterize the student population enrolled each semester and, therefore, to allow the tailoring of the learning activities of each class. We show that this scale can be employed as an item bank from which instructors could extract short testlets and where we can add new items fitting the existing calibration.

  12. Modeling of soil carbon turnover under different crop management: Calibration of RothC-model for Pannonian climate conditions

    NASA Astrophysics Data System (ADS)

    Rampazzo Todorovic, G.; Stemmer, M.; Tatzber, M.; Katzlberger, C.; Spiegel, H.; Zehetner, F.; Gerzabek, M. H.

    2009-04-01

    Despite our knowledge about soil C dynamics, very few long-term data concerning soil organic C dynamics are available for calibrating and evaluating C models. The long-term 14C turnover field experiment, established in 1967 in Fuchsenbigl, Lower Austria, offers the unique opportunity to investigate the mineralization and stabilization of 14C-labeled wheat straw and farmyard manure under different cropping systems (crop rotation CR, spring wheat SW and bare fallow BF) in a long-term field experiment established by H.-E. Oberländer in 1967 in Fuchsenbigl/Lower Austria. In this work the Roth-C-26.3-model was calibrated for the Pannonian climatic region based on the field experiment results. Decomposition rate constants were modified regarding the possible climatic influence on carbon sequestration in soil C pools. The modeled output based on the calibrated model fitted better to measured values than data obtained with the original Roth-C-26.3-model parameters. The main change was in the decomposition rate constant for the HUM (humified) soil C pool, which is now fitted for different plots from 0.005 to 0.01 y-1 instead of 0.02 y-1 as determined in the original Rothamsted field trial. Moreover, for one plot, in addition to the HUM pool, the decomposition rate constant for RPM (resistant plant material) pool was fitted at 0.7 y-1 instead of 0.3 y-1 as originally in the Roth-C-26.3-model. These changes yielded a higher HUM pool in the calibrated model because of the longer turnover period (100-200 versus 50 years). Compared with CR and SW treatments, the decline of TOC was largest in the BF treatments as expected because no significant carbon input has occurred since 1967. Nonetheless, the decline was still not as fast as calculated with original RothC-26.3-model decomposition rate constants. The specific research question was the long-term effect of residue removal on SOM levels under different crop management, under different soil conditions and different climatic regimes of Fuchsenbigl (Austria), Rothamsted (UK) and Ultuna (Sweden). Modeling results of removing the crop residues showed that this can entail a long-term decline of SOM. However, these impacts are strongly dependent on the crop types, the soil properties, and the climatic conditions at a given location. Modeling results of the removal of crop residues showed that it can entail a long-term decline of SOM. A comparison of modeling results for winter wheat and spring barley for Rothamsted/UK, Fuchsenbigl/Austria and Ultuna/Sweden indicate slight SOC decreases at the Fuchsenbigl site when 100% of the straw was removed and increasing trends when 50% was removed. However, at the Rothamsted and Ultuna sites, 50% straw removal still resulted in declining SOC stocks.

  13. 49. Photocopy of engineering drawing (original drawing located in LBNL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    49. Photocopy of engineering drawing (original drawing located in LBNL Building 90F Architecture and Engineering As-Built Collection). June 6, 1949. B51A0354. BEVATRON PLOT PLAN (MASTEN AND HURD) - University of California Radiation Laboratory, Bevatron, 1 Cyclotron Road, Berkeley, Alameda County, CA

  14. Radiometric characterization of hyperspectral imagers using multispectral sensors

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel; Thome, Kurt; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-08-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (MODIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of MODIS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most bands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  15. Radiometric Characterization of Hyperspectral Imagers using Multispectral Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Kurt, Thome; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-01-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these test sites are not always successful due to weather and funding availability. Therefore, RSG has also automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor, This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral a imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (M0DIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of M0DlS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most brands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  16. Final Report for "Non-Accelerator Physics – Research in High Energy Physics: Dark Energy Research on DES"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritz, Steve; Jeltema, Tesla

    One of the greatest mysteries in modern cosmology is the fact that the expansion of the universe is observed to be accelerating. This acceleration may stem from dark energy, an additional energy component of the universe, or may indicate that the theory of general relativity is incomplete on cosmological scales. The growth rate of large-scale structure in the universe and particularly the largest collapsed structures, clusters of galaxies, is highly sensitive to the underlying cosmology. Clusters will provide one of the single most precise methods of constraining dark energy with the ongoing Dark Energy Survey (DES). The accuracy of themore » cosmological constraints derived from DES clusters necessarily depends on having an optimized and well-calibrated algorithm for selecting clusters as well as an optical richness estimator whose mean relation and scatter compared to cluster mass are precisely known. Calibrating the galaxy cluster richness-mass relation and its scatter was the focus of the funded work. Specifically, we employ X-ray observations and optical spectroscopy with the Keck telescopes of optically-selected clusters to calibrate the relationship between optical richness (the number of galaxies in a cluster) and underlying mass. This work also probes aspects of cluster selection like the accuracy of cluster centering which are critical to weak lensing cluster studies.« less

  17. Spectral interpolation - Zero fill or convolution. [image processing

    NASA Technical Reports Server (NTRS)

    Forman, M. L.

    1977-01-01

    Zero fill, or augmentation by zeros, is a method used in conjunction with fast Fourier transforms to obtain spectral spacing at intervals closer than obtainable from the original input data set. In the present paper, an interpolation technique (interpolation by repetitive convolution) is proposed which yields values accurate enough for plotting purposes and which lie within the limits of calibration accuracies. The technique is shown to operate faster than zero fill, since fewer operations are required. The major advantages of interpolation by repetitive convolution are that efficient use of memory is possible (thus avoiding the difficulties encountered in decimation in time FFTs) and that is is easy to implement.

  18. The Microwave Temperature Profiler (PERF)

    NASA Technical Reports Server (NTRS)

    Lim, Boon; Mahoney, Michael; Haggerty, Julie; Denning, Richard

    2013-01-01

    The JPL developed Microwave Temperature Profiler (MTP) has recently participated in GloPac, HIPPO (I to V) and TORERO, and the ongoing ATTREX campaigns. The MTP is now capable of supporting the NASA Global Hawk and a new canister version supports the NCAR G-V. The primary product from the MTP is remote measurements of the atmospheric temperature at, above and below the flight path, providing for the vertical state of the atmosphere. The NCAR-MTP has demonstrated unprecedented instrument performance and calibration with plus or minus 0.2 degrees Kelvin flight level temperature error. Derived products include curtain plots, isentropes, lapse rate, cold point height and tropopause height.

  19. Spectrophotometric determination of ketoprofen and its application in pharmaceutical analysis.

    PubMed

    Kormosh, Zholt; Hunka, Iryna; Basel, Yaroslav

    2009-01-01

    A new simple rapid and sensitive spectrophotometric method has been developed for the determination of ketoprofen in pharmaceutical preparations. The method is based on the reaction of ketoprofen with an analytical reagent--Astra Phloxin FF--at pH 8.0-10.8 and followed by the extraction of formed ion associate in toluene with spectrophotometric detection (it has an absorption maximum at 563 nm, epsilon = 7.6 x 10(4) L x mol(-1) x cm(-1)). The calibration plot was linear from 0.8-16.0 microg x mL(-1) of ketoprofen, and the detection limit was 0.037 microg x mL(-1).

  20. A Herschel-SPIRE Survey of the MonR2 Giant Molecular Cloud

    NASA Astrophysics Data System (ADS)

    Pokhrel, Riwaj; Gutermuth, Robert; Ali, Babar; Megeath, Thomas; Pipher, Judith; Myers, Philip; Fischer, William; Henning, Thomas; Wolk, Scott; Allen, Lori; Tobin, John

    2015-08-01

    We present a new survey of the MonR2 giant molecular cloud with SPIRE on the Herschel Space Observatory. We cross-calibrated SPIRE data with Planck-HFI and accounted for its absolute offset and zero point correction. We fixed emissivity with the help of flux-error and flux ratio plots. As the best representation of cold dusty molecular clouds, we did greybody fits of the SEDs. We studied the nature of distribution of column densities above and below certain critical limit, followed by the mass and temperature distributions for different regions. We used dendrograms as a technique to study the hierarchical structures in the GMC.

  1. A Herschel-SPIRE Survey of the MonR2 Giant Molecular Cloud

    NASA Astrophysics Data System (ADS)

    Pokhrel, Riwaj; Gutermuth, Robert A.; Ali, Babar; Megeath, S. Thomas; Pipher, Judith; Myers, Philip C.; Fischer, William J.; Henning, Thomas; Wolk, Scott J.; Allen, Lori; Tobin, John J.

    2014-06-01

    We present a new survey of the MonR2 giant molecular cloud with SPIRE on the Herschel Space Observatory. We cross-calibrated SPIRE data with Planck-HFI and accounted for its absolute offset and zero point correction. We fixed emissivity with the help of flux-error and flux ratio plots. As the best representation of cold dusty molecular clouds, we did greybody fits of the SEDs. We studied the nature of distribution of column densities above and below certain critical limit, followed by the mass and temperature distributions for different regions. We isolated the filaments and studied radial column density profile in this cloud.

  2. Stripping Voltammetry

    NASA Astrophysics Data System (ADS)

    Lovrić, Milivoj

    Electrochemical stripping means the oxidative or reductive removal of atoms, ions, or compounds from an electrode surface (or from the electrode body, as in the case of liquid mercury electrodes with dissolved metals) [1-5]. In general, these atoms, ions, or compounds have been preliminarily immobilized on the surface of an inert electrode (or within it) as the result of a preconcentration step, while the products of the electrochemical stripping will dissolve in the electrolytic solution. Often the product of the electrochemical stripping is identical to the analyte before the preconcentration. However, there are exemptions to these rules. Electroanalytical stripping methods comprise two steps: first, the accumulation of a dissolved analyte onto, or in, the working electrode, and, second, the subsequent stripping of the accumulated substance by a voltammetric [3, 5], potentiometric [6, 7], or coulometric [8] technique. In stripping voltammetry, the condition is that there are two independent linear relationships: the first one between the activity of accumulated substance and the concentration of analyte in the sample, and the second between the maximum stripping current and the accumulated substance activity. Hence, a cumulative linear relationship between the maximum response and the analyte concentration exists. However, the electrode capacity for the analyte accumulation is limited and the condition of linearity is satisfied only well below the electrode saturation. For this reason, stripping voltammetry is used mainly in trace analysis. The limit of detection depends on the factor of proportionality between the activity of the accumulated substance and the bulk concentration of the analyte. This factor is a constant in the case of a chemical accumulation, but for electrochemical accumulation it depends on the electrode potential. The factor of proportionality between the maximum stripping current and the analyte concentration is rarely known exactly. In fact, it is frequently ignored. For the analysis it suffices to establish the linear relationship empirically. The slope of this relationship may vary from one sample to another because of different influences of the matrix. In this case the concentration of the analyte is determined by the method of standard additions [1]. After measuring the response of the sample, the concentration of the analyte is deliberately increased by adding a certain volume of its standard solution. The response is measured again, and this procedure is repeated three or four times. The unknown concentration is determined by extrapolation of the regression line to the concentration axis [9]. However, in many analytical methods, the final measurement is performed in a standard matrix that allows the construction of a calibration plot. Still, the slope of this plot depends on the active area of the working electrode surface. Each solid electrode needs a separate calibration plot, and that plot must be checked from time to time because of possible deterioration of the electrode surface [2].

  3. The calibration methods for Multi-Filter Rotating Shadowband Radiometer: a review

    NASA Astrophysics Data System (ADS)

    Chen, Maosi; Davis, John; Tang, Hongzhao; Ownby, Carolyn; Gao, Wei

    2013-09-01

    The continuous, over two-decade data record from the Multi-Filter Rotating Shadowband Radiometer (MFRSR) is ideal for climate research which requires timely and accurate information of important atmospheric components such as gases, aerosols, and clouds. Except for parameters derived from MFRSR measurement ratios, which are not impacted by calibration error, most applications require accurate calibration factor(s), angular correction, and spectral response function(s) from calibration. Although a laboratory lamp (or reference) calibration can provide all the information needed to convert the instrument readings to actual radiation, in situ calibration methods are implemented routinely (daily) to fill the gaps between lamp calibrations. In this paper, the basic structure and the data collection and pretreatment of the MFRSR are described. The laboratory lamp calibration and its limitations are summarized. The cloud screening algorithms for MFRSR data are presented. The in situ calibration methods, the standard Langley method and its variants, the ratio-Langley method, the general method, Alexandrov's comprehensive method, and Chen's multi-channel method, are outlined. The reason that all these methods do not fit for all situations is that they assume some properties, such as aerosol optical depth (AOD), total optical depth (TOD), precipitable water vapor (PWV), effective size of aerosol particles, or angstrom coefficient, are invariant over time. These properties are not universal and some of them rarely happen. In practice, daily calibration factors derived from these methods should be smoothed to restrain error.

  4. Introducing the fit-criteria assessment plot - A visualisation tool to assist class enumeration in group-based trajectory modelling.

    PubMed

    Klijn, Sven L; Weijenberg, Matty P; Lemmens, Paul; van den Brandt, Piet A; Lima Passos, Valéria

    2017-10-01

    Background and objective Group-based trajectory modelling is a model-based clustering technique applied for the identification of latent patterns of temporal changes. Despite its manifold applications in clinical and health sciences, potential problems of the model selection procedure are often overlooked. The choice of the number of latent trajectories (class-enumeration), for instance, is to a large degree based on statistical criteria that are not fail-safe. Moreover, the process as a whole is not transparent. To facilitate class enumeration, we introduce a graphical summary display of several fit and model adequacy criteria, the fit-criteria assessment plot. Methods An R-code that accepts universal data input is presented. The programme condenses relevant group-based trajectory modelling output information of model fit indices in automated graphical displays. Examples based on real and simulated data are provided to illustrate, assess and validate fit-criteria assessment plot's utility. Results Fit-criteria assessment plot provides an overview of fit criteria on a single page, placing users in an informed position to make a decision. Fit-criteria assessment plot does not automatically select the most appropriate model but eases the model assessment procedure. Conclusions Fit-criteria assessment plot is an exploratory, visualisation tool that can be employed to assist decisions in the initial and decisive phase of group-based trajectory modelling analysis. Considering group-based trajectory modelling's widespread resonance in medical and epidemiological sciences, a more comprehensive, easily interpretable and transparent display of the iterative process of class enumeration may foster group-based trajectory modelling's adequate use.

  5. Ant Foraging As an Indicator of Tropical Dry Forest Restoration.

    PubMed

    Hernández-Flores, J; Osorio-Beristain, M; Martínez-Garza, C

    2016-08-01

    Variation in foraging behavior may indicate differences in food availability and allow assessment of restoration actions. Ants are prominent bioindicators used in assessing ecological responses to disturbance. However, behavioral data have been poorly incorporated as an index. The foraging performance of red harvester ants was quantified in order to evaluate the success of a restoration ecology experiment in the tropical dry forest of Sierra de Huautla, Morelos, in central Mexico. Foraging performance by granivorous, Pogonomyrmex barbatus, ants was diminished after 6 and 8 years of cattle grazing and wood harvest were excluded as part of a restoration experiment in a highly degraded biome. Despite investing more time in foraging, ant colonies in exclusion plots showed lower foraging success and acquired less seed biomass than colonies in control plots. In line with the predictions of optimal foraging theory, in restored plots where ant foraging performance was poor, ants harvested a higher diversity of seeds. Reduced foraging success and increased harvest of non-preferred foods in exclusion plots were likely due to the growth of herbaceous vegetation, which impedes travel by foragers. Moreover, by 8 years of exclusion, 37% of nests in exclusion plots had disappeared compared to 0% of nests in control plots. Ants' foraging success and behavior were sensitive to changes in habitat quality due to the plant successional process triggered by a restoration intervention. This study spotlights on the utility of animal foraging behavior in the evaluation of habitat restoration programs. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Calibration of CR-39-based thoron progeny device.

    PubMed

    Fábián, F; Csordás, A; Shahrokhi, A; Somlai, J; Kovács, T

    2014-07-01

    Radon isotopes and their progenies have proven significant role in respiratory tumour formation. In most cases, the radiological effect of one of the radon isotopes (thoron) and its progenies has been neglected together with its measurement technique; however, latest surveys proved that thoron's existence is expectable in flats and in workplace in Europe. Detectors based on different track detector measurement technologies have recently spread for measuring thoron progenies; however, the calibration is not yet completely elaborated. This study deals with the calibration of the track detector measurement method suitable for measuring thoron progenies using different devices with measurement techniques capable of measuring several progenies (Pylon AB5 and WLx, Sarad EQF 3220). The calibration factor values related to the thoron progeny monitors, the measurement uncertainty, reproducibility and other parameters were found using the calibration chamber. In the future, the effects of the different parameters (aerosol distribution, etc.) will be determined. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Polarized Redundant-Baseline Calibration for 21 cm Cosmology Without Adding Spectral Structure

    NASA Astrophysics Data System (ADS)

    Dillon, Joshua S.; Kohn, Saul A.; Parsons, Aaron R.; Aguirre, James E.; Ali, Zaki S.; Bernardi, Gianni; Kern, Nicholas S.; Li, Wenyang; Liu, Adrian; Nunhokee, Chuneeta D.; Pober, Jonathan C.

    2018-04-01

    21 cm cosmology is a promising new probe of the evolution of visible matter in our universe, especially during the poorly-constrained Cosmic Dawn and Epoch of Reionization. However, in order to separate the 21 cm signal from bright astrophysical foregrounds, we need an exquisite understanding of our telescopes so as to avoid adding spectral structure to spectrally-smooth foregrounds. One powerful calibration method relies on repeated simultaneous measurements of the same interferometric baseline to solve for the sky signal and for instrumental parameters simultaneously. However, certain degrees of freedom are not constrained by asserting internal consistency between redundant measurements. In this paper, we review the origin of these degeneracies of redundant-baseline calibration and demonstrate how they can source unwanted spectral structure in our measurement and show how to eliminate that additional, artificial structure. We also generalize redundant calibration to dual-polarization instruments, derive the degeneracy structure, and explore the unique challenges to calibration and preserving spectral smoothness presented by a polarized measurement.

  8. Polarized redundant-baseline calibration for 21 cm cosmology without adding spectral structure

    NASA Astrophysics Data System (ADS)

    Dillon, Joshua S.; Kohn, Saul A.; Parsons, Aaron R.; Aguirre, James E.; Ali, Zaki S.; Bernardi, Gianni; Kern, Nicholas S.; Li, Wenyang; Liu, Adrian; Nunhokee, Chuneeta D.; Pober, Jonathan C.

    2018-07-01

    21 cm cosmology is a promising new probe of the evolution of visible matter in our universe, especially during the poorly constrained Cosmic Dawn and Epoch of Reionization. However, in order to separate the 21 cm signal from bright astrophysical foregrounds, we need an exquisite understanding of our telescopes so as to avoid adding spectral structure to spectrally smooth foregrounds. One powerful calibration method relies on repeated simultaneous measurements of the same interferometric baseline to solve for the sky signal and for instrumental parameters simultaneously. However, certain degrees of freedom are not constrained by asserting internal consistency between redundant measurements. In this paper, we review the origin of these degeneracies of redundant-baseline calibration and demonstrate how they can source unwanted spectral structure in our measurement and show how to eliminate that additional, artificial structure. We also generalize redundant calibration to dual-polarization instruments, derive the degeneracy structure, and explore the unique challenges to calibration and preserving spectral smoothness presented by a polarized measurement.

  9. Development of an open source package for the processing of sun-sky photometric data in the European Skyrad Users network (ESR)

    NASA Astrophysics Data System (ADS)

    Estelles, V.; Smyth, T.; Campanelli, M.; Utrillas, M. P.

    2009-04-01

    The European SkyRad users network (ESR) is a joint initiative from the Institute of Atmospheric and Climate Sciences (ISAC) at the National Research Council (CNR) in Italy, the Group of Solar Radiation (GRSV) at the University of Valencia (UV) in Spain, and the Plymouth Marine Laboratory (PML) in the United Kingdom. It was started as a Protocol of Agreement between the three institutions, in 2003. The main objective was to collaborate on the improvement of some technical aspects of the Skyrad.pack algorithm. Currently the network is addressed at European research groups that are users of sun - sky photometers and mainly focus their research on the study of atmospheric aerosols and their application to remote sensing or climatological studies. There exist well known international networks such as AERONET (Aerosol Robotic Network) or SKYNET (SKYrad NETwork, in Asia) but they have some characteristics that actually prevent many European research groups to get involved with them. These limitations mean that a number of European groups are working independently, with no coordination. The resultant databases are not made public or the employed methodology is not homogeneous. In turn, it means that a great amount of data is being lost for critical regional studies in Europe. One of these limitations is related to the supported instrumentation. International networks usually adopt a given model of sun photometer as a standard. The ESR is a multi instrumental network using both Prede POM and Cimel CE318 sun - sky photometers. Another limitation is related to the calibration. In the case of AERONET, a centralized and stringent calibration protocol is adopted. This protocol is designed in order to offer a well tracked and quality assured calibration and data elaboration; it is in fact the key stone for the homogeneity of the network results. But centralization raises other problems. The instruments must be periodically sent every 6 - 12 months to United States or France; therefore, 1) the instrument absence generates considerable data gaps, 2) it is also a chance for equipment damage during the transport, and 3) the proprietary group must cope with the economical cost of these international insured deliveries. Moreover, the protocol constrains the network capability to handle a large amount of instruments. In fact, AERONET is very reluctant at the moment to accept new sites in Europe. ESR has developed an improved version of the Langley plot technique (SKYIL) that allows the users to perform a continuous in situ calibration. Previous results show that the obtained uncertainties in the calibration factors (1.0 - 2.5%) are very similar to the uncertainty values for field instruments in AERONET (1.0 - 2.0%). A third difference that could make ESR more appealing to some European research groups is related to the algorithms itself. The core inversion code (Skyrad.pack), the calibration codes and all the automatization scripts are free open source codes that can be further customized by the users. Therefore, an advanced user could easily access and modify the algorithms for new improvements. As a conclusion, the ESR users network has been conceived as a flexible network and collaborative platform for European groups whose main research is focused on atmospheric aerosols characterization and model development. The package we have developed for the network is an open source product that is available for public use, both for Cimel CE318 and Prede POM instruments.

  10. Linear mixed-effects models to describe individual tree crown width for China-fir in Fujian Province, southeast China.

    PubMed

    Hao, Xu; Yujun, Sun; Xinjie, Wang; Jin, Wang; Yao, Fu

    2015-01-01

    A multiple linear model was developed for individual tree crown width of Cunninghamia lanceolata (Lamb.) Hook in Fujian province, southeast China. Data were obtained from 55 sample plots of pure China-fir plantation stands. An Ordinary Linear Least Squares (OLS) regression was used to establish the crown width model. To adjust for correlations between observations from the same sample plots, we developed one level linear mixed-effects (LME) models based on the multiple linear model, which take into account the random effects of plots. The best random effects combinations for the LME models were determined by the Akaike's information criterion, the Bayesian information criterion and the -2logarithm likelihood. Heteroscedasticity was reduced by three residual variance functions: the power function, the exponential function and the constant plus power function. The spatial correlation was modeled by three correlation structures: the first-order autoregressive structure [AR(1)], a combination of first-order autoregressive and moving average structures [ARMA(1,1)], and the compound symmetry structure (CS). Then, the LME model was compared to the multiple linear model using the absolute mean residual (AMR), the root mean square error (RMSE), and the adjusted coefficient of determination (adj-R2). For individual tree crown width models, the one level LME model showed the best performance. An independent dataset was used to test the performance of the models and to demonstrate the advantage of calibrating LME models.

  11. Diffuse sunlight based calibration of the water vapor channel in the upc raman lidar

    NASA Astrophysics Data System (ADS)

    Muñoz-Porcar, Constantino; Comeron, Adolfo; Sicard, Michaël; Barragan, Ruben; Garcia-Vizcaino, David; Rodríguez-Gómez, Alejandro; Rocadenbosch, Francesc

    2018-04-01

    A method for determining the calibration factor of the water vapor channel of a Raman lidar, based on zenith measurements of diffuse sunlight and on assumptions regarding some system parameters and Raman scattering models, has been applied to the lidar system of Universitat Politècnica de Catalunya (UPC; Technical University of Catalonia, Spain). Results will be analyzed in terms of stability and comparison with typical methods relying on simultaneous radiosonde measurements.

  12. X Marks the Plot: Can Cliffs Notes Help Students Find Literary Gold?

    ERIC Educational Resources Information Center

    Lilla, Rick

    1998-01-01

    Examines Villanova University's decision to stop selling Cliffs Notes in its bookstore and attitudes toward Cliffs Notes, highlighting honest work, shortcuts, serious research, critical thinking, and original thinking. Provides the following advice for librarians: avoid being parental; avoid unexamined judgments; and avoid undervaluing Cliffs…

  13. Plotting a New Course for Metasearch

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2005-01-01

    Today's world demands an expansive search environment. The universe of information resources is immense and is growing rapidly. The content needed for research and scholarship is dispersed among publishers, aggregators, repositories, library catalogs, e-print servers, and servers throughout the Web. Users do not want to jump from one interface to…

  14. Estimation of Tree Position and STEM Diameter Using Simultaneous Localization and Mapping with Data from a Backpack-Mounted Laser Scanner

    NASA Astrophysics Data System (ADS)

    Holmgren, J.; Tulldahl, H. M.; Nordlöf, J.; Nyström, M.; Olofsson, K.; Rydell, J.; Willén, E.

    2017-10-01

    A system was developed for automatic estimations of tree positions and stem diameters. The sensor trajectory was first estimated using a positioning system that consists of a low precision inertial measurement unit supported by image matching with data from a stereo-camera. The initial estimation of the sensor trajectory was then calibrated by adjustments of the sensor pose using the laser scanner data. Special features suitable for forest environments were used to solve the correspondence and matching problems. Tree stem diameters were estimated for stem sections using laser data from individual scanner rotations and were then used for calibration of the sensor pose. A segmentation algorithm was used to associate stem sections to individual tree stems. The stem diameter estimates of all stem sections associated to the same tree stem were then combined for estimation of stem diameter at breast height (DBH). The system was validated on four 20 m radius circular plots and manual measured trees were automatically linked to trees detected in laser data. The DBH could be estimated with a RMSE of 19 mm (6 %) and a bias of 8 mm (3 %). The calibrated sensor trajectory and the combined use of circle fits from individual scanner rotations made it possible to obtain reliable DBH estimates also with a low precision positioning system.

  15. KINEROS2-AGWA: Model Use, Calibration, and Validation

    NASA Technical Reports Server (NTRS)

    Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..

    2013-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  16. KINEROS2/AGWA: Model use, calibration and validation

    USGS Publications Warehouse

    Goodrich, D.C.; Burns, I.S.; Unkrich, C.L.; Semmens, Darius J.; Guertin, D.P.; Hernandez, M.; Yatheendradas, S.; Kennedy, Jeffrey R.; Levick, Lainie R.

    2012-01-01

    KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.

  17. Omega Dante soft x-ray power diagnostic component calibration at the National Synchrotron Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, K.M.; Weber, F.A.; Dewald, E.L.

    2004-10-01

    The Dante soft x-ray spectrometer, installed on the Omega laser facility at the Laboratory for Laser Energetics, University of Rochester, is a 12-channel filter-edge defined soft x-ray power diagnostic. It is used to measure the spectrally resolved, absolute flux from direct drive, indirect drive (hohlraums) and other plasma sources. Dante component calibration efforts using two beam lines, U3C (50 eV-1 keV) and X8A (1-6 keV) at the National Synchrotron Light Source have been implemented to improve the accuracy of these measurements. We have calibrated metallic vacuum x-ray diodes, mirrors and filters.

  18. Modeling the effect of soil structure on water flow and isoproturon dynamics in an agricultural field receiving repeated urban waste compost application.

    PubMed

    Filipović, Vilim; Coquet, Yves; Pot, Valérie; Houot, Sabine; Benoit, Pierre

    2014-11-15

    Transport processes in soils are strongly affected by heterogeneity of soil hydraulic properties. Tillage practices and compost amendments can modify soil structure and create heterogeneity at the local scale within agricultural fields. The long-term field experiment QualiAgro (INRA-Veolia partnership 1998-2013) explores the impact of heterogeneity in soil structure created by tillage practices and compost application on transport processes. A modeling study was performed to evaluate how the presence of heterogeneity due to soil tillage and compost application affects water flow and pesticide dynamics in soil during a long-term period. The study was done on a plot receiving a co-compost of green wastes and sewage sludge (SGW) applied once every 2 years since 1998. The plot was cultivated with a biannual rotation of winter wheat-maize (except 1 year of barley) and a four-furrow moldboard plow was used for tillage. In each plot, wick lysimeter outflow and TDR probe data were collected at different depths from 2004, while tensiometer measurements were also conducted during 2007/2008. Isoproturon concentration was measured in lysimeter outflow since 2004. Detailed profile description was used to locate different soil structures in the profile, which was then implemented in the HYDRUS-2D model. Four zones were identified in the plowed layer: compacted clods with no visible macropores (Δ), non-compacted soil with visible macroporosity (Γ), interfurrows created by moldboard plowing containing crop residues and applied compost (IF), and the plow pan (PP) created by plowing repeatedly to the same depth. Isoproturon retention and degradation parameters were estimated from laboratory batch sorption and incubation experiments, respectively, for each structure independently. Water retention parameters were estimated from pressure plate laboratory measurements and hydraulic conductivity parameters were obtained from field tension infiltrometer experiments. Soil hydraulic properties were optimized on one calibration year (2007/08) using pressure head, water content and lysimeter outflow data, and then tested on the whole 2004/2010 period. Lysimeter outflow and water content dynamics in the soil profile were correctly described for the whole period (model efficiency coefficient: 0.99) after some correction of LAI estimates for wheat (2005/06) and barley (2006/07). Using laboratory-measured degradation rates and assuming degradation only in the liquid phase caused large overestimation of simulated isoproturon losses in lysimeter outflow. A proper order of magnitude of isoproturon losses was obtained after considering that degradation occurred in solid (sorbed) phase at a rate 75% of that in liquid phase. Isoproturon concentrations were found to be highly sensitive to degradation rates. Neither the laboratory-measured isoproturon fate parameters nor the independently-derived soil hydraulic parameters could describe the actual multiannual field dynamics of water and isoproturon without calibration. However, once calibrated on a limited period of time (9 months), HYDRUS-2D was able to simulate the whole 6-year time series with good accuracy. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. ALMA Pipeline: Current Status

    NASA Astrophysics Data System (ADS)

    Shinnaga, H.; Humphreys, E.; Indebetouw, R.; Villard, E.; Kern, J.; Davis, L.; Miura, R. E.; Nakazato, T.; Sugimoto, K.; Kosugi, G.; Akiyama, E.; Muders, D.; Wyrowski, F.; Williams, S.; Lightfoot, J.; Kent, B.; Momjian, E.; Hunter, T.; ALMA Pipeline Team

    2015-12-01

    The ALMA Pipeline is the automated data reduction tool that runs on ALMA data. Current version of the ALMA pipeline produces science quality data products for standard interferometric observing modes up to calibration process. The ALMA Pipeline is comprised of (1) heuristics in the form of Python scripts that select the best processing parameters, and (2) contexts that are given for book-keeping purpose of data processes. The ALMA Pipeline produces a "weblog" that showcases detailed plots for users to judge how each step of calibration processes are treated. The ALMA Interferometric Pipeline was conditionally accepted in March 2014 by processing Cycle 0 and Cycle 1 data sets. From Cycle 2, ALMA Pipeline is used for ALMA data reduction and quality assurance for the projects whose observing modes are supported by the ALMA Pipeline. Pipeline tasks are available based on CASA version 4.2.2, and the first public pipeline release called CASA 4.2.2-pipe has been available since October 2014. One can reduce ALMA data both by CASA tasks as well as by pipeline tasks by using CASA version 4.2.2-pipe.

  20. A comparison of directed search target detection versus in-scene target detection in Worldview-2 datasets

    NASA Astrophysics Data System (ADS)

    Grossman, S.

    2015-05-01

    Since the events of September 11, 2001, the intelligence focus has moved from large order-of-battle targets to small targets of opportunity. Additionally, the business community has discovered the use of remotely sensed data to anticipate demand and derive data on their competition. This requires the finer spectral and spatial fidelity now available to recognize those targets. This work hypothesizes that directed searches using calibrated data perform at least as well as inscene manually intensive target detection searches. It uses calibrated Worldview-2 multispectral images with NEF generated signatures and standard detection algorithms to compare bespoke directed search capabilities against ENVI™ in-scene search capabilities. Multiple execution runs are performed at increasing thresholds to generate detection rates. These rates are plotted and statistically analyzed. While individual head-to-head comparison results vary, 88% of the directed searches performed at least as well as in-scene searches with 50% clearly outperforming in-scene methods. The results strongly support the premise that directed searches perform at least as well as comparable in-scene searches.

  1. Playing with LISEM: Experiences from Norway

    NASA Astrophysics Data System (ADS)

    Greipsland, Inga; Krzeminska, Dominika

    2017-04-01

    Reducing soil loss from agricultural land is an important environmental challenge that is of relevance for both the European Soil Thematic Strategy (EC 2002) and the Water Framework Directive (EC 2000). Agricultural land in Norway is scarce, covering only around 3% of the total land area (The World Bank, 2015), which puts stress on preserving soil quality for food production. Additionally, reducing sediment loss is a national priority because of associated transport of pollutants such as phosphorous, which can cause eutrophication in nearby waterbodies. It is necessary to find tools that can estimate the effect of different scenarios on erosion processes on agricultural areas. We would like to present the challenges experienced and the results obtained by using LISEM (Limburg Soil Erosion Model) on the plot- subcatchment- and catchment scale in southeastern Norway. The agricultural catchment has been the subject of long-term monitoring of water quality. Challenges included spatial upscaling of local calibration, calibration on areas with very low soil loss rates and equifinality. In this poster, we want to facilitate a discussion about the possibilities of and limitations to the model for predicting hydrological and soil erosion processes at different scales.

  2. A compact presentation of DSN array telemetry performance

    NASA Technical Reports Server (NTRS)

    Greenhall, C. A.

    1982-01-01

    The telemetry performance of an arrayed receiver system, including radio losses, is often given by a family of curves giving bit error rate vs bit SNR, with tracking loop SNR at one receiver held constant along each curve. This study shows how to process this information into a more compact, useful format in which the minimal total signal power and optimal carrier suppression, for a given fixed bit error rate, are plotted vs data rate. Examples for baseband-only combining are given. When appropriate dimensionless variables are used for plotting, receiver arrays with different numbers of antennas and different threshold tracking loop bandwidths look much alike, and a universal curve for optimal carrier suppression emerges.

  3. Teleradiology costs in a rural area

    NASA Astrophysics Data System (ADS)

    Chimiak, William J.

    1994-05-01

    There have been several excellent papers providing architectures for teleradiology. Effective teleradiology systems can be fielded today. However, cost issues arise which easily blur a decision to deploy a teleradiology system for a given hospital or regional hospital system. In this paper, a T1 infrastructure is assumed that is comprised of dedicated T1 links as well as fractional T1 links. The effects of teleconferencing are included in the analysis. Plots of the telecommunication costs provide visualization of the cost and performance issues as a function of varying degrees teleradiology and teleconference utilization. 1993 tariffs in North Carolina will be used as a baseline to arrive at some basic teleradiology cost plots and metrics. The graphs are produced by gnuplot that is freely available on many anonymous ftp sites and runs on Unix workstations as well as personal computers. The plotting commands used for the graphs are available at The Bowman Gray School of Medicine of Wake Forest University anonymous ftp site.

  4. RADON CHAMBER IN THE CENTRAL MINING INSTITUTE-THE CALIBRATION FACILITY FOR RADON AND RADON PROGENY MONITORS.

    PubMed

    Skubacz, K; Chalupnik, S; Urban, P; Wysocka, M

    2017-11-01

    The article presents the advantages of the radon chamber with volume of 17 m3, that belongs to Silesian Centre for Environmental Radioactivity and its applicability for calibration of equipment designed to measure the radon concentration and its short-lived decay products. The chamber can be operated under controlled conditions in the range from -20 to 60°C and relative humidity from 20 to 90%. There is also discussed the influence of aerosol concentration and their size distribution on the calibration results. When calibrating the measuring devices in an atmosphere with a large contribution of ultrafine particles that are defined as particles with diameter <0.1 μm, their sensitivity may decrease by tens of percent. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Transmittance Measurement of a Heliostat Facility used in the Preflight Radiometric Calibration of Earth-Observing Sensors

    NASA Technical Reports Server (NTRS)

    Czapla-Myers, J.; Thome, K.; Anderson, N.; McCorkel, J.; Leisso, N.; Good, W.; Collins, S.

    2009-01-01

    Ball Aerospace and Technologies Corporation in Boulder, Colorado, has developed a heliostat facility that will be used to determine the preflight radiometric calibration of Earth-observing sensors that operate in the solar-reflective regime. While automatically tracking the Sun, the heliostat directs the solar beam inside a thermal vacuum chamber, where the sensor under test resides. The main advantage to using the Sun as the illumination source for preflight radiometric calibration is because it will also be the source of illumination when the sensor is in flight. This minimizes errors in the pre- and post-launch calibration due to spectral mismatches. It also allows the instrument under test to operate at irradiance values similar to those on orbit. The Remote Sensing Group at the University of Arizona measured the transmittance of the heliostat facility using three methods, the first of which is a relative measurement made using a hyperspectral portable spectroradiometer and well-calibrated reference panel. The second method is also a relative measurement, and uses a 12-channel automated solar radiometer. The final method is an absolute measurement using a hyperspectral spectroradiometer and reference panel combination, where the spectroradiometer is calibrated on site using a solar-radiation-based calibration.

  6. Comparison of Three Contemporary Risk Scores for Mortality Following Elective Abdominal Aortic Aneurysm Repair

    PubMed Central

    Grant, S.W.; Hickey, G.L.; Carlson, E.D.; McCollum, C.N.

    2014-01-01

    Objective/background A number of contemporary risk prediction models for mortality following elective abdominal aortic aneurysm (AAA) repair have been developed. Before a model is used either in clinical practice or to risk-adjust surgical outcome data it is important that its performance is assessed in external validation studies. Methods The British Aneurysm Repair (BAR) score, Medicare, and Vascular Governance North West (VGNW) models were validated using an independent prospectively collected sample of multicentre clinical audit data. Consecutive, data on 1,124 patients undergoing elective AAA repair at 17 hospitals in the north-west of England and Wales between April 2011 and March 2013 were analysed. The outcome measure was in-hospital mortality. Model calibration (observed to expected ratio with chi-square test, calibration plots, calibration intercept and slope) and discrimination (area under receiver operating characteristic curve [AUC]) were assessed in the overall cohort and procedural subgroups. Results The mean age of the population was 74.4 years (SD 7.7); 193 (17.2%) patients were women and the majority of patients (759, 67.5%) underwent endovascular aneurysm repair. All three models demonstrated good calibration in the overall cohort and procedural subgroups. Overall discrimination was excellent for the BAR score (AUC 0.83, 95% confidence interval [CI] 0.76–0.89), and acceptable for the Medicare and VGNW models, with AUCs of 0.78 (95% CI 0.70–0.86) and 0.75 (95% CI 0.65–0.84) respectively. Only the BAR score demonstrated good discrimination in procedural subgroups. Conclusion All three models demonstrated good calibration and discrimination for the prediction of in-hospital mortality following elective AAA repair and are potentially useful. The BAR score has a number of advantages, which include being developed on the most contemporaneous data, excellent overall discrimination, and good performance in procedural subgroups. Regular model validations and recalibration will be essential. PMID:24837173

  7. Comparison of three contemporary risk scores for mortality following elective abdominal aortic aneurysm repair.

    PubMed

    Grant, S W; Hickey, G L; Carlson, E D; McCollum, C N

    2014-07-01

    A number of contemporary risk prediction models for mortality following elective abdominal aortic aneurysm (AAA) repair have been developed. Before a model is used either in clinical practice or to risk-adjust surgical outcome data it is important that its performance is assessed in external validation studies. The British Aneurysm Repair (BAR) score, Medicare, and Vascular Governance North West (VGNW) models were validated using an independent prospectively collected sample of multicentre clinical audit data. Consecutive, data on 1,124 patients undergoing elective AAA repair at 17 hospitals in the north-west of England and Wales between April 2011 and March 2013 were analysed. The outcome measure was in-hospital mortality. Model calibration (observed to expected ratio with chi-square test, calibration plots, calibration intercept and slope) and discrimination (area under receiver operating characteristic curve [AUC]) were assessed in the overall cohort and procedural subgroups. The mean age of the population was 74.4 years (SD 7.7); 193 (17.2%) patients were women and the majority of patients (759, 67.5%) underwent endovascular aneurysm repair. All three models demonstrated good calibration in the overall cohort and procedural subgroups. Overall discrimination was excellent for the BAR score (AUC 0.83, 95% confidence interval [CI] 0.76-0.89), and acceptable for the Medicare and VGNW models, with AUCs of 0.78 (95% CI 0.70-0.86) and 0.75 (95% CI 0.65-0.84) respectively. Only the BAR score demonstrated good discrimination in procedural subgroups. All three models demonstrated good calibration and discrimination for the prediction of in-hospital mortality following elective AAA repair and are potentially useful. The BAR score has a number of advantages, which include being developed on the most contemporaneous data, excellent overall discrimination, and good performance in procedural subgroups. Regular model validations and recalibration will be essential. Copyright © 2014 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  8. NuSTAR on-ground calibration: II. Effective area

    NASA Astrophysics Data System (ADS)

    Brejnholt, Nicolai F.; Christensen, Finn E.; Westergaard, Niels J.; Hailey, Charles J.; Koglin, Jason E.; Craig, William W.

    2012-09-01

    The Nuclear Spectroscopic Telescope ARray (NuSTAR) was launched in June 2012 carrying the first focusing hard X-ray (5-80keV) optics to orbit. The multilayer coating was carried out at the Technical University of Denmark (DTU Space). In this article we introduce the NuSTAR multilayer reference database and its implementation in the NuSTAR optic response model. The database and its implementation is validated using on-ground effective area calibration data and used to estimate in-orbit performance.

  9. Cross-calibration of liquid and solid QCT calibration standards: corrections to the UCSF normative data

    NASA Technical Reports Server (NTRS)

    Faulkner, K. G.; Gluer, C. C.; Grampp, S.; Genant, H. K.

    1993-01-01

    Quantitative computed tomography (QCT) has been shown to be a precise and sensitive method for evaluating spinal bone mineral density (BMD) and skeletal response to aging and therapy. Precise and accurate determination of BMD using QCT requires a calibration standard to compensate for and reduce the effects of beam-hardening artifacts and scanner drift. The first standards were based on dipotassium hydrogen phosphate (K2HPO4) solutions. Recently, several manufacturers have developed stable solid calibration standards based on calcium hydroxyapatite (CHA) in water-equivalent plastic. Due to differences in attenuating properties of the liquid and solid standards, the calibrated BMD values obtained with each system do not agree. In order to compare and interpret the results obtained on both systems, cross-calibration measurements were performed in phantoms and patients using the University of California San Francisco (UCSF) liquid standard and the Image Analysis (IA) solid standard on the UCSF GE 9800 CT scanner. From the phantom measurements, a highly linear relationship was found between the liquid- and solid-calibrated BMD values. No influence on the cross-calibration due to simulated variations in body size or vertebral fat content was seen, though a significant difference in the cross-calibration was observed between scans acquired at 80 and 140 kVp. From the patient measurements, a linear relationship between the liquid (UCSF) and solid (IA) calibrated values was derived for GE 9800 CT scanners at 80 kVp (IA = [1.15 x UCSF] - 7.32).(ABSTRACT TRUNCATED AT 250 WORDS).

  10. Ground Albedo Neutron Sensing (GANS) method for measurements of soil moisture in cropped fields

    NASA Astrophysics Data System (ADS)

    Andres Rivera Villarreyes, Carlos; Baroni, Gabriele; Oswald, Sascha E.

    2013-04-01

    Measurement of soil moisture at the plot or hill-slope scale is an important link between local vadose zone hydrology and catchment hydrology. However, so far only few methods are on the way to close this gap between point measurements and remote sensing. This study evaluates the applicability of the Ground Albedo Neutron Sensing (GANS) for integral quantification of seasonal soil moisture in the root zone at the scale of a field or small watershed, making use of the crucial role of hydrogen as neutron moderator relative to other landscape materials. GANS measurements were performed at two locations in Germany under different vegetative situations and seasonal conditions. Ground albedo neutrons were measured at (i) a lowland Bornim farmland (Brandenburg) cropped with sunflower in 2011 and winter rye in 2012, and (ii) a mountainous farmland catchment (Schaefertal, Harz Mountains) since middle 2011. At both sites depth profiles of soil moisture were measured at several locations in parallel by frequency domain reflectometry (FDR) for comparison and calibration. Initially, calibration parameters derived from a previous study with corn cover were tested under sunflower and winter rye periods at the same farmland. GANS soil moisture based on these parameters showed a large discrepancy compared to classical soil moisture measurements. Therefore, two new calibration approaches and four different ways of integration the soil moisture profile to an integral value for GANS were evaluated in this study. This included different sets of calibration parameters based on different growing periods of sunflower. New calibration parameters showed a good agreement with FDR network during sunflower period (RMSE = 0.023 m3 m-3), but they underestimated soil moisture in the winter rye period. The GANS approach resulted to be highly affected by temporal changes of biomass and crop types which suggest the need of neutron corrections for long-term observations with crop rotation. Finally, Bornim sunflower parameters were transferred to Schaefertal catchment for further evaluation. This study proves GANS potential to close the measurement gap between point scale and remote sensing scale; however, its calibration needs to be adapted for vegetation in cropped fields.

  11. Validation of the German Diabetes Risk Score among the general adult population: findings from the German Health Interview and Examination Surveys

    PubMed Central

    Paprott, Rebecca; Mühlenbruch, Kristin; Mensink, Gert B M; Thiele, Silke; Schulze, Matthias B; Scheidt-Nave, Christa; Heidemann, Christin

    2016-01-01

    Objective To evaluate the German Diabetes Risk Score (GDRS) among the general adult German population for prediction of incident type 2 diabetes and detection of prevalent undiagnosed diabetes. Methods The longitudinal sample for prediction of incident diagnosed type 2 diabetes included 3625 persons who participated both in the examination survey in 1997–1999 and the examination survey in 2008–2011. Incident diagnosed type 2 diabetes was defined as first-time physician diagnosis or antidiabetic medication during 5 years of follow-up excluding potential incident type 1 and gestational diabetes. The cross-sectional sample for detection of prevalent undiagnosed diabetes included 6048 participants without diagnosed diabetes of the examination survey in 2008–2011. Prevalent undiagnosed diabetes was defined as glycated haemoglobin ≥6.5% (48 mmol/mol). We assessed discrimination as area under the receiver operating characteristic curve (ROC-AUC (95% CI)) and calibration through calibration plots. Results In longitudinal analyses, 82 subjects with incident diagnosed type 2 diabetes were identified after 5 years of follow-up. For prediction of incident diagnosed diabetes, the GDRS yielded an ROC-AUC of 0.87 (0.83 to 0.90). Calibration plots indicated excellent prediction for low diabetes risk and overestimation for intermediate and high diabetes risk. When considering the entire follow-up period of 11.9 years (ROC-AUC: 0.84 (0.82 to 0.86)) and including incident undiagnosed diabetes (ROC-AUC: 0.81 (0.78 to 0.84)), discrimination decreased somewhat. A previously simplified paper version of the GDRS yielded a similar predictive ability (ROC-AUC: 0.86 (0.82 to 0.89)). In cross-sectional analyses, 128 subjects with undiagnosed diabetes were identified. For detection of prevalent undiagnosed diabetes, the ROC-AUC was 0.84 (0.81 to 0.86). Again, the simplified version yielded a similar result (ROC-AUC: 0.83 (0.80 to 0.86)). Conclusions The GDRS might be applied for public health monitoring of diabetes risk in the German adult population. Future research needs to evaluate whether the GDRS is useful to improve diabetes risk awareness and prevention among the general population. PMID:27933187

  12. Satellite Instrument Calibration for Measuring Global Climate Change. Report of a Workshop at the University of Maryland Inn and Conference Center, College Park, MD. , November 12-14, 2002

    NASA Technical Reports Server (NTRS)

    Ohring, G.; Wielicki, B.; Spencer, R.; Emery, B.; Datla, R.

    2004-01-01

    Measuring the small changes associated with long-term global climate change from space is a daunting task. To address these problems and recommend directions for improvements in satellite instrument calibration some 75 scientists, including researchers who develop and analyze long-term data sets from satellites, experts in the field of satellite instrument calibration, and physicists working on state of the art calibration sources and standards met November 12 - 14, 2002 and discussed the issues. The workshop defined the absolute accuracies and long-term stabilities of global climate data sets that are needed to detect expected trends, translated these data set accuracies and stabilities to required satellite instrument accuracies and stabilities, and evaluated the ability of current observing systems to meet these requirements. The workshop's recommendations include a set of basic axioms or overarching principles that must guide high quality climate observations in general, and a roadmap for improving satellite instrument characterization, calibration, inter-calibration, and associated activities to meet the challenge of measuring global climate change. It is also recommended that a follow-up workshop be conducted to discuss implementation of the roadmap developed at this workshop.

  13. Evaluation of different parameterizations of the spatial heterogeneity of subsurface storage capacity for hourly runoff simulation in boreal mountainous watershed

    NASA Astrophysics Data System (ADS)

    Hailegeorgis, Teklu T.; Alfredsen, Knut; Abdella, Yisak S.; Kolberg, Sjur

    2015-03-01

    Identification of proper parameterizations of spatial heterogeneity is required for precipitation-runoff models. However, relevant studies with a specific aim at hourly runoff simulation in boreal mountainous catchments are not common. We conducted calibration and evaluation of hourly runoff simulation in a boreal mountainous watershed based on six different parameterizations of the spatial heterogeneity of subsurface storage capacity for a semi-distributed (subcatchments hereafter called elements) and distributed (1 × 1 km2 grid) setup. We evaluated representation of element-to-element, grid-to-grid, and probabilistic subcatchment/subbasin, subelement and subgrid heterogeneities. The parameterization cases satisfactorily reproduced the streamflow hydrographs with Nash-Sutcliffe efficiency values for the calibration and validation periods up to 0.84 and 0.86 respectively, and similarly for the log-transformed streamflow up to 0.85 and 0.90. The parameterizations reproduced the flow duration curves, but predictive reliability in terms of quantile-quantile (Q-Q) plots indicated marked over and under predictions. The simple and parsimonious parameterizations with no subelement or no subgrid heterogeneities provided equivalent simulation performance compared to the more complex cases. The results indicated that (i) identification of parameterizations require measurements from denser precipitation stations than what is required for acceptable calibration of the precipitation-streamflow relationships, (ii) there is challenges in the identification of parameterizations based on only calibration to catchment integrated streamflow observations and (iii) a potential preference for the simple and parsimonious parameterizations for operational forecast contingent on their equivalent simulation performance for the available input data. In addition, the effects of non-identifiability of parameters (interactions and equifinality) can contribute to the non-identifiability of the parameterizations.

  14. Timing considerations for preclinical MRgRT: effects of ion diffusion, SNR and imaging times on FXG gel calibration

    NASA Astrophysics Data System (ADS)

    Welch, M.; Foltz, W. D.; Jaffray, D. A.

    2015-01-01

    Sub-millimeter resolution images are required for gel dosimeters to be used in preclinical research, which is challenging for MR probed ferrous xylenol-orange (FXG) dosimeters due to ion diffusion and inadequate SNR. A preclinical 7 T MR, small animal irradiator and FXG dosimeters were used in all experiments. Ion diffusion was analyzed using high resolution (0.2 mm/pixel) T1 MR images collected every 5 minutes, post-irradiation, for an hour. Using Fick's second law, ion diffusion was approximated for the first hour post-irradiation. SNR, T1 map precision and calibration fit were determined for two MR protocols: (1) 10 minute acquisition, 0.35mm/pixel and 3mm slices, (2) 45 minute acquisition, 0. 25 mm/pixel and 2 mm slices. SNR and T1 map precision were calculated using a Monte Carlo simulation. Calibration curves were determined by plotting R1 relaxation rates versus depth dose data, and fitting a linear trend line. Ion diffusion was estimated as 0.003mm2 in the first hour post-irradiation. For protocols (1) and (2) respectively, Monte Carlo simulation predicted T1 precisions of 3% and 5% within individual voxels using experimental SNRs; the corresponding measured T1 precisions were 8% and 12%. The linear trend lines reported slopes of 27 ± 3 Gy*s (R2: 0.80 ± 0.04) and 27 ± 4 Gy*s (R2: 0.90 ± 0.04). Ion diffusion is negligible within the first hour post-irradiation, and an accurate and reproducible calibration can be achieved in a preclinical setting with sub-millimeter resolution.

  15. Continuous glucose monitoring in subcutaneous tissue using factory-calibrated sensors: a pilot study.

    PubMed

    Hoss, Udo; Jeddi, Iman; Schulz, Mark; Budiman, Erwin; Bhogal, Claire; McGarraugh, Geoffrey

    2010-08-01

    Commercial continuous subcutaneous glucose monitors require in vivo calibration using capillary blood glucose tests. Feasibility of factory calibration, i.e., sensor batch characterization in vitro with no further need for in vivo calibration, requires a predictable and stable in vivo sensor sensitivity and limited inter- and intra-subject variation of the ratio of interstitial to blood glucose concentration. Twelve volunteers wore two FreeStyle Navigator (Abbott Diabetes Care, Alameda, CA) continuous glucose monitoring systems for 5 days in parallel for two consecutive sensor wears (four sensors per subject, 48 sensors total). Sensors from a prototype sensor lot with a low variability in glucose sensitivity were used for the study. Median sensor sensitivity values based on capillary blood glucose were calculated per sensor and compared for inter- and intra-subject variation. Mean absolute relative difference (MARD) calculation and error grid analysis were performed using a single calibration factor for all sensors to simulate factory calibration and compared to standard fingerstick calibration. Sensor sensitivity variation in vitro was 4.6%, which increased to 8.3% in vivo (P < 0.0001). Analysis of variance revealed no significant inter-subject differences in sensor sensitivity (P = 0.134). Applying a single universal calibration factor retrospectively to all sensors resulted in a MARD of 10.4% and 88.1% of values in Clarke Error Grid Zone A, compared to a MARD of 10.9% and 86% of values in Error Grid Zone A for fingerstick calibration. Factory calibration of sensors for continuous subcutaneous glucose monitoring is feasible with similar accuracy to standard fingerstick calibration. Additional data are required to confirm this result in subjects with diabetes.

  16. The Origin of DIRT (Detrital Input and Removal Treatments): the Legacy of Dr. Francis D. Hole

    NASA Astrophysics Data System (ADS)

    Townsend, K. L.; Lajtha, K.; Caldwell, B.; Sollins, P.

    2007-12-01

    Soil organic matter (SOM) plays a key role in the cycling and retention of nitrogen and carbon within soil. Both above and belowground detrital inputs determine the nature and quantity of SOM. Studies on detrital impacts on SOM dynamics are underway at several LTER, ILTER and LTER-affiliated sites using a common experimental design, Detrital Input and Removal Treatments (DIRT). The concept for DIRT was originally based on experimental plots established at the University of Wisconsin Arboretum by Dr. Francis D. Hole in 1956 to study the effects of detrital inputs on pedogenesis. These plots are located on two forested sites and two prairie sites within the arboretum. Manipulations of the forested sites include double litter, no litter and removal of the O and A horizons. Manipulations of the prairie sites include harvest, mulch, bare and burn. These original treatments have largely been maintained since 1956. After 40 years of maintenance, there were significant differences in soil carbon between the double and no litter plots. The double litter plots had increased by nearly 30% while the no litter plots had decreased over 50%. The original DIRT plots are now 50 years old and have been re-sampled, where possible, for total carbon and nitrogen, labile and recalcitrant carbon fractions, net and gross nitrogen mineralization rates, and SOM bioavailability through CO2 respiration. The soils were fractionated by density to examine the role of carbon in each density fraction. The mean age of carbon in each fraction was determined by radiocarbon dating. This sampling and analysis is of special significance because it provides a glimpse into the future SOM trajectories for the new DIRT sites: Harvard Forest (MA), Bousson (PA), Andrews Experimental Forest (OR) and Sikfokut (Hungary).

  17. Forest disturbance spurs growth of modeling and technology

    NASA Astrophysics Data System (ADS)

    Bohrer, G.; Matheny, A. M.; Mirfenderesgi, G.; Morin, T. H.; Rey Sanchez, A. C.; Gough, C. M.; Vogel, C. S.; Nadelhoffer, K. J.; Curtis, P.

    2016-12-01

    As new opportunities for scientific exploration open, needs for data generate a drive for innovative developments of new research tools. The Forest Accelerated Succession ExperimenT (FASET) was enacted in 2007, continuous flux observations at the University of Michigan Biological Station (UMBS) since 2000. FASET is a large-scale ecological experiment testing the immediate and intermediate term effects of disturbance, and eventually, the role of succession and community composition on forest flux dynamics. Decades-long tree-level observations in the UMBS forest, combined with the long term flux observations allowed us to match the bottom-up accumulated response of individual trees with the top-down whole-plot response measured from the flux tower. However, data describing tree-level canopy structure and hydrological response over an entire plot were not readily available. Unintentionally, FASET became both a motivation and a test-bed for new research tools and approaches. We expanded the operation and analysis approach for a portable canopy LiDARfor 3-D measurements meter-scale canopy structure. We matched canopy LiDAR measurements with root measurements from ground penetrating radar. To study the hydrological effects of the disturbance, we instrumented a large number of trees with Granier-style sap flux sensors. We further developed an approach to use frequency domain reflectometry sensors for continuous measurements of tree water content. We developed an approach to combine plot census, allometry and sap-flux observations in a bottom-up fashion to compare with plot-level EC transpiration rates. We found that while the transpirational water demand in the disturbance plot increased, overall evapotranspiration decreased. This decrease, however, is not uniform across species. A new individual-plant to ecosystem scale hydrodynamic model (FETCH2) demonstrates how specific traits translate to intra-daily differences in plot-level transpiration dynamics.

  18. Invasive C4 Perennial Grass Alters Net Ecosystem Exchange in Mixed C3/C4 Savanna Grassland

    NASA Astrophysics Data System (ADS)

    Basham, T. S.; Litvak, M.

    2006-12-01

    The invasion of ecosystems by non-native plants that differ from native plants in physiological characteristics and phenology has the potential to alter ecosystem function. In Texas and other regions of the southern central plains of the United States, the introduced C4 perennial grass, Bothriochloa ischaemum, invades C3/C4 mixed grasslands and savannas, resulting in decreased plant community diversity (Gabbard 2003; Harmoney et al 2004). The objective of this study was to quantify how the conversion of these mixed grass communities to C4 dominated, B. ischaemum monocultures impacts carbon cycling and sequestration. Seasonal measurements of Net Ecosystem Exchange (NEE) of CO2, leaf level gas exchange and soil respiration were compared between savanna grassland plots composed of either naturally occurring B. ischaemum monocultures or native mixed grasses (n=16). NEE was measured using a closed system chamber that attached to permanently installed stainless steel bases. Temperature, soil moisture, aerial percent species cover and leaf area index were also monitored in plots to explain variability in measured responses. Results showed that NEE differed seasonally between invaded and native plots due to 1) greater leaf surface area per unit ground area in invaded plots, 2) differences in phenological patterns of plant activity and 3) differences in responses to water limitation between invaded and native plots. Cold season and summer drought NEE were driven primarily by belowground respiration in both plot types, however spring uptake activity commenced two months later in invaded plots. This later start in invaded plots was compensated for by greater uptake throughout the growing season and in particular during the drier summer months. Differences in NEE between plot types were not due to differences in soil respiration nor were they due to greater leaf level photosynthetic capabilities of B. ischaemum relative to the dominant native grasses. NEE, soil respiration and biomass accumulation were limited by temperature and soil moisture in both native and invaded plots; however, invaded areas were less sensitive to both higher temperatures and lower soil moisture. Preliminary modeling results suggest that from January-August 2006, invaded grasslands stored approximately one third more carbon than native grasslands, making them 20% less of a carbon source than native plots during this year of record high temperatures and drought. Gabbard, BL. 2003. The Population Dynamics and Distribution of the Exotic Grass,Bothriochloa ischaemum, PhD Dissertation, University of Texas, Austin, TX Harmoney et al. 2004. Herbicide Effects on Established Yellow Old World Bluestem (Bothriochloa ischaemum). Weed Technology 18:545 550

  19. Spatial and Temporal Evaluation of Soil Erosion with RUSLE: A Case Study in an Olive Orchard Microcathment in Spain

    EPA Science Inventory

    Soil loss is commonly estimated using the Revised Universal Soil Loss Equation (RUSLE). Since RUSLE is an empirically based soil loss model derived from surveys on plots, the high spatial and temporal variability of erosion in Mediterranean environments and scale effects provoke...

  20. Moviemaking for the Language Acquisition Classroom: Engage Your Students with an Instructional "Soap"

    ERIC Educational Resources Information Center

    Carlson, Gigi; Crowther, Judith

    2004-01-01

    Television melodrama, like grand opera, is constructed to formula. Character interactions are highly charged and plot dominates, initiating excitement, suspense, and raising questions around timeless and universal themes. Despite--or because of--their extreme nature, the soaps remain one of the longest-standing television genres, with the loyal…

  1. The Big Deals in Biofuels

    ERIC Educational Resources Information Center

    Brainard, Jeffrey

    2007-01-01

    Plants that bear less familiar names such as switch grass, "Miscanthus," and kenaf, are not much to look at, having weathered Iowa's winter snows. But Iowa State researchers see these crops as seeds of change in alternative fuels. Rows of experimental crops line the test plots at Iowa State University's research farm. Although corn is…

  2. Theoretical Hammett Plot for the Gas-Phase Ionization of Benzoic Acid versus Phenol: A Computational Chemistry Lab Exercise

    ERIC Educational Resources Information Center

    Ziegler, Blake E.

    2013-01-01

    Computational chemistry undergraduate laboratory courses are now part of the chemistry curriculum at many universities. However, there remains a lack of computational chemistry exercises available to instructors. This exercise is presented for students to develop skills using computational chemistry software while supplementing their knowledge of…

  3. Understanding the Graphical Challenges Faced by Vision-Impaired Students in Australian Universities

    ERIC Educational Resources Information Center

    Butler, Matthew; Holloway, Leona; Marriott, Kim; Goncu, Cagatay

    2017-01-01

    Information graphics such as plots, maps, plans, charts, tables and diagrams form an integral part of the student learning experience in many disciplines. However, for a vision impaired student accessing such graphical materials can be problematic. This research seeks to understand the current state of accessible graphics provision in Australian…

  4. First report of bacterial blight of carrot in Indiana caused by Xanthomonas hortorum pv. carotae

    USDA-ARS?s Scientific Manuscript database

    In summer 2012, bacterial blight symptoms were observed on leaves of carrot plants in 7 out of 70 plots of carrot breeding lines at the Purdue University Meig Horticulture Research Farm, Lafayette, IN. Symptoms included small to large, variably shaped, water soaked to dry, necrotic lesions, with or ...

  5. Growth Results From 20-Year-Old Low Density Pine Plantations

    Treesearch

    A. Gordon Holley; Charles T. Stiff

    2004-01-01

    In 1994, under a cooperative effort between Temple-Inland Forest Products Corporation and Stephen F. Austin State University, 84 permanent research plots were established in two loblolly pine (Pinus taeda) plantations in eastern Texas. The study was designed to evaluate the effects of heavy thinning, pruning, fertilization, and competition control on...

  6. Spatial and Temporal Evaluation of Soil Erosion with RUSLE: A case Study in an Olive Orchard Microcathment in Spain

    EPA Science Inventory

    Soil loss is commonly estimated using the Revised Universal Soil Loss Equation (RUSLE). Since RUSLE is an empirically based soil loss model derived from surveys on plots, the high spatial and temporal variability of erosion in Mediterranean environments and scale effects provo...

  7. Rarefaction Wave Eliminator Concepts For A Large Blast/Thermal Simulator.

    DTIC Science & Technology

    1985-02-01

    hard copies of the pressure-time records. Final data process- ing was completed with the computer, printer , and plotter. Plots of pressure- time records...F ATTN: Prof 0. Zinke Fayetteville, AR 72701 Cdr, CRDC, AMCCOM ATTI: 4O-SPS-IL University of California PM=-J Lawrence Livermore Lab SOM-RSP-A ATTN

  8. Quantitative Assessment of Agricultural Runoff and Soil Erosion Using Mathematical Modeling: Applications in the Mediterranean Region

    NASA Astrophysics Data System (ADS)

    Arhonditsis, G.; Giourga, C.; Loumou, A.; Koulouri, M.

    2002-09-01

    Three mathematical models, the runoff curve number equation, the universal soil loss equation, and the mass response functions, were evaluated for predicting nonpoint source nutrient loading from agricultural watersheds of the Mediterranean region. These methodologies were applied to a catchment, the gulf of Gera Basin, that is a typical terrestrial ecosystem of the islands of the Aegean archipelago. The calibration of the model parameters was based on data from experimental plots from which edge-of-field losses of sediment, water runoff, and nutrients were measured. Special emphasis was given to the transport of dissolved and solid-phase nutrients from their sources in the farmers' fields to the outlet of the watershed in order to estimate respective attenuation rates. It was found that nonpoint nutrient loading due to surface losses was high during winter, the contribution being between 50% and 80% of the total annual nutrient losses from the terrestrial ecosystem. The good fit between simulated and experimental data supports the view that these modeling procedures should be considered as reliable and effective methodological tools in Mediterranean areas for evaluating potential control measures, such as management practices for soil and water conservation and changes in land uses, aimed at diminishing soil loss and nutrient delivery to surface waters. Furthermore, the modifications of the general mathematical formulations and the experimental values of the model parameters provided by the study can be used in further application of these methodologies in watersheds with similar characteristics.

  9. The near-infrared Tully-Fisher relation - A preliminary study of the Coma and Abell 400 clusters

    NASA Technical Reports Server (NTRS)

    Guhathakurta, Puragra; Bernstein, Gary; Raychaudhury, Somak; Haynes, Martha; Giovanelli, Riccardo; Herter, Terry; Vogt, Nicole

    1993-01-01

    We have started a large project to study the NIR Tully-Fisher (TF) relation using H- and I-band surface photometry of spiral galaxies. A preliminary study of 20 spirals in the Coma and Abell 400 clusters is presented. The NIR images have been used to derive accurate inclinations and total magnitudes, and rotational linewidths are measured from high-quality 21-cm Arecibo data. The scatter in the Coma TF plot is found to be 0.19 mag in the H band and 0.20 mag in the I band for a set of 13 galaxies, if we assume that they are all at the same distance. The deviation of the Coma galaxies from the best-fit Tully-Fisher relation is correlated with their redshift, indicating that some of the galaxies are not bound to the cluster. Indeed, if we treat all the galaxies in the Coma sample as undergoing free Hubble expansion, the TF scatter drops to 0.12 and 0.13 mag for the H- and I-band datasets, respectively. The Abell 400 sample is best fit by a common distance model, yielding a scatter of 0.12 mag for seven galaxies in H using a fixed TF slope. We are in the process of studying cluster and field spirals out to about 10,000 km/s in order to calibrate the NIR TF relation and will apply it to more nearby galaxies to measure the peculiar velocity field in the local universe.

  10. Data fusion for a vision-aided radiological detection system: Calibration algorithm performance

    NASA Astrophysics Data System (ADS)

    Stadnikia, Kelsey; Henderson, Kristofer; Martin, Allan; Riley, Phillip; Koppal, Sanjeev; Enqvist, Andreas

    2018-05-01

    In order to improve the ability to detect, locate, track and identify nuclear/radiological threats, the University of Florida nuclear detection community has teamed up with the 3D vision community to collaborate on a low cost data fusion system. The key is to develop an algorithm to fuse the data from multiple radiological and 3D vision sensors as one system. The system under development at the University of Florida is being assessed with various types of radiological detectors and widely available visual sensors. A series of experiments were devised utilizing two EJ-309 liquid organic scintillation detectors (one primary and one secondary), a Microsoft Kinect for Windows v2 sensor and a Velodyne HDL-32E High Definition LiDAR Sensor which is a highly sensitive vision sensor primarily used to generate data for self-driving cars. Each experiment consisted of 27 static measurements of a source arranged in a cube with three different distances in each dimension. The source used was Cf-252. The calibration algorithm developed is utilized to calibrate the relative 3D-location of the two different types of sensors without need to measure it by hand; thus, preventing operator manipulation and human errors. The algorithm can also account for the facility dependent deviation from ideal data fusion correlation. Use of the vision sensor to determine the location of a sensor would also limit the possible locations and it does not allow for room dependence (facility dependent deviation) to generate a detector pseudo-location to be used for data analysis later. Using manually measured source location data, our algorithm-predicted the offset detector location within an average of 20 cm calibration-difference to its actual location. Calibration-difference is the Euclidean distance from the algorithm predicted detector location to the measured detector location. The Kinect vision sensor data produced an average calibration-difference of 35 cm and the HDL-32E produced an average calibration-difference of 22 cm. Using NaI and He-3 detectors in place of the EJ-309, the calibration-difference was 52 cm for NaI and 75 cm for He-3. The algorithm is not detector dependent; however, from these results it was determined that detector dependent adjustments are required.

  11. C3R2 - Complete Calibration of the Color-Redshift Relation: Keck spectroscopy to train photometric redshifts for Euclid and WFIRST

    NASA Astrophysics Data System (ADS)

    Stern, Daniel; C3R2 Team

    2017-01-01

    A primary objective of both WFIRST and Euclid is to provide a 3D map of the distribution of matter across a significant fraction of the universe from the weak lensing shear field, but to do so requires robust distances to billions of galaxies. I will report on a multi-semester program, expected to total approximately 40 nights with Keck over the next two years. This program, supporting both the NASA PCOS and COR science goals, will obtain the necessary galaxy spectroscopy to calibrate the color-redshift relation for the Euclid mission, and make significant progress towards the WFIRST requirements. The program, called C3R2 or Complete Calibration of the Color-Redshift Relation, already encompasses 10 allocated nights of NASA Keck Key Strategic Mission Support (PI D. Stern), 12 allocated nights from Caltech (PI J. Cohen), 3 allocated nights from the University of Hawaii (PI D. Sanders), and 1.5 allocated nights from UC-Riverside (PI B. Mobasher). We are also pursuing opportunities at additional 8- to 10-meter class telescopes, including Magellan, VLT and GCT. I will present the motivation for this program, the plans, and current results.

  12. Effects of processed oil shale on the element content of Atriplex cancescens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, B.M.

    1982-01-01

    Samples of four-wing saltbush were collected from the Colorado State University Intensive Oil Shale Revegetation Study Site test plots in the Piceance basin, Colorado. The test plots were constructed to evaluate the effects of processed oil shale geochemistry on plant growth using various thicknesses of soil cover over the processed shale and/or over a gravel barrier between the shale and soil. Generally, the thicker the soil cover, the less the influence of the shale geochemistry on the element concentrations in the plants. Concentrations of 20 elements were larger in the ash of four-wing saltbush growing on the plot with themore » gravel barrier (between the soil and processed shale) when compared to the sample from the control plot. A greater water content in the soil in this plot has been reported, and the interaction between the increased, percolating water and shale may have increased the availability of these elements for plant uptake. Concentrations of boron, copper, fluorine, lithium, molybdenum, selenium, silicon, and zinc were larger in the samples grown over processed shale, compared to those from the control plot, and concentrations for barium, calcium, lanthanum, niobium, phosphorus, and strontium were smaller. Concentrations for arsenic, boron, fluorine, molybdenum, and selenium - considered to be potential toxic contaminants - were similar to results reported in the literature for vegetation from the test plots. The copper-to-molybdenum ratios in three of the four samples of four-wing saltbush growing over the processed shale were below the ratio of 2:1, which is judged detrimental to ruminants, particularly cattle. Boron concentrations averaged 140 ppM, well above the phytotoxicity level for most plant species. Arsenic, fluorine, and selenium concentrations were below toxic levels, and thus should not present any problem for revegetation or forage use at this time.« less

  13. YRMR Organizing Committee

    NASA Astrophysics Data System (ADS)

    2011-02-01

    Elena Cannuccia graduated in physics at the University of Rome Tor Vergata in the Condensed Matter Theory Group (1http://www.fisica.uniroma2.it/?cmtheo-group/). She is currently finishing her PhD in the same research group. Her research project focused on the investigation of the role played by electron-phonon coupling on the electronic properties of polymers. Through completing her research she made a contribution to the development of YAMBO (http://www.yambo-code.org/), a FORTRAN/C code for Many-Body calculations in solid state and molecular physics. Luca Mazzaferro is a PhD student at the University of Rome "Tor Vergata". He works on the ATLAS experiment and is a member of the ATLAS Calibration group. He gratuated from the University of Rome "La Sapienza" in May 2010, and worked with the local ATLAS group, developing the LCDS routines for the calibration of the ATLAS MDT Chambers. This software is now a standard for managing the calibration analysis of ATLAS chambers. He also works in the administrative group of "Tor Vergata" grid-computing farm. Marina Migliaccio graduated in Universe Science at the University of Rome "Tor Vergata", and she is now a PhD student in Astronomy. During her PhD she has spent eight months as a visiting scholar at the Kavli Institute for Cosmology in Cambridge (UK). The focus of her research is precision cosmology. In this context, her work so far has been devoted to the study of Cosmic Microwave Background (CMB) radiation in order to constrain cosmological models and early universe physics. She has analyzed the BOOMERanG balloon-borne experiment data, searching for a primordial non-Gaussian signature. Since 2008, she has been involved in the Planck mission Core Cosmology program, where her major contribution deals with measuring the statistical properties of CMB intensity and polarization fields in view of realistic (both instrumental and astrophysical) effects. Davide Pietrobon graduated in Astronomy, sharing the PhD between the University of Roma "Tor Vergata" and the Institute of Cosmology and Gravitation at the University of Portsmouth, within the context of the European Cotutela project. His thesis represents a detailed analysis of the cosmological perturbations through needlets, a statistical tool he developed together with his colleagues in Rome. In particular he focused on two main open questions in cosmology: dark energy and non-Gaussianity. He gained his Bachelors degree in physics from the University of Modena and Reggio Emilia, and his Masters in physics at the University of Roma "Tor Vergata". He spent three months at the University of California Irvine as a visiting student and is now a postdoc at the Jet Propulsion Laboratory. Francesco Stellato has studied the role of metals in the pathogenesis of neurodegenerative diseases such as Parkinsons and Alzheimers during his PhD. To this purpose, he mainly used synchrotron radiation-based techniques, e.g. X-ray Absorption Spectroscopy. He is interested in the development of new-generation light sources such as high-brilliance synchrotron and Free Electron Lasers, and in their application to the structural and dynamical study of biomolecules. Marcella Veneziani is a postdoc fellow at the California Institute of Technology and the University of Rome "La Sapienza". In February 2009 she gained her PhD in Astronomy at the University of Rome "La Sapienza", and in Physics, Particles and Matter at the University of Paris Diderot. Her fields of interest are: physics of the interstellar medium and star formation; cosmic microwave background radiation; analysis of data from orbital and suborbital experiments, and instrumental calibration. She is a member of the Herschel-HiGal, the Planck-HFI and the BOOMERanG collaborations. Part of her work has been undertaken at the European Space Agency Astronomy Center and at the University of California Irvine.

  14. Calibrating the Galaxy Color-Redshift Relation: A Critical Foundation for Weak Lensing Cosmology with WFIRST and Euclid

    NASA Astrophysics Data System (ADS)

    Stern, Daniel

    2016-08-01

    A primary objective of both WFIRST and Euclid is to provide a 3D map of the distribution of matter across a significant fraction of the universe from the weak lensing shear field, but to do so requires robust distances to billions of galaxies. We propose a 4-semester, 20-night Key Strategic Mission Support program, supporting both the NASA PCOS and COR science goals, to obtain the necessary galaxy spectroscopy to calibrate the color-redshift relation. Combined with a coordinated, similarly sized Caltech Keck proposal, the proposed program will achieve the photometric redshift calibration requirements for Euclid, and make significant progress towards the WFIRST requirements. [2016B is the 2nd semester of our 4-semester request.

  15. Assessment of the calibration of periodontal diagnosis and treatment planning among dental students at three dental schools.

    PubMed

    Lane, Brittany A; Luepke, Paul; Chaves, Eros; Maupome, Gerardo; Eckert, George J; Blanchard, Steven; John, Vanchit

    2015-01-01

    Calibration in diagnosis and treatment planning is difficult to achieve due to variations that exist in clinical interpretation. To determine if dental faculty members are consistent in teaching how to diagnose and treat periodontal disease, variations among dental students can be evaluated. A previous study reported high variability in diagnoses and treatment plans of periodontal cases at Indiana University School of Dentistry. This study aimed to build on that one by extending the research to two additional schools: Marquette University School of Dentistry and West Virginia University School of Dentistry. Diagnosis and treatment planning by 40 third- and fourth-year dental students were assessed at each of the schools. Students were asked to select the diagnosis and treatment plans on a questionnaire pertaining to 11 cases. Their responses were compared using chi-square tests, and multirater kappa statistics were used to assess agreement between classes and between schools. Logistic regression models were used to evaluate the effects of school, class year, prior experience, and GPA/class rank on correct responses. One case had a statistically significant difference in responses between third- and fourth-year dental students. Kappas for school agreement and class agreement were low. The students from Indiana University had higher diagnosis and treatment agreements than the Marquette University students, and the Marquette students fared better than the West Virginia University students. This study can help restructure future periodontal courses for a better understanding of periodontal diagnosis and treatment planning.

  16. Sky-radiance gradient measurements at narrow bands in the visible.

    PubMed

    Winter, E M; Metcalf, T W; Stotts, L B

    1995-07-01

    Accurate calibrated measurements of the radiance of the daytime sky were made in narrow bands in the visible portion of the spectrum. These measurements were made over several months and were tabulated in a sun-referenced coordinate system. The radiance as a function of wavelength at angles ranging from 5 to 90 deg was plotted. A best-fit inverse power-law fit shows inversely linear behavior of the radiance versus wavelength near the Sun (5 deg) and a slope approaching inverse fourth power far from the Sun (60 deg). This behavior fits a Mie-scattering interpretation near the Sun and a Rayleigh-scattering interpretation away from the Sun. The results are also compared with LOWTRAN models.

  17. An Analysis of San Diego's Housing Market Using a Geographically Weighted Regression Approach

    NASA Astrophysics Data System (ADS)

    Grant, Christina P.

    San Diego County real estate transaction data was evaluated with a set of linear models calibrated by ordinary least squares and geographically weighted regression (GWR). The goal of the analysis was to determine whether the spatial effects assumed to be in the data are best studied globally with no spatial terms, globally with a fixed effects submarket variable, or locally with GWR. 18,050 single-family residential sales which closed in the six months between April 2014 and September 2014 were used in the analysis. Diagnostic statistics including AICc, R2, Global Moran's I, and visual inspection of diagnostic plots and maps indicate superior model performance by GWR as compared to both global regressions.

  18. Determination of plutonium in nitric acid solutions using energy dispersive L X-ray fluorescence with a low power X-ray generator

    NASA Astrophysics Data System (ADS)

    Py, J.; Groetz, J.-E.; Hubinois, J.-C.; Cardona, D.

    2015-04-01

    This work presents the development of an in-line energy dispersive L X-ray fluorescence spectrometer set-up, with a low power X-ray generator and a secondary target, for the determination of plutonium concentration in nitric acid solutions. The intensity of the L X-rays from the internal conversion and gamma rays emitted by the daughter nuclei from plutonium is minimized and corrected, in order to eliminate the interferences with the L X-ray fluorescence spectrum. The matrix effects are then corrected by the Compton peak method. A calibration plot for plutonium solutions within the range 0.1-20 g L-1 is given.

  19. Determination of virginiamycin M1 residue in tissues of swine and chicken by ultra-performance liquid chromatography tandem mass spectrometry.

    PubMed

    Wang, Xiaoyang; Wang, Mi; Zhang, Keyu; Hou, Ting; Zhang, Lifang; Fei, Chenzong; Xue, Feiqun; Hang, Taijun

    2018-06-01

    A reliable UPLC-MS/MS method with high sensitivity was developed and validated for the determination of virginiamycin M1 in muscle, fat, liver, and kidney samples of chicken and swine. Analytes were extracted using acetonitrile and extracts were defatted by N-hexane. Chromatographic separation was performed on a BEH C18 liquid chromatography column. The analytes were then detected using triplequadrupole mass spectrometry in positive electrospray ionization and multiple reaction monitoring mode. Calibration plots were constructed using standard working solutions and showed good linearity. Limits of quantification ranged from 2 to 60 ng mL -1 . Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2002-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC "E" test-stand complex and utilize the SSC file format. The programs are the following: 1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel; 2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris); and 3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  1. Kinematics of Local, High-Velocity K dwarfs in the SUPERBLINK Proper Motion Catalog

    NASA Astrophysics Data System (ADS)

    Kim, Bokyoung; Lepine, Sebastien

    2018-01-01

    We present a study of the kinematics of 345,480 K stars within 2 kpc of the Sun, based on data from the SUPERBLINK catalog of stars with high proper motions (> 40 mas/yr), combined with data from the 2MASS survey and from the first GAIA release, which together yields proper motions accurate to ~2 mas/yr. All K dwarfs were selected based on their G-K colors, and photometric distances were estimated from a re-calibrated color-magnitude relationship for K dwarfs. We plot transverse velocities VT in various directions on the sky, to examine the local distribution of K dwarfs in velocity space. We have also obtained radial velocity information for a subsample of 10,128 stars, from RAVE and SDSS DR12, which we use to construct spatial velocity (U, V, W) plots. About a third (123,350) of the stars are high-velocity K dwarfs, with motions consistent with the local Galactic halo population. Our kinematic analysis suggests that their velocity-space distribution is very uniform, and we find no evidence of substructure that might arise, e.g., from local streams or moving groups.

  2. Size-exclusion chromatography of perfluorosulfonated ionomers.

    PubMed

    Mourey, T H; Slater, L A; Galipo, R C; Koestner, R J

    2011-08-26

    A size-exclusion chromatography (SEC) method in N,N-dimethylformamide containing 0.1 M LiNO(3) is shown to be suitable for the determination of molar mass distributions of three classes of perfluorosulfonated ionomers, including Nafion(®). Autoclaving sample preparation is optimized to prepare molecular solutions free of aggregates, and a solvent exchange method concentrates the autoclaved samples to enable the use of molar-mass-sensitive detection. Calibration curves obtained from light scattering and viscometry detection suggest minor variation in the specific refractive index increment across the molecular size distributions, which introduces inaccuracies in the calculation of local absolute molar masses and intrinsic viscosities. Conformation plots that combine apparent molar masses from light scattering detection with apparent intrinsic viscosities from viscometry detection partially compensate for the variations in refractive index increment. The conformation plots are consistent with compact polymer conformations, and they provide Mark-Houwink-Sakurada constants that can be used to calculate molar mass distributions without molar-mass-sensitive detection. Unperturbed dimensions and characteristic ratios calculated from viscosity-molar mass relationships indicate unusually free rotation of the perfluoroalkane backbones and may suggest limitations to applying two-parameter excluded volume theories for these ionomers. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Quantifying differences in the impact of variable chemistry on equilibrium uranium(VI) adsorption properties of aquifer sediments

    USGS Publications Warehouse

    Stoliker, Deborah L.; Kent, Douglas B.; Zachara, John M.

    2011-01-01

    Uranium adsorption-desorption on sediment samples collected from the Hanford 300-Area, Richland, WA varied extensively over a range of field-relevant chemical conditions, complicating assessment of possible differences in equilibrium adsorption properties. Adsorption equilibrium was achieved in 500-1000 h although dissolved uranium concentrations increased over thousands of hours owing to changes in aqueous chemical composition driven by sediment-water reactions. A nonelectrostatic surface complexation reaction, >SOH + UO22+ + 2CO32- = >SOUO2(CO3HCO3)2-, provided the best fit to experimental data for each sediment sample resulting in a range of conditional equilibrium constants (logKc) from 21.49 to 21.76. Potential differences in uranium adsorption properties could be assessed in plots based on the generalized mass-action expressions yielding linear trends displaced vertically by differences in logKc values. Using this approach, logKc values for seven sediment samples were not significantly different. However, a significant difference in adsorption properties between one sediment sample and the fines (Kc uncertainty were improved by capturing all data points within experimental errors. The mass-action expression plots demonstrate that applying models outside the range of conditions used in model calibration greatly increases potential errors.

  4. Improvements in the Goddard balloon-borne lidar

    NASA Technical Reports Server (NTRS)

    Heaps, W. S.

    1986-01-01

    The Goddard balloon-borne lidar system for the measurement of stratospheric ozone and the hydroxyl radical has made three additional flights since the last laser radar conference. On September 27, 1984, a flight was made from Palestine, Texas obtaining a measurement of hydroxyl diurnal variation at 36 km. These data are presented on the plot which shows hydroxyl concentration as a function of GMT for the range cell closest to the instrument. Local noon corresponds to 18 hours on the plot. The rapid drop in concentration after noon is not predicted by models of stratospheric chemistry. It may represent the effects of contamination of the sample volume by hydrocarbons outgassed from the balloon. The more recent flights on June 30, 1985, and December 6, 1985, focussed on measurements of concentration in the lower stratosphere (less than 30 km). The June flight succeeded in obtaining an average concentration measurement (1.8 + or - 0.0000018 molecules/cubic cm) over the altitude range 21 to 26 km. The December flight obtained measurements down to 24 km with a better signal-to-noise ratio than that obtained in June. Prospects for further improvement in sensitivity and absolute calibration will be discussed.

  5. Mathematical Model and Calibration Experiment of a Large Measurement Range Flexible Joints 6-UPUR Six-Axis Force Sensor

    PubMed Central

    Zhao, Yanzhi; Zhang, Caifeng; Zhang, Dan; Shi, Zhongpan; Zhao, Tieshi

    2016-01-01

    Nowadays improving the accuracy and enlarging the measuring range of six-axis force sensors for wider applications in aircraft landing, rocket thrust, and spacecraft docking testing experiments has become an urgent objective. However, it is still difficult to achieve high accuracy and large measuring range with traditional parallel six-axis force sensors due to the influence of the gap and friction of the joints. Therefore, to overcome the mentioned limitations, this paper proposed a 6-Universal-Prismatic-Universal-Revolute (UPUR) joints parallel mechanism with flexible joints to develop a large measurement range six-axis force sensor. The structural characteristics of the sensor are analyzed in comparison with traditional parallel sensor based on the Stewart platform. The force transfer relation of the sensor is deduced, and the force Jacobian matrix is obtained using screw theory in two cases of the ideal state and the state of flexibility of each flexible joint is considered. The prototype and loading calibration system are designed and developed. The K value method and least squares method are used to process experimental data, and in errors of kind Ι and kind II linearity are obtained. The experimental results show that the calibration error of the K value method is more than 13.4%, and the calibration error of the least squares method is 2.67%. The experimental results prove the feasibility of the sensor and the correctness of the theoretical analysis which are expected to be adopted in practical applications. PMID:27529244

  6. The radiation metrology network related to the field of mammography: implementation and uncertainty analysis of the calibration system

    NASA Astrophysics Data System (ADS)

    Peixoto, J. G. P.; de Almeida, C. E.

    2001-09-01

    It is recognized by the international guidelines that it is necessary to offer calibration services for mammography beams in order to improve the quality of clinical diagnosis. Major efforts have been made by several laboratories in order to establish an appropriate and traceable calibration infrastructure and to provide the basis for a quality control programme in mammography. The contribution of the radiation metrology network to the users of mammography is reviewed in this work. Also steps required for the implementation of a mammography calibration system using a constant potential x-ray and a clinical mammography x-ray machine are presented. The various qualities of mammography radiation discussed in this work are in accordance with the IEC 61674 and the AAPM recommendations. They are at present available at several primary standard dosimetry laboratories (PSDLs), namely the PTB, NIST and BEV and a few secondary standard dosimetry laboratories (SSDLs) such as at the University of Wisconsin and at the IAEA's SSDL. We discuss the uncertainties involved in all steps of the calibration chain in accord with the ISO recommendations.

  7. Lamp mapping technique for independent determination of the water vapor mixing ratio calibration factor for a Raman lidar system

    NASA Astrophysics Data System (ADS)

    Venable, Demetrius D.; Whiteman, David N.; Calhoun, Monique N.; Dirisu, Afusat O.; Connell, Rasheen M.; Landulfo, Eduardo

    2011-08-01

    We have investigated a technique that allows for the independent determination of the water vapor mixing ratio calibration factor for a Raman lidar system. This technique utilizes a procedure whereby a light source of known spectral characteristics is scanned across the aperture of the lidar system's telescope and the overall optical efficiency of the system is determined. Direct analysis of the temperature-dependent differential scattering cross sections for vibration and vibration-rotation transitions (convolved with narrowband filters) along with the measured efficiency of the system, leads to a theoretical determination of the water vapor mixing ratio calibration factor. A calibration factor was also obtained experimentally from lidar measurements and radiosonde data. A comparison of the theoretical and experimentally determined values agrees within 5%. We report on the sensitivity of the water vapor mixing ratio calibration factor to uncertainties in parameters that characterize the narrowband transmission filters, the temperature-dependent differential scattering cross section, and the variability of the system efficiency ratios as the lamp is scanned across the aperture of the telescope used in the Howard University Raman Lidar system.

  8. Wind Tunnel Balance Calibration: Are 1,000,000 Data Points Enough?

    NASA Technical Reports Server (NTRS)

    Rhew, Ray D.; Parker, Peter A.

    2016-01-01

    Measurement systems are typically calibrated based on standard practices established by a metrology standards laboratory, for example the National Institute for Standards and Technology (NIST), or dictated by an organization's metrology manual. Therefore, the calibration is designed and executed according to an established procedure. However, for many aerodynamic research measurement systems a universally accepted standard, traceable approach does not exist. Therefore, a strategy for how to develop a calibration protocol is left to the developer or user to define based on experience and recommended practice in their respective industry. Wind tunnel balances are one such measurement system. Many different calibration systems, load schedules and procedures have been developed for balances with little consensus on a recommended approach. Especially lacking is guidance the number of calibration data points needed. Regrettably, the number of data points tends to be correlated with the perceived quality of the calibration. Often, the number of data points is associated with ones ability to generate the data rather than by a defined need in support of measurement objectives. Hence the title of the paper was conceived to challenge recent observations in the wind tunnel balance community that shows an ever increasing desire for more data points per calibration absent of guidance to determine when there are enough. This paper presents fundamental concepts and theory to aid in the development of calibration procedures for wind tunnel balances and provides a framework that is generally applicable to the characterization and calibration of other measurement systems. Questions that need to be answered are for example: What constitutes an adequate calibration? How much data are needed in the calibration? How good is the calibration? This paper will assist a practitioner in answering these questions by presenting an underlying theory on how to evaluate a calibration based on objective measures. This will enable the developer and user to design calibrations with quantified performance in terms of their capability to meet the user's objectives and a basis for comparing existing calibrations that may have been developed in an ad-hoc manner.

  9. Assessing the performance of a plastic optical fibre turbidity sensor for measuring post-fire erosion from plot to catchment scale

    NASA Astrophysics Data System (ADS)

    Keizer, J. J.; Martins, M. A. S.; Prats, S. A.; Santos, L. F.; Vieira, D. C. S.; Nogueira, R.; Bilro, L.

    2015-09-01

    This study is the first comprehensive testing of a novel plastic optical fibre turbidity sensor with runoff samples collected in the field and, more specifically, with a total of 158 streamflow samples and 925 overland flow samples from a recently burnt forest area in north-central Portugal, collected mainly during the first year after the wildfire, as well as with 56 overland flow samples from a nearby long-unburnt study site. Sediment concentrations differed less between overland flow and streamflow samples than between study sites and, at one study site, between plots with and without effective erosion mitigation treatments. Maximum concentrations ranged from 0.91 to 8.19 g L-1 for the micro-plot overland flow samples from the six burnt sites, from 1.74 to 8.99 g L-1 for the slope-scale overland flow samples from these same sites, and amounted to 4.55 g L-1 for the streamflow samples. Power functions provided (reasonably) good fits to the - expected - relationships of increasing normalized light loss with increasing sediment concentrations for the different sample types from individual study sites. The corresponding adjusted R2 values ranged from 0.64 to 0.81 in the case of the micro-plot samples from the six burnt sites, from 0.72 to 0.89 in the case of the slope-scale samples from these same sites, and was 0.85 in the case of the streamflow samples. While the overall performance of the sensor was thus rather satisfactory, the results pointed to the need for scale of site-specific calibrations to maximize the reliability of the predictions of sediment concentration by the POF (plastic optical fibre) sensor. This especially applied to the cases in which sediment concentrations were comparatively low, for example following mulching with forest residues.

  10. Calibrated Passive Sampling--Multi-plot Field Measurements of NH3 Emissions with a Combination of Dynamic Tube Method and Passive Samplers.

    PubMed

    Pacholski, Andreas

    2016-03-21

    Agricultural ammonia (NH3) emissions (90% of total EU emissions) are responsible for about 45% airborne eutrophication, 31% soil acidification and 12% fine dust formation within the EU15. But NH3 emissions also mean a considerable loss of nutrients. Many studies on NH3 emission from organic and mineral fertilizer application have been performed in recent decades. Nevertheless, research related to NH3 emissions after application fertilizers is still limited in particular with respect to relationships to emissions, fertilizer type, site conditions and crop growth. Due to the variable response of crops to treatments, effects can only be validated in experimental designs including field replication for statistical testing. The dominating ammonia loss methods yielding quantitative emissions require large field areas, expensive equipment or current supply, which restricts their application in replicated field trials. This protocol describes a new methodology for the measurement of NH3 emissions on many plots linking a simple semi-quantitative measuring method used in all plots, with a quantitative method by simultaneous measurements using both methods on selected plots. As a semi-quantitative measurement method passive samplers are used. The second method is a dynamic chamber method (Dynamic Tube Method) to obtain a transfer quotient, which converts the semi-quantitative losses of the passive sampler to quantitative losses (kg nitrogen ha(-1)). The principle underlying this approach is that passive samplers placed in a homogeneous experimental field have the same NH3 absorption behavior under identical environmental conditions. Therefore, a transfer co-efficient obtained from single passive samplers can be used to scale the values of all passive samplers used in the same field trial. The method proved valid under a wide range of experimental conditions and is recommended to be used under conditions with bare soil or small canopies (<0.3 m). Results obtained from experiments with taller plants should be treated more carefully.

  11. Calibrated Passive Sampling - Multi-plot Field Measurements of NH3 Emissions with a Combination of Dynamic Tube Method and Passive Samplers

    PubMed Central

    Pacholski, Andreas

    2016-01-01

    Agricultural ammonia (NH3) emissions (90% of total EU emissions) are responsible for about 45% airborne eutrophication, 31% soil acidification and 12% fine dust formation within the EU15. But NH3 emissions also mean a considerable loss of nutrients. Many studies on NH3 emission from organic and mineral fertilizer application have been performed in recent decades. Nevertheless, research related to NH3 emissions after application fertilizers is still limited in particular with respect to relationships to emissions, fertilizer type, site conditions and crop growth. Due to the variable response of crops to treatments, effects can only be validated in experimental designs including field replication for statistical testing. The dominating ammonia loss methods yielding quantitative emissions require large field areas, expensive equipment or current supply, which restricts their application in replicated field trials. This protocol describes a new methodology for the measurement of NH3 emissions on many plots linking a simple semi-quantitative measuring method used in all plots, with a quantitative method by simultaneous measurements using both methods on selected plots. As a semi-quantitative measurement method passive samplers are used. The second method is a dynamic chamber method (Dynamic Tube Method) to obtain a transfer quotient, which converts the semi-quantitative losses of the passive sampler to quantitative losses (kg nitrogen ha-1). The principle underlying this approach is that passive samplers placed in a homogeneous experimental field have the same NH3 absorption behavior under identical environmental conditions. Therefore, a transfer co-efficient obtained from single passive samplers can be used to scale the values of all passive samplers used in the same field trial. The method proved valid under a wide range of experimental conditions and is recommended to be used under conditions with bare soil or small canopies (<0.3 m). Results obtained from experiments with taller plants should be treated more carefully. PMID:27023010

  12. SEGY to ASCII Conversion and Plotting Program 2.0

    USGS Publications Warehouse

    Goldman, Mark R.

    2005-01-01

    INTRODUCTION SEGY has long been a standard format for storing seismic data and header information. Almost every seismic processing package can read and write seismic data in SEGY format. In the data processing world, however, ASCII format is the 'universal' standard format. Very few general-purpose plotting or computation programs will accept data in SEGY format. The software presented in this report, referred to as SEGY to ASCII (SAC), converts seismic data written in SEGY format (Barry et al., 1975) to an ASCII data file, and then creates a postscript file of the seismic data using a general plotting package (GMT, Wessel and Smith, 1995). The resulting postscript file may be plotted by any standard postscript plotting program. There are two versions of SAC: one version for plotting a SEGY file that contains a single gather, such as a stacked CDP or migrated section, and a second version for plotting multiple gathers from a SEGY file containing more than one gather, such as a collection of shot gathers. Note that if a SEGY file has multiple gathers, then each gather must have the same number of traces per gather, and each trace must have the same sample interval and number of samples per trace. SAC will read several common standards of SEGY data, including SEGY files with sample values written in either IBM or IEEE floating-point format. In addition, utility programs are present to convert non-standard Seismic Unix (.sux) SEGY files and PASSCAL (.rsy) SEGY files to standard SEGY files. SAC allows complete user control over all plotting parameters including label size and font, tick mark intervals, trace scaling, and the inclusion of a title and descriptive text. SAC shell scripts create a postscript image of the seismic data in vector rather than bitmap format, using GMT's pswiggle command. Although this can produce a very large postscript file, the image quality is generally superior to that of a bitmap image, and commercial programs such as Adobe Illustrator? can manipulate the image more efficiently.

  13. Elizabeth City State University: Elizabeth City, North Carolina (Data)

    DOE Data Explorer

    Stoffel, T.; Andreas, A.

    1985-09-25

    The Historically Black Colleges and Universities (HBCU) Solar Radiation Monitoring Network operated from July 1985 through December 1996. Funded by DOE, the six-station network provided 5-minute averaged measurements of direct normal, global, and diffuse horizontal solar irradiance. The data were processed at NREL to improve the assessment of the solar radiation resources in the southeastern United States. Historical HBCU data available online include quality assessed 5-min data, monthly reports, and plots. In January 1997 the HBCU sites became part of the CONFRRM solar monitoring network and data from the two remaining active stations, Bluefield State College and Elizabeth City State University, are collected by the NREL Measurement & Instrumentation Data Center (MIDC).

  14. Contributions of the observatory of New Mexico State University, Volume 1, no. 4, April 4

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Papers are presented dealing with astronomical observations of the Jupiter Red Spot, Corona Borealis Constellation, and Meteoroids. Calibration of instruments and reduction and analysis of data are discussed.

  15. External Validation and Evaluation of Reliability and Validity of the Modified Seoul National University Renal Stone Complexity Scoring System to Predict Stone-Free Status After Retrograde Intrarenal Surgery.

    PubMed

    Park, Juhyun; Kang, Minyong; Jeong, Chang Wook; Oh, Sohee; Lee, Jeong Woo; Lee, Seung Bae; Son, Hwancheol; Jeong, Hyeon; Cho, Sung Yong

    2015-08-01

    The modified Seoul National University Renal Stone Complexity scoring system (S-ReSC-R) for retrograde intrarenal surgery (RIRS) was developed as a tool to predict stone-free rate (SFR) after RIRS. We externally validated the S-ReSC-R. We retrospectively reviewed 159 patients who underwent RIRS. The S-ReSC-R was assigned from 1 to 12 according to the location and number of sites involved. The stone-free status was defined as no evidence of a stone or with clinically insignificant residual fragment stones less than 2 mm. Interobserver and test-retest reliabilities were evaluated. Statistical performance of the prediction model was assessed by its predictive accuracy, predictive probability, and clinical usefulness. Overall SFR was 73.0%. The SFRs were 86.7%, 70.2%, and 48.6% in low-score (1-2), intermediate-score (3-4), and high-score (5-12) groups, respectively (p<0.001). External validation of S-ReSC-R revealed an area under the curve (AUC) of 0.731 (95% CI 0.650-0.813). The AUC of the three-titered S-ReSC-R was 0.701 (95% CI 0.609-0.794). The calibration plot showed that the predicted probability of SFR had a concordance comparable to that of observed frequency. The Hosmer-Lemeshow goodness of fit test revealed a p-value of 0.01 for the S-ReSC-R and 0.90 for the three-titered S-ReSC-R. Interobserver and test-retest reliabilities revealed an almost perfect level of agreement. The present study proved the predictive value of S-ReSC-R to predict SFR following RIRS in an independent cohort. Interobserver and test-retest reliabilities confirmed that S-ReSC-R was reliable and valid.

  16. Integrated work-flow for quantitative metabolome profiling of plants, Peucedani Radix as a case.

    PubMed

    Song, Yuelin; Song, Qingqing; Liu, Yao; Li, Jun; Wan, Jian-Bo; Wang, Yitao; Jiang, Yong; Tu, Pengfei

    2017-02-08

    Universal acquisition of reliable information regarding the qualitative and quantitative properties of complicated matrices is the premise for the success of metabolomics study. Liquid chromatography-mass spectrometry (LC-MS) is now serving as a workhorse for metabolomics; however, LC-MS-based non-targeted metabolomics is suffering from some shortcomings, even some cutting-edge techniques have been introduced. Aiming to tackle, to some extent, the drawbacks of the conventional approaches, such as redundant information, detector saturation, low sensitivity, and inconstant signal number among different runs, herein, a novel and flexible work-flow consisting of three progressive steps was proposed to profile in depth the quantitative metabolome of plants. The roots of Peucedanum praeruptorum Dunn (Peucedani Radix, PR) that are rich in various coumarin isomers, were employed as a case study to verify the applicability. First, offline two dimensional LC-MS was utilized for in-depth detection of metabolites in a pooled PR extract namely universal metabolome standard (UMS). Second, mass fragmentation rules, notably concerning angular-type pyranocoumarins that are the primary chemical homologues in PR, and available databases were integrated for signal assignment and structural annotation. Third, optimum collision energy (OCE) as well as ion transition for multiple monitoring reaction measurement was online optimized with a reference compound-free strategy for each annotated component and large-scale relative quantification of all annotated components was accomplished by plotting calibration curves via serially diluting UMS. It is worthwhile to highlight that the potential of OCE for isomer discrimination was described and the linearity ranges of those primary ingredients were extended by suppressing their responses. The integrated workflow is expected to be qualified as a promising pipeline to clarify the quantitative metabolome of plants because it could not only holistically provide qualitative information, but also straightforwardly generate accurate quantitative dataset. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. A New Outer Galaxy Molecular Cloud Catalog: Applications to Galactic Structure

    NASA Astrophysics Data System (ADS)

    Kerton, C. R.; Brunt, C. M.; Pomerleau, C.

    2001-12-01

    We have generated a new molecular cloud catalog from a reprocessed version of the Five College Radio Astronomy (FCRAO) Observatory Outer Galaxy Survey (OGS) of 12CO (J=1--0) emission. The catalog has been used to develop a technique that uses the observed angular size-linewidth relation (ASLWR) as a distance indicator to molecular cloud ensembles. The new technique is a promising means to map out the large-scale structure of our Galaxy using the new high spatial dynamic range CO surveys currently available. The catalog was created using a two-stage object-identification algorithm. We first identified contiguous emission structures of a specified minimum number of pixels above a specified temperature threshold. Each structure so defined was then examined and localized emission enhancements within each structure were identified as separate objects. The resulting cloud catalog, contains basic data on 14595 objects. From the OGS we identified twenty-three cloud ensembles. For each, bisector fits to angular size vs. linewidth plots were made. The fits vary in a systematic way that allows a calibration of the fit parameters with distance to be made. Our derived distances to the ensembles are consistent with the distance to the Perseus Arm, and the accurate radial velocity measurements available from the same data are in accord with the known non-circular motions at the location of the Perseus Arm. The ASLWR method was also successfully applied to data from the Boston University/FCRAO Galactic Ring Survey (GRS) of 13CO(J=1--0) emission. Based upon our experience with the GRS and OGS, the ASLWR technique should be usable in any data set with sufficient spatial dynamic range to allow it to be properly calibrated. C.P. participated in this study through the Women in Engineering and Science (WES) program of NRC Canada. The Dominion Radio Astrophysical Observatory is a National Facility operated by the National Research Council. The Canadian Galactic Plane Survey is a Canadian project with international partners, and is supported by the Natural Sciences and Engineering Research Council (NSERC).

  18. Flight Test Results of an Angle of Attack and Angle of Sideslip Calibration Method Using Output-Error Optimization

    NASA Technical Reports Server (NTRS)

    Siu, Marie-Michele; Martos, Borja; Foster, John V.

    2013-01-01

    As part of a joint partnership between the NASA Aviation Safety Program (AvSP) and the University of Tennessee Space Institute (UTSI), research on advanced air data calibration methods has been in progress. This research was initiated to expand a novel pitot-static calibration method that was developed to allow rapid in-flight calibration for the NASA Airborne Subscale Transport Aircraft Research (AirSTAR) facility. This approach uses Global Positioning System (GPS) technology coupled with modern system identification methods that rapidly computes optimal pressure error models over a range of airspeed with defined confidence bounds. Subscale flight tests demonstrated small 2-s error bounds with significant reduction in test time compared to other methods. Recent UTSI full scale flight tests have shown airspeed calibrations with the same accuracy or better as the Federal Aviation Administration (FAA) accepted GPS 'four-leg' method in a smaller test area and in less time. The current research was motivated by the desire to extend this method for inflight calibration of angle of attack (AOA) and angle of sideslip (AOS) flow vanes. An instrumented Piper Saratoga research aircraft from the UTSI was used to collect the flight test data and evaluate flight test maneuvers. Results showed that the output-error approach produces good results for flow vane calibration. In addition, maneuvers for pitot-static and flow vane calibration can be integrated to enable simultaneous and efficient testing of each system.

  19. Laser induced breakdown spectroscopy (LIBS) applied to stratigrafic elemental analysis and optical coherence tomography (OCT) to damage determination of cultural heritage Brazilian coins

    NASA Astrophysics Data System (ADS)

    M. Amaral, Marcello; Raele, Marcus P.; Z. de Freitas, Anderson; Zahn, Guilherme S.; Samad, Ricardo E.; D. Vieira, Nilson, Jr.; G. Tarelho, Luiz V.

    2009-07-01

    This work presents a compositional characterization of 1939's Thousand "Réis" and 1945's One "Cruzeiro" Brazilian coins, forged on aluminum bronze alloy. The coins were irradiated by a Q-switched Nd:YAG laser with 4 ns pulse width and energy of 25mJ emitting at 1064nm reaching 3.1010Wcm-2 (assured condition for stoichiometric ablation), forming a plasma in a small fraction of the coin. Plasma emission was collected by an optical fiber system connected to an Echelle spectrometer. The capability of LIBS to remove small fraction of material was exploited and the coins were analyzed ablating layer by layer from patina to the bulk. The experimental conditions to assure reproductivity were determined by evaluation of three plasma paramethers: ionization temperature using Saha-Boltzmann plot, excitation temperature using Boltzmann plot, plasma density using Saha-Boltzmann plot and Stark broadening. The Calibration-Free LIBS technique was applied to both coins and the analytical determination of elemental composition was employed. In order to confirm the Edict Law elemental composition the results were corroborated by Neutron Activation Analysis (NAA). In both cases the results determined by CF-LIBS agreed to with the Edict Law and NAA determination. Besides the major components for the bronze alloy some other impurities were observed. Finally, in order to determine the coin damage made by the laser, the OCT (Optical Coherence Tomography) technique was used. After tree pulses of laser 54μg of coin material were removed reaching 120μm in depth.

  20. Functional traits help predict post-disturbance demography of tropical trees.

    PubMed

    Flores, Olivier; Hérault, Bruno; Delcamp, Matthieu; Garnier, Éric; Gourlet-Fleury, Sylvie

    2014-01-01

    How tropical tree species respond to disturbance is a central issue of forest ecology, conservation and resource management. We define a hierarchical model to investigate how functional traits measured in control plots relate to the population change rate and to demographic rates for recruitment and mortality after disturbance by logging operations. Population change and demographic rates were quantified on a 12-year period after disturbance and related to seven functional traits measured in control plots. The model was calibrated using a Bayesian Network approach on 53 species surveyed in permanent forest plots (37.5 ha) at Paracou in French Guiana. The network analysis allowed us to highlight both direct and indirect relationships among predictive variables. Overall, 89% of interspecific variability in the population change rate after disturbance were explained by the two demographic rates, the recruitment rate being the most explicative variable. Three direct drivers explained 45% of the variability in recruitment rates, including leaf phosphorus concentration, with a positive effect, and seed size and wood density with negative effects. Mortality rates were explained by interspecific variability in maximum diameter only (25%). Wood density, leaf nitrogen concentration, maximum diameter and seed size were not explained by variables in the analysis and thus appear as independent drivers of post-disturbance demography. Relationships between functional traits and demographic parameters were consistent with results found in undisturbed forests. Functional traits measured in control conditions can thus help predict the fate of tropical tree species after disturbance. Indirect relationships also suggest how different processes interact to mediate species demographic response.

  1. Data for Figures in Rainfall-induced release of microbes from manure: model development, parameter estimation, and uncertainty evaluation on small plots

    EPA Pesticide Factsheets

    ? Figure 1. Ratio of cumulative released cells to cells initially present in the manure at Week 0 as they vary by time, manure type and age, microbe, and Event (i.e., season). The 95% confidence intervals of the observed median number of cells in microbial runoff are shown as the shaded area.? Figure 2. Typical observed and simulated cumulative microbial runoff for Plots A403 and C209 with individual plot calibration.? Figure 3. Observed versus simulated microbial runoff associated with the Approach 1, adjusted for cumulative results by manure type and Event. Results accounted for counts associated with field monitoring time intervals described in Section 2.1 Field method. NS=Nash-Sutcliffe modeling efficiency, EC=E. coli, En=enterococci, FC= fecal coliforms.? Figure 4. Ratio of cumulative released cells/mass to cells/mass initially present in the aged manure by time and component (e.g., microbe) for solid manure (a) and (b), and amended, dry litter, and slurry manure (c). Solid lines (Equation (11) correspond to values in Table 3 for solid manure, and dry litter and slurry manure, respectively: (a) uses individual b values, and (b) and (c) use the combined values for b. Bounds of first and third quartiles associated with the present study??s results for cattle. Bounds of first and third quartiles associated with the present study??s results for poultry and swine. The full color versions of all figures are available in the online version of this paper, at ht

  2. Combined Yamamoto approach for simultaneous estimation of adsorption isotherm and kinetic parameters in ion-exchange chromatography.

    PubMed

    Rüdt, Matthias; Gillet, Florian; Heege, Stefanie; Hitzler, Julian; Kalbfuss, Bernd; Guélat, Bertrand

    2015-09-25

    Application of model-based design is appealing to support the development of protein chromatography in the biopharmaceutical industry. However, the required efforts for parameter estimation are frequently perceived as time-consuming and expensive. In order to speed-up this work, a new parameter estimation approach for modelling ion-exchange chromatography in linear conditions was developed. It aims at reducing the time and protein demand for the model calibration. The method combines the estimation of kinetic and thermodynamic parameters based on the simultaneous variation of the gradient slope and the residence time in a set of five linear gradient elutions. The parameters are estimated from a Yamamoto plot and a gradient-adjusted Van Deemter plot. The combined approach increases the information extracted per experiment compared to the individual methods. As a proof of concept, the combined approach was successfully applied for a monoclonal antibody on a cation-exchanger and for a Fc-fusion protein on an anion-exchange resin. The individual parameter estimations for the mAb confirmed that the new approach maintained the accuracy of the usual Yamamoto and Van Deemter plots. In the second case, offline size-exclusion chromatography was performed in order to estimate the thermodynamic parameters of an impurity (high molecular weight species) simultaneously with the main product. Finally, the parameters obtained from the combined approach were used in a lumped kinetic model to simulate the chromatography runs. The simulated chromatograms obtained for a wide range of gradient lengths and residence times showed only small deviations compared to the experimental data. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Bare soil erosion modelling with rainfall simulations: experiments on crop and recently burned areas

    NASA Astrophysics Data System (ADS)

    Catani, F.; Menci, S.; Moretti, S.; Keizer, J.

    2006-12-01

    The use of numerical models is of fundamental importance in the comprehension and prediction of soil erosion. At the very basis of the calibration process of the numerical models are the direct measurements of the governing parameters, carried out during field or laboratory tests. To measure and model soil erosion rainfall simulations can be used, that allow the reproduction of project rainfall having chosen characteristics of intensity and duration. The main parameters that rainfall simulators can measure are hydraulic conductivity, parameters of soil erodibility, rate and features of splash erosion, discharge coefficient and sediment yield. Other important parameters can be estimated during the rainfall simulations through the use of photogrammetric instruments able to memorize high definition stereographic models of the soil plot under analysis at different time steps. In this research rainfall simulator experiments (rse) were conducted to measure and quantify runoff and erosion processes on selected bare soil plots. The selected plots are located in some vineyards, olive groves and crops in central Italy and in some recently burned areas in north-central Portugal, affected by a wildfire during early July 2005 and, at the time, largely covered by commercial eucalypt plantations. On the Italian crops the choice of the rainfall intensities and durations were performed on the basis of the previous knowledge of the selected test areas. The procedure was based on an initial phase of soil wetting and a following phase of 3 erosion cycles. The first should reproduce the effects of a normal rainfall with a return time of 2 years (23 mm/h). The second should represent a serious episode with a return time of 10 years (34 mm/h). The third has the objective to reproduce and understand the effects of an intense precipitation event, with a return time of 50 years (41 mm/h). During vineyards experiments some photogrammetric surveys were carried out as well. In the Portugal burned areas, to measure the influence of rain intensities, two rainfall simulations have been carried out simultaneously, one with an intensity of 45 mm/h and one with 85 mm/h. In both cases, before the experiments, soil and vegetation cover description have been made and soil samples have been taken. During the simulations soil samples leaving the parcels were taken at suitable time intervals to measure the sediment yield and the runoff. The rse data have been thought to provide a sufficient basis for erosion modelling at the small-plot scale and, through upscaling, for predicting erosion rates at the slope scale. For this purpose two soil erosion models, WEPP and MEFIDIS, have been selected and then compared. The comparison has shown a certain degree of uncertainty in numeric erosion prediction, due to the non linearity of the overland erosion processes, and to technical and conceptual difficulties, including the data collection. In the following laboratory phase high resolution (2 by 2 mm) DEMs of the vineyards plot are being produced for each meaningful processing phase. The digital elevation models will then be analysed to asses calibration parameters such as soil roughness (expressed by standard deviation of elevations, fractal dimension and local relief energy), soil and sediment transfer (hypsometric curves, local elevation and volume differences) and rill network evolution (Horton ordering, stream lengths, contributing area, drainage density, Hack's law)

  4. Somewhere Between Great and Small: Disentangling the Conceptual Jumble of Middle, Regional, and Niche Powers

    DTIC Science & Technology

    2013-01-01

    and levels of corruption, as well as more ephemeral soft power considerations like national reputation, moral clout, and cultural influence.7 For...of carefully calibrated issues that balance underlying national interests and plausible opportunities for exerting influence. Middle power diplomacy...World (University Park: Pennsylvania State University Press, 1997); Björn Hettne, András Inotai and Osvaldo Sunkel, eds., Gobalism and the New

  5. Fully in Silico Calibration of Empirical Predictive Models for Environmental Fate Properties of Novel Munitions Compounds

    DTIC Science & Technology

    2016-04-01

    57) ASTM Standard E 2552 (2008) Standard guide for assessing the environmental and human health impacts of new energetic compounds; ASTM...Project ER-1735 APRIL 2016 Paul G. Tratnyek Alexandra J. Salter-Blanc Oregon Health & Science University Eric J. Bylaska Kurt R...order NEB Nudged Elastic Band NMR Nuclear Magnetic Resonance NOM Natural Organic Matter OHSU Oregon Health & Science University PCM Polarizable

  6. GNSS-Based Space Weather Systems Including COSMIC Ionospheric Measurements

    NASA Technical Reports Server (NTRS)

    Komjathy, Attila; Mandrake, Lukas; Wilson, Brian; Iijima, Byron; Pi, Xiaoqing; Hajj, George; Mannucci, Anthony J.

    2006-01-01

    The presentation outline includes University Corporation for Atmospheric Research (UCAR) and Jet Propulsion Laboratory (JPL) product comparisons, assimilating ground-based global positioning satellites (GPS) and COSMIC into JPL/University of Southern California (USC) Global Assimilative Ionospheric Model (GAIM), and JPL/USC GAIM validation. The discussion of comparisons examines Abel profiles and calibrated TEC. The JPL/USC GAIM validation uses Arecibo ISR, Jason-2 VTEC, and Abel profiles.

  7. Understory composition of hardwood stands in north central West Virginia

    Treesearch

    M.J. Twery

    1991-01-01

    Understory composition was measured on 960 10.5 m2 plots in 16 stands on the West Virginia University Forest in north-central West Virginia. The overstory composition was dominated by oaks (Quercus spp.) on 50% of the stands and by a mixture of oaks and yellow-poplar (Liriodendron tulipifera L.) on 50%. All...

  8. Russell, Henry Norris (1877-1957)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Astronomer, born in Oyster Bay, NY, spent nearly all his life working at Princeton University. He spectroscopically studied eclipsing binary stars to determine the masses of their component stars. At first collaborating with the British astronomer Hinks at Cambridge, he started to measure stellar parallaxes and, plotting the absolute magnitudes of stars whose distance he had thus measured, agains...

  9. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  10. El Niæo linked to increase in childhood diarrheal disease, a leading cause of premature death

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    William Checkley recalled plotting into his computer the data for the number of hospital admissions and the time of year.When we did the analysis and started looking at the relative increase, thats when it hit us, said Checkley, a medical student at Johns Hopkins University in Maryland.

  11. Wildfire and Oak Regeneration at the Urban Fringe

    Treesearch

    Joan L. Schwan; Herb Fong; Hilary K. Hug

    1997-01-01

    In July 1992, wildfire burned 500 acres of rural lands owned by Stanford University. Within the fire zone are five plots, ranging in size from 0.1 acre to more than 1 acre, on which nearly 600 naturally established juvenile California oaks (Q. agrifolia, Q. douglasii, and Q. lobata) have been monitored since 1990. Surveys following...

  12. The Lolium pathotype of Magnaporthe oryzae recovered from a single blasted wheat plant in the United States

    USDA-ARS?s Scientific Manuscript database

    Wheat blast is a devastating disease that was first identified in Brazil and has subsequently spread to surrounding countries in South America. In May 2011, disease scouting in a University of Kentucky wheat trial plot in Princeton, Kentucky identified a single plant with disease symptoms that diffe...

  13. Management of sheath blight and narrow brown leaf spot with biocontrol agents in organic rice, 2010

    USDA-ARS?s Scientific Manuscript database

    The experiment was established in a field of League-type soil (3% sand, 32% silt, and 64% clay) under organic management for many years at the Texas A&M University System's Agrilife Research and Extension Center, Beaumont. Plots consisted of seven 18-ft rows, and spaced 7 inches between rows. There ...

  14. A microplate assay to measure classical and alternative complement activity.

    PubMed

    Puissant-Lubrano, Bénédicte; Fortenfant, Françoise; Winterton, Peter; Blancher, Antoine

    2017-05-01

    We developed and validated a kinetic microplate hemolytic assay (HA) to quantify classical and alternative complement activity in a single dilution of human plasma or serum. The assay is based on monitoring hemolysis of sensitized sheep (or uncoated rabbit) red blood cells by means of a 96-well microplate reader. The activity of the calibrator was evaluated by reference to 200 healthy adults. The conversion of 50% hemolysis time into a percentage of activity was obtained using a calibration curve plotted daily. The linearity of the assay as well as interference (by hemolysis, bilrubinemia and lipemia) was assessed for classical pathway (CP). The within-day and the between-day precision was satisfactory regarding the performance of commercially available liposome immunoassay (LIA) and ELISA. Patients with hereditary or acquired complement deficiencies were detected (activity was measured <30%). We also provided a reference range obtained from 200 blood donors. The agreement of CP evaluated on samples from 48 patients was 94% with LIA and 87.5% with ELISA. The sensitivity of our assay was better than that of LIA, and the cost was lower than either LIA or ELISA. In addition, this assay was less time consuming than previously reported HAs. This assay allows the simultaneous measurement of 36 samples in duplicate per run of a 96-well plate. The use of a daily calibration curve allows standardization of the method and leads to good reproducibility. The same technique was also adapted for the quantification of alternative pathway (AP) activity.

  15. BAT3 Analyzer: Real-Time Data Display and Interpretation Software for the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3)

    USGS Publications Warehouse

    Winston, Richard B.; Shapiro, Allen M.

    2007-01-01

    The BAT3 Analyzer provides real-time display and interpretation of fluid pressure responses and flow rates measured during geochemical sampling, hydraulic testing, or tracer testing conducted with the Multifunction Bedrock-Aquifer Transportable Testing Tool (BAT3) (Shapiro, 2007). Real-time display of the data collected with the Multifunction BAT3 allows the user to ensure that the downhole apparatus is operating properly, and that test procedures can be modified to correct for unanticipated hydraulic responses during testing. The BAT3 Analyzer can apply calibrations to the pressure transducer and flow meter data to display physically meaningful values. Plots of the time-varying data can be formatted for a specified time interval, and either saved to files, or printed. Libraries of calibrations for the pressure transducers and flow meters can be created, updated and reloaded to facilitate the rapid set up of the software to display data collected during testing with the Multifunction BAT3. The BAT3 Analyzer also has the functionality to estimate calibrations for pressure transducers and flow meters using data collected with the Multifunction BAT3 in conjunction with corroborating check measurements. During testing with the Multifunction BAT3, and also after testing has been completed, hydraulic properties of the test interval can be estimated by comparing fluid pressure responses with model results; a variety of hydrogeologic conceptual models of the formation are available for interpreting fluid-withdrawal, fluid-injection, and slug tests.

  16. Variability in Predictions from Online Tools: A Demonstration Using Internet-Based Melanoma Predictors.

    PubMed

    Zabor, Emily C; Coit, Daniel; Gershenwald, Jeffrey E; McMasters, Kelly M; Michaelson, James S; Stromberg, Arnold J; Panageas, Katherine S

    2018-02-22

    Prognostic models are increasingly being made available online, where they can be publicly accessed by both patients and clinicians. These online tools are an important resource for patients to better understand their prognosis and for clinicians to make informed decisions about treatment and follow-up. The goal of this analysis was to highlight the possible variability in multiple online prognostic tools in a single disease. To demonstrate the variability in survival predictions across online prognostic tools, we applied a single validation dataset to three online melanoma prognostic tools. Data on melanoma patients treated at Memorial Sloan Kettering Cancer Center between 2000 and 2014 were retrospectively collected. Calibration was assessed using calibration plots and discrimination was assessed using the C-index. In this demonstration project, we found important differences across the three models that led to variability in individual patients' predicted survival across the tools, especially in the lower range of predictions. In a validation test using a single-institution data set, calibration and discrimination varied across the three models. This study underscores the potential variability both within and across online tools, and highlights the importance of using methodological rigor when developing a prognostic model that will be made publicly available online. The results also reinforce that careful development and thoughtful interpretation, including understanding a given tool's limitations, are required in order for online prognostic tools that provide survival predictions to be a useful resource for both patients and clinicians.

  17. External validation of a Cox prognostic model: principles and methods

    PubMed Central

    2013-01-01

    Background A prognostic model should not enter clinical practice unless it has been demonstrated that it performs a useful role. External validation denotes evaluation of model performance in a sample independent of that used to develop the model. Unlike for logistic regression models, external validation of Cox models is sparsely treated in the literature. Successful validation of a model means achieving satisfactory discrimination and calibration (prediction accuracy) in the validation sample. Validating Cox models is not straightforward because event probabilities are estimated relative to an unspecified baseline function. Methods We describe statistical approaches to external validation of a published Cox model according to the level of published information, specifically (1) the prognostic index only, (2) the prognostic index together with Kaplan-Meier curves for risk groups, and (3) the first two plus the baseline survival curve (the estimated survival function at the mean prognostic index across the sample). The most challenging task, requiring level 3 information, is assessing calibration, for which we suggest a method of approximating the baseline survival function. Results We apply the methods to two comparable datasets in primary breast cancer, treating one as derivation and the other as validation sample. Results are presented for discrimination and calibration. We demonstrate plots of survival probabilities that can assist model evaluation. Conclusions Our validation methods are applicable to a wide range of prognostic studies and provide researchers with a toolkit for external validation of a published Cox model. PMID:23496923

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cernoch, Antonin; Soubusta, Jan; Celechovska, Lucie

    We report on experimental implementation of the optimal universal asymmetric 1->2 quantum cloning machine for qubits encoded into polarization states of single photons. Our linear-optical machine performs asymmetric cloning by partially symmetrizing the input polarization state of signal photon and a blank copy idler photon prepared in a maximally mixed state. We show that the employed method of measurement of mean clone fidelities exhibits strong resilience to imperfect calibration of the relative efficiencies of single-photon detectors used in the experiment. Reliable characterization of the quantum cloner is thus possible even when precise detector calibration is difficult to achieve.

  19. A Monte Carlo modeling alternative for the API Gamma Ray Calibration Facility.

    PubMed

    Galford, J E

    2017-04-01

    The gamma ray pit at the API Calibration Facility, located on the University of Houston campus, defines the API unit for natural gamma ray logs used throughout the petroleum logging industry. Future use of the facility is uncertain. An alternative method is proposed to preserve the gamma ray API unit definition as an industry standard by using Monte Carlo modeling to obtain accurate counting rate-to-API unit conversion factors for gross-counting and spectral gamma ray tool designs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Rainfall simulation experiments in ecological and conventional vineyards.

    NASA Astrophysics Data System (ADS)

    Adrian, Alexander; Brings, Christine; Rodrigo Comino, Jesús; Iserloh, Thomas; Ries, Johannes B.

    2015-04-01

    In October 2014, the Trier University started a measurement series, which defines, compares and evaluates the behavior of runoff and soil erosion with different farming productions in vineyards. The research area is located in Kanzem, a traditional wine village in the Saar Valley (Rheinland-Palatinate, Germany). The test fields show different cultivation methods: ecological (with natural vegetation cover under and around the vines) and conventional cultivated rows of wine. By using the small portable rainfall simulator of Trier University it shall be proved if the assumption that there is more runoff and soil erosion in the conventional part than in the ecological part of the tillage system. Rainfall simulations assess the generation of overland flow, soil erosion and infiltration. So, a trend of soil erosion and runoff of the different cultivation techniques are noted. The objective of this work is to compare the geomorphological dynamics of two different tillage systems. Therefore, 30 rainfall simulations plots were evenly distributed on a west exposition hillside with different slope angels (8-25°), vegetation- and stone-covers. In concrete, the plot surface reaches from strongly covered soil across lithoidal surfaces to bare soil often with compacted lanes of typical using machines. In addition, by using the collected substrate, an estimation and distribution of the grain size of the eroded material shall be given. The eroded substrate is compared to soil samples of the test plots. The first results have shown that there is slightly more runoff and soil erosion in the ecological area than on the conventional part of the vineyard.

  1. Mapping Invasive Plant Species with a Combination of Field and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Skowronek, S.; Feilhauer, H.; Van De Kerchove, R.; Ewald, M.; Aerts, R.; Somers, B.; Warrie, J.; Kempeneers, P.; Lenoir, J.; Honnay, O.; Asner, G. P.; Schmidtlein, S.; Hattab, T.; Rocchini, D.

    2015-12-01

    Advanced hyperspectral and LIDAR data offer a great potential to map and monitor invasive plant species and their impact on ecosystems. These species are often difficult to detect over large areas with traditional mapping approaches. One challenge is the combination of the remote sensing data with the field data for calibration and validation. Therefore, our goals were to (1) develop an approach that allows to efficiently map species invasions based on presence-only data of the target species and remote sensing data; and (2) use this approach to create distribution maps for invasive plant species in two study areas in western Europe, which offer the basis for further analysis of the impact of invasions and to infer possible management options. For this purpose, on the island of Sylt in Northern Germany, we collected vegetation data on 120 plots with a size of 3 m x 3 m with different cover fractions of two invasive plant species; the moss Campylopus introflexus and the shrub Rosa rugosa. In the forest of Compiègne in Northern France, we sampled a total of 50 plots with a size of 25 x 25 m, targeting the invasive tree Prunus serotina. In both study areas, independent validation datasets containing presence and absence points of the target species were collected. Airborne hyperspectral data (APEX), which were simultaneously acquired for both study areas in summer 2014, provided 285 spectral bands covering the visible, near infrared and short-wave infrared region with a pixel size of 1.8 and 3 m. First results showed that mapping using one-class classifiers is possible: For C. introflexus, AUC value was 0.89 and OAC 0.72, for R. rugosa., AUC was 0.93 and OAC 0.92. However, for both species, a few areas were mapped incorrectly. Possible explanations are the different appearances of the target species in different biotope types underrepresented in the calibration data, and a high cover of species with similar reflectance properties.

  2. Groundwater flow and solute transport modelling from within R: Development of the RMODFLOW and RMT3DMS packages.

    NASA Astrophysics Data System (ADS)

    Rogiers, Bart

    2015-04-01

    Since a few years, an increasing number of contributed R packages is becoming available, in the field of hydrology. Hydrological time series analysis packages, lumped conceptual rainfall-runoff models, distributed hydrological models, weather generators, and different calibration and uncertainty estimation methods are all available. Also a few packages are available for solving partial differential equations. Subsurface hydrological modelling is however still seldomly performed in R, or with codes interfaced with R, despite the fact that excellent geostatistical packages, model calibration/inversion options and state-of-the-art visualization libraries are available. Moreover, other popular scientific programming languages like matlab and python have packages for pre- and post-processing files of MODFLOW (Harbaugh 2005) and MT3DMS (Zheng 2010) models. To fill this gap, we present here the development versions of the RMODFLOW and RMT3DMS packages, which allow pre- and post-processing MODFLOW and MT3DMS input and output files from within R. File reading and writing functions are currently available for different packages, and plotting functions are foreseen making use of the ggplot2 package (plotting system based on the grammar of graphics; Wickham 2009). The S3 generic-function object oriented programming style is used for this. An example is provided, making modifications to an existing model, and visualization of the model output. References Harbaugh, A. (2005). MODFLOW-2005: The US Geological Survey Modular Ground-water Model--the Ground-water Flow Process, U.S. Geological Survey Techniques and Methods 6-A16 (p. 253). Wickham, H. (2009). ggplot2: elegant graphics for data analysis. Springer New York, 2009. Zheng, C. (2010). MT3DMS v5.3, a modular three-dimensional multispecies transport model for simulation of advection, dispersion and chemical reactions of contaminants in groundwater systems. Supplemental User's Guide. (p. 56).

  3. External validation of a prehospital risk score for critical illness.

    PubMed

    Kievlan, Daniel R; Martin-Gill, Christian; Kahn, Jeremy M; Callaway, Clifton W; Yealy, Donald M; Angus, Derek C; Seymour, Christopher W

    2016-08-11

    Identification of critically ill patients during prehospital care could facilitate early treatment and aid in the regionalization of critical care. Tools to consistently identify those in the field with or at higher risk of developing critical illness do not exist. We sought to validate a prehospital critical illness risk score that uses objective clinical variables in a contemporary cohort of geographically and temporally distinct prehospital encounters. We linked prehospital encounters at 21 emergency medical services (EMS) agencies to inpatient electronic health records at nine hospitals in southwestern Pennsylvania from 2010 to 2012. The primary outcome was critical illness during hospitalization, defined as an intensive care unit stay with delivery of organ support (mechanical ventilation or vasopressor use). We calculated the prehospital risk score using demographics and first vital signs from eligible EMS encounters, and we tested the association between score variables and critical illness using multivariable logistic regression. Discrimination was assessed using the AUROC curve, and calibration was determined by plotting observed versus expected events across score values. Operating characteristics were calculated at score thresholds. Among 42,550 nontrauma, non-cardiac arrest adult EMS patients, 1926 (4.5 %) developed critical illness during hospitalization. We observed moderate discrimination of the prehospital critical illness risk score (AUROC 0.73, 95 % CI 0.72-0.74) and adequate calibration based on observed versus expected plots. At a score threshold of 2, sensitivity was 0.63 (95 % CI 0.61-0.75), specificity was 0.73 (95 % CI 0.72-0.73), negative predictive value was 0.98 (95 % CI 0.98-0.98), and positive predictive value was 0.10 (95 % CI 0.09-0.10). The risk score performance was greater with alternative definitions of critical illness, including in-hospital mortality (AUROC 0.77, 95 % CI 0.7 -0.78). In an external validation cohort, a prehospital risk score using objective clinical data had moderate discrimination for critical illness during hospitalization.

  4. Tests of Sunspot Number Sequences: 3. Effects of Regression Procedures on the Calibration of Historic Sunspot Data

    NASA Astrophysics Data System (ADS)

    Lockwood, M.; Owens, M. J.; Barnard, L.; Usoskin, I. G.

    2016-11-01

    We use sunspot-group observations from the Royal Greenwich Observatory (RGO) to investigate the effects of intercalibrating data from observers with different visual acuities. The tests are made by counting the number of groups [RB] above a variable cut-off threshold of observed total whole spot area (uncorrected for foreshortening) to simulate what a lower-acuity observer would have seen. The synthesised annual means of RB are then re-scaled to the full observed RGO group number [RA] using a variety of regression techniques. It is found that a very high correlation between RA and RB (r_{AB} > 0.98) does not prevent large errors in the intercalibration (for example sunspot-maximum values can be over 30 % too large even for such levels of r_{AB}). In generating the backbone sunspot number [R_{BB}], Svalgaard and Schatten ( Solar Phys., 2016) force regression fits to pass through the scatter-plot origin, which generates unreliable fits (the residuals do not form a normal distribution) and causes sunspot-cycle amplitudes to be exaggerated in the intercalibrated data. It is demonstrated that the use of Quantile-Quantile ("Q-Q") plots to test for a normal distribution is a useful indicator of erroneous and misleading regression fits. Ordinary least-squares linear fits, not forced to pass through the origin, are sometimes reliable (although the optimum method used is shown to be different when matching peak and average sunspot-group numbers). However, other fits are only reliable if non-linear regression is used. From these results it is entirely possible that the inflation of solar-cycle amplitudes in the backbone group sunspot number as one goes back in time, relative to related solar-terrestrial parameters, is entirely caused by the use of inappropriate and non-robust regression techniques to calibrate the sunspot data.

  5. The Use of Transfer Radiometers in Validating the Visible through Shortwave Infrared Calibrations of Radiance Sources Used by Instruments in NASA's Earth Observing System

    NASA Technical Reports Server (NTRS)

    Butler, James J.; Barnes, Robert A.

    2002-01-01

    The detection and study of climate change over a time frame of decades requires successive generations of satellite, airborne, and ground-based instrumentation carefully calibrated against a common radiance scale. In NASA s Earth Observing System (EOS) program, the pre-launch radiometric calibration of these instruments in the wavelength region from 400 nm to 2500 nm is accomplished using internally illuminated integrating spheres and diffuse reflectance panels illuminated by irradiance standard lamps. Since 1995, the EOS Calibration Program operating within the EOS Project Science Office (PSO) has enlisted the expertise of national standards laboratories and government and university metrology laboratories in an effort to validate the radiance scales assigned to sphere and panel radiance sources by EOS instrument calibration facilities. This state-of-the-art program has been accomplished using ultra-stable transfer radiometers independently calibrated by the above participating institutions. In ten comparisons since February 1995, the agreement between the radiance measurements of the transfer radiometers is plus or minus 1.80% at 411 nm, plus or minus 1.31% at 552.5 nm, plus or minus 1.32% at 868.0 nm, plus or minus 2.54% at 1622nm, and plus or minus 2.81% at 2200nm (sigma =1).

  6. Behavioral responses of cotton mice (Peromyscus gossypinus) to large amounts of coarse woody debris.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinkleman, Travis M.

    Hinkleman, Travis M. 2004. MS Thesis. Clemson University, Clemson, South Carolina. 62 pp. Coarse woody debris (CWD) is any log, snag, or downed branch >10 cm in diameter. As a major structural feature of forest ecosystems, CWD serves as an important habitat component for a variety of organisms. Rodents frequently use CWD for travel routes and daytime refugia. Although rodents are known to use CWD extensively and selectively, the use and selection of CWD by rodents may vary according to the abundance of CWD. The purpose of this project was to determine the effect of CWD abundance on the habitatmore » use patterns of a common terrestrial rodent, the cotton mouse (Peromyscus gossypinus). I tracked cotton mice with fluorescent pigments and radiotelemetry in 6 plots, situated in loblolly pine (Pinus taeda) stands, with manipulated levels of woody debris. Treatment plots had 6x the amount of woody debris as control plots. I determined log use and movement patterns from the paths produced by powder-tracking, and I identified daytime refugia by radio-tracking. Travel along logs was almost exclusively associated with the surface of logs (91%). The proportion of a movement path associated with logs was not the best predictor of path complexity; rather, the sex of the individual was the only significant indicator of relative displacement (i.e., males moved farther from the point of release than females) and vegetation cover was the only significant predictor of mean turning angle (i.e., increasing vegetation cover yielded more convoluted paths). Mice used logs to a greater extent on treatment plots (23.7%) than mice on control plots (4.8%). Mice on treatment plots used logs with less decay, less ground contact, and more bark than logs used by mice on control plots. Differences in log use patterns were largely a result of the attributes of available logs, but mice used logs selectively on treatment plots. Refuges were highly associated with woody debris, including refuges in rotting stumps (65%), root boles (13%), brush piles (8%), and logs (7%). Mice used different frequencies of refuge types between treatments; root bole and brush pile refuges were used more on treatment plots whereas stump and log refuges were used more on control plots. Refuge type, log volume, and tree basal area were significant predictors of refuge selection on control plots whereas refuge type and size were significant predictors of refuge selection on treatment plots. Refuges were significantly more dispersed on treatment plots. Mice used refuges more intensely and switched refuges less in the winter than the summer, regardless of woody debris abundance. The extensive and selective use of logs by cotton mice suggests that logs may be an important resource. However, logs are not a critical habitat component. Over half of the paths on control plots were not associated with logs, and logs were used infrequently as refuges. Nonetheless, refuges were highly associated with woody debris (e.g., stumps, root boles), which suggests that woody debris may be a critical habitat component.« less

  7. Long-term effects of land application of class B biosolids on the soil microbial populations, pathogens, and activity.

    PubMed

    Zerzghi, Huruy; Gerba, Charles P; Brooks, John P; Pepper, Ian L

    2010-01-01

    This study evaluated the influence of 20 annual land applications of Class B biosolids on the soil microbial community. The potential benefits and hazards of land application were evaluated by analysis of surface soil samples collected following the 20th land application of biosolids. The study was initiated in 1986 at the University of Arizona Marana Agricultural Center, 21 miles north of Tucson, AZ. The final application of biosolids was in March 2005, followed by growth of cotton (Gossypium hirsutum L.) from April through November 2005. Surface soil samples (0-30 cm) were collected monthly from March 2005, 2 wk after the final biosolids application, through December 2005, and analyzed for soil microbial numbers. December samples were analyzed for additional soil microbial properties. Data show that land application of Class B biosolids had no significant long-term effect on indigenous soil microbial numbers including bacteria, actinomycetes, and fungi compared to unamended control plots. Importantly, no bacterial or viral pathogens were detected in soil samples collected from biosolid amended plots in December (10 mo after the last land application) demonstrating that pathogens introduced via Class B biosolids only survived in soil transiently. However, plots that received biosolids had significantly higher microbial activity or potential for microbial transformations, including nitrification, sulfur oxidation, and dehydrogenase activity, than control plots and plots receiving inorganic fertilizers. Overall, the 20 annual land applications showed no long-term adverse effects, and therefore, this study documents that land application of biosolids at this particular site was sustainable throughout the 20-yr period, with respect to soil microbial properties.

  8. Monitoring middle-atmospheric water vapor over Seoul by using a 22 GHz ground-based radiometer SWARA

    NASA Astrophysics Data System (ADS)

    Ka, Soohyun; de Wachter, Evelyn; Kaempfer, Niklaus; Oh, Jung Jin

    2010-10-01

    Water vapor is the strongest natural greenhouse gas in the atmosphere. It is most abundant in the troposphere at low altitudes, due to evaporation at the ocean surface, with maximum values of around 6 g/kg. The amount of water vapor reaches a minimum at tropopause level and increases again in the middle atmosphere through oxidation of methane and vertical transport. Water vapor has both positive and negative effects on global warming, and we need to study how it works on climate change by monitoring water vapor concentration in the middle atmosphere. In this paper, we focus on the 22 GHz ground-based radiometer called SWARA (Seoul Water vapor Radiometer) which has been operated at Sookmyung women's university in Seoul, Korea since Oct. 2006. It is a joint project of the University of Bern, Switzerland, and the Sookmyung Women's University of Seoul, South Korea. The SWARA receives 22.235 GHz emitted from water vapor spontaneously and converts down to 1.5 GHz with +/- 0.5 GHz band width in 61 kHz resolution. To represent 22.235 GHz water vapor spectrum precisely, we need some calibration methods because the signal shows very weak intensity in ~0.1 K on the ground. For SWARA, we have used the balancing and the tipping curve methods for a calibration. To retrieve the water vapor profile, we have applied ARTS and Qpack software. In this paper, we will present the calibration methods and water vapor variation over Seoul for the last 4 years.

  9. An IMU-to-Body Alignment Method Applied to Human Gait Analysis.

    PubMed

    Vargas-Valencia, Laura Susana; Elias, Arlindo; Rocon, Eduardo; Bastos-Filho, Teodiano; Frizera, Anselmo

    2016-12-10

    This paper presents a novel calibration procedure as a simple, yet powerful, method to place and align inertial sensors with body segments. The calibration can be easily replicated without the need of any additional tools. The proposed method is validated in three different applications: a computer mathematical simulation; a simplified joint composed of two semi-spheres interconnected by a universal goniometer; and a real gait test with five able-bodied subjects. Simulation results demonstrate that, after the calibration method is applied, the joint angles are correctly measured independently of previous sensor placement on the joint, thus validating the proposed procedure. In the cases of a simplified joint and a real gait test with human volunteers, the method also performs correctly, although secondary plane errors appear when compared with the simulation results. We believe that such errors are caused by limitations of the current inertial measurement unit (IMU) technology and fusion algorithms. In conclusion, the presented calibration procedure is an interesting option to solve the alignment problem when using IMUs for gait analysis.

  10. Students' challenges with polar functions: covariational reasoning and plotting in the polar coordinate system

    NASA Astrophysics Data System (ADS)

    Habre, Samer

    2017-01-01

    Covariational reasoning has been the focus of many studies but only a few looked into this reasoning in the polar coordinate system. In fact, research on student's familiarity with polar coordinates and graphing in the polar coordinate system is scarce. This paper examines the challenges that students face when plotting polar curves using the corresponding plot in the Cartesian plane. In particular, it examines how students coordinate the covariation in the polar coordinate system with the covariation in the Cartesian one. The research, which was conducted in a sophomore level Calculus class at an American university operating in Lebanon, investigates in addition the challenges when students synchronize the reasoning between the two coordinate systems. For this, the mental actions that students engage in when performing covariational tasks are examined. Results show that coordinating the value of one polar variable with changes in the other was well achieved. Coordinating the direction of change of one variable with changes in the other variable was more challenging for students especially when the radial distance r is negative.

  11. Omega flight-test data reduction sequence. [computer programs for reduction of navigation data

    NASA Technical Reports Server (NTRS)

    Lilley, R. W.

    1974-01-01

    Computer programs for Omega data conversion, summary, and preparation for distribution are presented. Program logic and sample data formats are included, along with operational instructions for each program. Flight data (or data collected in flight format in the laboratory) is provided by the Ohio University Omega receiver base in the form of 6-bit binary words representing the phase of an Omega station with respect to the receiver's local clock. All eight Omega stations are measured in each 10-second Omega time frame. In addition, an event-marker bit and a time-slot D synchronizing bit are recorded. Program FDCON is used to remove data from the flight recorder tape and place it on data-processing cards for later use. Program FDSUM provides for computer plotting of selected LOP's, for single-station phase plots, and for printout of basic signal statistics for each Omega channel. Mean phase and standard deviation are printed, along with data from which a phase distribution can be plotted for each Omega station. Program DACOP simply copies the Omega data deck a controlled number of times, for distribution to users.

  12. Precision Mapping of the California Connected Vehicle Testbed Corridor

    DOT National Transportation Integrated Search

    2015-11-01

    In this project the University of California Riverside mapping sensor hardware was successfully mounted on an instrumented vehicle to map a segment of the California Connected Vehicle testbed corridor on State Route 82. After calibrating the sensor p...

  13. JUPITER PROJECT - MERGING INVERSE PROBLEM FORMULATION TECHNOLOGIES

    EPA Science Inventory

    The JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) project seeks to enhance and build on the technology and momentum behind two of the most popular sensitivity analysis, data assessment, calibration, and uncertainty analysis programs used in envi...

  14. Simulation of size-exclusion chromatography distribution coefficients of comb-shaped molecules in spherical pores comparison of simulation and experiment.

    PubMed

    Radke, Wolfgang

    2004-03-05

    Simulations of the distribution coefficients of linear polymers and regular combs with various spacings between the arms have been performed. The distribution coefficients were plotted as a function of the number of segments in order to compare the size exclusion chromatography (SEC)-elution behavior of combs relative to linear molecules. By comparing the simulated SEC-calibration curves it is possible to predict the elution behavior of comb-shaped polymers relative to linear ones. In order to compare the results obtained by computer simulations with experimental data, a variety of comb-shaped polymers varying in side chain length, spacing between the side chains and molecular weights of the backbone were analyzed by SEC with light-scattering detection. It was found that the computer simulations could predict the molecular weights of linear molecules having the same retention volume with an accuracy of about 10%, i.e. the error in the molecular weight obtained by calculating the molecular weight of the comb-polymer based on a calibration curve constructed using linear standards and the results of the computer simulations are of the same magnitude as the experimental error of absolute molecular weight determination.

  15. Ion-selective electrodes in potentiometric titrations; a new method for processing and evaluating titration data.

    PubMed

    Granholm, Kim; Sokalski, Tomasz; Lewenstam, Andrzej; Ivaska, Ari

    2015-08-12

    A new method to convert the potential of an ion-selective electrode to concentration or activity in potentiometric titration is proposed. The advantage of this method is that the electrode standard potential and the slope of the calibration curve do not have to be known. Instead two activities on the titration curve have to be estimated e.g. the starting activity before the titration begins and the activity at the end of the titration in the presence of large excess of titrant. This new method is beneficial when the analyte is in a complexed matrix or in a harsh environment which affects the properties of the electrode and the traditional calibration procedure with standard solutions cannot be used. The new method was implemented both in a method of linearization based on the Grans's plot and in determination of the stability constant of a complex and the concentration of the complexing ligand in the sample. The new method gave accurate results when using titrations data from experiments with samples of known composition and with real industrial harsh black liquor sample. A complexometric titration model was also developed. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Data-base development for water-quality modeling of the Patuxent River basin, Maryland

    USGS Publications Warehouse

    Fisher, G.T.; Summers, R.M.

    1987-01-01

    Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)

  17. Dioxin analysis by gas chromatography-Fourier transform ion cyclotron resonance mass spectrometry (GC-FTICRMS).

    PubMed

    Taguchi, Vince Y; Nieckarz, Robert J; Clement, Ray E; Krolik, Stefan; Williams, Robert

    2010-11-01

    The feasibility of utilizing a gas chromatograph-tandem quadrupole-Fourier transform ion cyclotron resonance mass spectrometer (GC-MS/MS-FTICRMS) to analyze chlorinated-dioxins/furans (CDDs/CDFs) and mixed halogenated dioxins/furans (HDDs/HDFs) was investigated by operating the system in the GC-FTICRMS mode. CDDs/CDFs and mixed HDDs/HDFs could be analyzed at 50,000 to 100,000 resolving power (RP) on the capillary gas chromatographic time scale. Initial experiments demonstrated that 1 pg of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) and 5 pg of 2-bromo-3,7,8-trichlorodibenzo-p-dioxin (BTrCDD) could be detected. The feasibility of utilizing an FTICRMS for screening of CDDs/CDFs, HDDs/HDFs and related compounds was also investigated by analyzing an extract from vegetation exposed to fall-out from an industrial fire. CDDs/CDFs, chlorinated pyrenes and chlorinated tetracenes could be detected from a Kendrick plot analysis of the ultrahigh resolution mass spectra. Mass accuracies were of the order of 0.5 ppm on standards with external mass calibration and 1 ppm on a sample with internal mass calibration. Copyright © 2010 American Society for Mass Spectrometry. Published by Elsevier Inc. All rights reserved.

  18. Qualitative and quantitative analysis of an additive element in metal oxide nanometer film using laser induced breakdown spectroscopy.

    PubMed

    Xiu, Junshan; Liu, Shiming; Sun, Meiling; Dong, Lili

    2018-01-20

    The photoelectric performance of metal ion-doped TiO 2 film will be improved with the changing of the compositions and concentrations of additive elements. In this work, the TiO 2 films doped with different Sn concentrations were obtained with the hydrothermal method. Qualitative and quantitative analysis of the Sn element in TiO 2 film was achieved with laser induced breakdown spectroscopy (LIBS) with the calibration curves plotted accordingly. The photoelectric characteristics of TiO 2 films doped with different Sn content were observed with UV visible absorption spectra and J-V curves. All results showed that Sn doping could improve the optical absorption to be red-shifted and advance the photoelectric properties of the TiO 2 films. We had obtained that when the concentration of Sn doping in TiO 2 films was 11.89  mmol/L, which was calculated by the LIBS calibration curves, the current density of the film was the largest, which indicated the best photoelectric performance. It indicated that LIBS was a potential and feasible measured method, which was applied to qualitative and quantitative analysis of the additive element in metal oxide nanometer film.

  19. The application of Near-Infrared Reflectance Spectroscopy (NIRS) to detect melamine adulteration of soya bean meal.

    PubMed

    Haughey, Simon A; Graham, Stewart F; Cancouët, Emmanuelle; Elliott, Christopher T

    2013-02-15

    Soya bean products are used widely in the animal feed industry as a protein based feed ingredient and have been found to be adulterated with melamine. This was highlighted in the Chinese scandal of 2008. Dehulled soya (GM and non-GM), soya hulls and toasted soya were contaminated with melamine and spectra were generated using Near Infrared Reflectance Spectroscopy (NIRS). By applying chemometrics to the spectral data, excellent calibration models and prediction statistics were obtained. The coefficients of determination (R(2)) were found to be 0.89-0.99 depending on the mathematical algorithm used, the data pre-processing applied and the sample type used. The corresponding values for the root mean square error of calibration and prediction were found to be 0.081-0.276% and 0.134-0.368%, respectively, again depending on the chemometric treatment applied to the data and sample type. In addition, adopting a qualitative approach with the spectral data and applying PCA, it was possible to discriminate between the four samples types and also, by generation of Cooman's plots, possible to distinguish between adulterated and non-adulterated samples. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Acropora interbranch skeleton Sr/Ca ratios: Evaluation of a potential new high-resolution paleothermometer

    NASA Astrophysics Data System (ADS)

    Sadler, James; Nguyen, Ai D.; Leonard, Nicole D.; Webb, Gregory E.; Nothdurft, Luke D.

    2016-04-01

    The majority of coral geochemistry-based paleoclimate reconstructions in the Indo-Pacific are conducted on selectively cored colonies of massive Porites. This restriction to a single genus may make it difficult to amass the required paleoclimate data for studies that require deep reef coring techniques. Acropora, however, is a highly abundant coral genus in both modern and fossil reef systems and displays potential as a novel climate archive. Here we present a calibration study for Sr/Ca ratios recovered from interbranch skeleton in corymbose Acropora colonies from Heron Reef, southern Great Barrier Reef. Significant intercolony differences in absolute Sr/Ca ratios were normalized by producing anomaly plots of both coral geochemistry and instrumental water temperature records. Weighted linear regression of these anomalies from the lagoon and fore-reef slope provide a sensitivity of -0.05 mmol/mol °C-1, with a correlation coefficient (r2 = 0.65) comparable to those of genera currently used in paleoclimate reconstructions. Reconstructions of lagoon and reef slope mean seasonality in water temperature accurately identify the greater seasonal amplitude observed in the lagoon of Heron Reef. A longer calibration period is, however, required for reliable reconstructions of annual mean water temperatures.

  1. Using Active Learning for Speeding up Calibration in Simulation Models.

    PubMed

    Cevik, Mucahit; Ergun, Mehmet Ali; Stout, Natasha K; Trentham-Dietz, Amy; Craven, Mark; Alagoz, Oguzhan

    2016-07-01

    Most cancer simulation models include unobservable parameters that determine disease onset and tumor growth. These parameters play an important role in matching key outcomes such as cancer incidence and mortality, and their values are typically estimated via a lengthy calibration procedure, which involves evaluating a large number of combinations of parameter values via simulation. The objective of this study is to demonstrate how machine learning approaches can be used to accelerate the calibration process by reducing the number of parameter combinations that are actually evaluated. Active learning is a popular machine learning method that enables a learning algorithm such as artificial neural networks to interactively choose which parameter combinations to evaluate. We developed an active learning algorithm to expedite the calibration process. Our algorithm determines the parameter combinations that are more likely to produce desired outputs and therefore reduces the number of simulation runs performed during calibration. We demonstrate our method using the previously developed University of Wisconsin breast cancer simulation model (UWBCS). In a recent study, calibration of the UWBCS required the evaluation of 378 000 input parameter combinations to build a race-specific model, and only 69 of these combinations produced results that closely matched observed data. By using the active learning algorithm in conjunction with standard calibration methods, we identify all 69 parameter combinations by evaluating only 5620 of the 378 000 combinations. Machine learning methods hold potential in guiding model developers in the selection of more promising parameter combinations and hence speeding up the calibration process. Applying our machine learning algorithm to one model shows that evaluating only 1.49% of all parameter combinations would be sufficient for the calibration. © The Author(s) 2015.

  2. Using Active Learning for Speeding up Calibration in Simulation Models

    PubMed Central

    Cevik, Mucahit; Ali Ergun, Mehmet; Stout, Natasha K.; Trentham-Dietz, Amy; Craven, Mark; Alagoz, Oguzhan

    2015-01-01

    Background Most cancer simulation models include unobservable parameters that determine the disease onset and tumor growth. These parameters play an important role in matching key outcomes such as cancer incidence and mortality and their values are typically estimated via lengthy calibration procedure, which involves evaluating large number of combinations of parameter values via simulation. The objective of this study is to demonstrate how machine learning approaches can be used to accelerate the calibration process by reducing the number of parameter combinations that are actually evaluated. Methods Active learning is a popular machine learning method that enables a learning algorithm such as artificial neural networks to interactively choose which parameter combinations to evaluate. We develop an active learning algorithm to expedite the calibration process. Our algorithm determines the parameter combinations that are more likely to produce desired outputs, therefore reduces the number of simulation runs performed during calibration. We demonstrate our method using previously developed University of Wisconsin Breast Cancer Simulation Model (UWBCS). Results In a recent study, calibration of the UWBCS required the evaluation of 378,000 input parameter combinations to build a race-specific model and only 69 of these combinations produced results that closely matched observed data. By using the active learning algorithm in conjunction with standard calibration methods, we identify all 69 parameter combinations by evaluating only 5620 of the 378,000 combinations. Conclusion Machine learning methods hold potential in guiding model developers in the selection of more promising parameter combinations and hence speeding up the calibration process. Applying our machine learning algorithm to one model shows that evaluating only 1.49% of all parameter combinations would be sufficient for the calibration. PMID:26471190

  3. Extinction, seeing and sky transparency monitoring at the Observatorio Astrofísico de Javalambre for J-PAS and J-PLUS calibration and scheduling

    NASA Astrophysics Data System (ADS)

    Vázquez Ramió, H.; Díaz-Martín, M. C.; Varela, J.; Ederoclite, A.; Maícas, N. Lamadrid, J. L.; Abril, J.; Iglesias-Marzoa, R.; Rodríguez, S.; Tilve, V.; Cenarro, A. J.; Antón Bravo, J. L.; Bello Ferrer, R.; Cristóbal-Hornillos, D.; Guillén Civera, L.; Hernández-Fuertes, J.; Jiménez Mejías, D.; Lasso-Cabrera, N. M.; López Alegre, G.; López Sainz, A.; Luis-Simoes, R. M.; Marín-Franch, A.; Moles, M.; Rueda-Teruel, F.; Rueda-Teruel, S.; Suárez López, O.; Yanes-Díaz, A.

    2015-05-01

    The Javalambre-Physics of the Accelerating Universe Astrophysical Survey (J-PAS; see Benítez et al. 2014) and the Javalambre-Photometric Local Universe Survey (J-PLUS) will be conducted at the brand-new Observatorio Astrofísico de Javalambre (OAJ) in Teruel, Spain. J-PLUS is planned to start by the first half of 2015 while J-PAS first light is expected to happen along 2015. Besides the two main telescopes (with 2.5 m and 80 cm apertures), several smaller-sized facilities are present at the OAJ devoted to site characterization and supporting measurements to be used to calibrate the J-PAS and J-PLUS photometry and to feed up the OAJ's Sequencer with the integrated seeing and the sky transparency. These instruments are: i) an extinction monitor, an 11 " telescope estimating the atmospheric extinction to finally obtain the OAJ extinction curve, which is the initial step to J-PAS overall photometric calibration procedure; ii) an 8 " telescope implementing the Differential Image Motion Monitor (DIMM) technique to obtain the integrated seeing; and iii) an All-Sky Transmission MONitor (ASTMON), a roughly all-sky instrument providing the sky transparency as well as sky brightness and the atmospheric extinction too.

  4. A Universal Threshold for the Assessment of Load and Output Residuals of Strain-Gage Balance Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2017-01-01

    A new universal residual threshold for the detection of load and gage output residual outliers of wind tunnel strain{gage balance data was developed. The threshold works with both the Iterative and Non{Iterative Methods that are used in the aerospace testing community to analyze and process balance data. It also supports all known load and gage output formats that are traditionally used to describe balance data. The threshold's definition is based on an empirical electrical constant. First, the constant is used to construct a threshold for the assessment of gage output residuals. Then, the related threshold for the assessment of load residuals is obtained by multiplying the empirical electrical constant with the sum of the absolute values of all first partial derivatives of a given load component. The empirical constant equals 2.5 microV/V for the assessment of balance calibration or check load data residuals. A value of 0.5 microV/V is recommended for the evaluation of repeat point residuals because, by design, the calculation of these residuals removes errors that are associated with the regression analysis of the data itself. Data from a calibration of a six-component force balance is used to illustrate the application of the new threshold definitions to real{world balance calibration data.

  5. On-orbit characterization of hyperspectral imagers

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel

    Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne- and satellite-based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This dissertation presents a method for determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on a multispectral sensor, Moderate-resolution Imaging Spectroradiometer (MODIS), as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. A method to predict hyperspectral surface reflectance using a combination of MODIS data and spectral shape information is developed and applied for the characterization of Hyperion. Spectral shape information is based on RSG's historical in situ data for the Railroad Valley test site and spectral library data for the Libyan test site. Average atmospheric parameters, also based on historical measurements, are used in reflectance prediction and transfer to space. Results of several cross-calibration scenarios that differ in image acquisition coincidence, test site, and reference sensor are found for the characterization of Hyperion. These are compared with results from the reflectance-based approach of vicarious calibration, a well-documented method developed by the RSG that serves as a baseline for calibration performance for the cross-calibration method developed here. Cross-calibration provides results that are within 2% of those of reflectance-based results in most spectral regions. Larger disagreements exist for shorter wavelengths studied in this work as well as in spectral areas that experience absorption by the atmosphere.

  6. Validity of Torque-Data Collection at Multiple Sites: A Framework for Collaboration on Clinical-Outcomes Research in Sports Medicine.

    PubMed

    Kuenze, Christopher; Eltouhky, Moataz; Thomas, Abbey; Sutherlin, Mark; Hart, Joseph

    2016-05-01

    Collecting torque data using a multimode dynamometer is common in sports-medicine research. The error in torque measurements across multiple sites and dynamometers has not been established. To assess the validity of 2 calibration protocols across 3 dynamometers and the error associated with torque measurement for each system. Observational study. 3 university laboratories at separate institutions. 2 Biodex System 3 dynamometers and 1 Biodex System 4 dynamometer. System calibration was completed using the manufacturer-recommended single-weight method and an experimental calibration method using a series of progressive weights. Both calibration methods were compared with a manually calculated theoretical torque across a range of applied weights. Relative error, absolute error, and percent error were calculated at each weight. Each outcome variable was compared between systems using 95% confidence intervals across low (0-65 Nm), moderate (66-110 Nm), and high (111-165 Nm) torque categorizations. Calibration coefficients were established for each system using both calibration protocols. However, within each system the calibration coefficients generated using the single-weight (System 4 = 2.42 [0.90], System 3a = 1.37 [1.11], System 3b = -0.96 [1.45]) and experimental calibration protocols (System 4 = 3.95 [1.08], System 3a = -0.79 [1.23], System 3b = 2.31 [1.66]) were similar and displayed acceptable mean relative error compared with calculated theoretical torque values. Overall, percent error was greatest for all 3 systems in low-torque conditions (System 4 = 11.66% [6.39], System 3a = 6.82% [11.98], System 3b = 4.35% [9.49]). The System 4 significantly overestimated torque across all 3 weight increments, and the System 3b overestimated torque over the moderate-torque increment. Conversion of raw voltage to torque values using the single-calibration-weight method is valid and comparable to a more complex multiweight calibration process; however, it is clear that calibration must be done for each individual system to ensure accurate data collection.

  7. TweezPal - Optical tweezers analysis and calibration software

    NASA Astrophysics Data System (ADS)

    Osterman, Natan

    2010-11-01

    Optical tweezers, a powerful tool for optical trapping, micromanipulation and force transduction, have in recent years become a standard technique commonly used in many research laboratories and university courses. Knowledge about the optical force acting on a trapped object can be gained only after a calibration procedure which has to be performed (by an expert) for each type of trapped objects. In this paper we present TweezPal, a user-friendly, standalone Windows software tool for optical tweezers analysis and calibration. Using TweezPal, the procedure can be performed in a matter of minutes even by non-expert users. The calibration is based on the Brownian motion of a particle trapped in a stationary optical trap, which is being monitored using video or photodiode detection. The particle trajectory is imported into the software which instantly calculates position histogram, trapping potential, stiffness and anisotropy. Program summaryProgram title: TweezPal Catalogue identifier: AEGR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 44 891 No. of bytes in distributed program, including test data, etc.: 792 653 Distribution format: tar.gz Programming language: Borland Delphi Computer: Any PC running Microsoft Windows Operating system: Windows 95, 98, 2000, XP, Vista, 7 RAM: 12 Mbytes Classification: 3, 4.14, 18, 23 Nature of problem: Quick, robust and user-friendly calibration and analysis of optical tweezers. The optical trap is calibrated from the trajectory of a trapped particle undergoing Brownian motion in a stationary optical trap (input data) using two methods. Solution method: Elimination of the experimental drift in position data. Direct calculation of the trap stiffness from the positional variance. Calculation of 1D optical trapping potential from the positional distribution of data points. Trap stiffness calculation by fitting a parabola to the trapping potential. Presentation of X-Y positional density for close inspection of the 2D trapping potential. Calculation of the trap anisotropy. Running time: Seconds

  8. Effect of the improved accelerometer calibration method on AIUB's GRACE monthly gravity field solution

    NASA Astrophysics Data System (ADS)

    Jean, Yoomin; Meyer, Ulrich; Arnold, Daniel; Bentel, Katrin; Jäggi, Adrian

    2017-04-01

    The monthly global gravity field solutions derived using the measurements from the GRACE (Gravity Recovery and Climate Experiment) satellites have been continuously improved by the processing centers. One of the improvements in the processing method is a more detailed calibration of the on-board accelerometers in the GRACE satellites. The accelerometer data calibration is usually restricted to the scale factors and biases. It has been assumed that the three different axes are perfectly orthogonal in the GRACE science reference frame. Recently, it was shown by Klinger and Mayer-Gürr (2016) that a fully-populated scale matrix considering the non-orthogonality of the axes and the misalignment of the GRACE science reference frame and the GRACE accelerometer frame improves the quality of the C20 coefficient in the GRACE monthly gravity field solutions. We investigate the effect of the more detailed calibration of the GRACE accelerometer data on the C20 coefficient in the case of the AIUB (Astronomical Institute of the University of Bern) processing method using the Celestial Mechanics Approach. We also investigate the effect of the new calibration parameters on the stochastic parameters in the Celestial Mechanics Approach.

  9. A comparison of forest dynamics at two sites in the Southeastern Ozark Mountains of Missouri

    Treesearch

    Michael A. Jenkins; Stephen G. Pallardy

    1993-01-01

    Changes in tree species composition and regeneration patterns were studied in 53 permanent vegetation plots located at two sites (Pioneer Forest and University State Forest) in oak-hickory forests of southeastern Missouri where mortality and decline of red oak species have been identified. The two sites also exhibited differing levels of decline and mortality. Between...

  10. Opera and Poison: A Secret and Enjoyable Approach to Teaching and Learning Chemistry

    ERIC Educational Resources Information Center

    Andre, Joao Paulo

    2013-01-01

    The storyline of operas, with historical or fictional characters, often include potions and poisons. This has prompted a study of the chemistry behind some operatic plots. The results were originally presented as a lecture given at the University of Minho in Portugal, within the context of the International Year of Chemistry. The same lecture was…

  11. Genetics Research Discovered in a Bestseller | Poster

    Cancer.gov

    By Nancy Parrish, Staff Writer One morning in early January, Amar Klar sat down at his computer and found an e-mail with a curious message from a colleague. While reading a bestselling novel, The Marriage Plot by Jeffrey Eugenides, his colleague, a professor at Princeton University, found a description of research on yeast genetics that was surprisingly similar to Klar’s early

  12. Conduct a Cost Effectiveness Study of Postgraduate Medical Education Programs. Part I, Part II (Appendixes). Final Report.

    ERIC Educational Resources Information Center

    Manning, Phillip R.; And Others

    To gather data on the effects of different modes of instruction on physician achievement in plotting mean P, O, RS, and T receptiors in electrocardiography, researchers chose a random sample of physicians who had taken a correspondence course on electrocardiography during the previous 5 years from the University of Southern California School of…

  13. Sugarbeet Activities of the USDA-ARS East Lansing Conducted in Cooperation with Saginaw Valley Bean and Beet Farm during 2008

    USDA-ARS?s Scientific Manuscript database

    Four evaluation plots were planted at the Michigan State University Saginaw Valley Bean and Beet Research Farm in 2008. Test 08BB01 was specifically designed to evaluate a number of non-traditional yield and physiological measures that had been suggested from earlier trials, in addition to the more ...

  14. Combined micro-droplet and thin-film-assisted pre-concentration of lead traces for on-line monitoring using anodic stripping voltammetry.

    PubMed

    Belostotsky, Inessa; Gridin, Vladimir V; Schechter, Israel; Yarnitzky, Chaim N

    2003-02-01

    An improved analytical method for airborne lead traces is reported. It is based on using a Venturi scrubber sampling device for simultaneous thin-film stripping and droplet entrapment of aerosol influxes. At least threefold enhancement of the lead-trace pre-concentration is achieved. The sampled traces are analyzed by square-wave anodic stripping voltammetry. The method was tested by a series of pilot experiments. These were performed using contaminant-controlled air intakes. Reproducible calibration plots were obtained. The data were validated by traditional analysis using filter sampling. LODs are comparable with the conventional techniques. The method was successfully applied to on-line and in situ environmental monitoring of lead.

  15. Total ozone observation by sun photometry at Arosa, Switzerland

    NASA Astrophysics Data System (ADS)

    Staehelin, Johannes; Schill, Herbert; Hoegger, Bruno; Viatte, Pierre; Levrat, Gilbert; Gamma, Adrian

    1995-07-01

    The method used for ground-based total ozone observations and the design of two instruments used to monitor atmospheric total ozone at Arosa (Dobson spectrophotometer and Brewer spectrometer) are briefly described. Two different procedures of the calibration of the Dobson spectrometer, both based on the Langley plot method, are presented. Data quality problems that occured in recent years in the measurements of one Dobson instrument at Arosa are discussed, and two different methods to reassess total ozone observations are compared. Two partially automated Dobson spectrophotometers and two completely automated Brewer spectrometers are currently in operation at Arosa. Careful comparison of the results of the measurements of the different instruments yields valuable information of possible small long- term drifts of the instruments involved in the operational measurements.

  16. Hollow optical-fiber based infrared spectroscopy for measurement of blood glucose level by using multi-reflection prism.

    PubMed

    Kino, Saiko; Omori, Suguru; Katagiri, Takashi; Matsuura, Yuji

    2016-02-01

    A mid-infrared attenuated total reflection (ATR) spectroscopy system employing hollow optical fibers and a trapezoidal multi-reflection ATR prism has been developed to measure blood glucose levels. Using a multi-reflection prism brought about higher sensitivity, and the flat and wide contact surface of the prism resulted in higher measurement reproducibility. An analysis of in vivo measurements of human inner lip mucosa revealed clear signatures of glucose in the difference spectra between ones taken during the fasting state and ones taken after ingestion of glucose solutions. A calibration plot based on the absorption peak at 1155 cm(-1) that originates from the pyranose ring structure of glucose gave measurement errors less than 20%.

  17. Spectroscopy of late type giant stars

    NASA Astrophysics Data System (ADS)

    Spaenhauer, A.; Thevenin, F.

    1984-06-01

    An attempt to calibrate broadband RGU colors of late type giant stars in terms of the physical parameters of the objects is reported. The parameters comprise the effective temperature, surface gravity and global metal abundance with respect to the sun. A selection of 21 giant star candidates in the Basel fields Plaut 1, Centaurus III and near HD 95540 were examined to obtain a two color plot. Attention is focused on the G-R color range 1.5-2.15 mag, i.e., spectral types K0-K5. A relationship between R and the metallicity is quantified and shown to have a correlation coefficient of 0.93. No correlation is found between metallicity and gravity or R and the effective temperature.

  18. Tropical forests are non-equilibrium ecosystems governed by interspecific competition based on universal 1/6 niche width.

    PubMed

    Fort, Hugo; Inchausti, Pablo

    2013-01-01

    Tropical forests are mega-diverse ecosystems that display complex and non-equilibrium dynamics. However, theoretical approaches have largely focused on explaining steady-state behaviour and fitting snapshots of data. Here we show that local and niche interspecific competition can realistically and parsimoniously explain the observed non-equilibrium regime of permanent plots of nine tropical forests, in eight different countries. Our spatially-explicit model, besides predicting with accuracy the main biodiversity metrics for these plots, can also reproduce their dynamics. A central finding is that tropical tree species have a universal niche width of approximately 1/6 of the niche axis that echoes the observed widespread convergence in their functional traits enabling them to exploit similar resources and to coexist despite of having large niche overlap. This niche width yields an average ratio of 0.25 between interspecific and intraspecific competition that corresponds to an intermediate value between the extreme claims of the neutral model and the classical niche-based model of community assembly (where interspecific competition is dominant). In addition, our model can explain and yield observed spatial patterns that classical niche-based and neutral theories cannot.

  19. The creation and evaluation of a model to simulate the probability of conception in seasonal-calving pasture-based dairy heifers.

    PubMed

    Fenlon, Caroline; O'Grady, Luke; Butler, Stephen; Doherty, Michael L; Dunnion, John

    2017-01-01

    Herd fertility in pasture-based dairy farms is a key driver of farm economics. Models for predicting nulliparous reproductive outcomes are rare, but age, genetics, weight, and BCS have been identified as factors influencing heifer conception. The aim of this study was to create a simulation model of heifer conception to service with thorough evaluation. Artificial Insemination service records from two research herds and ten commercial herds were provided to build and evaluate the models. All were managed as spring-calving pasture-based systems. The factors studied were related to age, genetics, and time of service. The data were split into training and testing sets and bootstrapping was used to train the models. Logistic regression (with and without random effects) and generalised additive modelling were selected as the model-building techniques. Two types of evaluation were used to test the predictive ability of the models: discrimination and calibration. Discrimination, which includes sensitivity, specificity, accuracy and ROC analysis, measures a model's ability to distinguish between positive and negative outcomes. Calibration measures the accuracy of the predicted probabilities with the Hosmer-Lemeshow goodness-of-fit, calibration plot and calibration error. After data cleaning and the removal of services with missing values, 1396 services remained to train the models and 597 were left for testing. Age, breed, genetic predicted transmitting ability for calving interval, month and year were significant in the multivariate models. The regression models also included an interaction between age and month. Year within herd was a random effect in the mixed regression model. Overall prediction accuracy was between 77.1% and 78.9%. All three models had very high sensitivity, but low specificity. The two regression models were very well-calibrated. The mean absolute calibration errors were all below 4%. Because the models were not adept at identifying unsuccessful services, they are not suggested for use in predicting the outcome of individual heifer services. Instead, they are useful for the comparison of services with different covariate values or as sub-models in whole-farm simulations. The mixed regression model was identified as the best model for prediction, as the random effects can be ignored and the other variables can be easily obtained or simulated.

  20. LAPS Lidar Measurements at the ARM Alaska Northslope Site (Support to FIRE Project)

    NASA Technical Reports Server (NTRS)

    Philbrick, C. Russell; Lysak, Daniel B., Jr.; Petach, Tomas M.; Esposito, Steven T.; Mulik, Karoline R.

    1998-01-01

    This report consists of data summaries of the results obtained during the May 1998 measurement period at Barrow Alaska. This report does not contain any data interpretation or analysis of the results which will follow this activity. This report is forwarded with a data set on magnetic media which contains the reduced data from the LAPS lidar in 15 minute intervals. The data was obtained during the period 15-30 May 1998. The measurement period overlapped with several aircraft flights conducted by NASA as part of the FIRE project. The report contains a summary list of the data obtained plus figures that have been prepared to help visualize the measurement periods. The order of the presentation is as follows: Section 1. A copy of the Statement of Work for the planned activity of the second measurement period at the ARM Northslope site is provided. Section 2. A list of the data collection periods shows the number of one minute data records stored during each hour of operation and the corresponding size (Mbytes) of the one hour data folders. The folder and file names are composed from the year, month, day, hour and minute. The date/time information is given in UTC for easier comparison with other data sets. Section 3. A set of 4 comparisons between the LAPS lidar results and the sondes released by the ARM scientists from a location nearby the lidar. The lidar results show the +/- 1 sigma statistical error on each of the independent 75 m altitude bins of the data. This set of 4 comparisons was used to set and validate the calibration value which was then used for the complete data set. Section 4. A set of false color figures with up to 10 hours of specific humidity measurements are shown in each graph. Two days of measurements are shown on each page. These plots are crude representations of the data and permit a survey which indicates when the clouds were very low or where interesting events may occur in the results. These plots are prepared using the real time sequence plot program which has no smoothing in either the altitude or time (except that you are allowed to pick the integration time and time step. All of these plots were prepared with 15 minute integration and 5 minute time step. Section 5. A set of time sequence data for all of the extended observation periods are shown with a smoothing algorithm from the Matlab plotting library. Most of these data are integrated for 5 minutes and stepped at I minute intervals but several plots are shown with both 15 minute integration and 5 minute steps. The upper level on these data was selected and converted to the white background where the error in the specific humidity reached 25%. Section 6. The set of one hour integrated plots shown with up to 4 hours per page are provided- from the real time analysis snapshot program. The only difference in these plots and the real time display is that the plots are stopped at an altitude where the error appears to be too large for the data to contain any meaningful information.

Top