Sample records for accurate volume estimation

  1. Direct volume estimation without segmentation

    NASA Astrophysics Data System (ADS)

    Zhen, X.; Wang, Z.; Islam, A.; Bhaduri, M.; Chan, I.; Li, S.

    2015-03-01

    Volume estimation plays an important role in clinical diagnosis. For example, cardiac ventricular volumes including left ventricle (LV) and right ventricle (RV) are important clinical indicators of cardiac functions. Accurate and automatic estimation of the ventricular volumes is essential to the assessment of cardiac functions and diagnosis of heart diseases. Conventional methods are dependent on an intermediate segmentation step which is obtained either manually or automatically. However, manual segmentation is extremely time-consuming, subjective and highly non-reproducible; automatic segmentation is still challenging, computationally expensive, and completely unsolved for the RV. Towards accurate and efficient direct volume estimation, our group has been researching on learning based methods without segmentation by leveraging state-of-the-art machine learning techniques. Our direct estimation methods remove the accessional step of segmentation and can naturally deal with various volume estimation tasks. Moreover, they are extremely flexible to be used for volume estimation of either joint bi-ventricles (LV and RV) or individual LV/RV. We comparatively study the performance of direct methods on cardiac ventricular volume estimation by comparing with segmentation based methods. Experimental results show that direct estimation methods provide more accurate estimation of cardiac ventricular volumes than segmentation based methods. This indicates that direct estimation methods not only provide a convenient and mature clinical tool for cardiac volume estimation but also enables diagnosis of cardiac diseases to be conducted in a more efficient and reliable way.

  2. Estimating bark thicknesses of common Appalachian hardwoods

    Treesearch

    R. Edward Thomas; Neal D. Bennett

    2014-01-01

    Knowing the thickness of bark along the stem of a tree is critical to accurately estimate residue and, more importantly, estimate the volume of solid wood available. Determining the volume or weight of bark for a log is important because bark and wood mass are typically separated while processing logs, and accurate determination of volume is problematic. Bark thickness...

  3. Influence of the volume and density functions within geometric models for estimating trunk inertial parameters.

    PubMed

    Wicke, Jason; Dumas, Genevieve A

    2010-02-01

    The geometric method combines a volume and a density function to estimate body segment parameters and has the best opportunity for developing the most accurate models. In the trunk, there are many different tissues that greatly differ in density (e.g., bone versus lung). Thus, the density function for the trunk must be particularly sensitive to capture this diversity, such that accurate inertial estimates are possible. Three different models were used to test this hypothesis by estimating trunk inertial parameters of 25 female and 24 male college-aged participants. The outcome of this study indicates that the inertial estimates for the upper and lower trunk are most sensitive to the volume function and not very sensitive to the density function. Although it appears that the uniform density function has a greater influence on inertial estimates in the lower trunk region than in the upper trunk region, this is likely due to the (overestimated) density value used. When geometric models are used to estimate body segment parameters, care must be taken in choosing a model that can accurately estimate segment volumes. Researchers wanting to develop accurate geometric models should focus on the volume function, especially in unique populations (e.g., pregnant or obese individuals).

  4. Accurately determining log and bark volumes of saw logs using high-resolution laser scan data

    Treesearch

    R. Edward Thomas; Neal D. Bennett

    2014-01-01

    Accurately determining the volume of logs and bark is crucial to estimating the total expected value recovery from a log. Knowing the correct size and volume of a log helps to determine which processing method, if any, should be used on a given log. However, applying volume estimation methods consistently can be difficult. Errors in log measurement and oddly shaped...

  5. Photogrammetry and Laser Imagery Tests for Tank Waste Volume Estimates: Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Field, Jim G.

    2013-03-27

    Feasibility tests were conducted using photogrammetry and laser technologies to estimate the volume of waste in a tank. These technologies were compared with video Camera/CAD Modeling System (CCMS) estimates; the current method used for post-retrieval waste volume estimates. This report summarizes test results and presents recommendations for further development and deployment of technologies to provide more accurate and faster waste volume estimates in support of tank retrieval and closure.

  6. Estimation of feline renal volume using computed tomography and ultrasound.

    PubMed

    Tyson, Reid; Logsdon, Stacy A; Werre, Stephen R; Daniel, Gregory B

    2013-01-01

    Renal volume estimation is an important parameter for clinical evaluation of kidneys and research applications. A time efficient, repeatable, and accurate method for volume estimation is required. The purpose of this study was to describe the accuracy of ultrasound and computed tomography (CT) for estimating feline renal volume. Standardized ultrasound and CT scans were acquired for kidneys of 12 cadaver cats, in situ. Ultrasound and CT multiplanar reconstructions were used to record renal length measurements that were then used to calculate volume using the prolate ellipsoid formula for volume estimation. In addition, CT studies were reconstructed at 1 mm, 5 mm, and 1 cm, and transferred to a workstation where the renal volume was calculated using the voxel count method (hand drawn regions of interest). The reference standard kidney volume was then determined ex vivo using water displacement with the Archimedes' principle. Ultrasound measurement of renal length accounted for approximately 87% of the variability in renal volume for the study population. The prolate ellipsoid formula exhibited proportional bias and underestimated renal volume by a median of 18.9%. Computed tomography volume estimates using the voxel count method with hand-traced regions of interest provided the most accurate results, with increasing accuracy for smaller voxel sizes in grossly normal kidneys (-10.1 to 0.6%). Findings from this study supported the use of CT and the voxel count method for estimating feline renal volume in future clinical and research studies. © 2012 Veterinary Radiology & Ultrasound.

  7. Accurate method for preoperative estimation of the right graft volume in adult-to-adult living donor liver transplantation.

    PubMed

    Khalaf, H; Shoukri, M; Al-Kadhi, Y; Neimatallah, M; Al-Sebayel, M

    2007-06-01

    Accurate estimation of graft volume is crucial to avoid small-for-size syndrome following adult-to-adult living donor liver transplantation AALDLT). Herein, we combined radiological and mathematical approaches for preoperative assessment of right graft volume. The right graft volume was preoperatively estimated in 31 live donors using two methods: first, the radiological graft volume (RGV) by computed tomography (CT) volumetry and second, a calculated graft volume (CGV) obtained by multiplying the standard liver volume by the percentage of the right graft volume (given by CT). Both methods were compared to the actual graft volume (AGV) measured during surgery. The graft recipient weight ratio (GRWR) was also calculated using all three volumes (RGV, CGV, and AGV). Lin's concordance correlation coefficient (CCC) was used to assess the agreement between AGV and both RGV and CGV. This was repeated using the GRWR measurements. The mean percentage of right graft volume was 62.4% (range, 55%-68%; SD +/- 3.27%). The CCC between AGV and RGV versus CGV was 0.38 and 0.66, respectively. The CCC between GRWR using AGV and RGV versus CGV was 0.63 and 0.88, respectively (P < .05). According to the Landis and Kock benchmark, the CGV correlated better with AGV when compared to RGV. The better correlation became even more apparent when applied to GRWR. In our experience, CGV showed a better correlation with AGV compared with the RGV. Using CGV in conjunction with RGV may be of value for a more accurate estimation of right graft volume for AALDLT.

  8. Estimating volume, biomass, and potential emissions of hand-piled fuels

    Treesearch

    Clinton S. Wright; Cameron S. Balog; Jeffrey W. Kelly

    2009-01-01

    Dimensions, volume, and biomass were measured for 121 hand-constructed piles composed primarily of coniferous (n = 63) and shrub/hardwood (n = 58) material at sites in Washington and California. Equations using pile dimensions, shape, and type allow users to accurately estimate the biomass of hand piles. Equations for estimating true pile volume from simple geometric...

  9. Validation of equations for pleural effusion volume estimation by ultrasonography.

    PubMed

    Hassan, Maged; Rizk, Rana; Essam, Hatem; Abouelnour, Ahmed

    2017-12-01

    To validate the accuracy of previously published equations that estimate pleural effusion volume using ultrasonography. Only equations using simple measurements were tested. Three measurements were taken at the posterior axillary line for each case with effusion: lateral height of effusion ( H ), distance between collapsed lung and chest wall ( C ) and distance between lung and diaphragm ( D ). Cases whose effusion was aspirated to dryness were included and drained volume was recorded. Intra-class correlation coefficient (ICC) was used to determine the predictive accuracy of five equations against the actual volume of aspirated effusion. 46 cases with effusion were included. The most accurate equation in predicting effusion volume was ( H  +  D ) × 70 (ICC 0.83). The simplest and yet accurate equation was H  × 100 (ICC 0.79). Pleural effusion height measured by ultrasonography gives a reasonable estimate of effusion volume. Incorporating distance between lung base and diaphragm into estimation improves accuracy from 79% with the first method to 83% with the latter.

  10. The international food unit: a new measurement aid that can improve portion size estimation.

    PubMed

    Bucher, T; Weltert, M; Rollo, M E; Smith, S P; Jia, W; Collins, C E; Sun, M

    2017-09-12

    Portion size education tools, aids and interventions can be effective in helping prevent weight gain. However consumers have difficulties in estimating food portion sizes and are confused by inconsistencies in measurement units and terminologies currently used. Visual cues are an important mediator of portion size estimation, but standardized measurement units are required. In the current study, we present a new food volume estimation tool and test the ability of young adults to accurately quantify food volumes. The International Food Unit™ (IFU™) is a 4x4x4 cm cube (64cm 3 ), subdivided into eight 2 cm sub-cubes for estimating smaller food volumes. Compared with currently used measures such as cups and spoons, the IFU™ standardizes estimation of food volumes with metric measures. The IFU™ design is based on binary dimensional increments and the cubic shape facilitates portion size education and training, memory and recall, and computer processing which is binary in nature. The performance of the IFU™ was tested in a randomized between-subject experiment (n = 128 adults, 66 men) that estimated volumes of 17 foods using four methods; the IFU™ cube, a deformable modelling clay cube, a household measuring cup or no aid (weight estimation). Estimation errors were compared between groups using Kruskall-Wallis tests and post-hoc comparisons. Estimation errors differed significantly between groups (H(3) = 28.48, p < .001). The volume estimations were most accurate in the group using the IFU™ cube (Mdn = 18.9%, IQR = 50.2) and least accurate using the measuring cup (Mdn = 87.7%, IQR = 56.1). The modelling clay cube led to a median error of 44.8% (IQR = 41.9). Compared with the measuring cup, the estimation errors using the IFU™ were significantly smaller for 12 food portions and similar for 5 food portions. Weight estimation was associated with a median error of 23.5% (IQR = 79.8). The IFU™ improves volume estimation accuracy compared to other methods. The cubic shape was perceived as favourable, with subdivision and multiplication facilitating volume estimation. Further studies should investigate whether the IFU™ can facilitate portion size training and whether portion size education using the IFU™ is effective and sustainable without the aid. A 3-dimensional IFU™ could serve as a reference object for estimating food volume.

  11. New model for estimating the relationship between surface area and volume in the human body using skeletal remains.

    PubMed

    Kasabova, Boryana E; Holliday, Trenton W

    2015-04-01

    A new model for estimating human body surface area and body volume/mass from standard skeletal metrics is presented. This model is then tested against both 1) "independently estimated" body surface areas and "independently estimated" body volume/mass (both derived from anthropometric data) and 2) the cylindrical model of Ruff. The model is found to be more accurate in estimating both body surface area and body volume/mass than the cylindrical model, but it is more accurate in estimating body surface area than it is for estimating body volume/mass (as reflected by the standard error of the estimate when "independently estimated" surface area or volume/mass is regressed on estimates derived from the present model). Two practical applications of the model are tested. In the first test, the relative contribution of the limbs versus the trunk to the body's volume and surface area is compared between "heat-adapted" and "cold-adapted" populations. As expected, the "cold-adapted" group has significantly more of its body surface area and volume in its trunk than does the "heat-adapted" group. In the second test, we evaluate the effect of variation in bi-iliac breadth, elongated or foreshortened limbs, and differences in crural index on the body's surface area to volume ratio (SA:V). Results indicate that the effects of bi-iliac breadth on SA:V are substantial, while those of limb lengths and (especially) the crural index are minor, which suggests that factors other than surface area relative to volume are driving morphological variation and ecogeographical patterning in limb prorportions. © 2014 Wiley Periodicals, Inc.

  12. Direct and simultaneous estimation of cardiac four chamber volumes by multioutput sparse regression.

    PubMed

    Zhen, Xiantong; Zhang, Heye; Islam, Ali; Bhaduri, Mousumi; Chan, Ian; Li, Shuo

    2017-02-01

    Cardiac four-chamber volume estimation serves as a fundamental and crucial role in clinical quantitative analysis of whole heart functions. It is a challenging task due to the huge complexity of the four chambers including great appearance variations, huge shape deformation and interference between chambers. Direct estimation has recently emerged as an effective and convenient tool for cardiac ventricular volume estimation. However, existing direct estimation methods were specifically developed for one single ventricle, i.e., left ventricle (LV), or bi-ventricles; they can not be directly used for four chamber volume estimation due to the great combinatorial variability and highly complex anatomical interdependency of the four chambers. In this paper, we propose a new, general framework for direct and simultaneous four chamber volume estimation. We have addressed two key issues, i.e., cardiac image representation and simultaneous four chamber volume estimation, which enables accurate and efficient four-chamber volume estimation. We generate compact and discriminative image representations by supervised descriptor learning (SDL) which can remove irrelevant information and extract discriminative features. We propose direct and simultaneous four-chamber volume estimation by the multioutput sparse latent regression (MSLR), which enables jointly modeling nonlinear input-output relationships and capturing four-chamber interdependence. The proposed method is highly generalized, independent of imaging modalities, which provides a general regression framework that can be extensively used for clinical data prediction to achieve automated diagnosis. Experiments on both MR and CT images show that our method achieves high performance with a correlation coefficient of up to 0.921 with ground truth obtained manually by human experts, which is clinically significant and enables more accurate, convenient and comprehensive assessment of cardiac functions. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Effects of the liver volume and donor steatosis on errors in the estimated standard liver volume.

    PubMed

    Siriwardana, Rohan Chaminda; Chan, See Ching; Chok, Kenneth Siu Ho; Lo, Chung Mau; Fan, Sheung Tat

    2011-12-01

    An accurate assessment of donor and recipient liver volumes is essential in living donor liver transplantation. Many liver donors are affected by mild to moderate steatosis, and steatotic livers are known to have larger volumes. This study analyzes errors in liver volume estimation by commonly used formulas and the effects of donor steatosis on these errors. Three hundred twenty-five Asian donors who underwent right lobe donor hepatectomy were the subjects of this study. The percentage differences between the liver volumes from computed tomography (CT) and the liver volumes estimated with each formula (ie, the error percentages) were calculated. Five popular formulas were tested. The degrees of steatosis were categorized as follows: no steatosis [n = 178 (54.8%)], ≤ 10% steatosis [n = 128 (39.4%)], and >10% to 20% steatosis [n = 19 (5.8%)]. The median errors ranged from 0.6% (7 mL) to 24.6% (360 mL). The lowest was seen with the locally derived formula. All the formulas showed a significant association between the error percentage and the CT liver volume (P < 0.001). Overestimation was seen with smaller liver volumes, whereas underestimation was seen with larger volumes. The locally derived formula was most accurate when the liver volume was 1001 to 1250 mL. A multivariate analysis showed that the estimation error was dependent on the liver volume (P = 0.001) and the anthropometric measurement that was used in the calculation (P < 0.001) rather than steatosis (P ≥ 0.07). In conclusion, all the formulas have a similar pattern of error that is possibly related to the anthropometric measurement. Clinicians should be aware of this pattern of error and the liver volume with which their formula is most accurate. Copyright © 2011 American Association for the Study of Liver Diseases.

  14. Accuracy and variability of tumor burden measurement on multi-parametric MRI

    NASA Astrophysics Data System (ADS)

    Salarian, Mehrnoush; Gibson, Eli; Shahedi, Maysam; Gaed, Mena; Gómez, José A.; Moussa, Madeleine; Romagnoli, Cesare; Cool, Derek W.; Bastian-Jordan, Matthew; Chin, Joseph L.; Pautler, Stephen; Bauman, Glenn S.; Ward, Aaron D.

    2014-03-01

    Measurement of prostate tumour volume can inform prognosis and treatment selection, including an assessment of the suitability and feasibility of focal therapy, which can potentially spare patients the deleterious side effects of radical treatment. Prostate biopsy is the clinical standard for diagnosis but provides limited information regarding tumour volume due to sparse tissue sampling. A non-invasive means for accurate determination of tumour burden could be of clinical value and an important step toward reduction of overtreatment. Multi-parametric magnetic resonance imaging (MPMRI) is showing promise for prostate cancer diagnosis. However, the accuracy and inter-observer variability of prostate tumour volume estimation based on separate expert contouring of T2-weighted (T2W), dynamic contrastenhanced (DCE), and diffusion-weighted (DW) MRI sequences acquired using an endorectal coil at 3T is currently unknown. We investigated this question using a histologic reference standard based on a highly accurate MPMRIhistology image registration and a smooth interpolation of planimetric tumour measurements on histology. Our results showed that prostate tumour volumes estimated based on MPMRI consistently overestimated histological reference tumour volumes. The variability of tumour volume estimates across the different pulse sequences exceeded interobserver variability within any sequence. Tumour volume estimates on DCE MRI provided the lowest inter-observer variability and the highest correlation with histology tumour volumes, whereas the apparent diffusion coefficient (ADC) maps provided the lowest volume estimation error. If validated on a larger data set, the observed correlations could support the development of automated prostate tumour volume segmentation algorithms as well as correction schemes for tumour burden estimation on MPMRI.

  15. Uncertainty in peat volume and soil carbon estimated using ground-penetrating radar and probing

    Treesearch

    Andrew D. Parsekian; Lee Slater; Dimitrios Ntarlagiannis; James Nolan; Stephen D. Sebestyen; Randall K. Kolka; Paul J. Hanson

    2012-01-01

    Estimating soil C stock in a peatland is highly dependent on accurate measurement of the peat volume. In this study, we evaluated the uncertainty in calculations of peat volume using high-resolution data to resolve the three-dimensional structure of a peat basin based on both direct (push probes) and indirect geophysical (ground-penetrating radar) measurements. We...

  16. Simple estimate of critical volume

    NASA Technical Reports Server (NTRS)

    Fedors, R. F.

    1980-01-01

    Method for estimating critical molar volume of materials is faster and simpler than previous procedures. Formula sums no more than 18 different contributions from components of chemical structure of material, and is as accurate (within 3 percent) as older more complicated models. Method should expedite many thermodynamic design calculations.

  17. Breast volume estimation from systematic series of CT scans using the Cavalieri principle and 3D reconstruction.

    PubMed

    Erić, Mirela; Anderla, Andraš; Stefanović, Darko; Drapšin, Miodrag

    2014-01-01

    Preoperative breast volume estimation is very important for the success of the breast surgery. In the present study, two different breast volume determination methods, Cavalieri principle and 3D reconstruction were compared. Consecutive sections were taken in slice thickness of 5 mm. Every 2nd breast section in a set of consecutive sections was selected. We marked breast tissue with blue line on each selected section, and so prepared CT scans used for breast volume estimation. The volumes of the 60 breasts were estimated using the Cavalieri principle and 3D reconstruction. The mean breast volume value was established to be 467.79 ± 188.90 cm(3) with Cavalieri method and 465.91 ± 191.41 cm(3) with 3D reconstruction. The mean CE for the estimates in this study was calculated as 0.25%. Skin-sparing volume was about 91.64% of the whole breast volume. Both methods are very accurate and have a strong linear association. Our results suggest that the calculation of breast volume or its part in vivo from systematic series of CT scans using the Cavalieri principle or 3D breast reconstruction is accurate enough to have a significant clinical benefit in planning reconstructive breast surgery. These methods can help the surgeon guide the choice of the most appropriate implant or/and flap preoperatively. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  18. Estimating Highway Volumes Using Vehicle Probe Data - Proof of Concept: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Yi; Young, Stanley E; Sadabadi, Kaveh

    This paper examines the feasibility of using sampled commercial probe data in combination with validated continuous counter data to accurately estimate vehicle volume across the entire roadway network, for any hour during the year. Currently either real time or archived volume data for roadways at specific times are extremely sparse. Most volume data are average annual daily traffic (AADT) measures derived from the Highway Performance Monitoring System (HPMS). Although methods to factor the AADT to hourly averages for typical day of week exist, actual volume data is limited to a sparse collection of locations in which volumes are continuously recorded.more » This paper explores the use of commercial probe data to generate accurate volume measures that span the highway network providing ubiquitous coverage in space, and specific point-in-time measures for a specific date and time. The paper examines the need for the data, fundamental accuracy limitations based on a basic statistical model that take into account the sampling nature of probe data, and early results from a proof of concept exercise revealing the potential of probe type data calibrated with public continuous count data to meet end user expectations in terms of accuracy of volume estimates.« less

  19. Self-Interaction Chromatography of mAbs: Accurate Measurement of Dead Volumes.

    PubMed

    Hedberg, S H M; Heng, J Y Y; Williams, D R; Liddell, J M

    2015-12-01

    Measurement of the second virial coefficient B22 for proteins using self-interaction chromatography (SIC) is becoming an increasingly important technique for studying their solution behaviour. In common with all physicochemical chromatographic methods, measuring the dead volume of the SIC packed column is crucial for accurate retention data; this paper examines best practise for dead volume determination. SIC type experiments using catalase, BSA, lysozyme and a mAb as model systems are reported, as well as a number of dead column measurements. It was observed that lysozyme and mAb interacted specifically with Toyopearl AF-Formyl dead columns depending upon pH and [NaCl], invalidating their dead volume usage. Toyopearl AF-Amino packed dead columns showed no such problems and acted as suitable dead columns without any solution condition dependency. Dead volume determinations using dextran MW standards with protein immobilised SIC columns provided dead volume estimates close to those obtained using Toyopearl AF-Amino dead columns. It is concluded that specific interactions between proteins, including mAbs, and select SIC support phases can compromise the use of some standard approaches for estimating the dead volume of SIC columns. Two other methods were shown to provide good estimates for the dead volume.

  20. Building generalized tree mass/volume component models for improved estimation of forest stocks and utilization potential

    Treesearch

    David W. MacFarlane

    2015-01-01

    Accurately assessing forest biomass potential is contingent upon having accurate tree biomass models to translate data from forest inventories. Building generality into these models is especially important when they are to be applied over large spatial domains, such as regional, national and international scales. Here, new, generalized whole-tree mass / volume...

  1. Quantifying Standing Dead Tree Volume and Structural Loss with Voxelized Terrestrial Lidar Data

    NASA Astrophysics Data System (ADS)

    Popescu, S. C.; Putman, E.

    2017-12-01

    Standing dead trees (SDTs) are an important forest component and impact a variety of ecosystem processes, yet the carbon pool dynamics of SDTs are poorly constrained in terrestrial carbon cycling models. The ability to model wood decay and carbon cycling in relation to detectable changes in tree structure and volume over time would greatly improve such models. The overall objective of this study was to provide automated aboveground volume estimates of SDTs and automated procedures to detect, quantify, and characterize structural losses over time with terrestrial lidar data. The specific objectives of this study were: 1) develop an automated SDT volume estimation algorithm providing accurate volume estimates for trees scanned in dense forests; 2) develop an automated change detection methodology to accurately detect and quantify SDT structural loss between subsequent terrestrial lidar observations; and 3) characterize the structural loss rates of pine and oak SDTs in southeastern Texas. A voxel-based volume estimation algorithm, "TreeVolX", was developed and incorporates several methods designed to robustly process point clouds of varying quality levels. The algorithm operates on horizontal voxel slices by segmenting the slice into distinct branch or stem sections then applying an adaptive contour interpolation and interior filling process to create solid reconstructed tree models (RTMs). TreeVolX estimated large and small branch volume with an RMSE of 7.3% and 13.8%, respectively. A voxel-based change detection methodology was developed to accurately detect and quantify structural losses and incorporated several methods to mitigate the challenges presented by shifting tree and branch positions as SDT decay progresses. The volume and structural loss of 29 SDTs, composed of Pinus taeda and Quercus stellata, were successfully estimated using multitemporal terrestrial lidar observations over elapsed times ranging from 71 - 753 days. Pine and oak structural loss rates were characterized by estimating the amount of volumetric loss occurring in 20 equal-interval height bins of each SDT. Results showed that large pine snags exhibited more rapid structural loss in comparison to medium-sized oak snags in this study.

  2. Automatic portion estimation and visual refinement in mobile dietary assessment

    NASA Astrophysics Data System (ADS)

    Woo, Insoo; Otsmo, Karl; Kim, SungYe; Ebert, David S.; Delp, Edward J.; Boushey, Carol J.

    2010-01-01

    As concern for obesity grows, the need for automated and accurate methods to monitor nutrient intake becomes essential as dietary intake provides a valuable basis for managing dietary imbalance. Moreover, as mobile devices with built-in cameras have become ubiquitous, one potential means of monitoring dietary intake is photographing meals using mobile devices and having an automatic estimate of the nutrient contents returned. One of the challenging problems of the image-based dietary assessment is the accurate estimation of food portion size from a photograph taken with a mobile digital camera. In this work, we describe a method to automatically calculate portion size of a variety of foods through volume estimation using an image. These "portion volumes" utilize camera parameter estimation and model reconstruction to determine the volume of food items, from which nutritional content is then extrapolated. In this paper, we describe our initial results of accuracy evaluation using real and simulated meal images and demonstrate the potential of our approach.

  3. Form-class volume tables for estimating board-foot content of northern conifers

    Treesearch

    C. Allen Bickford

    1951-01-01

    The timber cruiser counts volume tables among his most important working tools. He wants - if he can get them - tables that are simple, easy to use, and accurate. Before using a volume table in a new situation, the careful cruiser will check it by comparing table volumes with actual volumes.

  4. Developing a stochastic traffic volume prediction model for public-private partnership projects

    NASA Astrophysics Data System (ADS)

    Phong, Nguyen Thanh; Likhitruangsilp, Veerasak; Onishi, Masamitsu

    2017-11-01

    Transportation projects require an enormous amount of capital investment resulting from their tremendous size, complexity, and risk. Due to the limitation of public finances, the private sector is invited to participate in transportation project development. The private sector can entirely or partially invest in transportation projects in the form of Public-Private Partnership (PPP) scheme, which has been an attractive option for several developing countries, including Vietnam. There are many factors affecting the success of PPP projects. The accurate prediction of traffic volume is considered one of the key success factors of PPP transportation projects. However, only few research works investigated how to predict traffic volume over a long period of time. Moreover, conventional traffic volume forecasting methods are usually based on deterministic models which predict a single value of traffic volume but do not consider risk and uncertainty. This knowledge gap makes it difficult for concessionaires to estimate PPP transportation project revenues accurately. The objective of this paper is to develop a probabilistic traffic volume prediction model. First, traffic volumes were estimated following the Geometric Brownian Motion (GBM) process. Monte Carlo technique is then applied to simulate different scenarios. The results show that this stochastic approach can systematically analyze variations in the traffic volume and yield more reliable estimates for PPP projects.

  5. Developing a method for estimating AADT on all Louisiana roads.

    DOT National Transportation Integrated Search

    2015-07-01

    Traffic flow volumes present key information needed for making transportation engineering and planning decisions. : Accurate traffic volume count has many applications including: roadway planning, design, air quality compliance, travel : model valida...

  6. Artificial Intelligence Procedures for Tree Taper Estimation within a Complex Vegetation Mosaic in Brazil

    PubMed Central

    Nunes, Matheus Henrique

    2016-01-01

    Tree stem form in native tropical forests is very irregular, posing a challenge to establishing taper equations that can accurately predict the diameter at any height along the stem and subsequently merchantable volume. Artificial intelligence approaches can be useful techniques in minimizing estimation errors within complex variations of vegetation. We evaluated the performance of Random Forest® regression tree and Artificial Neural Network procedures in modelling stem taper. Diameters and volume outside bark were compared to a traditional taper-based equation across a tropical Brazilian savanna, a seasonal semi-deciduous forest and a rainforest. Neural network models were found to be more accurate than the traditional taper equation. Random forest showed trends in the residuals from the diameter prediction and provided the least precise and accurate estimations for all forest types. This study provides insights into the superiority of a neural network, which provided advantages regarding the handling of local effects. PMID:27187074

  7. Artificial Intelligence Procedures for Tree Taper Estimation within a Complex Vegetation Mosaic in Brazil.

    PubMed

    Nunes, Matheus Henrique; Görgens, Eric Bastos

    2016-01-01

    Tree stem form in native tropical forests is very irregular, posing a challenge to establishing taper equations that can accurately predict the diameter at any height along the stem and subsequently merchantable volume. Artificial intelligence approaches can be useful techniques in minimizing estimation errors within complex variations of vegetation. We evaluated the performance of Random Forest® regression tree and Artificial Neural Network procedures in modelling stem taper. Diameters and volume outside bark were compared to a traditional taper-based equation across a tropical Brazilian savanna, a seasonal semi-deciduous forest and a rainforest. Neural network models were found to be more accurate than the traditional taper equation. Random forest showed trends in the residuals from the diameter prediction and provided the least precise and accurate estimations for all forest types. This study provides insights into the superiority of a neural network, which provided advantages regarding the handling of local effects.

  8. Lower limb muscle volume estimation from maximum cross-sectional area and muscle length in cerebral palsy and typically developing individuals.

    PubMed

    Vanmechelen, Inti M; Shortland, Adam P; Noble, Jonathan J

    2018-01-01

    Deficits in muscle volume may be a significant contributor to physical disability in young people with cerebral palsy. However, 3D measurements of muscle volume using MRI or 3D ultrasound may be difficult to make routinely in the clinic. We wished to establish whether accurate estimates of muscle volume could be made from a combination of anatomical cross-sectional area and length measurements in samples of typically developing young people and young people with bilateral cerebral palsy. Lower limb MRI scans were obtained from the lower limbs of 21 individuals with cerebral palsy (14.7±3years, 17 male) and 23 typically developing individuals (16.8±3.3years, 16 male). The volume, length and anatomical cross-sectional area were estimated from six muscles of the left lower limb. Analysis of Covariance demonstrated that the relationship between the length*cross-sectional area and volume was not significantly different depending on the subject group. Linear regression analysis demonstrated that the product of anatomical cross-sectional area and length bore a strong and significant relationship to the measured muscle volume (R 2 values between 0.955 and 0.988) with low standard error of the estimates of 4.8 to 8.9%. This study demonstrates that muscle volume may be estimated accurately in typically developing individuals and individuals with cerebral palsy by a combination of anatomical cross-sectional area and muscle length. 2D ultrasound may be a convenient method of making these measurements routinely in the clinic. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. The volume and mean depth of Earth's lakes

    NASA Astrophysics Data System (ADS)

    Cael, B. B.; Heathcote, A. J.; Seekell, D. A.

    2017-01-01

    Global lake volume estimates are scarce, highly variable, and poorly documented. We developed a rigorous method for estimating global lake depth and volume based on the Hurst coefficient of Earth's surface, which provides a mechanistic connection between lake area and volume. Volume-area scaling based on the Hurst coefficient is accurate and consistent when applied to lake data sets spanning diverse regions. We applied these relationships to a global lake area census to estimate global lake volume and depth. The volume of Earth's lakes is 199,000 km3 (95% confidence interval 196,000-202,000 km3). This volume is in the range of historical estimates (166,000-280,000 km3), but the overall mean depth of 41.8 m (95% CI 41.2-42.4 m) is significantly lower than previous estimates (62-151 m). These results highlight and constrain the relative scarcity of lake waters in the hydrosphere and have implications for the role of lakes in global biogeochemical cycles.

  10. Efficient Voronoi volume estimation for DEM simulations of granular materials under confined conditions

    PubMed Central

    Frenning, Göran

    2015-01-01

    When the discrete element method (DEM) is used to simulate confined compression of granular materials, the need arises to estimate the void space surrounding each particle with Voronoi polyhedra. This entails recurring Voronoi tessellation with small changes in the geometry, resulting in a considerable computational overhead. To overcome this limitation, we propose a method with the following features:•A local determination of the polyhedron volume is used, which considerably simplifies implementation of the method.•A linear approximation of the polyhedron volume is utilised, with intermittent exact volume calculations when needed.•The method allows highly accurate volume estimates to be obtained at a considerably reduced computational cost. PMID:26150975

  11. How large is the typical subarachnoid hemorrhage? A review of current neurosurgical knowledge.

    PubMed

    Whitmore, Robert G; Grant, Ryan A; LeRoux, Peter; El-Falaki, Omar; Stein, Sherman C

    2012-01-01

    Despite the morbidity and mortality of subarachnoid hemorrhage (SAH), the average volume of a typical hemorrhage is not well defined. Animal models of SAH often do not accurately mimic the human disease process. The purpose of this study is to estimate the average SAH volume, allowing standardization of animal models of the disease. We performed a MEDLINE search of SAH volume and erythrocyte counts in human cerebrospinal fluid as well as for volumes of blood used in animal injection models of SAH, from 1956 to 2010. We polled members of the American Association of Neurological Surgeons (AANS) for estimates of typical SAH volume. Using quantitative data from the literature, we calculated the total volume of SAH as equal to the volume of blood clotted in basal cisterns plus the volume of dispersed blood in cerebrospinal fluid. The results of the AANS poll confirmed our estimates. The human literature yielded 322 publications and animal literature, 237 studies. Four quantitative human studies reported blood clot volumes ranging from 0.2 to 170 mL, with a mean of ∼20 mL. There was only one quantitative study reporting cerebrospinal fluid red blood cell counts from serial lumbar puncture after SAH. Dispersed blood volume ranged from 2.9 to 45.9 mL, and we used the mean of 15 mL for our calculation. Therefore, total volume of SAH equals 35 mL. The AANS poll yielded 176 responses, ranging from 2 to 350 mL, with a mean of 33.9 ± 4.4 mL. Based on our estimate of total SAH volume of 35 mL, animal injection models may now become standardized for more accurate portrayal of the human disease process. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Polynomial Fitting of DT-MRI Fiber Tracts Allows Accurate Estimation of Muscle Architectural Parameters

    PubMed Central

    Damon, Bruce M.; Heemskerk, Anneriet M.; Ding, Zhaohua

    2012-01-01

    Fiber curvature is a functionally significant muscle structural property, but its estimation from diffusion-tensor MRI fiber tracking data may be confounded by noise. The purpose of this study was to investigate the use of polynomial fitting of fiber tracts for improving the accuracy and precision of fiber curvature (κ) measurements. Simulated image datasets were created in order to provide data with known values for κ and pennation angle (θ). Simulations were designed to test the effects of increasing inherent fiber curvature (3.8, 7.9, 11.8, and 15.3 m−1), signal-to-noise ratio (50, 75, 100, and 150), and voxel geometry (13.8 and 27.0 mm3 voxel volume with isotropic resolution; 13.5 mm3 volume with an aspect ratio of 4.0) on κ and θ measurements. In the originally reconstructed tracts, θ was estimated accurately under most curvature and all imaging conditions studied; however, the estimates of κ were imprecise and inaccurate. Fitting the tracts to 2nd order polynomial functions provided accurate and precise estimates of κ for all conditions except very high curvature (κ=15.3 m−1), while preserving the accuracy of the θ estimates. Similarly, polynomial fitting of in vivo fiber tracking data reduced the κ values of fitted tracts from those of unfitted tracts and did not change the θ values. Polynomial fitting of fiber tracts allows accurate estimation of physiologically reasonable values of κ, while preserving the accuracy of θ estimation. PMID:22503094

  13. Estimating volumes and costs of forest biomass in western Montana using forest inventory and geospatial data

    Treesearch

    Dan Loeffler; David E. Calkin; Robin P. Silverstein

    2006-01-01

    Utilizing timber harvest residues (biomass) for renewable energy production provides an alternative disposal method to onsite burning that may improve the economic viability of hazardous fuels treatments. Due to the relatively low value of biomass, accurate estimates of biomass volumes and costs of collection and delivery are essential if investment in renewable energy...

  14. Improved estimates of partial volume coefficients from noisy brain MRI using spatial context.

    PubMed

    Manjón, José V; Tohka, Jussi; Robles, Montserrat

    2010-11-01

    This paper addresses the problem of accurate voxel-level estimation of tissue proportions in the human brain magnetic resonance imaging (MRI). Due to the finite resolution of acquisition systems, MRI voxels can contain contributions from more than a single tissue type. The voxel-level estimation of this fractional content is known as partial volume coefficient estimation. In the present work, two new methods to calculate the partial volume coefficients under noisy conditions are introduced and compared with current similar methods. Concretely, a novel Markov Random Field model allowing sharp transitions between partial volume coefficients of neighbouring voxels and an advanced non-local means filtering technique are proposed to reduce the errors due to random noise in the partial volume coefficient estimation. In addition, a comparison was made to find out how the different methodologies affect the measurement of the brain tissue type volumes. Based on the obtained results, the main conclusions are that (1) both Markov Random Field modelling and non-local means filtering improved the partial volume coefficient estimation results, and (2) non-local means filtering was the better of the two strategies for partial volume coefficient estimation. Copyright 2010 Elsevier Inc. All rights reserved.

  15. The validity of ultrasound estimation of muscle volumes.

    PubMed

    Infantolino, Benjamin W; Gales, Daniel J; Winter, Samantha L; Challis, John H

    2007-08-01

    The purpose of this study was to validate ultrasound muscle volume estimation in vivo. To examine validity, vastus lateralis ultrasound images were collected from cadavers before muscle dissection; after dissection, the volumes were determined by hydrostatic weighing. Seven thighs from cadaver specimens were scanned using a 7.5-MHz ultrasound probe (SSD-1000, Aloka, Japan). The perimeter of the vastus lateralis was identified in the ultrasound images and manually digitized. Volumes were then estimated using the Cavalieri principle, by measuring the image areas of sets of parallel two-dimensional slices through the muscles. The muscles were then dissected from the cadavers, and muscle volume was determined via hydrostatic weighing. There was no statistically significant difference between the ultrasound estimation of muscle volume and that estimated using hydrostatic weighing (p > 0.05). The mean percentage error between the two volume estimates was 0.4% +/- 6.9. Three operators all performed four digitizations of all images from one randomly selected muscle; there was no statistical difference between operators or trials and the intraclass correlation was high (>0.8). The results of this study indicate that ultrasound is an accurate method for estimating muscle volumes in vivo.

  16. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  17. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  18. Estimating Pinus palustris tree diameter and stem volume from tree height, crown area and stand-level parameters

    Treesearch

    C.A. Gonzalez-Benecke; Salvador A. Gezan; Lisa J. Samuelson; Wendell P. Cropper; Daniel J. Leduc; Timothy A. Martin

    2014-01-01

    Accurate and efficient estimation of forest growth and live biomass is a critical element in assessing potential responses to forest management and environmental change. The objective of this study was to develop models to predict longleaf pine tree diameter at breast height (dbh) and merchantable stem volume (V) using data obtained from field measurements. We used...

  19. Price Estimation Guidelines

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.; Aster, R. W.; Firnett, P. J.; Miller, M. A.

    1985-01-01

    Improved Price Estimation Guidelines, IPEG4, program provides comparatively simple, yet relatively accurate estimate of price of manufactured product. IPEG4 processes user supplied input data to determine estimate of price per unit of production. Input data include equipment cost, space required, labor cost, materials and supplies cost, utility expenses, and production volume on industry wide or process wide basis.

  20. Accuracy of surgical wound drainage measurements: an analysis and comparison.

    PubMed

    Yue, Brian; Nizzero, Danielle; Zhang, Chunxiao; van Zyl, Natasha; Ting, Jeannette

    2015-05-01

    Surgical drain tube readings can influence the clinical management of the post-operative patient. The accuracy of these readings has not been documented in the current literature and this experimental study aims to address this paucity. Aliquots (10, 25, 40 and 90 mL) of black tea solution prepared to mimic haemoserous fluid were injected into UnoVac, RedoVac and Jackson-Pratt drain tubes. Nursing and medical staff from a tertiary hospital were asked to estimate drain volumes by direct observation; analysis of variance was performed on the results and significance level was set at 0.05. Doctors and nurses are equally accurate in estimating drain tube volumes. Jackson-Pratt systems were found to be the most accurate for intermediate volumes of 25 and 40 mL. For extreme of volumes (both high and low), all drainage systems were inaccurate. This study suggests that for intermediate volumes (25 and 40 mL), Jackson-Pratt is the drainage system of choice. The accuracy of volume measurement is diminished at the extremes of drain volumes; emptying of drainage systems is recommended to avoid overfilling of drainage systems. © 2014 Royal Australasian College of Surgeons.

  1. Estimation of gas and tissue lung volumes by MRI: functional approach of lung imaging.

    PubMed

    Qanadli, S D; Orvoen-Frija, E; Lacombe, P; Di Paola, R; Bittoun, J; Frija, G

    1999-01-01

    The purpose of this work was to assess the accuracy of MRI for the determination of lung gas and tissue volumes. Fifteen healthy subjects underwent MRI of the thorax and pulmonary function tests [vital capacity (VC) and total lung capacity (TLC)] in the supine position. MR examinations were performed at inspiration and expiration. Lung volumes were measured by a previously validated technique on phantoms. Both individual and total lung volumes and capacities were calculated. MRI total vital capacity (VC(MRI)) was compared with spirometric vital capacity (VC(SP)). Capacities were correlated to lung volumes. Tissue volume (V(T)) was estimated as the difference between the total lung volume at full inspiration and the TLC. No significant difference was seen between VC(MRI) and VC(SP). Individual capacities were well correlated (r = 0.9) to static volume at full inspiration. The V(T) was estimated to be 836+/-393 ml. This preliminary study demonstrates that MRI can accurately estimate lung gas and tissue volumes. The proposed approach appears well suited for functional imaging of the lung.

  2. Automatic portion estimation and visual refinement in mobile dietary assessment

    PubMed Central

    Woo, Insoo; Otsmo, Karl; Kim, SungYe; Ebert, David S.; Delp, Edward J.; Boushey, Carol J.

    2011-01-01

    As concern for obesity grows, the need for automated and accurate methods to monitor nutrient intake becomes essential as dietary intake provides a valuable basis for managing dietary imbalance. Moreover, as mobile devices with built-in cameras have become ubiquitous, one potential means of monitoring dietary intake is photographing meals using mobile devices and having an automatic estimate of the nutrient contents returned. One of the challenging problems of the image-based dietary assessment is the accurate estimation of food portion size from a photograph taken with a mobile digital camera. In this work, we describe a method to automatically calculate portion size of a variety of foods through volume estimation using an image. These “portion volumes” utilize camera parameter estimation and model reconstruction to determine the volume of food items, from which nutritional content is then extrapolated. In this paper, we describe our initial results of accuracy evaluation using real and simulated meal images and demonstrate the potential of our approach. PMID:22242198

  3. Automated segmentation of ventricles from serial brain MRI for the quantification of volumetric changes associated with communicating hydrocephalus in patients with brain tumor

    NASA Astrophysics Data System (ADS)

    Pura, John A.; Hamilton, Allison M.; Vargish, Geoffrey A.; Butman, John A.; Linguraru, Marius George

    2011-03-01

    Accurate ventricle volume estimates could improve the understanding and diagnosis of postoperative communicating hydrocephalus. For this category of patients, associated changes in ventricle volume can be difficult to identify, particularly over short time intervals. We present an automated segmentation algorithm that evaluates ventricle size from serial brain MRI examination. The technique combines serial T1- weighted images to increase SNR and segments the means image to generate a ventricle template. After pre-processing, the segmentation is initiated by a fuzzy c-means clustering algorithm to find the seeds used in a combination of fast marching methods and geodesic active contours. Finally, the ventricle template is propagated onto the serial data via non-linear registration. Serial volume estimates were obtained in an automated robust and accurate manner from difficult data.

  4. Can recording only the day-time voided volumes predict bladder capacity?

    PubMed

    Cho, Won Yeol; Kim, Seong Cheol; Kim, Sun-Ouck; Park, Sungchan; Lee, Sang Don; Chung, Jae Min; Kim, Kyung Do; Moon, Du Geon; Kim, Young Sig; Kim, Jun Mo

    2018-05-01

    This study aimed to demonstrate a method to easily assess bladder capacity using knowledge of day-time voided volumes, which can be obtained even from patients with nocturnal enuresis where the first morning void cannot accurately predict the bladder capacity due to bladder emptying overnight. We evaluated 177 healthy children from 7 Korean medical centres entered the study between January 2008 and January 2009. Voided volumes measured for more than 48 hours were recorded in the frequency volume chart (FVC). Most voided volumes during day-time were showed between 30% and 80% of the maximal voided volume (MVV). The maximal voided volume during day-time (MVVDT) was significantly less than the MVV (179.5±71.1 mL vs. 227.0±79.2 mL, p<0.001). The correlation coefficients with the MVV were 0.801 for the estimated MVV using the MVVDT (MVVDT×1.25), which suggested a fairly strong relationship between the MVVDT×1.25 and the MVV. The MVV derived from the FVC excluding the FMV was less than if the FMV had been included. When an accurate first morning voided volume cannot be obtained, as in patients with nocturnal enuresis, calculating MVVDT×1.25 allows estimation of the bladder capacity in place of the MVV.

  5. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center

    PubMed Central

    Dou, Chao

    2016-01-01

    The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. 
 PMID:28090205

  6. Main Trend Extraction Based on Irregular Sampling Estimation and Its Application in Storage Volume of Internet Data Center.

    PubMed

    Miao, Beibei; Dou, Chao; Jin, Xuebo

    2016-01-01

    The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always "dirty," which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the "dirty" data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. 
 .

  7. Performance of sampling methods to estimate log characteristics for wildlife.

    Treesearch

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton

    2004-01-01

    Accurate estimation of the characteristics of log resources, or coarse woody debris (CWD), is critical to effective management of wildlife and other forest resources. Despite the importance of logs as wildlife habitat, methods for sampling logs have traditionally focused on silvicultural and fire applications. These applications have emphasized estimates of log volume...

  8. A practical model for pressure probe system response estimation (with review of existing models)

    NASA Astrophysics Data System (ADS)

    Hall, B. F.; Povey, T.

    2018-04-01

    The accurate estimation of the unsteady response (bandwidth) of pneumatic pressure probe systems (probe, line and transducer volume) is a common practical problem encountered in the design of aerodynamic experiments. Understanding the bandwidth of the probe system is necessary to capture unsteady flow features accurately. Where traversing probes are used, the desired traverse speed and spatial gradients in the flow dictate the minimum probe system bandwidth required to resolve the flow. Existing approaches for bandwidth estimation are either complex or inaccurate in implementation, so probes are often designed based on experience. Where probe system bandwidth is characterized, it is often done experimentally, requiring careful experimental set-up and analysis. There is a need for a relatively simple but accurate model for estimation of probe system bandwidth. A new model is presented for the accurate estimation of pressure probe bandwidth for simple probes commonly used in wind tunnel environments; experimental validation is provided. An additional, simple graphical method for air is included for convenience.

  9. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion.

    PubMed

    Guo, Zhi-Jun; Lin, Qiang; Liu, Hai-Tao; Lu, Jun-Ying; Zeng, Yan-Hong; Meng, Fan-Jie; Cao, Bin; Zi, Xue-Rong; Han, Shu-Ming; Zhang, Yu-Huan

    2013-09-01

    Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 × d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l × h × d): V = 0.56 × (l × h × d) + 39.44 (r = 0.92, P = 0.000). The 64-slice CT volume-rendering technique can accurately measure the volume in pleural effusion patients, and a linear regression equation can be used to estimate the volume of the free pleural effusion.

  10. Estimating cardiac fiber orientations in pig hearts using registered ultrasound and MR image volumes

    NASA Astrophysics Data System (ADS)

    Dormer, James D.; Meng, Yuguang; Zhang, Xiaodong; Jiang, Rong; Wagner, Mary B.; Fei, Baowei

    2017-03-01

    Heart fiber mechanics can be important predictors in current and future cardiac function. Accurate knowledge of these mechanics could enable cardiologists to provide a diagnosis before conditions progress. Magnetic resonance diffusion tensor imaging (MR-DTI) has been used to determine cardiac fiber orientations. Ultrasound is capable of providing anatomical information in real time, enabling a physician to quickly adjust parameters to optimize image scans. If known fiber orientations from a template heart measured using DTI can be accurately deformed onto a cardiac ultrasound volume, fiber orientations could be estimated for the patient without the need for a costly MR scan while still providing cardiologists valuable information about the heart mechanics. In this study, we apply the method to pig hearts, which are a close representation of human heart anatomy. Experiments from pig hearts show that the registration method achieved an average Dice similarity coefficient (DSC) of 0.819 +/- 0.050 between the ultrasound and deformed MR volumes and that the proposed ultrasound-based method is able to estimate the cardiac fiber orientation in pig hearts.

  11. The Volume of Earth's Lakes

    NASA Astrophysics Data System (ADS)

    Cael, B. B.

    How much water do lakes on Earth hold? Global lake volume estimates are scarce, highly variable, and poorly documented. We develop a mechanistic null model for estimating global lake mean depth and volume based on a statistical topographic approach to Earth's surface. The volume-area scaling prediction is accurate and consistent within and across lake datasets spanning diverse regions. We applied these relationships to a global lake area census to estimate global lake volume and depth. The volume of Earth's lakes is 199,000 km3 (95% confidence interval 196,000-202,000 km3) . This volume is in the range of historical estimates (166,000-280,000 km3) , but the overall mean depth of 41.8 m (95% CI 41.2-42.4 m) is significantly lower than previous estimates (62 - 151 m). These results highlight and constrain the relative scarcity of lake waters in the hydrosphere and have implications for the role of lakes in global biogeochemical cycles. We also evaluate the size (area) distribution of lakes on Earth compared to expectations from percolation theory. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. 2388357.

  12. Are EMS call volume predictions based on demand pattern analysis accurate?

    PubMed

    Brown, Lawrence H; Lerner, E Brooke; Larmon, Baxter; LeGassick, Todd; Taigman, Michael

    2007-01-01

    Most EMS systems determine the number of crews they will deploy in their communities and when those crews will be scheduled based on anticipated call volumes. Many systems use historical data to calculate their anticipated call volumes, a method of prediction known as demand pattern analysis. To evaluate the accuracy of call volume predictions calculated using demand pattern analysis. Seven EMS systems provided 73 consecutive weeks of hourly call volume data. The first 20 weeks of data were used to calculate three common demand pattern analysis constructs for call volume prediction: average peak demand (AP), smoothed average peak demand (SAP), and 90th percentile rank (90%R). The 21st week served as a buffer. Actual call volumes in the last 52 weeks were then compared to the predicted call volumes by using descriptive statistics. There were 61,152 hourly observations in the test period. All three constructs accurately predicted peaks and troughs in call volume but not exact call volume. Predictions were accurate (+/-1 call) 13% of the time using AP, 10% using SAP, and 19% using 90%R. Call volumes were overestimated 83% of the time using AP, 86% using SAP, and 74% using 90%R. When call volumes were overestimated, predictions exceeded actual call volume by a median (Interquartile range) of 4 (2-6) calls for AP, 4 (2-6) for SAP, and 3 (2-5) for 90%R. Call volumes were underestimated 4% of time using AP, 4% using SAP, and 7% using 90%R predictions. When call volumes were underestimated, call volumes exceeded predictions by a median (Interquartile range; maximum under estimation) of 1 (1-2; 18) call for AP, 1 (1-2; 18) for SAP, and 2 (1-3; 20) for 90%R. Results did not vary between systems. Generally, demand pattern analysis estimated or overestimated call volume, making it a reasonable predictor for ambulance staffing patterns. However, it did underestimate call volume between 4% and 7% of the time. Communities need to determine if these rates of over-and underestimation are acceptable given their resources and local priorities.

  13. Stroke Volume estimation using aortic pressure measurements and aortic cross sectional area: Proof of concept.

    PubMed

    Kamoi, S; Pretty, C G; Chiew, Y S; Pironet, A; Davidson, S; Desaive, T; Shaw, G M; Chase, J G

    2015-08-01

    Accurate Stroke Volume (SV) monitoring is essential for patient with cardiovascular dysfunction patients. However, direct SV measurements are not clinically feasible due to the highly invasive nature of measurement devices. Current devices for indirect monitoring of SV are shown to be inaccurate during sudden hemodynamic changes. This paper presents a novel SV estimation using readily available aortic pressure measurements and aortic cross sectional area, using data from a porcine experiment where medical interventions such as fluid replacement, dobutamine infusions, and recruitment maneuvers induced SV changes in a pig with circulatory shock. Measurement of left ventricular volume, proximal aortic pressure, and descending aortic pressure waveforms were made simultaneously during the experiment. From measured data, proximal aortic pressure was separated into reservoir and excess pressures. Beat-to-beat aortic characteristic impedance values were calculated using both aortic pressure measurements and an estimate of the aortic cross sectional area. SV was estimated using the calculated aortic characteristic impedance and excess component of the proximal aorta. The median difference between directly measured SV and estimated SV was -1.4ml with 95% limit of agreement +/- 6.6ml. This method demonstrates that SV can be accurately captured beat-to-beat during sudden changes in hemodynamic state. This novel SV estimation could enable improved cardiac and circulatory treatment in the critical care environment by titrating treatment to the effect on SV.

  14. Measurement of lung expansion with computed tomography and comparison with quantitative histology.

    PubMed

    Coxson, H O; Mayo, J R; Behzad, H; Moore, B J; Verburgt, L M; Staples, C A; Paré, P D; Hogg, J C

    1995-11-01

    The total and regional lung volumes were estimated from computed tomography (CT), and the pleural pressure gradient was determined by using the milliliters of gas per gram of tissue estimated from the X-ray attenuation values and the pressure-volume curve of the lung. The data show that CT accurately estimated the volume of the resected lobe but overestimated its weight by 24 +/- 19%. The volume of gas per gram of tissue was less in the gravity-dependent regions due to a pleural pressure gradient of 0.24 +/- 0.08 cmH2O/cm of descent in the thorax. The proportion of tissue to air obtained with CT was similar to that obtained by quantitative histology. We conclude that the CT scan can be used to estimate total and regional lung volumes and that measurements of the proportions of tissue and air within the thorax by CT can be used in conjunction with quantitative histology to evaluate lung structure.

  15. Recent work on material interface reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosso, S.J.; Swartz, B.K.

    1997-12-31

    For the last 15 years, many Eulerian codes have relied on a series of piecewise linear interface reconstruction algorithms developed by David Youngs. In a typical Youngs` method, the material interfaces were reconstructed based upon nearly cell values of volume fractions of each material. The interfaces were locally represented by linear segments in two dimensions and by pieces of planes in three dimensions. The first step in such reconstruction was to locally approximate an interface normal. In Youngs` 3D method, a local gradient of a cell-volume-fraction function was estimated and taken to be the local interface normal. A linear interfacemore » was moved perpendicular to the now known normal until the mass behind it matched the material volume fraction for the cell in question. But for distorted or nonorthogonal meshes, the gradient normal estimate didn`t accurately match that of linear material interfaces. Moreover, curved material interfaces were also poorly represented. The authors will present some recent work in the computation of more accurate interface normals, without necessarily increasing stencil size. Their estimate of the normal is made using an iterative process that, given mass fractions for nearby cells of known but arbitrary variable density, converges in 3 or 4 passes in practice (and quadratically--like Newton`s method--in principle). The method reproduces a linear interface in both orthogonal and nonorthogonal meshes. The local linear approximation is generally 2nd-order accurate, with a 1st-order accurate normal for curved interfaces in both two and three dimensional polyhedral meshes. Recent work demonstrating the interface reconstruction for curved surfaces will /be discussed.« less

  16. Back to the future: estimating pre-injury brain volume in patients with traumatic brain injury.

    PubMed

    Ross, David E; Ochs, Alfred L; D Zannoni, Megan; Seabaugh, Jan M

    2014-11-15

    A recent meta-analysis by Hedman et al. allows for accurate estimation of brain volume changes throughout the life span. Additionally, Tate et al. showed that intracranial volume at a later point in life can be used to estimate reliably brain volume at an earlier point in life. These advancements were combined to create a model which allowed the estimation of brain volume just prior to injury in a group of patients with mild or moderate traumatic brain injury (TBI). This volume estimation model was used in combination with actual measurements of brain volume to test hypotheses about progressive brain volume changes in the patients. Twenty six patients with mild or moderate TBI were compared to 20 normal control subjects. NeuroQuant® was used to measure brain MRI volume. Brain volume after the injury (from MRI scans performed at t1 and t2) was compared to brain volume just before the injury (volume estimation at t0) using longitudinal designs. Groups were compared with respect to volume changes in whole brain parenchyma (WBP) and its 3 major subdivisions: cortical gray matter (GM), cerebral white matter (CWM) and subcortical nuclei+infratentorial regions (SCN+IFT). Using the normal control data, the volume estimation model was tested by comparing measured brain volume to estimated brain volume; reliability ranged from good to excellent. During the initial phase after injury (t0-t1), the TBI patients had abnormally rapid atrophy of WBP and CWM, and abnormally rapid enlargement of SCN+IFT. Rates of volume change during t0-t1 correlated with cross-sectional measures of volume change at t1, supporting the internal reliability of the volume estimation model. A logistic regression analysis using the volume change data produced a function which perfectly predicted group membership (TBI patients vs. normal control subjects). During the first few months after injury, patients with mild or moderate TBI have rapid atrophy of WBP and CWM, and rapid enlargement of SCN+IFT. The magnitude and pattern of the changes in volume may allow for the eventual development of diagnostic tools based on the volume estimation approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Improved biovolume estimation of Microcystis aeruginosa colonies: A statistical approach.

    PubMed

    Alcántara, I; Piccini, C; Segura, A M; Deus, S; González, C; Martínez de la Escalera, G; Kruk, C

    2018-05-27

    The Microcystis aeruginosa complex (MAC) clusters many of the most common freshwater and brackish bloom-forming cyanobacteria. In monitoring protocols, biovolume estimation is a common approach to determine MAC colonies biomass and useful for prediction purposes. Biovolume (μm 3 mL -1 ) is calculated multiplying organism abundance (orgL -1 ) by colonial volume (μm 3 org -1 ). Colonial volume is estimated based on geometric shapes and requires accurate measurements of dimensions using optical microscopy. A trade-off between easy-to-measure but low-accuracy simple shapes (e.g. sphere) and time costly but high-accuracy complex shapes (e.g. ellipsoid) volume estimation is posed. Overestimations effects in ecological studies and management decisions associated to harmful blooms are significant due to the large sizes of MAC colonies. In this work, we aimed to increase the precision of MAC biovolume estimations by developing a statistical model based on two easy-to-measure dimensions. We analyzed field data from a wide environmental gradient (800 km) spanning freshwater to estuarine and seawater. We measured length, width and depth from ca. 5700 colonies under an inverted microscope and estimated colonial volume using three different recommended geometrical shapes (sphere, prolate spheroid and ellipsoid). Because of the non-spherical shape of MAC the ellipsoid resulted in the most accurate approximation, whereas the sphere overestimated colonial volume (3-80) especially for large colonies (MLD higher than 300 μm). Ellipsoid requires measuring three dimensions and is time-consuming. Therefore, we constructed different statistical models to predict organisms depth based on length and width. Splitting the data into training (2/3) and test (1/3) sets, all models resulted in low training (1.41-1.44%) and testing average error (1.3-2.0%). The models were also evaluated using three other independent datasets. The multiple linear model was finally selected to calculate MAC volume as an ellipsoid based on length and width. This work contributes to achieve a better estimation of MAC volume applicable to monitoring programs as well as to ecological research. Copyright © 2017. Published by Elsevier B.V.

  18. Volume error analysis for lung nodules attached to pulmonary vessels in an anthropomorphic thoracic phantom

    NASA Astrophysics Data System (ADS)

    Kinnard, Lisa M.; Gavrielides, Marios A.; Myers, Kyle J.; Zeng, Rongping; Peregoy, Jennifer; Pritchard, William; Karanian, John W.; Petrick, Nicholas

    2008-03-01

    High-resolution CT, three-dimensional (3D) methods for nodule volumetry have been introduced, with the hope that such methods will be more accurate and consistent than currently used planar measures of size. However, the error associated with volume estimation methods still needs to be quantified. Volume estimation error is multi-faceted in the sense that it is impacted by characteristics of the patient, the software tool and the CT system. The overall goal of this research is to quantify the various sources of measurement error and, when possible, minimize their effects. In the current study, we estimated nodule volume from ten repeat scans of an anthropomorphic phantom containing two synthetic spherical lung nodules (diameters: 5 and 10 mm; density: -630 HU), using a 16-slice Philips CT with 20, 50, 100 and 200 mAs exposures and 0.8 and 3.0 mm slice thicknesses. True volume was estimated from an average of diameter measurements, made using digital calipers. We report variance and bias results for volume measurements as a function of slice thickness, nodule diameter, and X-ray exposure.

  19. Continuous stroke volume estimation from aortic pressure using zero dimensional cardiovascular model: proof of concept study from porcine experiments.

    PubMed

    Kamoi, Shun; Pretty, Christopher; Docherty, Paul; Squire, Dougie; Revie, James; Chiew, Yeong Shiong; Desaive, Thomas; Shaw, Geoffrey M; Chase, J Geoffrey

    2014-01-01

    Accurate, continuous, left ventricular stroke volume (SV) measurements can convey large amounts of information about patient hemodynamic status and response to therapy. However, direct measurements are highly invasive in clinical practice, and current procedures for estimating SV require specialized devices and significant approximation. This study investigates the accuracy of a three element Windkessel model combined with an aortic pressure waveform to estimate SV. Aortic pressure is separated into two components capturing; 1) resistance and compliance, 2) characteristic impedance. This separation provides model-element relationships enabling SV to be estimated while requiring only one of the three element values to be known or estimated. Beat-to-beat SV estimation was performed using population-representative optimal values for each model element. This method was validated using measured SV data from porcine experiments (N = 3 female Pietrain pigs, 29-37 kg) in which both ventricular volume and aortic pressure waveforms were measured simultaneously. The median difference between measured SV from left ventricle (LV) output and estimated SV was 0.6 ml with a 90% range (5th-95th percentile) -12.4 ml-14.3 ml. During periods when changes in SV were induced, cross correlations in between estimated and measured SV were above R = 0.65 for all cases. The method presented demonstrates that the magnitude and trends of SV can be accurately estimated from pressure waveforms alone, without the need for identification of complex physiological metrics where strength of correlations may vary significantly from patient to patient.

  20. Analytical study to define a helicopter stability derivative extraction method, volume 1

    NASA Technical Reports Server (NTRS)

    Molusis, J. A.

    1973-01-01

    A method is developed for extracting six degree-of-freedom stability and control derivatives from helicopter flight data. Different combinations of filtering and derivative estimate are investigated and used with a Bayesian approach for derivative identification. The combination of filtering and estimate found to yield the most accurate time response match to flight test data is determined and applied to CH-53A and CH-54B flight data. The method found to be most accurate consists of (1) filtering flight test data with a digital filter, followed by an extended Kalman filter (2) identifying a derivative estimate with a least square estimator, and (3) obtaining derivatives with the Bayesian derivative extraction method.

  1. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    DOE PAGES

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less

  2. Optimal back-extrapolation method for estimating plasma volume in humans using the indocyanine green dilution method.

    PubMed

    Polidori, David; Rowley, Clarence

    2014-07-22

    The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method.

  3. Validation of Body Volume Acquisition by Using Elliptical Zone Method.

    PubMed

    Chiu, C-Y; Pease, D L; Fawkner, S; Sanders, R H

    2016-12-01

    The elliptical zone method (E-Zone) can be used to obtain reliable body volume data including total body volume and segmental volumes with inexpensive and portable equipment. The purpose of this research was to assess the accuracy of body volume data obtained from E-Zone by comparing them with those acquired from the 3D photonic scanning method (3DPS). 17 male participants with diverse somatotypes were recruited. Each participant was scanned twice on the same day by a 3D whole-body scanner and photographed twice for the E-Zone analysis. The body volume data acquired from 3DPS was regarded as the reference against which the accuracy of the E-Zone was assessed. The relative technical error of measurement (TEM) of total body volume estimations was around 3% for E-Zone. E-Zone can estimate the segmental volumes of upper torso, lower torso, thigh, shank, upper arm and lower arm accurately (relative TEM<10%) but the accuracy for small segments including the neck, hand and foot were poor. In summary, E-Zone provides a reliable, inexpensive, portable, and simple method to obtain reasonable estimates of total body volume and to indicate segmental volume distribution. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Preoperative estimation of the liver graft weight in adult right lobe living donor liver transplantation using maximal portal vein diameters.

    PubMed

    Wang, Frank; Pan, Kuang-Tse; Chu, Sung-Yu; Chan, Kun-Ming; Chou, Hong-Shiue; Wu, Ting-Jung; Lee, Wei-Chen

    2011-04-01

    An accurate preoperative estimate of the graft weight is vital to avoid small-for-size syndrome in the recipient and ensure donor safety after adult living donor liver transplantation (LDLT). Here we describe a simple method for estimating the graft volume (GV) that uses the maximal right portal vein diameter (RPVD) and the maximal left portal vein diameter (LPVD). Between June 2004 and December 2009, 175 consecutive donors undergoing right hepatectomy for LDLT were retrospectively reviewed. The GV was determined with 3 estimation methods: (1) the radiological graft volume (RGV) estimated by computed tomography (CT) volumetry; (2) the computed tomography-calculated graft volume (CGV-CT), which was obtained by the multiplication of the standard liver volume (SLV) by the RGV percentage with respect to the total liver volume derived from CT; and (3) the portal vein diameter ratio-calculated graft volume (CGV-PVDR), which was obtained by the multiplication of the SLV by the portal vein diameter ratio [PVDR; ie, PVDR = RPVD(2) /(RPVD(2) + LPVD(2) )]. These values were compared to the actual graft weight (AGW), which was measured intraoperatively. The mean AGW was 633.63 ± 107.51 g, whereas the mean RGV, CGV-CT, and CGV-PVDR values were 747.83 ± 138.59, 698.21 ± 94.81, and 685.20 ± 90.88 cm(3) , respectively. All 3 estimation methods tended to overestimate the AGW (P < 0.001). The actual graft-to-recipient body weight ratio (GRWR) was 1.00% ± 0.19%, and the GRWRs calculated on the basis of the RGV, CGV-CT, and CGV-PVDR values were 1.19% ± 0.25%, 1.11% ± 0.22%, and 1.09% ± 0.21%, respectively. Overall, the CGV-PVDR values better correlated with the AGW and GRWR values according to Lin's concordance correlation coefficient and the Landis and Kock benchmark. In conclusion, the PVDR method is a simple estimation method that accurately predicts GVs and GRWRs in adult LDLT. Copyright © 2011 American Association for the Study of Liver Diseases.

  5. Accurate bulk density determination of irregularly shaped translucent and opaque aerogels

    NASA Astrophysics Data System (ADS)

    Petkov, M. P.; Jones, S. M.

    2016-05-01

    We present a volumetric method for accurate determination of bulk density of aerogels, calculated from extrapolated weight of the dry pure solid and volume estimates based on the Archimedes' principle of volume displacement, using packed 100 μm-sized monodispersed glass spheres as a "quasi-fluid" media. Hard particle packing theory is invoked to demonstrate the reproducibility of the apparent density of the quasi-fluid. Accuracy rivaling that of the refractive index method is demonstrated for both translucent and opaque aerogels with different absorptive properties, as well as for aerogels with regular and irregular shapes.

  6. Novel blood pressure and pulse pressure estimation based on pulse transit time and stroke volume approximation.

    PubMed

    Lee, Joonnyong; Sohn, JangJay; Park, Jonghyun; Yang, SeungMan; Lee, Saram; Kim, Hee Chan

    2018-06-18

    Non-invasive continuous blood pressure monitors are of great interest to the medical community due to their value in hypertension management. Recently, studies have shown the potential of pulse pressure as a therapeutic target for hypertension, but not enough attention has been given to non-invasive continuous monitoring of pulse pressure. Although accurate pulse pressure estimation can be of direct value to hypertension management and indirectly to the estimation of systolic blood pressure, as it is the sum of pulse pressure and diastolic blood pressure, only a few inadequate methods of pulse pressure estimation have been proposed. We present a novel, non-invasive blood pressure and pulse pressure estimation method based on pulse transit time and pre-ejection period. Pre-ejection period and pulse transit time were measured non-invasively using electrocardiogram, seismocardiogram, and photoplethysmogram measured from the torso. The proposed method used the 2-element Windkessel model to model pulse pressure with the ratio of stroke volume, approximated by pre-ejection period, and arterial compliance, estimated by pulse transit time. Diastolic blood pressure was estimated using pulse transit time, and systolic blood pressure was estimated as the sum of the two estimates. The estimation method was verified in 11 subjects in two separate conditions with induced cardiovascular response and the results were compared against a reference measurement and values obtained from a previously proposed method. The proposed method yielded high agreement with the reference (pulse pressure correlation with reference R ≥ 0.927, diastolic blood pressure correlation with reference R ≥ 0.854, systolic blood pressure correlation with reference R ≥ 0.914) and high estimation accuracy in pulse pressure (mean root-mean-squared error ≤ 3.46 mmHg) and blood pressure (mean root-mean-squared error ≤ 6.31 mmHg for diastolic blood pressure and ≤ 8.41 mmHg for systolic blood pressure) over a wide range of hemodynamic changes. The proposed pulse pressure estimation method provides accurate estimates in situations with and without significant changes in stroke volume. The proposed method improves upon the currently available systolic blood pressure estimation methods by providing accurate pulse pressure estimates.

  7. A Comparison of Lung Nodule Segmentation Algorithms: Methods and Results from a Multi-institutional Study.

    PubMed

    Kalpathy-Cramer, Jayashree; Zhao, Binsheng; Goldgof, Dmitry; Gu, Yuhua; Wang, Xingwei; Yang, Hao; Tan, Yongqiang; Gillies, Robert; Napel, Sandy

    2016-08-01

    Tumor volume estimation, as well as accurate and reproducible borders segmentation in medical images, are important in the diagnosis, staging, and assessment of response to cancer therapy. The goal of this study was to demonstrate the feasibility of a multi-institutional effort to assess the repeatability and reproducibility of nodule borders and volume estimate bias of computerized segmentation algorithms in CT images of lung cancer, and to provide results from such a study. The dataset used for this evaluation consisted of 52 tumors in 41 CT volumes (40 patient datasets and 1 dataset containing scans of 12 phantom nodules of known volume) from five collections available in The Cancer Imaging Archive. Three academic institutions developing lung nodule segmentation algorithms submitted results for three repeat runs for each of the nodules. We compared the performance of lung nodule segmentation algorithms by assessing several measurements of spatial overlap and volume measurement. Nodule sizes varied from 29 μl to 66 ml and demonstrated a diversity of shapes. Agreement in spatial overlap of segmentations was significantly higher for multiple runs of the same algorithm than between segmentations generated by different algorithms (p < 0.05) and was significantly higher on the phantom dataset compared to the other datasets (p < 0.05). Algorithms differed significantly in the bias of the measured volumes of the phantom nodules (p < 0.05) underscoring the need for assessing performance on clinical data in addition to phantoms. Algorithms that most accurately estimated nodule volumes were not the most repeatable, emphasizing the need to evaluate both their accuracy and precision. There were considerable differences between algorithms, especially in a subset of heterogeneous nodules, underscoring the recommendation that the same software be used at all time points in longitudinal studies.

  8. Completely automated estimation of prostate volume for 3-D side-fire transrectal ultrasound using shape prior approach

    NASA Astrophysics Data System (ADS)

    Li, Lu; Narayanan, Ramakrishnan; Miller, Steve; Shen, Feimo; Barqawi, Al B.; Crawford, E. David; Suri, Jasjit S.

    2008-02-01

    Real-time knowledge of capsule volume of an organ provides a valuable clinical tool for 3D biopsy applications. It is challenging to estimate this capsule volume in real-time due to the presence of speckles, shadow artifacts, partial volume effect and patient motion during image scans, which are all inherent in medical ultrasound imaging. The volumetric ultrasound prostate images are sliced in a rotational manner every three degrees. The automated segmentation method employs a shape model, which is obtained from training data, to delineate the middle slices of volumetric prostate images. Then a "DDC" algorithm is applied to the rest of the images with the initial contour obtained. The volume of prostate is estimated with the segmentation results. Our database consists of 36 prostate volumes which are acquired using a Philips ultrasound machine using a Side-fire transrectal ultrasound (TRUS) probe. We compare our automated method with the semi-automated approach. The mean volumes using the semi-automated and complete automated techniques were 35.16 cc and 34.86 cc, with the error of 7.3% and 7.6% compared to the volume obtained by the human estimated boundary (ideal boundary), respectively. The overall system, which was developed using Microsoft Visual C++, is real-time and accurate.

  9. Improved pressure contour analysis for estimating cardiac stroke volume using pulse wave velocity measurement.

    PubMed

    Kamoi, Shun; Pretty, Christopher; Balmer, Joel; Davidson, Shaun; Pironet, Antoine; Desaive, Thomas; Shaw, Geoffrey M; Chase, J Geoffrey

    2017-04-24

    Pressure contour analysis is commonly used to estimate cardiac performance for patients suffering from cardiovascular dysfunction in the intensive care unit. However, the existing techniques for continuous estimation of stroke volume (SV) from pressure measurement can be unreliable during hemodynamic instability, which is inevitable for patients requiring significant treatment. For this reason, pressure contour methods must be improved to capture changes in vascular properties and thus provide accurate conversion from pressure to flow. This paper presents a novel pressure contour method utilizing pulse wave velocity (PWV) measurement to capture vascular properties. A three-element Windkessel model combined with the reservoir-wave concept are used to decompose the pressure contour into components related to storage and flow. The model parameters are identified beat-to-beat from the water-hammer equation using measured PWV, wave component of the pressure, and an estimate of subject-specific aortic dimension. SV is then calculated by converting pressure to flow using identified model parameters. The accuracy of this novel method is investigated using data from porcine experiments (N = 4 Pietrain pigs, 20-24.5 kg), where hemodynamic properties were significantly altered using dobutamine, fluid administration, and mechanical ventilation. In the experiment, left ventricular volume was measured using admittance catheter, and aortic pressure waveforms were measured at two locations, the aortic arch and abdominal aorta. Bland-Altman analysis comparing gold-standard SV measured by the admittance catheter and estimated SV from the novel method showed average limits of agreement of ±26% across significant hemodynamic alterations. This result shows the method is capable of estimating clinically acceptable absolute SV values according to Critchely and Critchely. The novel pressure contour method presented can accurately estimate and track SV even when hemodynamic properties are significantly altered. Integrating PWV measurements into pressure contour analysis improves identification of beat-to-beat changes in Windkessel model parameters, and thus, provides accurate estimate of blood flow from measured pressure contour. The method has great potential for overcoming weaknesses associated with current pressure contour methods for estimating SV.

  10. How Accurate Are We in Estimating True Stone Volume? A Comparison of Water Displacement, Ellipsoid Formula, and a CT-Based Software Tool.

    PubMed

    Jain, Rajat; Omar, Mohamed; Chaparala, Hemant; Kahn, Adam; Li, Jianbo; Kahn, Leonard; Sivalingam, Sri

    2018-04-23

    To compare the accuracy and reliability of stone volume estimated by ellipsoid formula (EFv) and CT-based algorithm (CTv) to true volume (TV) by water displacement in an in vitro model. Ninety stone phantoms were created using clay (0.5-40 cm 3 , 814 HU ±91) and scanned with CT. For each stone, TV was measured by water displacement, CTv was calculated by the region-growing algorithm in the CT-based software AGFA IMPAX Volume Viewer, and EFv was calculated by the standard formula π × L × W × H × 0.167. All measurements were repeated thrice, and concordance correlation coefficient (CCC) was calculated for the whole group, as well as subgroups based on volume (<1.5 cm 3 , 1.5-6 cm 3 , and >6 cm 3 ). Mean TV, CTv, and EFv were 6.42 cm 3  ± 6.57 (range: 0.5-39.37 cm 3 ), 6.24 cm 3  ± 6.15 (0.48-36.1 cm 3 ), and 8.98 cm 3  ± 9.96 (0.49-47.05 cm 3 ), respectively. When comparing TV to CTv, CCC was 0.99 (95% confidence interval [CI]: 0.99-0.995), indicating excellent agreement, although TV was slightly underestimated at larger volumes. When comparing TV to EFv, CCC was 0.82 (95% CI: 0.78-0.86), indicating poor agreement. EFv tended to overestimate the TV, especially as stone volume increased beyond 1.5 cm 3 , and there was a significant spread between trials. An automated CT-based algorithm more accurately and reliably estimates stone volume than does the ellipsoid formula. While further research is necessary to validate stone volume as a surrogate for stone burden, CT-based algorithmic volume measurement of urinary stones is a promising technology.

  11. MRI volumetry of prefrontal cortex

    NASA Astrophysics Data System (ADS)

    Sheline, Yvette I.; Black, Kevin J.; Lin, Daniel Y.; Pimmel, Joseph; Wang, Po; Haller, John W.; Csernansky, John G.; Gado, Mokhtar; Walkup, Ronald K.; Brunsden, Barry S.; Vannier, Michael W.

    1995-05-01

    Prefrontal cortex volumetry by brain magnetic resonance (MR) is required to estimate changes postulated to occur in certain psychiatric and neurologic disorders. A semiautomated method with quantitative characterization of its performance is sought to reliably distinguish small prefrontal cortex volume changes within individuals and between groups. Stereological methods were tested by a blinded comparison of measurements applied to 3D MR scans obtained using an MPRAGE protocol. Fixed grid stereologic methods were used to estimate prefrontal cortex volumes on a graphic workstation, after the images are scaled from 16 to 8 bits using a histogram method. In addition images were resliced into coronal sections perpendicular to the bicommissural plane. Prefrontal cortex volumes were defined as all sections of the frontal lobe anterior to the anterior commissure. Ventricular volumes were excluded. Stereological measurement yielded high repeatability and precision, and was time efficient for the raters. The coefficient of error was

  12. Gastropod shell size and architecture influence the applicability of methods used to estimate internal volume.

    PubMed

    Ragagnin, Marilia Nagata; Gorman, Daniel; McCarthy, Ian Donald; Sant'Anna, Bruno Sampaio; de Castro, Cláudio Campi; Turra, Alexander

    2018-01-11

    Obtaining accurate and reproducible estimates of internal shell volume is a vital requirement for studies into the ecology of a range of shell-occupying organisms, including hermit crabs. Shell internal volume is usually estimated by filling the shell cavity with water or sand, however, there has been no systematic assessment of the reliability of these methods and moreover no comparison with modern alternatives, e.g., computed tomography (CT). This study undertakes the first assessment of the measurement reproducibility of three contrasting approaches across a spectrum of shell architectures and sizes. While our results suggested a certain level of variability inherent for all methods, we conclude that a single measure using sand/water is likely to be sufficient for the majority of studies. However, care must be taken as precision may decline with increasing shell size and structural complexity. CT provided less variation between repeat measures but volume estimates were consistently lower compared to sand/water and will need methodological improvements before it can be used as an alternative. CT indicated volume may be also underestimated using sand/water due to the presence of air spaces visible in filled shells scanned by CT. Lastly, we encourage authors to clearly describe how volume estimates were obtained.

  13. Abdominal fat volume estimation by stereology on CT: a comparison with manual planimetry.

    PubMed

    Manios, G E; Mazonakis, M; Voulgaris, C; Karantanas, A; Damilakis, J

    2016-03-01

    To deploy and evaluate a stereological point-counting technique on abdominal CT for the estimation of visceral (VAF) and subcutaneous abdominal fat (SAF) volumes. Stereological volume estimations based on point counting and systematic sampling were performed on images from 14 consecutive patients who had undergone abdominal CT. For the optimization of the method, five sampling intensities in combination with 100 and 200 points were tested. The optimum stereological measurements were compared with VAF and SAF volumes derived by the standard technique of manual planimetry on the same scans. Optimization analysis showed that the selection of 200 points along with the sampling intensity 1/8 provided efficient volume estimations in less than 4 min for VAF and SAF together. The optimized stereology showed strong correlation with planimetry (VAF: r = 0.98; SAF: r = 0.98). No statistical differences were found between the two methods (VAF: P = 0.81; SAF: P = 0.83). The 95% limits of agreement were also acceptable (VAF: -16.5%, 16.1%; SAF: -10.8%, 10.7%) and the repeatability of stereology was good (VAF: CV = 4.5%, SAF: CV = 3.2%). Stereology may be successfully applied to CT images for the efficient estimation of abdominal fat volume and may constitute a good alternative to the conventional planimetric technique. Abdominal obesity is associated with increased risk of disease and mortality. Stereology may quantify visceral and subcutaneous abdominal fat accurately and consistently. The application of stereology to estimating abdominal volume fat reduces processing time. Stereology is an efficient alternative method for estimating abdominal fat volume.

  14. Understanding traffic variations by vehicle classifications

    DOT National Transportation Integrated Search

    1998-08-01

    To provide a better understanding of how short-duration truck volume counts can be used to accurately estimate the key variables needed for design, planning, and operational analyses, the Long-Term Pavement Performance (LTPP) program recently complet...

  15. Brain Volume Estimation Enhancement by Morphological Image Processing Tools.

    PubMed

    Zeinali, R; Keshtkar, A; Zamani, A; Gharehaghaji, N

    2017-12-01

    Volume estimation of brain is important for many neurological applications. It is necessary in measuring brain growth and changes in brain in normal/abnormal patients. Thus, accurate brain volume measurement is very important. Magnetic resonance imaging (MRI) is the method of choice for volume quantification due to excellent levels of image resolution and between-tissue contrast. Stereology method is a good method for estimating volume but it requires to segment enough MRI slices and have a good resolution. In this study, it is desired to enhance stereology method for volume estimation of brain using less MRI slices with less resolution. In this study, a program for calculating volume using stereology method has been introduced. After morphologic method, dilation was applied and the stereology method enhanced. For the evaluation of this method, we used T1-wighted MR images from digital phantom in BrainWeb which had ground truth. The volume of 20 normal brain extracted from BrainWeb, was calculated. The volumes of white matter, gray matter and cerebrospinal fluid with given dimension were estimated correctly. Volume calculation from Stereology method in different cases was made. In three cases, Root Mean Square Error (RMSE) was measured. Case I with T=5, d=5, Case II with T=10, D=10 and Case III with T=20, d=20 (T=slice thickness, d=resolution as stereology parameters). By comparing these results of two methods, it is obvious that RMSE values for our proposed method are smaller than Stereology method. Using morphological operation, dilation allows to enhance the estimation volume method, Stereology. In the case with less MRI slices and less test points, this method works much better compared to Stereology method.

  16. Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frothingham, David; Barker, Michelle; Buechi, Steve

    2013-07-01

    Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less

  17. Descendant root volume varies as a function of root type: estimation of root biomass lost during uprooting in Pinus pinaster.

    PubMed

    Danjon, Frédéric; Caplan, Joshua S; Fortin, Mathieu; Meredieu, Céline

    2013-01-01

    Root systems of woody plants generally display a strong relationship between the cross-sectional area or cross-sectional diameter (CSD) of a root and the dry weight of biomass (DWd) or root volume (Vd) that has grown (i.e., is descendent) from a point. Specification of this relationship allows one to quantify root architectural patterns and estimate the amount of material lost when root systems are extracted from the soil. However, specifications of this relationship generally do not account for the fact that root systems are comprised of multiple types of roots. We assessed whether the relationship between CSD and Vd varies as a function of root type. Additionally, we sought to identify a more accurate and time-efficient method for estimating missing root volume than is currently available. We used a database that described the 3D root architecture of Pinus pinaster root systems (5, 12, or 19 years) from a stand in southwest France. We determined the relationship between CSD and Vd for 10,000 root segments from intact root branches. Models were specified that did and did not account for root type. The relationships were then applied to the diameters of 11,000 broken root ends to estimate the volume of missing roots. CSD was nearly linearly related to the square root of Vd, but the slope of the curve varied greatly as a function of root type. Sinkers and deep roots tapered rapidly, as they were limited by available soil depth. Distal shallow roots tapered gradually, as they were less limited spatially. We estimated that younger trees lost an average of 17% of root volume when excavated, while older trees lost 4%. Missing volumes were smallest in the central parts of root systems and largest in distal shallow roots. The slopes of the curves for each root type are synthetic parameters that account for differentiation due to genetics, soil properties, or mechanical stimuli. Accounting for this differentiation is critical to estimating root loss accurately.

  18. Descendant root volume varies as a function of root type: estimation of root biomass lost during uprooting in Pinus pinaster

    PubMed Central

    Danjon, Frédéric; Caplan, Joshua S.; Fortin, Mathieu; Meredieu, Céline

    2013-01-01

    Root systems of woody plants generally display a strong relationship between the cross-sectional area or cross-sectional diameter (CSD) of a root and the dry weight of biomass (DWd) or root volume (Vd) that has grown (i.e., is descendent) from a point. Specification of this relationship allows one to quantify root architectural patterns and estimate the amount of material lost when root systems are extracted from the soil. However, specifications of this relationship generally do not account for the fact that root systems are comprised of multiple types of roots. We assessed whether the relationship between CSD and Vd varies as a function of root type. Additionally, we sought to identify a more accurate and time-efficient method for estimating missing root volume than is currently available. We used a database that described the 3D root architecture of Pinus pinaster root systems (5, 12, or 19 years) from a stand in southwest France. We determined the relationship between CSD and Vd for 10,000 root segments from intact root branches. Models were specified that did and did not account for root type. The relationships were then applied to the diameters of 11,000 broken root ends to estimate the volume of missing roots. CSD was nearly linearly related to the square root of Vd, but the slope of the curve varied greatly as a function of root type. Sinkers and deep roots tapered rapidly, as they were limited by available soil depth. Distal shallow roots tapered gradually, as they were less limited spatially. We estimated that younger trees lost an average of 17% of root volume when excavated, while older trees lost 4%. Missing volumes were smallest in the central parts of root systems and largest in distal shallow roots. The slopes of the curves for each root type are synthetic parameters that account for differentiation due to genetics, soil properties, or mechanical stimuli. Accounting for this differentiation is critical to estimating root loss accurately. PMID:24167506

  19. Optimal back-extrapolation method for estimating plasma volume in humans using the indocyanine green dilution method

    PubMed Central

    2014-01-01

    Background The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. Methods We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Conclusions Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method. PMID:25052018

  20. Vestibular schwannomas: Accuracy of tumor volume estimated by ice cream cone formula using thin-sliced MR images.

    PubMed

    Ho, Hsing-Hao; Li, Ya-Hui; Lee, Jih-Chin; Wang, Chih-Wei; Yu, Yi-Lin; Hueng, Dueng-Yuan; Ma, Hsin-I; Hsu, Hsian-He; Juan, Chun-Jung

    2018-01-01

    We estimated the volume of vestibular schwannomas by an ice cream cone formula using thin-sliced magnetic resonance images (MRI) and compared the estimation accuracy among different estimating formulas and between different models. The study was approved by a local institutional review board. A total of 100 patients with vestibular schwannomas examined by MRI between January 2011 and November 2015 were enrolled retrospectively. Informed consent was waived. Volumes of vestibular schwannomas were estimated by cuboidal, ellipsoidal, and spherical formulas based on a one-component model, and cuboidal, ellipsoidal, Linskey's, and ice cream cone formulas based on a two-component model. The estimated volumes were compared to the volumes measured by planimetry. Intraobserver reproducibility and interobserver agreement was tested. Estimation error, including absolute percentage error (APE) and percentage error (PE), was calculated. Statistical analysis included intraclass correlation coefficient (ICC), linear regression analysis, one-way analysis of variance, and paired t-tests with P < 0.05 considered statistically significant. Overall tumor size was 4.80 ± 6.8 mL (mean ±standard deviation). All ICCs were no less than 0.992, suggestive of high intraobserver reproducibility and high interobserver agreement. Cuboidal formulas significantly overestimated the tumor volume by a factor of 1.9 to 2.4 (P ≤ 0.001). The one-component ellipsoidal and spherical formulas overestimated the tumor volume with an APE of 20.3% and 29.2%, respectively. The two-component ice cream cone method, and ellipsoidal and Linskey's formulas significantly reduced the APE to 11.0%, 10.1%, and 12.5%, respectively (all P < 0.001). The ice cream cone method and other two-component formulas including the ellipsoidal and Linskey's formulas allow for estimation of vestibular schwannoma volume more accurately than all one-component formulas.

  1. Direct measurement of proximal isovelocity surface area by real-time three-dimensional color Doppler for quantitation of aortic regurgitant volume: an in vitro validation.

    PubMed

    Pirat, Bahar; Little, Stephen H; Igo, Stephen R; McCulloch, Marti; Nosé, Yukihiko; Hartley, Craig J; Zoghbi, William A

    2009-03-01

    The proximal isovelocity surface area (PISA) method is useful in the quantitation of aortic regurgitation (AR). We hypothesized that actual measurement of PISA provided with real-time 3-dimensional (3D) color Doppler yields more accurate regurgitant volumes than those estimated by 2-dimensional (2D) color Doppler PISA. We developed a pulsatile flow model for AR with an imaging chamber in which interchangeable regurgitant orifices with defined shapes and areas were incorporated. An ultrasonic flow meter was used to calculate the reference regurgitant volumes. A total of 29 different flow conditions for 5 orifices with different shapes were tested at a rate of 72 beats/min. 2D PISA was calculated as 2pi r(2), and 3D PISA was measured from 8 equidistant radial planes of the 3D PISA. Regurgitant volume was derived as PISA x aliasing velocity x time velocity integral of AR/peak AR velocity. Regurgitant volumes by flow meter ranged between 12.6 and 30.6 mL/beat (mean 21.4 +/- 5.5 mL/beat). Regurgitant volumes estimated by 2D PISA correlated well with volumes measured by flow meter (r = 0.69); however, a significant underestimation was observed (y = 0.5x + 0.6). Correlation with flow meter volumes was stronger for 3D PISA-derived regurgitant volumes (r = 0.83); significantly less underestimation of regurgitant volumes was seen, with a regression line close to identity (y = 0.9x + 3.9). Direct measurement of PISA is feasible, without geometric assumptions, using real-time 3D color Doppler. Calculation of aortic regurgitant volumes with 3D color Doppler using this methodology is more accurate than conventional 2D method with hemispheric PISA assumption.

  2. Estimates of the volume of water in five coal aquifers, Northern Cheyenne Indian Reservation, southeastern Montana

    USGS Publications Warehouse

    Tuck, L.K.; Pearson, Daniel K.; Cannon, M.R.; Dutton, DeAnn M.

    2013-01-01

    The Tongue River Member of the Tertiary Fort Union Formation is the primary source of groundwater in the Northern Cheyenne Indian Reservation in southeastern Montana. Coal beds within this formation generally contain the most laterally extensive aquifers in much of the reservation. The U.S. Geological Survey, in cooperation with the Northern Cheyenne Tribe, conducted a study to estimate the volume of water in five coal aquifers. This report presents estimates of the volume of water in five coal aquifers in the eastern and southern parts of the Northern Cheyenne Indian Reservation: the Canyon, Wall, Pawnee, Knobloch, and Flowers-Goodale coal beds in the Tongue River Member of the Tertiary Fort Union Formation. Only conservative estimates of the volume of water in these coal aquifers are presented. The volume of water in the Canyon coal was estimated to range from about 10,400 acre-feet (75 percent saturated) to 3,450 acre-feet (25 percent saturated). The volume of water in the Wall coal was estimated to range from about 14,200 acre-feet (100 percent saturated) to 3,560 acre-feet (25 percent saturated). The volume of water in the Pawnee coal was estimated to range from about 9,440 acre-feet (100 percent saturated) to 2,360 acre-feet (25 percent saturated). The volume of water in the Knobloch coal was estimated to range from about 38,700 acre-feet (100 percent saturated) to 9,680 acre-feet (25 percent saturated). The volume of water in the Flowers-Goodale coal was estimated to be about 35,800 acre-feet (100 percent saturated). Sufficient data are needed to accurately characterize coal-bed horizontal and vertical variability, which is highly complex both locally and regionally. Where data points are widely spaced, the reliability of estimates of the volume of coal beds is decreased. Additionally, reliable estimates of the volume of water in coal aquifers depend heavily on data about water levels and data about coal-aquifer characteristics. Because the data needed to define the volume of water were sparse, only conservative estimates of the volume of water in the five coal aquifers are presented in this report. These estimates need to be used with caution and mindfulness of the uncertainty associated with them.

  3. Analysis on Vertical Scattering Signatures in Forestry with PolInSAR

    NASA Astrophysics Data System (ADS)

    Guo, Shenglong; Li, Yang; Zhang, Jingjing; Hong, Wen

    2014-11-01

    We apply accurate topographic phase to the Freeman-Durden decomposition for polarimetric SAR interferometry (PolInSAR) data. The cross correlation matrix obtained from PolInSAR observations can be decomposed into three scattering mechanisms matrices accounting for the odd-bounce, double-bounce and volume scattering. We estimate the phase based on the Random volume over Ground (RVoG) model, and as the initial input parameter of the numerical method which is used to solve the parameters of decomposition. In addition, the modified volume scattering model introduced by Y. Yamaguchi is applied to the PolInSAR target decomposition in forest areas rather than the pure random volume scattering as proposed by Freeman-Durden to make best fit to the actual measured data. This method can accurately retrieve the magnitude associated with each mechanism and their vertical location along the vertical dimension. We test the algorithms with L- and P- band simulated data.

  4. Volume estimation of small phantoms and rat kidneys using three-dimensional ultrasonography and a position sensor.

    PubMed

    Strømmen, Kenneth; Stormark, Tor André; Iversen, Bjarne M; Matre, Knut

    2004-09-01

    To evaluate the accuracy of small volume estimation, both in vivo and in vitro, measurements with a three-dimensional (3D) ultrasound (US) system were carried out. A position sensor was used and the transmitting frequency was 10 MHz. Balloons with known volumes were scanned while rat kidneys were scanned in vivo and in vitro. The Archimedes' principle was used to estimate the true volume. For balloons, the 3D US system gave very good agreement with true volumes in the volume range 0.1 to 10.0 mL (r = 0.999, n = 45, mean difference +/- 2SD = 0.245 +/- 0.370 mL). For rat kidneys in vivo (volume range 0.6 to 2.7 mL) the method was less accurate (r = 0.800, n = 10, mean difference +/- 2SD = -0.288 +/- 0.676 mL). For rat kidneys in vitro (volume range 0.3 to 2.7 mL) the results showed good agreement (r = 0.981, n = 23, mean difference +/- 2SD = 0.039 +/- 0.254 mL). For balloons, kidneys in vivo and in vitro, the mean percentage error was 9.3 +/- 4.8%, -17.1 +/- 17.4%, and 4.6 +/- 11.5%, respectively. This method can estimate the volume of small phantoms and rat kidneys and opens new possibilities for volume measurements of small objects and the study of organ function in small animals. (E-mail ).

  5. Tumor Volume Estimation and Quasi-Continuous Administration for Most Effective Bevacizumab Therapy

    PubMed Central

    Sápi, Johanna; Kovács, Levente; Drexler, Dániel András; Kocsis, Pál; Gajári, Dávid; Sápi, Zoltán

    2015-01-01

    Background Bevacizumab is an exogenous inhibitor which inhibits the biological activity of human VEGF. Several studies have investigated the effectiveness of bevacizumab therapy according to different cancer types but these days there is an intense debate on its utility. We have investigated different methods to find the best tumor volume estimation since it creates the possibility for precise and effective drug administration with a much lower dose than in the protocol. Materials and Methods We have examined C38 mouse colon adenocarcinoma and HT-29 human colorectal adenocarcinoma. In both cases, three groups were compared in the experiments. The first group did not receive therapy, the second group received one 200 μg bevacizumab dose for a treatment period (protocol-based therapy), and the third group received 1.1 μg bevacizumab every day (quasi-continuous therapy). Tumor volume measurement was performed by digital caliper and small animal MRI. The mathematical relationship between MRI-measured tumor volume and mass was investigated to estimate accurate tumor volume using caliper-measured data. A two-dimensional mathematical model was applied for tumor volume evaluation, and tumor- and therapy-specific constants were calculated for the three different groups. The effectiveness of bevacizumab administration was examined by statistical analysis. Results In the case of C38 adenocarcinoma, protocol-based treatment did not result in significantly smaller tumor volume compared to the no treatment group; however, there was a significant difference between untreated mice and mice who received quasi-continuous therapy (p = 0.002). In the case of HT-29 adenocarcinoma, the daily treatment with one-twelfth total dose resulted in significantly smaller tumors than the protocol-based treatment (p = 0.038). When the tumor has a symmetrical, solid closed shape (typically without treatment), volume can be evaluated accurately from caliper-measured data with the applied two-dimensional mathematical model. Conclusion Our results provide a theoretical background for a much more effective bevacizumab treatment using optimized administration. PMID:26540189

  6. Tumor Volume Estimation and Quasi-Continuous Administration for Most Effective Bevacizumab Therapy.

    PubMed

    Sápi, Johanna; Kovács, Levente; Drexler, Dániel András; Kocsis, Pál; Gajári, Dávid; Sápi, Zoltán

    2015-01-01

    Bevacizumab is an exogenous inhibitor which inhibits the biological activity of human VEGF. Several studies have investigated the effectiveness of bevacizumab therapy according to different cancer types but these days there is an intense debate on its utility. We have investigated different methods to find the best tumor volume estimation since it creates the possibility for precise and effective drug administration with a much lower dose than in the protocol. We have examined C38 mouse colon adenocarcinoma and HT-29 human colorectal adenocarcinoma. In both cases, three groups were compared in the experiments. The first group did not receive therapy, the second group received one 200 μg bevacizumab dose for a treatment period (protocol-based therapy), and the third group received 1.1 μg bevacizumab every day (quasi-continuous therapy). Tumor volume measurement was performed by digital caliper and small animal MRI. The mathematical relationship between MRI-measured tumor volume and mass was investigated to estimate accurate tumor volume using caliper-measured data. A two-dimensional mathematical model was applied for tumor volume evaluation, and tumor- and therapy-specific constants were calculated for the three different groups. The effectiveness of bevacizumab administration was examined by statistical analysis. In the case of C38 adenocarcinoma, protocol-based treatment did not result in significantly smaller tumor volume compared to the no treatment group; however, there was a significant difference between untreated mice and mice who received quasi-continuous therapy (p = 0.002). In the case of HT-29 adenocarcinoma, the daily treatment with one-twelfth total dose resulted in significantly smaller tumors than the protocol-based treatment (p = 0.038). When the tumor has a symmetrical, solid closed shape (typically without treatment), volume can be evaluated accurately from caliper-measured data with the applied two-dimensional mathematical model. Our results provide a theoretical background for a much more effective bevacizumab treatment using optimized administration.

  7. Two-dimensional echo-cardiographic estimation of left atrial volume and volume load in patients with congenital heart disease.

    PubMed

    Kawaguchi, A; Linde, L M; Imachi, T; Mizuno, H; Akutsu, H

    1983-12-01

    To estimate the left atrial volume (LAV) and pulmonary blood flow in patients with congenital heart disease (CHD), we employed two-dimensional echocardiography (TDE). The LAV was measured in dimensions other than those obtained in conventional M-mode echocardiography (M-mode echo). Mathematical and geometrical models for LAV calculation using the standard long-axis, short-axis and apical four-chamber planes were devised and found to be reliable in a preliminary study using porcine heart preparations, although length (10%), area (20%) and volume (38%) were significantly and consistently underestimated with echocardiography. Those models were then applied and correlated with angiocardiograms (ACG) in 25 consecutive patients with suspected CHD. In terms of the estimation of the absolute LAV, accuracy seemed commensurate with the number of the dimensions measured. The correlation between data obtained by TDE and ACG varied with changing hemodynamics such as cardiac cycle, absolute LAV and presence or absence of volume load. The left atrium was found to become spherical and progressively underestimated with TDE at ventricular endsystole, in larger LAV and with increased volume load. Since this tendency became less pronounced in measuring additional dimensions, reliable estimation of the absolute LAV and volume load was possible when 2 or 3 dimensions were measured. Among those calculation models depending on 2 or 3 dimensional measurements, there was only a small difference in terms of accuracy and predictability, although algorithm used varied from one model to another. This suggests that accurate cross-sectional area measurement is critically important for volume estimation rather than any particular algorithm involved. Cross-sectional area measurement by TDE integrated into a three dimensional equivalent allowed a reliable estimate of the LAV or volume load in a variety of hemodynamic situations where M-mode echo was not reliable.

  8. Shift level analysis of cable yarder availability, utilization, and productive time

    Treesearch

    James R. Sherar; Chris B. LeDoux

    1989-01-01

    Decision makers, loggers, managers, and planners need to understand and have methods for estimating utilization and productive time of cable logging systems. In making an accurate prediction of how much area and volume a machine will log per unit time and the associated cable yarding costs, a reliable estimate of the availability, utilization, and productive time of...

  9. Bidirectional segmentation of prostate capsule from ultrasound volumes: an improved strategy

    NASA Astrophysics Data System (ADS)

    Wei, Liyang; Narayanan, Ramkrishnan; Kumar, Dinesh; Fenster, Aaron; Barqawi, Albaha; Werahera, Priya; Crawford, E. David; Suri, Jasjit S.

    2008-03-01

    Prostate volume is an indirect indicator for several prostate diseases. Volume estimation is a desired requirement during prostate biopsy, therapy and clinical follow up. Image segmentation is thus necessary. Previously, discrete dynamic contour (DDC) was implemented in orthogonal unidirectional on the slice-by-slice basis for prostate boundary estimation. This suffered from the glitch that it needed stopping criteria during the propagation of segmentation procedure from slice-to-slice. To overcome this glitch, axial DDC was implemented and this suffered from the fact that central axis never remains fixed and wobbles during propagation of segmentation from slice-to-slice. The effect of this was a multi-fold reconstructed surface. This paper presents a bidirectional DDC approach, thereby removing the two glitches. Our bidirectional DDC protocol was tested on a clinical dataset on 28 3-D ultrasound image volumes acquired using side fire Philips transrectal ultrasound. We demonstrate the orthogonal bidirectional DDC strategy achieved the most accurate volume estimation compared with previously published orthogonal unidirectional DDC and axial DDC methods. Compared to the ground truth, we show that the mean volume estimation errors were: 18.48%, 9.21% and 7.82% for unidirectional, axial and bidirectional DDC methods, respectively. The segmentation architecture is implemented in Visual C++ in Windows environment.

  10. Emergency Physician Estimation of Blood Loss

    PubMed Central

    Ashburn, Jeffery C.; Harrison, Tamara; Ham, James J.; Strote, Jared

    2012-01-01

    Introduction Emergency physicians (EP) frequently estimate blood loss, which can have implications for clinical care. The objectives of this study were to examine EP accuracy in estimating blood loss on different surfaces and compare attending physician and resident performance. Methods A sample of 56 emergency department (ED) physicians (30 attending physicians and 26 residents) were asked to estimate the amount of moulage blood present in 4 scenarios: 500 mL spilled onto an ED cot; 25 mL spilled onto a 10-pack of 4 × 4-inch gauze; 100 mL on a T-shirt; and 150 mL in a commode filled with water. Standard estimate error (the absolute value of (estimated volume − actual volume)/actual volume × 100) was calculated for each estimate. Results The mean standard error for all estimates was 116% with a range of 0% to 1233%. Only 8% of estimates were within 20% of the true value. Estimates were most accurate for the sheet scenario and worst for the commode scenario. Residents and attending physicians did not perform significantly differently (P > 0.05). Conclusion Emergency department physicians do not estimate blood loss well in a variety of scenarios. Such estimates could potentially be misleading if used in clinical decision making. Clinical experience does not appear to improve estimation ability in this limited study. PMID:22942938

  11. Segmentation propagation for the automated quantification of ventricle volume from serial MRI

    NASA Astrophysics Data System (ADS)

    Linguraru, Marius George; Butman, John A.

    2009-02-01

    Accurate ventricle volume estimates could potentially improve the understanding and diagnosis of communicating hydrocephalus. Postoperative communicating hydrocephalus has been recognized in patients with brain tumors where the changes in ventricle volume can be difficult to identify, particularly over short time intervals. Because of the complex alterations of brain morphology in these patients, the segmentation of brain ventricles is challenging. Our method evaluates ventricle size from serial brain MRI examinations; we (i) combined serial images to increase SNR, (ii) automatically segmented this image to generate a ventricle template using fast marching methods and geodesic active contours, and (iii) propagated the segmentation using deformable registration of the original MRI datasets. By applying this deformation to the ventricle template, serial volume estimates were obtained in a robust manner from routine clinical images (0.93 overlap) and their variation analyzed.

  12. Infrasound Waveform Inversion and Mass Flux Validation from Sakurajima Volcano, Japan

    NASA Astrophysics Data System (ADS)

    Fee, D.; Kim, K.; Yokoo, A.; Izbekov, P. E.; Lopez, T. M.; Prata, F.; Ahonen, P.; Kazahaya, R.; Nakamichi, H.; Iguchi, M.

    2015-12-01

    Recent advances in numerical wave propagation modeling and station coverage have permitted robust inversion of infrasound data from volcanic explosions. Complex topography and crater morphology have been shown to substantially affect the infrasound waveform, suggesting that homogeneous acoustic propagation assumptions are invalid. Infrasound waveform inversion provides an exciting tool to accurately characterize emission volume and mass flux from both volcanic and non-volcanic explosions. Mass flux, arguably the most sought-after parameter from a volcanic eruption, can be determined from the volume flux using infrasound waveform inversion if the volcanic flow is well-characterized. Thus far, infrasound-based volume and mass flux estimates have yet to be validated. In February 2015 we deployed six infrasound stations around the explosive Sakurajima Volcano, Japan for 8 days. Here we present our full waveform inversion method and volume and mass flux estimates of numerous high amplitude explosions using a high resolution DEM and 3-D Finite Difference Time Domain modeling. Application of this technique to volcanic eruptions may produce realistic estimates of mass flux and plume height necessary for volcanic hazard mitigation. Several ground-based instruments and methods are used to independently determine the volume, composition, and mass flux of individual volcanic explosions. Specifically, we use ground-based ash sampling, multispectral infrared imagery, UV spectrometry, and multigas data to estimate the plume composition and flux. Unique tiltmeter data from underground tunnels at Sakurajima also provides a way to estimate the volume and mass of each explosion. In this presentation we compare the volume and mass flux estimates derived from the different methods and discuss sources of error and future improvements.

  13. Micro CT based truth estimation of nodule volume

    NASA Astrophysics Data System (ADS)

    Kinnard, L. M.; Gavrielides, M. A.; Myers, K. J.; Zeng, R.; Whiting, B.; Lin-Gibson, S.; Petrick, N.

    2010-03-01

    With the advent of high-resolution CT, three-dimensional (3D) methods for nodule volumetry have been introduced, with the hope that such methods will be more accurate and consistent than currently used planar measures of size. However, the error associated with volume estimation methods still needs to be quantified. Volume estimation error is multi-faceted in the sense that there is variability associated with the patient, the software tool and the CT system. A primary goal of our current research efforts is to quantify the various sources of measurement error and, when possible, minimize their effects. In order to assess the bias of an estimate, the actual value, or "truth," must be known. In this work we investigate the reliability of micro CT to determine the "true" volume of synthetic nodules. The advantage of micro CT over other truthing methods is that it can provide both absolute volume and shape information in a single measurement. In the current study we compare micro CT volume truth to weight-density truth for spherical, elliptical, spiculated and lobulated nodules with diameters from 5 to 40 mm, and densities of -630 and +100 HU. The percent differences between micro CT and weight-density volume for -630 HU nodules range from [-21.7%, -0.6%] (mean= -11.9%) and the differences for +100 HU nodules range from [-0.9%, 3.0%] (mean=1.7%).

  14. Nonlocal Intracranial Cavity Extraction

    PubMed Central

    Manjón, José V.; Eskildsen, Simon F.; Coupé, Pierrick; Romero, José E.; Collins, D. Louis; Robles, Montserrat

    2014-01-01

    Automatic and accurate methods to estimate normalized regional brain volumes from MRI data are valuable tools which may help to obtain an objective diagnosis and followup of many neurological diseases. To estimate such regional brain volumes, the intracranial cavity volume (ICV) is often used for normalization. However, the high variability of brain shape and size due to normal intersubject variability, normal changes occurring over the lifespan, and abnormal changes due to disease makes the ICV estimation problem challenging. In this paper, we present a new approach to perform ICV extraction based on the use of a library of prelabeled brain images to capture the large variability of brain shapes. To this end, an improved nonlocal label fusion scheme based on BEaST technique is proposed to increase the accuracy of the ICV estimation. The proposed method is compared with recent state-of-the-art methods and the results demonstrate an improved performance both in terms of accuracy and reproducibility while maintaining a reduced computational burden. PMID:25328511

  15. Force estimation from OCT volumes using 3D CNNs.

    PubMed

    Gessert, Nils; Beringhoff, Jens; Otte, Christoph; Schlaefer, Alexander

    2018-07-01

    Estimating the interaction forces of instruments and tissue is of interest, particularly to provide haptic feedback during robot-assisted minimally invasive interventions. Different approaches based on external and integrated force sensors have been proposed. These are hampered by friction, sensor size, and sterilizability. We investigate a novel approach to estimate the force vector directly from optical coherence tomography image volumes. We introduce a novel Siamese 3D CNN architecture. The network takes an undeformed reference volume and a deformed sample volume as an input and outputs the three components of the force vector. We employ a deep residual architecture with bottlenecks for increased efficiency. We compare the Siamese approach to methods using difference volumes and two-dimensional projections. Data were generated using a robotic setup to obtain ground-truth force vectors for silicon tissue phantoms as well as porcine tissue. Our method achieves a mean average error of [Formula: see text] when estimating the force vector. Our novel Siamese 3D CNN architecture outperforms single-path methods that achieve a mean average error of [Formula: see text]. Moreover, the use of volume data leads to significantly higher performance compared to processing only surface information which achieves a mean average error of [Formula: see text]. Based on the tissue dataset, our methods shows good generalization in between different subjects. We propose a novel image-based force estimation method using optical coherence tomography. We illustrate that capturing the deformation of subsurface structures substantially improves force estimation. Our approach can provide accurate force estimates in surgical setups when using intraoperative optical coherence tomography.

  16. Estimation of the sensitive volume for gravitational-wave source populations using weighted Monte Carlo integration

    NASA Astrophysics Data System (ADS)

    Tiwari, Vaibhav

    2018-07-01

    The population analysis and estimation of merger rates of compact binaries is one of the important topics in gravitational wave astronomy. The primary ingredient in these analyses is the population-averaged sensitive volume. Typically, sensitive volume, of a given search to a given simulated source population, is estimated by drawing signals from the population model and adding them to the detector data as injections. Subsequently injections, which are simulated gravitational waveforms, are searched for by the search pipelines and their signal-to-noise ratio (SNR) is determined. Sensitive volume is estimated, by using Monte-Carlo (MC) integration, from the total number of injections added to the data, the number of injections that cross a chosen threshold on SNR and the astrophysical volume in which the injections are placed. So far, only fixed population models have been used in the estimation of binary black holes (BBH) merger rates. However, as the scope of population analysis broaden in terms of the methodologies and source properties considered, due to an increase in the number of observed gravitational wave (GW) signals, the procedure will need to be repeated multiple times at a large computational cost. In this letter we address the problem by performing a weighted MC integration. We show how a single set of generic injections can be weighted to estimate the sensitive volume for multiple population models; thereby greatly reducing the computational cost. The weights in this MC integral are the ratios of the output probabilities, determined by the population model and standard cosmology, and the injection probability, determined by the distribution function of the generic injections. Unlike analytical/semi-analytical methods, which usually estimate sensitive volume using single detector sensitivity, the method is accurate within statistical errors, comes at no added cost and requires minimal computational resources.

  17. Stroke onset time estimation from multispectral quantitative magnetic resonance imaging in a rat model of focal permanent cerebral ischemia.

    PubMed

    McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A

    2016-08-01

    Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.

  18. Performance of dense digital surface models based on image matching in the estimation of plot-level forest variables

    NASA Astrophysics Data System (ADS)

    Nurminen, Kimmo; Karjalainen, Mika; Yu, Xiaowei; Hyyppä, Juha; Honkavaara, Eija

    2013-09-01

    Recent research results have shown that the performance of digital surface model extraction using novel high-quality photogrammetric images and image matching is a highly competitive alternative to laser scanning. In this article, we proceed to compare the performance of these two methods in the estimation of plot-level forest variables. Dense point clouds extracted from aerial frame images were used to estimate the plot-level forest variables needed in a forest inventory covering 89 plots. We analyzed images with 60% and 80% forward overlaps and used test plots with off-nadir angles of between 0° and 20°. When compared to reference ground measurements, the airborne laser scanning (ALS) data proved to be the most accurate: it yielded root mean square error (RMSE) values of 6.55% for mean height, 11.42% for mean diameter, and 20.72% for volume. When we applied a forward overlap of 80%, the corresponding results from aerial images were 6.77% for mean height, 12.00% for mean diameter, and 22.62% for volume. A forward overlap of 60% resulted in slightly deteriorated RMSE values of 7.55% for mean height, 12.20% for mean diameter, and 22.77% for volume. According to our results, the use of higher forward overlap produced only slightly better results in the estimation of these forest variables. Additionally, we found that the estimation accuracy was not significantly impacted by the increase in the off-nadir angle. Our results confirmed that digital aerial photographs were about as accurate as ALS in forest resources estimation as long as a terrain model was available.

  19. Using a traffic simulation model (VISSIM) with an emissions model (MOVES) to predict emissions from vehicles on a limited-access highway.

    PubMed

    Abou-Senna, Hatem; Radwan, Essam; Westerlund, Kurt; Cooper, C David

    2013-07-01

    The Intergovernmental Panel on Climate Change (IPCC) estimates that baseline global GHG emissions may increase 25-90% from 2000 to 2030, with carbon dioxide (CO2 emissions growing 40-110% over the same period. On-road vehicles are a major source of CO2 emissions in all the developed countries, and in many of the developing countries in the world. Similarly, several criteria air pollutants are associated with transportation, for example, carbon monoxide (CO), nitrogen oxides (NO(x)), and particulate matter (PM). Therefore, the need to accurately quantify transportation-related emissions from vehicles is essential. The new US. Environmental Protection Agency (EPA) mobile source emissions model, MOVES2010a (MOVES), can estimate vehicle emissions on a second-by-second basis, creating the opportunity to combine a microscopic traffic simulation model (such as VISSIM) with MOVES to obtain accurate results. This paper presents an examination of four different approaches to capture the environmental impacts of vehicular operations on a 10-mile stretch of Interstate 4 (I-4), an urban limited-access highway in Orlando, FL. First (at the most basic level), emissions were estimated for the entire 10-mile section "by hand" using one average traffic volume and average speed. Then three advanced levels of detail were studied using VISSIM/MOVES to analyze smaller links: average speeds and volumes (AVG), second-by-second link drive schedules (LDS), and second-by-second operating mode distributions (OPMODE). This paper analyzes how the various approaches affect predicted emissions of CO, NO(x), PM2.5, PM10, and CO2. The results demonstrate that obtaining precise and comprehensive operating mode distributions on a second-by-second basis provides more accurate emission estimates. Specifically, emission rates are highly sensitive to stop-and-go traffic and the associated driving cycles of acceleration, deceleration, and idling. Using the AVG or LDS approach may overestimate or underestimate emissions, respectively, compared to an operating mode distribution approach. Transportation agencies and researchers in the past have estimated emissions using one average speed and volume on a long stretch of roadway. With MOVES, there is an opportunity for higher precision and accuracy. Integrating a microscopic traffic simulation model (such as VISSIM) with MOVES allows one to obtain precise and accurate emissions estimates. The proposed emission rate estimation process also can be extended to gridded emissions for ozone modeling, or to localized air quality dispersion modeling, where temporal and spatial resolution of emissions is essential to predict the concentration of pollutants near roadways.

  20. Improved accuracy of aboveground biomass and carbon estimates for live trees in forests of the eastern United States

    Treesearch

    Philip Radtke; David Walker; Jereme Frank; Aaron Weiskittel; Clara DeYoung; David MacFarlane; Grant Domke; Christopher Woodall; John Coulston; James Westfall

    2017-01-01

    Accurate estimation of forest biomass and carbon stocks at regional to national scales is a key requirement in determining terrestrial carbon sources and sinks on United States (US) forest lands. To that end, comprehensive assessment and testing of alternative volume and biomass models were conducted for individual tree models employed in the component ratio method (...

  1. Analysis and evaluation of methods for backcalculation of Mr values : volume 1 : research report : final report.

    DOT National Transportation Integrated Search

    1993-01-01

    Use of the 1986 AASHTO Design Guide requires accurate estimates of the resilient modulus of flexible pavement materials. Traditionally, these properties have been determined from either laboratory testing or by backcalculation from deflection data. S...

  2. Accuracy of MRI volume measurements of breast lesions: comparison between automated, semiautomated and manual assessment.

    PubMed

    Rominger, Marga B; Fournell, Daphne; Nadar, Beenarose Thanka; Behrens, Sarah N M; Figiel, Jens H; Keil, Boris; Heverhagen, Johannes T

    2009-05-01

    The aim of this study was to investigate the efficacy of a dedicated software tool for automated and semiautomated volume measurement in contrast-enhanced (CE) magnetic resonance mammography (MRM). Ninety-six breast lesions with histopathological workup (27 benign, 69 malignant) were re-evaluated by different volume measurement techniques. Volumes of all lesions were extracted automatically (AVM) and semiautomatically (SAVM) from CE 3D MRM and compared with manual 3D contour segmentation (manual volume measurement, MVM, reference measurement technique) and volume estimates based on maximum diameter measurement (MDM). Compared with MVM as reference method MDM, AVM and SAVM underestimated lesion volumes by 63.8%, 30.9% and 21.5%, respectively, with significantly different accuracy for benign (102.4%, 18.4% and 11.4%) and malignant (54.9%, 33.0% and 23.1%) lesions (p < 0.05). Inter- and intraobserver reproducibility was best for AVM (mean difference +/- 2SD, 1.0 +/- 9.7% and 1.8 +/- 12.1%) followed by SAVM (4.3 +/- 25.7% and 4.3 +/- 7.9%), MVM (2.3 +/- 38.2% and 8.6 +/- 31.8%) and MDM (33.9 +/- 128.4% and 9.3 +/- 55.9%). SAVM is more accurate for volume assessment of breast lesions than MDM and AVM. Volume measurement is less accurate for malignant than benign lesions.

  3. Vestibular schwannomas: Accuracy of tumor volume estimated by ice cream cone formula using thin-sliced MR images

    PubMed Central

    Ho, Hsing-Hao; Li, Ya-Hui; Lee, Jih-Chin; Wang, Chih-Wei; Yu, Yi-Lin; Hueng, Dueng-Yuan; Hsu, Hsian-He

    2018-01-01

    Purpose We estimated the volume of vestibular schwannomas by an ice cream cone formula using thin-sliced magnetic resonance images (MRI) and compared the estimation accuracy among different estimating formulas and between different models. Methods The study was approved by a local institutional review board. A total of 100 patients with vestibular schwannomas examined by MRI between January 2011 and November 2015 were enrolled retrospectively. Informed consent was waived. Volumes of vestibular schwannomas were estimated by cuboidal, ellipsoidal, and spherical formulas based on a one-component model, and cuboidal, ellipsoidal, Linskey’s, and ice cream cone formulas based on a two-component model. The estimated volumes were compared to the volumes measured by planimetry. Intraobserver reproducibility and interobserver agreement was tested. Estimation error, including absolute percentage error (APE) and percentage error (PE), was calculated. Statistical analysis included intraclass correlation coefficient (ICC), linear regression analysis, one-way analysis of variance, and paired t-tests with P < 0.05 considered statistically significant. Results Overall tumor size was 4.80 ± 6.8 mL (mean ±standard deviation). All ICCs were no less than 0.992, suggestive of high intraobserver reproducibility and high interobserver agreement. Cuboidal formulas significantly overestimated the tumor volume by a factor of 1.9 to 2.4 (P ≤ 0.001). The one-component ellipsoidal and spherical formulas overestimated the tumor volume with an APE of 20.3% and 29.2%, respectively. The two-component ice cream cone method, and ellipsoidal and Linskey’s formulas significantly reduced the APE to 11.0%, 10.1%, and 12.5%, respectively (all P < 0.001). Conclusion The ice cream cone method and other two-component formulas including the ellipsoidal and Linskey’s formulas allow for estimation of vestibular schwannoma volume more accurately than all one-component formulas. PMID:29438424

  4. Methods for the quantification of coarse woody debris and an examination of its spatial patterning: A study from the Tenderfoot Creek Experimental Forest, MT

    Treesearch

    Paul B. Alaback; Duncan C. Lutes

    1997-01-01

    Methods for the quantification of coarse woody debris volume and the description of spatial patterning were studied in the Tenderfoot Creek Experimental Forest, Montana. The line transect method was found to be an accurate, unbiased estimator of down debris volume (> 10cm diameter) on 1/4 hectare fixed-area plots, when perpendicular lines were used. The Fischer...

  5. Breast Volume Measurement by Recycling the Data Obtained From 2 Routine Modalities, Mammography and Magnetic Resonance Imaging.

    PubMed

    Itsukage, Shizu; Sowa, Yoshihiro; Goto, Mariko; Taguchi, Tetsuya; Numajiri, Toshiaki

    2017-01-01

    Objective: Preoperative prediction of breast volume is important in the planning of breast reconstructive surgery. In this study, we prospectively estimated the accuracy of measurement of breast volume using data from 2 routine modalities, mammography and magnetic resonance imaging, by comparison with volumes of mastectomy specimens. Methods: The subjects were 22 patients (24 breasts) who were scheduled to undergo total mastectomy for breast cancer. Preoperatively, magnetic resonance imaging volume measurement was performed using a medical imaging system and the mammographic volume was calculated using a previously proposed formula. Volumes of mastectomy specimens were measured intraoperatively using a method based on Archimedes' principle and Newton's third law. Results: The average breast volumes measured on magnetic resonance imaging and mammography were 318.47 ± 199.4 mL and 325.26 ± 217.36 mL, respectively. The correlation coefficients with mastectomy specimen volumes were 0.982 for magnetic resonance imaging and 0.911 for mammography. Conclusions: Breast volume measurement using magnetic resonance imaging was highly accurate but requires data analysis software. In contrast, breast volume measurement with mammography requires only a simple formula and is sufficiently accurate, although the accuracy was lower than that obtained with magnetic resonance imaging. These results indicate that mammography could be an alternative modality for breast volume measurement as a substitute for magnetic resonance imaging.

  6. Breast Volume Measurement by Recycling the Data Obtained From 2 Routine Modalities, Mammography and Magnetic Resonance Imaging

    PubMed Central

    Itsukage, Shizu; Goto, Mariko; Taguchi, Tetsuya; Numajiri, Toshiaki

    2017-01-01

    Objective: Preoperative prediction of breast volume is important in the planning of breast reconstructive surgery. In this study, we prospectively estimated the accuracy of measurement of breast volume using data from 2 routine modalities, mammography and magnetic resonance imaging, by comparison with volumes of mastectomy specimens. Methods: The subjects were 22 patients (24 breasts) who were scheduled to undergo total mastectomy for breast cancer. Preoperatively, magnetic resonance imaging volume measurement was performed using a medical imaging system and the mammographic volume was calculated using a previously proposed formula. Volumes of mastectomy specimens were measured intraoperatively using a method based on Archimedes’ principle and Newton's third law. Results: The average breast volumes measured on magnetic resonance imaging and mammography were 318.47 ± 199.4 mL and 325.26 ± 217.36 mL, respectively. The correlation coefficients with mastectomy specimen volumes were 0.982 for magnetic resonance imaging and 0.911 for mammography. Conclusions: Breast volume measurement using magnetic resonance imaging was highly accurate but requires data analysis software. In contrast, breast volume measurement with mammography requires only a simple formula and is sufficiently accurate, although the accuracy was lower than that obtained with magnetic resonance imaging. These results indicate that mammography could be an alternative modality for breast volume measurement as a substitute for magnetic resonance imaging. PMID:29308107

  7. Constraining explosive volcanism: subjective choices during estimates of eruption magnitude

    USGS Publications Warehouse

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Don; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-01-01

    When estimating the magnitude of explosive eruptions from their deposits, individuals make three sets of critical choices with respect to input data: the spacing of sampling sites, the selection of contour intervals to constrain the field measurements, and the hand contouring of thickness/isomass data, respectively. Volcanologists make subjective calls, as there are no accepted published protocols and few accounts of how these choices will impact estimates of eruption magnitude. Here, for the first time, we took a set of unpublished thickness measurements from the 1959 Kīlauea Iki pyroclastic fall deposit and asked 101 volcanologists worldwide to hand contour the data. First, there were surprisingly consistent volume estimates across maps with three different sampling densities. Second, the variability in volume calculations imparted by individuals’ choices of contours is also surprisingly low and lies between s = 5 and 8 %. Third, volume estimation is insensitive to the extent to which different individuals “smooth” the raw data in constructing contour lines. Finally, large uncertainty is associated with the construction of the thinnest isopachs, which is likely to underestimate the actual trend of deposit thinning. The net result is that researchers can have considerable confidence in using volume or dispersal data from multiple authors and different deposits for comparative studies. These insights should help volcanologists around the world to optimize design and execution of field-based studies to characterize accurately the volume of pyroclastic deposits.

  8. Constraining explosive volcanism: subjective choices during estimates of eruption magnitude

    NASA Astrophysics Data System (ADS)

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Donald A.; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-02-01

    When estimating the magnitude of explosive eruptions from their deposits, individuals make three sets of critical choices with respect to input data: the spacing of sampling sites, the selection of contour intervals to constrain the field measurements, and the hand contouring of thickness/isomass data, respectively. Volcanologists make subjective calls, as there are no accepted published protocols and few accounts of how these choices will impact estimates of eruption magnitude. Here, for the first time, we took a set of unpublished thickness measurements from the 1959 Kīlauea Iki pyroclastic fall deposit and asked 101 volcanologists worldwide to hand contour the data. First, there were surprisingly consistent volume estimates across maps with three different sampling densities. Second, the variability in volume calculations imparted by individuals' choices of contours is also surprisingly low and lies between s = 5 and 8 %. Third, volume estimation is insensitive to the extent to which different individuals "smooth" the raw data in constructing contour lines. Finally, large uncertainty is associated with the construction of the thinnest isopachs, which is likely to underestimate the actual trend of deposit thinning. The net result is that researchers can have considerable confidence in using volume or dispersal data from multiple authors and different deposits for comparative studies. These insights should help volcanologists around the world to optimize design and execution of field-based studies to characterize accurately the volume of pyroclastic deposits.

  9. Terrestrial Laser Scanning for Coastal Geomorphologic Research in Western Greece

    NASA Astrophysics Data System (ADS)

    Hoffmeister, D.; Tilly, N.; Curdt, C.; Aasen, H.; Ntageretzis, K.; Hadler, H.; Willershäuser, T.; Vött, A.; Bareth, G.

    2012-07-01

    We used terrestrial laser scanning (TLS) for (i) accurate volume estimations of dislocated boulders moved by high-energy impacts and for (ii) monitoring of annual coastal changes. In this contribution, we present three selected sites in Western Greece that were surveyed during a time span of four years (2008-2011). The Riegl LMS-Z420i laser scanner was used in combination with a precise DGPS system (Topcon HiPer Pro). Each scan position and a further target were recorded for georeferencing and merging of the point clouds. For the annual detection of changes, reference points for the base station of the DGPS system were marked. Our studies show that TLS is capable to accurately estimate volumes of boulders, which were dislocated and deposited inland from the littoral zone. The mass of each boulder was calculated from this 3D-reconstructed volume and according density data. The masses turned out to be considerably smaller than common estimated masses based on tape-measurements and according density approximations. The accurate mass data was incorporated into wave transport equations, which estimate wave velocities of high-energy impacts. As expected, these show smaller wave velocities, due to the incorporated smaller mass. Furthermore, TLS is capable to monitor annual changes on coastal areas. The changes are detected by comparing high resolution digital elevation models from every year. On a beach site, larger areas of sea-weed and sandy sediments are eroded. In contrast, bigger gravel with 30-50 cm diameter was accumulated. At the other area with bigger boulders and a different coastal configuration only slightly differences were detectable. In low-lying coastal areas and along recent beaches, post-processing of point clouds turned out to be more difficult, due to noise effects by water and shadowing effects. However, our studies show that the application of TLS in different littoral settings is an appropriate and promising tool. The combination of both instruments worked well and the annual positioning procedure with own survey point is precose for this purpose.

  10. Direct Measurement of Proximal Isovelocity Surface Area by Real-Time Three-Dimensional Color Doppler for Quantitation of Aortic Regurgitant Volume: An In Vitro Validation

    PubMed Central

    Pirat, Bahar; Little, Stephen H.; Igo, Stephen R.; McCulloch, Marti; Nosé, Yukihiko; Hartley, Craig J.; Zoghbi, William A.

    2012-01-01

    Objective The proximal isovelocity surface area (PISA) method is useful in the quantitation of aortic regurgitation (AR). We hypothesized that actual measurement of PISA provided with real-time 3-dimensional (3D) color Doppler yields more accurate regurgitant volumes than those estimated by 2-dimensional (2D) color Doppler PISA. Methods We developed a pulsatile flow model for AR with an imaging chamber in which interchangeable regurgitant orifices with defined shapes and areas were incorporated. An ultrasonic flow meter was used to calculate the reference regurgitant volumes. A total of 29 different flow conditions for 5 orifices with different shapes were tested at a rate of 72 beats/min. 2D PISA was calculated as 2π r2, and 3D PISA was measured from 8 equidistant radial planes of the 3D PISA. Regurgitant volume was derived as PISA × aliasing velocity × time velocity integral of AR/peak AR velocity. Results Regurgitant volumes by flow meter ranged between 12.6 and 30.6 mL/beat (mean 21.4 ± 5.5 mL/beat). Regurgitant volumes estimated by 2D PISA correlated well with volumes measured by flow meter (r = 0.69); however, a significant underestimation was observed (y = 0.5x + 0.6). Correlation with flow meter volumes was stronger for 3D PISA-derived regurgitant volumes (r = 0.83); significantly less underestimation of regurgitant volumes was seen, with a regression line close to identity (y = 0.9x + 3.9). Conclusion Direct measurement of PISA is feasible, without geometric assumptions, using real-time 3D color Doppler. Calculation of aortic regurgitant volumes with 3D color Doppler using this methodology is more accurate than conventional 2D method with hemispheric PISA assumption. PMID:19168322

  11. An estimation of vehicle kilometer traveled and on-road emissions using the traffic volume and travel speed on road links in Incheon City.

    PubMed

    Jung, Sungwoon; Kim, Jounghwa; Kim, Jeongsoo; Hong, Dahee; Park, Dongjoo

    2017-04-01

    The objective of this study is to estimate the vehicle kilometer traveled (VKT) and on-road emissions using the traffic volume in urban. We estimated two VKT; one is based on registered vehicles and the other is based on traffic volumes. VKT for registered vehicles was 2.11 times greater than that of the applied traffic volumes because each VKT estimation method is different. Therefore, we had to define the inner VKT is moved VKT inner in urban to compare two values. Also, we focused on freight modes because these are discharged much air pollutant emissions. From analysis results, we found middle and large trucks registered in other regions traveled to target city in order to carry freight, target city has included many industrial and logistics areas. Freight is transferred through the harbors, large logistics centers, or via locations before being moved to the final destination. During this process, most freight is moved by middle and large trucks, and trailers rather than small trucks for freight import and export. Therefore, these trucks from other areas are inflow more than registered vehicles. Most emissions from diesel trucks had been overestimated in comparison to VKT from applied traffic volumes in target city. From these findings, VKT is essential based on traffic volume and travel speed on road links in order to estimate accurately the emissions of diesel trucks in target city. Our findings support the estimation of the effect of on-road emissions on urban air quality in Korea. Copyright © 2016. Published by Elsevier B.V.

  12. CT volumetry of the skeletal tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brindle, James M.; Alexandre Trindade, A.; Pichardo, Jose C.

    2006-10-15

    Computed tomography (CT) is an important and widely used modality in the diagnosis and treatment of various cancers. In the field of molecular radiotherapy, the use of spongiosa volume (combined tissues of the bone marrow and bone trabeculae) has been suggested as a means to improve the patient-specificity of bone marrow dose estimates. The noninvasive estimation of an organ volume comes with some degree of error or variation from the true organ volume. The present study explores the ability to obtain estimates of spongiosa volume or its surrogate via manual image segmentation. The variation among different segmentation raters was exploredmore » and found not to be statistically significant (p value >0.05). Accuracy was assessed by having several raters manually segment a polyvinyl chloride (PVC) pipe with known volumes. Segmentation of the outer region of the PVC pipe resulted in mean percent errors as great as 15% while segmentation of the pipe's inner region resulted in mean percent errors within {approx}5%. Differences between volumes estimated with the high-resolution CT data set (typical of ex vivo skeletal scans) and the low-resolution CT data set (typical of in vivo skeletal scans) were also explored using both patient CT images and a PVC pipe phantom. While a statistically significant difference (p value <0.002) between the high-resolution and low-resolution data sets was observed with excised femoral heads obtained following total hip arthroplasty, the mean difference between high-resolution and low-resolution data sets was found to be only 1.24 and 2.18 cm{sup 3} for spongiosa and cortical bone, respectively. With respect to differences observed with the PVC pipe, the variation between the high-resolution and low-resolution mean percent errors was a high as {approx}20% for the outer region volume estimates and only as high as {approx}6% for the inner region volume estimates. The findings from this study suggest that manual segmentation is a reasonably accurate and reliable means for the in vivo estimation of spongiosa volume. This work also provides a foundation for future studies where spongiosa volumes are estimated by various raters in more comprehensive CT data sets.« less

  13. Volume effects of late term normal tissue toxicity in prostate cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Bonta, Dacian Viorel

    Modeling of volume effects for treatment toxicity is paramount for optimization of radiation therapy. This thesis proposes a new model for calculating volume effects in gastro-intestinal and genito-urinary normal tissue complication probability (NTCP) following radiation therapy for prostate carcinoma. The radiobiological and the pathological basis for this model and its relationship to other models are detailed. A review of the radiobiological experiments and published clinical data identified salient features and specific properties a biologically adequate model has to conform to. The new model was fit to a set of actual clinical data. In order to verify the goodness of fit, two established NTCP models and a non-NTCP measure for complication risk were fitted to the same clinical data. The method of fit for the model parameters was maximum likelihood estimation. Within the framework of the maximum likelihood approach I estimated the parameter uncertainties for each complication prediction model. The quality-of-fit was determined using the Aikaike Information Criterion. Based on the model that provided the best fit, I identified the volume effects for both types of toxicities. Computer-based bootstrap resampling of the original dataset was used to estimate the bias and variance for the fitted parameter values. Computer simulation was also used to estimate the population size that generates a specific uncertainty level (3%) in the value of predicted complication probability. The same method was used to estimate the size of the patient population needed for accurate choice of the model underlying the NTCP. The results indicate that, depending on the number of parameters of a specific NTCP model, 100 (for two parameter models) and 500 patients (for three parameter models) are needed for accurate parameter fit. Correlation of complication occurrence in patients was also investigated. The results suggest that complication outcomes are correlated in a patient, although the correlation coefficient is rather small.

  14. Simulation of DNAPL migration in heterogeneous translucent porous media based on estimation of representative elementary volume

    NASA Astrophysics Data System (ADS)

    Wu, Ming; Wu, Jianfeng; Wu, Jichun

    2017-10-01

    When the dense nonaqueous phase liquid (DNAPL) comes into the subsurface environment, its migration behavior is crucially affected by the permeability and entry pressure of subsurface porous media. A prerequisite for accurately simulating DNAPL migration in aquifers is then the determination of the permeability, entry pressure and corresponding representative elementary volumes (REV) of porous media. However, the permeability, entry pressure and corresponding representative elementary volumes (REV) are hard to determine clearly. This study utilizes the light transmission micro-tomography (LTM) method to determine the permeability and entry pressure of two dimensional (2D) translucent porous media and integrates the LTM with a criterion of relative gradient error to quantify the corresponding REV of porous media. As a result, the DNAPL migration in porous media might be accurately simulated by discretizing the model at the REV dimension. To validate the quantification methods, an experiment of perchloroethylene (PCE) migration is conducted in a two-dimensional heterogeneous bench-scale aquifer cell. Based on the quantifications of permeability, entry pressure and REV scales of 2D porous media determined by the LTM and relative gradient error, different models with different sizes of discretization grid are used to simulate the PCE migration. It is shown that the model based on REV size agrees well with the experimental results over the entire migration period including calibration, verification and validation processes. This helps to better understand the microstructures of porous media and achieve accurately simulating DNAPL migration in aquifers based on the REV estimation.

  15. Estimation of diaphragm length in patients with severe chronic obstructive pulmonary disease.

    PubMed

    McKenzie, D K; Gorman, R B; Tolman, J; Pride, N B; Gandevia, S C

    2000-11-01

    In patients with advanced chronic obstructive pulmonary disease (COPD) diaphragm function may be compromised because of reduced muscle fibre length. Diaphragm length (L(Di)) can be estimated from measurements of transverse diameter of the rib cage (D(Rc)) and the length of the zone of apposition (L(Zapp)) in healthy subjects, but this method has not been validated in patients with COPD. Postero-anterior chest radiographs were obtained at total lung capacity (TLC), functional residual capacity (FRC) and residual volume (RV) in nine male patients with severe COPD (mean [S.D.]; FEV(1), 23 [6] %pred.; FRC, 199 [15] %pred.). Radiographs taken at TLC were used to identify the lateral costal insertions of the diaphragm (L(Zapp) assumed to approach zero at TLC). L(Di) was measured directly and also estimated from measurements of L(Zapp) and D(Rc) using a prediction equation derived from healthy subjects. The estimation of L(Di) was highly accurate with an intraclass correlation coefficient of 0.93 and 95% CI of approximately +/-8% of the true value. L(Di) decreased from 426 (64) mm at RV to 305 (31) mm at TLC. As there were only small and variable changes in D(Rc) across the lung volume range, most of the L(Di) changes occurred in the zone of apposition. Additional studies showed that measurements of L(Di) from PA and lateral radiographs performed at different lung volumes were tightly correlated. These results suggest that non-invasive measurements of L(Zapp) in the coronal plane (e.g. using ultrasonography) and D(Rc) (e.g. using magnetometers) can be used to provide an accurate estimate of L(Di) in COPD patients.

  16. Vector Competence of Mosquitoes for Arboviruses

    DTIC Science & Technology

    1989-07-30

    dose required to infect 50% of the Rockefeller strain of Ae. aegypti, following feeding on a pledget , has now been more accurately estimated to be...greater surface area for feeding, when compared to gauze pledgets , thus reducing competition between mosquitoes for access to feeding sites. The volume of

  17. Using airborne laser altimetry to estimate Sabellaria alveolata (Polychaeta: Sabellariidae) reefs volume in tidal flat environments

    NASA Astrophysics Data System (ADS)

    Noernberg, Mauricio Almeida; Fournier, Jérôme; Dubois, Stanislas; Populus, Jacques

    2010-12-01

    This study has exploited aerial photographs and LiDAR digital elevation model to quantify intertidal complex landforms volume. A first volume estimation of the main sabellariid polychaete reef complex of the Bay of Mont-Saint-Michel - France is presented. The Sabellaria alveolata is an engineering species that heavily modifies its environment. This gregarious tube-building annelid forms dense and solid reefs of bioclastic coarse sand which can reach several km 2. Since 1970 a very strong decline of reefs has been observed. The authorities have curbed fishing activities without any noticeable changes in reef health status. The S. alveolata reef volume is estimated to be 132 048 m 3 (96 301 m 3 for Sainte-Anne reef and 35 747 m 3 for Champeaux reef). Further LiDAR data surveys will be needed to be able to understand and quantify the accretion/erosion processes in play in the reef dynamic. Because of the internal variability of topographic complexity of the reef, characterized by crevices, cracks, and holes rather than whole blocks, further studies are needed to calculate more accurately the volume of the reef.

  18. Scan-based volume animation driven by locally adaptive articulated registrations.

    PubMed

    Rhee, Taehyun; Lewis, J P; Neumann, Ulrich; Nayak, Krishna S

    2011-03-01

    This paper describes a complete system to create anatomically accurate example-based volume deformation and animation of articulated body regions, starting from multiple in vivo volume scans of a specific individual. In order to solve the correspondence problem across volume scans, a template volume is registered to each sample. The wide range of pose variations is first approximated by volume blend deformation (VBD), providing proper initialization of the articulated subject in different poses. A novel registration method is presented to efficiently reduce the computation cost while avoiding strong local minima inherent in complex articulated body volume registration. The algorithm highly constrains the degrees of freedom and search space involved in the nonlinear optimization, using hierarchical volume structures and locally constrained deformation based on the biharmonic clamped spline. Our registration step establishes a correspondence across scans, allowing a data-driven deformation approach in the volume domain. The results provide an occlusion-free person-specific 3D human body model, asymptotically accurate inner tissue deformations, and realistic volume animation of articulated movements driven by standard joint control estimated from the actual skeleton. Our approach also addresses the practical issues arising in using scans from living subjects. The robustness of our algorithms is tested by their applications on the hand, probably the most complex articulated region in the body, and the knee, a frequent subject area for medical imaging due to injuries. © 2011 IEEE

  19. Oxygen transfer rate estimation in oxidation ditches from clean water measurements.

    PubMed

    Abusam, A; Keesman, K J; Meinema, K; Van Straten, G

    2001-06-01

    Standard methods for the determination of oxygen transfer rate are based on assumptions that are not valid for oxidation ditches. This paper presents a realistic and simple new method to be used in the estimation of oxygen transfer rate in oxidation ditches from clean water measurements. The new method uses a loop-of-CSTRs model, which can be easily incorporated within control algorithms, for modelling oxidation ditches. Further, this method assumes zero oxygen transfer rates (KLa) in the unaerated CSTRs. Application of a formal estimation procedure to real data revealed that the aeration constant (k = KLaVA, where VA is the volume of the aerated CSTR) can be determined significantly more accurately than KLa and VA. Therefore, the new method estimates k instead of KLa. From application to real data, this method proved to be more accurate than the commonly used Dutch standard method (STORA, 1980).

  20. Accuracy of predicted haemoglobin concentration on cardiopulmonary bypass in paediatric cardiac surgery: effect of different formulae for estimating patient blood volume.

    PubMed

    Redlin, Matthias; Boettcher, Wolfgang; Dehmel, Frank; Cho, Mi-Young; Kukucka, Marian; Habazettl, Helmut

    2017-11-01

    When applying a blood-conserving approach in paediatric cardiac surgery with the aim of reducing the transfusion of homologous blood products, the decision to use blood or blood-free priming of the cardiopulmonary bypass (CPB) circuit is often based on the predicted haemoglobin concentration (Hb) as derived from the pre-CPB Hb, the prime volume and the estimated blood volume. We assessed the accuracy of this approach and whether it may be improved by using more sophisticated methods of estimating the blood volume. Data from 522 paediatric cardiac surgery patients treated with CPB with blood-free priming in a 2-year period from May 2013 to May 2015 were collected. Inclusion criteria were body weight <15 kg and available Hb data immediately prior to and after the onset of CPB. The Hb on CPB was predicted according to Fick's principle from the pre-CPB Hb, the prime volume and the patient blood volume. Linear regression analyses and Bland-Altman plots were used to assess the accuracy of the Hb prediction. Different methods to estimate the blood volume were assessed and compared. The initial Hb on CPB correlated well with the predicted Hb (R 2 =0.87, p<0.001). A Bland-Altman plot revealed little bias at 0.07 g/dL and an area of agreement from -1.35 to 1.48 g/dL. More sophisticated methods of estimating blood volume from lean body mass did not improve the Hb prediction, but rather increased bias. Hb prediction is reasonably accurate, with the best result obtained with the simplest method of estimating the blood volume at 80 mL/kg body weight. When deciding for or against blood-free priming, caution is necessary when the predicted Hb lies in a range of ± 2 g/dL around the transfusion trigger.

  1. Alpha shape theory for 3D visualization and volumetric measurement of brain tumor progression using magnetic resonance images.

    PubMed

    Hamoud Al-Tamimi, Mohammed Sabbih; Sulong, Ghazali; Shuaib, Ibrahim Lutfi

    2015-07-01

    Resection of brain tumors is a tricky task in surgery due to its direct influence on the patients' survival rate. Determining the tumor resection extent for its complete information via-à-vis volume and dimensions in pre- and post-operative Magnetic Resonance Images (MRI) requires accurate estimation and comparison. The active contour segmentation technique is used to segment brain tumors on pre-operative MR images using self-developed software. Tumor volume is acquired from its contours via alpha shape theory. The graphical user interface is developed for rendering, visualizing and estimating the volume of a brain tumor. Internet Brain Segmentation Repository dataset (IBSR) is employed to analyze and determine the repeatability and reproducibility of tumor volume. Accuracy of the method is validated by comparing the estimated volume using the proposed method with that of gold-standard. Segmentation by active contour technique is found to be capable of detecting the brain tumor boundaries. Furthermore, the volume description and visualization enable an interactive examination of tumor tissue and its surrounding. Admirable features of our results demonstrate that alpha shape theory in comparison to other existing standard methods is superior for precise volumetric measurement of tumor. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Left ventricular endocardial surface detection based on real-time 3D echocardiographic data

    NASA Technical Reports Server (NTRS)

    Corsi, C.; Borsari, M.; Consegnati, F.; Sarti, A.; Lamberti, C.; Travaglini, A.; Shiota, T.; Thomas, J. D.

    2001-01-01

    OBJECTIVE: A new computerized semi-automatic method for left ventricular (LV) chamber segmentation is presented. METHODS: The LV is imaged by real-time three-dimensional echocardiography (RT3DE). The surface detection model, based on level set techniques, is applied to RT3DE data for image analysis. The modified level set partial differential equation we use is solved by applying numerical methods for conservation laws. The initial conditions are manually established on some slices of the entire volume. The solution obtained for each slice is a contour line corresponding with the boundary between LV cavity and LV endocardium. RESULTS: The mathematical model has been applied to sequences of frames of human hearts (volume range: 34-109 ml) imaged by 2D and reconstructed off-line and RT3DE data. Volume estimation obtained by this new semi-automatic method shows an excellent correlation with those obtained by manual tracing (r = 0.992). Dynamic change of LV volume during the cardiac cycle is also obtained. CONCLUSION: The volume estimation method is accurate; edge based segmentation, image completion and volume reconstruction can be accomplished. The visualization technique also allows to navigate into the reconstructed volume and to display any section of the volume.

  3. Evaluating the Accuracy of Common Runoff Estimation Methods for New Impervious Hot-Mix Asphalt

    EPA Science Inventory

    Accurately predicting runoff volume from impervious surfaces for water quality design events (e.g., 25.4 mm) is important for sizing green infrastructure stormwater control measures to meet water quality and infiltration design targets. The objective of this research was to quan...

  4. A combined surface/volume scattering retracking algorithm for ice sheet satellite altimetry

    NASA Technical Reports Server (NTRS)

    Davis, Curt H.

    1992-01-01

    An algorithm that is based upon a combined surface-volume scattering model is developed. It can be used to retrack individual altimeter waveforms over ice sheets. An iterative least-squares procedure is used to fit the combined model to the return waveforms. The retracking algorithm comprises two distinct sections. The first generates initial model parameter estimates from a filtered altimeter waveform. The second uses the initial estimates, the theoretical model, and the waveform data to generate corrected parameter estimates. This retracking algorithm can be used to assess the accuracy of elevations produced from current retracking algorithms when subsurface volume scattering is present. This is extremely important so that repeated altimeter elevation measurements can be used to accurately detect changes in the mass balance of the ice sheets. By analyzing the distribution of the model parameters over large portions of the ice sheet, regional and seasonal variations in the near-surface properties of the snowpack can be quantified.

  5. Runoff measurements and hydrological modelling for the estimation of rainfall volumes in an Alpine basin

    NASA Astrophysics Data System (ADS)

    Ranzi, R.; Bacchi, B.; Grossi, G.

    2003-01-01

    Streamflow data and water levels in reservoirs have been collected at 30 recording sites in the Toce river basin and its surroundings, upstream of Lago Maggiore, one of the target areas of the Mesoscale Alpine Programme (MAP) experiment. These data have been used for two purposes: firstly, the verification of a hydrological model, forced by rain-gauge data and the output of a mesoscale meteorological model, for flood simulation and forecasting; secondly, to solve an inverse problem--to estimate rainfall volumes from the runoff data in mountain areas where the influence of orography and the limits of actual monitoring systems prevent accurate measurement of precipitation. The methods are illustrated for 19-20 September 1999, MAP Intensive Observing Period 2b, an event with a 4-year return period for the Toce river basin. Uncertainties in the estimates of the areal rainfall volumes based on rain-gauge data and via the inverse solution are assessed.

  6. Total and regional body volumes derived from dual-energy X-ray absorptiometry output.

    PubMed

    Wilson, Joseph P; Fan, Bo; Shepherd, John A

    2013-01-01

    Total body volume is an important health metric used to measure body density, shape, and multicompartmental body composition but is currently only available through underwater weighing or air displacement plethysmography (ADP). The objective of this investigation was to derive an accurate body volume from dual-energy X-ray absorptiometry (DXA)-reported measures for advanced body composition models. Volunteers received a whole body DXA scan and an ADP measure at baseline (N = 25) and 6 mo (N = 22). Baseline measures were used to calibrate body volume from the reported DXA masses of fat, lean, and bone mineral content. A second population (N = 385) from the National Health and Nutrition Examination Survey was used to estimate the test-retest precision of regional (arms, legs, head, and trunk) and total body volumes. Overall, we found that DXA-volume was highly correlated to ADP-volume (R² = 0.99). The 6-mo change in total DXA-volume was highly correlated to change in ADP-volume (R² = 0.98). The root mean square percent coefficient of variation precision of DXA-volume measures ranged from 1.1% (total) to 3.2% (head). We conclude that the DXA-volume method can measure body volume accurately and precisely, can be used in body composition models, could be an independent health indicator, and is useful as a prospective or retrospective biomarker of body composition. Copyright © 2013 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  7. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less smoothing at early time points post-radiopharmaceutical administration but more smoothing and fewer iterations at later time points when the total organ activity was lower. The results of this study demonstrate the importance of using optimal reconstruction and regularization parameters. Optimal results were obtained with different parameters at each time point, but using a single set of parameters for all time points produced near-optimal dose-volume histograms.

  8. Gravity Recovery and Climate Experiment (GRACE) detection of water storage changes in the Three Gorges Reservoir of China and comparison with in situ measurements

    NASA Astrophysics Data System (ADS)

    Wang, Xianwei; de Linage, Caroline; Famiglietti, James; Zender, Charles S.

    2011-12-01

    Water impoundment in the Three Gorges Reservoir (TGR) of China caused a large mass redistribution from the oceans to a concentrated land area in a short time period. We show that this mass shift is captured by the Gravity Recovery and Climate Experiment (GRACE) unconstrained global solutions at a 400 km spatial resolution after removing correlated errors. The WaterGAP Global Hydrology Model (WGHM) is selected to isolate the TGR contribution from regional water storage changes. For the first time, this study compares the GRACE (minus WGHM) estimated TGR volume changes with in situ measurements from April 2002 to May 2010 at a monthly time scale. During the 8 year study period, GRACE-WGHM estimated TGR volume changes show an increasing trend consistent with the TGR in situ measurements and lead to similar estimates of impounded water volume. GRACE-WGHM estimated total volume increase agrees to within 14% (3.2 km3) of the in situ measurements. This indicates that GRACE can retrieve the true amplitudes of large surface water storage changes in a concentrated area that is much smaller than the spatial resolution of its global harmonic solutions. The GRACE-WGHM estimated TGR monthly volume changes explain 76% (r2 = 0.76) of in situ measurement monthly variability and have an uncertainty of 4.62 km3. Our results also indicate reservoir leakage and groundwater recharge due to TGR filling and contamination from neighboring lakes are nonnegligible in the GRACE total water storage changes. Moreover, GRACE observations could provide a relatively accurate estimate of global water volume withheld by newly constructed large reservoirs and their impacts on global sea level rise since 2002.

  9. Development of mapped stress-field boundary conditions based on a Hill-type muscle model.

    PubMed

    Cardiff, P; Karač, A; FitzPatrick, D; Flavin, R; Ivanković, A

    2014-09-01

    Forces generated in the muscles and tendons actuate the movement of the skeleton. Accurate estimation and application of these musculotendon forces in a continuum model is not a trivial matter. Frequently, musculotendon attachments are approximated as point forces; however, accurate estimation of local mechanics requires a more realistic application of musculotendon forces. This paper describes the development of mapped Hill-type muscle models as boundary conditions for a finite volume model of the hip joint, where the calculated muscle fibres map continuously between attachment sites. The applied muscle forces are calculated using active Hill-type models, where input electromyography signals are determined from gait analysis. Realistic muscle attachment sites are determined directly from tomography images. The mapped muscle boundary conditions, implemented in a finite volume structural OpenFOAM (ESI-OpenCFD, Bracknell, UK) solver, are employed to simulate the mid-stance phase of gait using a patient-specific natural hip joint, and a comparison is performed with the standard point load muscle approach. It is concluded that physiological joint loading is not accurately represented by simplistic muscle point loading conditions; however, when contact pressures are of sole interest, simplifying assumptions with regard to muscular forces may be valid. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Validation of Test Weighing Protocol to Estimate Enteral Feeding Volumes in Preterm Infants.

    PubMed

    Rankin, Michael W; Jimenez, Elizabeth Yakes; Caraco, Marina; Collinson, Marie; Lostetter, Lisa; DuPont, Tara L

    2016-11-01

    To evaluate the accuracy of pre- and postfeeding weights to estimate enteral feeding volumes in preterm infants. Single-center prospective cohort study of infants 28-36 weeks' corrected age receiving gavage feedings. For each test weight, 3 pre- and 3 postgavage feeding weights were obtained by study personnel, blinded to feeding volume, via a specific protocol. The correlation between test weight difference and actual volume ingested was assessed by the use of summary statistics, Spearman rho, and graphical analyses. The relationship between categorical predictive variables and a predefined acceptable difference (±5 mL) was assessed with the χ 2 or Fisher exact test. A total of 101 test weights were performed in 68 infants. Estimated and actual feeding volumes were highly correlated (r = 0.94, P < .001), with a mean absolute difference of 2.95 mL (SD: 2.70; range: 0, 12.3 mL; 5th, 95th percentile: 0, 9.3); 85% of test weights were within ±5 mL of actual feeding volume and did not vary significantly by corrected age, feeding tube or respiratory support type, feeding duration or volume, formula vs breast milk, or caloric density. With adherence to study protocol, 89% of test weights (66/74) were within ±5 mL of actual volume, compared with 71% (19/27, P = .04) when concerns about protocol adherence were noted (eg, difficulty securing oxygen tubing). Via the use of a standard protocol, feeding volumes can be estimated accurately by pre- and postfeeding weights. Test weighing could be a valuable tool to support direct breastfeeding in the neonatal intensive care unit. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A proposed framework for consensus-based lung tumour volume auto-segmentation in 4D computed tomography imaging

    NASA Astrophysics Data System (ADS)

    Martin, Spencer; Brophy, Mark; Palma, David; Louie, Alexander V.; Yu, Edward; Yaremko, Brian; Ahmad, Belal; Barron, John L.; Beauchemin, Steven S.; Rodrigues, George; Gaede, Stewart

    2015-02-01

    This work aims to propose and validate a framework for tumour volume auto-segmentation based on ground-truth estimates derived from multi-physician input contours to expedite 4D-CT based lung tumour volume delineation. 4D-CT datasets of ten non-small cell lung cancer (NSCLC) patients were manually segmented by 6 physicians. Multi-expert ground truth (GT) estimates were constructed using the STAPLE algorithm for the gross tumour volume (GTV) on all respiratory phases. Next, using a deformable model-based method, multi-expert GT on each individual phase of the 4D-CT dataset was propagated to all other phases providing auto-segmented GTVs and motion encompassing internal gross target volumes (IGTVs) based on GT estimates (STAPLE) from each respiratory phase of the 4D-CT dataset. Accuracy assessment of auto-segmentation employed graph cuts for 3D-shape reconstruction and point-set registration-based analysis yielding volumetric and distance-based measures. STAPLE-based auto-segmented GTV accuracy ranged from (81.51  ±  1.92) to (97.27  ±  0.28)% volumetric overlap of the estimated ground truth. IGTV auto-segmentation showed significantly improved accuracies with reduced variance for all patients ranging from 90.87 to 98.57% volumetric overlap of the ground truth volume. Additional metrics supported these observations with statistical significance. Accuracy of auto-segmentation was shown to be largely independent of selection of the initial propagation phase. IGTV construction based on auto-segmented GTVs within the 4D-CT dataset provided accurate and reliable target volumes compared to manual segmentation-based GT estimates. While inter-/intra-observer effects were largely mitigated, the proposed segmentation workflow is more complex than that of current clinical practice and requires further development.

  12. A proposed framework for consensus-based lung tumour volume auto-segmentation in 4D computed tomography imaging.

    PubMed

    Martin, Spencer; Brophy, Mark; Palma, David; Louie, Alexander V; Yu, Edward; Yaremko, Brian; Ahmad, Belal; Barron, John L; Beauchemin, Steven S; Rodrigues, George; Gaede, Stewart

    2015-02-21

    This work aims to propose and validate a framework for tumour volume auto-segmentation based on ground-truth estimates derived from multi-physician input contours to expedite 4D-CT based lung tumour volume delineation. 4D-CT datasets of ten non-small cell lung cancer (NSCLC) patients were manually segmented by 6 physicians. Multi-expert ground truth (GT) estimates were constructed using the STAPLE algorithm for the gross tumour volume (GTV) on all respiratory phases. Next, using a deformable model-based method, multi-expert GT on each individual phase of the 4D-CT dataset was propagated to all other phases providing auto-segmented GTVs and motion encompassing internal gross target volumes (IGTVs) based on GT estimates (STAPLE) from each respiratory phase of the 4D-CT dataset. Accuracy assessment of auto-segmentation employed graph cuts for 3D-shape reconstruction and point-set registration-based analysis yielding volumetric and distance-based measures. STAPLE-based auto-segmented GTV accuracy ranged from (81.51  ±  1.92) to (97.27  ±  0.28)% volumetric overlap of the estimated ground truth. IGTV auto-segmentation showed significantly improved accuracies with reduced variance for all patients ranging from 90.87 to 98.57% volumetric overlap of the ground truth volume. Additional metrics supported these observations with statistical significance. Accuracy of auto-segmentation was shown to be largely independent of selection of the initial propagation phase. IGTV construction based on auto-segmented GTVs within the 4D-CT dataset provided accurate and reliable target volumes compared to manual segmentation-based GT estimates. While inter-/intra-observer effects were largely mitigated, the proposed segmentation workflow is more complex than that of current clinical practice and requires further development.

  13. Estimation of Right-Lobe Graft Weight From Computed Tomographic Volumetry for Living Donor Liver Transplantation.

    PubMed

    Yang, X; Chu, C W; Yang, J D; Yang, K H; Yu, H C; Cho, B H; You, H

    2017-03-01

    The objective of the study was to establish a right-lobe graft weight (GW) estimation formula for living donor liver transplantation (LDLT) from right-lobe graft volume without veins (GV w/o_veins ), including portal vein and hepatic vein measured by computed tomographic (CT) volumetry, and to compare its estimation accuracy with those of existing formulas. Right-lobe GW estimation formulas established with the use of graft volume with veins (GV w_veins ) sacrifice accuracy because GW measured intra-operatively excludes the weight of blood in the veins. Right-lobe GW estimation formulas have been established with the use of right-lobe GV w/o_veins , but a more accurate formula must be developed. The present study developed right-lobe GW estimation formulas based on GV w/o_veins as well as GV w_veins , using 40 cases of Korean donors: GW = 29.1 + 0.943 × GV w/o_veins (adjusted R 2  = 0.94) and GW = 74.7 + 0.773 × GV w_veins (adjusted R 2  = 0.87). The proposed GW estimation formulas were compared with existing GV w_veins - and GV w/o_veins -based models, using 43 cases additionally obtained from two medical centers for cross-validation. The GV w/o_veins -based formula developed in the present study was most preferred (absolute error = 21.5 ± 16.5 g and percentage of absolute error = 3.0 ± 2.3%). The GV w/o_veins -based formula is preferred to the GV w_veins -based formula in GW estimation. Accurate CT volumetry and alignment between planned and actual surgical cutting lines are crucial in the establishment of a better GW estimation formula. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. A simple method to estimate restoration volume as a possible predictor for tooth fracture.

    PubMed

    Sturdevant, J R; Bader, J D; Shugars, D A; Steet, T C

    2003-08-01

    Many dentists cite the fracture risk posed by a large existing restoration as a primary reason for their decision to place a full-coverage restoration. However, there is poor agreement among dentists as to when restoration placement is necessary because of the inability to make objective measurements of restoration size. The purpose of this study was to compare a new method to estimate restoration volumes in posterior teeth with analytically determined volumes. True restoration volume proportion (RVP) was determined for 96 melamine typodont teeth: 24 each of maxillary second premolar, mandibular second premolar, maxillary first molar, and mandibular first molar. Each group of 24 was subdivided into 3 groups to receive an O, MO, or MOD amalgam preparation design. Each preparation design was further subdivided into 4 groups of increasingly larger size. The density of amalgam used was calculated according to ANSI/ADA Specification 1. The teeth were weighed before and after restoration with amalgam. Restoration weight was calculated, and the density of amalgam was used to calculate restoration volume. A liquid pycnometer was used to calculate coronal volume after sectioning the anatomic crown from the root horizontally at the cementoenamel junction. True RVP was calculated by dividing restoration volume by coronal volume. An occlusal photograph and a bitewing radiograph were made of each restored tooth to provide 2 perpendicular views. Each image was digitized, and software was used to measure the percentage of the anatomic crown restored with amalgam. Estimated RVP was calculated by multiplying the percentage of the anatomic crown restored from the 2 views together. Pearson correlation coefficients were used to compare estimated RVP with true RVP. The Pearson correlation coefficient of true RVP with estimated RVP was 0.97 overall (P

  15. Voxel-Based 3-D Tree Modeling from Lidar Images for Extracting Tree Structual Information

    NASA Astrophysics Data System (ADS)

    Hosoi, F.

    2014-12-01

    Recently, lidar (light detection and ranging) has been used to extracting tree structural information. Portable scanning lidar systems can capture the complex shape of individual trees as a 3-D point-cloud image. 3-D tree models reproduced from the lidar-derived 3-D image can be used to estimate tree structural parameters. We have proposed the voxel-based 3-D modeling for extracting tree structural parameters. One of the tree parameters derived from the voxel modeling is leaf area density (LAD). We refer to the method as the voxel-based canopy profiling (VCP) method. In this method, several measurement points surrounding the canopy and optimally inclined laser beams are adopted for full laser beam illumination of whole canopy up to the internal. From obtained lidar image, the 3-D information is reproduced as the voxel attributes in the 3-D voxel array. Based on the voxel attributes, contact frequency of laser beams on leaves is computed and LAD in each horizontal layer is obtained. This method offered accurate LAD estimation for individual trees and woody canopy trees. For more accurate LAD estimation, the voxel model was constructed by combining airborne and portable ground-based lidar data. The profiles obtained by the two types of lidar complemented each other, thus eliminating blind regions and yielding more accurate LAD profiles than could be obtained by using each type of lidar alone. Based on the estimation results, we proposed an index named laser beam coverage index, Ω, which relates to the lidar's laser beam settings and a laser beam attenuation factor. It was shown that this index can be used for adjusting measurement set-up of lidar systems and also used for explaining the LAD estimation error using different types of lidar systems. Moreover, we proposed a method to estimate woody material volume as another application of the voxel tree modeling. In this method, voxel solid model of a target tree was produced from the lidar image, which is composed of consecutive voxels that filled the outer surface and the interior of the stem and large branches. From the model, the woody material volume of any part of the target tree can be directly calculated easily by counting the number of corresponding voxels and multiplying the result by the per-voxel volume.

  16. TU-G-BRA-05: Predicting Volume Change of the Tumor and Critical Structures Throughout Radiation Therapy by CT-CBCT Registration with Local Intensity Correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, S; Robinson, A; Kiess, A

    2015-06-15

    Purpose: The purpose of this study is to develop an accurate and effective technique to predict and monitor volume changes of the tumor and organs at risk (OARs) from daily cone-beam CTs (CBCTs). Methods: While CBCT is typically used to minimize the patient setup error, its poor image quality impedes accurate monitoring of daily anatomical changes in radiotherapy. Reconstruction artifacts in CBCT often cause undesirable errors in registration-based contour propagation from the planning CT, a conventional way to estimate anatomical changes. To improve the registration and segmentation accuracy, we developed a new deformable image registration (DIR) that iteratively corrects CBCTmore » intensities using slice-based histogram matching during the registration process. Three popular DIR algorithms (hierarchical B-spline, demons, optical flow) augmented by the intensity correction were implemented on a graphics processing unit for efficient computation, and their performances were evaluated on six head and neck (HN) cancer cases. Four trained scientists manually contoured nodal gross tumor volume (GTV) on the planning CT and every other fraction CBCTs for each case, to which the propagated GTV contours by DIR were compared. The performance was also compared with commercial software, VelocityAI (Varian Medical Systems Inc.). Results: Manual contouring showed significant variations, [-76, +141]% from the mean of all four sets of contours. The volume differences (mean±std in cc) between the average manual segmentation and four automatic segmentations are 3.70±2.30(B-spline), 1.25±1.78(demons), 0.93±1.14(optical flow), and 4.39±3.86 (VelocityAI). In comparison to the average volume of the manual segmentations, the proposed approach significantly reduced the estimation error by 9%(B-spline), 38%(demons), and 51%(optical flow) over the conventional mutual information based method (VelocityAI). Conclusion: The proposed CT-CBCT registration with local CBCT intensity correction can accurately predict the tumor volume change with reduced errors. Although demonstrated only on HN nodal GTVs, the results imply improved accuracy for other critical structures. This work was supported by NIH/NCI under grant R42CA137886.« less

  17. Towards an Optimized Method of Olive Tree Crown Volume Measurement

    PubMed Central

    Miranda-Fuentes, Antonio; Llorens, Jordi; Gamarra-Diezma, Juan L.; Gil-Ribes, Jesús A.; Gil, Emilio

    2015-01-01

    Accurate crown characterization of large isolated olive trees is vital for adjusting spray doses in three-dimensional crop agriculture. Among the many methodologies available, laser sensors have proved to be the most reliable and accurate. However, their operation is time consuming and requires specialist knowledge and so a simpler crown characterization method is required. To this end, three methods were evaluated and compared with LiDAR measurements to determine their accuracy: Vertical Crown Projected Area method (VCPA), Ellipsoid Volume method (VE) and Tree Silhouette Volume method (VTS). Trials were performed in three different kinds of olive tree plantations: intensive, adapted one-trunked traditional and traditional. In total, 55 trees were characterized. Results show that all three methods are appropriate to estimate the crown volume, reaching high coefficients of determination: R2 = 0.783, 0.843 and 0.824 for VCPA, VE and VTS, respectively. However, discrepancies arise when evaluating tree plantations separately, especially for traditional trees. Here, correlations between LiDAR volume and other parameters showed that the Mean Vector calculated for VCPA method showed the highest correlation for traditional trees, thus its use in traditional plantations is highly recommended. PMID:25658396

  18. A quantitative evaluation of pleural effusion on computed tomography scans using B-spline and local clustering level set.

    PubMed

    Song, Lei; Gao, Jungang; Wang, Sheng; Hu, Huasi; Guo, Youmin

    2017-01-01

    Estimation of the pleural effusion's volume is an important clinical issue. The existing methods cannot assess it accurately when there is large volume of liquid in the pleural cavity and/or the patient has some other disease (e.g. pneumonia). In order to help solve this issue, the objective of this study is to develop and test a novel algorithm using B-spline and local clustering level set method jointly, namely BLL. The BLL algorithm was applied to a dataset involving 27 pleural effusions detected on chest CT examination of 18 adult patients with the presence of free pleural effusion. Study results showed that average volumes of pleural effusion computed using the BLL algorithm and assessed manually by the physicians were 586 ml±339 ml and 604±352 ml, respectively. For the same patient, the volume of the pleural effusion, segmented semi-automatically, was 101.8% ±4.6% of that was segmented manually. Dice similarity was found to be 0.917±0.031. The study demonstrated feasibility of applying the new BLL algorithm to accurately measure the volume of pleural effusion.

  19. Accuracy in estimation of timber assortments and stem distribution - A comparison of airborne and terrestrial laser scanning techniques

    NASA Astrophysics Data System (ADS)

    Kankare, Ville; Vauhkonen, Jari; Tanhuanpää, Topi; Holopainen, Markus; Vastaranta, Mikko; Joensuu, Marianna; Krooks, Anssi; Hyyppä, Juha; Hyyppä, Hannu; Alho, Petteri; Viitala, Risto

    2014-11-01

    Detailed information about timber assortments and diameter distributions is required in forest management. Forest owners can make better decisions concerning the timing of timber sales and forest companies can utilize more detailed information to optimize their wood supply chain from forest to factory. The objective here was to compare the accuracies of high-density laser scanning techniques for the estimation of tree-level diameter distribution and timber assortments. We also introduce a method that utilizes a combination of airborne and terrestrial laser scanning in timber assortment estimation. The study was conducted in Evo, Finland. Harvester measurements were used as a reference for 144 trees within a single clear-cut stand. The results showed that accurate tree-level timber assortments and diameter distributions can be obtained, using terrestrial laser scanning (TLS) or a combination of TLS and airborne laser scanning (ALS). Saw log volumes were estimated with higher accuracy than pulpwood volumes. The saw log volumes were estimated with relative root-mean-squared errors of 17.5% and 16.8% with TLS and a combination of TLS and ALS, respectively. The respective accuracies for pulpwood were 60.1% and 59.3%. The differences in the bucking method used also caused some large errors. In addition, tree quality factors highly affected the bucking accuracy, especially with pulpwood volume.

  20. Calibration Experiments for a Computer Vision Oyster Volume Estimation System

    ERIC Educational Resources Information Center

    Chang, G. Andy; Kerns, G. Jay; Lee, D. J.; Stanek, Gary L.

    2009-01-01

    Calibration is a technique that is commonly used in science and engineering research that requires calibrating measurement tools for obtaining more accurate measurements. It is an important technique in various industries. In many situations, calibration is an application of linear regression, and is a good topic to be included when explaining and…

  1. 3D Volumetry and its Correlation Between Postoperative Gastric Volume and Excess Weight Loss After Sleeve Gastrectomy.

    PubMed

    Hanssen, Andrés; Plotnikov, Sergio; Acosta, Geylor; Nuñez, José Tomas; Haddad, José; Rodriguez, Carmen; Petrucci, Claudia; Hanssen, Diego; Hanssen, Rafael

    2018-03-01

    The volume of the postoperative gastric remnant is a key factor in excess weight loss (EWL) after sleeve gastrectomy (SG). Traditional methods to estimate gastric volume (GV) after bariatric procedures are often inaccurate; usually conventional biplanar contrast studies are used. Thirty patients who underwent SG were followed prospectively and evaluated at 6 months after the surgical procedure, performing 3D CT reconstruction and gastric volumetry, to establish its relationship with EWL. The gastric remnant was distended with effervescent sodium bicarbonate given orally. Helical CT images were acquired and reconstructed; GV was estimated with the software of the CT device. The relationship between GV and EWL was analyzed. The study allowed estimating the GV in all patients. A dispersion diagram showed an inverse relationship between GV and %EWL. 55.5% of patients with GV ≤ 100 ml had %EWL 25-75% and 38.8% had an %EWL above 75% and patients with GV ≥ 100 ml had an %EWL under 25% (50% of patients) or between 25 and 75% (50% of this group). The Pearson's correlation coefficient was R = 6.62, with bilateral significance (p ≤ .01). The Chi-square result correlating GV and EWL showed a significance of .005 (p ≤ .01). The 3D reconstructions showed accurately the shape and anatomic details of the gastric remnant. 3D volumetry CT scans accurately estimate GV after SG. A significant relationship between GV and EWL 6 months after SG was established, seeming that GV ≥ 100 ml at 6 months of SG is associated with poor EWL.

  2. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles

    PubMed Central

    Cortés, Camilo; Unzueta, Luis; de los Reyes-Guzmán, Ana; Ruiz, Oscar E.; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044

  3. Does a pneumotach accurately characterize voice function?

    NASA Astrophysics Data System (ADS)

    Walters, Gage; Krane, Michael

    2016-11-01

    A study is presented which addresses how a pneumotach might adversely affect clinical measurements of voice function. A pneumotach is a device, typically a mask, worn over the mouth, in order to measure time-varying glottal volume flow. By measuring the time-varying difference in pressure across a known aerodynamic resistance element in the mask, the glottal volume flow waveform is estimated. Because it adds aerodynamic resistance to the vocal system, there is some concern that using a pneumotach may not accurately portray the behavior of the voice. To test this hypothesis, experiments were performed in a simplified airway model with the principal dimensions of an adult human upper airway. A compliant constriction, fabricated from silicone rubber, modeled the vocal folds. Variations of transglottal pressure, time-averaged volume flow, model vocal fold vibration amplitude, and radiated sound with subglottal pressure were performed, with and without the pneumotach in place, and differences noted. Acknowledge support of NIH Grant 2R01DC005642-10A1.

  4. Volume of reaction by the Archibald ultracentrifuge method (lobster hemocyanin).

    PubMed

    Saxena, V P; Kegeles, G; Kikas, R

    1976-07-01

    Samples of lobster hemocyanin (Homarus americanus) under conditions of reversible reaction between whole (25 S) and half (17 S) molecules have been subjected to accurately known nitrogen pressures in analytical ultracentrifuge cells. A modified pressurization chamber of the type developed by Schumaker and colleagues has been constructed for this purpose. The molecular weight was then determined at the top (liquid-gas) meniscus, by means of the Archibald method. The logarithmic dependence upon pressure of the derived equilibrium constant then gave directly the volume of reaction. Experiments were performed in veronal-citrate buffers at pH 8, where the molar volume of formation of whole (dodecameric) molecules from half molecules appears to be negative, and at pH 8.46 in veronal-citrate buffer in the presence of 0.003 molar free calcium ion, where the molar volume of formation was estimated to be + 390 cm3/mole. In glycine-sodium hydroxide buffer at pH 9.6 containing 0.0047 molar free calcium, the molar volume of formation of whole molecules was estimated to be +120 +/- 70 cm3, corresponding to an estimated difference in partial specific volume between whole molecules and half molecules of only 1.3 (10)-4cm3/gram. The correctness of the sign of this value in glycine buffer has been verified by pressure-jump light-scattering experiments.

  5. A New Approach to Estimate Forest Parameters Using Dual-Baseline Pol-InSAR Data

    NASA Astrophysics Data System (ADS)

    Bai, L.; Hong, W.; Cao, F.; Zhou, Y.

    2009-04-01

    In POL-InSAR applications using ESPRIT technique, it is assumed that there exist stable scattering centres in the forest. However, the observations in forest severely suffer from volume and temporal decorrelation. The forest scatters are not stable as assumed. The obtained interferometric information is not accurate as expected. Besides, ESPRIT techniques could not identify the interferometric phases corresponding to the ground and the canopy. It provides multiple estimations for the height between two scattering centers due to phase unwrapping. Therefore, estimation errors are introduced to the forest height results. To suppress the two types of errors, we use the dual-baseline POL-InSAR data to estimate forest height. Dual-baseline coherence optimization is applied to obtain interferometric information of stable scattering centers in the forest. From the interferometric phases for different baselines, estimation errors caused by phase unwrapping is solved. Other estimation errors can be suppressed, too. Experiments are done to the ESAR L band POL-InSAR data. Experimental results show the proposed methods provide more accurate forest height than ESPRIT technique.

  6. Comparison of left and right atrial volume by echocardiography versus cardiac magnetic resonance imaging using the area-length method.

    PubMed

    Whitlock, Matthew; Garg, Anuj; Gelow, Jill; Jacobson, Timothy; Broberg, Craig

    2010-11-01

    Increased atrial volumes predict adverse cardiovascular events. Accordingly, accurate measurement of atrial size has become increasingly important in clinical practice. The area-length method is commonly used to estimate the volume. Disagreements between atrial volumes using echocardiography and other imaging modalities have been found. It is unclear whether this has resulted from differences in the measurement method or discrepancies among imaging modalities. We compared the right atrial (RA) and left atrial (LA) volume estimates using the area-length method for transthoracic echocardiography and cardiovascular magnetic resonance (CMR) imaging. Patients undergoing echocardiography and CMR imaging within 1 month were identified retrospectively. For both modalities, the RA and LA long-axis dimension and area were measured using standard 2- and 4-chamber views, and the volume was calculated using the area-length method for both atria. The echocardiographic and CMR values were compared using the Bland-Altman method. A total of 85 patients and 18 controls were included in the present study. The atrial volumes estimated using the area-length method were significantly smaller when measured using echocardiography than when measured using CMR imaging (LA volume 35 ± 20 vs 49 ± 30 ml/m², p <0.001, and RA volume 32 ± 23 vs 43 ± 29 ml/m², p = 0.012). The mean difference (CMR imaging minus echocardiography) was 14 ± 14 ml/m² for the LA and 10 ± 16 ml/m² for the RA volume. Similar results were found in the healthy controls. No significant intra- or interobserver variability was found within each modality. In conclusion, echocardiography consistently underestimated the atrial volumes compared to CMR imaging using the area-length method. Copyright © 2010 Elsevier Inc. All rights reserved.

  7. Marginal space learning for efficient detection of 2D/3D anatomical structures in medical images.

    PubMed

    Zheng, Yefeng; Georgescu, Bogdan; Comaniciu, Dorin

    2009-01-01

    Recently, marginal space learning (MSL) was proposed as a generic approach for automatic detection of 3D anatomical structures in many medical imaging modalities [1]. To accurately localize a 3D object, we need to estimate nine pose parameters (three for position, three for orientation, and three for anisotropic scaling). Instead of exhaustively searching the original nine-dimensional pose parameter space, only low-dimensional marginal spaces are searched in MSL to improve the detection speed. In this paper, we apply MSL to 2D object detection and perform a thorough comparison between MSL and the alternative full space learning (FSL) approach. Experiments on left ventricle detection in 2D MRI images show MSL outperforms FSL in both speed and accuracy. In addition, we propose two novel techniques, constrained MSL and nonrigid MSL, to further improve the efficiency and accuracy. In many real applications, a strong correlation may exist among pose parameters in the same marginal spaces. For example, a large object may have large scaling values along all directions. Constrained MSL exploits this correlation for further speed-up. The original MSL only estimates the rigid transformation of an object in the image, therefore cannot accurately localize a nonrigid object under a large deformation. The proposed nonrigid MSL directly estimates the nonrigid deformation parameters to improve the localization accuracy. The comparison experiments on liver detection in 226 abdominal CT volumes demonstrate the effectiveness of the proposed methods. Our system takes less than a second to accurately detect the liver in a volume.

  8. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms.

    PubMed

    Saccà, Alessandro

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes' principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of 'unellipticity' introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices.

  9. A Simple yet Accurate Method for the Estimation of the Biovolume of Planktonic Microorganisms

    PubMed Central

    2016-01-01

    Determining the biomass of microbial plankton is central to the study of fluxes of energy and materials in aquatic ecosystems. This is typically accomplished by applying proper volume-to-carbon conversion factors to group-specific abundances and biovolumes. A critical step in this approach is the accurate estimation of biovolume from two-dimensional (2D) data such as those available through conventional microscopy techniques or flow-through imaging systems. This paper describes a simple yet accurate method for the assessment of the biovolume of planktonic microorganisms, which works with any image analysis system allowing for the measurement of linear distances and the estimation of the cross sectional area of an object from a 2D digital image. The proposed method is based on Archimedes’ principle about the relationship between the volume of a sphere and that of a cylinder in which the sphere is inscribed, plus a coefficient of ‘unellipticity’ introduced here. Validation and careful evaluation of the method are provided using a variety of approaches. The new method proved to be highly precise with all convex shapes characterised by approximate rotational symmetry, and combining it with an existing method specific for highly concave or branched shapes allows covering the great majority of cases with good reliability. Thanks to its accuracy, consistency, and low resources demand, the new method can conveniently be used in substitution of any extant method designed for convex shapes, and can readily be coupled with automated cell imaging technologies, including state-of-the-art flow-through imaging devices. PMID:27195667

  10. It’s what’s inside that counts: Egg contaminant concentrations are influenced by estimates of egg density, egg volume, and fresh egg mass

    USGS Publications Warehouse

    Herzog, Mark; Ackerman, Joshua T.; Eagles-Smith, Collin A.; Hartman, Christopher

    2016-01-01

    In egg contaminant studies, it is necessary to calculate egg contaminant concentrations on a fresh wet weight basis and this requires accurate estimates of egg density and egg volume. We show that the inclusion or exclusion of the eggshell can influence egg contaminant concentrations, and we provide estimates of egg density (both with and without the eggshell) and egg-shape coefficients (used to estimate egg volume from egg morphometrics) for American avocet (Recurvirostra americana), black-necked stilt (Himantopus mexicanus), and Forster’s tern (Sterna forsteri). Egg densities (g/cm3) estimated for whole eggs (1.056 ± 0.003) were higher than egg densities estimated for egg contents (1.024 ± 0.001), and were 1.059 ± 0.001 and 1.025 ± 0.001 for avocets, 1.056 ± 0.001 and 1.023 ± 0.001 for stilts, and 1.053 ± 0.002 and 1.025 ± 0.002 for terns. The egg-shape coefficients for egg volume (K v ) and egg mass (K w ) also differed depending on whether the eggshell was included (K v = 0.491 ± 0.001; K w = 0.518 ± 0.001) or excluded (K v = 0.493 ± 0.001; K w = 0.505 ± 0.001), and varied among species. Although egg contaminant concentrations are rarely meant to include the eggshell, we show that the typical inclusion of the eggshell in egg density and egg volume estimates results in egg contaminant concentrations being underestimated by 6–13 %. Our results demonstrate that the inclusion of the eggshell significantly influences estimates of egg density, egg volume, and fresh egg mass, which leads to egg contaminant concentrations that are biased low. We suggest that egg contaminant concentrations be calculated on a fresh wet weight basis using only internal egg-content densities, volumes, and masses appropriate for the species. For the three waterbirds in our study, these corrected coefficients are 1.024 ± 0.001 for egg density, 0.493 ± 0.001 for K v , and 0.505 ± 0.001 for K w .

  11. Multi-atlas segmentation enables robust multi-contrast MRI spleen segmentation for splenomegaly

    NASA Astrophysics Data System (ADS)

    Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L.; Assad, Albert; Abramson, Richard G.; Landman, Bennett A.

    2017-02-01

    Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≍1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.

  12. Multi-atlas Segmentation Enables Robust Multi-contrast MRI Spleen Segmentation for Splenomegaly.

    PubMed

    Huo, Yuankai; Liu, Jiaqi; Xu, Zhoubing; Harrigan, Robert L; Assad, Albert; Abramson, Richard G; Landman, Bennett A

    2017-02-11

    Non-invasive spleen volume estimation is essential in detecting splenomegaly. Magnetic resonance imaging (MRI) has been used to facilitate splenomegaly diagnosis in vivo. However, achieving accurate spleen volume estimation from MR images is challenging given the great inter-subject variance of human abdomens and wide variety of clinical images/modalities. Multi-atlas segmentation has been shown to be a promising approach to handle heterogeneous data and difficult anatomical scenarios. In this paper, we propose to use multi-atlas segmentation frameworks for MRI spleen segmentation for splenomegaly. To the best of our knowledge, this is the first work that integrates multi-atlas segmentation for splenomegaly as seen on MRI. To address the particular concerns of spleen MRI, automated and novel semi-automated atlas selection approaches are introduced. The automated approach interactively selects a subset of atlases using selective and iterative method for performance level estimation (SIMPLE) approach. To further control the outliers, semi-automated craniocaudal length based SIMPLE atlas selection (L-SIMPLE) is proposed to introduce a spatial prior in a fashion to guide the iterative atlas selection. A dataset from a clinical trial containing 55 MRI volumes (28 T1 weighted and 27 T2 weighted) was used to evaluate different methods. Both automated and semi-automated methods achieved median DSC > 0.9. The outliers were alleviated by the L-SIMPLE (≈1 min manual efforts per scan), which achieved 0.9713 Pearson correlation compared with the manual segmentation. The results demonstrated that the multi-atlas segmentation is able to achieve accurate spleen segmentation from the multi-contrast splenomegaly MRI scans.

  13. Log sampling methods and software for stand and landscape analyses.

    Treesearch

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...

  14. Volume estimation using food specific shape templates in mobile image-based dietary assessment

    NASA Astrophysics Data System (ADS)

    Chae, Junghoon; Woo, Insoo; Kim, SungYe; Maciejewski, Ross; Zhu, Fengqing; Delp, Edward J.; Boushey, Carol J.; Ebert, David S.

    2011-03-01

    As obesity concerns mount, dietary assessment methods for prevention and intervention are being developed. These methods include recording, cataloging and analyzing daily dietary records to monitor energy and nutrient intakes. Given the ubiquity of mobile devices with built-in cameras, one possible means of improving dietary assessment is through photographing foods and inputting these images into a system that can determine the nutrient content of foods in the images. One of the critical issues in such the image-based dietary assessment tool is the accurate and consistent estimation of food portion sizes. The objective of our study is to automatically estimate food volumes through the use of food specific shape templates. In our system, users capture food images using a mobile phone camera. Based on information (i.e., food name and code) determined through food segmentation and classification of the food images, our system choose a particular food template shape corresponding to each segmented food. Finally, our system reconstructs the three-dimensional properties of the food shape from a single image by extracting feature points in order to size the food shape template. By employing this template-based approach, our system automatically estimates food portion size, providing a consistent method for estimation food volume.

  15. Computer simulation comparison of tripolar, bipolar, and spline Laplacian electrocadiogram estimators.

    PubMed

    Chen, T; Besio, W; Dai, W

    2009-01-01

    A comparison of the performance of the tripolar and bipolar concentric as well as spline Laplacian electrocardiograms (LECGs) and body surface Laplacian mappings (BSLMs) for localizing and imaging the cardiac electrical activation has been investigated based on computer simulation. In the simulation a simplified eccentric heart-torso sphere-cylinder homogeneous volume conductor model were developed. Multiple dipoles with different orientations were used to simulate the underlying cardiac electrical activities. Results show that the tripolar concentric ring electrodes produce the most accurate LECG and BSLM estimation among the three estimators with the best performance in spatial resolution.

  16. Inter-slice bidirectional registration-based segmentation of the prostate gland in MR and CT image sequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalvati, Farzad, E-mail: farzad.khalvati@uwaterloo.ca; Tizhoosh, Hamid R.; Salmanpour, Aryan

    Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less

  17. Inter-slice bidirectional registration-based segmentation of the prostate gland in MR and CT image sequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalvati, Farzad, E-mail: farzad.khalvati@uwaterloo.ca; Tizhoosh, Hamid R.; Salmanpour, Aryan

    2013-12-15

    Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less

  18. Detection and volume estimation of artificial hematomas in the subcutaneous fatty tissue: comparison of different MR sequences at 3.0 T.

    PubMed

    Ogris, Kathrin; Petrovic, Andreas; Scheicher, Sylvia; Sprenger, Hanna; Urschler, Martin; Hassler, Eva Maria; Yen, Kathrin; Scheurer, Eva

    2017-06-01

    In legal medicine, reliable localization and analysis of hematomas in subcutaneous fatty tissue is required for forensic reconstruction. Due to the absence of ionizing radiation, magnetic resonance imaging (MRI) is particularly suited to examining living persons with forensically relevant injuries. However, there is limited experience regarding MRI signal properties of hemorrhage in soft tissue. The aim of this study was to evaluate MR sequences with respect to their ability to show high contrast between hematomas and subcutaneous fatty tissue as well as to reliably determine the volume of artificial hematomas. Porcine tissue models were prepared by injecting blood into the subcutaneous fatty tissue to create artificial hematomas. MR images were acquired at 3T and four blinded observers conducted manual segmentation of the hematomas. To assess segmentability, the agreement of measured volume with the known volume of injected blood was statistically analyzed. A physically motivated normalization taking into account partial volume effect was applied to the data to ensure comparable results among differently sized hematomas. The inversion recovery sequence exhibited the best segmentability rate, whereas the T1T2w turbo spin echo sequence showed the most accurate results regarding volume estimation. Both sequences led to reproducible volume estimations. This study demonstrates that MRI is a promising forensic tool to assess and visualize even very small amounts of blood in soft tissue. The presented results enable the improvement of protocols for detection and volume determination of hemorrhage in forensically relevant cases and also provide fundamental knowledge for future in-vivo examinations.

  19. Space Station Facility government estimating

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1993-01-01

    This new, unique Cost Engineering Report introduces the 800-page, C-100 government estimate for the Space Station Processing Facility (SSPF) and Volume IV Aerospace Construction Price Book. At the January 23, 1991, bid opening for the SSPF, the government cost estimate was right on target. Metric, Inc., Prime Contractor, low bid was 1.2 percent below the government estimate. This project contains many different and complex systems. Volume IV is a summary of the cost associated with construction, activation and Ground Support Equipment (GSE) design, estimating, fabrication, installation, testing, termination, and verification of this project. Included are 13 reasons the government estimate was so accurate; abstract of bids, for 8 bidders and government estimate with additive alternates, special labor and materials, budget comparison and system summaries; and comments on the energy credit from local electrical utility. This report adds another project to our continuing study of 'How Does the Low Bidder Get Low and Make Money?' which was started in 1967, and first published in the 1973 AACE Transaction with 18 ways the low bidders get low. The accuracy of this estimate proves the benefits of our Kennedy Space Center (KSC) teamwork efforts and KSC Cost Engineer Tools which are contributing toward our goals of the Space Station.

  20. A multifractal approach to space-filling recovery for PET quantification.

    PubMed

    Willaime, Julien M Y; Aboagye, Eric O; Tsoumpas, Charalampos; Turkheimer, Federico E

    2014-11-01

    A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV mean) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic (18)F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical (18)F-fluorothymidine PET test-retest dataset. TLA estimates were stable for a range of resolutions typical in PET oncology (4-6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV mean or TV measurements across imaging protocols. The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.

  1. An improved bathymetric model for the modern and palaeo Lake Eyre

    NASA Astrophysics Data System (ADS)

    Leon, J. X.; Cohen, T. J.

    2012-11-01

    Here we demonstrate the applicability of using altimetry data and Landsat imagery to provide the most accurate digital elevation model (DEM) of Australia's largest playa lake — Lake Eyre. We demonstrate through the use of geospatial techniques a robust assessment of lake area and volume of recent lake-filling episodes whilst also providing the most accurate estimates of area and volume for larger lake filling episodes that occurred throughout the last glacial cycle. We highlight that at a depth of 25 m Lake Mega-Eyre would merge with the adjacent Lake Mega-Frome to form an immense waterbody with a combined area of almost 35,000 km2 and a combined volume of ~ 520 km3. This would represent a vast water body in what is now the arid interior of the Australian continent. The improved DEM is more reliable from a geomorphological and hydrological perspective and allows a more accurate assessment of water balance under the modern hydrological regime. The results presented using GLAS/ICESat data suggest that earlier historical soundings were correct and the actual lowest topographic point in Australia is - 15.6 m below sea level. The results also contrast nicely the different basin characteristics of two adjacent lake systems: Lake Eyre and Lake Frome.

  2. Forest inventory using multistage sampling with probability proportional to size. [Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Lee, D. C. L.; Hernandezfilho, P.; Shimabukuro, Y. E.; Deassis, O. R.; Demedeiros, J. S.

    1984-01-01

    A multistage sampling technique, with probability proportional to size, for forest volume inventory using remote sensing data is developed and evaluated. The study area is located in the Southeastern Brazil. The LANDSAT 4 digital data of the study area are used in the first stage for automatic classification of reforested areas. Four classes of pine and eucalypt with different tree volumes are classified utilizing a maximum likelihood classification algorithm. Color infrared aerial photographs are utilized in the second stage of sampling. In the third state (ground level) the time volume of each class is determined. The total time volume of each class is expanded through a statistical procedure taking into account all the three stages of sampling. This procedure results in an accurate time volume estimate with a smaller number of aerial photographs and reduced time in field work.

  3. Novel Approach to Estimate Kidney and Cyst Volumes using Mid-Slice Magnetic Resonance Images in Polycystic Kidney Disease

    PubMed Central

    Bae, Kyongtae T; Tao, Cheng; Wang, Jinhong; Kaya, Diana; Wu, Zhiyuan; Bae, Junu T; Chapman, Arlene B; Torres, Vicente E; Grantham, Jared J; Mrug, Michal; Bennett, William M; Flessner, Michael F; Landsittel, Doug P

    2013-01-01

    Objective To evaluate whether kidney and cyst volumes can be accurately estimated based on limited area measurements from MR images of patients with autosomal dominant polycystic kidney disease (ADPKD). Materials and Methods MR coronal images of 178 ADPKD participants from the Consortium for Radiologic Imaging Studies of ADPKD (CRISP) were analyzed. For each MR image slice, we measured kidney and renal cyst areas using stereology and region-based thresholding methods, respectively. The kidney and cyst ‘observed’ volumes were calculated by summing up the area measurements of all the slices covering the kidney. To estimate the volume, we selected a coronal mid-slice in each kidney and multiplied its area by the total number of slices (‘PANK2’ for kidney and ‘PANC2’ for cyst). We then compared the kidney and cyst volumes predicted from PANK2 and PANC2, respectively, to the corresponding observed volumes, using a linear regression analysis. Results The kidney volume predicted from PANK2 correlated extremely well with the observed kidney volume: R2=0.994 for right and 0.991 for left kidney. The linear regression coefficient multiplier to PANK2 that best fit the kidney volume was 0.637 (95%CI: 0.629–0.644) for right and 0.624 (95%CI: 0.616–0.633) for left kidney. The correlation between the cyst volume predicted from PANC2 and the observed cyst volume was also very high: R2=0.984 for right and 0.967 for left kidney. The least squares linear regression coefficient for PANC2 was 0.637 (95%CI: 0.624–0.649) for right and 0.608 (95%CI: 0.591–0.625) for left kidney. Conclusion Kidney and cyst volumes can be closely approximated by multiplying the product of the mid-slice area measurement and the total number of slices in the coronal MR images of ADPKD kidneys by 0.61–0.64. This information will help save processing time needed to estimate total kidney and cyst volumes of ADPKD kidneys. PMID:24107679

  4. Probabilistic brain tissue segmentation in neonatal magnetic resonance imaging.

    PubMed

    Anbeek, Petronella; Vincken, Koen L; Groenendaal, Floris; Koeman, Annemieke; van Osch, Matthias J P; van der Grond, Jeroen

    2008-02-01

    A fully automated method has been developed for segmentation of four different structures in the neonatal brain: white matter (WM), central gray matter (CEGM), cortical gray matter (COGM), and cerebrospinal fluid (CSF). The segmentation algorithm is based on information from T2-weighted (T2-w) and inversion recovery (IR) scans. The method uses a K nearest neighbor (KNN) classification technique with features derived from spatial information and voxel intensities. Probabilistic segmentations of each tissue type were generated. By applying thresholds on these probability maps, binary segmentations were obtained. These final segmentations were evaluated by comparison with a gold standard. The sensitivity, specificity, and Dice similarity index (SI) were calculated for quantitative validation of the results. High sensitivity and specificity with respect to the gold standard were reached: sensitivity >0.82 and specificity >0.9 for all tissue types. Tissue volumes were calculated from the binary and probabilistic segmentations. The probabilistic segmentation volumes of all tissue types accurately estimated the gold standard volumes. The KNN approach offers valuable ways for neonatal brain segmentation. The probabilistic outcomes provide a useful tool for accurate volume measurements. The described method is based on routine diagnostic magnetic resonance imaging (MRI) and is suitable for large population studies.

  5. Effects of Lugol's iodine solution and formalin on cell volume of three bloom-forming dinoflagellates

    NASA Astrophysics Data System (ADS)

    Yang, Yang; Sun, Xiaoxia; Zhao, Yongfang

    2017-07-01

    Fixatives are traditionally used in marine ecosystem research. The bias introduced by fixatives on the dimensions of plankton cells may lead to an overestimation or underestimation of the carbon biomass. To determine the impact of traditional fixatives on dinoflagellates during short- and long-term fixation, we analyzed the degree of change in three bloom-forming dinoflagellates ( Prorocentrum micans, Scrippsiella trochoidea and Noctiluca scintillans) brought about by Lugol's iodine solution (hereafter Lugol's) and formalin. The fixation effects were species-specific. P. micans cell volume showed no significant change following long-term preservation, and S. trochoidea swelled by approximately 8.06% in Lugol's and by 20.97% in formalin as a percentage of the live cell volume, respectively. N. scintillans shrank significantly in both fixatives. The volume change due to formalin in N. scintillans was not concentration-dependent, whereas the volume shrinkage of N. scintillans cells fixed with Lugol's at a concentration of 2% was nearly six-fold that in cells fixed with Lugol's at a concentration of 0.6%-0.8%. To better estimate the volume of N. scintillans fixed in formalin at a concentration of 5%, we suggest that the conversion relationship was as follows: volume of live cell=volume of intact fixed cell/0.61. Apart from size change, damage induced by fixatives on N. scintillans was obvious. Lugol's is not a suitable fixative for N. scintillans due to high frequency of broken cells. Accurate carbon biomass estimate of N. scintillans should be performed on live samples. These findings help to improve the estimate of phytoplankton cell volume and carbon biomass in marine ecosystem.

  6. Improving Spleen Volume Estimation via Computer Assisted Segmentation on Clinically Acquired CT Scans

    PubMed Central

    Xu, Zhoubing; Gertz, Adam L.; Burke, Ryan P.; Bansal, Neil; Kang, Hakmook; Landman, Bennett A.; Abramson, Richard G.

    2016-01-01

    OBJECTIVES Multi-atlas fusion is a promising approach for computer-assisted segmentation of anatomical structures. The purpose of this study was to evaluate the accuracy and time efficiency of multi-atlas segmentation for estimating spleen volumes on clinically-acquired CT scans. MATERIALS AND METHODS Under IRB approval, we obtained 294 deidentified (HIPAA-compliant) abdominal CT scans on 78 subjects from a recent clinical trial. We compared five pipelines for obtaining splenic volumes: Pipeline 1–manual segmentation of all scans, Pipeline 2–automated segmentation of all scans, Pipeline 3–automated segmentation of all scans with manual segmentation for outliers on a rudimentary visual quality check, Pipelines 4 and 5–volumes derived from a unidimensional measurement of craniocaudal spleen length and three-dimensional splenic index measurements, respectively. Using Pipeline 1 results as ground truth, the accuracy of Pipelines 2–5 (Dice similarity coefficient [DSC], Pearson correlation, R-squared, and percent and absolute deviation of volume from ground truth) were compared for point estimates of splenic volume and for change in splenic volume over time. Time cost was also compared for Pipelines 1–5. RESULTS Pipeline 3 was dominant in terms of both accuracy and time cost. With a Pearson correlation coefficient of 0.99, average absolute volume deviation 23.7 cm3, and 1 minute per scan, Pipeline 3 yielded the best results. The second-best approach was Pipeline 5, with a Pearson correlation coefficient 0.98, absolute deviation 46.92 cm3, and 1 minute 30 seconds per scan. Manual segmentation (Pipeline 1) required 11 minutes per scan. CONCLUSION A computer-automated segmentation approach with manual correction of outliers generated accurate splenic volumes with reasonable time efficiency. PMID:27519156

  7. Ejection fraction in myocardial perfusion imaging assessed with a dynamic phantom: comparison between IQ-SPECT and LEHR.

    PubMed

    Hippeläinen, Eero; Mäkelä, Teemu; Kaasalainen, Touko; Kaleva, Erna

    2017-12-01

    Developments in single photon emission tomography instrumentation and reconstruction methods present a potential for decreasing acquisition times. One of such recent options for myocardial perfusion imaging (MPI) is IQ-SPECT. This study was motivated by the inconsistency in the reported ejection fraction (EF) and left ventricular (LV) volume results between IQ-SPECT and more conventional low-energy high-resolution (LEHR) collimation protocols. IQ-SPECT and LEHR quantitative results were compared while the equivalent number of iterations (EI) was varied. The end-diastolic (EDV) and end-systolic volumes (ESV) and the derived EF values were investigated. A dynamic heart phantom was used to produce repeatable ESVs, EDVs and EFs. Phantom performance was verified by comparing the set EF values to those measured from a gated multi-slice X-ray computed tomography (CT) scan (EF True ). The phantom with an EF setting of 45, 55, 65 and 70% was imaged with both IQ-SPECT and LEHR protocols. The data were reconstructed with different EI, and two commonly used clinical myocardium delineation software were used to evaluate the LV volumes. The CT verification showed that the phantom EF settings were repeatable and accurate with the EF True being within 1% point from the manufacture's nominal value. Depending on EI both MPI protocols can be made to produce correct EF estimates, but IQ-SPECT protocol produced on average 41 and 42% smaller EDV and ESV when compared to the phantom's volumes, while LEHR protocol underestimated volumes by 24 and 21%, respectively. The volume results were largely similar between the delineation methods used. The reconstruction parameters can greatly affect the volume estimates obtained from perfusion studies. IQ-SPECT produces systematically smaller LV volumes than the conventional LEHR MPI protocol. The volume estimates are also software dependent.

  8. Radiation dose estimation for marine mussels following exposure to tritium: Best practice for use of the ERICA tool in ecotoxicological studies.

    PubMed

    Dallas, Lorna J; Devos, Alexandre; Fievet, Bruno; Turner, Andrew; Lyons, Brett P; Jha, Awadhesh N

    2016-05-01

    Accurate dosimetry is critically important for ecotoxicological and radioecological studies on the potential effects of environmentally relevant radionuclides, such as tritium ((3)H). Previous studies have used basic dosimetric equations to estimate dose from (3)H exposure in ecologically important organisms, such as marine mussels. This study compares four different methods of estimating dose to adult mussels exposed to 1 or 15 MBq L(-1) tritiated water (HTO) under laboratory conditions. These methods were (1) an equation converting seawater activity concentrations to dose rate with fixed parameters; (2) input into the ERICA tool of seawater activity concentrations only; (3) input into the ERICA tool of estimated whole organism concentrations (woTACs), comprising dry activity plus estimated tissue free water tritium (TFWT) activity (TFWT volume × seawater activity concentration); and (4) input into the ERICA tool of measured whole organism activity concentrations, comprising dry activity plus measured TFWT activity (TFWT volume × TFWT activity concentration). Methods 3 and 4 are recommended for future ecotoxicological experiments as they produce values for individual animals and are not reliant on transfer predictions (estimation of concentration ratio). Method 1 may be suitable if measured whole organism concentrations are not available, as it produced results between 3 and 4. As there are technical complications to accurately measuring TFWT, we recommend that future radiotoxicological studies on mussels or other aquatic invertebrates measure whole organism activity in non-dried tissues (i.e. incorporating TFWT and dry activity as one, rather than as separate fractions) and input this data into the ERICA tool. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Deorientation of PolSAR coherency matrix for volume scattering retrieval

    NASA Astrophysics Data System (ADS)

    Kumar, Shashi; Garg, R. D.; Kushwaha, S. P. S.

    2016-05-01

    Polarimetric SAR data has proven its potential to extract scattering information for different features appearing in single resolution cell. Several decomposition modelling approaches have been developed to retrieve scattering information from PolSAR data. During scattering power decomposition based on physical scattering models it becomes very difficult to distinguish volume scattering as a result from randomly oriented vegetation from scattering nature of oblique structures which are responsible for double-bounce and volume scattering , because both are decomposed in same scattering mechanism. The polarization orientation angle (POA) of an electromagnetic wave is one of the most important character which gets changed due to scattering from geometrical structure of topographic slopes, oriented urban area and randomly oriented features like vegetation cover. The shift in POA affects the polarimetric radar signatures. So, for accurate estimation of scattering nature of feature compensation in polarization orientation shift becomes an essential procedure. The prime objective of this work was to investigate the effect of shift in POA in scattering information retrieval and to explore the effect of deorientation on regression between field-estimated aboveground biomass (AGB) and volume scattering. For this study Dudhwa National Park, U.P., India was selected as study area and fully polarimetric ALOS PALSAR data was used to retrieve scattering information from the forest area of Dudhwa National Park. Field data for DBH and tree height was collect for AGB estimation using stratified random sampling. AGB was estimated for 170 plots for different locations of the forest area. Yamaguchi four component decomposition modelling approach was utilized to retrieve surface, double-bounce, helix and volume scattering information. Shift in polarization orientation angle was estimated and deorientation of coherency matrix for compensation of POA shift was performed. Effect of deorientation on RGB color composite for the forest area can be easily seen. Overestimation of volume scattering and under estimation of double bounce scattering was recorded for PolSAR decomposition without deorientation and increase in double bounce scattering and decrease in volume scattering was noticed after deorientation. This study was mainly focused on volume scattering retrieval and its relation with field estimated AGB. Change in volume scattering after POA compensation of PolSAR data was recorded and a comparison was performed on volume scattering values for all the 170 forest plots for which field data were collected. Decrease in volume scattering after deorientation was noted for all the plots. Regression between PolSAR decomposition based volume scattering and AGB was performed. Before deorientation, coefficient determination (R2) between volume scattering and AGB was 0.225. After deorientation an improvement in coefficient of determination was found and the obtained value was 0.613. This study recommends deorientation of PolSAR data for decomposition modelling to retrieve reliable volume scattering information from forest area.

  10. Designing a sampling system for concurrently measuring outdoor recreation visitation and describing visitor characteristics

    Treesearch

    Donald B.K. English; Stanley J. Zarnoch; Susan M. Kocis

    2004-01-01

    Two primary information needs for managing recreation areas and the visitors to those areas are: (1) good estimates of visitation volume, and (2) accurate descriptions of visitor characteristics, such as length of stay, frequency of visit, and primary activity. For National Forests in the United States of America with large undeveloped areas, efficient sampling for the...

  11. Evaluation of total aboveground biomass and total merchantable biomass in Missouri

    Treesearch

    Michael E. Goerndt; David R. Larsen; Charles D. Keating

    2014-01-01

    In recent years, the state of Missouri has been converting to biomass weight rather than volume as the standard measurement of wood for buying and selling sawtimber. Therefore, there is a need to identify accurate and precise methods of estimating whole tree biomass and merchantable biomass of harvested trees as well as total standing biomass of live timber for...

  12. Evaluation of Traffic Density Parameters as an Indicator of Vehicle Emission-Related Near-Road Air Pollution: A Case Study with NEXUS Measurement Data on Black Carbon

    EPA Science Inventory

    An important factor in evaluating health risk of near-road air pollution is to accurately estimate the traffic-related vehicle emission of air pollutants. Inclusion of traffic parameters such as road length/area, distance to roads, and traffic volume/intensity into models such as...

  13. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 1

    NASA Technical Reports Server (NTRS)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 1 of the four major tasks included in the study. Task 1 compares flight plans based on forecasts with plans based on the verifying analysis from 33 days during the summer and fall of 1979. The comparisons show that: (1) potential fuel savings conservatively estimated to be between 1.2 and 2.5 percent could result from using more timely and accurate weather data in flight planning and route selection; (2) the Suitland forecast generally underestimates wind speeds; and (3) the track selection methodology of many airlines operating on the North Atlantic may not be optimum resulting in their selecting other than the optimum North Atlantic Organized Track about 50 percent of the time.

  14. On the relation between testes size and sperm reserves in the one-humped camel (camelus dromedarius).

    PubMed

    Elwisby, A B; Omar, A M

    1975-01-01

    The size, weight and volume as well as the sperm content of the testes of 18 mature, 6-10-year-old camels with unknown breeding history were determined after slaughtering and then related to one another. It was found that the weight and the volume of the testes can be estimated fairly accurately by means of their length. The length of the testes can also be used to calculate the sperm reserves, although more accurate values can be obtained by taking the square numbers of the testes length, breadth, and thickness into account. The average values for gonadal and epididymal sperm reserves of paired testes and epididymides was 6.278 +/- 1.226 X 10(9) and 12.177 +/- 2.316 X 10(9) respectively. The sperm production per gram testes tissue was 40.55 +/- 15 X 10(6).

  15. Estimating Equivalency of Explosives Through A Thermochemical Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maienschein, J L

    2002-07-08

    The Cheetah thermochemical computer code provides an accurate method for estimating the TNT equivalency of any explosive, evaluated either with respect to peak pressure or the quasi-static pressure at long time in a confined volume. Cheetah calculates the detonation energy and heat of combustion for virtually any explosive (pure or formulation). Comparing the detonation energy for an explosive with that of TNT allows estimation of the TNT equivalency with respect to peak pressure, while comparison of the heat of combustion allows estimation of TNT equivalency with respect to quasi-static pressure. We discuss the methodology, present results for many explosives, andmore » show comparisons with equivalency data from other sources.« less

  16. Signal-to-noise ratio estimation in digital computer simulation of lowpass and bandpass systems with applications to analog and digital communications, volume 3

    NASA Technical Reports Server (NTRS)

    Tranter, W. H.; Turner, M. D.

    1977-01-01

    Techniques are developed to estimate power gain, delay, signal-to-noise ratio, and mean square error in digital computer simulations of lowpass and bandpass systems. The techniques are applied to analog and digital communications. The signal-to-noise ratio estimates are shown to be maximum likelihood estimates in additive white Gaussian noise. The methods are seen to be especially useful for digital communication systems where the mapping from the signal-to-noise ratio to the error probability can be obtained. Simulation results show the techniques developed to be accurate and quite versatile in evaluating the performance of many systems through digital computer simulation.

  17. Is STAPLE algorithm confident to assess segmentation methods in PET imaging?

    NASA Astrophysics Data System (ADS)

    Dewalle-Vignion, Anne-Sophie; Betrouni, Nacim; Baillet, Clio; Vermandel, Maximilien

    2015-12-01

    Accurate tumor segmentation in [18F]-fluorodeoxyglucose positron emission tomography is crucial for tumor response assessment and target volume definition in radiation therapy. Evaluation of segmentation methods from clinical data without ground truth is usually based on physicians’ manual delineations. In this context, the simultaneous truth and performance level estimation (STAPLE) algorithm could be useful to manage the multi-observers variability. In this paper, we evaluated how this algorithm could accurately estimate the ground truth in PET imaging. Complete evaluation study using different criteria was performed on simulated data. The STAPLE algorithm was applied to manual and automatic segmentation results. A specific configuration of the implementation provided by the Computational Radiology Laboratory was used. Consensus obtained by the STAPLE algorithm from manual delineations appeared to be more accurate than manual delineations themselves (80% of overlap). An improvement of the accuracy was also observed when applying the STAPLE algorithm to automatic segmentations results. The STAPLE algorithm, with the configuration used in this paper, is more appropriate than manual delineations alone or automatic segmentations results alone to estimate the ground truth in PET imaging. Therefore, it might be preferred to assess the accuracy of tumor segmentation methods in PET imaging.

  18. Is STAPLE algorithm confident to assess segmentation methods in PET imaging?

    PubMed

    Dewalle-Vignion, Anne-Sophie; Betrouni, Nacim; Baillet, Clio; Vermandel, Maximilien

    2015-12-21

    Accurate tumor segmentation in [18F]-fluorodeoxyglucose positron emission tomography is crucial for tumor response assessment and target volume definition in radiation therapy. Evaluation of segmentation methods from clinical data without ground truth is usually based on physicians' manual delineations. In this context, the simultaneous truth and performance level estimation (STAPLE) algorithm could be useful to manage the multi-observers variability. In this paper, we evaluated how this algorithm could accurately estimate the ground truth in PET imaging. Complete evaluation study using different criteria was performed on simulated data. The STAPLE algorithm was applied to manual and automatic segmentation results. A specific configuration of the implementation provided by the Computational Radiology Laboratory was used. Consensus obtained by the STAPLE algorithm from manual delineations appeared to be more accurate than manual delineations themselves (80% of overlap). An improvement of the accuracy was also observed when applying the STAPLE algorithm to automatic segmentations results. The STAPLE algorithm, with the configuration used in this paper, is more appropriate than manual delineations alone or automatic segmentations results alone to estimate the ground truth in PET imaging. Therefore, it might be preferred to assess the accuracy of tumor segmentation methods in PET imaging.

  19. Snow cover volumes dynamic monitoring during melting season using high topographic accuracy approach for a Lebanese high plateau witness sinkhole

    NASA Astrophysics Data System (ADS)

    Abou Chakra, Charbel; Somma, Janine; Elali, Taha; Drapeau, Laurent

    2017-04-01

    Climate change and its negative impact on water resource is well described. For countries like Lebanon, undergoing major population's rise and already decreasing precipitations issues, effective water resources management is crucial. Their continuous and systematic monitoring overs long period of time is therefore an important activity to investigate drought risk scenarios for the Lebanese territory. Snow cover on Lebanese mountains is the most important water resources reserve. Consequently, systematic observation of snow cover dynamic plays a major role in order to support hydrologic research with accurate data on snow cover volumes over the melting season. For the last 20 years few studies have been conducted for Lebanese snow cover. They were focusing on estimating the snow cover surface using remote sensing and terrestrial measurement without obtaining accurate maps for the sampled locations. Indeed, estimations of both snow cover area and volumes are difficult due to snow accumulation very high variability and Lebanese mountains chains slopes topographic heterogeneity. Therefore, the snow cover relief measurement in its three-dimensional aspect and its Digital Elevation Model computation is essential to estimate snow cover volume. Despite the need to cover the all lebanese territory, we favored experimental terrestrial topographic site approaches due to high resolution satellite imagery cost, its limited accessibility and its acquisition restrictions. It is also most challenging to modelise snow cover at national scale. We therefore, selected a representative witness sinkhole located at Ouyoun el Siman to undertake systematic and continuous observations based on topographic approach using a total station. After four years of continuous observations, we acknowledged the relation between snow melt rate, date of total melting and neighboring springs discharges. Consequently, we are able to forecast, early in the season, dates of total snowmelt and springs low water flows which are essentially feeded by snowmelt water. Simulations were ran, predicting the snow level between two sampled dates, they provided promising result for national scale extrapolation.

  20. Flood type specific construction of synthetic design hydrographs

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Viviroli, Daniel; Sikorska, Anna E.; Vannier, Olivier; Favre, Anne-Catherine; Seibert, Jan

    2017-02-01

    Accurate estimates of flood peaks, corresponding volumes, and hydrographs are required to design safe and cost-effective hydraulic structures. In this paper, we propose a statistical approach for the estimation of the design variables peak and volume by constructing synthetic design hydrographs for different flood types such as flash-floods, short-rain floods, long-rain floods, and rain-on-snow floods. Our approach relies on the fitting of probability density functions to observed flood hydrographs of a certain flood type and accounts for the dependence between peak discharge and flood volume. It makes use of the statistical information contained in the data and retains the process information of the flood type. The method was tested based on data from 39 mesoscale catchments in Switzerland and provides catchment specific and flood type specific synthetic design hydrographs for all of these catchments. We demonstrate that flood type specific synthetic design hydrographs are meaningful in flood-risk management when combined with knowledge on the seasonality and the frequency of different flood types.

  1. Comparison of epicardial adipose tissue radiodensity threshold between contrast and non-contrast enhanced computed tomography scans: A cohort study of derivation and validation.

    PubMed

    Xu, Lingyu; Xu, Yuancheng; Coulden, Richard; Sonnex, Emer; Hrybouski, Stanislau; Paterson, Ian; Butler, Craig

    2018-05-11

    Epicardial adipose tissue (EAT) volume derived from contrast enhanced (CE) computed tomography (CT) scans is not well validated. We aim to establish a reliable threshold to accurately quantify EAT volume from CE datasets. We analyzed EAT volume on paired non-contrast (NC) and CE datasets from 25 patients to derive appropriate Hounsfield (HU) cutpoints to equalize two EAT volume estimates. The gold standard threshold (-190HU, -30HU) was used to assess EAT volume on NC datasets. For CE datasets, EAT volumes were estimated using three previously reported thresholds: (-190HU, -30HU), (-190HU, -15HU), (-175HU, -15HU) and were analyzed by a semi-automated 3D Fat analysis software. Subsequently, we applied a threshold correction to (-190HU, -30HU) based on mean differences in radiodensity between NC and CE images (ΔEATrd = CE radiodensity - NC radiodensity). We then validated our findings on EAT threshold in 21 additional patients with paired CT datasets. EAT volume from CE datasets using previously published thresholds consistently underestimated EAT volume from NC dataset standard by a magnitude of 8.2%-19.1%. Using our corrected threshold (-190HU, -3HU) in CE datasets yielded statistically identical EAT volume to NC EAT volume in the validation cohort (186.1 ± 80.3 vs. 185.5 ± 80.1 cm 3 , Δ = 0.6 cm 3 , 0.3%, p = 0.374). Estimating EAT volume from contrast enhanced CT scans using a corrected threshold of -190HU, -3HU provided excellent agreement with EAT volume from non-contrast CT scans using a standard threshold of -190HU, -30HU. Copyright © 2018. Published by Elsevier B.V.

  2. A mathematical formula to estimate in vivo thyroid volume from two-dimensional ultrasonography.

    PubMed

    Trimboli, Pierpaolo; Ruggieri, Massimo; Fumarola, Angela; D'Alò, Michele; Straniero, Andrea; Maiuolo, Amelia; Ulisse, Salvatore; D'Armiento, Massimino

    2008-08-01

    The determination of thyroid volume (TV) is required for the management of thyroid diseases. Since two-dimensional ultrasonography (2D-US) has become the accepted method for the assessment of TV (2D-US-TV), we verified whether it accurately assesses postsurgical measured TV (PS-TV). In 92 patients who underwent total thyroidectomy by conventional cervicotomy, 2D-US-TV obtained by the ellipsoid volume formula was compared to PS-TV, determined by the Archimedes' principle. Mean 2D-US-TV (23.9 +/- 14.8 mL) was significantly lower than mean PS-TV (33.4 +/- 20.1 mL). Underestimation was observed in 77% of cases, and it was related to gland multinodularity and/or nodular involvement of the isthmus, while 2D-US-TV matched the PS-TV in the remaining 21 cases (23%). A mathematical formula, to estimate PS-TV from US-TV, was derived using a linear model (Calculated-TV = [1.24 x 2D-US-TV]+ 3.66). Calculated-TV (mean value 33.4 +/- 18.3 mL) significantly (p < 0.01) increased from 21 (23%) to 31 (34%) of the cases that matched PS-TV. In addition, it significantly (p < 0.01) decreased from 77% to 27% the percentage of cases where PS-TV was underestimated as well as the range of the disagreement from 245% to 92%. This study shows that 2D-US does not provide an accurate estimation of TV and suggests that it can be improved by a mathematical model different from the ellipsoid model. If confirmed in prospective studies, this may contribute to a more appropriate management of thyroid diseases.

  3. Monitoring of tissue heating with medium intensity focused ultrasound via four dimensional optoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Oyaga Landa, Francisco Javier; Ronda Penacoba, Silvia; Deán-Ben, Xosé Luís.; Montero de Espinosa, Francisco; Razansky, Daniel

    2018-02-01

    Medium intensity focused ultrasound (MIFU) holds promise in important clinical applications. Generally, the aim in MIFU is to stimulate physiological mechanisms that reinforce healing responses, avoiding reaching temperatures that can cause permanent tissue damage. The outcome of interventions is then strongly affected by the temperature distribution in the treated region, and accurate monitoring represents a significant clinical need. In this work, we showcase the capacities of 4D optoacoustic imaging to monitor tissue heating during MIFU. The proposed method allows localizing the ultrasound focus, estimating the peak temperature and measuring the size of the heat-affected volume. Calibration experiments in a tissue-mimicking phantom demonstrate that the optoacoustically-estimated temperature accurately matches thermocouple readings. The good performance of the suggested approach in real tissues is further showcased in experiments with bovine muscle samples.

  4. It's what's inside that counts: egg contaminant concentrations are influenced by estimates of egg density, egg volume, and fresh egg mass.

    PubMed

    Herzog, Mark P; Ackerman, Joshua T; Eagles-Smith, Collin A; Hartman, C Alex

    2016-05-01

    In egg contaminant studies, it is necessary to calculate egg contaminant concentrations on a fresh wet weight basis and this requires accurate estimates of egg density and egg volume. We show that the inclusion or exclusion of the eggshell can influence egg contaminant concentrations, and we provide estimates of egg density (both with and without the eggshell) and egg-shape coefficients (used to estimate egg volume from egg morphometrics) for American avocet (Recurvirostra americana), black-necked stilt (Himantopus mexicanus), and Forster's tern (Sterna forsteri). Egg densities (g/cm(3)) estimated for whole eggs (1.056 ± 0.003) were higher than egg densities estimated for egg contents (1.024 ± 0.001), and were 1.059 ± 0.001 and 1.025 ± 0.001 for avocets, 1.056 ± 0.001 and 1.023 ± 0.001 for stilts, and 1.053 ± 0.002 and 1.025 ± 0.002 for terns. The egg-shape coefficients for egg volume (K v ) and egg mass (K w ) also differed depending on whether the eggshell was included (K v  = 0.491 ± 0.001; K w  = 0.518 ± 0.001) or excluded (K v  = 0.493 ± 0.001; K w  = 0.505 ± 0.001), and varied among species. Although egg contaminant concentrations are rarely meant to include the eggshell, we show that the typical inclusion of the eggshell in egg density and egg volume estimates results in egg contaminant concentrations being underestimated by 6-13 %. Our results demonstrate that the inclusion of the eggshell significantly influences estimates of egg density, egg volume, and fresh egg mass, which leads to egg contaminant concentrations that are biased low. We suggest that egg contaminant concentrations be calculated on a fresh wet weight basis using only internal egg-content densities, volumes, and masses appropriate for the species. For the three waterbirds in our study, these corrected coefficients are 1.024 ± 0.001 for egg density, 0.493 ± 0.001 for K v , and 0.505 ± 0.001 for K w .

  5. Two-Dimensional Echocardiography Estimates of Fetal Ventricular Mass throughout Gestation.

    PubMed

    Aye, Christina Y L; Lewandowski, Adam James; Ohuma, Eric O; Upton, Ross; Packham, Alice; Kenworthy, Yvonne; Roseman, Fenella; Norris, Tess; Molloholli, Malid; Wanyonyi, Sikolia; Papageorghiou, Aris T; Leeson, Paul

    2017-08-12

    Two-dimensional (2D) ultrasound quality has improved in recent years. Quantification of cardiac dimensions is important to screen and monitor certain fetal conditions. We assessed the feasibility and reproducibility of fetal ventricular measures using 2D echocardiography, reported normal ranges in our cohort, and compared estimates to other modalities. Mass and end-diastolic volume were estimated by manual contouring in the four-chamber view using TomTec Image Arena 4.6 in end diastole. Nomograms were created from smoothed centiles of measures, constructed using fractional polynomials after log transformation. The results were compared to those of previous studies using other modalities. A total of 294 scans from 146 fetuses from 15+0 to 41+6 weeks of gestation were included. Seven percent of scans were unanalysable and intraobserver variability was good (intraclass correlation coefficients for left and right ventricular mass 0.97 [0.87-0.99] and 0.99 [0.95-1.0], respectively). Mass and volume increased exponentially, showing good agreement with 3D mass estimates up to 28 weeks of gestation, after which our measurements were in better agreement with neonatal cardiac magnetic resonance imaging. There was good agreement with 4D volume estimates for the left ventricle. Current state-of-the-art 2D echocardiography platforms provide accurate, feasible, and reproducible fetal ventricular measures across gestation, and in certain circumstances may be the modality of choice. © 2017 S. Karger AG, Basel.

  6. Oyster's cells regulatory volume decrease: A new tool for evaluating the toxicity of low concentration hydrocarbons in marine waters.

    PubMed

    Ben Naceur, Chiraz; Maxime, Valérie; Ben Mansour, Hedi; Le Tilly, Véronique; Sire, Olivier

    2016-11-01

    Human activities require fossil fuels for transport and energy, a substantial part of which can accidentally or voluntarily (oil spillage) flow to the marine environment and cause adverse effects in human and ecosystems' health. This experiment was designed to estimate the suitability of an original cellular biomarker to early quantify the biological risk associated to hydrocarbons pollutants in seawater. Oocytes and hepatopancreas cells, isolated from oyster (Crassostrea gigas), were tested for their capacity to regulate their volume following a hypo-osmotic challenge. Cell volumes were estimated from cell images recorded at regular time intervals during a 90min-period. When exposed to diluted seawater (osmolalities from 895 to 712mosmkg(-1)), both cell types first swell and then undergo a shrinkage known as Regulatory Volume Decrease (RVD). This process is inversely proportional to the magnitude of the osmotic shock and is best fitted using a first-order exponential decay model. The Recovered Volume Factor (RVF) calculated from this model appears to be an accurate tool to compare cells responses. As shown by an about 50% decrease in RVF, the RVD process was significantly inhibited in cells sampled from oysters previously exposed to a low concentration of diesel oil (8.4mgL(-1) during 24h). This toxic effect was interpreted as a decreased permeability of the cell membranes resulting from an alteration of their lipidic structure by diesel oil compounds. In contrast, the previous contact of oysters with diesel did not induce any rise in the gills glutathione S-transferase specific activity. Therefore, this work demonstrates that the study of the RVD process of cells selected from sentinel animal species could be an alternative bioassay for the monitoring of hydrocarbons and probably, of various chemicals in the environment liable to alter the cellular regulations. Especially, given the high sensitivity of this biomarker compared with a proven one, it could become a relevant and accurate tool to estimate the biological hazards of micropollutants in the water. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. MULTISCALE ADAPTIVE SMOOTHING MODELS FOR THE HEMODYNAMIC RESPONSE FUNCTION IN FMRI*

    PubMed Central

    Wang, Jiaping; Zhu, Hongtu; Fan, Jianqing; Giovanello, Kelly; Lin, Weili

    2012-01-01

    In the event-related functional magnetic resonance imaging (fMRI) data analysis, there is an extensive interest in accurately and robustly estimating the hemodynamic response function (HRF) and its associated statistics (e.g., the magnitude and duration of the activation). Most methods to date are developed in the time domain and they have utilized almost exclusively the temporal information of fMRI data without accounting for the spatial information. The aim of this paper is to develop a multiscale adaptive smoothing model (MASM) in the frequency domain by integrating the spatial and temporal information to adaptively and accurately estimate HRFs pertaining to each stimulus sequence across all voxels in a three-dimensional (3D) volume. We use two sets of simulation studies and a real data set to examine the finite sample performance of MASM in estimating HRFs. Our real and simulated data analyses confirm that MASM outperforms several other state-of-art methods, such as the smooth finite impulse response (sFIR) model. PMID:24533041

  8. A multifractal approach to space-filling recovery for PET quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O.; Tsoumpas, Charalampos

    2014-11-01

    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal andmore » synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.« less

  9. Estimating and bidding for the Space Station Processing Facility

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1993-01-01

    This new, unique Cost Engineering Report introduces the 800-page, C-100 government estimate for the Space Station Processing Facility (SSPF) and Volume IV Aerospace Construction Price Book. At the January 23, 1991, bid opening for the SSPF, the government cost estimate was right on target. Metric, Inc., Prime Contractor, low bid was 1.2 percent below the government estimate. This project contains many different and complex systems. Volume IV is a summary of the cost associated with construction, activation and Ground Support Equipment (GSE) design, estimating, fabrication, installation, testing, termination, and verification of this project. Included are 13 reasons the government estimate was so accurate; abstract of bids, for 8 bidders and government estimate with additive alternates, special labor and materials, budget comparison and system summaries; and comments on the energy credit from local electrical utility. This report adds another project to our continuing study of 'How Does the Low Bidder Get Low and Make Money?' which was started in 1967, and first published in the 1973 AACE Transaction with 10 more ways the low bidder got low. The accuracy of this estimate proves the benefits of our Kennedy Space Center (KSC) teamwork efforts and KSC Cost Engineer Tools which are contributing toward our goals of the Space Station.

  10. Unobtrusive Estimation of Cardiac Contractility and Stroke Volume Changes Using Ballistocardiogram Measurements on a High Bandwidth Force Plate

    PubMed Central

    Ashouri, Hazar; Orlandic, Lara; Inan, Omer T.

    2016-01-01

    Unobtrusive and inexpensive technologies for monitoring the cardiovascular health of heart failure (HF) patients outside the clinic can potentially improve their continuity of care by enabling therapies to be adjusted dynamically based on the changing needs of the patients. Specifically, cardiac contractility and stroke volume (SV) are two key aspects of cardiovascular health that change significantly for HF patients as their condition worsens, yet these parameters are typically measured only in hospital/clinical settings, or with implantable sensors. In this work, we demonstrate accurate measurement of cardiac contractility (based on pre-ejection period, PEP, timings) and SV changes in subjects using ballistocardiogram (BCG) signals detected via a high bandwidth force plate. The measurement is unobtrusive, as it simply requires the subject to stand still on the force plate while holding electrodes in the hands for simultaneous electrocardiogram (ECG) detection. Specifically, we aimed to assess whether the high bandwidth force plate can provide accuracy beyond what is achieved using modified weighing scales we have developed in prior studies, based on timing intervals, as well as signal-to-noise ratio (SNR) estimates. Our results indicate that the force plate BCG measurement provides more accurate timing information and allows for better estimation of PEP than the scale BCG (r2 = 0.85 vs. r2 = 0.81) during resting conditions. This correlation is stronger during recovery after exercise due to more significant changes in PEP (r2 = 0.92). The improvement in accuracy can be attributed to the wider bandwidth of the force plate. ∆SV (i.e., changes in stroke volume) estimations from the force plate BCG resulted in an average error percentage of 5.3% with a standard deviation of ±4.2% across all subjects. Finally, SNR calculations showed slightly better SNR in the force plate measurements among all subjects but the small difference confirmed that SNR is limited by motion artifacts rather than instrumentation. PMID:27240380

  11. Improvements in lake water budget computations using Landsat data

    NASA Technical Reports Server (NTRS)

    Gervin, J. C.; Shih, S. F.

    1979-01-01

    A supervised multispectral classification was performed on Landsat data for Lake Okeechobee's extensive littoral zone to provide two types of information. First, the acreage of a given plant species as measured by satellite was combined with a more accurate transpiration rate to give a better estimate of evapotranspiration from the littoral zone. Second, the surface area coupled by plant communities was used to develop a better estimate of the water surface as a function of lake stage. Based on this information, more detailed representations of evapotranspiration and total water surface (and hence total lake volume) were provided to the water balance budget model for lake volume predictions. The model results based on information derived from satellite demonstrated a 94 percent reduction in cumulative lake stage error and a 70 percent reduction in the maximum deviation of the lake stage.

  12. Multi-model ensemble estimation of volume transport through the straits of the East/Japan Sea

    NASA Astrophysics Data System (ADS)

    Han, Sooyeon; Hirose, Naoki; Usui, Norihisa; Miyazawa, Yasumasa

    2016-01-01

    The volume transports measured at the Korea/Tsushima, Tsugaru, and Soya/La Perouse Straits remain quantitatively inconsistent. However, data assimilation models at least provide a self-consistent budget despite subtle differences among the models. This study examined the seasonal variation of the volume transport using the multiple linear regression and ridge regression of multi-model ensemble (MME) methods to estimate more accurately transport at these straits by using four different data assimilation models. The MME outperformed all of the single models by reducing uncertainties, especially the multicollinearity problem with the ridge regression. However, the regression constants turned out to be inconsistent with each other if the MME was applied separately for each strait. The MME for a connected system was thus performed to find common constants for these straits. The estimation of this MME was found to be similar to the MME result of sea level difference (SLD). The estimated mean transport (2.43 Sv) was smaller than the measurement data at the Korea/Tsushima Strait, but the calibrated transport of the Tsugaru Strait (1.63 Sv) was larger than the observed data. The MME results of transport and SLD also suggested that the standard deviation (STD) of the Korea/Tsushima Strait is larger than the STD of the observation, whereas the estimated results were almost identical to that observed for the Tsugaru and Soya/La Perouse Straits. The similarity between MME results enhances the reliability of the present MME estimation.

  13. A Framework for Automating Cost Estimates in Assembly Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calton, T.L.; Peters, R.R.

    1998-12-09

    When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process maymore » iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.« less

  14. Temporal and spatial variation in allocating annual traffic activity across an urban region and implications for air quality assessments

    PubMed Central

    Batterman, Stuart

    2015-01-01

    Patterns of traffic activity, including changes in the volume and speed of vehicles, vary over time and across urban areas and can substantially affect vehicle emissions of air pollutants. Time-resolved activity at the street scale typically is derived using temporal allocation factors (TAFs) that allow the development of emissions inventories needed to predict concentrations of traffic-related air pollutants. This study examines the spatial and temporal variation of TAFs, and characterizes prediction errors resulting from their use. Methods are presented to estimate TAFs and their spatial and temporal variability and used to analyze total, commercial and non-commercial traffic in the Detroit, Michigan, U.S. metropolitan area. The variability of total volume estimates, quantified by the coefficient of variation (COV) representing the percentage departure from expected hourly volume, was 21, 33, 24 and 33% for weekdays, Saturdays, Sundays and holidays, respectively. Prediction errors mostly resulted from hour-to-hour variability on weekdays and Saturdays, and from day-to-day variability on Sundays and holidays. Spatial variability was limited across the study roads, most of which were large freeways. Commercial traffic had different temporal patterns and greater variability than noncommercial vehicle traffic, e.g., the weekday variability of hourly commercial volume was 28%. The results indicate that TAFs for a metropolitan region can provide reasonably accurate estimates of hourly vehicle volume on major roads. While vehicle volume is only one of many factors that govern on-road emission rates, air quality analyses would be strengthened by incorporating information regarding the uncertainty and variability of traffic activity. PMID:26688671

  15. Use of continuous and grab sample data for calculating total maximum daily load (TMDL) in agricultural watersheds.

    PubMed

    Gulati, Shelly; Stubblefield, Ashley A; Hanlon, Jeremy S; Spier, Chelsea L; Stringfellow, William T

    2014-03-01

    Measuring the discharge of diffuse pollution from agricultural watersheds presents unique challenges. Flows in agricultural watersheds, particularly in Mediterranean climates, can be predominately irrigation runoff and exhibit large diurnal fluctuation in both volume and concentration. Flow and pollutant concentrations in these smaller watersheds dominated by human activity do not conform to a normal distribution and it is not clear if parametric methods are appropriate or accurate for load calculations. The objective of this study was to compare the accuracy of five load estimation methods to calculate pollutant loads from agricultural watersheds. Calculation of loads using results from discrete (grab) samples was compared with the true-load computed using in situ continuous monitoring measurements. A new method is introduced that uses a non-parametric measure of central tendency (the median) to calculate loads (median-load). The median-load method was compared to more commonly used parametric estimation methods which rely on using the mean as a measure of central tendency (mean-load and daily-load), a method that utilizes the total flow volume (volume-load), and a method that uses measure of flow at the time of sampling (instantaneous-load). Using measurements from ten watersheds in the San Joaquin Valley of California, the average percent error compared to the true-load for total dissolved solids (TDS) was 7.3% for the median-load, 6.9% for the mean-load, 6.9% for the volume-load, 16.9% for the instantaneous-load, and 18.7% for the daily-load methods of calculation. The results of this study show that parametric methods are surprisingly accurate, even for data that have starkly non-normal distributions and are highly skewed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Object strength--an accurate measure for small objects that is insensitive to partial volume effects.

    PubMed

    Tofts, P S; Silver, N C; Barker, G J; Gass, A

    2005-07-01

    There are currently four problems in characterising small nonuniform lesions or other objects in Magnetic Resonance images where partial volume effects are significant. Object size is over- or under-estimated; boundaries are often not reproducible; mean object value cannot be measured; and fuzzy borders cannot be accommodated. A new measure, Object Strength, is proposed. This is the sum of all abnormal intensities, above a uniform background value. For a uniform object, this is simply the product of the increase in intensity and the size of the object. Biologically, this could be at least as relevant as existing measures of size or mean intensity. We hypothesise that Object Strength will perform better than traditional area measurements in characterising small objects. In a pilot study, the reproducibility of object strength measurements was investigated using MR images of small multiple sclerosis (MS) lesions. In addition, accuracy was investigated using artificial lesions of known volume (0.3-6.2 ml) and realistic appearance. Reproducibility approached that of area measurements (in 33/90 lesion reports the difference between repeats was less than for area measurements). Total lesion volume was accurate to 0.2%. In conclusion, Object Strength has potential for improved characterisation of small lesions and objects in imaging and possibly spectroscopy.

  17. Adaptive changes in pancreas post Roux-en-Y gastric bypass induced weight loss.

    PubMed

    Lautenbach, A; Wernecke, M; Riedel, N; Veigel, J; Yamamura, J; Keller, S; Jung, R; Busch, P; Mann, O; Knop, F K; Holst, J J; Meier, J J; Aberle, J

    2018-05-16

    Obesity has been shown to trigger adaptive increases in pancreas parenchymal and fat volume. Consecutively, pancreatic steatosis may lead to beta-cell dysfunction. However, it is not known, whether the pancreatic tissue components decrease with weight loss and pancreatic steatosis is reversible following RYGB. Therefore, the objective of the study was to investigate the effects of RYGB-induced weight loss on pancreatic volume and glucose homeostasis. 11 patients were recruited in the Obesity Centre of the University Medical Centre Hamburg-Eppendorf. Before and 6 months after RYGB, total GLP-1 levels were measured during OGTT. To assess changes in visceral adipose tissue and pancreatic volume, MRI was performed. Measures of glucose homeostasis and insulin indices were assessed. Fractional beta-cell area was estimated by correlation with the C-peptide-to-glucose ratio, beta-cell mass was calculated by the product of beta-cell area and pancreas parenchymal weight. Pancreas volume decreased from 83.8 (75.7-92.0) to 70.5 (58.8-82.3) cm 3 [mean (95% CI), p=0.001]. The decrease in total volume was associated with a significant decrease in fat volume. Fasting insulin and C-peptide were lower post RYGB. HOMA-IR levels decreased, whereas insulin sensitivity increased (p=0.03). This was consistent with a reduction in the estimated beta-cell area and mass. Following RYGB, pancreatic volume and steatosis adaptively decreased to "normal" levels with accompanying improvement in glucose homeostasis. Moreover, obesity-driven beta-cell expansion seems to be reversible, however future studies must define a method to more accurately estimate functional beta-cell mass to increase our understanding of glucose homeostasis after RYGB. This article is protected by copyright. All rights reserved.

  18. Feasibility of single-beat full-volume capture real-time three-dimensional echocardiography for quantification of right ventricular volume: validation by cardiac magnetic resonance imaging.

    PubMed

    Zhang, Quan Bin; Sun, Jing Ping; Gao, Rui Feng; Lee, Alex Pui-Wai; Feng, Yan Lin; Liu, Xiao Rong; Sheng, Wei; Liu, Feng; Yang, Xing Sheng; Fang, Fang; Yu, Cheuk-Man

    2013-10-09

    The lack of an accurate noninvasive method for assessing right ventricular (RV) volume and function has been a major deficiency of two-dimensional (2D) echocardiography. The aim of our study was to test the feasibility of single-beat full-volume capture with real-time three-dimensional echo (3DE) imaging system for the evaluation of RV volumes and function validated by cardiac magnetic resonance imaging (CMRI). Sixty-one subjects (16 normal subjects, 20 patients with hypertension, 16 patients with pulmonary heart disease and 9 patients with coronary heart disease) were studied. RV volume and function assessments using 3DE were compared with manual tracing with CMRI as the reference method. Fifty-nine of 61 patients (96.7%; 36 male, mean age, 62 ± 15 years) had adequate three-dimensional echocardiographic data sets for analysis. The mean RV end diastolic volume (EDV) was 105 ± 38 ml, end-systolic volume (ESV) was 60 ± 30 and RV ejection fraction (EF) was 44 ± 11% by CMRI; and EDV 103 ± 38 ml, ESV 60 ± 28 ml and RV EF 41 ± 13% by 3DE. The correlations and agreements between measurements estimated by two methods were acceptable. RV volumes and function can be analyzed with 3DE software in most of subjects with or without heart diseases, which is able to be estimated with single-beat full-volume capture with real-time 3DE compared with CMRI. © 2013.

  19. Visual illusion in mass estimation of cut food.

    PubMed

    Wada, Yuji; Tsuzuki, Daisuke; Kobayashi, Naoki; Hayakawa, Fumiyo; Kohyama, Kaoru

    2007-07-01

    We investigated the effect of the appearance of cut food on visual mass estimation. In this experiment, we manipulated the shape (e.g., a block, fine strips, or small cubes) of food samples of various masses, and presented them on a CRT display as stimuli. Eleven subjects participated in tasks to choose the picture of the food sample which they felt indicated a target mass. We used raw carrots and surimi (ground fish) gel as hard and soft samples, respectively. The results clearly confirm an existence of an illusion, and this indicates that the appearance of food interferes with visual mass estimation. Specifically, participants often overestimated the mass of finely cut food, especially fine strips, whereas they could accurately estimate the mass of block samples, regardless of the physical characteristics of the foods. The overestimation of the mass of cut food increased with the food's actual mass, and was particularly obvious with increases of apparent volume when cut into fine strips. These results suggest that the apparent volume of a food sample effects the visual estimation of its mass. Hence we can conclude that there are illusions associated with the visual presentation of food that may influence various food impressions, including satisfaction and eating behaviour.

  20. Sampling procedures for inventory of commercial volume tree species in Amazon Forest.

    PubMed

    Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R

    2017-01-01

    The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.

  1. Continuous Rapid Quantification of Stroke Volume Using Magnetohydrodynamic Voltages in 3T Magnetic Resonance Imaging.

    PubMed

    Gregory, T Stan; Oshinski, John; Schmidt, Ehud J; Kwong, Raymond Y; Stevenson, William G; Ho Tse, Zion Tsz

    2015-12-01

    To develop a technique to noninvasively estimate stroke volume in real time during magnetic resonance imaging (MRI)-guided procedures, based on induced magnetohydrodynamic voltages (VMHD) that occur in ECG recordings during MRI exams, leaving the MRI scanner free to perform other imaging tasks. Because of the relationship between blood flow (BF) and VMHD, we hypothesized that a method to obtain stroke volume could be derived from extracted VMHD vectors in the vectorcardiogram (VCG) frame of reference (VMHDVCG). To estimate a subject-specific BF-VMHD model, VMHDVCG was acquired during a 20-s breath-hold and calibrated versus aortic BF measured using phase-contrast magnetic resonance in 10 subjects (n=10) and 1 subject diagnosed with premature ventricular contractions. Beat-to-beat validation of VMHDVCG-derived BF was performed using real-time phase-contrast imaging in 7 healthy subjects (n=7) during 15-minute cardiac exercise stress tests and 30 minutes after stress relaxation in 3T MRIs. Subject-specific equations were derived to correlate VMHDVCG with BF at rest and validated using real-time phase-contrast. An average error of 7.22% and 3.69% in stroke volume estimation, respectively, was found during peak stress and after complete relaxation. Measured beat-to-beat BF time history derived from real-time phase-contrast and VMHD was highly correlated using a Spearman rank correlation coefficient during stress tests (0.89) and after stress relaxation (0.86). Accurate beat-to-beat stroke volume and BF were estimated using VMHDVCG extracted from intra-MRI 12-lead ECGs, providing a means to enhance patient monitoring during MR imaging and MR-guided interventions. © 2015 American Heart Association, Inc.

  2. Volumes and bulk densities of forty asteroids from ADAM shape modeling

    NASA Astrophysics Data System (ADS)

    Hanuš, J.; Viikinkoski, M.; Marchis, F.; Ďurech, J.; Kaasalainen, M.; Delbo', M.; Herald, D.; Frappa, E.; Hayamizu, T.; Kerr, S.; Preston, S.; Timerson, B.; Dunham, D.; Talbot, J.

    2017-05-01

    Context. Disk-integrated photometric data of asteroids do not contain accurate information on shape details or size scale. Additional data such as disk-resolved images or stellar occultation measurements further constrain asteroid shapes and allow size estimates. Aims: We aim to use all the available disk-resolved images of approximately forty asteroids obtained by the Near-InfraRed Camera (Nirc2) mounted on the W.M. Keck II telescope together with the disk-integrated photometry and stellar occultation measurements to determine their volumes. We can then use the volume, in combination with the known mass, to derive the bulk density. Methods: We downloaded and processed all the asteroid disk-resolved images obtained by the Nirc2 that are available in the Keck Observatory Archive (KOA). We combined optical disk-integrated data and stellar occultation profiles with the disk-resolved images and use the All-Data Asteroid Modeling (ADAM) algorithm for the shape and size modeling. Our approach provides constraints on the expected uncertainty in the volume and size as well. Results: We present shape models and volume for 41 asteroids. For 35 of these asteroids, the knowledge of their mass estimates from the literature allowed us to derive their bulk densities. We see a clear trend of lower bulk densities for primitive objects (C-complex) and higher bulk densities for S-complex asteroids. The range of densities in the X-complex is large, suggesting various compositions. We also identified a few objects with rather peculiar bulk densities, which is likely a hint of their poor mass estimates. Asteroid masses determined from the Gaia astrometric observations should further refine most of the density estimates.

  3. A Comparison of Methods Used to Estimate the Height of Sand Dunes on Mars

    NASA Technical Reports Server (NTRS)

    Bourke, M. C.; Balme, M.; Beyer, R. A.; Williams, K. K.; Zimbelman, J.

    2006-01-01

    The collection of morphometric data on small-scale landforms from other planetary bodies is difficult. We assess four methods that can be used to estimate the height of aeolian dunes on Mars. These are (1) stereography, (2) slip face length, (3) profiling photoclinometry, and (4) Mars Orbiter Laser Altimeter (MOLA). Results show that there is good agreement among the methods when conditions are ideal. However, limitations inherent to each method inhibited their accurate application to all sites. Collectively, these techniques provide data on a range of morphometric parameters, some of which were not previously available for dunes on Mars. They include dune height, width, length, surface area, volume, and longitudinal and transverse profiles. Thc utilization of these methods will facilitate a more accurate analysis of aeolian dunes on Mars and enable comparison with dunes on other planetary surfaces.

  4. Validating New Software for Semiautomated Liver Volumetry--Better than Manual Measurement?

    PubMed

    Noschinski, L E; Maiwald, B; Voigt, P; Wiltberger, G; Kahn, T; Stumpp, P

    2015-09-01

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33% vs. 57%, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04 min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience. Both tested types of software allow exact volumetry of resected liver parts. Preoperative prediction can be performed more accurately with the semiautomated software. The semiautomated software is nearly four times faster than the tested manual program and less dependent on the user's experience. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Enhanced Photoacoustic Gas Analyser Response Time and Impact on Accuracy at Fast Ventilation Rates during Multiple Breath Washout

    PubMed Central

    Horsley, Alex; Macleod, Kenneth; Gupta, Ruchi; Goddard, Nick; Bell, Nicholas

    2014-01-01

    Background The Innocor device contains a highly sensitive photoacoustic gas analyser that has been used to perform multiple breath washout (MBW) measurements using very low concentrations of the tracer gas SF6. Use in smaller subjects has been restricted by the requirement for a gas analyser response time of <100 ms, in order to ensure accurate estimation of lung volumes at rapid ventilation rates. Methods A series of previously reported and novel enhancements were made to the gas analyser to produce a clinically practical system with a reduced response time. An enhanced lung model system, capable of delivering highly accurate ventilation rates and volumes, was used to assess in vitro accuracy of functional residual capacity (FRC) volume calculation and the effects of flow and gas signal alignment on this. Results 10–90% rise time was reduced from 154 to 88 ms. In an adult/child lung model, accuracy of volume calculation was −0.9 to 2.9% for all measurements, including those with ventilation rate of 30/min and FRC of 0.5 L; for the un-enhanced system, accuracy deteriorated at higher ventilation rates and smaller FRC. In a separate smaller lung model (ventilation rate 60/min, FRC 250 ml, tidal volume 100 ml), mean accuracy of FRC measurement for the enhanced system was minus 0.95% (range −3.8 to 2.0%). Error sensitivity to flow and gas signal alignment was increased by ventilation rate, smaller FRC and slower analyser response time. Conclusion The Innocor analyser can be enhanced to reliably generate highly accurate FRC measurements down at volumes as low as those simulating infant lung settings. Signal alignment is a critical factor. With these enhancements, the Innocor analyser exceeds key technical component recommendations for MBW apparatus. PMID:24892522

  6. Quantitative three-dimensional transrectal ultrasound (TRUS) for prostate imaging

    NASA Astrophysics Data System (ADS)

    Pathak, Sayan D.; Aarnink, Rene G.; de la Rosette, Jean J.; Chalana, Vikram; Wijkstra, Hessel; Haynor, David R.; Debruyne, Frans M. J.; Kim, Yongmin

    1998-06-01

    With the number of men seeking medical care for prostate diseases rising steadily, the need of a fast and accurate prostate boundary detection and volume estimation tool is being increasingly experienced by the clinicians. Currently, these measurements are made manually, which results in a large examination time. A possible solution is to improve the efficiency by automating the boundary detection and volume estimation process with minimal involvement from the human experts. In this paper, we present an algorithm based on SNAKES to detect the boundaries. Our approach is to selectively enhance the contrast along the edges using an algorithm called sticks and integrate it with a SNAKES model. This integrated algorithm requires an initial curve for each ultrasound image to initiate the boundary detection process. We have used different schemes to generate the curves with a varying degree of automation and evaluated its effects on the algorithm performance. After the boundaries are identified, the prostate volume is calculated using planimetric volumetry. We have tested our algorithm on 6 different prostate volumes and compared the performance against the volumes manually measured by 3 experts. With the increase in the user inputs, the algorithm performance improved as expected. The results demonstrate that given an initial contour reasonably close to the prostate boundaries, the algorithm successfully delineates the prostate boundaries in an image, and the resulting volume measurements are in close agreement with those made by the human experts.

  7. Technique for bone volume measurement from human femur head samples by classification of micro-CT image histograms.

    PubMed

    Marinozzi, Franco; Bini, Fabiano; Marinozzi, Andrea; Zuppante, Francesca; De Paolis, Annalisa; Pecci, Raffaella; Bedini, Rossella

    2013-01-01

    Micro-CT analysis is a powerful technique for a non-invasive evaluation of the morphometric parameters of trabecular bone samples. This elaboration requires a previous binarization of the images. A problem which arises from the binarization process is the partial volume artifact. Voxels at the external surface of the sample can contain both bone and air so thresholding operates an incorrect estimation of volume occupied by the two materials. The aim of this study is the extraction of bone volumetric information directly from the image histograms, by fitting them with a suitable set of functions. Nineteen trabecular bone samples were extracted from femoral heads of eight patients subject to a hip arthroplasty surgery. Trabecular bone samples were acquired using micro-CT Scanner. Hystograms of the acquired images were computed and fitted by Gaussian-like functions accounting for: a) gray levels produced by the bone x-ray absorption, b) the portions of the image occupied by air and c) voxels that contain a mixture of bone and air. This latter contribution can be considered such as an estimation of the partial volume effect. The comparison of the proposed technique to the bone volumes measured by a reference instrument such as by a helium pycnometer show the method as a good way for an accurate bone volume calculation of trabecular bone samples.

  8. A new method of cardiographic image segmentation based on grammar

    NASA Astrophysics Data System (ADS)

    Hamdi, Salah; Ben Abdallah, Asma; Bedoui, Mohamed H.; Alimi, Adel M.

    2011-10-01

    The measurement of the most common ultrasound parameters, such as aortic area, mitral area and left ventricle (LV) volume, requires the delineation of the organ in order to estimate the area. In terms of medical image processing this translates into the need to segment the image and define the contours as accurately as possible. The aim of this work is to segment an image and make an automated area estimation based on grammar. The entity "language" will be projected to the entity "image" to perform structural analysis and parsing of the image. We will show how the idea of segmentation and grammar-based area estimation is applied to real problems of cardio-graphic image processing.

  9. The lack of adequate quality assurance/quality control data hinders the assessment of potential forest degradation in a national forest inventory

    Treesearch

    Thomas Brandeis; Stanley Zarnoch; Christopher Oswalt; Jeffery Stringer

    2017-01-01

    Hardwood lumber harvested from the temperate broadleaf and mixed broadleaf/conifer forests of the east-central United States is an important economic resource. Forest industry stakeholders in this region have a growing need for accurate, reliable estimates of high-quality wood volume. While lower-graded timber has an increasingly wide array of uses, the forest products...

  10. Efficient Construction of Free Energy Profiles of Breathing Metal–Organic Frameworks Using Advanced Molecular Dynamics Simulations

    PubMed Central

    2017-01-01

    In order to reliably predict and understand the breathing behavior of highly flexible metal–organic frameworks from thermodynamic considerations, an accurate estimation of the free energy difference between their different metastable states is a prerequisite. Herein, a variety of free energy estimation methods are thoroughly tested for their ability to construct the free energy profile as a function of the unit cell volume of MIL-53(Al). The methods comprise free energy perturbation, thermodynamic integration, umbrella sampling, metadynamics, and variationally enhanced sampling. A series of molecular dynamics simulations have been performed in the frame of each of the five methods to describe structural transformations in flexible materials with the volume as the collective variable, which offers a unique opportunity to assess their computational efficiency. Subsequently, the most efficient method, umbrella sampling, is used to construct an accurate free energy profile at different temperatures for MIL-53(Al) from first principles at the PBE+D3(BJ) level of theory. This study yields insight into the importance of the different aspects such as entropy contributions and anharmonic contributions on the resulting free energy profile. As such, this thorough study provides unparalleled insight in the thermodynamics of the large structural deformations of flexible materials. PMID:29131647

  11. Efficient Construction of Free Energy Profiles of Breathing Metal-Organic Frameworks Using Advanced Molecular Dynamics Simulations.

    PubMed

    Demuynck, Ruben; Rogge, Sven M J; Vanduyfhuys, Louis; Wieme, Jelle; Waroquier, Michel; Van Speybroeck, Veronique

    2017-12-12

    In order to reliably predict and understand the breathing behavior of highly flexible metal-organic frameworks from thermodynamic considerations, an accurate estimation of the free energy difference between their different metastable states is a prerequisite. Herein, a variety of free energy estimation methods are thoroughly tested for their ability to construct the free energy profile as a function of the unit cell volume of MIL-53(Al). The methods comprise free energy perturbation, thermodynamic integration, umbrella sampling, metadynamics, and variationally enhanced sampling. A series of molecular dynamics simulations have been performed in the frame of each of the five methods to describe structural transformations in flexible materials with the volume as the collective variable, which offers a unique opportunity to assess their computational efficiency. Subsequently, the most efficient method, umbrella sampling, is used to construct an accurate free energy profile at different temperatures for MIL-53(Al) from first principles at the PBE+D3(BJ) level of theory. This study yields insight into the importance of the different aspects such as entropy contributions and anharmonic contributions on the resulting free energy profile. As such, this thorough study provides unparalleled insight in the thermodynamics of the large structural deformations of flexible materials.

  12. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  13. Validating automated kidney stone volumetry in computed tomography and mathematical correlation with estimated stone volume based on diameter.

    PubMed

    Wilhelm, Konrad; Miernik, Arkadiusz; Hein, Simon; Schlager, Daniel; Adams, Fabian; Benndorf, Matthias; Fritz, Benjamin; Langer, Mathias; Hesse, Albrecht; Schoenthaler, Martin; Neubauer, Jakob

    2018-06-02

    To validate AutoMated UroLithiasis Evaluation Tool (AMULET) software for kidney stone volumetry and compare its performance to standard clinical practice. Maximum diameter and volume of 96 urinary stones were measured as reference standard by three independent urologists. The same stones were positioned in an anthropomorphic phantom and CT scans acquired in standard settings. Three independent radiologists blinded to the reference values took manual measurements of the maximum diameter and automatic measurements of maximum diameter and volume. An "expected volume" was calculated based on manual diameter measurements using the formula: V=4/3 πr³. 96 stones were analyzed in the study. We had initially aimed to assess 100. Nine were replaced during data acquisition due of crumbling and 4 had to be excluded because the automated measurement did not work. Mean reference maximum diameter was 13.3 mm (5.2-32.1 mm). Correlation coefficients among all measured outcomes were compared. The correlation between the manual and automatic diameter measurements to the reference was 0.98 and 0.91, respectively (p<0.001). Mean reference volume was 1200 mm³ (10-9000 mm³). The correlation between the "expected volume" and automatically measured volume to the reference was 0.95 and 0.99, respectively (p<0.001). Patients' kidney stone burden is usually assessed according to maximum diameter. However, as most stones are not spherical, this entails a potential bias. Automated stone volumetry is possible and significantly more accurate than diameter-based volumetric calculations. To avoid bias in clinical trials, size should be measured as volume. However, automated diameter measurements are not as accurate as manual measurements.

  14. Effects of cumulative illness severity on hippocampal gray matter volume in major depression: a voxel-based morphometry study.

    PubMed

    Zaremba, Dario; Enneking, Verena; Meinert, Susanne; Förster, Katharina; Bürger, Christian; Dohm, Katharina; Grotegerd, Dominik; Redlich, Ronny; Dietsche, Bruno; Krug, Axel; Kircher, Tilo; Kugel, Harald; Heindel, Walter; Baune, Bernhard T; Arolt, Volker; Dannlowski, Udo

    2018-02-08

    Patients with major depression show reduced hippocampal volume compared to healthy controls. However, the contribution of patients' cumulative illness severity to hippocampal volume has rarely been investigated. It was the aim of our study to find a composite score of cumulative illness severity that is associated with hippocampal volume in depression. We estimated hippocampal gray matter volume using 3-tesla brain magnetic resonance imaging in 213 inpatients with acute major depression according to DSM-IV criteria (employing the SCID interview) and 213 healthy controls. Patients' cumulative illness severity was ascertained by six clinical variables via structured clinical interviews. A principal component analysis was conducted to identify components reflecting cumulative illness severity. Regression analyses and a voxel-based morphometry approach were used to investigate the influence of patients' individual component scores on hippocampal volume. Principal component analysis yielded two main components of cumulative illness severity: Hospitalization and Duration of Illness. While the component Hospitalization incorporated information from the intensity of inpatient treatment, the component Duration of Illness was based on the duration and frequency of illness episodes. We could demonstrate a significant inverse association of patients' Hospitalization component scores with bilateral hippocampal gray matter volume. This relationship was not found for Duration of Illness component scores. Variables associated with patients' history of psychiatric hospitalization seem to be accurate predictors of hippocampal volume in major depression and reliable estimators of patients' cumulative illness severity. Future studies should pay attention to these measures when investigating hippocampal volume changes in major depression.

  15. A deep learning approach for pose estimation from volumetric OCT data.

    PubMed

    Gessert, Nils; Schlüter, Matthias; Schlaefer, Alexander

    2018-05-01

    Tracking the pose of instruments is a central problem in image-guided surgery. For microscopic scenarios, optical coherence tomography (OCT) is increasingly used as an imaging modality. OCT is suitable for accurate pose estimation due to its micrometer range resolution and volumetric field of view. However, OCT image processing is challenging due to speckle noise and reflection artifacts in addition to the images' 3D nature. We address pose estimation from OCT volume data with a new deep learning-based tracking framework. For this purpose, we design a new 3D convolutional neural network (CNN) architecture to directly predict the 6D pose of a small marker geometry from OCT volumes. We use a hexapod robot to automatically acquire labeled data points which we use to train 3D CNN architectures for multi-output regression. We use this setup to provide an in-depth analysis on deep learning-based pose estimation from volumes. Specifically, we demonstrate that exploiting volume information for pose estimation yields higher accuracy than relying on 2D representations with depth information. Supporting this observation, we provide quantitative and qualitative results that 3D CNNs effectively exploit the depth structure of marker objects. Regarding the deep learning aspect, we present efficient design principles for 3D CNNs, making use of insights from the 2D deep learning community. In particular, we present Inception3D as a new architecture which performs best for our application. We show that our deep learning approach reaches errors at our ground-truth label's resolution. We achieve a mean average error of 14.89 ± 9.3 µm and 0.096 ± 0.072° for position and orientation learning, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Optimal estimation retrieval of aerosol microphysical properties from SAGE~II satellite observations in the volcanically unperturbed lower stratosphere

    NASA Astrophysics Data System (ADS)

    Wurl, D.; Grainger, R. G.; McDonald, A. J.; Deshler, T.

    2010-05-01

    Stratospheric aerosol particles under non-volcanic conditions are typically smaller than 0.1 μm. Due to fundamental limitations of the scattering theory in the Rayleigh limit, these tiny particles are hard to measure by satellite instruments. As a consequence, current estimates of global aerosol properties retrieved from spectral aerosol extinction measurements tend to be strongly biased. Aerosol surface area densities, for instance, are observed to be about 40% smaller than those derived from correlative in situ measurements (Deshler et al., 2003). An accurate knowledge of the global distribution of aerosol properties is, however, essential to better understand and quantify the role they play in atmospheric chemistry, dynamics, radiation and climate. To address this need a new retrieval algorithm was developed, which employs a nonlinear Optimal Estimation (OE) method to iteratively solve for the monomodal size distribution parameters which are statistically most consistent with both the satellite-measured multi-wavelength aerosol extinction data and a priori information. By thus combining spectral extinction measurements (at visible to near infrared wavelengths) with prior knowledge of aerosol properties at background level, even the smallest particles are taken into account which are practically invisible to optical remote sensing instruments. The performance of the OE retrieval algorithm was assessed based on synthetic spectral extinction data generated from both monomodal and small-mode-dominant bimodal sulphuric acid aerosol size distributions. For monomodal background aerosol, the new algorithm was shown to fairly accurately retrieve the particle sizes and associated integrated properties (surface area and volume densities), even in the presence of large extinction uncertainty. The associated retrieved uncertainties are a good estimate of the true errors. In the case of bimodal background aerosol, where the retrieved (monomodal) size distributions naturally differ from the correct bimodal values, the associated surface area (A) and volume densities (V) are, nevertheless, fairly accurately retrieved, except at values larger than 1.0 μm2 cm-3 (A) and 0.05 μm3 cm-3 (V), where they tend to underestimate the true bimodal values. Due to the limited information content in the SAGE II spectral extinction measurements this kind of forward model error cannot be avoided here. Nevertheless, the retrieved uncertainties are a good estimate of the true errors in the retrieved integrated properties, except where the surface area density exceeds the 1.0 μm2 cm-3 threshold. When applied to near-global SAGE II satellite extinction measured in 1999 the retrieved OE surface area and volume densities are observed to be larger by, respectively, 20-50% and 10-40% compared to those estimates obtained by the SAGE~II operational retrieval algorithm. An examination of the OE algorithm biases with in situ data indicates that the new OE aerosol property estimates tend to be more realistic than previous estimates obtained from remotely sensed data through other retrieval techniques. Based on the results of this study we therefore suggest that the new Optimal Estimation retrieval algorithm is able to contribute to an advancement in aerosol research by considerably improving current estimates of aerosol properties in the lower stratosphere under low aerosol loading conditions.

  17. Using Data Pooling to Measure the Density of Sodas: An Introductory Discovery Experiment

    NASA Astrophysics Data System (ADS)

    Herrick, Richard S.; Nestor, Lisa P.; Benedetto, David A.

    1999-10-01

    We have developed an experiment in which students measure the density of Coke and Diet Coke. In the first part of the experiment they make measurements using a buret, pipet, and graduated cylinder. The density data are pooled and plotted for each type of glassware. Students discover that Coke and Diet Coke have different densities. Discussion of the data also shows students the relative advantages and disadvantages of each type of apparatus and introduces them to the concept of error analysis. In the second half of the experiment each student uses a buret to accurately measure an assigned volume of either Coke or Diet Coke. Volumes in the range of 2 to 30 mL are assigned. These data are pooled. The slope of the mass-vs-volume plot provides an accurate measurement of the density and also shows that density is an intensive property. The difference in densities is due to the large amount of sugar in Coke compared to the relatively small amount of artificial sweetener in Diet Coke. Information read from soda cans is used to estimate the accuracy of these measurements. This experiment is used as the first experiment for college science students.

  18. Estimation of lactose interference in vaccines and a proposal of methodological adjustment of total protein determination by the lowry method.

    PubMed

    Kusunoki, Hideki; Okuma, Kazu; Hamaguchi, Isao

    2012-01-01

    For national regulatory testing in Japan, the Lowry method is used for the determination of total protein content in vaccines. However, many substances are known to interfere with the Lowry method, rendering accurate estimation of protein content difficult. To accurately determine the total protein content in vaccines, it is necessary to identify the major interfering substances and improve the methodology for removing such substances. This study examined the effects of high levels of lactose with low levels of protein in freeze-dried, cell culture-derived Japanese encephalitis vaccine (inactivated). Lactose was selected because it is a reducing sugar that is expected to interfere with the Lowry method. Our results revealed that concentrations of ≥ 0.1 mg/mL lactose interfered with the Lowry assays and resulted in overestimation of the protein content in a lactose concentration-dependent manner. On the other hand, our results demonstrated that it is important for the residual volume to be ≤ 0.05 mL after trichloroacetic acid precipitation in order to avoid the effects of lactose. Thus, the method presented here is useful for accurate protein determination by the Lowry method, even when it is used for determining low levels of protein in vaccines containing interfering substances. In this study, we have reported a methodological adjustment that allows accurate estimation of protein content for national regulatory testing, when the vaccine contains interfering substances.

  19. InSAR Surface Deformation and Source Modelling at Semisopochnoi Island During the 2014 and 2015 Seismic Swarms with Constraints from Geochemical and Seismic Analysis

    NASA Astrophysics Data System (ADS)

    DeGrandpre, K.; Pesicek, J. D.; Lu, Z.

    2017-12-01

    During the summer of 2014 and the early spring of 2015 two notable increases in seismic activity at Semisopochnoi Island in the western Aleutian islands were recorded on AVO seismometers on Semisopochnoi and neighboring islands. These seismic swarms did not lead to an eruption. This study employs interferometric synthetic aperture radar (InSAR) techniques using TerraSAR-X images in conjunction with more accurately relocating the recorded seismic events through simultaneous inversion of event travel times and a three-dimensional velocity model using tomoDD. The InSAR images exhibit surprising coherence and an island wide spatial distribution of inflation that is then used in Mogi, Okada, spheroid, and ellipsoid source models in order to define the three-dimensional location and volume change required for a source at the volcano to produce the observed surface deformation. The tomoDD relocations provide a more accurate and realistic three-dimensional velocity model as well as a tighter clustering of events for both swarms that clearly outline a linear seismic void within the larger group of shallow (<10 km) seismicity. The source models are fit to this void and pressure estimates from geochemical analysis are used to verify the storage depth of magmas at Semisopochnoi. Comparisons of calculated source cavity, magma injection, and surface deformation volumes are made in order to assess the reality behind the various modelling estimates. Incorporating geochemical and seismic data to provide constraints on surface deformation source inversions provides an interdisciplinary approach that can be used to make more accurate interpretations of dynamic observations.

  20. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Size assessment of breast lesions by means of a computer-aided detection (CAD) system for magnetic resonance mammography.

    PubMed

    Levrini, G; Sghedoni, R; Mori, C; Botti, A; Vacondio, R; Nitrosi, A; Iori, M; Nicoli, F

    2011-10-01

    The aim of this study was to investigate the efficacy of a dedicated software tool for automated volume measurement of breast lesions in contrast-enhanced (CE) magnetic resonance mammography (MRM). The size of 52 breast lesions with a known histopathological diagnosis (three benign, 49 malignant) was automatically evaluated using different techniques. The volume of all lesions was measured automatically (AVM) from CE 3D MRM examinations by means of a computer-aided detection (CAD) system and compared with the size estimates based on maximum diameter measurement (MDM) on MRM, ultrasonography (US), mammography and histopathology. Compared with histopathology as the reference method, AVM understimated lesion size by 4% on average. This result was similar to MDM (3% understimation, not significantly different) but significantly better than US and mammographic lesion measurements (24% and 33% size underestimation, respectively). AVM is as accurate as MDM but faster. Both methods are more accurate for size assessment of breast lesions compared with US and mammography.

  2. A low-cost three-dimensional laser surface scanning approach for defining body segment parameters.

    PubMed

    Pandis, Petros; Bull, Anthony Mj

    2017-11-01

    Body segment parameters are used in many different applications in ergonomics as well as in dynamic modelling of the musculoskeletal system. Body segment parameters can be defined using different methods, including techniques that involve time-consuming manual measurements of the human body, used in conjunction with models or equations. In this study, a scanning technique for measuring subject-specific body segment parameters in an easy, fast, accurate and low-cost way was developed and validated. The scanner can obtain the body segment parameters in a single scanning operation, which takes between 8 and 10 s. The results obtained with the system show a standard deviation of 2.5% in volumetric measurements of the upper limb of a mannequin and 3.1% difference between scanning volume and actual volume. Finally, the maximum mean error for the moment of inertia by scanning a standard-sized homogeneous object was 2.2%. This study shows that a low-cost system can provide quick and accurate subject-specific body segment parameter estimates.

  3. EEG source localization: Sensor density and head surface coverage.

    PubMed

    Song, Jasmine; Davey, Colin; Poulsen, Catherine; Luu, Phan; Turovets, Sergei; Anderson, Erik; Li, Kai; Tucker, Don

    2015-12-30

    The accuracy of EEG source localization depends on a sufficient sampling of the surface potential field, an accurate conducting volume estimation (head model), and a suitable and well-understood inverse technique. The goal of the present study is to examine the effect of sampling density and coverage on the ability to accurately localize sources, using common linear inverse weight techniques, at different depths. Several inverse methods are examined, using the popular head conductivity. Simulation studies were employed to examine the effect of spatial sampling of the potential field at the head surface, in terms of sensor density and coverage of the inferior and superior head regions. In addition, the effects of sensor density and coverage are investigated in the source localization of epileptiform EEG. Greater sensor density improves source localization accuracy. Moreover, across all sampling density and inverse methods, adding samples on the inferior surface improves the accuracy of source estimates at all depths. More accurate source localization of EEG data can be achieved with high spatial sampling of the head surface electrodes. The most accurate source localization is obtained when the voltage surface is densely sampled over both the superior and inferior surfaces. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Unifying framework for multimodal brain MRI segmentation based on Hidden Markov Chains.

    PubMed

    Bricq, S; Collet, Ch; Armspach, J P

    2008-12-01

    In the frame of 3D medical imaging, accurate segmentation of multimodal brain MR images is of interest for many brain disorders. However, due to several factors such as noise, imaging artifacts, intrinsic tissue variation and partial volume effects, tissue classification remains a challenging task. In this paper, we present a unifying framework for unsupervised segmentation of multimodal brain MR images including partial volume effect, bias field correction, and information given by a probabilistic atlas. Here-proposed method takes into account neighborhood information using a Hidden Markov Chain (HMC) model. Due to the limited resolution of imaging devices, voxels may be composed of a mixture of different tissue types, this partial volume effect is included to achieve an accurate segmentation of brain tissues. Instead of assigning each voxel to a single tissue class (i.e., hard classification), we compute the relative amount of each pure tissue class in each voxel (mixture estimation). Further, a bias field estimation step is added to the proposed algorithm to correct intensity inhomogeneities. Furthermore, atlas priors were incorporated using probabilistic brain atlas containing prior expectations about the spatial localization of different tissue classes. This atlas is considered as a complementary sensor and the proposed method is extended to multimodal brain MRI without any user-tunable parameter (unsupervised algorithm). To validate this new unifying framework, we present experimental results on both synthetic and real brain images, for which the ground truth is available. Comparison with other often used techniques demonstrates the accuracy and the robustness of this new Markovian segmentation scheme.

  5. Cardiac chamber volumes by echocardiography using a new mathematical method: A promising technique for zero-G use

    NASA Technical Reports Server (NTRS)

    Buckey, J. C.; Beattie, J. M.; Gaffney, F. A.; Nixon, J. V.; Blomqvist, C. G.

    1984-01-01

    Accurate, reproducible, and non-invasive means for ventricular volume determination are needed for evaluating cardiovascular function zero-gravity. Current echocardiographic methods, particularly for the right ventricle, suffer from a large standard error. A new mathematical approach, recently described by Watanabe et al., was tested on 1 normal formalin-fixed human hearts suspended in a mineral oil bath. Volumes are estimated from multiple two-dimensional echocardiographic views recorded from a single point at sequential angles. The product of sectional cavity area and center of mass for each view summed over the range of angles (using a trapezoidal rule) gives volume. Multiple (8-14) short axis right ventricle and left ventricle views at 5.0 deg intervals were videotaped. The images were digitized by two independent observers (leading-edge to leading-edge technique) and analyzed using a graphics tablet and microcomputer. Actual volumes were determined by filling the chambers with water. These data were compared to the mean of the two echo measurements.

  6. Velocity gradients and reservoir volumes lessons in computational sensitivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, P.W.

    1995-12-31

    The sensitivity of reservoir volume estimation from depth converted geophysical time maps to the velocity gradients employed is investigated through a simple model study. The computed volumes are disconcertingly sensitive to gradients, both horizontal and vertical. The need for an accurate method of time to depth conversion is well demonstrated by the model study in which errors in velocity are magnified 40 fold in the computation of the volume. Thus if +/- 10% accuracy in the volume is desired, we must be able to estimate the velocity at the water contact with 0.25% accuracy. Put another way, if the velocitymore » is 8000 feet per second at the well then we have only +/- 20 feet per second leeway in estimating the velocity at the water contact. Very moderate horizontal and vertical gradients would typically indicate a velocity change of a few hundred feet per second if they are in the same direction. Clearly the interpreter needs to by very careful. A methodology is demonstrated which takes into account all the information that is available, velocities, tops, depositional and lithologic spatial patterns, and common sense. It is assumed that through appropriate use of check shot and other time-depth information, that the interpreter has correctly tied the reflection picks to the well tops. Such ties are ordinarily too soft for direct time-depth conversion to give adequate depth ties. The proposed method uses a common compaction law as its basis and incorporates time picks, tops and stratigraphic maps into the depth conversion process. The resulting depth map ties the known well tops in an optimum fashion.« less

  7. CT liver volumetry using geodesic active contour segmentation with a level-set algorithm

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Obajuluwa, Ademola; Xu, Jianwu; Hori, Masatoshi; Baron, Richard

    2010-03-01

    Automatic liver segmentation on CT images is challenging because the liver often abuts other organs of a similar density. Our purpose was to develop an accurate automated liver segmentation scheme for measuring liver volumes. We developed an automated volumetry scheme for the liver in CT based on a 5 step schema. First, an anisotropic smoothing filter was applied to portal-venous phase CT images to remove noise while preserving the liver structure, followed by an edge enhancer to enhance the liver boundary. By using the boundary-enhanced image as a speed function, a fastmarching algorithm generated an initial surface that roughly estimated the liver shape. A geodesic-active-contour segmentation algorithm coupled with level-set contour-evolution refined the initial surface so as to more precisely fit the liver boundary. The liver volume was calculated based on the refined liver surface. Hepatic CT scans of eighteen prospective liver donors were obtained under a liver transplant protocol with a multi-detector CT system. Automated liver volumes obtained were compared with those manually traced by a radiologist, used as "gold standard." The mean liver volume obtained with our scheme was 1,520 cc, whereas the mean manual volume was 1,486 cc, with the mean absolute difference of 104 cc (7.0%). CT liver volumetrics based on an automated scheme agreed excellently with "goldstandard" manual volumetrics (intra-class correlation coefficient was 0.95) with no statistically significant difference (p(F<=f)=0.32), and required substantially less completion time. Our automated scheme provides an efficient and accurate way of measuring liver volumes.

  8. Application of decline curve analysis to estimate recovery factors for carbon dioxide enhanced oil recovery

    USGS Publications Warehouse

    Jahediesfanjani, Hossein

    2017-07-17

    IntroductionIn the decline curve analysis (DCA) method of estimating recoverable hydrocarbon volumes, the analyst uses historical production data from a well, lease, group of wells (or pattern), or reservoir and plots production rates against time or cumu­lative production for the analysis. The DCA of an individual well is founded on the same basis as the fluid-flow principles that are used for pressure-transient analysis of a single well in a reservoir domain and therefore can provide scientifically reasonable and accurate results. However, when used for a group of wells, a lease, or a reservoir, the DCA becomes more of an empirical method. Plots from the DCA reflect the reservoir response to the oil withdrawal (or production) under the prevailing operating and reservoir conditions, and they continue to be good tools for estimating recoverable hydrocarbon volumes and future production rates. For predicting the total recov­erable hydrocarbon volume, the DCA results can help the analyst to evaluate the reservoir performance under any of the three phases of reservoir productive life—primary, secondary (waterflood), or tertiary (enhanced oil recovery) phases—so long as the historical production data are sufficient to establish decline trends at the end of the three phases.

  9. Accurate tracking of tumor volume change during radiotherapy by CT-CBCT registration with intensity correction

    NASA Astrophysics Data System (ADS)

    Park, Seyoun; Robinson, Adam; Quon, Harry; Kiess, Ana P.; Shen, Colette; Wong, John; Plishker, William; Shekhar, Raj; Lee, Junghoon

    2016-03-01

    In this paper, we propose a CT-CBCT registration method to accurately predict the tumor volume change based on daily cone-beam CTs (CBCTs) during radiotherapy. CBCT is commonly used to reduce patient setup error during radiotherapy, but its poor image quality impedes accurate monitoring of anatomical changes. Although physician's contours drawn on the planning CT can be automatically propagated to daily CBCTs by deformable image registration (DIR), artifacts in CBCT often cause undesirable errors. To improve the accuracy of the registration-based segmentation, we developed a DIR method that iteratively corrects CBCT intensities by local histogram matching. Three popular DIR algorithms (B-spline, demons, and optical flow) with the intensity correction were implemented on a graphics processing unit for efficient computation. We evaluated their performances on six head and neck (HN) cancer cases. For each case, four trained scientists manually contoured the nodal gross tumor volume (GTV) on the planning CT and every other fraction CBCTs to which the propagated GTV contours by DIR were compared. The performance was also compared with commercial image registration software based on conventional mutual information (MI), VelocityAI (Varian Medical Systems Inc.). The volume differences (mean±std in cc) between the average of the manual segmentations and automatic segmentations are 3.70+/-2.30 (B-spline), 1.25+/-1.78 (demons), 0.93+/-1.14 (optical flow), and 4.39+/-3.86 (VelocityAI). The proposed method significantly reduced the estimation error by 9% (B-spline), 38% (demons), and 51% (optical flow) over the results using VelocityAI. Although demonstrated only on HN nodal GTVs, the results imply that the proposed method can produce improved segmentation of other critical structures over conventional methods.

  10. A Macroecological Analysis of SERA Derived Forest Heights and Implications for Forest Volume Remote Sensing

    PubMed Central

    Brolly, Matthew; Woodhouse, Iain H.; Niklas, Karl J.; Hammond, Sean T.

    2012-01-01

    Individual trees have been shown to exhibit strong relationships between DBH, height and volume. Often such studies are cited as justification for forest volume or standing biomass estimation through remote sensing. With resolution of common satellite remote sensing systems generally too low to resolve individuals, and a need for larger coverage, these systems rely on descriptive heights, which account for tree collections in forests. For remote sensing and allometric applications, this height is not entirely understood in terms of its location. Here, a forest growth model (SERA) analyzes forest canopy height relationships with forest wood volume. Maximum height, mean, H100, and Lorey's height are examined for variability under plant number density, resource and species. Our findings, shown to be allometrically consistent with empirical measurements for forested communities world-wide, are analyzed for implications to forest remote sensing techniques such as LiDAR and RADAR. Traditional forestry measures of maximum height, and to a lesser extent H100 and Lorey's, exhibit little consistent correlation with forest volume across modeled conditions. The implication is that using forest height to infer volume or biomass from remote sensing requires species and community behavioral information to infer accurate estimates using height alone. SERA predicts mean height to provide the most consistent relationship with volume of the height classifications studied and overall across forest variations. This prediction agrees with empirical data collected from conifer and angiosperm forests with plant densities ranging between 102–106 plants/hectare and heights 6–49 m. Height classifications investigated are potentially linked to radar scattering centers with implications for allometry. These findings may be used to advance forest biomass estimation accuracy through remote sensing. Furthermore, Lorey's height with its specific relationship to remote sensing physics is recommended as a more universal indicator of volume when using remote sensing than achieved using either maximum height or H100. PMID:22457800

  11. Fuzzy hidden Markov chains segmentation for volume determination and quantitation in PET.

    PubMed

    Hatt, M; Lamare, F; Boussion, N; Turzo, A; Collet, C; Salzenstein, F; Roux, C; Jarritt, P; Carson, K; Cheze-Le Rest, C; Visvikis, D

    2007-06-21

    Accurate volume of interest (VOI) estimation in PET is crucial in different oncology applications such as response to therapy evaluation and radiotherapy treatment planning. The objective of our study was to evaluate the performance of the proposed algorithm for automatic lesion volume delineation; namely the fuzzy hidden Markov chains (FHMC), with that of current state of the art in clinical practice threshold based techniques. As the classical hidden Markov chain (HMC) algorithm, FHMC takes into account noise, voxel intensity and spatial correlation, in order to classify a voxel as background or functional VOI. However the novelty of the fuzzy model consists of the inclusion of an estimation of imprecision, which should subsequently lead to a better modelling of the 'fuzzy' nature of the object of interest boundaries in emission tomography data. The performance of the algorithms has been assessed on both simulated and acquired datasets of the IEC phantom, covering a large range of spherical lesion sizes (from 10 to 37 mm), contrast ratios (4:1 and 8:1) and image noise levels. Both lesion activity recovery and VOI determination tasks were assessed in reconstructed images using two different voxel sizes (8 mm3 and 64 mm3). In order to account for both the functional volume location and its size, the concept of % classification errors was introduced in the evaluation of volume segmentation using the simulated datasets. Results reveal that FHMC performs substantially better than the threshold based methodology for functional volume determination or activity concentration recovery considering a contrast ratio of 4:1 and lesion sizes of <28 mm. Furthermore differences between classification and volume estimation errors evaluated were smaller for the segmented volumes provided by the FHMC algorithm. Finally, the performance of the automatic algorithms was less susceptible to image noise levels in comparison to the threshold based techniques. The analysis of both simulated and acquired datasets led to similar results and conclusions as far as the performance of segmentation algorithms under evaluation is concerned.

  12. Multiple-animal MR imaging using a 3T clinical scanner and multi-channel coil for volumetric analysis in a mouse tumor model.

    PubMed

    Mitsuda, Minoru; Yamaguchi, Masayuki; Furuta, Toshihiro; Nabetani, Akira; Hirayama, Akira; Nozaki, Atsushi; Niitsu, Mamoru; Fujii, Hirofumi

    2011-01-01

    Multiple small-animal magnetic resonance (MR) imaging to measure tumor volume may increase the throughput of preclinical cancer research assessing tumor response to novel therapies. We used a clinical scanner and multi-channel coil to evaluate the usefulness of this imaging to assess experimental tumor volume in mice. We performed a phantom study to assess 2-dimensional (2D) geometric distortion using 9-cm spherical and 32-cell (8×4 one-cm(2) grids) phantoms using a 3-tesla clinical MR scanner and dedicated multi-channel coil composed of 16 5-cm circular coils. Employing the multi-channel coil, we simultaneously scanned 6 or 8 mice bearing sarcoma 180 tumors. We estimated tumor volume from the sum of the product of tumor area and slice thickness on 2D spin-echo images (repetition time/echo time, 3500/16 ms; in-plane resolution, 0.195×0.195×1 mm(3)). After MR acquisition, we excised and weighed tumors, calculated reference tumor volumes from actual tumor weight assuming a density of 1.05 g/cm(3), and assessed the correlation between the estimated and reference volumes using Pearson's test. Two-dimensional geometric distortion was acceptable below 5% in the 9-cm spherical phantom and in every cell in the 32-cell phantom. We scanned up to 8 mice simultaneously using the multi-channel coil and found 11 tumors larger than 0.1 g in 12 mice. Tumor volumes were 1.04±0.73 estimated by MR imaging and 1.04±0.80 cm(3) by reference volume (average±standard deviation) and highly correlated (correlation coefficient, 0.995; P<0.01, Pearson's test). Use of multiple small-animal MR imaging employing a clinical scanner and multi-channel coil enabled accurate assessment of experimental tumor volume in a large number of mice and may facilitate high throughput monitoring of tumor response to therapy in preclinical research.

  13. A macroecological analysis of SERA derived forest heights and implications for forest volume remote sensing.

    PubMed

    Brolly, Matthew; Woodhouse, Iain H; Niklas, Karl J; Hammond, Sean T

    2012-01-01

    Individual trees have been shown to exhibit strong relationships between DBH, height and volume. Often such studies are cited as justification for forest volume or standing biomass estimation through remote sensing. With resolution of common satellite remote sensing systems generally too low to resolve individuals, and a need for larger coverage, these systems rely on descriptive heights, which account for tree collections in forests. For remote sensing and allometric applications, this height is not entirely understood in terms of its location. Here, a forest growth model (SERA) analyzes forest canopy height relationships with forest wood volume. Maximum height, mean, H₁₀₀, and Lorey's height are examined for variability under plant number density, resource and species. Our findings, shown to be allometrically consistent with empirical measurements for forested communities world-wide, are analyzed for implications to forest remote sensing techniques such as LiDAR and RADAR. Traditional forestry measures of maximum height, and to a lesser extent H₁₀₀ and Lorey's, exhibit little consistent correlation with forest volume across modeled conditions. The implication is that using forest height to infer volume or biomass from remote sensing requires species and community behavioral information to infer accurate estimates using height alone. SERA predicts mean height to provide the most consistent relationship with volume of the height classifications studied and overall across forest variations. This prediction agrees with empirical data collected from conifer and angiosperm forests with plant densities ranging between 10²-10⁶ plants/hectare and heights 6-49 m. Height classifications investigated are potentially linked to radar scattering centers with implications for allometry. These findings may be used to advance forest biomass estimation accuracy through remote sensing. Furthermore, Lorey's height with its specific relationship to remote sensing physics is recommended as a more universal indicator of volume when using remote sensing than achieved using either maximum height or H₁₀₀.

  14. On the accuracy of estimation of basic pharmacokinetic parameters by the traditional noncompartmental equations and the prediction of the steady-state volume of distribution in obese patients based upon data derived from normal subjects.

    PubMed

    Berezhkovskiy, Leonid M

    2011-06-01

    The steady-state and terminal volumes of distribution, as well as the mean residence time of drug in the body (V(ss), V(β), and MRT) are the common pharmacokinetic parameters calculated using the drug plasma concentration-time profile C(p) (t) following intravenous (i.v. bolus or constant rate infusion) drug administration. These calculations are valid for the linear pharmacokinetic system with central elimination (i.e., elimination rate being proportional to drug concentration in plasma). Formally, the assumption of central elimination is not normally met because the rate of drug elimination is proportional to the unbound drug concentration at elimination site, although equilibration between systemic circulation and the site of clearance for majority of small molecule drugs is fast. Thus, the assumption of central elimination is practically quite adequate. It appears reasonable to estimate the extent of possible errors in determination of these pharmacokinetic parameters due to the absence of central elimination. The comparison of V(ss), V(β), and MRT calculated by exact equations and the commonly used ones was made considering a simplified physiologically based pharmacokinetic model. It was found that if the drug plasma concentration profile is detected accurately, determination of drug distribution volumes and MRT using the traditional noncompartmental calculations of these parameters from C(p) (t) yields the values very close to that obtained from exact equations. Though in practice, the accurate measurement of C(p) (t), especially its terminal phase, may not always be possible. This is particularly applicable for obtaining the distribution volumes of lipophilic compounds in obese subjects, when the possibility of late terminal phase at low drug concentration is quite likely, specifically for compounds with high clearance. An accurate determination of V(ss) is much needed in clinical practice because it is critical for the proper selection of drug treatment regimen. For that reason, we developed a convenient method for calculation of V(ss) in obese (or underweight) subjects. It is based on using the V(ss) values obtained from pharmacokinetic studies in normal subjects and the physicochemical properties of drug molecule. A simple criterion that determines either the increase or decrease of V(ss) (per unit body weight) due to obesity is obtained. The accurate determination of adipose tissue-plasma partition coefficient is crucial for the practical application of suggested method. Copyright © 2011 Wiley-Liss, Inc.

  15. Analysis of volumetric response of pituitary adenomas receiving adjuvant CyberKnife stereotactic radiosurgery with the application of an exponential fitting model.

    PubMed

    Yu, Yi-Lin; Yang, Yun-Ju; Lin, Chin; Hsieh, Chih-Chuan; Li, Chiao-Zhu; Feng, Shao-Wei; Tang, Chi-Tun; Chung, Tzu-Tsao; Ma, Hsin-I; Chen, Yuan-Hao; Ju, Da-Tong; Hueng, Dueng-Yuan

    2017-01-01

    Tumor control rates of pituitary adenomas (PAs) receiving adjuvant CyberKnife stereotactic radiosurgery (CK SRS) are high. However, there is currently no uniform way to estimate the time course of the disease. The aim of this study was to analyze the volumetric responses of PAs after CK SRS and investigate the application of an exponential decay model in calculating an accurate time course and estimation of the eventual outcome.A retrospective review of 34 patients with PAs who received adjuvant CK SRS between 2006 and 2013 was performed. Tumor volume was calculated using the planimetric method. The percent change in tumor volume and tumor volume rate of change were compared at median 4-, 10-, 20-, and 36-month intervals. Tumor responses were classified as: progression for >15% volume increase, regression for ≤15% decrease, and stabilization for ±15% of the baseline volume at the time of last follow-up. For each patient, the volumetric change versus time was fitted with an exponential model.The overall tumor control rate was 94.1% in the 36-month (range 18-87 months) follow-up period (mean volume change of -43.3%). Volume regression (mean decrease of -50.5%) was demonstrated in 27 (79%) patients, tumor stabilization (mean change of -3.7%) in 5 (15%) patients, and tumor progression (mean increase of 28.1%) in 2 (6%) patients (P = 0.001). Tumors that eventually regressed or stabilized had a temporary volume increase of 1.07% and 41.5% at 4 months after CK SRS, respectively (P = 0.017). The tumor volume estimated using the exponential fitting equation demonstrated high positive correlation with the actual volume calculated by magnetic resonance imaging (MRI) as tested by Pearson correlation coefficient (0.9).Transient progression of PAs post-CK SRS was seen in 62.5% of the patients receiving CK SRS, and it was not predictive of eventual volume regression or progression. A three-point exponential model is of potential predictive value according to relative distribution. An exponential decay model can be used to calculate the time course of tumors that are ultimately controlled.

  16. Joint Multi-Fiber NODDI Parameter Estimation and Tractography Using the Unscented Information Filter

    PubMed Central

    Reddy, Chinthala P.; Rathi, Yogesh

    2016-01-01

    Tracing white matter fiber bundles is an integral part of analyzing brain connectivity. An accurate estimate of the underlying tissue parameters is also paramount in several neuroscience applications. In this work, we propose to use a joint fiber model estimation and tractography algorithm that uses the NODDI (neurite orientation dispersion diffusion imaging) model to estimate fiber orientation dispersion consistently and smoothly along the fiber tracts along with estimating the intracellular and extracellular volume fractions from the diffusion signal. While the NODDI model has been used in earlier works to estimate the microstructural parameters at each voxel independently, for the first time, we propose to integrate it into a tractography framework. We extend this framework to estimate the NODDI parameters for two crossing fibers, which is imperative to trace fiber bundles through crossings as well as to estimate the microstructural parameters for each fiber bundle separately. We propose to use the unscented information filter (UIF) to accurately estimate the model parameters and perform tractography. The proposed approach has significant computational performance improvements as well as numerical robustness over the unscented Kalman filter (UKF). Our method not only estimates the confidence in the estimated parameters via the covariance matrix, but also provides the Fisher-information matrix of the state variables (model parameters), which can be quite useful to measure model complexity. Results from in-vivo human brain data sets demonstrate the ability of our algorithm to trace through crossing fiber regions, while estimating orientation dispersion and other biophysical model parameters in a consistent manner along the tracts. PMID:27147956

  17. Joint Multi-Fiber NODDI Parameter Estimation and Tractography Using the Unscented Information Filter.

    PubMed

    Reddy, Chinthala P; Rathi, Yogesh

    2016-01-01

    Tracing white matter fiber bundles is an integral part of analyzing brain connectivity. An accurate estimate of the underlying tissue parameters is also paramount in several neuroscience applications. In this work, we propose to use a joint fiber model estimation and tractography algorithm that uses the NODDI (neurite orientation dispersion diffusion imaging) model to estimate fiber orientation dispersion consistently and smoothly along the fiber tracts along with estimating the intracellular and extracellular volume fractions from the diffusion signal. While the NODDI model has been used in earlier works to estimate the microstructural parameters at each voxel independently, for the first time, we propose to integrate it into a tractography framework. We extend this framework to estimate the NODDI parameters for two crossing fibers, which is imperative to trace fiber bundles through crossings as well as to estimate the microstructural parameters for each fiber bundle separately. We propose to use the unscented information filter (UIF) to accurately estimate the model parameters and perform tractography. The proposed approach has significant computational performance improvements as well as numerical robustness over the unscented Kalman filter (UKF). Our method not only estimates the confidence in the estimated parameters via the covariance matrix, but also provides the Fisher-information matrix of the state variables (model parameters), which can be quite useful to measure model complexity. Results from in-vivo human brain data sets demonstrate the ability of our algorithm to trace through crossing fiber regions, while estimating orientation dispersion and other biophysical model parameters in a consistent manner along the tracts.

  18. Fast surface-based travel depth estimation algorithm for macromolecule surface shape description.

    PubMed

    Giard, Joachim; Alface, Patrice Rondao; Gala, Jean-Luc; Macq, Benoît

    2011-01-01

    Travel Depth, introduced by Coleman and Sharp in 2006, is a physical interpretation of molecular depth, a term frequently used to describe the shape of a molecular active site or binding site. Travel Depth can be seen as the physical distance a solvent molecule would have to travel from a point of the surface, i.e., the Solvent-Excluded Surface (SES), to its convex hull. Existing algorithms providing an estimation of the Travel Depth are based on a regular sampling of the molecule volume and the use of the Dijkstra's shortest path algorithm. Since Travel Depth is only defined on the molecular surface, this volume-based approach is characterized by a large computational complexity due to the processing of unnecessary samples lying inside or outside the molecule. In this paper, we propose a surface-based approach that restricts the processing to data defined on the SES. This algorithm significantly reduces the complexity of Travel Depth estimation and makes possible the analysis of large macromolecule surface shape description with high resolution. Experimental results show that compared to existing methods, the proposed algorithm achieves accurate estimations with considerably reduced processing times.

  19. Equations for calculating hydrogeochemical reactions of minerals and gases such as CO2 at high pressures and temperatures

    USGS Publications Warehouse

    Appelo, C.A.J.; Parkhurst, David L.; Post, V.E.A.

    2014-01-01

    Calculating the solubility of gases and minerals at the high pressures of carbon capture and storage in geological reservoirs requires an accurate description of the molar volumes of aqueous species and the fugacity coefficients of gases. Existing methods for calculating the molar volumes of aqueous species are limited to a specific concentration matrix (often seawater), have been fit for a limited temperature (below 60 °C) or pressure range, apply only at infinite dilution, or are defined for salts instead of individual ions. A more general and reliable calculation of apparent molar volumes of single ions is presented, based on a modified Redlich–Rosenfeld equation. The modifications consist of (1) using the Born equation to calculate the temperature dependence of the intrinsic volumes, following Helgeson–Kirkham–Flowers (HKF), but with Bradley and Pitzer’s expression for the dielectric permittivity of water, (2) using the pressure dependence of the extended Debye–Hückel equation to constrain the limiting slope of the molar volume with ionic strength, and (3) adopting the convention that the proton has zero volume at all ionic strengths, temperatures and pressures. The modifications substantially reduce the number of fitting parameters, while maintaining or even extending the range of temperature and pressure over which molar volumes can be accurately estimated. The coefficients in the HKF-modified-Redlich–Rosenfeld equation were fitted by least-squares on measured solution densities.The limiting volume and attraction factor in the Van der Waals equation of state can be estimated with the Peng–Robinson approach from the critical temperature, pressure, and acentric factor of a gas. The Van der Waals equation can then be used to determine the fugacity coefficients for pure gases and gases in a mixture, and the solubility of the gas can be calculated from the fugacity, the molar volume in aqueous solution, and the equilibrium constant. The coefficients for the Peng–Robinson equations are readily available in the literature.The required equations have been implemented in PHREEQC, version 3, and the parameters for calculating the partial molar volumes and fugacity coefficients have been added to the databases that are distributed with PHREEQC. The ease of use and power of the formulation are illustrated by calculating the solubility of CO2 at high pressures and temperatures, and comparing with well-known examples from the geochemical literature. The equations and parameterizations are suitable for wide application in hydrogeochemical systems, especially in the field of carbon capture and storage.

  20. Equations for calculating hydrogeochemical reactions of minerals and gases such as CO2 at high pressures and temperatures

    NASA Astrophysics Data System (ADS)

    Appelo, C. A. J.; Parkhurst, D. L.; Post, V. E. A.

    2014-01-01

    Calculating the solubility of gases and minerals at the high pressures of carbon capture and storage in geological reservoirs requires an accurate description of the molar volumes of aqueous species and the fugacity coefficients of gases. Existing methods for calculating the molar volumes of aqueous species are limited to a specific concentration matrix (often seawater), have been fit for a limited temperature (below 60 °C) or pressure range, apply only at infinite dilution, or are defined for salts instead of individual ions. A more general and reliable calculation of apparent molar volumes of single ions is presented, based on a modified Redlich-Rosenfeld equation. The modifications consist of (1) using the Born equation to calculate the temperature dependence of the intrinsic volumes, following Helgeson-Kirkham-Flowers (HKF), but with Bradley and Pitzer’s expression for the dielectric permittivity of water, (2) using the pressure dependence of the extended Debye-Hückel equation to constrain the limiting slope of the molar volume with ionic strength, and (3) adopting the convention that the proton has zero volume at all ionic strengths, temperatures and pressures. The modifications substantially reduce the number of fitting parameters, while maintaining or even extending the range of temperature and pressure over which molar volumes can be accurately estimated. The coefficients in the HKF-modified-Redlich-Rosenfeld equation were fitted by least-squares on measured solution densities. The limiting volume and attraction factor in the Van der Waals equation of state can be estimated with the Peng-Robinson approach from the critical temperature, pressure, and acentric factor of a gas. The Van der Waals equation can then be used to determine the fugacity coefficients for pure gases and gases in a mixture, and the solubility of the gas can be calculated from the fugacity, the molar volume in aqueous solution, and the equilibrium constant. The coefficients for the Peng-Robinson equations are readily available in the literature. The required equations have been implemented in PHREEQC, version 3, and the parameters for calculating the partial molar volumes and fugacity coefficients have been added to the databases that are distributed with PHREEQC. The ease of use and power of the formulation are illustrated by calculating the solubility of CO2 at high pressures and temperatures, and comparing with well-known examples from the geochemical literature. The equations and parameterizations are suitable for wide application in hydrogeochemical systems, especially in the field of carbon capture and storage.

  1. Automated digital volume measurement of melanoma metastases in sentinel nodes predicts disease recurrence and survival.

    PubMed

    Riber-Hansen, Rikke; Nyengaard, Jens R; Hamilton-Dutoit, Stephen J; Sjoegren, Pia; Steiniche, Torben

    2011-09-01

    Total metastatic volume (TMV) is an important prognostic factor in melanoma sentinel lymph nodes (SLNs) that avoids both the interobserver variation and unidirectional upstaging seen when using semi-quantitative size estimates. However, it is somewhat laborious for routine application. Our aim was to investigate whether digital image analysis can estimate TMV accurately in melanoma SLNs. TMV was measured in 147 SLNs from 95 patients both manually and by automated digital image analysis. The results were compared by Bland-Altman plots (numerical data) and kappa statistics (categorical data). In addition, disease-free and melanoma-specific survivals were calculated. Mean metastatic volume per patient was 10.6 mm(3) (median 0.05 mm(3); range 0.0001-621.3 mm(3)) and 9.62 mm(3) (median 0.05 mm(3); range 0.00001-564.3 mm(3)) with manual and digital measurement, respectively. The Bland-Altman plot showed an even distribution of the differences, and the kappa statistic was 0.84. In multivariate analysis, both manual and digital metastasis volume measurements were independent progression markers when corrected for primary tumour thickness [manual: hazard ratio (HR): 1.21, 95% confidence interval (CI): 1.07-1.36, P = 0.002; digital: HR: 1.21, 95% CI: 1.06-1.37, P = 0.004]. Stereology-based, automated digital metastasis volume measurement in melanoma SLNs predicts disease recurrence and survival. © 2011 Blackwell Publishing Limited.

  2. Analysis of area level and unit level models for small area estimation in forest inventories assisted with LiDAR auxiliary information.

    PubMed

    Mauro, Francisco; Monleon, Vicente J; Temesgen, Hailemariam; Ford, Kevin R

    2017-01-01

    Forest inventories require estimates and measures of uncertainty for subpopulations such as management units. These units often times hold a small sample size, so they should be regarded as small areas. When auxiliary information is available, different small area estimation methods have been proposed to obtain reliable estimates for small areas. Unit level empirical best linear unbiased predictors (EBLUP) based on plot or grid unit level models have been studied more thoroughly than area level EBLUPs, where the modelling occurs at the management unit scale. Area level EBLUPs do not require a precise plot positioning and allow the use of variable radius plots, thus reducing fieldwork costs. However, their performance has not been examined thoroughly. We compared unit level and area level EBLUPs, using LiDAR auxiliary information collected for inventorying 98,104 ha coastal coniferous forest. Unit level models were consistently more accurate than area level EBLUPs, and area level EBLUPs were consistently more accurate than field estimates except for large management units that held a large sample. For stand density, volume, basal area, quadratic mean diameter, mean height and Lorey's height, root mean squared errors (rmses) of estimates obtained using area level EBLUPs were, on average, 1.43, 2.83, 2.09, 1.40, 1.32 and 1.64 times larger than those based on unit level estimates, respectively. Similarly, direct field estimates had rmses that were, on average, 1.37, 1.45, 1.17, 1.17, 1.26, and 1.38 times larger than rmses of area level EBLUPs. Therefore, area level models can lead to substantial gains in accuracy compared to direct estimates, and unit level models lead to very important gains in accuracy compared to area level models, potentially justifying the additional costs of obtaining accurate field plot coordinates.

  3. Analysis of area level and unit level models for small area estimation in forest inventories assisted with LiDAR auxiliary information

    PubMed Central

    Monleon, Vicente J.; Temesgen, Hailemariam; Ford, Kevin R.

    2017-01-01

    Forest inventories require estimates and measures of uncertainty for subpopulations such as management units. These units often times hold a small sample size, so they should be regarded as small areas. When auxiliary information is available, different small area estimation methods have been proposed to obtain reliable estimates for small areas. Unit level empirical best linear unbiased predictors (EBLUP) based on plot or grid unit level models have been studied more thoroughly than area level EBLUPs, where the modelling occurs at the management unit scale. Area level EBLUPs do not require a precise plot positioning and allow the use of variable radius plots, thus reducing fieldwork costs. However, their performance has not been examined thoroughly. We compared unit level and area level EBLUPs, using LiDAR auxiliary information collected for inventorying 98,104 ha coastal coniferous forest. Unit level models were consistently more accurate than area level EBLUPs, and area level EBLUPs were consistently more accurate than field estimates except for large management units that held a large sample. For stand density, volume, basal area, quadratic mean diameter, mean height and Lorey’s height, root mean squared errors (rmses) of estimates obtained using area level EBLUPs were, on average, 1.43, 2.83, 2.09, 1.40, 1.32 and 1.64 times larger than those based on unit level estimates, respectively. Similarly, direct field estimates had rmses that were, on average, 1.37, 1.45, 1.17, 1.17, 1.26, and 1.38 times larger than rmses of area level EBLUPs. Therefore, area level models can lead to substantial gains in accuracy compared to direct estimates, and unit level models lead to very important gains in accuracy compared to area level models, potentially justifying the additional costs of obtaining accurate field plot coordinates. PMID:29216290

  4. Partial volume correction and image analysis methods for intersubject comparison of FDG-PET studies

    NASA Astrophysics Data System (ADS)

    Yang, Jun

    2000-12-01

    Partial volume effect is an artifact mainly due to the limited imaging sensor resolution. It creates bias in the measured activity in small structures and around tissue boundaries. In brain FDG-PET studies, especially for Alzheimer's disease study where there is serious gray matter atrophy, accurate estimate of cerebral metabolic rate of glucose is even more problematic due to large amount of partial volume effect. In this dissertation, we developed a framework enabling inter-subject comparison of partial volume corrected brain FDG-PET studies. The framework is composed of the following image processing steps: (1)MRI segmentation, (2)MR-PET registration, (3)MR based PVE correction, (4)MR 3D inter-subject elastic mapping. Through simulation studies, we showed that the newly developed partial volume correction methods, either pixel based or ROI based, performed better than previous methods. By applying this framework to a real Alzheimer's disease study, we demonstrated that the partial volume corrected glucose rates vary significantly among the control, at risk and disease patient groups and this framework is a promising tool useful for assisting early identification of Alzheimer's patients.

  5. Simultaneous pressure-volume measurements using optical sensors and MRI for left ventricle function assessment during animal experiment.

    PubMed

    Abi-Abdallah Rodriguez, Dima; Durand, Emmanuel; de Rochefort, Ludovic; Boudjemline, Younes; Mousseaux, Elie

    2015-01-01

    Simultaneous pressure and volume measurements enable the extraction of valuable parameters for left ventricle function assessment. Cardiac MR has proven to be the most accurate method for volume estimation. Nonetheless, measuring pressure simultaneously during MRI acquisitions remains a challenge given the magnetic nature of the widely used pressure transducers. In this study we show the feasibility of simultaneous in vivo pressure-volume acquisitions with MRI using optical pressure sensors. Pressure-volume loops were calculated while inducing three inotropic states in a sheep and functional indices were extracted, using single beat loops, to characterize systolic and diastolic performance. Functional indices evolved as expected in response to positive inotropic stimuli. The end-systolic elastance, representing the contractility index, the diastolic myocardium compliance, and the cardiac work efficiency all increased when inducing inotropic state enhancement. The association of MRI and optical pressure sensors within the left ventricle successfully enabled pressure-volume loop analysis after having respective data simultaneously recorded during the experimentation without the need to move the animal between each inotropic state. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  6. Improved quantification for local regions of interest in preclinical PET imaging

    NASA Astrophysics Data System (ADS)

    Cal-González, J.; Moore, S. C.; Park, M.-A.; Herraiz, J. L.; Vaquero, J. J.; Desco, M.; Udias, J. M.

    2015-09-01

    In Positron Emission Tomography, there are several causes of quantitative inaccuracy, such as partial volume or spillover effects. The impact of these effects is greater when using radionuclides that have a large positron range, e.g. 68Ga and 124I, which have been increasingly used in the clinic. We have implemented and evaluated a local projection algorithm (LPA), originally evaluated for SPECT, to compensate for both partial-volume and spillover effects in PET. This method is based on the use of a high-resolution CT or MR image, co-registered with a PET image, which permits a high-resolution segmentation of a few tissues within a volume of interest (VOI) centered on a region within which tissue-activity values need to be estimated. The additional boundary information is used to obtain improved activity estimates for each tissue within the VOI, by solving a simple inversion problem. We implemented this algorithm for the preclinical Argus PET/CT scanner and assessed its performance using the radionuclides 18F, 68Ga and 124I. We also evaluated and compared the results obtained when it was applied during the iterative reconstruction, as well as after the reconstruction as a postprocessing procedure. In addition, we studied how LPA can help to reduce the ‘spillover contamination’, which causes inaccurate quantification of lesions in the immediate neighborhood of large, ‘hot’ sources. Quantification was significantly improved by using LPA, which provided more accurate ratios of lesion-to-background activity concentration for hot and cold regions. For 18F, the contrast was improved from 3.0 to 4.0 in hot lesions (when the true ratio was 4.0) and from 0.16 to 0.06 in cold lesions (true ratio  =  0.0), when using the LPA postprocessing. Furthermore, activity values estimated within the VOI using LPA during reconstruction were slightly more accurate than those obtained by post-processing, while also visually improving the image contrast and uniformity within the VOI.

  7. Improved quantification for local regions of interest in preclinical PET imaging

    PubMed Central

    Cal-González, J.; Moore, S. C.; Park, M.-A.; Herraiz, J. L.; Vaquero, J. J.; Desco, M.; Udias, J. M.

    2015-01-01

    In Positron Emission Tomography, there are several causes of quantitative inaccuracy, such as partial volume or spillover effects. The impact of these effects is greater when using radionuclides that have a large positron range, e.g., 68Ga and 124I, which have been increasingly used in the clinic. We have implemented and evaluated a local projection algorithm (LPA), originally evaluated for SPECT, to compensate for both partial-volume and spillover effects in PET. This method is based on the use of a high-resolution CT or MR image, co-registered with a PET image, which permits a high-resolution segmentation of a few tissues within a volume of interest (VOI) centered on a region within which tissue-activity values need to be estimated. The additional boundary information is used to obtain improved activity estimates for each tissue within the VOI, by solving a simple inversion problem. We implemented this algorithm for the preclinical Argus PET/CT scanner and assessed its performance using the radionuclides 18F, 68Ga and 124I. We also evaluated and compared the results obtained when it was applied during the iterative reconstruction, as well as after the reconstruction as a postprocessing procedure. In addition, we studied how LPA can help to reduce the “spillover contamination”, which causes inaccurate quantification of lesions in the immediate neighborhood of large, “hot” sources. Quantification was significantly improved by using LPA, which provided more accurate ratios of lesion-to-background activity concentration for hot and cold regions. For 18F, the contrast was improved from 3.0 to 4.0 in hot lesions (when the true ratio was 4.0) and from 0.16 to 0.06 in cold lesions (true ratio = 0.0), when using the LPA postprocessing. Furthermore, activity values estimated within the VOI using LPA during reconstruction were slightly more accurate than those obtained by post-processing, while also visually improving the image contrast and uniformity within the VOI. PMID:26334312

  8. Determination of void volume in normal phase liquid chromatography.

    PubMed

    Jiang, Ping; Wu, Di; Lucy, Charles A

    2014-01-10

    Void volume is an important fundamental parameter in chromatography. Little prior discussion has focused on the determination of void volume in normal phase liquid chromatography (NPLC). Various methods to estimate the total void volume are compared: pycnometry; minor disturbance method based on injection of weak solvent; tracer pulse method; hold-up volume based on unretained compounds; and accessible volume based on Martin's rule and its descendants. These are applied to NPLC on silica, RingSep and DNAP columns. Pycnometry provides a theoretically maximum value for the total void volume and should be performed at least once for each new column. However, pycnometry does not reflect the volume of adsorbed strong solvent on the stationary phase, and so only yields an accurate void volume for weaker mobile phase conditions. 1,3,5-Tri-t-butyl benzene (TTBB) results in hold-up volumes that are convenient measures of the void volume for all eluent conditions on charge-transfer columns (RingSep and DNAP), but is weakly retained under weak eluent conditions on silica. Injection of the weak mobile phase component (hexane) may be used to determine void volume, but care must be exercised to select the appropriate disturbance feature. Accessible volumes, that are determined using a homologous series, are always biased low, and are not recommended as a measure of the void volume. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Monitoring and Estimation of Soil Losses from Ephemeral Gully Erosion in Mediterranean Region Using Low Altitude Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Gündoğan, R.; Alma, V.; Dindaroğlu, T.; Günal, H.; Yakupoğlu, T.; Susam, T.; Saltalı, K.

    2017-11-01

    Calculation of gullies by remote sensing images obtained from satellite or aerial platforms is often not possible because gullies in agricultural fields, defined as the temporary gullies are filled in a very short time with tillage operations. Therefore, fast and accurate estimation of sediment loss with the temporary gully erosion is of great importance. In this study, it is aimed to monitor and calculate soil losses caused by the gully erosion that occurs in agricultural areas with low altitude unmanned aerial vehicles. According to the calculation with Pix4D, gully volume was estimated to be 10.41 m3 and total loss of soil was estimated to be 14.47 Mg. The RMSE value of estimations was found to be 0.89. The results indicated that unmanned aerial vehicles could be used in predicting temporary gully erosion and losses of soil.

  10. ARL Summer Student Research Symposium. Volume 2: Compendium of Abstracts

    DTIC Science & Technology

    2012-08-01

    7 Acoustic Localization with Compensation for Wind Au, Brandon Accurate localization of targets is difficult, as we are often unaware of all the...gathered by the arrays. However, many sources of signal interference add noise. While wind will have a negligible effect on sensors close to the...error over 1 m. However, wind data is often not collected or changes rapidly, so blind wind estimation is calculated to best fit the given data. The

  11. Estimating Marine Aerosol Particle Volume and Number from Maritime Aerosol Network Data

    NASA Technical Reports Server (NTRS)

    Sayer, A. M.; Smirnov, A.; Hsu, N. C.; Munchak, L. A.; Holben, B. N.

    2012-01-01

    As well as spectral aerosol optical depth (AOD), aerosol composition and concentration (number, volume, or mass) are of interest for a variety of applications. However, remote sensing of these quantities is more difficult than for AOD, as it is more sensitive to assumptions relating to aerosol composition. This study uses spectral AOD measured on Maritime Aerosol Network (MAN) cruises, with the additional constraint of a microphysical model for unpolluted maritime aerosol based on analysis of Aerosol Robotic Network (AERONET) inversions, to estimate these quantities over open ocean. When the MAN data are subset to those likely to be comprised of maritime aerosol, number and volume concentrations obtained are physically reasonable. Attempts to estimate surface concentration from columnar abundance, however, are shown to be limited by uncertainties in vertical distribution. Columnar AOD at 550 nm and aerosol number for unpolluted maritime cases are also compared with Moderate Resolution Imaging Spectroradiometer (MODIS) data, for both the present Collection 5.1 and forthcoming Collection 6. MODIS provides a best-fitting retrieval solution, as well as the average for several different solutions, with different aerosol microphysical models. The average solution MODIS dataset agrees more closely with MAN than the best solution dataset. Terra tends to retrieve lower aerosol number than MAN, and Aqua higher, linked with differences in the aerosol models commonly chosen. Collection 6 AOD is likely to agree more closely with MAN over open ocean than Collection 5.1. In situations where spectral AOD is measured accurately, and aerosol microphysical properties are reasonably well-constrained, estimates of aerosol number and volume using MAN or similar data would provide for a greater variety of potential comparisons with aerosol properties derived from satellite or chemistry transport model data.

  12. Spatial and temporal skin blood volume and saturation estimation using a multispectral snapshot imaging camera

    NASA Astrophysics Data System (ADS)

    Ewerlöf, Maria; Larsson, Marcus; Salerud, E. Göran

    2017-02-01

    Hyperspectral imaging (HSI) can estimate the spatial distribution of skin blood oxygenation, using visible to near-infrared light. HSI oximeters often use a liquid-crystal tunable filter, an acousto-optic tunable filter or mechanically adjustable filter wheels, which has too long response/switching times to monitor tissue hemodynamics. This work aims to evaluate a multispectral snapshot imaging system to estimate skin blood volume and oxygen saturation with high temporal and spatial resolution. We use a snapshot imager, the xiSpec camera (MQ022HG-IM-SM4X4-VIS, XIMEA), having 16 wavelength-specific Fabry-Perot filters overlaid on the custom CMOS-chip. The spectral distribution of the bands is however substantially overlapping, which needs to be taken into account for an accurate analysis. An inverse Monte Carlo analysis is performed using a two-layered skin tissue model, defined by epidermal thickness, haemoglobin concentration and oxygen saturation, melanin concentration and spectrally dependent reduced-scattering coefficient, all parameters relevant for human skin. The analysis takes into account the spectral detector response of the xiSpec camera. At each spatial location in the field-of-view, we compare the simulated output to the detected diffusively backscattered spectra to find the best fit. The imager is evaluated for spatial and temporal variations during arterial and venous occlusion protocols applied to the forearm. Estimated blood volume changes and oxygenation maps at 512x272 pixels show values that are comparable to reference measurements performed in contact with the skin tissue. We conclude that the snapshot xiSpec camera, paired with an inverse Monte Carlo algorithm, permits us to use this sensor for spatial and temporal measurement of varying physiological parameters, such as skin tissue blood volume and oxygenation.

  13. Revisit to three-dimensional percolation theory: Accurate analysis for highly stretchable conductive composite materials

    PubMed Central

    Kim, Sangwoo; Choi, Seongdae; Oh, Eunho; Byun, Junghwan; Kim, Hyunjong; Lee, Byeongmoon; Lee, Seunghwan; Hong, Yongtaek

    2016-01-01

    A percolation theory based on variation of conductive filler fraction has been widely used to explain the behavior of conductive composite materials under both small and large deformation conditions. However, it typically fails in properly analyzing the materials under the large deformation since the assumption may not be valid in such a case. Therefore, we proposed a new three-dimensional percolation theory by considering three key factors: nonlinear elasticity, precisely measured strain-dependent Poisson’s ratio, and strain-dependent percolation threshold. Digital image correlation (DIC) method was used to determine actual Poisson’s ratios at various strain levels, which were used to accurately estimate variation of conductive filler volume fraction under deformation. We also adopted strain-dependent percolation threshold caused by the filler re-location with deformation. When three key factors were considered, electrical performance change was accurately analyzed for composite materials with both isotropic and anisotropic mechanical properties. PMID:27694856

  14. Three-dimensional proximal flow convergence automatic calculation for determining mitral valve area in rheumatic mitral stenosis.

    PubMed

    Sampaio, Francisco; Ladeiras-Lopes, Ricardo; Almeida, João; Fonseca, Paulo; Fontes-Carvalho, Ricardo; Ribeiro, José; Gama, Vasco

    2017-07-01

    Management of patients with mitral stenosis (MS) depends heavily on the accurate quantification of mitral valve area (MVA) using echocardiography. All currently used two-dimensional (2D) methods have limitations. Estimation of MVA using the proximal isovelocity surface area (PISA) method with real time three-dimensional (3D) echocardiography may circumvent those limitations. We aimed to evaluate the accuracy of 3D direct measurement of PISA in the estimation of MVA. Twenty-seven consecutive patients (median age of 63 years; 77.8% females) with rheumatic MS were prospectively studied. Transthoracic and transesophageal echocardiography with 2D and 3D acquisitions were performed on the same day. The reference method for MVA quantification was valve planimetry after 3D-volume multiplanar reconstruction. A semi-automated software was used to calculate the 3D flow convergence volume. Compared to MVA estimation using 3D planimetry, 3D PISA showed the best correlation (rho=0.78, P<.0001), followed by pressure half-time (PHT: rho=0.66, P<.001), continuity equation (CE: rho=0.61, P=.003), and 2D PISA (rho=0.26, P=.203). Bland-Altman analysis revealed a good agreement for MVA estimation with 3D PISA (mean difference -0.03 cm 2 ; limits of agreement (LOA) -0.40-0.35), in contrast to wider LOA for 2D methods: CE (mean difference 0.02 cm 2 , LOA -0.56-0.60); PHT (mean difference 0.31 cm 2 , LOA -0.32-0.95); 2D PISA (mean difference -0.03 cm 2 , LOA -0.92-0.86). MVA estimation using 3D PISA was feasible and more accurate than 2D methods. Its introduction in daily clinical practice seems possible and may overcome technical limitations of 2D methods. © 2017, Wiley Periodicals, Inc.

  15. Estimating the proportion of groundwater recharge from flood events in relation to total annual recharge in a karst aquifer

    NASA Astrophysics Data System (ADS)

    Dvory, N. Z.; Ronen, A.; Livshitz, Y.; Adar, E.; Kuznetsov, M.; Yakirevich, A.

    2017-12-01

    Sustainable groundwater production from karstic aquifers is primarily dictated by its recharge rate. Therefore, in order to limit over-exploitation, it is essential to accurately quantify groundwater recharge. Infiltration during erratic floods in karstic basins may contribute substantial amount to aquifer recharge. However, the complicated nature of karst systems, which are characterized in part by multiple springs, sinkholes, and losing/gaining streams, present a large obstacle to accurately assess the actual contribution of flood water to groundwater recharge. In this study, we aim to quantify the proportion of groundwater recharge during flood events in relation to the annual recharge for karst aquifers. The role of karst conduits on flash flood infiltration was examined during four flood and artificial runoff events in the Sorek creek near Jerusalem, Israel. The events were monitored in short time steps (four minutes). This high resolution analysis is essential to accurately estimating surface flow volumes, which are of particular importance in arid and semi-arid climate where ephemeral flows may provide a substantial contribution to the groundwater reservoirs. For the present investigation, we distinguished between direct infiltration, percolation through karst conduits and diffused infiltration, which is most affected by evapotranspiration. A water balance was then calculated for the 2014/15 hydrologic year using the Hydrologic Engineering Center - Hydrologic Modelling System (HEC-HMS). Simulations show that an additional 8% to 24% of the annual recharge volume is added from runoff losses along the creek that infiltrate through the karst system into the aquifer. The results improve the understanding of recharge processes and support the use of the proposed methodology for quantifying groundwater recharge.

  16. Coplanar electrode microfluidic chip enabling accurate sheathless impedance cytometry.

    PubMed

    De Ninno, Adele; Errico, Vito; Bertani, Francesca Romana; Businaro, Luca; Bisegna, Paolo; Caselli, Federica

    2017-03-14

    Microfluidic impedance cytometry offers a simple non-invasive method for single-cell analysis. Coplanar electrode chips are especially attractive due to ease of fabrication, yielding miniaturized, reproducible, and ultimately low-cost devices. However, their accuracy is challenged by the dependence of the measured signal on particle trajectory within the interrogation volume, that manifests itself as an error in the estimated particle size, unless any kind of focusing system is used. In this paper, we present an original five-electrode coplanar chip enabling accurate particle sizing without the need for focusing. The chip layout is designed to provide a peculiar signal shape from which a new metric correlating with particle trajectory can be extracted. This metric is exploited to correct the estimated size of polystyrene beads of 5.2, 6 and 7 μm nominal diameter, reaching coefficient of variations lower than the manufacturers' quoted values. The potential impact of the proposed device in the field of life sciences is demonstrated with an application to Saccharomyces cerevisiae yeast.

  17. A novel phenomenological multi-physics model of Li-ion battery cells

    NASA Astrophysics Data System (ADS)

    Oh, Ki-Yong; Samad, Nassim A.; Kim, Youngki; Siegel, Jason B.; Stefanopoulou, Anna G.; Epureanu, Bogdan I.

    2016-09-01

    A novel phenomenological multi-physics model of Lithium-ion battery cells is developed for control and state estimation purposes. The model can capture electrical, thermal, and mechanical behaviors of battery cells under constrained conditions, e.g., battery pack conditions. Specifically, the proposed model predicts the core and surface temperatures and reaction force induced from the volume change of battery cells because of electrochemically- and thermally-induced swelling. Moreover, the model incorporates the influences of changes in preload and ambient temperature on the force considering severe environmental conditions electrified vehicles face. Intensive experimental validation demonstrates that the proposed multi-physics model accurately predicts the surface temperature and reaction force for a wide operational range of preload and ambient temperature. This high fidelity model can be useful for more accurate and robust state of charge estimation considering the complex dynamic behaviors of the battery cell. Furthermore, the inherent simplicity of the mechanical measurements offers distinct advantages to improve the existing power and thermal management strategies for battery management.

  18. Predicting volume of distribution with decision tree-based regression methods using predicted tissue:plasma partition coefficients.

    PubMed

    Freitas, Alex A; Limbu, Kriti; Ghafourian, Taravat

    2015-01-01

    Volume of distribution is an important pharmacokinetic property that indicates the extent of a drug's distribution in the body tissues. This paper addresses the problem of how to estimate the apparent volume of distribution at steady state (Vss) of chemical compounds in the human body using decision tree-based regression methods from the area of data mining (or machine learning). Hence, the pros and cons of several different types of decision tree-based regression methods have been discussed. The regression methods predict Vss using, as predictive features, both the compounds' molecular descriptors and the compounds' tissue:plasma partition coefficients (Kt:p) - often used in physiologically-based pharmacokinetics. Therefore, this work has assessed whether the data mining-based prediction of Vss can be made more accurate by using as input not only the compounds' molecular descriptors but also (a subset of) their predicted Kt:p values. Comparison of the models that used only molecular descriptors, in particular, the Bagging decision tree (mean fold error of 2.33), with those employing predicted Kt:p values in addition to the molecular descriptors, such as the Bagging decision tree using adipose Kt:p (mean fold error of 2.29), indicated that the use of predicted Kt:p values as descriptors may be beneficial for accurate prediction of Vss using decision trees if prior feature selection is applied. Decision tree based models presented in this work have an accuracy that is reasonable and similar to the accuracy of reported Vss inter-species extrapolations in the literature. The estimation of Vss for new compounds in drug discovery will benefit from methods that are able to integrate large and varied sources of data and flexible non-linear data mining methods such as decision trees, which can produce interpretable models. Graphical AbstractDecision trees for the prediction of tissue partition coefficient and volume of distribution of drugs.

  19. SPHYNX: an accurate density-based SPH method for astrophysical applications

    NASA Astrophysics Data System (ADS)

    Cabezón, R. M.; García-Senz, D.; Figueira, J.

    2017-10-01

    Aims: Hydrodynamical instabilities and shocks are ubiquitous in astrophysical scenarios. Therefore, an accurate numerical simulation of these phenomena is mandatory to correctly model and understand many astrophysical events, such as supernovas, stellar collisions, or planetary formation. In this work, we attempt to address many of the problems that a commonly used technique, smoothed particle hydrodynamics (SPH), has when dealing with subsonic hydrodynamical instabilities or shocks. To that aim we built a new SPH code named SPHYNX, that includes many of the recent advances in the SPH technique and some other new ones, which we present here. Methods: SPHYNX is of Newtonian type and grounded in the Euler-Lagrange formulation of the smoothed-particle hydrodynamics technique. Its distinctive features are: the use of an integral approach to estimating the gradients; the use of a flexible family of interpolators called sinc kernels, which suppress pairing instability; and the incorporation of a new type of volume element which provides a better partition of the unity. Unlike other modern formulations, which consider volume elements linked to pressure, our volume element choice relies on density. SPHYNX is, therefore, a density-based SPH code. Results: A novel computational hydrodynamic code oriented to Astrophysical applications is described, discussed, and validated in the following pages. The ensuing code conserves mass, linear and angular momentum, energy, entropy, and preserves kernel normalization even in strong shocks. In our proposal, the estimation of gradients is enhanced using an integral approach. Additionally, we introduce a new family of volume elements which reduce the so-called tensile instability. Both features help to suppress the damp which often prevents the growth of hydrodynamic instabilities in regular SPH codes. Conclusions: On the whole, SPHYNX has passed the verification tests described below. For identical particle setting and initial conditions the results were similar (or better in some particular cases) than those obtained with other SPH schemes such as GADGET-2, PSPH or with the recent density-independent formulation (DISPH) and conservative reproducing kernel (CRKSPH) techniques.

  20. SU-F-R-44: Modeling Lung SBRT Tumor Response Using Bayesian Network Averaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diamant, A; Ybarra, N; Seuntjens, J

    2016-06-15

    Purpose: The prediction of tumor control after a patient receives lung SBRT (stereotactic body radiation therapy) has proven to be challenging, due to the complex interactions between an individual’s biology and dose-volume metrics. Many of these variables have predictive power when combined, a feature that we exploit using a graph modeling approach based on Bayesian networks. This provides a probabilistic framework that allows for accurate and visually intuitive predictive modeling. The aim of this study is to uncover possible interactions between an individual patient’s characteristics and generate a robust model capable of predicting said patient’s treatment outcome. Methods: We investigatedmore » a cohort of 32 prospective patients from multiple institutions whom had received curative SBRT to the lung. The number of patients exhibiting tumor failure was observed to be 7 (event rate of 22%). The serum concentration of 5 biomarkers previously associated with NSCLC (non-small cell lung cancer) was measured pre-treatment. A total of 21 variables were analyzed including: dose-volume metrics with BED (biologically effective dose) correction and clinical variables. A Markov Chain Monte Carlo technique estimated the posterior probability distribution of the potential graphical structures. The probability of tumor failure was then estimated by averaging the top 100 graphs and applying Baye’s rule. Results: The optimal Bayesian model generated throughout this study incorporated the PTV volume, the serum concentration of the biomarker EGFR (epidermal growth factor receptor) and prescription BED. This predictive model recorded an area under the receiver operating characteristic curve of 0.94(1), providing better performance compared to competing methods in other literature. Conclusion: The use of biomarkers in conjunction with dose-volume metrics allows for the generation of a robust predictive model. The preliminary results of this report demonstrate that it is possible to accurately model the prognosis of an individual lung SBRT patient’s treatment.« less

  1. [Volume assessment in the acute heart and renal failure].

    PubMed

    Vujicić, Bozidar; Ruzić, Alen; Zaputović, Luka; Racki, Sanjin

    2012-10-01

    Acute kidney injury (AKI) is an important clinical issue, especially in the setting of critical care. It has been shown in multiple studies to be a key independent risk factor for mortality, even after adjustment for demographics and severity of illness. There is wide agreement that a generally applicable classification system is required for AKI which helps to standardize estimation of severity of renal disfunction and to predict outcome associated with this condition. That's how RIFLE (Risk-Injury-Failure-Loss-End-stage renal disease), and AKIN (Acute Kidney Injury Network) classifications for AKI were found in 2004 and 2007, respectively. In the clinical setting of heart failure, a positive fluid balance (often expressed in the literature as weight gain) is used by disease management programs as a marker of heart failure decompensation. Oliguria is defined as urine output less than 0,3 ml/kg/h for at least 24 h. Since any delay in treatment can lead to a dangerous progression of the AKI, early recognition of oliguria appears to be crucial. Critically ill patients with oliguric AKI are at increased risk for fluid imbalance due to widespread systemic inflammation, reduced plasma oncotic pressure and increased capillary leak. These patients are particulary at risk of fluid overload and therefore restrictive strategy of fluid administration should be used. Objective, rapid and accurate volume assessment is important in undiagnosed patients presenting with critical illness, as errors may result in interventions with fatal outcomes. The historical tools such as physical exam, and chest radiography suffer from significant limitations. As gold standard, radioisolopic measurement of volume is impractical in the acute care enviroment. Newer technologies offer the promise of both rapid and accurate bedside estimation of volume status with the potential to improve clinical outcomes. Blood assessment with bioimpendance vector analysis, and bedside ultrasound seem to be promising technologies for this need.

  2. Automatic Measurement of Fetal Brain Development from Magnetic Resonance Imaging: New Reference Data.

    PubMed

    Link, Daphna; Braginsky, Michael B; Joskowicz, Leo; Ben Sira, Liat; Harel, Shaul; Many, Ariel; Tarrasch, Ricardo; Malinger, Gustavo; Artzi, Moran; Kapoor, Cassandra; Miller, Elka; Ben Bashat, Dafna

    2018-01-01

    Accurate fetal brain volume estimation is of paramount importance in evaluating fetal development. The aim of this study was to develop an automatic method for fetal brain segmentation from magnetic resonance imaging (MRI) data, and to create for the first time a normal volumetric growth chart based on a large cohort. A semi-automatic segmentation method based on Seeded Region Growing algorithm was developed and applied to MRI data of 199 typically developed fetuses between 18 and 37 weeks' gestation. The accuracy of the algorithm was tested against a sub-cohort of ground truth manual segmentations. A quadratic regression analysis was used to create normal growth charts. The sensitivity of the method to identify developmental disorders was demonstrated on 9 fetuses with intrauterine growth restriction (IUGR). The developed method showed high correlation with manual segmentation (r2 = 0.9183, p < 0.001) as well as mean volume and volume overlap differences of 4.77 and 18.13%, respectively. New reference data on 199 normal fetuses were created, and all 9 IUGR fetuses were at or below the third percentile of the normal growth chart. The proposed method is fast, accurate, reproducible, user independent, applicable with retrospective data, and is suggested for use in routine clinical practice. © 2017 S. Karger AG, Basel.

  3. Development of deformable moving lung phantom to simulate respiratory motion in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jina; Lee, Youngkyu; Shin, Hunjoo

    Radiation treatment requires high accuracy to protect healthy organs and destroy the tumor. However, tumors located near the diaphragm constantly move during treatment. Respiration-gated radiotherapy has significant potential for the improvement of the irradiation of tumor sites affected by respiratory motion, such as lung and liver tumors. To measure and minimize the effects of respiratory motion, a realistic deformable phantom is required for use as a gold standard. The purpose of this study was to develop and study the characteristics of a deformable moving lung (DML) phantom, such as simulation, tissue equivalence, and rate of deformation. The rate of changemore » of the lung volume, target deformation, and respiratory signals were measured in this study; they were accurately measured using a realistic deformable phantom. The measured volume difference was 31%, which closely corresponds to the average difference in human respiration, and the target movement was − 30 to + 32 mm. The measured signals accurately described human respiratory signals. This DML phantom would be useful for the estimation of deformable image registration and in respiration-gated radiotherapy. This study shows that the developed DML phantom can exactly simulate the patient's respiratory signal and it acts as a deformable 4-dimensional simulation of a patient's lung with sufficient volume change.« less

  4. Measuring intestinal fluid transport in vitro: Gravimetric method versus non-absorbable marker.

    PubMed

    Whittamore, Jonathan M; Genz, Janet; Grosell, Martin; Wilson, Rod W

    2016-04-01

    The gut sac is a long-standing, widely used in vitro preparation for studying solute and water transport, and calculation of these fluxes requires an accurate assessment of volume. This is commonly determined gravimetrically by measuring the change in mass over time. While convenient this likely under-estimates actual net water flux (Jv) due to tissue edema. We evaluated whether the popular in vivo volume marker [(14)C]-PEG 4000, offers a more representative measure of Jvin vitro. We directly compared these two methods in five teleost species (toadfish, flounder, rainbow trout, killifish and tilapia). Net fluid absorption by the toadfish intestine based on PEG was significantly higher, by almost 4-fold, compared to gravimetric measurements, compatible with the latter under-estimating Jv. Despite this, PEG proved inconsistent for all of the other species frequently resulting in calculation of net secretion, in contrast to absorption seen gravimetrically. Such poor parallelism could not be explained by the absorption of [(14)C]-PEG (typically <1%). We identified a number of factors impacting the effectiveness of PEG. One was adsorption to the surface of sample tubes. While it was possible to circumvent this using unlabelled PEG 4000, this had a deleterious effect on PEG-based Jv. We also found sequestration of PEG within the intestinal mucus. In conclusion, the short-comings associated with the accurate representation of Jv by gut sac preparations are not overcome by [(14)C]-PEG. The gravimetric method therefore remains the most reliable measure of Jv and we urge caution in the use of PEG as a volume marker. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Accurate tissue characterization in low-dose CT imaging with pure iterative reconstruction.

    PubMed

    Murphy, Kevin P; McLaughlin, Patrick D; Twomey, Maria; Chan, Vincent E; Moloney, Fiachra; Fung, Adrian J; Chan, Faimee E; Kao, Tafline; O'Neill, Siobhan B; Watson, Benjamin; O'Connor, Owen J; Maher, Michael M

    2017-04-01

    We assess the ability of low-dose hybrid iterative reconstruction (IR) and 'pure' model-based IR (MBIR) images to maintain accurate Hounsfield unit (HU)-determined tissue characterization. Standard-protocol (SP) and low-dose modified-protocol (MP) CTs were contemporaneously acquired in 34 Crohn's disease patients referred for CT. SP image reconstruction was via the manufacturer's recommendations (60% FBP, filtered back projection; 40% ASiR, Adaptive Statistical iterative Reconstruction; SP-ASiR40). MP data sets underwent four reconstructions (100% FBP; 40% ASiR; 70% ASiR; MBIR). Three observers measured tissue volumes using HU thresholds for fat, soft tissue and bone/contrast on each data set. Analysis was via SPSS. Inter-observer agreement was strong for 1530 datapoints (rs > 0.9). MP-MBIR tissue volume measurement was superior to other MP reconstructions and closely correlated with the reference SP-ASiR40 images for all tissue types. MP-MBIR superiority was most marked for fat volume calculation - close SP-ASiR40 and MP-MBIR Bland-Altman plot correlation was seen with the lowest average difference (336 cm 3 ) when compared with other MP reconstructions. Hounsfield unit-determined tissue volume calculations from MP-MBIR images resulted in values comparable to SP-ASiR40 calculations and values that are superior to MP-ASiR images. Accuracy of estimation of volume of tissues (e.g. fat) using segmentation software on low-dose CT images appears optimal when reconstructed with pure IR. © 2016 The Royal Australian and New Zealand College of Radiologists.

  6. A feasibility study on estimation of tissue mixture contributions in 3D arterial spin labeling sequence

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Pu, Huangsheng; Zhang, Xi; Li, Baojuan; Liang, Zhengrong; Lu, Hongbing

    2017-03-01

    Arterial spin labeling (ASL) provides a noninvasive measurement of cerebral blood flow (CBF). Due to relatively low spatial resolution, the accuracy of CBF measurement is affected by the partial volume (PV) effect. To obtain accurate CBF estimation, the contribution of each tissue type in the mixture is desirable. In general, this can be obtained according to the registration of ASL and structural image in current ASL studies. This approach can obtain probability of each tissue type inside each voxel, but it also introduces error, which include error of registration algorithm and imaging itself error in scanning of ASL and structural image. Therefore, estimation of mixture percentage directly from ASL data is greatly needed. Under the assumption that ASL signal followed the Gaussian distribution and each tissue type is independent, a maximum a posteriori expectation-maximization (MAP-EM) approach was formulated to estimate the contribution of each tissue type to the observed perfusion signal at each voxel. Considering the sensitivity of MAP-EM to the initialization, an approximately accurate initialization was obtain using 3D Fuzzy c-means method. Our preliminary results demonstrated that the GM and WM pattern across the perfusion image can be sufficiently visualized by the voxel-wise tissue mixtures, which may be promising for the diagnosis of various brain diseases.

  7. ANGIOCARE: an automated system for fast three-dimensional coronary reconstruction by integrating angiographic and intracoronary ultrasound data.

    PubMed

    Bourantas, Christos V; Kalatzis, Fanis G; Papafaklis, Michail I; Fotiadis, Dimitrios I; Tweddel, Ann C; Kourtis, Iraklis C; Katsouras, Christos S; Michalis, Lampros K

    2008-08-01

    The development of an automated, user-friendly system (ANGIOCARE), for rapid three-dimensional (3D) coronary reconstruction, integrating angiographic and, intracoronary ultrasound (ICUS) data. Biplane angiographic and ICUS sequence images are imported into the system where a prevalidated method is used for coronary reconstruction. This incorporates extraction of the catheter path from two end-diastolic X-ray images and detection of regions of interest (lumen, outer vessel wall) in the ICUS sequence by an automated border detection algorithm. The detected borders are placed perpendicular to the catheter path and established algorithms used to estimate their absolute orientation. The resulting 3D object is imported into an advanced visualization module with which the operator can interact, examine plaque distribution (depicted as a color coded map) and assess plaque burden by virtual endoscopy. Data from 19 patients (27 vessels) undergoing biplane angiography and ICUS were examined. The reconstructed vessels were 21.3-80.2 mm long. The mean difference was 0.9 +/- 2.9% between the plaque volumes measured using linear 3D ICUS analysis and the volumes, estimated by taking into account the curvature of the vessel. The time required to reconstruct a luminal narrowing of 25 mm was approximately 10 min. The ANGIOCARE system provides rapid coronary reconstruction allowing the operator accurately to estimate the length of the lesion and determine plaque distribution and volume. (c) 2008 Wiley-Liss, Inc.

  8. Fusion of 4D echocardiography and cine cardiac magnetic resonance volumes using a salient spatio-temporal analysis

    NASA Astrophysics Data System (ADS)

    Atehortúa, Angélica; Garreau, Mireille; Romero, Eduardo

    2017-11-01

    An accurate left (LV) and right ventricular (RV) function quantification is important to support evaluation, diagnosis and prognosis of cardiac pathologies such as the cardiomyopathies. Currently, diagnosis by ultrasound is the most cost-effective examination. However, this modality is highly noisy and operator dependent, hence prone to errors. Therefore, fusion with other cardiac modalities may provide complementary information and improve the analysis of the specific pathologies like cardiomyopathies. This paper proposes an automatic registration between two complementary modalities, 4D echocardiography and Magnetic resonance images, by mapping both modalities to a common space of salience where an optimal registration between them is estimated. The obtained matrix transformation is then applied to the MRI volume which is superimposed to the 4D echocardiography. Manually selected marks in both modalities are used to evaluate the precision of the superimposition. Preliminary results, in three evaluation cases, show the distance between these marked points and the estimated with the transformation is about 2 mm.

  9. Homogenization via the strong-permittivity-fluctuation theory with nonzero depolarization volume

    NASA Astrophysics Data System (ADS)

    Mackay, Tom G.

    2004-08-01

    The depolarization dyadic provides the scattering response of a single inclusion particle embedded within a homogenous background medium. These dyadics play a central role in formalisms used to estimate the effective constitutive parameters of homogenized composite mediums (HCMs). Conventionally, the inclusion particle is taken to be vanishingly small; this allows the pointwise singularity of the dyadic Green function associated with the background medium to be employed as the depolarization dyadic. A more accurate approach is pursued in this communication by taking into account the nonzero spatial extent of inclusion particles. Depolarization dyadics corresponding to inclusion particles of nonzero volume are incorporated within the strong-permittivity-fluctuation theory (SPFT). The linear dimensions of inclusion particles are assumed to be small relative to the electromagnetic wavelength(s) and the SPFT correlation length. The influence of the size of inclusion particles upon SPFT estimates of the HCM constitutive parameters is investigated for anisotropic dielectric HCMs.In particular, the interplay between correlation length and inclusion size is explored.

  10. Shape Models of Asteroids as a Missing Input for Bulk Density Determinations

    NASA Astrophysics Data System (ADS)

    Hanuš, Josef

    2015-07-01

    To determine a meaningful bulk density of an asteroid, both accurate volume and mass estimates are necessary. The volume can be computed by scaling the size of the 3D shape model to fit the disk-resolved images or stellar occultation profiles, which are available in the literature or through collaborations. This work provides a list of asteroids, for which (i) there are already mass estimates with reported uncertainties better than 20% or their mass will be most likely determined in the future from Gaia astrometric observations, and (ii) their 3D shape models are currently unknown. Additional optical lightcurves are necessary to determine the convex shape models of these asteroids. The main aim of this article is to motivate the observers to obtain lightcurves of these asteroids, and thus contribute to their shape model determinations. Moreover, a web page https://asteroid-obs.oca.eu, which maintains an up-to-date list of these objects to assure efficiency and to avoid any overlapping efforts, was created.

  11. Intensity-Based Registration for Lung Motion Estimation

    NASA Astrophysics Data System (ADS)

    Cao, Kunlin; Ding, Kai; Amelon, Ryan E.; Du, Kaifang; Reinhardt, Joseph M.; Raghavan, Madhavan L.; Christensen, Gary E.

    Image registration plays an important role within pulmonary image analysis. The task of registration is to find the spatial mapping that brings two images into alignment. Registration algorithms designed for matching 4D lung scans or two 3D scans acquired at different inflation levels can catch the temporal changes in position and shape of the region of interest. Accurate registration is critical to post-analysis of lung mechanics and motion estimation. In this chapter, we discuss lung-specific adaptations of intensity-based registration methods for 3D/4D lung images and review approaches for assessing registration accuracy. Then we introduce methods for estimating tissue motion and studying lung mechanics. Finally, we discuss methods for assessing and quantifying specific volume change, specific ventilation, strain/ stretch information and lobar sliding.

  12. Examination of simplified travel demand model. [Internal volume forecasting model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, R.L. Jr.; McFarlane, W.J.

    1978-01-01

    A simplified travel demand model, the Internal Volume Forecasting (IVF) model, proposed by Low in 1972 is evaluated as an alternative to the conventional urban travel demand modeling process. The calibration of the IVF model for a county-level study area in Central Wisconsin results in what appears to be a reasonable model; however, analysis of the structure of the model reveals two primary mis-specifications. Correction of the mis-specifications leads to a simplified gravity model version of the conventional urban travel demand models. Application of the original IVF model to ''forecast'' 1960 traffic volumes based on the model calibrated for 1970more » produces accurate estimates. Shortcut and ad hoc models may appear to provide reasonable results in both the base and horizon years; however, as shown by the IVF mode, such models will not always provide a reliable basis for transportation planning and investment decisions.« less

  13. Accuracy of height estimation and tidal volume setting using anthropometric formulas in an ICU Caucasian population.

    PubMed

    L'her, Erwan; Martin-Babau, Jérôme; Lellouche, François

    2016-12-01

    Knowledge of patients' height is essential for daily practice in the intensive care unit. However, actual height measurements are unavailable on a daily routine in the ICU and measured height in the supine position and/or visual estimates may lack consistency. Clinicians do need simple and rapid methods to estimate the patients' height, especially in short height and/or obese patients. The objectives of the study were to evaluate several anthropometric formulas for height estimation on healthy volunteers and to test whether several of these estimates will help tidal volume setting in ICU patients. This was a prospective, observational study in a medical intensive care unit of a university hospital. During the first phase of the study, eight limb measurements were performed on 60 healthy volunteers and 18 height estimation formulas were tested. During the second phase, four height estimates were performed on 60 consecutive ICU patients under mechanical ventilation. In the 60 healthy volunteers, actual height was well correlated with the gold standard, measured height in the erect position. Correlation was low between actual and calculated height, using the hand's length and width, the index, or the foot equations. The Chumlea method and its simplified version, performed in the supine position, provided adequate estimates. In the 60 ICU patients, calculated height using the simplified Chumlea method was well correlated with measured height (r = 0.78; ∂ < 1 %). Ulna and tibia estimates also provided valuable estimates. All these height estimates allowed calculating IBW or PBW that were significantly different from the patients' actual weight on admission. In most cases, tidal volume set according to these estimates was lower than what would have been set using the actual weight. When actual height is unavailable in ICU patients undergoing mechanical ventilation, alternative anthropometric methods to obtain patient's height based on lower leg and on forearm measurements could be useful to facilitate the application of protective mechanical ventilation in a Caucasian ICU population. The simplified Chumlea method is easy to achieve in a bed-ridden patient and provides accurate height estimates, with a low bias.

  14. Evaluation of the pulse-contour method of determining stroke volume in man.

    NASA Technical Reports Server (NTRS)

    Alderman, E. L.; Branzi, A.; Sanders, W.; Brown, B. W.; Harrison, D. C.

    1972-01-01

    The pulse-contour method for determining stroke volume has been employed as a continuous rapid method of monitoring the cardiovascular status of patients. Twenty-one patients with ischemic heart disease and 21 patients with mitral valve disease were subjected to a variety of hemodynamic interventions. The pulse-contour estimations, using three different formulas derived by Warner, Kouchoukos, and Herd, were compared with indicator-dilution outputs. A comparison of the results of the two methods for determining stroke volume yielded correlation coefficients ranging from 0.59 to 0.84. The better performing Warner formula yielded a coefficient of variation of about 20%. The type of hemodynamic interventions employed did not significantly affect the results using the pulse-contour method. Although the correlation of the pulse-contour and indicator-dilution stroke volumes is high, the coefficient of variation is such that small changes in stroke volume cannot be accurately assessed by the pulse-contour method. However, the simplicity and rapidity of this method compared to determination of cardiac output by Fick or indicator-dilution methods makes it a potentially useful adjunct for monitoring critically ill patients.

  15. Controls on the physical properties of gas-hydrate-bearing sediments because of the interaction between gas hydrate and porous media

    USGS Publications Warehouse

    Lee, Myung W.; Collett, Timothy S.

    2005-01-01

    Physical properties of gas-hydrate-bearing sediments depend on the pore-scale interaction between gas hydrate and porous media as well as the amount of gas hydrate present. Well log measurements such as proton nuclear magnetic resonance (NMR) relaxation and electromagnetic propagation tool (EPT) techniques depend primarily on the bulk volume of gas hydrate in the pore space irrespective of the pore-scale interaction. However, elastic velocities or permeability depend on how gas hydrate is distributed in the pore space as well as the amount of gas hydrate. Gas-hydrate saturations estimated from NMR and EPT measurements are free of adjustable parameters; thus, the estimations are unbiased estimates of gas hydrate if the measurement is accurate. However, the amount of gas hydrate estimated from elastic velocities or electrical resistivities depends on many adjustable parameters and models related to the interaction of gas hydrate and porous media, so these estimates are model dependent and biased. NMR, EPT, elastic-wave velocity, electrical resistivity, and permeability measurements acquired in the Mallik 5L-38 well in the Mackenzie Delta, Canada, show that all of the well log evaluation techniques considered provide comparable gas-hydrate saturations in clean (low shale content) sandstone intervals with high gas-hydrate saturations. However, in shaly intervals, estimates from log measurement depending on the pore-scale interaction between gas hydrate and host sediments are higher than those estimates from measurements depending on the bulk volume of gas hydrate.

  16. Osmotic potential calculations of inorganic and organic aqueous solutions over wide solute concentration levels and temperatures.

    PubMed

    Cochrane, T T; Cochrane, T A

    2016-01-01

    To demonstrate that the authors' new "aqueous solution vs pure water" equation to calculate osmotic potential may be used to calculate the osmotic potentials of inorganic and organic aqueous solutions over wide ranges of solute concentrations and temperatures. Currently, the osmotic potentials of solutions used for medical purposes are calculated from equations based on the thermodynamics of the gas laws which are only accurate at low temperature and solute concentration levels. Some solutions used in medicine may need their osmotic potentials calculated more accurately to take into account solute concentrations and temperatures. The authors experimented with their new equation for calculating the osmotic potentials of inorganic and organic aqueous solutions up to and beyond body temperatures by adjusting three of its factors; (a) the volume property of pure water, (b) the number of "free" water molecules per unit volume of solution, "Nf," and (c) the "t" factor expressing the cooperative structural relaxation time of pure water at given temperatures. Adequate information on the volume property of pure water at different temperatures is available in the literature. However, as little information on the relative densities of inorganic and organic solutions, respectively, at varying temperatures needed to calculate Nf was available, provisional equations were formulated to approximate values. Those values together with tentative t values for different temperatures chosen from values calculated by different workers were substituted into the authors' equation to demonstrate how osmotic potentials could be estimated over temperatures up to and beyond bodily temperatures. The provisional equations formulated to calculate Nf, the number of free water molecules per unit volume of inorganic and organic solute solutions, respectively, over wide concentration ranges compared well with the calculations of Nf using recorded relative density data at 20 °C. They were subsequently used to estimate Nf values at temperatures up to and excess of body temperatures. Those values, together with t values at temperatures up to and in excess of body temperatures recorded in the literature, were substituted in the authors' equation for the provisional calculation of osmotic potentials. The calculations indicated that solution temperatures and solute concentrations have a marked effect on osmotic potentials. Following work to measure the relative densities of aqueous solutions for the calculation of Nf values and the determination of definitive t values up to and beyond bodily temperatures, the authors' equation would enable the accurate estimations of the osmotic potentials of wide concentrations of aqueous solutions of inorganic and organic solutes over the temperature range. The study illustrates that not only solute concentrations but also temperatures have a marked effect on osmotic potentials, an observation of medical and biological significance.

  17. Scaling wood volume estimates from inventory plots to landscapes with airborne LiDAR in temperate deciduous forest.

    PubMed

    Levick, Shaun R; Hessenmöller, Dominik; Schulze, E-Detlef

    2016-12-01

    Monitoring and managing carbon stocks in forested ecosystems requires accurate and repeatable quantification of the spatial distribution of wood volume at landscape to regional scales. Grid-based forest inventory networks have provided valuable records of forest structure and dynamics at individual plot scales, but in isolation they may not represent the carbon dynamics of heterogeneous landscapes encompassing diverse land-management strategies and site conditions. Airborne LiDAR has greatly enhanced forest structural characterisation and, in conjunction with field-based inventories, it provides avenues for monitoring carbon over broader spatial scales. Here we aim to enhance the integration of airborne LiDAR surveying with field-based inventories by exploring the effect of inventory plot size and number on the relationship between field-estimated and LiDAR-predicted wood volume in deciduous broad-leafed forest in central Germany. Estimation of wood volume from airborne LiDAR was most robust (R 2  = 0.92, RMSE = 50.57 m 3 ha -1  ~14.13 Mg C ha -1 ) when trained and tested with 1 ha experimental plot data (n = 50). Predictions based on a more extensive (n = 1100) plot network with considerably smaller (0.05 ha) plots were inferior (R 2  = 0.68, RMSE = 101.01 ~28.09 Mg C ha -1 ). Differences between the 1 and 0.05 ha volume models from LiDAR were negligible however at the scale of individual land-management units. Sample size permutation tests showed that increasing the number of inventory plots above 350 for the 0.05 ha plots returned no improvement in R 2 and RMSE variability of the LiDAR-predicted wood volume model. Our results from this study confirm the utility of LiDAR for estimating wood volume in deciduous broad-leafed forest, but highlight the challenges associated with field plot size and number in establishing robust relationships between airborne LiDAR and field derived wood volume. We are moving into a forest management era where field-inventory and airborne LiDAR are inextricably linked, and we encourage field inventory campaigns to strive for increased plot size and give greater attention to precise stem geolocation for better integration with remote sensing strategies.

  18. Oligomeric cationic polymethacrylates: a comparison of methods for determining molecular weight.

    PubMed

    Locock, Katherine E S; Meagher, Laurence; Haeussler, Matthias

    2014-02-18

    This study compares three common laboratory methods, size-exclusion chromatography (SEC), (1)H nuclear magnetic resonance (NMR), and matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF), to determine the molecular weight of oligomeric cationic copolymers. The potential bias for each method was examined across a series of polymers that varied in molecular weight and cationic character (both choice of cation (amine versus guanidine) and relative proportion present). SEC was found to be the least accurate, overestimating Mn by an average of 140%, owing to the lack of appropriate cationic standards available, and the complexity involved in estimating the hydrodynamic volume of copolymers. MALDI-TOF approximated Mn well for the highly monodisperse (Đ < 1.1), low molecular weight (degree of polymerization (DP) <50) species but appeared unsuitable for the largest polymers in the series due to the mass bias associated with the technique. (1)H NMR was found to most accurately estimate Mn in this study, differing to theoretical values by only 5.2%. (1)H NMR end-group analysis is therefore an inexpensive and facile, primary quantitative method to estimate the molecular weight of oliogomeric cationic polymethacrylates if suitably distinct end-groups signals are present in the spectrum.

  19. A comparison of ambient casino sound and music: effects on dissociation and on perceptions of elapsed time while playing slot machines.

    PubMed

    Noseworthy, Theodore J; Finlay, Karen

    2009-09-01

    This research examined the effects of a casino's auditory character on estimates of elapsed time while gambling. More specifically, this study varied whether the sound heard while gambling was ambient casino sound alone or ambient casino sound accompanied by music. The tempo and volume of both the music and ambient sound were varied to manipulate temporal engagement and introspection. One hundred and sixty (males = 91) individuals played slot machines in groups of 5-8, after which they provided estimates of elapsed time. The findings showed that the typical ambient casino auditive environment, which characterizes the majority of gaming venues, promotes understated estimates of elapsed duration of play. In contrast, when music is introduced into the ambient casino environment, it appears to provide a cue of interval from which players can more accurately reconstruct elapsed duration of play. This is particularly the case when the tempo of the music is slow and the volume is high. Moreover, the confidence with which time estimates are held (as reflected by latency of response) is higher in an auditive environment with music than in an environment that is comprised of ambient casino sounds alone. Implications for casino management are discussed.

  20. Analysis of volumetric response of pituitary adenomas receiving adjuvant CyberKnife stereotactic radiosurgery with the application of an exponential fitting model

    PubMed Central

    Yu, Yi-Lin; Yang, Yun-Ju; Lin, Chin; Hsieh, Chih-Chuan; Li, Chiao-Zhu; Feng, Shao-Wei; Tang, Chi-Tun; Chung, Tzu-Tsao; Ma, Hsin-I; Chen, Yuan-Hao; Ju, Da-Tong; Hueng, Dueng-Yuan

    2017-01-01

    Abstract Tumor control rates of pituitary adenomas (PAs) receiving adjuvant CyberKnife stereotactic radiosurgery (CK SRS) are high. However, there is currently no uniform way to estimate the time course of the disease. The aim of this study was to analyze the volumetric responses of PAs after CK SRS and investigate the application of an exponential decay model in calculating an accurate time course and estimation of the eventual outcome. A retrospective review of 34 patients with PAs who received adjuvant CK SRS between 2006 and 2013 was performed. Tumor volume was calculated using the planimetric method. The percent change in tumor volume and tumor volume rate of change were compared at median 4-, 10-, 20-, and 36-month intervals. Tumor responses were classified as: progression for >15% volume increase, regression for ≤15% decrease, and stabilization for ±15% of the baseline volume at the time of last follow-up. For each patient, the volumetric change versus time was fitted with an exponential model. The overall tumor control rate was 94.1% in the 36-month (range 18–87 months) follow-up period (mean volume change of −43.3%). Volume regression (mean decrease of −50.5%) was demonstrated in 27 (79%) patients, tumor stabilization (mean change of −3.7%) in 5 (15%) patients, and tumor progression (mean increase of 28.1%) in 2 (6%) patients (P = 0.001). Tumors that eventually regressed or stabilized had a temporary volume increase of 1.07% and 41.5% at 4 months after CK SRS, respectively (P = 0.017). The tumor volume estimated using the exponential fitting equation demonstrated high positive correlation with the actual volume calculated by magnetic resonance imaging (MRI) as tested by Pearson correlation coefficient (0.9). Transient progression of PAs post-CK SRS was seen in 62.5% of the patients receiving CK SRS, and it was not predictive of eventual volume regression or progression. A three-point exponential model is of potential predictive value according to relative distribution. An exponential decay model can be used to calculate the time course of tumors that are ultimately controlled. PMID:28121913

  1. Global estimates of shark catches using trade records from commercial markets.

    PubMed

    Clarke, Shelley C; McAllister, Murdoch K; Milner-Gulland, E J; Kirkwood, G P; Michielsens, Catherine G J; Agnew, David J; Pikitch, Ellen K; Nakano, Hideki; Shivji, Mahmood S

    2006-10-01

    Despite growing concerns about overexploitation of sharks, lack of accurate, species-specific harvest data often hampers quantitative stock assessment. In such cases, trade studies can provide insights into exploitation unavailable from traditional monitoring. We applied Bayesian statistical methods to trade data in combination with genetic identification to estimate by species, the annual number of globally traded shark fins, the most commercially valuable product from a group of species often unrecorded in harvest statistics. Our results provide the first fishery-independent estimate of the scale of shark catches worldwide and indicate that shark biomass in the fin trade is three to four times higher than shark catch figures reported in the only global data base. Comparison of our estimates to approximated stock assessment reference points for one of the most commonly traded species, blue shark, suggests that current trade volumes in numbers of sharks are close to or possibly exceeding the maximum sustainable yield levels.

  2. A new method for estimating carbon dioxide emissions from transportation at fine spatial scales

    PubMed Central

    Shu, Yuqin; Reams, Margaret

    2016-01-01

    Detailed estimates of carbon dioxide (CO2) emissions at fine spatial scales are useful to both modelers and decision makers who are faced with the problem of global warming and climate change. Globally, transport related emissions of carbon dioxide are growing. This letter presents a new method based on the volume-preserving principle in the areal interpolation literature to disaggregate transportation-related CO2 emission estimates from the county-level scale to a 1 km2 grid scale. The proposed volume-preserving interpolation (VPI) method, together with the distance-decay principle, were used to derive emission weights for each grid based on its proximity to highways, roads, railroads, waterways, and airports. The total CO2 emission value summed from the grids within a county is made to be equal to the original county-level estimate, thus enforcing the volume-preserving property. The method was applied to downscale the transportation-related CO2 emission values by county (i.e. parish) for the state of Louisiana into 1 km2 grids. The results reveal a more realistic spatial pattern of CO2 emission from transportation, which can be used to identify the emission ‘hot spots’. Of the four highest transportation-related CO2 emission hotspots in Louisiana, high-emission grids literally covered the entire East Baton Rouge Parish and Orleans Parish, whereas CO2 emission in Jefferson Parish (New Orleans suburb) and Caddo Parish (city of Shreveport) were more unevenly distributed. We argue that the new method is sound in principle, flexible in practice, and the resultant estimates are more accurate than previous gridding approaches. PMID:26997973

  3. Evaluation and comparison of diffusion MR methods for measuring apparent transcytolemmal water exchange rate constant

    NASA Astrophysics Data System (ADS)

    Tian, Xin; Li, Hua; Jiang, Xiaoyu; Xie, Jingping; Gore, John C.; Xu, Junzhong

    2017-02-01

    Two diffusion-based approaches, CG (constant gradient) and FEXI (filtered exchange imaging) methods, have been previously proposed for measuring transcytolemmal water exchange rate constant kin, but their accuracy and feasibility have not been comprehensively evaluated and compared. In this work, both computer simulations and cell experiments in vitro were performed to evaluate these two methods. Simulations were done with different cell diameters (5, 10, 20 μm), a broad range of kin values (0.02-30 s-1) and different SNR's, and simulated kin's were directly compared with the ground truth values. Human leukemia K562 cells were cultured and treated with saponin to selectively change cell transmembrane permeability. The agreement between measured kin's of both methods was also evaluated. The results suggest that, without noise, the CG method provides reasonably accurate estimation of kin especially when it is smaller than 10 s-1, which is in the typical physiological range of many biological tissues. However, although the FEXI method overestimates kin even with corrections for the effects of extracellular water fraction, it provides reasonable estimates with practical SNR's and more importantly, the fitted apparent exchange rate AXR showed approximately linear dependence on the ground truth kin. In conclusion, either CG or FEXI method provides a sensitive means to characterize the variations in transcytolemmal water exchange rate constant kin, although the accuracy and specificity is usually compromised. The non-imaging CG method provides more accurate estimation of kin, but limited to large volume-of-interest. Although the accuracy of FEXI is compromised with extracellular volume fraction, it is capable of spatially mapping kin in practice.

  4. Multi-material decomposition of spectral CT images

    NASA Astrophysics Data System (ADS)

    Mendonça, Paulo R. S.; Bhotika, Rahul; Maddah, Mahnaz; Thomsen, Brian; Dutta, Sandeep; Licato, Paul E.; Joshi, Mukta C.

    2010-04-01

    Spectral Computed Tomography (Spectral CT), and in particular fast kVp switching dual-energy computed tomography, is an imaging modality that extends the capabilities of conventional computed tomography (CT). Spectral CT enables the estimation of the full linear attenuation curve of the imaged subject at each voxel in the CT volume, instead of a scalar image in Hounsfield units. Because the space of linear attenuation curves in the energy ranges of medical applications can be accurately described through a two-dimensional manifold, this decomposition procedure would be, in principle, limited to two materials. This paper describes an algorithm that overcomes this limitation, allowing for the estimation of N-tuples of material-decomposed images. The algorithm works by assuming that the mixing of substances and tissue types in the human body has the physicochemical properties of an ideal solution, which yields a model for the density of the imaged material mix. Under this model the mass attenuation curve of each voxel in the image can be estimated, immediately resulting in a material-decomposed image triplet. Decomposition into an arbitrary number of pre-selected materials can be achieved by automatically selecting adequate triplets from an application-specific material library. The decomposition is expressed in terms of the volume fractions of each constituent material in the mix; this provides for a straightforward, physically meaningful interpretation of the data. One important application of this technique is in the digital removal of contrast agent from a dual-energy exam, producing a virtual nonenhanced image, as well as in the quantification of the concentration of contrast observed in a targeted region, thus providing an accurate measure of tissue perfusion.

  5. The contribution of the swimbladder to buoyancy in the adult zebrafish (Danio rerio): a morphometric analysis.

    PubMed

    Robertson, George N; Lindsey, Benjamin W; Dumbarton, Tristan C; Croll, Roger P; Smith, Frank M

    2008-06-01

    Many teleost fishes use a swimbladder, a gas-filled organ in the coelomic cavity, to reduce body density toward neutral buoyancy, thus minimizing the locomotory cost of maintaining a constant depth in the water column. However, for most swimbladder-bearing teleosts, the contribution of this organ to the attainment of neutral buoyancy has not been quantified. Here, we examined the quantitative contribution of the swimbladder to buoyancy and three-dimensional stability in a small cyprinid, the zebrafish (Danio rerio). In aquaria during daylight hours, adult animals were observed at mean depths from 10.1 +/- 6.0 to 14.2 +/- 5.6 cm below the surface. Fish mass and whole-body volume were linearly correlated (r(2) = 0.96) over a wide range of body size (0.16-0.73 g); mean whole-body density was 1.01 +/- 0.09 g cm(-3). Stereological estimations of swimbladder volume from linear dimensions of lateral X-ray images and direct measurements of gas volumes recovered by puncture from the same swimbladders showed that results from these two methods were highly correlated (r(2) = 0.85). The geometric regularity of the swimbladder thus permitted its volume to be accurately estimated from a single lateral image. Mean body density in the absence of the swimbladder was 1.05 +/- 0.04 g cm(-3). The swimbladder occupied 5.1 +/- 1.4% of total body volume, thus reducing whole-body density significantly. The location of the centers of mass and buoyancy along rostro-caudal and dorso-ventral axes overlapped near the ductus communicans, a constriction between the anterior and posterior swimbladder chambers. Our work demonstrates that the swimbladder of the adult zebrafish contributes significantly to buoyancy and attitude stability. Furthermore, we describe and verify a stereological method for estimating swimbladder volume that will aid future studies of the functions of this organ. 2008 Wiley-Liss, Inc

  6. Number of holes contained within the Fermi surface volume in underdoped high-temperature superconductors

    DOE PAGES

    Harrison, Neil

    2016-08-16

    Here, we provide a potential solution to the longstanding problem relating Fermi surface reconstruction to the number of holes contained within the Fermi surface volume in underdoped high T c superconductors. On considering uniaxial and biaxial charge-density wave order, we show that there exists a relationship between the ordering wave vector, the hole doping, and the cross-sectional area of the reconstructed Fermi surface whose precise form depends on the volume of the starting Fermi surface. We consider a “large” starting Fermi surface comprising 1+p hole carriers, as predicted by band structure calculations, and a “small” starting Fermi surface comprising pmore » hole carriers, as proposed in models in which the Coulomb repulsion remains the dominant energy. Using the reconstructed Fermi surface cross-sectional area obtained in quantum oscillation experiments in YBa 2Cu 3O 6+x and HgBa 2CuO 4+x and the established methods for estimating the chemical hole doping, we find the ordering vectors obtained from x-ray scattering measurements to show a close correspondence with those expected for the small starting Fermi surface. We therefore show the quantum oscillation frequency and charge-density wave vectors provide accurate estimates for the number of holes contributing to the Fermi surface volume in the pseudogap regime.« less

  7. Number of holes contained within the Fermi surface volume in underdoped high-temperature superconductors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Neil

    Here, we provide a potential solution to the longstanding problem relating Fermi surface reconstruction to the number of holes contained within the Fermi surface volume in underdoped high T c superconductors. On considering uniaxial and biaxial charge-density wave order, we show that there exists a relationship between the ordering wave vector, the hole doping, and the cross-sectional area of the reconstructed Fermi surface whose precise form depends on the volume of the starting Fermi surface. We consider a “large” starting Fermi surface comprising 1+p hole carriers, as predicted by band structure calculations, and a “small” starting Fermi surface comprising pmore » hole carriers, as proposed in models in which the Coulomb repulsion remains the dominant energy. Using the reconstructed Fermi surface cross-sectional area obtained in quantum oscillation experiments in YBa 2Cu 3O 6+x and HgBa 2CuO 4+x and the established methods for estimating the chemical hole doping, we find the ordering vectors obtained from x-ray scattering measurements to show a close correspondence with those expected for the small starting Fermi surface. We therefore show the quantum oscillation frequency and charge-density wave vectors provide accurate estimates for the number of holes contributing to the Fermi surface volume in the pseudogap regime.« less

  8. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array

    NASA Astrophysics Data System (ADS)

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong

    2014-10-01

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  9. Fusing Continuous-Valued Medical Labels Using a Bayesian Model.

    PubMed

    Zhu, Tingting; Dunkley, Nic; Behar, Joachim; Clifton, David A; Clifford, Gari D

    2015-12-01

    With the rapid increase in volume of time series medical data available through wearable devices, there is a need to employ automated algorithms to label data. Examples of labels include interventions, changes in activity (e.g. sleep) and changes in physiology (e.g. arrhythmias). However, automated algorithms tend to be unreliable resulting in lower quality care. Expert annotations are scarce, expensive, and prone to significant inter- and intra-observer variance. To address these problems, a Bayesian Continuous-valued Label Aggregator (BCLA) is proposed to provide a reliable estimation of label aggregation while accurately infer the precision and bias of each algorithm. The BCLA was applied to QT interval (pro-arrhythmic indicator) estimation from the electrocardiogram using labels from the 2006 PhysioNet/Computing in Cardiology Challenge database. It was compared to the mean, median, and a previously proposed Expectation Maximization (EM) label aggregation approaches. While accurately predicting each labelling algorithm's bias and precision, the root-mean-square error of the BCLA was 11.78 ± 0.63 ms, significantly outperforming the best Challenge entry (15.37 ± 2.13 ms) as well as the EM, mean, and median voting strategies (14.76 ± 0.52, 17.61 ± 0.55, and 14.43 ± 0.57 ms respectively with p < 0.0001). The BCLA could therefore provide accurate estimation for medical continuous-valued label tasks in an unsupervised manner even when the ground truth is not available.

  10. Development and Evaluation of a Semi-automated Segmentation Tool and a Modified Ellipsoid Formula for Volumetric Analysis of the Kidney in Non-contrast T2-Weighted MR Images.

    PubMed

    Seuss, Hannes; Janka, Rolf; Prümmer, Marcus; Cavallaro, Alexander; Hammon, Rebecca; Theis, Ragnar; Sandmair, Martin; Amann, Kerstin; Bäuerle, Tobias; Uder, Michael; Hammon, Matthias

    2017-04-01

    Volumetric analysis of the kidney parenchyma provides additional information for the detection and monitoring of various renal diseases. Therefore the purposes of the study were to develop and evaluate a semi-automated segmentation tool and a modified ellipsoid formula for volumetric analysis of the kidney in non-contrast T2-weighted magnetic resonance (MR)-images. Three readers performed semi-automated segmentation of the total kidney volume (TKV) in axial, non-contrast-enhanced T2-weighted MR-images of 24 healthy volunteers (48 kidneys) twice. A semi-automated threshold-based segmentation tool was developed to segment the kidney parenchyma. Furthermore, the three readers measured renal dimensions (length, width, depth) and applied different formulas to calculate the TKV. Manual segmentation served as a reference volume. Volumes of the different methods were compared and time required was recorded. There was no significant difference between the semi-automatically and manually segmented TKV (p = 0.31). The difference in mean volumes was 0.3 ml (95% confidence interval (CI), -10.1 to 10.7 ml). Semi-automated segmentation was significantly faster than manual segmentation, with a mean difference = 188 s (220 vs. 408 s); p < 0.05. Volumes did not differ significantly comparing the results of different readers. Calculation of TKV with a modified ellipsoid formula (ellipsoid volume × 0.85) did not differ significantly from the reference volume; however, the mean error was three times higher (difference of mean volumes -0.1 ml; CI -31.1 to 30.9 ml; p = 0.95). Applying the modified ellipsoid formula was the fastest way to get an estimation of the renal volume (41 s). Semi-automated segmentation and volumetric analysis of the kidney in native T2-weighted MR data delivers accurate and reproducible results and was significantly faster than manual segmentation. Applying a modified ellipsoid formula quickly provides an accurate kidney volume.

  11. Accuracy analysis of point cloud modeling for evaluating concrete specimens

    NASA Astrophysics Data System (ADS)

    D'Amico, Nicolas; Yu, Tzuyang

    2017-04-01

    Photogrammetric methods such as structure from motion (SFM) have the capability to acquire accurate information about geometric features, surface cracks, and mechanical properties of specimens and structures in civil engineering. Conventional approaches to verify the accuracy in photogrammetric models usually require the use of other optical techniques such as LiDAR. In this paper, geometric accuracy of photogrammetric modeling is investigated by studying the effects of number of photos, radius of curvature, and point cloud density (PCD) on estimated lengths, areas, volumes, and different stress states of concrete cylinders and panels. Four plain concrete cylinders and two plain mortar panels were used for the study. A commercially available mobile phone camera was used in collecting all photographs. Agisoft PhotoScan software was applied in photogrammetric modeling of all concrete specimens. From our results, it was found that the increase of number of photos does not necessarily improve the geometric accuracy of point cloud models (PCM). It was also found that the effect of radius of curvature is not significant when compared with the ones of number of photos and PCD. A PCD threshold of 15.7194 pts/cm3 is proposed to construct reliable and accurate PCM for condition assessment. At this PCD threshold, all errors for estimating lengths, areas, and volumes were less than 5%. Finally, from the study of mechanical property of a plain concrete cylinder, we have found that the increase of stress level inside the concrete cylinder can be captured by the increase of radial strain in its PCM.

  12. Accurate fluid force measurement based on control surface integration

    NASA Astrophysics Data System (ADS)

    Lentink, David

    2018-01-01

    Nonintrusive 3D fluid force measurements are still challenging to conduct accurately for freely moving animals, vehicles, and deforming objects. Two techniques, 3D particle image velocimetry (PIV) and a new technique, the aerodynamic force platform (AFP), address this. Both rely on the control volume integral for momentum; whereas PIV requires numerical integration of flow fields, the AFP performs the integration mechanically based on rigid walls that form the control surface. The accuracy of both PIV and AFP measurements based on the control surface integration is thought to hinge on determining the unsteady body force associated with the acceleration of the volume of displaced fluid. Here, I introduce a set of non-dimensional error ratios to show which fluid and body parameters make the error negligible. The unsteady body force is insignificant in all conditions where the average density of the body is much greater than the density of the fluid, e.g., in gas. Whenever a strongly deforming body experiences significant buoyancy and acceleration, the error is significant. Remarkably, this error can be entirely corrected for with an exact factor provided that the body has a sufficiently homogenous density or acceleration distribution, which is common in liquids. The correction factor for omitting the unsteady body force, {{{ {ρ f}} {1 - {ρ f} ( {{ρ b}+{ρ f}} )}.{( {{{{ρ }}b}+{ρ f}} )}}} , depends only on the fluid, {ρ f}, and body, {{ρ }}b, density. Whereas these straightforward solutions work even at the liquid-gas interface in a significant number of cases, they do not work for generalized bodies undergoing buoyancy in combination with appreciable body density inhomogeneity, volume change (PIV), or volume rate-of-change (PIV and AFP). In these less common cases, the 3D body shape needs to be measured and resolved in time and space to estimate the unsteady body force. The analysis shows that accounting for the unsteady body force is straightforward to non-intrusively and accurately determine fluid force in most applications.

  13. Semiautomatic segmentation of liver metastases on volumetric CT images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Jiayong; Schwartz, Lawrence H.; Zhao, Binsheng, E-mail: bz2166@cumc.columbia.edu

    2015-11-15

    Purpose: Accurate segmentation and quantification of liver metastases on CT images are critical to surgery/radiation treatment planning and therapy response assessment. To date, there are no reliable methods to perform such segmentation automatically. In this work, the authors present a method for semiautomatic delineation of liver metastases on contrast-enhanced volumetric CT images. Methods: The first step is to manually place a seed region-of-interest (ROI) in the lesion on an image. This ROI will (1) serve as an internal marker and (2) assist in automatically identifying an external marker. With these two markers, lesion contour on the image can be accuratelymore » delineated using traditional watershed transformation. Density information will then be extracted from the segmented 2D lesion and help determine the 3D connected object that is a candidate of the lesion volume. The authors have developed a robust strategy to automatically determine internal and external markers for marker-controlled watershed segmentation. By manually placing a seed region-of-interest in the lesion to be delineated on a reference image, the method can automatically determine dual threshold values to approximately separate the lesion from its surrounding structures and refine the thresholds from the segmented lesion for the accurate segmentation of the lesion volume. This method was applied to 69 liver metastases (1.1–10.3 cm in diameter) from a total of 15 patients. An independent radiologist manually delineated all lesions and the resultant lesion volumes served as the “gold standard” for validation of the method’s accuracy. Results: The algorithm received a median overlap, overestimation ratio, and underestimation ratio of 82.3%, 6.0%, and 11.5%, respectively, and a median average boundary distance of 1.2 mm. Conclusions: Preliminary results have shown that volumes of liver metastases on contrast-enhanced CT images can be accurately estimated by a semiautomatic segmentation method.« less

  14. Nano-Scale Characterization of Al-Mg Nanocrystalline Alloys

    NASA Astrophysics Data System (ADS)

    Harvey, Evan; Ladani, Leila

    Materials with nano-scale microstructure have become increasingly popular due to their benefit of substantially increased strengths. The increase in strength as a result of decreasing grain size is defined by the Hall-Petch equation. With increased interest in miniaturization of components, methods of mechanical characterization of small volumes of material are necessary because traditional means such as tensile testing becomes increasingly difficult with such small test specimens. This study seeks to characterize elastic-plastic properties of nanocrystalline Al-5083 through nanoindentation and related data analysis techniques. By using nanoindentation, accurate predictions of the elastic modulus and hardness of the alloy were attained. Also, the employed data analysis model provided reasonable estimates of the plastic properties (strain-hardening exponent and yield stress) lending credibility to this procedure as an accurate, full mechanical characterization method.

  15. Determination of bromine in selected polymer materials by a wavelength-dispersive X-ray fluorescence spectrometric method - Critical thickness problem and solutions

    NASA Astrophysics Data System (ADS)

    Gorewoda, Tadeusz; Mzyk, Zofia; Anyszkiewicz, Jacek; Charasińska, Jadwiga

    2015-04-01

    The purpose of this study was to develop an accurate method for the determination of bromine in polymer materials using X-ray fluorescence spectrometry when the thickness of the sample is less than the bromine critical thickness (tc) value. This is particularly important for analyzing compliance with the Restriction of Hazardous Substances Directive. Mathematically and experimentally estimated tc values in polyethylene and cellulose matrixes were up to several millimeters. Four methods were developed to obtain an accurate result. These methods include the addition of an element with a high mass absorption coefficient, the measurement of the total bromine contained in a defined volume of the sample, the exploitation of tube-Rayleigh line intensities and using the Br-Lβ line.

  16. Towards local progression estimation of pulmonary emphysema using CT.

    PubMed

    Staring, M; Bakker, M E; Stolk, J; Shamonin, D P; Reiber, J H C; Stoel, B C

    2014-02-01

    Whole lung densitometry on chest CT images is an accepted method for measuring tissue destruction in patients with pulmonary emphysema in clinical trials. Progression measurement is required for evaluation of change in health condition and the effect of drug treatment. Information about the location of emphysema progression within the lung may be important for the correct interpretation of drug efficacy, or for determining a treatment plan. The purpose of this study is therefore to develop and validate methods that enable the local measurement of lung density changes, which requires proper modeling of the effect of respiration on density. Four methods, all based on registration of baseline and follow-up chest CT scans, are compared. The first naïve method subtracts registered images. The second employs the so-called dry sponge model, where volume correction is performed using the determinant of the Jacobian of the transformation. The third and the fourth introduce a novel adaptation of the dry sponge model that circumvents its constant-mass assumption, which is shown to be invalid. The latter two methods require a third CT scan at a different inspiration level to estimate the patient-specific density-volume slope, where one method employs a global and the other a local slope. The methods were validated on CT scans of a phantom mimicking the lung, where mass and volume could be controlled. In addition, validation was performed on data of 21 patients with pulmonary emphysema. The image registration method was optimized leaving a registration error below half the slice increment (median 1.0 mm). The phantom study showed that the locally adapted slope model most accurately measured local progression. The systematic error in estimating progression, as measured on the phantom data, was below 2 gr/l for a 70 ml (6%) volume difference, and 5 gr/l for a 210 ml (19%) difference, if volume correction was applied. On the patient data an underlying linearity assumption relating lung volume change with density change was shown to hold (fitR(2) = 0.94), and globalized versions of the local models are consistent with global results (R(2) of 0.865 and 0.882 for the two adapted slope models, respectively). In conclusion, image matching and subsequent analysis of differences according to the proposed lung models (i) has good local registration accuracy on patient data, (ii) effectively eliminates a dependency on inspiration level at acquisition time, (iii) accurately predicts progression in phantom data, and (iv) is reasonably consistent with global results in patient data. It is therefore a potential future tool for assessing local emphysema progression in drug evaluation trials and in clinical practice.

  17. A comparison between a new model and current models for estimating trunk segment inertial parameters.

    PubMed

    Wicke, Jason; Dumas, Genevieve A; Costigan, Patrick A

    2009-01-05

    Modeling of the body segments to estimate segment inertial parameters is required in the kinetic analysis of human motion. A new geometric model for the trunk has been developed that uses various cross-sectional shapes to estimate segment volume and adopts a non-uniform density function that is gender-specific. The goal of this study was to test the accuracy of the new model for estimating the trunk's inertial parameters by comparing it to the more current models used in biomechanical research. Trunk inertial parameters estimated from dual X-ray absorptiometry (DXA) were used as the standard. Twenty-five female and 24 male college-aged participants were recruited for the study. Comparisons of the new model to the accepted models were accomplished by determining the error between the models' trunk inertial estimates and that from DXA. Results showed that the new model was more accurate across all inertial estimates than the other models. The new model had errors within 6.0% for both genders, whereas the other models had higher average errors ranging from 10% to over 50% and were much more inconsistent between the genders. In addition, there was little consistency in the level of accuracy for the other models when estimating the different inertial parameters. These results suggest that the new model provides more accurate and consistent trunk inertial estimates than the other models for both female and male college-aged individuals. However, similar studies need to be performed using other populations, such as elderly or individuals from a distinct morphology (e.g. obese). In addition, the effect of using different models on the outcome of kinetic parameters, such as joint moments and forces needs to be assessed.

  18. User-initialized active contour segmentation and golden-angle real-time cardiovascular magnetic resonance enable accurate assessment of LV function in patients with sinus rhythm and arrhythmias.

    PubMed

    Contijoch, Francisco; Witschey, Walter R T; Rogers, Kelly; Rears, Hannah; Hansen, Michael; Yushkevich, Paul; Gorman, Joseph; Gorman, Robert C; Han, Yuchi

    2015-05-21

    Data obtained during arrhythmia is retained in real-time cardiovascular magnetic resonance (rt-CMR), but there is limited and inconsistent evidence to show that rt-CMR can accurately assess beat-to-beat variation in left ventricular (LV) function or during an arrhythmia. Multi-slice, short axis cine and real-time golden-angle radial CMR data was collected in 22 clinical patients (18 in sinus rhythm and 4 patients with arrhythmia). A user-initialized active contour segmentation (ACS) software was validated via comparison to manual segmentation on clinically accepted software. For each image in the 2D acquisitions, slice volume was calculated and global LV volumes were estimated via summation across the LV using multiple slices. Real-time imaging data was reconstructed using different image exposure times and frame rates to evaluate the effect of temporal resolution on measured function in each slice via ACS. Finally, global volumetric function of ectopic and non-ectopic beats was measured using ACS in patients with arrhythmias. ACS provides global LV volume measurements that are not significantly different from manual quantification of retrospectively gated cine images in sinus rhythm patients. With an exposure time of 95.2 ms and a frame rate of > 89 frames per second, golden-angle real-time imaging accurately captures hemodynamic function over a range of patient heart rates. In four patients with frequent ectopic contractions, initial quantification of the impact of ectopic beats on hemodynamic function was demonstrated. User-initialized active contours and golden-angle real-time radial CMR can be used to determine time-varying LV function in patients. These methods will be very useful for the assessment of LV function in patients with frequent arrhythmias.

  19. Automatic estimation of extent of resection and residual tumor volume of patients with glioblastoma.

    PubMed

    Meier, Raphael; Porz, Nicole; Knecht, Urspeter; Loosli, Tina; Schucht, Philippe; Beck, Jürgen; Slotboom, Johannes; Wiest, Roland; Reyes, Mauricio

    2017-10-01

    OBJECTIVE In the treatment of glioblastoma, residual tumor burden is the only prognostic factor that can be actively influenced by therapy. Therefore, an accurate, reproducible, and objective measurement of residual tumor burden is necessary. This study aimed to evaluate the use of a fully automatic segmentation method-brain tumor image analysis (BraTumIA)-for estimating the extent of resection (EOR) and residual tumor volume (RTV) of contrast-enhancing tumor after surgery. METHODS The imaging data of 19 patients who underwent primary resection of histologically confirmed supratentorial glioblastoma were retrospectively reviewed. Contrast-enhancing tumors apparent on structural preoperative and immediate postoperative MR imaging in this patient cohort were segmented by 4 different raters and the automatic segmentation BraTumIA software. The manual and automatic results were quantitatively compared. RESULTS First, the interrater variabilities in the estimates of EOR and RTV were assessed for all human raters. Interrater agreement in terms of the coefficient of concordance (W) was higher for RTV (W = 0.812; p < 0.001) than for EOR (W = 0.775; p < 0.001). Second, the volumetric estimates of BraTumIA for all 19 patients were compared with the estimates of the human raters, which showed that for both EOR (W = 0.713; p < 0.001) and RTV (W = 0.693; p < 0.001) the estimates of BraTumIA were generally located close to or between the estimates of the human raters. No statistically significant differences were detected between the manual and automatic estimates. BraTumIA showed a tendency to overestimate contrast-enhancing tumors, leading to moderate agreement with expert raters with respect to the literature-based, survival-relevant threshold values for EOR. CONCLUSIONS BraTumIA can generate volumetric estimates of EOR and RTV, in a fully automatic fashion, which are comparable to the estimates of human experts. However, automated analysis showed a tendency to overestimate the volume of a contrast-enhancing tumor, whereas manual analysis is prone to subjectivity, thereby causing considerable interrater variability.

  20. Best method for right atrial volume assessment by two-dimensional echocardiography: validation with magnetic resonance imaging.

    PubMed

    Ebtia, Mahasti; Murphy, Darra; Gin, Kenneth; Lee, Pui K; Jue, John; Nair, Parvathy; Mayo, John; Barnes, Marion E; Thompson, Darby J S; Tsang, Teresa S M

    2015-05-01

    Echocardiographic methods for estimating right atrial (RA) volume have not been standardized. Our aim was to evaluate two-dimensional (2D) echocardiographic methods of RA volume assessment, using RA volume by magnetic resonance imaging (MRI) as the reference. Right atrial volume was assessed in 51 patients (mean age 63 ± 14 years, 33 female) who underwent comprehensive 2D echocardiography and cardiac MRI for clinically indicated reasons. Echocardiographic RA volume methods included (1) biplane area length, using four-chamber view twice (biplane 4C-4C); (2) biplane area length, using four-chamber and subcostal views (biplane 4C-subcostal); and (3) single plane Simpson's method of disks (Simpson's). Echocardiographic RA volumes as well as linear RA major and minor dimensions were compared to RA volume by MRI using correlation and Bland-Altman methods, and evaluated for inter-observer reproducibility and accuracy in discriminating RA enlargement. All echocardiography volumetric methods performed well compared to MRI, with Pearson's correlation of 0.98 and concordance correlation ≥0.91 for each. For bias and limits of agreement, biplane 4C-4C (bias -4.81 mL/m(2) , limits of agreement ±9.8 mL/m(2) ) and Simpson's (bias -5.15 mL/m(2) , limits of agreement ±10.1 mL/m(2) ) outperformed biplane 4C-subcostal (bias -8.36 mL/m(2) , limits of agreement ±12.5 mL/m(2) ). Accuracy for discriminating RA enlargement was higher for all volumetric methods than for linear measurements. Inter-observer variability was satisfactory across all methods. Compared to MRI, biplane 4C-4C and single plane Simpson's are highly accurate and reproducible 2D echocardiography methods for estimating RA volume. Linear dimensions are inaccurate and should be abandoned. © 2014, Wiley Periodicals, Inc.

  1. Nonintrusive iris image acquisition system based on a pan-tilt-zoom camera and light stripe projection

    NASA Astrophysics Data System (ADS)

    Yoon, Soweon; Jung, Ho Gi; Park, Kang Ryoung; Kim, Jaihie

    2009-03-01

    Although iris recognition is one of the most accurate biometric technologies, it has not yet been widely used in practical applications. This is mainly due to user inconvenience during the image acquisition phase. Specifically, users try to adjust their eye position within small capture volume at a close distance from the system. To overcome these problems, we propose a novel iris image acquisition system that provides users with unconstrained environments: a large operating range, enabling movement from standing posture, and capturing good-quality iris images in an acceptable time. The proposed system has the following three contributions compared with previous works: (1) the capture volume is significantly increased by using a pan-tilt-zoom (PTZ) camera guided by a light stripe projection, (2) the iris location in the large capture volume is found fast due to 1-D vertical face searching from the user's horizontal position obtained by the light stripe projection, and (3) zooming and focusing on the user's irises at a distance are accurate and fast using the estimated 3-D position of a face by the light stripe projection and the PTZ camera. Experimental results show that the proposed system can capture good-quality iris images in 2.479 s on average at a distance of 1.5 to 3 m, while allowing a limited amount of movement by the user.

  2. Image-derived input function with factor analysis and a-priori information.

    PubMed

    Simončič, Urban; Zanotti-Fregonara, Paolo

    2015-02-01

    Quantitative PET studies often require the cumbersome and invasive procedure of arterial cannulation to measure the input function. This study sought to minimize the number of necessary blood samples by developing a factor-analysis-based image-derived input function (IDIF) methodology for dynamic PET brain studies. IDIF estimation was performed as follows: (a) carotid and background regions were segmented manually on an early PET time frame; (b) blood-weighted and tissue-weighted time-activity curves (TACs) were extracted with factor analysis; (c) factor analysis results were denoised and scaled using the voxels with the highest blood signal; (d) using population data and one blood sample at 40 min, whole-blood TAC was estimated from postprocessed factor analysis results; and (e) the parent concentration was finally estimated by correcting the whole-blood curve with measured radiometabolite concentrations. The methodology was tested using data from 10 healthy individuals imaged with [(11)C](R)-rolipram. The accuracy of IDIFs was assessed against full arterial sampling by comparing the area under the curve of the input functions and by calculating the total distribution volume (VT). The shape of the image-derived whole-blood TAC matched the reference arterial curves well, and the whole-blood area under the curves were accurately estimated (mean error 1.0±4.3%). The relative Logan-V(T) error was -4.1±6.4%. Compartmental modeling and spectral analysis gave less accurate V(T) results compared with Logan. A factor-analysis-based IDIF for [(11)C](R)-rolipram brain PET studies that relies on a single blood sample and population data can be used for accurate quantification of Logan-V(T) values.

  3. The description of a method for accurately estimating creatinine clearance in acute kidney injury.

    PubMed

    Mellas, John

    2016-05-01

    Acute kidney injury (AKI) is a common and serious condition encountered in hospitalized patients. The severity of kidney injury is defined by the RIFLE, AKIN, and KDIGO criteria which attempt to establish the degree of renal impairment. The KDIGO guidelines state that the creatinine clearance should be measured whenever possible in AKI and that the serum creatinine concentration and creatinine clearance remain the best clinical indicators of renal function. Neither the RIFLE, AKIN, nor KDIGO criteria estimate actual creatinine clearance. Furthermore there are no accepted methods for accurately estimating creatinine clearance (K) in AKI. The present study describes a unique method for estimating K in AKI using urine creatinine excretion over an established time interval (E), an estimate of creatinine production over the same time interval (P), and the estimated static glomerular filtration rate (sGFR), at time zero, utilizing the CKD-EPI formula. Using these variables estimated creatinine clearance (Ke)=E/P * sGFR. The method was tested for validity using simulated patients where actual creatinine clearance (Ka) was compared to Ke in several patients, both male and female, and of various ages, body weights, and degrees of renal impairment. These measurements were made at several serum creatinine concentrations in an attempt to determine the accuracy of this method in the non-steady state. In addition E/P and Ke was calculated in hospitalized patients, with AKI, and seen in nephrology consultation by the author. In these patients the accuracy of the method was determined by looking at the following metrics; E/P>1, E/P<1, E=P in an attempt to predict progressive azotemia, recovering azotemia, or stabilization in the level of azotemia respectively. In addition it was determined whether Ke<10 ml/min agreed with Ka and whether patients with AKI on renal replacement therapy could safely terminate dialysis if Ke was greater than 5 ml/min. In the simulated patients there were 96 measurements in six different patients where Ka was compared to Ke. The estimated proportion of Ke within 30% of Ka was 0.907 with 95% exact binomial proportion confidence limits. The predictive accuracy of E/P in the study patients was also reported as a proportion and the associated 95% confidence limits: 0.848 (0.800, 0.896) for E/P<1; 0.939 (0.904, 0.974) for E/P>1 and 0.907 (0.841, 0.973) for 0.95 ml/min accurately predicted the ability to terminate renal replacement therapy in AKI. Include the need to measure urine volume accurately. Furthermore the precision of the method requires accurate estimates of sGFR, while a reasonable measure of P is crucial to estimating Ke. The present study provides the practitioner with a new tool to estimate real time K in AKI with enough precision to predict the severity of the renal injury, including progression, stabilization, or improvement in azotemia. It is the author's belief that this simple method improves on RIFLE, AKIN, and KDIGO for estimating the degree of renal impairment in AKI and allows a more accurate estimate of K in AKI. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Is computed tomography volumetric assessment of the liver reliable in patients with cirrhosis?

    PubMed Central

    Goumard, Claire; Perdigao, Fabiano; Cazejust, Julien; Zalinski, Stéphane; Soubrane, Olivier; Scatton, Olivier

    2014-01-01

    Objectives: The estimation of liver volume (LV) has been widely studied in normal liver, the density of which is considered to be equivalent to 1 kg/l. In cirrhosis, volumetric evaluation and its correlation to liver mass remain unclear. The aim of this study was to evaluate the accuracy of computed tomography (CT) scanning to assess LV in patients with cirrhosis. Methods: Liver volume was evaluated by CT (CTLV) and correlated to the explanted liver weight (LW) in 49 patients. Liver density (LD) and its association with clinical features were analysed. Commonly used formulae for estimating LV were also evaluated. The real density of cirrhotic liver was prospectively measured in explant specimens. Results: Wide variations between CTLV (in ml) and LW (in g) were found (range: 3–748). Cirrhotic livers in patients with hepatitis B virus infection presented significantly increased LD (P = 0.001) with lower CTLV (P = 0.005). Liver volume as measured by CT was also decreased in patients with Model for End-stage Liver Disease scores of >15 (P = 0.023). Formulae estimating LV correlated poorly with CTLV and LW. The density of cirrhotic liver measured prospectively in 15 patients was 1.1 kg/l. Conclusions: In cirrhotic liver, LV assessed by CT did not correspond to real LW. Liver density changed according to the aetiology and severity of liver disease. Commonly used formulae did not accurately assess LV. PMID:23679861

  5. The correlation between preoperative volumetry and real graft weight: comparison of two volumetry programs.

    PubMed

    Mussin, Nadiar; Sumo, Marco; Lee, Kwang-Woong; Choi, YoungRok; Choi, Jin Yong; Ahn, Sung-Woo; Yoon, Kyung Chul; Kim, Hyo-Sin; Hong, Suk Kyun; Yi, Nam-Joon; Suh, Kyung-Suk

    2017-04-01

    Liver volumetry is a vital component in living donor liver transplantation to determine an adequate graft volume that meets the metabolic demands of the recipient and at the same time ensures donor safety. Most institutions use preoperative contrast-enhanced CT image-based software programs to estimate graft volume. The objective of this study was to evaluate the accuracy of 2 liver volumetry programs (Rapidia vs . Dr. Liver) in preoperative right liver graft estimation compared with real graft weight. Data from 215 consecutive right lobe living donors between October 2013 and August 2015 were retrospectively reviewed. One hundred seven patients were enrolled in Rapidia group and 108 patients were included in the Dr. Liver group. Estimated graft volumes generated by both software programs were compared with real graft weight measured during surgery, and further classified into minimal difference (≤15%) and big difference (>15%). Correlation coefficients and degree of difference were determined. Linear regressions were calculated and results depicted as scatterplots. Minimal difference was observed in 69.4% of cases from Dr. Liver group and big difference was seen in 44.9% of cases from Rapidia group (P = 0.035). Linear regression analysis showed positive correlation in both groups (P < 0.01). However, the correlation coefficient was better for the Dr. Liver group (R 2 = 0.719), than for the Rapidia group (R 2 = 0.688). Dr. Liver can accurately predict right liver graft size better and faster than Rapidia, and can facilitate preoperative planning in living donor liver transplantation.

  6. Measurement of limb volume: laser scanning versus volume displacement.

    PubMed

    McKinnon, John Gregory; Wong, Vanessa; Temple, Walley J; Galbraith, Callum; Ferry, Paul; Clynch, George S; Clynch, Colin

    2007-10-01

    Determining the prevalence and treatment success of surgical lymphedema requires accurate and reproducible measurement. A new method of measurement of limb volume is described. A series of inanimate objects of known and unknown volume was measured using digital laser scanning and water displacement. A similar comparison was made with 10 human volunteers. Digital scanning was evaluated by comparison to the established method of water displacement, then to itself to determine reproducibility of measurement. (1) Objects of known volume: Laser scanning accurately measured the calculated volume but water displacement became less accurate as the size of the object increased. (2) Objects of unknown volume: As average volume increased, there was an increasing bias of underestimation of volume by the water displacement method. The coefficient of reproducibility of water displacement was 83.44 ml. In contrast, the reproducibility of the digital scanning method was 19.0 ml. (3) Human data: The mean difference between water displacement volume and laser scanning volume was 151.7 ml (SD +/- 189.5). The coefficient of reproducibility of water displacement was 450.8 ml whereas for laser scanning it was 174 ml. Laser scanning is an innovative method of measuring tissue volume that combines precision and reproducibility and may have clinical utility for measuring lymphedema. 2007 Wiley-Liss, Inc

  7. The single water-surface sweep estimation method accurately estimates very low (n = 4) to low-moderate (n = 25-100) and high (n > 100) Aedes aegypti (Diptera: Culicidae) pupae numbers in large water containers up to 13 times faster than the exhaustive sweep and total count method and without any sediment contamination.

    PubMed

    Romero-Vivas, C M; Llinás, H; Falconar, A K

    2015-03-01

    To confirm that a single water-surface sweep-net collection coupled with three calibration factors (2.6, 3.0 and 3.5 for 1/3, 2/3 and 3/3 water levels, respectively) (WSCF) could accurately estimate very low to high Aedes aegypti pupae numbers in water containers more rapidly than the exhaustive 5-sweep and total count (ESTC) method recommended by WHO. Both methods were compared in semi-field trials using low (n = 25) to moderate (n = 50-100) pupae numbers in a 250-l drum at 1/3, 2/3 and 3/3 water levels, and by their mean-time determinations using 200 pupae in three 220- to 1024-l water containers at these water levels. Accuracy was further assessed using 69.1% (393/569) of the field-based drums and tanks which contained <100 pupae. The WSCF method accurately estimated total populations in the semi-field trials up to 13.0 times faster than the ESTC method (all P < 0.001); no significant differences (all P-values ≥ 0.05) were obtained between the methods for very low (n = 4) to low-moderate (n = 25-100) and high (n > 100) pupae numbers/container and without sediment disturbance. The simple WSCF method sensitively, accurately and robustly estimated total pupae numbers in their principal breeding sites worldwide, containers with >20 l water volumes, significantly (2.7- to 13.0-fold: all P-values <0.001) faster than the ESTC method for very low to high pupae numbers/container without contaminating the clean water by sediment disturbance which is generated using the WHO-recommended ESTC method. The WSCF method seems ideal for global community-based surveillance and control programmes. © 2014 John Wiley & Sons Ltd.

  8. A model for estimating passive integrated transponder (PIT) tag antenna efficiencies for interval-specific emigration rates

    USGS Publications Warehouse

    Horton, G.E.; Dubreuil, T.L.; Letcher, B.H.

    2007-01-01

    Our goal was to understand movement and its interaction with survival for populations of stream salmonids at long-term study sites in the northeastern United States by employing passive integrated transponder (PIT) tags and associated technology. Although our PIT tag antenna arrays spanned the stream channel (at most flows) and were continuously operated, we are aware that aspects of fish behavior, environmental characteristics, and electronic limitations influenced our ability to detect 100% of the emigration from our stream site. Therefore, we required antenna efficiency estimates to adjust observed emigration rates. We obtained such estimates by testing a full-scale physical model of our PIT tag antenna array in a laboratory setting. From the physical model, we developed a statistical model that we used to predict efficiency in the field. The factors most important for predicting efficiency were external radio frequency signal and tag type. For most sampling intervals, there was concordance between the predicted and observed efficiencies, which allowed us to estimate the true emigration rate for our field populations of tagged salmonids. One caveat is that the model's utility may depend on its ability to characterize external radio frequency signals accurately. Another important consideration is the trade-off between the volume of data necessary to model efficiency accurately and the difficulty of storing and manipulating large amounts of data.

  9. Nowcasting Intraseasonal Recreational Fishing Harvest with Internet Search Volume

    PubMed Central

    Carter, David W.; Crosson, Scott; Liese, Christopher

    2015-01-01

    Estimates of recreational fishing harvest are often unavailable until after a fishing season has ended. This lag in information complicates efforts to stay within the quota. The simplest way to monitor quota within the season is to use harvest information from the previous year. This works well when fishery conditions are stable, but is inaccurate when fishery conditions are changing. We develop regression-based models to “nowcast” intraseasonal recreational fishing harvest in the presence of changing fishery conditions. Our basic model accounts for seasonality, changes in the fishing season, and important events in the fishery. Our extended model uses Google Trends data on the internet search volume relevant to the fishery of interest. We demonstrate the model with the Gulf of Mexico red snapper fishery where the recreational sector has exceeded the quota nearly every year since 2007. Our results confirm that data for the previous year works well to predict intraseasonal harvest for a year (2012) where fishery conditions are consistent with historic patterns. However, for a year (2013) of unprecedented harvest and management activity our regression model using search volume for the term “red snapper season” generates intraseasonal nowcasts that are 27% more accurate than the basic model without the internet search information and 29% more accurate than the prediction based on the previous year. Reliable nowcasts of intraseasonal harvest could make in-season (or in-year) management feasible and increase the likelihood of staying within quota. Our nowcasting approach using internet search volume might have the potential to improve quota management in other fisheries where conditions change year-to-year. PMID:26348645

  10. Quantitative estimation of infarct size by simultaneous dual radionuclide single photon emission computed tomography: comparison with peak serum creatine kinase activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawaguchi, K.; Sone, T.; Tsuboi, H.

    1991-05-01

    To test the hypothesis that simultaneous dual energy single photon emission computed tomography (SPECT) with technetium-99m (99mTc) pyrophosphate and thallium-201 (201TI) can provide an accurate estimate of the size of myocardial infarction and to assess the correlation between infarct size and peak serum creatine kinase activity, 165 patients with acute myocardial infarction underwent SPECT 3.2 +/- 1.3 (SD) days after the onset of acute myocardial infarction. In the present study, the difference in the intensity of 99mTc-pyrophosphate accumulation was assumed to be attributable to difference in the volume of infarcted myocardium, and the infarct volume was corrected by the ratiomore » of the myocardial activity to the osseous activity to quantify the intensity of 99mTc-pyrophosphate accumulation. The correlation of measured infarct volume with peak serum creatine kinase activity was significant (r = 0.60, p less than 0.01). There was also a significant linear correlation between the corrected infarct volume and peak serum creatine kinase activity (r = 0.71, p less than 0.01). Subgroup analysis showed a high correlation between corrected volume and peak creatine kinase activity in patients with anterior infarctions (r = 0.75, p less than 0.01) but a poor correlation in patients with inferior or posterior infarctions (r = 0.50, p less than 0.01). In both the early reperfusion and the no reperfusion groups, a good correlation was found between corrected infarct volume and peak serum creatine kinase activity (r = 0.76 and r = 0.76, respectively; p less than 0.01).« less

  11. Astronautic Structures Manual, Volume 3

    NASA Technical Reports Server (NTRS)

    1975-01-01

    This document (Volumes I, II, and III) presents a compilation of industry-wide methods in aerospace strength analysis that can be carried out by hand, that are general enough in scope to cover most structures encountered, and that are sophisticated enough to give accurate estimates of the actual strength expected. It provides analysis techniques for the elastic and inelastic stress ranges. It serves not only as a catalog of methods not usually available, but also as a reference source for the background of the methods themselves. An overview of the manual is as follows: Section A is a general introduction of methods used and includes sections on loads, combined stresses, and interaction curves; Section B is devoted to methods of strength analysis; Section C is devoted to the topic of structural stability; Section D is on thermal stresses; Section E is on fatigue and fracture mechanics; Section F is on composites; Section G is on rotating machinery; and Section H is on statistics. These three volumes supersede Volumes I and II, NASA TM X-60041 and NASA TM X-60042, respectively.

  12. Dynamic-contrast-enhanced-MRI with extravasating contrast reagent: Rat cerebral glioma blood volume determination

    NASA Astrophysics Data System (ADS)

    Li, Xin; Rooney, William D.; Várallyay, Csanád G.; Gahramanov, Seymur; Muldoon, Leslie L.; Goodman, James A.; Tagge, Ian J.; Selzer, Audrey H.; Pike, Martin M.; Neuwelt, Edward A.; Springer, Charles S.

    2010-10-01

    The accurate mapping of the tumor blood volume (TBV) fraction ( vb) is a highly desired imaging biometric goal. It is commonly thought that achieving this is difficult, if not impossible, when small molecule contrast reagents (CRs) are used for the T1-weighted (Dynamic-Contrast-Enhanced) DCE-MRI technique. This is because angiogenic malignant tumor vessels allow facile CR extravasation. Here, a three-site equilibrium water exchange model is applied to DCE-MRI data from the cerebrally-implanted rat brain U87 glioma, a tumor exhibiting rapid CR extravasation. Analyses of segments of the (and the entire) DCE data time-course with this "shutter-speed" pharmacokinetic model, which admits finite water exchange kinetics, allow TBV estimation from the first-pass segment. Pairwise parameter determinances were tested with grid searches of 2D parametric error surfaces. Tumor blood volume ( vb), as well as ve (the extracellular, extravascular space volume fraction), and Ktrans (a CR extravasation rate measure) parametric maps are presented. The role of the Patlak Plot in DCE-MRI is also considered.

  13. Quantitative photoplethysmography: Lambert-Beer law or inverse function incorporating light scatter.

    PubMed

    Cejnar, M; Kobler, H; Hunyor, S N

    1993-03-01

    Finger blood volume is commonly determined from measurement of infra-red (IR) light transmittance using the Lambert-Beer law of light absorption derived for use in non-scattering media, even when such transmission involves light scatter around the phalangeal bone. Simultaneous IR transmittance and finger volume were measured over the full dynamic range of vascular volumes in seven subjects and outcomes compared with data fitted according to the Lambert-Beer exponential function and an inverse function derived for light attenuation by scattering materials. Curves were fitted by the least-squares method and goodness of fit was compared using standard errors of estimate (SEE). The inverse function gave a better data fit in six of the subjects: mean SEE 1.9 (SD 0.7, range 0.7-2.8) and 4.6 (2.2, 2.0-8.0) respectively (p < 0.02, paired t-test). Thus, when relating IR transmittance to blood volume, as occurs in the finger during measurements of arterial compliance, an inverse function derived from a model of light attenuation by scattering media gives more accurate results than the traditional exponential fit.

  14. Assessment of the Derivative-Moment Transformation method for unsteady-load estimation

    NASA Astrophysics Data System (ADS)

    Mohebbian, Ali; Rival, David

    2011-11-01

    It is often difficult, if not impossible, to measure the aerodynamic or hydrodynamic forces on a moving body. For this reason, a classical control-volume technique is typically applied to extract the unsteady forces instead. However, measuring the acceleration term within the volume of interest using PIV can be limited by optical access, reflections as well as shadows. Therefore in this study an alternative approach, termed the Derivative-Moment Transformation (DMT) method, is introduced and tested on a synthetic data set produced using numerical simulations. The test case involves the unsteady loading of a flat plate in a two-dimensional, laminar periodic gust. The results suggest that the DMT method can accurately predict the acceleration term so long as appropriate spatial and temporal resolutions are maintained. The major deficiency was found to be the determination of pressure in the wake. The effect of control-volume size was investigated suggesting that smaller domains work best by minimizing the associated error with the pressure field. When increasing the control-volume size, the number of calculations necessary for the pressure-gradient integration increases, in turn substantially increasing the error propagation.

  15. Chart of conversion factors: From English to metric system and metric to English system

    USGS Publications Warehouse

    ,

    1976-01-01

    The conversion factors in the following tables are for conversion of our customary (English) units of measurement to SI*units, and for convenience, reciprocals are shown for converting SI units back to the English system. The first table contains rule-of-thumb figures, useful for "getting the feel" of SI units or mental estimation. The succeeding tables contain factors accurate to 3 or more significant figures. Please refer to known reference volumes for additional accuracy, as well as for factors dealing with other scientific notation involving SI units.

  16. A 3D Freehand Ultrasound System for Multi-view Reconstructions from Sparse 2D Scanning Planes

    PubMed Central

    2011-01-01

    Background A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. Methods We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes. For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Results Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Conclusions Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views. PMID:21251284

  17. A 3D freehand ultrasound system for multi-view reconstructions from sparse 2D scanning planes.

    PubMed

    Yu, Honggang; Pattichis, Marios S; Agurto, Carla; Beth Goens, M

    2011-01-20

    A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes.For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views.

  18. Automatically measuring brain ventricular volume within PACS using artificial intelligence.

    PubMed

    Yepes-Calderon, Fernando; Nelson, Marvin D; McComb, J Gordon

    2018-01-01

    The picture archiving and communications system (PACS) is currently the standard platform to manage medical images but lacks analytical capabilities. Staying within PACS, the authors have developed an automatic method to retrieve the medical data and access it at a voxel level, decrypted and uncompressed that allows analytical capabilities while not perturbing the system's daily operation. Additionally, the strategy is secure and vendor independent. Cerebral ventricular volume is important for the diagnosis and treatment of many neurological disorders. A significant change in ventricular volume is readily recognized, but subtle changes, especially over longer periods of time, may be difficult to discern. Clinical imaging protocols and parameters are often varied making it difficult to use a general solution with standard segmentation techniques. Presented is a segmentation strategy based on an algorithm that uses four features extracted from the medical images to create a statistical estimator capable of determining ventricular volume. When compared with manual segmentations, the correlation was 94% and holds promise for even better accuracy by incorporating the unlimited data available. The volume of any segmentable structure can be accurately determined utilizing the machine learning strategy presented and runs fully automatically within the PACS.

  19. Biomass, production and woody detritus in an old coast redwood (Sequoia sempervirens) forest

    USGS Publications Warehouse

    Busing, R.T.; Fujimori, T.

    2005-01-01

    We examined aboveground biomass dynamics, aboveground net primary production (ANPP), and woody detritus input in an old Sequoia sempervirens stand over a three-decade period. Our estimates of aboveground biomass ranged from 3300 to 5800 Mg ha-1. Stem biomass estimates ranged from 3000 to 5200 Mg ha-1. Stem biomass declined 7% over the study interval. Biomass dynamics were patchy, with marked declines in recent tree-fall patches <0.05 ha in size. Larger tree-fall patches approaching 0.2 ha in size were observed outside the study plot. Our estimates of ANPP ranged from 6 to 14 Mg ha -1yr-1. Estimates of 7 to 10 Mg ha-1yr -1 were considered to be relatively accurate. Thus, our estimates based on long-term data corroborated the findings of earlier short-term studies. ANPP of old, pure stands of Sequoia was not above average for temperate forests. Even though production was potentially high on a per stem basis, it was moderate at the stand level. We obtained values of 797 m3 ha -1 and 262 Mg ha-1 for coarse woody detritus volume and mass, respectively. Fine woody detritus volume and mass were estimated at 16 m3 ha-1 and 5 Mg ha-1, respectively. Standing dead trees (or snags) comprised 7% of the total coarse detritus volume and 8% of the total mass. Coarse detritus input averaged 5.7 to 6.9 Mg ha -1yr-1. Assuming steady-state input and pool of coarse detritus, we obtained a decay rate constant of 0.022 to 0.026. The old-growth stand of Sequoia studied had extremely high biomass, but ANPP was moderate and the amount of woody detritus was not exceptionally large. Biomass accretion and loss were not rapid in this stand partly because of the slow population dynamics and low canopy turnover rate of Sequoia at the old-growth stage. Nomenclature: Hickman (1993). ?? Springer 2005.

  20. Costing the supply chain for delivery of ACT and RDTs in the public sector in Benin and Kenya.

    PubMed

    Shretta, Rima; Johnson, Brittany; Smith, Lisa; Doumbia, Seydou; de Savigny, Don; Anupindi, Ravi; Yadav, Prashant

    2015-02-05

    Studies have shown that supply chain costs are a significant proportion of total programme costs. Nevertheless, the costs of delivering specific products are poorly understood and ballpark estimates are often used to inadequately plan for the budgetary implications of supply chain expenses. The purpose of this research was to estimate the country level costs of the public sector supply chain for artemisinin-based combination therapy (ACT) and rapid diagnostic tests (RDTs) from the central to the peripheral levels in Benin and Kenya. A micro-costing approach was used and primary data on the various cost components of the supply chain was collected at the central, intermediate, and facility levels between September and November 2013. Information sources included central warehouse databases, health facility records, transport schedules, and expenditure reports. Data from document reviews and semi-structured interviews were used to identify cost inputs and estimate actual costs. Sampling was purposive to isolate key variables of interest. Survey guides were developed and administered electronically. Data were extracted into Microsoft Excel, and the supply chain cost per unit of ACT and RDT distributed by function and level of system was calculated. In Benin, supply chain costs added USD 0.2011 to the initial acquisition cost of ACT and USD 0.3375 to RDTs (normalized to USD 1). In Kenya, they added USD 0.2443 to the acquisition cost of ACT and USD 0.1895 to RDTs (normalized to USD 1). Total supply chain costs accounted for more than 30% of the initial acquisition cost of the products in some cases and these costs were highly sensitive to product volumes. The major cost drivers were found to be labour, transport, and utilities with health facilities carrying the majority of the cost per unit of product. Accurate cost estimates are needed to ensure adequate resources are available for supply chain activities. Product volumes should be considered when costing supply chain functions rather than dollar value. Further work is needed to develop extrapolative costing models that can be applied at country level without extensive micro-costing exercises. This will allow other countries to generate more accurate estimates in the future.

  1. Earth rotation excitation mechanisms derived from geodetic space observations

    NASA Astrophysics Data System (ADS)

    Göttl, F.; Schmidt, M.

    2009-04-01

    Earth rotation variations are caused by mass displacements and motions in the subsystems of the Earth. Via the satellite Gravity and Climate Experiment (GRACE) gravity field variations can be identified which are caused by mass redistribution in the Earth system. Therefore time variable gravity field models (GFZ RL04, CSR RL04, JPL RL04, ITG-Grace03, GRGS, ...) can be used to derive different impacts on Earth rotation. Furthermore satellite altimetry provides accurate information on sea level anomalies (AVISO, DGFI) which are caused by mass and volume changes of seawater. Since Earth rotation is solely affected by mass variations and motions the volume (steric) effect has to be reduced from the altimetric observations in order to infer oceanic contributions to Earth rotation variations. Therefore the steric effect is estimated from physical ocean parameters such as temperature and salinity changes in the oceans (WOA05, Ishii). In this study specific individual geophysical contributions to Earth rotation variations are identified by means of a multitude of accurate geodetic space observations in combination with a realistic error propagation. It will be shown that due to adjustment of altimetric and/or gravimetric solutions the results for polar motion excitations can be improved.

  2. Kidney volume measurement methods for clinical studies on autosomal dominant polycystic kidney disease

    PubMed Central

    Sharma, Kanishka; Caroli, Anna; Quach, Le Van; Petzold, Katja; Bozzetto, Michela; Serra, Andreas L.; Remuzzi, Giuseppe; Remuzzi, Andrea

    2017-01-01

    Background In autosomal dominant polycystic kidney disease (ADPKD), total kidney volume (TKV) is regarded as an important biomarker of disease progression and different methods are available to assess kidney volume. The purpose of this study was to identify the most efficient kidney volume computation method to be used in clinical studies evaluating the effectiveness of treatments on ADPKD progression. Methods and findings We measured single kidney volume (SKV) on two series of MR and CT images from clinical studies on ADPKD (experimental dataset) by two independent operators (expert and beginner), twice, using all of the available methods: polyline manual tracing (reference method), free-hand manual tracing, semi-automatic tracing, Stereology, Mid-slice and Ellipsoid method. Additionally, the expert operator also measured the kidney length. We compared different methods for reproducibility, accuracy, precision, and time required. In addition, we performed a validation study to evaluate the sensitivity of these methods to detect the between-treatment group difference in TKV change over one year, using MR images from a previous clinical study. Reproducibility was higher on CT than MR for all methods, being highest for manual and semiautomatic contouring methods (planimetry). On MR, planimetry showed highest accuracy and precision, while on CT accuracy and precision of both planimetry and Stereology methods were comparable. Mid-slice and Ellipsoid method, as well as kidney length were fast but provided only a rough estimate of kidney volume. The results of the validation study indicated that planimetry and Stereology allow using an importantly lower number of patients to detect changes in kidney volume induced by drug treatment as compared to other methods. Conclusions Planimetry should be preferred over fast and simplified methods for accurately monitoring ADPKD progression and assessing drug treatment effects. Expert operators, especially on MR images, are required for performing reliable estimation of kidney volume. The use of efficient TKV quantification methods considerably reduces the number of patients to enrol in clinical investigations, making them more feasible and significant. PMID:28558028

  3. Quantitative comparisons of three automated methods for estimating intracranial volume: A study of 270 longitudinal magnetic resonance images.

    PubMed

    Shang, Xiaoyan; Carlson, Michelle C; Tang, Xiaoying

    2018-04-30

    Total intracranial volume (TIV) is often used as a measure of brain size to correct for individual variability in magnetic resonance imaging (MRI) based morphometric studies. An adjustment of TIV can greatly increase the statistical power of brain morphometry methods. As such, an accurate and precise TIV estimation is of great importance in MRI studies. In this paper, we compared three automated TIV estimation methods (multi-atlas likelihood fusion (MALF), Statistical Parametric Mapping 8 (SPM8) and FreeSurfer (FS)) using longitudinal T1-weighted MR images in a cohort of 70 older participants at elevated sociodemographic risk for Alzheimer's disease. Statistical group comparisons in terms of four different metrics were performed. Furthermore, sex, education level, and intervention status were investigated separately for their impacts on the TIV estimation performance of each method. According to our experimental results, MALF was the least susceptible to atrophy, while SPM8 and FS suffered a loss in precision. In group-wise analysis, MALF was the least sensitive method to group variation, whereas SPM8 was particularly sensitive to sex and FS was unstable with respect to education level. In terms of effectiveness, both MALF and SPM8 delivered a user-friendly performance, while FS was relatively computationally intensive. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Computerized tomography with 3-dimensional reconstruction for the evaluation of renal size and arterial anatomy in the living kidney donor.

    PubMed

    Janoff, Daniel M; Davol, Patrick; Hazzard, James; Lemmers, Michael J; Paduch, Darius A; Barry, John M

    2004-01-01

    Computerized tomography (CT) with 3-dimensional (3-D) reconstruction has gained acceptance as an imaging study to evaluate living renal donors. We report our experience with this technique in 199 consecutive patients to validate its predictions of arterial anatomy and kidney volumes. Between January 1997 and March 2002, 199 living donor nephrectomies were performed at our institution using an open technique. During the operation arterial anatomy was recorded as well as kidney weight in 98 patients and displacement volume in 27. Each donor had been evaluated preoperatively by CT angiography with 3-D reconstruction. Arterial anatomy described by a staff radiologist was compared with intraoperative findings. CT estimated volumes were reported. Linear correlation graphs were generated to assess the reliability of CT volume predictions. The accuracy of CT angiography for predicting arterial anatomy was 90.5%. However, as the number of renal arteries increased, predictive accuracy decreased. The ability of CT to predict multiple arteries remained high with a positive predictive value of 95.2%. Calculated CT volume and kidney weight significantly correlated (0.654). However, the coefficient of variation index (how much average CT volume differed from measured intraoperative volume) was 17.8%. CT angiography with 3-D reconstruction accurately predicts arterial vasculature in more than 90% of patients and it can be used to compare renal volumes. However, accuracy decreases with multiple renal arteries and volume comparisons may be inaccurate when the difference in kidney volumes is within 17.8%.

  5. Measurement of the volume of the pedicled TRAM flap in immediate breast reconstruction.

    PubMed

    Chang, K P; Lin, S D; Hou, M F; Lee, S S; Tsai, C C

    2001-12-01

    The transverse rectus abdominis musculocutaneous (TRAM) flap is now accepted as the standard for breast reconstruction, but achieving symmetrical breast reconstruction is still a challenge. A precise estimate of the volume of the flap is necessary to reconstruct a symmetrical and aesthetically pleasing breast. Many methods have been developed to overcome this problem, but they have not been suitable for the pedicled TRAM flap. By using a self-made device based on the Archimedes' principle, the authors can calculate accurately the volume of the pedicled TRAM flap and predict reliably the breast volume intraoperatively. The "procedure" is based on a self-made box into which the pedicled TRAM flap is placed. Warm saline is added to the box and the flap is then removed. Flap volume is calculated easily by determining the difference between the preestimated volume of the box and the volume of the residual water. From February to May 2000, this method was used on 28 patients to predict breast volume for breast reconstruction. This study revealed that the difference of the maximal chest circumferences (the index of the breast volume) demonstrates a positive correlation with the difference of the volumes and weights between the mastectomy specimen and the net TRAM flap. However, a more closely positive correlation exists between the differences of maximal chest circumference volume (r = 0.677) than maximal chest circumference weight (r = 0.618). These data reveal that the reconstructed breast's volume has a closer relationship with the volume of the net pedicled TRAM flap, rather than with its weight.

  6. Validation of graft and standard liver size predictions in right liver living donor liver transplantation.

    PubMed

    Chan, See Ching; Lo, Chung Mau; Chok, Kenneth S H; Sharr, William W; Cheung, Tan To; Tsang, Simon H Y; Chan, Albert C Y; Fan, Sheung Tat

    2011-12-01

    To assess the accuracy of a formula derived from 159 living liver donors to estimate the liver size of a normal subject: standard liver weight (g) = 218 + body weight (kg) × 12.3 + 51 (if male). Standard liver volume (SLV) is attained by a conversion factor of 1.19 mL/g. The total liver volume (TLV) of each of the subsequent consecutive 126 living liver donors was determined using the right liver graft weight (RGW) on the back table, right/left liver volume ratio on computed tomography, and the conversion factor. The estimated right liver graft weight (ERGW) was determined by the right liver volume on computed tomography (CT) and the conversion factor. SLV and ERGW were compared with TLV and RGW, respectively, by paired sample t test. Donor characteristics of both series were similar. SLV and TLV were 1,099.6 ± 139.6 and 1,108.5 ± 175.2 mL, respectively, ( R 2  = 0.476) ( p  = 0.435). The difference between SLV and TLV was only -8.9 ± 128.2 mL (-1.0 ± 11.7%). ERGW and RGW were 601.5 ± 104.1 and 597.1 ± 102.2 g, respectively ( R 2  = 0.781) ( p  = 0.332). The conversion factor from liver weight to volume for this series was 1.20 mL/g. The difference between ERGW and RGW was 4.3 ± 49.8 g (0.3 ± 8.8%). ERGW was smaller than RGW for over 10% (range 0.21-40.66 g) in 18 of the 126 donors. None had the underestimation of RGW by over 20%. SLV and graft weight estimations were accurate using the formula and conversion factor.

  7. Osmotic potential calculations of inorganic and organic aqueous solutions over wide solute concentration levels and temperatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cochrane, T. T., E-mail: agteca@hotmail.com; Cochrane, T. A., E-mail: tom.cochrane@canterbury.ac.nz

    Purpose: To demonstrate that the authors’ new “aqueous solution vs pure water” equation to calculate osmotic potential may be used to calculate the osmotic potentials of inorganic and organic aqueous solutions over wide ranges of solute concentrations and temperatures. Currently, the osmotic potentials of solutions used for medical purposes are calculated from equations based on the thermodynamics of the gas laws which are only accurate at low temperature and solute concentration levels. Some solutions used in medicine may need their osmotic potentials calculated more accurately to take into account solute concentrations and temperatures. Methods: The authors experimented with their newmore » equation for calculating the osmotic potentials of inorganic and organic aqueous solutions up to and beyond body temperatures by adjusting three of its factors; (a) the volume property of pure water, (b) the number of “free” water molecules per unit volume of solution, “N{sub f},” and (c) the “t” factor expressing the cooperative structural relaxation time of pure water at given temperatures. Adequate information on the volume property of pure water at different temperatures is available in the literature. However, as little information on the relative densities of inorganic and organic solutions, respectively, at varying temperatures needed to calculate N{sub f} was available, provisional equations were formulated to approximate values. Those values together with tentative t values for different temperatures chosen from values calculated by different workers were substituted into the authors’ equation to demonstrate how osmotic potentials could be estimated over temperatures up to and beyond bodily temperatures. Results: The provisional equations formulated to calculate N{sub f}, the number of free water molecules per unit volume of inorganic and organic solute solutions, respectively, over wide concentration ranges compared well with the calculations of N{sub f} using recorded relative density data at 20 °C. They were subsequently used to estimate N{sub f} values at temperatures up to and excess of body temperatures. Those values, together with t values at temperatures up to and in excess of body temperatures recorded in the literature, were substituted in the authors’ equation for the provisional calculation of osmotic potentials. The calculations indicated that solution temperatures and solute concentrations have a marked effect on osmotic potentials. Conclusions: Following work to measure the relative densities of aqueous solutions for the calculation of N{sub f} values and the determination of definitive t values up to and beyond bodily temperatures, the authors’ equation would enable the accurate estimations of the osmotic potentials of wide concentrations of aqueous solutions of inorganic and organic solutes over the temperature range. The study illustrates that not only solute concentrations but also temperatures have a marked effect on osmotic potentials, an observation of medical and biological significance.« less

  8. Statistical modeling and MAP estimation for body fat quantification with MRI ratio imaging

    NASA Astrophysics Data System (ADS)

    Wong, Wilbur C. K.; Johnson, David H.; Wilson, David L.

    2008-03-01

    We are developing small animal imaging techniques to characterize the kinetics of lipid accumulation/reduction of fat depots in response to genetic/dietary factors associated with obesity and metabolic syndromes. Recently, we developed an MR ratio imaging technique that approximately yields lipid/{lipid + water}. In this work, we develop a statistical model for the ratio distribution that explicitly includes a partial volume (PV) fraction of fat and a mixture of a Rician and multiple Gaussians. Monte Carlo hypothesis testing showed that our model was valid over a wide range of coefficient of variation of the denominator distribution (c.v.: 0-0:20) and correlation coefficient among the numerator and denominator (ρ 0-0.95), which cover the typical values that we found in MRI data sets (c.v.: 0:027-0:063, ρ: 0:50-0:75). Then a maximum a posteriori (MAP) estimate for the fat percentage per voxel is proposed. Using a digital phantom with many PV voxels, we found that ratio values were not linearly related to PV fat content and that our method accurately described the histogram. In addition, the new method estimated the ground truth within +1.6% vs. +43% for an approach using an uncorrected ratio image, when we simply threshold the ratio image. On the six genetically obese rat data sets, the MAP estimate gave total fat volumes of 279 +/- 45mL, values 21% smaller than those from the uncorrected ratio images, principally due to the non-linear PV effect. We conclude that our algorithm can increase the accuracy of fat volume quantification even in regions having many PV voxels, e.g. ectopic fat depots.

  9. Computed Tomography Volumetry in Preoperative Living Kidney Donor Assessment for Prediction of Split Renal Function.

    PubMed

    Wahba, Roger; Franke, Mareike; Hellmich, Martin; Kleinert, Robert; Cingöz, Tülay; Schmidt, Matthias C; Stippel, Dirk L; Bangard, Christopher

    2016-06-01

    Transplant centers commonly evaluate split renal function (SRF) with Tc-99m-mercapto-acetyltriglycin (MAG3) scintigraphy in living kidney donation. Alternatively, the kidney volume can be measured based on predonation CT scans. The aim of this study was to identify the most accurate CT volumetry technique for SRF and the prediction of postdonation kidney function (PDKF). Three CT volumetry techniques (modified ellipsoid volume [MELV], smart region of interest [ROI] volume, renal cortex volume [RCV]) were performed in 101 living kidney donors. Preoperation CT volumetric SRF was determined and compared with MAG3-SRF, postoperation donor kidney function, and graft function. The correlation between donors predonation total kidney volume and predonation kidney function was the highest for RCV (0.58 with creatine clearance, 0.54 with estimated glomerular filtration rate-Cockcroft-Gault). The predonation volume of the preserved kidney was (ROI, MELV, RCV) 148.0 ± 29.1 cm, 151.2 ± 35.4 and 93.9 ± 25.2 (P < 0.005 MELV vs RCV and ROI vs RCV). Bland-Altman analysis showed agreement between CT volumetry SRF and MAG3-SRF (bias, 95% limits of agreement: ROI vs MAG3 0.4%, -7.7% to 8.6%; MELV vs MAG3 0.4%, -8.9% to 9.7%; RCV vs MAG3 0.8%, -9.1% to 10.7%). The correlation between predonation CT volumetric SRF of the preserved kidney and PDKF at day 3 was r = 0.85 to 0.88, between MAG3-SRF and PDKF (r = 0.84). The difference of predonation SRF between preserved and donated kidney was the lowest for ROI and RCV (median, 3% and 4%; 95th percentile, 9% and 13%). Overall renal cortex volumetry seems to be the most accurate technique for the evaluation of predonation SRF and allows a reliable prediction of donor's PDKF.

  10. Comparison of high-pressure CO 2 sorption isotherms on Eastern and Western US coals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, V; Hur, T -B; Fazio, J

    2013-10-01

    Accurate estimation of carbon dioxide (CO 2) sorption capacity of coal is important for planning the CO 2 sequestration efforts. In this work, we investigated sorption and swelling behavior of several Eastern and Western US coal samples from the Central Appalachian Basin and from San Juan Basin. The CO 2 sorption isotherms have been completed at 55°C for as received and dried samples. The role of mineral components in coal, the coal swelling, the effects of temperature and moisture, and the error propagation have been analyzed. Changes in void volume due to dewatering and other factors such as temporary cagingmore » of carbon dioxide molecules in coal matrix were identified among the main factors affecting accuracy of the carbon dioxide sorption isotherms. The (helium) void volume in the sample cells was measured before and after the sorption isotherm experiments and was used to build the volume-corrected data plots.« less

  11. A system for accurate and automated injection of hyperpolarized substrate with minimal dead time and scalable volumes over a large range

    NASA Astrophysics Data System (ADS)

    Reynolds, Steven; Bucur, Adriana; Port, Michael; Alizadeh, Tooba; Kazan, Samira M.; Tozer, Gillian M.; Paley, Martyn N. J.

    2014-02-01

    Over recent years hyperpolarization by dissolution dynamic nuclear polarization has become an established technique for studying metabolism in vivo in animal models. Temporal signal plots obtained from the injected metabolite and daughter products, e.g. pyruvate and lactate, can be fitted to compartmental models to estimate kinetic rate constants. Modeling and physiological parameter estimation can be made more robust by consistent and reproducible injections through automation. An injection system previously developed by us was limited in the injectable volume to between 0.6 and 2.4 ml and injection was delayed due to a required syringe filling step. An improved MR-compatible injector system has been developed that measures the pH of injected substrate, uses flow control to reduce dead volume within the injection cannula and can be operated over a larger volume range. The delay time to injection has been minimized by removing the syringe filling step by use of a peristaltic pump. For 100 μl to 10.000 ml, the volume range typically used for mice to rabbits, the average delivered volume was 97.8% of the demand volume. The standard deviation of delivered volumes was 7 μl for 100 μl and 20 μl for 10.000 ml demand volumes (mean S.D. was 9 ul in this range). In three repeat injections through a fixed 0.96 mm O.D. tube the coefficient of variation for the area under the curve was 2%. For in vivo injections of hyperpolarized pyruvate in tumor-bearing rats, signal was first detected in the input femoral vein cannula at 3-4 s post-injection trigger signal and at 9-12 s in tumor tissue. The pH of the injected pyruvate was 7.1 ± 0.3 (mean ± S.D., n = 10). For small injection volumes, e.g. less than 100 μl, the internal diameter of the tubing contained within the peristaltic pump could be reduced to improve accuracy. Larger injection volumes are limited only by the size of the receiving vessel connected to the pump.

  12. Digital Biomass Accumulation Using High-Throughput Plant Phenotype Data Analysis.

    PubMed

    Rahaman, Md Matiur; Ahsan, Md Asif; Gillani, Zeeshan; Chen, Ming

    2017-09-01

    Biomass is an important phenotypic trait in functional ecology and growth analysis. The typical methods for measuring biomass are destructive, and they require numerous individuals to be cultivated for repeated measurements. With the advent of image-based high-throughput plant phenotyping facilities, non-destructive biomass measuring methods have attempted to overcome this problem. Thus, the estimation of plant biomass of individual plants from their digital images is becoming more important. In this paper, we propose an approach to biomass estimation based on image derived phenotypic traits. Several image-based biomass studies state that the estimation of plant biomass is only a linear function of the projected plant area in images. However, we modeled the plant volume as a function of plant area, plant compactness, and plant age to generalize the linear biomass model. The obtained results confirm the proposed model and can explain most of the observed variance during image-derived biomass estimation. Moreover, a small difference was observed between actual and estimated digital biomass, which indicates that our proposed approach can be used to estimate digital biomass accurately.

  13. Estimation of construction and demolition waste volume generation in new residential buildings in Spain.

    PubMed

    Villoria Sáez, Paola; del Río Merino, Mercedes; Porras-Amores, César

    2012-02-01

    The management planning of construction and demolition (C&D) waste uses a single indicator which does not provide enough detailed information. Therefore the determination and implementation of other innovative and precise indicators should be determined. The aim of this research work is to improve existing C&D waste quantification tools in the construction of new residential buildings in Spain. For this purpose, several housing projects were studied to determine an estimation of C&D waste generated during their construction process. This paper determines the values of three indicators to estimate the generation of C&D waste in new residential buildings in Spain, itemizing types of waste and construction stages. The inclusion of two more accurate indicators, in addition to the global one commonly in use, provides a significant improvement in C&D waste quantification tools and management planning.

  14. Near-infrared spectral tomography integrated with digital breast tomosynthesis: Effects of tissue scattering on optical data acquisition design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michaelsen, Kelly; Krishnaswamy, Venkat; Pogue, Brian W.

    2012-07-15

    Purpose: Design optimization and phantom validation of an integrated digital breast tomosynthesis (DBT) and near-infrared spectral tomography (NIRST) system targeting improvement in sensitivity and specificity of breast cancer detection is presented. Factors affecting instrumentation design include minimization of cost, complexity, and examination time while maintaining high fidelity NIRST measurements with sufficient information to recover accurate optical property maps. Methods: Reconstructed DBT slices from eight patients with abnormal mammograms provided anatomical information for the NIRST simulations. A limited frequency domain (FD) and extensive continuous wave (CW) NIRST system was modeled. The FD components provided tissue scattering estimations used in the reconstructionmore » of the CW data. Scattering estimates were perturbed to study the effects on hemoglobin recovery. Breast mimicking agar phantoms with inclusions were imaged using the combined DBT/NIRST system for comparison with simulation results. Results: Patient simulations derived from DBT images show successful reconstruction of both normal and malignant lesions in the breast. They also demonstrate the importance of accurately quantifying tissue scattering. Specifically, 20% errors in optical scattering resulted in 22.6% or 35.1% error in quantification of total hemoglobin concentrations, depending on whether scattering was over- or underestimated, respectively. Limited frequency-domain optical signal sampling provided two regions scattering estimates (for fat and fibroglandular tissues) that led to hemoglobin concentrations that reduced the error in the tumor region by 31% relative to when a single estimate of optical scattering was used throughout the breast volume of interest. Acquiring frequency-domain data with six wavelengths instead of three did not significantly improve the hemoglobin concentration estimates. Simulation results were confirmed through experiments in two-region breast mimicking gelatin phantoms. Conclusions: Accurate characterization of scattering is necessary for quantification of hemoglobin. Based on this study, a system design is described to optimally combine breast tomosynthesis with NIRST.« less

  15. A flexible and accurate digital volume correlation method applicable to high-resolution volumetric images

    NASA Astrophysics Data System (ADS)

    Pan, Bing; Wang, Bo

    2017-10-01

    Digital volume correlation (DVC) is a powerful technique for quantifying interior deformation within solid opaque materials and biological tissues. In the last two decades, great efforts have been made to improve the accuracy and efficiency of the DVC algorithm. However, there is still a lack of a flexible, robust and accurate version that can be efficiently implemented in personal computers with limited RAM. This paper proposes an advanced DVC method that can realize accurate full-field internal deformation measurement applicable to high-resolution volume images with up to billions of voxels. Specifically, a novel layer-wise reliability-guided displacement tracking strategy combined with dynamic data management is presented to guide the DVC computation from slice to slice. The displacements at specified calculation points in each layer are computed using the advanced 3D inverse-compositional Gauss-Newton algorithm with the complete initial guess of the deformation vector accurately predicted from the computed calculation points. Since only limited slices of interest in the reference and deformed volume images rather than the whole volume images are required, the DVC calculation can thus be efficiently implemented on personal computers. The flexibility, accuracy and efficiency of the presented DVC approach are demonstrated by analyzing computer-simulated and experimentally obtained high-resolution volume images.

  16. Accuracy of the dose-shift approximation in estimating the delivered dose in SBRT of lung tumors considering setup errors and breathing motions.

    PubMed

    Karlsson, Kristin; Lax, Ingmar; Lindbäck, Elias; Poludniowski, Gavin

    2017-09-01

    Geometrical uncertainties can result in a delivered dose to the tumor different from that estimated in the static treatment plan. The purpose of this project was to investigate the accuracy of the dose calculated to the clinical target volume (CTV) with the dose-shift approximation, in stereotactic body radiation therapy (SBRT) of lung tumors considering setup errors and breathing motion. The dose-shift method was compared with a beam-shift method with dose recalculation. Included were 10 patients (10 tumors) selected to represent a variety of SBRT-treated lung tumors in terms of tumor location, CTV volume, and tumor density. An in-house developed toolkit within a treatment planning system allowed the shift of either the dose matrix or a shift of the beam isocenter with dose recalculation, to simulate setup errors and breathing motion. Setup shifts of different magnitudes (up to 10 mm) and directions as well as breathing with different peak-to-peak amplitudes (up to 10:5:5 mm) were modeled. The resulting dose-volume histograms (DVHs) were recorded and dose statistics were extracted. Generally, both the dose-shift and beam-shift methods resulted in calculated doses lower than the static planned dose, although the minimum (D 98% ) dose exceeded the prescribed dose in all cases, for setup shifts up to 5 mm. The dose-shift method also generally underestimated the dose compared with the beam-shift method. For clinically realistic systematic displacements of less than 5 mm, the results demonstrated that in the minimum dose region within the CTV, the dose-shift method was accurate to 2% (root-mean-square error). Breathing motion only marginally degraded the dose distributions. Averaged over the patients and shift directions, the dose-shift approximation was determined to be accurate to approximately 2% (RMS) within the CTV, for clinically relevant geometrical uncertainties for SBRT of lung tumors.

  17. Should early amputation impact initial fluid therapy algorithms in burns resuscitation? A retrospective analysis using 3D modelling.

    PubMed

    Staruch, Robert M T; Beverly, A; Lewis, D; Wilson, Y; Martin, N

    2017-02-01

    While the epidemiology of amputations in patients with burns has been investigated previously, the effect of an amputation on burn size and its impact on fluid management have not been considered in the literature. Fluid resuscitation volumes are based on the percentage of the total body surface area (%TBSA) burned calculated during the primary survey. There is currently no consensus as to whether the fluid volumes should be recalculated after an amputation to compensate for the new body surface area. The aim of this study was to model the impact of an amputation on burn size and predicted fluid requirement. A retrospective search was performed of the database at the Queen Elizabeth Hospital Birmingham Regional Burns Centre to identify all patients who had required an early amputation as a result of their burn injury. The search identified 10 patients over a 3-year period. Burn injuries were then mapped using 3D modelling software. BurnCase3D is a computer program that allows accurate plotting of burn injuries on a digital mannequin adjusted for height and weight. Theoretical fluid requirements were then calculated using the Parkland formula for the first 24 h, and Herndon formula for the second 24 h, taking into consideration the effects of the amputation on residual burn size. This study demonstrated that amputation can have an unpredictable effect on burn size that results in a significant deviation from predicted fluid resuscitation volumes. This discrepancy in fluid estimation may cause iatrogenic complications due to over-resuscitation in burn-injured casualties. Combining a more accurate estimation of postamputation burn size with goal-directed fluid therapy during the resuscitation phase should enable burn care teams to optimise patient outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. SU-F-T-42: MRI and TRUS Image Fusion as a Mode of Generating More Accurate Prostate Contours

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petronek, M; Purysko, A; Balik, S

    Purpose: Transrectal Ultrasound (TRUS) imaging is utilized intra-operatively for LDR permanent prostate seed implant treatment planning. Prostate contouring with TRUS can be challenging at the apex and base. This study attempts to improve accuracy of prostate contouring with MRI-TRUS fusion to prevent over- or under-estimation of the prostate volume. Methods: 14 patients with previous MRI guided prostate biopsy and undergone an LDR permanent prostate seed implant have been selected. The prostate was contoured on the MRI images (1 mm slice thickness) by a radiologist. The prostate was also contoured on TRUS images (5 mm slice thickness) during LDR procedure bymore » a urologist. MRI and TRUS images were rigidly fused manually and the prostate contours from MRI and TRUS were compared using Dice similarity coefficient, percentage volume difference and length, height and width differences. Results: The prostate volume was overestimated by 8 ± 18% (range: 34% to −25%) in TRUS images compared to MRI. The mean Dice was 0.77 ± 0.09 (range: 0.53 to 0.88). The mean difference (TRUS-MRI) in the prostate width was 0 ± 4 mm (range: −11 to 5 mm), height was −3 ± 6 mm (range: −13 to 6 mm) and length was 6 ± 6 (range: −10 to 16 mm). Prostate was overestimated with TRUS imaging at the base for 6 cases (mean: 8 ± 4 mm and range: 5 to 14 mm), at the apex for 6 cases (mean: 11 ± 3 mm and range: 5 to 15 mm) and 1 case was underestimated at both base and apex by 4 mm. Conclusion: Use of intra-operative TRUS and MRI image fusion can help to improve the accuracy of prostate contouring by accurately accounting for prostate over- or under-estimations, especially at the base and apex. The mean amount of discrepancy is within a range that is significant for LDR sources.« less

  19. The estimation of the thyroid volume before surgery--an important prerequisite for minimally invasive thyroidectomy.

    PubMed

    Ruggieri, M; Fumarola, A; Straniero, A; Maiuolo, A; Coletta, I; Veltri, A; Di Fiore, A; Trimboli, P; Gargiulo, P; Genderini, M; D'Armiento, M

    2008-09-01

    Actually, thyroid volume >25 ml, obtained by preoperative ultrasound evaluation, is a very important exclusion criteria for minimally invasive thyroidectomy. So far, among different imaging techniques, two-dimensional ultrasonography has become the more accepted method for the assessment of thyroid volume (US-TV). The aims of this study were: (1) to estimate the preoperative thyroid volume in patients undergoing minimally invasive total thyroidectomy using a mathematical formula and (2) to verify its validity by comparing it with the postsurgical TV (PS-TV). In 53 patients who underwent minimally invasive total thyroidectomy (from January 2003 to December 2007), US-TV, obtained by ellipsoid volume formula, was compared to PS-TV determined by the Archimedes' principle. A mathematical formula able to predict the TV from the US-TV was applied in 34 cases in the last 2 years. Mean US-TV (14.4 +/- 5.9 ml) was significantly lower than mean PS-TV (21.7 +/- 10.3 ml). This underestimation was related to gland multinodularity and/or nodular involvement of the isthmus. A mathematical formula to reduce US-TV underestimation and predict the real TV was developed using a linear model. Mean predicted TV (16.8 +/- 3.7 ml) perfectly matched mean PS-TV, underestimating PS-TV in 19% of cases. We verified the accuracy of this mathematical model in patients' eligibility for minimally invasive total thyroidectomy, and we demonstrated that a predicted TV <25 ml was confirmed post-surgery in 94% of cases. We demonstrated that using a linear model, it is possible to predict from US the PS-TV with high accuracy. In fact, the mean predicted TV perfectly matched the mean PS-TV in all cases. In particular, the percentage of cases in which the predicted TV perfectly matched the PS-TV increases from 23%, estimated by US, to 43%. Moreover, the percentage of TV underestimation was reduced from 77% to 19%, as well as the range of the disagreement from up to 200% to 80%. This study shows that two-dimensional US can provide the accurate estimation of thyroid volume but that it can be improved by a mathematical model. This may contribute to a more appropriate surgical management of thyroid diseases.

  20. Assessment of MRI-Based Automated Fetal Cerebral Cortical Folding Measures in Prediction of Gestational Age in the Third Trimester.

    PubMed

    Wu, J; Awate, S P; Licht, D J; Clouchoux, C; du Plessis, A J; Avants, B B; Vossough, A; Gee, J C; Limperopoulos, C

    2015-07-01

    Traditional methods of dating a pregnancy based on history or sonographic assessment have a large variation in the third trimester. We aimed to assess the ability of various quantitative measures of brain cortical folding on MR imaging in determining fetal gestational age in the third trimester. We evaluated 8 different quantitative cortical folding measures to predict gestational age in 33 healthy fetuses by using T2-weighted fetal MR imaging. We compared the accuracy of the prediction of gestational age by these cortical folding measures with the accuracy of prediction by brain volume measurement and by a previously reported semiquantitative visual scale of brain maturity. Regression models were constructed, and measurement biases and variances were determined via a cross-validation procedure. The cortical folding measures are accurate in the estimation and prediction of gestational age (mean of the absolute error, 0.43 ± 0.45 weeks) and perform better than (P = .024) brain volume (mean of the absolute error, 0.72 ± 0.61 weeks) or sonography measures (SDs approximately 1.5 weeks, as reported in literature). Prediction accuracy is comparable with that of the semiquantitative visual assessment score (mean, 0.57 ± 0.41 weeks). Quantitative cortical folding measures such as global average curvedness can be an accurate and reliable estimator of gestational age and brain maturity for healthy fetuses in the third trimester and have the potential to be an indicator of brain-growth delays for at-risk fetuses and preterm neonates. © 2015 by American Journal of Neuroradiology.

  1. Estimating TCP Packet Loss Ratio from Sampled ACK Packets

    NASA Astrophysics Data System (ADS)

    Yamasaki, Yasuhiro; Shimonishi, Hideyuki; Murase, Tutomu

    The advent of various quality-sensitive applications has greatly changed the requirements for IP network management and made the monitoring of individual traffic flows more important. Since the processing costs of per-flow quality monitoring are high, especially in high-speed backbone links, packet sampling techniques have been attracting considerable attention. Existing sampling techniques, such as those used in Sampled NetFlow and sFlow, however, focus on the monitoring of traffic volume, and there has been little discussion of the monitoring of such quality indexes as packet loss ratio. In this paper we propose a method for estimating, from sampled packets, packet loss ratios in individual TCP sessions. It detects packet loss events by monitoring duplicate ACK events raised by each TCP receiver. Because sampling reveals only a portion of the actual packet loss, the actual packet loss ratio is estimated statistically. Simulation results show that the proposed method can estimate the TCP packet loss ratio accurately from a 10% sampling of packets.

  2. Tropical forest plantation biomass estimation using RADARSAT-SAR and TM data of south china

    NASA Astrophysics Data System (ADS)

    Wang, Chenli; Niu, Zheng; Gu, Xiaoping; Guo, Zhixing; Cong, Pifu

    2005-10-01

    Forest biomass is one of the most important parameters for global carbon stock model yet can only be estimated with great uncertainties. Remote sensing, especially SAR data can offers the possibility of providing relatively accurate forest biomass estimations at a lower cost than inventory in study tropical forest. The goal of this research was to compare the sensitivity of forest biomass to Landsat TM and RADARSAT-SAR data and to assess the efficiency of NDVI, EVI and other vegetation indices in study forest biomass based on the field survey date and GIS in south china. Based on vegetation indices and factor analysis, multiple regression and neural networks were developed for biomass estimation for each species of the plantation. For each species, the better relationships between the biomass predicted and that measured from field survey was obtained with a neural network developed for the species. The relationship between predicted and measured biomass derived from vegetation indices differed between species. This study concludes that single band and many vegetation indices are weakly correlated with selected forest biomass. RADARSAT-SAR Backscatter coefficient has a relatively good logarithmic correlation with forest biomass, but neither TM spectral bands nor vegetation indices alone are sufficient to establish an efficient model for biomass estimation due to the saturation of bands and vegetation indices, multiple regression models that consist of spectral and environment variables improve biomass estimation performance. Comparing with TM, a relatively well estimation result can be achieved by RADARSAT-SAR, but all had limitations in tropical forest biomass estimation. The estimation results obtained are not accurate enough for forest management purposes at the forest stand level. However, the approximate volume estimates derived by the method can be useful in areas where no other forest information is available. Therefore, this paper provides a better understanding of relationships of remote sensing data and forest stand parameters used in forest parameter estimation models.

  3. A time accurate finite volume high resolution scheme for three dimensional Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Hsu, Andrew T.

    1989-01-01

    A time accurate, three-dimensional, finite volume, high resolution scheme for solving the compressible full Navier-Stokes equations is presented. The present derivation is based on the upwind split formulas, specifically with the application of Roe's (1981) flux difference splitting. A high-order accurate (up to the third order) upwind interpolation formula for the inviscid terms is derived to account for nonuniform meshes. For the viscous terms, discretizations consistent with the finite volume concept are described. A variant of second-order time accurate method is proposed that utilizes identical procedures in both the predictor and corrector steps. Avoiding the definition of midpoint gives a consistent and easy procedure, in the framework of finite volume discretization, for treating viscous transport terms in the curvilinear coordinates. For the boundary cells, a new treatment is introduced that not only avoids the use of 'ghost cells' and the associated problems, but also satisfies the tangency conditions exactly and allows easy definition of viscous transport terms at the first interface next to the boundary cells. Numerical tests of steady and unsteady high speed flows show that the present scheme gives accurate solutions.

  4. The Effect of Bilateral Superior Laryngeal Nerve Lesion on Swallowing – A Novel Method to Quantitate Aspirated Volume and Pharyngeal Threshold in Videofluoroscopy

    PubMed Central

    DING, Peng; FUNG, George Shiu-Kai; LIN, Ming De; HOLMAN, Shaina D.; GERMAN, Rebecca Z.

    2015-01-01

    Purpose To determine the effect of bilateral superior laryngeal nerve (SLN) lesion on swallowing threshold volume and the occurrence of aspiration, using a novel measurement technique for videofluorscopic swallowing studies (VFSS). Methods and Materials We used a novel radiographic phantom to assess volume of the milk containing barium from fluoroscopy. The custom made phantom was firstly calibrated by comparing image intensity of the phantom with known cylinder depths. Secondly, known volume pouches of milk in a pig cadaver were compared to volumes calculated with the phantom. Using these standards, we calculated the volume of milk in the valleculae, esophagus and larynx, for 205 feeding sequences from four infant pigs feeding before and after had bilateral SLN lesions. Swallow safety was assessed using the IMPAS scale. Results The log-linear correlation between image intensity values from the phantom filled with barium milk and the known phantom cylinder depths was strong (R2>0.95), as was the calculated volumes of the barium milk pouches. The threshold volume of bolus in the valleculae during feeding was significantly larger after bilateral SLN lesion than in control swallows (p<0.001). The IMPAS score increased in the lesioned swallows relative to the controls (p<0.001). Conclusion Bilateral SLN lesion dramatically increased the aspiration incidence and the threshold volume of bolus in valleculae. The use of this phantom permits quantification of the aspirated volume of fluid. The custom made phantom and calibration allow for more accurate 3D volume estimation from 2D x-ray in VFSS. PMID:25270532

  5. Assessing spatial uncertainty in reservoir characterization for carbon sequestration planning using public well-log data: A case study

    USGS Publications Warehouse

    Venteris, E.R.; Carter, K.M.

    2009-01-01

    Mapping and characterization of potential geologic reservoirs are key components in planning carbon dioxide (CO2) injection projects. The geometry of target and confining layers is vital to ensure that the injected CO2 remains in a supercritical state and is confined to the target layer. Also, maps of injection volume (porosity) are necessary to estimate sequestration capacity at undrilled locations. Our study uses publicly filed geophysical logs and geostatistical modeling methods to investigate the reliability of spatial prediction for oil and gas plays in the Medina Group (sandstone and shale facies) in northwestern Pennsylvania. Specifically, the modeling focused on two targets: the Grimsby Formation and Whirlpool Sandstone. For each layer, thousands of data points were available to model structure and thickness but only hundreds were available to support volumetric modeling because of the rarity of density-porosity logs in the public records. Geostatistical analysis based on this data resulted in accurate structure models, less accurate isopach models, and inconsistent models of pore volume. Of the two layers studied, only the Whirlpool Sandstone data provided for a useful spatial model of pore volume. Where reliable models for spatial prediction are absent, the best predictor available for unsampled locations is the mean value of the data, and potential sequestration sites should be planned as close as possible to existing wells with volumetric data. ?? 2009. The American Association of Petroleum Geologists/Division of Environmental Geosciences. All rights reserved.

  6. Applying Lidar and High-Resolution Multispectral Imagery for Improved Quantification and Mapping of Tundra Vegetation Structure and Distribution in the Alaskan Arctic

    NASA Astrophysics Data System (ADS)

    Greaves, Heather E.

    Climate change is disproportionately affecting high northern latitudes, and the extreme temperatures, remoteness, and sheer size of the Arctic tundra biome have always posed challenges that make application of remote sensing technology especially appropriate. Advances in high-resolution remote sensing continually improve our ability to measure characteristics of tundra vegetation communities, which have been difficult to characterize previously due to their low stature and their distribution in complex, heterogeneous patches across large landscapes. In this work, I apply terrestrial lidar, airborne lidar, and high-resolution airborne multispectral imagery to estimate tundra vegetation characteristics for a research area near Toolik Lake, Alaska. Initially, I explored methods for estimating shrub biomass from terrestrial lidar point clouds, finding that a canopy-volume based algorithm performed best. Although shrub biomass estimates derived from airborne lidar data were less accurate than those from terrestrial lidar data, algorithm parameters used to derive biomass estimates were similar for both datasets. Additionally, I found that airborne lidar-based shrub biomass estimates were just as accurate whether calibrated against terrestrial lidar data or harvested shrub biomass--suggesting that terrestrial lidar potentially could replace destructive biomass harvest. Along with smoothed Normalized Differenced Vegetation Index (NDVI) derived from airborne imagery, airborne lidar-derived canopy volume was an important predictor in a Random Forest model trained to estimate shrub biomass across the 12.5 km2 covered by our lidar and imagery data. The resulting 0.80 m resolution shrub biomass maps should provide important benchmarks for change detection in the Toolik area, especially as deciduous shrubs continue to expand in tundra regions. Finally, I applied 33 lidar- and imagery-derived predictor layers in a validated Random Forest modeling approach to map vegetation community distribution at 20 cm resolution across the data collection area, creating maps that will enable validation of coarser maps, as well as study of fine-scale ecological processes in the area. These projects have pushed the limits of what can be accomplished for vegetation mapping using airborne remote sensing in a challenging but important region; it is my hope that the methods explored here will illuminate potential paths forward as landscapes and technologies inevitably continue to change.

  7. Scale dependence of the 200-mb divergence inferred from EOLE data.

    NASA Technical Reports Server (NTRS)

    Morel, P.; Necco, G.

    1973-01-01

    The EOLE experiment with 480 constant-volume balloons distributed over the Southern Hemisphere approximately at the 200-mb level, has provided a unique, highly accurate set of tracer trajectories in the general westerly circulation. The trajectories of neighboring balloons are analyzed to estimate the horizontal divergence from the Lagrangian derivative of the area of one cluster. The variance of the divergence estimates results from two almost comparable effects: the true divergence of the horizontal flow and eddy diffusion due to small-scale, two-dimensional turbulence. Taking this into account, the rms divergence is found to be of the order of 0.00001 per sec and decreases logarithmically with cluster size. This scale dependence is shown to be consistent with the quasi-geostrophic turbulence model of the general circulation in midlatitudes.

  8. Diffusion amid random overlapping obstacles: Similarities, invariants, approximations

    PubMed Central

    Novak, Igor L.; Gao, Fei; Kraikivski, Pavel; Slepchenko, Boris M.

    2011-01-01

    Efficient and accurate numerical techniques are used to examine similarities of effective diffusion in a void between random overlapping obstacles: essential invariance of effective diffusion coefficients (Deff) with respect to obstacle shapes and applicability of a two-parameter power law over nearly entire range of excluded volume fractions (ϕ), except for a small vicinity of a percolation threshold. It is shown that while neither of the properties is exact, deviations from them are remarkably small. This allows for quick estimation of void percolation thresholds and approximate reconstruction of Deff (ϕ) for obstacles of any given shape. In 3D, the similarities of effective diffusion yield a simple multiplication “rule” that provides a fast means of estimating Deff for a mixture of overlapping obstacles of different shapes with comparable sizes. PMID:21513372

  9. Galaxy two-point covariance matrix estimation for next generation surveys

    NASA Astrophysics Data System (ADS)

    Howlett, Cullan; Percival, Will J.

    2017-12-01

    We perform a detailed analysis of the covariance matrix of the spherically averaged galaxy power spectrum and present a new, practical method for estimating this within an arbitrary survey without the need for running mock galaxy simulations that cover the full survey volume. The method uses theoretical arguments to modify the covariance matrix measured from a set of small-volume cubic galaxy simulations, which are computationally cheap to produce compared to larger simulations and match the measured small-scale galaxy clustering more accurately than is possible using theoretical modelling. We include prescriptions to analytically account for the window function of the survey, which convolves the measured covariance matrix in a non-trivial way. We also present a new method to include the effects of super-sample covariance and modes outside the small simulation volume which requires no additional simulations and still allows us to scale the covariance matrix. As validation, we compare the covariance matrix estimated using our new method to that from a brute-force calculation using 500 simulations originally created for analysis of the Sloan Digital Sky Survey Main Galaxy Sample. We find excellent agreement on all scales of interest for large-scale structure analysis, including those dominated by the effects of the survey window, and on scales where theoretical models of the clustering normally break down, but the new method produces a covariance matrix with significantly better signal-to-noise ratio. Although only formally correct in real space, we also discuss how our method can be extended to incorporate the effects of redshift space distortions.

  10. Concurrent agreement between an anthropometric model to predict thigh volume and dual-energy X-Ray absorptiometry assessment in female volleyball players aged 14-18 years.

    PubMed

    Tavares, Óscar M; Valente-Dos-Santos, João; Duarte, João P; Póvoas, Susana C; Gobbo, Luís A; Fernandes, Rômulo A; Marinho, Daniel A; Casanova, José M; Sherar, Lauren B; Courteix, Daniel; Coelho-E-Silva, Manuel J

    2016-11-24

    A variety of performance outputs are strongly determined by lower limbs volume and composition in children and adolescents. The current study aimed to examine the validity of thigh volume (TV) estimated by anthropometry in late adolescent female volleyball players. Dual-energy X-ray absorptiometry (DXA) measures were used as the reference method. Total and regional body composition was assessed with a Lunar DPX NT/Pro/MD+/Duo/Bravo scanner in a cross-sectional sample of 42 Portuguese female volleyball players aged 14-18 years (165.2 ± 0.9 cm; 61.1 ± 1.4 kg). TV was estimated with the reference method (TV-DXA) and with the anthropometric method (TV-ANTH). Agreement between procedures was assessed with Deming regression. The analysis also considered a calibration of the anthropometric approach. The equation that best predicted TV-DXA was: -0.899 + 0.876 × log 10 (body mass) + 0.113 × log 10 (TV-ANTH). This new model (NM) was validated using the predicted residual sum of squares (PRESS) method (R 2 PRESS  = 0.838). Correlation between the reference method and the NM was 0.934 (95%CI: 0.880-0.964, S y∙x  = 0.325 L). A new and accurate anthropometric method to estimate TV in adolescent female volleyball players was obtained from the equation of Jones and Pearson alongside with adjustments for body mass.

  11. AMDTreat 5.0+ with PHREEQC titration module to compute caustic chemical quantity, effluent quality, and sludge volume

    USGS Publications Warehouse

    Cravotta, Charles A.; Means, Brent P; Arthur, Willam; McKenzie, Robert M; Parkhurst, David L.

    2015-01-01

    Alkaline chemicals are commonly added to discharges from coal mines to increase pH and decrease concentrations of acidity and dissolved aluminum, iron, manganese, and associated metals. The annual cost of chemical treatment depends on the type and quantities of chemicals added and sludge produced. The AMDTreat computer program, initially developed in 2003, is widely used to compute such costs on the basis of the user-specified flow rate and water quality data for the untreated AMD. Although AMDTreat can use results of empirical titration of net-acidic or net-alkaline effluent with caustic chemicals to accurately estimate costs for treatment, such empirical data are rarely available. A titration simulation module using the geochemical program PHREEQC has been incorporated with AMDTreat 5.0+ to improve the capability of AMDTreat to estimate: (1) the quantity and cost of caustic chemicals to attain a target pH, (2) the chemical composition of the treated effluent, and (3) the volume of sludge produced by the treatment. The simulated titration results for selected caustic chemicals (NaOH, CaO, Ca(OH)2, Na2CO3, or NH3) without aeration or with pre-aeration can be compared with or used in place of empirical titration data to estimate chemical quantities, treated effluent composition, sludge volume (precipitated metals plus unreacted chemical), and associated treatment costs. This paper describes the development, evaluation, and potential utilization of the PHREEQC titration module with the new AMDTreat 5.0+ computer program available at http://www.amd.osmre.gov/.

  12. The use of computed tomography for the estimation of DIEP flap weights in breast reconstruction: a simple mathematical formula.

    PubMed

    Nanidis, Theodore G; Ridha, Hyder; Jallali, Navid

    2014-10-01

    Estimation of the volume of abdominal tissue is desirable when planning autologous abdominal based breast reconstruction. However, this can be difficult clinically. The aim of this study was to develop a simple, yet reliable method of calculating the deep inferior epigastric artery perforator flap weight using the routine preoperative computed tomography angiogram (CTA) scan. Our mathematical formula is based on the shape of a DIEP flap resembling that of an isosceles triangular prism. Thus its volume can be calculated with a standard mathematical formula. Using bony landmarks three measurements were acquired from the CTA scan to calculate the flap weight. This was then compared to the actual flap weight harvested in both a retrospective feasibility and prospective study. In the retrospective group 17 DIEP flaps in 17 patients were analyzed. Average predicted flap weight was 667 g (range 293-1254). The average actual flap weight was 657 g (range 300-1290) giving an average percentage error of 6.8% (p-value for weight difference 0.53). In the prospective group 15 DIEP flaps in 15 patients were analyzed. Average predicted flap weight was 618 g (range 320-925). The average actual flap weight was 624 g (range 356-970) giving an average percentage error of 6.38% (p-value for weight difference 0.57). This formula is a quick, reliable and accurate way of estimating the volume of abdominal tissue using the preoperative CTA scan. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. The correlation between preoperative volumetry and real graft weight: comparison of two volumetry programs

    PubMed Central

    Mussin, Nadiar; Sumo, Marco; Choi, YoungRok; Choi, Jin Yong; Ahn, Sung-Woo; Yoon, Kyung Chul; Kim, Hyo-Sin; Hong, Suk Kyun; Yi, Nam-Joon; Suh, Kyung-Suk

    2017-01-01

    Purpose Liver volumetry is a vital component in living donor liver transplantation to determine an adequate graft volume that meets the metabolic demands of the recipient and at the same time ensures donor safety. Most institutions use preoperative contrast-enhanced CT image-based software programs to estimate graft volume. The objective of this study was to evaluate the accuracy of 2 liver volumetry programs (Rapidia vs. Dr. Liver) in preoperative right liver graft estimation compared with real graft weight. Methods Data from 215 consecutive right lobe living donors between October 2013 and August 2015 were retrospectively reviewed. One hundred seven patients were enrolled in Rapidia group and 108 patients were included in the Dr. Liver group. Estimated graft volumes generated by both software programs were compared with real graft weight measured during surgery, and further classified into minimal difference (≤15%) and big difference (>15%). Correlation coefficients and degree of difference were determined. Linear regressions were calculated and results depicted as scatterplots. Results Minimal difference was observed in 69.4% of cases from Dr. Liver group and big difference was seen in 44.9% of cases from Rapidia group (P = 0.035). Linear regression analysis showed positive correlation in both groups (P < 0.01). However, the correlation coefficient was better for the Dr. Liver group (R2 = 0.719), than for the Rapidia group (R2 = 0.688). Conclusion Dr. Liver can accurately predict right liver graft size better and faster than Rapidia, and can facilitate preoperative planning in living donor liver transplantation. PMID:28382294

  14. Volume-translated cubic EoS and PC-SAFT density models and a free volume-based viscosity model for hydrocarbons at extreme temperature and pressure conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgess, Ward A.; Tapriyal, Deepak; Morreale, Bryan D.

    2013-12-01

    This research focuses on providing the petroleum reservoir engineering community with robust models of hydrocarbon density and viscosity at the extreme temperature and pressure conditions (up to 533 K and 276 MPa, respectively) characteristic of ultra-deep reservoirs, such as those associated with the deepwater wells in the Gulf of Mexico. Our strategy is to base the volume-translated (VT) Peng–Robinson (PR) and Soave–Redlich–Kwong (SRK) cubic equations of state (EoSs) and perturbed-chain, statistical associating fluid theory (PC-SAFT) on an extensive data base of high temperature (278–533 K), high pressure (6.9–276 MPa) density rather than fitting the models to low pressure saturated liquidmore » density data. This high-temperature, high-pressure (HTHP) data base consists of literature data for hydrocarbons ranging from methane to C{sub 40}. The three new models developed in this work, HTHP VT-PR EoS, HTHP VT-SRK EoS, and hybrid PC-SAFT, yield mean absolute percent deviation values (MAPD) for HTHP hydrocarbon density of ~2.0%, ~1.5%, and <1.0%, respectively. An effort was also made to provide accurate hydrocarbon viscosity models based on literature data. Viscosity values are estimated with the frictional theory (f-theory) and free volume (FV) theory of viscosity. The best results were obtained when the PC-SAFT equation was used to obtain both the attractive and repulsive pressure inputs to f-theory, and the density input to FV theory. Both viscosity models provide accurate results at pressures to 100 MPa but experimental and model results can deviate by more than 25% at pressures above 200 MPa.« less

  15. Real-Time Three-Dimensional Echocardiography: Characterization of Cardiac Anatomy and Function-Current Clinical Applications and Literature Review Update.

    PubMed

    Velasco, Omar; Beckett, Morgan Q; James, Aaron W; Loehr, Megan N; Lewis, Taylor G; Hassan, Tahmin; Janardhanan, Rajesh

    2017-01-01

    Our review of real-time three-dimensional echocardiography (RT3DE) discusses the diagnostic utility of RT3DE and provides a comparison with two-dimensional echocardiography (2DE) in clinical cardiology. A Pubmed literature search on RT3DE was performed using the following key words: transthoracic, two-dimensional, three-dimensional, real-time, and left ventricular (LV) function. Articles included perspective clinical studies and meta-analyses in the English language, and focused on the role of RT3DE in human subjects. Application of RT3DE includes analysis of the pericardium, right ventricular (RV) and LV cavities, wall motion, valvular disease, great vessels, congenital anomalies, and traumatic injury, such as myocardial contusion. RT3DE, through a transthoracic echocardiography (TTE), allows for increasingly accurate volume and valve motion assessment, estimated LV ejection fraction, and volume measurements. Chamber motion and LV mass approximation have been more accurately evaluated by RT3DE by improved inclusion of the third dimension and quantification of volumetric movement. Moreover, RT3DE was shown to have no statistical significance when comparing the ejection fractions of RT3DE to cardiac magnetic resonance (CMR). Analysis of RT3DE data sets of the LV endocardial exterior allows for the volume to be directly quantified for specific phases of the cardiac cycle, ranging from end systole to end diastole, eliminating error from wall motion abnormalities and asymmetrical left ventricles. RT3DE through TTE measures cardiac function with superior diagnostic accuracy in predicting LV mass, systolic function, along with LV and RV volume when compared with 2DE with comparable results to CMR.

  16. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  17. Reperfusion is a more accurate predictor of follow-up infarct volume than recanalization: a proof of concept using CT in acute ischemic stroke patients.

    PubMed

    Soares, Bruno P; Tong, Elizabeth; Hom, Jason; Cheng, Su-Chun; Bredno, Joerg; Boussel, Loic; Smith, Wade S; Wintermark, Max

    2010-01-01

    The purpose of this study was to compare recanalization and reperfusion in terms of their predictive value for imaging outcomes (follow-up infarct volume, infarct growth, salvaged penumbra) and clinical outcome in acute ischemic stroke patients. Twenty-two patients admitted within 6 hours of stroke onset were retrospectively included in this study. These patients underwent a first stroke CT protocol including CT-angiography (CTA) and perfusion-CT (PCT) on admission, and similar imaging after treatment, typically around 24 hours, to assess recanalization and reperfusion. Recanalization was assessed by comparing arterial patency on admission and posttreatment CTAs; reperfusion, by comparing the volumes of CBV, CBF, and MTT abnormality on admission and posttreatment PCTs. Collateral flow was graded on the admission CTA. Follow-up infarct volume was measured on the discharge noncontrast CT. The groups of patients with reperfusion, no reperfusion, recanalization, and no recanalization were compared in terms of imaging and clinical outcomes. Reperfusion (using an MTT reperfusion index >75%) was a more accurate predictor of follow-up infarct volume than recanalization. Collateral flow and recanalization were not accurate predictors of follow-up infarct volume. An interaction term was found between reperfusion and the volume of the admission penumbra >50 mL. Our study provides evidence that reperfusion is a more accurate predictor of follow-up infarct volume in acute ischemic stroke patients than recanalization. We recommend an MTT reperfusion index >75% to assess therapy efficacy in future acute ischemic stroke trials that use perfusion-CT.

  18. Muscle parameters estimation based on biplanar radiography.

    PubMed

    Dubois, G; Rouch, P; Bonneau, D; Gennisson, J L; Skalli, W

    2016-11-01

    The evaluation of muscle and joint forces in vivo is still a challenge. Musculo-Skeletal (musculo-skeletal) models are used to compute forces based on movement analysis. Most of them are built from a scaled-generic model based on cadaver measurements, which provides a low level of personalization, or from Magnetic Resonance Images, which provide a personalized model in lying position. This study proposed an original two steps method to access a subject-specific musculo-skeletal model in 30 min, which is based solely on biplanar X-Rays. First, the subject-specific 3D geometry of bones and skin envelopes were reconstructed from biplanar X-Rays radiography. Then, 2200 corresponding control points were identified between a reference model and the subject-specific X-Rays model. Finally, the shape of 21 lower limb muscles was estimated using a non-linear transformation between the control points in order to fit the muscle shape of the reference model to the X-Rays model. Twelfth musculo-skeletal models were reconstructed and compared to their reference. The muscle volume was not accurately estimated with a standard deviation (SD) ranging from 10 to 68%. However, this method provided an accurate estimation the muscle line of action with a SD of the length difference lower than 2% and a positioning error lower than 20 mm. The moment arm was also well estimated with SD lower than 15% for most muscle, which was significantly better than scaled-generic model for most muscle. This method open the way to a quick modeling method for gait analysis based on biplanar radiography.

  19. Terrestrial laser scanning to quantify above-ground biomass of structurally complex coastal wetland vegetation

    NASA Astrophysics Data System (ADS)

    Owers, Christopher J.; Rogers, Kerrylee; Woodroffe, Colin D.

    2018-05-01

    Above-ground biomass represents a small yet significant contributor to carbon storage in coastal wetlands. Despite this, above-ground biomass is often poorly quantified, particularly in areas where vegetation structure is complex. Traditional methods for providing accurate estimates involve harvesting vegetation to develop mangrove allometric equations and quantify saltmarsh biomass in quadrats. However broad scale application of these methods may not capture structural variability in vegetation resulting in a loss of detail and estimates with considerable uncertainty. Terrestrial laser scanning (TLS) collects high resolution three-dimensional point clouds capable of providing detailed structural morphology of vegetation. This study demonstrates that TLS is a suitable non-destructive method for estimating biomass of structurally complex coastal wetland vegetation. We compare volumetric models, 3-D surface reconstruction and rasterised volume, and point cloud elevation histogram modelling techniques to estimate biomass. Our results show that current volumetric modelling approaches for estimating TLS-derived biomass are comparable to traditional mangrove allometrics and saltmarsh harvesting. However, volumetric modelling approaches oversimplify vegetation structure by under-utilising the large amount of structural information provided by the point cloud. The point cloud elevation histogram model presented in this study, as an alternative to volumetric modelling, utilises all of the information within the point cloud, as opposed to sub-sampling based on specific criteria. This method is simple but highly effective for both mangrove (r2 = 0.95) and saltmarsh (r2 > 0.92) vegetation. Our results provide evidence that application of TLS in coastal wetlands is an effective non-destructive method to accurately quantify biomass for structurally complex vegetation.

  20. Estimating antimalarial drugs consumption in Africa before the switch to artemisinin-based combination therapies (ACTs).

    PubMed

    Kindermans, Jean-Marie; Vandenbergh, Daniel; Vreeke, Ed; Olliaro, Piero; D'Altilia, Jean-Pierre

    2007-07-10

    Having reliable forecasts is critical now for producers, malaria-endemic countries and agencies in order to adapt production and procurement of the artemisinin-based combination treatments (ACTs), the new first-line treatments of malaria. There is no ideal method to quantify drug requirements for malaria. Morbidity data give uncertain estimations. This study uses drug consumption to provide elements to help estimate quantities and financial requirements of ACTs. The consumption of chloroquine, sulphadoxine/pyrimethamine and quinine both through the private and public sector was assessed in five sub-Saharan Africa countries with different epidemiological patterns (Senegal, Rwanda, Tanzania, Malawi, Zimbabwe). From these data the number of adult treatments per capita was calculated and the volumes and financial implications derived for the whole of Africa. Identifying and obtaining data from the private sector was difficult. The quality of information on drug supply and distribution in countries must be improved. The number of adult treatments per capita and per year in the five countries ranged from 0.18 to 0.50. Current adult treatment prices for ACTs range US$ 1-1.8. Taking the upper range for both volumes and costs, the highest number of adult treatments consumed for Africa was estimated at 314.5 million, corresponding to an overall maximum annual need for financing ACT procurement of US$ 566.1 million. In reality, both the number of cases treated and the cost of treatment are likely to be lower (projections for the lowest consumption estimate with the least expensive ACT would require US $ 113 million per annum). There were substantial variations in the market share between public and private sources among these countries (the public sector share ranging from 98% in Rwanda to 33% in Tanzania). Additional studies are required to build a more robust methodology, and to assess current consumptions more accurately in order to better quantify volumes and finances for production and procurement of ACTs.

  1. Stereological assessment of normal Persian squirrels (Sciurus anomalus) kidney.

    PubMed

    Akbari, Mohsen; Goodarzi, Nader; Tavafi, Majid

    2017-03-01

    The functions of the mammalian kidney are closely related to its structure. This suggests that renal function can be completely characterized by accurate knowledge of its quantitative morphological features. The aim of this study was to investigate the histomorphometric features of the kidney using design-based and unbiased stereological methods in the Persian squirrel (Sciurus anomalus), which is the only representative of the Sciuridae family in the Middle East. The left kidneys of five animals were examined. Total volume of the kidney, cortex, and medulla were determined to be 960.75 ± 87.4, 754.31 ± 77.09 and 206.1 ± 16.89 mm 3 , respectively. The glomerular number was 32844.03 ± 1069.19, and the total glomerular volume was estimated to be 36.7 ± 1.45 mm 3 . The volume and length of the proximal convoluted tubule were estimated at 585.67 ± 60.7 mm 3 and 328.8 ± 14.8 m, respectively, with both values being greater than those reported in the rat kidney. The volume and length of the distal convoluted tubule were calculated at 122.34 ± 7.38 mm 3 and 234.4 ± 17.45 m, respectively, which are also greater than those reported in the rat kidney. Despite the comparable body weight, the total number and mean individual volume of glomeruli in the Persian squirrel kidney were greater than those in the rat kidney. Overall, the stereological variables of the kidneys elucidated in this study are exclusive to the Persian squirrel. Our findings, together with future renal physiological data, will contribute to a better understanding of the renal structure-function relationship in the Persian squirrel.

  2. A novel method to estimate the volume of bone defects using cone-beam computed tomography: an in vitro study.

    PubMed

    Esposito, Stefano Andrea; Huybrechts, Bart; Slagmolen, Pieter; Cotti, Elisabetta; Coucke, Wim; Pauwels, Ruben; Lambrechts, Paul; Jacobs, Reinhilde

    2013-09-01

    The routine use of high-resolution images derived from 3-dimensional cone-beam computed tomography (CBCT) datasets enables the linear measurement of lesions in the maxillary and mandibular bones on 3 planes of space. Measurements on different planes would make it possible to obtain real volumetric assessments. In this study, we tested, in vitro, the accuracy and reliability of new dedicated software developed for volumetric lesion assessment in clinical endodontics. Twenty-seven bone defects were created around the apices of 8 teeth in 1 young bovine mandible to simulate periapical lesions of different sizes and shapes. The volume of each defect was determined by taking an impression of the defect using a silicone material. The samples were scanned using an Accuitomo 170 CBCT (J. Morita Mfg Co, Kyoto, Japan), and the data were uploaded into a newly developed dedicated software tool. Two endodontists acted as independent and calibrated observers. They analyzed each bone defect for volume. The difference between the direct volumetric measurements and the measurements obtained with the CBCT images was statistically assessed using a lack-of-fit test. A correlation study was undertaken using the Pearson product-moment correlation coefficient. Intra- and interobserver agreement was also evaluated. The results showed a good fit and strong correlation between both volume measurements (ρ > 0.9) with excellent inter- and intraobserver agreement. Using this software, CBCT proved to be a reliable method in vitro for the estimation of endodontic lesion volumes in bovine jaws. Therefore, it may constitute a new, validated technique for the accurate evaluation and follow-up of apical periodontitis. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  3. Comprehensive Assessment of Coronary Artery Disease by Using First-Pass Analysis Dynamic CT Perfusion: Validation in a Swine Model.

    PubMed

    Hubbard, Logan; Lipinski, Jerry; Ziemer, Benjamin; Malkasian, Shant; Sadeghi, Bahman; Javan, Hanna; Groves, Elliott M; Dertli, Brian; Molloi, Sabee

    2018-01-01

    Purpose To retrospectively validate a first-pass analysis (FPA) technique that combines computed tomographic (CT) angiography and dynamic CT perfusion measurement into one low-dose examination. Materials and Methods The study was approved by the animal care committee. The FPA technique was retrospectively validated in six swine (mean weight, 37.3 kg ± 7.5 [standard deviation]) between April 2015 and October 2016. Four to five intermediate-severity stenoses were generated in the left anterior descending artery (LAD), and 20 contrast material-enhanced volume scans were acquired per stenosis. All volume scans were used for maximum slope model (MSM) perfusion measurement, but only two volume scans were used for FPA perfusion measurement. Perfusion measurements in the LAD, left circumflex artery (LCx), right coronary artery, and all three coronary arteries combined were compared with microsphere perfusion measurements by using regression, root-mean-square error, root-mean-square deviation, Lin concordance correlation, and diagnostic outcomes analysis. The CT dose index and size-specific dose estimate per two-volume FPA perfusion measurement were also determined. Results FPA and MSM perfusion measurements (P FPA and P MSM ) in all three coronary arteries combined were related to reference standard microsphere perfusion measurements (P MICRO ), as follows: P FPA_COMBINED = 1.02 P MICRO_COMBINED + 0.11 (r = 0.96) and P MSM_COMBINED = 0.28 P MICRO_COMBINED + 0.23 (r = 0.89). The CT dose index and size-specific dose estimate per two-volume FPA perfusion measurement were 10.8 and 17.8 mGy, respectively. Conclusion The FPA technique was retrospectively validated in a swine model and has the potential to be used for accurate, low-dose vessel-specific morphologic and physiologic assessment of coronary artery disease. © RSNA, 2017.

  4. Measuring Compositions in Organic Depth Profiling: Results from a VAMAS Interlaboratory Study.

    PubMed

    Shard, Alexander G; Havelund, Rasmus; Spencer, Steve J; Gilmore, Ian S; Alexander, Morgan R; Angerer, Tina B; Aoyagi, Satoka; Barnes, Jean-Paul; Benayad, Anass; Bernasik, Andrzej; Ceccone, Giacomo; Counsell, Jonathan D P; Deeks, Christopher; Fletcher, John S; Graham, Daniel J; Heuser, Christian; Lee, Tae Geol; Marie, Camille; Marzec, Mateusz M; Mishra, Gautam; Rading, Derk; Renault, Olivier; Scurr, David J; Shon, Hyun Kyong; Spampinato, Valentina; Tian, Hua; Wang, Fuyi; Winograd, Nicholas; Wu, Kui; Wucher, Andreas; Zhou, Yufan; Zhu, Zihua; Cristaudo, Vanina; Poleunis, Claude

    2015-08-20

    We report the results of a VAMAS (Versailles Project on Advanced Materials and Standards) interlaboratory study on the measurement of composition in organic depth profiling. Layered samples with known binary compositions of Irganox 1010 and either Irganox 1098 or Fmoc-pentafluoro-l-phenylalanine in each layer were manufactured in a single batch and distributed to more than 20 participating laboratories. The samples were analyzed using argon cluster ion sputtering and either X-ray photoelectron spectroscopy (XPS) or time-of-flight secondary ion mass spectrometry (ToF-SIMS) to generate depth profiles. Participants were asked to estimate the volume fractions in two of the layers and were provided with the compositions of all other layers. Participants using XPS provided volume fractions within 0.03 of the nominal values. Participants using ToF-SIMS either made no attempt, or used various methods that gave results ranging in error from 0.02 to over 0.10 in volume fraction, the latter representing a 50% relative error for a nominal volume fraction of 0.2. Error was predominantly caused by inadequacy in the ability to compensate for primary ion intensity variations and the matrix effect in SIMS. Matrix effects in these materials appear to be more pronounced as the number of atoms in both the primary analytical ion and the secondary ion increase. Using the participants' data we show that organic SIMS matrix effects can be measured and are remarkably consistent between instruments. We provide recommendations for identifying and compensating for matrix effects. Finally, we demonstrate, using a simple normalization method, that virtually all ToF-SIMS participants could have obtained estimates of volume fraction that were at least as accurate and consistent as XPS.

  5. Measuring Compositions in Organic Depth Profiling: Results from a VAMAS Interlaboratory Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shard, A. G.; Havelund, Rasmus; Spencer, Steve J.

    We report the results of a VAMAS (Versailles Project on Advanced Materials and Standards) interlaboratory study on the measurement of composition in organic depth profiling. Layered samples with known binary compositions of Irganox 1010 and either Irganox 1098 or Fmoc-pentafluoro-L-phenylalanine in each layer were manufactured in a single batch and distributed to more than 20 participating laboratories. The samples were analyzed using argon cluster ion sputtering and either X-ray Photoelectron Spectroscopy (XPS) or Time-of-Flight Secondary Ion Mass Spectrometry (ToF-SIMS) to generate depth profiles. Participants were asked to estimate the volume fractions in two of the layers and were provided withmore » the compositions of all other layers. Participants using XPS provided volume fractions within 0.03 of the nominal values. Participants using ToF-SIMS either made no attempt, or used various methods that gave results ranging in error from 0.02 to over 0.10 in volume fraction, the latter representing a 50% relative error for a nominal volume fraction of 0.2. Error was predominantly caused by inadequacy in the ability to compensate for primary ion intensity variations and the matrix effect in SIMS. Matrix effects in these materials appear to be more pronounced as the number of atoms in both the primary analytical ion and the secondary ion increase. Using the participants’ data we show that organic SIMS matrix effects can be measured and are remarkably consistent between instruments. We provide recommendations for identifying and compensating for matrix effects. Finally we demonstrate, using a simple normalization method, that virtually all ToF-SIMS participants could have obtained estimates of volume fraction that were at least as accurate and consistent as XPS.« less

  6. The BREAST-V: a unifying predictive formula for volume assessment in small, medium, and large breasts.

    PubMed

    Longo, Benedetto; Farcomeni, Alessio; Ferri, Germano; Campanale, Antonella; Sorotos, Micheal; Santanelli, Fabio

    2013-07-01

    Breast volume assessment enhances preoperative planning of both aesthetic and reconstructive procedures, helping the surgeon in the decision-making process of shaping the breast. Numerous methods of breast size determination are currently reported but are limited by methodologic flaws and variable estimations. The authors aimed to develop a unifying predictive formula for volume assessment in small to large breasts based on anthropomorphic values. Ten anthropomorphic breast measurements and direct volumes of 108 mastectomy specimens from 88 women were collected prospectively. The authors performed a multivariate regression to build the optimal model for development of the predictive formula. The final model was then internally validated. A previously published formula was used as a reference. Mean (±SD) breast weight was 527.9 ± 227.6 g (range, 150 to 1250 g). After model selection, sternal notch-to-nipple, inframammary fold-to-nipple, and inframammary fold-to-fold projection distances emerged as the most important predictors. The resulting formula (the BREAST-V) showed an adjusted R of 0.73. The estimated expected absolute error on new breasts is 89.7 g (95 percent CI, 62.4 to 119.1 g) and the expected relative error is 18.4 percent (95 percent CI, 12.9 to 24.3 percent). Application of reference formula on the sample yielded worse predictions than those derived by the formula, showing an R of 0.55. The BREAST-V is a reliable tool for predicting small to large breast volumes accurately for use as a complementary device in surgeon evaluation. An app entitled BREAST-V for both iOS and Android devices is currently available for free download in the Apple App Store and Google Play Store. Diagnostic, II.

  7. Platelet Counts in Insoluble Platelet-Rich Fibrin Clots: A Direct Method for Accurate Determination.

    PubMed

    Kitamura, Yutaka; Watanabe, Taisuke; Nakamura, Masayuki; Isobe, Kazushige; Kawabata, Hideo; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Tanaka, Takaaki; Kawase, Tomoyuki

    2018-01-01

    Platelet-rich fibrin (PRF) clots have been used in regenerative dentistry most often, with the assumption that growth factor levels are concentrated in proportion to the platelet concentration. Platelet counts in PRF are generally determined indirectly by platelet counting in other liquid fractions. This study shows a method for direct estimation of platelet counts in PRF. To validate this method by determination of the recovery rate, whole-blood samples were obtained with an anticoagulant from healthy donors, and platelet-rich plasma (PRP) fractions were clotted with CaCl 2 by centrifugation and digested with tissue-plasminogen activator. Platelet counts were estimated before clotting and after digestion using an automatic hemocytometer. The method was then tested on PRF clots. The quality of platelets was examined by scanning electron microscopy and flow cytometry. In PRP-derived fibrin matrices, the recovery rate of platelets and white blood cells was 91.6 and 74.6%, respectively, after 24 h of digestion. In PRF clots associated with small and large red thrombi, platelet counts were 92.6 and 67.2% of the respective total platelet counts. These findings suggest that our direct method is sufficient for estimating the number of platelets trapped in an insoluble fibrin matrix and for determining that platelets are distributed in PRF clots and red thrombi roughly in proportion to their individual volumes. Therefore, we propose this direct digestion method for more accurate estimation of platelet counts in most types of platelet-enriched fibrin matrix.

  8. Platelet Counts in Insoluble Platelet-Rich Fibrin Clots: A Direct Method for Accurate Determination

    PubMed Central

    Kitamura, Yutaka; Watanabe, Taisuke; Nakamura, Masayuki; Isobe, Kazushige; Kawabata, Hideo; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Tanaka, Takaaki; Kawase, Tomoyuki

    2018-01-01

    Platelet-rich fibrin (PRF) clots have been used in regenerative dentistry most often, with the assumption that growth factor levels are concentrated in proportion to the platelet concentration. Platelet counts in PRF are generally determined indirectly by platelet counting in other liquid fractions. This study shows a method for direct estimation of platelet counts in PRF. To validate this method by determination of the recovery rate, whole-blood samples were obtained with an anticoagulant from healthy donors, and platelet-rich plasma (PRP) fractions were clotted with CaCl2 by centrifugation and digested with tissue-plasminogen activator. Platelet counts were estimated before clotting and after digestion using an automatic hemocytometer. The method was then tested on PRF clots. The quality of platelets was examined by scanning electron microscopy and flow cytometry. In PRP-derived fibrin matrices, the recovery rate of platelets and white blood cells was 91.6 and 74.6%, respectively, after 24 h of digestion. In PRF clots associated with small and large red thrombi, platelet counts were 92.6 and 67.2% of the respective total platelet counts. These findings suggest that our direct method is sufficient for estimating the number of platelets trapped in an insoluble fibrin matrix and for determining that platelets are distributed in PRF clots and red thrombi roughly in proportion to their individual volumes. Therefore, we propose this direct digestion method for more accurate estimation of platelet counts in most types of platelet-enriched fibrin matrix. PMID:29450197

  9. Development of a Spect-Based Three-Dimensional Treatment Planner for Radionuclide Therapy with Iodine -131.

    NASA Astrophysics Data System (ADS)

    Giap, Huan Bosco

    Accurate calculation of absorbed dose to target tumors and normal tissues in the body is an important requirement for establishing fundamental dose-response relationships for radioimmunotherapy. Two major obstacles have been the difficulty in obtaining an accurate patient-specific 3-D activity map in-vivo and calculating the resulting absorbed dose. This study investigated a methodology for 3-D internal dosimetry, which integrates the 3-D biodistribution of the radionuclide acquired from SPECT with a dose-point kernel convolution technique to provide the 3-D distribution of absorbed dose. Accurate SPECT images were reconstructed with appropriate methods for noise filtering, attenuation correction, and Compton scatter correction. The SPECT images were converted into activity maps using a calibration phantom. The activity map was convolved with an ^{131}I dose-point kernel using a 3-D fast Fourier transform to yield a 3-D distribution of absorbed dose. The 3-D absorbed dose map was then processed to provide the absorbed dose distribution in regions of interest. This methodology can provide heterogeneous distributions of absorbed dose in volumes of any size and shape with nonuniform distributions of activity. Comparison of the activities quantitated by our SPECT methodology to true activities in an Alderson abdominal phantom (with spleen, liver, and spherical tumor) yielded errors of -16.3% to 4.4%. Volume quantitation errors ranged from -4.0 to 5.9% for volumes greater than 88 ml. The percentage differences of the average absorbed dose rates calculated by this methodology and the MIRD S-values were 9.1% for liver, 13.7% for spleen, and 0.9% for the tumor. Good agreement (percent differences were less than 8%) was found between the absorbed dose due to penetrating radiation calculated from this methodology and TLD measurement. More accurate estimates of the 3 -D distribution of absorbed dose can be used as a guide in specifying the minimum activity to be administered to patients to deliver a prescribed absorbed dose to tumor without exceeding the toxicity limits of normal tissues.

  10. Is There a Safe Lipoaspirate Volume? A Risk Assessment Model of Liposuction Volume as a Function of Body Mass Index.

    PubMed

    Chow, Ian; Alghoul, Mohammed S; Khavanin, Nima; Hanwright, Philip J; Mayer, Kristen E; Hume, Keith M; Murphy, Robert X; Gutowski, Karol A; Kim, John Y S

    2015-09-01

    No concrete data exist to support a specific volume at which liposuction becomes unsafe; surgeons rely on their own estimates, professional organization advisories, or institutional or government-imposed restrictions. This study represents the first attempt to quantify the comprehensive risk associated with varying liposuction volumes and its interaction with body mass index. Suction-assisted lipectomies were identified from the Tracking Operations and Outcomes for Plastic Surgeons database. Multivariate regression models incorporating the interaction between liposuction volume and body mass index were used to assess the influence of liposuction volume on complications and to develop a tool that returns a single adjusted odds ratio for any combination of body mass index and liposuction volume. Recursive partitioning was used to determine whether exceeding a threshold in liposuction volume per body mass index unit significantly increased complications. Sixty-nine of 4534 patients (1.5 percent) meeting inclusion criteria experienced a postoperative complication. Liposuction volume and body mass index were significant independent risk factors for complications. With progressively higher volumes, increasing body mass index reduced risk (OR, 0.99; 95 percent CI, 0.98 to 0.99; p = 0.007). Liposuction volumes in excess of 100 ml per unit of body mass index were an independent predictor of complications (OR, 4.58; 95 percent CI, 2.60 to 8.05; p < 0.001). Liposuction by board-certified plastic surgeons is safe, with a low risk of life-threatening complications. Traditional liposuction volume thresholds do not accurately convey individualized risk. The authors' risk assessment model demonstrates that volumes in excess of 100 ml per unit of body mass index confer an increased risk of complications. Therapeutic, III.

  11. Pancreatic mucinous cystic neoplasm size using CT volumetry, spherical and ellipsoid formulas: validation study.

    PubMed

    Chalian, Hamid; Seyal, Adeel Rahim; Rezai, Pedram; Töre, Hüseyin Gürkan; Miller, Frank H; Bentrem, David J; Yaghmai, Vahid

    2014-01-10

    The accuracy for determining pancreatic cyst volume with commonly used spherical and ellipsoid methods is unknown. The role of CT volumetry in volumetric assessment of pancreatic cysts needs to be explored. To compare volumes of the pancreatic cysts by CT volumetry, spherical and ellipsoid methods and determine their accuracy by correlating with actual volume as determined by EUS-guided aspiration. Setting This is a retrospective analysis performed at a tertiary care center. Patients Seventy-eight pathologically proven pancreatic cysts evaluated with CT and endoscopic ultrasound (EUS) were included. Design The volume of fourteen cysts that had been fully aspirated by EUS was compared to CT volumetry and the routinely used methods (ellipsoid and spherical volume). Two independent observers measured all cysts using commercially available software to evaluate inter-observer reproducibility for CT volumetry. The volume of pancreatic cysts as determined by various methods was compared using repeated measures analysis of variance. Bland-Altman plot and intraclass correlation coefficient were used to determine mean difference and correlation between observers and methods. The error was calculated as the percentage of the difference between the CT estimated volumes and the aspirated volume divided by the aspirated one. CT volumetry was comparable to aspirated volume (P=0.396) with very high intraclass correlation (r=0.891, P<0.001) and small mean difference (0.22 mL) and error (8.1%). Mean difference with aspirated volume and error were larger for ellipsoid (0.89 mL, 30.4%; P=0.024) and spherical (1.73 mL, 55.5%; P=0.004) volumes than CT volumetry. There was excellent inter-observer correlation in volumetry of the entire cohort (r=0.997, P<0.001). CT volumetry is accurate and reproducible. Ellipsoid and spherical volume overestimate the true volume of pancreatic cysts.

  12. iProphet: Multi-level Integrative Analysis of Shotgun Proteomic Data Improves Peptide and Protein Identification Rates and Error Estimates*

    PubMed Central

    Shteynberg, David; Deutsch, Eric W.; Lam, Henry; Eng, Jimmy K.; Sun, Zhi; Tasman, Natalie; Mendoza, Luis; Moritz, Robert L.; Aebersold, Ruedi; Nesvizhskii, Alexey I.

    2011-01-01

    The combination of tandem mass spectrometry and sequence database searching is the method of choice for the identification of peptides and the mapping of proteomes. Over the last several years, the volume of data generated in proteomic studies has increased dramatically, which challenges the computational approaches previously developed for these data. Furthermore, a multitude of search engines have been developed that identify different, overlapping subsets of the sample peptides from a particular set of tandem mass spectrometry spectra. We present iProphet, the new addition to the widely used open-source suite of proteomic data analysis tools Trans-Proteomics Pipeline. Applied in tandem with PeptideProphet, it provides more accurate representation of the multilevel nature of shotgun proteomic data. iProphet combines the evidence from multiple identifications of the same peptide sequences across different spectra, experiments, precursor ion charge states, and modified states. It also allows accurate and effective integration of the results from multiple database search engines applied to the same data. The use of iProphet in the Trans-Proteomics Pipeline increases the number of correctly identified peptides at a constant false discovery rate as compared with both PeptideProphet and another state-of-the-art tool Percolator. As the main outcome, iProphet permits the calculation of accurate posterior probabilities and false discovery rate estimates at the level of sequence identical peptide identifications, which in turn leads to more accurate probability estimates at the protein level. Fully integrated with the Trans-Proteomics Pipeline, it supports all commonly used MS instruments, search engines, and computer platforms. The performance of iProphet is demonstrated on two publicly available data sets: data from a human whole cell lysate proteome profiling experiment representative of typical proteomic data sets, and from a set of Streptococcus pyogenes experiments more representative of organism-specific composite data sets. PMID:21876204

  13. A comparison of approaches for estimating bottom-sediment mass in large reservoirs

    USGS Publications Warehouse

    Juracek, Kyle E.

    2006-01-01

    Estimates of sediment and sediment-associated constituent loads and yields from drainage basins are necessary for the management of reservoir-basin systems to address important issues such as reservoir sedimentation and eutrophication. One method for the estimation of loads and yields requires a determination of the total mass of sediment deposited in a reservoir. This method involves a sediment volume-to-mass conversion using bulk-density information. A comparison of four computational approaches (partition, mean, midpoint, strategic) for using bulk-density information to estimate total bottom-sediment mass in four large reservoirs indicated that the differences among the approaches were not statistically significant. However, the lack of statistical significance may be a result of the small sample size. Compared to the partition approach, which was presumed to provide the most accurate estimates of bottom-sediment mass, the results achieved using the strategic, mean, and midpoint approaches differed by as much as ?4, ?20, and ?44 percent, respectively. It was concluded that the strategic approach may merit further investigation as a less time consuming and less costly alternative to the partition approach.

  14. Retrieval of Ocean Subsurface Particulate Backscattering Coefficient from Space-Borne CALIOP Lidar Measurement

    NASA Technical Reports Server (NTRS)

    Lu, Xiaomei; Hu, Yongxiang; Pelon, Jacques; Trepte, Chip; Liu, Katie; Rodier, Sharon; Zeng, Shan; Luckher, Patricia; Verhappen, Ron; Wilson, Jamie; hide

    2016-01-01

    A new approach has been proposed to determine ocean subsurface particulate backscattering coefficient bbp from CALIOP 30deg off-nadir lidar measurements. The new method also provides estimates of the particle volume scattering function at the 180deg scattering angle. The CALIOP based layer-integrated lidar backscatter and particulate backscattering coefficients are compared with the results obtained from MODIS ocean color measurements. The comparison analysis shows that ocean subsurface lidar backscatter and particulate backscattering coefficient bbp can be accurately obtained from CALIOP lidar measurements, thereby supporting the use of space-borne lidar measurements for ocean subsurface studies.

  15. FOCIS: A forest classification and inventory system using LANDSAT and digital terrain data

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Franklin, J.; Woodcook, C. E.; Logan, T. L.

    1981-01-01

    Accurate, cost-effective stratification of forest vegetation and timber inventory is the primary goal of a Forest Classification and Inventory System (FOCIS). Conventional timber stratification using photointerpretation can be time-consuming, costly, and inconsistent from analyst to analyst. FOCIS was designed to overcome these problems by using machine processing techniques to extract and process tonal, textural, and terrain information from registered LANDSAT multispectral and digital terrain data. Comparison of samples from timber strata identified by conventional procedures showed that both have about the same potential to reduce the variance of timber volume estimates over simple random sampling.

  16. Quantifying Water Stress Using Total Water Volumes and GRACE

    NASA Astrophysics Data System (ADS)

    Richey, A. S.; Famiglietti, J. S.; Druffel-Rodriguez, R.

    2011-12-01

    Water will follow oil as the next critical resource leading to unrest and uprisings globally. To better manage this threat, an improved understanding of the distribution of water stress is required today. This study builds upon previous efforts to characterize water stress by improving both the quantification of human water use and the definition of water availability. Current statistics on human water use are often outdated or inaccurately reported nationally, especially for groundwater. This study improves these estimates by defining human water use in two ways. First, we use NASA's Gravity Recovery and Climate Experiment (GRACE) to isolate the anthropogenic signal in water storage anomalies, which we equate to water use. Second, we quantify an ideal water demand by using average water requirements for the domestic, industrial, and agricultural water use sectors. Water availability has traditionally been limited to "renewable" water, which ignores large, stored water sources that humans use. We compare water stress estimates derived using either renewable water or the total volume of water globally. We use the best-available data to quantify total aquifer and surface water volumes, as compared to groundwater recharge and surface water runoff from land-surface models. The work presented here should provide a more realistic image of water stress by explicitly quantifying groundwater, defining water availability as total water supply, and using GRACE to more accurately quantify water use.

  17. Logistics Needs for Potential Deep Space Mission Scenarios Post Asteroid Redirect Crewed Mission

    NASA Technical Reports Server (NTRS)

    Lopez, Pedro, Jr.; Shultz, Eric; Mattfeld, Bryan; Stromgren, Chel; Goodliff, Kandyce

    2015-01-01

    The Asteroid Redirect Mission (ARM) is currently being explored as the next step towards deep space human exploration, with the ultimate goal of reaching Mars. NASA is currently investigating a number of potential human exploration missions, which will progressively increase the distance and duration that humans spend away from Earth. Missions include extended human exploration in cis-lunar space which, as conceived, would involve durations of around 60 days, and human missions to Mars, which are anticipated to be as long as 1000 days. The amount of logistics required to keep the crew alive and healthy for these missions is significant. It is therefore important that the design and planning for these missions include accurate estimates of logistics requirements. This paper provides a description of a process and calculations used to estimate mass and volume requirements for crew logistics, including consumables, such as food, personal items, gasses, and liquids. Determination of logistics requirements is based on crew size, mission duration, and the degree of closure of the environmental control life support system (ECLSS). Details are provided on the consumption rates for different types of logistics and how those rates were established. Results for potential mission scenarios are presented, including a breakdown of mass and volume drivers. Opportunities for mass and volume reduction are identified, along with potential threats that could possibly increase requirements.

  18. Breast volume assessment: comparing five different techniques.

    PubMed

    Bulstrode, N; Bellamy, E; Shrotria, S

    2001-04-01

    Breast volume assessment is not routinely performed pre-operatively because as yet there is no accepted technique. There have been a variety of methods published, but this is the first study to compare these techniques. We compared volume measurements obtained from mammograms (previously compared to mastectomy specimens) with estimates of volume obtained from four other techniques: thermoplastic moulding, magnetic resonance imaging, Archimedes principle and anatomical measurements. We also assessed the acceptability of each method to the patient. Measurements were performed on 10 women, which produced results for 20 breasts. We were able to calculate regression lines between volume measurements obtained from mammography to the other four methods: (1) magnetic resonance imaging (MRI), 379+(0.75 MRI) [r=0.48], (2) Thermoplastic moulding, 132+(1.46 Thermoplastic moulding) [r=0.82], (3) Anatomical measurements, 168+(1.55 Anatomical measurements) [r=0.83]. (4) Archimedes principle, 359+(0.6 Archimedes principle) [r=0.61] all units in cc. The regression curves for the different techniques are variable and it is difficult to reliably compare results. A standard method of volume measurement should be used when comparing volumes before and after intervention or between individual patients, and it is unreliable to compare volume measurements using different methods. Calculating the breast volume from mammography has previously been compared to mastectomy samples and shown to be reasonably accurate. However we feel thermoplastic moulding shows promise and should be further investigated as it gives not only a volume assessment but a three-dimensional impression of the breast shape, which may be valuable in assessing cosmesis following breast-conserving-surgery.

  19. A probabilistic multidimensional approach to quantify large wood recruitment from hillslopes in mountainous-forested catchments

    NASA Astrophysics Data System (ADS)

    Cislaghi, Alessio; Rigon, Emanuel; Lenzi, Mario Aristide; Bischetti, Gian Battista

    2018-04-01

    Large wood (LW) plays a key role in physical, chemical, environmental, and biological processes in most natural and seminatural streams. However, it is also a source of hydraulic hazard in anthropised territories. Recruitment from fluvial processes has been the subject of many studies, whereas less attention has been given to hillslope recruitment, which is linked to episodic and spatially distributed events and requires a reliable and accurate slope stability model and a hillslope-channel transfer model. The purpose of this study is to develop an innovative LW hillslope-recruitment estimation approach that combines forest stand characteristics in a spatially distributed form, a probabilistic multidimensional slope stability model able to include the reinforcement exerted by roots, and a hillslope-channel transfer procedure. The approach was tested on a small mountain headwater catchment in the eastern Italian Alps that is prone to shallow landslide and debris flow phenomena. The slope stability model (that had not been calibrated) provided accurate performances, in terms of unstable areas identification according to the landslide inventory (AUC = 0.832) and of LW volume estimation in comparison with LW volume produced by inventoried landslides (7702 m3 corresponding to a recurrence time of about 30 years in the susceptibility curve). The results showed that most LW potentially mobilised by landslides does not reach the channel network (only about 16%), in agreement with the few data reported by other studies, as well as the data normalized for unit length of channel and unit length of channel per year (0-116 m3/km and 0-4 m3/km y-1). This study represents an important contribution to LW research. A rigorous and site-specific estimation of LW hillslope recruitment should, in fact, be an integral part of more general studies on LW dynamics, for forest planning and management, and positioning in-channel wood retention structures.

  20. How High is that Dune? A Comparison of Methods Used to Constrain the Morphometry of Aeolian Bedforms on Mars

    NASA Technical Reports Server (NTRS)

    Bourke, M.; Balme, M.; Beyer, R. A.; Williams, K. K.

    2004-01-01

    Methods traditionally used to estimate the relative height of surface features on Mars include: photoclinometry, shadow length and stereography. The MOLA data set enables a more accurate assessment of the surface topography of Mars. However, many small-scale aeolian bedforms remain below the sample resolution of the MOLA data set. In response to this a number of research teams have adopted and refined existing methods and applied them to high resolution (2-6 m/pixel) narrow angle MOC satellite images. Collectively, the methods provide data on a range of morphometric parameters (many not previously available for dunes on Mars). These include dune height, width, length, surface area, volume, longitudinal and cross profiles). This data will facilitate a more accurate analysis of aeolian bedforms on Mars. In this paper we undertake a comparative analysis of methods used to determine the height of aeolian dunes and ripples.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, M.R.; Hobbs, M.L.; McGee, B.C.

    Exponential-13,6 (EXP-13,6) potential pammeters for 750 gases composed of 48 elements were determined and assembled in a database, referred to as the JCZS database, for use with the Jacobs Cowperthwaite Zwisler equation of state (JCZ3-EOS)~l) The EXP- 13,6 force constants were obtained by using literature values of Lennard-Jones (LJ) potential functions, by using corresponding states (CS) theory, by matching pure liquid shock Hugoniot data, and by using molecular volume to determine the approach radii with the well depth estimated from high-pressure isen- tropes. The JCZS database was used to accurately predict detonation velocity, pressure, and temperature for 50 dif- 3more » Accurate predictions were also ferent explosives with initial densities ranging from 0.25 glcm3 to 1.97 g/cm . obtained for pure liquid shock Hugoniots, static properties of nitrogen, and gas detonations at high initial pressures.« less

  2. Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy

    PubMed Central

    2011-01-01

    To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685

  3. Measured body composition and geometrical data of four ``virtual family'' members for thermoregulatory modeling

    NASA Astrophysics Data System (ADS)

    Xu, Xiaojiang; Rioux, Timothy P.; MacLeod, Tynan; Patel, Tejash; Rome, Maxwell N.; Potter, Adam W.

    2017-03-01

    The purpose of this paper is to develop a database of tissue composition, distribution, volume, surface area, and skin thickness from anatomically correct human models, the virtual family. These models were based on high-resolution magnetic resonance imaging (MRI) of human volunteers, including two adults (male and female) and two children (boy and girl). In the segmented image dataset, each voxel is associated with a label which refers to a tissue type that occupies up that specific cubic millimeter of the body. The tissue volume was calculated from the number of the voxels with the same label. Volumes of 24 organs in body and volumes of 7 tissues in 10 specific body regions were calculated. Surface area was calculated from the collection of voxels that are touching the exterior air. Skin thicknesses were estimated from its volume and surface area. The differences between the calculated and original masses were about 3 % or less for tissues or organs that are important to thermoregulatory modeling, e.g., muscle, skin, and fat. This accurate database of body tissue distributions and geometry is essential for the development of human thermoregulatory models. Data derived from medical imaging provide new effective tools to enhance thermal physiology research and gain deeper insight into the mechanisms of how the human body maintains heat balance.

  4. Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.

    PubMed

    Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N

    2011-04-15

    To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.

  5. Pore-scale micro-computed-tomography imaging: Nonwetting-phase cluster-size distribution during drainage and imbibition

    NASA Astrophysics Data System (ADS)

    Georgiadis, A.; Berg, S.; Makurat, A.; Maitland, G.; Ott, H.

    2013-09-01

    We investigated the cluster-size distribution of the residual nonwetting phase in a sintered glass-bead porous medium at two-phase flow conditions, by means of micro-computed-tomography (μCT) imaging with pore-scale resolution. Cluster-size distribution functions and cluster volumes were obtained by image analysis for a range of injected pore volumes under both imbibition and drainage conditions; the field of view was larger than the porosity-based representative elementary volume (REV). We did not attempt to make a definition for a two-phase REV but used the nonwetting-phase cluster-size distribution as an indicator. Most of the nonwetting-phase total volume was found to be contained in clusters that were one to two orders of magnitude larger than the porosity-based REV. The largest observed clusters in fact ranged in volume from 65% to 99% of the entire nonwetting phase in the field of view. As a consequence, the largest clusters observed were statistically not represented and were found to be smaller than the estimated maximum cluster length. The results indicate that the two-phase REV is larger than the field of view attainable by μCT scanning, at a resolution which allows for the accurate determination of cluster connectivity.

  6. Adjusting for unrecorded consumption in survey and per capita sales data: quantification of impact on gender- and age-specific alcohol-attributable fractions for oral and pharyngeal cancers in Great Britain.

    PubMed

    Meier, Petra Sylvia; Meng, Yang; Holmes, John; Baumberg, Ben; Purshouse, Robin; Hill-McManus, Daniel; Brennan, Alan

    2013-01-01

    Large discrepancies are typically found between per capita alcohol consumption estimated via survey data compared with sales, excise or production figures. This may lead to significant inaccuracies when calculating levels of alcohol-attributable harms. Using British data, we demonstrate an approach to adjusting survey data to give more accurate estimates of per capita alcohol consumption. First, sales and survey data are adjusted to account for potential biases (e.g. self-pouring, under-sampled populations) using evidence from external data sources. Secondly, survey and sales data are aligned using different implementations of Rehm et al.'s method [in (2010) Statistical modeling of volume of alcohol exposure for epidemiological studies of population health: the US example. Pop Health Metrics 8, 1-12]. Thirdly, the impact of our approaches is tested by using our revised survey dataset to calculate alcohol-attributable fractions (AAFs) for oral and pharyngeal cancers. British sales data under-estimate per capita consumption by 8%, primarily due to illicit alcohol. Adjustments to survey data increase per capita consumption estimates by 35%, primarily due to under-sampling of dependent drinkers and under-estimation of home-poured spirits volumes. Before aligning sales and survey data, the revised survey estimate remains 22% lower than the revised sales estimate. Revised AAFs for oral and pharyngeal cancers are substantially larger with our preferred method for aligning data sources, yielding increases in an AAF from the original survey dataset of 0.47-0.60 (males) and 0.28-0.35 (females). It is possible to use external data sources to adjust survey data to reduce the under-estimation of alcohol consumption and then account for residual under-estimation using a statistical calibration technique. These revisions lead to markedly higher estimated levels of alcohol-attributable harm.

  7. A predictive nondestructive model for the covariation of tree height, diameter, and stem volume scaling relationships.

    PubMed

    Zhang, Zhongrui; Zhong, Quanlin; Niklas, Karl J; Cai, Liang; Yang, Yusheng; Cheng, Dongliang

    2016-08-24

    Metabolic scaling theory (MST) posits that the scaling exponents among plant height H, diameter D, and biomass M will covary across phyletically diverse species. However, the relationships between scaling exponents and normalization constants remain unclear. Therefore, we developed a predictive model for the covariation of H, D, and stem volume V scaling relationships and used data from Chinese fir (Cunninghamia lanceolata) in Jiangxi province, China to test it. As predicted by the model and supported by the data, normalization constants are positively correlated with their associated scaling exponents for D vs. V and H vs. V, whereas normalization constants are negatively correlated with the scaling exponents of H vs. D. The prediction model also yielded reliable estimations of V (mean absolute percentage error = 10.5 ± 0.32 SE across 12 model calibrated sites). These results (1) support a totally new covariation scaling model, (2) indicate that differences in stem volume scaling relationships at the intra-specific level are driven by anatomical or ecophysiological responses to site quality and/or management practices, and (3) provide an accurate non-destructive method for predicting Chinese fir stem volume.

  8. A computational method for sharp interface advection.

    PubMed

    Roenby, Johan; Bredmose, Henrik; Jasak, Hrvoje

    2016-11-01

    We devise a numerical method for passive advection of a surface, such as the interface between two incompressible fluids, across a computational mesh. The method is called isoAdvector, and is developed for general meshes consisting of arbitrary polyhedral cells. The algorithm is based on the volume of fluid (VOF) idea of calculating the volume of one of the fluids transported across the mesh faces during a time step. The novelty of the isoAdvector concept consists of two parts. First, we exploit an isosurface concept for modelling the interface inside cells in a geometric surface reconstruction step. Second, from the reconstructed surface, we model the motion of the face-interface intersection line for a general polygonal face to obtain the time evolution within a time step of the submerged face area. Integrating this submerged area over the time step leads to an accurate estimate for the total volume of fluid transported across the face. The method was tested on simple two-dimensional and three-dimensional interface advection problems on both structured and unstructured meshes. The results are very satisfactory in terms of volume conservation, boundedness, surface sharpness and efficiency. The isoAdvector method was implemented as an OpenFOAM ® extension and is published as open source.

  9. Role of bioimpedance vectorial analysis in cardio-renal syndromes.

    PubMed

    Aspromonte, Nadia; Cruz, Dinna N; Ronco, Claudio; Valle, Roberto

    2012-01-01

    The cardio-renal syndromes (CRS) are the result of complex bidirectional organ cross-talk between the heart and kidney, with tremendous overlap of diseases such as coronary heart disease, heart failure (HF), and renal dysfunction in the same patient. Volume overload plays an important role in the pathophysiology of CRS. The appropriate treatment of overhydration, particularly in HF and in chronic kidney disease, has been associated with improved outcomes and blood pressure control. Clinical examination alone is often insufficient for accurate assessment of volume status because significant volume overload can exist even in the absence of peripheral or pulmonary edema on physical examination or radiography. Bioelectrical impedance techniques increasingly are being used in the management of patients with HF and those on chronic dialysis. These methods provide more objective estimates of volume status in such patients. Used in conjunction with standard clinical assessment and biomarkers such as the natriuretic peptides, bioimpedance analysis may be useful in guiding pharmacologic and ultrafiltration therapies and subsequently restoring such patients to a euvolemic or optivolemic state. In this article, we review the use of these techniques in CRS. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Estimation of gonad volume, fecundity, and reproductive stage of shovelnose sturgeon using sonography and endoscopy with application to the endangered pallid sturgeon

    USGS Publications Warehouse

    Bryan, J.L.; Wildhaber, M.L.; Papoulias, D.M.; DeLonay, A.J.; Tillitt, D.E.; Annis, M.L.

    2007-01-01

    Most species of sturgeon are declining in the Mississippi River Basin of North America including pallid (Scaphirhynchus albus F. and R.) and shovelnose sturgeons (S. platorynchus R.). Understanding the reproductive cycle of sturgeon in the Mississippi River Basin is important in evaluating the status and viability of sturgeon populations. We used non-invasive, non-lethal methods for examining internal reproductive organs of shovelnose and pallid sturgeon. We used an ultrasound to measure egg diameter, fecundity, and gonad volume; endoscope was used to visually examine the gonad. We found the ultrasound to accurately measure the gonad volume, but it underestimated egg diameter by 52%. After correcting for the measurement error, the ultrasound accurately measured the gonad volume but it was higher than the true gonad volume for stages I and II. The ultrasound underestimated the fecundity of shovelnose sturgeon by 5%. The ultrasound fecundity was lower than the true fecundity for stage III and during August. Using the endoscope, we viewed seven different egg color categories. Using a model selection procedure, the presence of four egg categories correctly predicted the reproductive stage ± one reproductive stage of shovelnose sturgeon 95% of the time. For pallid sturgeon, the ultrasound overestimated the density of eggs by 49% and the endoscope was able to view eggs in 50% of the pallid sturgeon. Individually, the ultrasound and endoscope can be used to assess certain reproductive characteristics in sturgeon. The use of both methods at the same time can be complementary depending on the parameter measured. These methods can be used to track gonad characteristics, including measuring Gonadosomatic Index in individuals and/or populations through time, which can be very useful when associating gonad characteristics with environmental spawning triggers or with repeated examinations of individual fish throughout the reproductive cycle.

  11. The impact of reliable prebolus T 1 measurements or a fixed T 1 value in the assessment of glioma patients with dynamic contrast enhancing MRI.

    PubMed

    Tietze, Anna; Mouridsen, Kim; Mikkelsen, Irene Klærke

    2015-06-01

    Accurate quantification of hemodynamic parameters using dynamic contrast enhanced (DCE) MRI requires a measurement of tissue T 1 prior to contrast injection (T 1). We evaluate (i) T 1 estimation using the variable flip angle (VFA) and the saturation recovery (SR) techniques and (ii) investigate if accurate estimation of DCE parameters outperform a time-saving approach with a predefined T 1 value when differentiating high- from low-grade gliomas. The accuracy and precision of T 1 measurements, acquired by VFA and SR, were investigated by computer simulations and in glioma patients using an equivalence test (p > 0.05 showing significant difference). The permeability measure, K trans, cerebral blood flow (CBF), and - volume, V p, were calculated in 42 glioma patients, using fixed T 1 of 1500 ms or an individual T 1 measurement, using SR. The areas under the receiver operating characteristic curves (AUCs) were used as measures for accuracy to differentiate tumor grade. The T 1 values obtained by VFA showed larger variation compared to those obtained using SR both in the digital phantom and the human data (p > 0.05). Although a fixed T 1 introduced a bias into the DCE calculation, this had only minor impact on the accuracy differentiating high-grade from low-grade gliomas, (AUCfix = 0.906 and AUCind = 0.884 for K trans; AUCfix = 0.863 and AUCind = 0.856 for V p; p for AUC comparison > 0.05). T 1 measurements by VFA were less precise, and the SR method is preferable, when accurate parameter estimation is required. Semiquantitative DCE values, based on predefined T 1 values, were sufficient to perform tumor grading in our study.

  12. Counting Synapses Using FIB/SEM Microscopy: A True Revolution for Ultrastructural Volume Reconstruction.

    PubMed

    Merchán-Pérez, Angel; Rodriguez, José-Rodrigo; Alonso-Nanclares, Lidia; Schertel, Andreas; Defelipe, Javier

    2009-01-01

    The advent of transmission electron microscopy (TEM) in the 1950s represented a fundamental step in the study of neuronal circuits. The application of this technique soon led to the realization that the number of synapses changes during the course of normal life, as well as under certain pathological or experimental circumstances. Since then, one of the main goals in neurosciences has been to define simple and accurate methods to estimate the magnitude of these changes. Contrary to analysing single sections, TEM reconstructions are extremely time-consuming and difficult. Therefore, most quantitative studies use stereological methods to define the three-dimensional characteristics of synaptic junctions that are studied in two dimensions. Here, to count the exact number of synapses per unit of volume we have applied a new three-dimensional reconstruction method that involves the combination of focused ion beam milling and scanning electron microscopy (FIB/SEM). We show that the images obtained with FIB/SEM are similar to those obtained with TEM, but with the advantage that FIB/SEM permits serial reconstructions of large volumes of tissue to be generated rapidly and automatically. Furthermore, we compared the estimates of the number of synapses obtained with stereological methods with the values obtained by FIB/SEM reconstructions. We concluded that FIB/SEM not only provides the actual number of synapses per volume but it is also much easier and faster to use than other currently available TEM methods. More importantly, it also avoids most of the errors introduced by stereological methods and overcomes the difficulties associated with these techniques.

  13. Measurement of the dynamic viscosity of hybrid engine oil -Cuo-MWCNT nanofluid, development of a practical viscosity correlation and utilizing the artificial neural network

    NASA Astrophysics Data System (ADS)

    Aghaei, Alireza; Khorasanizadeh, Hossein; Sheikhzadeh, Ghanbar Ali

    2018-01-01

    The main objectives of this study have been measurement of the dynamic viscosity of CuO-MWCNTs/SAE 5w-50 hybrid nanofluid, utilization of artificial neural networks (ANN) and development of a new viscosity model. The new nanofluid has been prepared by a two-stage procedure with volume fractions of 0.05, 0.1, 0.25, 0.5, 0.75 and 1%. Then, utilizing a Brookfield viscometer, its dynamic viscosity has been measured for temperatures of 5, 15, 25, 35, 45, 55 °C. The experimental results demonstrate that the viscosity increases by increasing the nanoparticles volume fraction and decreases by increasing temperature. Based on the experimental data the maximum and minimum nanofluid viscosity enhancements, when the volume fraction increases from 0.05 to 1, are 35.52% and 12.92% for constant temperatures of 55 and 15 °C, respectively. The higher viscosity of oil engine in higher temperatures is an advantage, thus this result is important. The measured nanofluid viscosity magnitudes in various shear rates show that this hybrid nanofluid is Newtonian. An ANN model has been employed to predict the viscosity of the CuO-MWCNTs/SAE 5w-50 hybrid nanofluid and the results showed that the ANN can estimate the viscosity efficiently and accurately. Eventually, for viscosity estimation a new temperature and volume fraction based third-degree polynomial empirical model has been developed. The comparison shows that this model is in good agreement with the experimental data.

  14. Estimating floodwater depths from flood inundation maps and topography

    USGS Publications Warehouse

    Cohen, Sagy; Brakenridge, G. Robert; Kettner, Albert; Bates, Bradford; Nelson, Jonathan M.; McDonald, Richard R.; Huang, Yu-Fen; Munasinghe, Dinuke; Zhang, Jiaqi

    2018-01-01

    Information on flood inundation extent is important for understanding societal exposure, water storage volumes, flood wave attenuation, future flood hazard, and other variables. A number of organizations now provide flood inundation maps based on satellite remote sensing. These data products can efficiently and accurately provide the areal extent of a flood event, but do not provide floodwater depth, an important attribute for first responders and damage assessment. Here we present a new methodology and a GIS-based tool, the Floodwater Depth Estimation Tool (FwDET), for estimating floodwater depth based solely on an inundation map and a digital elevation model (DEM). We compare the FwDET results against water depth maps derived from hydraulic simulation of two flood events, a large-scale event for which we use medium resolution input layer (10 m) and a small-scale event for which we use a high-resolution (LiDAR; 1 m) input. Further testing is performed for two inundation maps with a number of challenging features that include a narrow valley, a large reservoir, and an urban setting. The results show FwDET can accurately calculate floodwater depth for diverse flooding scenarios but also leads to considerable bias in locations where the inundation extent does not align well with the DEM. In these locations, manual adjustment or higher spatial resolution input is required.

  15. Embedded fiber-optic sensing for accurate internal monitoring of cell state in advanced battery management systems part 1: Cell embedding method and performance

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Kiesel, Peter; Sommer, Lars Wilko; Schwartz, Julian; Lochbaum, Alexander; Hegyi, Alex; Schuh, Andreas; Arakaki, Kyle; Saha, Bhaskar; Ganguli, Anurag; Kim, Kyung Ho; Kim, ChaeAh; Hah, Hoe Jin; Kim, SeokKoo; Hwang, Gyu-Ok; Chung, Geun-Chang; Choi, Bokkyu; Alamgir, Mohamed

    2017-02-01

    A key challenge hindering the mass adoption of Lithium-ion and other next-gen chemistries in advanced battery applications such as hybrid/electric vehicles (xEVs) has been management of their functional performance for more effective battery utilization and control over their life. Contemporary battery management systems (BMS) reliant on monitoring external parameters such as voltage and current to ensure safe battery operation with the required performance usually result in overdesign and inefficient use of capacity. More informative embedded sensors are desirable for internal cell state monitoring, which could provide accurate state-of-charge (SOC) and state-of-health (SOH) estimates and early failure indicators. Here we present a promising new embedded sensing option developed by our team for cell monitoring, fiber-optic sensors. High-performance large-format pouch cells with embedded fiber-optic sensors were fabricated. The first of this two-part paper focuses on the embedding method details and performance of these cells. The seal integrity, capacity retention, cycle life, compatibility with existing module designs, and mass-volume cost estimates indicate their suitability for xEV and other advanced battery applications. The second part of the paper focuses on the internal strain and temperature signals obtained from these sensors under various conditions and their utility for high-accuracy cell state estimation algorithms.

  16. Evaluating the accuracy of wear formulae for acetabular cup liners.

    PubMed

    Wu, James Shih-Shyn; Hsu, Shu-Ling; Chen, Jian-Horng

    2010-02-01

    This study proposes two methods for exploring the wear volume of a worn liner. The first method is a numerical method, in which SolidWorks software is used to create models of the worn out regions of liners at various wear directions and depths. The second method is an experimental one, in which a machining center is used to mill polyoxymethylene to manufacture worn and unworn liner models, then the volumes of the models are measured. The results show that the SolidWorks software is a good tool for presenting the wear pattern and volume of a worn liner. The formula provided by Ilchmann is the most suitable for computing liner volume loss, but is not accurate enough. This study suggests that a more accurate wear formula is required. This is crucial for accurate evaluation of the performance of hip components implanted in patients, as well as for designing new hip components.

  17. Novel imaging analysis system to measure the spatial dimension of engineered tissue construct.

    PubMed

    Choi, Kyoung-Hwan; Yoo, Byung-Su; Park, So Ra; Choi, Byung Hyune; Min, Byoung-Hyun

    2010-02-01

    The measurement of the spatial dimensions of tissue-engineered constructs is very important for their clinical applications. In this study, a novel method to measure the volume of tissue-engineered constructs was developed using iterative mathematical computations. The method measures and analyzes three-dimensional (3D) parameters of a construct to estimate its actual volume using a sequence of software-based mathematical algorithms. The mathematical algorithm is composed of two stages: the shape extraction and the determination of volume. The shape extraction utilized 3D images of a construct: length, width, and thickness, captured by a high-quality camera with charge coupled device. The surface of the 3D images was then divided into fine sections. The area of each section was measured and combined to obtain the total surface area. The 3D volume of the target construct was then mathematically obtained using its total surface area and thickness. The accuracy of the measurement method was verified by comparing the results with those obtained from the hydrostatic weighing method (Korea Research Institute of Standards and Science [KRISS], Korea). The mean difference in volume between two methods was 0.0313 +/- 0.0003% (n = 5, P = 0.523) with no significant statistical difference. In conclusion, our image-based spatial measurement system is a reliable and easy method to obtain an accurate 3D volume of a tissue-engineered construct.

  18. Accuracy of using Diagnosis Procedure Combination administrative claims data for estimating the amount of opioid consumption among cancer patients in Japan.

    PubMed

    Iwamoto, Momoko; Higashi, Takahiro; Miura, Hiroki; Kawaguchi, Takahiro; Tanaka, Shigeyuki; Yamashita, Itsuku; Yoshimoto, Tetsusuke; Yoshida, Shigeaki; Matoba, Motohiro

    2015-11-01

    The state of opioid consumption among cancer patients has never been comprehensively investigated in Japan. The Diagnosis Procedure Combination claims data may be used to measure and monitor opioid consumption among cancer patients, but the accuracy of using the Diagnosis Procedure Combination data for this purpose has never been tested. We aimed to ascertain the accuracy of using the Diagnosis Procedure Combination claims data for estimating total opioid analgesic consumption by cancer patients compared with electronic medical records at Aomori Prefectural Central Hospital. We calculated percent differences between estimates obtained from electronic medical records and Diagnosis Procedure Combination claims data by month and drug type (morphine, oxycodone, fentanyl, buprenorphine, codeine and tramadol) between 1 October 2012 and 30 September 2013, and further examined the causes of discrepancy by reviewing medical and administrative charts between April and July 2013. Percent differences varied by month for drug types with small prescription volumes, but less so for drugs with larger prescription volumes. Differences also tended to diminish when consumption was compared for a year instead of a month. Total percent difference between electronic medical records and Diagnosis Procedure Combination data during the study period was -0.1% (4721 mg per year per hospital), as electronic medical records as baseline. Half of the discrepancy was caused by errors in data entry. Our study showed that Diagnosis Procedure Combination claims data can be used to accurately estimate opioid consumption among a population of cancer patients, although the same conclusion cannot be made for individual estimates or when making estimates for a group of patients over a short period of time. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. A Non-Invasive Method for Estimating Cardiopulmonary Variables Using Breath-by-Breath Injection of Two Tracer Gases.

    PubMed

    Clifton, Lei; Clifton, David A; Hahn, Clive E W; Farmeryy, Andrew D

    2013-01-01

    Conventional methods for estimating cardiopulmonary variables usually require complex gas analyzers and the active co-operation of the patient. Therefore, they are not compatible with the crowded environment of the intensive care unit (ICU) or operating theatre, where patient co-operation is typically impossible. However, it is these patients that would benefit the most from accurate estimation of cardiopulmonary variables, because of their critical condition. This paper describes the results of a collaborative development between an anesthesiologists and biomedical engineers to create a compact and non-invasive system for the measurement of cardiopulmonary variables such as lung volume, airway dead space volume, and pulmonary blood flow. In contrast with conventional methods, the compact apparatus and non-invasive nature of the proposed method allow it to be used in the ICU, as well as in general clinical settings. We propose the use of a non-invasive method, in which tracer gases are injected into the patient's inspired breath, and the concentration of the tracer gases is subsequently measured. A novel breath-by-breath tidal ventilation model is then used to estimate the value of a patient's cardiopulmonary variables. Experimental results from an artificial lung demonstrate minimal error in the estimation of known parameters using the proposed method. Results from analysis of a cohort of 20 healthy volunteers (within the Oxford University Hospitals NHS Trust) show that the values of estimated cardiopulmonary variables from these subjects lies within the expected ranges. Advantages of this method are that it is non-invasive, compact, portable, and can perform analysis in real time with less than 1 min of acquired respiratory data.

  20. A new formula for estimation of standard liver volume using computed tomography-measured body thickness.

    PubMed

    Ma, Ka Wing; Chok, Kenneth S H; Chan, Albert C Y; Tam, Henry S C; Dai, Wing Chiu; Cheung, Tan To; Fung, James Y Y; Lo, Chung Mau

    2017-09-01

    The objective of this article is to derive a more accurate and easy-to-use formula for finding estimated standard liver volume (ESLV) using novel computed tomography (CT) measurement parameters. New formulas for ESLV have been emerging that aim to improve the accuracy of estimation. However, many of these formulas contain body surface area measurements and logarithms in the equations that lead to a more complicated calculation. In addition, substantial errors in ESLV using these old formulas have been shown. An improved version of the formula for ESLV is needed. This is a retrospective cohort of consecutive living donor liver transplantations from 2005 to 2016. Donors were randomly assigned to either the formula derivation or validation groups. Total liver volume (TLV) measured by CT was used as the reference for a linear regression analysis against various patient factors. The derived formula was compared with the existing formulas. There were 722 patients (197 from the derivation group, 164 from the validation group, and 361 from the recipient group) involved in the study. The donor's body weight (odds ratio [OR], 10.42; 95% confidence interval [CI], 7.25-13.60; P < 0.01) and body thickness (OR, 2.00; 95% CI, 0.36-3.65; P = 0.02) were found to be independent factors for the TLV calculation. A formula for TLV (cm 3 ) was derived: 2 × thickness (mm) + 10 × weight (kg) + 190 with R 2 0.48, which was the highest when compared with the 4 other most often cited formulas. This formula remained superior to other published formulas in the validation set analysis (R 2 , 5.37; interclass correlation coefficient, 0.74). Graft weight/ESLV values calculated by the new formula were shown to have the highest correlation with delayed graft function (C-statistic, 0.79; 95% CI, 0.69-0.90; P < 0.01). The new formula (2 × thickness + 10 × weight + 190) represents the first study proposing the use of CT-measured body thickness which is novel, easy to use, and the most accurate for ESLV. Liver Transplantation 23 1113-1122 2017 AASLD. © 2017 by the American Association for the Study of Liver Diseases.

  1. Studies in astronomical time series analysis. IV - Modeling chaotic and random processes with linear filters

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1990-01-01

    While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.

  2. Evaluation of Sentinel Lymph Node Dose Distribution in 3D Conformal Radiotherapy Techniques in 67 pN0 Breast Cancer Patients.

    PubMed

    Witucki, Gerlo; Degregorio, Nikolaus; Rempen, Andreas; Schwentner, Lukas; Bottke, Dirk; Janni, Wolfgang; Ebner, Florian

    2015-01-01

    Introduction. The anatomic position of the sentinel lymph node is variable. The purpose of the following study was to assess the dose distribution delivered to the surgically marked sentinel lymph node site by 3D conformal radio therapy technique. Material and Method. We retrospectively analysed 70 radiotherapy (RT) treatment plans of consecutive primary breast cancer patients with a successful, disease-free, sentinel lymph node resection. Results. In our case series the SN clip volume received a mean dose of 40.7 Gy (min 28.8 Gy/max 47.6 Gy). Conclusion. By using surgical clip markers in combination with 3D CT images our data supports the pathway of tumouricidal doses in the SN bed. The target volume should be defined by surgical clip markers and 3D CT images to give accurate dose estimations.

  3. The Effects of Reducing the Structural Mass of the Transit Habitat on the Cryogenic Propellant Required for a Human Phobos Mission

    NASA Technical Reports Server (NTRS)

    Zipay, John Joseph

    2016-01-01

    A technique for rapidly determining the relationship between the pressurized volume, structural mass and the cryogenic propellant required to be delivered to Earth orbit for a Mars Transit Habitat is provided. This technique is based on assumptions for the required delta-V's, the Exploration Upper Stage performance and the historical structural masses for human spacecraft from Mercury Program through the International Space Station. If the Mars Transit Habitat is constructed from aluminum, structural mass estimates based on the habitat pressurized volume are accurate to within 15%. Other structural material options for the Mars Transit Habitat are also evaluated. The results show that small, achievable reductions in the structural mass of the Transit Habitat can save tens of thousands of pounds of cryogenic propellant that need to be delivered to Earth orbit for a human Phobos Mission.

  4. The Effects of Reducing the Structural Mass of the Transit Habitat on the Cryogenic Propellant Required for a Human Phobos Mission

    NASA Technical Reports Server (NTRS)

    Zipay, John J.

    2016-01-01

    A technique for rapidly determining the relationship between the pressurized volume, structural mass and the cryogenic propellant required to be delivered to Earth orbit for a Mars Transit Habitat is provided. This technique is based on assumptions for the required delta-V's, the Exploration Upper Stage performance and the historical structural masses for human spacecraft from Mercury Program through the International Space Station. If the Mars Transit Habitat is constructed from aluminum, structural mass estimates based on the habitat pressurized volume are accurate to within 15 percent. Other structural material options for the Mars Transit Habitat are also evaluated. The results show that small, achievable reductions in the structural mass of the Transit Habitat can save tens of thousands of pounds of cryogenic propellant that need to be delivered to Earth orbit for a human Phobos Mission.

  5. Simple formula for the surface area of the body and a simple model for anthropometry.

    PubMed

    Reading, Bruce D; Freeman, Brian

    2005-03-01

    The body surface area (BSA) of any adult, when derived from the arithmetic mean of the different values calculated from four independent accepted formulae, can be expressed accurately in Systeme International d'Unites (SI) units by the simple equation BSA = 1/6(WH)0.5, where W is body weight in kg, H is body height in m, and BSA is in m2. This formula, which is derived in part by modeling the body as a simple solid of revolution or a prolate spheroid (i.e., a stretched ellipsoid of revolution) gives students, teachers, and clinicians a simple rule for the rapid estimation of surface area using rational units. The formula was tested independently for human subjects by using it to predict body volume and then comparing this prediction against the actual volume measured by Archimedes' principle. Copyright 2005 Wiley-Liss, Inc.

  6. Asynchronous discrete event schemes for PDEs

    NASA Astrophysics Data System (ADS)

    Stone, D.; Geiger, S.; Lord, G. J.

    2017-08-01

    A new class of asynchronous discrete-event simulation schemes for advection-diffusion-reaction equations is introduced, based on the principle of allowing quanta of mass to pass through faces of a (regular, structured) Cartesian finite volume grid. The timescales of these events are linked to the flux on the face. The resulting schemes are self-adaptive, and local in both time and space. Experiments are performed on realistic physical systems related to porous media flow applications, including a large 3D advection diffusion equation and advection diffusion reaction systems. The results are compared to highly accurate reference solutions where the temporal evolution is computed with exponential integrator schemes using the same finite volume discretisation. This allows a reliable estimation of the solution error. Our results indicate a first order convergence of the error as a control parameter is decreased, and we outline a framework for analysis.

  7. Bubble behavior characteristics based on virtual binocular stereo vision

    NASA Astrophysics Data System (ADS)

    Xue, Ting; Xu, Ling-shuang; Zhang, Shang-zhen

    2018-01-01

    The three-dimensional (3D) behavior characteristics of bubble rising in gas-liquid two-phase flow are of great importance to study bubbly flow mechanism and guide engineering practice. Based on the dual-perspective imaging of virtual binocular stereo vision, the 3D behavior characteristics of bubbles in gas-liquid two-phase flow are studied in detail, which effectively increases the projection information of bubbles to acquire more accurate behavior features. In this paper, the variations of bubble equivalent diameter, volume, velocity and trajectory in the rising process are estimated, and the factors affecting bubble behavior characteristics are analyzed. It is shown that the method is real-time and valid, the equivalent diameter of the rising bubble in the stagnant water is periodically changed, and the crests and troughs in the equivalent diameter curve appear alternately. The bubble behavior characteristics as well as the spiral amplitude are affected by the orifice diameter and the gas volume flow.

  8. Predicting the Effective Elastic Properties of Polymer Bonded Explosives based on Micromechanical Methods

    NASA Astrophysics Data System (ADS)

    Wang, Jingcheng; Luo, Jingrun

    2018-04-01

    Due to the extremely high particle volume fraction (greater than 85%) and damage feature of polymer bonded explosives (PBXs), conventional micromechanical methods lead to inaccurate estimates on their effective elastic properties. According to their manufacture characteristics, a multistep approach based on micromechanical methods is proposed. PBXs are treated as pseudo poly-crystal materials consisting of equivalent composite particles (explosive crystals with binder coating), rather than two-phase composites composed of explosive particles and binder matrix. Moduli of composite spheres are obtained by generalized self-consistent method first, and the self-consistent method is modified to calculate the effective moduli of PBX. Defects and particle size distribution are considered by Mori-Tanaka method. Results show that when the multistep approach is applied to PBX 9501, estimates are far more accurate than the conventional micromechanical results. The bulk modulus is 5.75% higher, and shear modulus is 5.78% lower than the experimental values. Further analyses discover that while particle volume fraction and the binder's property have significant influences on the effective moduli of PBX, the moduli of particles present minor influences. Investigation of another particle size distribution indicates that the use of more fine particles will enhance the effective moduli of PBX.

  9. Improved daily precipitation nitrate and ammonium concentration models for the Chesapeake Bay Watershed.

    PubMed

    Grimm, J W; Lynch, J A

    2005-06-01

    Daily precipitation nitrate and ammonium concentration models were developed for the Chesapeake Bay Watershed (USA) using a linear least-squares regression approach and precipitation chemistry data from 29 National Atmospheric Deposition Program/National Trends Network (NADP/NTN) sites. Only weekly samples that comprised a single precipitation event were used in model development. The most significant variables in both ammonium and nitrate models included: precipitation volume, the number of days since the last event, a measure of seasonality, latitude, and the proportion of land within 8km covered by forest or devoted to industry and transportation. Additional variables included in the nitrate model were the proportion of land within 0.8km covered by water and/or forest. Local and regional ammonia and nitrogen oxide emissions were not as well correlated as land cover. Modeled concentrations compared very well with event chemistry data collected at six NADP/AirMoN sites within the Chesapeake Bay Watershed. Wet deposition estimates were also consistent with observed deposition at selected sites. Accurately describing the spatial distribution of precipitation volume throughout the watershed is important in providing critical estimates of wet-fall deposition of ammonium and nitrate.

  10. Dietary intake assessment using integrated sensors and software

    NASA Astrophysics Data System (ADS)

    Shang, Junqing; Pepin, Eric; Johnson, Eric; Hazel, David; Teredesai, Ankur; Kristal, Alan; Mamishev, Alexander

    2012-02-01

    The area of dietary assessment is becoming increasingly important as obesity rates soar, but valid measurement of the food intake in free-living persons is extraordinarily challenging. Traditional paper-based dietary assessment methods have limitations due to bias, user burden and cost, and therefore improved methods are needed to address important hypotheses related to diet and health. In this paper, we will describe the progress of our mobile Diet Data Recorder System (DDRS), where an electronic device is used for objective measurement on dietary intake in real time and at moderate cost. The DDRS consists of (1) a mobile device that integrates a smartphone and an integrated laser package, (2) software on the smartphone for data collection and laser control, (3) an algorithm to process acquired data for food volume estimation, which is the largest source of error in calculating dietary intake, and (4) database and interface for data storage and management. The estimated food volume, together with direct entries of food questionnaires and voice recordings, could provide dietitians and nutritional epidemiologists with more complete food description and more accurate food portion sizes. In this paper, we will describe the system design of DDRS and initial results of dietary assessment.

  11. HipMatch: an object-oriented cross-platform program for accurate determination of cup orientation using 2D-3D registration of single standard X-ray radiograph and a CT volume.

    PubMed

    Zheng, Guoyan; Zhang, Xuan; Steppacher, Simon D; Murphy, Stephen B; Siebenrock, Klaus A; Tannast, Moritz

    2009-09-01

    The widely used procedure of evaluation of cup orientation following total hip arthroplasty using single standard anteroposterior (AP) radiograph is known inaccurate, largely due to the wide variability in individual pelvic orientation relative to X-ray plate. 2D-3D image registration methods have been introduced for an accurate determination of the post-operative cup alignment with respect to an anatomical reference extracted from the CT data. Although encouraging results have been reported, their extensive usage in clinical routine is still limited. This may be explained by their requirement of a CAD model of the prosthesis, which is often difficult to be organized from the manufacturer due to the proprietary issue, and by their requirement of either multiple radiographs or a radiograph-specific calibration, both of which are not available for most retrospective studies. To address these issues, we developed and validated an object-oriented cross-platform program called "HipMatch" where a hybrid 2D-3D registration scheme combining an iterative landmark-to-ray registration with a 2D-3D intensity-based registration was implemented to estimate a rigid transformation between a pre-operative CT volume and the post-operative X-ray radiograph for a precise estimation of cup alignment. No CAD model of the prosthesis is required. Quantitative and qualitative results evaluated on cadaveric and clinical datasets are given, which indicate the robustness and the accuracy of the program. HipMatch is written in object-oriented programming language C++ using cross-platform software Qt (TrollTech, Oslo, Norway), VTK, and Coin3D and is transportable to any platform.

  12. Differences between nipher and slter shielded rain gages at two Colorado deposition monitoring sites

    USGS Publications Warehouse

    Bigelow, David S.; Denning, A. Scott

    1990-01-01

    In the last decade the United States and Canada have made significant progress in establishing spatial ad temporal estimates of atmospheric deposition throughout North America. Fundamental to the wet-deposition portion of these estimates is the accurate and precise measurement of precipitation amount. Goodison and others (I-3) have reported on a new type of shielded snow gage known as the Canadian MSC Nipher shielded snow gage. Because this shielded snow gage has been shown to be superior to other precipitation gages for the estimation of snowfall amount, its design was adapted to the Universal Belfort precipitation gage (4), the dominant precipitation gage used at deposition monitoring sites in the United States. Favorable results taken from monitoring sites using this modified Nipher shielded snow gage (3-6) have prompted the U.S. Environmental Protection Agency and the Electric Power Research Institute to adopt the Nipher shielded Belfort gage as a standard piece of equipment in the Acid MODES and Operational Evaluation Network (OEN) monitoring programs and to propose that is be included as a standard snow gage in other North American deposition monitoring programs. This communication details preliminary results from two of nine NADP/NTN deposition monitoring sites selected by the Environmental Protection Agency to compare Nipher shielded Belfort precipitation gage volumes to volumes obtained from the standard Belfort gage used in the NADP/NTN monitoring program.

  13. Oil Formation Volume Factor Determination Through a Fused Intelligence

    NASA Astrophysics Data System (ADS)

    Gholami, Amin

    2016-12-01

    Volume change of oil between reservoir condition and standard surface condition is called oil formation volume factor (FVF), which is very time, cost and labor intensive to determine. This study proposes an accurate, rapid and cost-effective approach for determining FVF from reservoir temperature, dissolved gas oil ratio, and specific gravity of both oil and dissolved gas. Firstly, structural risk minimization (SRM) principle of support vector regression (SVR) was employed to construct a robust model for estimating FVF from the aforementioned inputs. Subsequently, an alternating conditional expectation (ACE) was used for approximating optimal transformations of input/output data to a higher correlated data and consequently developing a sophisticated model between transformed data. Eventually, a committee machine with SVR and ACE was constructed through the use of hybrid genetic algorithm-pattern search (GA-PS). Committee machine integrates ACE and SVR models in an optimal linear combination such that makes benefit of both methods. A group of 342 data points was used for model development and a group of 219 data points was used for blind testing the constructed model. Results indicated that the committee machine performed better than individual models.

  14. Comparison of the excretion of sodium and meglumine diatrizoate at urography with simulated compression: an experimental study in the rat.

    PubMed

    Owman, T

    1981-07-01

    In the experimental model in the rabbit the excretion of sodium and meglumine diatrizoate, respectively, have been compared. Urographic density which was estimated through renal pelvic volume as calculated according to previous experiments (Owman 1978; Owman & Olin 1980) and urinary iodine concentration, is suggested to be more accurate than mere determination of urine iodine concentration and diuresis when evaluating and comparing urographic contrast media experimentally. More reliable dose optima are probably found when calculating density rather than determining urine concentrations. Of the examined media in this investigation, the sodium salt of diatrizoate was not superior to the meglumine salt in dose ranges up to 320 mg I/kg body weight, while at higher doses sodium diatrizoate gave higher urinary iodine concentrations and higher estimated density.

  15. A novel method for blood volume estimation using trivalent chromium in rabbit models.

    PubMed

    Baby, Prathap Moothamadathil; Kumar, Pramod; Kumar, Rajesh; Jacob, Sanu S; Rawat, Dinesh; Binu, V S; Karun, Kalesh M

    2014-05-01

    Blood volume measurement though important in management of critically ill-patients is not routinely estimated in clinical practice owing to labour intensive, intricate and time consuming nature of existing methods. The aim was to compare blood volume estimations using trivalent chromium [(51)Cr(III)] and standard Evans blue dye (EBD) method in New Zealand white rabbit models and establish correction-factor (CF). Blood volume estimation in 33 rabbits was carried out using EBD method and concentration determined using spectrophotometric assay followed by blood volume estimation using direct injection of (51)Cr(III). Twenty out of 33 rabbits were used to find CF by dividing blood volume estimation using EBD with blood volume estimation using (51)Cr(III). CF is validated in 13 rabbits by multiplying it with blood volume estimation values obtained using (51)Cr(III). The mean circulating blood volume of 33 rabbits using EBD was 142.02 ± 22.77 ml or 65.76 ± 9.31 ml/kg and using (51)Cr(III) was estimated to be 195.66 ± 47.30 ml or 89.81 ± 17.88 ml/kg. The CF was found to be 0.77. The mean blood volume of 13 rabbits measured using EBD was 139.54 ± 27.19 ml or 66.33 ± 8.26 ml/kg and using (51)Cr(III) with CF was 152.73 ± 46.25 ml or 71.87 ± 13.81 ml/kg (P = 0.11). The estimation of blood volume using (51)Cr(III) was comparable to standard EBD method using CF. With further research in this direction, we envisage human blood volume estimation using (51)Cr(III) to find its application in acute clinical settings.

  16. ASTRAL, DRAGON and SEDAN scores predict stroke outcome more accurately than physicians.

    PubMed

    Ntaios, G; Gioulekas, F; Papavasileiou, V; Strbian, D; Michel, P

    2016-11-01

    ASTRAL, SEDAN and DRAGON scores are three well-validated scores for stroke outcome prediction. Whether these scores predict stroke outcome more accurately compared with physicians interested in stroke was investigated. Physicians interested in stroke were invited to an online anonymous survey to provide outcome estimates in randomly allocated structured scenarios of recent real-life stroke patients. Their estimates were compared to scores' predictions in the same scenarios. An estimate was considered accurate if it was within 95% confidence intervals of actual outcome. In all, 244 participants from 32 different countries responded assessing 720 real scenarios and 2636 outcomes. The majority of physicians' estimates were inaccurate (1422/2636, 53.9%). 400 (56.8%) of physicians' estimates about the percentage probability of 3-month modified Rankin score (mRS) > 2 were accurate compared with 609 (86.5%) of ASTRAL score estimates (P < 0.0001). 394 (61.2%) of physicians' estimates about the percentage probability of post-thrombolysis symptomatic intracranial haemorrhage were accurate compared with 583 (90.5%) of SEDAN score estimates (P < 0.0001). 160 (24.8%) of physicians' estimates about post-thrombolysis 3-month percentage probability of mRS 0-2 were accurate compared with 240 (37.3%) DRAGON score estimates (P < 0.0001). 260 (40.4%) of physicians' estimates about the percentage probability of post-thrombolysis mRS 5-6 were accurate compared with 518 (80.4%) DRAGON score estimates (P < 0.0001). ASTRAL, DRAGON and SEDAN scores predict outcome of acute ischaemic stroke patients with higher accuracy compared to physicians interested in stroke. © 2016 EAN.

  17. Tracking unaccounted water use in data sparse arid environment

    NASA Astrophysics Data System (ADS)

    Hafeez, M. M.; Edraki, M.; Ullah, M. K.; Chemin, Y.; Sixsmith, J.; Faux, R.

    2009-12-01

    Hydrological knowledge of irrigated farms within the inundation plains of the Murray Darling Basin (MDB) is very limited in quality and reliability of the observation network that has been declining rapidly over the past decade. This paper focuses on Land Surface Diversions (LSD) that encompass all forms of surface water diversion except the direct extraction of water from rivers, watercourses and lakes by farmers for the purposes of irrigation and stock and domestic supply. Its accurate measurement is very challenging, due to the practical difficulties associated with separating the different components of LSD and estimating them accurately for a large catchment. The inadequacy of current methods of measuring and monitoring LSD poses severe limitations on existing and proposed policies for managing such diversions. It is commonly believed that LSD comprises 20-30% of total diversions from river valleys in the MDB areas. But, scientific estimates of LSD do not exist, because they were considered unimportant prior the onset of recent draught in Australia. There is a need to develop hydrological water balance models through the coupling of hydrological variables derived from on ground hydrological measurements and remote sensing techniques to accurately model LSD. Typically, the hydrological water balance components for farm/catchment scale models includes: irrigation inflow, outflow, rainfall, runoff, evapotranspiration, soil moisture change and deep percolation. The actual evapotranspiration (ETa) is the largest and single most important component of hydrological water balance model. An accurate quantification of all components of hydrological water balance model at farm/catchment scale is of prime importance to estimate the volume of LSD. A hydrological water balance model is developed to calculate LSD at 6 selected pilot farms. The catchment hydrological water balance model is being developed by using selected parameters derived from hydrological water balance model at farm scale. LSD results obtained through the modelling process have been compared with LSD estimates measured with the ground observed data at 6 pilot farms. The differences between the values are between 3 to 5 percent of the water inputs which is within the confidence limit expected from such analysis. Similarly, the LSD values at the catchment scale have been estimated with a great confidence. The hydrological water balance models at farm and catchment scale provide reliable quantification of LSD. Improved LSD estimates can guide water management decisions at farm to catchment scale and could be instrumental for enhancing the integrity of the water allocation process and making them fairer and equitable across stakeholders.

  18. Effects of computing parameters and measurement locations on the estimation of 3D NPS in non-stationary MDCT images.

    PubMed

    Miéville, Frédéric A; Bolard, Gregory; Bulling, Shelley; Gudinchet, François; Bochud, François O; Verdun, François R

    2013-11-01

    The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT. Copyright © 2012 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

  20. Variations in respiratory excretion of carbon dioxide can be used to calculate pulmonary blood flow.

    PubMed

    Preiss, David A; Azami, Takafumi; Urman, Richard D

    2015-02-01

    A non-invasive means of measuring pulmonary blood flow (PBF) would have numerous benefits in medicine. Traditionally, respiratory-based methods require breathing maneuvers, partial rebreathing, or foreign gas mixing because exhaled CO2 volume on a per-breath basis does not accurately represent alveolar exchange of CO2. We hypothesized that if the dilutional effect of the functional residual capacity was accounted for, the relationship between the calculated volume of CO2 removed per breath and the alveolar partial pressure of CO2 would be reversely linear. A computer model was developed that uses variable tidal breathing to calculate CO2 removal per breath at the level of the alveoli. We iterated estimates for functional residual capacity to create the best linear fit of alveolar CO2 pressure and CO2 elimination for 10 minutes of breathing and incorporated the volume of CO2 elimination into the Fick equation to calculate PBF. The relationship between alveolar pressure of CO2 and CO2 elimination produced an R(2) = 0.83. The optimal functional residual capacity differed from the "actual" capacity by 0.25 L (8.3%). The repeatability coefficient leveled at 0.09 at 10 breaths and the difference between the PBF calculated by the model and the preset blood flow was 0.62 ± 0.53 L/minute. With variations in tidal breathing, a linear relationship exists between alveolar CO2 pressure and CO2 elimination. Existing technology may be used to calculate CO2 elimination during quiet breathing and might therefore be used to accurately calculate PBF in humans with healthy lungs.

  1. WEB downloadable software for training in cardiovascular hemodynamics in the (3-D) stress echo lab

    PubMed Central

    2010-01-01

    When a physiological (exercise) stress echo is scheduled, interest focuses on wall motion segmental contraction abnormalities to diagnose ischemic response to stress, and on left ventricular ejection fraction to assess contractile reserve. Echocardiographic evaluation of volumes (plus standard assessment of heart rate and blood pressure) is ideally suited for the quantitative and accurate calculation of a set of parameters allowing a complete characterization of cardiovascular hemodynamics (including cardiac output and systemic vascular resistance), left ventricular elastance (mirroring left ventricular contractility, theoretically independent of preload and afterload changes heavily affecting the ejection fraction), arterial elastance, ventricular arterial coupling (a central determinant of net cardiovascular performance in normal and pathological conditions), and diastolic function (through the diastolic mean filling rate). All these parameters were previously inaccessible, inaccurate or labor-intensive and now become, at least in principle, available in the stress echocardiography laboratory since all of them need an accurate estimation of left ventricular volumes and stroke volume, easily derived from 3 D echo. Aims of this paper are: 1) to propose a simple method to assess a set of parameters allowing a complete characterization of cardiovascular hemodynamics in the stress echo lab, from basic measurements to calculations 2) to propose a simple, web-based software program, to learn and training calculations as a phantom of the everyday activity in the busy stress echo lab 3) to show examples of software testing in a way that proves its value. The informatics infrastructure is available on the web, linking to http://cctrainer.ifc.cnr.it PMID:21073738

  2. Noninvasive evaluation of global and regional left ventricular function using computed tomography and magnetic resonance imaging: a meta-analysis.

    PubMed

    Kaniewska, Malwina; Schuetz, Georg M; Willun, Steffen; Schlattmann, Peter; Dewey, Marc

    2017-04-01

    To compare the diagnostic accuracy of computed tomography (CT) in the assessment of global and regional left ventricular (LV) function with magnetic resonance imaging (MRI). MEDLINE, EMBASE and ISI Web of Science were systematically reviewed. Evaluation included: ejection fraction (EF), end-diastolic volume (EDV), end-systolic volume (ESV), stroke volume (SV) and left ventricular mass (LVM). Differences between modalities were analysed using limits of agreement (LoA). Publication bias was measured by Egger's regression test. Heterogeneity was evaluated using Cochran's Q test and Higgins I 2 statistic. In the presence of heterogeneity the DerSimonian-Laird method was used for estimation of heterogeneity variance. Fifty-three studies including 1,814 patients were identified. The mean difference between CT and MRI was -0.56 % (LoA, -11.6-10.5 %) for EF, 2.62 ml (-34.1-39.3 ml) for EDV and 1.61 ml (-22.4-25.7 ml) for ESV, 3.21 ml (-21.8-28.3 ml) for SV and 0.13 g (-28.2-28.4 g) for LVM. CT detected wall motion abnormalities on a per-segment basis with 90 % sensitivity and 97 % specificity. CT is accurate for assessing global LV function parameters but the limits of agreement versus MRI are moderately wide, while wall motion deficits are detected with high accuracy. • CT helps to assess patients with coronary artery disease (CAD). • MRI is the reference standard for evaluation of left ventricular function. • CT provides accurate assessment of global left ventricular function.

  3. On soft clipping of Zernike moments for deblurring and enhancement of optical point spread functions

    NASA Astrophysics Data System (ADS)

    Becherer, Nico; Jödicke, Hanna; Schlosser, Gregor; Hesser, Jürgen; Zeilfelder, Frank; Männer, Reinhard

    2006-02-01

    Blur and noise originating from the physical imaging processes degrade the microscope data. Accurate deblurring techniques require, however, an accurate estimation of the underlying point-spread function (PSF). A good representation of PSFs can be achieved by Zernike Polynomials since they offer a compact representation where low-order coefficients represent typical aberrations of optical wavefronts while noise is represented in higher order coefficients. A quantitative description of the noise distribution (Gaussian) over the Zernike moments of various orders is given which is the basis for the new soft clipping approach for denoising of PSFs. Instead of discarding moments beyond a certain order, those Zernike moments that are more sensitive to noise are dampened according to the measured distribution and the present noise model. Further, a new scheme to combine experimental and theoretical PSFs in Zernike space is presented. According to our experimental reconstructions, using the new improved PSF the correlation between reconstructed and original volume is raised by 15% on average cases and up to 85% in the case of thin fibre structures, compared to reconstructions where a non improved PSF was used. Finally, we demonstrate the advantages of our approach on 3D images of confocal microscopes by generating visually improved volumes. Additionally, we are presenting a method to render the reconstructed results using a new volume rendering method that is almost artifact-free. The new approach is based on a Shear-Warp technique, wavelet data encoding techniques and a recent approach to approximate the gray value distribution by a Super spline model.

  4. Limitations of the permeability-limited compartment model in estimating vascular permeability and interstitial volume fraction in DCE-MRI.

    PubMed

    Carreira, Guido Correia; Gemeinhardt, Ole; Gorenflo, Rudolf; Beyersdorff, Dirk; Franiel, Tobias; Plendl, Johanna; Lüdemann, Lutz

    2011-06-01

    Dynamic contrast-enhanced magnetic resonance imaging commonly uses compartment models to estimate tissue parameters in general and perfusion parameters in particular. Compartment models assume a homogeneous distribution of the injected tracer throughout the compartment volume. Since tracer distribution within a compartment cannot be assessed, the parameters obtained by means of a compartment model might differ from the actual physical values. This work systematically examines the widely used permeability-surface-limited one-compartment model to determine the reliability of the parameters obtained by comparing them with their actual values. A computer simulation was used to model spatial tracer distribution within the interstitial volume using diffusion of contrast agent in tissue. Vascular parameters were varied as well as tissue parameters. The vascular parameters used were capillary radius (4 and 12 μm), capillary permeability (from 0.03 to 3.3 μm/s) and intercapillary distances from 30 to 300 μm. The tissue parameters used were tortuosity (λ), porosity (α) and interstitial volume fraction (v(e)). Our results suggest that the permeability-surface-limited compartment model generally underestimates capillary permeability for capillaries with a radius of 4 μm by factors from ≈0.03 for α=0.04, to ≈ 0.1 for α=0.2, to ≈ 0.5 for α=1.0. An overestimation of actual capillary permeability for capillaries with a radius of 12 μm by a factor of ≥1.3 was found for α=1.0, while α=0.2 yielded an underestimation by a factor of ≈0.3 and α=0.04 by a factor of ≈ 0.03. The interstitial volume fraction, v(e), obtained by the compartment model differed with increasing intercapillary distances and for low vessel permeability, whereas v(e) was found to be estimated approximately accurately for P=0.3 μm/s and P=3.3 μm/s for vessel distances <100 μm. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Using MODIS and GLAS Data to Develop Timber Volume Estimates in Central Siberia

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon; Kimes, Daniel; Sun, Guoqing; Kharuk, Viatcheslav; Hyde, Peter; Nelson, Ross

    2007-01-01

    The boreal forest is the Earth's largest terrestrial biome, covering some 12 million km2 and accounting for about one third of this planet's total forest area. Mapping of boreal forest's type, structure parameters and biomass are critical for understanding the boreal forest's significance in the carbon cycle, its response to and impact on global climate change. Ground based forest inventories, have much uncertainty in the inventory data, particularly in remote areas of Siberia where sampling is sparse and/or lacking. In addition, many of the forest inventories that do exist for Siberia are now a decade or more old. Thus, available forest inventories fail to capture the current conditions. Changes in forest structure in a particular forest-type and region can change significantly due to changing environment conditions, and natural and anthropogenic disturbance. Remote sensing methods can potentially overcome these problems. Multispectral sensors can be used to provide vegetation cover maps that show a timely and accurate geographic distribution of vegetation types rather than decade old ground based maps. Lidar sensors can be used to directly obtain measurements that can be used to derive critical forest structure information (e.g., height, density, and volume). These in turn can used to estimate biomass components using allometric equations without having to use out dated forest inventory. Finally, remote sensing data is ideally suited to provide a sampling basis for a rigorous statistical estimate of the variance and error bound on forest structure measures. In this study, new remote sensing methods were applied to develop estimates timber volume using NASA's MODerate resolution Imaging Spectroradiometer (MODIS) and unique waveform data of the geoscience laser altimeter system (GLAS) for a 10 deg x 10 deg area in central Siberia. Using MODIS and GLAS data, maps were produced for cover type and timber volume for 2003, and a realistic variance (error bound) for timber volume was calculated for the study area. In this 'study we used only GLAS footprints that had a slope value of less than 10 deg. This was done to avoid large errors due to the effect of slope on the GLAS models. The method requires the integration of new remote sensing methods with available ground studies of forest timber volume conducted in Russian forests. The results were compared to traditional ground forest inventory methods reported in the literature and to ground truth collected in the study area.

  6. Automation of CT-based haemorrhagic stroke assessment for improved clinical outcomes: study protocol and design

    PubMed Central

    Chinda, Betty; Medvedev, George; Siu, William; Ester, Martin; Arab, Ali; Gu, Tao; Moreno, Sylvain; D’Arcy, Ryan C N; Song, Xiaowei

    2018-01-01

    Introduction Haemorrhagic stroke is of significant healthcare concern due to its association with high mortality and lasting impact on the survivors’ quality of life. Treatment decisions and clinical outcomes depend strongly on the size, spread and location of the haematoma. Non-contrast CT (NCCT) is the primary neuroimaging modality for haematoma assessment in haemorrhagic stroke diagnosis. Current procedures do not allow convenient NCCT-based haemorrhage volume calculation in clinical settings, while research-based approaches are yet to be tested for clinical utility; there is a demonstrated need for developing effective solutions. The project under review investigates the development of an automatic NCCT-based haematoma computation tool in support of accurate quantification of haematoma volumes. Methods and analysis Several existing research methods for haematoma volume estimation are studied. Selected methods are tested using NCCT images of patients diagnosed with acute haemorrhagic stroke. For inter-rater and intrarater reliability evaluation, different raters will analyse haemorrhage volumes independently. The efficiency with respect to time of haematoma volume assessments will be examined to compare with the results from routine clinical evaluations and planimetry assessment that are known to be more accurate. The project will target the development of an enhanced solution by adapting existing methods and integrating machine learning algorithms. NCCT-based information of brain haemorrhage (eg, size, volume, location) and other relevant information (eg, age, sex, risk factor, comorbidities) will be used in relation to clinical outcomes with future project development. Validity and reliability of the solution will be examined for potential clinical utility. Ethics and dissemination The project including procedures for deidentification of NCCT data has been ethically approved. The study involves secondary use of existing data and does not require new consent of participation. The team consists of clinical neuroimaging scientists, computing scientists and clinical professionals in neurology and neuroradiology and includes patient representatives. Research outputs will be disseminated following knowledge translation plans towards improving stroke patient care. Significant findings will be published in scientific journals. Anticipated deliverables include computer solutions for improved clinical assessment of haematoma using NCCT. PMID:29674371

  7. Basic physics and doubts about relationship between mammographically determined tissue density and breast cancer risk.

    PubMed

    Kopans, Daniel B

    2008-02-01

    Numerous studies have suggested a link between breast tissue patterns, as defined with mammography, and risk for breast cancer. There may be a relationship, but the author believes all of these studies have methodological flaws. It is impossible, with the parameters used in these studies, to accurately measure the percentage of tissues by volume when two-dimensional x-ray mammographic images are used. Without exposure values, half-value layer information, and knowledge of the compressed thickness of the breast, an accurate volume of tissue cannot be calculated. The great variability in positioning the breast for a mammogram is also an uncontrollable factor in measuring tissue density. Computerized segmentation algorithms can accurately assess the percentage of the x-ray image that is "dense," but this does not accurately measure the true volume of tissue. Since the percentage of dense tissue is ultimately measured in relation to the complete volume of the breast, defining the true boundaries of the breast is also a problem. Studies that purport to show small percentage differences between groups are likely inaccurate. Future investigations need to use three-dimensional information. (c) RSNA, 2008.

  8. Bayesian inference of ice thickness from remote-sensing data

    NASA Astrophysics Data System (ADS)

    Werder, Mauro A.; Huss, Matthias

    2017-04-01

    Knowledge about ice thickness and volume is indispensable for studying ice dynamics, future sea-level rise due to glacier melt or their contribution to regional hydrology. Accurate measurements of glacier thickness require on-site work, usually employing radar techniques. However, these field measurements are time consuming, expensive and sometime downright impossible. Conversely, measurements of the ice surface, namely elevation and flow velocity, are becoming available world-wide through remote sensing. The model of Farinotti et al. (2009) calculates ice thicknesses based on a mass conservation approach paired with shallow ice physics using estimates of the surface mass balance. The presented work applies a Bayesian inference approach to estimate the parameters of a modified version of this forward model by fitting it to both measurements of surface flow speed and of ice thickness. The inverse model outputs ice thickness as well the distribution of the error. We fit the model to ten test glaciers and ice caps and quantify the improvements of thickness estimates through the usage of surface ice flow measurements.

  9. Discriminative parameter estimation for random walks segmentation.

    PubMed

    Baudin, Pierre-Yves; Goodman, Danny; Kumrnar, Puneet; Azzabou, Noura; Carlier, Pierre G; Paragios, Nikos; Kumar, M Pawan

    2013-01-01

    The Random Walks (RW) algorithm is one of the most efficient and easy-to-use probabilistic segmentation methods. By combining contrast terms with prior terms, it provides accurate segmentations of medical images in a fully automated manner. However, one of the main drawbacks of using the RW algorithm is that its parameters have to be hand-tuned. we propose a novel discriminative learning framework that estimates the parameters using a training dataset. The main challenge we face is that the training samples are not fully supervised. Specifically, they provide a hard segmentation of the images, instead of a probabilistic segmentation. We overcome this challenge by treating the optimal probabilistic segmentation that is compatible with the given hard segmentation as a latent variable. This allows us to employ the latent support vector machine formulation for parameter estimation. We show that our approach significantly outperforms the baseline methods on a challenging dataset consisting of real clinical 3D MRI volumes of skeletal muscles.

  10. The 1980 US/Canada wheat and barley exploratory experiment, volume 1

    NASA Technical Reports Server (NTRS)

    Bizzell, R. M.; Prior, H. L.; Payne, R. W.; Disler, J. M.

    1983-01-01

    The results from the U.S./Canada Wheat and Barley Exploratory Experiment which was completed during FY 1980 are presented. The results indicate that the new crop identification procedures performed well for spring small grains and that they are conductive to automation. The performance of the machine processing techniques shows a significant improvement over previously evaluated technology. However, the crop calendars will require additional development and refinements prior to integration into automated area estimation technology. The evaluation showed the integrated technology to be capable of producing accurate and consistent spring small grains proportion estimates. However, barley proportion estimation technology was not satisfactorily evaluated. The low-density segments examined were judged not to give indicative or unequivocal results. It is concluded that, generally, the spring small grains technology is ready for evaluation in a pilot experiment focusing on sensitivity analyses to a variety of agricultural and meteorological conditions representative of the global environment. It is further concluded that a strong potential exists for establishing a highly efficient technology or spring small grains.

  11. Using Airborne LIDAR Data for Assessment of Forest Fire Fuel Load Potential

    NASA Astrophysics Data System (ADS)

    İnan, M.; Bilici, E.; Akay, A. E.

    2017-11-01

    Forest fire incidences are one of the most detrimental disasters that may cause long terms effects on forest ecosystems in many parts of the world. In order to minimize environmental damages of fires on forest ecosystems, the forested areas with high fire risk should be determined so that necessary precaution measurements can be implemented in those areas. Assessment of forest fire fuel load can be used to estimate forest fire risk. In order to estimate fuel load capacity, forestry parameters such as number of trees, tree height, tree diameter, crown diameter, and tree volume should be accurately measured. In recent years, with the advancements in remote sensing technology, it is possible to use airborne LIDAR for data estimation of forestry parameters. In this study, the capabilities of using LIDAR based point cloud data for assessment of the forest fuel load potential was investigated. The research area was chosen in the Istanbul Bentler series of Bahceköy Forest Enterprise Directorate that composed of mixed deciduous forest structure.

  12. Supernovae as probes of cosmic parameters: estimating the bias from under-dense lines of sight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busti, V.C.; Clarkson, C.; Holanda, R.F.L., E-mail: vinicius.busti@uct.ac.za, E-mail: holanda@uepb.edu.br, E-mail: chris.clarkson@uct.ac.za

    2013-11-01

    Correctly interpreting observations of sources such as type Ia supernovae (SNe Ia) require knowledge of the power spectrum of matter on AU scales — which is very hard to model accurately. Because under-dense regions account for much of the volume of the universe, light from a typical source probes a mean density significantly below the cosmic mean. The relative sparsity of sources implies that there could be a significant bias when inferring distances of SNe Ia, and consequently a bias in cosmological parameter estimation. While the weak lensing approximation should in principle give the correct prediction for this, linear perturbationmore » theory predicts an effectively infinite variance in the convergence for ultra-narrow beams. We attempt to quantify the effect typically under-dense lines of sight might have in parameter estimation by considering three alternative methods for estimating distances, in addition to the usual weak lensing approximation. We find in each case this not only increases the errors in the inferred density parameters, but also introduces a bias in the posterior value.« less

  13. 3D volumetric modeling of grapevine biomass using Tripod LiDAR

    USGS Publications Warehouse

    Keightley, K.E.; Bawden, G.W.

    2010-01-01

    Tripod mounted laser scanning provides the means to generate high-resolution volumetric measures of vegetation structure and perennial woody tissue for the calculation of standing biomass in agronomic and natural ecosystems. Other than costly destructive harvest methods, no technique exists to rapidly and accurately measure above-ground perennial tissue for woody plants such as Vitis vinifera (common grape vine). Data collected from grapevine trunks and cordons were used to study the accuracy of wood volume derived from laser scanning as compared with volume derived from analog measurements. A set of 10 laser scan datasets were collected for each of 36 vines from which volume was calculated using combinations of two, three, four, six and 10 scans. Likewise, analog volume measurements were made by submerging the vine trunks and cordons in water and capturing the displaced water. A regression analysis examined the relationship between digital and non-digital techniques among the 36 vines and found that the standard error drops rapidly as additional scans are added to the volume calculation process and stabilizes at the four-view geometry with an average Pearson's product moment correlation coefficient of 0.93. Estimates of digital volumes are systematically greater than those of analog volumes and can be explained by the manner in which each technique interacts with the vine tissue. This laser scanning technique yields a highly linear relationship between vine volume and tissue mass revealing a new, rapid and non-destructive method to remotely measure standing biomass. This application shows promise for use in other ecosystems such as orchards and forests. ?? 2010 Elsevier B.V.

  14. Volume Quantification of Acute Infratentorial Hemorrhage with Computed Tomography: Validation of the Formula 1/2ABC and 2/3SH

    PubMed Central

    Zhang, Yunyun; Yan, Jing; Fu, Yi; Chen, Shengdi

    2013-01-01

    Objective To compare the accuracy of formula 1/2ABC with 2/3SH on volume estimation for hypertensive infratentorial hematoma. Methods One hundred and forty-seven CT scans diagnosed as hypertensive infratentorial hemorrhage were reviewed. Based on the shape, hematomas were categorized as regular or irregular. Multilobular was defined as a special shape of irregular. Hematoma volume was calculated employing computer-assisted volumetric analysis (CAVA), 1/2ABC and 2/3SH, respectively. Results The correlation coefficients between 1/2ABC (or 2/3SH) and CAVA were greater than 0.900 in all subgroups. There were neither significant differences in absolute values of volume deviation nor percentage deviation between 1/2ABC and 2/3SH for regular hemorrhage (P>0.05). While for cerebellar, brainstem and irregular hemorrhages, the absolute values of volume deviation and percentage deviation by formula 1/2ABC were greater than 2/3SH (P<0.05). 1/2ABC and 2/3SH underestimated hematoma volume each by 10% and 5% for cerebellar hemorrhage, 14% and 9% for brainstem hemorrhage, 19% and 16% for regular hemorrhage, 9% and 3% for irregular hemorrhage, respectively. In addition, for the multilobular hemorrhage, 1/2ABC underestimated the volume by 9% while 2/3SH overestimated it by 2%. Conclusions For regular hemorrhage volume calculation, the accuracy of 2/3SH is similar to 1/2ABC. While for cerebellar, brainstem or irregular hemorrhages (including multilobular), 2/3SH is more accurate than 1/2ABC. PMID:23638025

  15. Comparison of air space measurement imaged by CT, small-animal CT, and hyperpolarized Xe MRI

    NASA Astrophysics Data System (ADS)

    Madani, Aniseh; White, Steven; Santyr, Giles; Cunningham, Ian

    2005-04-01

    Lung disease is the third leading cause of death in the western world. Lung air volume measurements are thought to be early indicators of lung disease and markers in pharmaceutical research. The purpose of this work is to develop a lung phantom for assessing and comparing the quantitative accuracy of hyperpolarized xenon 129 magnetic resonance imaging (HP 129Xe MRI), conventional computed tomography (HRCT), and highresolution small-animal CT (μCT) in measuring lung gas volumes. We developed a lung phantom consisting of solid cellulose acetate spheres (1, 2, 3, 4 and 5 mm diameter) uniformly packed in circulated air or HP 129Xe gas. Air volume is estimated based on simple thresholding algorithm. Truth is calculated from the sphere diameters and validated using μCT. While this phantom is not anthropomorphic, it enables us to directly measure air space volume and compare these imaging methods as a function of sphere diameter for the first time. HP 129Xe MRI requires partial volume analysis to distinguish regions with and without 129Xe gas and results are within %5 of truth but settling of the heavy 129Xe gas complicates this analysis. Conventional CT demonstrated partial-volume artifacts for the 1mm spheres. μCT gives the most accurate air-volume results. Conventional CT and HP 129Xe MRI give similar results although non-uniform densities of 129Xe require more sophisticated algorithms than simple thresholding. The threshold required to give the true air volume in both HRCT and μCT, varies with sphere diameters calling into question the validity of thresholding method.

  16. Peak skin and eye lens radiation dose from brain perfusion CT based on Monte Carlo simulation.

    PubMed

    Zhang, Di; Cagnon, Chris H; Villablanca, J Pablo; McCollough, Cynthia H; Cody, Dianna D; Stevens, Donna M; Zankl, Maria; Demarco, John J; Turner, Adam C; Khatonabadi, Maryam; McNitt-Gray, Michael F

    2012-02-01

    The purpose of our study was to accurately estimate the radiation dose to skin and the eye lens from clinical CT brain perfusion studies, investigate how well scanner output (expressed as volume CT dose index [CTDI(vol)]) matches these estimated doses, and investigate the efficacy of eye lens dose reduction techniques. Peak skin dose and eye lens dose were estimated using Monte Carlo simulation methods on a voxelized patient model and 64-MDCT scanners from four major manufacturers. A range of clinical protocols was evaluated. CTDI(vol) for each scanner was obtained from the scanner console. Dose reduction to the eye lens was evaluated for various gantry tilt angles as well as scan locations. Peak skin dose and eye lens dose ranged from 81 mGy to 348 mGy, depending on the scanner and protocol used. Peak skin dose and eye lens dose were observed to be 66-79% and 59-63%, respectively, of the CTDI(vol) values reported by the scanners. The eye lens dose was significantly reduced when the eye lenses were not directly irradiated. CTDI(vol) should not be interpreted as patient dose; this study has shown it to overestimate dose to the skin or eye lens. These results may be used to provide more accurate estimates of actual dose to ensure that protocols are operated safely below thresholds. Tilting the gantry or moving the scanning region further away from the eyes are effective for reducing lens dose in clinical practice. These actions should be considered when they are consistent with the clinical task and patient anatomy.

  17. Comparison of usual and alternative methods to measure height in mechanically ventilated patients: potential impact on protective ventilation.

    PubMed

    Bojmehrani, Azadeh; Bergeron-Duchesne, Maude; Bouchard, Carmelle; Simard, Serge; Bouchard, Pierre-Alexandre; Vanderschuren, Abel; L'Her, Erwan; Lellouche, François

    2014-07-01

    Protective ventilation implementation requires the calculation of predicted body weight (PBW), determined by a formula based on gender and height. Consequently, height inaccuracy may be a limiting factor to correctly set tidal volumes. The objective of this study was to evaluate the accuracy of different methods in measuring heights in mechanically ventilated patients. Before cardiac surgery, actual height was measured with a height gauge while subjects were standing upright (reference method); the height was also estimated by alternative methods based on lower leg and forearm measurements. After cardiac surgery, upon ICU admission, a subject's height was visually estimated by a clinician and then measured with a tape measure while the subject was supine and undergoing mechanical ventilation. One hundred subjects (75 men, 25 women) were prospectively included. Mean PBW was 61.0 ± 9.7 kg, and mean actual weight was 30.3% higher. In comparison with the reference method, estimating the height visually and using the tape measure were less accurate than both lower leg and forearm measurements. Errors above 10% in calculating the PBW were present in 25 and 40 subjects when the tape measure or visual estimation of height was used in the formula, respectively. With lower leg and forearm measurements, 15 subjects had errors above 10% (P < .001). Our results demonstrate that significant variability exists between the different methods used to measure height in bedridden patients on mechanical ventilation. Alternative methods based on lower leg and forearm measurements are potentially interesting solutions to facilitate the accurate application of protective ventilation. Copyright © 2014 by Daedalus Enterprises.

  18. Guidelines for estimating volume, biomass, and smoke production for piled slash.

    Treesearch

    Colin C. Hardy

    1998-01-01

    Guidelines in the form of a six-step approach are provided for estimating volumes, oven-dry mass, consumption, and particulate matter emissions for piled logging debris. Seven stylized pile shapes and their associated geometric volume formulae are used to estimate gross pile volumes. The gross volumes are then reduced to net wood volume by applying an appropriate wood-...

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staring, M., E-mail: m.staring@lumc.nl; Bakker, M. E.; Shamonin, D. P.

    Purpose: Whole lung densitometry on chest CT images is an accepted method for measuring tissue destruction in patients with pulmonary emphysema in clinical trials. Progression measurement is required for evaluation of change in health condition and the effect of drug treatment. Information about the location of emphysema progression within the lung may be important for the correct interpretation of drug efficacy, or for determining a treatment plan. The purpose of this study is therefore to develop and validate methods that enable the local measurement of lung density changes, which requires proper modeling of the effect of respiration on density. Methods:more » Four methods, all based on registration of baseline and follow-up chest CT scans, are compared. The first naïve method subtracts registered images. The second employs the so-called dry sponge model, where volume correction is performed using the determinant of the Jacobian of the transformation. The third and the fourth introduce a novel adaptation of the dry sponge model that circumvents its constant-mass assumption, which is shown to be invalid. The latter two methods require a third CT scan at a different inspiration level to estimate the patient-specific density-volume slope, where one method employs a global and the other a local slope. The methods were validated on CT scans of a phantom mimicking the lung, where mass and volume could be controlled. In addition, validation was performed on data of 21 patients with pulmonary emphysema. Results: The image registration method was optimized leaving a registration error below half the slice increment (median 1.0 mm). The phantom study showed that the locally adapted slope model most accurately measured local progression. The systematic error in estimating progression, as measured on the phantom data, was below 2 gr/l for a 70 ml (6%) volume difference, and 5 gr/l for a 210 ml (19%) difference, if volume correction was applied. On the patient data an underlying linearity assumption relating lung volume change with density change was shown to hold (fitR{sup 2} = 0.94), and globalized versions of the local models are consistent with global results (R{sup 2} of 0.865 and 0.882 for the two adapted slope models, respectively). Conclusions: In conclusion, image matching and subsequent analysis of differences according to the proposed lung models (i) has good local registration accuracy on patient data, (ii) effectively eliminates a dependency on inspiration level at acquisition time, (iii) accurately predicts progression in phantom data, and (iv) is reasonably consistent with global results in patient data. It is therefore a potential future tool for assessing local emphysema progression in drug evaluation trials and in clinical practice.« less

  20. Three-dimensional ordered-subset expectation maximization iterative protocol for evaluation of left ventricular volumes and function by quantitative gated SPECT: a dynamic phantom study.

    PubMed

    Ceriani, Luca; Ruberto, Teresa; Delaloye, Angelika Bischof; Prior, John O; Giovanella, Luca

    2010-03-01

    The purposes of this study were to characterize the performance of a 3-dimensional (3D) ordered-subset expectation maximization (OSEM) algorithm in the quantification of left ventricular (LV) function with (99m)Tc-labeled agent gated SPECT (G-SPECT), the QGS program, and a beating-heart phantom and to optimize the reconstruction parameters for clinical applications. A G-SPECT image of a dynamic heart phantom simulating the beating left ventricle was acquired. The exact volumes of the phantom were known and were as follows: end-diastolic volume (EDV) of 112 mL, end-systolic volume (ESV) of 37 mL, and stroke volume (SV) of 75 mL; these volumes produced an LV ejection fraction (LVEF) of 67%. Tomographic reconstructions were obtained after 10-20 iterations (I) with 4, 8, and 16 subsets (S) at full width at half maximum (FWHM) gaussian postprocessing filter cutoff values of 8-15 mm. The QGS program was used for quantitative measurements. Measured values ranged from 72 to 92 mL for EDV, from 18 to 32 mL for ESV, and from 54 to 63 mL for SV, and the calculated LVEF ranged from 65% to 76%. Overall, the combination of 10 I, 8 S, and a cutoff filter value of 10 mm produced the most accurate results. The plot of the measures with respect to the expectation maximization-equivalent iterations (I x S product) revealed a bell-shaped curve for the LV volumes and a reverse distribution for the LVEF, with the best results in the intermediate range. In particular, FWHM cutoff values exceeding 10 mm affected the estimation of the LV volumes. The QGS program is able to correctly calculate the LVEF when used in association with an optimized 3D OSEM algorithm (8 S, 10 I, and FWHM of 10 mm) but underestimates the LV volumes. However, various combinations of technical parameters, including a limited range of I and S (80-160 expectation maximization-equivalent iterations) and low cutoff values (< or =10 mm) for the gaussian postprocessing filter, produced results with similar accuracies and without clinically relevant differences in the LV volumes and the estimated LVEF.

  1. Comparison of computer versus manual determination of pulmonary nodule volumes in CT scans

    NASA Astrophysics Data System (ADS)

    Biancardi, Alberto M.; Reeves, Anthony P.; Jirapatnakul, Artit C.; Apanasovitch, Tatiyana; Yankelevitz, David; Henschke, Claudia I.

    2008-03-01

    Accurate nodule volume estimation is necessary in order to estimate the clinically relevant growth rate or change in size over time. An automated nodule volume-measuring algorithm was applied to a set of pulmonary nodules that were documented by the Lung Image Database Consortium (LIDC). The LIDC process model specifies that each scan is assessed by four experienced thoracic radiologists and that boundaries are to be marked around the visible extent of the nodules for nodules 3 mm and larger. Nodules were selected from the LIDC database with the following inclusion criteria: (a) they must have a solid component on a minimum of three CT image slices and (b) they must be marked by all four LIDC radiologists. A total of 113 nodules met the selection criterion with diameters ranging from 3.59 mm to 32.68 mm (mean 9.37 mm, median 7.67 mm). The centroid of each marked nodule was used as the seed point for the automated algorithm. 95 nodules (84.1%) were correctly segmented, but one was considered not meeting the first selection criterion by the automated method; for the remaining ones, eight (7.1%) were structurally too complex or extensively attached and 10 (8.8%) were considered not properly segmented after a simple visual inspection by a radiologist. Since the LIDC specifications, as aforementioned, instruct radiologists to include both solid and sub-solid parts, the automated method core capability of segmenting solid tissues was augmented to take into account also the nodule sub-solid parts. We ranked the distances of the automated method estimates and the radiologist-based estimates from the median of the radiologist-based values. The automated method was in 76.6% of the cases closer to the median than at least one of the values derived from the manual markings, which is a sign of a very good agreement with the radiologists' markings.

  2. A Concurrent Flow Model for Extraction during Transcapillary Passage

    PubMed Central

    Bassingthwaighte, James B.

    2010-01-01

    A model for capillary-tissue exchange in a uniformly perfused organ with uniform capillary transit times and no diffusional capillary interactions was designed to permit the exploration of the influences of various parameters on the interpretation of indicator-dilution curves obtained at the venous outflow following the simultaneous injection of tracers into the arterial inflow. These parameters include tissue geometric factors, longitudinal diffusion and volumes of distribution of tracers in blood and tissue, hematocrit, volumes of nonexchanging vessels and the sampling system, capillary permeability, P. capillary surface area, S, and flow of blood- or solute-containing fluid, Fs′. An assumption of instantaneous radial diffusion in the extravascular region is appropriate when intercapillary distances are small, as they are in the heart, or permeabilities are low, as they are for lipophobic solutes. Numerical solutions were obtained for dispersed input functions similar to normal intravascular dye-dilution curves. Axial extravascular diffusion showed a negligible influence at low permeabilities. The “instantaneous extraction” of a permeating solute can provide an estimate of PS/Fs′, the ratio of the capillary permeability–surface area product to the flow, when PS/Fs′ lies between approximately 0.05 and 3.0; the limits of the range depend on the extravascular volume of distribution and the influences of intravascular dispersion. The most accurate estimates were obtained when experiments were designed so that PS/Fs′ was between 0.2 and 1.0 or peak extractions were between 0.1 and 0.6. PMID:4608628

  3. Three-dimensional digital holographic aperture synthesis for rapid and highly-accurate large-volume metrology

    NASA Astrophysics Data System (ADS)

    Crouch, Stephen; Kaylor, Brant M.; Barber, Zeb W.; Reibel, Randy R.

    2015-09-01

    Currently large volume, high accuracy three-dimensional (3D) metrology is dominated by laser trackers, which typically utilize a laser scanner and cooperative reflector to estimate points on a given surface. The dependency upon the placement of cooperative targets dramatically inhibits the speed at which metrology can be conducted. To increase speed, laser scanners or structured illumination systems can be used directly on the surface of interest. Both approaches are restricted in their axial and lateral resolution at longer stand-off distances due to the diffraction limit of the optics used. Holographic aperture ladar (HAL) and synthetic aperture ladar (SAL) can enhance the lateral resolution of an imaging system by synthesizing much larger apertures by digitally combining measurements from multiple smaller apertures. Both of these approaches only produce two-dimensional imagery and are therefore not suitable for large volume 3D metrology. We combined the SAL and HAL approaches to create a swept frequency digital holographic 3D imaging system that provides rapid measurement speed for surface coverage with unprecedented axial and lateral resolution at longer standoff ranges. The technique yields a "data cube" of Fourier domain data, which can be processed with a 3D Fourier transform to reveal a 3D estimate of the surface. In this paper, we provide the theoretical background for the technique and show experimental results based on an ultra-wideband frequency modulated continuous wave (FMCW) chirped heterodyne ranging system showing ~100 micron lateral and axial precisions at >2 m standoff distances.

  4. Trajectory prediction for ballistic missiles based on boost-phase LOS measurements

    NASA Astrophysics Data System (ADS)

    Yeddanapudi, Murali; Bar-Shalom, Yaakov

    1997-10-01

    This paper addresses the problem of the estimation of the trajectory of a tactical ballistic missile using line of sight (LOS) measurements from one or more passive sensors (typically satellites). The major difficulties of this problem include: the estimation of the unknown time of launch, incorporation of (inaccurate) target thrust profiles to model the target dynamics during the boost phase and an overall ill-conditioning of the estimation problem due to poor observability of the target motion via the LOS measurements. We present a robust estimation procedure based on the Levenberg-Marquardt algorithm that provides both the target state estimate and error covariance taking into consideration the complications mentioned above. An important consideration in the defense against tactical ballistic missiles is the determination of the target position and error covariance at the acquisition range of a surveillance radar in the vicinity of the impact point. We present a systematic procedure to propagate the target state and covariance to a nominal time, when it is within the detection range of a surveillance radar to obtain a cueing volume. Mont Carlo simulation studies on typical single and two sensor scenarios indicate that the proposed algorithms are accurate in terms of the estimates and the estimator calculated covariances are consistent with the errors.

  5. Population pharmacokinetics and maximum a posteriori probability Bayesian estimator of abacavir: application of individualized therapy in HIV-infected infants and toddlers

    PubMed Central

    Zhao, Wei; Cella, Massimo; Della Pasqua, Oscar; Burger, David; Jacqz-Aigrain, Evelyne

    2012-01-01

    AIMS To develop a population pharmacokinetic model for abacavir in HIV-infected infants and toddlers, which will be used to describe both once and twice daily pharmacokinetic profiles, identify covariates that explain variability and propose optimal time points to optimize the area under the concentration–time curve (AUC) targeted dosage and individualize therapy. METHODS The pharmacokinetics of abacavir was described with plasma concentrations from 23 patients using nonlinear mixed-effects modelling (NONMEM) software. A two-compartment model with first-order absorption and elimination was developed. The final model was validated using bootstrap, visual predictive check and normalized prediction distribution errors. The Bayesian estimator was validated using the cross-validation and simulation–estimation method. RESULTS The typical population pharmacokinetic parameters and relative standard errors (RSE) were apparent systemic clearance (CL) 13.4 l h−1 (RSE 6.3%), apparent central volume of distribution 4.94 l (RSE 28.7%), apparent peripheral volume of distribution 8.12 l (RSE14.2%), apparent intercompartment clearance 1.25 l h−1 (RSE 16.9%) and absorption rate constant 0.758 h−1 (RSE 5.8%). The covariate analysis identified weight as the individual factor influencing the apparent oral clearance: CL = 13.4 × (weight/12)1.14. The maximum a posteriori probability Bayesian estimator, based on three concentrations measured at 0, 1 or 2, and 3 h after drug intake allowed predicting individual AUC0–t. CONCLUSIONS The population pharmacokinetic model developed for abacavir in HIV-infected infants and toddlers accurately described both once and twice daily pharmacokinetic profiles. The maximum a posteriori probability Bayesian estimator of AUC0–t was developed from the final model and can be used routinely to optimize individual dosing. PMID:21988586

  6. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  7. A method to combine target volume data from 3D and 4D planned thoracic radiotherapy patient cohorts for machine learning applications.

    PubMed

    Johnson, Corinne; Price, Gareth; Khalifa, Jonathan; Faivre-Finn, Corinne; Dekker, Andre; Moore, Christopher; van Herk, Marcel

    2018-02-01

    The gross tumour volume (GTV) is predictive of clinical outcome and consequently features in many machine-learned models. 4D-planning, however, has prompted substitution of the GTV with the internal gross target volume (iGTV). We present and validate a method to synthesise GTV data from the iGTV, allowing the combination of 3D and 4D planned patient cohorts for modelling. Expert delineations in 40 non-small cell lung cancer patients were used to develop linear fit and erosion methods to synthesise the GTV volume and shape. Quality was assessed using Dice Similarity Coefficients (DSC) and closest point measurements; by calculating dosimetric features; and by assessing the quality of random forest models built on patient populations with and without synthetic GTVs. Volume estimates were within the magnitudes of inter-observer delineation variability. Shape comparisons produced mean DSCs of 0.8817 and 0.8584 for upper and lower lobe cases, respectively. A model trained on combined true and synthetic data performed significantly better than models trained on GTV alone, or combined GTV and iGTV data. Accurate synthesis of GTV size from the iGTV permits the combination of lung cancer patient cohorts, facilitating machine learning applications in thoracic radiotherapy. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Measurement of gastric meal and secretion volumes using magnetic resonance imaging

    PubMed Central

    Hoad, C.L.; Parker, H.; Hudders, N.; Costigan, C.; Cox, E.F.; Perkins, A.C.; Blackshaw, P.E.; Marciani, L.; Spiller, R.C.; Fox, M.R.; Gowland, P.A.

    2015-01-01

    MRI can assess multiple gastric functions without ionizing radiation. However, time consuming image acquisition and analysis of gastric volume data, plus confounding of gastric emptying measurements by gastric secretions mixed with the test meal have limited its use to research centres. This study presents an MRI acquisition protocol and analysis algorithm suitable for the clinical measurement of gastric volume and secretion volume. Reproducibility of gastric volume measurements was assessed using data from 10 healthy volunteers following a liquid test meal with rapid MRI acquisition within one breath-hold and semi-automated analysis. Dilution of the ingested meal with gastric secretion was estimated using a respiratory-triggered T1 mapping protocol. Accuracy of the secretion volume measurements was assessed using data from 24 healthy volunteers following a mixed (liquid/solid) test meal with MRI meal volumes compared to data acquired using gamma scintigraphy (GS) on the same subjects studied on a separate study day. The mean (SD) coefficient of variance between 3 observers for both total gastric contents (including meal, secretions and air) and just the gastric contents (meal and secretion only) was 3 (2) % at large gastric volumes (> 200 ml). Mean (SD) secretion volumes post meal ingestion were 64 (51) ml and 110 (40) ml at 15 and 75 minutes respectively. Comparison with GS meal volumes, showed that MRI meal only volume (after correction for secretion volume) were similar to GS, with a linear regression gradient (std err) of 1.06 (0.10) and intercept −11 (24) ml. In conclusion, (i) rapid acquisition removed the requirement to image during prolonged breath-hold (ii) semi-automatic analysis greatly reduced time required to derive measurements and (iii) correction for secretion volumes provides accurate assessment of gastric meal volumes and emptying. Together these features provide the scientific basis of a protocol which would be suitable in clinical practice. PMID:25592405

  9. Using GIS to Estimate Lake Volume from Limited Data

    EPA Science Inventory

    Estimates of lake volume are necessary for estimating residence time or modeling pollutants. Modern GIS methods for calculating lake volume improve upon more dated technologies (e.g. planimeters) and do not require potentially inaccurate assumptions (e.g. volume of a frustum of ...

  10. Measurement of lung volumes from supine portable chest radiographs.

    PubMed

    Ries, A L; Clausen, J L; Friedman, P J

    1979-12-01

    Lung volumes in supine nonambulatory patients are physiological parameters often difficult to measure with current techniques (plethysmograph, gas dilution). Existing radiographic methods for measuring lung volumes require standard upright chest radiographs. Accordingly, in 31 normal supine adults, we determined helium-dilution functional residual and total lung capacities and measured planimetric lung field areas (LFA) from corresponding portable anteroposterior and lateral radiographs. Low radiation dose methods, which delivered less than 10% of that from standard portable X-ray technique, were utilized. Correlation between lung volume and radiographic LFA was highly significant (r = 0.96, SEE = 10.6%). Multiple-step regressions using height and chest diameter correction factors reduced variance, but weight and radiographic magnification factors did not. In 17 additional subjects studied for validation, the regression equations accurately predicted radiographic lung volume. Thus, this technique can provide accurate and rapid measurement of lung volume in studies involving supine patients.

  11. Optimal ventilation of the anesthetized pediatric patient.

    PubMed

    Feldman, Jeffrey M

    2015-01-01

    Mechanical ventilation of the pediatric patient is challenging because small changes in delivered volume can be a significant fraction of the intended tidal volume. Anesthesia ventilators have traditionally been poorly suited to delivering small tidal volumes accurately, and pressure-controlled ventilation has become used commonly when caring for pediatric patients. Modern anesthesia ventilators are designed to deliver small volumes accurately to the patient's airway by compensating for the compliance of the breathing system and delivering tidal volume independent of fresh gas flow. These technology advances provide the opportunity to implement a lung-protective ventilation strategy in the operating room based upon control of tidal volume. This review will describe the capabilities of the modern anesthesia ventilator and the current understanding of lung-protective ventilation. An optimal approach to mechanical ventilation for the pediatric patient is described, emphasizing the importance of using bedside monitors to optimize the ventilation strategy for the individual patient.

  12. A physical multifield model predicts the development of volume and structure in the human brain

    NASA Astrophysics Data System (ADS)

    Rooij, Rijk de; Kuhl, Ellen

    2018-03-01

    The prenatal development of the human brain is characterized by a rapid increase in brain volume and a development of a highly folded cortex. At the cellular level, these events are enabled by symmetric and asymmetric cell division in the ventricular regions of the brain followed by an outwards cell migration towards the peripheral regions. The role of mechanics during brain development has been suggested and acknowledged in past decades, but remains insufficiently understood. Here we propose a mechanistic model that couples cell division, cell migration, and brain volume growth to accurately model the developing brain between weeks 10 and 29 of gestation. Our model accurately predicts a 160-fold volume increase from 1.5 cm3 at week 10 to 235 cm3 at week 29 of gestation. In agreement with human brain development, the cortex begins to form around week 22 and accounts for about 30% of the total brain volume at week 29. Our results show that cell division and coupling between cell density and volume growth are essential to accurately model brain volume development, whereas cell migration and diffusion contribute mainly to the development of the cortex. We demonstrate that complex folding patterns, including sinusoidal folds and creases, emerge naturally as the cortex develops, even for low stiffness contrasts between the cortex and subcortex.

  13. NOTE Thyroid volume measurement in external beam radiotherapy patients using CT imaging: correlation with clinical and anthropometric characteristics

    NASA Astrophysics Data System (ADS)

    Veres, C.; Garsi, J. P.; Rubino, C.; Pouzoulet, F.; Bidault, F.; Chavaudra, J.; Bridier, A.; Ricard, M.; Ferreira, I.; Lefkopoulos, D.; de Vathaire, F.; Diallo, I.

    2010-11-01

    The aim of this study is to define criteria for accurate representation of the thyroid in human models used to represent external beam radiotherapy (EBRT) patients and evaluate the relationship between the volume of this organ and clinical and anthropometric characteristics. From CT images, we segmented the thyroid gland and calculated its volume for a population of 188 EBRT patients of both sexes, with ages ranging from 1 to 89 years. To evaluate uncertainties linked to measured volumes, experimental studies on the Livermore anthropomorphic phantom were performed. For our population of EBRT patients, we observed that in children, thyroid volume increased rapidly with age, from about 3 cm3 at 2 years to about 16 cm3 at 20. In adults, the mean thyroid gland volume was 23.5 ± 9 cm3 for males and 17.5 ± 8 cm3 for females. According to anthropometric parameters, the best fit for children was obtained by modeling the log of thyroid volume as a linear function of body surface area (BSA) (p < 0.0001) and age (p = 0.04) and for adults, as a linear function of BSA (p < 0.0001) and gender (p = 0.01). This work enabled us to demonstrate that BSA was the best indicator of thyroid volume for both males and females. These results should be taken into account when modeling the volume of the thyroid in human models used to represent EBRT patients for dosimetry in retrospective studies of the relationship between the estimated dose to the thyroid and long-term follow-up data on EBRT patients.

  14. Kinetic Growth Rate after Portal Vein Embolization Predicts Posthepatectomy Outcomes: Toward Zero Liver-Related Mortality in Patients with Colorectal Liver Metastases and Small Future Liver Remnant

    PubMed Central

    Shindoh, Junichi; Truty, Mark J; Aloia, Thomas A; Curley, Steven A; Zimmitti, Giuseppe; Huang, Steven Y; Mahvash, Armeen; Gupta, Sanjay; Wallace, Michael J; Vauthey, Jean-Nicolas

    2013-01-01

    Background Standardized future liver remnant (sFLR) volume and degree of hypertrophy after portal vein embolization (PVE) have been recognized as significant predictors of surgical outcomes after major liver resection. However, regeneration rate of the FLR after PVE varies among individuals and its clinical significance is unknown. Study Design Degree of hypertrophy at initial volume assessment divided by number of weeks elapsed after PVE was defined as the kinetic growth rate (KGR). In 107 consecutive patients who underwent liver resection for colorectal liver metastases with a sFLR volume of greater than 20%, the ability of the KGR to predict overall and liver-specific postoperative morbidity and mortality was compared with sFLR volume and degree of hypertrophy. Results Using receiver operating characteristic analysis, the best cut-off values for sFLR volume, degree of hypertrophy, and KGR for predicting postoperative hepatic insufficiency were estimated as, respectively, 29.6%, 7.5%, and 2.0% per week. Among these, KGR was the most accurate predictor (area under the curve, 0.830 [0.736-0.923]; asymptotic significance, 0.002). KGR of less than 2% per week vs. ≥2% per week correlate with rates of hepatic insufficiency (21.6% vs. 0%, p = 0.0001) and liver-related 90-day mortality (8.1% vs. 0%, P=0.04). The predictive value of KGR was not influenced by sFLR volume or the timing of initial volume assessment when evaluated within 8 weeks after PVE. Conclusions KGR is a better predictor of postoperative morbidity and mortality after liver resection for small FLR than conventional measured volume parameters (sFLR volume and degree of hypertrophy). PMID:23219349

  15. Kinetic growth rate after portal vein embolization predicts posthepatectomy outcomes: toward zero liver-related mortality in patients with colorectal liver metastases and small future liver remnant.

    PubMed

    Shindoh, Junichi; Truty, Mark J; Aloia, Thomas A; Curley, Steven A; Zimmitti, Giuseppe; Huang, Steven Y; Mahvash, Armeen; Gupta, Sanjay; Wallace, Michael J; Vauthey, Jean-Nicolas

    2013-02-01

    Standardized future liver remnant (sFLR) volume and degree of hypertrophy after portal vein embolization (PVE) have been recognized as important predictors of surgical outcomes after major liver resection. However, the regeneration rate of the FLR after PVE varies among individuals and its clinical significance is unknown. Kinetic growth rate (KGR) is defined as the degree of hypertrophy at initial volume assessment divided by number of weeks elapsed after PVE. In 107 consecutive patients who underwent liver resection for colorectal liver metastases with an sFLR volume >20%, the ability of the KGR to predict overall and liver-specific postoperative morbidity and mortality was compared with sFLR volume and degree of hypertrophy. Using receiver operating characteristic analysis, the best cutoff values for sFLR volume, degree of hypertrophy, and KGR for predicting postoperative hepatic insufficiency were estimated as 29.6%, 7.5%, and 2.0% per week, respectively. Among these, KGR was the most accurate predictor (area under the curve 0.830 [95% CI, 0.736-0.923]; asymptotic significance, 0.002). A KGR of <2% per week vs ≥2% per week correlates with rates of hepatic insufficiency (21.6% vs 0%; p = 0.0001) and liver-related 90-day mortality (8.1% vs 0%; p = 0.04). The predictive value of KGR was not influenced by sFLR volume or the timing of initial volume assessment when evaluated within 8 weeks after PVE. Kinetic growth rate is a better predictor of postoperative morbidity and mortality after liver resection for small FLR than conventional measured volume parameters (ie, sFLR volume and degree of hypertrophy). Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  16. A Comparison of Height-Accumulation and Volume-Equation Methods for Estimating Tree and Stand Volumes

    Treesearch

    R.B. Ferguson; V. Clark Baldwin

    1995-01-01

    Estimating tree and stand volume in mature plantations is time consuming, involving much manpower and equipment; however, several sampling and volume-prediction techniques are available. This study showed that a well-constructed, volume-equation method yields estimates comparable to those of the often more time-consuming, height-accumulation method, even though the...

  17. Quantifying Golgi structure using EM: combining volume-SEM and stereology for higher throughput.

    PubMed

    Ferguson, Sophie; Steyer, Anna M; Mayhew, Terry M; Schwab, Yannick; Lucocq, John Milton

    2017-06-01

    Investigating organelles such as the Golgi complex depends increasingly on high-throughput quantitative morphological analyses from multiple experimental or genetic conditions. Light microscopy (LM) has been an effective tool for screening but fails to reveal fine details of Golgi structures such as vesicles, tubules and cisternae. Electron microscopy (EM) has sufficient resolution but traditional transmission EM (TEM) methods are slow and inefficient. Newer volume scanning EM (volume-SEM) methods now have the potential to speed up 3D analysis by automated sectioning and imaging. However, they produce large arrays of sections and/or images, which require labour-intensive 3D reconstruction for quantitation on limited cell numbers. Here, we show that the information storage, digital waste and workload involved in using volume-SEM can be reduced substantially using sampling-based stereology. Using the Golgi as an example, we describe how Golgi populations can be sensed quantitatively using single random slices and how accurate quantitative structural data on Golgi organelles of individual cells can be obtained using only 5-10 sections/images taken from a volume-SEM series (thereby sensing population parameters and cell-cell variability). The approach will be useful in techniques such as correlative LM and EM (CLEM) where small samples of cells are treated and where there may be variable responses. For Golgi study, we outline a series of stereological estimators that are suited to these analyses and suggest workflows, which have the potential to enhance the speed and relevance of data acquisition in volume-SEM.

  18. Assessment of the derivative-moment transformation method for unsteady-load estimation

    NASA Astrophysics Data System (ADS)

    Mohebbian, Ali; Rival, David E.

    2012-08-01

    It is often difficult, if not impossible, to measure the aerodynamic or hydrodynamic forces on a moving body. For this reason, a classical control-volume technique is typically applied to extract the unsteady forces. However, measuring the acceleration term within the volume of interest using particle image velocimetry (PIV) can be limited by optical access, reflections, as well as shadows. Therefore, in this study, an alternative approach, termed the derivative-moment transformation (DMT) method, is introduced and tested on a synthetic data set produced using numerical simulations. The test case involves the unsteady loading of a flat plate in a two-dimensional, laminar periodic gust. The results suggest that the DMT method can accurately predict the acceleration term so long as appropriate spatial and temporal resolutions are maintained. The major deficiency, which is more dominant for the direction of drag, was found to be the determination of pressure and unsteady terms in the wake. The effect of control-volume size was investigated, suggesting that larger domains work best by minimizing the associated error in the determination of the pressure field. When decreasing the control-volume size, wake vortices, which produce high gradients across the control surfaces, are found to substantially increase the level of error. On the other hand, it was shown that for large control volumes, and with realistic spatial resolution, the accuracy of the DMT method would also suffer. Therefore, a delicate compromise is required when selecting control-volume size in future experiments.

  19. 3D Myocardial Elastography In Vivo.

    PubMed

    Papadacci, Clement; Bunting, Ethan A; Wan, Elaine Y; Nauleau, Pierre; Konofagou, Elisa E

    2017-02-01

    Strain evaluation is of major interest in clinical cardiology as it can quantify the cardiac function. Myocardial elastography, a radio-frequency (RF)-based cross-correlation method, has been developed to evaluate the local strain distribution in the heart in vivo. However, inhomogeneities such as RF ablation lesions or infarction require a three-dimensional approach to be measured accurately. In addition, acquisitions at high volume rate are essential to evaluate the cardiac strain in three dimensions. Conventional focused transmit schemes using 2D matrix arrays, trade off sufficient volume rate for beam density or sector size to image rapid moving structure such as the heart, which lowers accuracy and precision in the strain estimation. In this study, we developed 3D myocardial elastography at high volume rates using diverging wave transmits to evaluate the local axial strain distribution in three dimensions in three open-chest canines before and after radio-frequency ablation. Acquisitions were performed with a 2.5 MHz 2D matrix array fully programmable used to emit 2000 diverging waves at 2000 volumes/s. Incremental displacements and strains enabled the visualization of rapid events during the QRS complex along with the different phases of the cardiac cycle in entire volumes. Cumulative displacement and strain volumes depict high contrast between non-ablated and ablated myocardium at the lesion location, mapping the tissue coagulation. 3D myocardial strain elastography could thus become an important technique to measure the regional strain distribution in three dimensions in humans.

  20. Preoperative TRAM free flap volume estimation for breast reconstruction in lean patients.

    PubMed

    Minn, Kyung Won; Hong, Ki Yong; Lee, Sang Woo

    2010-04-01

    To obtain pleasing symmetry in breast reconstruction with transverse rectus abdominis myocutaneous (TRAM) free flap, a large amount of abdominal flap is elevated and remnant tissue is trimmed in most cases. However, elevation of abundant abdominal flap can cause excessive tension in donor site closure and increase the possibility of hypertrophic scarring especially in lean patients. The TRAM flap was divided into 4 zones in routine manner; the depth and dimension of the 4 zones were obtained using ultrasound and AutoCAD (Autodesk Inc., San Rafael, CA), respectively. The acquired numbers were then multiplied to obtain an estimate of volume of each zone and the each zone volume was added. To confirm the relation between the estimated volume and the actual volume, authors compared intraoperative actual TRAM flap volumes with preoperative estimated volumes in 30 consecutive TRAM free flap breast reconstructions. The estimated volumes and the actual elevated volumes of flap were found to be correlated by regression analysis (r = 0.9258, P < 0.01). According to this result, we could confirm the reliability of the preoperative volume estimation using our method. Afterward, the authors applied this method to 7 lean patients by estimation and revision of the design and obtained symmetric results with minimal donor site morbidity. Preoperative estimation of TRAM flap volume with ultrasound and AutoCAD (Autodesk Inc.) allow the authors to attain the precise volume desired for elevation. This method provides advantages in terms of minimal flap trimming, easier closure of donor sites, reduced scar widening and symmetry, especially in lean patients.

  1. Errors in Measuring Water Potentials of Small Samples Resulting from Water Adsorption by Thermocouple Psychrometer Chambers 1

    PubMed Central

    Bennett, Jerry M.; Cortes, Peter M.

    1985-01-01

    The adsorption of water by thermocouple psychrometer assemblies is known to cause errors in the determination of water potential. Experiments were conducted to evaluate the effect of sample size and psychrometer chamber volume on measured water potentials of leaf discs, leaf segments, and sodium chloride solutions. Reasonable agreement was found between soybean (Glycine max L. Merr.) leaf water potentials measured on 5-millimeter radius leaf discs and large leaf segments. Results indicated that while errors due to adsorption may be significant when using small volumes of tissue, if sufficient tissue is used the errors are negligible. Because of the relationship between water potential and volume in plant tissue, the errors due to adsorption were larger with turgid tissue. Large psychrometers which were sealed into the sample chamber with latex tubing appeared to adsorb more water than those sealed with flexible plastic tubing. Estimates are provided of the amounts of water adsorbed by two different psychrometer assemblies and the amount of tissue sufficient for accurate measurements of leaf water potential with these assemblies. It is also demonstrated that water adsorption problems may have generated low water potential values which in prior studies have been attributed to large cut surface area to volume ratios. PMID:16664367

  2. Errors in measuring water potentials of small samples resulting from water adsorption by thermocouple psychrometer chambers.

    PubMed

    Bennett, J M; Cortes, P M

    1985-09-01

    The adsorption of water by thermocouple psychrometer assemblies is known to cause errors in the determination of water potential. Experiments were conducted to evaluate the effect of sample size and psychrometer chamber volume on measured water potentials of leaf discs, leaf segments, and sodium chloride solutions. Reasonable agreement was found between soybean (Glycine max L. Merr.) leaf water potentials measured on 5-millimeter radius leaf discs and large leaf segments. Results indicated that while errors due to adsorption may be significant when using small volumes of tissue, if sufficient tissue is used the errors are negligible. Because of the relationship between water potential and volume in plant tissue, the errors due to adsorption were larger with turgid tissue. Large psychrometers which were sealed into the sample chamber with latex tubing appeared to adsorb more water than those sealed with flexible plastic tubing. Estimates are provided of the amounts of water adsorbed by two different psychrometer assemblies and the amount of tissue sufficient for accurate measurements of leaf water potential with these assemblies. It is also demonstrated that water adsorption problems may have generated low water potential values which in prior studies have been attributed to large cut surface area to volume ratios.

  3. A computational method for sharp interface advection

    PubMed Central

    Bredmose, Henrik; Jasak, Hrvoje

    2016-01-01

    We devise a numerical method for passive advection of a surface, such as the interface between two incompressible fluids, across a computational mesh. The method is called isoAdvector, and is developed for general meshes consisting of arbitrary polyhedral cells. The algorithm is based on the volume of fluid (VOF) idea of calculating the volume of one of the fluids transported across the mesh faces during a time step. The novelty of the isoAdvector concept consists of two parts. First, we exploit an isosurface concept for modelling the interface inside cells in a geometric surface reconstruction step. Second, from the reconstructed surface, we model the motion of the face–interface intersection line for a general polygonal face to obtain the time evolution within a time step of the submerged face area. Integrating this submerged area over the time step leads to an accurate estimate for the total volume of fluid transported across the face. The method was tested on simple two-dimensional and three-dimensional interface advection problems on both structured and unstructured meshes. The results are very satisfactory in terms of volume conservation, boundedness, surface sharpness and efficiency. The isoAdvector method was implemented as an OpenFOAM® extension and is published as open source. PMID:28018619

  4. Magnetic Moment Quantifications of Small Spherical Objects in MRI

    PubMed Central

    Cheng, Yu-Chung N.; Hsieh, Ching-Yi; Tackett, Ronald; Kokeny, Paul; Regmi, Rajesh Kumar; Lawes, Gavin

    2014-01-01

    Purpose The purpose of this work is to develop a method for accurately quantifying effective magnetic moments of spherical-like small objects from magnetic resonance imaging (MRI). A standard 3D gradient echo sequence with only one echo time is intended for our approach to measure the effective magnetic moment of a given object of interest. Methods Our method sums over complex MR signals around the object and equates those sums to equations derived from the magnetostatic theory. With those equations, our method is able to determine the center of the object with subpixel precision. By rewriting those equations, the effective magnetic moment of the object becomes the only unknown to be solved. Each quantified effective magnetic moment has an uncertainty that is derived from the error propagation method. If the volume of the object can be measured from spin echo images, the susceptibility difference between the object and its surrounding can be further quantified from the effective magnetic moment. Numerical simulations, a variety of glass beads in phantom studies with different MR imaging parameters from a 1.5 T machine, and measurements from a SQUID (superconducting quantum interference device) based magnetometer have been conducted to test the robustness of our method. Results Quantified effective magnetic moments and susceptibility differences from different imaging parameters and methods all agree with each other within two standard deviations of estimated uncertainties. Conclusion An MRI method is developed to accurately quantify the effective magnetic moment of a given small object of interest. Most results are accurate within 10% of true values and roughly half of the total results are accurate within 5% of true values using very reasonable imaging parameters. Our method is minimally affected by the partial volume, dephasing, and phase aliasing effects. Our next goal is to apply this method to in vivo studies. PMID:25490517

  5. Magnetic moment quantifications of small spherical objects in MRI.

    PubMed

    Cheng, Yu-Chung N; Hsieh, Ching-Yi; Tackett, Ronald; Kokeny, Paul; Regmi, Rajesh Kumar; Lawes, Gavin

    2015-07-01

    The purpose of this work is to develop a method for accurately quantifying effective magnetic moments of spherical-like small objects from magnetic resonance imaging (MRI). A standard 3D gradient echo sequence with only one echo time is intended for our approach to measure the effective magnetic moment of a given object of interest. Our method sums over complex MR signals around the object and equates those sums to equations derived from the magnetostatic theory. With those equations, our method is able to determine the center of the object with subpixel precision. By rewriting those equations, the effective magnetic moment of the object becomes the only unknown to be solved. Each quantified effective magnetic moment has an uncertainty that is derived from the error propagation method. If the volume of the object can be measured from spin echo images, the susceptibility difference between the object and its surrounding can be further quantified from the effective magnetic moment. Numerical simulations, a variety of glass beads in phantom studies with different MR imaging parameters from a 1.5T machine, and measurements from a SQUID (superconducting quantum interference device) based magnetometer have been conducted to test the robustness of our method. Quantified effective magnetic moments and susceptibility differences from different imaging parameters and methods all agree with each other within two standard deviations of estimated uncertainties. An MRI method is developed to accurately quantify the effective magnetic moment of a given small object of interest. Most results are accurate within 10% of true values, and roughly half of the total results are accurate within 5% of true values using very reasonable imaging parameters. Our method is minimally affected by the partial volume, dephasing, and phase aliasing effects. Our next goal is to apply this method to in vivo studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. 40 CFR 86.519-90 - Constant volume sampler calibration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Constant volume sampler calibration... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.519-90 Constant volume sampler calibration. (a) The CVS (Constant Volume Sampler) is calibrated using an accurate flowmeter and restrictor...

  7. 40 CFR 86.519-90 - Constant volume sampler calibration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Constant volume sampler calibration... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.519-90 Constant volume sampler calibration. (a) The CVS (Constant Volume Sampler) is calibrated using an accurate flowmeter and restrictor...

  8. 40 CFR 86.519-90 - Constant volume sampler calibration.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Constant volume sampler calibration... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.519-90 Constant volume sampler calibration. (a) The CVS (Constant Volume Sampler) is calibrated using an accurate flowmeter and restrictor...

  9. 40 CFR 86.519-90 - Constant volume sampler calibration.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 19 2014-07-01 2014-07-01 false Constant volume sampler calibration... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.519-90 Constant volume sampler calibration. (a) The CVS (Constant Volume Sampler) is calibrated using an accurate flowmeter and restrictor...

  10. 40 CFR 86.519-90 - Constant volume sampler calibration.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Constant volume sampler calibration... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.519-90 Constant volume sampler calibration. (a) The CVS (Constant Volume Sampler) is calibrated using an accurate flowmeter and restrictor...

  11. Integrated petrophysical and reservoir characterization workflow to enhance permeability and water saturation prediction

    NASA Astrophysics Data System (ADS)

    Al-Amri, Meshal; Mahmoud, Mohamed; Elkatatny, Salaheldin; Al-Yousef, Hasan; Al-Ghamdi, Tariq

    2017-07-01

    Accurate estimation of permeability is essential in reservoir characterization and in determining fluid flow in porous media which greatly assists optimize the production of a field. Some of the permeability prediction techniques such as Porosity-Permeability transforms and recently artificial intelligence and neural networks are encouraging but still show moderate to good match to core data. This could be due to limitation to homogenous media while the knowledge about geology and heterogeneity is indirectly related or absent. The use of geological information from core description as in Lithofacies which includes digenetic information show a link to permeability when categorized into rock types exposed to similar depositional environment. The objective of this paper is to develop a robust combined workflow integrating geology and petrophysics and wireline logs in an extremely heterogeneous carbonate reservoir to accurately predict permeability. Permeability prediction is carried out using pattern recognition algorithm called multi-resolution graph-based clustering (MRGC). We will bench mark the prediction results with hard data from core and well test analysis. As a result, we showed how much better improvements are achieved in the permeability prediction when geology is integrated within the analysis. Finally, we use the predicted permeability as an input parameter in J-function and correct for uncertainties in saturation calculation produced by wireline logs using the classical Archie equation. Eventually, high level of confidence in hydrocarbon volumes estimation is reached when robust permeability and saturation height functions are estimated in presence of important geological details that are petrophysically meaningful.

  12. Myocardial strains from 3D displacement encoded magnetic resonance imaging

    PubMed Central

    2012-01-01

    Background The ability to measure and quantify myocardial motion and deformation provides a useful tool to assist in the diagnosis, prognosis and management of heart disease. The recent development of magnetic resonance imaging methods, such as harmonic phase analysis of tagging and displacement encoding with stimulated echoes (DENSE), make detailed non-invasive 3D kinematic analyses of human myocardium possible in the clinic and for research purposes. A robust analysis method is required, however. Methods We propose to estimate strain using a polynomial function which produces local models of the displacement field obtained with DENSE. Given a specific polynomial order, the model is obtained as the least squares fit of the acquired displacement field. These local models are subsequently used to produce estimates of the full strain tensor. Results The proposed method is evaluated on a numerical phantom as well as in vivo on a healthy human heart. The evaluation showed that the proposed method produced accurate results and showed low sensitivity to noise in the numerical phantom. The method was also demonstrated in vivo by assessment of the full strain tensor and to resolve transmural strain variations. Conclusions Strain estimation within a 3D myocardial volume based on polynomial functions yields accurate and robust results when validated on an analytical model. The polynomial field is capable of resolving the measured material positions from the in vivo data, and the obtained in vivo strains values agree with previously reported myocardial strains in normal human hearts. PMID:22533791

  13. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses

    NASA Astrophysics Data System (ADS)

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-04-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates.

  14. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses

    PubMed Central

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-01-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates. PMID:25844042

  15. Experimental design and efficient parameter estimation in preclinical pharmacokinetic studies.

    PubMed

    Ette, E I; Howie, C A; Kelman, A W; Whiting, B

    1995-05-01

    Monte Carlo simulation technique used to evaluate the effect of the arrangement of concentrations on the efficiency of estimation of population pharmacokinetic parameters in the preclinical setting is described. Although the simulations were restricted to the one compartment model with intravenous bolus input, they provide the basis of discussing some structural aspects involved in designing a destructive ("quantic") preclinical population pharmacokinetic study with a fixed sample size as is usually the case in such studies. The efficiency of parameter estimation obtained with sampling strategies based on the three and four time point designs were evaluated in terms of the percent prediction error, design number, individual and joint confidence intervals coverage for parameter estimates approaches, and correlation analysis. The data sets contained random terms for both inter- and residual intra-animal variability. The results showed that the typical population parameter estimates for clearance and volume were efficiently (accurately and precisely) estimated for both designs, while interanimal variability (the only random effect parameter that could be estimated) was inefficiently (inaccurately and imprecisely) estimated with most sampling schedules of the two designs. The exact location of the third and fourth time point for the three and four time point designs, respectively, was not critical to the efficiency of overall estimation of all population parameters of the model. However, some individual population pharmacokinetic parameters were sensitive to the location of these times.

  16. Improved ultrasound transducer positioning by fetal heart location estimation during Doppler based heart rate measurements.

    PubMed

    Hamelmann, Paul; Vullings, Rik; Schmitt, Lars; Kolen, Alexander F; Mischi, Massimo; van Laar, Judith O E H; Bergmans, Jan W M

    2017-09-21

    Doppler ultrasound (US) is the most commonly applied method to measure the fetal heart rate (fHR). When the fetal heart is not properly located within the ultrasonic beam, fHR measurements often fail. As a consequence, clinical staff need to reposition the US transducer on the maternal abdomen, which can be a time consuming and tedious task. In this article, a method is presented to aid clinicians with the positioning of the US transducer to produce robust fHR measurements. A maximum likelihood estimation (MLE) algorithm is developed, which provides information on fetal heart location using the power of the Doppler signals received in the individual elements of a standard US transducer for fHR recordings. The performance of the algorithm is evaluated with simulations and in vitro experiments performed on a beating-heart setup. Both the experiments and the simulations show that the heart location can be accurately determined with an error of less than 7 mm within the measurement volume of the employed US transducer. The results show that the developed algorithm can be used to provide accurate feedback on fetal heart location for improved positioning of the US transducer, which may lead to improved measurements of the fHR.

  17. An Evaluation of Fractal Surface Measurement Methods for Characterizing Landscape Complexity from Remote-Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina Siu-Ngan; Qiu, Hong-Lie; Quattrochi, Dale A.; Emerson, Charles W.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The rapid increase in digital data volumes from new and existing sensors necessitates the need for efficient analytical tools for extracting information. We developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates the three fractal dimension measurement methods: isarithm, variogram, and triangular prism, along with the spatial autocorrelation measurement methods Moran's I and Geary's C, that have been implemented in ICAMS. A modified triangular prism method was proposed and implemented. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all of the surfaces, particularly those with higher fractal dimensions. Similar to the fractal techniques, the spatial autocorrelation techniques are found to be useful to measure complex images but not images with low dimensionality. These fractal measurement methods can be applied directly to unclassified images and could serve as a tool for change detection and data mining.

  18. A new method for ultrasound detection of interfacial position in gas-liquid two-phase flow.

    PubMed

    Coutinho, Fábio Rizental; Ofuchi, César Yutaka; de Arruda, Lúcia Valéria Ramos; Neves, Flávio; Morales, Rigoberto E M

    2014-05-22

    Ultrasonic measurement techniques for velocity estimation are currently widely used in fluid flow studies and applications. An accurate determination of interfacial position in gas-liquid two-phase flows is still an open problem. The quality of this information directly reflects on the accuracy of void fraction measurement, and it provides a means of discriminating velocity information of both phases. The algorithm known as Velocity Matched Spectrum (VM Spectrum) is a velocity estimator that stands out from other methods by returning a spectrum of velocities for each interrogated volume sample. Interface detection of free-rising bubbles in quiescent liquid presents some difficulties for interface detection due to abrupt changes in interface inclination. In this work a method based on velocity spectrum curve shape is used to generate a spatial-temporal mapping, which, after spatial filtering, yields an accurate contour of the air-water interface. It is shown that the proposed technique yields a RMS error between 1.71 and 3.39 and a probability of detection failure and false detection between 0.89% and 11.9% in determining the spatial-temporal gas-liquid interface position in the flow of free rising bubbles in stagnant liquid. This result is valid for both free path and with transducer emitting through a metallic plate or a Plexiglas pipe.

  19. A New Method for Ultrasound Detection of Interfacial Position in Gas-Liquid Two-Phase Flow

    PubMed Central

    Coutinho, Fábio Rizental; Ofuchi, César Yutaka; de Arruda, Lúcia Valéria Ramos; Jr., Flávio Neves; Morales, Rigoberto E. M.

    2014-01-01

    Ultrasonic measurement techniques for velocity estimation are currently widely used in fluid flow studies and applications. An accurate determination of interfacial position in gas-liquid two-phase flows is still an open problem. The quality of this information directly reflects on the accuracy of void fraction measurement, and it provides a means of discriminating velocity information of both phases. The algorithm known as Velocity Matched Spectrum (VM Spectrum) is a velocity estimator that stands out from other methods by returning a spectrum of velocities for each interrogated volume sample. Interface detection of free-rising bubbles in quiescent liquid presents some difficulties for interface detection due to abrupt changes in interface inclination. In this work a method based on velocity spectrum curve shape is used to generate a spatial-temporal mapping, which, after spatial filtering, yields an accurate contour of the air-water interface. It is shown that the proposed technique yields a RMS error between 1.71 and 3.39 and a probability of detection failure and false detection between 0.89% and 11.9% in determining the spatial-temporal gas-liquid interface position in the flow of free rising bubbles in stagnant liquid. This result is valid for both free path and with transducer emitting through a metallic plate or a Plexiglas pipe. PMID:24858961

  20. A novel model for estimating organic chemical bioconcentration in agricultural plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hung, H.; Mackay, D.; Di Guardo, A.

    1995-12-31

    There is increasing recognition that much human and wildlife exposure to organic contaminants can be traced through the food chain to bioconcentration in vegetation. For risk assessment, there is a need for an accurate model to predict organic chemical concentrations in plants. Existing models range from relatively simple correlations of concentrations using octanol-water or octanol-air partition coefficients, to complex models involving extensive physiological data. To satisfy the need for a relatively accurate model of intermediate complexity, a novel approach has been devised to predict organic chemical concentrations in agricultural plants as a function of soil and air concentrations, without themore » need for extensive plant physiological data. The plant is treated as three compartments, namely, leaves, roots and stems (including fruit and seeds). Data readily available from the literature, including chemical properties, volume, density and composition of each compartment; metabolic and growth rate of plant; and readily obtainable environmental conditions at the site are required as input. Results calculated from the model are compared with observed and experimentally-determined concentrations. It is suggested that the model, which includes a physiological database for agricultural plants, gives acceptably accurate predictions of chemical partitioning between plants, air and soil.« less

  1. Rapid perfusion quantification using Welch-Satterthwaite approximation and analytical spectral filtering

    NASA Astrophysics Data System (ADS)

    Krishnan, Karthik; Reddy, Kasireddy V.; Ajani, Bhavya; Yalavarthy, Phaneendra K.

    2017-02-01

    CT and MR perfusion weighted imaging (PWI) enable quantification of perfusion parameters in stroke studies. These parameters are calculated from the residual impulse response function (IRF) based on a physiological model for tissue perfusion. The standard approach for estimating the IRF is deconvolution using oscillatory-limited singular value decomposition (oSVD) or Frequency Domain Deconvolution (FDD). FDD is widely recognized as the fastest approach currently available for deconvolution of CT Perfusion/MR PWI. In this work, three faster methods are proposed. The first is a direct (model based) crude approximation to the final perfusion quantities (Blood flow, Blood volume, Mean Transit Time and Delay) using the Welch-Satterthwaite approximation for gamma fitted concentration time curves (CTC). The second method is a fast accurate deconvolution method, we call Analytical Fourier Filtering (AFF). The third is another fast accurate deconvolution technique using Showalter's method, we call Analytical Showalter's Spectral Filtering (ASSF). Through systematic evaluation on phantom and clinical data, the proposed methods are shown to be computationally more than twice as fast as FDD. The two deconvolution based methods, AFF and ASSF, are also shown to be quantitatively accurate compared to FDD and oSVD.

  2. Prediction of beef carcass salable yield and trimmable fat using bioelectrical impedance analysis.

    PubMed

    Zollinger, B L; Farrow, R L; Lawrence, T E; Latman, N S

    2010-03-01

    Bioelectrical impedance technology (BIA) is capable of providing an objective method of beef carcass yield estimation with the rapidity of yield grading. Electrical resistance (Rs), reactance (Xc), impedance (I), hot carcass weight (HCW), fat thickness between the 12th and 13th ribs (FT), estimated percentage kidney, pelvic, and heart fat (KPH%), longissimus muscle area (LMA), length between electrodes (LGE) as well as three derived carcass values that included electrical volume (EVOL), reactive density (XcD), and resistive density (RsD) were determined for the carcasses of 41 commercially fed cattle. Carcasses were subsequently fabricated into salable beef products reflective of industry standards. Equations were developed to predict percentage salable carcass yield (SY%) and percentage trimmable fat (FT%). Resulting equations accounted for 81% and 84% of variation in SY% and FT%, respectively. These results indicate that BIA technology is an accurate predictor of beef carcass composition. Copyright 2009 Elsevier Ltd. All rights reserved.

  3. Pharmacokinetics of paracetamol (acetaminophen) after intravenous and oral administration.

    PubMed

    Rawlins, M D; Henderson, D B; Hijab, A R

    1977-04-20

    Plasma paracetamol concentrations were measured in 6 volunteers after single intravenous (1000 mg) and oral (500 mg, 1000 mg and 2000 mg) doses of the drug. Paracetamol levels declined multiphasically with a mean clearance after intravenous administration of 352 +/- 40 ml/min. A two-compartment open model appeared to describe the decline adequately. Comparison of the areas under the plasma concentration-time curves (AUC) indicated that oral bioavailability increased from 0.63 +/- 0.02 after 500 mg, to 0.89 +/- 0.04 and 0.87 +/- 0.08 after 1000 mg and 2000 mg, respectively. As a consequence of the incomplete bioavailability of paracetamol, as well as its multicompartmental distribution, accurate estimates of its distribution volume and clearance cannot be obtained if the drug is given orally. However, an estimate of its total plasma clearance may be derived from the AUC after a 500 mg oral dose.

  4. Updating stand-level forest inventories using airborne laser scanning and Landsat time series data

    NASA Astrophysics Data System (ADS)

    Bolton, Douglas K.; White, Joanne C.; Wulder, Michael A.; Coops, Nicholas C.; Hermosilla, Txomin; Yuan, Xiaoping

    2018-04-01

    Vertical forest structure can be mapped over large areas by combining samples of airborne laser scanning (ALS) data with wall-to-wall spatial data, such as Landsat imagery. Here, we use samples of ALS data and Landsat time-series metrics to produce estimates of top height, basal area, and net stem volume for two timber supply areas near Kamloops, British Columbia, Canada, using an imputation approach. Both single-year and time series metrics were calculated from annual, gap-free Landsat reflectance composites representing 1984-2014. Metrics included long-term means of vegetation indices, as well as measures of the variance and slope of the indices through time. Terrain metrics, generated from a 30 m digital elevation model, were also included as predictors. We found that imputation models improved with the inclusion of Landsat time series metrics when compared to single-year Landsat metrics (relative RMSE decreased from 22.8% to 16.5% for top height, from 32.1% to 23.3% for basal area, and from 45.6% to 34.1% for net stem volume). Landsat metrics that characterized 30-years of stand history resulted in more accurate models (for all three structural attributes) than Landsat metrics that characterized only the most recent 10 or 20 years of stand history. To test model transferability, we compared imputed attributes against ALS-based estimates in nearby forest blocks (>150,000 ha) that were not included in model training or testing. Landsat-imputed attributes correlated strongly to ALS-based estimates in these blocks (R2 = 0.62 and relative RMSE = 13.1% for top height, R2 = 0.75 and relative RMSE = 17.8% for basal area, and R2 = 0.67 and relative RMSE = 26.5% for net stem volume), indicating model transferability. These findings suggest that in areas containing spatially-limited ALS data acquisitions, imputation models, and Landsat time series and terrain metrics can be effectively used to produce wall-to-wall estimates of key inventory attributes, providing an opportunity to update estimates of forest attributes in areas where inventory information is either out of date or non-existent.

  5. A comparison of two dose calculation algorithms-anisotropic analytical algorithm and Acuros XB-for radiation therapy planning of canine intranasal tumors.

    PubMed

    Nagata, Koichi; Pethel, Timothy D

    2017-07-01

    Although anisotropic analytical algorithm (AAA) and Acuros XB (AXB) are both radiation dose calculation algorithms that take into account the heterogeneity within the radiation field, Acuros XB is inherently more accurate. The purpose of this retrospective method comparison study was to compare them and evaluate the dose discrepancy within the planning target volume (PTV). Radiation therapy (RT) plans of 11 dogs with intranasal tumors treated by radiation therapy at the University of Georgia were evaluated. All dogs were planned for intensity-modulated radiation therapy using nine coplanar X-ray beams that were equally spaced, then dose calculated with anisotropic analytical algorithm. The same plan with the same monitor units was then recalculated using Acuros XB for comparisons. Each dog's planning target volume was separated into air, bone, and tissue and evaluated. The mean dose to the planning target volume estimated by Acuros XB was 1.3% lower. It was 1.4% higher for air, 3.7% lower for bone, and 0.9% lower for tissue. The volume of planning target volume covered by the prescribed dose decreased by 21% when Acuros XB was used due to increased dose heterogeneity within the planning target volume. Anisotropic analytical algorithm relatively underestimates the dose heterogeneity and relatively overestimates the dose to the bone and tissue within the planning target volume for the radiation therapy planning of canine intranasal tumors. This can be clinically significant especially if the tumor cells are present within the bone, because it may result in relative underdosing of the tumor. © 2017 American College of Veterinary Radiology.

  6. An assessment of the impact of FIA's default assumptions on the estimates of coarse woody debris volume and biomass

    Treesearch

    Vicente J. Monleon

    2009-01-01

    Currently, Forest Inventory and Analysis estimation procedures use Smalian's formula to compute coarse woody debris (CWD) volume and assume that logs lie horizontally on the ground. In this paper, the impact of those assumptions on volume and biomass estimates is assessed using 7 years of Oregon's Phase 2 data. Estimates of log volume computed using Smalian...

  7. A novel scatter separation method for multi-energy x-ray imaging

    NASA Astrophysics Data System (ADS)

    Sossin, A.; Rebuffel, V.; Tabary, J.; Létang, J. M.; Freud, N.; Verger, L.

    2016-06-01

    X-ray imaging coupled with recently emerged energy-resolved photon counting detectors provides the ability to differentiate material components and to estimate their respective thicknesses. However, such techniques require highly accurate images. The presence of scattered radiation leads to a loss of spatial contrast and, more importantly, a bias in radiographic material imaging and artefacts in computed tomography (CT). The aim of the present study was to introduce and evaluate a partial attenuation spectral scatter separation approach (PASSSA) adapted for multi-energy imaging. This evaluation was carried out with the aid of numerical simulations provided by an internal simulation tool, Sindbad-SFFD. A simplified numerical thorax phantom placed in a CT geometry was used. The attenuation images and CT slices obtained from corrected data showed a remarkable increase in local contrast and internal structure detectability when compared to uncorrected images. Scatter induced bias was also substantially decreased. In terms of quantitative performance, the developed approach proved to be quite accurate as well. The average normalized root-mean-square error between the uncorrected projections and the reference primary projections was around 23%. The application of PASSSA reduced this error to around 5%. Finally, in terms of voxel value accuracy, an increase by a factor  >10 was observed for most inspected volumes-of-interest, when comparing the corrected and uncorrected total volumes.

  8. Continuing education: online monitoring of haemodialysis dose.

    PubMed

    Vartia, Aarne

    2018-01-25

    Kt/V urea reflects the efficacy of haemodialysis scaled to patient size (urea distribution volume). The guidelines recommend monthly Kt/V measurements based on blood samples. Modern haemodialysis machines are equipped with accessories monitoring the dose online at every session without extra costs, blood samples and computers. To describe the principles, devices, benefits and shortcomings of online monitoring of haemodialysis dose. A critical literature overview and discussion. UV absorbance methods measure Kt/V, ionic dialysance Kt (product of clearance and treatment time; cleared volume without scaling). Both are easy and useful methods, but comparison is difficult due to problems in scaling of the dialysis dose to the patient's size. The best dose estimation method is the one which predicts the quality of life and survival most accurately. There is some evidence on the predictive value of ionic dialysance Kt, but more documentation is required on the UV method. Online monitoring is a useful tool in everyday quality assurance, but blood samples are still required for more accurate kinetic modelling. After reading this article the reader should be able to: Understand the elements of the Kt/V equation for dialysis dose. Compare and contrast different methods of measurement of dialysis dose. Reflect on the importance of adequate dialysis dose for patient survival and life quality. © 2018 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  9. Optical Breast Shape Capture and Finite Element Mesh Generation for Electrical Impedance Tomography

    PubMed Central

    Forsyth, J.; Borsic, A.; Halter, R.J.; Hartov, A.; Paulsen, K.D.

    2011-01-01

    X-Ray mammography is the standard for breast cancer screening. The development of alternative imaging modalities is desirable because Mammograms expose patients to ionizing radiation. Electrical Impedance Tomography (EIT) may be used to determine tissue conductivity, a property which is an indicator of cancer presence. EIT is also a low-cost imaging solution and does not involve ionizing radiation. In breast EIT, impedance measurements are made using electrodes placed on the surface of the patient’s breast. The complex conductivity of the volume of the breast is estimated by a reconstruction algorithm. EIT reconstruction is a severely ill-posed inverse problem. As a result, noisy instrumentation and incorrect modelling of the electrodes and domain shape produce significant image artefacts. In this paper, we propose a method that has the potential to reduce these errors by accurately modelling the patient breast shape. A 3D hand-held optical scanner is used to acquire the breast geometry and electrode positions. We develop methods for processing the data from the scanner and producing volume meshes accurately matching the breast surface and electrode locations, which can be used for image reconstruction. We demonstrate this method for a plaster breast phantom and a human subject. Using this approach will allow patient-specific finite element meshes to be generated which has the potential to improve the clinical value of EIT for breast cancer diagnosis. PMID:21646711

  10. Fast decomposition of two ultrasound longitudinal waves in cancellous bone using a phase rotation parameter for bone quality assessment: Simulation study.

    PubMed

    Taki, Hirofumi; Nagatani, Yoshiki; Matsukawa, Mami; Kanai, Hiroshi; Izumi, Shin-Ichi

    2017-10-01

    Ultrasound signals that pass through cancellous bone may be considered to consist of two longitudinal waves, which are called fast and slow waves. Accurate decomposition of these fast and slow waves is considered to be highly beneficial in determination of the characteristics of cancellous bone. In the present study, a fast decomposition method using a wave transfer function with a phase rotation parameter was applied to received signals that have passed through bovine bone specimens with various bone volume to total volume (BV/TV) ratios in a simulation study, where the elastic finite-difference time-domain method is used and the ultrasound wave propagated parallel to the bone axes. The proposed method succeeded to decompose both fast and slow waves accurately; the normalized residual intensity was less than -19.5 dB when the specimen thickness ranged from 4 to 7 mm and the BV/TV value ranged from 0.144 to 0.226. There was a strong relationship between the phase rotation value and the BV/TV value. The ratio of the peak envelope amplitude of the decomposed fast wave to that of the slow wave increased monotonically with increasing BV/TV ratio, indicating the high performance of the proposed method in estimation of the BV/TV value in cancellous bone.

  11. Filtering Raw Terrestrial Laser Scanning Data for Efficient and Accurate Use in Geomorphologic Modeling

    NASA Astrophysics Data System (ADS)

    Gleason, M. J.; Pitlick, J.; Buttenfield, B. P.

    2011-12-01

    Terrestrial laser scanning (TLS) represents a new and particularly effective remote sensing technique for investigating geomorphologic processes. Unfortunately, TLS data are commonly characterized by extremely large volume, heterogeneous point distribution, and erroneous measurements, raising challenges for applied researchers. To facilitate efficient and accurate use of TLS in geomorphology, and to improve accessibility for TLS processing in commercial software environments, we are developing a filtering method for raw TLS data to: eliminate data redundancy; produce a more uniformly spaced dataset; remove erroneous measurements; and maintain the ability of the TLS dataset to accurately model terrain. Our method conducts local aggregation of raw TLS data using a 3-D search algorithm based on the geometrical expression of expected random errors in the data. This approach accounts for the estimated accuracy and precision limitations of the instruments and procedures used in data collection, thereby allowing for identification and removal of potential erroneous measurements prior to data aggregation. Initial tests of the proposed technique on a sample TLS point cloud required a modest processing time of approximately 100 minutes to reduce dataset volume over 90 percent (from 12,380,074 to 1,145,705 points). Preliminary analysis of the filtered point cloud revealed substantial improvement in homogeneity of point distribution and minimal degradation of derived terrain models. We will test the method on two independent TLS datasets collected in consecutive years along a non-vegetated reach of the North Fork Toutle River in Washington. We will evaluate the tool using various quantitative, qualitative, and statistical methods. The crux of this evaluation will include a bootstrapping analysis to test the ability of the filtered datasets to model the terrain at roughly the same accuracy as the raw datasets.

  12. A new class of accurate, mesh-free hydrodynamic simulation methods

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.

    2015-06-01

    We present two new Lagrangian methods for hydrodynamics, in a systematic comparison with moving-mesh, smoothed particle hydrodynamics (SPH), and stationary (non-moving) grid methods. The new methods are designed to simultaneously capture advantages of both SPH and grid-based/adaptive mesh refinement (AMR) schemes. They are based on a kernel discretization of the volume coupled to a high-order matrix gradient estimator and a Riemann solver acting over the volume `overlap'. We implement and test a parallel, second-order version of the method with self-gravity and cosmological integration, in the code GIZMO:1 this maintains exact mass, energy and momentum conservation; exhibits superior angular momentum conservation compared to all other methods we study; does not require `artificial diffusion' terms; and allows the fluid elements to move with the flow, so resolution is automatically adaptive. We consider a large suite of test problems, and find that on all problems the new methods appear competitive with moving-mesh schemes, with some advantages (particularly in angular momentum conservation), at the cost of enhanced noise. The new methods have many advantages versus SPH: proper convergence, good capturing of fluid-mixing instabilities, dramatically reduced `particle noise' and numerical viscosity, more accurate sub-sonic flow evolution, and sharp shock-capturing. Advantages versus non-moving meshes include: automatic adaptivity, dramatically reduced advection errors and numerical overmixing, velocity-independent errors, accurate coupling to gravity, good angular momentum conservation and elimination of `grid alignment' effects. We can, for example, follow hundreds of orbits of gaseous discs, while AMR and SPH methods break down in a few orbits. However, fixed meshes minimize `grid noise'. These differences are important for a range of astrophysical problems.

  13. Digital breast tomosynthesis geometry calibration

    NASA Astrophysics Data System (ADS)

    Wang, Xinying; Mainprize, James G.; Kempston, Michael P.; Mawdsley, Gordon E.; Yaffe, Martin J.

    2007-03-01

    Digital Breast Tomosynthesis (DBT) is a 3D x-ray technique for imaging the breast. The x-ray tube, mounted on a gantry, moves in an arc over a limited angular range around the breast while 7-15 images are acquired over a period of a few seconds. A reconstruction algorithm is used to create a 3D volume dataset from the projection images. This procedure reduces the effects of tissue superposition, often responsible for degrading the quality of projection mammograms. This may help improve sensitivity of cancer detection, while reducing the number of false positive results. For DBT, images are acquired at a set of gantry rotation angles. The image reconstruction process requires several geometrical factors associated with image acquisition to be known accurately, however, vibration, encoder inaccuracy, the effects of gravity on the gantry arm and manufacturing tolerances can produce deviations from the desired acquisition geometry. Unlike cone-beam CT, in which a complete dataset is acquired (500+ projections over 180°), tomosynthesis reconstruction is challenging in that the angular range is narrow (typically from 20°-45°) and there are fewer projection images (~7-15). With such a limited dataset, reconstruction is very sensitive to geometric alignment. Uncertainties in factors such as detector tilt, gantry angle, focal spot location, source-detector distance and source-pivot distance can produce several artifacts in the reconstructed volume. To accurately and efficiently calculate the location and angles of orientation of critical components of the system in DBT geometry, a suitable phantom is required. We have designed a calibration phantom for tomosynthesis and developed software for accurate measurement of the geometric parameters of a DBT system. These have been tested both by simulation and experiment. We will present estimates of the precision available with this technique for a prototype DBT system.

  14. The Application of Fractal and Multifractal Theory in Hydraulic-Flow-Unit Characterization and Permeability Estimation

    NASA Astrophysics Data System (ADS)

    Chen, X.; Yao, G.; Cai, J.

    2017-12-01

    Pore structure characteristics are important factors in influencing the fluid transport behavior of porous media, such as pore-throat ratio, pore connectivity and size distribution, moreover, wettability. To accurately characterize the diversity of pore structure among HFUs, five samples selected from different HFUs (porosities are approximately equal, however permeability varies widely) were chosen to conduct micro-computerized tomography test to acquire direct 3D images of pore geometries and to perform mercury injection experiments to obtain the pore volume-radii distribution. To characterize complex and high nonlinear pore structure of all samples, three classic fractal geometry models were applied. Results showed that each HFU has similar box-counting fractal dimension and generalized fractal dimension in the number-area model, but there are significant differences in multifractal spectrums. In the radius-volume model, there are three obvious linear segments, corresponding to three fractal dimension values, and the middle one is proved as the actual fractal dimension according to the maximum radius. In the number-radius model, the spherical-pore size distribution extracted by maximum ball algorithm exist a decrease in the number of small pores compared with the fractal power rate rather than the traditional linear law. Among the three models, only multifractal analysis can classify the HFUs accurately. Additionally, due to the tightness and low-permeability in reservoir rocks, connate water film existing in the inner surface of pore channels commonly forms bound water. The conventional model which is known as Yu-Cheng's model has been proved to be typically not applicable. Considering the effect of irreducible water saturation, an improved fractal permeability model was also deduced theoretically. The comparison results showed that the improved model can be applied to calculate permeability directly and accurately in such unconventional rocks.

  15. Assessment of dedicated low-dose cardiac micro-CT reconstruction algorithms using the left ventricular volume of small rodents as a performance measure.

    PubMed

    Maier, Joscha; Sawall, Stefan; Kachelrieß, Marc

    2014-05-01

    Phase-correlated microcomputed tomography (micro-CT) imaging plays an important role in the assessment of mouse models of cardiovascular diseases and the determination of functional parameters as the left ventricular volume. As the current gold standard, the phase-correlated Feldkamp reconstruction (PCF), shows poor performance in case of low dose scans, more sophisticated reconstruction algorithms have been proposed to enable low-dose imaging. In this study, the authors focus on the McKinnon-Bates (MKB) algorithm, the low dose phase-correlated (LDPC) reconstruction, and the high-dimensional total variation minimization reconstruction (HDTV) and investigate their potential to accurately determine the left ventricular volume at different dose levels from 50 to 500 mGy. The results were verified in phantom studies of a five-dimensional (5D) mathematical mouse phantom. Micro-CT data of eight mice, each administered with an x-ray dose of 500 mGy, were acquired, retrospectively gated for cardiac and respiratory motion and reconstructed using PCF, MKB, LDPC, and HDTV. Dose levels down to 50 mGy were simulated by using only a fraction of the projections. Contrast-to-noise ratio (CNR) was evaluated as a measure of image quality. Left ventricular volume was determined using different segmentation algorithms (Otsu, level sets, region growing). Forward projections of the 5D mouse phantom were performed to simulate a micro-CT scan. The simulated data were processed the same way as the real mouse data sets. Compared to the conventional PCF reconstruction, the MKB, LDPC, and HDTV algorithm yield images of increased quality in terms of CNR. While the MKB reconstruction only provides small improvements, a significant increase of the CNR is observed in LDPC and HDTV reconstructions. The phantom studies demonstrate that left ventricular volumes can be determined accurately at 500 mGy. For lower dose levels which were simulated for real mouse data sets, the HDTV algorithm shows the best performance. At 50 mGy, the deviation from the reference obtained at 500 mGy were less than 4%. Also the LDPC algorithm provides reasonable results with deviation less than 10% at 50 mGy while PCF and MKB reconstruction show larger deviations even at higher dose levels. LDPC and HDTV increase CNR and allow for quantitative evaluations even at dose levels as low as 50 mGy. The left ventricular volumes exemplarily illustrate that cardiac parameters can be accurately estimated at lowest dose levels if sophisticated algorithms are used. This allows to reduce dose by a factor of 10 compared to today's gold standard and opens new options for longitudinal studies of the heart.

  16. Assessment of dedicated low-dose cardiac micro-CT reconstruction algorithms using the left ventricular volume of small rodents as a performance measure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maier, Joscha, E-mail: joscha.maier@dkfz.de; Sawall, Stefan; Kachelrieß, Marc

    2014-05-15

    Purpose: Phase-correlated microcomputed tomography (micro-CT) imaging plays an important role in the assessment of mouse models of cardiovascular diseases and the determination of functional parameters as the left ventricular volume. As the current gold standard, the phase-correlated Feldkamp reconstruction (PCF), shows poor performance in case of low dose scans, more sophisticated reconstruction algorithms have been proposed to enable low-dose imaging. In this study, the authors focus on the McKinnon-Bates (MKB) algorithm, the low dose phase-correlated (LDPC) reconstruction, and the high-dimensional total variation minimization reconstruction (HDTV) and investigate their potential to accurately determine the left ventricular volume at different dose levelsmore » from 50 to 500 mGy. The results were verified in phantom studies of a five-dimensional (5D) mathematical mouse phantom. Methods: Micro-CT data of eight mice, each administered with an x-ray dose of 500 mGy, were acquired, retrospectively gated for cardiac and respiratory motion and reconstructed using PCF, MKB, LDPC, and HDTV. Dose levels down to 50 mGy were simulated by using only a fraction of the projections. Contrast-to-noise ratio (CNR) was evaluated as a measure of image quality. Left ventricular volume was determined using different segmentation algorithms (Otsu, level sets, region growing). Forward projections of the 5D mouse phantom were performed to simulate a micro-CT scan. The simulated data were processed the same way as the real mouse data sets. Results: Compared to the conventional PCF reconstruction, the MKB, LDPC, and HDTV algorithm yield images of increased quality in terms of CNR. While the MKB reconstruction only provides small improvements, a significant increase of the CNR is observed in LDPC and HDTV reconstructions. The phantom studies demonstrate that left ventricular volumes can be determined accurately at 500 mGy. For lower dose levels which were simulated for real mouse data sets, the HDTV algorithm shows the best performance. At 50 mGy, the deviation from the reference obtained at 500 mGy were less than 4%. Also the LDPC algorithm provides reasonable results with deviation less than 10% at 50 mGy while PCF and MKB reconstruction show larger deviations even at higher dose levels. Conclusions: LDPC and HDTV increase CNR and allow for quantitative evaluations even at dose levels as low as 50 mGy. The left ventricular volumes exemplarily illustrate that cardiac parameters can be accurately estimated at lowest dose levels if sophisticated algorithms are used. This allows to reduce dose by a factor of 10 compared to today's gold standard and opens new options for longitudinal studies of the heart.« less

  17. Empirical Assessment of the Mean Block Volume of Rock Masses Intersected by Four Joint Sets

    NASA Astrophysics Data System (ADS)

    Morelli, Gian Luca

    2016-05-01

    The estimation of a representative value for the rock block volume ( V b) is of huge interest in rock engineering in regards to rock mass characterization purposes. However, while mathematical relationships to precisely estimate this parameter from the spacing of joints can be found in literature for rock masses intersected by three dominant joint sets, corresponding relationships do not actually exist when more than three sets occur. In these cases, a consistent assessment of V b can only be achieved by directly measuring the dimensions of several representative natural rock blocks in the field or by means of more sophisticated 3D numerical modeling approaches. However, Palmström's empirical relationship based on the volumetric joint count J v and on a block shape factor β is commonly used in the practice, although strictly valid only for rock masses intersected by three joint sets. Starting from these considerations, the present paper is primarily intended to investigate the reliability of a set of empirical relationships linking the block volume with the indexes most commonly used to characterize the degree of jointing in a rock mass (i.e. the J v and the mean value of the joint set spacings) specifically applicable to rock masses intersected by four sets of persistent discontinuities. Based on the analysis of artificial 3D block assemblies generated using the software AutoCAD, the most accurate best-fit regression has been found between the mean block volume (V_{{{{b}}_{{m}} }}) of tested rock mass samples and the geometric mean value of the spacings of the joint sets delimiting blocks; thus, indicating this mean value as a promising parameter for the preliminary characterization of the block size. Tests on field outcrops have demonstrated that the proposed empirical methodology has the potential of predicting the mean block volume of multiple-set jointed rock masses with an acceptable accuracy for common uses in most practical rock engineering applications.

  18. Water activity and mobility in solutions of glycerol and small molecular weight sugars: Implication for cryo- and lyopreservation

    NASA Astrophysics Data System (ADS)

    He, Xiaoming; Fowler, Alex; Toner, Mehmet

    2006-10-01

    In this study, the free volume models, originally developed for large molecular weight polymer-solvent systems, were used to study the water activity and mobility in solutions of four small molecular weight cryo-/lyoprotectants, viz., glycerol, a monosaccharide (fructose), and two disaccharides (sucrose and trehalose). The free volume model parameters were determined by fitting the models to available experimental data using a nonlinear optimization procedure. It was found that free volume models could accurately predict the available experimental data, which suggests that the free volume models might be generally applicable to aqueous solutions of small molecular weight cryo-/lyoprotectants. Furthermore, several models for estimating the mutual diffusion coefficient were tested using available experimental data for aqueous solutions of glycerol and a better method to estimate the mutual diffusion coefficient was proposed. Free volume models were used to predict and analyze the water activity and mobility in solutions of four cryo-/lyoprotectants under conditions frequently encountered in cryo-/lyopreservation applications. It was found that the water mobility in the glassy state of the above four solutions is essentially negligible in the case of cryopreservation with storage temperature lower than -110°C. However, the water mobility in a glass at higher temperature (>-80°C) may be significant. As a result, a subcooling of up to 50°C may be necessary for the long-term cryo-/lyopreservation of biomaterials depending on the water content and the type of cryo-/lyoprotectants. It was further shown that trehalose might be the best of the four protectants studied for lyopreservation (water mass fraction ⩽0.1) when the storage temperature is above the room temperature. The results from this study might be useful for the development of more effective protocols for both cryopreservation and lyopreservation of living cells and other biomaterials.

  19. An extension of the Saltykov method to quantify 3D grain size distributions in mylonites

    NASA Astrophysics Data System (ADS)

    Lopez-Sanchez, Marco A.; Llana-Fúnez, Sergio

    2016-12-01

    The estimation of 3D grain size distributions (GSDs) in mylonites is key to understanding the rheological properties of crystalline aggregates and to constraining dynamic recrystallization models. This paper investigates whether a common stereological method, the Saltykov method, is appropriate for the study of GSDs in mylonites. In addition, we present a new stereological method, named the two-step method, which estimates a lognormal probability density function describing the 3D GSD. Both methods are tested for reproducibility and accuracy using natural and synthetic data sets. The main conclusion is that both methods are accurate and simple enough to be systematically used in recrystallized aggregates with near-equant grains. The Saltykov method is particularly suitable for estimating the volume percentage of particular grain-size fractions with an absolute uncertainty of ±5 in the estimates. The two-step method is suitable for quantifying the shape of the actual 3D GSD in recrystallized rocks using a single value, the multiplicative standard deviation (MSD) parameter, and providing a precision in the estimate typically better than 5%. The novel method provides a MSD value in recrystallized quartz that differs from previous estimates based on apparent 2D GSDs, highlighting the inconvenience of using apparent GSDs for such tasks.

  20. Patient-specific dose calculations for pediatric CT of the chest, abdomen and pelvis

    PubMed Central

    Fraser, Nicholas D.; Carver, Diana E.; Pickens, David R.; Price, Ronald R.; Hernanz-Schulman, Marta; Stabin, Michael G.

    2015-01-01

    Background Organ dose is essential for accurate estimates of patient dose from CT. Objective To determine organ doses from a broad range of pediatric patients undergoing diagnostic chest–abdomen–pelvis CT and investigate how these relate to patient size. Materials and methods We used a previously validated Monte Carlo simulation model of a Philips Brilliance 64 multi-detector CT scanner (Philips Healthcare, Best, The Netherlands) to calculate organ doses for 40 pediatric patients (M:F=21:19; range 0.6–17 years). Organ volumes and positions were determined from the images using standard segmentation techniques. Non-linear regression was performed to determine the relationship between volume CT dose index (CTDIvol)-normalized organ doses and abdominopelvic diameter. We then compared results with values obtained from independent studies. Results We found that CTDIvol-normalized organ dose correlated strongly with exponentially decreasing abdominopelvic diameter (R2>0.8 for most organs). A similar relationship was determined for effective dose when normalized by dose-length product (R2=0.95). Our results agreed with previous studies within 12% using similar scan parameters (i.e. bowtie filter size, beam collimation); however results varied up to 25% when compared to studies using different bowtie filters. Conclusion Our study determined that organ doses can be estimated from measurements of patient size, namely body diameter, and CTDIvol prior to CT examination. This information provides an improved method for patient dose estimation. PMID:26142256

Top