Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.
DOT National Transportation Integrated Search
1979-09-01
This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...
Tunnel and Station Cost Methodology Volume II: Stations
DOT National Transportation Integrated Search
1981-01-01
The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...
Market projections of cellulose nanomaterial-enabled products-- Part 2: Volume estimates
John Cowie; E.M. (Ted) Bilek; Theodore H. Wegner; Jo Anne Shatkin
2014-01-01
Nanocellulose has enormous potential to provide an important materials platform in numerous product sectors. This study builds on previous work by the same authors in which likely high-volume, low-volume, and novel applications for cellulosic nanomaterials were identified. In particular, this study creates a transparent methodology and estimates the potential annual...
A multifractal approach to space-filling recovery for PET quantification.
Willaime, Julien M Y; Aboagye, Eric O; Tsoumpas, Charalampos; Turkheimer, Federico E
2014-11-01
A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV mean) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic (18)F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical (18)F-fluorothymidine PET test-retest dataset. TLA estimates were stable for a range of resolutions typical in PET oncology (4-6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV mean or TV measurements across imaging protocols. The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.
Ahlgren, André; Wirestam, Ronnie; Petersen, Esben Thade; Ståhlberg, Freddy; Knutsson, Linda
2014-09-01
Quantitative perfusion MRI based on arterial spin labeling (ASL) is hampered by partial volume effects (PVEs), arising due to voxel signal cross-contamination between different compartments. To address this issue, several partial volume correction (PVC) methods have been presented. Most previous methods rely on segmentation of a high-resolution T1 -weighted morphological image volume that is coregistered to the low-resolution ASL data, making the result sensitive to errors in the segmentation and coregistration. In this work, we present a methodology for partial volume estimation and correction, using only low-resolution ASL data acquired with the QUASAR sequence. The methodology consists of a T1 -based segmentation method, with no spatial priors, and a modified PVC method based on linear regression. The presented approach thus avoids prior assumptions about the spatial distribution of brain compartments, while also avoiding coregistration between different image volumes. Simulations based on a digital phantom as well as in vivo measurements in 10 volunteers were used to assess the performance of the proposed segmentation approach. The simulation results indicated that QUASAR data can be used for robust partial volume estimation, and this was confirmed by the in vivo experiments. The proposed PVC method yielded probable perfusion maps, comparable to a reference method based on segmentation of a high-resolution morphological scan. Corrected gray matter (GM) perfusion was 47% higher than uncorrected values, suggesting a significant amount of PVEs in the data. Whereas the reference method failed to completely eliminate the dependence of perfusion estimates on the volume fraction, the novel approach produced GM perfusion values independent of GM volume fraction. The intra-subject coefficient of variation of corrected perfusion values was lowest for the proposed PVC method. As shown in this work, low-resolution partial volume estimation in connection with ASL perfusion estimation is feasible, and provides a promising tool for decoupling perfusion and tissue volume. Copyright © 2014 John Wiley & Sons, Ltd.
A multifractal approach to space-filling recovery for PET quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O.; Tsoumpas, Charalampos
2014-11-01
Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal andmore » synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.« less
Methodology update for estimating volume to service flow ratio.
DOT National Transportation Integrated Search
2015-12-01
Volume/service flow ratio (VSF) is calculated by the Highway Performance Monitoring System (HPMS) software as an indicator of peak hour congestion. It is an essential input to the Kentucky Transportation Cabinets (KYTC) key planning applications, ...
From field data to volumes: constraining uncertainties in pyroclastic eruption parameters
NASA Astrophysics Data System (ADS)
Klawonn, Malin; Houghton, Bruce F.; Swanson, Donald A.; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.
2014-07-01
In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal ( s = 62 %) and distal field ( s = 53 %) and small for the densely sampled intermediate deposit ( s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.
Traffic volume estimation using network interpolation techniques.
DOT National Transportation Integrated Search
2013-12-01
Kriging method is a frequently used interpolation methodology in geography, which enables estimations of unknown values at : certain places with the considerations of distances among locations. When it is used in transportation field, network distanc...
Estimating Monetized Benefits of Groundwater Recharge from Stormwater Retention Practices
The purpose of the study is to inform valuation of groundwater recharge from stormwater retention in areas projected for new development and redevelopment. This study examined a simplified methodology for estimating recharge volume.
Improved estimates of partial volume coefficients from noisy brain MRI using spatial context.
Manjón, José V; Tohka, Jussi; Robles, Montserrat
2010-11-01
This paper addresses the problem of accurate voxel-level estimation of tissue proportions in the human brain magnetic resonance imaging (MRI). Due to the finite resolution of acquisition systems, MRI voxels can contain contributions from more than a single tissue type. The voxel-level estimation of this fractional content is known as partial volume coefficient estimation. In the present work, two new methods to calculate the partial volume coefficients under noisy conditions are introduced and compared with current similar methods. Concretely, a novel Markov Random Field model allowing sharp transitions between partial volume coefficients of neighbouring voxels and an advanced non-local means filtering technique are proposed to reduce the errors due to random noise in the partial volume coefficient estimation. In addition, a comparison was made to find out how the different methodologies affect the measurement of the brain tissue type volumes. Based on the obtained results, the main conclusions are that (1) both Markov Random Field modelling and non-local means filtering improved the partial volume coefficient estimation results, and (2) non-local means filtering was the better of the two strategies for partial volume coefficient estimation. Copyright 2010 Elsevier Inc. All rights reserved.
Assessment of undiscovered oil and gas resources of the North Sakhalin Basin Province, Russia, 2011
Klett, T.R.; Schenk, Christopher J.; Wandrey, Craig J.; Charpentier, Ronald R.; Brownfield, Michael E.; Pitman, Janet K.; Pollastro, Richard M.; Cook, Troy A.; Tennyson, Marilyn E.
2011-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated volumes of undiscovered, technically recoverable, conventional petroleum resources for the North Sakhalin Basin Province of Russia. The mean volumes were estimated at 5.3 billion barrels of crude oil, 43.8 trillion cubic feet of natural gas, and 0.8 billion barrels of natural gas liquids.
From field data to volumes: constraining uncertainties in pyroclastic eruption parameters
Klawonn, Malin; Houghton, Bruce F.; Swanson, Don; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.
2014-01-01
In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal (s = 62 %) and distal field (s = 53 %) and small for the densely sampled intermediate deposit (s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.
1975-06-01
the Air Force Flight Dynamics Laboratory for use in conceptual and preliminary designs pauses of weapon system development. The methods are a...trade study method provides ai\\ iterative capability stemming from a direct interface with design synthesis programs. A detailed cost data base ;ind...system for data expmjsion is provided. The methods are designed for ease in changing cost estimating relationships and estimating coefficients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of differentmore » approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.« less
Quantifying Standing Dead Tree Volume and Structural Loss with Voxelized Terrestrial Lidar Data
NASA Astrophysics Data System (ADS)
Popescu, S. C.; Putman, E.
2017-12-01
Standing dead trees (SDTs) are an important forest component and impact a variety of ecosystem processes, yet the carbon pool dynamics of SDTs are poorly constrained in terrestrial carbon cycling models. The ability to model wood decay and carbon cycling in relation to detectable changes in tree structure and volume over time would greatly improve such models. The overall objective of this study was to provide automated aboveground volume estimates of SDTs and automated procedures to detect, quantify, and characterize structural losses over time with terrestrial lidar data. The specific objectives of this study were: 1) develop an automated SDT volume estimation algorithm providing accurate volume estimates for trees scanned in dense forests; 2) develop an automated change detection methodology to accurately detect and quantify SDT structural loss between subsequent terrestrial lidar observations; and 3) characterize the structural loss rates of pine and oak SDTs in southeastern Texas. A voxel-based volume estimation algorithm, "TreeVolX", was developed and incorporates several methods designed to robustly process point clouds of varying quality levels. The algorithm operates on horizontal voxel slices by segmenting the slice into distinct branch or stem sections then applying an adaptive contour interpolation and interior filling process to create solid reconstructed tree models (RTMs). TreeVolX estimated large and small branch volume with an RMSE of 7.3% and 13.8%, respectively. A voxel-based change detection methodology was developed to accurately detect and quantify structural losses and incorporated several methods to mitigate the challenges presented by shifting tree and branch positions as SDT decay progresses. The volume and structural loss of 29 SDTs, composed of Pinus taeda and Quercus stellata, were successfully estimated using multitemporal terrestrial lidar observations over elapsed times ranging from 71 - 753 days. Pine and oak structural loss rates were characterized by estimating the amount of volumetric loss occurring in 20 equal-interval height bins of each SDT. Results showed that large pine snags exhibited more rapid structural loss in comparison to medium-sized oak snags in this study.
IUS/TUG orbital operations and mission support study. Volume 5: Cost estimates
NASA Technical Reports Server (NTRS)
1975-01-01
The costing approach, methodology, and rationale utilized for generating cost data for composite IUS and space tug orbital operations are discussed. Summary cost estimates are given along with cost data initially derived for the IUS program and space tug program individually, and cost estimates for each work breakdown structure element.
Klett, T.R.; Schenk, Christopher J.; Wandrey, Craig J.; Charpentier, Ronald R.; Brownfield, Michael E.; Pitman, Janet K.; Pollastro, Richard M.; Cook, Troy A.; Tennyson, Marilyn E.
2012-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated volumes of undiscovered, technically recoverable, conventional petroleum resources for the southern Siberian craton provinces of Russia. The mean volumes were estimated at 3.0 billion barrels of crude oil, 63.3 trillion cubic feet of natural gas, and 1.2 billion barrels of natural gas liquids.
Patrick D. Miles; Andrew D. Hill
2010-01-01
The U.S. Forest Service's Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. This report documents the methodology used to estimate live-tree gross, net, and sound volume for the 24 States inventoried by the Northern Research Station's (NRS) FIA unit. Sound volume is of particular interest...
Demougeot-Renard, Helene; De Fouquet, Chantal
2004-10-01
Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.
Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L
2018-02-01
Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.
Estimating the volume of glaciers in the Himalayan-Karakoram region using different methods
NASA Astrophysics Data System (ADS)
Frey, H.; Machguth, H.; Huss, M.; Huggel, C.; Bajracharya, S.; Bolch, T.; Kulkarni, A.; Linsbauer, A.; Salzmann, N.; Stoffel, M.
2014-12-01
Ice volume estimates are crucial for assessing water reserves stored in glaciers. Due to its large glacier coverage, such estimates are of particular interest for the Himalayan-Karakoram (HK) region. In this study, different existing methodologies are used to estimate the ice reserves: three area-volume relations, one slope-dependent volume estimation method, and two ice-thickness distribution models are applied to a recent, detailed, and complete glacier inventory of the HK region, spanning over the period 2000-2010 and revealing an ice coverage of 40 775 km2. An uncertainty and sensitivity assessment is performed to investigate the influence of the observed glacier area and important model parameters on the resulting total ice volume. Results of the two ice-thickness distribution models are validated with local ice-thickness measurements at six glaciers. The resulting ice volumes for the entire HK region range from 2955 to 4737 km3, depending on the approach. This range is lower than most previous estimates. Results from the ice thickness distribution models and the slope-dependent thickness estimations agree well with measured local ice thicknesses. However, total volume estimates from area-related relations are larger than those from other approaches. The study provides evidence on the significant effect of the selected method on results and underlines the importance of a careful and critical evaluation.
U.S. Geological Survey assessment of reserve growth outside of the United States
Klett, Timothy R.; Cook, Troy A.; Charpentier, Ronald R.; Tennyson, Marilyn E.; Le, Phuong A.
2015-12-21
The U.S. Geological Survey estimated volumes of technically recoverable, conventional petroleum resources resulting from reserve growth for discovered fields outside the United States that have reported in-place oil and gas volumes of 500 million barrels of oil equivalent or greater. The mean volumes of reserve growth were estimated at 665 billion barrels of crude oil; 1,429 trillion cubic feet of natural gas; and 16 billion barrels of natural gas liquids. These volumes constitute a significant portion of the world’s oil and gas resources and represent the potential future growth of current global reserves over time based on better assessment methodology, new technologies, and greater understanding of reservoirs.
Initial retrieval sequence and blending strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pemwell, D.L.; Grenard, C.E.
1996-09-01
This report documents the initial retrieval sequence and the methodology used to select it. Waste retrieval, storage, pretreatment and vitrification were modeled for candidate single-shell tank retrieval sequences. Performance of the sequences was measured by a set of metrics (for example,high-level waste glass volume, relative risk and schedule).Computer models were used to evaluate estimated glass volumes,process rates, retrieval dates, and blending strategy effects.The models were based on estimates of component inventories and concentrations, sludge wash factors and timing, retrieval annex limitations, etc.
A Compatible Stem Taper-Volume-Weight System For Intensively Managed Fast Growing Loblolly Pine
Yugia Zhang; Bruce E. Borders; Robert L Bailey
2002-01-01
eometry-oriented methodology yielded a compatible taper-volume-weight system of models whose parameters were estimated using data from intensively managed loblolly pine (Pinus taeda L.) plantations in the lower coastal plain of Georgia. Data analysis showed that fertilization has significantly reduced taper (inside and outside bark) on the upper...
Army Training Study: Battalion Training Survey. Volumes 1 and 2.
1978-08-08
mathematical logic in the methodology. II. MAGN ITUJDE-ESTI MAT ION SCALLING A. General Description A unique methodology, Magnitude-Estimation...to 142.) I b " p .’ . -, / 1 ’- " ’. " " . -’ -" ..’- ’ ;’ ’- . "’ .- ’,, • "." -- -. -. -.-. The base conditio (represen.d in T1- sIA , IIA, and IIIA
ERIC Educational Resources Information Center
Kaskowitz, David H.
The booklet provides detailed estimates on handicapping conditions for school aged populations. The figures are intended to help the federal government validate state child count data as required by P.L. 94-142, the Education for All Handicapped Children. Section I uncovers the methodology used to arrive at the estimates, and it identifies the…
ERIC Educational Resources Information Center
Gutmanis, Ivars; And Others
The report presents the methodology used by the National Planning Association (NPA), under contract to the Federal Energy Administration (FEA), to estimate direct labor usage coefficients in some sixty different occupational categories involved in construction, operation, and maintenance of energy facilities. Volume 1 presents direct labor usage…
Detection and quantification of MS lesions using fuzzy topological principles
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.
1996-04-01
Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.
Ragagnin, Marilia Nagata; Gorman, Daniel; McCarthy, Ian Donald; Sant'Anna, Bruno Sampaio; de Castro, Cláudio Campi; Turra, Alexander
2018-01-11
Obtaining accurate and reproducible estimates of internal shell volume is a vital requirement for studies into the ecology of a range of shell-occupying organisms, including hermit crabs. Shell internal volume is usually estimated by filling the shell cavity with water or sand, however, there has been no systematic assessment of the reliability of these methods and moreover no comparison with modern alternatives, e.g., computed tomography (CT). This study undertakes the first assessment of the measurement reproducibility of three contrasting approaches across a spectrum of shell architectures and sizes. While our results suggested a certain level of variability inherent for all methods, we conclude that a single measure using sand/water is likely to be sufficient for the majority of studies. However, care must be taken as precision may decline with increasing shell size and structural complexity. CT provided less variation between repeat measures but volume estimates were consistently lower compared to sand/water and will need methodological improvements before it can be used as an alternative. CT indicated volume may be also underestimated using sand/water due to the presence of air spaces visible in filled shells scanned by CT. Lastly, we encourage authors to clearly describe how volume estimates were obtained.
Ruscher-Hill, Brandi; Kirkham, Amy L.; Burns, Jennifer M.
2018-01-01
Body mass dynamics of animals can indicate critical associations between extrinsic factors and population vital rates. Photogrammetry can be used to estimate mass of individuals in species whose life histories make it logistically difficult to obtain direct body mass measurements. Such studies typically use equations to relate volume estimates from photogrammetry to mass; however, most fail to identify the sources of error between the estimated and actual mass. Our objective was to identify the sources of error that prevent photogrammetric mass estimation from directly predicting actual mass, and develop a methodology to correct this issue. To do this, we obtained mass, body measurements, and scaled photos for 56 sedated Weddell seals (Leptonychotes weddellii). After creating a three-dimensional silhouette in the image processing program PhotoModeler Pro, we used horizontal scale bars to define the ground plane, then removed the below-ground portion of the animal’s estimated silhouette. We then re-calculated body volume and applied an expected density to estimate animal mass. We compared the body mass estimates derived from this silhouette slice method with estimates derived from two other published methodologies: body mass calculated using photogrammetry coupled with a species-specific correction factor, and estimates using elliptical cones and measured tissue densities. The estimated mass values (mean ± standard deviation 345±71 kg for correction equation, 346±75 kg for silhouette slice, 343±76 kg for cones) were not statistically distinguishable from each other or from actual mass (346±73 kg) (ANOVA with Tukey HSD post-hoc, p>0.05 for all pairwise comparisons). We conclude that volume overestimates from photogrammetry are likely due to the inability of photo modeling software to properly render the ventral surface of the animal where it contacts the ground. Due to logistical differences between the “correction equation”, “silhouette slicing”, and “cones” approaches, researchers may find one technique more useful for certain study programs. In combination or exclusively, these three-dimensional mass estimation techniques have great utility in field studies with repeated measures sampling designs or where logistic constraints preclude weighing animals. PMID:29320573
NASA Technical Reports Server (NTRS)
Lummus, J. R.; Joyce, G. T.; Omalley, C. D.
1980-01-01
An evaluation of current prediction methodologies to estimate the aerodynamic uncertainties identified for the E205 configuration is presented. This evaluation was accomplished by comparing predicted and wind tunnel test data in three major categories: untrimmed longitudinal aerodynamics; trimmed longitudinal aerodynamics; and lateral-directional aerodynamic characteristics.
NASA Technical Reports Server (NTRS)
Frassinelli, G. J.
1972-01-01
Cost estimates and funding schedules are presented for a given configuration and costing ground rules. Cost methodology is described and the cost evolution from a baseline configuration to a selected configuration is given, emphasizing cases in which cost was a design driver. Programmatic cost avoidance techniques are discussed.
A Novel Application for the Cavalieri Principle: A Stereological and Methodological Study
Altunkaynak, Berrin Zuhal; Altunkaynak, Eyup; Unal, Deniz; Unal, Bunyamin
2009-01-01
Objective The Cavalieri principle was applied to consecutive pathology sections that were photographed at the same magnification and used to estimate tissue volumes via superimposing a point counting grid on these images. The goal of this study was to perform the Cavalieri method quickly and practically. Materials and Methods In this study, 10 adult female Sprague Dawley rats were used. Brain tissue was removed and sampled both systematically and randomly. Brain volumes were estimated using two different methods. First, all brain slices were scanned with an HP ScanJet 3400C scanner, and their images were shown on a PC monitor. Brain volume was then calculated based on these images. Second, all brain slices were photographed in 10× magnification with a microscope camera, and brain volumes were estimated based on these micrographs. Results There was no statistically significant difference between the volume measurements of the two techniques (P>0.05; Paired Samples t Test). Conclusion This study demonstrates that personal computer scanning of serial tissue sections allows for easy and reliable volume determination based on the Cavalieri method. PMID:25610077
A novel application for the cavalieri principle: a stereological and methodological study.
Altunkaynak, Berrin Zuhal; Altunkaynak, Eyup; Unal, Deniz; Unal, Bunyamin
2009-08-01
The Cavalieri principle was applied to consecutive pathology sections that were photographed at the same magnification and used to estimate tissue volumes via superimposing a point counting grid on these images. The goal of this study was to perform the Cavalieri method quickly and practically. In this study, 10 adult female Sprague Dawley rats were used. Brain tissue was removed and sampled both systematically and randomly. Brain volumes were estimated using two different methods. First, all brain slices were scanned with an HP ScanJet 3400C scanner, and their images were shown on a PC monitor. Brain volume was then calculated based on these images. Second, all brain slices were photographed in 10× magnification with a microscope camera, and brain volumes were estimated based on these micrographs. There was no statistically significant difference between the volume measurements of the two techniques (P>0.05; Paired Samples t Test). This study demonstrates that personal computer scanning of serial tissue sections allows for easy and reliable volume determination based on the Cavalieri method.
Bivariate Heritability of Total and Regional Brain Volumes: the Framingham Study
DeStefano, Anita L.; Seshadri, Sudha; Beiser, Alexa; Atwood, Larry D.; Massaro, Joe M.; Au, Rhoda; Wolf, Philip A.; DeCarli, Charles
2009-01-01
Heritability and genetic and environmental correlations of total and regional brain volumes were estimated from a large, generally healthy, community-based sample, to determine if there are common elements to the genetic influence of brain volumes and white matter hyperintensity volume. There were 1538 Framingham Heart Study participants with brain volume measures from quantitative magnetic resonance imaging (MRI) who were free of stroke and other neurological disorders that might influence brain volumes and who were members of families with at least two Framingham Heart Study participants. Heritability was estimated using variance component methodology and adjusting for the components of the Framingham stroke risk profile. Genetic and environmental correlations between traits were obtained from bivariate analysis. Heritability estimates ranging from 0.46 to 0.60, were observed for total brain, white matter hyperintensity, hippocampal, temporal lobe, and lateral ventricular volumes. Moderate, yet significant, heritability was observed for the other measures. Bivariate analyses demonstrated that relationships between brain volume measures, except for white matter hyperintensity, reflected both moderate to strong shared genetic and shared environmental influences. This study confirms strong genetic effects on brain and white matter hyperintensity volumes. These data extend current knowledge by showing that these two different types of MRI measures do not share underlying genetic or environmental influences. PMID:19812462
Assessment of undiscovered conventional oil and gas resources of Thailand
Schenk, Chris
2011-01-01
The U.S. Geological Survey estimated mean volumes of 1.6 billion barrels of undiscovered conventional oil and 17 trillion cubic feet of undiscovered conventional natural gas in three geologic provinces of Thailand using a geology-based methodology. Most of the undiscovered conventional oil and gas resource is estimated to be in the area known as offshore Thai Basin province.
Development of estimation methodology for bicycle and pedestrian volumes based on existing counts.
DOT National Transportation Integrated Search
2013-10-01
The Colorado Department of Transportation (CDOT) adopted the Bicycle and Pedestrian Policy directive in 2009 : stating that "...the needs of bicyclists and pedestrians shall be included in the planning, design, and operation of : transportation facil...
NASA Astrophysics Data System (ADS)
Ebrahim, Girma Y.; Villholth, Karen G.
2016-10-01
Groundwater is an important resource for multiple uses in South Africa. Hence, setting limits to its sustainable abstraction while assuring basic human needs is required. Due to prevalent data scarcity related to groundwater replenishment, which is the traditional basis for estimating groundwater availability, the present article presents a novel method for determining allocatable groundwater in quaternary (fourth-order) catchments through information on streamflow. Using established methodologies for assessing baseflow, recession flow, and instream ecological flow requirement, the methodology develops a combined stepwise methodology to determine annual available groundwater storage volume using linear reservoir theory, essentially linking low flows proportionally to upstream groundwater storages. The approach was trialled for twenty-one perennial and relatively undisturbed catchments with long-term and reliable streamflow records. Using the Desktop Reserve Model, instream flow requirements necessary to meet the present ecological state of the streams were determined, and baseflows in excess of these flows were converted into a conservative estimates of allocatable groundwater storages on an annual basis. Results show that groundwater development potential exists in fourteen of the catchments, with upper limits to allocatable groundwater volumes (including present uses) ranging from 0.02 to 3.54 × 106 m3 a-1 (0.10-11.83 mm a-1) per catchment. With a secured availability of these volume 75% of the years, variability between years is assumed to be manageable. A significant (R2 = 0.88) correlation between baseflow index and the drainage time scale for the catchments underscores the physical basis of the methodology and also enables the reduction of the procedure by one step, omitting recession flow analysis. The method serves as an important complementary tool for the assessment of the groundwater part of the Reserve and the Groundwater Resource Directed Measures in South Africa and could be adapted and applied elsewhere.
NASA Technical Reports Server (NTRS)
1990-01-01
Cost estimates for phase C/D of the laser atmospheric wind sounder (LAWS) program are presented. This information provides a framework for cost, budget, and program planning estimates for LAWS. Volume 3 is divided into three sections. Section 1 details the approach taken to produce the cost figures, including the assumptions regarding the schedule for phase C/D and the methodology and rationale for costing the various work breakdown structure (WBS) elements. Section 2 shows a breakdown of the cost by WBS element, with the cost divided in non-recurring and recurring expenditures. Note that throughout this volume the cost is given in 1990 dollars, with bottom line totals also expressed in 1988 dollars (1 dollar(88) = 0.93 1 dollar(90)). Section 3 shows a breakdown of the cost by year. The WBS and WBS dictionary are included as an attachment to this report.
Projected electric power demands for the Potomac Electric Power Company. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estomin, S.; Kahal, M.
1984-03-01
This three-volume report presents the results of an econometric forecast of peak and electric power demands for the Potomac Electric Power Company (PEPCO) through the year 2002. Volume I describes the methodology, the results of the econometric estimations, the forecast assumptions and the calculated forecasts of peak demand and energy usage. Separate sets of models were developed for the Maryland Suburbs (Montgomery and Prince George's counties), the District of Columbia and Southern Maryland (served by a wholesale customer of PEPCO). For each of the three jurisdictions, energy equations were estimated for residential and commercial/industrial customers for both summer and wintermore » seasons. For the District of Columbia, summer and winter equations for energy sales to the federal government were also estimated. Equations were also estimated for street lighting and energy losses. Noneconometric techniques were employed to forecast energy sales to the Northern Virginia suburbs, Metrorail and federal government facilities located in Maryland.« less
The real estate factor: quantifying the impact of infarct location on stroke severity.
Menezes, Nina M; Ay, Hakan; Wang Zhu, Ming; Lopez, Chloe J; Singhal, Aneesh B; Karonen, Jari O; Aronen, Hannu J; Liu, Yawu; Nuutinen, Juho; Koroshetz, Walter J; Sorensen, A Gregory
2007-01-01
The severity of the neurological deficit after ischemic stroke is moderately correlated with infarct volume. In the current study, we sought to quantify the impact of location on neurological deficit severity and to delineate this impact from that of volume. We developed atlases consisting of location-weighted values indicating the relative importance in terms of neurological deficit severity for every voxel of the brain. These atlases were applied to 80 first-ever ischemic stroke patients to produce estimates of clinical deficit severity. Each patient had an MRI and National Institutes of Health Stroke Scale (NIHSS) examination just before or soon after hospital discharge. The correlation between the location-based deficit predictions and measured neurological deficit (NIHSS) scores were compared with the correlation obtained using volume alone to predict the neurological deficit. Volume-based estimates of neurological deficit severity were only moderately correlated with measured NIHSS scores (r=0.62). The combination of volume and location resulted in a significantly better correlation with clinical deficit severity (r=0.79, P=0.032). The atlas methodology is a feasible way of integrating infarct size and location to predict stroke severity. It can estimate stroke severity better than volume alone.
Assessment of undiscovered, technically recoverable oil and gas resources of Armenia, 2014
Klett, Timothy R.; Schenk, Christopher J.; Wandrey, Craig J.; Brownfield, Michael E.; Charpentier, Ronald R.; Tennyson, Marilyn E.; Gautier, Donald L.
2014-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of 1 million barrels of undiscovered, technically recoverable conventional oil and 6 billion cubic feet of undiscovered, technically recoverable conventional natural gas in Armenia.
Assessment of Undiscovered Oil and Gas Resources of the Red Sea Basin Province
,
2010-01-01
The U.S. Geological Survey estimated mean volumes of 5 billion barrels of undiscovered technically recoverable oil and 112 trillion cubic feet of recoverable gas in the Red Sea Basin Province using a geology-based assessment methodology.
NASA Astrophysics Data System (ADS)
Tiwari, Vaibhav
2018-07-01
The population analysis and estimation of merger rates of compact binaries is one of the important topics in gravitational wave astronomy. The primary ingredient in these analyses is the population-averaged sensitive volume. Typically, sensitive volume, of a given search to a given simulated source population, is estimated by drawing signals from the population model and adding them to the detector data as injections. Subsequently injections, which are simulated gravitational waveforms, are searched for by the search pipelines and their signal-to-noise ratio (SNR) is determined. Sensitive volume is estimated, by using Monte-Carlo (MC) integration, from the total number of injections added to the data, the number of injections that cross a chosen threshold on SNR and the astrophysical volume in which the injections are placed. So far, only fixed population models have been used in the estimation of binary black holes (BBH) merger rates. However, as the scope of population analysis broaden in terms of the methodologies and source properties considered, due to an increase in the number of observed gravitational wave (GW) signals, the procedure will need to be repeated multiple times at a large computational cost. In this letter we address the problem by performing a weighted MC integration. We show how a single set of generic injections can be weighted to estimate the sensitive volume for multiple population models; thereby greatly reducing the computational cost. The weights in this MC integral are the ratios of the output probabilities, determined by the population model and standard cosmology, and the injection probability, determined by the distribution function of the generic injections. Unlike analytical/semi-analytical methods, which usually estimate sensitive volume using single detector sensitivity, the method is accurate within statistical errors, comes at no added cost and requires minimal computational resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belzer, D.B.; Serot, D.E.; Kellogg, M.A.
1991-03-01
Development of integrated mobilization preparedness policies requires planning estimates of available productive capacity during national emergency conditions. Such estimates must be developed in a manner to allow evaluation of current trends in capacity and the consideration of uncertainties in various data inputs and in engineering assumptions. This study developed estimates of emergency operating capacity (EOC) for 446 manufacturing industries at the 4-digit Standard Industrial Classification (SIC) level of aggregation and for 24 key nonmanufacturing sectors. This volume lays out the general concepts and methods used to develop the emergency operating estimates. The historical analysis of capacity extends from 1974 throughmore » 1986. Some nonmanufacturing industries are included. In addition to mining and utilities, key industries in transportation, communication, and services were analyzed. Physical capacity and efficiency of production were measured. 3 refs., 2 figs., 12 tabs. (JF)« less
On-time reliability impacts of ATIS. Volume III, Implications for ATIS investment strategies
DOT National Transportation Integrated Search
2003-05-01
The effect of ATIS accuracy and extent of ATIS roadway instrumentation on the on-time reliability benefits to routine users of ATIS are evaluated through the application of Heuristic On-line Web-linked Arrival Time Estimation (HOWLATE) methodology. T...
GEOSTATISTICAL INTERPOLATION OF CHEMICAL CONCENTRATION. (R825689C037)
Measurements of contaminant concentration at a hazardous waste site typically vary over many orders of magnitude and have highly skewed distributions. This work presents a practical methodology for the estimation of solute concentration contour maps and volume...
Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics
NASA Astrophysics Data System (ADS)
Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph
2011-11-01
Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.
NASA Astrophysics Data System (ADS)
Ryu, B. Y.; Jung, H. J.; Bae, S. H.; Choi, C. U.
2013-12-01
CO2 emissions on roads in urban centers substantially affect global warming. It is important to quantify CO2 emissions in terms of the link unit in order to reduce these emissions on the roads. Therefore, in this study, we utilized real-time traffic data and attempted to develop a methodology for estimating CO2 emissions per link unit. Because of the recent development of the vehicle-to-infrastructure (V2I) communication technology, data from probe vehicles (PVs) can be collected and speed per link unit can be calculated. Among the existing emission calculation methodologies, mesoscale modeling, which is a representative modeling measurement technique, requires speed and traffic data per link unit. As it is not feasible to install fixed detectors at every link for traffic data collection, in this study, we developed a model for traffic volume estimation by utilizing the number of PVs that can be additionally collected when the PV data are collected. Multiple linear regression and an artificial neural network (ANN) were used for estimating the traffic volume. The independent variables and input data for each model are the number of PVs, travel time index (TTI), the number of lanes, and time slots. The result from the traffic volume estimate model shows that the mean absolute percentage error (MAPE) of the ANN is 18.67%, thus proving that it is more effective. The ANN-based traffic volume estimation served as the basis for the calculation of emissions per link unit. The daily average emissions for Daejeon, where this study was based, were 2210.19 ton/day. By vehicle type, passenger cars accounted for 71.28% of the total emissions. By road, Gyeryongro emitted 125.48 ton/day, accounting for 5.68% of the total emission, the highest percentage of all roads. In terms of emissions per kilometer, Hanbatdaero had the highest emission volume, with 7.26 ton/day/km on average. This study proves that real-time traffic data allow an emissions estimate in terms of the link unit. Furthermore, an analysis of CO2 emissions can support traffic management to make decisions related to the reduction of carbon emissions.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Transportation Sector Model of the National Energy Modeling System. Volume 2 -- Appendices: Part 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The attachments contained within this appendix provide additional details about the model development and estimation process which do not easily lend themselves to incorporation in the main body of the model documentation report. The information provided in these attachments is not integral to the understanding of the model`s operation, but provides the reader with opportunity to gain a deeper understanding of some of the model`s underlying assumptions. There will be a slight degree of replication of materials found elsewhere in the documentation, made unavoidable by the dictates of internal consistency. Each attachment is associated with a specific component of themore » transportation model; the presentation follows the same sequence of modules employed in Volume 1. The following attachments are contained in Appendix F: Fuel Economy Model (FEM)--provides a discussion of the FEM vehicle demand and performance by size class models; Alternative Fuel Vehicle (AFV) Model--describes data input sources and extrapolation methodologies; Light-Duty Vehicle (LDV) Stock Model--discusses the fuel economy gap estimation methodology; Light Duty Vehicle Fleet Model--presents the data development for business, utility, and government fleet vehicles; Light Commercial Truck Model--describes the stratification methodology and data sources employed in estimating the stock and performance of LCT`s; Air Travel Demand Model--presents the derivation of the demographic index, used to modify estimates of personal travel demand; and Airborne Emissions Model--describes the derivation of emissions factors used to associate transportation measures to levels of airborne emissions of several pollutants.« less
,
2012-01-01
Using a performance-based geologic assessment methodology, the U.S. Geological Survey estimated a technically recoverable mean volume of 6.1 trillion cubic feet of potential shale gas in the Bombay, Cauvery, and Krishna-Godavari Provinces of India.
Vittorazzi, C; Amaral Junior, A T; Guimarães, A G; Viana, A P; Silva, F H L; Pena, G F; Daher, R F; Gerhardt, I F S; Oliveira, G H F; Pereira, M G
2017-09-27
Selection indices commonly utilize economic weights, which become arbitrary genetic gains. In popcorn, this is even more evident due to the negative correlation between the main characteristics of economic importance - grain yield and popping expansion. As an option in the use of classical biometrics as a selection index, the optimal procedure restricted maximum likelihood/best linear unbiased predictor (REML/BLUP) allows the simultaneous estimation of genetic parameters and the prediction of genotypic values. Based on the mixed model methodology, the objective of this study was to investigate the comparative efficiency of eight selection indices estimated by REML/BLUP for the effective selection of superior popcorn families in the eighth intrapopulation recurrent selection cycle. We also investigated the efficiency of the inclusion of the variable "expanded popcorn volume per hectare" in the most advantageous selection of superior progenies. In total, 200 full-sib families were evaluated in two different areas in the North and Northwest regions of the State of Rio de Janeiro, Brazil. The REML/BLUP procedure resulted in higher estimated gains than those obtained with classical biometric selection index methodologies and should be incorporated into the selection of progenies. The following indices resulted in higher gains in the characteristics of greatest economic importance: the classical selection index/values attributed by trial, via REML/BLUP, and the greatest genotypic values/expanded popcorn volume per hectare, via REML. The expanded popcorn volume per hectare characteristic enabled satisfactory gains in grain yield and popping expansion; this characteristic should be considered super-trait in popcorn breeding programs.
NASA Astrophysics Data System (ADS)
Sepúlveda, J.; Hoyos Ortiz, C. D.
2017-12-01
An adequate quantification of precipitation over land is critical for many societal applications including agriculture, hydroelectricity generation, water supply, and risk management associated with extreme events. The use of rain gauges, a traditional method for precipitation estimation, and an excellent one, to estimate the volume of liquid water during a particular precipitation event, does not allow to fully capture the highly spatial variability of the phenomena which is a requirement for almost all practical applications. On the other hand, the weather radar, an active remote sensing sensor, provides a proxy for rainfall with fine spatial resolution and adequate temporary sampling, however, it does not measure surface precipitation. In order to fully exploit the capabilities of the weather radar, it is necessary to develop quantitative precipitation estimation (QPE) techniques combining radar information with in-situ measurements. Different QPE methodologies are explored and adapted to local observations in a highly complex terrain region in tropical Colombia using a C-Band radar and a relatively dense network of rain gauges and disdrometers. One important result is that the expressions reported in the literature for extratropical locations are not representative of the conditions found in the tropical region studied. In addition to reproducing the state-of-the-art techniques, a new multi-stage methodology based on radar-derived variables and disdrometer data is proposed in order to achieve the best QPE possible. The main motivation for this new methodology is based on the fact that most traditional QPE methods do not directly take into account the different uncertainty sources involved in the process. The main advantage of the multi-stage model compared to traditional models is that it allows assessing and quantifying the uncertainty in the surface rain rate estimation. The sub-hourly rainfall estimations using the multi-stage methodology are realistic compared to observed data in spite of the many sources of uncertainty including the sampling volume, the different physical principles of the sensors, the incomplete understanding of the microphysics of precipitation and, the most important, the rapidly varying droplet size distribution.
1983-02-01
s.,ccesstully modeled to enhance future computer design simulations; (2) a new methodology for conduc*n dynamic analysis of vehicle mechanics was...to prelminary design methodology for tilt rotors, advancing blade concepts configuration helicopters, and compound helicopters in conjunction with...feasibility of low-level personnel parachutes has been demon- strated. A study was begun to design a free-fall water contalner. An experimental program to
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.; Mcmaster, K. M.
1981-01-01
The methodology presented is a derivation of the utility owned solar electric systems model. The net present value of the system is determined by consideration of all financial benefits and costs including a specified return on investment. Life cycle costs, life cycle revenues, and residual system values are obtained. Break-even values of system parameters are estimated by setting the net present value to zero.
Assessment of potential shale gas and shale oil resources of the Norte Basin, Uruguay, 2011
Schenk, Christopher J.; Kirschbaum, Mark A.; Charpentier, Ronald R.; Cook, Troy; Klett, Timothy R.; Gautier, Donald L.; Pollastro, Richard M.; Weaver, Jean N.; Brownfield, Michael
2011-01-01
Using a performance-based geological assessment methodology, the U.S. Geological Survey estimated mean volumes of 13.4 trillion cubic feet of potential technically recoverable shale gas and 0.5 billion barrels of technically recoverable shale oil resources in the Norte Basin of Uruguay.
Haines, Seth S.
2015-07-13
The quantities of water and hydraulic fracturing proppant required for producing petroleum (oil, gas, and natural gas liquids) from continuous accumulations, and the quantities of water extracted during petroleum production, can be quantitatively assessed using a probabilistic approach. The water and proppant assessment methodology builds on the U.S. Geological Survey methodology for quantitative assessment of undiscovered technically recoverable petroleum resources in continuous accumulations. The U.S. Geological Survey assessment methodology for continuous petroleum accumulations includes fundamental concepts such as geologically defined assessment units, and probabilistic input values including well-drainage area, sweet- and non-sweet-spot areas, and success ratio within the untested area of each assessment unit. In addition to petroleum-related information, required inputs for the water and proppant assessment methodology include probabilistic estimates of per-well water usage for drilling, cementing, and hydraulic-fracture stimulation; the ratio of proppant to water for hydraulic fracturing; the percentage of hydraulic fracturing water that returns to the surface as flowback; and the ratio of produced water to petroleum over the productive life of each well. Water and proppant assessments combine information from recent or current petroleum assessments with water- and proppant-related input values for the assessment unit being studied, using Monte Carlo simulation, to yield probabilistic estimates of the volume of water for drilling, cementing, and hydraulic fracture stimulation; the quantity of proppant for hydraulic fracture stimulation; and the volumes of water produced as flowback shortly after well completion, and produced over the life of the well.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levine, Jonathan S.; Fukai, Isis; Soeder, Daniel J.
While the majority of shale formations will serve as reservoir seals for stored anthropogenic carbon dioxide (CO2), hydrocarbon-bearing shale formations may be potential geologic sinks after depletion through primary production. Here in this paper we present the United States-Department of Energy-National Energy Technology Laboratory (US-DOE-NETL) methodology for screening-level assessment of prospective CO 2 storage resources in shale using a volumetric equation. Volumetric resource estimates are produced from the bulk volume, porosity, and sorptivity of the shale and storage efficiency factors based on formation-scale properties and petrophysical limitations on fluid transport. Prospective shale formations require: (1) prior hydrocarbon production using horizontalmore » drilling and stimulation via staged, high-volume hydraulic fracturing, (2) depths sufficient to maintain CO 2 in a supercritical state, generally >800 m, and (3) an overlying seal. The US-DOE-NETL methodology accounts for storage of CO 2 in shale as a free fluid phase within fractures and matrix pores and as an sorbed phase on organic matter and clays. Uncertainties include but are not limited to poorly-constrained geologic variability in formation thickness, porosity, existing fluid content, organic richness, and mineralogy. Knowledge of how these parameters may be linked to depositional environments, facies, and diagenetic history of the shale will improve the understanding of pore-to-reservoir scale behavior, and provide improved estimates of prospective CO 2 storage.« less
Levine, Jonathan S.; Fukai, Isis; Soeder, Daniel J.; ...
2016-05-31
While the majority of shale formations will serve as reservoir seals for stored anthropogenic carbon dioxide (CO2), hydrocarbon-bearing shale formations may be potential geologic sinks after depletion through primary production. Here in this paper we present the United States-Department of Energy-National Energy Technology Laboratory (US-DOE-NETL) methodology for screening-level assessment of prospective CO 2 storage resources in shale using a volumetric equation. Volumetric resource estimates are produced from the bulk volume, porosity, and sorptivity of the shale and storage efficiency factors based on formation-scale properties and petrophysical limitations on fluid transport. Prospective shale formations require: (1) prior hydrocarbon production using horizontalmore » drilling and stimulation via staged, high-volume hydraulic fracturing, (2) depths sufficient to maintain CO 2 in a supercritical state, generally >800 m, and (3) an overlying seal. The US-DOE-NETL methodology accounts for storage of CO 2 in shale as a free fluid phase within fractures and matrix pores and as an sorbed phase on organic matter and clays. Uncertainties include but are not limited to poorly-constrained geologic variability in formation thickness, porosity, existing fluid content, organic richness, and mineralogy. Knowledge of how these parameters may be linked to depositional environments, facies, and diagenetic history of the shale will improve the understanding of pore-to-reservoir scale behavior, and provide improved estimates of prospective CO 2 storage.« less
Klett, T.R.
2011-01-01
The U.S. Geological Survey, using a geology-based assessment methodology, estimated mean volumes of technically recoverable, conventional, undiscovered petroleum resources at 218 million barrels of crude oil, 4.1 trillion cubic feet of natural gas, and 94 million barrels of natural gas liquids for the Azov-Kuban Basin Province as part of a program to estimate petroleum resources for priority basins throughout the world.
Klett, T.R.; Schenk, Christopher J.; Wandrey, Craig J.; Charpentier, Ronald R.; Brownfield, Michael E.; Pitman, Janet K.; Pollastro, Richard M.; Cook, Troy A.; Tennyson, Marilyn E.
2012-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated volumes of undiscovered, technically recoverable, conventional petroleum resources for the Amu Darya Basin and Afghan–Tajik Basin Provinces of Afghanistan, Iran, Tajikistan, Turkmenistan, and Uzbekistan. The mean volumes were estimated at 962 million barrels of crude oil, 52 trillion cubic feet of natural gas, and 582 million barrels of natural gas liquids for the Amu Darya Basin Province and at 946 million barrels of crude oil, 7 trillion cubic feet of natural gas, and 85 million barrels of natural gas liquids for the Afghan–Tajik Basin Province.
Roberts-Ashby, Tina; Brandon N. Ashby,
2016-01-01
This paper demonstrates geospatial modification of the USGS methodology for assessing geologic CO2 storage resources, and was applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin. The study provides detailed evaluation of porous intervals within these reservoirs and utilizes GIS to evaluate the potential spatial distribution of reservoir parameters and volume of CO2 that can be stored. This study also shows that incorporating spatial variation of parameters using detailed and robust datasets may improve estimates of storage resources when compared to applying uniform values across the study area derived from small datasets, like many assessment methodologies. Geospatially derived estimates of storage resources presented here (Pre-Punta Gorda Composite = 105,570 MtCO2; Dollar Bay = 24,760 MtCO2) were greater than previous assessments, which was largely attributed to the fact that detailed evaluation of these reservoirs resulted in higher estimates of porosity and net-porous thickness, and areas of high porosity and thick net-porous intervals were incorporated into the model, likely increasing the calculated volume of storage space available for CO2 sequestration. The geospatial method for evaluating CO2 storage resources also provides the ability to identify areas that potentially contain higher volumes of storage resources, as well as areas that might be less favorable.
Tug fleet and ground operations schedules and controls. Volume 3: Program cost estimates
NASA Technical Reports Server (NTRS)
1975-01-01
Cost data for the tug DDT&E and operations phases are presented. Option 6 is the recommended option selected from seven options considered and was used as the basis for ground processing estimates. Option 6 provides for processing the tug in a factory clean environment in the low bay area of VAB with subsequent cleaning to visibly clean. The basis and results of the trade study to select Option 6 processing plan is included. Cost estimating methodology, a work breakdown structure, and a dictionary of WBS definitions is also provided.
Space Tug Docking Study. Volume 5: Cost Analysis
NASA Technical Reports Server (NTRS)
1976-01-01
The cost methodology, summary cost data, resulting cost estimates by Work Breakdown Structure (WBS), technical characteristics data, program funding schedules and the WBS for the costing are discussed. Cost estimates for two tasks of the study are reported. The first, developed cost estimates for design, development, test and evaluation (DDT&E) and theoretical first unit (TFU) at the component level (Level 7) for all items reported in the data base. Task B developed total subsystem DDT&E costs and funding schedules for the three candidate Rendezvous and Docking Systems: manual, autonomous, and hybrid.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Methodology for Software Reliability Prediction. Volume 2.
1987-11-01
The overall acquisition ,z program shall include the resources, schedule, management, structure , and controls necessary to ensure that specified AD...Independent Verification/Validation - Programming Team Structure - Educational Level of Team Members - Experience Level of Team Members * Methods Used...Prediction or Estimation Parameter Supported: Software - Characteristics 3. Objectives: Structured programming studies and Government Ur.’.. procurement
Assessment of undiscovered conventional oil and gas resources of six geologic provinces of China
Charpentier, Ronald R.; Schenk, Christopher J.; Brownfield, Michael E.; Cook, Troy A.; Klett, Timothy R.; Pitman, Janet K.; Pollastro, Richard M.
2012-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of undiscovered conventional petroleum resources in six geologic provinces of China at 14.9 billion barrels of oil, 87.6 trillion cubic feet of natural gas, and 1.4 billion barrels of natural-gas liquids.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauf, M.J.; Vance, J.N.; James, D.
1991-01-01
A number of nuclear utilities and industry organizations in the United States have evaluated the requirements for reactor decommissioning. These broad scope studies have addressed the major issues of technology, methodology, safety and costs of decommissioning and have produced substantial volumes of data to describe, in detail, the issues and impacts which result. The objective of this paper to provide CECo a reasonable basis for discussion low-level waste burial volumes for the most likely decommissioning options and to show how various decontamination and VR technologies can be applied to provide additional reduction of the volumes required to be buried atmore » low-level waste burial grounds.« less
SRB ascent aerodynamic heating design criteria reduction study, volume 1
NASA Technical Reports Server (NTRS)
Crain, W. K.; Frost, C. L.; Engel, C. D.
1989-01-01
An independent set of solid rocket booster (SRB) convective ascent design environments were produced which would serve as a check on the Rockwell IVBC-3 environments used to design the ascent phase of flight. In addition, support was provided for lowering the design environments such that Thermal Protection System (TPS), based on conservative estimates, could be removed leading to a reduction in SRB refurbishment time and cost. Ascent convective heating rates and loads were generated at locations in the SRB where lowering the thermal environment would impact the TPS design. The ascent thermal environments are documented along with the wind tunnel/flight test data base used as well as the trajectory and environment generation methodology. Methodology, as well as, environment summaries compared to the 1980 Design and Rockwell IVBC-3 Design Environment are presented in this volume, 1.
The report gives results of a study, the objective of which was to significantly improve engineering cost estimates currently being used to evaluate the economic effects of applying SO2 and NOx controls at 200 large SO2-emitting coal-fired utility plants. To accomplish the object...
A Multinomial Logit Approach to Estimating Regional Inventories by Product Class
Lawrence Teeter; Xiaoping Zhou
1998-01-01
Current timber inventory projections generally lack information on inventory by product classes. Most models available for inventory projection and linked to supply analyses are limited to projecting aggregate softwood and hardwood. The objective of this research is to develop a methodology to distribute the volume on each FIA survey plot to product classes and...
Toward quantifying the effectiveness of water trading under uncertainty.
Luo, B; Huang, G H; Zou, Y; Yin, Y Y
2007-04-01
This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.
St Charles, Frank Kelley; McAughey, John; Shepperd, Christopher J
2013-06-01
Methodologies have been developed, described and demonstrated that convert mouth exposure estimates of cigarette smoke constituents to dose by accounting for smoke spilled from the mouth prior to inhalation (mouth-spill (MS)) and the respiratory retention (RR) during the inhalation cycle. The methodologies are applicable to just about any chemical compound in cigarette smoke that can be measured analytically and can be used with ambulatory population studies. Conversion of exposure to dose improves the relevancy for risk assessment paradigms. Except for urinary nicotine plus metabolites, biomarkers generally do not provide quantitative exposure or dose estimates. In addition, many smoke constituents have no reliable biomarkers. We describe methods to estimate the RR of chemical compounds in smoke based on their vapor pressure (VP) and to estimate the MS for a given subject. Data from two clinical studies were used to demonstrate dose estimation for 13 compounds, of which only 3 have urinary biomarkers. Compounds with VP > 10(-5) Pa generally have RRs of 88% or greater, which do not vary appreciably with inhalation volume (IV). Compounds with VP < 10(-7) Pa generally have RRs dependent on IV and lung exposure time. For MS, mean subject values from both studies were slightly greater than 30%. For constituents with urinary biomarkers, correlations with the calculated dose were significantly improved over correlations with mouth exposure. Of toxicological importance is that the dose correlations provide an estimate of the metabolic conversion of a constituent to its respective biomarker.
Acoustic measurement method of the volume flux of a seafloor hydrothermal plume
NASA Astrophysics Data System (ADS)
Xu, G.; Jackson, D. R.; Bemis, K. G.; Rona, P. A.
2011-12-01
Measuring fluxes (volume, chemical, heat, etc.) of the deep sea hydrothermal vents has been a crucial but challenging task faced by the scientific community since the discovery of the vent systems. However, the great depths and complexities of the hydrothermal vents make traditional sampling methods laborious and almost daunting missions. Furthermore, the samples, in most cases both sparse in space and sporadic in time, are hardly enough to provide a result with moderate uncertainty. In September 2010, our Cabled Observatory Vent Imaging Sonar System (COVIS, http://vizlab.rutgers.edu/AcoustImag/covis.html) was connected to the Neptune Canada underwater ocean observatory network (http://www.neptunecanada.ca) at the Main Endeavour vent field on the Endeavour segment of the Juan de Fuca Ridge. During the experiment, the COVIS system produced 3D images of the buoyant plume discharged from the vent complex Grotto by measuring the back-scattering intensity of the acoustic signal. Building on the methodology developed in our previous work, the vertical flow velocity of the plume is estimated from the Doppler shift of the acoustic signal using geometric correction to compensate for the ambient horizontal currents. A Gaussian distribution curve is fitted to the horizontal back-scattering intensity profile to determine the back-scattering intensity at the boundary of the plume. Such a boundary value is used as the threshold in a window function for separating the plume from background signal. Finally, the volume flux is obtained by integrating the resulting 2D vertical velocity profile over the horizontal cross-section of the plume. In this presentation, we discuss preliminary results from the COVIS experiment. In addition, several alternative approaches are applied to determination of the accuracy of the estimated plume vertical velocity in the absence of direct measurements. First, the results from our previous experiment (conducted in 2000 at the same vent complex using a similar methodology but a different sonar system) provide references to the consistency of the methodology. Second, the vertical flow rate measurement made in 2007 at an adjacent vent complex (Dante) using a different acoustic method (acoustic scintillation) can serve as a first order estimation of the plume vertical velocity. Third, another first order estimation can be obtained by combining the plume bending angle with the horizontal current measured by a current meter array deployed to the north of the vent field. Finally, statistical techniques are used to quantify the errors due to the ambient noises, inherent uncertainties of the methodology, and the fluctuation of the plume structure.
Timber resource statistics for the upper Tanana block, Tanana inventory unit, Alaska, 1974.
Karl M. Hegg
1983-01-01
This report for the 3.6-million-acre Upper Tanana block is the third of four on the 14-million-acre Tanana Valley forest inventory unit. Descriptions of area, climate, forest, general resource use, and inventory methodology are presented. Area and volume tables are provided for commercial and operable noncommercial forest lands. Estimates for commercial forest land...
Klett, T.R.; Schenk, Christopher J.; Charpentier, Ronald R.; Brownfield, Michael E.; Pitman, Janet K.; Cook, Troy A.; Tennyson, Marilyn E.
2010-01-01
The U.S. Geological Survey estimated mean volumes of technically recoverable, conventional, undiscovered petroleum resources at 1.4 billion barrels of crude oil, 2.4 trillion cubic feet of natural gas, and 85 million barrels of natural gas liquids for the Volga-Ural Region Province, using a geology-based assessment methodology.
ERIC Educational Resources Information Center
Health Resources Administration (DHHS/PHS), Hyattsville Md. Office of Graduate Medical Education.
Results of a three-year study to estimate the future supply and requirements for physicians, which was conducted by the Graduate Medical Education National Advisory Committee (GMENAC), are summarized. The research methodology, which consisted of three mathematical models to project physician supply and requirements, is described, and 40…
Gautier, Donald L.; Pitman, Janet K.; Charpentier, Ronald R.; Cook, Troy; Klett, Timothy R.; Schenk, Christopher J.
2012-01-01
Using a performance-based geological assessment methodology, the U.S. Geological Survey estimated mean volumes of 1,345 billion cubic feet of potentially technically recoverable gas and 168 million barrels of technically recoverable oil and natural gas liquids in Ordovician and Silurian age shales in the Polish- Ukrainian Foredeep basin of Poland.
Drake II, Ronald M.; Hatch, Joseph R.; Schenk, Christopher J.; Charpentier, Ronald R.; Klett, Timothy R.; Le, Phuong A.; Leathers, Heidi M.; Brownfield, Michael E.; Gaswirth, Stephanie B.; Marra, Kristen R.; Pitman, Janet K.; Potter, Christopher J.; Tennyson, Marilyn E.
2015-09-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of undiscovered, technically recoverable resources of 463 million barrels of oil, 11.2 trillion cubic feet of gas, and 35 million barrels of natural gas liquids in the Cherokee Platform Province area of Kansas, Oklahoma, and Missouri.
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.; Mcmaster, K. M.
1981-01-01
The utility owned solar electric system methodology is generalized and updated. The net present value of the system is determined by consideration of all financial benefits and costs (including a specified return on investment). Life cycle costs, life cycle revenues, and residual system values are obtained. Break even values of system parameters are estimated by setting the net present value to zero. While the model was designed for photovoltaic generators with a possible thermal energy byproduct, it applicability is not limited to such systems. The resulting owner-dependent methodology for energy generation system assessment consists of a few equations that can be evaluated without the aid of a high-speed computer.
Estimating Equivalency of Explosives Through A Thermochemical Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maienschein, J L
2002-07-08
The Cheetah thermochemical computer code provides an accurate method for estimating the TNT equivalency of any explosive, evaluated either with respect to peak pressure or the quasi-static pressure at long time in a confined volume. Cheetah calculates the detonation energy and heat of combustion for virtually any explosive (pure or formulation). Comparing the detonation energy for an explosive with that of TNT allows estimation of the TNT equivalency with respect to peak pressure, while comparison of the heat of combustion allows estimation of TNT equivalency with respect to quasi-static pressure. We discuss the methodology, present results for many explosives, andmore » show comparisons with equivalency data from other sources.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mead, H; St. Jude Children’s Research Hospital, Memphis, TN; Brady, S
Purpose: To discover if a previously published methodology for estimating patient-specific organ dose in a pediatric population (5–55kg) is translatable to the adult sized patient population (> 55 kg). Methods: An adult male anthropomorphic phantom was scanned with metal oxide semiconductor field effect transistor (MOSFET) dosimeters placed at 23 organ locations in the chest and abdominopelvic regions to determine absolute organ dose. Organ-dose-to-SSDE correlation factors were developed by dividing individual phantom organ doses by SSDE of the phantom; where SSDE was calculated at the center of the scan volume of the chest and abdomen/pelvis separately. Organ dose correlation factors developedmore » in phantom were multiplied by 28 chest and 22 abdominopelvic patient SSDE values to estimate organ dose. The median patient weight from the CT examinations was 68.9 kg (range 57–87 kg) and median age was 17 years (range 13–28 years). Calculated organ dose estimates were compared to published Monte Carlo simulated patient and phantom results. Results: Organ-dose-to-SSDE correlation was determined for a total of 23 organs in the chest and abdominopelvic regions. For organs fully covered by the scan volume, correlation in the chest (median 1.3; range 1.1–1.5) and abdominopelvic (median 0.9; range 0.7–1.0) was 1.0 ± 10%. For organs that extended beyond the scan volume (i.e. skin bone marrow and bone surface) correlation was determined to be a median of 0.3 (range 0.1–0.4). Calculated patient organ dose using patient SSDE agreed to better than 6% (chest) and 15% (abdominopelvic) to published values. Conclusion: This study demonstrated that our previous published methodology for calculating organ dose using patient-specific SSDE for the chest and abdominopelvic regions is translatable to adult sized patients for organs fully covered by the scan volume.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad
Existing methodologies for estimating the electricity generation potential of Enhanced Geothermal Systems (EGS) assume thermal recovery factors of 5% or less, resulting in relatively low volumetric electricity generation potentials for EGS reservoirs. This study proposes and develops a methodology for calculating EGS electricity generation potential based on the Gringarten conceptual model and analytical solution for heat extraction from fractured rock. The electricity generation potential of a cubic kilometer of rock as a function of temperature is calculated assuming limits on the allowed produced water temperature decline and reservoir lifetime based on surface power plant constraints. The resulting estimates of EGSmore » electricity generation potential can be one to nearly two-orders of magnitude larger than those from existing methodologies. The flow per unit fracture surface area from the Gringarten solution is found to be a key term in describing the conceptual reservoir behavior. The methodology can be applied to aid in the design of EGS reservoirs by giving minimum reservoir volume, fracture spacing, number of fractures, and flow requirements for a target reservoir power output. Limitations of the idealized model compared to actual reservoir performance and the implications on reservoir design are discussed.« less
Audebert, M; Oxarango, L; Duquennoi, C; Touze-Foltz, N; Forquet, N; Clément, R
2016-09-01
Leachate recirculation is a key process in the operation of municipal solid waste landfills as bioreactors. To ensure optimal water content distribution, bioreactor operators need tools to design leachate injection systems. Prediction of leachate flow by subsurface flow modelling could provide useful information for the design of such systems. However, hydrodynamic models require additional data to constrain them and to assess hydrodynamic parameters. Electrical resistivity tomography (ERT) is a suitable method to study leachate infiltration at the landfill scale. It can provide spatially distributed information which is useful for constraining hydrodynamic models. However, this geophysical method does not allow ERT users to directly measure water content in waste. The MICS (multiple inversions and clustering strategy) methodology was proposed to delineate the infiltration area precisely during time-lapse ERT survey in order to avoid the use of empirical petrophysical relationships, which are not adapted to a heterogeneous medium such as waste. The infiltration shapes and hydrodynamic information extracted with MICS were used to constrain hydrodynamic models in assessing parameters. The constraint methodology developed in this paper was tested on two hydrodynamic models: an equilibrium model where, flow within the waste medium is estimated using a single continuum approach and a non-equilibrium model where flow is estimated using a dual continuum approach. The latter represents leachate flows into fractures. Finally, this methodology provides insight to identify the advantages and limitations of hydrodynamic models. Furthermore, we suggest an explanation for the large volume detected by MICS when a small volume of leachate is injected. Copyright © 2016 Elsevier Ltd. All rights reserved.
McAughey, John; Shepperd, Christopher J.
2013-01-01
Methodologies have been developed, described and demonstrated that convert mouth exposure estimates of cigarette smoke constituents to dose by accounting for smoke spilled from the mouth prior to inhalation (mouth-spill (MS)) and the respiratory retention (RR) during the inhalation cycle. The methodologies are applicable to just about any chemical compound in cigarette smoke that can be measured analytically and can be used with ambulatory population studies. Conversion of exposure to dose improves the relevancy for risk assessment paradigms. Except for urinary nicotine plus metabolites, biomarkers generally do not provide quantitative exposure or dose estimates. In addition, many smoke constituents have no reliable biomarkers. We describe methods to estimate the RR of chemical compounds in smoke based on their vapor pressure (VP) and to estimate the MS for a given subject. Data from two clinical studies were used to demonstrate dose estimation for 13 compounds, of which only 3 have urinary biomarkers. Compounds with VP > 10−5 Pa generally have RRs of 88% or greater, which do not vary appreciably with inhalation volume (IV). Compounds with VP < 10−7 Pa generally have RRs dependent on IV and lung exposure time. For MS, mean subject values from both studies were slightly greater than 30%. For constituents with urinary biomarkers, correlations with the calculated dose were significantly improved over correlations with mouth exposure. Of toxicological importance is that the dose correlations provide an estimate of the metabolic conversion of a constituent to its respective biomarker. PMID:23742081
Assessment of undiscovered oil and gas resources of the Susitna Basin, southern Alaska, 2017
Stanley, Richard G.; Potter, Christopher J.; Lewis, Kristen A.; Lillis, Paul G.; Shah, Anjana K.; Haeussler, Peter J.; Phillips, Jeffrey D.; Valin, Zenon C.; Schenk, Christopher J.; Klett, Timothy R.; Brownfield, Michael E.; Drake II, Ronald M.; Finn, Thomas M.; Haines, Seth S.; Higley, Debra K.; Houseknecht, David W.; Le, Phuong A.; Marra, Kristen R.; Mercier, Tracey J.; Leathers-Miller, Heidi M.; Paxton, Stanley T.; Pearson, Ofori N.; Tennyson, Marilyn E.; Woodall, Cheryl A.; Zyrianova, Margarita V.
2018-05-01
The U.S. Geological Survey (USGS) recently completed an assessment of undiscovered, technically recoverable oil and gas resources in the Susitna Basin of southern Alaska. Using a geology-based methodology, the USGS estimates that mean undiscovered volumes of about 2 million barrels of oil and nearly 1.7 trillion cubic feet of gas may be found in this area.
Forecasting the demand potential for STOL air transportation
NASA Technical Reports Server (NTRS)
Fan, S.; Horonjeff, R.; Kanafani, A.; Mogharabi, A.
1973-01-01
A process for predicting the potential demand for STOL aircraft was investigated to provide a conceptual framework, and an analytical methodology for estimating the STOL air transportation market. It was found that: (1) schedule frequency has the strongest effect on the traveler's choice among available routes, (2) work related business constitutes approximately 50% of total travel volume, and (3) air travel demand follows economic trends.
2011-03-28
CL CHEST 807.2 Closed Fracture of Sternum FRAC CL CHEST 808.8 Fracture of Pelvis Unspec, Closed FRAC CL PELVIS+UROGENITAL 810 Clavicle Fracture...of Pelvis Unspec, Open FRAC OP PELVIS+UROGENITAL 810.1 Clavicle Fracture, Open FRAC OP SHOULDER & UPPER ARM 810.12 Open Fracture of Shaft of Clavicle
Methodology for Software Reliability Prediction. Volume 1.
1987-11-01
SPACECRAFT 0 MANNED SPACECRAFT B ATCH SYSTEM AIRBORNE AVIONICS 0 UNMANNED EVENT C014TROL a REAL TIME CLOSED 0 UNMANNED SPACECRAFT LOOP OPERATINS SPACECRAFT...software reliability. A Software Reliability Measurement Framework was established which spans the life cycle of a software system and includes the...specification, prediction, estimation, and assessment of software reliability. Data from 59 systems , representing over 5 million lines of code, were
Assessment of Undiscovered Petroleum Resources of Southern and Western Afghanistan, 2009
Wandrey, C.J.; Kosti, Amir Zada; Selab, Amir Mohammad; Omari, Mohammad Karim; Muty, Salam Abdul; Nakshband, Haidari Gulam; Hosine, Abdul Aminulah; Wahab, Abdul; Hamidi, Abdul Wasy; Ahmadi, Nasim; Agena, Warren F.; Charpentier, Ronald R.; Cook, Troy; Drenth, B.J.
2009-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey--Afghanistan Ministry of Mines Joint Oil and Gas Resource Assessment Team estimated mean undiscovered resource volumes of 21.55 million barrels of oil, 44.76 billion cubic feet of non-associated natural gas, and 0.91 million barrels of natural gas liquids in the western Afghanistan Tirpul Assessment Unit (AU) (80230101).
Marra, Kristen R.; Charpentier, Ronald R.; Schenk, Christopher J.; Lewan, Michael D.; Leathers-Miller, Heidi M.; Klett, Timothy R.; Gaswirth, Stephanie B.; Le, Phuong A.; Mercier, Tracey J.; Pitman, Janet K.; Tennyson, Marilyn E.
2015-12-17
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of 53 trillion cubic feet of shale gas, 172 million barrels of shale oil, and 176 million barrels of natural gas liquids in the Barnett Shale of the Bend Arch–Fort Worth Basin Province of Texas.
NASA Astrophysics Data System (ADS)
Molon, Michelle; Boyce, Joseph I.; Arain, M. Altaf
2017-01-01
Coarse root biomass was estimated in a temperate pine forest using high-resolution (1 GHz) 3-D ground-penetrating radar (GPR). GPR survey grids were acquired across a 400 m2 area with varying line spacing (12.5 and 25 cm). Root volume and biomass were estimated directly from the 3-D radar volume by using isometric surfaces calculated with the marching cubes algorithm. Empirical relations between GPR reflection amplitude and root diameter were determined for 14 root segments (0.1-10 cm diameter) reburied in a 6 m2 experimental test plot and surveyed at 5-25 cm line spacing under dry and wet soil conditions. Reburied roots >1.4 cm diameter were detectable as continuous root structures with 5 cm line separation. Reflection amplitudes were strongly controlled by soil moisture and decreased by 40% with a twofold increase in soil moisture. GPR line intervals of 12.5 and 25 cm produced discontinuous mapping of roots, and GPR coarse root biomass estimates (0.92 kgC m-2) were lower than those obtained previously with a site-specific allometric equation due to nondetection of vertical roots and roots <1.5 cm diameter. The results show that coarse root volume and biomass can be estimated directly from interpolated 3-D GPR volumes by using a marching cubes approach, but mapping of roots as continuous structures requires high inline sampling and line density (<5 cm). The results demonstrate that 3-D GPR is viable approach for estimating belowground carbon and for mapping tree root architecture. This methodology can be applied more broadly in other disciplines (e.g., archaeology and civil engineering) for imaging buried structures.
Lyden, Hannah; Gimbel, Sarah I; Del Piero, Larissa; Tsai, A Bryna; Sachs, Matthew E; Kaplan, Jonas T; Margolin, Gayla; Saxbe, Darby
2016-01-01
Associations between brain structure and early adversity have been inconsistent in the literature. These inconsistencies may be partially due to methodological differences. Different methods of brain segmentation may produce different results, obscuring the relationship between early adversity and brain volume. Moreover, adolescence is a time of significant brain growth and certain brain areas have distinct rates of development, which may compromise the accuracy of automated segmentation approaches. In the current study, 23 adolescents participated in two waves of a longitudinal study. Family aggression was measured when the youths were 12 years old, and structural scans were acquired an average of 4 years later. Bilateral amygdalae and hippocampi were segmented using three different methods (manual tracing, FSL, and NeuroQuant). The segmentation estimates were compared, and linear regressions were run to assess the relationship between early family aggression exposure and all three volume segmentation estimates. Manual tracing results showed a positive relationship between family aggression and right amygdala volume, whereas FSL segmentation showed negative relationships between family aggression and both the left and right hippocampi. However, results indicate poor overlap between methods, and different associations were found between early family aggression exposure and brain volume depending on the segmentation method used.
Lyden, Hannah; Gimbel, Sarah I.; Del Piero, Larissa; Tsai, A. Bryna; Sachs, Matthew E.; Kaplan, Jonas T.; Margolin, Gayla; Saxbe, Darby
2016-01-01
Associations between brain structure and early adversity have been inconsistent in the literature. These inconsistencies may be partially due to methodological differences. Different methods of brain segmentation may produce different results, obscuring the relationship between early adversity and brain volume. Moreover, adolescence is a time of significant brain growth and certain brain areas have distinct rates of development, which may compromise the accuracy of automated segmentation approaches. In the current study, 23 adolescents participated in two waves of a longitudinal study. Family aggression was measured when the youths were 12 years old, and structural scans were acquired an average of 4 years later. Bilateral amygdalae and hippocampi were segmented using three different methods (manual tracing, FSL, and NeuroQuant). The segmentation estimates were compared, and linear regressions were run to assess the relationship between early family aggression exposure and all three volume segmentation estimates. Manual tracing results showed a positive relationship between family aggression and right amygdala volume, whereas FSL segmentation showed negative relationships between family aggression and both the left and right hippocampi. However, results indicate poor overlap between methods, and different associations were found between early family aggression exposure and brain volume depending on the segmentation method used. PMID:27656121
Verma, Mahendra K.; Warwick, Peter D.
2011-01-01
The Energy Independence and Security Act of 2007 (Public Law 110-140) authorized the U.S. Geological Survey (USGS) to conduct a national assessment of geologic storage resources for carbon dioxide (CO2) and requested that the USGS estimate the "potential volumes of oil and gas recoverable by injection and sequestration of industrial carbon dioxide in potential sequestration formations" (121 Stat. 1711). The USGS developed a noneconomic, probability-based methodology to assess the Nation's technically assessable geologic storage resources available for sequestration of CO2 (Brennan and others, 2010) and is currently using the methodology to assess the Nation's CO2 geologic storage resources. Because the USGS has not developed a methodology to assess the potential volumes of technically recoverable hydrocarbons that could be produced by injection and sequestration of CO2, the Geologic Carbon Sequestration project initiated an effort in 2010 to develop a methodology for the assessment of the technically recoverable hydrocarbon potential in the sedimentary basins of the United States using enhanced oil recovery (EOR) techniques with CO2 (CO2-EOR). In collaboration with Stanford University, the USGS hosted a 2-day CO2-EOR workshop in May 2011, attended by 28 experts from academia, natural resource agencies and laboratories of the Federal Government, State and international geologic surveys, and representatives from the oil and gas industry. The geologic and the reservoir engineering and operations working groups formed during the workshop discussed various aspects of geology, reservoir engineering, and operations to make recommendations for the methodology.
Klett, T.R.; Schenk, Christopher J.; Charpentier, Ronald R.; Gautier, Donald L.; Brownfield, Michael E.; Pitman, Janet K.; Cook, Troy A.; Tennyson, Marilyn E.
2010-01-01
The U.S. Geological Survey estimated mean volumes of technically recoverable, conventional, undiscovered petroleum resources at 19.6 billion barrels of crude oil, 243 trillion cubic feet of natural gas, and 9.3 billion barrels of natural gas liquids for the Caspian Sea area, using a geology-based assessment methodology.
NASA Astrophysics Data System (ADS)
George, D. L.; Iverson, R. M.; Cannon, C. M.
2016-12-01
Landslide-generated tsunamis pose significant hazards to coastal communities and infrastructure, but developing models to assess these hazards presents challenges beyond those confronted when modeling seismically generated tsunamis. We present a new methodology in which our depth-averaged two-phase model D-Claw (Proc. Roy. Soc. A, 2014, doi: 10.1098/rspa.2013.0819 and doi:10.1098/rspa.2013.0820) is used to simulate all stages of landslide dynamics and subsequent tsunami generation and propagation. D-Claw was developed to simulate landslides and debris-flows, but if granular solids are absent, then the D-Claw equations reduce to the shallow-water equations commonly used to model tsunamis. Because the model describes the evolution of solid and fluid volume fractions, it treats both landslides and tsunamis as special cases of a more general class of phenomena, and the landslide and tsunami can be simulated as a single-layer continuum with spatially and temporally evolving solid-grain concentrations. This seamless approach accommodates wave generation via mass displacement and longitudinal momentum transfer, the dominant mechanisms producing impulse waves when large subaerial landslides impact relatively shallow bodies of water. To test our methodology, we used D-Claw to model a large subaerial landslide and resulting tsunami that occurred on October, 17, 2015, in Taan Fjord near the terminus of Tyndall Glacier, Alaska. The estimated landslide volume derived from radiated long-period seismicity (C. Stark (2015), Abstract EP51D-08, AGU Fall Meeting) was about 70-80 million cubic meters. Guided by satellite imagery and this volume estimate, we inferred an approximate landslide basal slip surface, and we used material property values identical to those used in our previous modeling of the 2014 Oso, Washington, landslide. With these inputs the modeled tsunami inundation patterns on shorelines compare well with observations derived from satellite imagery.
Petroleum supply annual, 1990. [Contains Glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-05-30
The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1990 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections, Summary Statistics, Detailed Statistics, and Refinery Capacity, each with final annual data. The second volume contains final statistics for each month of 1990, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes,more » located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.« less
Petroleum supply annual 1992. [Contains glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-05-27
The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1992 through annual and monthly surveys. The PSA is divided into two volumes. The first volume contains four sections: Summary Statistics, Detailed Statistics, Refinery Capacity, and Oxygenate Capacity each with final annual data. This second volume contains final statistics for each month of 1992, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them.more » Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary.« less
NASA Astrophysics Data System (ADS)
Hadwin, Paul J.; Sipkens, T. A.; Thomson, K. A.; Liu, F.; Daun, K. J.
2016-01-01
Auto-correlated laser-induced incandescence (AC-LII) infers the soot volume fraction (SVF) of soot particles by comparing the spectral incandescence from laser-energized particles to the pyrometrically inferred peak soot temperature. This calculation requires detailed knowledge of model parameters such as the absorption function of soot, which may vary with combustion chemistry, soot age, and the internal structure of the soot. This work presents a Bayesian methodology to quantify such uncertainties. This technique treats the additional "nuisance" model parameters, including the soot absorption function, as stochastic variables and incorporates the current state of knowledge of these parameters into the inference process through maximum entropy priors. While standard AC-LII analysis provides a point estimate of the SVF, Bayesian techniques infer the posterior probability density, which will allow scientists and engineers to better assess the reliability of AC-LII inferred SVFs in the context of environmental regulations and competing diagnostics.
Multi-atlas and label fusion approach for patient-specific MRI based skull estimation.
Torrado-Carvajal, Angel; Herraiz, Joaquin L; Hernandez-Tamames, Juan A; San Jose-Estepar, Raul; Eryaman, Yigitcan; Rozenholc, Yves; Adalsteinsson, Elfar; Wald, Lawrence L; Malpica, Norberto
2016-04-01
MRI-based skull segmentation is a useful procedure for many imaging applications. This study describes a methodology for automatic segmentation of the complete skull from a single T1-weighted volume. The skull is estimated using a multi-atlas segmentation approach. Using a whole head computed tomography (CT) scan database, the skull in a new MRI volume is detected by nonrigid image registration of the volume to every CT, and combination of the individual segmentations by label-fusion. We have compared Majority Voting, Simultaneous Truth and Performance Level Estimation (STAPLE), Shape Based Averaging (SBA), and the Selective and Iterative Method for Performance Level Estimation (SIMPLE) algorithms. The pipeline has been evaluated quantitatively using images from the Retrospective Image Registration Evaluation database (reaching an overlap of 72.46 ± 6.99%), a clinical CT-MR dataset (maximum overlap of 78.31 ± 6.97%), and a whole head CT-MRI pair (maximum overlap 78.68%). A qualitative evaluation has also been performed on MRI acquisition of volunteers. It is possible to automatically segment the complete skull from MRI data using a multi-atlas and label fusion approach. This will allow the creation of complete MRI-based tissue models that can be used in electromagnetic dosimetry applications and attenuation correction in PET/MR. © 2015 Wiley Periodicals, Inc.
Pirat, Bahar; Little, Stephen H; Igo, Stephen R; McCulloch, Marti; Nosé, Yukihiko; Hartley, Craig J; Zoghbi, William A
2009-03-01
The proximal isovelocity surface area (PISA) method is useful in the quantitation of aortic regurgitation (AR). We hypothesized that actual measurement of PISA provided with real-time 3-dimensional (3D) color Doppler yields more accurate regurgitant volumes than those estimated by 2-dimensional (2D) color Doppler PISA. We developed a pulsatile flow model for AR with an imaging chamber in which interchangeable regurgitant orifices with defined shapes and areas were incorporated. An ultrasonic flow meter was used to calculate the reference regurgitant volumes. A total of 29 different flow conditions for 5 orifices with different shapes were tested at a rate of 72 beats/min. 2D PISA was calculated as 2pi r(2), and 3D PISA was measured from 8 equidistant radial planes of the 3D PISA. Regurgitant volume was derived as PISA x aliasing velocity x time velocity integral of AR/peak AR velocity. Regurgitant volumes by flow meter ranged between 12.6 and 30.6 mL/beat (mean 21.4 +/- 5.5 mL/beat). Regurgitant volumes estimated by 2D PISA correlated well with volumes measured by flow meter (r = 0.69); however, a significant underestimation was observed (y = 0.5x + 0.6). Correlation with flow meter volumes was stronger for 3D PISA-derived regurgitant volumes (r = 0.83); significantly less underestimation of regurgitant volumes was seen, with a regression line close to identity (y = 0.9x + 3.9). Direct measurement of PISA is feasible, without geometric assumptions, using real-time 3D color Doppler. Calculation of aortic regurgitant volumes with 3D color Doppler using this methodology is more accurate than conventional 2D method with hemispheric PISA assumption.
A methodology to derive Synthetic Design Hydrographs for river flood management
NASA Astrophysics Data System (ADS)
Tomirotti, Massimo; Mignosa, Paolo
2017-12-01
The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.
Olea, R.A.; Houseknecht, D.W.; Garrity, C.P.; Cook, T.A.
2011-01-01
Shale gas is a form of continuous unconventional hydrocarbon accumulation whose resource estimation is unfeasible through the inference of pore volume. Under these circumstances, the usual approach is to base the assessment on well productivity through estimated ultimate recovery (EUR). Unconventional resource assessments that consider uncertainty are typically done by applying analytical procedures based on classical statistics theory that ignores geographical location, does not take into account spatial correlation, and assumes independence of EUR from other variables that may enter into the modeling. We formulate a new, more comprehensive approach based on sequential simulation to test methodologies known to be capable of more fully utilizing the data and overcoming unrealistic simplifications. Theoretical requirements demand modeling of EUR as areal density instead of well EUR. The new experimental methodology is illustrated by evaluating a gas play in the Woodford Shale in the Arkoma Basin of Oklahoma. Differently from previous assessments, we used net thickness and vitrinite reflectance as secondary variables correlated to cell EUR. In addition to the traditional probability distribution for undiscovered resources, the new methodology provides maps of EUR density and maps with probabilities to reach any given cell EUR, which are useful to visualize geographical variations in prospectivity.
NASA Technical Reports Server (NTRS)
Campbell, B. H.
1974-01-01
A methodology which was developed for balanced designing of spacecraft subsystems and interrelates cost, performance, safety, and schedule considerations was refined. The methodology consists of a two-step process: the first step is one of selecting all hardware designs which satisfy the given performance and safety requirements, the second step is one of estimating the cost and schedule required to design, build, and operate each spacecraft design. Using this methodology to develop a systems cost/performance model allows the user of such a model to establish specific designs and the related costs and schedule. The user is able to determine the sensitivity of design, costs, and schedules to changes in requirements. The resulting systems cost performance model is described and implemented as a digital computer program.
Volume and Mass Estimation of Three-Phase High Power Transformers for Space Applications
NASA Technical Reports Server (NTRS)
Kimnach, Greg L.
2004-01-01
Spacecraft historically have had sub-1kW(sub e), electrical requirements for GN&C, science, and communications: Galileo at 600W(sub e), and Cassini at 900W(sub e), for example. Because most missions have had the same order of magnitude power requirements, the Power Distribution Systems (PDS) use existing, space-qualified technology and are DC. As science payload and mission duration requirements increase, however, the required electrical power increases. Subsequently, this requires a change from a passive energy conversion (solar arrays and batteries) to dynamic (alternator, solar dynamic, etc.), because dynamic conversion has higher thermal and conversion efficiencies, has higher power densities, and scales more readily to higher power levels. Furthermore, increased power requirements and physical distribution lengths are best served with high-voltage, multi-phase AC to maintain distribution efficiency and minimize voltage drops. The generated AC-voltage must be stepped-up (or down) to interface with various subsystems or electrical hardware. Part of the trade-space design for AC distribution systems is volume and mass estimation of high-power transformers. The volume and mass are functions of the power rating, operating frequency, the ambient and allowable temperature rise, the types and amount of heat transfer available, the core material and shape, the required flux density in a core, the maximum current density, etc. McLyman has tabulated the performance of a number of transformers cores and derived a "cookbook" methodology to determine the volume of transformers, whereas Schawrze had derived an empirical method to estimate the mass of single-phase transformers. Based on the work of McLyman and Schwarze, it is the intent herein to derive an empirical solution to the volume and mass estimation of three-phase, laminated EI-core power transformers, having radiated and conducted heat transfer mechanisms available. Estimation of the mounting hardware, connectors, etc. is not included.
NASA Technical Reports Server (NTRS)
Caluori, V. A.; Conrad, R. T.; Jenkins, J. C.
1980-01-01
Technological requirements and forecasts of rocket engine parameters and launch vehicles for future Earth to geosynchronous orbit transportation systems are presented. The parametric performance, weight, and envelope data for the LOX/CH4, fuel cooled, staged combustion cycle and the hydrogen cooled, expander bleed cycle engine concepts are discussed. The costing methodology and ground rules used to develop the engine study are summarized. The weight estimating methodology for winged launched vehicles is described and summary data, used to evaluate and compare weight data for dedicated and integrated O2/H2 subsystems for the SSTO, HLLV and POTV are presented. Detail weights, comparisons, and weight scaling equations are provided.
Velocity gradients and reservoir volumes lessons in computational sensitivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, P.W.
1995-12-31
The sensitivity of reservoir volume estimation from depth converted geophysical time maps to the velocity gradients employed is investigated through a simple model study. The computed volumes are disconcertingly sensitive to gradients, both horizontal and vertical. The need for an accurate method of time to depth conversion is well demonstrated by the model study in which errors in velocity are magnified 40 fold in the computation of the volume. Thus if +/- 10% accuracy in the volume is desired, we must be able to estimate the velocity at the water contact with 0.25% accuracy. Put another way, if the velocitymore » is 8000 feet per second at the well then we have only +/- 20 feet per second leeway in estimating the velocity at the water contact. Very moderate horizontal and vertical gradients would typically indicate a velocity change of a few hundred feet per second if they are in the same direction. Clearly the interpreter needs to by very careful. A methodology is demonstrated which takes into account all the information that is available, velocities, tops, depositional and lithologic spatial patterns, and common sense. It is assumed that through appropriate use of check shot and other time-depth information, that the interpreter has correctly tied the reflection picks to the well tops. Such ties are ordinarily too soft for direct time-depth conversion to give adequate depth ties. The proposed method uses a common compaction law as its basis and incorporates time picks, tops and stratigraphic maps into the depth conversion process. The resulting depth map ties the known well tops in an optimum fashion.« less
Gaswirth, Stephanie B.; Marra, Kristen R.; Cook, Troy A.; Charpentier, Ronald R.; Gautier, Donald L.; Higley, Debra K.; Klett, Timothy R.; Lewan, Michael D.; Lillis, Paul G.; Schenk, Christopher J.; Tennyson, Marilyn E.; Whidden, Katherine J.
2013-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean undiscovered volumes of 7.4 billion barrels of oil, 6.7 trillion cubic feet of associated/dissolved natural gas, and 0.53 billion barrels of natural gas liquids in the Bakken and Three Forks Formations in the Williston Basin Province of Montana, North Dakota, and South Dakota.
Fuzzy hidden Markov chains segmentation for volume determination and quantitation in PET.
Hatt, M; Lamare, F; Boussion, N; Turzo, A; Collet, C; Salzenstein, F; Roux, C; Jarritt, P; Carson, K; Cheze-Le Rest, C; Visvikis, D
2007-06-21
Accurate volume of interest (VOI) estimation in PET is crucial in different oncology applications such as response to therapy evaluation and radiotherapy treatment planning. The objective of our study was to evaluate the performance of the proposed algorithm for automatic lesion volume delineation; namely the fuzzy hidden Markov chains (FHMC), with that of current state of the art in clinical practice threshold based techniques. As the classical hidden Markov chain (HMC) algorithm, FHMC takes into account noise, voxel intensity and spatial correlation, in order to classify a voxel as background or functional VOI. However the novelty of the fuzzy model consists of the inclusion of an estimation of imprecision, which should subsequently lead to a better modelling of the 'fuzzy' nature of the object of interest boundaries in emission tomography data. The performance of the algorithms has been assessed on both simulated and acquired datasets of the IEC phantom, covering a large range of spherical lesion sizes (from 10 to 37 mm), contrast ratios (4:1 and 8:1) and image noise levels. Both lesion activity recovery and VOI determination tasks were assessed in reconstructed images using two different voxel sizes (8 mm3 and 64 mm3). In order to account for both the functional volume location and its size, the concept of % classification errors was introduced in the evaluation of volume segmentation using the simulated datasets. Results reveal that FHMC performs substantially better than the threshold based methodology for functional volume determination or activity concentration recovery considering a contrast ratio of 4:1 and lesion sizes of <28 mm. Furthermore differences between classification and volume estimation errors evaluated were smaller for the segmented volumes provided by the FHMC algorithm. Finally, the performance of the automatic algorithms was less susceptible to image noise levels in comparison to the threshold based techniques. The analysis of both simulated and acquired datasets led to similar results and conclusions as far as the performance of segmentation algorithms under evaluation is concerned.
Patouillard, Edith; Kleinschmidt, Immo; Hanson, Kara; Pok, Sochea; Palafox, Benjamin; Tougher, Sarah; O'Connell, Kate; Goodman, Catherine
2013-09-05
There is increased interest in using commercial providers for improving access to quality malaria treatment. Understanding their current role is an essential first step, notably in terms of the volume of diagnostics and anti-malarials they sell. Sales volume data can be used to measure the importance of different provider and product types, frequency of parasitological diagnosis and impact of interventions. Several methods for measuring sales volumes are available, yet all have methodological challenges and evidence is lacking on the comparability of different methods. Using sales volume data on anti-malarials and rapid diagnostic tests (RDTs) for malaria collected through provider recall (RC) and retail audits (RA), this study measures the degree of agreement between the two methods at wholesale and retail commercial providers in Cambodia following the Bland-Altman approach. Relative strengths and weaknesses of the methods were also investigated through qualitative research with fieldworkers. A total of 67 wholesalers and 107 retailers were sampled. Wholesale sales volumes were estimated through both methods for 62 anti-malarials and 23 RDTs and retail volumes for 113 anti-malarials and 33 RDTs. At wholesale outlets, RA estimates for anti-malarial sales were on average higher than RC estimates (mean difference of four adult equivalent treatment doses (95% CI 0.6-7.2)), equivalent to 30% of mean sales volumes. For RDTs at wholesalers, the between-method mean difference was not statistically significant (one test, 95% CI -6.0-4.0). At retail outlets, between-method differences for both anti-malarials and RDTs increased with larger volumes being measured, so mean differences were not a meaningful measure of agreement between the methods. Qualitative research revealed that in Cambodia where sales volumes are small, RC had key advantages: providers were perceived to remember more easily their sales volumes and find RC less invasive; fieldworkers found it more convenient; and it was cheaper to implement than RA. Both RA and RC had implementation challenges and were prone to data collection errors. Choice of empirical methods is likely to have important implications for data quality depending on the study context.
2013-01-01
Background There is increased interest in using commercial providers for improving access to quality malaria treatment. Understanding their current role is an essential first step, notably in terms of the volume of diagnostics and anti-malarials they sell. Sales volume data can be used to measure the importance of different provider and product types, frequency of parasitological diagnosis and impact of interventions. Several methods for measuring sales volumes are available, yet all have methodological challenges and evidence is lacking on the comparability of different methods. Methods Using sales volume data on anti-malarials and rapid diagnostic tests (RDTs) for malaria collected through provider recall (RC) and retail audits (RA), this study measures the degree of agreement between the two methods at wholesale and retail commercial providers in Cambodia following the Bland-Altman approach. Relative strengths and weaknesses of the methods were also investigated through qualitative research with fieldworkers. Results A total of 67 wholesalers and 107 retailers were sampled. Wholesale sales volumes were estimated through both methods for 62 anti-malarials and 23 RDTs and retail volumes for 113 anti-malarials and 33 RDTs. At wholesale outlets, RA estimates for anti-malarial sales were on average higher than RC estimates (mean difference of four adult equivalent treatment doses (95% CI 0.6-7.2)), equivalent to 30% of mean sales volumes. For RDTs at wholesalers, the between-method mean difference was not statistically significant (one test, 95% CI −6.0-4.0). At retail outlets, between-method differences for both anti-malarials and RDTs increased with larger volumes being measured, so mean differences were not a meaningful measure of agreement between the methods. Qualitative research revealed that in Cambodia where sales volumes are small, RC had key advantages: providers were perceived to remember more easily their sales volumes and find RC less invasive; fieldworkers found it more convenient; and it was cheaper to implement than RA. Discussion/conclusions Both RA and RC had implementation challenges and were prone to data collection errors. Choice of empirical methods is likely to have important implications for data quality depending on the study context. PMID:24010526
NASA Astrophysics Data System (ADS)
Martínez-Sánchez, J.; Puente, I.; GonzálezJorge, H.; Riveiro, B.; Arias, P.
2016-06-01
When ground conditions are weak, particularly in free formed tunnel linings or retaining walls, sprayed concrete can be applied on the exposed surfaces immediately after excavation for shotcreting rock outcrops. In these situations, shotcrete is normally applied conjointly with rock bolts and mesh, thereby supporting the loose material that causes many of the small ground falls. On the other hand, contractors want to determine the thickness and volume of sprayed concrete for both technical and economic reasons: to guarantee their structural strength but also, to not deliver excess material that they will not be paid for. In this paper, we first introduce a terrestrial LiDAR-based method for the automatic detection of rock bolts, as typically used in anchored retaining walls. These ground support elements are segmented based on their geometry and they will serve as control points for the co-registration of two successive scans, before and after shotcreting. Then we compare both point clouds to estimate the sprayed concrete thickness and the expending volume on the wall. This novel methodology is demonstrated on repeated scan data from a retaining wall in the city of Vigo (Spain), resulting in a rock bolts detection rate of 91%, that permits to obtain a detailed information of the thickness and calculate a total volume of 3597 litres of concrete. These results have verified the effectiveness of the developed approach by increasing productivity and improving previous empirical proposals for real time thickness estimation.
Stanley, Richard G.; Pierce, Brenda S.; Houseknecht, David W.
2011-01-01
The U.S. Geological Survey (USGS) has completed an assessment of the volumes of undiscovered, technically recoverable oil and gas resources in conventional and continuous accumulations in Cook Inlet. The assessment used a geology-based methodology and results from new scientific research by the USGS and the State of Alaska, Department of Natural Resources, Division of Geological and Geophysical Surveys and Division of Oil and Gas (DOG). In the Cook Inlet region, the USGS estimates mean undiscovered volumes of nearly 600 million barrels of oil, about 19 trillion cubic feet of gas, and about 46 million barrels of natural gas liquids.
Evaluating performance of stormwater sampling approaches using a dynamic watershed model.
Ackerman, Drew; Stein, Eric D; Ritter, Kerry J
2011-09-01
Accurate quantification of stormwater pollutant levels is essential for estimating overall contaminant discharge to receiving waters. Numerous sampling approaches exist that attempt to balance accuracy against the costs associated with the sampling method. This study employs a novel and practical approach of evaluating the accuracy of different stormwater monitoring methodologies using stormflows and constituent concentrations produced by a fully validated continuous simulation watershed model. A major advantage of using a watershed model to simulate pollutant concentrations is that a large number of storms representing a broad range of conditions can be applied in testing the various sampling approaches. Seventy-eight distinct methodologies were evaluated by "virtual samplings" of 166 simulated storms of varying size, intensity and duration, representing 14 years of storms in Ballona Creek near Los Angeles, California. The 78 methods can be grouped into four general strategies: volume-paced compositing, time-paced compositing, pollutograph sampling, and microsampling. The performances of each sampling strategy was evaluated by comparing the (1) median relative error between the virtually sampled and the true modeled event mean concentration (EMC) of each storm (accuracy), (2) median absolute deviation about the median or "MAD" of the relative error or (precision), and (3) the percentage of storms where sampling methods were within 10% of the true EMC (combined measures of accuracy and precision). Finally, costs associated with site setup, sampling, and laboratory analysis were estimated for each method. Pollutograph sampling consistently outperformed the other three methods both in terms of accuracy and precision, but was the most costly method evaluated. Time-paced sampling consistently underestimated while volume-paced sampling over estimated the storm EMCs. Microsampling performance approached that of pollutograph sampling at a substantial cost savings. The most efficient method for routine stormwater monitoring in terms of a balance between performance and cost was volume-paced microsampling, with variable sample pacing to ensure that the entirety of the storm was captured. Pollutograph sampling is recommended if the data are to be used for detailed analysis of runoff dynamics.
Sanz-Requena, Roberto; Moratal, David; García-Sánchez, Diego Ramón; Bodí, Vicente; Rieta, José Joaquín; Sanchis, Juan Manuel
2007-03-01
Intravascular ultrasound (IVUS) imaging is used along with X-ray coronary angiography to detect vessel pathologies. Manual analysis of IVUS images is slow and time-consuming and it is not feasible for clinical purposes. A semi-automated method is proposed to generate 3D reconstructions from IVUS video sequences, so that a fast diagnose can be easily done, quantifying plaque length and severity as well as plaque volume of the vessels under study. The methodology described in this work has four steps: a pre-processing of IVUS images, a segmentation of media-adventitia contour, a detection of intima and plaque and a 3D reconstruction of the vessel. Preprocessing is intended to remove noise from the images without blurring the edges. Segmentation of media-adventitia contour is achieved using active contours (snakes). In particular, we use the gradient vector flow (GVF) as external force for the snakes. The detection of lumen border is obtained taking into account gray-level information of the inner part of the previously detected contours. A knowledge-based approach is used to determine which level of gray corresponds statistically to the different regions of interest: intima, plaque and lumen. The catheter region is automatically discarded. An estimate of plaque type is also given. Finally, 3D reconstruction of all detected regions is made. The suitability of this methodology has been verified for the analysis and visualization of plaque length, stenosis severity, automatic detection of the most problematic regions, calculus of plaque volumes and a preliminary estimation of plaque type obtaining for automatic measures of lumen and vessel area an average error smaller than 1mm(2) (equivalent aproximately to 10% of the average measure), for calculus of plaque and lumen volume errors smaller than 0.5mm(3) (equivalent approximately to 20% of the average measure) and for plaque type estimates a mismatch of less than 8% in the analysed frames.
NASA Astrophysics Data System (ADS)
Lacava, T.; Faruolo, M.; Coviello, I.; Filizzola, C.; Pergola, N.; Tramutoli, V.
2014-12-01
Gas flaring is one of the most controversial energetic and environmental issues the Earth is facing, moreover contributing to the global warming and climate change. According to the World Bank, each year about 150 Billion Cubic Meter of gas are being flared globally, that is equivalent to the annual gas use of Italy and France combined. Besides, about 400 million tons of CO2 (representing about 1.2% of global CO2 emissions) are added annually into the atmosphere. Efforts to evaluate the impact of flaring on the surrounding environment are hampered by lack of official information on flare locations and volumes. Suitable satellite based techniques could offers a potential solution to this problem through the detection and subsequent mapping of flare locations as well as gas emissions estimation. In this paper a new methodological approach, based on the Robust Satellite Techniques (RST), a multi-temporal scheme of satellite data analysis, was developed to analyze and characterize the flaring activity of the largest Italian gas and oil pre-treatment plant (ENI-COVA) located in Val d'Agri (Basilicata) For this site, located in an anthropized area characterized by a large environmental complexity, flaring emissions are mainly related to emergency conditions (i.e. waste flaring), being the industrial process regulated by strict regional laws. With reference to the peculiar characteristics of COVA flaring, the RST approach was implemented on 13 years of EOS-MODIS (Earth Observing System - Moderate Resolution Imaging Spectroradiometer) infrared data to detect COVA-related thermal anomalies and to develop a regression model for gas flared volume estimation. The methodological approach, the whole processing chain and the preliminarily achieved results will be shown and discussed in this paper. In addition, the possible implementation of the proposed approach on the data acquired by the SUOMI NPP - VIIRS (National Polar-orbiting Partnership - Visible Infrared Imaging Radiometer Suite) and the expected improvements will be also discussed.
Klett, T.R.
2011-01-01
The U.S. Geological Survey, using a geology-based assessment methodology, estimated mean volumes of technically recoverable, conventional, undiscovered petroleum resources at 84 million barrels of crude oil, 4.7 trillion cubic feet of natural gas, and 130 million barrels of natural gas liquids for the Dnieper-Donets Basin Province and 39 million barrels of crude oil, 48 billion cubic feet of natural gas, and 1 million barrels of natural gas liquids for the Pripyat Basin Province. The assessments are part of a program to estimate these resources for priority basins throughout the world.
Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment
NASA Astrophysics Data System (ADS)
David, S.; Visvikis, D.; Roux, C.; Hatt, M.
2011-09-01
In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.
NASA Astrophysics Data System (ADS)
Shpotyuk, Ya; Cebulski, J.; Ingram, A.; Shpotyuk, O.
2017-12-01
Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy in application to nanostructurized substances treated within three-term fitting procedure are reconsidered to parameterize their atomic-deficient structural arrangement. In contrast to conventional three-term fitting analysis of the detected PAL spectra based on admixed positron trapping and positronium (Ps) decaying, the nanostructurization due to guest nanoparticles embedded in host matrix is considered as producing modified trapping, which involves conversion between these channels. The developed approach referred to as x3-x2-coupling decomposition algorithm allows estimation free volumes of interfacial voids responsible for positron trapping and bulk lifetimes in nanoparticle-embedded substances. This methodology is validated using experimental data of Chakraverty et al. [Phys. Rev. B71 (2005) 024115] on PAL study of composites formed by guest NiFe2O4 nanocrystals grown in host SiO2 matrix.
Assessment of undiscovered oil and gas resources of the Sud Province, north-central Africa
Brownfield, M.E.; Klett, T.R.; Schenk, C.J.; Charpentier, R.R.; Cook, T.A.; Pollastro, R.M.; Tennyson, Marilyn E.
2011-01-01
The Sud Province located in north-central Africa recently was assessed for undiscovered, technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 7.31 billion barrels of oil, 13.42 trillion cubic feet of natural gas, and 353 million barrels of natural gas liquids.
Assessment of undiscovered oil and gas resources of the Chad Basin Province, North-Central Africa
Brownfield, Michael E.; Schenk, Christopher J.; Charpentier, Ronald R.; Klett, Timothy R.; Cook, Troy A.; Pollastro, Richard M.; Tennyson, Marilyn E.
2010-01-01
The Chad Basin Province located in north-central Africa recently was assessed for undiscovered, technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 2.32 billion barrels of oil, 14.65 trillion cubic feet of natural gas, and 391 million barrels of natural gas liquids.
Proceedings of the Workshop on Identification and Control of Flexible Space Structures, volume 1
NASA Technical Reports Server (NTRS)
Rodriguez, G. (Editor)
1985-01-01
Identification and control of flexible space structures were studied. Exploration of the most advanced modeling estimation, identification and control methodologies to flexible space structures was discussed. The following general areas were discussed: space platforms, antennas, and flight experiments; control/structure interactions - modeling, integrated design and optimization, control and stabilization, and shape control; control technology; control of space stations; large antenna control, dynamics and control experiments, and control/structure interaction experiments.
Stanley, Richard G.; Charpentier, Ronald R.; Cook, Troy A.; Houseknecht, David W.; Klett, Timothy R.; Lewis, Kristen A.; Lillis, Paul G.; Nelson, Philip H.; Phillips, Jeffrey D.; Pollastro, Richard M.; Potter, Christopher J.; Rouse, William A.; Saltus, Richard W.; Schenk, Christopher J.; Shah, Anjana K.; Valin, Zenon C.
2011-01-01
The U.S. Geological Survey (USGS) recently completed a new assessment of undiscovered, technically recoverable oil and gas resources in the Cook Inlet region of south-central Alaska. Using a geology-based assessment methodology, the USGS estimates that mean undiscovered volumes of nearly 600 million barrels of oil, about 19 trillion cubic feet of natural gas, and 46 million barrels of natural gas liquids remain to be found in this area.
Assessment of undiscovered oil and gas resources of four East Africa Geologic Provinces
Brownfield, Michael E.; Schenk, Christopher J.; Charpentier, Ronald R.; Klett, Timothy R.; Cook, Troy A.; Pollastro, Richard M.; Tennyson, Marilyn E.
2012-01-01
Four geologic provinces along the east coast of Africa recently were assessed for undiscovered, technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 27.6 billion barrels of oil, 441.1 trillion cubic feet of natural gas, and 13.77 billion barrels of natural gas liquids.
Airborne Electromagnetic Mapping of Peatlands: a Case Study in Norway.
NASA Astrophysics Data System (ADS)
Silvestri, S.; Viezzoli, A.; Pfaffhuber, A. A.; Vettore, A.
2017-12-01
Peatlands are extraordinary reservoirs of organic carbon that can be found over a wide range of latitudes, in tropical, to temperate, to (sub)polar climates. According to some estimates, the carbon stored in peatlands almost match the atmospheric carbon pool. Peatlands degradation due to natural and anthropogenic factors releases every year large amount of CO2 and other green house gasses into the atmosphere. The conservation of peatlands is therefore a key measure to reduce emissions and to mitigate climate change. An effective plan to prevent peatlands degradation must move from a precise estimate of the volume of peat stored across vast territories around the world. One example are the several bogs that characterize large surfaces in Norway. Our research combines the use of high spatial resolution satellite optical data with Airborne Electromagnetic (AEM) and field measurements in order to map the extension and thickness of peat in Brøttum, Ringsaker province, Norway. The methodology allows us to quantify the volume of peat as well as the organic carbon stock. The variable thickness typical of Norwegian bogs allows us to test the limits of the AEM methodology in resolving near surface peat layers. This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 747809. Start date: 1 June 2017. Duration: 24 months
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
A regional-scale estimation of ice wedge ice volumes in the Canadian High Arctic
NASA Astrophysics Data System (ADS)
Templeton, M.; Pollard, W. H.; Grand'Maison, C. B.
2016-12-01
Ice wedges are both prominent and environmentally vulnerable features in continuous permafrost environments. As the world's Arctic regions begin to warm, concern over the potential effects of ice wedge melt out has become an immediate issue, receiving much attention in the permafrost literature. In this study we estimate the volume of ice wedge ice for large areas in the Canadian High Arctic through the use of high resolution satellite imagery and the improved capabilities of Geographic Information Systems (GIS). The methodology used for this study is similar to that of one performed in Siberia and Alaska by Ulrich et al, in 2014. Utilizing Ulrich's technique, this study detected ice wedge polygons from satellite imagery using ArcGIS. The average width and depth of these ice wedges were obtained from a combination of field data and long-term field studies for the same location. The assumptions used in the analysis of ice wedge volume have been tested, including trough width being representative of ice wedge width, and ice wedge ice content (Pollard and French 1980). This study used specific field sites located near Eureka on Ellesmere Island (N80°01', W85°43') and at Expedition Fiord on Axel Heiberg Island (N79°23', W90°59'). The preliminary results indicate that the methodology used by Ulrich et al, 2014 is transferrable to the Canadian High Arctic, and that ice wedge volumes range between 3-10% of the upper part of permafrost. These findings are similar to previous studies and their importance is made all the more evident by the dynamic nature of ice wedges where it could be argued that they are a key driver of thermokarst terrain. The ubiquitous nature of ice wedges across arctic terrain highlights the importance and the need to improve our understanding of ice wedge dynamics, as subsidence from ice wedge melt-out could lead to large scale landscape change.
Catanuto, Giuseppe; Taher, Wafa; Rocco, Nicola; Catalano, Francesca; Allegra, Dario; Milotta, Filippo Luigi Maria; Stanco, Filippo; Gallo, Giovanni; Nava, Maurizio Bruno
2018-03-20
Breast shape is defined utilizing mainly qualitative assessment (full, flat, ptotic) or estimates, such as volume or distances between reference points, that cannot describe it reliably. We will quantitatively describe breast shape with two parameters derived from a statistical methodology denominated principal component analysis (PCA). We created a heterogeneous dataset of breast shapes acquired with a commercial infrared 3-dimensional scanner on which PCA was performed. We plotted on a Cartesian plane the two highest values of PCA for each breast (principal components 1 and 2). Testing of the methodology on a preoperative and postoperative surgical case and test-retest was performed by two operators. The first two principal components derived from PCA are able to characterize the shape of the breast included in the dataset. The test-retest demonstrated that different operators are able to obtain very similar values of PCA. The system is also able to identify major changes in the preoperative and postoperative stages of a two-stage reconstruction. Even minor changes were correctly detected by the system. This methodology can reliably describe the shape of a breast. An expert operator and a newly trained operator can reach similar results in a test/re-testing validation. Once developed and after further validation, this methodology could be employed as a good tool for outcome evaluation, auditing, and benchmarking.
Soil Bulk Density by Soil Type, Land Use and Data Source: Putting the Error in SOC Estimates
NASA Astrophysics Data System (ADS)
Wills, S. A.; Rossi, A.; Loecke, T.; Ramcharan, A. M.; Roecker, S.; Mishra, U.; Waltman, S.; Nave, L. E.; Williams, C. O.; Beaudette, D.; Libohova, Z.; Vasilas, L.
2017-12-01
An important part of SOC stock and pool assessment is the assessment, estimation, and application of bulk density estimates. The concept of bulk density is relatively simple (the mass of soil in a given volume), the specifics Bulk density can be difficult to measure in soils due to logistical and methodological constraints. While many estimates of SOC pools use legacy data in their estimates, few concerted efforts have been made to assess the process used to convert laboratory carbon concentration measurements and bulk density collection into volumetrically based SOC estimates. The methodologies used are particularly sensitive in wetlands and organic soils with high amounts of carbon and very low bulk densities. We will present an analysis across four database measurements: NCSS - the National Cooperative Soil Survey Characterization dataset, RaCA - the Rapid Carbon Assessment sample dataset, NWCA - the National Wetland Condition Assessment, and ISCN - the International soil Carbon Network. The relationship between bulk density and soil organic carbon will be evaluated by dataset and land use/land cover information. Prediction methods (both regression and machine learning) will be compared and contrasted across datasets and available input information. The assessment and application of bulk density, including modeling, aggregation and error propagation will be evaluated. Finally, recommendations will be made about both the use of new data in soil survey products (such as SSURGO) and the use of that information as legacy data in SOC pool estimates.
NASA Technical Reports Server (NTRS)
Bobick, J. C.; Braun, R. L.; Denny, R. E.
1979-01-01
The analysis of the benefits and costs of aeronautical research and technology (ABC-ART) models are documented. These models were developed by NASA for use in analyzing the economic feasibility of applying advanced aeronautical technology to future civil aircraft. The methodology is composed of three major modules: fleet accounting module, airframe manufacturing module, and air carrier module. The fleet accounting module is used to estimate the number of new aircraft required as a function of time to meet demand. This estimation is based primarily upon the expected retirement age of existing aircraft and the expected change in revenue passenger miles demanded. Fuel consumption estimates are also generated by this module. The airframe manufacturer module is used to analyze the feasibility of the manufacturing the new aircraft demanded. The module includes logic for production scheduling and estimating manufacturing costs. For a series of aircraft selling prices, a cash flow analysis is performed and a rate of return on investment is calculated. The air carrier module provides a tool for analyzing the financial feasibility of an airline purchasing and operating the new aircraft. This module includes a methodology for computing the air carrier direct and indirect operating costs, performing a cash flow analysis, and estimating the internal rate of return on investment for a set of aircraft purchase prices.
Assessment of undiscovered oil and gas resources of the West African Costal Province, West Africa
Brownfield, Michael E.; Charpentier, Ronald R.; Schenk, Christopher J.; Klett, Timothy R.; Cook, Troy A.; Pollastro, Richard M.
2011-01-01
The West African Coastal Province along the west African coastline recently was assessed for undiscovered, technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's USGS World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 3.2 billion barrels of oil, 23.63 trillion cubic feet of natural gas, and 721 million barrels of natural gas liquids.
Assessment of Undiscovered Oil and Gas Resources of Four West Africa Geologic Provinces
Brownfield, Michael E.; Charpentier, Ronald R.; Cook, Troy A.; Klett, Timothy R.; Pitman, Janet K.; Pollastro, Richard M.; Schenk, Christopher J.; Tennyson, Marilyn E.
2010-01-01
Four geologic provinces located along the northwest and west-central coast of Africa recently were assessed for undiscovered oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 71.7 billion barrels of oil, 187.2 trillion cubic feet of natural gas, and 10.9 billion barrels of natural gas liquids.
2000-03-01
tristis Spizella passerina Stelgidopteryx serripennis Sterna Sterna caspia Sterna forsteri Sturnella neglecta Sturnus vulgaris Tachycineta...ENVIRONMENT 3-1 3.1 Transportation, Traffic, and Circulation 3_1 3.1.1 Existing Transportation System 3~5 3.1.2 Methodology for Estimating Existing...Policies 3-121 3.10 Utilities 3-125 3.10.1 Water Systems 3 3.10.2 Storm Water Collection System 3蕎 3.10.3 Sanitary Collection System 3
NASA Astrophysics Data System (ADS)
Acín, V.; Bird, I.; Boccali, T.; Cancio, G.; Collier, I. P.; Corney, D.; Delaunay, B.; Delfino, M.; dell'Agnello, L.; Flix, J.; Fuhrmann, P.; Gasthuber, M.; Gülzow, V.; Heiss, A.; Lamanna, G.; Macchi, P.-E.; Maggi, M.; Matthews, B.; Neissner, C.; Nief, J.-Y.; Porto, M. C.; Sansum, A.; Schulz, M.; Shiers, J.
2015-12-01
Several scientific fields, including Astrophysics, Astroparticle Physics, Cosmology, Nuclear and Particle Physics, and Research with Photons, are estimating that by the 2020 decade they will require data handling systems with data volumes approaching the Zettabyte distributed amongst as many as 1018 individually addressable data objects (Zettabyte-Exascale systems). It may be convenient or necessary to deploy such systems using multiple physical sites. This paper describes the findings of a working group composed of experts from several
Assessment of undiscovered oil and gas resources of the South Africa Coastal Province, Africa
Brownfield, Michael E.; Schenk, Christopher J.; Charpentier, Ronald R.; Klett, Timothy R.; Cook, Troy A.; Pollastro, Richard M.
2012-01-01
The South Africa Coastal Province along the South Africa coast recently was assessed for undiscovered, technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 2.13 billion barrels of oil, 35.96 trillion cubic feet of natural gas, and 1,115 million barrels of natural gas liquids.
Wandrey, Craig J.; Schenk, Christopher J.; Klett, Timothy R.; Brownfield, Michael E.; Charpentier, Ronald R.; Cook, Troy A.; Pollastro, Richard M.; Tennyson, Marilyn E.
2012-01-01
The Irrawaddy-Andaman and Indo-Burman Geologic Provinces were recently assessed for undiscovered technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 2.3 billion barrels of oil, 79.6 trillion cubic feet of gas, and 2.1 billion barrels of natrual gas liquids.
Evolution of Mobil`s methods to evaluate exploration and producing opportunities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaynor, C.B.; Cook, D.M. Jr.
1996-08-01
Over the past decade, Mobil has changed significantly in size, structure and focus to improve profitability. Concurrently, work processes and methodologies have been modified to improve resource utilization and opportunity selection. The key imperative has been recognition of the full range of hydrocarbon volume uncertainty, its risk and value. Exploration has focussed on increasing success through improved geotechnical estimates and demonstrating value addition. For Producing, the important tasks: (1) A centralized Exploration and Producing team was formed to help ensure an integrated, consistent worldwide approach to prospect and field assessments. Monte Carlo simulation was instituted to recognize probability-weighted ranges ofmore » possible outcomes for prospects and fields, and hydrocarbon volume category definitions were standardized. (2) Exploration instituted a global Prospect Inventory, tracking wildcat predictions vs. results. Performance analyses led to initiatives to improve the quality and consistency of assessments. Process improvement efforts included the use of multidisciplinary teams and peer reviews. Continued overestimates of hydrocarbon volumes prompted methodology changes such as the use of {open_quotes}reality checks{close_quotes} and log-normal distributions. The communication of value predictions and additions became paramount. (3) Producing now recognizes the need for Exploration`s commercial discoveries and new Producing ventures, notwithstanding the associated risk. Multi-disciplinary teams of engineers and geoscientists work on post-discovery assessments to optimize field development and maximize the value of opportunities. Mobil now integrates volume and risk assessment with correlative future capital investment programs to make proactive strategic choices to maximize shareholder value.« less
Transportation Energy Conservation Data Book: A Selected Bibliography. Edition 3,
1978-11-01
Charlottesville, VA 22901 TITLE: Couputer-Based Resource Accounting Model TT1.1: Methodology for the Design of Urban for Automobile Technology Impact...Evaluation System ACCOUNTING; INDUSTRIAL SECTOR; ENERGY tPIESi Documentation. volume 6. CONSUM PTION: PERFORANCE: DESIGN : NASTE MEAT: Methodology for... Methodology for the Design of Urban Transportation 000172 Energy Flows In the U.S., 1973 and 1974. Volume 1: Methodology * $opdate to the Fational Energy
Pirat, Bahar; Little, Stephen H.; Igo, Stephen R.; McCulloch, Marti; Nosé, Yukihiko; Hartley, Craig J.; Zoghbi, William A.
2012-01-01
Objective The proximal isovelocity surface area (PISA) method is useful in the quantitation of aortic regurgitation (AR). We hypothesized that actual measurement of PISA provided with real-time 3-dimensional (3D) color Doppler yields more accurate regurgitant volumes than those estimated by 2-dimensional (2D) color Doppler PISA. Methods We developed a pulsatile flow model for AR with an imaging chamber in which interchangeable regurgitant orifices with defined shapes and areas were incorporated. An ultrasonic flow meter was used to calculate the reference regurgitant volumes. A total of 29 different flow conditions for 5 orifices with different shapes were tested at a rate of 72 beats/min. 2D PISA was calculated as 2π r2, and 3D PISA was measured from 8 equidistant radial planes of the 3D PISA. Regurgitant volume was derived as PISA × aliasing velocity × time velocity integral of AR/peak AR velocity. Results Regurgitant volumes by flow meter ranged between 12.6 and 30.6 mL/beat (mean 21.4 ± 5.5 mL/beat). Regurgitant volumes estimated by 2D PISA correlated well with volumes measured by flow meter (r = 0.69); however, a significant underestimation was observed (y = 0.5x + 0.6). Correlation with flow meter volumes was stronger for 3D PISA-derived regurgitant volumes (r = 0.83); significantly less underestimation of regurgitant volumes was seen, with a regression line close to identity (y = 0.9x + 3.9). Conclusion Direct measurement of PISA is feasible, without geometric assumptions, using real-time 3D color Doppler. Calculation of aortic regurgitant volumes with 3D color Doppler using this methodology is more accurate than conventional 2D method with hemispheric PISA assumption. PMID:19168322
C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1
NASA Astrophysics Data System (ADS)
Wilson, J. L.; Jolly, M. B.
1984-01-01
A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.
Development of a Probabilistic Assessment Methodology for Evaluation of Carbon Dioxide Storage
Burruss, Robert A.; Brennan, Sean T.; Freeman, P.A.; Merrill, Matthew D.; Ruppert, Leslie F.; Becker, Mark F.; Herkelrath, William N.; Kharaka, Yousif K.; Neuzil, Christopher E.; Swanson, Sharon M.; Cook, Troy A.; Klett, Timothy R.; Nelson, Philip H.; Schenk, Christopher J.
2009-01-01
This report describes a probabilistic assessment methodology developed by the U.S. Geological Survey (USGS) for evaluation of the resource potential for storage of carbon dioxide (CO2) in the subsurface of the United States as authorized by the Energy Independence and Security Act (Public Law 110-140, 2007). The methodology is based on USGS assessment methodologies for oil and gas resources created and refined over the last 30 years. The resource that is evaluated is the volume of pore space in the subsurface in the depth range of 3,000 to 13,000 feet that can be described within a geologically defined storage assessment unit consisting of a storage formation and an enclosing seal formation. Storage assessment units are divided into physical traps (PTs), which in most cases are oil and gas reservoirs, and the surrounding saline formation (SF), which encompasses the remainder of the storage formation. The storage resource is determined separately for these two types of storage. Monte Carlo simulation methods are used to calculate a distribution of the potential storage size for individual PTs and the SF. To estimate the aggregate storage resource of all PTs, a second Monte Carlo simulation step is used to sample the size and number of PTs. The probability of successful storage for individual PTs or the entire SF, defined in this methodology by the likelihood that the amount of CO2 stored will be greater than a prescribed minimum, is based on an estimate of the probability of containment using present-day geologic knowledge. The report concludes with a brief discussion of needed research data that could be used to refine assessment methodologies for CO2 sequestration.
A study of the radiobiological modeling of the conformal radiation therapy in cancer treatment
NASA Astrophysics Data System (ADS)
Pyakuryal, Anil Prasad
Cancer is one of the leading causes of mortalities in the world. The precise diagnosis of the disease helps the patients to select the appropriate modality of the treatments such as surgery, chemotherapy and radiation therapy. The physics of X-radiation and the advanced imaging technologies such as positron emission tomography (PET) and computed tomography (CT) plays an important role in the efficient diagnosis and therapeutic treatments in cancer. However, the accuracy of the measurements of the metabolic target volumes (MTVs) in the PET/CT dual-imaging modality is always limited. Similarly the external beam radiation therapy (XRT) such as 3D conformal radiotherapy (3DCRT) and intensity modulated radiation therapy (IMRT) is the most common modality in the radiotherapy treatment. These treatments are simulated and evaluated using the XRT plans and the standard methodologies in the commercial planning system. However, the normal organs are always susceptible to the radiation toxicity in these treatments due to lack of knowledge of the appropriate radiobiological models to estimate the clinical outcomes. We explored several methodologies to estimate MTVs by reviewing various techniques of the target volume delineation using the static phantoms in the PET scans. The review suggests that the more precise and practical method of delineating PET MTV should be an intermediate volume between the volume coverage for the standardized uptake value (SUV; 2.5) of glucose and the 50% (40%) threshold of the maximum SUV for the smaller (larger) volume delineations in the radiotherapy applications. Similarly various types of optimal XRT plans were designed using the CT and PET/CT scans for the treatment of various types of cancer patients. The qualities of these plans were assessed using the universal plan-indices. The dose-volume criteria were also examined in the targets and organs by analyzing the conventional dose-volume histograms (DVHs). The biological models such as tumor control probability based on Poisson statistics model, and normal tissue complication probabilities based on Lyman-Kutcher-Burman model, were efficient to estimate the radiobiological outcomes of the treatments by taking into account of the dose-volume effects in the organs. Furthermore, a novel technique of spatial DVH analysis was also found to be useful to determine the primary cause of the complications in the critical organs in the treatments. The study also showed that the 3DCRT and IMRT techniques offer the promising results in the XRT treatment of the left-breast and the prostate cancer patients respectively. Unfortunately, several organs such as salivary glands and larynx, and esophagus, were found to be significantly vulnerable to the radiation toxicity in the treatment of the head and neck (HN), and left-lung cancer patients respectively. The radiobiological outcomes were also found to be consistent with the clinical results of the IMRT based treatments of a significant number of the HN cancer patients.
Development of Probabilistic Rigid Pavement Design Methodologies for Military Airfields.
1983-12-01
4A161102AT22, Task AO, Work Unit 009, "Methodology for Considering Material Variability in Pavement Design." OCE Project Monitor was Mr. S. S. Gillespie. The...PREFACE. .. ............................. VOLUME 1: STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS VOLUME 11: MATHEMATICAL FORMULATION OF...VOLUME IV: PROBABILISTIC ANALYSIS OF RIGID AIRFIELD DESIGN BY ELASTIC LAYERED THEORY VOLUME I STATE OF THE ART VARIABILITY OF AIRFIELD PAVEMENT MATERIALS
Kusunoki, Hideki; Okuma, Kazu; Hamaguchi, Isao
2012-01-01
For national regulatory testing in Japan, the Lowry method is used for the determination of total protein content in vaccines. However, many substances are known to interfere with the Lowry method, rendering accurate estimation of protein content difficult. To accurately determine the total protein content in vaccines, it is necessary to identify the major interfering substances and improve the methodology for removing such substances. This study examined the effects of high levels of lactose with low levels of protein in freeze-dried, cell culture-derived Japanese encephalitis vaccine (inactivated). Lactose was selected because it is a reducing sugar that is expected to interfere with the Lowry method. Our results revealed that concentrations of ≥ 0.1 mg/mL lactose interfered with the Lowry assays and resulted in overestimation of the protein content in a lactose concentration-dependent manner. On the other hand, our results demonstrated that it is important for the residual volume to be ≤ 0.05 mL after trichloroacetic acid precipitation in order to avoid the effects of lactose. Thus, the method presented here is useful for accurate protein determination by the Lowry method, even when it is used for determining low levels of protein in vaccines containing interfering substances. In this study, we have reported a methodological adjustment that allows accurate estimation of protein content for national regulatory testing, when the vaccine contains interfering substances.
NASA Astrophysics Data System (ADS)
Pergola, Nicola; Faruolo, Mariapia; Irina, Coviello; Carolina, Filizzola; Teodosio, Lacava; Valerio, Tramutoli
2014-05-01
Different kinds of atmospheric pollution affect human health and the environment at local and global scale. The petroleum industry represents one of the most important environmental pollution sources, accounting for about 18% of well-to-wheels greenhouse gas (GHG) emissions. The main pollution source is represented by the flaring of gas, one of the most challenging energy and environmental problems facing the world today. The World Bank has estimated that 150 billion cubic meters of natural gas are being flared annually, that is equivalent to 30% of the European Union's gas consumption. Since 2002, satellite-based methodologies have shown their capability in providing independent and reliable estimation of gas flaring emissions, at both national and global scale. In this paper, for the first time, the potential of satellite data in estimating gas flaring volumes emitted from a single on-shore crude oil pre-treatment plant, i.e. the Ente Nazionale Idrocarburi (ENI) Val d'Agri Oil Center (COVA), located in the Basilicata Region (South of Italy), was assessed. Specifically, thirteen years of night-time Moderate Resolution Imaging Spectroradiometer (MODIS) data acquired in the medium and thermal infrared (MIR and TIR, respectively) bands were processed. The Robust Satellite Techniques (RST) approach was implemented for identifying anomalous values of the signals under investigation (i.e. the MIR-TIR difference one), associated to the COVA flares emergency discharges. Then, the Fire Radiative Power (FRP), computed for the thermal anomalies previously identified, was correlated to the emitted gas flaring volumes, available for the COVA in the period 2003 - 2009, defining a satellite based regression model for estimating COVA gas flaring emitted volumes. The used strategy and the preliminary results of this analysis will be described in detail in this work.
Bunegin, L; Wahl, D; Albin, M S
1994-03-01
Cerebral embolism has been implicated in the development of cognitive and neurological deficits following bypass surgery. This study proposes methodology for estimating cerebral air embolus volume using transcranial Doppler sonography. Transcranial Doppler audio signals of air bubbles in the middle cerebral artery obtained from in vivo experiments were subjected to a fast-Fourier transform analysis. Audio segments when no air was present as well as artifact resulting from electrocautery and sensor movement were also subjected to fast-Fourier transform analysis. Spectra were compared, and frequency and power differences were noted and used for development of audio band-pass filters for isolation of frequencies associated with air emboli. In a bench model of the middle cerebral artery circulation, repetitive injections of various air volumes between 0.5 and 500 microL were made. Transcranial Doppler audio output was band-pass filtered, acquired digitally, then subjected to a fast-Fourier transform power spectrum analysis and power spectrum integration. A linear least-squares correlation was performed on the data. Fast-Fourier transform analysis of audio segments indicated that frequencies between 250 and 500 Hz are consistently dominant in the spectrum when air emboli are present. Background frequencies appear to be below 240 Hz, and artifact resulting from sensor movement and electrocautery appears to be below 300 Hz. Data from the middle cerebral artery model filtered through a 307- to 450-Hz band-pass filter yielded a linear relation between emboli volume and the integrated value of the power spectrum near 40 microL. Detection of emboli less than 0.5 microL was inconsistent, and embolus volumes greater than 40 microL were indistinguishable from one another. The preliminary technique described in this study may represent a starting point from which automated detection and volume estimation of cerebral emboli might be approached.
NASA Astrophysics Data System (ADS)
Giap, Huan Bosco
Accurate calculation of absorbed dose to target tumors and normal tissues in the body is an important requirement for establishing fundamental dose-response relationships for radioimmunotherapy. Two major obstacles have been the difficulty in obtaining an accurate patient-specific 3-D activity map in-vivo and calculating the resulting absorbed dose. This study investigated a methodology for 3-D internal dosimetry, which integrates the 3-D biodistribution of the radionuclide acquired from SPECT with a dose-point kernel convolution technique to provide the 3-D distribution of absorbed dose. Accurate SPECT images were reconstructed with appropriate methods for noise filtering, attenuation correction, and Compton scatter correction. The SPECT images were converted into activity maps using a calibration phantom. The activity map was convolved with an ^{131}I dose-point kernel using a 3-D fast Fourier transform to yield a 3-D distribution of absorbed dose. The 3-D absorbed dose map was then processed to provide the absorbed dose distribution in regions of interest. This methodology can provide heterogeneous distributions of absorbed dose in volumes of any size and shape with nonuniform distributions of activity. Comparison of the activities quantitated by our SPECT methodology to true activities in an Alderson abdominal phantom (with spleen, liver, and spherical tumor) yielded errors of -16.3% to 4.4%. Volume quantitation errors ranged from -4.0 to 5.9% for volumes greater than 88 ml. The percentage differences of the average absorbed dose rates calculated by this methodology and the MIRD S-values were 9.1% for liver, 13.7% for spleen, and 0.9% for the tumor. Good agreement (percent differences were less than 8%) was found between the absorbed dose due to penetrating radiation calculated from this methodology and TLD measurement. More accurate estimates of the 3 -D distribution of absorbed dose can be used as a guide in specifying the minimum activity to be administered to patients to deliver a prescribed absorbed dose to tumor without exceeding the toxicity limits of normal tissues.
DOT National Transportation Integrated Search
1981-06-01
This study provides information about public attitudes towards proposed highway safety countermeasures in three program areas: alcohol and drugs, unsafe driving behaviors, and pedestrian safety. This volume describes the three research methodologies ...
Experimental investigation of the mass flow gain factor in a draft tube with cavitation vortex rope
NASA Astrophysics Data System (ADS)
Landry, C.; Favrel, A.; Müller, A.; Yamamoto, K.; Alligné, S.; Avellan, F.
2017-04-01
At off-design operating operations, cavitating flow is often observed in hydraulic machines. The presence of a cavitation vortex rope may induce draft tube surge and electrical power swings at part load and full load operations. The stability analysis of these operating conditions requires a numerical pipe model taking into account the complexity of the two-phase flow. Among the hydroacoustic parameters describing the cavitating draft tube flow in the numerical model, the mass flow gain factor, representing the mass excitation source expressed as the rate of change of the cavitation volume as a function of the discharge, remains difficult to model. This paper presents a quasi-static method to estimate the mass flow gain factor in the draft tube for a given cavitation vortex rope volume in the case of a reduced scale physical model of a ν = 0.27 Francis turbine. The methodology is based on an experimental identification of the natural frequency of the test rig hydraulic system for different Thoma numbers. With the identification of the natural frequency, it is possible to model the wave speed, the cavitation compliance and the volume of the cavitation vortex rope. By applying this new methodology for different discharge values, it becomes possible to identify the mass flow gain factor and improve the accuracy of the system stability analysis.
NASA Astrophysics Data System (ADS)
Haines, S. S.; Varela, B. A.; Thamke, J.; Hawkins, S. J.; Gianoutsos, N. J.; Tennyson, M. E.
2017-12-01
Water is used for several stages of oil and gas production, in particular for hydraulic fracturing that is typically used during production of petroleum from low-permeability shales and other rock types (referred to as "continuous" petroleum accumulations). Proppant, often sand, is also consumed during hydraulic fracturing. Water is then produced from the reservoir along with the oil and gas, representing either a disposal consideration or a possible source of water for further petroleum development or other purposes. The U.S. Geological Survey (USGS) has developed an approach for regional-scale estimation of these water and proppant quantities in order to provide an improved understanding of possible impacts and to help with planning and decision-making. Using the new methodology, the USGS has conducted a quantitative assessment of water and proppant requirements, and water production volumes, associated with associated with possible future production of undiscovered petroleum resources in the Bakken and Three Forks Formations, Williston Basin, USA. This water and proppant assessment builds directly from the 2013 USGS petroleum assessment for the Bakken and Three Forks Formations. USGS petroleum assessments incorporate all available geologic and petroleum production information, and include the definition of assessment units (AUs) that specify the geographic regions and geologic formations for the assessment. The 2013 petroleum assessment included 5 continuous AUs for the Bakken Formation and one continuous AU for the Three Forks Formation. The assessment inputs are defined probabilistically, and a Monte Carlo approach provides outputs that include uncertainty bounds. We can summarize the assessment outputs with the mean values of the associated distributions. The mean estimated total volume of water for well drilling and cement for all six continuous AUs is 5.9 billion gallons, and the mean estimated volume of water for hydraulic fracturing for all AUs is 164.3 billion gallons. The mean estimated quantity of proppant for hydraulic fracturing is 101.3 million tons. Summing over all of the AUs, the mean estimated total flowback water volume is 9.9 billion gallons and the mean estimated total produced water is 414.5 billion gallons.
NASA Astrophysics Data System (ADS)
Muñoz, Randy; Paredes, Javier; Huggel, Christian; Drenkhan, Fabian; García, Javier
2017-04-01
The availability and consistency of data is a determining factor for the reliability of any hydrological model and simulated results. Unfortunately, there are many regions worldwide where data is not available in the desired quantity and quality. The Santa River basin (SRB), located within a complex topographic and climatic setting in the tropical Andes of Peru is a clear example of this challenging situation. A monitoring network of in-situ stations in the SRB recorded series of hydro-meteorological variables which finally ceased to operate in 1999. In the following years, several researchers evaluated and completed many of these series. This database was used by multiple research and policy-oriented projects in the SRB. However, hydroclimatic information remains limited, making it difficult to perform research, especially when dealing with the assessment of current and future water resources. In this context, here the evaluation of different methodologies to interpolate temperature and precipitation data at a monthly time step as well as ice volume data in glacierized basins with limited data is presented. The methodologies were evaluated for the Quillcay River, a tributary of the SRB, where the hydro-meteorological data is available from nearby monitoring stations since 1983. The study period was 1983 - 1999 with a validation period among 1993 - 1999. For temperature series the aim was to extend the observed data and interpolate it. Data from Reanalysis NCEP was used to extend the observed series: 1) using a simple correlation with multiple field stations, or 2) applying the altitudinal correction proposed in previous studies. The interpolation then was applied as a function of altitude. Both methodologies provide very close results, by parsimony simple correlation is shown as a viable choice. For precipitation series, the aim was to interpolate observed data. Two methodologies were evaluated: 1) Inverse Distance Weighting whose results underestimate the amount of precipitation in high-altitudinal zones, and 2) ordinary Kriging (OK) whose variograms were calculated with the multi-annual monthly mean precipitation applying them to the whole study period. OK leads to better results in both low and high altitudinal zones. For ice volume, the aim was to estimate values from historical data: 1) with the GlabTop algorithm which needs digital elevation models, but these are available in an appropriate scale since 2009, 2) with a widely applied but controversially discussed glacier area-volume relation whose parameters were calibrated with results from the GlabTop model. Both methodologies provide reasonable results, but for historical data, the area-volume scaling only requires the glacial area easy to calculate from satellite images since 1986. In conclusion, the simple correlation, the OK and the calibrated relation for ice volume showed the best ways to interpolate glacio-climatic information. However, these methods must be carefully applied and revisited for the specific situation with high complexity. This is a first step in order to identify the most appropriate methods to interpolate and extend observed data in glacierized basins with limited information. New research should be done evaluating another methodologies and meteorological data in order to improve hydrological models and water management policies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Bria M.; Brady, Samuel L., E-mail: samuel.brady@stjude.org; Kaufman, Robert A.
Purpose: To investigate the correlation of size-specific dose estimate (SSDE) with absorbed organ dose, and to develop a simple methodology for estimating patient organ dose in a pediatric population (5–55 kg). Methods: Four physical anthropomorphic phantoms representing a range of pediatric body habitus were scanned with metal oxide semiconductor field effect transistor (MOSFET) dosimeters placed at 23 organ locations to determine absolute organ dose. Phantom absolute organ dose was divided by phantom SSDE to determine correlation between organ dose and SSDE. Organ dose correlation factors (CF{sub SSDE}{sup organ}) were then multiplied by patient-specific SSDE to estimate patient organ dose. Themore » CF{sub SSDE}{sup organ} were used to retrospectively estimate individual organ doses from 352 chest and 241 abdominopelvic pediatric CT examinations, where mean patient weight was 22 kg ± 15 (range 5–55 kg), and mean patient age was 6 yrs ± 5 (range 4 months to 23 yrs). Patient organ dose estimates were compared to published pediatric Monte Carlo study results. Results: Phantom effective diameters were matched with patient population effective diameters to within 4 cm; thus, showing appropriate scalability of the phantoms across the entire pediatric population in this study. IndividualCF{sub SSDE}{sup organ} were determined for a total of 23 organs in the chest and abdominopelvic region across nine weight subcategories. For organs fully covered by the scan volume, correlation in the chest (average 1.1; range 0.7–1.4) and abdominopelvic region (average 0.9; range 0.7–1.3) was near unity. For organ/tissue that extended beyond the scan volume (i.e., skin, bone marrow, and bone surface), correlation was determined to be poor (average 0.3; range: 0.1–0.4) for both the chest and abdominopelvic regions, respectively. A means to estimate patient organ dose was demonstrated. Calculated patient organ dose, using patient SSDE and CF{sub SSDE}{sup organ}, was compared to previously published pediatric patient doses that accounted for patient size in their dose calculation, and was found to agree in the chest to better than an average of 5% (27.6/26.2) and in the abdominopelvic region to better than 2% (73.4/75.0). Conclusions: For organs fully covered within the scan volume, the average correlation of SSDE and organ absolute dose was found to be better than ±10%. In addition, this study provides a complete list of organ dose correlation factors (CF{sub SSDE}{sup organ}) for the chest and abdominopelvic regions, and describes a simple methodology to estimate individual pediatric patient organ dose based on patient SSDE.« less
Estimates of Fossil Fuel Carbon Dioxide Emissions From Mexico at Monthly Time Intervals
NASA Astrophysics Data System (ADS)
Losey, L. M.; Andres, R. J.
2003-12-01
Human consumption of fossil fuels has greatly contributed to the rise of carbon dioxide in the Earth's atmosphere. To better understand the global carbon cycle, it is important to identify the major sources of these fossil fuels. Mexico is among the top fifteen nations in the world for producing fossil fuel carbon dioxide emissions. Based on this information and that emissions from Mexico are a focus of the North American Carbon Program, Mexico was selected for this study. Mexican monthly inland sales volumes for January 1988-May 2003 were collected on natural gas and liquid fuels from the Energy Information Agency in the United States Department of Energy. These sales figures represent a major portion of the total fossil fuel consumption in Mexico. The fraction of a particular fossil fuel consumed in a given month was determined by dividing the monthly sales volumes by the annual sum of monthly sales volumes for a given year. This fraction was then multiplied by the annual carbon dioxide values reported by the Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) to estimate the monthly carbon dioxide emissions from the respective fuels. The advantages of this methodology are: 1) monthly fluxes are consistent with the annual flux as determined by the widely-accepted CDIAC values, and 2) its general application can be easily adapted to other nations for determining their sub-annual time scale emissions. The major disadvantage of this methodology is the proxy nature inherent to it. Only a fraction of the total emissions are used as an estimate in determining the seasonal cycle. The error inherent in this approach increases as the fraction of total emissions represented by the proxy decreases. These data are part of a long-term project between researchers at the University of North Dakota and ORNL which attempts to identify and understand the source(s) of seasonal variations of global, fossil-fuel derived, carbon dioxide emissions. Better knowledge of the temporal variation of the annual fossil fuel flux will lead to a better understanding of the global carbon cycle. This research will be archived at CDIAC for public access.
Caffeine as an indicator for the quantification of untreated wastewater in karst systems.
Hillebrand, Olav; Nödler, Karsten; Licha, Tobias; Sauter, Martin; Geyer, Tobias
2012-02-01
Contamination from untreated wastewater leakage and related bacterial contamination poses a threat to drinking water quality. However, a quantification of the magnitude of leakage is difficult. The objective of this work is to provide a highly sensitive methodology for the estimation of the mass of untreated wastewater entering karst aquifers with rapid recharge. For this purpose a balance approach is adapted. It is based on the mass flow of caffeine in spring water, the load of caffeine in untreated wastewater and the daily water consumption per person in a spring catchment area. Caffeine is a source-specific indicator for wastewater, consumed and discharged in quantities allowing detection in a karst spring. The methodology was applied to estimate the amount of leaking and infiltrating wastewater to a well investigated karst aquifer on a daily basis. The calculated mean volume of untreated wastewater entering the aquifer was found to be 2.2 ± 0.5 m(3) d(-1) (undiluted wastewater). It corresponds to approximately 0.4% of the total amount of wastewater within the spring catchment. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nazir, Mohd Yusuf Mohd; Al-Shorgani, Najeeb Kaid Nasser; Kalil, Mohd Sahaid; Hamid, Aidil Abdul
2015-09-01
In this study, three factors (fructose concentration, agitation speed and monosodium glutamate (MSG) concentration) were optimized to enhance DHA production by Schizochytrium SW1 using response surface methodology (RSM). Central composite design was applied as the experimental design and analysis of variance (ANOVA) was used to analyze the data. The experiments were conducted using 500 mL flask with 100 mL working volume at 30°C for 96 hours. ANOVA analysis revealed that the process was adequately represented significantly by the quadratic model (p<0.0001) and two of the factors namely agitation speed and MSG concentration significantly affect DHA production (p<0.005). Level of influence for each variable and quadratic polynomial equation were obtained for DHA production by multiple regression analyses. The estimated optimum conditions for maximizing DHA production by SW1 were 70 g/L fructose, 250 rpm agitation speed and 12 g/L MSG. Consequently, the quadratic model was validated by applying of the estimated optimum conditions, which confirmed the model validity and 52.86% of DHA was produced.
NASA Technical Reports Server (NTRS)
Andrews, J.; Donziger, A.; Hazelrigg, G. A., Jr.; Heiss, K. P.; Sand, F.; Stevenson, P.
1974-01-01
The economic value of an ERS system with a technical capability similar to ERTS, allowing for increased coverage obtained through the use of multiple active satellites in orbit is presented. A detailed breakdown of the benefits achievable from an ERS system is given and a methodology for their estimation is established. The ECON case studies in agriculture, water use, and land cover are described along with the current ERTS system. The cost for a projected ERS system is given.
2009-01-21
will be denoted by v ⊗ u. Superscript T indicates the transpose operation (AT )ij = Aji for any second-order tensor A while symbol ‘tr‘ denotes the...indicated in red . which was subsequently marked on the plate as shown in Figure 2. Note that the plate exhibits fiber texture resulting from the...specimen deformed to 10% strain reveals that many grains have twinned (twins appear red in Figure 9). The twin volume fraction was estimated to be 17
1986-03-01
Directly from Sample Bid VI-16 Example 3 VI-16 Determining the Zero Price Qiantity Demanded VI-26 Summary VI -31 CHAPrER VII, THE DETERMINATION OF NED...While the standard deviation and variance are absolute measures of dispersion, a relative measure of dispersion can also be computed. This measure is...refers to the closeness of fit between the estimates obtained from Zli e and the true population value. The only way of being absolutely i: o-.iat the
Wandrey, Craig J.; Schenk, Christopher J.; Klett, Timothy R.; Brownfield, Michael E.; Charpentier, Ronald R.; Cook, Troy A.; Pollastro, Richard M.; Tennyson, Marilyn E.
2013-01-01
The Cretaceous-Tertiary Composite Total Petroleum System coincident Taranaki Basin Assessment Unit was recently assessed for undiscovered technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey (USGS) World Energy Resources Project, World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 487 million barrels of oil, 9.8 trillion cubic feet of gas, and 408 million barrels of natural gas liquids.
Towards an Optimized Method of Olive Tree Crown Volume Measurement
Miranda-Fuentes, Antonio; Llorens, Jordi; Gamarra-Diezma, Juan L.; Gil-Ribes, Jesús A.; Gil, Emilio
2015-01-01
Accurate crown characterization of large isolated olive trees is vital for adjusting spray doses in three-dimensional crop agriculture. Among the many methodologies available, laser sensors have proved to be the most reliable and accurate. However, their operation is time consuming and requires specialist knowledge and so a simpler crown characterization method is required. To this end, three methods were evaluated and compared with LiDAR measurements to determine their accuracy: Vertical Crown Projected Area method (VCPA), Ellipsoid Volume method (VE) and Tree Silhouette Volume method (VTS). Trials were performed in three different kinds of olive tree plantations: intensive, adapted one-trunked traditional and traditional. In total, 55 trees were characterized. Results show that all three methods are appropriate to estimate the crown volume, reaching high coefficients of determination: R2 = 0.783, 0.843 and 0.824 for VCPA, VE and VTS, respectively. However, discrepancies arise when evaluating tree plantations separately, especially for traditional trees. Here, correlations between LiDAR volume and other parameters showed that the Mean Vector calculated for VCPA method showed the highest correlation for traditional trees, thus its use in traditional plantations is highly recommended. PMID:25658396
Klett, T.R.; Schenk, Christopher J.; Wandrey, Craig J.; Charpentier, Ronald R.; Cook, Troy A.; Brownfield, Michael E.; Pitman, Janet K.; Pollastro, Richard M.
2012-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated volumes of undiscovered, technically recoverable, conventional petroleum resources for the Assam, Bombay, Cauvery, and Krishna–Godavari Provinces, South Asia. The estimated mean volumes are as follows: (1) Assam Province, 273 million barrels of crude oil, 1,559 billion cubic feet of natural gas, and 43 million barrels of natural gas liquids; (2) Bombay Province, 1,854 million barrels of crude oil, 15,417 billion cubic feet of natural gas, and 498 million barrels of natural gas liquids; (3) Cauvery Province, 941 million barrels of crude oil, 25,208 billion cubic feet of natural gas, and 654 million barrels of natural gas liquids; and (4) Krishna–Godavari Province, 466 million barrels of crude oil, 37,168 billion cubic feet of natural gas, and 484 million barrels of natural gas liquids. The totals for the four provinces are 3,534 million barrels of crude oil, 79,352 billion cubic feet of natural gas, and 1,679 million barrels of natural gas liquids.
Simulating the minimum core for hydrophobic collapse in globular proteins.
Tsai, J.; Gerstein, M.; Levitt, M.
1997-01-01
To investigate the nature of hydrophobic collapse considered to be the driving force in protein folding, we have simulated aqueous solutions of two model hydrophobic solutes, methane and isobutylene. Using a novel methodology for determining contacts, we can precisely follow hydrophobic aggregation as it proceeds through three stages: dispersed, transition, and collapsed. Theoretical modeling of the cluster formation observed by simulation indicates that this aggregation is cooperative and that the simulations favor the formation of a single cluster midway through the transition stage. This defines a minimum solute hydrophobic core volume. We compare this with protein hydrophobic core volumes determined from solved crystal structures. Our analysis shows that the solute core volume roughly estimates the minimum core size required for independent hydrophobic stabilization of a protein and defines a limiting concentration of nonpolar residues that can cause hydrophobic collapse. These results suggest that the physical forces driving aggregation of hydrophobic molecules in water is indeed responsible for protein folding. PMID:9416609
Estimating Wood Volume for Pinus Brutia Trees in Forest Stands from QUICKBIRD-2 Imagery
NASA Astrophysics Data System (ADS)
Patias, Petros; Stournara, Panagiota
2016-06-01
Knowledge of forest parameters, such as wood volume, is required for a sustainable forest management. Collecting such information in the field is laborious and even not feasible in inaccessible areas. In this study, tree wood volume is estimated utilizing remote sensing techniques, which can facilitate the extraction of relevant information. The study area is the University Forest of Taxiarchis, which is located in central Chalkidiki, Northern Greece and covers an area of 58km2. The tree species under study is the conifer evergreen species P. brutia (Calabrian pine). Three plot surfaces of 10m radius were used. VHR Quickbird-2 images are used in combination with an allometric relationship connecting the Tree Crown with the Diameter at breast height (Dbh), and a volume table developed for Greece. The overall methodology is based on individual tree crown delineation, based on (a) the marker-controlled watershed segmentation approach and (b) the GEographic Object-Based Image Analysis approach. The aim of the first approach is to extract separate segments each of them including a single tree and eventual lower vegetation, shadows, etc. The aim of the second approach is to detect and remove the "noisy" background. In the application of the first approach, the Blue, Green, Red, Infrared and PCA-1 bands are tested separately. In the application of the second approach, NDVI and image brightness thresholds are utilized. The achieved results are evaluated against field plot data. Their observed difference are between -5% to +10%.
NASA Technical Reports Server (NTRS)
Czabaj, M. W.; Riccio, M. L.; Whitacre, W. W.
2014-01-01
A combined experimental and computational study aimed at high-resolution 3D imaging, visualization, and numerical reconstruction of fiber-reinforced polymer microstructures at the fiber length scale is presented. To this end, a sample of graphite/epoxy composite was imaged at sub-micron resolution using a 3D X-ray computed tomography microscope. Next, a novel segmentation algorithm was developed, based on concepts adopted from computer vision and multi-target tracking, to detect and estimate, with high accuracy, the position of individual fibers in a volume of the imaged composite. In the current implementation, the segmentation algorithm was based on Global Nearest Neighbor data-association architecture, a Kalman filter estimator, and several novel algorithms for virtualfiber stitching, smoothing, and overlap removal. The segmentation algorithm was used on a sub-volume of the imaged composite, detecting 508 individual fibers. The segmentation data were qualitatively compared to the tomographic data, demonstrating high accuracy of the numerical reconstruction. Moreover, the data were used to quantify a) the relative distribution of individual-fiber cross sections within the imaged sub-volume, and b) the local fiber misorientation relative to the global fiber axis. Finally, the segmentation data were converted using commercially available finite element (FE) software to generate a detailed FE mesh of the composite volume. The methodology described herein demonstrates the feasibility of realizing an FE-based, virtual-testing framework for graphite/fiber composites at the constituent level.
How uncertain is model-based prediction of copper loads in stormwater runoff?
Lindblom, E; Ahlman, S; Mikkelsen, P S
2007-01-01
In this paper, we conduct a systematic analysis of the uncertainty related with estimating the total load of pollution (copper) from a separate stormwater drainage system, conditioned on a specific combination of input data, a dynamic conceptual pollutant accumulation-washout model and measurements (runoff volumes and pollutant masses). We use the generalized likelihood uncertainty estimation (GLUE) methodology and generate posterior parameter distributions that result in model outputs encompassing a significant number of the highly variable measurements. Given the applied pollution accumulation-washout model and a total of 57 measurements during one month, the total predicted copper masses can be predicted within a range of +/-50% of the median value. The message is that this relatively large uncertainty should be acknowledged in connection with posting statements about micropollutant loads as estimated from dynamic models, even when calibrated with on-site concentration data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kost, S; Yu, N; Lin, S
2016-06-15
Purpose: To compare mean lung dose (MLD) estimates from 99mTc macroaggregated albumin (MAA) SPECT/CT using two published methodologies for patients treated with {sup 90}Y radioembolization for liver cancer. Methods: MLD was estimated retrospectively using two methodologies for 40 patients from SPECT/CT images of 99mTc-MAA administered prior to radioembolization. In these two methods, lung shunt fractions (LSFs) were calculated as the ratio of scanned lung activity to the activity in the entire scan volume or to the sum of activity in the lung and liver respectively. Misregistration of liver activity into the lungs during SPECT acquisition was overcome by excluding lungmore » counts within either 2 or 1.5 cm of the diaphragm apex respectively. Patient lung density was assumed to be 0.3 g/cm{sup 3} or derived from CT densitovolumetry respectively. Results from both approaches were compared to MLD determined by planar scintigraphy (PS). The effect of patient size on the difference between MLD from PS and SPECT/CT was also investigated. Results: Lung density from CT densitovolumetry is not different from the reference density (p = 0.68). The second method resulted in lung dose of an average 1.5 times larger lung dose compared to the first method; however the difference between the means of the two estimates was not significant (p = 0.07). Lung dose from both methods were statistically different from those estimated from 2D PS (p < 0.001). There was no correlation between patient size and the difference between MLD from PS and both SPECT/CT methods (r < 0.22, p > 0.17). Conclusion: There is no statistically significant difference between MLD estimated from the two techniques. Both methods are statistically different from conventional PS, with PS overestimating dose by a factor of three or larger. The difference between lung doses estimated from 2D planar or 3D SPECT/CT is not dependent on patient size.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simons, B.A.; Woldt, W.E.; Jones, D.D.
The environmental and health risks posed by unregulated waste disposal sites are potential concerns of Pacific Rim regions and island ares because of the need to protect aquifers and other valuable water resources. A non-intrusive screening methodology to determine site characteristics including possible soil and/or groundwater contamination, areal extent of waste, etc. is being developed and tested at waste disposal sites in Nebraska. This type of methodology would be beneficial to Pacific Rim regions in investigating and/or locating unknown or poorly documented contamination areas for hazard assessment and groundwater protection. Traditional assessment methods are generally expensive, time consuming, and potentiallymore » exacerbate the problem. Ideally, a quick and inexpensive assessment method to reliably characterize these sites is desired. Electromagnetic (EM) conductivity surveying and soil-vapor sampling techniques, combined with innovative three-dimensional geostatistical methods are used to map the data to develop a site characterization of the subsurface and to aid in tracking any contaminant plumes. The EM data is analyzed to determine/estimate the extent and volume of waste and/or leachate. Soil-vapor data are analyzed to estimate a site`s volatile organic compound (VOC) emission rate to the atmosphere. The combined information could then be incorporated as one part of an overall hazard assessment system.« less
An Approach to the Use of Depth Cameras for Weed Volume Estimation
Andújar, Dionisio; Dorado, José; Fernández-Quintanilla, César; Ribeiro, Angela
2016-01-01
The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them. PMID:27347972
An Approach to the Use of Depth Cameras for Weed Volume Estimation.
Andújar, Dionisio; Dorado, José; Fernández-Quintanilla, César; Ribeiro, Angela
2016-06-25
The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them.
NASA Astrophysics Data System (ADS)
Dung Nguyen, The; Kappas, Martin
2017-04-01
In the last several years, the interest in forest biomass and carbon stock estimation has increased due to its importance for forest management, modelling carbon cycle, and other ecosystem services. However, no estimates of biomass and carbon stocks of deferent forest cover types exist throughout in the Xuan Lien Nature Reserve, Thanh Hoa, Viet Nam. This study investigates the relationship between above ground carbon stock and different vegetation indices and to identify the most likely vegetation index that best correlate with forest carbon stock. The terrestrial inventory data come from 380 sample plots that were randomly sampled. Individual tree parameters such as DBH and tree height were collected to calculate the above ground volume, biomass and carbon for different forest types. The SPOT6 2013 satellite data was used in the study to obtain five vegetation indices NDVI, RDVI, MSR, RVI, and EVI. The relationships between the forest carbon stock and vegetation indices were investigated using a multiple linear regression analysis. R-square, RMSE values and cross-validation were used to measure the strength and validate the performance of the models. The methodology presented here demonstrates the possibility of estimating forest volume, biomass and carbon stock. It can also be further improved by addressing more spectral bands data and/or elevation.
Automated lung volumetry from routine thoracic CT scans: how reliable is the result?
Haas, Matthias; Hamm, Bernd; Niehues, Stefan M
2014-05-01
Today, lung volumes can be easily calculated from chest computed tomography (CT) scans. Modern postprocessing workstations allow automated volume measurement of data sets acquired. However, there are challenges in the use of lung volume as an indicator of pulmonary disease when it is obtained from routine CT. Intra-individual variation and methodologic aspects have to be considered. Our goal was to assess the reliability of volumetric measurements in routine CT lung scans. Forty adult cancer patients whose lungs were unaffected by the disease underwent routine chest CT scans in 3-month intervals, resulting in a total number of 302 chest CT scans. Lung volume was calculated by automatic volumetry software. On average of 7.2 CT scans were successfully evaluable per patient (range 2-15). Intra-individual changes were assessed. In the set of patients investigated, lung volume was approximately normally distributed, with a mean of 5283 cm(3) (standard deviation = 947 cm(3), skewness = -0.34, and curtosis = 0.16). Between different scans in one and the same patient the median intra-individual standard deviation in lung volume was 853 cm(3) (16% of the mean lung volume). Automatic lung segmentation of routine chest CT scans allows a technically stable estimation of lung volume. However, substantial intra-individual variations have to be considered. A median intra-individual deviation of 16% in lung volume between different routine scans was found. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Bethel, James; Green, James L.; Nord, Christine; Kalton, Graham; West, Jerry
2005-01-01
This report is Volume 2 of the methodology report that provides information about the development, design, and conduct of the 9-month data collection of the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B). This volume begins with a brief overview of the ECLS-B, but focuses on the sample design, calculation of response rates, development…
A Probabilistic Assessment Methodology for the Evaluation of Geologic Carbon Dioxide Storage
Brennan, Sean T.; Burruss, Robert A.; Merrill, Matthew D.; Freeman, P.A.; Ruppert, Leslie F.
2010-01-01
In 2007, the Energy Independence and Security Act (Public Law 110-140) authorized the U.S. Geological Survey (USGS) to conduct a national assessment of potential geologic storage resources for carbon dioxide (CO2) in cooperation with the U.S. Environmental Protection Agency and the U.S. Department of Energy. The first year of that activity was specified for development of a methodology to estimate storage potential that could be applied uniformly to geologic formations across the United States. After its release, the methodology was to receive public comment and external expert review. An initial methodology was developed and published in March 2009 (Burruss and others, 2009), and public comments were received. The report was then sent to a panel of experts for external review. The external review report was received by the USGS in December 2009. This report is in response to those external comments and reviews and describes how the previous assessment methodology (Burruss and others, 2009) was revised. The resource that is assessed is the technically accessible storage resource, which is defined as the mass of CO2 that can be stored in the pore volume of a storage formation. The methodology that is presented in this report is intended to be used for assessments at scales ranging from regional to subbasinal in which storage assessment units are defined on the basis of common geologic and hydrologic characteristics. The methodology does not apply to site-specific evaluation of storage resources or capacity.
The RAAF Logistics Study. Volume 4,
1986-10-01
Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system
Magnetic resonance fingerprinting based on realistic vasculature in mice
Pouliot, Philippe; Gagnon, Louis; Lam, Tina; Avti, Pramod K.; Bowen, Chris; Desjardins, Michèle; Kakkar, Ashok K.; Thorin, E.; Sakadzic, Sava; Boas, David A.; Lesage, Frédéric
2017-01-01
Magnetic resonance fingerprinting (MRF) was recently proposed as a novel strategy for MR data acquisition and analysis. A variant of MRF called vascular MRF (vMRF) followed, that extracted maps of three parameters of physiological importance: cerebral oxygen saturation (SatO2), mean vessel radius and cerebral blood volume (CBV). However, this estimation was based on idealized 2-dimensional simulations of vascular networks using random cylinders and the empirical Bloch equations convolved with a diffusion kernel. Here we focus on studying the vascular MR fingerprint using real mouse angiograms and physiological values as the substrate for the MR simulations. The MR signal is calculated ab initio with a Monte Carlo approximation, by tracking the accumulated phase from a large number of protons diffusing within the angiogram. We first study the identifiability of parameters in simulations, showing that parameters are fully estimable at realistically high signal-to-noise ratios (SNR) when the same angiogram is used for dictionary generation and parameter estimation, but that large biases in the estimates persist when the angiograms are different. Despite these biases, simulations show that differences in parameters remain estimable. We then applied this methodology to data acquired using the GESFIDE sequence with SPIONs injected into 9 young wild type and 9 old atherosclerotic mice. Both the pre injection signal and the ratio of post-to-pre injection signals were modeled, using 5-dimensional dictionaries. The vMRF methodology extracted significant differences in SatO2, mean vessel radius and CBV between the two groups, consistent across brain regions and dictionaries. Further validation work is essential before vMRF can gain wider application. PMID:28043909
NASA Astrophysics Data System (ADS)
Giorli, Giacomo; Drazen, Jeffrey C.; Neuheimer, Anna B.; Copeland, Adrienne; Au, Whitlow W. L.
2018-01-01
Pelagic animals that form deep sea scattering layers (DSLs) represent an important link in the food web between zooplankton and top predators. While estimating the composition, density and location of the DSL is important to understand mesopelagic ecosystem dynamics and to predict top predators' distribution, DSL composition and density are often estimated from trawls which may be biased in terms of extrusion, avoidance, and gear-associated biases. Instead, location and biomass of DSLs can be estimated from active acoustic techniques, though estimates are often in aggregate without regard to size or taxon specific information. For the first time in the open ocean, we used a DIDSON sonar to characterize the fauna in DSLs. Estimates of the numerical density and length of animals at different depths and locations along the Kona coast of the Island of Hawaii were determined. Data were collected below and inside the DSLs with the sonar mounted on a profiler. A total of 7068 animals were counted and sized. We estimated numerical densities ranging from 1 to 7 animals/m3 and individuals as long as 3 m were detected. These numerical densities were orders of magnitude higher than those estimated from trawls and average sizes of animals were much larger as well. A mixed model was used to characterize numerical density and length of animals as a function of deep sea layer sampled, location, time of day, and day of the year. Numerical density and length of animals varied by month, with numerical density also a function of depth. The DIDSON proved to be a good tool for open-ocean/deep-sea estimation of the numerical density and size of marine animals, especially larger ones. Further work is needed to understand how this methodology relates to estimates of volume backscatters obtained with standard echosounding techniques, density measures obtained with other sampling methodologies, and to precisely evaluate sampling biases.
Longo, Benedetto; Farcomeni, Alessio; Ferri, Germano; Campanale, Antonella; Sorotos, Micheal; Santanelli, Fabio
2013-07-01
Breast volume assessment enhances preoperative planning of both aesthetic and reconstructive procedures, helping the surgeon in the decision-making process of shaping the breast. Numerous methods of breast size determination are currently reported but are limited by methodologic flaws and variable estimations. The authors aimed to develop a unifying predictive formula for volume assessment in small to large breasts based on anthropomorphic values. Ten anthropomorphic breast measurements and direct volumes of 108 mastectomy specimens from 88 women were collected prospectively. The authors performed a multivariate regression to build the optimal model for development of the predictive formula. The final model was then internally validated. A previously published formula was used as a reference. Mean (±SD) breast weight was 527.9 ± 227.6 g (range, 150 to 1250 g). After model selection, sternal notch-to-nipple, inframammary fold-to-nipple, and inframammary fold-to-fold projection distances emerged as the most important predictors. The resulting formula (the BREAST-V) showed an adjusted R of 0.73. The estimated expected absolute error on new breasts is 89.7 g (95 percent CI, 62.4 to 119.1 g) and the expected relative error is 18.4 percent (95 percent CI, 12.9 to 24.3 percent). Application of reference formula on the sample yielded worse predictions than those derived by the formula, showing an R of 0.55. The BREAST-V is a reliable tool for predicting small to large breast volumes accurately for use as a complementary device in surgeon evaluation. An app entitled BREAST-V for both iOS and Android devices is currently available for free download in the Apple App Store and Google Play Store. Diagnostic, II.
Kindermans, Jean-Marie; Vandenbergh, Daniel; Vreeke, Ed; Olliaro, Piero; D'Altilia, Jean-Pierre
2007-07-10
Having reliable forecasts is critical now for producers, malaria-endemic countries and agencies in order to adapt production and procurement of the artemisinin-based combination treatments (ACTs), the new first-line treatments of malaria. There is no ideal method to quantify drug requirements for malaria. Morbidity data give uncertain estimations. This study uses drug consumption to provide elements to help estimate quantities and financial requirements of ACTs. The consumption of chloroquine, sulphadoxine/pyrimethamine and quinine both through the private and public sector was assessed in five sub-Saharan Africa countries with different epidemiological patterns (Senegal, Rwanda, Tanzania, Malawi, Zimbabwe). From these data the number of adult treatments per capita was calculated and the volumes and financial implications derived for the whole of Africa. Identifying and obtaining data from the private sector was difficult. The quality of information on drug supply and distribution in countries must be improved. The number of adult treatments per capita and per year in the five countries ranged from 0.18 to 0.50. Current adult treatment prices for ACTs range US$ 1-1.8. Taking the upper range for both volumes and costs, the highest number of adult treatments consumed for Africa was estimated at 314.5 million, corresponding to an overall maximum annual need for financing ACT procurement of US$ 566.1 million. In reality, both the number of cases treated and the cost of treatment are likely to be lower (projections for the lowest consumption estimate with the least expensive ACT would require US $ 113 million per annum). There were substantial variations in the market share between public and private sources among these countries (the public sector share ranging from 98% in Rwanda to 33% in Tanzania). Additional studies are required to build a more robust methodology, and to assess current consumptions more accurately in order to better quantify volumes and finances for production and procurement of ACTs.
ERIC Educational Resources Information Center
Wylie, Ruth C.
This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…
Proceedings of the Workshop on Identification and Control of Flexible Space Structures, Volume 2
NASA Technical Reports Server (NTRS)
Rodriguez, G. (Editor)
1985-01-01
The results of a workshop on identification and control of flexible space structures held in San Diego, CA, July 4 to 6, 1984 are discussed. The main objectives of the workshop were to provide a forum to exchange ideas in exploring the most advanced modeling, estimation, identification and control methodologies to flexible space structures. The workshop responded to the rapidly growing interest within NASA in large space systems (space station, platforms, antennas, flight experiments) currently under design. Dynamic structural analysis, control theory, structural vibration and stability, and distributed parameter systems are discussed.
1992-01-01
the uncertainty. The above method can give an estimate of the precision of the * analysis. However, determining the accuracy can not be done as...speciation has been determined from analyzing model samples as well as comparison with other methods and combinations of other methods with this method . 3...laboratory. The output of the sensor is characterized over its working range and an appropriate response factor determined by linear regression of the
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report compares the energy use, oil use and emissions of electric vehicles (EVs) with those of conventional, gasoline- powered vehicles (CVs) over the total life cycle of the vehicles. The various stages included in the vehicles` life cycles include vehicle manufacture, fuel production, and vehicle operation. Disposal is not included. An inventory of the air emissions associated with each stage of the life cycle is estimated. Water pollutants and solid wastes are reported for individual processes, but no comprehensive inventory is developed. Volume II contains additional details on the vehicle, utility, and materials analyses and discusses several details ofmore » the methodology.« less
Eoff, Jennifer D.; Biewick, Laura R.H.; Brownfield, Michael E.; Burke, Lauri; Charpentier, Ronald R.; Dubiel, Russell F.; Gaswirth, Stephanie B.; Gianoutsos, Nicholas J.; Kinney, Scott A.; Klett, Timothy R.; Leathers, Heidi M.; Mercier, Tracey J.; Paxton, Stanley T.; Pearson, Ofori N.; Pitman, Janet K.; Schenk, Christopher J.; Tennyson, Marilyn E.; Whidden, Katherine J.
2015-08-11
Using a geology-based assessment methodology, the U.S. Geological Survey estimated undiscovered mean volumes of 14 million barrels of conventional oil, 430 billion cubic feet of conventional gas, 34,028 billion cubic feet of continuous gas, and a mean total of 391 million barrels of natural gas liquids in sandstone reservoirs of the Upper Jurassic–Lower Cretaceous Cotton Valley Group in onshore lands and State waters of the U.S. Gulf Coast region.
3D motion and strain estimation of the heart: initial clinical findings
NASA Astrophysics Data System (ADS)
Barbosa, Daniel; Hristova, Krassimira; Loeckx, Dirk; Rademakers, Frank; Claus, Piet; D'hooge, Jan
2010-03-01
The quantitative assessment of regional myocardial function remains an important goal in clinical cardiology. As such, tissue Doppler imaging and speckle tracking based methods have been introduced to estimate local myocardial strain. Recently, volumetric ultrasound has become more readily available, allowing therefore the 3D estimation of motion and myocardial deformation. Our lab has previously presented a method based on spatio-temporal elastic registration of ultrasound volumes to estimate myocardial motion and deformation in 3D, overcoming the spatial limitations of the existing methods. This method was optimized on simulated data sets in previous work and is currently tested in a clinical setting. In this manuscript, 10 healthy volunteers, 10 patient with myocardial infarction and 10 patients with arterial hypertension were included. The cardiac strain values extracted with the proposed method were compared with the ones estimated with 1D tissue Doppler imaging and 2D speckle tracking in all patient groups. Although the absolute values of the 3D strain components assessed by this new methodology were not identical to the reference methods, the relationship between the different patient groups was similar.
Effects of obesity on lung volume and capacity in children and adolescents: a systematic review
Winck, Aline Dill; Heinzmann-Filho, João Paulo; Soares, Rafaela Borges; da Silva, Juliana Severo; Woszezenki, Cristhiele Taís; Zanatta, Letiane Bueno
2016-01-01
Abstract Objective: To assess the effects of obesity on lung volume and capacity in children and adolescents. Data source: This is a systematic review, carried out in Pubmed, Lilacs, Scielo and PEDro databases, using the following Keywords: Plethysmography; Whole Body OR Lung Volume Measurements OR Total Lung Capacity OR Functional Residual Capacity OR Residual Volume AND Obesity. Observational studies or clinical trials that assessed the effects of obesity on lung volume and capacity in children and adolescents (0-18 years) without any other associated disease; in English; Portuguese and Spanish languages were selected. Methodological quality was assessed by the Agency for Healthcare Research and Quality. Data synthesis: Of the 1,030 articles, only four were included in the review. The studies amounted to 548 participants, predominantly males, with sample size ranging from 45 to 327 individuals. 100% of the studies evaluated nutritional status through BMI (z-score) and 50.0% reported the data on abdominal circumference. All demonstrated that obesity causes negative effects on lung volume and capacity, causing a reduction mainly in functional residual capacity in 75.0% of the studies; in the expiratory reserve volume in 50.0% and in the residual volume in 25.0%. The methodological quality ranged from moderate to high, with 75.0% of the studies classified as having high methodological quality. Conclusions: Obesity causes deleterious effects on lung volume and capacity in children and adolescents, mainly by reducing functional residual capacity, expiratory reserve volume and residual volume. PMID:27130483
NASA Technical Reports Server (NTRS)
Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)
2002-01-01
This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.
Stabilized Finite Elements in FUN3D
NASA Technical Reports Server (NTRS)
Anderson, W. Kyle; Newman, James C.; Karman, Steve L.
2017-01-01
A Streamlined Upwind Petrov-Galerkin (SUPG) stabilized finite-element discretization has been implemented as a library into the FUN3D unstructured-grid flow solver. Motivation for the selection of this methodology is given, details of the implementation are provided, and the discretization for the interior scheme is verified for linear and quadratic elements by using the method of manufactured solutions. A methodology is also described for capturing shocks, and simulation results are compared to the finite-volume formulation that is currently the primary method employed for routine engineering applications. The finite-element methodology is demonstrated to be more accurate than the finite-volume technology, particularly on tetrahedral meshes where the solutions obtained using the finite-volume scheme can suffer from adverse effects caused by bias in the grid. Although no effort has been made to date to optimize computational efficiency, the finite-element scheme is competitive with the finite-volume scheme in terms of computer time to reach convergence.
Chirico, Peter G.; Malpeli, Katherine C.
2012-01-01
In May of 2000, a meeting was convened in Kimberley, South Africa, by representatives of the diamond industry and leaders of African governments to develop a certification process intended to assure that export shipments of rough diamonds were free of conflict concerns. Outcomes of the meeting were formally supported later in December of 2000 by the United Nations in a resolution adopted by the General Assembly. By 2002, the Kimberley Process Certification Scheme was ratified and signed by diamond-producing and diamond-importing countries. As of August 2012, the Kimberley Process (KP) had 51 participants representing 77 countries. It is often difficult to obtain independent verification of the diamond production statistics that are provided to the KP. However, some degree of independent verification can be obtained through an understanding of a country’s naturally occurring endowment of diamonds and the intensity of mining activities. Studies that integrate these two components can produce a range of estimated values for a country’s diamond production, and these estimates can then be compared to the production statistics released by that country. This methodology is used to calculate (1) the diamond resource potential of a country, which refers to the total number of carats estimated to be remaining in the country, and (2) the diamond production capacity of a country, which is the current volume of diamonds that may realistically be produced per year utilizing current human and physical resources. The following sections outline the methodology used by the U.S. Geological Survey (USGS) to perform diamond assessments in Mali, the Central African Republic, Ghana, and Guinea.
NASA Astrophysics Data System (ADS)
Shpotyuk, O.; Ingram, A.; Shpotyuk, Ya.
2018-02-01
Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy are examined to parameterize free-volume structural evolution processes in some nanostructurized substances obeying conversion from positronium (Ps) decaying to positron trapping. Unlike conventional x3-term fitting analysis based on admixed positron trapping and Ps decaying, the effect of nanostructurization is considered as occurring due to conversion from preferential Ps decaying in initial host matrix to positron trapping in modified (nanostructurized) host-guest matrix. The developed approach referred to as x3-x2-CDA (coupling decomposition algorithm) allows estimation defect-free bulk and defect-specific positron lifetimes of free-volume elements responsible for nanostructurization. The applicability of this approach is proved for some nanostructurized materials allowing free-volume changes through Ps-to-positron trapping conversion, such as (i) metallic Ag nanoparticles embedded in polymer matrix, (ii) structure-modification processes caused by swift heavy ions irradiation in polystyrene, and (iii) host-guest chemistry problems like water immersion in alumomagnesium spinel ceramics. This approach is considered to be used as test-indicator, separating processes of host-matrix nanostructurization due to embedded nanoparticles from uncorrelated changes in positron-trapping and Ps-decaying channels.
NASA Astrophysics Data System (ADS)
Voumard, Jérémie; Jaboyedoff, Michel
2016-04-01
The 22-23th July 2015, two severe storms at one day interval have caused in Scuol, lower Engadine (Canton of Graubünden, Switzerland). The static storm cells produced up to 150 mm rain precipitations in three hours generating several debris flow. On 22 July 2015, three buildings in the Pradella hamlet near Scuol were damaged by a debris flow. People of two holiday camps, 100 children and 40 adults, were evacuated. Nobody was injured but the buildings damages are important. A day after, about 200 mm rain in a short time were measured in the same area. A car was been swept away by a debris flow in the Scuol village and its driver could escape at the last moment. The S-charl valley was isolated during more than one week by seven big debris flows and several little ones. About 100 people, in majority holidaymakers, were blocked in the S-charl hamlet without power supply during few days. Until the swiss army built a provisional emergency bridge to open the valley access, the only way to access the S-charl valley was by helicopter. Overall damages -roads infrastructures, buildings, drinking water supply, power supply and other- are estimated to one million Swiss Francs and the debris flow volume is estimated to 100'00 cubic meters. The S-charl valley roadsides were photographed fifteen days before the extreme storm event from an on-motion vehicle. The same roadsides were photographed twenty days after the event with the same acquisition methodology. 3D point clouds from Structure of Motion (SfM) from the -before and after event- pictures have been produced and compared. Thus, is was possible to evaluate the number of debris flows that occurred in the S-charl valley and estimate their volume in the roadsides. This study case allows to evaluate the low-cost SfM on-motion methodology and to give theirs main advantages and disadvantages when it is used to estimate changes roadsides due to a natural hazard event.
How well do we understand oil spill hazard mapping?
NASA Astrophysics Data System (ADS)
Sepp Neves, Antonio Augusto; Pinardi, Nadia
2017-04-01
In simple terms, we could describe the marine oil spill hazard as related to three main factors: the spill event itself, the spill trajectory and the arrival and adsorption of oil to the shore or beaching. Regarding the first factor, spill occurrence rates and magnitude distribution and their respective uncertainties have been estimated mainly relying on maritime casualty reports. Abascal et al. (2010) and Sepp Neves et al. (2015) demonstrated for the Prestige (Spain, 2002) and Jiyeh (Lebanon, 2006) spills that ensemble numerical oil spill simulations can generate reliable estimaes of the most likely oil trajectories and impacted coasts. Although paramount to estimate the spill impacts on coastal resources, the third component of the oil spill hazard (i.e. oil beaching) is still subject of discussion. Analysts have employed different methodologies to estimate the coastal component of the hazard relying, for instance, on the beaching frequency solely, the time which a given coastal segment is subject to oil concentrations above a certain preset threshold, percentages of oil beached compared to the original spilled volume and many others. Obviously, results are not comparable and sometimes not consistent with the present knowledge about the environmental impacts of oil spills. The observed inconsistency in the hazard mapping methodologies suggests that there is still a lack of understanding of the beaching component of the oil spill hazard itself. The careful statistical description of the beaching process could finally set a common ground in oil spill hazard mapping studies as observed for other hazards such as earthquakes and landslides. This paper is the last of a series of efforts to standardize oil spill hazard and risk assessments through an ISO-compliant framework (IT - OSRA, see Sepp Neves et al., (2015)). We performed two large ensemble oil spill experiments addressing uncertainties in the spill characteristics and location, and meteocean conditions for two different areas (Algarve and Uruguay) aiming at quantifying the hazard due to accidental (large volumes and rare events) and operational (frequent and usually involving small volumes) spills associated with the maritime traffic. In total, over 60,000 240h-long simulations were run and the statistical behavior of the beached concentrations found was described. The concentration distributions for both study areas were successfully fit using a Gamma distribution demonstrating the generality of our conclusions. The oil spill hazard and its uncertainties were quantified for accidental and operational events relying on the statistical distribution parameters. Therefore, the hazard estimates were comparable between areas and allowed to identify priority coastal segments for protection and rank sources of hazard.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Membranes with artificial free-volume for biofuel production
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.
2015-01-01
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity. PMID:26104672
Membranes with artificial free-volume for biofuel production
NASA Astrophysics Data System (ADS)
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; Chen, X. Chelsea; Cotanda, Pepa; Hill, Anita J.; Balsara, Nitash P.
2015-06-01
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. We have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the term artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. We found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.
Membranes with artificial free-volume for biofuel production
Petzetakis, Nikos; Doherty, Cara M.; Thornton, Aaron W.; ...
2015-06-24
Free-volume of polymers governs transport of penetrants through polymeric films. Control over free-volume is thus important for the development of better membranes for a wide variety of applications such as gas separations, pharmaceutical purifications and energy storage. To date, methodologies used to create materials with different amounts of free-volume are based primarily on chemical synthesis of new polymers. Here we report a simple methodology for generating free-volume based on the self-assembly of polyethylene-b-polydimethylsiloxane-b-polyethylene triblock copolymers. Here, we have used this method to fabricate a series of membranes with identical compositions but with different amounts of free-volume. We use the termmore » artificial free-volume to refer to the additional free-volume created by self-assembly. The effect of artificial free-volume on selective transport through the membranes was tested using butanol/water and ethanol/water mixtures due to their importance in biofuel production. Moreover, we found that the introduction of artificial free-volume improves both alcohol permeability and selectivity.« less
Bissessor, Liselle; Wilson, Janet; McAuliffe, Gary; Upton, Arlo
2017-06-16
Trichomonas vaginalis (TV) prevalence varies among different communities and peoples. The availability of robust molecular platforms for the detection of TV has advanced diagnosis; however, molecular tests are more costly than phenotypic methodologies, and testing all urogenital samples is costly. We recently replaced culture methods with the Aptima Trichomonas vaginalis nucleic acid amplification test on specific request and as reflex testing by the laboratory, and have audited this change. Data were collected from August 2015 (microbroth culture and microscopy) and August 2016 (Aptima TV assay) including referrer, testing volumes, results and test cost estimates. In August 2015, 10,299 vaginal swabs, and in August 2016, 2,189 specimens (urogenital swabs and urines), were tested. The positivity rate went from 0.9% to 5.3%, and overall more TV infections were detected in 2016. The number needed to test and cost for one positive TV result respectively was 111 and $902.55 in 2015, and 19 and $368.92 in 2016. Request volumes and positivity rates differed among referrers. The methodology change was associated with higher overall detection of TV, and reductions in the numbers needed to test/cost for one TV diagnosis. Our audit suggests that there is room for improvement with TV test requesting in our community.
Application of Modern Design of Experiments to CARS Thermometry in a Model Scramjet Engine
NASA Technical Reports Server (NTRS)
Danehy, P. M.; DeLoach, R.; Cutler, A. D.
2002-01-01
We have applied formal experiment design and analysis to optimize the measurement of temperature in a supersonic combustor at NASA Langley Research Center. We used the coherent anti-Stokes Raman spectroscopy (CARS) technique to map the temperature distribution in the flowfield downstream of an 1160 K, Mach 2 freestream into which supersonic hydrogen fuel is injected at an angle of 30 degrees. CARS thermometry is inherently a single-point measurement technique; it was used to map thc flow by translating the measurement volume through the flowfield. The method known as "Modern Design of Experiments" (MDOE) was used to estimate the data volume required, design the test matrix, perform the experiment and analyze the resulting data. MDOE allowed us to match the volume of data acquired to the precision requirements of the customer. Furthermore, one aspect of MDOE, known as response surface methodology, allowed us to develop precise maps of the flowfield temperature, allowing interpolation between measurement points. An analytic function in two spatial variables was fit to the data from a single measurement plane. Fitting with a Cosine Series Bivariate Function allowed the mean temperature to be mapped with 95% confidence interval half-widths of +/- 30 Kelvin, comfortably meeting the confidence of +/- 50 Kelvin specified prior to performing the experiments. We estimate that applying MDOE to the present experiment saved a factor of 5 in data volume acquired, compared to experiments executed in the traditional manner. Furthermore, the precision requirements could have been met with less than half the data acquired.
A Practical Methodology for Disaggregating the Drivers of Drug Costs Using Administrative Data.
Lungu, Elena R; Manti, Orlando J; Levine, Mitchell A H; Clark, Douglas A; Potashnik, Tanya M; McKinley, Carol I
2017-09-01
Prescription drug expenditures represent a significant component of health care costs in Canada, with estimates of $28.8 billion spent in 2014. Identifying the major cost drivers and the effect they have on prescription drug expenditures allows policy makers and researchers to interpret current cost pressures and anticipate future expenditure levels. To identify the major drivers of prescription drug costs and to develop a methodology to disaggregate the impact of each of the individual drivers. The methodology proposed in this study uses the Laspeyres approach for cost decomposition. This approach isolates the effect of the change in a specific factor (e.g., price) by holding the other factor(s) (e.g., quantity) constant at the base-period value. The Laspeyres approach is expanded to a multi-factorial framework to isolate and quantify several factors that drive prescription drug cost. Three broad categories of effects are considered: volume, price and drug-mix effects. For each category, important sub-effects are quantified. This study presents a new and comprehensive methodology for decomposing the change in prescription drug costs over time including step-by-step demonstrations of how the formulas were derived. This methodology has practical applications for health policy decision makers and can aid researchers in conducting cost driver analyses. The methodology can be adjusted depending on the purpose and analytical depth of the research and data availability. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.
NASA Technical Reports Server (NTRS)
Keitz, J. F.
1982-01-01
The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 1 of the four major tasks included in the study. Task 1 compares flight plans based on forecasts with plans based on the verifying analysis from 33 days during the summer and fall of 1979. The comparisons show that: (1) potential fuel savings conservatively estimated to be between 1.2 and 2.5 percent could result from using more timely and accurate weather data in flight planning and route selection; (2) the Suitland forecast generally underestimates wind speeds; and (3) the track selection methodology of many airlines operating on the North Atlantic may not be optimum resulting in their selecting other than the optimum North Atlantic Organized Track about 50 percent of the time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less
Direct volume estimation without segmentation
NASA Astrophysics Data System (ADS)
Zhen, X.; Wang, Z.; Islam, A.; Bhaduri, M.; Chan, I.; Li, S.
2015-03-01
Volume estimation plays an important role in clinical diagnosis. For example, cardiac ventricular volumes including left ventricle (LV) and right ventricle (RV) are important clinical indicators of cardiac functions. Accurate and automatic estimation of the ventricular volumes is essential to the assessment of cardiac functions and diagnosis of heart diseases. Conventional methods are dependent on an intermediate segmentation step which is obtained either manually or automatically. However, manual segmentation is extremely time-consuming, subjective and highly non-reproducible; automatic segmentation is still challenging, computationally expensive, and completely unsolved for the RV. Towards accurate and efficient direct volume estimation, our group has been researching on learning based methods without segmentation by leveraging state-of-the-art machine learning techniques. Our direct estimation methods remove the accessional step of segmentation and can naturally deal with various volume estimation tasks. Moreover, they are extremely flexible to be used for volume estimation of either joint bi-ventricles (LV and RV) or individual LV/RV. We comparatively study the performance of direct methods on cardiac ventricular volume estimation by comparing with segmentation based methods. Experimental results show that direct estimation methods provide more accurate estimation of cardiac ventricular volumes than segmentation based methods. This indicates that direct estimation methods not only provide a convenient and mature clinical tool for cardiac volume estimation but also enables diagnosis of cardiac diseases to be conducted in a more efficient and reliable way.
Antoniou, Stavros A; Andreou, Alexandros; Antoniou, George A; Koch, Oliver O; Köhler, Gernot; Luketina, Ruzica-R; Bertsias, Antonios; Pointner, Rudolph; Granderath, Frank-Alexander
2015-11-01
Measures have been taken to improve methodological quality of randomized controlled trials (RCTs). This review systematically assessed the trends in volume and methodological quality of RCTs on minimally invasive surgery within a 10-year period. RCTs on minimally invasive surgery were searched in the 10 most cited general surgical journals and the 5 most cited journals of laparoscopic interest for the years 2002 and 2012. Bibliometric and methodological quality components were abstracted using the Scottish Intercollegiate Guidelines Network. The pooled number of RCTs from low-contribution regions demonstrated an increasing proportion of the total published RCTs, compensating for a concomitant decrease of the respective contributions from Europe and North America. International collaborations were more frequent in 2012. Acceptable or high quality RCTs accounted for 37.9% and 54.4% of RCTs published in 2002 and 2012, respectively. Components of external validity were poorly reported. Both the volume and the reporting quality of laparoscopic RCTs have increased from 2002 to 2012, but there seems to be ample room for improvement of methodological quality. Copyright © 2015 Elsevier Inc. All rights reserved.
Lin, Xiaojun; Tao, Hongbing; Cai, Miao; Liao, Aihua; Cheng, Zhaohui; Lin, Haifeng
2016-01-01
Abstract Previous reviews have suggested that hospital volume is inversely related to in-hospital mortality. However, percutaneous coronary intervention (PCI) practices have changed substantially in recent years, and whether this relationship persists remains controversial. A systematic search was performed using PubMed, Embase, and the Cochrane Library to identify studies that describe the effect of hospital volume on the outcomes of PCI. Critical appraisals of the methodological quality and the risk of bias were conducted independently by 2 authors. Fourteen of 96 potentiality relevant articles were included in the analysis. Twelve of the articles described the relationship between hospital volume and mortality and included data regarding odds ratios (ORs); 3 studies described the relationship between hospital volume and long-term survival, and only 1 study included data regarding hazard ratios (HRs). A meta-analysis of postoperative mortality was performed using a random effects model, and the pooled effect estimate was significantly in favor of high volume providers (OR: 0.79; 95% confidence interval [CI], 0.72–0.86; P < 0.001). A systematic review of long-term survival was performed, and a trend toward better long-term survival in high volume hospitals was observed. This meta-analysis only included studies published after 2006 and revealed that postoperative mortality following PCI correlates significantly and inversely with hospital volume. However, the magnitude of the effect of volume on long-term survival is difficult to assess. Additional research is necessary to confirm our findings and to elucidate the mechanism underlying the volume–outcome relationship. PMID:26844508
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake`s ground motion is a function of the earthquake`s magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is amore » stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. This document, Volume II, contains Appendices 2, 3, 5, 6, and 7 covering the following topics: Eastern North American Empirical Ground Motion Data; Examination of Variance of Seismographic Network Data; Soil Amplification and Vertical-to-Horizontal Ratios from Analysis of Strong Motion Data From Active Tectonic Regions; Revision and Calibration of Ou and Herrmann Method; Generalized Ray Procedure for Modeling Ground Motion Attenuation; Crustal Models for Velocity Regionalization; Depth Distribution Models; Development of Generic Site Effects Model; Validation and Comparison of One-Dimensional Site Response Methodologies; Plots of Amplification Factors; Assessment of Coupling Between Vertical & Horizontal Motions in Nonlinear Site Response Analysis; and Modeling of Dynamic Soil Properties.« less
Back to the future: estimating pre-injury brain volume in patients with traumatic brain injury.
Ross, David E; Ochs, Alfred L; D Zannoni, Megan; Seabaugh, Jan M
2014-11-15
A recent meta-analysis by Hedman et al. allows for accurate estimation of brain volume changes throughout the life span. Additionally, Tate et al. showed that intracranial volume at a later point in life can be used to estimate reliably brain volume at an earlier point in life. These advancements were combined to create a model which allowed the estimation of brain volume just prior to injury in a group of patients with mild or moderate traumatic brain injury (TBI). This volume estimation model was used in combination with actual measurements of brain volume to test hypotheses about progressive brain volume changes in the patients. Twenty six patients with mild or moderate TBI were compared to 20 normal control subjects. NeuroQuant® was used to measure brain MRI volume. Brain volume after the injury (from MRI scans performed at t1 and t2) was compared to brain volume just before the injury (volume estimation at t0) using longitudinal designs. Groups were compared with respect to volume changes in whole brain parenchyma (WBP) and its 3 major subdivisions: cortical gray matter (GM), cerebral white matter (CWM) and subcortical nuclei+infratentorial regions (SCN+IFT). Using the normal control data, the volume estimation model was tested by comparing measured brain volume to estimated brain volume; reliability ranged from good to excellent. During the initial phase after injury (t0-t1), the TBI patients had abnormally rapid atrophy of WBP and CWM, and abnormally rapid enlargement of SCN+IFT. Rates of volume change during t0-t1 correlated with cross-sectional measures of volume change at t1, supporting the internal reliability of the volume estimation model. A logistic regression analysis using the volume change data produced a function which perfectly predicted group membership (TBI patients vs. normal control subjects). During the first few months after injury, patients with mild or moderate TBI have rapid atrophy of WBP and CWM, and rapid enlargement of SCN+IFT. The magnitude and pattern of the changes in volume may allow for the eventual development of diagnostic tools based on the volume estimation approach. Copyright © 2014 Elsevier Inc. All rights reserved.
Effects of obesity on lung volume and capacity in children and adolescents: a systematic review.
Winck, Aline Dill; Heinzmann-Filho, João Paulo; Soares, Rafaela Borges; da Silva, Juliana Severo; Woszezenki, Cristhiele Taís; Zanatta, Letiane Bueno
2016-12-01
To assess the effects of obesity on lung volume and capacity in children and adolescents. This is a systematic review, carried out in Pubmed, Lilacs, Scielo and PEDro databases, using the following Keywords: Plethysmography; Whole Body OR Lung Volume Measurements OR Total Lung Capacity OR Functional Residual Capacity OR Residual Volume AND Obesity. Observational studies or clinical trials that assessed the effects of obesity on lung volume and capacity in children and adolescents (0-18 years) without any other associated disease; in English; Portuguese and Spanish languages were selected. Methodological quality was assessed by the Agency for Healthcare Research and Quality. Of the 1,030 articles, only four were included in the review. The studies amounted to 548 participants, predominantly males, with sample size ranging from 45 to 327 individuals. 100% of the studies evaluated nutritional status through BMI (z-score) and 50.0% reported the data on abdominal circumference. All demonstrated that obesity causes negative effects on lung volume and capacity, causing a reduction mainly in functional residual capacity in 75.0% of the studies; in the expiratory reserve volume in 50.0% and in the residual volume in 25.0%. The methodological quality ranged from moderate to high, with 75.0% of the studies classified as having high methodological quality. Obesity causes deleterious effects on lung volume and capacity in children and adolescents, mainly by reducing functional residual capacity, expiratory reserve volume and residual volume. Copyright © 2016 Sociedade de Pediatria de São Paulo. Publicado por Elsevier Editora Ltda. All rights reserved.
ERIC Educational Resources Information Center
Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).
The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…
Effect of body composition methodology on heritability estimation of body fatness
USDA-ARS?s Scientific Manuscript database
Heritability estimates of human body fatness vary widely and the contribution of body composition methodology to this variability is unknown. The effect of body composition methodology on estimations of genetic and environmental contributions to body fatness variation was examined in 78 adult male ...
Magnetic resonance fingerprinting based on realistic vasculature in mice.
Pouliot, Philippe; Gagnon, Louis; Lam, Tina; Avti, Pramod K; Bowen, Chris; Desjardins, Michèle; Kakkar, Ashok K; Thorin, Eric; Sakadzic, Sava; Boas, David A; Lesage, Frédéric
2017-04-01
Magnetic resonance fingerprinting (MRF) was recently proposed as a novel strategy for MR data acquisition and analysis. A variant of MRF called vascular MRF (vMRF) followed, that extracted maps of three parameters of physiological importance: cerebral oxygen saturation (SatO 2 ), mean vessel radius and cerebral blood volume (CBV). However, this estimation was based on idealized 2-dimensional simulations of vascular networks using random cylinders and the empirical Bloch equations convolved with a diffusion kernel. Here we focus on studying the vascular MR fingerprint using real mouse angiograms and physiological values as the substrate for the MR simulations. The MR signal is calculated ab initio with a Monte Carlo approximation, by tracking the accumulated phase from a large number of protons diffusing within the angiogram. We first study the identifiability of parameters in simulations, showing that parameters are fully estimable at realistically high signal-to-noise ratios (SNR) when the same angiogram is used for dictionary generation and parameter estimation, but that large biases in the estimates persist when the angiograms are different. Despite these biases, simulations show that differences in parameters remain estimable. We then applied this methodology to data acquired using the GESFIDE sequence with SPIONs injected into 9 young wild type and 9 old atherosclerotic mice. Both the pre injection signal and the ratio of post-to-pre injection signals were modeled, using 5-dimensional dictionaries. The vMRF methodology extracted significant differences in SatO 2 , mean vessel radius and CBV between the two groups, consistent across brain regions and dictionaries. Further validation work is essential before vMRF can gain wider application. Copyright © 2017 Elsevier Inc. All rights reserved.
Bonnet, Benjamin; Jourdan, Franck; du Cailar, Guilhem; Fesler, Pierre
2017-08-01
End-systolic left ventricular (LV) elastance ( E es ) has been previously calculated and validated invasively using LV pressure-volume (P-V) loops. Noninvasive methods have been proposed, but clinical application remains complex. The aims of the present study were to 1 ) estimate E es according to modeling of the LV P-V curve during ejection ("ejection P-V curve" method) and validate our method with existing published LV P-V loop data and 2 ) test the clinical applicability of noninvasively detecting a difference in E es between normotensive and hypertensive subjects. On the basis of the ejection P-V curve and a linear relationship between elastance and time during ejection, we used a nonlinear least-squares method to fit the pressure waveform. We then computed the slope and intercept of time-varying elastance as well as the volume intercept (V 0 ). As a validation, 22 P-V loops obtained from previous invasive studies were digitized and analyzed using the ejection P-V curve method. To test clinical applicability, ejection P-V curves were obtained from 33 hypertensive subjects and 32 normotensive subjects with carotid tonometry and real-time three-dimensional echocardiography during the same procedure. A good univariate relationship ( r 2 = 0.92, P < 0.005) and good limits of agreement were found between the invasive calculation of E es and our new proposed ejection P-V curve method. In hypertensive patients, an increase in arterial elastance ( E a ) was compensated by a parallel increase in E es without change in E a / E es In addition, the clinical reproducibility of our method was similar to that of another noninvasive method. In conclusion, E es and V 0 can be estimated noninvasively from modeling of the P-V curve during ejection. This approach was found to be reproducible and sensitive enough to detect an expected increase in LV contractility in hypertensive patients. Because of its noninvasive nature, this methodology may have clinical implications in various disease states. NEW & NOTEWORTHY The use of real-time three-dimensional echocardiography-derived left ventricular volumes in conjunction with carotid tonometry was found to be reproducible and sensitive enough to detect expected differences in left ventricular elastance in arterial hypertension. Because of its noninvasive nature, this methodology may have clinical implications in various disease states. Copyright © 2017 the American Physiological Society.
Connectivity Measures in EEG Microstructural Sleep Elements.
Sakellariou, Dimitris; Koupparis, Andreas M; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K
2016-01-01
During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an "EEG-element connectivity" methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease.
Connectivity Measures in EEG Microstructural Sleep Elements
Sakellariou, Dimitris; Koupparis, Andreas M.; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K.
2016-01-01
During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an “EEG-element connectivity” methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease. PMID:26924980
Harinipriya, S; Sangaranarayanan, M V
2006-01-31
The evaluation of the free energy of activation pertaining to the electron-transfer reactions occurring at liquid/liquid interfaces is carried out employing a diffuse boundary model. The interfacial solvation numbers are estimated using a lattice gas model under the quasichemical approximation. The standard reduction potentials of the redox couples, appropriate inner potential differences, dielectric permittivities, as well as the width of the interface are included in the analysis. The methodology is applied to the reaction between [Fe(CN)6](3-/4-) and [Lu(biphthalocyanine)](3+/4+) at water/1,2-dichloroethane interface. The rate-determining step is inferred from the estimated free energy of activation for the constituent processes. The results indicate that the solvent shielding effect and the desolvation of the reactants at the interface play a central role in dictating the free energy of activation. The heterogeneous electron-transfer rate constant is evaluated from the molar reaction volume and the frequency factor.
Uncertainty quantification applied to the radiological characterization of radioactive waste.
Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P
2017-09-01
This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.
Estimating daily forest carbon fluxes using a combination of ground and remotely sensed data
NASA Astrophysics Data System (ADS)
Chirici, Gherardo; Chiesi, Marta; Corona, Piermaria; Salvati, Riccardo; Papale, Dario; Fibbi, Luca; Sirca, Costantino; Spano, Donatella; Duce, Pierpaolo; Marras, Serena; Matteucci, Giorgio; Cescatti, Alessandro; Maselli, Fabio
2016-02-01
Several studies have demonstrated that Monteith's approach can efficiently predict forest gross primary production (GPP), while the modeling of net ecosystem production (NEP) is more critical, requiring the additional simulation of forest respirations. The NEP of different forest ecosystems in Italy was currently simulated by the use of a remote sensing driven parametric model (modified C-Fix) and a biogeochemical model (BIOME-BGC). The outputs of the two models, which simulate forests in quasi-equilibrium conditions, are combined to estimate the carbon fluxes of actual conditions using information regarding the existing woody biomass. The estimates derived from the methodology have been tested against daily reference GPP and NEP data collected through the eddy correlation technique at five study sites in Italy. The first test concerned the theoretical validity of the simulation approach at both annual and daily time scales and was performed using optimal model drivers (i.e., collected or calibrated over the site measurements). Next, the test was repeated to assess the operational applicability of the methodology, which was driven by spatially extended data sets (i.e., data derived from existing wall-to-wall digital maps). A good estimation accuracy was generally obtained for GPP and NEP when using optimal model drivers. The use of spatially extended data sets worsens the accuracy to a varying degree, which is properly characterized. The model drivers with the most influence on the flux modeling strategy are, in increasing order of importance, forest type, soil features, meteorology, and forest woody biomass (growing stock volume).
Automatic tree parameter extraction by a Mobile LiDAR System in an urban context.
Herrero-Huerta, Mónica; Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo
2018-01-01
In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees.
Automatic tree parameter extraction by a Mobile LiDAR System in an urban context
Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo
2018-01-01
In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees. PMID:29689076
A novel method for blood volume estimation using trivalent chromium in rabbit models.
Baby, Prathap Moothamadathil; Kumar, Pramod; Kumar, Rajesh; Jacob, Sanu S; Rawat, Dinesh; Binu, V S; Karun, Kalesh M
2014-05-01
Blood volume measurement though important in management of critically ill-patients is not routinely estimated in clinical practice owing to labour intensive, intricate and time consuming nature of existing methods. The aim was to compare blood volume estimations using trivalent chromium [(51)Cr(III)] and standard Evans blue dye (EBD) method in New Zealand white rabbit models and establish correction-factor (CF). Blood volume estimation in 33 rabbits was carried out using EBD method and concentration determined using spectrophotometric assay followed by blood volume estimation using direct injection of (51)Cr(III). Twenty out of 33 rabbits were used to find CF by dividing blood volume estimation using EBD with blood volume estimation using (51)Cr(III). CF is validated in 13 rabbits by multiplying it with blood volume estimation values obtained using (51)Cr(III). The mean circulating blood volume of 33 rabbits using EBD was 142.02 ± 22.77 ml or 65.76 ± 9.31 ml/kg and using (51)Cr(III) was estimated to be 195.66 ± 47.30 ml or 89.81 ± 17.88 ml/kg. The CF was found to be 0.77. The mean blood volume of 13 rabbits measured using EBD was 139.54 ± 27.19 ml or 66.33 ± 8.26 ml/kg and using (51)Cr(III) with CF was 152.73 ± 46.25 ml or 71.87 ± 13.81 ml/kg (P = 0.11). The estimation of blood volume using (51)Cr(III) was comparable to standard EBD method using CF. With further research in this direction, we envisage human blood volume estimation using (51)Cr(III) to find its application in acute clinical settings.
Passenger rail vehicle safety assessment methodology. Volume I, Summary of safe performance limits.
DOT National Transportation Integrated Search
2000-04-01
This report presents a methodology based on computer simulation that asseses the safe dyamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical design parameters and characteristic properties of bo...
Methodology for estimating soil carbon for the forest carbon budget model of the United States, 2001
L. S. Heath; R. A. Birdsey; D. W. Williams
2002-01-01
The largest carbon (C) pool in United States forests is the soil C pool. We present methodology and soil C pool estimates used in the FORCARB model, which estimates and projects forest carbon budgets for the United States. The methodology balances knowledge, uncertainties, and ease of use. The estimates are calculated using the USDA Natural Resources Conservation...
Railroad Classification Yard Technology Manual: Volume II : Yard Computer Systems
DOT National Transportation Integrated Search
1981-08-01
This volume (Volume II) of the Railroad Classification Yard Technology Manual documents the railroad classification yard computer systems methodology. The subjects covered are: functional description of process control and inventory computer systems,...
Calculation of the radiative properties of photosynthetic microorganisms
NASA Astrophysics Data System (ADS)
Dauchet, Jérémi; Blanco, Stéphane; Cornet, Jean-François; Fournier, Richard
2015-08-01
A generic methodological chain for the predictive calculation of the light-scattering and absorption properties of photosynthetic microorganisms within the visible spectrum is presented here. This methodology has been developed in order to provide the radiative properties needed for the analysis of radiative transfer within photobioreactor processes, with a view to enable their optimization for large-scale sustainable production of chemicals for energy and chemistry. It gathers an electromagnetic model of light-particle interaction along with detailed and validated protocols for the determination of input parameters: morphological and structural characteristics of the studied microorganisms as well as their photosynthetic-pigment content. The microorganisms are described as homogeneous equivalent-particles whose shape and size distribution is characterized by image analysis. The imaginary part of their refractive index is obtained thanks to a new and quite extended database of the in vivo absorption spectra of photosynthetic pigments (that is made available to the reader). The real part of the refractive index is then calculated by using the singly subtractive Kramers-Krönig approximation, for which the anchor point is determined with the Bruggeman mixing rule, based on the volume fraction of the microorganism internal-structures and their refractive indices (extracted from a database). Afterwards, the radiative properties are estimated using the Schiff approximation for spheroidal or cylindrical particles, as a first step toward the description of the complexity and diversity of the shapes encountered within the microbial world. Finally, these predictive results are confronted to experimental normal-hemispherical transmittance spectra for validation. This entire procedure is implemented for Rhodospirillum rubrum, Arthrospira platensis and Chlamydomonas reinhardtii, each representative of the main three kinds of photosynthetic microorganisms, i.e. respectively photosynthetic bacteria, cyanobacteria and eukaryotic microalgae. The obtained results are in very good agreement with the experimental measurements when the shape of the microorganisms is well described (in comparison to the standard volume-equivalent sphere approximation). As a main perspective, the consideration of the helical shape of Arthrospira platensis appears to be a key to an accurate estimation of its radiative properties. On the whole, the presented methodological chain also appears of great interest for other scientific communities such as atmospheric science, oceanography, astrophysics and engineering.
NASA Technical Reports Server (NTRS)
Liu, Hua-Kuang (Editor); Schenker, Paul (Editor)
1987-01-01
The papers presented in this volume provide an overview of current research in both optical and digital pattern recognition, with a theme of identifying overlapping research problems and methodologies. Topics discussed include image analysis and low-level vision, optical system design, object analysis and recognition, real-time hybrid architectures and algorithms, high-level image understanding, and optical matched filter design. Papers are presented on synthetic estimation filters for a control system; white-light correlator character recognition; optical AI architectures for intelligent sensors; interpreting aerial photographs by segmentation and search; and optical information processing using a new photopolymer.
Houseknecht, David W.; Coleman, James L.; Milici, Robert C.; Garrity, Christopher P.; Rouse, William A.; Fulk, Bryant R.; Paxton, Stanley T.; Abbott, Marvin M.; Mars, John L.; Cook, Troy A.; Schenk, Christopher J.; Charpentier, Ronald R.; Klett, Timothy R.; Pollastro, Richard M.; Ellis, Geoffrey S.
2010-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of 38 trillion cubic feet (TCF) of undiscovered natural gas, 159 million barrels of natural gas liquid (MMBNGL), and no oil in accumulations of 0.5 million barrels (MMBO) or larger in the Arkoma Basin Province and related areas. More than 97 percent of the undiscovered gas occurs in continuous accumulations-70 percent in shale gas formations, 18 percent in a basin-centered accumulation with tight sandstone reservoirs, and 9 percent in coal beds. Less than 3 percent of the natural gas occurs in conventional accumulations.
Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej
2015-01-01
Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.
NASA Astrophysics Data System (ADS)
Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej
2015-02-01
Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.
K-Means Subject Matter Expert Refined Topic Model Methodology
2017-01-01
Refined Topic Model Methodology Topic Model Estimation via K-Means U.S. Army TRADOC Analysis Center-Monterey 700 Dyer Road...January 2017 K-means Subject Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-Means Theodore T. Allen, Ph.D. Zhenhuan...Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-means 5a. CONTRACT NUMBER W9124N-15-P-0022 5b. GRANT NUMBER 5c
Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005
ERIC Educational Resources Information Center
Coffman, Julia, Ed.
2005-01-01
This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…
Wetzel, Hermann
2006-01-01
In a large number of mostly retrospective association studies, a statistical relationship between volume and quality of health care has been reported. However, the relevance of these results is frequently limited by methodological shortcomings. In this article, criteria for the evidence and definition of thresholds for volume-outcome relations are proposed, e.g. the specification of relevant outcomes for quality indicators, analysis of volume as a continuous variable with an adequate case-mix and risk adjustment, accounting for cluster effects and considering mathematical models for the derivation of cut-off values. Moreover, volume thresholds are regarded as surrogate parameters for the indirect classification of the quality of care, whose diagnostic validity and effectiveness in improving health care quality need to be evaluated in prospective studies.
Guidelines for estimating volume, biomass, and smoke production for piled slash.
Colin C. Hardy
1998-01-01
Guidelines in the form of a six-step approach are provided for estimating volumes, oven-dry mass, consumption, and particulate matter emissions for piled logging debris. Seven stylized pile shapes and their associated geometric volume formulae are used to estimate gross pile volumes. The gross volumes are then reduced to net wood volume by applying an appropriate wood-...
Mesospheric gravity wave momentum flux estimation using hybrid Doppler interferometry
NASA Astrophysics Data System (ADS)
Spargo, Andrew J.; Reid, Iain M.; MacKinnon, Andrew D.; Holdsworth, David A.
2017-06-01
Mesospheric gravity wave (GW) momentum flux estimates using data from multibeam Buckland Park MF radar (34.6° S, 138.5° E) experiments (conducted from July 1997 to June 1998) are presented. On transmission, five Doppler beams were symmetrically steered about the zenith (one zenith beam and four off-zenith beams in the cardinal directions). The received beams were analysed with hybrid Doppler interferometry (HDI) (Holdsworth and Reid, 1998), principally to determine the radial velocities of the effective scattering centres illuminated by the radar. The methodology of Thorsen et al. (1997), later re-introduced by Hocking (2005) and since extensively applied to meteor radar returns, was used to estimate components of Reynolds stress due to propagating GWs and/or turbulence in the radar resolution volume. Physically reasonable momentum flux estimates are derived from the Reynolds stress components, which are also verified using a simple radar model incorporating GW-induced wind perturbations. On the basis of these results, we recommend the intercomparison of momentum flux estimates between co-located meteor radars and vertical-beam interferometric MF radars. It is envisaged that such intercomparisons will assist with the clarification of recent concerns (e.g. Vincent et al., 2010) of the accuracy of the meteor radar technique.
Varughese, J K; Wentzel-Larsen, T; Vassbotn, F; Moen, G; Lund-Johansen, M
2010-04-01
In this volumetric study of the vestibular schwannoma, we evaluated the accuracy and reliability of several approximation methods that are in use, and determined the minimum volume difference that needs to be measured for it to be attributable to an actual difference rather than a retest error. We also found empirical proportionality coefficients for the different methods. DESIGN/SETTING AND PARTICIPANTS: Methodological study with investigation of three different VS measurement methods compared to a reference method that was based on serial slice volume estimates. These volume estimates were based on: (i) one single diameter, (ii) three orthogonal diameters or (iii) the maximal slice area. Altogether 252 T1-weighted MRI images with gadolinium contrast, from 139 VS patients, were examined. The retest errors, in terms of relative percentages, were determined by undertaking repeated measurements on 63 scans for each method. Intraclass correlation coefficients were used to assess the agreement between each of the approximation methods and the reference method. The tendency for approximation methods to systematically overestimate/underestimate different-sized tumours was also assessed, with the help of Bland-Altman plots. The most commonly used approximation method, the maximum diameter, was the least reliable measurement method and has inherent weaknesses that need to be considered. This includes greater retest errors than area-based measurements (25% and 15%, respectively), and that it was the only approximation method that could not easily be converted into volumetric units. Area-based measurements can furthermore be more reliable for smaller volume differences than diameter-based measurements. All our findings suggest that the maximum diameter should not be used as an approximation method. We propose the use of measurement modalities that take into account growth in multiple dimensions instead.
NASA Astrophysics Data System (ADS)
Bosca, Ryan J.; Jackson, Edward F.
2016-01-01
Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.
NASA Astrophysics Data System (ADS)
Vulpiani, Gianfranco; Ripepe, Maurizio
2017-04-01
The detection and quantitative retrieval of ash plumes is of significant interest due to the environmental, climatic, and socioeconomic effects of ash fallout which might cause hardship and damages in areas surrounding volcanoes, representing a serious hazard to aircrafts. Real-time monitoring of such phenomena is crucial for initializing ash dispersion models. Ground-based and space-borne remote sensing observations provide essential information for scientific and operational applications. Satellite visible-infrared radiometric observations from geostationary platforms are usually exploited for long-range trajectory tracking and for measuring low-level eruptions. Their imagery is available every 10-30 min and suffers from a relatively poor spatial resolution. Moreover, the field of view of geostationary radiometric measurements may be blocked by water and ice clouds at higher levels and the observations' overall utility is reduced at night. Ground-based microwave weather radars may represent an important tool for detecting and, to a certain extent, mitigating the hazards presented by ash clouds. The possibility of monitoring in all weather conditions at a fairly high spatial resolution (less than a few hundred meters) and every few minutes after the eruption is the major advantage of using ground-based microwave radar systems. Ground-based weather radar systems can also provide data for estimating the ash volume, total mass, and height of eruption clouds. Previous methodological studies have investigated the possibility of using ground-based single- and dual-polarization radar system for the remote sensing of volcanic ash cloud. In the present work, methodology was revised to overcome some limitations related to the assumed microphysics. New scattering simulations based on the T-matrix solution technique were used to set up the parametric algorithms adopted to estimate the mass concentration and ash mean diameter. Furthermore, because quantitative estimation of the erupted materials in the proximity of the volcano's vent is crucial for initializing transportation models, a novel methodology for estimating a volcano eruption's mass discharge rate based on the combination of radar and a thermal camera was developed. We show how it is possible to calculate the mass flow using radar-derived ash concentration and particle diameter at the base of the eruption column using the exit velocity estimated by the thermal camera. The proposed procedure was tested on four Etna eruption episodes that occurred in December 2015 as observed by the available network of C and X band radar systems. The results are congruent with other independent methodologies and observations . The agreement between the total erupted mass derived by the retrieved MDR and the plume concentration can be considered as a self-consistent methodological assessment. Interestingly, the analysis of the polarimetric radar observations allowed us to derive some features of the ash plume, including the size of the eruption column and the height of the gas thrust region.
Sommers, A D
2011-05-03
Liquid droplets on micropatterned surfaces consisting of parallel grooves tens of micrometers in width and depth are considered, and a method for calculating the droplet volume on these surfaces is presented. This model, which utilizes the elongated and parallel-sided nature of droplets condensed on these microgrooved surfaces, requires inputs from two droplet images at ϕ = 0° and ϕ = 90°--namely, the droplet major axis, minor axis, height, and two contact angles. In this method, a circular cross-sectional area is extruded the length of the droplet where the chord of the extruded circle is fixed by the width of the droplet. The maximum apparent contact angle is assumed to occur along the side of the droplet because of the surface energy barrier to wetting imposed by the grooves--a behavior that was observed experimentally. When applied to water droplets condensed onto a microgrooved aluminum surface, this method was shown to calculate the actual droplet volume to within 10% for 88% of the droplets analyzed. This method is useful for estimating the volume of retained droplets on topographically modified, anisotropic surfaces where both heat and mass transfer occur and the surface microchannels are aligned parallel to gravity to assist in condensate drainage.
Helal-Neto, Edward; Cabezas, Santiago Sánchez; Sancenón, Félix; Martínez-Máñez, Ramón; Santos-Oliveira, Ralph
2018-05-10
The use of monoclonal antibodies (Mab) in the current medicine is increasing. Antibody-drug conjugates (ADCs) represents an increasingly and important modality for treating several types of cancer. In this area, the use of Mab associated with nanoparticles is a valuable strategy. However, the methodology used to calculate the Mab entrapment, efficiency and content is extremely expensive. In this study we developed and tested a novel very simple one-step methodology to calculate monoclonal antibody entrapment in mesoporous silica (with magnetic core) nanoparticles using the radiolabeling process as primary methodology. The magnetic core mesoporous silica were successfully developed and characterised. The PXRD analysis at high angles confirmed the presence of magnetic cores in the structures and transmission electron microscopy allowed to determine structures size (58.9 ± 8.1 nm). From the isotherm curve, a specific surface area of 872 m 2 /g was estimated along with a pore volume of 0.85 cm 3 /g and an average pore diameter of 3.15 nm. The radiolabeling process to proceed the indirect determination were well-done. Trastuzumab were successfully labeled (>97%) with Tc-99m generating a clear suspension. Besides, almost all the Tc-99m used (labeling the trastuzumab) remained trapped in the surface of the mesoporous silica for a period as long as 8 h. The indirect methodology demonstrated a high entrapment in magnetic core mesoporous silica surface of Tc-99m-traztuzumab. The results confirmed the potential use from the indirect entrapment efficiency methodology using the radiolabeling process, as a one-step, easy and cheap methodology. Copyright © 2018 Elsevier B.V. All rights reserved.
A new methodology for estimating nuclear casualties as a function of time.
Zirkle, Robert A; Walsh, Terri J; Disraelly, Deena S; Curling, Carl A
2011-09-01
The Human Response Injury Profile (HRIP) nuclear methodology provides an estimate of casualties occurring as a consequence of nuclear attacks against military targets for planning purposes. The approach develops user-defined, time-based casualty and fatality estimates based on progressions of underlying symptoms and their severity changes over time. This paper provides a description of the HRIP nuclear methodology and its development, including inputs, human response and the casualty estimation process.
Analyzing the requirements for mass production of small wind turbine generators
NASA Astrophysics Data System (ADS)
Anuskiewicz, T.; Asmussen, J.; Frankenfield, O.
Mass producibility of small wind turbine generators to give manufacturers design and cost data for profitable production operations is discussed. A 15 kW wind turbine generator for production in annual volumes from 1,000 to 50,000 units is discussed. Methodology to cost the systems effectively is explained. The process estimate sequence followed is outlined with emphasis on the process estimate sheets compiled for each component and subsystem. These data enabled analysts to develop cost breakdown profiles crucial in manufacturing decision-making. The appraisal also led to various design recommendations including replacement of aluminum towers with cost effective carbon steel towers. Extensive cost information is supplied in tables covering subassemblies, capital requirements, and levelized energy costs. The physical layout of the plant is depicted to guide manufacturers in taking advantage of the growing business opportunity now offered in conjunction with the national need for energy development.
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.; Way, M. J.; Gazis, P. G.
2017-01-01
We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform of finely binned galaxy positions. In both cases, deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multipoint hierarchy. We identify some threads of modern large-scale inference methodology that will presumably yield detections in new wider and deeper surveys.
Using GIS to Estimate Lake Volume from Limited Data
Estimates of lake volume are necessary for estimating residence time or modeling pollutants. Modern GIS methods for calculating lake volume improve upon more dated technologies (e.g. planimeters) and do not require potentially inaccurate assumptions (e.g. volume of a frustum of ...
Photogrammetry and Laser Imagery Tests for Tank Waste Volume Estimates: Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Jim G.
2013-03-27
Feasibility tests were conducted using photogrammetry and laser technologies to estimate the volume of waste in a tank. These technologies were compared with video Camera/CAD Modeling System (CCMS) estimates; the current method used for post-retrieval waste volume estimates. This report summarizes test results and presents recommendations for further development and deployment of technologies to provide more accurate and faster waste volume estimates in support of tank retrieval and closure.
Cost benefits of advanced software: A review of methodology used at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla N.
1993-01-01
To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.
R.B. Ferguson; V. Clark Baldwin
1995-01-01
Estimating tree and stand volume in mature plantations is time consuming, involving much manpower and equipment; however, several sampling and volume-prediction techniques are available. This study showed that a well-constructed, volume-equation method yields estimates comparable to those of the often more time-consuming, height-accumulation method, even though the...
Mesquita, D P; Dias, O; Amaral, A L; Ferreira, E C
2009-04-01
In recent years, a great deal of attention has been focused on the research of activated sludge processes, where the solid-liquid separation phase is frequently considered of critical importance, due to the different problems that severely affect the compaction and the settling of the sludge. Bearing that in mind, in this work, image analysis routines were developed in Matlab environment, allowing the identification and characterization of microbial aggregates and protruding filaments in eight different wastewater treatment plants, for a combined period of 2 years. The monitoring of the activated sludge contents allowed for the detection of bulking events proving that the developed image analysis methodology is adequate for a continuous examination of the morphological changes in microbial aggregates and subsequent estimation of the sludge volume index. In fact, the obtained results proved that the developed image analysis methodology is a feasible method for the continuous monitoring of activated sludge systems and identification of disturbances.
Cost benefit analysis of space communications technology: Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Holland, L. D.; Sassone, P. G.; Gallagher, J. J.; Robinette, S. L.; Vogler, F. H.; Zimmer, R. P.
1976-01-01
The questions of (1) whether or not NASA should support the further development of space communications technology, and, if so, (2) which technology's support should be given the highest priority are addressed. Insofar as the issues deal principally with resource allocation, an economics perspective is adopted. The resultant cost benefit methodology utilizes the net present value concept in three distinct analysis stages to evaluate and rank those technologies which pass a qualification test based upon probable (private sector) market failure. User-preference and technology state-of-the-art surveys were conducted (in 1975) to form a data base for the technology evaluation. The program encompassed near-future technologies in space communications earth stations and satellites, including the noncommunication subsystems of the satellite (station keeping, electrical power system, etc.). Results of the research program include confirmation of the applicability of the methodology as well as a list of space communications technologies ranked according to the estimated net present value of their support (development) by NASA.
NASA Astrophysics Data System (ADS)
Katz, B. G.; Eppert, S.; Lohmann, D.; Li, S.; Goteti, G.; Kaheil, Y. H.
2011-12-01
At 4,400 meters, Mount Rainer has been the point of origin for several major lahar events. The largest event, termed the "Osceola Mudflow," occurred 5,500 years ago and covered an area of approximately 550km2 with a total volume of deposited material from 2 to 4km3. Particularly deadly, large lahars are estimated to have maximum flow velocities in of 100km/h with a density often described as "Flowing Concrete." While rare, these events typically cause total destruction within a lahar inundation zone. It is estimated that approximately 150,000 people live on top of previous deposits left by lahars which can be triggered by anything from earthquakes to glacial and chemical erosion of volcanic bedrock over time to liquefaction caused by extreme rainfall events. A novel methodology utilizing a 2 dimensional hydraulic model has been implemented allowing for high resolution (30m) lahar inundation maps to be generated. The utility of this model above or in addition to other methodologies such as that of Iverson (1998), lies in its portability to other lahar zones as well as its ability to model any total volume specified by the user. The process for generating lahar flood plains requires few inputs including: a Digital Terrain Map of any resolution (DTM), a mask defining the locations for lahar genesis, a raster of friction coefficients, and a time series depicting uniform material accumulation over the genesis mask which is allowed to flow down-slope. Finally, a significant improvement in speed has been made for solving the two dimensional model by utilizing the latest in graphics processing unit (GPU) technology which has resulted in a greater than 200 times speed up in model run time over previous CPU-based methods. The model runs for the Osceola Mudflow compare favorably with USGS derived inundation regions as derived using field measurements and GIS based approaches such as the LAHARZ program suit. Overall gradation of low to high risk match well, however the new method allows lahars to flow over a wider range of terrain effecting 800 to 1,700km2 for 2 to 4km3 of source material, a 300 to 150% increase over the literature estimate of 550km2. To demonstrate the portability of this methodology, total set up time for this region was measured in single days, while total run time for a single simulation was 1.5 days for the 42M grid cells within the Mount Rainier study area. Areas of improvement for this methodology include reducing the total effected area by increasing friction coefficients to account for the thicker material within the lahar, as well as utilizing equations specifically for mudflow, such as Meunier's mud/debris equations, for situations where sediment concentration is sufficiently high. The addition of the above may allow for differentiation between cohesive and non-cohesive lahars to be made which has important implications on the length and width of lahar inundation. Additionally, using DTMs which have been corrected to pre-lahar levels while not useable for estimating future risk, would allow for a higher degree of confidence to be placed on modeled versus estimated accumulation map comparisons.
An integrated study to evaluate debris flow hazard in alpine environment
NASA Astrophysics Data System (ADS)
Tiranti, Davide; Crema, Stefano; Cavalli, Marco; Deangeli, Chiara
2018-05-01
Debris flows are among the most dangerous natural processes affecting the alpine environment due to their magnitude (volume of transported material) and the long runout. The presence of structures and infrastructures on alluvial fans can lead to severe problems in terms of interactions between debris flows and human activities. Risk mitigation in these areas requires identifying the magnitude, triggers, and propagation of debris flows. Here, we propose an integrated methodology to characterize these phenomena. The methodology consists of three complementary procedures. Firstly, we adopt a classification method based on the propensity of the catchment bedrocks to produce clayey-grained material. The classification allows us to identify the most likely rheology of the process. Secondly, we calculate a sediment connectivity index to estimate the topographic control on the possible coupling between the sediment source areas and the catchment channel network. This step allows for the assessment of the debris supply, which is most likely available for the channelized processes. Finally, with the data obtained in the previous steps, we modelled the propagation and depositional pattern of debris flows with a 3D code based on Cellular Automata. The results of the numerical runs allow us to identify the depositional patterns and the areas potentially involved in the flow processes. This integrated methodology is applied to a test-bed catchment located in Northwestern Alps. The results indicate that this approach can be regarded as a useful tool to estimate debris flow related potential hazard scenarios in an alpine environment in an expeditious way without possessing an exhaustive knowledge of the investigated catchment, including data on historical debris flow events.
Preoperative TRAM free flap volume estimation for breast reconstruction in lean patients.
Minn, Kyung Won; Hong, Ki Yong; Lee, Sang Woo
2010-04-01
To obtain pleasing symmetry in breast reconstruction with transverse rectus abdominis myocutaneous (TRAM) free flap, a large amount of abdominal flap is elevated and remnant tissue is trimmed in most cases. However, elevation of abundant abdominal flap can cause excessive tension in donor site closure and increase the possibility of hypertrophic scarring especially in lean patients. The TRAM flap was divided into 4 zones in routine manner; the depth and dimension of the 4 zones were obtained using ultrasound and AutoCAD (Autodesk Inc., San Rafael, CA), respectively. The acquired numbers were then multiplied to obtain an estimate of volume of each zone and the each zone volume was added. To confirm the relation between the estimated volume and the actual volume, authors compared intraoperative actual TRAM flap volumes with preoperative estimated volumes in 30 consecutive TRAM free flap breast reconstructions. The estimated volumes and the actual elevated volumes of flap were found to be correlated by regression analysis (r = 0.9258, P < 0.01). According to this result, we could confirm the reliability of the preoperative volume estimation using our method. Afterward, the authors applied this method to 7 lean patients by estimation and revision of the design and obtained symmetric results with minimal donor site morbidity. Preoperative estimation of TRAM flap volume with ultrasound and AutoCAD (Autodesk Inc.) allow the authors to attain the precise volume desired for elevation. This method provides advantages in terms of minimal flap trimming, easier closure of donor sites, reduced scar widening and symmetry, especially in lean patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is comprised of five volumes. The volume presents the study conclusions, summarizes the methodology used (more detail is found in Volume 3), discusses four case study applications of the model, and contains profiles of coastal communities in an Appendix.
Empirical Assessment of the Mean Block Volume of Rock Masses Intersected by Four Joint Sets
NASA Astrophysics Data System (ADS)
Morelli, Gian Luca
2016-05-01
The estimation of a representative value for the rock block volume ( V b) is of huge interest in rock engineering in regards to rock mass characterization purposes. However, while mathematical relationships to precisely estimate this parameter from the spacing of joints can be found in literature for rock masses intersected by three dominant joint sets, corresponding relationships do not actually exist when more than three sets occur. In these cases, a consistent assessment of V b can only be achieved by directly measuring the dimensions of several representative natural rock blocks in the field or by means of more sophisticated 3D numerical modeling approaches. However, Palmström's empirical relationship based on the volumetric joint count J v and on a block shape factor β is commonly used in the practice, although strictly valid only for rock masses intersected by three joint sets. Starting from these considerations, the present paper is primarily intended to investigate the reliability of a set of empirical relationships linking the block volume with the indexes most commonly used to characterize the degree of jointing in a rock mass (i.e. the J v and the mean value of the joint set spacings) specifically applicable to rock masses intersected by four sets of persistent discontinuities. Based on the analysis of artificial 3D block assemblies generated using the software AutoCAD, the most accurate best-fit regression has been found between the mean block volume (V_{{{{b}}_{{m}} }}) of tested rock mass samples and the geometric mean value of the spacings of the joint sets delimiting blocks; thus, indicating this mean value as a promising parameter for the preliminary characterization of the block size. Tests on field outcrops have demonstrated that the proposed empirical methodology has the potential of predicting the mean block volume of multiple-set jointed rock masses with an acceptable accuracy for common uses in most practical rock engineering applications.
Tuck, L.K.; Pearson, Daniel K.; Cannon, M.R.; Dutton, DeAnn M.
2013-01-01
The Tongue River Member of the Tertiary Fort Union Formation is the primary source of groundwater in the Northern Cheyenne Indian Reservation in southeastern Montana. Coal beds within this formation generally contain the most laterally extensive aquifers in much of the reservation. The U.S. Geological Survey, in cooperation with the Northern Cheyenne Tribe, conducted a study to estimate the volume of water in five coal aquifers. This report presents estimates of the volume of water in five coal aquifers in the eastern and southern parts of the Northern Cheyenne Indian Reservation: the Canyon, Wall, Pawnee, Knobloch, and Flowers-Goodale coal beds in the Tongue River Member of the Tertiary Fort Union Formation. Only conservative estimates of the volume of water in these coal aquifers are presented. The volume of water in the Canyon coal was estimated to range from about 10,400 acre-feet (75 percent saturated) to 3,450 acre-feet (25 percent saturated). The volume of water in the Wall coal was estimated to range from about 14,200 acre-feet (100 percent saturated) to 3,560 acre-feet (25 percent saturated). The volume of water in the Pawnee coal was estimated to range from about 9,440 acre-feet (100 percent saturated) to 2,360 acre-feet (25 percent saturated). The volume of water in the Knobloch coal was estimated to range from about 38,700 acre-feet (100 percent saturated) to 9,680 acre-feet (25 percent saturated). The volume of water in the Flowers-Goodale coal was estimated to be about 35,800 acre-feet (100 percent saturated). Sufficient data are needed to accurately characterize coal-bed horizontal and vertical variability, which is highly complex both locally and regionally. Where data points are widely spaced, the reliability of estimates of the volume of coal beds is decreased. Additionally, reliable estimates of the volume of water in coal aquifers depend heavily on data about water levels and data about coal-aquifer characteristics. Because the data needed to define the volume of water were sparse, only conservative estimates of the volume of water in the five coal aquifers are presented in this report. These estimates need to be used with caution and mindfulness of the uncertainty associated with them.
Estimation of feline renal volume using computed tomography and ultrasound.
Tyson, Reid; Logsdon, Stacy A; Werre, Stephen R; Daniel, Gregory B
2013-01-01
Renal volume estimation is an important parameter for clinical evaluation of kidneys and research applications. A time efficient, repeatable, and accurate method for volume estimation is required. The purpose of this study was to describe the accuracy of ultrasound and computed tomography (CT) for estimating feline renal volume. Standardized ultrasound and CT scans were acquired for kidneys of 12 cadaver cats, in situ. Ultrasound and CT multiplanar reconstructions were used to record renal length measurements that were then used to calculate volume using the prolate ellipsoid formula for volume estimation. In addition, CT studies were reconstructed at 1 mm, 5 mm, and 1 cm, and transferred to a workstation where the renal volume was calculated using the voxel count method (hand drawn regions of interest). The reference standard kidney volume was then determined ex vivo using water displacement with the Archimedes' principle. Ultrasound measurement of renal length accounted for approximately 87% of the variability in renal volume for the study population. The prolate ellipsoid formula exhibited proportional bias and underestimated renal volume by a median of 18.9%. Computed tomography volume estimates using the voxel count method with hand-traced regions of interest provided the most accurate results, with increasing accuracy for smaller voxel sizes in grossly normal kidneys (-10.1 to 0.6%). Findings from this study supported the use of CT and the voxel count method for estimating feline renal volume in future clinical and research studies. © 2012 Veterinary Radiology & Ultrasound.
Jiang, Zheng; Wang, Hong; Wu, Qi-nan
2015-06-01
To optimize the processing of polysaccharide extraction from Spirodela polyrrhiza. Five factors related to extraction rate of polysaccharide were optimized by the Plackett-Burman design. Based on this study, three factors, including alcohol volume fraction, extraction temperature and ratio of material to liquid, were regarded as investigation factors by Box-Behnken response surface methodology. The effect order of three factors on the extraction rate of polysaccharide from Spirodela polyrrhiza were as follows: extraction temperature, alcohol volume fraction,ratio of material to liquid. According to Box-Behnken response, the best extraction conditions were: alcohol volume fraction of 81%, ratio of material to liquid of 1:42, extraction temperature of 100 degrees C, extraction time of 60 min for four times. Plackett-Burman design and Box-Behnken response surface methodology used to optimize the extraction process for the polysaccharide in this study is effective and stable.
Vicente J. Monleon
2009-01-01
Currently, Forest Inventory and Analysis estimation procedures use Smalian's formula to compute coarse woody debris (CWD) volume and assume that logs lie horizontally on the ground. In this paper, the impact of those assumptions on volume and biomass estimates is assessed using 7 years of Oregon's Phase 2 data. Estimates of log volume computed using Smalian...
Wong, M; Wuethrich, P; Eggli, P; Hunziker, E
1996-05-01
A new methodology was developed to measure spatial variations in chondrocyte/matrix structural parameters and chondrocyte biosynthetic activity in articular cartilage. This technique is based on the use of a laser scanning confocal microscope that can "optically" section chemically fixed, unembedded tissue. The confocal images are used for morphometric measurement of stereologic parameters such as cell density (cells/mm3), cell volume fraction (%), surface density (l/cm), mean cell volume (micron3), and mean cell surface area (micron2). Adjacent pieces of tissue are simultaneously processed for conventional liquid emulsion autoradiography, and a semiautomated grain counting program is used to measure the silver grain density at regions corresponding to the same sites used for structural measurements. An estimate of chondrocyte biosynthetic activity in terms of grains per cell is obtained by dividing the value for grain density by that for cell density. In this paper, the newly developed methodology was applied to characterize the zone-specific behavior of adult articular cartilage in the free-swelling state. Cylinders of young adult bovine articular cartilage were labelled with either [3H]proline or [35S]sulfate, and chondrocyte biosynthesis and structural parameters were measured from the articular surface to the tidemark. The results showed that chondrocytes of the radial zone occupied twice the volume and surface area of the chondrocytes of the superficial zone but were 10 times more synthetically active. This efficient and unbiased technique may prove useful in studying the correlation between mechanically induced changes in cell form and biosynthetic activity within inhomogeneous tissue as well as metabolic changes in cartilage due to ageing and disease.
Goudeketting, Seline R; Heinen, Stefan G H; Ünlü, Çağdaş; van den Heuvel, Daniel A F; de Vries, Jean-Paul P M; van Strijen, Marco J; Sailer, Anna M
2017-08-01
To systematically review and meta-analyze the added value of 3-dimensional (3D) image fusion technology in endovascular aortic repair for its potential to reduce contrast media volume, radiation dose, procedure time, and fluoroscopy time. Electronic databases were systematically searched for studies published between January 2010 and March 2016 that included a control group describing 3D fusion imaging in endovascular aortic procedures. Two independent reviewers assessed the methodological quality of the included studies and extracted data on iodinated contrast volume, radiation dose, procedure time, and fluoroscopy time. The contrast use for standard and complex endovascular aortic repairs (fenestrated, branched, and chimney) were pooled using a random-effects model; outcomes are reported as the mean difference with 95% confidence intervals (CIs). Seven studies, 5 retrospective and 2 prospective, involving 921 patients were selected for analysis. The methodological quality of the studies was moderate (median 17, range 15-18). The use of fusion imaging led to an estimated mean reduction in iodinated contrast of 40.1 mL (95% CI 16.4 to 63.7, p=0.002) for standard procedures and a mean 70.7 mL (95% CI 44.8 to 96.6, p<0.001) for complex repairs. Secondary outcome measures were not pooled because of potential bias in nonrandomized data, but radiation doses, procedure times, and fluoroscopy times were lower, although not always significantly, in the fusion group in 6 of the 7 studies. Compared with the control group, 3D fusion imaging is associated with a significant reduction in the volume of contrast employed for standard and complex endovascular aortic procedures, which can be particularly important in patients with renal failure. Radiation doses, procedure times, and fluoroscopy times were reduced when 3D fusion was used.
Aguilar, Hector N; Battié, Michele C
2017-01-01
Osteoarthritis is a common hip joint disease, involving loss of articular cartilage. The prevalence and prognosis of hip osteoarthritis have been difficult to determine, with various clinical and radiological methods used to derive epidemiological estimates exhibiting significant heterogeneity. MRI-based methods directly visualise hip joint cartilage, and offer potential to more reliably define presence and severity of osteoarthritis, but have been underused. We performed a systematic review of MRI-based estimates of hip articular cartilage in the general population and in patients with established osteoarthritis, using MEDLINE, EMBASE and SCOPUS current to June 2016, with search terms such as ‘hip’, ‘femoral head’, ‘cartilage’, ‘volume’, ‘thickness’, ‘MRI’, etc. Ultimately, 11 studies were found appropriate for inclusion, but they were heterogeneous in osteoarthritis assessment methodology and composition. Overall, the studies consistently demonstrate the reliability and potential clinical utility of MRI-based estimates. However, no longitudinal data or reference values for hip cartilage thickness or volume have been published, limiting the ability of MRI to define or risk-stratify hip osteoarthritis. MRI-based techniques are available to quantify articular cartilage signal, volume, thickness and defects, which could establish the sequence and rate of articular cartilage changes at the hip that yield symptomatic osteoarthritis. However, prevalence and rates of progression of hip osteoarthritis have not been established in any MRI studies in the general population. Future investigations could fill this important knowledge gap using robust MRI methods in population-based cross-sectional and longitudinal studies. PMID:28405471
NASA Astrophysics Data System (ADS)
Larrea, Patricia; Salinas, Sergio; Widom, Elisabeth; Siebe, Claus; Abbitt, Robbyn J. F.
2017-12-01
Paricutin volcano is the youngest and most studied monogenetic volcano in the Michoacán-Guanajuato volcanic field (Mexico), with an excellent historical record of its nine years (February 1943 to March 1952) of eruptive activity. This eruption offered a unique opportunity to observe the birth of a new volcano and document its entire eruption. Geologists surveyed all of the eruptive phases in progress, providing maps depicting the volcano's sequential growth. We have combined all of those previous results and present a new methodological approach, which utilizes state of the art GIS mapping tools to outline and identify the 23 different eruptive phases as originally defined by Luhr and Simkin (1993). Using these detailed lava flow distribution maps, the volume of each of the flows was estimated with the aid of pre- and post-eruption digital elevation models. Our procedure yielded a total lava flow volume ranging between 1.59 and 1.68 km3 DRE, which is larger than previous estimates based on simpler methods. In addition, compositional data allowed us to estimate magma effusion rates and to determine variations in the relative proportions of the different magma compositions issued during the eruption. These results represent the first comprehensive documentation of the combined chemical, temporal, and volumetric evolution of the Paricutin lava field and provide key constraints for petrological interpretations of the nature of the magmatic plumbing system that fed the eruption.
Kasabova, Boryana E; Holliday, Trenton W
2015-04-01
A new model for estimating human body surface area and body volume/mass from standard skeletal metrics is presented. This model is then tested against both 1) "independently estimated" body surface areas and "independently estimated" body volume/mass (both derived from anthropometric data) and 2) the cylindrical model of Ruff. The model is found to be more accurate in estimating both body surface area and body volume/mass than the cylindrical model, but it is more accurate in estimating body surface area than it is for estimating body volume/mass (as reflected by the standard error of the estimate when "independently estimated" surface area or volume/mass is regressed on estimates derived from the present model). Two practical applications of the model are tested. In the first test, the relative contribution of the limbs versus the trunk to the body's volume and surface area is compared between "heat-adapted" and "cold-adapted" populations. As expected, the "cold-adapted" group has significantly more of its body surface area and volume in its trunk than does the "heat-adapted" group. In the second test, we evaluate the effect of variation in bi-iliac breadth, elongated or foreshortened limbs, and differences in crural index on the body's surface area to volume ratio (SA:V). Results indicate that the effects of bi-iliac breadth on SA:V are substantial, while those of limb lengths and (especially) the crural index are minor, which suggests that factors other than surface area relative to volume are driving morphological variation and ecogeographical patterning in limb prorportions. © 2014 Wiley Periodicals, Inc.
Borque, Paloma; Luke, Edward; Kollias, Pavlos
2016-05-27
Coincident profiling observations from Doppler lidars and radars are used to estimate the turbulence energy dissipation rate (ε) using three different data sources: (i) Doppler radar velocity (DRV), (ii) Doppler lidar velocity (DLV), and (iii) Doppler radar spectrum width (DRW) measurements. Likewise, the agreement between the derived ε estimates is examined at the cloud base height of stratiform warm clouds. Collocated ε estimates based on power spectra analysis of DRV and DLV measurements show good agreement (correlation coefficient of 0.86 and 0.78 for both cases analyzed here) during both drizzling and nondrizzling conditions. This suggests that unified (below and abovemore » cloud base) time-height estimates of ε in cloud-topped boundary layer conditions can be produced. This also suggests that eddy dissipation rate can be estimated throughout the cloud layer without the constraint that clouds need to be nonprecipitating. Eddy dissipation rate estimates based on DRW measurements compare well with the estimates based on Doppler velocity but their performance deteriorates as precipitation size particles are introduced in the radar volume and broaden the DRW values. And, based on this finding, a methodology to estimate the Doppler spectra broadening due to the spread of the drop size distribution is presented. Furthermore, the uncertainties in ε introduced by signal-to-noise conditions, the estimation of the horizontal wind, the selection of the averaging time window, and the presence of precipitation are discussed in detail.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borque, Paloma; Luke, Edward; Kollias, Pavlos
Coincident profiling observations from Doppler lidars and radars are used to estimate the turbulence energy dissipation rate (ε) using three different data sources: (i) Doppler radar velocity (DRV), (ii) Doppler lidar velocity (DLV), and (iii) Doppler radar spectrum width (DRW) measurements. Likewise, the agreement between the derived ε estimates is examined at the cloud base height of stratiform warm clouds. Collocated ε estimates based on power spectra analysis of DRV and DLV measurements show good agreement (correlation coefficient of 0.86 and 0.78 for both cases analyzed here) during both drizzling and nondrizzling conditions. This suggests that unified (below and abovemore » cloud base) time-height estimates of ε in cloud-topped boundary layer conditions can be produced. This also suggests that eddy dissipation rate can be estimated throughout the cloud layer without the constraint that clouds need to be nonprecipitating. Eddy dissipation rate estimates based on DRW measurements compare well with the estimates based on Doppler velocity but their performance deteriorates as precipitation size particles are introduced in the radar volume and broaden the DRW values. And, based on this finding, a methodology to estimate the Doppler spectra broadening due to the spread of the drop size distribution is presented. Furthermore, the uncertainties in ε introduced by signal-to-noise conditions, the estimation of the horizontal wind, the selection of the averaging time window, and the presence of precipitation are discussed in detail.« less
Methodology for Estimating Total Automotive Manufacturing Costs
DOT National Transportation Integrated Search
1983-04-01
A number of methodologies for estimating manufacturing costs have been developed. This report discusses the different approaches and shows that an approach to estimating manufacturing costs in the automobile industry based on surrogate plants is pref...
Image-derived input function with factor analysis and a-priori information.
Simončič, Urban; Zanotti-Fregonara, Paolo
2015-02-01
Quantitative PET studies often require the cumbersome and invasive procedure of arterial cannulation to measure the input function. This study sought to minimize the number of necessary blood samples by developing a factor-analysis-based image-derived input function (IDIF) methodology for dynamic PET brain studies. IDIF estimation was performed as follows: (a) carotid and background regions were segmented manually on an early PET time frame; (b) blood-weighted and tissue-weighted time-activity curves (TACs) were extracted with factor analysis; (c) factor analysis results were denoised and scaled using the voxels with the highest blood signal; (d) using population data and one blood sample at 40 min, whole-blood TAC was estimated from postprocessed factor analysis results; and (e) the parent concentration was finally estimated by correcting the whole-blood curve with measured radiometabolite concentrations. The methodology was tested using data from 10 healthy individuals imaged with [(11)C](R)-rolipram. The accuracy of IDIFs was assessed against full arterial sampling by comparing the area under the curve of the input functions and by calculating the total distribution volume (VT). The shape of the image-derived whole-blood TAC matched the reference arterial curves well, and the whole-blood area under the curves were accurately estimated (mean error 1.0±4.3%). The relative Logan-V(T) error was -4.1±6.4%. Compartmental modeling and spectral analysis gave less accurate V(T) results compared with Logan. A factor-analysis-based IDIF for [(11)C](R)-rolipram brain PET studies that relies on a single blood sample and population data can be used for accurate quantification of Logan-V(T) values.
Volume, conservation and instruction: A classroom based solomon four group study of conflict
NASA Astrophysics Data System (ADS)
Rowell, J. A.; Dawson, C. J.
The research reported is an attempt to widen the applicability of Piagetian theory-based conflict methodology from individual situations to whole classes. A Solomon four group experimental design augmented by a delayed posttest, was used to provide a controlled framework for studying the effects of conflict instruction on Grade 8 students' ability to conserve volume of noncompressible matter, and to apply that knowledge to gas volume. The results, reported for individuals and groups, show the methodology can be effective, particularly when instruction is preceded by a pretest. Immediate posttest differences in knowledge of gas volume between spontaneous (pretest) conservers and instructed conservers of volume of noncompressible matter were no longer in evidence on the delayed posttest. This observation together with the effects of pretesting and of the instructional sequence are shown to have a consistent Piagetian interpretation. Practical implications are discussed.
NASA Astrophysics Data System (ADS)
Ruchkinova, O.; Shchuckin, I.
2017-06-01
Its proved, that phytofilters are environmental friendly solution of problem of purification of surface plate from urbanized territories. Phytofilters answer the nowadays purposes to systems of purification of land drainage. The main problem of it is restrictions, connecter with its use in the conditions of cold temperature. Manufactured a technology and mechanism, which provide a whole-year purification of surface plate and its storage. Experimentally stated optimal makeup of filtering load: peat, zeolite and sand in per cent of volume, which provides defined hydraulic characteristics. Stated sorbate and ion-selective volume of complex filtering load of ordered composition in dynamic conditions. Estimated dependences of exit concentrations of oil products and heavy metals on temperature by filtering through complex filtering load of ordered composition. Defined effectiveness of purification at phytofiltering installation. Fixed an influence of embryophytes on process of phytogeneration and capacity of filtering load. Recommended swamp iris, mace reed and reed grass. Manufactured phytofilter calculation methodology. Calculated economic effect from use of phytofiltration technology in comparison with traditional block-modular installations.
Exact solutions of a two parameter flux model and cryobiological applications.
Benson, James D; Chicone, Carmen C; Critser, John K
2005-06-01
Solute-solvent transmembrane flux models are used throughout biological sciences with applications in plant biology, cryobiology (transplantation and transfusion medicine), as well as circulatory and kidney physiology. Using a standard two parameter differential equation model of solute and solvent transmembrane flux described by Jacobs [The simultaneous measurement of cell permeability to water and to dissolved substances, J. Cell. Comp. Physiol. 2 (1932) 427-444], we determine the functions that describe the intracellular water volume and moles of intracellular solute for every time t and every set of initial conditions. Here, we provide several novel biophysical applications of this theory to important biological problems. These include using this result to calculate the value of cell volume excursion maxima and minima along with the time at which they occur, a novel result that is of significant relevance to the addition and removal of permeating solutes during cryopreservation. We also present a methodology that produces extremely accurate sum of squares estimates when fitting data for cellular permeability parameter values. Finally, we show that this theory allows a significant increase in both accuracy and speed of finite element methods for multicellular volume simulations, which has critical clinical biophysical applications in cryosurgical approaches to cancer treatment.
The validity of ultrasound estimation of muscle volumes.
Infantolino, Benjamin W; Gales, Daniel J; Winter, Samantha L; Challis, John H
2007-08-01
The purpose of this study was to validate ultrasound muscle volume estimation in vivo. To examine validity, vastus lateralis ultrasound images were collected from cadavers before muscle dissection; after dissection, the volumes were determined by hydrostatic weighing. Seven thighs from cadaver specimens were scanned using a 7.5-MHz ultrasound probe (SSD-1000, Aloka, Japan). The perimeter of the vastus lateralis was identified in the ultrasound images and manually digitized. Volumes were then estimated using the Cavalieri principle, by measuring the image areas of sets of parallel two-dimensional slices through the muscles. The muscles were then dissected from the cadavers, and muscle volume was determined via hydrostatic weighing. There was no statistically significant difference between the ultrasound estimation of muscle volume and that estimated using hydrostatic weighing (p > 0.05). The mean percentage error between the two volume estimates was 0.4% +/- 6.9. Three operators all performed four digitizations of all images from one randomly selected muscle; there was no statistical difference between operators or trials and the intraclass correlation was high (>0.8). The results of this study indicate that ultrasound is an accurate method for estimating muscle volumes in vivo.
Determination of Time Dependent Virus Inactivation Rates
NASA Astrophysics Data System (ADS)
Chrysikopoulos, C. V.; Vogler, E. T.
2003-12-01
A methodology is developed for estimating temporally variable virus inactivation rate coefficients from experimental virus inactivation data. The methodology consists of a technique for slope estimation of normalized virus inactivation data in conjunction with a resampling parameter estimation procedure. The slope estimation technique is based on a relatively flexible geostatistical method known as universal kriging. Drift coefficients are obtained by nonlinear fitting of bootstrap samples and the corresponding confidence intervals are obtained by bootstrap percentiles. The proposed methodology yields more accurate time dependent virus inactivation rate coefficients than those estimated by fitting virus inactivation data to a first-order inactivation model. The methodology is successfully applied to a set of poliovirus batch inactivation data. Furthermore, the importance of accurate inactivation rate coefficient determination on virus transport in water saturated porous media is demonstrated with model simulations.
Klett, Timothy R.; Schenk, Christopher J.; Wandrey, Craig J.; Brownfield, Michael E.; Charpentier, Ronald R.; Tennyson, Marilyn E.; Gautier, Donald L.
2014-01-01
Using a well performance-based geologic assessment methodology, the U.S. Geological Survey estimated a technically recoverable mean volume of 62 million barrels of oil in shale oil reservoirs, and more than 3,700 billion cubic feet of gas in tight sandstone gas reservoirs in the Bombay and Krishna-Godavari Provinces of India. The term “provinces” refer to geologically defined units assessed by the USGS for the purposes of this report and carries no political or diplomatic connotation. Shale oil and tight sandstone gas reservoirs were evaluated in the Assam and Cauvery Provinces, but these reservoirs were not quantitatively assessed.
Pannopnut, Papinwit; Kitporntheranunt, Maethaphan; Paritakul, Panwara; Kongsomboon, Kittipong
2015-01-01
To investigate the correlation between ultrasound measured placental volume and collected umbilical cord blood (UCB) volume in term pregnancy. An observational cross-sectional study of term singleton pregnant women in the labor ward at Maha Chakri Sirindhorn Medical Center was conducted. Placental thickness, height, and width were measured using two-dimensional (2D) ultrasound and calculated for placental volume using the volumetric mathematic model. After the delivery of the baby, UCB was collected and measured for its volume immediately. Then, birth weight, placental weight, and the actual placental volume were analyzed. The Pearson's correlation was used to determine the correlation between each two variables. A total of 35 pregnant women were eligible for the study. The mean and standard deviation of estimated placental volume and actual placental volume were 534±180 mL and 575±118 mL, respectively. The median UCB volume was 140 mL (range 98-220 mL). The UCB volume did not have a statistically significant correlation with the estimated placental volume (correlation coefficient 0.15; p=0.37). However, the UCB volume was significantly correlated with the actual placental volume (correlation coefficient 0.62; p<0.001) and birth weight (correlation coefficient 0.38; p=0.02). The estimated placental volume by 2D ultrasound was not significantly correlated with the UCB volume. Further studies to establish the correlation between the UCB volume and the estimated placental volume using other types of placental imaging may be needed.
Expanded uncertainty estimation methodology in determining the sandy soils filtration coefficient
NASA Astrophysics Data System (ADS)
Rusanova, A. D.; Malaja, L. D.; Ivanov, R. N.; Gruzin, A. V.; Shalaj, V. V.
2018-04-01
The combined standard uncertainty estimation methodology in determining the sandy soils filtration coefficient has been developed. The laboratory researches were carried out which resulted in filtration coefficient determination and combined uncertainty estimation obtaining.
Quantitative CT: technique dependence of volume estimation on pulmonary nodules
NASA Astrophysics Data System (ADS)
Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan
2012-03-01
Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.
Estimating sugar maple bark thickness and volume.
Charles L. Stayton; Michael Hoffman
1970-01-01
Sugar maple bark thickness and volume were estimated using first a published method, then equations developed by the authors. Both methods gave estimates that compared closely with measured values. Information is also presented on variation in bark thickness and on weight and volume of bark as a percentage of total merchantable stem weight and volume.
The Development of a Methodology for Estimating the Cost of Air Force On-the-Job Training.
ERIC Educational Resources Information Center
Samers, Bernard N.; And Others
The Air Force uses a standardized costing methodology for resident technical training schools (TTS); no comparable methodology exists for computing the cost of on-the-job training (OJT). This study evaluates three alternative survey methodologies and a number of cost models for estimating the cost of OJT for airmen training in the Administrative…
Comparison of volume estimation methods for pancreatic islet cells
NASA Astrophysics Data System (ADS)
Dvořák, JiřÃ.; Å vihlík, Jan; Habart, David; Kybic, Jan
2016-03-01
In this contribution we study different methods of automatic volume estimation for pancreatic islets which can be used in the quality control step prior to the islet transplantation. The total islet volume is an important criterion in the quality control. Also, the individual islet volume distribution is interesting -- it has been indicated that smaller islets can be more effective. A 2D image of a microscopy slice containing the islets is acquired. The input of the volume estimation methods are segmented images of individual islets. The segmentation step is not discussed here. We consider simple methods of volume estimation assuming that the islets have spherical or ellipsoidal shape. We also consider a local stereological method, namely the nucleator. The nucleator does not rely on any shape assumptions and provides unbiased estimates if isotropic sections through the islets are observed. We present a simulation study comparing the performance of the volume estimation methods in different scenarios and an experimental study comparing the methods on a real dataset.
ERIC Educational Resources Information Center
Genovesi, Giovanni, Ed.
This collection, the last of four volumes on the history of compulsory education among the nations of Europe and the western hemisphere, analyzes statistics, methodology, reforms, and new tendencies. Twelve of the document's 18 articles are written in English, 3 are written in French and 3 are in Italian. Summaries accompany most articles; three…
Inverse analysis of turbidites by machine learning
NASA Astrophysics Data System (ADS)
Naruse, H.; Nakao, K.
2017-12-01
This study aims to propose a method to estimate paleo-hydraulic conditions of turbidity currents from ancient turbidites by using machine-learning technique. In this method, numerical simulation was repeated under various initial conditions, which produces a data set of characteristic features of turbidites. Then, this data set of turbidites is used for supervised training of a deep-learning neural network (NN). Quantities of characteristic features of turbidites in the training data set are given to input nodes of NN, and output nodes are expected to provide the estimates of initial condition of the turbidity current. The optimization of weight coefficients of NN is then conducted to reduce root-mean-square of the difference between the true conditions and the output values of NN. The empirical relationship with numerical results and the initial conditions is explored in this method, and the discovered relationship is used for inversion of turbidity currents. This machine learning can potentially produce NN that estimates paleo-hydraulic conditions from data of ancient turbidites. We produced a preliminary implementation of this methodology. A forward model based on 1D shallow-water equations with a correction of density-stratification effect was employed. This model calculates a behavior of a surge-like turbidity current transporting mixed-size sediment, and outputs spatial distribution of volume per unit area of each grain-size class on the uniform slope. Grain-size distribution was discretized 3 classes. Numerical simulation was repeated 1000 times, and thus 1000 beds of turbidites were used as the training data for NN that has 21000 input nodes and 5 output nodes with two hidden-layers. After the machine learning finished, independent simulations were conducted 200 times in order to evaluate the performance of NN. As a result of this test, the initial conditions of validation data were successfully reconstructed by NN. The estimated values show very small deviation from the true parameters. Comparing to previous inverse modeling of turbidity currents, our methodology is superior especially in the efficiency of computation. Also, our methodology has advantage in extensibility and applicability to various sediment transport processes such as pyroclastic flows or debris flows.
Methodology to Estimate the Quantity, Composition, and ...
This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure. This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estimates have been limited to building-related CDD, a goal in the development of this methodology was to use data originating from CDD facilities and contractors to better capture the current picture of total CDD management, including materials from roads, bridges and infrastructure.
Christopher M. Oswalt; Adam M. Saunders
2009-01-01
Sound estimation procedures are desideratum for generating credible population estimates to evaluate the status and trends in resource conditions. As such, volume estimation is an integral component of the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) program's reporting. In effect, reliable volume estimation procedures are...
Assessing the quality of the volume-outcome relationship in uro-oncology.
Mayer, Erik K; Purkayastha, Sanjay; Athanasiou, Thanos; Darzi, Ara; Vale, Justin A
2009-02-01
To assess systematically the quality of evidence for the volume-outcome relationship in uro-oncology, and thus facilitate the formulating of health policy within this speciality, as 'Implementation of Improving Outcome Guidance' has led to centralization of uro-oncology based on published studies that have supported a 'higher volume-better outcome' relationship, but improved awareness of methodological drawbacks in health service research has questioned the strength of this proposed volume-outcome relationship. We systematically searched previous relevant reports and extracted all articles from 1980 onwards assessing the volume-outcome relationship for cystectomy, prostatectomy and nephrectomy at the institution and/or surgeon level. Studies were assessed for their methodological quality using a previously validated rating system. Where possible, meta-analytical methods were used to calculate overall differences in outcome measures between low and high volume healthcare providers. In all, 22 studies were included in the final analysis; 19 of these were published in the last 5 years. Only four studies appropriately explored the effect of both the institution and surgeon volume on outcome measures. Mortality and length of stay were the most frequently measured outcomes. The median total quality scores within each of the operation types were 8.5, 9 and 8 for cystectomy, prostatectomy and nephrectomy, respectively (possible maximum score 18). Random-effects modelling showed a higher risk of mortality in low-volume institutions than in higher-volume institutions for both cystectomy and nephrectomy (odds ratio 1.88, 95% confidence interval 1.54-2.29, and 1.28, 1.10-1.49, respectively). The methodological quality of volume-outcome research as applied to cystectomy, prostatectomy and nephrectomy is only modest at best. Accepting several limitations, pooled analysis confirms a higher-volume, lower-mortality relationship for cystectomy and nephrectomy. Future research should focus on the development of a quality framework with a validated scoring system for the bench-marking of data to improve validity and facilitate rational policy-making within the speciality of uro-oncology.
NASA Astrophysics Data System (ADS)
Sun, Qiliang; Alves, Tiago M.; Lu, Xiangyang; Chen, Chuanxu; Xie, Xinong
2018-03-01
Submarine slope failure can mobilize large amounts of seafloor sediment, as shown in varied offshore locations around the world. Submarine landslide volumes are usually estimated by mapping their tops and bases on seismic data. However, two essential components of the total volume of failed sediments are overlooked in most estimates: (a) the volume of subseismic turbidites generated during slope failure and (b) the volume of shear compaction occurring during the emplacement of failed sediment. In this study, the true volume of a large submarine landslide in the northern South China Sea is estimated using seismic, multibeam bathymetry and Ocean Drilling Program/Integrated Ocean Drilling Program well data. The submarine landslide was evacuated on the continental slope and deposited in an ocean basin connected to the slope through a narrow moat. This particular character of the sea floor provides an opportunity to estimate the amount of strata remobilized by slope instability. The imaged volume of the studied landslide is 1035 ± 64 km3, 406 ± 28 km3 on the slope and 629 ± 36 km3 in the ocean basin. The volume of subseismic turbidites is 86 km3 (median value), and the volume of shear compaction is 100 km3, which are 8.6% and 9.7% of the landslide volume imaged on seismic data, respectively. This study highlights that the original volume of the failed sediments is significantly larger than that estimated using seismic and bathymetric data. Volume loss related to the generation of landslide-related turbidites and shear compaction must be considered when estimating the total volume of failed strata in the submarine realm.
NASA Astrophysics Data System (ADS)
Saide, P. E.; Steinhoff, D.; Kosovic, B.; Weil, J.; Smith, N.; Blewitt, D.; Delle Monache, L.
2017-12-01
There are a wide variety of methods that have been proposed and used to estimate methane emissions from oil and gas production by using air composition and meteorology observations in conjunction with dispersion models. Although there has been some verification of these methodologies using controlled releases and concurrent atmospheric measurements, it is difficult to assess the accuracy of these methods for more realistic scenarios considering factors such as terrain, emissions from multiple components within a well pad, and time-varying emissions representative of typical operations. In this work we use a large-eddy simulation (LES) to generate controlled but realistic synthetic observations, which can be used to test multiple source term estimation methods, also known as an Observing System Simulation Experiment (OSSE). The LES is based on idealized simulations of the Weather Research & Forecasting (WRF) model at 10 m horizontal grid-spacing covering an 8 km by 7 km domain with terrain representative of a region located in the Barnett shale. Well pads are setup in the domain following a realistic distribution and emissions are prescribed every second for the components of each well pad (e.g., chemical injection pump, pneumatics, compressor, tanks, and dehydrator) using a simulator driven by oil and gas production volume, composition and realistic operational conditions. The system is setup to allow assessments under different scenarios such as normal operations, during liquids unloading events, or during other prescribed operational upset events. Methane and meteorology model output are sampled following the specifications of the emission estimation methodologies and considering typical instrument uncertainties, resulting in realistic observations (see Figure 1). We will show the evaluation of several emission estimation methods including the EPA Other Test Method 33A and estimates using the EPA AERMOD regulatory model. We will also show source estimation results from advanced methods such as variational inverse modeling, and Bayesian inference and stochastic sampling techniques. Future directions including other types of observations, other hydrocarbons being considered, and assessment of additional emission estimation methods will be discussed.
The volume and mean depth of Earth's lakes
NASA Astrophysics Data System (ADS)
Cael, B. B.; Heathcote, A. J.; Seekell, D. A.
2017-01-01
Global lake volume estimates are scarce, highly variable, and poorly documented. We developed a rigorous method for estimating global lake depth and volume based on the Hurst coefficient of Earth's surface, which provides a mechanistic connection between lake area and volume. Volume-area scaling based on the Hurst coefficient is accurate and consistent when applied to lake data sets spanning diverse regions. We applied these relationships to a global lake area census to estimate global lake volume and depth. The volume of Earth's lakes is 199,000 km3 (95% confidence interval 196,000-202,000 km3). This volume is in the range of historical estimates (166,000-280,000 km3), but the overall mean depth of 41.8 m (95% CI 41.2-42.4 m) is significantly lower than previous estimates (62-151 m). These results highlight and constrain the relative scarcity of lake waters in the hydrosphere and have implications for the role of lakes in global biogeochemical cycles.
Exsanguinated blood volume estimation using fractal analysis of digital images.
Sant, Sonia P; Fairgrieve, Scott I
2012-05-01
The estimation of bloodstain volume using fractal analysis of digital images of passive blood stains is presented. Binary digital photos of bloodstains of known volumes (ranging from 1 to 7 mL), dispersed in a defined area, were subjected to image analysis using FracLac V. 2.0 for ImageJ. The box-counting method was used to generate a fractal dimension for each trial. A positive correlation between the generated fractal number and the volume of blood was found (R(2) = 0.99). Regression equations were produced to estimate the volume of blood in blind trials. An error rate ranging from 78% for 1 mL to 7% for 6 mL demonstrated that as the volume increases so does the accuracy of the volume estimation. This method used in the preliminary study proved that bloodstain patterns may be deconstructed into mathematical parameters, thus removing the subjective element inherent in other methods of volume estimation. © 2012 American Academy of Forensic Sciences.
Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico
NASA Astrophysics Data System (ADS)
Nathenson, M.; Fierstein, J.
2012-12-01
Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico Manuel Nathenson and Judy Fierstein U.S. Geological Survey, 345 Middlefield Road MS-910, Menlo Park, CA 94025 In a recent numerical simulation of tephra transport and deposition for the 1982 eruption, Bonasia et al. (2012) used masses for the tephra layers (A-1, B, and C) based on the volume data of Carey and Sigurdsson (1986) calculated by the methodology of Rose et al. (1973). For reasons not clear, using the same methodology we obtained volumes for layers A-1 and B much less than those previously reported. For example, for layer A-1, Carey and Sigurdsson (1986) reported a volume of 0.60 km3, whereas we obtain a volume of 0.23 km3. Moreover, applying the more recent methodology of tephra-volume calculation (Pyle, 1989; Fierstein and Nathenson, 1992) and using the isopachs maps in Carey and Sigurdsson (1986), we calculate a total tephra volume of 0.52 km3 (A-1, 0.135; B, 0.125; and C, 0.26 km3). In contrast, Carey and Sigurdsson (1986) report a much larger total volume of 2.19 km3. Such disagreement not only reflects the differing methodologies, but we propose that the volumes calculated with the methodology of Pyle and of Fierstein and Nathenson—involving the use of straight lines on a log thickness versus square root of area plot—better represent the actual fall deposits. After measuring the areas for the isomass contours for the HAZMAPP and FALL3D simulations in Bonasia et al. (2012), we applied the Pyle-Fierstein and Nathenson methodology to calculate the tephra masses deposited on the ground. These masses from five of the simulations range from 70% to 110% of those reported by Carey and Sigurdsson (1986), whereas that for layer B in the HAZMAP calculation is 160%. In the Bonasia et al. (2012) study, the mass erupted by the volcano is a critical input used in the simulation to produce an ash cloud that deposits tephra on the ground. Masses on the ground (as calculated by us) for five of the simulations range from 20% to 46% of the masses used as simulation inputs, whereas that for layer B in the HAZMAP calculation is 74%. It is not clear why the percentages are so variable, nor why the output volumes are such small percentages of the input erupted mass. From our volume calculations, the masses on the ground from the simulations are factors of 2.3 to 10 times what was actually deposited. Given this finding from our reevaluation of volumes, the simulations appear to overestimate the hazards from eruptions of sizes that occurred at El Chichón. Bonasia, R., A. Costa, A. Folch, G. Macedonio, and L. Capra, (2012), Numerical simulation of tephra transport and deposition of the 1982 El Chichón eruption and implications for hazard assessment, J. Volc. Geotherm. Res., 231-232, 39-49. Carey, S. and H. Sigurdsson, (1986), The 1982 eruptions of El Chichon volcano, Mexico: Observations and numerical modelling of tephra-fall distribution, Bull. Volcanol., 48, 127-141. Fierstein, J., and M. Nathenson, (1992), Another look at the calculation of fallout tephra volumes, Bull. Volcanol., 54, 156-167. Pyle, D.M., (1989), The thickness, volume and grainsize of tephra fall deposits, Bull. Volcanol., 51, 1-15. Rose, W.I., Jr., S. Bonis, R.E. Stoiber, M. Keller, and T. Bickford, (1973), Studies of volcanic ash from two recent Central American eruptions, Bull. Volcanol., 37, 338-364.
A New Approach for Deep Gray Matter Analysis Using Partial-Volume Estimation.
Bonnier, Guillaume; Kober, Tobias; Schluep, Myriam; Du Pasquier, Renaud; Krueger, Gunnar; Meuli, Reto; Granziera, Cristina; Roche, Alexis
2016-01-01
The existence of partial volume effects in brain MR images makes it challenging to understand physio-pathological alterations underlying signal changes due to pathology across groups of healthy subjects and patients. In this study, we implement a new approach to disentangle gray and white matter alterations in the thalamus and the basal ganglia. The proposed method was applied to a cohort of early multiple sclerosis (MS) patients and healthy subjects to evaluate tissue-specific alterations related to diffuse inflammatory or neurodegenerative processes. Forty-three relapsing-remitting MS patients and nineteen healthy controls underwent 3T MRI including: (i) fluid-attenuated inversion recovery, double inversion recovery, magnetization-prepared gradient echo for lesion count, and (ii) T1 relaxometry. We applied a partial volume estimation algorithm to T1 relaxometry maps to gray and white matter local concentrations as well as T1 values characteristic of gray and white matter in the thalamus and the basal ganglia. Statistical tests were performed to compare groups in terms of both global T1 values, tissue characteristic T1 values, and tissue concentrations. Significant increases in global T1 values were observed in the thalamus (p = 0.038) and the putamen (p = 0.026) in RRMS patients compared to HC. In the Thalamus, the T1 increase was associated with a significant increase in gray matter characteristic T1 (p = 0.0016) with no significant effect in white matter. The presented methodology provides additional information to standard MR signal averaging approaches that holds promise to identify the presence and nature of diffuse pathology in neuro-inflammatory and neurodegenerative diseases.
NASA Astrophysics Data System (ADS)
Xu, Robert S.; Michailovich, Oleg V.; Solovey, Igor; Salama, Magdy M. A.
2010-03-01
Prostate specific antigen density is an established parameter for indicating the likelihood of prostate cancer. To this end, the size and volume of the gland have become pivotal quantities used by clinicians during the standard cancer screening process. As an alternative to manual palpation, an increasing number of volume estimation methods are based on the imagery data of the prostate. The necessity to process large volumes of such data requires automatic segmentation algorithms, which can accurately and reliably identify the true prostate region. In particular, transrectal ultrasound (TRUS) imaging has become a standard means of assessing the prostate due to its safe nature and high benefit-to-cost ratio. Unfortunately, modern TRUS images are still plagued by many ultrasound imaging artifacts such as speckle noise and shadowing, which results in relatively low contrast and reduced SNR of the acquired images. Consequently, many modern segmentation methods incorporate prior knowledge about the prostate geometry to enhance traditional segmentation techniques. In this paper, a novel approach to the problem of TRUS segmentation, particularly the definition of the prostate shape prior, is presented. The proposed approach is based on the concept of distribution tracking, which provides a unified framework for tracking both photometric and morphological features of the prostate. In particular, the tracking of morphological features defines a novel type of "weak" shape priors. The latter acts as a regularization force, which minimally bias the segmentation procedure, while rendering the final estimate stable and robust. The value of the proposed methodology is demonstrated in a series of experiments.
Hazardous waste management and weight-based indicators--the case of Haifa Metropolis.
Elimelech, E; Ayalon, O; Flicstein, B
2011-01-30
The quantity control of hazardous waste in Israel relies primarily on the Environmental Services Company (ESC) reports. With limited management tools, the Ministry of Environmental Protection (MoEP) has no applicable methodology to confirm or monitor the actual amounts of hazardous waste produced by various industrial sectors. The main goal of this research was to develop a method for estimating the amounts of hazardous waste produced by various sectors. In order to achieve this goal, sector-specific indicators were tested on three hazardous waste producing sectors in the Haifa Metropolis: petroleum refineries, dry cleaners, and public hospitals. The findings reveal poor practice of hazardous waste management in the dry cleaning sector and in the public hospitals sector. Large discrepancies were found in the dry cleaning sector, between the quantities of hazardous waste reported and the corresponding indicator estimates. Furthermore, a lack of documentation on hospitals' pharmaceutical and chemical waste production volume was observed. Only in the case of petroleum refineries, the reported amount was consistent with the estimate. Copyright © 2010 Elsevier B.V. All rights reserved.
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
Houseknecht, D.W.; Bird, K.J.; Schuenemeyer, J.H.; Attanasi, E.D.; Garrity, C.P.; Schenk, C.J.; Charpentier, R.R.; Pollastro, R.M.; Cook, T.A.; and Klett, T.R.
2010-01-01
Using a geology-based assessment methodology, the U.S. Geological Survey estimated mean volumes of 896 million barrels of oil (MMBO) and about 53 trillion cubic feet (TCFG) of nonassociated natural gas in conventional, undiscovered accumulations within the National Petroleum Reserve in Alaska and adjacent State waters. The estimated volume of undiscovered oil is significantly lower than estimates released in 2002, owing primarily to recent exploration drilling that revealed an abrupt transition from oil to gas and reduced reservoir quality in the Alpine sandstone 15-20 miles west of the giant Alpine oil field. The National Petroleum Reserve in Alaska (NPRA) has been the focus of oil exploration during the past decade, stimulated by the mid-1990s discovery of the adjacent Alpine field-the largest onshore oil discovery in the United States during the past 25 years. Recent activities in NPRA, including extensive 3-D seismic surveys, six Federal lease sales totaling more than $250 million in bonus bids, and completion of more than 30 exploration wells on Federal and Native lands, indicate in key formations more gas than oil and poorer reservoir quality than anticipated. In the absence of a gas pipeline from northern Alaska, exploration has waned and several petroleum companies have relinquished assets in the NPRA. This fact sheet updates U.S. Geological Survey (USGS) estimates of undiscovered oil and gas in NPRA, based on publicly released information from exploration wells completed during the past decade and on the results of research that documents significant Cenozoic uplift and erosion in NPRA. The results included in this fact sheet-released in October 2010-supersede those of a previous assessment completed by the USGS in 2002.
DOT National Transportation Integrated Search
1995-01-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...
DOT National Transportation Integrated Search
1995-09-01
This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...
Trabant, Dennis C.
1999-01-01
The volume of four of the largest glaciers on Iliamna Volcano was estimated using the volume model developed for evaluating glacier volumes on Redoubt Volcano. The volume model is controlled by simulated valley cross sections that are constructed by fitting third-order polynomials to the shape of the valley walls exposed above the glacier surface. Critical cross sections were field checked by sounding with ice-penetrating radar during July 1998. The estimated volumes of perennial snow and glacier ice for Tuxedni, Lateral, Red, and Umbrella Glaciers are 8.6, 0.85, 4.7, and 0.60 cubic kilometers respectively. The estimated volume of snow and ice on the upper 1,000 meters of the volcano is about 1 cubic kilometer. The volume estimates are thought to have errors of no more than ?25 percent. The volumes estimated for the four largest glaciers are more than three times the total volume of snow and ice on Mount Rainier and about 82 times the total volume of snow and ice that was on Mount St. Helens before its May 18, 1980 eruption. Volcanoes mantled by substantial snow and ice covers have produced the largest and most catastrophic lahars and floods. Therefore, it is prudent to expect that, during an eruptive episode, flooding and lahars threaten all of the drainages heading on Iliamna Volcano. On the other hand, debris avalanches can happen any time. Fortunately, their influence is generally limited to the area within a few kilometers of the summit.
ERIC Educational Resources Information Center
Construction Systems Management, Inc., Anchorage, AK.
Volume II of a 3-volume report demonstrates the use of Design Determinants and Options (presented in Volume I) in the planning and design of small rural Alaskan secondary schools. Section I, a checklist for gathering site-specific information to be used as a data base for facility design, is organized in the same format as Volume I, which can be…
Glacier volume estimation of Cascade Volcanoes—an analysis and comparison with other methods
Driedger, Carolyn L.; Kennard, P.M.
1986-01-01
During the 1980 eruption of Mount St. Helens, the occurrence of floods and mudflows made apparent a need to assess mudflow hazards on other Cascade volcanoes. A basic requirement for such analysis is information about the volume and distribution of snow and ice on these volcanoes. An analysis was made of the volume-estimation methods developed by previous authors and a volume estimation method was developed for use in the Cascade Range. A radio echo-sounder, carried in a backpack, was used to make point measurements of ice thickness on major glaciers of four Cascade volcanoes (Mount Rainier, Washington; Mount Hood and the Three Sisters, Oregon; and Mount Shasta, California). These data were used to generate ice-thickness maps and bedrock topographic maps for developing and testing volume-estimation methods. Subsequently, the methods were applied to the unmeasured glaciers on those mountains and, as a test of the geographical extent of applicability, to glaciers beyond the Cascades having measured volumes. Two empirical relationships were required in order to predict volumes for all the glaciers. Generally, for glaciers less than 2.6 km in length, volume was found to be estimated best by using glacier area, raised to a power. For longer glaciers, volume was found to be estimated best by using a power law relationship, including slope and shear stress. The necessary variables can be estimated from topographic maps and aerial photographs.
Fast left ventricle tracking in CMR images using localized anatomical affine optical flow
NASA Astrophysics Data System (ADS)
Queirós, Sandro; Vilaça, João. L.; Morais, Pedro; Fonseca, Jaime C.; D'hooge, Jan; Barbosa, Daniel
2015-03-01
In daily cardiology practice, assessment of left ventricular (LV) global function using non-invasive imaging remains central for the diagnosis and follow-up of patients with cardiovascular diseases. Despite the different methodologies currently accessible for LV segmentation in cardiac magnetic resonance (CMR) images, a fast and complete LV delineation is still limitedly available for routine use. In this study, a localized anatomically constrained affine optical flow method is proposed for fast and automatic LV tracking throughout the full cardiac cycle in short-axis CMR images. Starting from an automatically delineated LV in the end-diastolic frame, the endocardial and epicardial boundaries are propagated by estimating the motion between adjacent cardiac phases using optical flow. In order to reduce the computational burden, the motion is only estimated in an anatomical region of interest around the tracked boundaries and subsequently integrated into a local affine motion model. Such localized estimation enables to capture complex motion patterns, while still being spatially consistent. The method was validated on 45 CMR datasets taken from the 2009 MICCAI LV segmentation challenge. The proposed approach proved to be robust and efficient, with an average distance error of 2.1 mm and a correlation with reference ejection fraction of 0.98 (1.9 +/- 4.5%). Moreover, it showed to be fast, taking 5 seconds for the tracking of a full 4D dataset (30 ms per image). Overall, a novel fast, robust and accurate LV tracking methodology was proposed, enabling accurate assessment of relevant global function cardiac indices, such as volumes and ejection fraction
NASA Astrophysics Data System (ADS)
Raju, Subramanian; Saibaba, Saroja
2016-09-01
The enthalpy of formation Δo H f is an important thermodynamic quantity, which sheds significant light on fundamental cohesive and structural characteristics of an alloy. However, being a difficult one to determine accurately through experiments, simple estimation procedures are often desirable. In the present study, a modified prescription for estimating Δo H f L of liquid transition metal alloys is outlined, based on the Macroscopic Atom Model of cohesion. This prescription relies on self-consistent estimation of liquid-specific model parameters, namely electronegativity ( ϕ L) and bonding electron density ( n b L ). Such unique identification is made through the use of well-established relationships connecting surface tension, compressibility, and molar volume of a metallic liquid with bonding charge density. The electronegativity is obtained through a consistent linear scaling procedure. The preliminary set of values for ϕ L and n b L , together with other auxiliary model parameters, is subsequently optimized to obtain a good numerical agreement between calculated and experimental values of Δo H f L for sixty liquid transition metal alloys. It is found that, with few exceptions, the use of liquid-specific model parameters in Macroscopic Atom Model yields a physically consistent methodology for reliable estimation of mixing enthalpies of liquid alloys.
A field test of cut-off importance sampling for bole volume
Jeffrey H. Gove; Harry T. Valentine; Michael J. Holmes
2000-01-01
Cut-off importance sampling has recently been introduced as a technique for estimating bole volume to some point below the tree tip, termed the cut-off point. A field test of this technique was conducted on a small population of eastern white pine trees using dendrometry as the standard for volume estimation. Results showed that the differences in volume estimates...
NASA Astrophysics Data System (ADS)
Rodrigues, Lineu; Senzanje, Aidan; Cecchi, Philippe; Liebe, Jens
2010-05-01
People living in areas with highly variable rainfall, experience droughts and floods and often have insecure livelihoods. Small multi-purpose reservoirs (SR) are a widely used form of infrastructures to provide people in such areas with water during the dry season, e.g. in the basins of São Francisco, Brazil, Limpopo, Zimbabwe, Bandama, Ivory Coast and Volta, Ghana. In these areas, the available natural flow in the streams is sometimes less than the flow required for water supply or irrigation, however water can be stored in times of surplus, for example, from a wet season to a dry season. Efficient water management and sound reservoir planning are hindered by the lack of information about the functioning of these reservoirs. Reservoirs in these regions were constructed in a series of projects funded by different agencies, at different times, with little or no coordination among the implementing partners. Poor record keeping and the lack of appropriate institutional support result in deficiencies of information on the capacity, operation, and maintenance of these structures. Estimating the storage capacity of dams is essential to the responsible management of water diversion. Most of SR in these basins have never been evaluated, possibly because the tools currently used for such measurement are labor-intensive, costly and time-consuming. The objective of this research was to develop methodology to estimate small reservoir capacities as a function of their remotely sensed surface areas in the São Francisco, Limpopo, Bandama and Volta basins, as a way to contribute to improve the water resource management in those catchments. Remote sensing was used to identify, localize and characterize small reservoirs. The surface area of each was calculated from satellite images. A sub-set of reservoirs was selected. For each reservoir in the sub-set, the surface area was estimated from field surveys, and storage capacity was estimated using information on reservoir surface area, depth and shape. Depth was measured using a stadia rod or a manual echosounder. For reservoirs in the sub-set, estimated surface area was used as an input into the triangulated irregular network model. With the surface area and depth, measured volume was calculated. Comparisons were made between estimates of surface area from field surveys and estimates of surface area from remote sensing. A linear regression analysis was carried out to establish the relationship between surface area and storage capacities. Within geomorphologically homogenous regions, one may expect a good correlation between the surface area, which may be determined through satellite observations, and the stored volume. Such a relation depends on the general shape of the slopes (convex, through straight, to concave). The power relationships between remotely sensed surface areas (m^2) and storage capacities of reservoirs (m^3) obtained were - Limpopo basin (Lower Mzingwane sub-catchment): Volume = 0.023083 x Area^1.3272 (R2 = 95%); Bandama basin (North of the basin in Ivory Coast): Volume = 0.00405 x Area^1.4953 (R2 = 88.9%); Volta basin (Upper East region of the Volta Basin in Ghana): Volume = 0.00857 × Area^1.43 (R2 = 97.5%); São Francisco basin (Preto river sub-catchment): Volume = 0.2643 x Area^1.1632 (R2 = 92.1%). Remote sensing was found to be a suitable means to detect small reservoirs and accurately measure their surface areas. The general relationship between measured reservoir volumes and their remotely sensed surface areas showed good accuracy for all four basins. Combining such relationships with periodical satellite-based reservoir area measurements may allow hydrologists and planners to have clear picture of water resource system in the Basins, especially in ungauged sub-basins.
Methodologies for Estimating Cumulative Human Exposures to Current-Use Pyrethroid Pesticides
We estimated cumulative residential pesticide exposures for a group of nine young children (4–6 years) using three different methodologies developed by the US Environmental Protection Agency and compared the results with estimates derived from measured urinary metabolite concentr...
CONCEPTUAL DESIGNS FOR A NEW HIGHWAY VEHICLE EMISSIONS ESTIMATION METHODOLOGY
The report discusses six conceptual designs for a new highway vehicle emissions estimation methodology and summarizes the recommendations of each design for improving the emissions and activity factors in the emissions estimation process. he complete design reports are included a...
Elia, Antonio; Conversa, Giulia
2015-01-01
Reduced water availability and environmental pollution caused by nitrogen (N) losses have increased the need for rational management of irrigation and N fertilization in horticultural systems. Decision support systems (DSS) could be powerful tools to assist farmers to improve irrigation and N fertilization efficiency. Currently, fertilization by drip irrigation system (fertigation) is used for many vegetable crops around the world. The paper illustrates the theoretical basis, the methodological approach and the structure of a DSS called GesCoN for fertigation management in open field vegetable crops. The DSS is based on daily water and N balance, considering the water lost by evapotranspiration (ET) and the N content in the aerial part of the crop (N uptake) as subtraction and the availability of water and N in the wet soil volume most effected by roots as the positive part. For the water balance, reference ET can be estimated using the Penman-Monteith (PM) or the Priestley-Taylor and Hargreaves models, specifically calibrated under local conditions. Both single or dual Kc approach can be used to calculate crop ET. Rain runoff and deep percolation are considered to calculate the effective rainfall. The soil volume most affected by the roots, the wet soil under emitters and their interactions are modeled. Crop growth is modeled by a non-linear logistic function on the basis of thermal time, but the model takes into account thermal and water stresses and allows an in-season calibration through a dynamic adaptation of the growth rate to the specific genetic and environmental conditions. N crop demand is related to DM accumulation by the N critical curve. N mineralization from soil organic matter is daily estimated. The DSS helps users to evaluate the daily amount of water and N fertilizer that has to be applied in order to fulfill the water and N-crop requirements to achieve the maximum potential yield, while reducing the risk of nitrate outflows.
DWI filtering using joint information for DTI and HARDI.
Tristán-Vega, Antonio; Aja-Fernández, Santiago
2010-04-01
The filtering of the Diffusion Weighted Images (DWI) prior to the estimation of the diffusion tensor or other fiber Orientation Distribution Functions (ODF) has been proved to be of paramount importance in the recent literature. More precisely, it has been evidenced that the estimation of the diffusion tensor without a previous filtering stage induces errors which cannot be recovered by further regularization of the tensor field. A number of approaches have been intended to overcome this problem, most of them based on the restoration of each DWI gradient image separately. In this paper we propose a methodology to take advantage of the joint information in the DWI volumes, i.e., the sum of the information given by all DWI channels plus the correlations between them. This way, all the gradient images are filtered together exploiting the first and second order information they share. We adapt this methodology to two filters, namely the Linear Minimum Mean Squared Error (LMMSE) and the Unbiased Non-Local Means (UNLM). These new filters are tested over a wide variety of synthetic and real data showing the convenience of the new approach, especially for High Angular Resolution Diffusion Imaging (HARDI). Among the techniques presented, the joint LMMSE is proved a very attractive approach, since it shows an accuracy similar to UNLM (or even better in some situations) with a much lighter computational load. Copyright 2009 Elsevier B.V. All rights reserved.
A radiographic method to estimate lung volume and its use in small mammals.
Canals, Mauricio; Olivares, Ricardo; Rosenmann, Mario
2005-01-01
In this paper we develop a method to estimate lung volume using chest x-rays of small mammals. We applied this method to assess the lung volume of several rodents. We showed that a good estimator of the lung volume is: V*L = 0.496 x VRX approximately equal to 1/2 x VRX, where VRX is a measurement obtained from the x-ray that represents the volume of a rectangular box containing the lungs and mediastinum organs. The proposed formula may be interpreted as the volume of an ellipsoid formed by both lungs joined at their bases. When that relationship was used to estimate lung volume, values similar to those expected from allometric relationship were found in four rodents. In two others, M. musculus and R. norvegicus, lung volume was similar to reported data, although values were lower than expected.
Cihan, Abdullah; Birkholzer, Jens; Bianchi, Marco
2014-12-31
Large-scale pressure increases resulting from carbon dioxide (CO 2) injection in the subsurface can potentially impact caprock integrity, induce reactivation of critically stressed faults, and drive CO 2 or brine through conductive features into shallow groundwater. Pressure management involving the extraction of native fluids from storage formations can be used to minimize pressure increases while maximizing CO2 storage. However, brine extraction requires pumping, transportation, possibly treatment, and disposal of substantial volumes of extracted brackish or saline water, all of which can be technically challenging and expensive. This paper describes a constrained differential evolution (CDE) algorithm for optimal well placement andmore » injection/ extraction control with the goal of minimizing brine extraction while achieving predefined pressure contraints. The CDE methodology was tested for a simple optimization problem whose solution can be partially obtained with a gradient-based optimization methodology. The CDE successfully estimated the true global optimum for both extraction well location and extraction rate, needed for the test problem. A more complex example application of the developed strategy was also presented for a hypothetical CO 2 storage scenario in a heterogeneous reservoir consisting of a critically stressed fault nearby an injection zone. Through the CDE optimization algorithm coupled to a numerical vertically-averaged reservoir model, we successfully estimated optimal rates and locations for CO 2 injection and brine extraction wells while simultaneously satisfying multiple pressure buildup constraints to avoid fault activation and caprock fracturing. The study shows that the CDE methodology is a very promising tool to solve also other optimization problems related to GCS, such as reducing ‘Area of Review’, monitoring design, reducing risk of leakage and increasing storage capacity and trapping.« less
Kolokotroni, Eleni; Dionysiou, Dimitra; Veith, Christian; Kim, Yoo-Jin; Franz, Astrid; Grgic, Aleksandar; Bohle, Rainer M.; Stamatakos, Georgios
2016-01-01
The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs’ cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of such estimates with the molecular profile of patients could serve as a basis for reliable personalized predictions. PMID:27657742
Peláez-Coca, M. D.; Orini, M.; Lázaro, J.; Bailón, R.; Gil, E.
2013-01-01
A methodology that combines information from several nonstationary biological signals is presented. This methodology is based on time-frequency coherence, that quantifies the similarity of two signals in the time-frequency domain. A cross time-frequency analysis method, based on quadratic time-frequency distribution, has been used for combining information of several nonstationary biomedical signals. In order to evaluate this methodology, the respiratory rate from the photoplethysmographic (PPG) signal is estimated. The respiration provokes simultaneous changes in the pulse interval, amplitude, and width of the PPG signal. This suggests that the combination of information from these sources will improve the accuracy of the estimation of the respiratory rate. Another target of this paper is to implement an algorithm which provides a robust estimation. Therefore, respiratory rate was estimated only in those intervals where the features extracted from the PPG signals are linearly coupled. In 38 spontaneous breathing subjects, among which 7 were characterized by a respiratory rate lower than 0.15 Hz, this methodology provided accurate estimates, with the median error {0.00; 0.98} mHz ({0.00; 0.31}%) and the interquartile range error {4.88; 6.59} mHz ({1.60; 1.92}%). The estimation error of the presented methodology was largely lower than the estimation error obtained without combining different PPG features related to respiration. PMID:24363777
NASA Technical Reports Server (NTRS)
Baker, T. C. (Principal Investigator)
1982-01-01
A general methodology is presented for estimating a stratum's at-harvest crop acreage proportion for a given crop year (target year) from the crop's estimated acreage proportion for sample segments from within the stratum. Sample segments from crop years other than the target year are (usually) required for use in conjunction with those from the target year. In addition, the stratum's (identifiable) crop acreage proportion may be estimated for times other than at-harvest in some situations. A by-product of the procedure is a methodology for estimating the change in the stratum's at-harvest crop acreage proportion from crop year to crop year. An implementation of the proposed procedure as a statistical analysis system routine using the system's matrix language module, PROC MATRIX, is described and documented. Three examples illustrating use of the methodology and algorithm are provided.
Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frothingham, David; Barker, Michelle; Buechi, Steve
2013-07-01
Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less
Okariz, Ana; Guraya, Teresa; Iturrondobeitia, Maider; Ibarretxe, Julen
2017-02-01
The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rosenberg, Eliott N.; Head, James W., III
2015-11-01
Our goal is to quantify the cumulative water volume that was required to carve the Late Noachian valley networks on Mars. We employ an improved methodology in which fluid/sediment flux ratios are based on empirical data, not assumed. We use a large quantity of data from terrestrial rivers to assess the variability of actual fluid/sediment flux sediment ratios. We find the flow depth by using an empirical relationship to estimate the fluid flux from the estimated channel width, and then using estimated grain sizes (theoretical sediment grain size predictions and comparison with observations by the Curiosity rover) to find the flow depth to which the resulting fluid flux corresponds. Assuming that the valley networks contained alluvial bed rivers, we find, from their current slopes and widths, that the onset of suspended transport occurs near the sand-gravel boundary. Thus, any bed sediment must have been fine gravel or coarser, whereas fine sediment would be carried downstream. Subsequent to the cessation of fluvial activity, aeolian processes have partially redistributed fine-grain particles in the valleys, often forming dunes. It seems likely that the dominant bed sediment size was near the threshold for suspension, and assuming that this was the case could make our final results underestimates, which is the same tendency that our other assumptions have. Making this assumption, we find a global equivalent layer (GEL) of 3-100 m of water to be the most probable cumulative volume that passed through the valley networks. This value is similar to the ∼34 m water GEL currently on the surface and in the near-surface in the form of ice. Note that the amount of water required to carve the valley networks could represent the same water recycled through a surface valley network hydrological system many times in separate or continuous precipitation/runoff/collection/evaporation/precipitation cycles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, B; Brady, S; Kaufman, R
Purpose: Investigate the correlation of SSDE with organ dose in a pediatric population. Methods: Four anthropomorphic phantoms, representing a range of pediatric body habitus, were scanned with MOSFET dosimeters placed at 23 organ locations to determine absolute organ dosimetry. Phantom organ dosimetry was divided by phantom SSDE to determine correlation between organ dose and SSDE. Correlation factors were then multiplied by patient SSDE to estimate patient organ dose. Patient demographics consisted of 352 chest and 241 abdominopelvic CT examinations, 22 ± 15 kg (range 5−55 kg) mean weight, and 6 ± 5 years (range 4 mon to 23 years) meanmore » age. Patient organ dose estimates were compared to published pediatric Monte Carlo study results. Results: Phantom effective diameters were matched with patient population effective diameters to within 4 cm. 23 organ correlation factors were determined in the chest and abdominopelvic region across nine pediatric weight subcategories. For organs fully covered by the scan volume, correlation in the chest (average 1.1; range 0.7−1.4) and abdominopelvic (average 0.9; range 0.7−1.3) was near unity. For organs that extended beyond the scan volume (i.e., skin, bone marrow, and bone surface), correlation was determined to be poor (average 0.3; range: 0.1−0.4) for both the chest and abdominopelvic regions, respectively. Pediatric organ dosimetry was compared to published values and was found to agree in the chest to better than an average of 5% (27.6/26.2) and in the abdominopelvic region to better than 2% (73.4/75.0). Conclusion: Average correlation of SSDE and organ dosimetry was found to be better than ± 10% for fully covered organs within the scan volume. This study provides a list of organ dose correlation factors for the chest and abdominopelvic regions, and describes a simple methodology to estimate individual pediatric patient organ dose based on patient SSDE.« less
Acer, N; Bayar, B; Basaloglu, H; Oner, E; Bayar, K; Sankur, S
2008-11-20
The size and shape of tarsal bones are especially relevant when considering some orthopedic diseases such as clubfoot. For this reason, the measurements of the tarsal bones have been the subject of many studies, none of which has used stereological methods to estimate the volume. In the present stereological study, we estimated the volume of calcaneal bone of normal feet and dry bones. We used a combination of the Cavalieri principle and computer tomographic scans taken from eight males and nine dry calcanei to estimate the volumes of calcaneal bones. The mean volume of dry calcaneal bones was estimated, producing mean results using the point-counting method and Archimedes principle being 49.11+/-10.7 or 48.22+/-11.92 cm(3), respectively. A positive correlation was found between anthropometric measurements and the volume of calcaneal bones. The findings of the present study using the stereological methods could provide data for the evaluation of normal and pathological volumes of calcaneal bones.
Pacilio, M; Basile, C; Shcherbinin, S; Caselli, F; Ventroni, G; Aragno, D; Mango, L; Santini, E
2011-06-01
Positron emission tomography (PET) and single-photon emission computed tomography (SPECT) imaging play an important role in the segmentation of functioning parts of organs or tumours, but an accurate and reproducible delineation is still a challenging task. In this work, an innovative iterative thresholding method for tumour segmentation has been proposed and implemented for a SPECT system. This method, which is based on experimental threshold-volume calibrations, implements also the recovery coefficients (RC) of the imaging system, so it has been called recovering iterative thresholding method (RIThM). The possibility to employ Monte Carlo (MC) simulations for system calibration was also investigated. The RIThM is an iterative algorithm coded using MATLAB: after an initial rough estimate of the volume of interest, the following calculations are repeated: (i) the corresponding source-to-background ratio (SBR) is measured and corrected by means of the RC curve; (ii) the threshold corresponding to the amended SBR value and the volume estimate is then found using threshold-volume data; (iii) new volume estimate is obtained by image thresholding. The process goes on until convergence. The RIThM was implemented for an Infinia Hawkeye 4 (GE Healthcare) SPECT/CT system, using a Jaszczak phantom and several test objects. Two MC codes were tested to simulate the calibration images: SIMIND and SimSet. For validation, test images consisting of hot spheres and some anatomical structures of the Zubal head phantom were simulated with SIMIND code. Additional test objects (flasks and vials) were also imaged experimentally. Finally, the RIThM was applied to evaluate three cases of brain metastases and two cases of high grade gliomas. Comparing experimental thresholds and those obtained by MC simulations, a maximum difference of about 4% was found, within the errors (+/- 2% and +/- 5%, for volumes > or = 5 ml or < 5 ml, respectively). Also for the RC data, the comparison showed differences (up to 8%) within the assigned error (+/- 6%). ANOVA test demonstrated that the calibration results (in terms of thresholds or RCs at various volumes) obtained by MC simulations were indistinguishable from those obtained experimentally. The accuracy in volume determination for the simulated hot spheres was between -9% and 15% in the range 4-270 ml, whereas for volumes less than 4 ml (in the range 1-3 ml) the difference increased abruptly reaching values greater than 100%. For the Zubal head phantom, errors ranged between 9% and 18%. For the experimental test images, the accuracy level was within +/- 10%, for volumes in the range 20-110 ml. The preliminary test of application on patients evidenced the suitability of the method in a clinical setting. The MC-guided delineation of tumor volume may reduce the acquisition time required for the experimental calibration. Analysis of images of several simulated and experimental test objects, Zubal head phantom and clinical cases demonstrated the robustness, suitability, accuracy, and speed of the proposed method. Nevertheless, studies concerning tumors of irregular shape and/or nonuniform distribution of the background activity are still in progress.
The international food unit: a new measurement aid that can improve portion size estimation.
Bucher, T; Weltert, M; Rollo, M E; Smith, S P; Jia, W; Collins, C E; Sun, M
2017-09-12
Portion size education tools, aids and interventions can be effective in helping prevent weight gain. However consumers have difficulties in estimating food portion sizes and are confused by inconsistencies in measurement units and terminologies currently used. Visual cues are an important mediator of portion size estimation, but standardized measurement units are required. In the current study, we present a new food volume estimation tool and test the ability of young adults to accurately quantify food volumes. The International Food Unit™ (IFU™) is a 4x4x4 cm cube (64cm 3 ), subdivided into eight 2 cm sub-cubes for estimating smaller food volumes. Compared with currently used measures such as cups and spoons, the IFU™ standardizes estimation of food volumes with metric measures. The IFU™ design is based on binary dimensional increments and the cubic shape facilitates portion size education and training, memory and recall, and computer processing which is binary in nature. The performance of the IFU™ was tested in a randomized between-subject experiment (n = 128 adults, 66 men) that estimated volumes of 17 foods using four methods; the IFU™ cube, a deformable modelling clay cube, a household measuring cup or no aid (weight estimation). Estimation errors were compared between groups using Kruskall-Wallis tests and post-hoc comparisons. Estimation errors differed significantly between groups (H(3) = 28.48, p < .001). The volume estimations were most accurate in the group using the IFU™ cube (Mdn = 18.9%, IQR = 50.2) and least accurate using the measuring cup (Mdn = 87.7%, IQR = 56.1). The modelling clay cube led to a median error of 44.8% (IQR = 41.9). Compared with the measuring cup, the estimation errors using the IFU™ were significantly smaller for 12 food portions and similar for 5 food portions. Weight estimation was associated with a median error of 23.5% (IQR = 79.8). The IFU™ improves volume estimation accuracy compared to other methods. The cubic shape was perceived as favourable, with subdivision and multiplication facilitating volume estimation. Further studies should investigate whether the IFU™ can facilitate portion size training and whether portion size education using the IFU™ is effective and sustainable without the aid. A 3-dimensional IFU™ could serve as a reference object for estimating food volume.
Using LiDAR to Estimate Surface Erosion Volumes within the Post-storm 2012 Bagley Fire
NASA Astrophysics Data System (ADS)
Mikulovsky, R. P.; De La Fuente, J. A.; Mondry, Z. J.
2014-12-01
The total post-storm 2012 Bagley fire sediment budget of the Squaw Creek watershed in the Shasta-Trinity National Forest was estimated using many methods. A portion of the budget was quantitatively estimated using LiDAR. Simple workflows were designed to estimate the eroded volume's of debris slides, fill failures, gullies, altered channels and streams. LiDAR was also used to estimate depositional volumes. Thorough manual mapping of large erosional features using the ArcGIS 10.1 Geographic Information System was required as these mapped features determined the eroded volume boundaries in 3D space. The 3D pre-erosional surface for each mapped feature was interpolated based on the boundary elevations. A surface difference calculation was run using the estimated pre-erosional surfaces and LiDAR surfaces to determine volume of sediment potentially delivered into the stream system. In addition, cross sections of altered channels and streams were taken using stratified random selection based on channel gradient and stream order respectively. The original pre-storm surfaces of channel features were estimated using the cross sections and erosion depth criteria. Open source software Inkscape was used to estimate cross sectional areas for randomly selected channel features and then averaged for each channel gradient and stream order classes. The average areas were then multiplied by the length of each class to estimate total eroded altered channel and stream volume. Finally, reservoir and in-channel depositional volumes were estimated by mapping channel forms and generating specific reservoir elevation zones associated with depositional events. The in-channel areas and zones within the reservoir were multiplied by estimated and field observed sediment thicknesses to attain a best guess sediment volume. In channel estimates included re-occupying stream channel cross sections established before the fire. Once volumes were calculated, other erosion processes of the Bagley sedimentation study, such as surface soil erosion were combined to estimate the total fire and storm sediment budget for the Squaw Creek watershed. The LiDAR-based measurement workflows can be easily applied to other sediment budget studies using one high resolution LiDAR dataset.
Accuracy of Standing-Tree Volume Estimates Based on McClure Mirror Caliper Measurements
Noel D. Cost
1971-01-01
The accuracy of standing-tree volume estimates, calculated from diameter measurements taken by a mirror caliper and with sectional aluminum poles for height control, was compared with volume estimates calculated from felled-tree measurements. Twenty-five trees which varied in species, size, and form were used in the test. The results showed that two estimates of total...
Direct and simultaneous estimation of cardiac four chamber volumes by multioutput sparse regression.
Zhen, Xiantong; Zhang, Heye; Islam, Ali; Bhaduri, Mousumi; Chan, Ian; Li, Shuo
2017-02-01
Cardiac four-chamber volume estimation serves as a fundamental and crucial role in clinical quantitative analysis of whole heart functions. It is a challenging task due to the huge complexity of the four chambers including great appearance variations, huge shape deformation and interference between chambers. Direct estimation has recently emerged as an effective and convenient tool for cardiac ventricular volume estimation. However, existing direct estimation methods were specifically developed for one single ventricle, i.e., left ventricle (LV), or bi-ventricles; they can not be directly used for four chamber volume estimation due to the great combinatorial variability and highly complex anatomical interdependency of the four chambers. In this paper, we propose a new, general framework for direct and simultaneous four chamber volume estimation. We have addressed two key issues, i.e., cardiac image representation and simultaneous four chamber volume estimation, which enables accurate and efficient four-chamber volume estimation. We generate compact and discriminative image representations by supervised descriptor learning (SDL) which can remove irrelevant information and extract discriminative features. We propose direct and simultaneous four-chamber volume estimation by the multioutput sparse latent regression (MSLR), which enables jointly modeling nonlinear input-output relationships and capturing four-chamber interdependence. The proposed method is highly generalized, independent of imaging modalities, which provides a general regression framework that can be extensively used for clinical data prediction to achieve automated diagnosis. Experiments on both MR and CT images show that our method achieves high performance with a correlation coefficient of up to 0.921 with ground truth obtained manually by human experts, which is clinically significant and enables more accurate, convenient and comprehensive assessment of cardiac functions. Copyright © 2016 Elsevier B.V. All rights reserved.
Airport Landside. Volume I. Planning Guide.
DOT National Transportation Integrated Search
1982-01-01
This volume describes a methodology for performing airport landside planning by applying the Airport Landside Simulation Model (ALSIM) developed by TSC. For this analysis, the airport landside is defined as extending from the airport boundary to the ...
NASA Astrophysics Data System (ADS)
Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.
2017-04-01
Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 1
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. This is Volume 1, an Executive Summary. Volume 2 contains Appendices A (Aerothermal Comparisons) and B (Flight Derived h sub 1/h sub u vs. M sub inf. Plots), and Volume 3 contains Appendix C (Comparison of Interference Factors among OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 3
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. Volume 2 contains Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub 1/h sub u vs. M sub inf. Plots). This is Volume 3, containing Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 2
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. This is volume 2, containing Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub i/h sub u vs. M sub inf. Plots). Volume 3 contains Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Working Papers in Dialogue Modeling, Volume 2.
ERIC Educational Resources Information Center
Mann, William C.; And Others
The technical working papers that comprise the two volumes of this document are related to the problem of creating a valid process model of human communication in dialogue. In Volume 2, the first paper concerns study methodology, and raises such issues as the choice between system-building and process-building, and the advantages of studying cases…
Human Rehabilitation Techniques. Project Papers. Volume IV, Part B.
ERIC Educational Resources Information Center
Dudek, R. A.; And Others
Volume IV, Part B of a six-volume final report (which covers the findings of a research project on policy and technology related to rehabilitation of disabled individuals) presents a continuation of papers (Part A) giving an overview of project methodology, much of the data used in projecting consequences and policymaking impacts in project…
Burruss, Robert
2009-01-01
Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible.
Burruss, R.C.
2009-01-01
Geologically based methodologies to assess the possible volumes of subsurface CO2 storage must apply clear and uniform definitions of resource and reserve concepts to each assessment unit (AU). Application of the current state of knowledge of geologic, hydrologic, geochemical, and geophysical parameters (contingencies) that control storage volume and injectivity allows definition of the contingent resource (CR) of storage. The parameters known with the greatest certainty are based on observations on known traps (KTs) within the AU that produced oil, gas, and water. The aggregate volume of KTs within an AU defines the most conservation volume of contingent resource. Application of the concept of reserve growth to CR volume provides a logical path for subsequent reevaluation of the total resource as knowledge of CO2 storage processes increases during implementation of storage projects. Increased knowledge of storage performance over time will probably allow the volume of the contingent resource of storage to grow over time, although negative growth is possible. ?? 2009 Elsevier Ltd. All rights reserved.
Reproducibility of isopach data and estimates of dispersal and eruption volumes
NASA Astrophysics Data System (ADS)
Klawonn, M.; Houghton, B. F.; Swanson, D.; Fagents, S. A.; Wessel, P.; Wolfe, C. J.
2012-12-01
Total erupted volume and deposit thinning relationships are key parameters in characterizing explosive eruptions and evaluating the potential risk from a volcano as well as inputs to volcanic plume models. Volcanologists most commonly estimate these parameters by hand-contouring deposit data, then representing these contours in thickness versus square root area plots, fitting empirical laws to the thinning relationships and integrating over the square root area to arrive at volume estimates. In this study we analyze the extent to which variability in hand-contouring thickness data for pyroclastic fall deposits influences the resulting estimates and investigate the effects of different fitting laws. 96 volcanologists (3% MA students, 19% PhD students, 20% postdocs, 27% professors, and 30% professional geologists) from 11 countries (Australia, Ecuador, France, Germany, Iceland, Italy, Japan, New Zealand, Switzerland, UK, USA) participated in our study and produced hand-contours on identical maps using our unpublished thickness measurements of the Kilauea Iki 1959 fall deposit. We computed volume estimates by (A) integrating over a surface fitted through the contour lines, as well as using the established methods of integrating over the thinning relationships of (B) an exponential fit with one to three segments, (C) a power law fit, and (D) a Weibull function fit. To focus on the differences from the hand-contours of the well constrained deposit and eliminate the effects of extrapolations to great but unmeasured thicknesses near the vent, we removed the volume contribution of the near vent deposit (defined as the deposit above 3.5 m) from the volume estimates. The remaining volume approximates to 1.76 *106 m3 (geometric mean for all methods) with maximum and minimum estimates of 2.5 *106 m3 and 1.1 *106 m3. Different integration methods of identical isopach maps result in volume estimate differences of up to 50% and, on average, maximum variation between integration methods of 14%. Volume estimates with methods (A), (C) and (D) show strong correlation (r = 0.8 to r = 0.9), while correlation of (B) with the other methods is weaker (r = 0.2 to r = 0.6) and correlation between (B) and (C) is not statistically significant. We find that the choice of larger maximum contours leads to smaller volume estimates due to method (C), but larger estimates with the other methods. We do not find statistically significant correlation between volume estimations and participants experience level, number of chosen contour levels, nor smoothness of contours. Overall, application of the different methods to the same maps leads to similar mean volume estimates, but the different methods show different dependencies and varying spread of volume estimates. The results indicate that these key parameters are less critically dependent on the operator and their choices of contour values, intervals etc., and more sensitive to the selection of technique to integrate these data.
NASA Technical Reports Server (NTRS)
Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.
1979-01-01
The appendices for the cross impact methodology are presented. These include: user's guide, telecommunication events, cross impacts, projection of historical trends, and projection of trends in satellite communications.
NASA Technical Reports Server (NTRS)
1981-01-01
Pioneer Engineering and Manufacturing Company estimated the cost of manufacturing and Air Brayton Receiver for a Solar Thermal Electric Power System as designed by the AiResearch Division of the Garrett Corporation. Production costs were estimated at annual volumes of 100; 1,000; 5,000; 10,000; 50,000; 100,000 and 1,000,000 units. These costs included direct labor, direct material and manufacturing burden. A make or buy analysis was made of each part of each volume. At high volumes special fabrication concepts were used to reduce operation cycle times. All costs were estimated at an assumed 100% plant capacity. Economic feasibility determined the level of production at which special concepts were to be introduced. Estimated costs were based on the economics of the last half of 1980. Tooling and capital equipment costs were estimated for ach volume. Infrastructure and personnel requirements were also estimated.
New Methodology for Natural Gas Production Estimates
2010-01-01
A new methodology is implemented with the monthly natural gas production estimates from the EIA-914 survey this month. The estimates, to be released April 29, 2010, include revisions for all of 2009. The fundamental changes in the new process include the timeliness of the historical data used for estimation and the frequency of sample updates, both of which are improved.
Revised estimates for direct-effect recreational jobs in the interior Columbia River basin.
Lisa K. Crone; Richard W. Haynes
1999-01-01
This paper reviews the methodology used to derive the original estimates for direct employment associated with recreation on Federal lands in the interior Columbia River basin (the basin), and details the changes in methodology and data used to derive new estimates. The new analysis resulted in an estimate of 77,655 direct-effect jobs associated with recreational...
Dae-Kwan Kim; Daniel M. Spotts; Donald F. Holecek
1998-01-01
This paper compares estimates of pleasure trip volume and expenditures derived from a regional telephone survey to those derived from the TravelScope mail panel survey. Significantly different estimates emerged, suggesting that survey-based estimates of pleasure trip volume and expenditures, at least in the case of the two surveys examined, appear to be affected by...
Timber resource statistics for the Upper Yukon inventory unit, Alaska, 1980.
Willem W.S. van Hees
1987-01-01
The 1980 inventory of the forest resources of the Upper Yukon unit was designed to produce inventory estimates of timberland area, volume of timber, and volumes of timber growth and mortality. Timberland area is estimated at 742,000 acres. Cubic-foot volume on all timberland is estimated at 475 million cubic feet. Timber growth and mortality are estimated at -615,000...
NASA Technical Reports Server (NTRS)
1974-01-01
The results of the updated 30-day life sciences dedicated laboratory scheduling and costing activities are documented, and the 'low cost' methodology used to establish individual equipment item costs is explained in terms of its allowances for equipment that is commerical off-the-shelf, modified commercial, and laboratory prototype; a method which significantly lowers program costs. The costs generated include estimates for non-recurring development, recurring production, and recurring operations costs. A cost for a biomedical emphasis laboratory and a Delta cost to provide a bioscience and technology laboratory were also generated. All cost reported are commensurate with the design and schedule definitions available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-03-01
The purpose of the guidance document is to provide the site team--risk assessor, remedial project manager (RPM), and community involvement coordinator--with information to improve community involvement in the Superfund risk assessment process. Specifically, the document: provides suggestions for how Superfund staff and community members can work together during the early stages of Superfund cleanup; identifies where, within the framework of the human health risk assessment methodology, community input can augment and improve EPA`s estimates of exposure and risk; recommends questions the site team should ask the community; and illustrates why community involvement is valuable during the human health assessment atmore » Superfund sites.« less
Flow-type landslides magnitude evaluation: the case study of the Campania Region (Southern Italy)
NASA Astrophysics Data System (ADS)
Santo, Antonio; De Falco, Melania; Di Crescenzo, Giuseppe
2015-04-01
In the last years studies concerning the triggering and the run-out susceptibility for different kind of landslides have become more and more precise. In the most of the cases the methodological approach involve the production of detailed thematic maps (at least 1:5000 scale) which represent a very useful tool for territorial planning, especially in urbanized areas. More recently these researches were accompanied by the growth of other studies dealing with landslide magnitude evaluation (especially in terms of volume and velocity estimate). In this paper the results of a flow-type landslides magnitude evaluation are presented. The study area is located in Southern Italy and is very wide (1,500 square kilometres) including all the Campania region. In this context flow type landslides represent the most frequent instabilities as shown by the large number of victims and the huge economic damage caused in the last few centuries. These shallow landslides involve thin cohesionless, unsaturated pyroclastic soils found over steep slopes around Somma-Vesuvio and Phlegrean district, affecting a wide area where over 100 towns are located. Since the potential volume of flow-type landslides is a measure of event magnitude we propose to estimate the potential volume at the scale of slope or basin for about 90 municipalities affecting 850 hierarchized drainage basins and 900 regular slopes. An empirical approach recently proposed in literature (De Falco et al., 2012), allows to estimate the volume of the pyroclastic cover that can be displaced along the slope. The method derives from the interpretation of numerous geological and geomorphological data gathered from a vast amount of case histories on landslides in volcanic and carbonatic contexts and it is based on determining the thickness of the pyroclastic cover and the width of the detachment and erosion-transport zone. Thickness can be evaluated with a good degree of approximation since, in these landslides, the failure surface is always very superficial (from 0.3 to 2 m) and positioned in pyroclastic covers resting on a generally rigid bedrock (calcareous rocks, lava or tuffs). The area of the detachment and erosion-transport zone (Af) is calculated by a mathematical function (statistical correlation) which link this factor with the difference in height (H) between a point on the slope with the highest susceptibility and a point, the first break at the foot of the slope, where the deposition starts to take place and the landslide loses velocity. Finally, potential volumes are calculated by using Af and a constant thickness of the pyroclastic cover for the whole slope. The volumes estimated were classified using the size classification proposed by Jacob (2005) to view the spatial distribution at regional and municipal scales. At the regional scale the study showed a variability of the volume potentially mobilized that ranging from 500 to 200,000 cubic meters; a non-random distribution of volumes mobilized that allows to show different macro-areas with several degrees of hazard. At the municipal scale the distribution of the volumes mobilized allows to identify the most dangerous landslides scenario. The result could represent a useful tool to define the most critical area in the Civil Protection and to detect the main areas where risk mitigation works are required.
Alternative occupied volume integrity (OVI) tests and analyses.
DOT National Transportation Integrated Search
2013-10-01
FRA, supported by the Volpe Center, conducted research on alternative methods of evaluating occupied volume integrity (OVI) in passenger railcars. Guided by this research, an alternative methodology for evaluating OVI that ensures an equivalent or gr...
Airport Landside - Volume III : ALSIM Calibration and Validation.
DOT National Transportation Integrated Search
1982-06-01
This volume discusses calibration and validation procedures applied to the Airport Landside Simulation Model (ALSIM), using data obtained at Miami, Denver and LaGuardia Airports. Criteria for the selection of a validation methodology are described. T...
Prospect evaluation of shallow I-35 reservoir of NE Malay Basin offshore, Terengganu, Malaysia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janjua, Osama Akhtar, E-mail: janjua945@hotmail.com; Wahid, Ali, E-mail: ali.wahid@live.com; Salim, Ahmed Mohamed Ahmed, E-mail: mohamed.salim@petronas.com.my
2016-02-01
A potential accumulation of hydrocarbon that describes significant and conceivable drilling target is related to prospect. Possibility of success estimation, assuming discovery of hydrocarbons and the potential recoverable quantities range under a commercial development program are the basis of Prospect evaluation activities. The objective was to find the new shallow prospects in reservoir sandstone of I –Formation in Malay basin. The prospects in the study area are mostly consisting of faulted structures and stratigraphic channels. The methodology follows seismic interpretation and mapping, attribute analysis, evaluation of nearby well data i.e., based on well – log correlation. The petrophysical parameters analoguemore » to nearby wells was used as an input parameter for volumetric assessment. Based on analysis of presence and effectiveness, the prospect has a complete petroleum system. Two wells have been proposed to be drilled near the major fault and stratigraphic channel in I-35 reservoir that is O-1 and O-2 prospects respectively. The probability of geological success of prospect O-1 is at 35% while for O-2 is 24%. Finally, for hydrocarbon in place volumes were calculated which concluded the best estimate volume for oil in O-1 prospect is 4.99 MMSTB and O-2 prospect is 28.70 MMSTB while for gas is 29.27 BSCF and 25.59 BSCF respectively.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, S.C.
1993-08-01
This report discusses a field demonstration of a methodology for characterizing an aquifer's geohydrology in the detail required to design an optimum network of wells and/or infiltration galleries for bioreclamation systems. The project work was conducted on a 1-hectare test site at Columbus AFB, Mississippi. The technical report is divided into two volumes. Volume I describes the test site and the well network, the assumptions, and the application of equations that define groundwater flow to a well, the results of three large-scale aquifer tests, and the results of 160 single-pump tests. Volume II describes the bore hole flowmeter tests, themore » tracer tests, the geological investigations, the geostatistical analysis and the guidelines for using groundwater models to design bioreclamation systems. Site characterization, Hydraulic conductivity, Groundwater flow, Geostatistics, Geohydrology, Monitoring wells.« less
Scargle, Jeffrey D; Way, M J; Gazis, P R
2017-04-10
We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform (FFT) of finely binned galaxy positions. In both cases deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multi-point hierarchy. We identify some threads of modern large scale inference methodology that will presumably yield detections in new wider and deeper surveys.
NASA Technical Reports Server (NTRS)
Goldstein, Melvyn L.; Parks, George; Gurgiolo, C.; Fazakerley, Andrew N.
2008-01-01
We present determinations of compressibility and vorticity in the magnetosheath and plasma sheet using moments from the four PEACE thermal electron instruments on CLUSTER. The methodology used assumes a linear variation of the moments throughout the volume defined by the four satellites, which allows spatially independent estimates of the divergence, curl, and gradient. Once the vorticity has been computed, it is possible to estimate directly the Taylor microscale. We have shown previously that the technique works well in the solar wind. Because the background flow speed in the magnetosheath and plasma sheet is usually less than the Alfven speed, the Taylor frozen-in-flow approximation cannot be used. Consequently, this four spacecraft approach is the only viable method for obtaining the wave number properties of the ambient fluctuations. Our results using electron velocity moments will be compared with previous work using magnetometer data from the FGM experiment on Cluster.
Bayesian Revision of Residual Detection Power
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2013-01-01
This paper addresses some issues with quality assessment and quality assurance in response surface modeling experiments executed in wind tunnels. The role of data volume on quality assurance for response surface models is reviewed. Specific wind tunnel response surface modeling experiments are considered for which apparent discrepancies exist between fit quality expectations based on implemented quality assurance tactics, and the actual fit quality achieved in those experiments. These discrepancies are resolved by using Bayesian inference to account for certain imperfections in the assessment methodology. Estimates of the fraction of out-of-tolerance model predictions based on traditional frequentist methods are revised to account for uncertainty in the residual assessment process. The number of sites in the design space for which residuals are out of tolerance is seen to exceed the number of sites where the model actually fails to fit the data. A method is presented to estimate how much of the design space in inadequately modeled by low-order polynomial approximations to the true but unknown underlying response function.
Scargle, Jeffrey D.; Way, M. J.; Gazis, P. R.
2017-01-01
We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform (FFT) of finely binned galaxy positions. In both cases deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multi-point hierarchy. We identify some threads of modern large scale inference methodology that will presumably yield detections in new wider and deeper surveys. PMID:29628519
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scargle, Jeffrey D.; Way, M. J.; Gazis, P. R., E-mail: Jeffrey.D.Scargle@nasa.gov, E-mail: Michael.J.Way@nasa.gov, E-mail: PGazis@sbcglobal.net
We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform of finely binned galaxy positions. In both cases, deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fouriermore » transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multipoint hierarchy. We identify some threads of modern large-scale inference methodology that will presumably yield detections in new wider and deeper surveys.« less
Remote sensing applied to forest resources
NASA Technical Reports Server (NTRS)
Hernandezfilho, P. (Principal Investigator)
1984-01-01
The development of methodologies to classify reforested areas using remotely sensed data is discussed. A preliminary study was carried out in northeast of the Sao Paulo State in 1978. The reforested areas of Pinus spp and Eucalyptus spp were based on the spectral, spatial and temporal characteristics fo LANDSAT imagery. Afterwards, a more detailed study was carried out in the Mato Grosso do Sul State. The reforested areas were mapped in functions of the age (from: 0 to 1 year, 1 to 2 years, 2 to 3 years, 3 to 4 years, 4 to 5 years and 5 to 6 years) and of the heterogeneity stand (from: 0 to 20%, 20 to 40%, 40 to 60%, 60 to 80% and 80 to 100%). The relative differences between the artificial forest areas, estimated from LANDSAT data and ground information, varied from -8.72 to +9.49%. The estimation of forest volume through a multistage sampling technique, with probability proportional to size, is also discussed.
Three approaches for estimating recovery factors in carbon dioxide enhanced oil recovery
Verma, Mahendra K.
2017-07-17
PrefaceThe Energy Independence and Security Act of 2007 authorized the U.S. Geological Survey (USGS) to conduct a national assessment of geologic storage resources for carbon dioxide (CO2) and requested the USGS to estimate the “potential volumes of oil and gas recoverable by injection and sequestration of industrial carbon dioxide in potential sequestration formations” (42 U.S.C. 17271(b)(4)). Geologic CO2 sequestration associated with enhanced oil recovery (EOR) using CO2 in existing hydrocarbon reservoirs has the potential to increase the U.S. hydrocarbon recoverable resource. The objective of this report is to provide detailed information on three approaches that can be used to calculate the incremental recovery factors for CO2-EOR. Therefore, the contents of this report could form an integral part of an assessment methodology that can be used to assess the sedimentary basins of the United States for the hydrocarbon recovery potential using CO2-EOR methods in conventional oil reservoirs.
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.; Way, M. J.; Gazis, P. R.
2017-01-01
We demonstrate the effectiveness of a relatively straightforward analysis of the complex 3D Fourier transform of galaxy coordinates derived from redshift surveys. Numerical demonstrations of this approach are carried out on a volume-limited sample of the Sloan Digital Sky Survey redshift survey. The direct unbinned transform yields a complex 3D data cube quite similar to that from the Fast Fourier Transform (FFT) of finely binned galaxy positions. In both cases deconvolution of the sampling window function yields estimates of the true transform. Simple power spectrum estimates from these transforms are roughly consistent with those using more elaborate methods. The complex Fourier transform characterizes spatial distributional properties beyond the power spectrum in a manner different from (and we argue is more easily interpreted than) the conventional multi-point hierarchy. We identify some threads of modern large scale inference methodology that will presumably yield detections in new wider and deeper surveys.
OPUS: Optimal Projection for Uncertain Systems. Volume 1
1991-09-01
unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of
1984-08-01
produce even the most basic binary cloud data and methodologies needed to support the evaluation programs." In view of this recognized deficiency, the...There was an exchange of information with non - DoD agencies, with presentations made by NASA and NOAA (see pp. 537, 569). A brief report by the steering...on cloud data bases and methodologies for users. To achieve these actions requires explicit support. *See classified supplementary volume. vi CONTENTS
Overcoming bias in estimating the volume-outcome relationship.
Tsai, Alexander C; Votruba, Mark; Bridges, John F P; Cebul, Randall D
2006-02-01
To examine the effect of hospital volume on 30-day mortality for patients with congestive heart failure (CHF) using administrative and clinical data in conventional regression and instrumental variables (IV) estimation models. The primary data consisted of longitudinal information on comorbid conditions, vital signs, clinical status, and laboratory test results for 21,555 Medicare-insured patients aged 65 years and older hospitalized for CHF in northeast Ohio in 1991-1997. The patient was the primary unit of analysis. We fit a linear probability model to the data to assess the effects of hospital volume on patient mortality within 30 days of admission. Both administrative and clinical data elements were included for risk adjustment. Linear distances between patients and hospitals were used to construct the instrument, which was then used to assess the endogeneity of hospital volume. When only administrative data elements were included in the risk adjustment model, the estimated volume-outcome effect was statistically significant (p=.029) but small in magnitude. The estimate was markedly attenuated in magnitude and statistical significance when clinical data were added to the model as risk adjusters (p=.39). IV estimation shifted the estimate in a direction consistent with selective referral, but we were unable to reject the consistency of the linear probability estimates. Use of only administrative data for volume-outcomes research may generate spurious findings. The IV analysis further suggests that conventional estimates of the volume-outcome relationship may be contaminated by selective referral effects. Taken together, our results suggest that efforts to concentrate hospital-based CHF care in high-volume hospitals may not reduce mortality among elderly patients.
Estimating Lake Volume from Limited Data: A Simple GIS Approach
Lake volume provides key information for estimating residence time or modeling pollutants. Methods for calculating lake volume have relied on dated technologies (e.g. planimeters) or used potentially inaccurate assumptions (e.g. volume of a frustum of a cone). Modern GIS provid...
DOT National Transportation Integrated Search
2000-04-01
This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...
Using Photogrammetry to Estimate Tank Waste Volumes from Video
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Jim G.
Washington River Protection Solutions (WRPS) contracted with HiLine Engineering & Fabrication, Inc. to assess the accuracy of photogrammetry tools as compared to video Camera/CAD Modeling System (CCMS) estimates. This test report documents the results of using photogrammetry to estimate the volume of waste in tank 241-C-I04 from post-retrieval videos and results using photogrammetry to estimate the volume of waste piles in the CCMS test video.
Gleason, Robert A.; Tangen, Brian A.; Laubhan, Murray K.; Kermes, Kevin E.; Euliss, Ned H.
2007-01-01
Executive Summary Concern over flooding along rivers in the Prairie Pothole Region has stimulated interest in developing spatially distributed hydrologic models to simulate the effects of wetland water storage on peak river flows. Such models require spatial data on the storage volume and interception area of existing and restorable wetlands in the watershed of interest. In most cases, information on these model inputs is lacking because resolution of existing topographic maps is inadequate to estimate volume and areas of existing and restorable wetlands. Consequently, most studies have relied on wetland area to volume or interception area relationships to estimate wetland basin storage characteristics by using available surface area data obtained as a product from remotely sensed data (e.g., National Wetlands Inventory). Though application of areal input data to estimate volume and interception areas is widely used, a drawback is that there is little information available to provide guidance regarding the application, limitations, and biases associated with such approaches. Another limitation of previous modeling efforts is that water stored by wetlands within a watershed is treated as a simple lump storage component that is filled prior to routing overflow to a pour point or gaging station. This approach does not account for dynamic wetland processes that influence water stored in prairie wetlands. Further, most models have not considered the influence of human-induced hydrologic changes, such as land use, that greatly influence quantity of surface water inputs and, ultimately, the rate that a wetland basin fills and spills. The goals of this study were to (1) develop and improve methodologies for estimating and spatially depicting wetland storage volumes and interceptions areas and (2) develop models and approaches for estimating/simulating the water storage capacity of potentially restorable and existing wetlands under various restoration, land use, and climatic scenarios. To address these goals, we developed models and approaches to spatially represent storage volumes and interception areas of existing and potentially restorable wetlands in the upper Mustinka subbasin within Grant County, Minn. We then developed and applied a model to simulate wetland water storage increases that would result from restoring 25 and 50 percent of the farmed and drained wetlands in the upper Mustinka subbasin. The model simulations were performed during the growing season (May-October) for relatively wet (1993; 0.79 m of precipitation) and dry (1987; 0.40 m of precipitation) years. Results from the simulations indicated that the 25 percent restoration scenario would increase water storage by 21-24 percent and that a 50 percent scenario would increase storage by 34-38 percent. Additionally, we estimated that wetlands in the subbasin have potential to store 11.57-20.98 percent of the total precipitation that fell over the entire subbasin area (52,758 ha). Our simulation results indicated that there is considerable potential to enhance water storage in the subbasin; however, evaluation and calibration of the model is necessary before simulation results can be applied to management and planning decisions. In this report we present guidance for the development and application of models (e.g., surface area-volume predictive models, hydrology simulation model) to simulate wetland water storage to provide a basis from which to understand and predict the effects of natural or human-induced hydrologic alterations. In developing these approaches, we tried to use simple and widely available input data to simulate wetland hydrology and predict wetland water storage for a specific precipitation event or a series of events. Further, the hydrology simulation model accounted for land use and soil type, which influence surface water inputs to wetlands. Although information presented in this report is specific to the Mustinka subbasin, the approaches
Volume, Conservation and Instruction: A Classroom Based Solomon Four Group Study of Conflict.
ERIC Educational Resources Information Center
Rowell, J. A.; Dawson, C. J.
1981-01-01
Summarizes a study to widen the applicability of Piagetian theory-based conflict methodology from individual situations to entire classes. A Solomon four group design was used to study effects of conflict instruction on students' (N=127) ability to conserve volume of noncompressible matter and to apply that knowledge to gas volume. (Author/JN)
ERIC Educational Resources Information Center
Weinberg, Jessica P., Ed.; O'Bryan, Erin L., Ed.; Moll, Laura A., Ed.; Haugan, Jason D., Ed.
The five papers included in this volume approach the study of American Indian languages from a diverse array of methodological and theoretical approaches to linguistics. Two papers focus on approaches that come from the applied linguistics tradition, emphasizing ethnolinguistics and discourse analysis: Sonya Bird's paper "A Cross Cultural…
ERIC Educational Resources Information Center
Schaumann, Leif
Intended as a companion piece to volume 7 in the Method Series, Pharmaceutical Supply System Planning (CE 024 234), this fifth of six volumes in the International Health Planning Reference Series is a combined literature review and annotated bibliography dealing with alternative methodologies for planning and analyzing pharmaceutical supply…
ERIC Educational Resources Information Center
Tao, Fumiyo; And Others
This volume contains technical and supporting materials that supplement Volume I, which describes upward mobility programs for disadvantaged and dislocated workers in the service sector. Appendix A is a detailed description of the project methodology, including data collection methods and information on data compilation, processing, and analysis.…
NASA Astrophysics Data System (ADS)
Richardson, Chris T.; Kannappan, Sheila; Bittner, Ashley; Isaac, Rohan; RESOLVE
2017-01-01
We present a novel methodology for modeling emission line galaxy samples that span the entire BPT diagram. Our methodology has several advantages over current modeling schemes: the free variables in the model are identical for both AGN and SF galaxies; these free variables are more closely linked to observable galaxy properties; and the ionizing spectra including an AGN and starlight are handled self-consistently rather than empirically. We show that our methodology is capable of fitting the vast majority of SDSS galaxies that fall within the traditional regions of galaxy classification on the BPT diagram. We also present current results for relaxing classification boundaries and extending our galaxies into the dwarf regime, using the REsolved Spectroscopy of a Local VolumE (RESOLVE) survey and the Environmental COntext (ECO) catalog, with special attention to compact blue E/S0s. We compare this methodology to PCA decomposition of the spectra. This work is supported by National Science Foundation awards AST-0955368 and CISE/ACI-1156614.
Assessing the Effects of Software Platforms on Volumetric Segmentation of Glioblastoma
Dunn, William D.; Aerts, Hugo J.W.L.; Cooper, Lee A.; Holder, Chad A.; Hwang, Scott N.; Jaffe, Carle C.; Brat, Daniel J.; Jain, Rajan; Flanders, Adam E.; Zinn, Pascal O.; Colen, Rivka R.; Gutman, David A.
2017-01-01
Background Radiological assessments of biologically relevant regions in glioblastoma have been associated with genotypic characteristics, implying a potential role in personalized medicine. Here, we assess the reproducibility and association with survival of two volumetric segmentation platforms and explore how methodology could impact subsequent interpretation and analysis. Methods Post-contrast T1- and T2-weighted FLAIR MR images of 67 TCGA patients were segmented into five distinct compartments (necrosis, contrast-enhancement, FLAIR, post contrast abnormal, and total abnormal tumor volumes) by two quantitative image segmentation platforms - 3D Slicer and a method based on Velocity AI and FSL. We investigated the internal consistency of each platform by correlation statistics, association with survival, and concordance with consensus neuroradiologist ratings using ordinal logistic regression. Results We found high correlations between the two platforms for FLAIR, post contrast abnormal, and total abnormal tumor volumes (spearman’s r(67) = 0.952, 0.959, and 0.969 respectively). Only modest agreement was observed for necrosis and contrast-enhancement volumes (r(67) = 0.693 and 0.773 respectively), likely arising from differences in manual and automated segmentation methods of these regions by 3D Slicer and Velocity AI/FSL, respectively. Survival analysis based on AUC revealed significant predictive power of both platforms for the following volumes: contrast-enhancement, post contrast abnormal, and total abnormal tumor volumes. Finally, ordinal logistic regression demonstrated correspondence to manual ratings for several features. Conclusion Tumor volume measurements from both volumetric platforms produced highly concordant and reproducible estimates across platforms for general features. As automated or semi-automated volumetric measurements replace manual linear or area measurements, it will become increasingly important to keep in mind that measurement differences between segmentation platforms for more detailed features could influence downstream survival or radio genomic analyses. PMID:29600296
Estimation of truck volumes and flows
DOT National Transportation Integrated Search
2004-08-01
This research presents a statistical approach for estimating truck volumes, based : primarily on classification counts and information on roadway functionality, employment, : sales volume and number of establishments within the state. Models have bee...
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
Estimation of gas and tissue lung volumes by MRI: functional approach of lung imaging.
Qanadli, S D; Orvoen-Frija, E; Lacombe, P; Di Paola, R; Bittoun, J; Frija, G
1999-01-01
The purpose of this work was to assess the accuracy of MRI for the determination of lung gas and tissue volumes. Fifteen healthy subjects underwent MRI of the thorax and pulmonary function tests [vital capacity (VC) and total lung capacity (TLC)] in the supine position. MR examinations were performed at inspiration and expiration. Lung volumes were measured by a previously validated technique on phantoms. Both individual and total lung volumes and capacities were calculated. MRI total vital capacity (VC(MRI)) was compared with spirometric vital capacity (VC(SP)). Capacities were correlated to lung volumes. Tissue volume (V(T)) was estimated as the difference between the total lung volume at full inspiration and the TLC. No significant difference was seen between VC(MRI) and VC(SP). Individual capacities were well correlated (r = 0.9) to static volume at full inspiration. The V(T) was estimated to be 836+/-393 ml. This preliminary study demonstrates that MRI can accurately estimate lung gas and tissue volumes. The proposed approach appears well suited for functional imaging of the lung.
Regression to fuzziness method for estimation of remaining useful life in power plant components
NASA Astrophysics Data System (ADS)
Alamaniotis, Miltiadis; Grelle, Austin; Tsoukalas, Lefteri H.
2014-10-01
Mitigation of severe accidents in power plants requires the reliable operation of all systems and the on-time replacement of mechanical components. Therefore, the continuous surveillance of power systems is a crucial concern for the overall safety, cost control, and on-time maintenance of a power plant. In this paper a methodology called regression to fuzziness is presented that estimates the remaining useful life (RUL) of power plant components. The RUL is defined as the difference between the time that a measurement was taken and the estimated failure time of that component. The methodology aims to compensate for a potential lack of historical data by modeling an expert's operational experience and expertise applied to the system. It initially identifies critical degradation parameters and their associated value range. Once completed, the operator's experience is modeled through fuzzy sets which span the entire parameter range. This model is then synergistically used with linear regression and a component's failure point to estimate the RUL. The proposed methodology is tested on estimating the RUL of a turbine (the basic electrical generating component of a power plant) in three different cases. Results demonstrate the benefits of the methodology for components for which operational data is not readily available and emphasize the significance of the selection of fuzzy sets and the effect of knowledge representation on the predicted output. To verify the effectiveness of the methodology, it was benchmarked against the data-based simple linear regression model used for predictions which was shown to perform equal or worse than the presented methodology. Furthermore, methodology comparison highlighted the improvement in estimation offered by the adoption of appropriate of fuzzy sets for parameter representation.
Using GIS to Estimate Lake Volume from Limited Data (Lake and Reservoir Management)
Estimates of lake volume are necessary for calculating residence time and modeling pollutants. Modern GIS methods for calculating lake volume improve upon more dated technologies (e.g. planimeters) and do not require potentially inaccurate assumptions (e.g. volume of a frustum of...
Richard L. Williamson; Robert O. Curtis
1980-01-01
Equations are given for estimating merchantable volumes of second-growth Douglas-fir stands to specified breast-high and top-diameter limits, in cubic feet or board feet, from total volume in cubic feet and certain associated stand characteristics.
Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach
ERIC Educational Resources Information Center
Stevenson, Glenn A.
2012-01-01
For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
The effects of survey question wording on rape estimates: evidence from a quasi-experimental design.
Fisher, Bonnie S
2009-02-01
The measurement of rape is among the leading methodological issues in the violence against women field. Methodological discussion continues to focus on decreasing measurement errors and improving the accuracy of rape estimates. The current study used a quasi-experimental design to examine the effect of survey question wording on estimates of completed and attempted rape and verbal threats of rape. Specifically, the study statistically compares self-reported rape estimates from two nationally representative studies of college women's sexual victimization experiences, the National College Women Sexual Victimization study and the National Violence Against College Women study. Results show significant differences between the two sets of rape estimates, with National Violence Against College Women study rape estimates ranging from 4.4% to 10.4% lower than the National College Women Sexual Victimization study rape estimates. Implications for future methodological research are discussed.
Methodology for estimating helicopter performance and weights using limited data
NASA Technical Reports Server (NTRS)
Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard
1990-01-01
Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.
Accuracy and variability of tumor burden measurement on multi-parametric MRI
NASA Astrophysics Data System (ADS)
Salarian, Mehrnoush; Gibson, Eli; Shahedi, Maysam; Gaed, Mena; Gómez, José A.; Moussa, Madeleine; Romagnoli, Cesare; Cool, Derek W.; Bastian-Jordan, Matthew; Chin, Joseph L.; Pautler, Stephen; Bauman, Glenn S.; Ward, Aaron D.
2014-03-01
Measurement of prostate tumour volume can inform prognosis and treatment selection, including an assessment of the suitability and feasibility of focal therapy, which can potentially spare patients the deleterious side effects of radical treatment. Prostate biopsy is the clinical standard for diagnosis but provides limited information regarding tumour volume due to sparse tissue sampling. A non-invasive means for accurate determination of tumour burden could be of clinical value and an important step toward reduction of overtreatment. Multi-parametric magnetic resonance imaging (MPMRI) is showing promise for prostate cancer diagnosis. However, the accuracy and inter-observer variability of prostate tumour volume estimation based on separate expert contouring of T2-weighted (T2W), dynamic contrastenhanced (DCE), and diffusion-weighted (DW) MRI sequences acquired using an endorectal coil at 3T is currently unknown. We investigated this question using a histologic reference standard based on a highly accurate MPMRIhistology image registration and a smooth interpolation of planimetric tumour measurements on histology. Our results showed that prostate tumour volumes estimated based on MPMRI consistently overestimated histological reference tumour volumes. The variability of tumour volume estimates across the different pulse sequences exceeded interobserver variability within any sequence. Tumour volume estimates on DCE MRI provided the lowest inter-observer variability and the highest correlation with histology tumour volumes, whereas the apparent diffusion coefficient (ADC) maps provided the lowest volume estimation error. If validated on a larger data set, the observed correlations could support the development of automated prostate tumour volume segmentation algorithms as well as correction schemes for tumour burden estimation on MPMRI.
Brain Volume Estimation Enhancement by Morphological Image Processing Tools.
Zeinali, R; Keshtkar, A; Zamani, A; Gharehaghaji, N
2017-12-01
Volume estimation of brain is important for many neurological applications. It is necessary in measuring brain growth and changes in brain in normal/abnormal patients. Thus, accurate brain volume measurement is very important. Magnetic resonance imaging (MRI) is the method of choice for volume quantification due to excellent levels of image resolution and between-tissue contrast. Stereology method is a good method for estimating volume but it requires to segment enough MRI slices and have a good resolution. In this study, it is desired to enhance stereology method for volume estimation of brain using less MRI slices with less resolution. In this study, a program for calculating volume using stereology method has been introduced. After morphologic method, dilation was applied and the stereology method enhanced. For the evaluation of this method, we used T1-wighted MR images from digital phantom in BrainWeb which had ground truth. The volume of 20 normal brain extracted from BrainWeb, was calculated. The volumes of white matter, gray matter and cerebrospinal fluid with given dimension were estimated correctly. Volume calculation from Stereology method in different cases was made. In three cases, Root Mean Square Error (RMSE) was measured. Case I with T=5, d=5, Case II with T=10, D=10 and Case III with T=20, d=20 (T=slice thickness, d=resolution as stereology parameters). By comparing these results of two methods, it is obvious that RMSE values for our proposed method are smaller than Stereology method. Using morphological operation, dilation allows to enhance the estimation volume method, Stereology. In the case with less MRI slices and less test points, this method works much better compared to Stereology method.
Load and resistance factor rating (LRFR) in New York State : volume II.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...
Load and resistance factor rating (LRFR) in NYS : volume II final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
Load and resistance factor rating (LRFR) in NYS : volume I final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
ERIC Educational Resources Information Center
Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen
2017-01-01
Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…
1998 motor vehicle occupant safety survey. Volume 1, methodology report
DOT National Transportation Integrated Search
2000-03-01
This is the Methodology Report for the 1998 Motor Vehicle Occupant Safety Survey. The survey is conducted on a biennial basis (initiated in 1994), and is administered by telephone to a randomly selected national sample. Two questionnaires are used, e...
Load and resistance factor rating (LRFR) in New York State : volume I.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology : for New York bridges. The methodology is applicable for the rating of existing : bridges, the posting of under-strength bridges, and checking Permit trucks. The : propo...
National survey of drinking and driving attitudes and behaviors : 2008. Volume 3, methodology report
DOT National Transportation Integrated Search
2010-08-01
This report presents the details of the methodology used for the 2008 National Survey of Drinking and Driving Attitudes and Behaviors conducted by Gallup, Inc. for : the National Highway Traffic Safety Administration (NHTSA). This survey represents t...
Dias, Jorge; Malheiro, Jorge; Almeida, Manuela; Dias, Leonídio; Silva-Ramos, Miguel; Martins, La Salete; Xambre, Luís; Castro-Henriques, António
2015-05-01
Donated kidney volume influences post-transplant outcomes and graft survival. We evaluated the relationship between living-donor kidney volume and recipient graft function at 12 months post-transplantation, exploring a volume threshold for a suboptimal graft function, and compared two different formulas of volume estimation. A retrospective analysis of 82 pairs of living-donor kidney transplants was conducted. Donor renal volumes were estimated from computerized tomography scans using the ellipsoid formula and the voxel counting technique. Linear and restricted cubic regression spline was used to analyze the association of volume with graft function. Additionally, we determined the correlation between the two volume estimation formulas and established a correction factor for the ellipsoid formula. Renal volume (adjusted to recipient BSA) had the strongest independent effect (B = 1.65 per 10 ml/m(2) increase, p value <0.001) on graft function at 12 months. The eGFR at 12 months was 52.5, 63.6 and 67.6 ml/min/1.73 m(2) for the low, medium and high volume ratio terciles, respectively (p value <0.001). The odds of a GFR <50 ml/min became significantly reduced with volumes above 145 cc/1.73 m(2). A strong positive correlation between the two formulas was identified (R(2) = 0.705), but the optimal correction factor for our cohort was 0.566. In a Caucasian population, higher donor kidney volumes estimated from preoperative CT scans are associated with higher recipient eGFRs at 12 months after live-donor transplantation. Using this criterion, transplant teams can potentially improve selection of living donors if multiple donors are available. However, the need for precise estimation of donor kidney volumes should not be overlooked.
Aerial photo volume tables for Douglas-fir in the Pacific Northwest.
Robert B. Pope
1961-01-01
The aerial photo volume tables in this report are tools to be used in obtaining better timber inventories. Volume estimates based on tables such as these, properly field checked, are generally cheaper than ground cruises of comparable accuracy. Photo volume tables also permit rough volume estimates to be made from aerial photos alone when limited time, bad weather, or...
Boka, Vasiliki-Ioanna; Argyropoulou, Aikaterini; Gikas, Evangelos; Angelis, Apostolis; Aligiannis, Nektarios; Skaltsounis, Alexios-Leandros
2015-11-01
A high-performance thin-layer chromatographic methodology was developed and validated for the isolation and quantitative determination of oleuropein in two extracts of Olea europaea leaves. OLE_A was a crude acetone extract, while OLE_AA was its defatted residue. Initially, high-performance thin-layer chromatography was employed for the purification process of oleuropein with fast centrifugal partition chromatography, replacing high-performance liquid-chromatography, in the stage of the determination of the distribution coefficient and the retention volume. A densitometric method was developed for the determination of the distribution coefficients, KC = CS/CM. The total concentrations of the target compound in the stationary phase (CS) and in the mobile phase (CM) were calculated by the area measured in the high-performance thin-layer chromatogram. The estimated Kc was also used for the calculation of the retention volume, VR, with a chromatographic retention equation. The obtained data were successfully applied for the purification of oleuropein and the experimental results confirmed the theoretical predictions, indicating that high-performance thin-layer chromatography could be an important counterpart in the phytochemical study of natural products. The isolated oleuropein (purity > 95%) was subsequently used for the estimation of its content in each extract with a simple, sensitive and accurate high-performance thin-layer chromatography method. The best fit calibration curve from 1.0 µg/track to 6.0 µg/track of oleuropein was polynomial and the quantification was achieved by UV detection at λ 240 nm. The method was validated giving rise to an efficient and high-throughput procedure, with the relative standard deviation % of repeatability and intermediate precision not exceeding 4.9% and accuracy between 92% and 98% (recovery rates). Moreover, the method was validated for robustness, limit of quantitation, and limit of detection. The amount of oleuropein for OLE_A, OLE_AA, and an aqueous extract of olive leaves was estimated to be 35.5% ± 2.7, 51.5% ± 1.4, and 12.5% ± 0.12, respectively. Statistical analysis proved that the method is repeatable and selective, and can be effectively applied for the estimation of oleuropein in olive leaves' extracts, and could potentially replace high-performance liquid chromatography methodologies developed so far. Thus, the phytochemical investigation of oleuropein could be based on high-performance thin-layer chromatography coupled with separation processes, such as fast centrifugal partition chromatography, showing efficacy and credibility. Georg Thieme Verlag KG Stuttgart · New York.
Inferring the risk factors behind the geographical spread and transmission of Zika in the Americas
Bóta, András; Gangavarapu, Karthik; Kraemer, Moritz U. G.; Grubaugh, Nathan D.
2018-01-01
Background An unprecedented Zika virus epidemic occurred in the Americas during 2015-2016. The size of the epidemic in conjunction with newly recognized health risks associated with the virus attracted significant attention across the research community. Our study complements several recent studies which have mapped epidemiological elements of Zika, by introducing a newly proposed methodology to simultaneously estimate the contribution of various risk factors for geographic spread resulting in local transmission and to compute the risk of spread (or re-introductions) between each pair of regions. The focus of our analysis is on the Americas, where the set of regions includes all countries, overseas territories, and the states of the US. Methodology/Principal findings We present a novel application of the Generalized Inverse Infection Model (GIIM). The GIIM model uses real observations from the outbreak and seeks to estimate the risk factors driving transmission. The observations are derived from the dates of reported local transmission of Zika virus in each region, the network structure is defined by the passenger air travel movements between all pairs of regions, and the risk factors considered include regional socioeconomic factors, vector habitat suitability, travel volumes, and epidemiological data. The GIIM relies on a multi-agent based optimization method to estimate the parameters, and utilizes a data driven stochastic-dynamic epidemic model for evaluation. As expected, we found that mosquito abundance, incidence rate at the origin region, and human population density are risk factors for Zika virus transmission and spread. Surprisingly, air passenger volume was less impactful, and the most significant factor was (a negative relationship with) the regional gross domestic product (GDP) per capita. Conclusions/Significance Our model generates country level exportation and importation risk profiles over the course of the epidemic and provides quantitative estimates for the likelihood of introduced Zika virus resulting in local transmission, between all origin-destination travel pairs in the Americas. Our findings indicate that local vector control, rather than travel restrictions, will be more effective at reducing the risks of Zika virus transmission and establishment. Moreover, the inverse relationship between Zika virus transmission and GDP suggests that Zika cases are more likely to occur in regions where people cannot afford to protect themselves from mosquitoes. The modeling framework is not specific for Zika virus, and could easily be employed for other vector-borne pathogens with sufficient epidemiological and entomological data. PMID:29346387
Estimating volume, biomass, and potential emissions of hand-piled fuels
Clinton S. Wright; Cameron S. Balog; Jeffrey W. Kelly
2009-01-01
Dimensions, volume, and biomass were measured for 121 hand-constructed piles composed primarily of coniferous (n = 63) and shrub/hardwood (n = 58) material at sites in Washington and California. Equations using pile dimensions, shape, and type allow users to accurately estimate the biomass of hand piles. Equations for estimating true pile volume from simple geometric...
Estimating bark thicknesses of common Appalachian hardwoods
R. Edward Thomas; Neal D. Bennett
2014-01-01
Knowing the thickness of bark along the stem of a tree is critical to accurately estimate residue and, more importantly, estimate the volume of solid wood available. Determining the volume or weight of bark for a log is important because bark and wood mass are typically separated while processing logs, and accurate determination of volume is problematic. Bark thickness...
Preparation of modified semi-coke by microwave heating and adsorption kinetics of methylene blue.
Wang, Xin; Peng, Jin-Hui; Duan, Xin-Hui; Srinivasakannan, Chandrasekar
2013-01-01
Preparation of modified semi-coke has been achieved, using phosphoric acid as the modifying agent, by microwave heating from virgin semi-coke. Process optimization using a Central Composite Design (CCD) design of Response Surface Methodology (RSM) technique for the preparation of modifies semi-coke is presented in this paper. The optimum conditions for producing modified semi-coke were: concentration of phosphoric acid 2.04, heating time 20 minutes and temperature 587 degrees C, with the optimum iodine of 862 mg/g and yield of 47.48%. The textural characteristics of modified semi-coke were analyzed using scanning electron microscopy (SEM) and nitrogen adsorption isotherm. The BET surface area of modified semi-coke was estimated to be 989.60 m2/g, with the pore volume of 0.74 cm3/g and a pore diameter of 3.009 nm, with micro-pore volume contributing to 62.44%. The Methylene Blue monolayer adsorption capacity was found to be mg/g at K. The adsorption capacity of the modified semi-coke highlights its suitability for liquid phase adsorption application with a potential usage in waste water treatment.
Stereology techniques in radiation biology
NASA Technical Reports Server (NTRS)
Kubinova, Lucie; Mao, XiaoWen; Janacek, Jiri; Archambeau, John O.; Nelson, G. A. (Principal Investigator)
2003-01-01
Clinicians involved in conventional radiation therapy are very concerned about the dose-response relationships of normal tissues. Before proceeding to new clinical protocols, radiation biologists involved with conformal proton therapy believe it is necessary to quantify the dose response and tolerance of the organs and tissues that will be irradiated. An important focus is on the vasculature. This presentation reviews the methodology and format of using confocal microscopy and stereological methods to quantify tissue parameters, cell number, tissue volume and surface area, and vessel length using the microvasculature as a model tissue. Stereological methods and their concepts are illustrated using an ongoing study of the dose response of the microvessels in proton-irradiated hemibrain. Methods for estimating the volume of the brain and the brain cortex, the total number of endothelial cells in cortical microvessels, the length of cortical microvessels, and the total surface area of cortical microvessel walls are presented step by step in a way understandable for readers with little mathematical background. It is shown that stereological techniques, based on a sound theoretical basis, are powerful and reliable and have been used successfully.
ERIC Educational Resources Information Center
Borthwick, J.; Knight, B.; Bender, A.; Loveder, P.
These two volumes provide information on the scope of adult and community education (ACE) in Australia and implications for improved data collection and reporting. Volume 1 begins with a glossary. Chapter 1 addresses project objectives and processes and methodology. Chapter 2 analyzes the scope and diversity of ACE in terms of what is currently…
Surrogate Plant Data Base : Volume 2. Appendix C : Facilities Planning Baseline Data
DOT National Transportation Integrated Search
1983-05-01
This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...
Surrogate Plant Data Base : Volume 4. Appendix E : Medium and Heavy Truck Manufacturing
DOT National Transportation Integrated Search
1983-05-01
This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...
Supplementary Computer Generated Cueing to Enhance Air Traffic Controller Efficiency
2013-03-01
assess the complexity of air traffic control (Mogford, Guttman, Morrow, & Kopardekar, 1995; Laudeman, Shelden, Branstrom, & Brasil , 1998). Controllers...Behaviorial Sciences: Volume 1: Methodological Issues Volume 2: Statistical Issues, 1, 257. Laudeman, I. V., Shelden, S. G., Branstrom, R., & Brasil
NASA Astrophysics Data System (ADS)
Reynerson, Charles Martin
This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.
Filgueiras, Paulo R; Terra, Luciana A; Castro, Eustáquio V R; Oliveira, Lize M S L; Dias, Júlio C M; Poppi, Ronei J
2015-09-01
This paper aims to estimate the temperature equivalent to 10% (T10%), 50% (T50%) and 90% (T90%) of distilled volume in crude oils using (1)H NMR and support vector regression (SVR). Confidence intervals for the predicted values were calculated using a boosting-type ensemble method in a procedure called ensemble support vector regression (eSVR). The estimated confidence intervals obtained by eSVR were compared with previously accepted calculations from partial least squares (PLS) models and a boosting-type ensemble applied in the PLS method (ePLS). By using the proposed boosting strategy, it was possible to identify outliers in the T10% property dataset. The eSVR procedure improved the accuracy of the distillation temperature predictions in relation to standard PLS, ePLS and SVR. For T10%, a root mean square error of prediction (RMSEP) of 11.6°C was obtained in comparison with 15.6°C for PLS, 15.1°C for ePLS and 28.4°C for SVR. The RMSEPs for T50% were 24.2°C, 23.4°C, 22.8°C and 14.4°C for PLS, ePLS, SVR and eSVR, respectively. For T90%, the values of RMSEP were 39.0°C, 39.9°C and 39.9°C for PLS, ePLS, SVR and eSVR, respectively. The confidence intervals calculated by the proposed boosting methodology presented acceptable values for the three properties analyzed; however, they were lower than those calculated by the standard methodology for PLS. Copyright © 2015 Elsevier B.V. All rights reserved.
Elci, Hakan; Turk, Necdet
2014-01-01
Block volumes are generally estimated by analyzing the discontinuity spacing measurements obtained either from the scan lines placed over the rock exposures or the borehole cores. Discontinuity spacing measurements made at the Mesozoic limestone quarries in Karaburun Peninsula were used to estimate the average block volumes that could be produced from them using the suggested methods in the literature. The Block Quality Designation (BQD) ratio method proposed by the authors has been found to have given in the same order of the rock block volume to the volumetric joint count (J v) method. Moreover, dimensions of the 2378 blocks produced between the years of 2009 and 2011 in the working quarries have been recorded. Assuming, that each block surfaces is a discontinuity, the mean block volume (V b), the mean volumetric joint count (J vb) and the mean block shape factor of the blocks are determined and compared with the estimated mean in situ block volumes (V in) and volumetric joint count (J vi) values estimated from the in situ discontinuity measurements. The established relations are presented as a chart to be used in practice for estimating the mean volume of blocks that can be obtained from a quarry site by analyzing the rock mass discontinuity spacing measurements. PMID:24696642
Elci, Hakan; Turk, Necdet
2014-01-01
Block volumes are generally estimated by analyzing the discontinuity spacing measurements obtained either from the scan lines placed over the rock exposures or the borehole cores. Discontinuity spacing measurements made at the Mesozoic limestone quarries in Karaburun Peninsula were used to estimate the average block volumes that could be produced from them using the suggested methods in the literature. The Block Quality Designation (BQD) ratio method proposed by the authors has been found to have given in the same order of the rock block volume to the volumetric joint count (J(v)) method. Moreover, dimensions of the 2378 blocks produced between the years of 2009 and 2011 in the working quarries have been recorded. Assuming, that each block surfaces is a discontinuity, the mean block volume (V(b)), the mean volumetric joint count (J(vb)) and the mean block shape factor of the blocks are determined and compared with the estimated mean in situ block volumes (V(in)) and volumetric joint count (J(vi)) values estimated from the in situ discontinuity measurements. The established relations are presented as a chart to be used in practice for estimating the mean volume of blocks that can be obtained from a quarry site by analyzing the rock mass discontinuity spacing measurements.
Ho, Hsing-Hao; Li, Ya-Hui; Lee, Jih-Chin; Wang, Chih-Wei; Yu, Yi-Lin; Hueng, Dueng-Yuan; Ma, Hsin-I; Hsu, Hsian-He; Juan, Chun-Jung
2018-01-01
We estimated the volume of vestibular schwannomas by an ice cream cone formula using thin-sliced magnetic resonance images (MRI) and compared the estimation accuracy among different estimating formulas and between different models. The study was approved by a local institutional review board. A total of 100 patients with vestibular schwannomas examined by MRI between January 2011 and November 2015 were enrolled retrospectively. Informed consent was waived. Volumes of vestibular schwannomas were estimated by cuboidal, ellipsoidal, and spherical formulas based on a one-component model, and cuboidal, ellipsoidal, Linskey's, and ice cream cone formulas based on a two-component model. The estimated volumes were compared to the volumes measured by planimetry. Intraobserver reproducibility and interobserver agreement was tested. Estimation error, including absolute percentage error (APE) and percentage error (PE), was calculated. Statistical analysis included intraclass correlation coefficient (ICC), linear regression analysis, one-way analysis of variance, and paired t-tests with P < 0.05 considered statistically significant. Overall tumor size was 4.80 ± 6.8 mL (mean ±standard deviation). All ICCs were no less than 0.992, suggestive of high intraobserver reproducibility and high interobserver agreement. Cuboidal formulas significantly overestimated the tumor volume by a factor of 1.9 to 2.4 (P ≤ 0.001). The one-component ellipsoidal and spherical formulas overestimated the tumor volume with an APE of 20.3% and 29.2%, respectively. The two-component ice cream cone method, and ellipsoidal and Linskey's formulas significantly reduced the APE to 11.0%, 10.1%, and 12.5%, respectively (all P < 0.001). The ice cream cone method and other two-component formulas including the ellipsoidal and Linskey's formulas allow for estimation of vestibular schwannoma volume more accurately than all one-component formulas.
Development of regional stump-to-mill logging cost estimators
Chris B. LeDoux; John E. Baumgras
1989-01-01
Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...
Abdominal fat volume estimation by stereology on CT: a comparison with manual planimetry.
Manios, G E; Mazonakis, M; Voulgaris, C; Karantanas, A; Damilakis, J
2016-03-01
To deploy and evaluate a stereological point-counting technique on abdominal CT for the estimation of visceral (VAF) and subcutaneous abdominal fat (SAF) volumes. Stereological volume estimations based on point counting and systematic sampling were performed on images from 14 consecutive patients who had undergone abdominal CT. For the optimization of the method, five sampling intensities in combination with 100 and 200 points were tested. The optimum stereological measurements were compared with VAF and SAF volumes derived by the standard technique of manual planimetry on the same scans. Optimization analysis showed that the selection of 200 points along with the sampling intensity 1/8 provided efficient volume estimations in less than 4 min for VAF and SAF together. The optimized stereology showed strong correlation with planimetry (VAF: r = 0.98; SAF: r = 0.98). No statistical differences were found between the two methods (VAF: P = 0.81; SAF: P = 0.83). The 95% limits of agreement were also acceptable (VAF: -16.5%, 16.1%; SAF: -10.8%, 10.7%) and the repeatability of stereology was good (VAF: CV = 4.5%, SAF: CV = 3.2%). Stereology may be successfully applied to CT images for the efficient estimation of abdominal fat volume and may constitute a good alternative to the conventional planimetric technique. Abdominal obesity is associated with increased risk of disease and mortality. Stereology may quantify visceral and subcutaneous abdominal fat accurately and consistently. The application of stereology to estimating abdominal volume fat reduces processing time. Stereology is an efficient alternative method for estimating abdominal fat volume.
A Radial Basis Function Approach to Financial Time Series Analysis
1993-12-01
including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data...collection of practical techniques to address these issues for a modeling methodology . Radial Basis Function networks. These techniques in- clude efficient... methodology often then amounts to a careful consideration of the interplay between model complexity and reliability. These will be recurrent themes
ERIC Educational Resources Information Center
Bateman, Donald R.; Zidonis, Frank J.
In the introduction to this volume of a two volume document (See also TE 002 131.) written for curriculum developers, Donald Bateman identifies the recent periods in the development of linguistic thought and methodology, and presents language curriculum development as the continuing exploration of the processes of evolving linguistic structures.…
A prototype software methodology for the rapid evaluation of biomanufacturing process options.
Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli
2007-10-01
A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.
Gingerich, W.H.; Pityer, R.A.; Rach, J.J.
1987-01-01
1. Total blood volume and relative blood volumes in selected tissues were determined in non-anesthetized, confined rainbow trout by using 51Cr-labelled trout erythrocytes as a vascular space marker.2. Mean total blood volume was estimated to be 4.09 ± 0.55 ml/100 g, or about 75% of that estimated with the commonly used plasma space marker Evans blue dye.3. Relative tissue blood volumes were greatest in highly perfused tissues such as kidney, gills, brain and liver and least in mosaic muscle.4. Estimates of tissue vascular spaces, made using radiolabelled erythrocytes, were only 25–50% of those based on plasma space markers.5. The consistently smaller vascular volumes obtained with labelled erythrocytes could be explained by assuming that commonly used plasma space markers diffuse from the vascular compartment.
Proximity Navigation of Highly Constrained Spacecraft
NASA Technical Reports Server (NTRS)
Scarritt, S.; Swartwout, M.
2007-01-01
Bandit is a 3-kg automated spacecraft in development at Washington University in St. Louis. Bandit's primary mission is to demonstrate proximity navigation, including docking, around a 25-kg student-built host spacecraft. However, because of extreme constraints in mass, power and volume, traditional sensing and actuation methods are not available. In particular, Bandit carries only 8 fixed-magnitude cold-gas thrusters to control its 6 DOF motion. Bandit lacks true inertial sensing, and the ability to sense position relative to the host has error bounds that approach the size of the Bandit itself. Some of the navigation problems are addressed through an extremely robust, error-tolerant soft dock. In addition, we have identified a control methodology that performs well in this constrained environment: behavior-based velocity potential functions, which use a minimum-seeking method similar to Lyapunov functions. We have also adapted the discrete Kalman filter for use on Bandit for position estimation and have developed a similar measurement vs. propagation weighting algorithm for attitude estimation. This paper provides an overview of Bandit and describes the control and estimation approach. Results using our 6DOF flight simulator are provided, demonstrating that these methods show promise for flight use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostoaei, A.I.; Burns, R.E.; Hoffman, F.O.
1999-07-01
In the early 1990s, concern about the Oak Ridge Reservation's past releases of contaminants to the environment prompted Tennessee's public health officials to pursue an in-depth study of potential off-site health effects at Oak Ridge. This study, the Oak Ridge dose reconstruction, was supported by an agreement between the U.S. Department of Energy (DOE) and the State of Tennessee, and was overseen by a 12-member panel appointed by Tennessee's Commissioner of Health. One of the major contaminants studied in the dose reconstruction was radioactive iodine, which was released to the air by X-10 (now called Oak Ridge National Laboratory) asmore » it processed spent nuclear reactor fuel from 1944 through 1956. The process recovered radioactive lanthanum for use in weapons development. Iodine concentrates in the thyroid gland so health concerns include various diseases of the thyroid, such as thyroid cancer. The large report, ''Iodine-131 Releases from Radioactive Lanthanum Processing at the X-10 Site in Oak Ridge, Tennessee (1944-1956) - An Assessment of Quantities Released, Off-site Radiation Doses, and Potential Excess Risks of Thyroid Cancer,'' is in two volumes. Volume 1 is the main body of the report, and Volume 1A, which has the same title, consists of 22 supporting appendices. Together, these reports serve the following purposes: (1) describe the methodologies used to estimate the amount of iodine-131 (I-131) released; (2) evaluate I-131's pathway from air to vegetation to food to humans; (3) estimate doses received by human thyroids; (4) estimate excess risk of acquiring a thyroid cancer during ones lifetime; and (5) provide equations, examples of historical documents used, and tables of calculated values. Results indicate that females born in 1952 who consumed milk from a goat pastured a few miles east of X-10 received the highest doses from I-131 and would have had the highest risks of contracting thyroid cancer. Doses from cow's milk are considerably less . Detailed dose and risk estimates, and associated uncertainties, for other contaminants studied for the Oak Ridge dose reconstruction are presented in several other technical reports. One way to easily locate them in OSTI's Information Bridge is by searching the ''report number field'' for the number DOE/OR/21981*. Be sure to place the asterisk after the base number so your search can list the complete series of reports related to Oak Ridge Dose Reconstruction.« less
Estimating tree bole volume using artificial neural network models for four species in Turkey.
Ozçelik, Ramazan; Diamantopoulou, Maria J; Brooks, John R; Wiant, Harry V
2010-01-01
Tree bole volumes of 89 Scots pine (Pinus sylvestris L.), 96 Brutian pine (Pinus brutia Ten.), 107 Cilicica fir (Abies cilicica Carr.) and 67 Cedar of Lebanon (Cedrus libani A. Rich.) trees were estimated using Artificial Neural Network (ANN) models. Neural networks offer a number of advantages including the ability to implicitly detect complex nonlinear relationships between input and output variables, which is very helpful in tree volume modeling. Two different neural network architectures were used and produced the Back propagation (BPANN) and the Cascade Correlation (CCANN) Artificial Neural Network models. In addition, tree bole volume estimates were compared to other established tree bole volume estimation techniques including the centroid method, taper equations, and existing standard volume tables. An overview of the features of ANNs and traditional methods is presented and the advantages and limitations of each one of them are discussed. For validation purposes, actual volumes were determined by aggregating the volumes of measured short sections (average 1 meter) of the tree bole using Smalian's formula. The results reported in this research suggest that the selected cascade correlation artificial neural network (CCANN) models are reliable for estimating the tree bole volume of the four examined tree species since they gave unbiased results and were superior to almost all methods in terms of error (%) expressed as the mean of the percentage errors. 2009 Elsevier Ltd. All rights reserved.
Assessing Conifer Ray Parenchyma for Ecological Studies: Pitfalls and Guidelines.
von Arx, Georg; Arzac, Alberto; Olano, José M; Fonti, Patrick
2015-01-01
Ray parenchyma is an essential tissue for tree functioning and survival. This living tissue plays a major role for storage and transport of water, nutrients, and non-structural carbohydrates (NSC), thus regulating xylem hydraulics and growth. However, despite the importance of rays for tree carbon and water relations, methodological challenges hamper knowledge about ray intra- and inter-tree variability and its ecological meaning. In this study we provide a methodological toolbox for soundly quantifying spatial and temporal variability of different ray features. Anatomical ray features were surveyed in different cutting planes (cross-sectional, tangential, and radial) using quantitative image analysis on stem-wood micro-sections sampled from 41 mature Scots pines (Pinus sylvestris). The percentage of ray surface (PERPAR), a proxy for ray volume, was compared among cutting planes and between early- and latewood to assess measurement-induced variability. Different tangential ray metrics were correlated to assess their similarities. The accuracy of cross-sectional and tangential measurements for PERPAR estimates as a function of number of samples and the measured wood surface was assessed using bootstrapping statistical technique. Tangential sections offered the best 3D insight of ray integration into the xylem and provided the most accurate estimates of PERPAR, with 10 samples of 4 mm(2) showing an estimate within ±6.0% of the true mean PERPAR (relative 95% confidence interval, CI95), and 20 samples of 4 mm(2) showing a CI95 of ±4.3%. Cross-sections were most efficient for establishment of time series, and facilitated comparisons with other widely used xylem anatomical features. Earlywood had significantly lower PERPAR (5.77 vs. 6.18%) and marginally fewer initiating rays than latewood. In comparison to tangential sections, PERPAR was systematically overestimated (6.50 vs. 4.92%) and required approximately twice the sample area for similar accuracy. Radial cuttings provided the least accurate PERPAR estimates. This evaluation of ray parenchyma in conifers and the presented guidelines regarding data accuracy as a function of measured wood surface and number of samples represent an important methodological reference for ray quantification, which will ultimately improve the understanding of the fundamental role of ray parenchyma tissue for the performance and survival of trees growing in stressed environments.
Assessing Conifer Ray Parenchyma for Ecological Studies: Pitfalls and Guidelines
von Arx, Georg; Arzac, Alberto; Olano, José M.; Fonti, Patrick
2015-01-01
Ray parenchyma is an essential tissue for tree functioning and survival. This living tissue plays a major role for storage and transport of water, nutrients, and non-structural carbohydrates (NSC), thus regulating xylem hydraulics and growth. However, despite the importance of rays for tree carbon and water relations, methodological challenges hamper knowledge about ray intra- and inter-tree variability and its ecological meaning. In this study we provide a methodological toolbox for soundly quantifying spatial and temporal variability of different ray features. Anatomical ray features were surveyed in different cutting planes (cross-sectional, tangential, and radial) using quantitative image analysis on stem-wood micro-sections sampled from 41 mature Scots pines (Pinus sylvestris). The percentage of ray surface (PERPAR), a proxy for ray volume, was compared among cutting planes and between early- and latewood to assess measurement-induced variability. Different tangential ray metrics were correlated to assess their similarities. The accuracy of cross-sectional and tangential measurements for PERPAR estimates as a function of number of samples and the measured wood surface was assessed using bootstrapping statistical technique. Tangential sections offered the best 3D insight of ray integration into the xylem and provided the most accurate estimates of PERPAR, with 10 samples of 4 mm2 showing an estimate within ±6.0% of the true mean PERPAR (relative 95% confidence interval, CI95), and 20 samples of 4 mm2 showing a CI95 of ±4.3%. Cross-sections were most efficient for establishment of time series, and facilitated comparisons with other widely used xylem anatomical features. Earlywood had significantly lower PERPAR (5.77 vs. 6.18%) and marginally fewer initiating rays than latewood. In comparison to tangential sections, PERPAR was systematically overestimated (6.50 vs. 4.92%) and required approximately twice the sample area for similar accuracy. Radial cuttings provided the least accurate PERPAR estimates. This evaluation of ray parenchyma in conifers and the presented guidelines regarding data accuracy as a function of measured wood surface and number of samples represent an important methodological reference for ray quantification, which will ultimately improve the understanding of the fundamental role of ray parenchyma tissue for the performance and survival of trees growing in stressed environments. PMID:26635842
Inference for lidar-assisted estimation of forest growing stock volume
Ronald E. McRoberts; Erik Næsset; Terje Gobakken
2013-01-01
Estimates of growing stock volume are reported by the national forest inventories (NFI) of most countries and may serve as the basis for aboveground biomass and carbon estimates as required by an increasing number of international agreements. The probability-based (design-based) statistical estimators traditionally used by NFIs to calculate estimates are generally...
NASA Astrophysics Data System (ADS)
Nasri, S.; Cudennec, C.; Albergel, J.; Berndtsson, R.
2004-02-01
In the beginning of the 1990s, the Tunisian Ministry of Agriculture launched an ambitious program for constructing small hillside reservoirs in the northern and central region of the country. At present, more than 720 reservoirs have been created. They consist of small compacted earth dams supplied with a horizontal overflow weir. Due to lack of hydrological data and the area's extreme floods, however, it is very difficult to design the overflow weirs. Also, catchments are very sensitive to erosion and the reservoirs are rapidly silted up. Consequently, prediction of flood volumes for important rainfall events becomes crucial. Few hydrological observations, however, exist for the catchment areas. For this purpose a geomorphological model methodology is presented to predict shape and volume of hydrographs for important floods. This model is built around a production function that defines the net storm rainfall (portion of rainfall during a storm which reaches a stream channel as direct runoff) from the total rainfall (observed rainfall in the catchment) and a transfer function based on the most complete possible definition of the surface drainage system. Observed rainfall during 5-min time steps was used in the model. The model runoff generation is based on surface drainage characteristics which can be easily extracted from maps. The model was applied to two representative experimental catchments in central Tunisia. The conceptual rainfall-runoff model based on surface topography and drainage network was seen to reproduce observed runoff satisfactory. The calibrated model was used to estimate runoff from 5, 10, 20, and 50 year rainfall return periods regarding runoff volume, maximum runoff, as well as the general shape of the runoff hydrograph. Practical conclusions to design hill reservoirs and to extrapolate results using this model methodology for ungauged small catchments in semiarid Tunisia are made.
Understanding and managing disaster evacuation on a transportation network.
Lambert, James H; Parlak, Ayse I; Zhou, Qian; Miller, John S; Fontaine, Michael D; Guterbock, Thomas M; Clements, Janet L; Thekdi, Shital A
2013-01-01
Uncertain population behaviors in a regional emergency could potentially harm the performance of the region's transportation system and subsequent evacuation effort. The integration of behavioral survey data with travel demand modeling enables an assessment of transportation system performance and the identification of operational and public health countermeasures. This paper analyzes transportation system demand and system performance for emergency management in three disaster scenarios. A two-step methodology first estimates the number of trips evacuating the region, thereby capturing behavioral aspects in a scientifically defensible manner based on survey results, and second, assigns these trips to a regional highway network, using geographic information systems software, thereby making the methodology transferable to other locations. Performance measures are generated for each scenario including maps of volume-to-capacity ratios, geographic contours of evacuation time from the center of the region, and link-specific metrics such as weighted average speed and traffic volume. The methods are demonstrated on a 600 segment transportation network in Washington, DC (USA) and are applied to three scenarios involving attacks from radiological dispersion devices (e.g., dirty bombs). The results suggests that: (1) a single detonation would degrade transportation system performance two to three times more than that which occurs during a typical weekday afternoon peak hour, (2) volume on several critical arterials within the network would exceed capacity in the represented scenarios, and (3) resulting travel times to reach intended destinations imply that un-aided evacuation is impractical. These results assist decisions made by two categories of emergency responders: (1) transportation managers who provide traveler information and who make operational adjustments to improve the network (e.g., signal retiming) and (2) public health officials who maintain shelters, food and water stations, or first aid centers along evacuation routes. This approach may also interest decisionmakers who are in a position to influence the allocation of emergency resources, including healthcare providers, infrastructure owners, transit providers, and regional or local planning staff. Copyright © 2012 Elsevier Ltd. All rights reserved.
Determination of fractional flow reserve (FFR) based on scaling laws: a simulation study
NASA Astrophysics Data System (ADS)
Wong, Jerry T.; Molloi, Sabee
2008-07-01
Fractional flow reserve (FFR) provides an objective physiological evaluation of stenosis severity. A technique that can measure FFR using only angiographic images would be a valuable tool in the cardiac catheterization laboratory. To perform this, the diseased blood flow can be measured with a first pass distribution analysis and the theoretical normal blood flow can be estimated from the total coronary arterial volume based on scaling laws. A computer simulation of the coronary arterial network was used to gain a better understanding of how hemodynamic conditions and coronary artery disease can affect blood flow, arterial volume and FFR estimation. Changes in coronary arterial flow and volume due to coronary stenosis, aortic pressure and venous pressure were examined to evaluate the potential use of flow and volume for FFR determination. This study showed that FFR can be estimated using arterial volume and a scaling coefficient corrected for aortic pressure. However, variations in venous pressure were found to introduce some error in FFR estimation. A relative form of FFR was introduced and was found to cancel out the influence of pressure on coronary flow, arterial volume and FFR estimation. The use of coronary flow and arterial volume for FFR determination appears promising.
NASA Astrophysics Data System (ADS)
Cael, B. B.
How much water do lakes on Earth hold? Global lake volume estimates are scarce, highly variable, and poorly documented. We develop a mechanistic null model for estimating global lake mean depth and volume based on a statistical topographic approach to Earth's surface. The volume-area scaling prediction is accurate and consistent within and across lake datasets spanning diverse regions. We applied these relationships to a global lake area census to estimate global lake volume and depth. The volume of Earth's lakes is 199,000 km3 (95% confidence interval 196,000-202,000 km3) . This volume is in the range of historical estimates (166,000-280,000 km3) , but the overall mean depth of 41.8 m (95% CI 41.2-42.4 m) is significantly lower than previous estimates (62 - 151 m). These results highlight and constrain the relative scarcity of lake waters in the hydrosphere and have implications for the role of lakes in global biogeochemical cycles. We also evaluate the size (area) distribution of lakes on Earth compared to expectations from percolation theory. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship Program under Grant No. 2388357.
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.
2018-01-01
Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.
Smith, S. Jerrod
2013-01-01
From the 1890s through the 1970s the Picher mining district in northeastern Ottawa County, Oklahoma, was the site of mining and processing of lead and zinc ore. When mining ceased in about 1979, as much as 165–300 million tons of mine tailings, locally referred to as “chat,” remained in the Picher mining district. Since 1979, some chat piles have been mined for aggregate materials and have decreased in volume and mass. Currently (2013), the land surface in the Picher mining district is covered by thousands of acres of chat, much of which remains on Indian trust land owned by allottees. The Bureau of Indian Affairs manages these allotted lands and oversees the sale and removal of chat from these properties. To help the Bureau of Indian Affairs better manage the sale and removal of chat, the U.S. Geological Survey, in cooperation with the Bureau of Indian Affairs, estimated the 2005 and 2010 volumes and masses of selected chat piles remaining on allotted lands in the Picher mining district. The U.S. Geological Survey also estimated the changes in volume and mass of these chat piles for the period 2005 through 2010. The 2005 and 2010 chat-pile volume and mass estimates were computed for 34 selected chat piles on 16 properties in the study area. All computations of volume and mass were performed on individual chat piles and on groups of chat piles in the same property. The Sooner property had the greatest estimated volume (4.644 million cubic yards) and mass (5.253 ± 0.473 million tons) of chat in 2010. Five of the selected properties (Sooner, Western, Lawyers, Skelton, and St. Joe) contained estimated chat volumes exceeding 1 million cubic yards and estimated chat masses exceeding 1 million tons in 2010. Four of the selected properties (Lucky Bill Humbah, Ta Mee Heh, Bird Dog, and St. Louis No. 6) contained estimated chat volumes of less than 0.1 million cubic yards and estimated chat masses of less than 0.1 million tons in 2010. The total volume of all selected chat piles was estimated to be 18.073 million cubic yards in 2005 and 16.171 million cubic yards in 2010. The total mass of all selected chat piles was estimated to be 20.445 ± 1.840 million tons in 2005 and 18.294 ± 1.646 million tons in 2010. All of the selected chat piles decreased in volume and mass for the period 2005 through 2010. Chat piles CP022 (Ottawa property) and CP013 (Sooner property) had some within-property chat-pile redistribution, with both chat piles having net decreases in volume and mass for the period 2005 through 2010. The Sooner property and the St. Joe property had the greatest volume (and mass) changes, with 1.266 million cubic yards and 0.217 million cubic yards (1.432 ± 0.129 million tons and 0.246 ± 0.022 million tons) of chat being removed, respectively. The chat removed from the Sooner and St. Joe properties accounts for about 78 percent of the chat removed from all selected chat piles and properties. The total volume and mass removed from all selected chat piles for the period 2005 through 2010 were estimated to be 1.902 million cubic yards and 2.151 ± 0.194 million tons, respectively.
NASA Astrophysics Data System (ADS)
Krokhin, G.; Pestunov, A.
2017-11-01
Exploitation conditions of power stations in variable modes and related changes of their technical state actualized problems of creating models for decision-making and state recognition basing on diagnostics using the fuzzy logic for identification their state and managing recovering processes. There is no unified methodological approach for obtaining the relevant information is a case of fuzziness and inhomogeneity of the raw information about the equipment state. The existing methods for extracting knowledge are usually unable to provide the correspondence between of the aggregates model parameters and the actual object state. The switchover of the power engineering from the preventive repair to the one, which is implemented according to the actual technical state, increased the responsibility of those who estimate the volume and the duration of the work. It may lead to inadequacy of the diagnostics and the decision-making models if corresponding methodological preparations do not take fuzziness into account, because the nature of the state information is of this kind. In this paper, we introduce a new model which formalizes the equipment state using not only exact information, but fuzzy as well. This model is more adequate to the actual state, than traditional analogs, and may be used in order to increase the efficiency and the service period of the power installations.
Using Mobile Device Samples to Estimate Traffic Volumes
DOT National Transportation Integrated Search
2017-12-01
In this project, TTI worked with StreetLight Data to evaluate a beta version of its traffic volume estimates derived from global positioning system (GPS)-based mobile devices. TTI evaluated the accuracy of average annual daily traffic (AADT) volume :...
NASA Astrophysics Data System (ADS)
Mubako, S. T.; Fullerton, T. M.; Walke, A.; Collins, T.; Mubako, G.; Walker, W. S.
2014-12-01
Water productivity is an area of growing interest in assessing the impact of human economic activities on water resources, especially in arid regions. Indicators of water productivity can assist water users in evaluating sectoral water use efficiency, identifying sources of pressure on water resources, and in supporting water allocation rationale under scarcity conditions. This case study for the water-scarce Middle Rio Grande River Basin aims to develop an environmental-economic accounting approach for water use in arid river basins through a methodological framework that relates water use to human economic activities impacting regional water resources. Water uses are coupled to economic transactions, and the complex but mutual relations between various water using sectors estimated. A comparison is made between the calculated water productivity indicators and representative cost/price per unit volume of water for the main water use sectors. Although it contributes very little to regional economic output, preliminary results confirm that Irrigation is among the sectors with the largest direct water use intensities. High economic value and low water use intensity economic sectors in the study region include Manufacturing, Mining, and Steam Electric Power. Water accounting challenges revealed by the study include differences in water management regimes between jurisdictions, and little understanding of the impact of major economic activities on the interaction between surface and groundwater systems in this region. A more comprehensive assessment would require the incorporation of environmental and social sustainability indicators to the calculated water productivity indicators.
Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates
NASA Technical Reports Server (NTRS)
Peffley, Al F.
1991-01-01
The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.
An economic analysis of robotically assisted hysterectomy.
Wright, Jason D; Ananth, Cande V; Tergas, Ana I; Herzog, Thomas J; Burke, William M; Lewin, Sharyn N; Lu, Yu-Shiang; Neugut, Alfred I; Hershman, Dawn L
2014-05-01
To perform an econometric analysis to examine the influence of procedure volume, variation in hospital accounting methodology, and use of various analytic methodologies on cost of robotically assisted hysterectomy for benign gynecologic disease and endometrial cancer. A national sample was used to identify women who underwent laparoscopic or robotically assisted hysterectomy for benign indications or endometrial cancer from 2006 to 2012. Surgeon and hospital volume were classified as the number of procedures performed before the index surgery. Total costs as well as fixed and variable costs were modeled using multivariable quantile regression methodology. A total of 180,230 women, including 169,324 women who underwent minimally invasive hysterectomy for benign indications and 10,906 patients whose hysterectomy was performed for endometrial cancer, were identified. The unadjusted median cost of robotically assisted hysterectomy for benign indications was $8,152 (interquartile range [IQR] $6,011-10,932) compared with $6,535 (IQR $5,127-8,357) for laparoscopic hysterectomy (P<.001). The cost differential decreased with increasing surgeon and hospital volume. The unadjusted median cost of robotically assisted hysterectomy for endometrial cancer was $9,691 (IQR $7,591-12,428) compared with $8,237 (IQR $6,400-10,807) for laparoscopic hysterectomy (P<.001). The cost differential decreased with increasing hospital volume from $2,471 for the first 5 to 15 cases to $924 for more than 50 cases. Based on surgeon volume, robotically assisted hysterectomy for endometrial cancer was $1,761 more expensive than laparoscopy for those who had performed fewer than five cases; the differential declined to $688 for more than 50 procedures compared with laparoscopic hysterectomy. The cost of robotic gynecologic surgery decreases with increased procedure volume. However, in all of the scenarios modeled, robotically assisted hysterectomy remained substantially more costly than laparoscopic hysterectomy.
Trommer, J.T.; Loper, J.E.; Hammett, K.M.; Bowman, Georgia
1996-01-01
Hydrologists use several traditional techniques for estimating peak discharges and runoff volumes from ungaged watersheds. However, applying these techniques to watersheds in west-central Florida requires that empirical relationships be extrapolated beyond tested ranges. As a result there is some uncertainty as to their accuracy. Sixty-six storms in 15 west-central Florida watersheds were modeled using (1) the rational method, (2) the U.S. Geological Survey regional regression equations, (3) the Natural Resources Conservation Service (formerly the Soil Conservation Service) TR-20 model, (4) the Army Corps of Engineers HEC-1 model, and (5) the Environmental Protection Agency SWMM model. The watersheds ranged between fully developed urban and undeveloped natural watersheds. Peak discharges and runoff volumes were estimated using standard or recommended methods for determining input parameters. All model runs were uncalibrated and the selection of input parameters was not influenced by observed data. The rational method, only used to calculate peak discharges, overestimated 45 storms, underestimated 20 storms and estimated the same discharge for 1 storm. The mean estimation error for all storms indicates the method overestimates the peak discharges. Estimation errors were generally smaller in the urban watersheds and larger in the natural watersheds. The U.S. Geological Survey regression equations provide peak discharges for storms of specific recurrence intervals. Therefore, direct comparison with observed data was limited to sixteen observed storms that had precipitation equivalent to specific recurrence intervals. The mean estimation error for all storms indicates the method overestimates both peak discharges and runoff volumes. Estimation errors were smallest for the larger natural watersheds in Sarasota County, and largest for the small watersheds located in the eastern part of the study area. The Natural Resources Conservation Service TR-20 model, overestimated peak discharges for 45 storms and underestimated 21 storms, and overestimated runoff volumes for 44 storms and underestimated 22 storms. The mean estimation error for all storms modeled indicates that the model overestimates peak discharges and runoff volumes. The smaller estimation errors in both peak discharges and runoff volumes were for storms occurring in the urban watersheds, and the larger errors were for storms occurring in the natural watersheds. The HEC-1 model overestimated peak discharge rates for 55 storms and underestimated 11 storms. Runoff volumes were overestimated for 44 storms and underestimated for 22 storms using the Army Corps of Engineers HEC-1 model. The mean estimation error for all the storms modeled indicates that the model overestimates peak discharge rates and runoff volumes. Generally, the smaller estimation errors in peak discharges were for storms occurring in the urban watersheds, and the larger errors were for storms occurring in the natural watersheds. Estimation errors in runoff volumes; however, were smallest for the 3 natural watersheds located in the southernmost part of Sarasota County. The Environmental Protection Agency Storm Water Management model produced similar peak discharges and runoff volumes when using both the Green-Ampt and Horton infiltration methods. Estimated peak discharge and runoff volume data calculated with the Horton method was only slightly higher than those calculated with the Green-Ampt method. The mean estimation error for all the storms modeled indicates the model using the Green-Ampt infiltration method overestimates peak discharges and slightly underestimates runoff volumes. Using the Horton infiltration method, the model overestimates both peak discharges and runoff volumes. The smaller estimation errors in both peak discharges and runoff volumes were for storms occurring in the five natural watersheds in Sarasota County with the least amount of impervious cover and the lowest slopes. The largest er
DOT National Transportation Integrated Search
2009-08-01
The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...
DOT National Transportation Integrated Search
2009-08-01
The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...
DOT National Transportation Integrated Search
2009-08-01
The Federal Railroad Administration tasked the Volpe Center with developing a methodology for determining the avoidable and fully allocated costs of Amtrak routes. Avoidable costs are costs that would not be incurred if an Amtrak route were discontin...
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…
Moore, Bria M.; Brady, Samuel L.; Mirro, Amy E.; Kaufman, Robert A.
2014-01-01
Purpose: To investigate the correlation of size-specific dose estimate (SSDE) with absorbed organ dose, and to develop a simple methodology for estimating patient organ dose in a pediatric population (5–55 kg). Methods: Four physical anthropomorphic phantoms representing a range of pediatric body habitus were scanned with metal oxide semiconductor field effect transistor (MOSFET) dosimeters placed at 23 organ locations to determine absolute organ dose. Phantom absolute organ dose was divided by phantom SSDE to determine correlation between organ dose and SSDE. Organ dose correlation factors (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\rm CF}_{{\\rm SSDE}}^{{\\rm organ}}$\\end{document} CF SSDE organ ) were then multiplied by patient-specific SSDE to estimate patient organ dose. The \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\rm CF}_{{\\rm SSDE}}^{{\\rm organ}}$\\end{document} CF SSDE organ were used to retrospectively estimate individual organ doses from 352 chest and 241 abdominopelvic pediatric CT examinations, where mean patient weight was 22 kg ± 15 (range 5–55 kg), and mean patient age was 6 yrs ± 5 (range 4 months to 23 yrs). Patient organ dose estimates were compared to published pediatric Monte Carlo study results. Results: Phantom effective diameters were matched with patient population effective diameters to within 4 cm; thus, showing appropriate scalability of the phantoms across the entire pediatric population in this study. Individual\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\rm CF}_{{\\rm SSDE}}^{{\\rm organ}}$\\end{document} CF SSDE organ were determined for a total of 23 organs in the chest and abdominopelvic region across nine weight subcategories. For organs fully covered by the scan volume, correlation in the chest (average 1.1; range 0.7–1.4) and abdominopelvic region (average 0.9; range 0.7–1.3) was near unity. For organ/tissue that extended beyond the scan volume (i.e., skin, bone marrow, and bone surface), correlation was determined to be poor (average 0.3; range: 0.1–0.4) for both the chest and abdominopelvic regions, respectively. A means to estimate patient organ dose was demonstrated. Calculated patient organ dose, using patient SSDE and \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\rm CF}_{{\\rm SSDE}}^{{\\rm organ}}$\\end{document} CF SSDE organ , was compared to previously published pediatric patient doses that accounted for patient size in their dose calculation, and was found to agree in the chest to better than an average of 5% (27.6/26.2) and in the abdominopelvic region to better than 2% (73.4/75.0). Conclusions: For organs fully covered within the scan volume, the average correlation of SSDE and organ absolute dose was found to be better than ±10%. In addition, this study provides a complete list of organ dose correlation factors (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\rm CF}_{{\\rm SSDE}}^{{\\rm organ}}$\\end{document} CF SSDE organ ) for the chest and abdominopelvic regions, and describes a simple methodology to estimate individual pediatric patient organ dose based on patient SSDE. PMID:24989395
Tunnel and Station Cost Methodology : Mined Tunnels
DOT National Transportation Integrated Search
1983-01-01
The main objective of this study was to develop a model for estimating the cost of subway station and tunnel construction. This report describes a cost estimating methodology for subway tunnels that can be used by planners, designers, owners, and gov...
Lunar Architecture Team - Phase 2 Habitat Volume Estimation: "Caution When Using Analogs"
NASA Technical Reports Server (NTRS)
Rudisill, Marianne; Howard, Robert; Griffin, Brand; Green, Jennifer; Toups, Larry; Kennedy, Kriss
2008-01-01
The lunar surface habitat will serve as the astronauts' home on the moon, providing a pressurized facility for all crew living functions and serving as the primary location for a number of crew work functions. Adequate volume is required for each of these functions in addition to that devoted to housing the habitat systems and crew consumables. The time constraints of the LAT-2 schedule precluded the Habitation Team from conducting a complete "bottoms-up" design of a lunar surface habitation system from which to derive true volumetric requirements. The objective of this analysis was to quickly derive an estimated total pressurized volume and pressurized net habitable volume per crewmember for a lunar surface habitat, using a principled, methodical approach in the absence of a detailed design. Five "heuristic methods" were used: historical spacecraft volumes, human/spacecraft integration standards and design guidance, Earth-based analogs, parametric "sizing" tools, and conceptual point designs. Estimates for total pressurized volume, total habitable volume, and volume per crewmember were derived using these methods. All method were found to provide some basis for volume estimates, but values were highly variable across a wide range, with no obvious convergence of values. Best current assumptions for required crew volume were provided as a range. Results of these analyses and future work are discussed.
Kanter, Michael H; Huang, Yii-Chieh; Kally, Zina; Gordon, Margo A; Meltzer, Charles
2018-06-01
A well-documented association exists between higher surgeon volumes and better outcomes for many procedures, but surgeons may be reluctant to change practice patterns without objective, credible, and near real-time data on their performance. In addition, published thresholds for procedure volumes may be biased or perceived as arbitrary; typical reports compare surgeons grouped into discrete procedure volume categories, even though the volume-outcomes relationship is likely continuous. The concentration curves methodology, which has been used to analyze whether health outcomes vary with socioeconomic status, was adapted to explore the association between procedure volume and outcomes as a continuous relationship so that data for all surgeons within a health care organization could be included. Using widely available software and requiring minimal analytic expertise, this approach plots cumulative percentages of two variables of interest against each other and assesses the characteristics of the resulting curve. Organization-specific relationships between surgeon volumes and outcomes were examined for three example types of procedures: uncomplicated hysterectomies, infant circumcisions, and total thyroidectomies. The concentration index was used to assess whether outcomes were equally distributed unrelated to volumes. For all three procedures, the concentration curve methodology identified associations between surgeon procedure volumes and selected outcomes that were specific to the organization. The concentration indices confirmed the higher prevalence of examined outcomes among low-volume surgeons. The curves supported organizational discussions about surgical quality. Concentration curves require minimal resources to identify organization- and procedure-specific relationships between surgeon procedure volumes and outcomes and can support quality improvement. Copyright © 2018 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Economic Effects of Increased Control Zone Sizes in Conflict Resolution
NASA Technical Reports Server (NTRS)
Datta, Koushik
1998-01-01
A methodology for estimating the economic effects of different control zone sizes used in conflict resolutions between aircraft is presented in this paper. The methodology is based on estimating the difference in flight times of aircraft with and without the control zone, and converting the difference into a direct operating cost. Using this methodology the effects of increased lateral and vertical control zone sizes are evaluated.
Predicting Vessel Trajectories from Ais Data Using R
2017-06-01
future position at the expectation level set by the user, therefore producing a valid methodology for both estimating the future vessel location and... methodology for both estimating the future vessel location and for assessing anomalous vessel behavior. vi THIS PAGE INTENTIONALLY LEFT BLANK vii... methodology , that brings them one step closer to attaining these goals. A key idea in the current literature is that the series of vessel locations
Performance-based methodology for assessing seismic vulnerability and capacity of buildings
NASA Astrophysics Data System (ADS)
Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li
2010-06-01
This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.
Hope, William W; Goodwin, Joanne; Felton, Timothy W; Ellis, Michael; Stevens, David A
2012-10-01
There is increased interest in intermittent regimen of liposomal amphotericin B, which may facilitate use in ambulatory settings. Little is known, however, about the most appropriate dosage and schedule of administration. Plasma pharmacokinetic data were acquired from 30 patients receiving liposomal amphotericin B for empirical treatment of suspected invasive fungal infection. Two cohorts were studied. The first cohort received 3 mg of liposomal amphotericin B/kg of body weight/day; the second cohort received 10 mg of liposomal amphotericin B/kg at time zero, followed by 5 mg/kg at 48 and 120 h. The levels of liposomal amphotericin B were measured by high-pressure liquid chromatography (HPLC). The pharmacokinetics were estimated by using a population methodology. Monte Carlo simulations were performed. D-optimal design was used to identify maximally informative sampling times for both conventional and intermittent regimens for future studies. A three-compartment pharmacokinetic model best described the data. The pharmacokinetics for both conventional and intermittent dosing were linear. The estimates for the mean (standard deviation) for clearance and the volume of the central compartment were 1.60 (0.85) liter/h and 20.61 (15.27) liters, respectively. Monte Carlo simulations demonstrated considerable variability in drug exposure. Bayesian estimates for clearance and volume increased in a linear manner with weight, but only the former was statistically significant (P = 0.039). D-optimal design provided maximally informative sampling times for future pharmacokinetic studies. The pharmacokinetics of a conventional and an intermittently administered high-dose regimen liposomal amphotericin B are linear. Further pharmacokinetic-pharmacodynamic preclinical and clinical studies are required to identify safe and effective intermittent regimens.
Goodwin, Joanne; Felton, Timothy W.; Ellis, Michael; Stevens, David A.
2012-01-01
There is increased interest in intermittent regimen of liposomal amphotericin B, which may facilitate use in ambulatory settings. Little is known, however, about the most appropriate dosage and schedule of administration. Plasma pharmacokinetic data were acquired from 30 patients receiving liposomal amphotericin B for empirical treatment of suspected invasive fungal infection. Two cohorts were studied. The first cohort received 3 mg of liposomal amphotericin B/kg of body weight/day; the second cohort received 10 mg of liposomal amphotericin B/kg at time zero, followed by 5 mg/kg at 48 and 120 h. The levels of liposomal amphotericin B were measured by high-pressure liquid chromatography (HPLC). The pharmacokinetics were estimated by using a population methodology. Monte Carlo simulations were performed. D-optimal design was used to identify maximally informative sampling times for both conventional and intermittent regimens for future studies. A three-compartment pharmacokinetic model best described the data. The pharmacokinetics for both conventional and intermittent dosing were linear. The estimates for the mean (standard deviation) for clearance and the volume of the central compartment were 1.60 (0.85) liter/h and 20.61 (15.27) liters, respectively. Monte Carlo simulations demonstrated considerable variability in drug exposure. Bayesian estimates for clearance and volume increased in a linear manner with weight, but only the former was statistically significant (P = 0.039). D-optimal design provided maximally informative sampling times for future pharmacokinetic studies. The pharmacokinetics of a conventional and an intermittently administered high-dose regimen liposomal amphotericin B are linear. Further pharmacokinetic-pharmacodynamic preclinical and clinical studies are required to identify safe and effective intermittent regimens. PMID:22869566
CT volumetry of the skeletal tissues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brindle, James M.; Alexandre Trindade, A.; Pichardo, Jose C.
2006-10-15
Computed tomography (CT) is an important and widely used modality in the diagnosis and treatment of various cancers. In the field of molecular radiotherapy, the use of spongiosa volume (combined tissues of the bone marrow and bone trabeculae) has been suggested as a means to improve the patient-specificity of bone marrow dose estimates. The noninvasive estimation of an organ volume comes with some degree of error or variation from the true organ volume. The present study explores the ability to obtain estimates of spongiosa volume or its surrogate via manual image segmentation. The variation among different segmentation raters was exploredmore » and found not to be statistically significant (p value >0.05). Accuracy was assessed by having several raters manually segment a polyvinyl chloride (PVC) pipe with known volumes. Segmentation of the outer region of the PVC pipe resulted in mean percent errors as great as 15% while segmentation of the pipe's inner region resulted in mean percent errors within {approx}5%. Differences between volumes estimated with the high-resolution CT data set (typical of ex vivo skeletal scans) and the low-resolution CT data set (typical of in vivo skeletal scans) were also explored using both patient CT images and a PVC pipe phantom. While a statistically significant difference (p value <0.002) between the high-resolution and low-resolution data sets was observed with excised femoral heads obtained following total hip arthroplasty, the mean difference between high-resolution and low-resolution data sets was found to be only 1.24 and 2.18 cm{sup 3} for spongiosa and cortical bone, respectively. With respect to differences observed with the PVC pipe, the variation between the high-resolution and low-resolution mean percent errors was a high as {approx}20% for the outer region volume estimates and only as high as {approx}6% for the inner region volume estimates. The findings from this study suggest that manual segmentation is a reasonably accurate and reliable means for the in vivo estimation of spongiosa volume. This work also provides a foundation for future studies where spongiosa volumes are estimated by various raters in more comprehensive CT data sets.« less
Estimating aspen volume and weight for individual trees, diameter classes, or entire stands.
Bryce E. Schlaegel
1975-01-01
Presents allometric volume and weight equations for Minnesota quaking aspen. Volume, green weight, and dry weight estimates can be made for wood, bark, and limbs on the basis of individual trees, diameter classes, or entire stands.
Critical length sampling: a method to estimate the volume of downed coarse woody debris
G& #246; ran St& #229; hl; Jeffrey H. Gove; Michael S. Williams; Mark J. Ducey
2010-01-01
In this paper, critical length sampling for estimating the volume of downed coarse woody debris is presented. Using this method, the volume of downed wood in a stand can be estimated by summing the critical lengths of down logs included in a sample obtained using a relascope or wedge prism; typically, the instrument should be tilted 90° from its usual...
Ambros Berger; Thomas Gschwantner; Ronald E. McRoberts; Klemens Schadauer
2014-01-01
National forest inventories typically estimate individual tree volumes using models that rely on measurements of predictor variables such as tree height and diameter, both of which are subject to measurement error. The aim of this study was to quantify the impacts of these measurement errors on the uncertainty of the model-based tree stem volume estimates. The impacts...
Harry V., Jr. Wiant; Michael L. Spangler; John E. Baumgras
2002-01-01
Various taper systems and the centroid method were compared to unbiased volume estimates made by importance sampling for 720 hardwood trees selected throughout the state of West Virginia. Only the centroid method consistently gave volumes estimates that did not differ significantly from those made by importance sampling, although some taper equations did well for most...
Map of assessed tight-gas resources in the United States
Biewick, Laura R. H.; ,
2014-01-01
This report presents a digital map of tight-gas resource assessments in the United States as part of the U.S. Geological Survey’s (USGS) National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the USGS quantitatively estimated potential volumes of undiscovered, technically recoverable natural gas resources within tight-gas assessment units (AUs). This is the second digital map product in a series of USGS unconventional oil and gas resource maps. The map plate included in this report can be printed in hard-copy form or downloaded in a Geographic Information System (GIS) data package, including an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and published map file (.pmf). In addition, the publication access table contains hyperlinks to current USGS tight-gas assessment publications and web pages.
Map of assessed coalbed-gas resources in the United States, 2014
,; Biewick, Laura R. H.
2014-01-01
This report presents a digital map of coalbed-gas resource assessments in the United States as part of the U.S. Geological Survey’s (USGS) National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the USGS quantitatively estimated potential volumes of undiscovered, technically recoverable natural gas resources within coalbed-gas assessment units (AUs). This is the third digital map product in a series of USGS unconventional oil and gas resource maps. The map plate included in this report can be printed in hardcopy form or downloaded in a Geographic Information System (GIS) data package, including an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and published map file (.pmf). In addition, the publication access table contains hyperlinks to current USGS coalbed-gas assessment publications and web pages.
NASA Technical Reports Server (NTRS)
Ferber, R. (Editor); Evans, D. (Editor)
1978-01-01
The background, objectives and methodology used for the Small Power Systems Solar Electric Workshop are described, and a summary of the results and conclusions developed at the workshop regarding small solar thermal electric power systems is presented.
DOT National Transportation Integrated Search
2012-05-05
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the ICM AMS methodology successfully and effectively. It provides a step-by-step approach to ...
Russkij jazyk za rubezom. Jahrgang 1974 ("The Russian Language Abroad." Volume 1974)
ERIC Educational Resources Information Center
Huebner, Wolfgang
1975-01-01
Articles in the 1974 volume of this periodical are briefly reviewed, preponderantly under the headings of teaching materials, methodology, linguistics, scientific reports, and chronicle. Reviews and supplements, tapes and other materials are also included. (Text is in German.) (IFS/WGA)
Three-dimensional analysis of anisotropic spatially reinforced structures
NASA Technical Reports Server (NTRS)
Bogdanovich, Alexander E.
1993-01-01
The material-adaptive three-dimensional analysis of inhomogeneous structures based on the meso-volume concept and application of deficient spline functions for displacement approximations is proposed. The general methodology is demonstrated on the example of a brick-type mosaic parallelepiped arbitrarily composed of anisotropic meso-volumes. A partition of each meso-volume into sub-elements, application of deficient spline functions for a local approximation of displacements and, finally, the use of the variational principle allows one to obtain displacements, strains, and stresses at anypoint within the structural part. All of the necessary external and internal boundary conditions (including the conditions of continuity of transverse stresses at interfaces between adjacent meso-volumes) can be satisfied with requisite accuracy by increasing the density of the sub-element mesh. The application of the methodology to textile composite materials is described. Several numerical examples for woven and braided rectangular composite plates and stiffened panels under transverse bending are considered. Some typical effects of stress concentrations due to the material inhomogeneities are demonstrated.
Matias-Guiu, Pau; Rodríguez-Bencomo, Juan José; Pérez-Correa, José R; López, Francisco
2018-04-15
Developing new distillation strategies can help the spirits industry to improve quality, safety and process efficiency. Batch stills equipped with a packed column and an internal partial condenser are an innovative experimental system, allowing a fast and flexible management of the rectification. In this study, the impact of four factors (heart-cut volume, head-cut volume, pH and cooling flow rate of the internal partial condenser during the head-cut fraction) on 18 major volatile compounds of Muscat spirits was optimized using response surface methodology and desirability function approaches. Results have shown that high rectification at the beginning of the heart-cut enhances the overall positive aroma compounds of the product, reducing off-flavor compounds. In contrast, optimum levels of heart-cut volume, head-cut volume and pH factors varied depending on the process goal. Finally, three optimal operational conditions (head off-flavors reduction, flowery terpenic enhancement and fruity ester enhancement) were evaluated by chemical and sensory analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Security Events and Vulnerability Data for Cybersecurity Risk Estimation.
Allodi, Luca; Massacci, Fabio
2017-08-01
Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.
Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site
NASA Astrophysics Data System (ADS)
Albarello, D.; Mucciarelli, M.
- A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.
Ambient Vibration Testing for Story Stiffness Estimation of a Heritage Timber Building
Min, Kyung-Won; Kim, Junhee; Park, Sung-Ah; Park, Chan-Soo
2013-01-01
This paper investigates dynamic characteristics of a historic wooden structure by ambient vibration testing, presenting a novel estimation methodology of story stiffness for the purpose of vibration-based structural health monitoring. As for the ambient vibration testing, measured structural responses are analyzed by two output-only system identification methods (i.e., frequency domain decomposition and stochastic subspace identification) to estimate modal parameters. The proposed methodology of story stiffness is estimation based on an eigenvalue problem derived from a vibratory rigid body model. Using the identified natural frequencies, the eigenvalue problem is efficiently solved and uniquely yields story stiffness. It is noteworthy that application of the proposed methodology is not necessarily confined to the wooden structure exampled in the paper. PMID:24227999
Fracture mechanics approach to estimate rail wear limits
DOT National Transportation Integrated Search
2009-10-01
This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…
Positive Deviance: Learning from Positive Anomalies
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Gale, Dick
2017-01-01
Purpose: This paper is one of seven in this volume, each elaborating different approaches to quality improvement in education. The purpose of this paper is to delineate a methodology called positive deviance. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an…
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
NASA Technical Reports Server (NTRS)
Greitzer, E. M.; Bonnefoy, P. A.; delaRosaBlanco, E.; Dorbian, C. S.; Drela, M.; Hall, D. K.; Hansman, R. J.; Hileman, J. I.; Liebeck, R. H.; Lovegren, J.;
2010-01-01
Appendices A to F present the theory behind the TASOPT methodology and code. Appendix A describes the bulk of the formulation, while Appendices B to F develop the major sub-models for the engine, fuselage drag, BLI accounting, etc.
Goodman, Angela; Hakala, J. Alexandra; Bromhal, Grant; Deel, Dawn; Rodosta, Traci; Frailey, Scott; Small, Michael; Allen, Doug; Romanov, Vyacheslav; Fazio, Jim; Huerta, Nicolas; McIntyre, Dustin; Kutchko, Barbara; Guthrie, George
2011-01-01
A detailed description of the United States Department of Energy (US-DOE) methodology for estimating CO2 storage potential for oil and gas reservoirs, saline formations, and unmineable coal seams is provided. The oil and gas reservoirs are assessed at the field level, while saline formations and unmineable coal seams are assessed at the basin level. The US-DOE methodology is intended for external users such as the Regional Carbon Sequestration Partnerships (RCSPs), future project developers, and governmental entities to produce high-level CO2 resource assessments of potential CO2 storage reservoirs in the United States and Canada at the regional and national scale; however, this methodology is general enough that it could be applied globally. The purpose of the US-DOE CO2 storage methodology, definitions of storage terms, and a CO2 storage classification are provided. Methodology for CO2 storage resource estimate calculation is outlined. The Log Odds Method when applied with Monte Carlo Sampling is presented in detail for estimation of CO2 storage efficiency needed for CO2 storage resource estimates at the regional and national scale. CO2 storage potential reported in the US-DOE's assessment are intended to be distributed online by a geographic information system in NatCarb and made available as hard-copy in the Carbon Sequestration Atlas of the United States and Canada. US-DOE's methodology will be continuously refined, incorporating results of the Development Phase projects conducted by the RCSPs from 2008 to 2018. Estimates will be formally updated every two years in subsequent versions of the Carbon Sequestration Atlas of the United States and Canada.
Wicke, Jason; Dumas, Genevieve A
2010-02-01
The geometric method combines a volume and a density function to estimate body segment parameters and has the best opportunity for developing the most accurate models. In the trunk, there are many different tissues that greatly differ in density (e.g., bone versus lung). Thus, the density function for the trunk must be particularly sensitive to capture this diversity, such that accurate inertial estimates are possible. Three different models were used to test this hypothesis by estimating trunk inertial parameters of 25 female and 24 male college-aged participants. The outcome of this study indicates that the inertial estimates for the upper and lower trunk are most sensitive to the volume function and not very sensitive to the density function. Although it appears that the uniform density function has a greater influence on inertial estimates in the lower trunk region than in the upper trunk region, this is likely due to the (overestimated) density value used. When geometric models are used to estimate body segment parameters, care must be taken in choosing a model that can accurately estimate segment volumes. Researchers wanting to develop accurate geometric models should focus on the volume function, especially in unique populations (e.g., pregnant or obese individuals).
Erić, Mirela; Anderla, Andraš; Stefanović, Darko; Drapšin, Miodrag
2014-01-01
Preoperative breast volume estimation is very important for the success of the breast surgery. In the present study, two different breast volume determination methods, Cavalieri principle and 3D reconstruction were compared. Consecutive sections were taken in slice thickness of 5 mm. Every 2nd breast section in a set of consecutive sections was selected. We marked breast tissue with blue line on each selected section, and so prepared CT scans used for breast volume estimation. The volumes of the 60 breasts were estimated using the Cavalieri principle and 3D reconstruction. The mean breast volume value was established to be 467.79 ± 188.90 cm(3) with Cavalieri method and 465.91 ± 191.41 cm(3) with 3D reconstruction. The mean CE for the estimates in this study was calculated as 0.25%. Skin-sparing volume was about 91.64% of the whole breast volume. Both methods are very accurate and have a strong linear association. Our results suggest that the calculation of breast volume or its part in vivo from systematic series of CT scans using the Cavalieri principle or 3D breast reconstruction is accurate enough to have a significant clinical benefit in planning reconstructive breast surgery. These methods can help the surgeon guide the choice of the most appropriate implant or/and flap preoperatively. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-25
...] Request for Comments on Methodology for Conducting an Independent Study of the Burden of Patent-Related... methodologies for performing such a study (Methodology Report). ICF has now provided the USPTO with its Methodology Report, in which ICF recommends methodologies for addressing various topics about estimating the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, C; Yin, Y
Purpose: The purpose of this study was to compare a radiation therapy treatment planning that would spare active bone marrow and whole pelvic bone marrow using 18F FLT PET/CT image. Methods: We have developed an IMRT planning methodology to incorporate functional PET imaging using 18F FLT/CT scans. Plans were generated for two cervical cancer patients, where pelvicactive bone marrow region was incorporated as avoidance regions based on the range: SUV>2., another region was whole pelvic bone marrow. Dose objectives were set to reduce the volume of active bone marrow and whole bone marraw. The volumes of received 10 (V10) andmore » 20 (V20) Gy for active bone marrow were evaluated. Results: Active bone marrow regions identified by 18F FLT with an SUV>2 represented an average of 48.0% of the total osseous pelvis for the two cases studied. Improved dose volume histograms for identified bone marrow SUV volumes and decreases in V10(average 18%), and V20(average 14%) were achieved without clinically significant changes to PTV or OAR doses. Conclusion: Incorporation of 18F FLT/CT PET in IMRT planning provides a methodology to reduce radiation dose to active bone marrow without compromising PTV or OAR dose objectives in cervical cancer.« less
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Kawaguchi, A; Linde, L M; Imachi, T; Mizuno, H; Akutsu, H
1983-12-01
To estimate the left atrial volume (LAV) and pulmonary blood flow in patients with congenital heart disease (CHD), we employed two-dimensional echocardiography (TDE). The LAV was measured in dimensions other than those obtained in conventional M-mode echocardiography (M-mode echo). Mathematical and geometrical models for LAV calculation using the standard long-axis, short-axis and apical four-chamber planes were devised and found to be reliable in a preliminary study using porcine heart preparations, although length (10%), area (20%) and volume (38%) were significantly and consistently underestimated with echocardiography. Those models were then applied and correlated with angiocardiograms (ACG) in 25 consecutive patients with suspected CHD. In terms of the estimation of the absolute LAV, accuracy seemed commensurate with the number of the dimensions measured. The correlation between data obtained by TDE and ACG varied with changing hemodynamics such as cardiac cycle, absolute LAV and presence or absence of volume load. The left atrium was found to become spherical and progressively underestimated with TDE at ventricular endsystole, in larger LAV and with increased volume load. Since this tendency became less pronounced in measuring additional dimensions, reliable estimation of the absolute LAV and volume load was possible when 2 or 3 dimensions were measured. Among those calculation models depending on 2 or 3 dimensional measurements, there was only a small difference in terms of accuracy and predictability, although algorithm used varied from one model to another. This suggests that accurate cross-sectional area measurement is critically important for volume estimation rather than any particular algorithm involved. Cross-sectional area measurement by TDE integrated into a three dimensional equivalent allowed a reliable estimate of the LAV or volume load in a variety of hemodynamic situations where M-mode echo was not reliable.
Estimated maximal and current brain volume predict cognitive ability in old age
Royle, Natalie A.; Booth, Tom; Valdés Hernández, Maria C.; Penke, Lars; Murray, Catherine; Gow, Alan J.; Maniega, Susana Muñoz; Starr, John; Bastin, Mark E.; Deary, Ian J.; Wardlaw, Joanna M.
2013-01-01
Brain tissue deterioration is a significant contributor to lower cognitive ability in later life; however, few studies have appropriate data to establish how much influence prior brain volume and prior cognitive performance have on this association. We investigated the associations between structural brain imaging biomarkers, including an estimate of maximal brain volume, and detailed measures of cognitive ability at age 73 years in a large (N = 620), generally healthy, community-dwelling population. Cognitive ability data were available from age 11 years. We found positive associations (r) between general cognitive ability and estimated brain volume in youth (male, 0.28; females, 0.12), and in measured brain volume in later life (males, 0.27; females, 0.26). Our findings show that cognitive ability in youth is a strong predictor of estimated prior and measured current brain volume in old age but that these effects were the same for both white and gray matter. As 1 of the largest studies of associations between brain volume and cognitive ability with normal aging, this work contributes to the wider understanding of how some early-life factors influence cognitive aging. PMID:23850342
Prediction and error of baldcypress stem volume from stump diameter
Bernard R. Parresol
1998-01-01
The need to estimate the volume of removals occurs for many reasons, such as in trespass cases, severance tax reports, and post-harvest assessments. A logarithmic model is presented for prediction of baldcypress total stem cubic foot volume using stump diameter as the independent variable. Because the error of prediction is as important as the volume estimate, the...
Taper-based system for estimating stem volumes of upland oaks
Donald E. Hilt
1980-01-01
A taper-based system for estimating stem volumes is developed for Central States upland oaks. Inside bark diameters up the stem are predicted as a function of dbhib, total height, and powers and relative height. A Fortran IV computer program, OAKVOL, is used to predict cubic and board-foot volumes to any desired merchantable top dib. Volumes of...
Siozopoulos, Achilleas; Thomaidis, Vasilios; Prassopoulos, Panos; Fiska, Aliki
2018-02-01
Literature includes a number of studies using structural MRI (sMRI) to determine the volume of the amygdala, which is modified in various pathologic conditions. The reported values vary widely mainly because of different anatomical approaches to the complex. This study aims at estimating of the normal amygdala volume from sMRI scans using a recent anatomical definition described in a study based on post-mortem material. The amygdala volume has been calculated in 106 healthy subjects, using sMRI and anatomical-based segmentation. The resulting volumes have been analyzed for differences related to hemisphere, sex, and age. The mean amygdalar volume was estimated at 1.42 cm 3 . The mean right amygdala volume has been found larger than the left, but the difference for the raw values was within the limits of the method error. No intersexual differences or age-related alterations have been observed. The study provides a method for determining the boundaries of the amygdala in sMRI scans based on recent anatomical considerations and an estimation of the mean normal amygdala volume from a quite large number of scans for future use in comparative studies.
DOT National Transportation Integrated Search
1982-09-01
This project provides information about norms and attitudes related to alcohol use and driving. This volume reports the methodology, findings, discussions, and conclusions of individual interviews conducted with early adolescents (ages 13-14), middle...
DOT National Transportation Integrated Search
1983-05-01
This four volume report consists of a data base describing "surrogate" automobile and truck manufacturing plants developed as part of a methodology for evaluating capital investment requirements in new manufacturing facilities to build new fleets of ...
DOT National Transportation Integrated Search
2014-04-01
This report describes the methodology and results of analyses performed to identify and evaluate : alternative methods to control traffic entering a lane closure on a two-lane, two-way road from low-volume : access points. Researchers documented the ...
ERIC Educational Resources Information Center
Gutmanis, Ivars; And Others
The primary purpose of the study was to develop and apply a methodology for estimating the need for scientists and engineers by specialty in energy and energy-related industries. The projections methodology was based on the Case 1 estimates by the National Petroleum Council of the results of "maximum efforts" to develop domestic fuel sources by…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reimund, Kevin K.; McCutcheon, Jeffrey R.; Wilson, Aaron D.
A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy densitymore » of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.« less
Polidori, David; Rowley, Clarence
2014-07-22
The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method.
Concept for a Satellite-Based Advanced Air Traffic Management System : Volume 7. System Cost.
DOT National Transportation Integrated Search
1973-02-01
The volume presents estimates of the federal government and user costs for the Satellite-Based Advanced Air Traffic Management System and the supporting rationale. The system configuration is that presented in volumes II and III. The cost estimates a...
Validation of equations for pleural effusion volume estimation by ultrasonography.
Hassan, Maged; Rizk, Rana; Essam, Hatem; Abouelnour, Ahmed
2017-12-01
To validate the accuracy of previously published equations that estimate pleural effusion volume using ultrasonography. Only equations using simple measurements were tested. Three measurements were taken at the posterior axillary line for each case with effusion: lateral height of effusion ( H ), distance between collapsed lung and chest wall ( C ) and distance between lung and diaphragm ( D ). Cases whose effusion was aspirated to dryness were included and drained volume was recorded. Intra-class correlation coefficient (ICC) was used to determine the predictive accuracy of five equations against the actual volume of aspirated effusion. 46 cases with effusion were included. The most accurate equation in predicting effusion volume was ( H + D ) × 70 (ICC 0.83). The simplest and yet accurate equation was H × 100 (ICC 0.79). Pleural effusion height measured by ultrasonography gives a reasonable estimate of effusion volume. Incorporating distance between lung base and diaphragm into estimation improves accuracy from 79% with the first method to 83% with the latter.
SU-E-T-129: Are Knowledge-Based Planning Dose Estimates Valid for Distensible Organs?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lalonde, R; Heron, D; Huq, M
2015-06-15
Purpose: Knowledge-based planning programs have become available to assist treatment planning in radiation therapy. Such programs can be used to generate estimated DVHs and planning constraints for organs at risk (OARs), based upon a model generated from previous plans. These estimates are based upon the planning CT scan. However, for distensible OARs like the bladder and rectum, daily variations in volume may make the dose estimates invalid. The purpose of this study is to determine whether knowledge-based DVH dose estimates may be valid for distensible OARs. Methods: The Varian RapidPlan™ knowledge-based planning module was used to generate OAR dose estimatesmore » and planning objectives for 10 prostate cases previously planned with VMAT, and final plans were calculated for each. Five weekly setup CBCT scans of each patient were then downloaded and contoured (assuming no change in size and shape of the target volume), and rectum and bladder DVHs were recalculated for each scan. Dose volumes were then compared at 75, 60,and 40 Gy for the bladder and rectum between the planning scan and the CBCTs. Results: Plan doses and estimates matched well at all dose points., Volumes of the rectum and bladder varied widely between planning CT and the CBCTs, ranging from 0.46 to 2.42 for the bladder and 0.71 to 2.18 for the rectum, causing relative dose volumes to vary between planning CT and CBCT, but absolute dose volumes were more consistent. The overall ratio of CBCT/plan dose volumes was 1.02 ±0.27 for rectum and 0.98 ±0.20 for bladder in these patients. Conclusion: Knowledge-based planning dose volume estimates for distensible OARs are still valid, in absolute volume terms, between treatment planning scans and CBCT’s taken during daily treatment. Further analysis of the data is being undertaken to determine how differences depend upon rectum and bladder filling state. This work has been supported by Varian Medical Systems.« less
Ho, Hsing-Hao; Li, Ya-Hui; Lee, Jih-Chin; Wang, Chih-Wei; Yu, Yi-Lin; Hueng, Dueng-Yuan; Hsu, Hsian-He
2018-01-01
Purpose We estimated the volume of vestibular schwannomas by an ice cream cone formula using thin-sliced magnetic resonance images (MRI) and compared the estimation accuracy among different estimating formulas and between different models. Methods The study was approved by a local institutional review board. A total of 100 patients with vestibular schwannomas examined by MRI between January 2011 and November 2015 were enrolled retrospectively. Informed consent was waived. Volumes of vestibular schwannomas were estimated by cuboidal, ellipsoidal, and spherical formulas based on a one-component model, and cuboidal, ellipsoidal, Linskey’s, and ice cream cone formulas based on a two-component model. The estimated volumes were compared to the volumes measured by planimetry. Intraobserver reproducibility and interobserver agreement was tested. Estimation error, including absolute percentage error (APE) and percentage error (PE), was calculated. Statistical analysis included intraclass correlation coefficient (ICC), linear regression analysis, one-way analysis of variance, and paired t-tests with P < 0.05 considered statistically significant. Results Overall tumor size was 4.80 ± 6.8 mL (mean ±standard deviation). All ICCs were no less than 0.992, suggestive of high intraobserver reproducibility and high interobserver agreement. Cuboidal formulas significantly overestimated the tumor volume by a factor of 1.9 to 2.4 (P ≤ 0.001). The one-component ellipsoidal and spherical formulas overestimated the tumor volume with an APE of 20.3% and 29.2%, respectively. The two-component ice cream cone method, and ellipsoidal and Linskey’s formulas significantly reduced the APE to 11.0%, 10.1%, and 12.5%, respectively (all P < 0.001). Conclusion The ice cream cone method and other two-component formulas including the ellipsoidal and Linskey’s formulas allow for estimation of vestibular schwannoma volume more accurately than all one-component formulas. PMID:29438424
Frenning, Göran
2015-01-01
When the discrete element method (DEM) is used to simulate confined compression of granular materials, the need arises to estimate the void space surrounding each particle with Voronoi polyhedra. This entails recurring Voronoi tessellation with small changes in the geometry, resulting in a considerable computational overhead. To overcome this limitation, we propose a method with the following features:•A local determination of the polyhedron volume is used, which considerably simplifies implementation of the method.•A linear approximation of the polyhedron volume is utilised, with intermittent exact volume calculations when needed.•The method allows highly accurate volume estimates to be obtained at a considerably reduced computational cost. PMID:26150975
Hoogman, Martine; Bralten, Janita; Hibar, Derrek P.; Mennes, Maarten; Zwiers, Marcel P.; Schweren, Lizanne; van Hulzen, Kimm J.E.; Medland, Sarah E.; Shumskaya, Elena; Jahanshad, Neda; de Zeeuw, Patrick; Szekely, Eszter; Sudre, Gustavo; Wolfers, Thomas; Onnink, Alberdingk M.H.; Dammers, Janneke T.; Mostert, Jeanette C.; Vives-Gilabert, Yolanda; Kohls, Gregor; Oberwelland, Eileen; Seitz, Jochen; Schulte-Rüther, Martin; di Bruttopilo, Sara Ambrosino; Doyle, Alysa E.; Høvik, Marie F.; Dramsdahl, Margaretha; Tamm, Leanne; van Erp, Theo G.M.; Dale, Anders; Schork, Andrew; Conzelmann, Annette; Zierhut, Kathrin; Baur, Ramona; McCarthy, Hazel; Yoncheva, Yuliya N.; Cubillo, Ana; Chantiluke, Kaylita; Mehta, Mitul A.; Paloyelis, Yannis; Hohmann, Sarah; Baumeister, Sarah; Bramati, Ivanei; Mattos, Paulo; Tovar-Moll, Fernanda; Douglas, Pamela; Banaschewski, Tobias; Brandeis, Daniel; Kuntsi, Jonna; Asherson, Phil; Rubia, Katya; Kelly, Clare; Di Martino, Adriana; Milham, Michael P.; Castellanos, Francisco X.; Frodl, Thomas; Zentis, Mariam; Lesch, Klaus-Peter; Reif, Andreas; Pauli, Paul; Jernigan, Terry; Haavik, Jan; Plessen, Kerstin J.; Lundervold, Astri J.; Hugdahl, Kenneth; Seidman, Larry J.; Biederman, Joseph; Rommelse, Nanda; Heslenfeld, Dirk J.; Hartman, Catharina; Hoekstra, Pieter J.; Oosterlaan, Jaap; von Polier, Georg; Konrad, Kerstin; Vilarroya, Oscar; Ramos-Quiroga, Josep-Antoni; Soliva, Joan Carles; Durston, Sarah; Buitelaar, Jan K.; Faraone, Stephen V.; Shaw, Philip; Thompson, Paul; Franke, Barbara
2017-01-01
BACKGROUND Neuroimaging studies show structural alterations in several brain regions in children and adults with attention-deficit/hyperactivity disorder (ADHD). Through the formation of the worldwide ENIGMA ADHD Working Group, we addressed weaknesses of prior imaging studies and meta-analyses in sample size and methodological heterogeneity. METHODS Our sample comprised 1713 participants with ADHD and 1529 controls from 23 sites (age range: 4–63 years; 66% males). Individual sites analyzed magnetic resonance imaging brain scans with harmonized protocols. Case-control differences in subcortical structures and intracranial volume (ICV) were assessed through mega-and meta-analysis. FINDINGS The volumes of the accumbens (Cohen’s d=−0.15), amygdala (d=−0.19), caudate (d=−0.11), hippocampus (d=−0.11), putamen (d=−0.14), and ICV (d=−0.10) were found to be smaller in cases relative to controls. Effect sizes were highest in children, case-control differences were not present in adults. Explorative lifespan modeling suggested a delay of maturation and a delay of degeneration. Psychostimulant medication use or presence of comorbid psychiatric disorders did not influence results, nor did symptom scores correlate with brain volume. INTERPRETATION Using the largest data set to date, we extend the brain maturation delay theory for ADHD to include subcortical structures and refute medication effects on brain volume suggested by earlier meta-analyses. We add new knowledge about bilateral amygdala, accumbens, and hippocampus reductions in ADHD, and provide unprecedented precision in effect size estimates. Lifespan analyses suggest that, in the absence of well-powered longitudinal studies, the ENIGMA cross-sectional sample across six decades of life provides a means to generate hypotheses about lifespan trajectories in brain phenotypes. FUNDING National Institutes of Health PMID:28219628
Local deformation for soft tissue simulation
Omar, Nadzeri; Zhong, Yongmin; Smith, Julian; Gu, Chengfan
2016-01-01
ABSTRACT This paper presents a new methodology to localize the deformation range to improve the computational efficiency for soft tissue simulation. This methodology identifies the local deformation range from the stress distribution in soft tissues due to an external force. A stress estimation method is used based on elastic theory to estimate the stress in soft tissues according to a depth from the contact surface. The proposed methodology can be used with both mass-spring and finite element modeling approaches for soft tissue deformation. Experimental results show that the proposed methodology can improve the computational efficiency while maintaining the modeling realism. PMID:27286482
Updated Magmatic Flux Rate Estimates for the Hawaii Plume
NASA Astrophysics Data System (ADS)
Wessel, P.
2013-12-01
Several studies have estimated the magmatic flux rate along the Hawaiian-Emperor Chain using a variety of methods and arriving at different results. These flux rate estimates have weaknesses because of incomplete data sets and different modeling assumptions, especially for the youngest portion of the chain (<3 Ma). While they generally agree on the 1st order features, there is less agreement on the magnitude and relative size of secondary flux variations. Some of these differences arise from the use of different methodologies, but the significance of this variability is difficult to assess due to a lack of confidence bounds on the estimates obtained with these disparate methods. All methods introduce some error, but to date there has been little or no quantification of error estimates for the inferred melt flux, making an assessment problematic. Here we re-evaluate the melt flux for the Hawaii plume with the latest gridded data sets (SRTM30+ and FAA 21.1) using several methods, including the optimal robust separator (ORS) and directional median filtering techniques (DiM). We also compute realistic confidence limits on the results. In particular, the DiM technique was specifically developed to aid in the estimation of surface loads that are superimposed on wider bathymetric swells and it provides error estimates on the optimal residuals. Confidence bounds are assigned separately for the estimated surface load (obtained from the ORS regional/residual separation techniques) and the inferred subsurface volume (from gravity-constrained isostasy and plate flexure optimizations). These new and robust estimates will allow us to assess which secondary features in the resulting melt flux curve are significant and should be incorporated when correlating melt flux variations with other geophysical and geochemical observations.
Geothermal resources and reserves in Indonesia: an updated revision
NASA Astrophysics Data System (ADS)
Fauzi, A.
2015-02-01
More than 300 high- to low-enthalpy geothermal sources have been identified throughout Indonesia. From the early 1980s until the late 1990s, the geothermal potential for power production in Indonesia was estimated to be about 20 000 MWe. The most recent estimate exceeds 29 000 MWe derived from the 300 sites (Geological Agency, December 2013). This resource estimate has been obtained by adding all of the estimated geothermal potential resources and reserves classified as "speculative", "hypothetical", "possible", "probable", and "proven" from all sites where such information is available. However, this approach to estimating the geothermal potential is flawed because it includes double counting of some reserve estimates as resource estimates, thus giving an inflated figure for the total national geothermal potential. This paper describes an updated revision of the geothermal resource estimate in Indonesia using a more realistic methodology. The methodology proposes that the preliminary "Speculative Resource" category should cover the full potential of a geothermal area and form the base reference figure for the resource of the area. Further investigation of this resource may improve the level of confidence of the category of reserves but will not necessarily increase the figure of the "preliminary resource estimate" as a whole, unless the result of the investigation is higher. A previous paper (Fauzi, 2013a, b) redefined and revised the geothermal resource estimate for Indonesia. The methodology, adopted from Fauzi (2013a, b), will be fully described in this paper. As a result of using the revised methodology, the potential geothermal resources and reserves for Indonesia are estimated to be about 24 000 MWe, some 5000 MWe less than the 2013 national estimate.
NASA Technical Reports Server (NTRS)
Doneaud, Andre A.; Miller, James R., Jr.; Johnson, L. Ronald; Vonder Haar, Thomas H.; Laybe, Patrick
1987-01-01
The use of the area-time-integral (ATI) technique, based only on satellite data, to estimate convective rain volume over a moving target is examined. The technique is based on the correlation between the radar echo area coverage integrated over the lifetime of the storm and the radar estimated rain volume. The processing of the GOES and radar data collected in 1981 is described. The radar and satellite parameters for six convective clusters from storm events occurring on June 12 and July 2, 1981 are analyzed and compared in terms of time steps and cluster lifetimes. Rain volume is calculated by first using the regression analysis to generate the regression equation used to obtain the ATI; the ATI versus rain volume relation is then employed to compute rain volume. The data reveal that the ATI technique using satellite data is applicable to the calculation of rain volume.
Steppan, Martin; Kraus, Ludwig; Piontek, Daniela; Siciliano, Valeria
2013-01-01
Prevalence estimation of cannabis use is usually based on self-report data. Although there is evidence on the reliability of this data source, its cross-cultural validity is still a major concern. External objective criteria are needed for this purpose. In this study, cannabis-related search engine query data are used as an external criterion. Data on cannabis use were taken from the 2007 European School Survey Project on Alcohol and Other Drugs (ESPAD). Provincial data came from three Italian nation-wide studies using the same methodology (2006-2008; ESPAD-Italia). Information on cannabis-related search engine query data was based on Google search volume indices (GSI). (1) Reliability analysis was conducted for GSI. (2) Latent measurement models of "true" cannabis prevalence were tested using perceived availability, web-based cannabis searches and self-reported prevalence as indicators. (3) Structure models were set up to test the influences of response tendencies and geographical position (latitude, longitude). In order to test the stability of the models, analyses were conducted on country level (Europe, US) and on provincial level in Italy. Cannabis-related GSI were found to be highly reliable and constant over time. The overall measurement model was highly significant in both data sets. On country level, no significant effects of response bias indicators and geographical position on perceived availability, web-based cannabis searches and self-reported prevalence were found. On provincial level, latitude had a significant positive effect on availability indicating that perceived availability of cannabis in northern Italy was higher than expected from the other indicators. Although GSI showed weaker associations with cannabis use than perceived availability, the findings underline the external validity and usefulness of search engine query data as external criteria. The findings suggest an acceptable relative comparability of national (provincial) prevalence estimates of cannabis use that are based on a common survey methodology. Search engine query data are a too weak indicator to base prevalence estimations on this source only, but in combination with other sources (waste water analysis, sales of cigarette paper) they may provide satisfactory estimates. Copyright © 2012. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Mitchell, G. A.; Gharib, J. J.; Doolittle, D. F.
2015-12-01
Methane gas flux from the seafloor to atmosphere is an important variable for global carbon cycle and climate models, yet is poorly constrained. Methodologies used to estimate seafloor gas flux commonly employ a combination of acoustic and optical techniques. These techniques often use hull-mounted multibeam echosounders (MBES) to quickly ensonify large volumes of the water column for acoustic backscatter anomalies indicative of gas bubble plumes. Detection of these water column anomalies with a MBES provides information on the lateral distribution of the plumes, the midwater dimensions of the plumes, and their positions on the seafloor. Seafloor plume locations are targeted for visual investigations using a remotely operated vehicle (ROV) to determine bubble emission rates, venting behaviors, bubble sizes, and ascent velocities. Once these variables are measured in-situ, an extrapolation of gas flux is made over the survey area using the number of remotely-mapped flares. This methodology was applied to a geophysical survey conducted in 2013 over a large seafloor crater that developed in response to an oil well blowout in 1983 offshore Papua New Guinea. The site was investigated by multibeam and sidescan mapping, sub-bottom profiling, 2-D high-resolution multi-channel seismic reflection, and ROV video and coring operations. Numerous water column plumes were detected in the data suggesting vigorously active vents within and near the seafloor crater (Figure 1). This study uses dual-frequency MBES datasets (Reson 7125, 200/400 kHz) and ROV video imagery of the active hydrocarbon seeps to estimate total gas flux from the crater. Plumes of bubbles were extracted from the water column data using threshold filtering techniques. Analysis of video images of the seep emission sites within the crater provided estimates on bubble size, expulsion frequency, and ascent velocity. The average gas flux characteristics made from ROV video observations is extrapolated over the number of individual flares detected acoustically and extracted to estimate gas flux from the survey area. The gas flux estimate from the water column filtering and ROV observations yields a range of 2.2 - 6.6 mol CH4 / min.
ERIC Educational Resources Information Center
Grasso, Janet; Fosburg, Steven
Fifth in a series of seven volumes reporting the design, methodology, and findings of the 4-year National Day Care Home Study (NDCHS), this volume presents a descriptive and statistical analysis of the day care institutions that administer day care systems. These systems, such as Learning Unlimited in Los Angeles and the family day care program of…
NASA Astrophysics Data System (ADS)
Michelon, M. F.; Antonelli, A.
2010-03-01
We have developed a methodology to study the thermodynamics of order-disorder transformations in n -component substitutional alloys that combines nonequilibrium methods, which can efficiently compute free energies, with Monte Carlo simulations, in which configurational and vibrational degrees of freedom are simultaneously considered on an equal footing basis. Furthermore, with this methodology one can easily perform simulations in the canonical and in the isobaric-isothermal ensembles, which allow the investigation of the bulk volume effect. We have applied this methodology to calculate configurational and vibrational contributions to the entropy of the Ni3Al alloy as functions of temperature. The simulations show that when the volume of the system is kept constant, the vibrational entropy does not change upon transition while constant-pressure calculations indicate that the volume increase at the order-disorder transition causes a vibrational entropy increase of 0.08kB/atom . This is significant when compared to the configurational entropy increase of 0.27kB/atom . Our calculations also indicate that the inclusion of vibrations reduces in about 30% the order-disorder transition temperature determined solely considering the configurational degrees of freedom.
Bigger is Better, but at What Cost? Estimating the Economic Value of Incremental Data Assets.
Dalessandro, Brian; Perlich, Claudia; Raeder, Troy
2014-06-01
Many firms depend on third-party vendors to supply data for commercial predictive modeling applications. An issue that has received very little attention in the prior research literature is the estimation of a fair price for purchased data. In this work we present a methodology for estimating the economic value of adding incremental data to predictive modeling applications and present two cases studies. The methodology starts with estimating the effect that incremental data has on model performance in terms of common classification evaluation metrics. This effect is then translated into economic units, which gives an expected economic value that the firm might realize with the acquisition of a particular data asset. With this estimate a firm can then set a data acquisition price that targets a particular return on investment. This article presents the methodology in full detail and illustrates it in the context of two marketing case studies.
Reliability study on high power 638-nm triple emitter broad area laser diode
NASA Astrophysics Data System (ADS)
Yagi, T.; Kuramoto, K.; Kadoiwa, K.; Wakamatsu, R.; Miyashita, M.
2016-03-01
Reliabilities of the 638-nm triple emitter broad area laser diode (BA-LD) with the window-mirror structure were studied. Methodology to estimate mean time to failure (MTTF) due to catastrophic optical mirror degradation (COMD) in reasonable aging duration was newly proposed. Power at which the LD failed due to COMD (PCOMD) was measured for the aged LDs under the several aging conditions. It was revealed that the PCOMD was proportional to logarithm of aging duration, and MTTF due to COMD (MTTF(COMD)) could be estimated by using this relation. MTTF(COMD) estimated by the methodology with the aging duration of approximately 2,000 hours was consistent with that estimated by the long term aging. By using this methodology, the MTTF of the BA-LD was estimated exceeding 100,000 hours under the output of 2.5 W, duty cycles of 30% .