Sample records for map produced volume

  1. Regional Disparities in Online Map User Access Volume and Determining Factors

    NASA Astrophysics Data System (ADS)

    Li, R.; Yang, N.; Li, R.; Huang, W.; Wu, H.

    2017-09-01

    The regional disparities of online map user access volume (use `user access volume' in this paper to indicate briefly) is a topic of growing interest with the increment of popularity in public users, which helps to target the construction of geographic information services for different areas. At first place we statistically analysed the online map user access logs and quantified these regional access disparities on different scales. The results show that the volume of user access is decreasing from east to the west in China as a whole, while East China produces the most access volume; these cities are also the crucial economic and transport centres. Then Principal Component Regression (PCR) is applied to explore the regional disparities of user access volume. A determining model for Online Map access volume is proposed afterwards, which indicates that area scale is the primary determining factor for regional disparities, followed by public transport development level and public service development level. Other factors like user quality index and financial index have very limited influence on the user access volume. According to the study of regional disparities in user access volume, map providers can reasonably dispatch and allocate the data resources and service resources in each area and improve the operational efficiency of the Online Map server cluster.

  2. Bathymetric map and area/capacity table for Castle Lake, Washington

    USGS Publications Warehouse

    Mosbrucker, Adam R.; Spicer, Kurt R.

    2017-11-14

    The May 18, 1980, eruption of Mount St. Helens produced a 2.5-cubic-kilometer debris avalanche that dammed South Fork Castle Creek, causing Castle Lake to form behind a 20-meter-tall blockage. Risk of a catastrophic breach of the newly impounded lake led to outlet channel stabilization work, aggressive monitoring programs, mapping efforts, and blockage stability studies. Despite relatively large uncertainty, early mapping efforts adequately supported several lake breakout models, but have limited applicability to current lake monitoring and hazard assessment. Here, we present the results of a bathymetric survey conducted in August 2012 with the purpose of (1) verifying previous volume estimates, (2) computing an area/capacity table, and (3) producing a bathymetric map. Our survey found seasonal lake volume ranges between 21.0 and 22.6 million cubic meters with a fundamental vertical accuracy representing 0.88 million cubic meters. Lake surface area ranges between 1.13 and 1.16 square kilometers. Relationships developed by our results allow the computation of lake volume from near real-time lake elevation measurements or from remotely sensed imagery.

  3. Quantifying Mesoscale Neuroanatomy Using X-Ray Microtomography

    PubMed Central

    Gray Roncal, William; Prasad, Judy A.; Fernandes, Hugo L.; Gürsoy, Doga; De Andrade, Vincent; Fezzaa, Kamel; Xiao, Xianghui; Vogelstein, Joshua T.; Jacobsen, Chris; Körding, Konrad P.

    2017-01-01

    Methods for resolving the three-dimensional (3D) microstructure of the brain typically start by thinly slicing and staining the brain, followed by imaging numerous individual sections with visible light photons or electrons. In contrast, X-rays can be used to image thick samples, providing a rapid approach for producing large 3D brain maps without sectioning. Here we demonstrate the use of synchrotron X-ray microtomography (µCT) for producing mesoscale (∼1 µm 3 resolution) brain maps from millimeter-scale volumes of mouse brain. We introduce a pipeline for µCT-based brain mapping that develops and integrates methods for sample preparation, imaging, and automated segmentation of cells, blood vessels, and myelinated axons, in addition to statistical analyses of these brain structures. Our results demonstrate that X-ray tomography achieves rapid quantification of large brain volumes, complementing other brain mapping and connectomics efforts. PMID:29085899

  4. Large-area Mapping of Forest Cover and Biomass using ALOS PALSAR

    NASA Astrophysics Data System (ADS)

    Cartus, O.; Kellndorfer, J. M.; Walker, W. S.; Goetz, S. J.; Laporte, N.; Bishop, J.; Cormier, T.; Baccini, A.

    2011-12-01

    In the frame of a Pantropical mapping project, we aim at producing high-resolution forest cover maps from ALOS PALSAR. The ALOS data was obtained through the Americas ALOS Data Node (AADN) at ASF. For the forest cover classification, a pan-tropical network of calibrated reference data was generated from ancillary satellite data (ICESAT GLAS). These data are used to classify PALSAR swath data to be combined to continental forest probability maps. The maps are validated with withheld training data for testing, as well as through independent operator verification with very high-resolution image. In addition, we aim at developing robust algorithms for the mapping of forest biophysical parameters like stem volume or biomass using synergy of PALSAR, optical and Lidar data. Currently we are testing different approaches for the mapping of forest biophysical parameters. 1) For the showcase scenario of Mexico, where we have access to ~1400 PALSAR FBD images as well as the 30 m Landsat Vegetation Continuous Field product, VCF, we test a traditional ground-data based approach. The PALSAR HH/HV intensity data and VCF are used as predictor layers in RandomForest for predicting aboveground forest biomass. A network of 40000 in situ biomass plots is used for model development (for each PALSAR swath) as well as for validation. With this approach a first 30 m biomass map for entire Mexico was produced. An initial validation of the map resulted in an RMSE of 41 t/ha and an R2 of 0.42. Pronounced differences between different ecozones were observed. In some areas the retrieval reached an R2 of 0.6 (e.g. pine-oak forests) whereas, for instance, in dry woodlands, the retrieval accuracy was much lower (R2 of 0.1). A major limitation of the approach was also represented by the fact that for the development of models for each ALOS swath, in some cases too few sample plots were available. 2) Chile: At a forest site in Central Chile, dominated by plantations of pinus radiata, synergy of ALOS PALSAR, Landsat and small-footprint Lidar is investigated for the mapping of forest growing stock volume and canopy height. Canopy Height Models with 1 m pixel size that were generated from the first/last return Lidar data were used to produce surrogate sampling plots to upscale stand-level inventory measurements to wall-to-wall maps with the aid of multi-temporal ALOS and Landsat data. The Lidar data allowed the estimation of volume and canopy height with high accuracy: 23 % error in case of volume and 7 % error in case of height. Using the Lidar estimates as surrogate training data for the development of models relating the ALOS backscatter to volume and height we obtained retrieval errors of ~60 % in case of volume and 31 % in case of height when using only one ALOS FBD image. Significant improvements could be achieved when 1) using three ALOS images for retrieval (50 % error for volume and 26 % for height) and 2) when including also Landsat data (42 % error for volume and 20 % for height).

  5. Using an ultrasound elasticity microscope to map three-dimensional strain in a porcine cornea.

    PubMed

    Hollman, Kyle W; Shtein, Roni M; Tripathy, Sakya; Kim, Kang

    2013-08-01

    An ultrasound elasticity microscope was used to map 3-D strain volume in an ex vivo porcine cornea to illustrate its ability to measure the mechanical properties of this tissue. Mechanical properties of the cornea play an important role in its function and, therefore, also in ophthalmic diseases such as kerataconus and corneal ectasia. The ultrasound elasticity microscope combines a tightly focused high-frequency transducer with confocal scanning to produce high-quality speckle over the entire volume of tissue. This system and the analysis were able to generate volume maps of compressional strain in all three directions for porcine corneal tissue, more information than any previous study has reported. Strain volume maps indicated features of the cornea and mechanical behavior as expected. These results constitute a step toward better understanding of corneal mechanics and better treatment of corneal diseases. Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  6. Interpreting The Unresolved Intensity Of Cosmologically Redshifted Line Radiation

    NASA Technical Reports Server (NTRS)

    Switzer, E. R.; Chang, T.-C.; Masui, K. W.; Pen, U.-L.; Voytek, T. C.

    2016-01-01

    Intensity mapping experiments survey the spectrum of diffuse line radiation rather than detect individual objects at high signal-to-noise ratio. Spectral maps of unresolved atomic and molecular line radiation contain three-dimensional information about the density and environments of emitting gas and efficiently probe cosmological volumes out to high redshift. Intensity mapping survey volumes also contain all other sources of radiation at the frequencies of interest. Continuum foregrounds are typically approximately 10(sup 2)-10(Sup 3) times brighter than the cosmological signal. The instrumental response to bright foregrounds will produce new spectral degrees of freedom that are not known in advance, nor necessarily spectrally smooth. The intrinsic spectra of fore-grounds may also not be well known in advance. We describe a general class of quadratic estimators to analyze data from single-dish intensity mapping experiments and determine contaminated spectral modes from the data themselves. The key attribute of foregrounds is not that they are spectrally smooth, but instead that they have fewer bright spectral degrees of freedom than the cosmological signal. Spurious correlations between the signal and foregrounds produce additional bias. Compensation for signal attenuation must estimate and correct this bias. A successful intensity mapping experiment will control instrumental systematics that spread variance into new modes, and it must observe a large enough volume that contaminant modes can be determined independently from the signal on scales of interest.

  7. Conformal mapping in optical biosensor applications.

    PubMed

    Zumbrum, Matthew E; Edwards, David A

    2015-09-01

    Optical biosensors are devices used to investigate surface-volume reaction kinetics. Current mathematical models for reaction dynamics rely on the assumption of unidirectional flow within these devices. However, new devices, such as the Flexchip, include a geometry that introduces two-dimensional flow, complicating the depletion of the volume reactant. To account for this, a previous mathematical model is extended to include two-dimensional flow, and the Schwarz-Christoffel mapping is used to relate the physical device geometry to that for a device with unidirectional flow. Mappings for several Flexchip dimensions are considered, and the ligand depletion effect is investigated for one of these mappings. Estimated rate constants are produced for simulated data to quantify the inclusion of two-dimensional flow in the mathematical model.

  8. Using remote sensing and GIS techniques to estimate discharge and recharge fluxes for the Death Valley regional groundwater flow system, USA

    USGS Publications Warehouse

    D'Agnese, F. A.; Faunt, C.C.; Turner, A.K.; ,

    1996-01-01

    The recharge and discharge components of the Death Valley regional groundwater flow system were defined by techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were used to calculate discharge volumes for these area. An empirical method of groundwater recharge estimation was modified to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.The recharge and discharge components of the Death Valley regional groundwater flow system were defined by remote sensing and GIS techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. This map provided a basis for subsequent evapotranspiration and infiltration estimations. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were then used to calculate discharge volumes for these areas. A previously used empirical method of groundwater recharge estimation was modified by GIS methods to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.

  9. Combining µXANES and µXRD mapping to analyse the heterogeneity in calcium carbonate granules excreted by the earthworm Lumbricus terrestris

    PubMed Central

    Brinza, Loredana; Schofield, Paul F.; Hodson, Mark E.; Weller, Sophie; Ignatyev, Konstantin; Geraki, Kalotina; Quinn, Paul D.; Mosselmans, J. Frederick W.

    2014-01-01

    The use of fluorescence full spectral micro-X-ray absorption near-edge structure (µXANES) mapping is becoming more widespread in the hard energy regime. This experimental method using the Ca K-edge combined with micro-X-ray diffraction (µXRD) mapping of the same sample has been enabled on beamline I18 at Diamond Light Source. This combined approach has been used to probe both long- and short-range order in calcium carbonate granules produced by the earthworm Lumbricus terrestris. In granules produced by earthworms cultured in a control artificial soil, calcite and vaterite are observed in the granules. However, granules produced by earthworms cultivated in the same artificial soil amended with 500 p.p.m. Mg also contain an aragonite. The two techniques, µXRD and µXANES, probe different sample volumes but there is good agreement in the phase maps produced. PMID:24365942

  10. From field data to volumes: constraining uncertainties in pyroclastic eruption parameters

    NASA Astrophysics Data System (ADS)

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Donald A.; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-07-01

    In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal ( s = 62 %) and distal field ( s = 53 %) and small for the densely sampled intermediate deposit ( s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.

  11. Maps for America: cartographic products of the U.S. Geological Survey and others

    USGS Publications Warehouse

    Thompson, Morris M.

    1988-01-01

    "Maps for America" was originally published in 1979 as a Centennial Volume commemorating the Geological Survey's hundred years of service (1879 - 1979) in the earth sciences. It was an eminently fitting Centennial Year publication, for, since its establishment, the Geological Survey has continuously carried on an extensive program of mapping to provide knowledge of the topography, geology, hydrology, and natural resources of our nation.This volume contains an organized presentation of information about the map produced by the Geological Survey and other American organizations, public and private. Such maps are important tools for those in government and in private endeavors who are working to assure the wisest choices in managing the Nation's resources. They are particularly supportive of the Department of the Interior's role as the Nation's principal conservation agency.The third edition of "Maps for America," like the second edition, is intended primarily to replenish the supply of copies of the book, but it also contains a number of changes to correct or update the text.

  12. Three-dimensional mapping of equiprobable hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirley, C.; Pohlmann, K.; Andricevic, R.

    1996-09-01

    Geological and geophysical data are used with the sequential indicator simulation algorithm of Gomez-Hernandez and Srivastava to produce multiple, equiprobable, three-dimensional maps of informal hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site. The upper 50 percent of the Tertiary volcanic lithostratigraphic column comprises the study volume. Semivariograms are modeled from indicator-transformed geophysical tool signals. Each equiprobable study volume is subdivided into discrete classes using the ISIM3D implementation of the sequential indicator simulation algorithm. Hydraulic conductivity is assigned within each class using the sequential Gaussian simulation method of Deutsch and Journel. The resulting maps show the contiguitymore » of high and low hydraulic conductivity regions.« less

  13. Glove-talk II - a neural-network interface which maps gestures to parallel formant speech synthesizer controls.

    PubMed

    Fels, S S; Hinton, G E

    1997-01-01

    Glove-Talk II is a system which translates hand gestures to speech through an adaptive interface. Hand gestures are mapped continuously to ten control parameters of a parallel formant speech synthesizer. The mapping allows the hand to act as an artificial vocal tract that produces speech in real time. This gives an unlimited vocabulary in addition to direct control of fundamental frequency and volume. Currently, the best version of Glove-Talk II uses several input devices, a parallel formant speech synthesizer, and three neural networks. The gesture-to-speech task is divided into vowel and consonant production by using a gating network to weight the outputs of a vowel and a consonant neural network. The gating network and the consonant network are trained with examples from the user. The vowel network implements a fixed user-defined relationship between hand position and vowel sound and does not require any training examples from the user. Volume, fundamental frequency, and stop consonants are produced with a fixed mapping from the input devices. With Glove-Talk II, the subject can speak slowly but with far more natural sounding pitch variations than a text-to-speech synthesizer.

  14. Surface-Constrained Volumetric Brain Registration Using Harmonic Mappings

    PubMed Central

    Joshi, Anand A.; Shattuck, David W.; Thompson, Paul M.; Leahy, Richard M.

    2015-01-01

    In order to compare anatomical and functional brain imaging data across subjects, the images must first be registered to a common coordinate system in which anatomical features are aligned. Intensity-based volume registration methods can align subcortical structures well, but the variability in sulcal folding patterns typically results in misalignment of the cortical surface. Conversely, surface-based registration using sulcal features can produce excellent cortical alignment but the mapping between brains is restricted to the cortical surface. Here we describe a method for volumetric registration that also produces an accurate one-to-one point correspondence between cortical surfaces. This is achieved by first parameterizing and aligning the cortical surfaces using sulcal landmarks. We then use a constrained harmonic mapping to extend this surface correspondence to the entire cortical volume. Finally, this mapping is refined using an intensity-based warp. We demonstrate the utility of the method by applying it to T1-weighted magnetic resonance images (MRI). We evaluate the performance of our proposed method relative to existing methods that use only intensity information; for this comparison we compute the inter-subject alignment of expert-labeled sub-cortical structures after registration. PMID:18092736

  15. From field data to volumes: constraining uncertainties in pyroclastic eruption parameters

    USGS Publications Warehouse

    Klawonn, Malin; Houghton, Bruce F.; Swanson, Don; Fagents, Sarah A.; Wessel, Paul; Wolfe, Cecily J.

    2014-01-01

    In this study, we aim to understand the variability in eruption volume estimates derived from field studies of pyroclastic deposits. We distributed paper maps of the 1959 Kīlauea Iki tephra to 101 volcanologists worldwide, who produced hand-drawn isopachs. Across the returned maps, uncertainty in isopach areas is 7 % across the well-sampled deposit but increases to over 30 % for isopachs that are governed by the largest and smallest thickness measurements. We fit the exponential, power-law, and Weibull functions through the isopach thickness versus area1/2 values and find volume estimate variations up to a factor of 4.9 for a single map. Across all maps and methodologies, we find an average standard deviation for a total volume of s = 29 %. The volume uncertainties are largest for the most proximal (s = 62 %) and distal field (s = 53 %) and small for the densely sampled intermediate deposit (s = 8 %). For the Kīlauea Iki 1959 eruption, we find that the deposit beyond the 5-cm isopach contains only 2 % of the total erupted volume, whereas the near-source deposit contains 48 % and the intermediate deposit 50 % of the total volume. Thus, the relative uncertainty within each zone impacts the total volume estimates differently. The observed uncertainties for the different deposit regions in this study illustrate a fundamental problem of estimating eruption volumes: while some methodologies may provide better fits to the isopach data or rely on fewer free parameters, the main issue remains the predictive capabilities of the empirical functions for the regions where measurements are missing.

  16. A Voxel-by-Voxel Comparison of Deformable Vector Fields Obtained by Three Deformable Image Registration Algorithms Applied to 4DCT Lung Studies.

    PubMed

    Fatyga, Mirek; Dogan, Nesrin; Weiss, Elizabeth; Sleeman, William C; Zhang, Baoshe; Lehman, William J; Williamson, Jeffrey F; Wijesooriya, Krishni; Christensen, Gary E

    2015-01-01

    Commonly used methods of assessing the accuracy of deformable image registration (DIR) rely on image segmentation or landmark selection. These methods are very labor intensive and thus limited to relatively small number of image pairs. The direct voxel-by-voxel comparison can be automated to examine fluctuations in DIR quality on a long series of image pairs. A voxel-by-voxel comparison of three DIR algorithms applied to lung patients is presented. Registrations are compared by comparing volume histograms formed both with individual DIR maps and with a voxel-by-voxel subtraction of the two maps. When two DIR maps agree one concludes that both maps are interchangeable in treatment planning applications, though one cannot conclude that either one agrees with the ground truth. If two DIR maps significantly disagree one concludes that at least one of the maps deviates from the ground truth. We use the method to compare 3 DIR algorithms applied to peak inhale-peak exhale registrations of 4DFBCT data obtained from 13 patients. All three algorithms appear to be nearly equivalent when compared using DICE similarity coefficients. A comparison based on Jacobian volume histograms shows that all three algorithms measure changes in total volume of the lungs with reasonable accuracy, but show large differences in the variance of Jacobian distribution on contoured structures. Analysis of voxel-by-voxel subtraction of DIR maps shows differences between algorithms that exceed a centimeter for some registrations. Deformation maps produced by DIR algorithms must be treated as mathematical approximations of physical tissue deformation that are not self-consistent and may thus be useful only in applications for which they have been specifically validated. The three algorithms tested in this work perform fairly robustly for the task of contour propagation, but produce potentially unreliable results for the task of DVH accumulation or measurement of local volume change. Performance of DIR algorithms varies significantly from one image pair to the next hence validation efforts, which are exhaustive but performed on a small number of image pairs may not reflect the performance of the same algorithm in practical clinical situations. Such efforts should be supplemented by validation based on a longer series of images of clinical quality.

  17. Bathymetric surveys of Morse and Geist Reservoirs in central Indiana made with acoustic Doppler current profiler and global positioning system technology, 1996

    USGS Publications Warehouse

    Wilson, J.T.; Morlock, S.E.; Baker, N.T.

    1997-01-01

    Acoustic Doppler current profiler, global positioning system, and geographic information system technology were used to map the bathymetry of Morse and Geist Reservoirs, two artificial lakes used for public water supply in central Indiana. The project was a pilot study to evaluate the use of the technologies for bathymetric surveys. Bathymetric surveys were last conducted in 1978 on Morse Reservoir and in 1980 on Geist Reservoir; those surveys were done with conventional methods using networks of fathometer transects. The 1996 bathymetric surveys produced updated estimates of reservoir volumes that will serve as base-line data for future estimates of storage capacity and sedimentation rates.An acoustic Doppler current profiler and global positioning system receiver were used to collect water-depth and position data from April 1996 through October 1996. All water-depth and position data were imported to a geographic information system to create a data base. The geographic information system then was used to generate water-depth contour maps and to compute the volumes for each reservoir.The computed volume of Morse Reservoir was 22,820 acre-feet (7.44 billion gallons), with a surface area of 1,484 acres. The computed volume of Geist Reservoir was 19,280 acre-feet (6.29 billion gallons), with a surface area of 1,848 acres. The computed 1996 reservoir volumes are less than the design volumes and indicate that sedimentation has occurred in both reservoirs. Cross sections were constructed from the computer-generated surfaces for 1996 and compared to the fathometer profiles from the 1978 and 1980 surveys; analysis of these cross sections also indicates that some sedimentation has occurred in both reservoirs.The acoustic Doppler current profiler, global positioning system, and geographic information system technologies described in this report produced bathymetric maps and volume estimates more efficiently and with comparable or greater resolution than conventional bathymetry methods.

  18. Maps for America: cartographic products of the U.S. Geological Survey and others

    USGS Publications Warehouse

    Thompson, Morris M.

    1981-01-01

    "Maps for America" was originally published in 1979 as a Centennial Volume commemorating the Geological Survey's hundred years of service (1879-1979) in the earth sciences. It was an eminently fitting Centennial Year publication, for, since its establishment, the Geological Survey has continuously carried on an extensive program of mapping to provide knowledge of the topography, geology, hydrology, and natural resources of our Nation. This volume contains an organized presentation of information about the maps produced by the Geological Survey and other American organizations, public and private. Such maps are important tools for those in government and in private endeavors who are working to assure the wisest choices in managing the Nation's resources. They are particularly supportive of the Department of the Interior's role as the Nation's principal conservation agency. The second edition of "Maps for America" is intended primarily to replenish the dwindling supply of copies of the book, but it also contains a number of changes to correct or update the text and to provide more suitable illustrations in certain instances.

  19. Stratifying FIA Ground Plots Using A 3-Year Old MRLC Forest Cover Map and Current TM Derived Variables Selected By "Decision Tree" Classification

    Treesearch

    Michael Hoppus; Stan Arner; Andrew Lister

    2001-01-01

    A reduction in variance for estimates of forest area and volume in the state of Connecticut was accomplished by stratifying FIA ground plots using raw, transformed and classified Landsat Thematic Mapper (TM) imagery. A US Geological Survey (USGS) Multi-Resolution Landscape Characterization (MRLC) vegetation cover map for Connecticut was used to produce a forest/non-...

  20. Glove-TalkII--a neural-network interface which maps gestures to parallel formant speech synthesizer controls.

    PubMed

    Fels, S S; Hinton, G E

    1998-01-01

    Glove-TalkII is a system which translates hand gestures to speech through an adaptive interface. Hand gestures are mapped continuously to ten control parameters of a parallel formant speech synthesizer. The mapping allows the hand to act as an artificial vocal tract that produces speech in real time. This gives an unlimited vocabulary in addition to direct control of fundamental frequency and volume. Currently, the best version of Glove-TalkII uses several input devices (including a Cyberglove, a ContactGlove, a three-space tracker, and a foot pedal), a parallel formant speech synthesizer, and three neural networks. The gesture-to-speech task is divided into vowel and consonant production by using a gating network to weight the outputs of a vowel and a consonant neural network. The gating network and the consonant network are trained with examples from the user. The vowel network implements a fixed user-defined relationship between hand position and vowel sound and does not require any training examples from the user. Volume, fundamental frequency, and stop consonants are produced with a fixed mapping from the input devices. One subject has trained to speak intelligibly with Glove-TalkII. He speaks slowly but with far more natural sounding pitch variations than a text-to-speech synthesizer.

  1. Direct Volume Rendering with Shading via Three-Dimensional Textures

    NASA Technical Reports Server (NTRS)

    VanGelder, Allen; Kim, Kwansik

    1996-01-01

    A new and easy-to-implement method for direct volume rendering that uses 3D texture maps for acceleration, and incorporates directional lighting, is described. The implementation, called Voltx, produces high-quality images at nearly interactive speeds on workstations with hardware support for three-dimensional texture maps. Previously reported methods did not incorporate a light model, and did not address issues of multiple texture maps for large volumes. Our research shows that these extensions impact performance by about a factor of ten. Voltx supports orthographic, perspective, and stereo views. This paper describes the theory and implementation of this technique, and compares it to the shear-warp factorization approach. A rectilinear data set is converted into a three-dimensional texture map containing color and opacity information. Quantized normal vectors and a lookup table provide efficiency. A new tesselation of the sphere is described, which serves as the basis for normal-vector quantization. A new gradient-based shading criterion is described, in which the gradient magnitude is interpreted in the context of the field-data value and the material classification parameters, and not in isolation. In the rendering phase, the texture map is applied to a stack of parallel planes, which effectively cut the texture into many slabs. The slabs are composited to form an image.

  2. Automatic characterization and segmentation of human skin using three-dimensional optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Hori, Yasuaki; Yasuno, Yoshiaki; Sakai, Shingo; Matsumoto, Masayuki; Sugawara, Tomoko; Madjarova, Violeta; Yamanari, Masahiro; Makita, Shuichi; Yasui, Takeshi; Araki, Tsutomu; Itoh, Masahide; Yatagai, Toyohiko

    2006-03-01

    A set of fully automated algorithms that is specialized for analyzing a three-dimensional optical coherence tomography (OCT) volume of human skin is reported. The algorithm set first determines the skin surface of the OCT volume, and a depth-oriented algorithm provides the mean epidermal thickness, distribution map of the epidermis, and a segmented volume of the epidermis. Subsequently, an en face shadowgram is produced by an algorithm to visualize the infundibula in the skin with high contrast. The population and occupation ratio of the infundibula are provided by a histogram-based thresholding algorithm and a distance mapping algorithm. En face OCT slices at constant depths from the sample surface are extracted, and the histogram-based thresholding algorithm is again applied to these slices, yielding a three-dimensional segmented volume of the infundibula. The dermal attenuation coefficient is also calculated from the OCT volume in order to evaluate the skin texture. The algorithm set examines swept-source OCT volumes of the skins of several volunteers, and the results show the high stability, portability and reproducibility of the algorithm.

  3. Blood pressure, brain structure, and cognition: opposite associations in men and women.

    PubMed

    Cherbuin, Nicolas; Mortby, Moyra E; Janke, Andrew L; Sachdev, Perminder S; Abhayaratna, Walter P; Anstey, Kaarin J

    2015-02-01

    Research on associations between blood pressure, brain structure, and cognitive function has produced somewhat inconsistent results. In part, this may be due to differences in age ranges studied and because of sex differences in physiology and/or exposure to risk factors, which may lead to different time course or patterns in cardiovascular disease progression. The aim of this study was to investigate the impact of sex on associations between blood pressure, regional cerebral volumes, and cognitive function in older individuals. In this cohort study, brachial blood pressure was measured twice at rest in 266 community-based individuals free of dementia aged 68-73 years who had also undergone a brain scan and a neuropsychological assessment. Associations between mean blood pressure (MAP), regional brain volumes, and cognition were investigated with voxel-wise regression analyses. Positive associations between MAP and regional volumes were detected in men, whereas negative associations were found in women. Similarly, there were sex differences in the brain-volume cognition relationship, with a positive relationship between regional brain volumes associated with MAP in men and a negative relationship in women. In this cohort of older individuals, higher MAP was associated with larger regional volume and better cognition in men, whereas opposite findings were demonstrated in women. These effects may be due to different lifetime risk exposure or because of physiological differences between men and women. Future studies investigating the relationship between blood pressure and brain structure or cognitive function should evaluate the potential for differential sex effects. © American Journal of Hypertension, Ltd 2014. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Shape information from a critical point analysis of calculated electron density maps: application to DNA-drug systems

    NASA Astrophysics Data System (ADS)

    Leherte, L.; Allen, F. H.; Vercauteren, D. P.

    1995-04-01

    A computational method is described for mapping the volume within the DNA double helix accessible to a groove-binding antibiotic, netropsin. Topological critical point analysis is used to locate maxima in electron density maps reconstructed from crystallographically determined atomic coordinates. The peaks obtained in this way are represented as ellipsoids with axes related to local curvature of the electron density function. Combining the ellipsoids produces a single electron density function which can be probed to estimate effective volumes of the interacting species. Close complementarity between host and ligand in this example shows the method to be a good representation of the electron density function at various resolutions; while at the atomic level the ellipsoid method gives results which are in close agreement with those from the conventional, spherical, van der Waals approach.

  5. Shape information from a critical point analysis of calculated electron density maps: Application to DNA-drug systems

    NASA Astrophysics Data System (ADS)

    Leherte, Laurence; Allen, Frank H.

    1994-06-01

    A computational method is described for mapping the volume within the DNA double helix accessible to the groove-binding antibiotic netropsin. Topological critical point analysis is used to locate maxima in electron density maps reconstructed from crystallographically determined atomic coordinates. The peaks obtained in this way are represented as ellipsoids with axes related to local curvature of the electron density function. Combining the ellipsoids produces a single electron density function which can be probed to estimate effective volumes of the interacting species. Close complementarity between host and ligand in this example shows the method to give a good representation of the electron density function at various resolutions. At the atomic level, the ellipsoid method gives results which are in close agreement with those from the conventional spherical van der Waals approach.

  6. Combining satellite imagery with forest inventory data to assess damage severity following a major blowdown event in northern Minnesota, USA

    Treesearch

    Mark D. Nelson; Sean P. Healey; W. Keith Moser; Mark H. Hansen

    2009-01-01

    Effects of a catastrophic blowdown event in northern Minnesota, USA were assessed using field inventory data, aerial sketch maps and satellite image data processed through the North American Forest Dynamics programme. Estimates were produced for forest area and net volume per unit area of live trees pre- and post-disturbance, and for changes in volume per unit area and...

  7. Using remote sensing and GIS techniques to estimate discharge and recharge. fluxes for the Death Valley regional groundwater flow system, USA

    USGS Publications Warehouse

    D'Agnese, F. A.; Faunt, C.C.; Keith, Turner A.

    1996-01-01

    The recharge and discharge components of the Death Valley regional groundwater flow system were defined by remote sensing and GIS techniques that integrated disparate data types to develop a spatially complex representation of near-surface hydrological processes. Image classification methods were applied to multispectral satellite data to produce a vegetation map. This map provided a basis for subsequent evapotranspiration and infiltration estimations. The vegetation map was combined with ancillary data in a GIS to delineate different types of wetlands, phreatophytes and wet playa areas. Existing evapotranspiration-rate estimates were then used to calculate discharge volumes for these areas. A previously used empirical method of groundwater recharge estimation was modified by GIS methods to incorporate data describing soil-moisture conditions, and a recharge potential map was produced. These discharge and recharge maps were readily converted to data arrays for numerical modelling codes. Inverse parameter estimation techniques also used these data to evaluate the reliability and sensitivity of estimated values.

  8. CARETS: A prototype regional environmental information system. Volume 6: Cost, accuracy and consistency comparisons of land use maps made from high-altitude aircraft photography and ERTS imagery

    NASA Technical Reports Server (NTRS)

    Alexander, R. H. (Principal Investigator); Fitzpatrick, K. A.

    1975-01-01

    The author has identified the following significant results. Level 2 land use maps produced at three scales (1:24,000, 1:100,000, and 1:250,000) from high altitude photography were compared with each other and with point data obtained in the field. The same procedures were employed to determine the accuracy of the Level 1 land use maps produced at 1:250,000 from high altitude photography and color composite ERTS imagery. Accuracy of the Level 2 maps was 84.9 percent at 1:24,000, 77.4 percent at 1:100,000 and 73.0 percent at 1:250,000. Accuracy of the Level 1 1:250,000 maps was 76.5 percent for aerial photographs and 69.5 percent for ERTS imagery. The cost of Level 2 land use mapping at 1:24,000 was found to be high ($11.93 per sq km). The cost of mapping at 1:100,000 ($1.75) was about two times as expensive as mapping at 1:250,000 ($.88), and the accuracy increased by only 4.4 percent.

  9. Long-term comparison of Kuparuk Watershed active layer maps, northern Alaska, USA

    NASA Astrophysics Data System (ADS)

    Nyland, K. E.; Queen, C.; Nelson, F. E.; Shiklomanov, N. I.; Streletskiy, D. A.; Klene, A. E.

    2017-12-01

    The active layer, or the uppermost soil horizon that thaws seasonally, is among the most dynamic components of the permafrost system. Evaluation of the thickness and spatial variation of the active layer is critical to many components of Arctic research, including climatology, ecology, environmental monitoring, and engineering. In this study we mapped active-layer thickness (ALT) across the 22,278 sq. km Kuparuk River basin on Alaska's North Slope throughout the summer of 2016. The Kuparuk River extends from the Brooks Range through the Arctic Foothills and across the Arctic Coastal Plain physiographic provinces, and drains into the Beaufort Sea. Methodology followed procedures used to produce an ALT map of the basin in 1995 accounting for the effects of topography, vegetation, topoclimate, and soils, using the same spatial sampling scheme for direct ALT and temperature measurement at representative locations and relating these parameters to vegetation-soil associations. A simple semi-empirical engineering solution was used to estimate thaw rates for the different associations. An improved lapse-rate formulation and a higher-resolution DEM were used to relate temperature to elevation. Three ALT maps were generated for the 2016 summer, combining measured thaw depth, temperature records, the 25 m ArcticDEM, high resolution remote sensed data, empirical laps rates, and a topoclimatic index through the thaw solution. These maps were used to track the spatial progression of thaw through the 2016 summer season and estimate a total volume of thawed soil. Maps produced in this study were compared to the 1995 map to track areas of significant geographic changes in patterns of ALT and total volume of thawed soil.

  10. A systematical analysis of in vivo contact forces on virtual catheter tip/tissue surface contact during cardiac mapping and intervention.

    PubMed

    Okumura, Yasuo; Johnson, Susan B; Bunch, T Jared; Henz, Benhur D; O'Brien, Christine J; Packer, Douglas L

    2008-06-01

    While catheter tip/tissue contact has been shown to be an important determinant of ablative lesions in in vitro studies, the impact of contact on the outcomes of mapping and ablation in the intact heart has not been evaluated. Twelve dogs underwent atrial ablation guided by the Senesitrade mark robotic catheter remote control system. After intracardiac ultrasound (ICE) validation of contact force measured by an in-line mechanical sensor, the relationship between contact force and individual lesion formation was established during irrigated-tipped ablation (flow 17 mL/sec) at 15 watts for 30 seconds. Minimal contact by ICE correlated with force of 4.7 +/- 5.8 grams, consistent contact 9.9 +/- 8.6 grams and tissue tenting produced 25.0 +/- 14.0 grams. Conversely, catheter tip/tissue contact by ICE was predicted by contact force. A contact force of 10-20 and > or =20 grams generated full-thickness, larger volume ablative lesions than that created with <10 grams (98 +/- 69 and 89 +/- 70 mm(3) vs 40 +/- 42 mm(3), P < 0.05). Moderate (10 grams) and marked contact (15-20 grams) application produced 1.5 X greater electroanatomic map volumes that were seen with minimal contact (5 grams) (26 +/- 3 cm(3) vs 33 +/- 6, 39 +/- 3 cm(3), P < 0.05). The electroanatomic map/CT merge process was also more distorted when mapping was generated at moderate to marked contact force. This study shows that mapping and ablation using a robotic sheath guidance system are critically dependent on generated force. These findings suggest that ablative lesion size is optimized by the application of 10-20 grams of contact force, although mapping requires lower-force application to avoid image distortions.

  11. Novel evidence that nitric oxide of the medial septal area influences the salivary secretion induced by pilocarpine.

    PubMed

    Saad, Wilson Abrão; Guarda, Ismael Francisco Motta Siqueira; Camargo, Luiz Antonio de Arruda; dos Santos, Talmir Augusto Faria Brisola; Saad, William Abrão; Simões, Sylvio; Guarda, Renata Saad

    2002-04-05

    Our studies have focused on the effect of injection of L-NAME and sodium nitroprussiate (SNP) on the salivary secretion, arterial blood pressure, sodium excretion and urinary volume induced by pilocarpine which was injected into the medial septal area (MSA). Rats were anesthetized with urethane (1.25 g/kg b. wt.) and a stainless steel cannula was implanted into their MSA. The amount of saliva secretion was studied over a five-minute period after injection of pilocarpine into MSA. Injection of pilocarpine (10, 20, 40, 80, 160 microg/microl) into MSA produced a dose-dependent increase in salivary secretion. L-NG-nitro arginine methyl-esther (L-NAME) (40 microg/microl), a nitric oxide (NO) synthase inhibitor, was injected into MSA prior to the injection of pilocarpine into MSA, producing an increase in salivary secretion due to the effect of pilocarpine. Sodium nitroprussiate (SNP) (30 microg/microl) was injected into MSA prior to the injection of pilocarpine into MSA attenuating the increase in salivary secretion induced by pilocarpine. Medial arterial pressure (MAP) increase after injections of pilocarpine into the MSA. L-NAME injected into the MSA prior to injection of pilocarpine into MSA increased the MAP. SNP injected into the MSA prior to pilocarpine attenuated the effect of pilocarpine on MAP. Pilocarpine (40 ug/ul) injected into the MAS induced an increase in sodium and urinary excretion. L-NAME injected prior to pilocarpine into the MSA increased the urinary sodium excretion and urinary volume induced by pilocarpine. SNP injected prior to pilocarpine into the MSA decreased the sodium excretion and urinary volume induced by pilocarpine. All these roles of pilocarpine depend on the release of nitric oxide into the MSA. We may also conclude that the MSA is involved with the cholinergic excitatory mechanism that induce salivary secretion, increase in MAP and increase in sodium excretion and urinary volume.

  12. Mapping site index and volume increment from forest inventory, Landsat, and ecological variables in Tahoe National Forest, California, USA

    USGS Publications Warehouse

    Huang, Shengli; Ramirez, Carlos; Conway, Scott; Kennedy, Kama; Kohler, Tanya; Liu, Jinxun

    2016-01-01

    High-resolution site index (SI) and mean annual increment (MAI) maps are desired for local forest management. We integrated field inventory, Landsat, and ecological variables to produce 30 m SI and MAI maps for the Tahoe National Forest (TNF) where different tree species coexist. We converted species-specific SI using adjustment factors. Then, the SI map was produced by (i) intensifying plots to expand the training sets to more climatic, topographic, soil, and forest reflective classes, (ii) using results from a stepwise regression to enable a weighted imputation that minimized the effects of outlier plots within classes, and (iii) local interpolation and strata median filling to assign values to pixels without direct imputations. The SI (reference age is 50 years) map had an R2 of 0.7637, a root-mean-square error (RMSE) of 3.60, and a mean absolute error (MAE) of 3.07 m. The MAI map was similarly produced with an R2 of 0.6882, an RMSE of 1.73, and a MAE of 1.20 m3·ha−1·year−1. Spatial patterns and trends of SI and MAI were analyzed to be related to elevation, aspect, slope, soil productivity, and forest type. The 30 m SI and MAI maps can be used to support decisions on fire, plantation, biodiversity, and carbon.

  13. Airborne gamma-ray spectrometer and magnetometer survey, Durango A, B, C, and D, Colorado. Volume I. Detail area. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    An airborne combined radiometric and magnetic survey was performed for the Department of Energy (DOE) over the Durango A, Durango B, Durango C, and Durango D Detail Areas of southwestern Colorado. The Durango A Detail Area is within the coverage of the Needle Mountains and Silverton 15' map sheets, and the Pole Creek Mountain, Rio Grande Pyramid, Emerald Lake, Granite Peak, Vallecito Reservoir, and Lemon Reservoir 7.5' map sheets of the National Topographic Map Series (NTMS). The Durango B Detail Area is within the coverage of the Silverton 15' map sheet and the Wetterhorn Peak, Uncompahgre Peak, Lake City, Redcloudmore » Peak, Lake San Cristobal, Pole Creek Mountain, and Finger Mesa 7.5' map sheets of the NTMS. The Durango C Detail Area is within the coverage of the Platoro and Wolf Creek Pass 15' map sheets of the NTMS. The Durango D Detail Area is within the coverage of the Granite Lake, Cimarrona Peak, Bear Mountain, and Oakbrush Ridge 7.5' map sheets of the NTMS. Radiometric data were corrected for live time, aircraft and equipment background, cosmic background, atmospheric radon, Compton scatter, and altitude dependence. The corrected data were statistically evaluated, gridded, and contoured to produce maps of the radiometric variables, uranium, potassium, and thorium; their ratios; and the residual magnetic field. These maps have been analyzed in order to produce a multi-variant analysis contour map based on the radiometric response of the individual geological units. A geochemical analysis has been performed, using the radiometric and magnetic contour maps, the multi-variant analysis map, and factor analysis techniques, to produce a geochemical analysis map for the area.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buehler, Marc; Tartaglia, Michael; Tompkins, John

    The Mu2e experiment at Fermilab is designed to explore charged lepton flavor violation by searching for muon-to-electron conversion. The magnetic field generated by a system of solenoids is crucial for Mu2e and requires accurate characterization to detect any flaws and to produce a detailed field map. Stringent physics goals are driving magnetic field specifications for the Mu2e solenoids. A field mapper is being designed, which will produce detailed magnetic field maps. The uniform field region of the spectrometer volume requires the highest level of precision (1 Gauss per 1 Tesla). During commissioning, multiple magnetic field maps will be generated tomore » verify proper alignment of all magnet coils, and to create the final magnetic field map. In order to design and build a precise field mapping system consisting of Hall and NRM probes, tolerances and precision for such a system need to be evaluated. In this paper we present a design for the Mu2e field mapping hardware, and discuss results from OPERA-3D simulations to specify parameters for Hall and NMR probes. We also present a fitting procedure for the analytical treatment of our expected magnetic measurements.« less

  15. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  16. NASA/BLM Applications Pilot Test (APT), phase 2. Volume 1: Executive summary. [vegetation mapping and production estimation in northwestern Arizona

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Data from LANDSAT, low altitude color aerial photography, and ground visits were combined and used to produce vegetation cover maps and to estimate productivity of range, woodland, and forest resources in northwestern Arizona. A planning session, two workshops, and four status reviews were held to assist technology transfer from NASA. Computer aided digital classification of LANDSAT data was selected as a major source of input data. An overview is presented of the data processing, data collection, productivity estimation, and map verification techniques used. Cost analysis and digital LANDSAT digital products are also considered.

  17. Application of laboratory reflectance spectroscopy to target and map expansive soils: example of the western Loiret, France

    NASA Astrophysics Data System (ADS)

    Hohmann, Audrey; Dufréchou, Grégory; Grandjean, Gilles; Bourguignon, Anne

    2014-05-01

    Swelling soils contain clay minerals that change volume with water content and cause extensive and expensive damage on infrastructures. Based on spatial distribution of infrastructure damages and existing geological maps, the Bureau de Recherches Géologiques et Minières (BRGM, i.e. the French Geological Survey) published in 2010 a 1:50 000 swelling hazard map of France, indexing the territory to low, moderate, or high swelling risk. This study aims to use SWIR (1100-2500 nm) reflectance spectra of soils acquired under laboratory controlled conditions to estimate the swelling potential of soils and improve the swelling risk map of France. 332 samples were collected at the W of Orléans (France) in various geological formations and swelling risk areas. Comparisons of swelling potential of soil samples and swelling risk areas of the map show several inconsistent associations that confirm the necessity to redraw the actual swelling risk map of France. New swelling risk maps of the sampling area were produce from soil samples using three interpolation methods. Maps produce using kriging and Natural neighbour interpolation methods did not permit to show discrete lithological units, introduced unsupported swelling risk zones, and did not appear useful to refine swelling risk map of France. Voronoi polygon was also used to produce map where swelling potential estimated from each samples were extrapolated to a polygon and all polygons were thus supported by field information. From methods tested here, Voronoi polygon appears thus the most adapted method to produce expansive soils maps. However, size of polygon is highly dependent of the samples spacing and samples may not be representative of the entire polygon. More samples are thus needed to provide reliable map at the scale of the sampling area. Soils were also sampled along two sections with a sampling interval of ca. 260 m and ca. 50 m. Sample interval of 50 m appears more adapted for mapping of smallest lithological units. The presence of several samples close to themselves indicating the same swelling potential is a good indication of the presence of a zone with constant swelling potential. Combination of Voronoi method and sampling interval of ca. 50 m appear adapted to produce local swelling potential maps in areas where doubt remain or where infrastructure damages attributed to expansive soils are knew.

  18. An appraisal of Indonesia's immense peat carbon stock using national peatland maps: uncertainties and potential losses from conversion.

    PubMed

    Warren, Matthew; Hergoualc'h, Kristell; Kauffman, J Boone; Murdiyarso, Daniel; Kolka, Randall

    2017-12-01

    A large proportion of the world's tropical peatlands occur in Indonesia where rapid conversion and associated losses of carbon, biodiversity and ecosystem services have brought peatland management to the forefront of Indonesia's climate mitigation efforts. We evaluated peat volume from two commonly referenced maps of peat distribution and depth published by Wetlands International (WI) and the Indonesian Ministry of Agriculture (MoA), and used regionally specific values of carbon density to calculate carbon stocks. Peatland extent and volume published in the MoA maps are lower than those in the WI maps, resulting in lower estimates of carbon storage. We estimate Indonesia's total peat carbon store to be within 13.6 GtC (the low MoA map estimate) and 40.5 GtC (the high WI map estimate) with a best estimate of 28.1 GtC: the midpoint of medium carbon stock estimates derived from WI (30.8 GtC) and MoA (25.3 GtC) maps. This estimate is about half of previous assessments which used an assumed average value of peat thickness for all Indonesian peatlands, and revises the current global tropical peat carbon pool to 75 GtC. Yet, these results do not diminish the significance of Indonesia's peatlands, which store an estimated 30% more carbon than the biomass of all Indonesian forests. The largest discrepancy between maps is for the Papua province, which accounts for 62-71% of the overall differences in peat area, volume and carbon storage. According to the MoA map, 80% of Indonesian peatlands are <300 cm thick and thus vulnerable to conversion outside of protected areas according to environmental regulations. The carbon contained in these shallower peatlands is conservatively estimated to be 10.6 GtC, equivalent to 42% of Indonesia's total peat carbon and about 12 years of global emissions from land use change at current rates. Considering the high uncertainties in peatland extent, volume and carbon storage revealed in this assessment of current maps, a systematic revision of Indonesia's peat maps to produce a single geospatial reference that is universally accepted would improve national peat carbon storage estimates and greatly benefit carbon cycle research, land use management and spatial planning.

  19. Forest resources of the United States, 1992

    Treesearch

    Douglas S. Powell; Joanne L. Faulkner; David R. Darr; Zhiliang Zhu; Douglas W. MacCleery

    1993-01-01

    The 1987 Resources Planning Act (RPA) Assessment forest resources statistics are updated to 1992, to provide current information on the Nation's forests. Resource tables present estimates of forest area, volume, mortality, growth, removals, and timber products output. Resource data are analyzed, and trends since 1987 are noted. A forest type map produced from...

  20. Gazetteer of planetary nomenclature 1994

    USGS Publications Warehouse

    Batson, Raymond M.; Russell, Joel F.

    1995-01-01

    Planetary nomenclature, like terrestrial nomenclature, is used to uniquely identify a feature on the surface of a planet or satellite so that the feature can be easily located, described, and discussed. This volume contains detailed information about all names of topographic and albedo features on planets and satellites (and some planetary ring and ring-gap systems) that the International Astronomical Union has named and approved from its founding in 1919 through its triennial meeting in 1994.This edition of the Gazetteer of Planetary Nomenclature supersedes an earlier informal volume distributed by the U.S. Geological Survey in 1986 as Open-File Report 84-692 (Masursky and others, 1986). Named features are depicted on maps of the Moon published first by the U.S. Defense Mapping Agency or the Aeronautical Chart and Information Center and more recently by the U.S. Geological Survey; on maps of Mercury, Venus, Mars, and the satellites of Jupiter, Saturn, and Uranus published by the U.S. Geological Survey; and on maps of the Moon, Venus, and Mars produced by the U.S.S.R.Although we have attempted to check the accuracy of all data in this volume, we realize that some errors will remain in a work of this size. Readers noting errors or omissions are urged to communicate them to the U.S. Geological Survey, Branch of Astrogeology, Rm. 409, 2255 N. Gemini Drive, Flagstaff, AZ 86001.

  1. Modeling and forecasting tephra hazards at Redoubt Volcano, Alaska, during 2009 unrest and eruption

    NASA Astrophysics Data System (ADS)

    Mastin, L. G.; Denlinger, R. P.; Wallace, K. L.; Schaefer, J. R.

    2009-12-01

    In late 2008, Redoubt Volcano, on the west coast of Alaska’s Cook Inlet, began a period of unrest that culminated in more than 19 small tephra-producing events between March 19 and April 4, 2009, followed by growth of a lava dome whose volume now exceeds 70 million cubic meters. The explosive events lasted from <1 to 31 minutes, sent tephra columns to heights of 19 km asl, and emitted dense-rock (DRE) tephra volumes up to several million cubic meters. Tephra fall affected transportation and infrastructure throughout Cook Inlet, including the Anchorage metropolitan area. The months of unrest that preceded the first explosive event allowed us to develop tools to forecast tephra hazards. As described in an accompanying abstract, colleagues at the University of Pisa produced automated, daily tephra-fall forecast maps using the 3-D VOL-CALPUFF model with input scenarios that represented likely event sizes and durations. Tephra-fall forecast maps were also generated every six hours for hypothetical events of 10M m3 volume DRE using the 2-D model ASHFALL, and relationships between hypothetical plume height and eruption rate were evaluated four times daily under then-current atmospheric conditions using the program PLUMERIA. Eruptive deposits were mapped and isomass contours constructed for the two largest events, March 24 (0340-0355Z) and April 4 (1358-1429Z), which produced radar-determined plume heights of 18.3 and 15.2 km asl (~15.6 and 12.5 km above the vent), and tephra volumes (DRE) of 6.3M and 3.1M m3, respectively. For the volumetric eruption rates calculated from mapped erupted volume and seismic duration (V=6.2×103 and 1.7×103 m3/s DRE), measured plume heights H above the vent fall within 10% of the empirical best-fit curve H=1.67V0.259 published in the book Volcanic Plumes by Sparks et al. (1997, eq. 5.1). The plume heights are slightly higher than (but still within 13% of) the 14.6 and 11.1 km predicted by PLUMERIA under the existing atmospheric conditions. We have also modeled these two events using the 3-D transient model FALL3D, which considers topographic effects on wind and tephra dispersal. Using the eruption rates and plume heights constrained by deposit mapping, seismic data, and Doppler radar, and an archived wind field obtained from the NOAA GDAS model for these dates, modeled isomass contours from the April 4 event closely resemble measured values, but modeled contours from the March 24 event extend only about half to three fourths as far from the volcano as measured. This discrepancy may result from inaccuracies in the modeled wind pattern, the grain-size distribution, or turbulent entrainment algorithms. The deposit pattern may also have been affected by a lateral blast which is thought to have accompanied this event.

  2. Timber Volume and Biomass Estimates in Central Siberia from Satellite Data

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon; Kimes, Daniel S.; Kharuk, Vyetcheslav I.

    2007-01-01

    Mapping of boreal forest's type, structure parameters and biomass are critical for understanding the boreal forest's significance in the carbon cycle, its response to and impact on global climate change. The biggest deficiency of the existing ground based forest inventories is the uncertainty in the inventory data, particularly in remote areas of Siberia where sampling is sparse, lacking, and often decades old. Remote sensing methods can help overcome these problems. In this joint US and Russian study, we used the moderate resolution imaging spectroradiometer (MODIS) and unique waveform data of the geoscience laser altimeter system (GLAS) and produced a map of timber volume for a 10degx12deg area in Central Siberia. Using these methods, the mean timber volume for the forested area in the total study area was 203 m3/ ha. The new remote sensing methods used in this study provide a truly independent estimate of forest structure, which is not dependent on traditional ground forest inventory methods.

  3. Connection between encounter volume and diffusivity in geophysical flows

    NASA Astrophysics Data System (ADS)

    Rypina, Irina I.; Smith, Stefan G. Llewellyn; Pratt, Larry J.

    2018-04-01

    Trajectory encounter volume - the volume of fluid that passes close to a reference fluid parcel over some time interval - has been recently introduced as a measure of mixing potential of a flow. Diffusivity is the most commonly used characteristic of turbulent diffusion. We derive the analytical relationship between the encounter volume and diffusivity under the assumption of an isotropic random walk, i.e., diffusive motion, in one and two dimensions. We apply the derived formulas to produce maps of encounter volume and the corresponding diffusivity in the Gulf Stream region of the North Atlantic based on satellite altimetry, and discuss the mixing properties of Gulf Stream rings. Advantages offered by the derived formula for estimating diffusivity from oceanographic data are discussed, as well as applications to other disciplines.

  4. Snow water equivalent mapping in Norway

    NASA Astrophysics Data System (ADS)

    Tveito, O. E.; Udnæs, H.-C.; Engeset, R.; Førland, E. J.; Isaksen, K.; Mengistu, Z.

    2003-04-01

    In high latitude area snow covers the ground large parts of the year. Information about the water volume as snow is of major importance in many respects. Flood forecasters at NVE need it in order to assess possible flood risks. Hydropower producers need it to plan the most efficient production of the water in their reservoirs, traders to estimate the potential energy available for the market. Meteorologists on their side use the information as boundary conditions in weather forecasting models. The Norwegian meteorological institute has provided snow accumulation maps for Norway for more than 50 years. These maps are now produced twice a month in the winter season. They show the accumulated precipitation in the winter season from the day the permanent snow cover is established. They do however not take melting into account, and do therefore not give a good description of the actual snow amounts during and after periods with snowmelt. Due to an increased need for a direct measure of water volumes as snow cover, met.no and NVE initialized a joint project in order to establish maps of the actual snow cover expressed in water equivalents. The project utilizes recent developments in the use of GIS in spatial modeling. Daily precipitation and temperature are distributed in space by using objective spatial interpolation methods. The interpolation considers topographical and other geographical parameters as well as weather type information. A degree-day model is used at each modeling point to calculate snow-accumulation and snowmelt. The maps represent a spatial scale of 1x1 km2. The modeled snow reservoir is validated by snow pillow values as well traditional snow depth observations. Preliminary results show that the new snow modeling approach reproduces the snow water equivalent well. The spatial approach also opens for a wide use in the terms of areal analysis.

  5. USGS Map-on-Demand Printing

    USGS Publications Warehouse

    ,

    1999-01-01

    Currently, the U.S. Geological Survey (USGS) uses conventional lithographic printing techniques to produce paper copies of most of its mapping products. This practice is not economical for those products that are in low demand. With the advent of newer technologies, high-speed, large-format printers have been coupled with innovative computer software to turn digital map data into a printed map. It is now possible to store and retrieve data from vast geospatial data bases and print a map on an as-needed basis; that is, print on demand, thereby eliminating the need to warehouse an inventory of paper maps for which there is low demand. Using print-on-demand technology, the USGS is implementing map-on-demand (MOD) printing for certain infrequently requested maps. By providing MOD, the USGS can offer an alternative to traditional, large-volume printing and can improve its responsiveness to customers by giving them greater access to USGS scientific data in a format that otherwise might not be available.

  6. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  7. A framework to analyze cerebral mean diffusivity using surface guided diffusion mapping in diffusion tensor imaging

    PubMed Central

    Kwon, Oh-Hun; Park, Hyunjin; Seo, Sang-Won; Na, Duk L.; Lee, Jong-Min

    2015-01-01

    The mean diffusivity (MD) value has been used to describe microstructural properties in Diffusion Tensor Imaging (DTI) in cortical gray matter (GM). Recently, researchers have applied a cortical surface generated from the T1-weighted volume. When the DTI data are analyzed using the cortical surface, it is important to assign an accurate MD value from the volume space to the vertex of the cortical surface, considering the anatomical correspondence between the DTI and the T1-weighted image. Previous studies usually sampled the MD value using the nearest-neighbor (NN) method or Linear method, even though there are geometric distortions in diffusion-weighted volumes. Here we introduce a Surface Guided Diffusion Mapping (SGDM) method to compensate for such geometric distortions. We compared our SGDM method with results using NN and Linear methods by investigating differences in the sampled MD value. We also projected the tissue classification results of non-diffusion-weighted volumes to the cortical midsurface. The CSF probability values provided by the SGDM method were lower than those produced by the NN and Linear methods. The MD values provided by the NN and Linear methods were significantly greater than those of the SGDM method in regions suffering from geometric distortion. These results indicate that the NN and Linear methods assigned the MD value in the CSF region to the cortical midsurface (GM region). Our results suggest that the SGDM method is an effective way to correct such mapping errors. PMID:26236180

  8. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-21

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  9. A comparison of FIA plot data derived from image pixels and image objects

    Treesearch

    Charles E. Werstak

    2012-01-01

    The use of Forest Inventory and Analysis (FIA) plot data for producing continuous and thematic maps of forest attributes (e.g., forest type, canopy cover, volume, and biomass) at the regional level from satellite imagery can be challenging due to differences in scale. Specifically, classification errors that may result from assumptions made between what the field data...

  10. Airborne gamma-ray spectrometer and magnetometer survey, Durango B, Colorado. Final report Volume II C. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    This volume contains eight appendices: flight line maps, geology maps, explanation of geologic legend, flight line/geology maps, radiometric contour maps, magnetic contour maps, multi-variant analysis maps, and geochemical factor analysis maps. These appendices pertain to the Durango B detail area.

  11. The informatics of a C57BL/6J mouse brain atlas.

    PubMed

    MacKenzie-Graham, Allan; Jones, Eagle S; Shattuck, David W; Dinov, Ivo D; Bota, Mihail; Toga, Arthur W

    2003-01-01

    The Mouse Atlas Project (MAP) aims to produce a framework for organizing and analyzing the large volumes of neuroscientific data produced by the proliferation of genetically modified animals. Atlases provide an invaluable aid in understanding the impact of genetic manipulations by providing a standard for comparison. We use a digital atlas as the hub of an informatics network, correlating imaging data, such as structural imaging and histology, with text-based data, such as nomenclature, connections, and references. We generated brain volumes using magnetic resonance microscopy (MRM), classical histology, and immunohistochemistry, and registered them into a common and defined coordinate system. Specially designed viewers were developed in order to visualize multiple datasets simultaneously and to coordinate between textual and image data. Researchers can navigate through the brain interchangeably, in either a text-based or image-based representation that automatically updates information as they move. The atlas also allows the independent entry of other types of data, the facile retrieval of information, and the straight-forward display of images. In conjunction with centralized servers, image and text data can be kept current and can decrease the burden on individual researchers' computers. A comprehensive framework that encompasses many forms of information in the context of anatomic imaging holds tremendous promise for producing new insights. The atlas and associated tools can be found at http://www.loni.ucla.edu/MAP.

  12. Airborne gamma-ray spectrometer and magnetometer survey, Durango D, Colorado. Final report Volume II B. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    This volume comprises eight appendices containing the following information for the Durango D detail area: flight line maps, geology maps, explanation of geologic legend, flight line/geology maps, radiometric contour maps, magnetic contour maps, multi-variant analysis maps, and geochemical factor analysis maps.

  13. Airborne gamma-ray spectrometer and magnetometer survey, Durango C, Colorado. Final report Volume II B. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    This volume comprises eight appendices containing the following information for the Durango C detail area: flight line maps, geology maps, explanation of geologic legend, flight line/geology maps, radiometric contour maps, magnetic contour maps, multi-variant analysis maps, and geochemical factor analysis maps.

  14. Vector Doppler: spatial sampling analysis and presentation techniques for real-time systems

    NASA Astrophysics Data System (ADS)

    Capineri, Lorenzo; Scabia, Marco; Masotti, Leonardo F.

    2001-05-01

    The aim of the vector Doppler (VD) technique is the quantitative reconstruction of a velocity field independently of the ultrasonic probe axis to flow angle. In particular vector Doppler is interesting for studying vascular pathologies related to complex blood flow conditions. Clinical applications require a real-time operating mode and the capability to perform Doppler measurements over a defined volume. The combination of these two characteristics produces a real-time vector velocity map. In previous works the authors investigated the theory of pulsed wave (PW) vector Doppler and developed an experimental system capable of producing off-line 3D vector velocity maps. Afterwards, for producing dynamic velocity vector maps, we realized a new 2D vector Doppler system based on a modified commercial echograph. The measurement and presentation of a vector velocity field requires a correct spatial sampling that must satisfy the Shannon criterion. In this work we tackled this problem, establishing a relationship between sampling steps and scanning system characteristics. Another problem posed by the vector Doppler technique is the data representation in real-time that should be easy to interpret for the physician. With this in mine we attempted a multimedia solution that uses both interpolated images and sound to represent the information of the measured vector velocity map. These presentation techniques were experimented for real-time scanning on flow phantoms and preliminary measurements in vivo on a human carotid artery.

  15. Ischemic lesion volume determination on diffusion weighted images vs. apparent diffusion coefficient maps.

    PubMed

    Bråtane, Bernt Tore; Bastan, Birgul; Fisher, Marc; Bouley, James; Henninger, Nils

    2009-07-07

    Though diffusion weighted imaging (DWI) is frequently used for identifying the ischemic lesion in focal cerebral ischemia, the understanding of spatiotemporal evolution patterns observed with different analysis methods remains imprecise. DWI and calculated apparent diffusion coefficient (ADC) maps were serially obtained in rat stroke models (MCAO): permanent, 90 min, and 180 min temporary MCAO. Lesion volumes were analyzed in a blinded and randomized manner by 2 investigators using (i) a previously validated ADC threshold, (ii) visual determination of hypointense regions on ADC maps, and (iii) visual determination of hyperintense regions on DWI. Lesion volumes were correlated with 24 hour 2,3,5-triphenyltetrazoliumchloride (TTC)-derived infarct volumes. TTC-derived infarct volumes were not significantly different from the ADC and DWI-derived lesion volumes at the last imaging time points except for significantly smaller DWI lesions in the pMCAO model (p=0.02). Volumetric calculation based on TTC-derived infarct also correlated significantly stronger to volumetric calculation based on last imaging time point derived lesions on ADC maps than DWI (p<0.05). Following reperfusion, lesion volumes on the ADC maps significantly reduced but no change was observed on DWI. Visually determined lesion volumes on ADC maps and DWI by both investigators correlated significantly with threshold-derived lesion volumes on ADC maps with the former method demonstrating a stronger correlation. There was also a better interrater agreement for ADC map analysis than for DWI analysis. Ischemic lesion determination by ADC was more accurate in final infarct prediction, rater independent, and provided exclusive information on ischemic lesion reversibility.

  16. Coronelli, Vincenzo Maria (1650-1718)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Cosmographer, from Venice. Produced globes of the Earth and of the heavens. Made two beautiful globes for Louis XIV, 3.9 m in diameter, the biggest in the world until the present century, and which are currently kept in the palace of Versailles. Designed hundreds of maps and published the first six of 45 volumes of the first encyclopedia in alphabetical order, the Biblioteca Universale Sacro-Prof...

  17. Mapping from Space - Ontology Based Map Production Using Satellite Imageries

    NASA Astrophysics Data System (ADS)

    Asefpour Vakilian, A.; Momeni, M.

    2013-09-01

    Determination of the maximum ability for feature extraction from satellite imageries based on ontology procedure using cartographic feature determination is the main objective of this research. Therefore, a special ontology has been developed to extract maximum volume of information available in different high resolution satellite imageries and compare them to the map information layers required in each specific scale due to unified specification for surveying and mapping. ontology seeks to provide an explicit and comprehensive classification of entities in all sphere of being. This study proposes a new method for automatic maximum map feature extraction and reconstruction of high resolution satellite images. For example, in order to extract building blocks to produce 1 : 5000 scale and smaller maps, the road networks located around the building blocks should be determined. Thus, a new building index has been developed based on concepts obtained from ontology. Building blocks have been extracted with completeness about 83%. Then, road networks have been extracted and reconstructed to create a uniform network with less discontinuity on it. In this case, building blocks have been extracted with proper performance and the false positive value from confusion matrix was reduced by about 7%. Results showed that vegetation cover and water features have been extracted completely (100%) and about 71% of limits have been extracted. Also, the proposed method in this article had the ability to produce a map with largest scale possible from any multi spectral high resolution satellite imagery equal to or smaller than 1 : 5000.

  18. Mapping from Space - Ontology Based Map Production Using Satellite Imageries

    NASA Astrophysics Data System (ADS)

    Asefpour Vakilian, A.; Momeni, M.

    2013-09-01

    Determination of the maximum ability for feature extraction from satellite imageries based on ontology procedure using cartographic feature determination is the main objective of this research. Therefore, a special ontology has been developed to extract maximum volume of information available in different high resolution satellite imageries and compare them to the map information layers required in each specific scale due to unified specification for surveying and mapping. ontology seeks to provide an explicit and comprehensive classification of entities in all sphere of being. This study proposes a new method for automatic maximum map feature extraction and reconstruction of high resolution satellite images. For example, in order to extract building blocks to produce 1 : 5000 scale and smaller maps, the road networks located around the building blocks should be determined. Thus, a new building index has been developed based on concepts obtained from ontology. Building blocks have been extracted with completeness about 83 %. Then, road networks have been extracted and reconstructed to create a uniform network with less discontinuity on it. In this case, building blocks have been extracted with proper performance and the false positive value from confusion matrix was reduced by about 7 %. Results showed that vegetation cover and water features have been extracted completely (100 %) and about 71 % of limits have been extracted. Also, the proposed method in this article had the ability to produce a map with largest scale possible from any multi spectral high resolution satellite imagery equal to or smaller than 1 : 5000.

  19. Integration of In Situ Radon Modeling with High Resolution Aerial Remote Sensing for Mapping and Quantifying Local to Regional Flow and Transport of Submarine Groundwater Discharge from Coastal Aquifers

    NASA Astrophysics Data System (ADS)

    Glenn, C. R.; Kennedy, J. J.; Dulaiova, H.; Kelly, J. L.; Lucey, P. G.; Lee, E.; Fackrell, J.

    2015-12-01

    Submarine groundwater discharge (SGD) is a principal conduit for huge volumes of fresh groundwater loss and is a key transport mechanism for nutrient and contaminant pollution to coastal zones worldwide. However, the volumes and spatially and temporally variable nature of SGD is poorly known and requires rapid and high-resolution data acquisition at the scales in which it is commonly observed. Airborne thermal infrared (TIR) remote sensing, using high-altitude manned aircraft and low-altitude remote-controlled unmanned aerial vehicles (UAVs or "Drones") are uniquely qualified for this task, and applicable wherever 0.1°C temperature contrasts exist between discharging and receiving waters. We report on the use of these technologies in combination with in situ radon model studies of SGD volume and nutrient flux from three of the largest Hawaiian Islands. High altitude manned aircraft results produce regional (~300m wide x 100s km coastline) 0.5 to 3.2 m-resolution sea-surface temperature maps accurate to 0.7°C that show point-source and diffuse flow in exquisite detail. Using UAVs offers cost-effective advantages of higher spatial and temporal resolution and instantaneous deployments that can be coordinated simultaneously with any ground-based effort. We demonstrate how TIR-mapped groundwater discharge plume areas may be linearly and highly correlated to in situ groundwater fluxes. We also illustrate how in situ nutrient data may be incorporated into infrared imagery to produce nutrient distribution maps of regional worth. These results illustrate the potential for volumetric quantification and up-scaling of small- to regional-scale SGD. These methodologies provide a tremendous advantage for identifying and differentiating spring-fed, point-sourced, and/or diffuse groundwater discharge into oceans, estuaries, and streams. The integrative techniques are also important precursors for developing best-use and cost-effective strategies for otherwise time-consuming in situ studies, and represent a substantial new asset for land use and coastal zone research and management.

  20. Molecules to maps: tools for visualization and interaction in support of computational biology.

    PubMed

    Kraemer, E T; Ferrin, T E

    1998-01-01

    The volume of data produced by genome projects, X-ray crystallography, NMR spectroscopy, and electron and confocal microscopy present the bioinformatics community with new challenges for analyzing, understanding, and exchanging this data. At the 1998 Pacific Symposium on Biocomputing, a track entitled 'Molecules to Maps: Tools for Visualization and Interaction in Computational Biology' provided tool developers and users with the opportunity to discuss advances in tools and techniques to assist scientists in evaluating, absorbing, navigating, and correlating this sea of information, through visualization and user interaction. In this paper we present these advances and discuss some of the challenges that remain to be solved.

  1. High-energy surface and volume plasmons in nanopatterned sub-10 nm aluminum nanostructures

    DOE PAGES

    Hobbs, Richard G.; Manfrinato, Vitor R.; Yang, Yujia; ...

    2016-06-13

    In this paper, we use electron energy-loss spectroscopy to map the complete plasmonic spectrum of aluminum nanodisks with diameters ranging from 3 to 120 nm fabricated by high-resolution electron-beam lithography. Our nanopatterning approach allows us to produce localized surface plasmon resonances across a wide spectral range spanning 2–8 eV. Electromagnetic simulations using the finite element method support the existence of dipolar, quadrupolar, and hexapolar surface plasmon modes as well as centrosymmetric breathing modes depending on the location of the electron-beam excitation. In addition, we have developed an approach using nanolithography that is capable of meV control over the energy andmore » attosecond control over the lifetime of volume plasmons in these nanodisks. The precise measurement of volume plasmon lifetime may also provide an opportunity to probe and control the DC electrical conductivity of highly confined metallic nanostructures. Lastly, we show the strong influence of the nanodisk boundary in determining both the energy and lifetime of surface plasmons and volume plasmons locally across individual aluminum nanodisks, and we have compared these observations to similar effects produced by scaling the nanodisk diameter.« less

  2. Lunar cartographic dossier, volume 1

    NASA Technical Reports Server (NTRS)

    Schimerman, L. A. (Editor)

    1975-01-01

    The dossier is designed to provide an up to date summary of the extent and quality of cartographic information as well as describing materials available to support lunar scientific investigation and study. It covers the specific photographic, selenodetic and cartographic data considered to be of continuing significance to users of lunar cartographic information. Historical background data is included. Descriptive and evaluative information is presented concerning lunar maps, photomaps and photo mosaics. Discussion comprises identification of series or individual sheet characteristics, control basis, source materials and compilation methodology used. The global, regional and local selenodetic control are described which were produced for lunar feature location in support of lunar mapping or positional study. Further discussion covers the fundamental basis for each control system, number of points produced, techniques employed and evaluated accuracy. Although lunar photography is an informational source rather than a cartographic product, a photography section was included to facilitate correlation to the mapping and control works described. Description of lunar photographic systems, photography and photo support data are presented from a cartographic-photogrammetric viewpoint with commentary on cartographic applications.

  3. Four-dimensional ultrasound current source density imaging of a dipole field

    NASA Astrophysics Data System (ADS)

    Wang, Z. H.; Olafsson, R.; Ingram, P.; Li, Q.; Qin, Y.; Witte, R. S.

    2011-09-01

    Ultrasound current source density imaging (UCSDI) potentially transforms conventional electrical mapping of excitable organs, such as the brain and heart. For this study, we demonstrate volume imaging of a time-varying current field by scanning a focused ultrasound beam and detecting the acoustoelectric (AE) interaction signal. A pair of electrodes produced an alternating current distribution in a special imaging chamber filled with a 0.9% NaCl solution. A pulsed 1 MHz ultrasound beam was scanned near the source and sink, while the AE signal was detected on remote recording electrodes, resulting in time-lapsed volume movies of the alternating current distribution.

  4. Comparison of CT perfusion summary maps to early diffusion-weighted images in suspected acute middle cerebral artery stroke.

    PubMed

    Benson, John; Payabvash, Seyedmehdi; Salazar, Pascal; Jagadeesan, Bharathi; Palmer, Christopher S; Truwit, Charles L; McKinney, Alexander M

    2015-04-01

    To assess the accuracy and reliability of one vendor's (Vital Images, Toshiba Medical, Minnetonka, MN) automated CT perfusion (CTP) summary maps in identification and volume estimation of infarcted tissue in patients with acute middle cerebral artery (MCA) distribution infarcts. From 1085 CTP examinations over 5.5 years, 43 diffusion-weighted imaging (DWI)-positive patients were included who underwent both CTP and DWI <12 h after symptom onset, with another 43 age-matched patients as controls (DWI-negative). Automated delay-corrected postprocessing software (DC-SVD) generated both infarct "core only" and "core+penumbra" CTP summary maps. Three reviewers independently tabulated Alberta Stroke Program Early CT scores (ASPECTS) of both CTP summary maps and coregistered DWI. Of 86 included patients, 36 had DWI infarct volumes ≤70 ml, 7 had volumes >70 ml, and 43 were negative; the automated CTP "core only" map correctly classified each as >70 ml or ≤70 ml, while the "core+penumbra" map misclassified 4 as >70 ml. There were strong correlations between DWI volume with both summary map-based volumes: "core only" (r=0.93), and "core+penumbra" (r=0.77) (both p<0.0001). Agreement between ASPECTS scores of infarct core on DWI with summary maps was 0.65-0.74 for "core only" map, and 0.61-0.65 for "core+penumbra" (both p<0.0001). Using DWI-based ASPECTS scores as the standard, the accuracy of the CTP-based maps were 79.1-86.0% for the "core only" map, and 83.7-88.4% for "core+penumbra." Automated CTP summary maps appear to be relatively accurate in both the detection of acute MCA distribution infarcts, and the discrimination of volumes using a 70 ml threshold. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Warping of a computerized 3-D atlas to match brain image volumes for quantitative neuroanatomical and functional analysis

    NASA Astrophysics Data System (ADS)

    Evans, Alan C.; Dai, Weiqian; Collins, D. Louis; Neelin, Peter; Marrett, Sean

    1991-06-01

    We describe the implementation, experience and preliminary results obtained with a 3-D computerized brain atlas for topographical and functional analysis of brain sub-regions. A volume-of-interest (VOI) atlas was produced by manual contouring on 64 adjacent 2 mm-thick MRI slices to yield 60 brain structures in each hemisphere which could be adjusted, originally by global affine transformation or local interactive adjustments, to match individual MRI datasets. We have now added a non-linear deformation (warp) capability (Bookstein, 1989) into the procedure for fitting the atlas to the brain data. Specific target points are identified in both atlas and MRI spaces which define a continuous 3-D warp transformation that maps the atlas on to the individual brain image. The procedure was used to fit MRI brain image volumes from 16 young normal volunteers. Regional volume and positional variability were determined, the latter in such a way as to assess the extent to which previous linear models of brain anatomical variability fail to account for the true variation among normal individuals. Using a linear model for atlas deformation yielded 3-D fits of the MRI data which, when pooled across subjects and brain regions, left a residual mis-match of 6 - 7 mm as compared to the non-linear model. The results indicate a substantial component of morphometric variability is not accounted for by linear scaling. This has profound implications for applications which employ stereotactic coordinate systems which map individual brains into a common reference frame: quantitative neuroradiology, stereotactic neurosurgery and cognitive mapping of normal brain function with PET. In the latter case, the combination of a non-linear deformation algorithm would allow for accurate measurement of individual anatomic variations and the inclusion of such variations in inter-subject averaging methodologies used for cognitive mapping with PET.

  6. Parametric mapping of [18F]fluoromisonidazole positron emission tomography using basis functions.

    PubMed

    Hong, Young T; Beech, John S; Smith, Rob; Baron, Jean-Claude; Fryer, Tim D

    2011-02-01

    In this study, we show a basis function method (BAFPIC) for voxelwise calculation of kinetic parameters (K(1), k(2), k(3), K(i)) and blood volume using an irreversible two-tissue compartment model. BAFPIC was applied to rat ischaemic stroke micro-positron emission tomography data acquired with the hypoxia tracer [(18)F]fluoromisonidazole because irreversible two-tissue compartmental modelling provided good fits to data from both hypoxic and normoxic tissues. Simulated data show that BAFPIC produces kinetic parameters with significantly lower variability and bias than nonlinear least squares (NLLS) modelling in hypoxic tissue. The advantage of BAFPIC over NLLS is less pronounced in normoxic tissue. K(i) determined from BAFPIC has lower variability than that from the Patlak-Gjedde graphical analysis (PGA) by up to 40% and lower bias, except for normoxic tissue at mid-high noise levels. Consistent with the simulation results, BAFPIC parametric maps of real data suffer less noise-induced variability than do NLLS and PGA. Delineation of hypoxia on BAFPIC k(3) maps is aided by low variability in normoxic tissue, which matches that in K(i) maps. BAFPIC produces K(i) values that correlate well with those from PGA (r(2)=0.93 to 0.97; slope 0.99 to 1.05, absolute intercept <0.00002 mL/g per min). BAFPIC is a computationally efficient method of determining parametric maps with low bias and variance.

  7. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    PubMed

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  8. Towards the Crowdsourcing of Massive Smartphone Assisted-GPS Sensor Ground Observations for the Production of Digital Terrain Models

    PubMed Central

    Massad, Ido

    2018-01-01

    Digital Terrain Models (DTMs) used for the representation of the bare earth are produced from elevation data obtained using high-end mapping platforms and technologies. These require the handling of complex post-processing performed by authoritative and commercial mapping agencies. In this research, we aim to exploit user-generated data to produce DTMs by handling massive volumes of position and elevation data collected using ubiquitous smartphone devices equipped with Assisted-GPS sensors. As massive position and elevation data are collected passively and straightforwardly by pedestrians, cyclists, and drivers, it can be transformed into valuable topographic information. Specifically, in dense and concealed built and vegetated areas, where other technologies fail, handheld devices have an advantage. Still, Assisted-GPS measurements are not as accurate as high-end technologies, requiring pre- and post-processing of observations. We propose the development and implementation of a 2D Kalman filter and smoothing on the acquired crowdsourced observations for topographic representation production. When compared to an authoritative DTM, results obtained are very promising in producing good elevation values. Today, open-source mapping infrastructures, such as OpenStreetMap, rely primarily on the global authoritative SRTM (Shuttle Radar Topography Mission), which shows similar accuracy but inferior resolution when compared to the results obtained in this research. Accordingly, our crowdsourced methodology has the capacity for reliable topographic representation production that is based on ubiquitous volunteered user-generated data. PMID:29562627

  9. Real-time volume rendering of 4D image using 3D texture mapping

    NASA Astrophysics Data System (ADS)

    Hwang, Jinwoo; Kim, June-Sic; Kim, Jae Seok; Kim, In Young; Kim, Sun Il

    2001-05-01

    Four dimensional image is 3D volume data that varies with time. It is used to express deforming or moving object in virtual surgery of 4D ultrasound. It is difficult to render 4D image by conventional ray-casting or shear-warp factorization methods because of their time-consuming rendering time or pre-processing stage whenever the volume data are changed. Even 3D texture mapping is used, repeated volume loading is also time-consuming in 4D image rendering. In this study, we propose a method to reduce data loading time using coherence between currently loaded volume and previously loaded volume in order to achieve real time rendering based on 3D texture mapping. Volume data are divided into small bricks and each brick being loaded is tested for similarity to one which was already loaded in memory. If the brick passed the test, it is defined as 3D texture by OpenGL functions. Later, the texture slices of the brick are mapped into polygons and blended by OpenGL blending functions. All bricks undergo this test. Continuously deforming fifty volumes are rendered in interactive time with SGI ONYX. Real-time volume rendering based on 3D texture mapping is currently available on PC.

  10. Compensation of missing wedge effects with sequential statistical reconstruction in electron tomography.

    PubMed

    Paavolainen, Lassi; Acar, Erman; Tuna, Uygar; Peltonen, Sari; Moriya, Toshio; Soonsawad, Pan; Marjomäki, Varpu; Cheng, R Holland; Ruotsalainen, Ulla

    2014-01-01

    Electron tomography (ET) of biological samples is used to study the organization and the structure of the whole cell and subcellular complexes in great detail. However, projections cannot be acquired over full tilt angle range with biological samples in electron microscopy. ET image reconstruction can be considered an ill-posed problem because of this missing information. This results in artifacts, seen as the loss of three-dimensional (3D) resolution in the reconstructed images. The goal of this study was to achieve isotropic resolution with a statistical reconstruction method, sequential maximum a posteriori expectation maximization (sMAP-EM), using no prior morphological knowledge about the specimen. The missing wedge effects on sMAP-EM were examined with a synthetic cell phantom to assess the effects of noise. An experimental dataset of a multivesicular body was evaluated with a number of gold particles. An ellipsoid fitting based method was developed to realize the quantitative measures elongation and contrast in an automated, objective, and reliable way. The method statistically evaluates the sub-volumes containing gold particles randomly located in various parts of the whole volume, thus giving information about the robustness of the volume reconstruction. The quantitative results were also compared with reconstructions made with widely-used weighted backprojection and simultaneous iterative reconstruction technique methods. The results showed that the proposed sMAP-EM method significantly suppresses the effects of the missing information producing isotropic resolution. Furthermore, this method improves the contrast ratio, enhancing the applicability of further automatic and semi-automatic analysis. These improvements in ET reconstruction by sMAP-EM enable analysis of subcellular structures with higher three-dimensional resolution and contrast than conventional methods.

  11. First Results of the Performance of the Global Forest/Non-Forest Map derived from TanDEM-X Interferometric Data

    NASA Astrophysics Data System (ADS)

    Gonzalez, Carolina; Rizzoli, Paola; Martone, Michele; Wecklich, Christopher; Bueso Bello, Jose Luis; Krieger, Gerhard; Zink, Manfred

    2017-04-01

    The globally acquired interferometric synthetic aperture radar (SAR) data set, used for the recently completed primary goal of the TanDEM-X mission, enables a big opportunity for scientific geo-applications. Of great importance for land characterization, classification, and monitoring is that the data set is globally acquired without gaps and includes multiple acquisitions of every region, with comparable parameters. One of the most valuable maps that can be derived from interferometric SAR data for land classification describes the presence/absence of vegetation. In particular, here we report about the deployment of the Global Forest/Non-Forest Map, derived from TanDEM-X interferometric SAR quick-look data, at a ground resolution of 50 m by 50 m. Presence of structures and in particular vegetation produces multiple scattering known as volume decorrelation. Its contribution can be directly estimated from the assessment of coherence loss in the interferometric bistatic pair, by compensating for all other decorrelation sources, such as poor signal-to-noise ratio or quantization noise. Three different forest types have been characterized based on the estimated volume decorrelation: tropical, temperate, and boreal forest. This characterization was then used in a fuzzy clustering approach for the discrimination of vegetated areas on a global scale. Water and cities are filtered out from the generated maps in order to distinguish volume decorrelation from other decorrelation sources. The validation and performance comparison of the delivered product is also presented, and represents a fundamental tool for optimizing the whole algorithm at all different stages. Furtheremore, as the time interval of the acquisitions is almost 4 years, change detection can be performed as well and examples of deforestation are also going to be included in the final paper.

  12. CARETS: A prototype regional environmental information system. Volume 5: Interpretation, compilation and field verification procedures in the CARETS project

    NASA Technical Reports Server (NTRS)

    Alexander, R. H. (Principal Investigator); Deforth, P. W.; Fitzpatrick, K. A.; Lins, H. F., Jr.; Mcginty, H. K., III

    1975-01-01

    The author has identified the following significant results. Level 2 land use mapping from high altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. To enhance the value of the land use sheets, a series of overlays was compiled, showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing level 1 land use maps from LANDSAT imagery, at a scale of 1:250,000 drafting film was directly overlaid on LANDSAT color composite transparencies. Numerous areas of change were identified, but extensive areas of false changes were also noted.

  13. Quantitative, nondestructive estimates of coarse root biomass in a temperate pine forest using 3-D ground-penetrating radar (GPR)

    NASA Astrophysics Data System (ADS)

    Molon, Michelle; Boyce, Joseph I.; Arain, M. Altaf

    2017-01-01

    Coarse root biomass was estimated in a temperate pine forest using high-resolution (1 GHz) 3-D ground-penetrating radar (GPR). GPR survey grids were acquired across a 400 m2 area with varying line spacing (12.5 and 25 cm). Root volume and biomass were estimated directly from the 3-D radar volume by using isometric surfaces calculated with the marching cubes algorithm. Empirical relations between GPR reflection amplitude and root diameter were determined for 14 root segments (0.1-10 cm diameter) reburied in a 6 m2 experimental test plot and surveyed at 5-25 cm line spacing under dry and wet soil conditions. Reburied roots >1.4 cm diameter were detectable as continuous root structures with 5 cm line separation. Reflection amplitudes were strongly controlled by soil moisture and decreased by 40% with a twofold increase in soil moisture. GPR line intervals of 12.5 and 25 cm produced discontinuous mapping of roots, and GPR coarse root biomass estimates (0.92 kgC m-2) were lower than those obtained previously with a site-specific allometric equation due to nondetection of vertical roots and roots <1.5 cm diameter. The results show that coarse root volume and biomass can be estimated directly from interpolated 3-D GPR volumes by using a marching cubes approach, but mapping of roots as continuous structures requires high inline sampling and line density (<5 cm). The results demonstrate that 3-D GPR is viable approach for estimating belowground carbon and for mapping tree root architecture. This methodology can be applied more broadly in other disciplines (e.g., archaeology and civil engineering) for imaging buried structures.

  14. Fluoride Induces a Volume Reduction in CA1 Hippocampal Slices Via MAP Kinase Pathway Through Volume Regulated Anion Channels

    PubMed Central

    Lee, Jaekwang; Han, Young-Eun; Favorov, Oleg; Tommerdahl, Mark; Whitsel, Barry

    2016-01-01

    Regulation of cell volume is an important aspect of cellular homeostasis during neural activity. This volume regulation is thought to be mediated by activation of specific transporters, aquaporin, and volume regulated anion channels (VRAC). In cultured astrocytes, it was reported that swelling-induced mitogen-activated protein (MAP) kinase activation is required to open VRAC, which are thought to be important in regulatory volume decrease and in the response of CNS to trauma and excitotoxicity. It has been also described that sodium fluoride (NaF), a recognized G-protein activator and protein phosphatase inhibitor, leads to a significant MAP kinase activation in endothelial cells. However, NaF's effect in volume regulation in the brain is not known yet. Here, we investigated the mechanism of NaF-induced volume change in rat and mouse hippocampal slices using intrinsic optical signal (IOS) recording, in which we measured relative changes in intracellular and extracellular volume as changes in light transmittance through brain slices. We found that NaF (1~5 mM) application induced a reduction in light transmittance (decreased volume) in CA1 hippocampus, which was completely reversed by MAP kinase inhibitor U0126 (10 µM). We also observed that NaF-induced volume reduction was blocked by anion channel blockers, suggesting that NaF-induced volume reduction could be mediated by VRAC. Overall, our results propose a novel molecular mechanism of NaF-induced volume reduction via MAP kinase signaling pathway by activation of VRAC. PMID:27122993

  15. Semi-automatic mapping of geological Structures using UAV-based photogrammetric data: An image analysis approach

    NASA Astrophysics Data System (ADS)

    Vasuki, Yathunanthan; Holden, Eun-Jung; Kovesi, Peter; Micklethwaite, Steven

    2014-08-01

    Recent advances in data acquisition technologies, such as Unmanned Aerial Vehicles (UAVs), have led to a growing interest in capturing high-resolution rock surface images. However, due to the large volumes of data that can be captured in a short flight, efficient analysis of this data brings new challenges, especially the time it takes to digitise maps and extract orientation data. We outline a semi-automated method that allows efficient mapping of geological faults using photogrammetric data of rock surfaces, which was generated from aerial photographs collected by a UAV. Our method harnesses advanced automated image analysis techniques and human data interaction to rapidly map structures and then calculate their dip and dip directions. Geological structures (faults, joints and fractures) are first detected from the primary photographic dataset and the equivalent three dimensional (3D) structures are then identified within a 3D surface model generated by structure from motion (SfM). From this information the location, dip and dip direction of the geological structures are calculated. A structure map generated by our semi-automated method obtained a recall rate of 79.8% when compared against a fault map produced using expert manual digitising and interpretation methods. The semi-automated structure map was produced in 10 min whereas the manual method took approximately 7 h. In addition, the dip and dip direction calculation, using our automated method, shows a mean±standard error of 1.9°±2.2° and 4.4°±2.6° respectively with field measurements. This shows the potential of using our semi-automated method for accurate and efficient mapping of geological structures, particularly from remote, inaccessible or hazardous sites.

  16. Direct dose mapping versus energy/mass transfer mapping for 4D dose accumulation: fundamental differences and dosimetric consequences.

    PubMed

    Li, Haisen S; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S; Chetty, Indrin J

    2014-01-06

    The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.

  17. Direct dose mapping versus energy/mass transfer mapping for 4D dose accumulation: fundamental differences and dosimetric consequences

    NASA Astrophysics Data System (ADS)

    Li, Haisen S.; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S.; Chetty, Indrin J.

    2014-01-01

    The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.

  18. Parametric mapping using spectral analysis for 11C-PBR28 PET reveals neuroinflammation in mild cognitive impairment subjects.

    PubMed

    Fan, Zhen; Dani, Melanie; Femminella, Grazia D; Wood, Melanie; Calsolaro, Valeria; Veronese, Mattia; Turkheimer, Federico; Gentleman, Steve; Brooks, David J; Hinz, Rainer; Edison, Paul

    2018-07-01

    Neuroinflammation and microglial activation play an important role in amnestic mild cognitive impairment (MCI) and Alzheimer's disease. In this study, we investigated the spatial distribution of neuroinflammation in MCI subjects, using spectral analysis (SA) to generate parametric maps and quantify 11 C-PBR28 PET, and compared these with compartmental and other kinetic models of quantification. Thirteen MCI and nine healthy controls were enrolled in this study. Subjects underwent 11 C-PBR28 PET scans with arterial cannulation. Spectral analysis with an arterial plasma input function was used to generate 11 C-PBR28 parametric maps. These maps were then compared with regional 11 C-PBR28 V T (volume of distribution) using a two-tissue compartment model and Logan graphic analysis. Amyloid load was also assessed with 18 F-Flutemetamol PET. With SA, three component peaks were identified in addition to blood volume. The 11 C-PBR28 impulse response function (IRF) at 90 min produced the lowest coefficient of variation. Single-subject analysis using this IRF demonstrated microglial activation in five out of seven amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake revealed a group-wise significant increase in neuroinflammation in amyloid-positive MCI subjects versus HC in multiple cortical association areas, and particularly in the temporal lobe. Interestingly, compartmental analysis detected group-wise increase in 11 C-PBR28 binding in the thalamus of amyloid-positive MCI subjects, while Logan parametric maps did not perform well. This study demonstrates for the first time that spectral analysis can be used to generate parametric maps of 11 C-PBR28 uptake, and is able to detect microglial activation in amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake allow voxel-wise single-subject analysis and could be used to evaluate microglial activation in individual subjects.

  19. Masked areas in shear peak statistics. A forward modeling approach

    DOE PAGES

    Bard, D.; Kratochvil, J. M.; Dawson, W.

    2016-03-09

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impactmore » of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.« less

  20. MASKED AREAS IN SHEAR PEAK STATISTICS: A FORWARD MODELING APPROACH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bard, D.; Kratochvil, J. M.; Dawson, W., E-mail: djbard@slac.stanford.edu

    2016-03-10

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impactmore » of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.« less

  1. Geologic map of Lake Mead and surrounding regions, southern Nevada, southwestern Utah, and northwestern Arizona

    USGS Publications Warehouse

    Felger, Tracey J.; Beard, Sue

    2010-01-01

    Regional stratigraphic units and structural features of the Lake Mead region are presented as a 1:250,000 scale map, and as a Geographic Information System database. The map, which was compiled from existing geologic maps of various scales, depicts geologic units, bedding and foliation attitudes, faults and folds. Units and structural features were generalized to highlight the regional stratigraphic and tectonic aspects of the geology of the Lake Mead region. This map was prepared in support of the papers presented in this volume, Special Paper 463, as well as to facilitate future investigations in the region. Stratigraphic units exposed within the area record 1800 million years of geologic history and include Proterozoic crystalline rocks, Paleozoic and Mesozoic sedimentary rocks, Mesozoic plutonic rocks, Cenozoic volcanic and intrusive rocks, sedimentary rocks and surfi cial deposits. Following passive margin sedimentation in the Paleozoic and Mesozoic, late Mesozoic (Sevier) thrusting and Late Cretaceous and early Tertiary compression produced major folding, reverse faulting, and thrust faulting in the Basin and Range, and resulted in regional uplift and monoclinal folding in the Colorado Plateau. Cenozoic extensional deformation, accompanied by sedimentation and volcanism, resulted in large-magnitude high- and low-angle normal faulting and strike-slip faulting in the Basin and Range; on the Colorado Plateau, extension produced north-trending high-angle normal faults. The latest history includes integration of the Colorado River system, dissection, development of alluvial fans, extensive pediment surfaces, and young faulting.

  2. The universe unveiled : instruments and images through history

    NASA Astrophysics Data System (ADS)

    Stephenson, Bruce; Bolt, Marvin; Friedman, Anna Felicity

    2000-11-01

    The search for understanding creates more than answers; it also produces instruments, books, maps, and other tools made and used by those seeking knowledge. The Universe Unveiled uniquely focuses on these artifacts and devices resulting from the attempts to decipher the Universe from the late fifteenth to the early twentieth century. Beautiful, full-color photographs capture these extremely rare and sometimes unusual curios. Beginning with the discovery of ways to keep time, The Universe Unveiled depicts the shift from an Earth-centered understanding of the Universe to a Sun-centered view, the mapping of the stars, and the ever-expanding knowledge of the heavens using telescopes. It also examines the developing technologies of navigation and of the measuring and mapping of the Earth. In addition to rare European curios, the book is illustrated with non-Western and American works. With more than 250 full-color images, this unique volume will delight the inventive as well as the curious.

  3. Analysis of regolith electromagnetic scattering as constrained by high resolution Earth-based measurements of the lunar microwave emission

    NASA Technical Reports Server (NTRS)

    Keihm, S. J.

    1983-01-01

    When high resolution measurements of the phase variation of the lunar disk center brightness temperature revealed that in situ regolith electrical losses were larger than those measured on returned samples by a factor of 1.5 to 2.0 at centimeter wavelengths, the need for a refinement of the regolith model to include realistic treatment of scattering effects was identified. Two distinct scattering regimes are considered: vertial variations in dielectric constant and volume scattering due to subsurface rock fragments. Models of lunar regolith energy transport processes are now at the state for which a maximum scientific return could be realized from a lunar orbiter microwave mapping experiment. A detailed analysis, including the effects of scattering produced a set of nominal brightness temperature spectra for lunar equatorial regions, which can be used for mapping as a calibration reference for mapping variations in mineralogy and heat flow.

  4. Fit of lithium disilicate crowns fabricated from conventional and digital impressions assessed with micro-CT.

    PubMed

    Kim, Jae-Hyun; Jeong, Ji-Hye; Lee, Jin-Han; Cho, Hye-Won

    2016-10-01

    Although the number of lithium disilicate crowns fabricated with computer-aided design and computer-aided manufacturing (CAD-CAM) technology has increased, the accuracy of the prostheses produced by using digital pathways remains unknown. The purpose of this in vitro study was to compare marginal and internal discrepancies of lithium disilicate crowns fabricated from digital and conventional impressions. A typodont mandibular first molar was prepared for a lithium disilicate crown, and 20 duplicate dies were fabricated by milling poly(methyl methacrylate) resin blocks from laboratory scans. Four groups of 5 lithium disilicate crowns each were created by using a CS3500 (Carestream Dental) intraoral digital impression; Trios (3shape) intraoral digital impression; Ceramill Map400 (Amann Girrbach) extraoral digital impression; and a heat-press technique as a control group. All of the IPS e.max CAD (Ivoclar Vivadent AG) crowns were produced using a 5-axis milling engine (Ceramill Motion2). The lithium disilicate crowns were cemented with zinc phosphate cement under finger pressure. Marginal and internal discrepancies were measured using micro-computed tomography (SkyScan1172). One-way ANOVAs with the Tukey honest significant differences test were used for statistical analysis of the data (α=.05). The mean marginal discrepancies of CS3500 lithium disilicate crowns were 129.6 μm, 200.9 μm for Ceramill Map400, and 207.8 μm 176.1 μm for the heat-press technique; and the internal discrepancy volumes for CS3500 were 25.3 mm 3 , 40.7 mm 3 for Trios, 29.1 mm 3 for Ceramill Map400, and 29.1 and 31.4 mm 3 for the heat-press technique. The CS3500 group showed a significantly better marginal discrepancy than the other 3 groups and a smaller internal discrepancy volume than the Trios group (P<.05). Significant differences were found between IPS e.max CAD crowns produced using 2 intraoral digital impressions, whereas no differences were found between IPS e.max CAD crowns produced from an extraoral digital impression and IPS e.max Press crowns produced using a heat-press technique. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  5. Re-assessing Rainwater Harvesting Volume by CHIRPS Satellite in Semarang Settlement Area

    NASA Astrophysics Data System (ADS)

    Prihanto, Yosef; Koestoer, Raldi H.; Sutjiningsih, Dwita

    2017-12-01

    Semarang City is one of the most influential coastal cities in Java Island. The city is facing increasingly-high water demand due to its development and water problems due to climate change. The spatial physiography and landscape of Semarang City are also exposed the city to water security problem. Hence, rainwater harvesting treatment is an urgent effort to meet the city’s water needs. However, planning, implementation and management of rainwater harvesting are highly depended on multitemporal rainfall data. It has not yet been fully compiled due to limited rain stations. This study aims to examine the extent to which CHIRPS satellite data can be utilized in estimating volume of rainwater harvesting 16 sub-districts in Semarang and determine the water security status. This study uses descriptive statistical method based on spatial analyses. Such method was developed through spatial modeling for rainfall using isohyetal model. The parameters used are rainfall, residential rooftop area, administrative area, population, physiographic and altitude units. Validation is carried out by using monthly 10 rain stations data. The results show level of validity by utilizing CHIRPS Satellite data and mapping rainfall distribution. This study also produces a potential map of distribution rainfall volume that can be harvested in 16 sub-districts of Semarang.

  6. Deconstructing the timber volume paradigm in management of the Tongass National Forest.

    Treesearch

    John P. Caouette; Marc G. Kramer; Gregory J. Nowacki

    2000-01-01

    Timber volume information and associated maps have been widely used in the Tongass National Forest for land-use planning and timber and wildlife management. Although considerable effort has been expended to improve timber volume maps, little has been done to evaluate the suitability of timber volume as a descriptor of forest character.We established a rough indicator...

  7. Improving sand and gravel utilization and land-use planning. - 3D-modelling gravel resources with geospatial data.

    NASA Astrophysics Data System (ADS)

    Rolstad Libach, Lars; Wolden, Knut; Dagestad, Atle; Eskil Larsen, Bjørn

    2017-04-01

    The Norwegian aggregate industry produces approximately 14 million tons of sand and gravel aggregates annually to a value of approximately 100 million Euros. Utilization of aggregates are often linked to land-use conflicts and complex environmental impacts at the extraction site. These topics are managed on a local municipal level in Norway. The Geological Survey of Norway has a database and a web map service with information about sand and gravel deposits with considerable volumes and an importance evaluation. Some of the deposits covers large areas where the land-use conflicts are high. To ease and improve land-use planning, safeguard other important resources like groundwater and sustainable utilization of sand and gravel resources - there is a need for more detailed information of already mapped important resources. Detailed 3D-models of gravel deposits is a tool for a better land-use- and resource management. By combining seismic, GPR and resistivity geophysical profile data, borehole data, quaternary maps and lidar surface data, it has been possible to make 3D-models of deposits and to further research the possibilities for distinguishing different qualities and volumes. Good datasets and a detailed resource map is a prerequisite to assess geological resources for planners, extractors and neighbours. Future challenges lies in use of, often old, geophysical data, and combining these. What kind of information is it possible to grasp from depth-data that actually argues for a more detailed delineation of resources?

  8. Lightning activity observed in upper and lower portions of storms and its relationship to storm structure from VHF mapping and Doppler radar

    NASA Technical Reports Server (NTRS)

    Taylor, W. L.; Rust, W. D.; Macgorman, D. R.; Brandes, E. A.

    1983-01-01

    Space time mapping of very high frequencies (VHF) sources reveals lightning processes for cloud to ground (CG) and for large intracloud (IC) flashes are confined to an altitude below about 10 km and closely associated with the central high reflectivity region of a storm. Another class of IC flashes was identified that produces a splattering of small sources within the main electrically active volume of a storm and also within a large divergent wind canopy at the top of a storm. There is no apparent temporal association between the small high altitude IC flashes occurring almost continuously and the large IC and CG flashes sporadically occurring in the lower portions of storms.

  9. Malignant induction probability maps for radiotherapy using X-ray and proton beams.

    PubMed

    Timlin, C; Houston, M; Jones, B

    2011-12-01

    The aim of this study was to display malignant induction probability (MIP) maps alongside dose distribution maps for radiotherapy using X-ray and charged particles such as protons. Dose distributions for X-rays and protons are used in an interactive MATLAB® program (MathWorks, Natick, MA). The MIP is calculated using a published linear quadratic model, which incorporates fractionation effects, cell killing and cancer induction as a function of dose, as well as relative biological effect. Two virtual situations are modelled: (a) a tumour placed centrally in a cubic volume of normal tissue and (b) the same tumour placed closer to the skin surface. The MIP is calculated for a variety of treatment field options. The results show that, for protons, the MIP increases with field numbers. In such cases, proton MIP can be higher than that for X-rays. Protons produce the lowest MIPs for superficial targets because of the lack of exit dose. The addition of a dose bath to all normal tissues increases the MIP by up to an order of magnitude. This exploratory study shows that it is possible to achieve three-dimensional displays of carcinogenesis risk. The importance of treatment geometry, including the length and volume of tissue traversed by each beam, can all influence MIP. Reducing the volume of tissue irradiated is advantageous, as reducing the number of cells at risk reduces the total MIP. This finding lends further support to the use of treatment gantries as well as the use of simpler field arrangements for particle therapy provided normal tissue tolerances are respected.

  10. Feasibility of anomaly detection and characterization using trans-admittance mammography with 60 × 60 electrode array

    NASA Astrophysics Data System (ADS)

    Zhao, Mingkang; Wi, Hun; Lee, Eun Jung; Woo, Eung Je; In Oh, Tong

    2014-10-01

    Electrical impedance imaging has the potential to detect an early stage of breast cancer due to higher admittivity values compared with those of normal breast tissues. The tumor size and extent of axillary lymph node involvement are important parameters to evaluate the breast cancer survival rate. Additionally, the anomaly characterization is required to distinguish a malignant tumor from a benign tumor. In order to overcome the limitation of breast cancer detection using impedance measurement probes, we developed the high density trans-admittance mammography (TAM) system with 60 × 60 electrode array and produced trans-admittance maps obtained at several frequency pairs. We applied the anomaly detection algorithm to the high density TAM system for estimating the volume and position of breast tumor. We tested four different sizes of anomaly with three different conductivity contrasts at four different depths. From multifrequency trans-admittance maps, we can readily observe the transversal position and estimate its volume and depth. Specially, the depth estimated values were obtained accurately, which were independent to the size and conductivity contrast when applying the new formula using Laplacian of trans-admittance map. The volume estimation was dependent on the conductivity contrast between anomaly and background in the breast phantom. We characterized two testing anomalies using frequency difference trans-admittance data to eliminate the dependency of anomaly position and size. We confirmed the anomaly detection and characterization algorithm with the high density TAM system on bovine breast tissue. Both results showed the feasibility of detecting the size and position of anomaly and tissue characterization for screening the breast cancer.

  11. How unusual is the long-runout of the earthquake-triggered giant Luanshibao landslide, Tibetan Plateau, China?

    NASA Astrophysics Data System (ADS)

    Guo, Changbao; Zhang, Yongshuang; Montgomery, David R.; Du, Yuben; Zhang, Guangze; Wang, Shifeng

    2016-04-01

    In the Tibetan Plateau, active tectonic deformation triggers frequent earthquakes, and giant landslides associated with active faults produce serious consequences. A study of the characteristics and mechanism of a historical long-runout landslide in Luanshibao (LSB), Tibetan Plateau, China, finds a maximum sliding distance (L) of 3.83 km with an elevation drop (H) of 820 m. The landslide volume (V) was ~ 0.64-0.94 × 108 m3, and it produced a long-runout (H/L = 0.21). Recent surface offset along the sinistral strike-slip Litang-Dewu fault passes through the middle part of the landslide, which initiated on the hanging wall of the fault. Geological mapping, geophysical prospecting, trenching, and 14C dating together indicate that the LSB landslide occurred in jointed granite ca. 1980 ± 30 YBP, probably triggered by a large earthquake. Compilation of volume and runout distance data for this landslide and other previously published data for volcanic and nonvolcanic long-runout landslides yields a composite runout length-volume relation (L = 12.52V0.37) that closely predicts runout of the LSB landslide, although substantial variation is noted in runout length around the central tendency.

  12. POVME 2.0: An Enhanced Tool for Determining Pocket Shape and Volume Characteristics

    PubMed Central

    2015-01-01

    Analysis of macromolecular/small-molecule binding pockets can provide important insights into molecular recognition and receptor dynamics. Since its release in 2011, the POVME (POcket Volume MEasurer) algorithm has been widely adopted as a simple-to-use tool for measuring and characterizing pocket volumes and shapes. We here present POVME 2.0, which is an order of magnitude faster, has improved accuracy, includes a graphical user interface, and can produce volumetric density maps for improved pocket analysis. To demonstrate the utility of the algorithm, we use it to analyze the binding pocket of RNA editing ligase 1 from the unicellular parasite Trypanosoma brucei, the etiological agent of African sleeping sickness. The POVME analysis characterizes the full dynamics of a potentially druggable transient binding pocket and so may guide future antitrypanosomal drug-discovery efforts. We are hopeful that this new version will be a useful tool for the computational- and medicinal-chemist community. PMID:25400521

  13. Automatized spleen segmentation in non-contrast-enhanced MR volume data using subject-specific shape priors

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Bülow, Robin; Völzke, Henry

    2017-07-01

    To develop the first fully automated 3D spleen segmentation framework derived from T1-weighted magnetic resonance (MR) imaging data and to verify its performance for spleen delineation and volumetry. This approach considers the issue of low contrast between spleen and adjacent tissue in non-contrast-enhanced MR images. Native T1-weighted MR volume data was performed on a 1.5 T MR system in an epidemiological study. We analyzed random subsamples of MR examinations without pathologies to develop and verify the spleen segmentation framework. The framework is modularized to include different kinds of prior knowledge into the segmentation pipeline. Classification by support vector machines differentiates between five different shape types in computed foreground probability maps and recognizes characteristic spleen regions in axial slices of MR volume data. A spleen-shape space generated by training produces subject-specific prior shape knowledge that is then incorporated into a final 3D level set segmentation method. Individually adapted shape-driven forces as well as image-driven forces resulting from refined foreground probability maps steer the level set successfully to the segment the spleen. The framework achieves promising segmentation results with mean Dice coefficients of nearly 0.91 and low volumetric mean errors of 6.3%. The presented spleen segmentation approach can delineate spleen tissue in native MR volume data. Several kinds of prior shape knowledge including subject-specific 3D prior shape knowledge can be used to guide segmentation processes achieving promising results.

  14. Integration of rainfall/runoff and geomorphological analyses flood hazard in small catchments: case studies from the southern Apennines (Italy)

    NASA Astrophysics Data System (ADS)

    Palumbo, Manuela; Ascione, Alessandra; Santangelo, Nicoletta; Santo, Antonio

    2017-04-01

    We present the first results of an analysis of flood hazard in ungauged mountain catchments that are associated with intensely urbanized alluvial fans. Assessment of hydrological hazard has been based on the integration of rainfall/runoff modelling of drainage basins with geomorphological analysis and mapping. Some small and steep, ungauged mountain catchments located in various areas of the southern Apennines, in southern Italy, have been chosen as test sites. In the last centuries, the selected basins have been subject to heavy and intense precipitation events, which have caused flash floods with serious damages in the correlated alluvial fan areas. Available spatial information (regional technical maps, DEMs, land use maps, geological/lithological maps, orthophotos) and an automated GIS-based procedure (ArcGis tools and ArcHydro tools) have been used to extract morphological, hydrological and hydraulic parameters. Such parameters have been used to run the HEC (Hydrologic Engineering Center of the US Army Corps of Engineers) software (GeoHMS, GeoRAS, HMS and RAS) based on rainfall-runoff models, which have allowed the hydrological and hydraulic simulations. As the floods occurred in the studied catchments have been debris flows dominated, the solid load simulation has been also performed. In order to validate the simulations, we have compared results of the modelling with the effects produced by past floods. Such effects have been quantified through estimations of both the sediment volumes within each catchment that have the potential to be mobilised (pre-event) during a sediment transfer event, and the volume of sediments delivered by the debris flows at basins' outlets (post-event). The post-event sediment volume has been quantified through post-event surveys and Lidar data. Evaluation of the pre-event sediment volumes in single catchments has been based on mapping of sediment storages that may constitute source zones of bed load transport and debris flows. For such an approach has been used a methodology that consists of the application of a process-based geomorphological mapping, based on data derived from GIS analysis using high-resolution DEMs, field measurements and aerial photograph interpretations. Our integrated approach, which allows quantification of the flow rate and a semi-quantitative assessment of sediment that can be mobilized during hydro-meteorological events, is applied for the first time to torrential catchmenmts of the southern Apennines and may significantly contribute to previsional studies aimed at risk mitigation in the study region.

  15. Lighting design for globally illuminated volume rendering.

    PubMed

    Zhang, Yubo; Ma, Kwan-Liu

    2013-12-01

    With the evolution of graphics hardware, high quality global illumination becomes available for real-time volume rendering. Compared to local illumination, global illumination can produce realistic shading effects which are closer to real world scenes, and has proven useful for enhancing volume data visualization to enable better depth and shape perception. However, setting up optimal lighting could be a nontrivial task for average users. There were lighting design works for volume visualization but they did not consider global light transportation. In this paper, we present a lighting design method for volume visualization employing global illumination. The resulting system takes into account view and transfer-function dependent content of the volume data to automatically generate an optimized three-point lighting environment. Our method fully exploits the back light which is not used by previous volume visualization systems. By also including global shadow and multiple scattering, our lighting system can effectively enhance the depth and shape perception of volumetric features of interest. In addition, we propose an automatic tone mapping operator which recovers visual details from overexposed areas while maintaining sufficient contrast in the dark areas. We show that our method is effective for visualizing volume datasets with complex structures. The structural information is more clearly and correctly presented under the automatically generated light sources.

  16. Using MODIS and GLAS Data to Develop Timber Volume Estimates in Central Siberia

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon; Kimes, Daniel; Sun, Guoqing; Kharuk, Viatcheslav; Hyde, Peter; Nelson, Ross

    2007-01-01

    The boreal forest is the Earth's largest terrestrial biome, covering some 12 million km2 and accounting for about one third of this planet's total forest area. Mapping of boreal forest's type, structure parameters and biomass are critical for understanding the boreal forest's significance in the carbon cycle, its response to and impact on global climate change. Ground based forest inventories, have much uncertainty in the inventory data, particularly in remote areas of Siberia where sampling is sparse and/or lacking. In addition, many of the forest inventories that do exist for Siberia are now a decade or more old. Thus, available forest inventories fail to capture the current conditions. Changes in forest structure in a particular forest-type and region can change significantly due to changing environment conditions, and natural and anthropogenic disturbance. Remote sensing methods can potentially overcome these problems. Multispectral sensors can be used to provide vegetation cover maps that show a timely and accurate geographic distribution of vegetation types rather than decade old ground based maps. Lidar sensors can be used to directly obtain measurements that can be used to derive critical forest structure information (e.g., height, density, and volume). These in turn can used to estimate biomass components using allometric equations without having to use out dated forest inventory. Finally, remote sensing data is ideally suited to provide a sampling basis for a rigorous statistical estimate of the variance and error bound on forest structure measures. In this study, new remote sensing methods were applied to develop estimates timber volume using NASA's MODerate resolution Imaging Spectroradiometer (MODIS) and unique waveform data of the geoscience laser altimeter system (GLAS) for a 10 deg x 10 deg area in central Siberia. Using MODIS and GLAS data, maps were produced for cover type and timber volume for 2003, and a realistic variance (error bound) for timber volume was calculated for the study area. In this 'study we used only GLAS footprints that had a slope value of less than 10 deg. This was done to avoid large errors due to the effect of slope on the GLAS models. The method requires the integration of new remote sensing methods with available ground studies of forest timber volume conducted in Russian forests. The results were compared to traditional ground forest inventory methods reported in the literature and to ground truth collected in the study area.

  17. Hekla Volcano, Iceland, in the 20th Century: Lava Volumes, Production Rates, and Effusion Rates

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.; Belart, J. M. C.; Magnússon, E.; Vilmundardóttir, O. K.; Kizel, F.; Sigurmundsson, F. S.; Gísladóttir, G.; Benediktsson, J. A.

    2018-02-01

    Lava flow thicknesses, volumes, and effusion rates provide essential information for understanding the behavior of eruptions and their associated deformation signals. Preeruption and posteruption elevation models were generated from historical stereo photographs to produce the lava flow thickness maps for the last five eruptions at Hekla volcano, Iceland. These results provide precise estimation of lava bulk volumes: V1947-1948 = 0.742 ± 0.138 km3, V1970 = 0.205 ± 0.012 km3, V1980-1981 = 0.169 ± 0.016 km3, V1991 = 0.241 ± 0.019 km3, and V2000 = 0.095 ± 0.005 km3 and reveal variable production rate through the 20th century. These new volumes improve the linear correlation between erupted volume and coeruption tilt change, indicating that tilt may be used to determine eruption volume. During eruptions the active vents migrate 325-480 m downhill, suggesting rough excess pressures of 8-12 MPa and that the gradient of this excess pressure increases from 0.4 to 11 Pa s-1 during the 20th century. We suggest that this is related to increased resistance along the eruptive conduit.

  18. EOS Data Products Handbook. Volume 2

    NASA Technical Reports Server (NTRS)

    Parkinson, Claire L. (Editor); Greenstone, Reynold (Editor); Closs, James (Technical Monitor)

    2000-01-01

    The EOS Data Products Handbook provides brief descriptions of the data products that will be produced from a range of missions of the Earth Observing System (EOS) and associated projects. Volume 1, originally published in 1997, covers the Tropical Rainfall Measuring Mission (TRMM), the Terra mission (formerly named EOS AM-1), and the Data Assimilation System, while this volume, Volume 2, covers the Active Cavity Radiometer Irradiance Monitor Satellite (ACRIMSAT), Aqua, Jason-1, Landsat 7, Meteor 3M/Stratospheric Aerosol and Gas Experiment III (SAGE III). the Quick Scatterometer (QuikScat), the Quick Total Ozone Mapping Spectrometer (Quik-TOMS), and the Vegetation Canopy Lidar (VCL) missions. Volume 2 follows closely the format of Volume 1, providing a list of products and an introduction and overview descriptions of the instruments and data processing, all introductory to the core of the book, which presents the individual data product descriptions, organized into 11 topical chapters. The product descriptions are followed by five appendices, which provide contact information for the EOS data centers that will be archiving and distributing the data sets, contact information for the science points of contact for the data products, references, acronyms and abbreviations, and a data products index.

  19. Automated Classification of Thermal Infrared Spectra Using Self-organizing Maps

    NASA Technical Reports Server (NTRS)

    Roush, Ted L.; Hogan, Robert

    2006-01-01

    Existing and planned space missions to a variety of planetary and satellite surfaces produce an ever increasing volume of spectral data. Understanding the scientific informational content in this large data volume is a daunting task. Fortunately various statistical approaches are available to assess such data sets. Here we discuss an automated classification scheme based on Kohonen Self-organizing maps (SOM) we have developed. The SUM process produces an output layer were spectra having similar properties lie in close proximity to each other. One major effort is partitioning this output layer into appropriate regions. This is prefonned by defining dosed regions based upon the strength of the boundaries between adjacent cells in the SOM output layer. We use the Davies-Bouldin index as a measure of the inter-class similarities and intra-class dissimilarities that determines the optimum partition of the output layer, and hence number of SOM clusters. This allows us to identify the natural number of clusters formed from the spectral data. Mineral spectral libraries prepared at Arizona State University (ASU) and John Hopkins University (JHU) are used to test and evaluate the classification scheme. We label the library sample spectra in a hierarchical scheme with class, subclass, and mineral group names. We use a portion of the spectra to train the SOM, i.e. produce the output layer, while the remaining spectra are used to test the SOM. The test spectra are presented to the SOM output layer and assigned membership to the appropriate cluster. We then evaluate these assignments to assess the scientific meaning and accuracy of the derived SOM classes as they relate to the labels. We demonstrate that unsupervised classification by SOMs can be a useful component in autonomous systems designed to identify mineral species from reflectance and emissivity spectra in the therrnal IR.

  20. Arbitrary-Order Conservative and Consistent Remapping and a Theory of Linear Maps: Part II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullrich, Paul A.; Devendran, Dharshi; Johansen, Hans

    2016-04-01

    The focus on this series of articles is on the generation of accurate, conservative, consistent, and (optionally) monotone linear offline maps. This paper is the second in the series. It extends on the first part by describing four examples of 2D linear maps that can be constructed in accordance with the theory of the earlier work. The focus is again on spherical geometry, although these techniques can be readily extended to arbitrary manifolds. The four maps include conservative, consistent, and (optionally) monotone linear maps (i) between two finite-volume meshes, (ii) from finite-volume to finite-element meshes using a projection-type approach, (iii)more » from finite-volume to finite-element meshes using volumetric integration, and (iv) between two finite-element meshes. Arbitrary order of accuracy is supported for each of the described nonmonotone maps.« less

  1. Recovery of chemical Estimates by Field Inhomogeneity Neighborhood Error Detection (REFINED): Fat/Water Separation at 7T

    PubMed Central

    Narayan, Sreenath; Kalhan, Satish C.; Wilson, David L.

    2012-01-01

    I.Abstract Purpose To reduce swaps in fat-water separation methods, a particular issue on 7T small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Materials and Methods Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Results Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Conclusion Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. PMID:23023815

  2. Recovery of chemical estimates by field inhomogeneity neighborhood error detection (REFINED): fat/water separation at 7 tesla.

    PubMed

    Narayan, Sreenath; Kalhan, Satish C; Wilson, David L

    2013-05-01

    To reduce swaps in fat-water separation methods, a particular issue on 7 Tesla (T) small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. Copyright © 2012 Wiley Periodicals, Inc.

  3. WE-FG-206-05: New Arterial Spin Labeling Method for Simultaneous Estimation of Arterial Cerebral Blood Volume, Cerebral Blood Flow and Arterial Transit Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, M; Whitlow, C; Jung, Y

    Purpose: To demonstrate the feasibility of a novel Arterial Spin Labeling (ASL) method for simultaneously measuring cerebral blood flow (CBF), arterial transit time (ATT), and arterial cerebral blood volume (aCBV) without the use of a contrast agent. Methods: A series of multi-TI ASL images were acquired from one healthy subject on a 3T Siemens Skyra, with the following parameters: PCASL labeling with variable TI [300, 400, 500, 600, 700, 800, 900, 1000, 1500, 2000, 2500, 3000, 3500, 4000] ms, labeling bolus 1400 ms when TI allows, otherwise 100 ms less than TI, TR was minimized for each TI, two sincmore » shaped pre-saturation pulses were applied in the imaging plane immediately before 2D EPI acquisition. 64×64×24 voxels, 5 mm slice thickness, 1 mm gap, full brain coverage, 6 averages per TI, no crusher gradients, 11 ms TE, scan time of 4:56. The perfusion weighted time-series was created for each voxel and fit to a novel model. The model has two components: 1) the traditional model developed by Buxton et al., accounting for CBF and ATT, and 2) a box car function characterizing the width of the labeling bolus, with variable timing and height in proportion to the aCBV. All three parameters were fit using a nonlinear fitting routine that constrained all parameters to be positive. The main purpose of the high-temporal resolution TI sampling for the first second of data acquisition was to precisely estimate the blood volume component for better detection of arrival time and magnitude of signal. Results: Whole brain maps of CBF, ATT, and aCBV were produced, and all three parameters maps are consistent with similar maps described in the literature. Conclusion: Simultaneous mapping of CBF, ATT, and aCBV is feasible with a clinically tractable scan time (under 5 minutes).« less

  4. Middle Atmosphere Program. Handbook for MAP, volume 11

    NASA Technical Reports Server (NTRS)

    Sechrist, C. F., Jr. (Editor)

    1984-01-01

    An overview is presented of the research activities and objectives of the middle atmosphere program (MAP). Status reports are presented of projects underway in the area of middle atmosphere climatology and atmospheric chemistry condensed minutes of MAP steering committee meetings are contained in this volume. Research recommendations for increased U.S. participation in the middle atmosphere program are given.

  5. Automatic Perceptual Color Map Generation for Realistic Volume Visualization

    PubMed Central

    Silverstein, Jonathan C.; Parsad, Nigel M.; Tsirline, Victor

    2008-01-01

    Advances in computed tomography imaging technology and inexpensive high performance computer graphics hardware are making high-resolution, full color (24-bit) volume visualizations commonplace. However, many of the color maps used in volume rendering provide questionable value in knowledge representation and are non-perceptual thus biasing data analysis or even obscuring information. These drawbacks, coupled with our need for realistic anatomical volume rendering for teaching and surgical planning, has motivated us to explore the auto-generation of color maps that combine natural colorization with the perceptual discriminating capacity of grayscale. As evidenced by the examples shown that have been created by the algorithm described, the merging of perceptually accurate and realistically colorized virtual anatomy appears to insightfully interpret and impartially enhance volume rendered patient data. PMID:18430609

  6. Modeling the October 2005 lahars at Panabaj (Guatemala)

    NASA Astrophysics Data System (ADS)

    Charbonnier, S. J.; Connor, C. B.; Connor, L. J.; Sheridan, M. F.; Oliva Hernández, J. P.; Richardson, J. A.

    2018-01-01

    An extreme rainfall event in October of 2005 triggered two deadly lahars on the flanks of Tolimán volcano (Guatemala) that caused many fatalities in the village of Panabaj. We mapped the deposits of these lahars, then developed computer simulations of the lahars using the geologic data and compared simulated area inundated by the flows to mapped area inundated. Computer simulation of the two lahars was dramatically improved after calibration with geological data. Specifically, detailed field measurements of flow inundation area, flow thickness, flow direction, and velocity estimates, collected after lahar emplacement, were used to calibrate the rheological input parameters for the models, including deposit volume, yield strength, sediment and water concentrations, and Manning roughness coefficients. Simulations of the two lahars, with volumes of 240,200 ± 55,400 and 126,000 ± 29,000 m3, using the FLO-2D computer program produced models of lahar runout within 3% of measured runouts and produced reasonable estimates of flow thickness and velocity along the lengths of the simulated flows. We compare areas inundated using the Jaccard fit, model sensitivity, and model precision metrics, all related to Bayes' theorem. These metrics show that false negatives (areas inundated by the observed lahar where not simulated) and false positives (areas not inundated by the observed lahar where inundation was simulated) are reduced using a model calibrated by rheology. The metrics offer a procedure for tuning model performance that will enhance model accuracy and make numerical models a more robust tool for natural hazard reduction.

  7. The big data challenges of connectomics.

    PubMed

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2014-11-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces 'big data', unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them.

  8. Robust reconstruction of B1 (+) maps by projection into a spherical functions space.

    PubMed

    Sbrizzi, Alessandro; Hoogduin, Hans; Lagendijk, Jan J; Luijten, Peter; van den Berg, Cornelis A T

    2014-01-01

    Several parallel transmit MRI techniques require knowledge of the transmit radiofrequency field profiles (B1 (+) ). During the past years, various methods have been developed to acquire this information. Often, these methods suffer from long measurement times and produce maps exhibiting regions with poor signal-to-noise ratio and artifacts. In this article, a model-based reconstruction procedure is introduced that improves the robustness of B1 (+) mapping. The missing information from undersampled B1 (+) maps and the regions of poor signal to noise ratio are reconstructed through projection into the space of spherical functions that arise naturally from the solution of the Helmholtz equations in the spherical coordinate system. As a result, B1 (+) data over a limited range of the field of view/volume is sufficient to reconstruct the B1 (+) over the full spatial domain in a fast and robust way. The same model is exploited to filter the noise of the measured maps. Results from simulations and in vivo measurements confirm the validity of the proposed method. A spherical functions model can well approximate the magnetic fields inside the body with few basis terms. Exploiting this compression capability, B1 (+) maps are reconstructed in regions of unknown or corrupted values. Copyright © 2013 Wiley Periodicals, Inc.

  9. The genetic control of milling yield, dough rheology and baking quality of wheat.

    PubMed

    Kuchel, H; Langridge, P; Mosionek, L; Williams, K; Jefferies, S P

    2006-05-01

    Improving the end-use quality of wheat is a key target for many breeding programmes. With the exception of the relationship between glutenin alleles and some dough rheological characters, knowledge concerning the genetic control of wheat quality traits is somewhat limited. A doubled haploid population produced from a cross between two Australian cultivars 'Trident' and 'Molineux' has been used to construct a linkage map based largely on microsatellite molecular makers. 'Molineux' is superior to 'Trident' for a number of milling, dough rheology and baking quality characteristics, although by international standards 'Trident' would still be regarded as possessing moderately good end-use quality. This population was therefore deemed useful for investigation of wheat end-use quality. A number of significant QTL identified for dough rheological traits mapped to HMW and LMW glutenin loci on chromosomes 1A and 1B. However, QTL associated with dough strength and loaf volume were also identified on chromosome 2A and a significant QTL associated with loaf volume and crumb quality was identified on chromosome 3A. A QTL for flour protein content and milling yield was identified on chromosome 6A and a QTL associated with flour colour reported previously on chromosome 7B was confirmed in this population. The detection of loci affecting dough strength, loaf volume and flour protein content may provide fresh opportunities for the application of marker-assisted selection to improve bread-making quality.

  10. Reproducibility of isopach data and estimates of dispersal and eruption volumes

    NASA Astrophysics Data System (ADS)

    Klawonn, M.; Houghton, B. F.; Swanson, D.; Fagents, S. A.; Wessel, P.; Wolfe, C. J.

    2012-12-01

    Total erupted volume and deposit thinning relationships are key parameters in characterizing explosive eruptions and evaluating the potential risk from a volcano as well as inputs to volcanic plume models. Volcanologists most commonly estimate these parameters by hand-contouring deposit data, then representing these contours in thickness versus square root area plots, fitting empirical laws to the thinning relationships and integrating over the square root area to arrive at volume estimates. In this study we analyze the extent to which variability in hand-contouring thickness data for pyroclastic fall deposits influences the resulting estimates and investigate the effects of different fitting laws. 96 volcanologists (3% MA students, 19% PhD students, 20% postdocs, 27% professors, and 30% professional geologists) from 11 countries (Australia, Ecuador, France, Germany, Iceland, Italy, Japan, New Zealand, Switzerland, UK, USA) participated in our study and produced hand-contours on identical maps using our unpublished thickness measurements of the Kilauea Iki 1959 fall deposit. We computed volume estimates by (A) integrating over a surface fitted through the contour lines, as well as using the established methods of integrating over the thinning relationships of (B) an exponential fit with one to three segments, (C) a power law fit, and (D) a Weibull function fit. To focus on the differences from the hand-contours of the well constrained deposit and eliminate the effects of extrapolations to great but unmeasured thicknesses near the vent, we removed the volume contribution of the near vent deposit (defined as the deposit above 3.5 m) from the volume estimates. The remaining volume approximates to 1.76 *106 m3 (geometric mean for all methods) with maximum and minimum estimates of 2.5 *106 m3 and 1.1 *106 m3. Different integration methods of identical isopach maps result in volume estimate differences of up to 50% and, on average, maximum variation between integration methods of 14%. Volume estimates with methods (A), (C) and (D) show strong correlation (r = 0.8 to r = 0.9), while correlation of (B) with the other methods is weaker (r = 0.2 to r = 0.6) and correlation between (B) and (C) is not statistically significant. We find that the choice of larger maximum contours leads to smaller volume estimates due to method (C), but larger estimates with the other methods. We do not find statistically significant correlation between volume estimations and participants experience level, number of chosen contour levels, nor smoothness of contours. Overall, application of the different methods to the same maps leads to similar mean volume estimates, but the different methods show different dependencies and varying spread of volume estimates. The results indicate that these key parameters are less critically dependent on the operator and their choices of contour values, intervals etc., and more sensitive to the selection of technique to integrate these data.

  11. Middle Atmosphere Program. Handbook for MAP, volume 6

    NASA Technical Reports Server (NTRS)

    Sechrist, C. F., Jr. (Editor)

    1982-01-01

    A directory of scientists associated with the Middle Atmosphere Program (MAP) is presented. The MAP steering committee, the standing committees, MAP study groups, and MAP projects are mentioned along with the MAP secretariat and regional consultative group.

  12. Debris-flow deposits and watershed erosion rates near southern Death Valley, CA, United States

    USGS Publications Warehouse

    Schmidt, K.M.; Menges, C.M.; ,

    2003-01-01

    Debris flows from the steep, granitic hillslopes of the Kingston Range, CA are commensurate in age with nearby fluvial deposits. Quaternary chronostratigraphic differentiation of debris-flow deposits is based upon time-dependent characteristics such as relative boulder strength, derived from Schmidt Hammer measurements, degree of surface desert varnish, pedogenesis, and vertical separation. Rock strength is highest for Holocene-aged boulders and decreases for Pleistocene-aged boulders weathering to grus. Volumes of age-stratified debris-flow deposits, constrained by deposit thickness above bedrock, GPS surveys, and geologic mapping, are greatest for Pleistocene deposits. Shallow landslide susceptibility, derived from a topographically based GIS model, in conjunction with deposit volumes produces watershed-scale erosion rates of ???2-47 mm ka-1, with time-averaged Holocene rates exceeding Pleistocene rates. ?? 2003 Millpress.

  13. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  14. Aeromagnetic survey map of the central California Coast Ranges

    USGS Publications Warehouse

    Langenheim, V.E.; Jachens, R.C.; Moussaoui, K.

    2009-01-01

    This aeromagnetic survey was flown as part of a Cooperative Research and Development Agreement (CRADA) with the Pacific Gas and Electric Company and is intended to promote further understanding of the geology and structure in the central California Coast Ranges by serving as a basis for geophysical interpretations and by supporting geological mapping, mineral and water resource investigations, and other topical studies. Local spatial variations in the Earth's magnetic field (evident as anomalies on aeromagnetic maps) reflect the distribution of magnetic minerals, primarily magnetite, in the underlying rocks. In many cases the volume content of magnetic minerals can be related to rock type, and abrupt spatial changes in the amount of magnetic minerals can commonly mark lithologic or structural boundaries. Bodies of serpentinite and other mafic and ultramafic rocks tend to produce the most intense magnetic anomalies, but such generalizations must be applied with caution because rocks with more felsic compositions, such as the porphyritic granodiorite-granite of the La Panza Range, and even some sedimentary units, also can cause measurable magnetic anomalies.

  15. Site-specific microtubule-associated protein 4 dephosphorylation causes microtubule network densification in pressure overload cardiac hypertrophy.

    PubMed

    Chinnakkannu, Panneerselvam; Samanna, Venkatesababa; Cheng, Guangmao; Ablonczy, Zsolt; Baicu, Catalin F; Bethard, Jennifer R; Menick, Donald R; Kuppuswamy, Dhandapani; Cooper, George

    2010-07-09

    In severe pressure overload-induced cardiac hypertrophy, a dense, stabilized microtubule network forms that interferes with cardiocyte contraction and microtubule-based transport. This is associated with persistent transcriptional up-regulation of cardiac alpha- and beta-tubulin and microtubule-stabilizing microtubule-associated protein 4 (MAP4). There is also extensive microtubule decoration by MAP4, suggesting greater MAP4 affinity for microtubules. Because the major determinant of this affinity is site-specific MAP4 dephosphorylation, we characterized this in hypertrophied myocardium and then assessed the functional significance of each dephosphorylation site found by mimicking it in normal cardiocytes. We first isolated MAP4 from normal and pressure overload-hypertrophied feline myocardium; volume-overloaded myocardium, which has an equal degree and duration of hypertrophy but normal functional and cytoskeletal properties, served as a control for any nonspecific growth-related effects. After cloning cDNA-encoding feline MAP4 and obtaining its deduced amino acid sequence, we characterized by mass spectrometry any site-specific MAP4 dephosphorylation. Solely in pressure overload-hypertrophied myocardium, we identified striking MAP4 dephosphorylation at Ser-472 in the MAP4 N-terminal projection domain and at Ser-924 and Ser-1056 in the assembly-promoting region of the C-terminal microtubule-binding domain. Site-directed mutagenesis of MAP4 cDNA was then used to switch each serine to non-phosphorylatable alanine. Wild-type and mutated cDNAs were used to construct adenoviruses; microtubule network density, stability, and MAP4 decoration were assessed in normal cardiocytes following an equivalent level of MAP4 expression. The Ser-924 --> Ala MAP4 mutant produced a microtubule phenotype indistinguishable from that seen in pressure overload hypertrophy, such that Ser-924 MAP4 dephosphorylation during pressure overload hypertrophy may be central to this cytoskeletal abnormality.

  16. Exploration of Piezoelectric Bimorph Deflection in Synthetic Jet Actuators

    NASA Astrophysics Data System (ADS)

    Housley, Kevin; Amitay, Michael

    2017-11-01

    The design of piezoelectric bimorphs for synthetic jet actuators could be improved by greater understanding of the deflection of the bimorphs; both their mode shapes and the resulting volume change inside the actuator. The velocity performance of synthetic jet actuators is dependent on this volume change and the associated internal pressure changes. Knowledge of these could aid in refining the geometry of the cavity to improve efficiency. Phase-locked jet velocities and maps of displacement of the surface of the bimorph were compared between actuators of varying diameter. Results from a bimorph of alternate stiffness were also compared. Bimorphs with higher stiffness exhibited a more desirable (0,1) mode shape, which produced a high volume change inside of the actuator cavity. Those with lower stiffness allowed for greater displacement of the surface, initially increasing the volume change, but exhibited higher mode shapes at certain frequency ranges. These higher node shapes sharply reduced the volume change and negatively impacted the velocity of the jet at those frequencies. Adjustments to the distribution of stiffness along the radius of the bimorph could prevent this and allow for improved deflection without the risk of reaching higher modes.

  17. Examining a Half Century of Northwestern North American Glacier Behavior

    NASA Astrophysics Data System (ADS)

    Molnia, B. F.; Fahey, M. J.; Friesen, B.; Josberger, E. G.

    2015-12-01

    In 1957, as part of the United States' contribution to the International Geophysical Year (IGY), the American Geographical Society (AGS) initiated a multi-institutional mapping project to produce 1:10,000-scale topographic maps of nine northwestern North American glaciers. The project's goal was to prepare precise maps at large scales of selected small glaciers to form a permanent record of the condition of these glaciers so that at a future date they could be resurveyed and compared. Continued surveys would give the history of wastage and accumulation, and more accurate interpretation of the response of these glaciers to meteorological and other factors. The resulting maps and a descriptive summary brochure were published in 1960 by the American Geographical Society. The USGS Global Fiducials Program (GFP) began to systematically image the same nine glaciers approximately half-century after its IGY mapping. The results of the GFP analyses would permit the types of comparisons that were envisioned by the IGY project. Imagery of each of these nine glaciers has been collected from multiple sources, including Next View licensed commercial imagery, vertical and oblique aerial photography, Landsat, and US National Imagery Systems. Exploitation of the imagery has resulted in the production of new 21st century maps that can be compared and contrasted with the vintage AGS map set. Comparison will permit the calculation of a number of parameters which will provide a direct insight into the changes that northwestern North American glaciers have been experiencing during the past half century. Specifically, these comparisons will permit the calculation of changes in glacier length, area, thickness, and volume; computation of rates of glacier advance and/or retreat, rates of glacier thickening and/or thinning, and rates of volume change; production of digital elevation models (DEMs); and generation of velocity fields from crevasse migration. The subsequent re-mapping and comparison to the 1950s maps will provide a unique survey of glacier change across western North America from Alaska to northwestern Washington. Each pair of glacier maps will be accompanied with a summary document describing the changes that have occurred at that glacier. From north to south, the nine IGY glaciers span a distance of more than 2,600 km.

  18. High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology

    PubMed Central

    Torres-Sánchez, Jorge; López-Granados, Francisca; Serrano, Nicolás; Arquero, Octavio; Peña, José M.

    2015-01-01

    The geometric features of agricultural trees such as canopy area, tree height and crown volume provide useful information about plantation status and crop production. However, these variables are mostly estimated after a time-consuming and hard field work and applying equations that treat the trees as geometric solids, which produce inconsistent results. As an alternative, this work presents an innovative procedure for computing the 3-dimensional geometric features of individual trees and tree-rows by applying two consecutive phases: 1) generation of Digital Surface Models with Unmanned Aerial Vehicle (UAV) technology and 2) use of object-based image analysis techniques. Our UAV-based procedure produced successful results both in single-tree and in tree-row plantations, reporting up to 97% accuracy on area quantification and minimal deviations compared to in-field estimations of tree heights and crown volumes. The maps generated could be used to understand the linkages between tree grown and field-related factors or to optimize crop management operations in the context of precision agriculture with relevant agro-environmental implications. PMID:26107174

  19. High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology.

    PubMed

    Torres-Sánchez, Jorge; López-Granados, Francisca; Serrano, Nicolás; Arquero, Octavio; Peña, José M

    2015-01-01

    The geometric features of agricultural trees such as canopy area, tree height and crown volume provide useful information about plantation status and crop production. However, these variables are mostly estimated after a time-consuming and hard field work and applying equations that treat the trees as geometric solids, which produce inconsistent results. As an alternative, this work presents an innovative procedure for computing the 3-dimensional geometric features of individual trees and tree-rows by applying two consecutive phases: 1) generation of Digital Surface Models with Unmanned Aerial Vehicle (UAV) technology and 2) use of object-based image analysis techniques. Our UAV-based procedure produced successful results both in single-tree and in tree-row plantations, reporting up to 97% accuracy on area quantification and minimal deviations compared to in-field estimations of tree heights and crown volumes. The maps generated could be used to understand the linkages between tree grown and field-related factors or to optimize crop management operations in the context of precision agriculture with relevant agro-environmental implications.

  20. Four-dimensional Doppler ultrasound measurements in carotid bifurcation models: effect of concentric versus eccentric stenosis

    NASA Astrophysics Data System (ADS)

    Poepping, Tamie L.; Rankin, Richard N.; Holdsworth, David W.

    2001-05-01

    A unique in-vitro system has been developed that incorporates both realistic phantoms and flow. The anthropomorphic carotid phantoms are fabricated in agar with stenosis severity of 30% or 70% (by NASCET standards) and one of two geometric configurations- concentric or eccentric. The phantoms are perfused with a flow waveform that simulates normal common carotid flow. Pulsed Doppler ultrasound data are acquired at a 1 mm grid spacing throughout the lumen of the carotid bifurcation. To obtain a half-lumen volume, symmetric about the mid plane, requires a 13 hour acquisition over 3238 interrogation sites, producing 5.6 Gbytes of data. The spectral analysis produces estimates of parameters such as the peak velocity, mean velocity, spectral-broadening index, and turbulence intensity. Color-encoded or grayscale-encoded maps of these spectral parameters show distinctly different flow patterns resulting from stenoses of equal severity but different eccentricity. The most noticeable differences are seen in the volumes of the recirculation zones and the paths of the high-velocity jets. Elevated levels of turbulence intensity are also seen distal to the stenosis in the 70%-stenosed models.

  1. Anthropogenic contribution to the geological and geomorphological record: A case study from Great Yarmouth, Norfolk, UK

    NASA Astrophysics Data System (ADS)

    Jordan, Hannah; Hamilton, Ken; Lawley, Russell; Price, Simon James

    2016-01-01

    Reconstruction of artificial or anthropogenic topographies, sediment thicknesses and volumes provides a mechanism for quantifying anthropogenic changes to sedimentary systems in the context of the proposed Anthropocene epoch. We present a methodology for determining the volumetric contribution of anthropogenic deposits to the geological and geomorphological record and apply it to the Great Yarmouth area of Norfolk, UK. 115 boreholes, drilled to a maximum depth of 6 m below ground level, were used to determine the thickness and distribution of seven geo-archaeological units comprising natural and anthropogenic deposits in the central Great Yarmouth area. This was supplemented by additional depth information derived from 467 existing ground investigation boreholes and published 1:50 000 scale geological maps. The top and base of each geo-archaeological unit were modelled from elevations recorded in the borehole data. Grids were produced using a natural neighbour analysis with a 25 m cell size using MapInfo 8.0 Vertical Mapper 3.1 to produce palaeotopographical surfaces. Maximum, minimum and average elevations for each geo-archaeological unit generally increase with decreasing age with the exception of the Early-Medieval palaeotopographical surface which locally occurs at higher elevations than that of the younger Late-Medieval unit. The total sediment volume for the combined Modern, Post-Medieval, Late-Medieval and Early-Medieval geo-archaeological units is 10.91 × 105 m3. The total sediment volume for the Aeolian, River Terrace and Marine geo-archaeological units combined is 65.58 × 105 m3. Anthropogenic sedimentation rates were calculated to increase from 590 m3/yr during the Early-Medieval period, 1500 m3/yr during the Post-Medieval period and 2300 m3/yr during the Modern period. It is estimated that the combined anthropogenic geo-archaeological units contribute approximately 15% of the total volume of sediments that would have been traditionally considered natural Holocene deposits in the Great Yarmouth area. The results indicate that an approach combining geological and archaeological deposit modelling can be used to quantify anthropogenic landscape impact and its associated sediment flux.

  2. Comparison of a fully mapped plot design to three alternative designs for volume and area estimates using Maine inventory data

    Treesearch

    Stanford L. Arner

    1998-01-01

    A fully mapped plot design is compared to three alternative designs using data collected for the recent inventory of Maine's forest resources. Like the fully mapped design, one alternative eliminates the bias of previous procedures, and should be less costly and more consistent. There was little difference in volume and area estimates or in sampling errors among...

  3. The big data challenges of connectomics

    PubMed Central

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2015-01-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them. PMID:25349911

  4. The big data challenges of connectomics

    DOE PAGES

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    2014-10-28

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less

  5. The big data challenges of connectomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less

  6. Lateral extrusion of a thermally weakened pluton overburden (Campiglia Marittima, Tuscany)

    NASA Astrophysics Data System (ADS)

    Vezzoni, Simone; Rocchi, Sergio; Dini, Andrea

    2017-10-01

    The ascent and emplacement of magmas in the upper crust modify the local pre-existing thermal and rheological settings. Such changes have important effects in producing anomalous structures, mass extrusion, rock fracturing, and in some conditions, hydrothermal mineralizations. In the Campiglia Marittima area, detailed field mapping led to the reconstruction of a local deformation history that overlaps, chronologically and spatially, with regional extension. This local deformation was triggered at the Miocene-Pliocene boundary by the intrusion of a monzogranitic pluton beneath a carbonate sedimentary sequence. The emplacement of the pluton produced a perturbation in the rheological behaviour of the carbonate host rocks, producing transient ductile conditions in the very shallow crust. The carbonate rocks were thermally weakened and flowed laterally, accumulating downslope of the pluton roof, mainly toward the east. As the thermal anomaly was decaying, the brittle-ductile boundary moved progressively back towards the pluton, and large tension gash-shaped volumes of fractured marble were generated. These fractured volumes were exploited by rising hydrothermal fluids generating sigmoidal skarn bodies and ore shoots. This work presents the Campiglia Marittima case study as a prime example of structural interference between regional extensional structures and local, lateral mass extrusion in a transient ductile rheological regime triggered by pluton emplacement.

  7. Lateral extrusion of a thermally weakened pluton overburden (Campiglia Marittima, Tuscany)

    NASA Astrophysics Data System (ADS)

    Vezzoni, Simone; Rocchi, Sergio; Dini, Andrea

    2018-06-01

    The ascent and emplacement of magmas in the upper crust modify the local pre-existing thermal and rheological settings. Such changes have important effects in producing anomalous structures, mass extrusion, rock fracturing, and in some conditions, hydrothermal mineralizations. In the Campiglia Marittima area, detailed field mapping led to the reconstruction of a local deformation history that overlaps, chronologically and spatially, with regional extension. This local deformation was triggered at the Miocene-Pliocene boundary by the intrusion of a monzogranitic pluton beneath a carbonate sedimentary sequence. The emplacement of the pluton produced a perturbation in the rheological behaviour of the carbonate host rocks, producing transient ductile conditions in the very shallow crust. The carbonate rocks were thermally weakened and flowed laterally, accumulating downslope of the pluton roof, mainly toward the east. As the thermal anomaly was decaying, the brittle-ductile boundary moved progressively back towards the pluton, and large tension gash-shaped volumes of fractured marble were generated. These fractured volumes were exploited by rising hydrothermal fluids generating sigmoidal skarn bodies and ore shoots. This work presents the Campiglia Marittima case study as a prime example of structural interference between regional extensional structures and local, lateral mass extrusion in a transient ductile rheological regime triggered by pluton emplacement.

  8. Rapid mapping of landslide disaster using UAV- photogrammetry

    NASA Astrophysics Data System (ADS)

    Cahyono, A. B.; Zayd, R. A.

    2018-03-01

    Unmanned Aerial Vehicle (UAV) systems offered many advantages in several mapping applications such as slope mapping, geohazard studies, etc. This study utilizes UAV system for landslide disaster occurred in Jombang Regency, East Java. This study concentrates on type of rotor-wing UAV, that is because rotor wing units are stable and able to capture images easily. Aerial photograph were acquired in the form of strips which followed the procedure of acquiring aerial photograph where taken 60 photos. Secondary data of ground control points using GPS Geodetic and check points established using Total Station technique was used. The digital camera was calibrated using close range photogrammetric software and the recovered camera calibration parameters were then used in the processing of digital images. All the aerial photographs were processed using digital photogrammetric software and the output in the form of orthophoto was produced. The final result shows a 1: 1500 scale orthophoto map from the data processing with SfM algorithm with GSD accuracy of 3.45 cm. And the calculated volume of contour line delineation of 10527.03 m3. The result is significantly different from the result of terrestrial methode equal to 964.67 m3 or 8.4% of the difference of both.

  9. SU-G-IeP1-12: Size Selective Arterial Cerebral Blood Volume Mapping Using Multiple Inversion Time Arterial Spin Labeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Y; Johnston, M; Whitlow, C

    Purpose: To demonstrate the feasibility of a novel method for size specific arterial cerebral blood volume (aCBV) mapping using pseudo-continuous arterial spin labeling (PCASL), with multiple TI. Methods: Multiple PCASL images were obtained from a subject with TI of [300, 400, 500, 600, 700, 800, 900, 1000, 1500, 2000, 2500, 3000, 3500, 4000] ms. Each TI pair was averaged six times. Two scans were performed: one without a flow crusher gradient and the other with a crusher gradient (10cm/s in three directions) to remove signals from large arteries. Scan times were 5min. without a crusher gradient and 5.5 min withmore » a crusher gradient. Non-linear fitting algorithm finds the minimum mean squared solution of per-voxel based aCBV, cerebral blood flow, and arterial transit time, and fits the data into a hemodynamic model that represents superposition of blood volume and flow components within a single voxel. Results: aCBV maps with a crusher gradient represent signals from medium and small sized arteries, while those without a crusher gradient represent signals from all sized arteries, indicating that flow crusher gradients can be effectively employed to achieve size-specific aCBV mapping. Regardless of flow crusher, the CBF and ATT maps are very similar in appearance. Conclusion: Quantitative size selective blood volume mapping controlled by a flow crusher is feasible without additional information because the ASL quantification process doesn’t require an arterial input function measured from a large artery. The size specific blood volume mapping is not interfered by sSignals from large arteries do not interfere with size specific aCBV mapping in the applications of interest in for applications in which only medium or small arteries are of interest.« less

  10. What, When, Where, and Why of Secondary Hawaiian Hotspot Volcanism

    NASA Astrophysics Data System (ADS)

    Garcia, M. O.; Ito, G.; Applegate, B.; Weis, D.; Swinnard, L.; Flinders, A.; Hanano, D.; Nobre-Silva, I.; Bianco, T.; Naumann, T.; Geist, D.; Blay, C.; Sciaroni, L.; Maerschalk, C.; Harpp, K.; Christensen, B.

    2007-12-01

    Secondary hotspot volcanism occurs on most oceanic island groups (Hawaii, Canary, Society) but its origins remain enigmatic. A 28-day marine expedition used multibeam bathymetry and acoustic imagery to map the extent of submarine volcanic fields around the northern Hawaiian Islands (Kauai, Niihau and Kaula), and the JASON2 ROV to sample many volcanoes to characterize the petrology, geochemistry (major and trace elements, and isotopes) and ages of the lavas from these volcanoes. Our integrated geological, geochemical and geophysical study attempts to examine the what (compositions and source), where (distribution and volumes), when (ages), and why (mechanisms) of secondary volcanism on and around the northern Hawaiian Islands. A first-order objective was to establish how the submarine volcanism relates in space, time, volume, and composition to the nearby shield volcanoes and their associated onshore secondary volcanism. Our surveying and sampling revealed major fields of submarine volcanoes extending from the shallow slopes of these islands to more than 100 km offshore. These discoveries dramatically expand the volumetric importance, distribution and geodynamic framework for Hawaiian secondary volcanism. New maps and rock petrology on the samples collected will be used to evaluate currently proposed mechanisms for secondary volcanism and to consider new models such as small-scale mantle convection driven by thermal and melt-induced buoyancy to produce the huge volume of newly discovered lava. Our results seem to indicate substantial revisions are needed to our current perceptions of hotspot dynamics for Hawaii and possibly elsewhere.

  11. Robust isotropic super-resolution by maximizing a Laplace posterior for MRI volumes

    NASA Astrophysics Data System (ADS)

    Han, Xian-Hua; Iwamoto, Yutaro; Shiino, Akihiko; Chen, Yen-Wei

    2014-03-01

    Magnetic resonance imaging can only acquire volume data with finite resolution due to various factors. In particular, the resolution in one direction (such as the slice direction) is much lower than others (such as the in-plane direction), yielding un-realistic visualizations. This study explores to reconstruct MRI isotropic resolution volumes from three orthogonal scans. This proposed super- resolution reconstruction is formulated as a maximum a posterior (MAP) problem, which relies on the generation model of the acquired scans from the unknown high-resolution volumes. Generally, the deviation ensemble of the reconstructed high-resolution (HR) volume from the available LR ones in the MAP is represented as a Gaussian distribution, which usually results in some noise and artifacts in the reconstructed HR volume. Therefore, this paper investigates a robust super-resolution by formulating the deviation set as a Laplace distribution, which assumes sparsity in the deviation ensemble based on the possible insight of the appeared large values only around some unexpected regions. In addition, in order to achieve reliable HR MRI volume, we integrates the priors such as bilateral total variation (BTV) and non-local mean (NLM) into the proposed MAP framework for suppressing artifacts and enriching visual detail. We validate the proposed robust SR strategy using MRI mouse data with high-definition resolution in two direction and low-resolution in one direction, which are imaged in three orthogonal scans: axial, coronal and sagittal planes. Experiments verifies that the proposed strategy can achieve much better HR MRI volumes than the conventional MAP method even with very high-magnification factor: 10.

  12. Whole-tumor histogram analysis of the cerebral blood volume map: tumor volume defined by 11C-methionine positron emission tomography image improves the diagnostic accuracy of cerebral glioma grading.

    PubMed

    Wu, Rongli; Watanabe, Yoshiyuki; Arisawa, Atsuko; Takahashi, Hiroto; Tanaka, Hisashi; Fujimoto, Yasunori; Watabe, Tadashi; Isohashi, Kayako; Hatazawa, Jun; Tomiyama, Noriyuki

    2017-10-01

    This study aimed to compare the tumor volume definition using conventional magnetic resonance (MR) and 11C-methionine positron emission tomography (MET/PET) images in the differentiation of the pre-operative glioma grade by using whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) maps. Thirty-four patients with histopathologically proven primary brain low-grade gliomas (n = 15) and high-grade gliomas (n = 19) underwent pre-operative or pre-biopsy MET/PET, fluid-attenuated inversion recovery, dynamic susceptibility contrast perfusion-weighted magnetic resonance imaging, and contrast-enhanced T1-weighted at 3.0 T. The histogram distribution derived from the nCBV maps was obtained by co-registering the whole tumor volume delineated on conventional MR or MET/PET images, and eight histogram parameters were assessed. The mean nCBV value had the highest AUC value (0.906) based on MET/PET images. Diagnostic accuracy significantly improved when the tumor volume was measured from MET/PET images compared with conventional MR images for the parameters of mean, 50th, and 75th percentile nCBV value (p = 0.0246, 0.0223, and 0.0150, respectively). Whole-tumor histogram analysis of CBV map provides more valuable histogram parameters and increases diagnostic accuracy in the differentiation of pre-operative cerebral gliomas when the tumor volume is derived from MET/PET images.

  13. Reconstructing time series water volumes of drying lakes in Central Asia with ZY-3 stereo remote sensing data

    NASA Astrophysics Data System (ADS)

    Li, J.; Warner, T.; Bao, A.

    2017-12-01

    Central Asia is one of the world most vulnerable areas responding to global change. Lakes in arid regions of Central Asia remain sensitive to climatic change and fluctuate with temperature and precipitation variations. Study showed that some central asian inland lakes in showed a trend of area shrinkage or extinct in the last decades. Quantitative analysis of lake volume changes in spatio-temporal processes will improve our understanding water resource utilization in arid regions and their responses to regional climate change. However, due to the lack of lake bathmetry or observation data, the volumes of these lakes remain unknown. In this paper, three lakes, such as Chaiwopu lake, Alik Lake and Selectyteniz Lake in Central Asia are used to reconstruct lake volume changes. Firstly, stereo mapping technologies derived from ZY-3 high resolution data are used to map the high-precision 3-D lake bathmetry, so as to create "Area-Level-Volume" based on contours of lake bathmetry. Secondly, time series lake areas in the last 50 years are mapped with multi-source and multi-temporal remote sensing images. Based on lake storage curves and time series lake areas, lake volumes in the last 5 decades can be reconstructed, and the spatio-temporal characteristics of lake volume changes and their mechanisms are also analyzed. The results showed that the high-precision lake hydrological elements are reconstructed on arid drying lakes through the application of stereo mapping technology in remote sensing.

  14. Volumetric evolution of Surtsey, Iceland, from topographic maps and scanning airborne laser altimetry

    USGS Publications Warehouse

    Garvin, J.B.; Williams, R.S.; Frawley, J.J.; Krabill, W.B.

    2000-01-01

    The volumetric evolution of Surtsey has been estimated on the basis of digital elevation models derived from NASA scanning airborne laser altimeter surveys (20 July 1998), as well as digitized 1:5,000-scale topographic maps produced by the National Land Survey of Iceland and by Norrman. Subaerial volumes have been computed from co-registered digital elevation models (DEM's) from 6 July 1968, 11 July 1975, 16 July 1993, and 20 July 1998 (scanning airborne laser altimetry), as well as true surface area (above mean sea level). Our analysis suggests that the subaerial volume of Surtsey has been reduced from nearly 0.100 km3 on 6 July 1968 to 0.075 km3 on 20 July 1998. Linear regression analysis of the temporal evolution of Surtsey's subaerial volume indicates that most of its subaerial surface will be at or below mean sea-level by approximately 2100. This assumes a conservative estimate of continuation of the current pace of marine erosion and mass-wasting on the island, including the indurated core of the conduits of the Surtur I and Surtur II eruptive vents. If the conduits are relatively resistant to marine erosion they will become sea stacks after the rest of the island has become a submarine shoal, and some portions of the island could survive for centuries. The 20 July 1998 scanning laser altimeter surveys further indicate rapid enlargement of erosional canyons in the northeastern portion of the partial tephra ring associated with Surtur I. Continued airborne and eventually spaceborne topographic surveys of Surtsey are planned to refine the inter-annual change of its subaerial volume.

  15. Development of Fracture Mechanics Maps for Composite Materials. Volume 3.

    DTIC Science & Technology

    1985-12-01

    RD-At69 W4 DEVELOPMENT OF FRCTURE ECHNICS NPS FOR COMPOSITE V3jMATERIALS VOLUME 3( ) DEUTSCHE FORSCHUNOS- UND YERSUCHSANSTALT FUER LUFT- UND RAUMF...DEVELOPMENT OF FRACTURE MECHANICS MAPS I FOR COMPOSITE MATERIALS Dr. H. W. Bergmann DFVLR - Institute for Structural Mechanics Braunschweig, West Germany...Brussels, Belgium ELEMENT NO . NO. NO. NO. 11. TITLE (Include Security Classification) Development of N/A N/A N/A N/A Fracture MechanicsMaps for Composite

  16. Atlas Based Segmentation and Mapping of Organs at Risk from Planning CT for the Development of Voxel-Wise Predictive Models of Toxicity in Prostate Radiotherapy

    NASA Astrophysics Data System (ADS)

    Acosta, Oscar; Dowling, Jason; Cazoulat, Guillaume; Simon, Antoine; Salvado, Olivier; de Crevoisier, Renaud; Haigron, Pascal

    The prediction of toxicity is crucial to managing prostate cancer radiotherapy (RT). This prediction is classically organ wise and based on the dose volume histograms (DVH) computed during the planning step, and using for example the mathematical Lyman Normal Tissue Complication Probability (NTCP) model. However, these models lack spatial accuracy, do not take into account deformations and may be inappropiate to explain toxicity events related with the distribution of the delivered dose. Producing voxel wise statistical models of toxicity might help to explain the risks linked to the dose spatial distribution but is challenging due to the difficulties lying on the mapping of organs and dose in a common template. In this paper we investigate the use of atlas based methods to perform the non-rigid mapping and segmentation of the individuals' organs at risk (OAR) from CT scans. To build a labeled atlas, 19 CT scans were selected from a population of patients treated for prostate cancer by radiotherapy. The prostate and the OAR (Rectum, Bladder, Bones) were then manually delineated by an expert and constituted the training data. After a number of affine and non rigid registration iterations, an average image (template) representing the whole population was obtained. The amount of consensus between labels was used to generate probabilistic maps for each organ. We validated the accuracy of the approach by segmenting the organs using the training data in a leave one out scheme. The agreement between the volumes after deformable registration and the manually segmented organs was on average above 60% for the organs at risk. The proposed methodology provides a way to map the organs from a whole population on a single template and sets the stage to perform further voxel wise analysis. With this method new and accurate predictive models of toxicity will be built.

  17. SU-F-J-93: Automated Segmentation of High-Resolution 3D WholeBrain Spectroscopic MRI for Glioblastoma Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreibmann, E; Shu, H; Cordova, J

    Purpose: We report on an automated segmentation algorithm for defining radiation therapy target volumes using spectroscopic MR images (sMRI) acquired at nominal voxel resolution of 100 microliters. Methods: Wholebrain sMRI combining 3D echo-planar spectroscopic imaging, generalized auto-calibrating partially-parallel acquisitions, and elliptical k-space encoding were conducted on 3T MRI scanner with 32-channel head coil array creating images. Metabolite maps generated include choline (Cho), creatine (Cr), and N-acetylaspartate (NAA), as well as Cho/NAA, Cho/Cr, and NAA/Cr ratio maps. Automated segmentation was achieved by concomitantly considering sMRI metabolite maps with standard contrast enhancing (CE) imaging in a pipeline that first uses the watermore » signal for skull stripping. Subsequently, an initial blob of tumor region is identified by searching for regions of FLAIR abnormalities that also display reduced NAA activity using a mean ratio correlation and morphological filters. These regions are used as starting point for a geodesic level-set refinement that adapts the initial blob to the fine details specific to each metabolite. Results: Accuracy of the segmentation model was tested on a cohort of 12 patients that had sMRI datasets acquired pre, mid and post-treatment, providing a broad range of enhancement patterns. Compared to classical imaging, where heterogeneity in the tumor appearance and shape across posed a greater challenge to the algorithm, sMRI’s regions of abnormal activity were easily detected in the sMRI metabolite maps when combining the detail available in the standard imaging with the local enhancement produced by the metabolites. Results can be imported in the treatment planning, leading in general increase in the target volumes (GTV60) when using sMRI+CE MRI compared to the standard CE MRI alone. Conclusion: Integration of automated segmentation of sMRI metabolite maps into planning is feasible and will likely streamline acceptance of this new acquisition modality in clinical practice.« less

  18. Objective rapid delineation of areas at risk from block-and-ash pyroclastic flows and surges

    USGS Publications Warehouse

    Widiwijayanti, C.; Voight, B.; Hidayat, D.; Schilling, S.P.

    2009-01-01

    Assessments of pyroclastic flow (PF) hazards are commonly based on mapping of PF and surge deposits and estimations of inundation limits, and/or computer models of varying degrees of sophistication. In volcanic crises a PF hazard map may be sorely needed, but limited time, exposures, or safety aspects may preclude fieldwork, and insufficient time or baseline data may be available for reliable dynamic simulations. We have developed a statistically constrained simulation model for block-and-ash type PFs to estimate potential areas of inundation by adapting methodology from Iverson et al. (Geol Soc America Bull 110:972-984, (1998) for lahars. The predictive equations for block-and-ash PFs are calibrated with data from several volcanoes and given by A = (0.05 to 0.1) V2/3, B = (35 to 40) V2/3, where A is cross-sectional area of inundation, B is planimetric area and V is deposit volume. The proportionality coefficients were obtained from regression analyses and comparison of simulations to mapped deposits. The method embeds the predictive equations in a GIS program coupled with DEM topography, using the LAHARZ program of Schilling (1998). Although the method is objective and reproducible, any PF hazard zone so computed should be considered as an approximate guide only, due to uncertainties on the coefficients applicable to individual PFs, the authenticity of DEM details, and the volume of future collapses. The statistical uncertainty of the predictive equations, which imply a factor of two or more in predicting A or B for a specified V, is superposed on the uncertainty of forecasting V for the next PF to descend a particular valley. Multiple inundation zones, produced by simulations using a selected range of volumes, partly accommodate these uncertainties. The resulting maps show graphically that PF inundation potentials are highest nearest volcano sources and along valley thalwegs, and diminish with distance from source and lateral distance from thalweg. The model does not explicitly consider dynamic behavior, which can be important. Ash-cloud surge impact limits must be extended beyond PF hazard zones and we provide several approaches to do this. The method has been used to supply PF and surge hazard maps in two crises: Merapi 2006; and Montserrat 2006-2007. ?? Springer-Verlag 2008.

  19. Dynamic three-dimensional phase-contrast technique in MRI: application to complex flow analysis around the artificial heart valve

    NASA Astrophysics Data System (ADS)

    Kim, Soo Jeong; Lee, Dong Hyuk; Song, Inchang; Kim, Nam Gook; Park, Jae-Hyeung; Kim, JongHyo; Han, Man Chung; Min, Byong Goo

    1998-07-01

    Phase-contrast (PC) method of magnetic resonance imaging (MRI) has bee used for quantitative measurements of flow velocity and volume flow rate. It is a noninvasive technique which provides an accurate two-dimensional velocity image. Moreover, Phase Contrast Cine magnetic resonance imaging combines the flow dependent contrast of PC-MRI with the ability of cardiac cine imaging to produce images throughout the cardiac cycle. However, the accuracy of the data acquired from the single through-plane velocity encoding can be reduced by the effect of flow direction, because in many practical cases flow directions are not uniform throughout the whole region of interest. In this study, we present dynamic three-dimensional velocity vector mapping method using PC-MRI which can visualize the complex flow pattern through 3D volume rendered images displayed dynamically. The direction of velocity mapping can be selected along any three orthogonal axes. By vector summation, the three maps can be combined to form a velocity vector map that determines the velocity regardless of the flow direction. At the same time, Cine method is used to observe the dynamic change of flow. We performed a phantom study to evaluate the accuracy of the suggested PC-MRI in continuous and pulsatile flow measurement. Pulsatile flow wave form is generated by the ventricular assistant device (VAD), HEMO-PULSA (Biomedlab, Seoul, Korea). We varied flow velocity, pulsatile flow wave form, and pulsing rate. The PC-MRI-derived velocities were compared with Doppler-derived results. The velocities of the two measurements showed a significant linear correlation. Dynamic three-dimensional velocity vector mapping was carried out for two cases. First, we applied to the flow analysis around the artificial heart valve in a flat phantom. We could observe the flow pattern around the valve through the 3-dimensional cine image. Next, it is applied to the complex flow inside the polymer sac that is used as ventricle in totally implantable artificial heart (TAH). As a result we could observe the flow pattern around the valves of the sac, though complex flow can not be detected correctly in the conventional phase contrast method. In addition, we could calculate the cardiac output from TAH sac by quantitative measurement of the volume of flow across the outlet valve.

  20. Computer simulation comparison of tripolar, bipolar, and spline Laplacian electrocadiogram estimators.

    PubMed

    Chen, T; Besio, W; Dai, W

    2009-01-01

    A comparison of the performance of the tripolar and bipolar concentric as well as spline Laplacian electrocardiograms (LECGs) and body surface Laplacian mappings (BSLMs) for localizing and imaging the cardiac electrical activation has been investigated based on computer simulation. In the simulation a simplified eccentric heart-torso sphere-cylinder homogeneous volume conductor model were developed. Multiple dipoles with different orientations were used to simulate the underlying cardiac electrical activities. Results show that the tripolar concentric ring electrodes produce the most accurate LECG and BSLM estimation among the three estimators with the best performance in spatial resolution.

  1. Middle Atmosphere Program. Handbook for MAP. Volume 31: Reference models of trace species for the COSPAR international reference atmosphere

    NASA Technical Reports Server (NTRS)

    Keating, G. M. (Editor)

    1989-01-01

    A set of preliminary reference atmosphere models of significant trace species which play important roles in controlling the chemistry, radiation budget, and circulation patterns of the atmosphere were produced. These models of trace species distributions are considered to be reference models rather than standard models; thus, it was not crucial that they be correct in an absolute sense. These reference models can serve as a means of comparison between individual observations, as a first guess in inversion algorithms, and as an approximate representation of observations for comparison to theoretical calculations.

  2. Forest Dragon-3: Decadal Trends of Northeastern Forests in China from Earth Observation Synergy

    NASA Astrophysics Data System (ADS)

    Schmullius, C.; Balling, J.; Schratz, P.; Thiel, C.; Santoro, M.; Wegmuller, U.; Li, Z.; Yong, P.

    2016-08-01

    In Forest DRAGON 3, synergy of Earth Observation products to derive information of decadal trends of forest in northeast China was investigated. Following up the results of Forest-DRAGON 1 and 2, Growing Stock Volume (GSV) products from different years were investigated to derive information on vegetational in north- east China. The BIOMASAR maps of 2005 and 2010, produced within the previous DRAGON projects, set the base for all analyses. We took a closer look at scale problems regarding GSV derivation, which are introduced by differing landcover within one pixel, to investigate differences throughout pixel classes with varying landcover class percentages. We developed an approach to select pixels containing forest only with the aim of undertaking a detailed analysis on retrieved GSV values for such pixels for the years 2005 and 2010. Using existing land cover products at different scales, the plausibility of changes in the BIOMASAR maps were checked.

  3. Partitioning of Electromagnetic Energy Inputs to the Thermosphere during Geomagnetic Disturbances

    DTIC Science & Technology

    2012-06-01

    boundary of a local flux tube volume is an equipotential . Figure 4 contains maps of Poynting flux normal to a 500 km altitude surface and maps of height...as a cell quantity throughout its computational volume, we are able to generate maps of the Poynting flux, ⃗ ⃗⃗⃗⃗⃗⃗ , on altitude surfaces at...the top of the thermosphere. We used separate modules to integrate the Poynting flux over this surface to compute the total electromagnetic energy

  4. An enhanced trend surface analysis equation for regional-residual separation of gravity data

    NASA Astrophysics Data System (ADS)

    Obasi, A. I.; Onwuemesi, A. G.; Romanus, O. M.

    2016-12-01

    Trend surface analysis is a geological term for a mathematical technique which separates a given map set into a regional component and a local component. This work has extended the steps for the derivation of the constants in the trend surface analysis equation from the popularly known matrix and simultaneous form to a more simplified and easily achievable format. To achieve this, matrix inversion was applied to the existing equations and the outcome was tested for suitability using a large volume of gravity data set acquired from the Anambra Basin, south-eastern Nigeria. Tabulation of the field data set was done using the Microsoft Excel spread sheet, while gravity maps were generated from the data set using Oasis Montaj software. A comparison of the residual gravity map produced using the new equations with its software derived counterpart has shown that the former has a higher enhancing capacity than the latter. This equation has shown strong suitability for application in the separation of gravity data sets into their regional and residual components.

  5. Communications among elements of a space construction ensemble

    NASA Technical Reports Server (NTRS)

    Davis, Randal L.; Grasso, Christopher A.

    1989-01-01

    Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.

  6. Glacier volume estimation of Cascade Volcanoes—an analysis and comparison with other methods

    USGS Publications Warehouse

    Driedger, Carolyn L.; Kennard, P.M.

    1986-01-01

    During the 1980 eruption of Mount St. Helens, the occurrence of floods and mudflows made apparent a need to assess mudflow hazards on other Cascade volcanoes. A basic requirement for such analysis is information about the volume and distribution of snow and ice on these volcanoes. An analysis was made of the volume-estimation methods developed by previous authors and a volume estimation method was developed for use in the Cascade Range. A radio echo-sounder, carried in a backpack, was used to make point measurements of ice thickness on major glaciers of four Cascade volcanoes (Mount Rainier, Washington; Mount Hood and the Three Sisters, Oregon; and Mount Shasta, California). These data were used to generate ice-thickness maps and bedrock topographic maps for developing and testing volume-estimation methods. Subsequently, the methods were applied to the unmeasured glaciers on those mountains and, as a test of the geographical extent of applicability, to glaciers beyond the Cascades having measured volumes. Two empirical relationships were required in order to predict volumes for all the glaciers. Generally, for glaciers less than 2.6 km in length, volume was found to be estimated best by using glacier area, raised to a power. For longer glaciers, volume was found to be estimated best by using a power law relationship, including slope and shear stress. The necessary variables can be estimated from topographic maps and aerial photographs.

  7. Assembling a biogenic hydrocarbon emissions inventory for the SCOS97-NARSTO modeling domain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benjamin, M.T.; Winer, A.M.; Karlik, J.

    1998-12-31

    To assist in developing ozone control strategies for Southern California, the California Air Resources Board is developing a biogenic hydrocarbon (BHC) emissions inventory model for the SCOS97-NARSTO domain. The basis for this bottom-up model is SCOS97-NARSTO-specific landuse and landcover maps, leafmass constants, and BHC emission rates. In urban areas, landuse maps developed by the Southern California Association of Governments, San Diego Association of Governments, and other local governments are used while in natural areas, landcover and plant community databases produced by the GAP Analysis Project (GAP) are employed. Plant identities and canopy volumes for species in each landuse and landcovermore » category are based on the most recent botanical field survey data. Where possible, experimentally determined leafmass constant and BHC emission rate measurements reported in the literature are used or, for those species where experimental data are not available, values are assigned based on taxonomic methods. A geographic information system is being used to integrate these databases, as well as the most recent environmental correction algorithms and canopy shading factors, to produce a spatially- and temporally-resolved BHC emission inventory suitable for input into the Urban Airshed Model.« less

  8. Three-Dimensional Geologic Map of the Hayward Fault Zone, San Francisco Bay Region, California

    USGS Publications Warehouse

    Phelps, G.A.; Graymer, R.W.; Jachens, R.C.; Ponce, D.A.; Simpson, R.W.; Wentworth, C.M.

    2008-01-01

    A three-dimensional (3D) geologic map of the Hayward Fault zone was created by integrating the results from geologic mapping, potential field geophysics, and seismology investigations. The map volume is 100 km long, 20 km wide, and extends to a depth of 12 km below sea level. The map volume is oriented northwest and is approximately bisected by the Hayward Fault. The complex geologic structure of the region makes it difficult to trace many geologic units into the subsurface. Therefore, the map units are generalized from 1:24,000-scale geologic maps. Descriptions of geologic units and structures are offered, along with a discussion of the methods used to map them and incorporate them into the 3D geologic map. The map spatial database and associated viewing software are provided. Elements of the map, such as individual fault surfaces, are also provided in a non-proprietary format so that the user can access the map via open-source software. The sheet accompanying this manuscript shows views taken from the 3D geologic map for the user to access. The 3D geologic map is designed as a multi-purpose resource for further geologic investigations and process modeling.

  9. Site-specific Microtubule-associated Protein 4 Dephosphorylation Causes Microtubule Network Densification in Pressure Overload Cardiac Hypertrophy*

    PubMed Central

    Chinnakkannu, Panneerselvam; Samanna, Venkatesababa; Cheng, Guangmao; Ablonczy, Zsolt; Baicu, Catalin F.; Bethard, Jennifer R.; Menick, Donald R.; Kuppuswamy, Dhandapani; Cooper, George

    2010-01-01

    In severe pressure overload-induced cardiac hypertrophy, a dense, stabilized microtubule network forms that interferes with cardiocyte contraction and microtubule-based transport. This is associated with persistent transcriptional up-regulation of cardiac α- and β-tubulin and microtubule-stabilizing microtubule-associated protein 4 (MAP4). There is also extensive microtubule decoration by MAP4, suggesting greater MAP4 affinity for microtubules. Because the major determinant of this affinity is site-specific MAP4 dephosphorylation, we characterized this in hypertrophied myocardium and then assessed the functional significance of each dephosphorylation site found by mimicking it in normal cardiocytes. We first isolated MAP4 from normal and pressure overload-hypertrophied feline myocardium; volume-overloaded myocardium, which has an equal degree and duration of hypertrophy but normal functional and cytoskeletal properties, served as a control for any nonspecific growth-related effects. After cloning cDNA-encoding feline MAP4 and obtaining its deduced amino acid sequence, we characterized by mass spectrometry any site-specific MAP4 dephosphorylation. Solely in pressure overload-hypertrophied myocardium, we identified striking MAP4 dephosphorylation at Ser-472 in the MAP4 N-terminal projection domain and at Ser-924 and Ser-1056 in the assembly-promoting region of the C-terminal microtubule-binding domain. Site-directed mutagenesis of MAP4 cDNA was then used to switch each serine to non-phosphorylatable alanine. Wild-type and mutated cDNAs were used to construct adenoviruses; microtubule network density, stability, and MAP4 decoration were assessed in normal cardiocytes following an equivalent level of MAP4 expression. The Ser-924 → Ala MAP4 mutant produced a microtubule phenotype indistinguishable from that seen in pressure overload hypertrophy, such that Ser-924 MAP4 dephosphorylation during pressure overload hypertrophy may be central to this cytoskeletal abnormality. PMID:20436166

  10. High resolution global flood hazard map from physically-based hydrologic and hydraulic models.

    NASA Astrophysics Data System (ADS)

    Begnudelli, L.; Kaheil, Y.; McCollum, J.

    2017-12-01

    The global flood map published online at http://www.fmglobal.com/research-and-resources/global-flood-map at 90m resolution is being used worldwide to understand flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs. The modeling system is based on a physically-based hydrologic model to simulate river discharges, and 2D shallow-water hydrodynamic model to simulate inundation. The model can be applied to large-scale flood hazard mapping thanks to several solutions that maximize its efficiency and the use of parallel computing. The hydrologic component of the modeling system is the Hillslope River Routing (HRR) hydrologic model. HRR simulates hydrological processes using a Green-Ampt parameterization, and is calibrated against observed discharge data from several publicly-available datasets. For inundation mapping, we use a 2D Finite-Volume Shallow-Water model with wetting/drying. We introduce here a grid Up-Scaling Technique (UST) for hydraulic modeling to perform simulations at higher resolution at global scale with relatively short computational times. A 30m SRTM is now available worldwide along with higher accuracy and/or resolution local Digital Elevation Models (DEMs) in many countries and regions. UST consists of aggregating computational cells, thus forming a coarser grid, while retaining the topographic information from the original full-resolution mesh. The full-resolution topography is used for building relationships between volume and free surface elevation inside cells and computing inter-cell fluxes. This approach almost achieves computational speed typical of the coarse grids while preserving, to a significant extent, the accuracy offered by the much higher resolution available DEM. The simulations are carried out along each river of the network by forcing the hydraulic model with the streamflow hydrographs generated by HRR. Hydrographs are scaled so that the peak corresponds to the return period corresponding to the hazard map being produced (e.g. 100 years, 500 years). Each numerical simulation models one river reach, except for the longest reaches which are split in smaller parts. Here we show results for selected river basins worldwide.

  11. Dem Reconstruction Using Light Field and Bidirectional Reflectance Function from Multi-View High Resolution Spatial Images

    NASA Astrophysics Data System (ADS)

    de Vieilleville, F.; Ristorcelli, T.; Delvit, J.-M.

    2016-06-01

    This paper presents a method for dense DSM reconstruction from high resolution, mono sensor, passive imagery, spatial panchromatic image sequence. The interest of our approach is four-fold. Firstly, we extend the core of light field approaches using an explicit BRDF model from the Image Synthesis community which is more realistic than the Lambertian model. The chosen model is the Cook-Torrance BRDF which enables us to model rough surfaces with specular effects using specific material parameters. Secondly, we extend light field approaches for non-pinhole sensors and non-rectilinear motion by using a proper geometric transformation on the image sequence. Thirdly, we produce a 3D volume cost embodying all the tested possible heights and filter it using simple methods such as Volume Cost Filtering or variational optimal methods. We have tested our method on a Pleiades image sequence on various locations with dense urban buildings and report encouraging results with respect to classic multi-label methods such as MIC-MAC, or more recent pipelines such as S2P. Last but not least, our method also produces maps of material parameters on the estimated points, allowing us to simplify building classification or road extraction.

  12. Integrating multidisciplinary science, modelling and impact data into evolving, syn-event volcanic hazard mapping and communication: A case study from the 2012 Tongariro eruption crisis, New Zealand

    NASA Astrophysics Data System (ADS)

    Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.

    2014-10-01

    New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice. Hazard maps were integral to science communication during the crisis, but there is limited international best practice information available on hazard maps as communication devices, as most volcanic hazard mapping literature is concerned with defining hazard zones. We propose that hazard maps are only as good as the communications framework and inter-agency relationships in which they are embedded, and we document in detail the crisis hazard map development process. We distinguish crisis hazard maps from background hazard maps and ashfall prediction maps, illustrating the complementary nature of these three distinct communication mechanisms. We highlight issues that arose and implications for the development of future maps.

  13. Mapping rice areas of South Asia using MODIS multitemporal data

    NASA Astrophysics Data System (ADS)

    Gumma, Murali Krishna; Nelson, Andrew; Thenkabail, Prasad S.; Singh, Amrendra N.

    2011-01-01

    Our goal is to map the rice areas of six South Asian countries using moderate-resolution imaging spectroradiometer (MODIS) time-series data for the time period 2000 to 2001. South Asia accounts for almost 40% of the world's harvested rice area and is also home to 74% of the population that lives on less than $2.00 a day. The population of the region is growing faster than its ability to produce rice. Thus, accurate and timely assessment of where and how rice is cultivated is important to craft food security and poverty alleviation strategies. We used a time series of eight-day, 500-m spatial resolution composite images from the MODIS sensor to produce rice maps and rice characteristics (e.g., intensity of cropping, cropping calendar) taking data for the years 2000 to 2001 and by adopting a suite of methods that include spectral matching techniques, decision trees, and ideal temporal profile data banks to rapidly identify and classify rice areas over large spatial extents. These methods are used in conjunction with ancillary spatial data sets (e.g., elevation, precipitation), national statistics, and maps, and a large volume of field-plot data. The resulting rice maps and statistics are compared against a subset of independent field-plot points and the best available subnational statistics on rice areas for the main crop growing season (kharif season). A fuzzy classification accuracy assessment for the 2000 to 2001 rice-map product, based on field-plot data, demonstrated accuracies from 67% to 100% for individual rice classes, with an overall accuracy of 80% for all classes. Most of the mixing was within rice classes. The derived physical rice area was highly correlated with the subnational statistics with R2 values of 97% at the district level and 99% at the state level for 2000 to 2001. These results suggest that the methods, approaches, algorithms, and data sets we used are ideal for rapid, accurate, and large-scale mapping of paddy rice as well as for generating their statistics over large areas.

  14. Mapping rice areas of South Asia using MODIS multitemporal data

    USGS Publications Warehouse

    Gumma, M.K.; Nelson, A.; Thenkabail, P.S.; Singh, A.N.

    2011-01-01

    Our goal is to map the rice areas of six South Asian countries using moderate-resolution imaging spectroradiometer (MODIS) time-series data for the time period 2000 to 2001. South Asia accounts for almost 40% of the world's harvested rice area and is also home to 74% of the population that lives on less than $2.00 a day. The population of the region is growing faster than its ability to produce rice. Thus, accurate and timely assessment of where and how rice is cultivated is important to craft food security and poverty alleviation strategies. We used a time series of eight-day, 500-m spatial resolution composite images from the MODIS sensor to produce rice maps and rice characteristics (e.g., intensity of cropping, cropping calendar) taking data for the years 2000 to 2001 and by adopting a suite of methods that include spectral matching techniques, decision trees, and ideal temporal profile data banks to rapidly identify and classify rice areas over large spatial extents. These methods are used in conjunction with ancillary spatial data sets (e.g., elevation, precipitation), national statistics, and maps, and a large volume of field-plot data. The resulting rice maps and statistics are compared against a subset of independent field-plot points and the best available subnational statistics on rice areas for the main crop growing season (kharif season). A fuzzy classification accuracy assessment for the 2000 to 2001 rice-map product, based on field-plot data, demonstrated accuracies from 67% to 100% for individual rice classes, with an overall accuracy of 80% for all classes. Most of the mixing was within rice classes. The derived physical rice area was highly correlated with the subnational statistics with R2 values of 97% at the district level and 99% at the state level for 2000 to 2001. These results suggest that the methods, approaches, algorithms, and data sets we used are ideal for rapid, accurate, and large-scale mapping of paddy rice as well as for generating their statistics over large areas. ?? 2011 Society of Photo-Optical Instrumentation Engineers (SPIE).

  15. Preliminary Aeromagnetic Map of Joshua Tree National Park and Vicinity, Southern California

    USGS Publications Warehouse

    Langenheim, V.E.; Hill, P.L.

    2010-01-01

    This aeromagnetic map of Joshua Tree National Park and vicinity is intended to promote further understanding of the geology and structure in the region by serving as a basis for geophysical interpretations and by supporting geological mapping, water-resource investigations, and various topical studies. Local spatial variations in the Earth's magnetic field (evident as anomalies on aeromagnetic maps) reflect the distribution of magnetic minerals, primarily magnetite, in the underlying rocks. In many cases the volume content of magnetic minerals can be related to rock type, and abrupt spatial changes in the amount of magnetic minerals commonly mark lithologic or structural boundaries. Bodies of mafic and ultramafic rocks tend to produce the most intense magnetic anomalies, but such generalizations must be applied with caution because rocks with more felsic compositions, or even some sedimentary units, also can cause measurable magnetic anomalies. The database includes two ASCII files containing new aeromagnetic data and two ASCII files with point locations of the local maximum horizontal gradient derived from the aeromagnetic data. This metadata file describes the horizontal gradient locations derived from new and existing aeromagnetic data. This aeromagnetic map identifies magnetic features as a basis for geophysical interpretations; the gradients help define the edges of magnetic sources. This database updates geophysical information originally presented in smaller-scale formats and includes detailed aeromagnetic data collected by EON Geosciences, Inc.

  16. The evolution of mapping habitat for northern spotted owls (Strix occidentalis caurina): A comparison of photo-interpreted, Landsat-based, and lidar-based habitat maps

    USGS Publications Warehouse

    Ackers, Steven H.; Davis, Raymond J.; Olsen, K.; Dugger, Catherine

    2015-01-01

    Wildlife habitat mapping has evolved at a rapid pace over the last few decades. Beginning with simple, often subjective, hand-drawn maps, habitat mapping now involves complex species distribution models (SDMs) using mapped predictor variables derived from remotely sensed data. For species that inhabit large geographic areas, remote sensing technology is often essential for producing range wide maps. Habitat monitoring for northern spotted owls (Strix occidentalis caurina), whose geographic covers about 23 million ha, is based on SDMs that use Landsat Thematic Mapper imagery to create forest vegetation data layers using gradient nearest neighbor (GNN) methods. Vegetation data layers derived from GNN are modeled relationships between forest inventory plot data, climate and topographic data, and the spectral signatures acquired by the satellite. When used as predictor variables for SDMs, there is some transference of the GNN modeling error to the final habitat map.Recent increases in the use of light detection and ranging (lidar) data, coupled with the need to produce spatially accurate and detailed forest vegetation maps have spurred interest in its use for SDMs and habitat mapping. Instead of modeling predictor variables from remotely sensed spectral data, lidar provides direct measurements of vegetation height for use in SDMs. We expect a SDM habitat map produced from directly measured predictor variables to be more accurate than one produced from modeled predictors.We used maximum entropy (Maxent) SDM modeling software to compare predictive performance and estimates of habitat area between Landsat-based and lidar-based northern spotted owl SDMs and habitat maps. We explored the differences and similarities between these maps, and to a pre-existing aerial photo-interpreted habitat map produced by local wildlife biologists. The lidar-based map had the highest predictive performance based on 10 bootstrapped replicate models (AUC = 0.809 ± 0.011), but the performance of the Landsat-based map was within acceptable limits (AUC = 0.717 ± 0.021). As is common with photo-interpreted maps, there was no accuracy assessment available for comparison. The photo-interpreted map produced the highest and lowest estimates of habitat area, depending on which habitat classes were included (nesting, roosting, and foraging habitat = 9962 ha, nesting habitat only = 6036 ha). The Landsat-based map produced an estimate of habitat area that was within this range (95% CI: 6679–9592 ha), while the lidar-based map produced an area estimate similar to what was interpreted by local wildlife biologists as nesting (i.e., high quality) habitat using aerial imagery (95% CI: 5453–7216). Confidence intervals of habitat area estimates from the SDMs based on Landsat and lidar overlapped.We concluded that both Landsat- and lidar-based SDMs produced reasonable maps and area estimates for northern spotted owl habitat within the study area. The lidar-based map was more precise and spatially similar to what local wildlife biologists considered spotted owl nesting habitat. The Landsat-based map provided a less precise spatial representation of habitat within the relatively small geographic confines of the study area, but habitat area estimates were similar to both the photo-interpreted and lidar-based maps.Photo-interpreted maps are time consuming to produce, subjective in nature, and difficult to replicate. SDMs provide a framework for efficiently producing habitat maps that can be replicated as habitat conditions change over time, provided that comparable remotely sensed data are available. When the SDM uses predictor variables extracted from lidar data, it can produce a habitat map that is both accurate and useful at large and small spatial scales. In comparison, SDMs using Landsat-based data are more appropriate for large scale analyses of amounts and general spatial patterns of habitat at regional scales.

  17. Cartography for lunar exploration: 2008 status and mission plans

    USGS Publications Warehouse

    Kirk, R.L.; Archinal, B.A.; Gaddis, L.R.; Rosiek, M.R.; Chen, Jun; Jiang, Jie; Nayak, Shailesh

    2008-01-01

    The initial spacecraft exploration of the Moon in the 1960s-70s yielded extensive data, primarily in the form of film and television images, which were used to produce a large number of hardcopy maps by conventional techniques. A second era of exploration, beginning in the early 1990s, has produced digital data including global multispectral imagery and altimetry, from which a new generation of digital map products tied to a rapidly evolving global control network has been made. Efforts are also underway to scan the earlier hardcopy maps for online distribution and to digitize the film images so that modern processing techniques can be used to make high-resolution digital terrain models (DTMs) and image mosaics consistent with the current global control. The pace of lunar exploration is accelerating dramatically, with as many as eight new missions already launched or planned for the current decade. These missions, of which the most important for cartography are SMART-1 (Europe), Kaguya/SELENE (Japan), Chang'e-1 (China), Chandrayaan-1 (India), and Lunar Reconnaissance Orbiter (USA), will return a volume of data exceeding that of all previous lunar and planetary missions combined. Framing and scanner camera images, including multispectral and stereo data, hyperspectral images, synthetic aperture radar (SAR) images, and laser altimetry will all be collected, including, in most cases, multiple data sets of each type. Substantial advances in international standardization and cooperation, development of new and more efficient data processing methods, and availability of resources for processing and archiving will all be needed if the next generation of missions are to fulfill their potential for high-precision mapping of the Moon in support of subsequent exploration and scientific investigation.

  18. Next generation of global land cover characterization, mapping, and monitoring

    USGS Publications Warehouse

    Giri, Chandra; Pengra, Bruce; Long, J.; Loveland, Thomas R.

    2013-01-01

    Land cover change is increasingly affecting the biophysics, biogeochemistry, and biogeography of the Earth's surface and the atmosphere, with far-reaching consequences to human well-being. However, our scientific understanding of the distribution and dynamics of land cover and land cover change (LCLCC) is limited. Previous global land cover assessments performed using coarse spatial resolution (300 m–1 km) satellite data did not provide enough thematic detail or change information for global change studies and for resource management. High resolution (∼30 m) land cover characterization and monitoring is needed that permits detection of land change at the scale of most human activity and offers the increased flexibility of environmental model parameterization needed for global change studies. However, there are a number of challenges to overcome before producing such data sets including unavailability of consistent global coverage of satellite data, sheer volume of data, unavailability of timely and accurate training and validation data, difficulties in preparing image mosaics, and high performance computing requirements. Integration of remote sensing and information technology is needed for process automation and high-performance computing needs. Recent developments in these areas have created an opportunity for operational high resolution land cover mapping, and monitoring of the world. Here, we report and discuss these advancements and opportunities in producing the next generations of global land cover characterization, mapping, and monitoring at 30-m spatial resolution primarily in the context of United States, Group on Earth Observations Global 30 m land cover initiative (UGLC).

  19. Modeling Approach for Estimating Co-Produced Water Volumes and Saltwater Disposal Volumes in Oklahoma

    NASA Astrophysics Data System (ADS)

    Murray, K. E.

    2016-12-01

    Management of produced fluids has become an important issue in Oklahoma because large volumes of saltwater are co-produced with oil and gas, and disposed into saltwater disposal wells at high rates. Petroleum production increased from 2009-2015, especially in central and north-central Oklahoma where the Mississippian and Hunton zones were redeveloped using horizontal wells and dewatering techniques that have led to a disproportional increase in produced water volumes. Improved management of co-produced water, including desalination for beneficial reuse and decreased saltwater disposal volumes, is only possible if spatial and temporal trends can be defined and related to the producing zones. It is challenging to quantify the volumes of co-produced water by region or production zone because co-produced water volumes are generally not reported. Therefore, the goal of this research is to estimate co-produced water volumes for 2008-present with an approach that can be replicated as petroleum production shifts to other regions. Oil and gas production rates from subsurface zones were multiplied by ratios of H2O:oil and H2O:gas for the respective zones. Initial H2O:oil and H2O:gas ratios were adjusted/calibrated, by zone, to maximize correlation of county-scale produced H2O estimates versus saltwater disposal volumes from 2013-2015. These calibrated ratios were then used to compute saltwater disposal volumes from 2008-2012 because of apparent data gaps in reported saltwater disposal volumes during that timeframe. This research can be used to identify regions that have the greatest need for produced water treatment systems. The next step in management of produced fluids is to explore optimal energy-efficient strategies that reduce deleterious effects.

  20. Map of assessed shale gas in the United States, 2012

    USGS Publications Warehouse

    ,; Biewick, Laura R. H.

    2013-01-01

    The U.S. Geological Survey has compiled a map of shale-gas assessments in the United States that were completed by 2012 as part of the National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the U.S. Geological Survey quantitatively estimated potential volumes of undiscovered gas within shale-gas assessment units. These shale-gas assessment units are mapped, and square-mile cells are shown to represent proprietary shale-gas wells. The square-mile cells include gas-producing wells from shale intervals. In some cases, shale-gas formations contain gas in deeper parts of a basin and oil at shallower depths (for example, the Woodford Shale and the Eagle Ford Shale). Because a discussion of shale oil is beyond the scope of this report, only shale-gas assessment units and cells are shown. The map can be printed as a hardcopy map or downloaded for interactive analysis in a Geographic Information System data package using the ArcGIS map document (file extension MXD) and published map file (file extension PMF). Also available is a publications access table with hyperlinks to current U.S. Geological Survey shale gas assessment publications and web pages. Assessment results and geologic reports are available as completed at the U.S. Geological Survey Energy Resources Program Web Site, http://energy.usgs.gov/OilGas/AssessmentsData/NationalOilGasAssessment.aspx. A historical perspective of shale gas activity in the United States is documented and presented in a video clip included as a PowerPoint slideshow.

  1. STATEWIDE MAPPING OF FLORIDA SOIL RADON POTENTIALS VOLUME 2. APPENDICES A-P

    EPA Science Inventory

    The report gives results of a statewide mapping of Florida soil radon potentials. Statewide maps identify Florida Regions with different levels of soil radon potential. The maps provide scientific estimates of regional radon potentials that can serve as a basis for implementing r...

  2. STATEWIDE MAPPING OF FLORIDA SOIL RADON POTENTIALS VOLUME 1. TECHNICAL REPORT

    EPA Science Inventory

    The report gives results of a statewide mapping of Florida soil radon potentials. Statewide maps identify Florida Regions with different levels of soil radon potential. The maps provide scientific estimates of regional radon potentials that can serve as a basis for implementing r...

  3. Manifestation of a neuro-fuzzy model to produce landslide susceptibility map using remote sensing data derived parameters

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred

    Landslides are the most common natural hazards in Malaysia. Preparation of landslide suscep-tibility maps is important for engineering geologists and geomorphologists. However, due to complex nature of landslides, producing a reliable susceptibility map is not easy. In this study, a new attempt is tried to produce landslide susceptibility map of a part of Cameron Valley of Malaysia. This paper develops an adaptive neuro-fuzzy inference system (ANFIS) based on a geographic information system (GIS) environment for landslide susceptibility mapping. To ob-tain the neuro-fuzzy relations for producing the landslide susceptibility map, landslide locations were identified from interpretation of aerial photographs and high resolution satellite images, field surveys and historical inventory reports. Landslide conditioning factors such as slope, plan curvature, distance to drainage lines, soil texture, lithology, and distance to lineament were extracted from topographic, soil, and lineament maps. Landslide susceptible areas were analyzed by the ANFIS model and mapped using the conditioning factors. Furthermore, we applied various membership functions (MFs) and fuzzy relations to produce landslide suscep-tibility maps. The prediction performance of the susceptibility map is checked by considering actual landslides in the study area. Results show that, triangular, trapezoidal, and polynomial MFs were the best individual MFs for modelling landslide susceptibility maps (86

  4. Status and trends of land change in the Eastern United States—1973 to 2000

    USGS Publications Warehouse

    Sayler, Kristi L.; Acevedo, William; Taylor, Janis

    2016-09-28

    PrefaceU.S. Geological Survey (USGS) Professional Paper 1794–D is the fourth in a four-volume series on the status and trends of the Nation’s land use and land cover, providing an assessment of the rates and causes of land-use and land-cover change in the Eastern United States between 1973 and 2000. Volumes A, B, and C provide similar analyses for the Western United States, the Great Plains of the United States, and the Midwest–South Central United States, respectively. The assessments of land-use and land-cover trends are conducted on an ecoregion-by-ecoregion basis, and each ecoregion assessment is guided by a nationally consistent study design that includes mapping, statistical methods, field studies, and analysis. Individual assessments provide a picture of the characteristics of land change occurring in a given ecoregion; in combination, they provide a framework for understanding the complex national mosaic of change and also the causes and consequences of change. Thus, each volume in this series provides a regional assessment of how (and how fast) land use and land cover are changing, and why. The four volumes together form the first comprehensive picture of land change across the Nation.Geographic understanding of land-use and land-cover change is directly relevant to a wide variety of stakeholders, including land and resource managers, policymakers, and scientists. The chapters in this volume present brief summaries of the patterns and rates of land change observed in each ecoregion in the Eastern United States, together with field photographs, statistics, and comparisons with other assessments. In addition, a synthesis chapter summarizes the scope of land change observed across the entire Eastern United States. The studies provide a way of integrating information across the landscape, and they form a critical component in the efforts to understand how land use and land cover affect important issues such as the provision of ecological goods and services and also the determination of risks to, and vulnerabilities of, human communities. Results from this project also are published in peer-reviewed journals, and they are further used to produce maps of change and other tools for land management, as well as to provide inputs for carbon-cycle modeling and other climate change research.This report is only one of the products produced by USGS on land-use and land-cover change in the United States. Other reports and land-cover statistics are available online at http://landcovertrends.usgs.gov.

  5. Status and trends of land change in the Western United States--1973 to 2000

    USGS Publications Warehouse

    Sleeter, Benjamin M.; Wilson, Tamara S.; Acevedo, William

    2012-12-05

    U.S. Geological Survey (USGS) Professional Paper 1794–A is the first in a four-volume series on the status and trends of the Nation’s land use and land cover, providing an assessment of the rates and causes of land-use and land-cover change in the Western United States between 1973 and 2000. Volumes B, C, and D provide similar analyses for the Great Plains, the Midwest–South Central United States, and the Eastern United States, respectively. The assessments of land-use and land-cover trends are conducted on an ecoregion-by-ecoregion basis, and each ecoregion assessment is guided by a nationally consistent study design that includes mapping, statistical methods, field studies, and analysis. Individual assessments provide a picture of the characteristics of land change occurring in a given ecoregion; in combination, they provide a framework for understanding the complex national mosaic of change and also the causes and consequences of change. Thus, each volume in this series provides a regional assessment of how (and how fast) land use and land cover are changing, and why. The four volumes together form the first comprehensive picture of land change across the Nation. Geographic understanding of land-use and land-cover change is directly relevant to a wide variety of stakeholders, including land and resource managers, policymakers, and scientists. The chapters in this volume present brief summaries of the patterns and rates of land change observed in each ecoregion in the Western United States, together with field photographs, statistics, and comparisons with other assessments. In addition, a synthesis chapter summarizes the scope of land change observed across the entire Western United States. The studies provide a way of integrating information across the landscape, and they form a critical component in the efforts to understand how land use and land cover affect important issues such as the provision of ecological goods and services and also the determination of risks to, and vulnerabilities of, human communities. Results from this project also are published in peer-reviewed journals, and they are further used to produce maps of change and other tools for land management, as well as to provide inputs for carbon-cycle modeling and other climate change research. This report is only one of the products produced by USGS on land-use and land-cover change in the United States. Other reports and land-cover statistics are available online at http://landcovertrends.usgs.gov.

  6. Atlas of United States Trees, Volume 2: Alaska Trees and Common Shrubs.

    ERIC Educational Resources Information Center

    Viereck, Leslie A.; Little, Elbert L., Jr.

    This volume is the second in a series of atlases describing the natural distribution or range of native tree species in the United States. The 82 species maps include 32 of trees in Alaska, 6 of shrubs rarely reaching tree size, and 44 more of common shrubs. More than 20 additional maps summarize environmental factors and furnish general…

  7. MRT letter: Guided filtering of image focus volume for 3D shape recovery of microscopic objects.

    PubMed

    Mahmood, Muhammad Tariq

    2014-12-01

    In this letter, a shape from focus (SFF) method is proposed that utilizes the guided image filtering to enhance the image focus volume efficiently. First, image focus volume is computed using a conventional focus measure. Then each layer of image focus volume is filtered using guided filtering. In this work, the all-in-focus image, which can be obtained from the initial focus volume, is used as guidance image. Finally, improved depth map is obtained from the filtered image focus volume by maximizing the focus measure along the optical axis. The proposed SFF method is efficient and provides better depth maps. The improved performance is highlighted by conducting several experiments using image sequences of simulated and real microscopic objects. The comparative analysis demonstrates the effectiveness of the proposed SFF method. © 2014 Wiley Periodicals, Inc.

  8. Does Preinterventional Flat-Panel Computer Tomography Pooled Blood Volume Mapping Predict Final Infarct Volume After Mechanical Thrombectomy in Acute Cerebral Artery Occlusion?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, Marlies, E-mail: marlies.wagner@kgu.de; Kyriakou, Yiannis, E-mail: yiannis.kyriakou@siemens.com; Mesnil de Rochemont, Richard du, E-mail: mesnil@em.uni-frankfurt.de

    2013-08-01

    PurposeDecreased cerebral blood volume is known to be a predictor for final infarct volume in acute cerebral artery occlusion. To evaluate the predictability of final infarct volume in patients with acute occlusion of the middle cerebral artery (MCA) or the distal internal carotid artery (ICA) and successful endovascular recanalization, pooled blood volume (PBV) was measured using flat-panel detector computed tomography (FPD CT).Materials and MethodsTwenty patients with acute unilateral occlusion of the MCA or distal ACI without demarcated infarction, as proven by CT at admission, and successful Thrombolysis in cerebral infarction score (TICI 2b or 3) endovascular thrombectomy were included. Cerebralmore » PBV maps were acquired from each patient immediately before endovascular thrombectomy. Twenty-four hours after recanalization, each patient underwent multislice CT to visualize final infarct volume. Extent of the areas of decreased PBV was compared with the final infarct volume proven by follow-up CT the next day.ResultsIn 15 of 20 patients, areas of distinct PBV decrease corresponded to final infarct volume. In 5 patients, areas of decreased PBV overestimated final extension of ischemia probably due to inappropriate timing of data acquisition and misery perfusion.ConclusionPBV mapping using FPD CT is a promising tool to predict areas of irrecoverable brain parenchyma in acute thromboembolic stroke. Further validation is necessary before routine use for decision making for interventional thrombectomy.« less

  9. Automatic detection of artifacts in converted S3D video

    NASA Astrophysics Data System (ADS)

    Bokov, Alexander; Vatolin, Dmitriy; Zachesov, Anton; Belous, Alexander; Erofeev, Mikhail

    2014-03-01

    In this paper we present algorithms for automatically detecting issues specific to converted S3D content. When a depth-image-based rendering approach produces a stereoscopic image, the quality of the result depends on both the depth maps and the warping algorithms. The most common problem with converted S3D video is edge-sharpness mismatch. This artifact may appear owing to depth-map blurriness at semitransparent edges: after warping, the object boundary becomes sharper in one view and blurrier in the other, yielding binocular rivalry. To detect this problem we estimate the disparity map, extract boundaries with noticeable differences, and analyze edge-sharpness correspondence between views. We pay additional attention to cases involving a complex background and large occlusions. Another problem is detection of scenes that lack depth volume: we present algorithms for detecting at scenes and scenes with at foreground objects. To identify these problems we analyze the features of the RGB image as well as uniform areas in the depth map. Testing of our algorithms involved examining 10 Blu-ray 3D releases with converted S3D content, including Clash of the Titans, The Avengers, and The Chronicles of Narnia: The Voyage of the Dawn Treader. The algorithms we present enable improved automatic quality assessment during the production stage.

  10. Maps and Mapmaking: The Young Oxford Companion.

    ERIC Educational Resources Information Center

    Stefoff, Rebecca

    This volume gathers all the information necessary to understand how maps depict the world. The book is a concise, but comprehensive guide to maps and mapmaking and includes photographs, drawings, maps, graphics, artifacts, and cartoons that make the articles accessible and entertaining. More than 350 entries, arranged alphabetically and thoroughly…

  11. Truck Traffic Iowa : 2010

    DOT National Transportation Integrated Search

    2011-01-01

    Truck volumes represented on this map are Annual Average Daily Traffic Volumes between major traffic generators: i.e., Highway Junctions and Cities. : Truck volumes include 6-Tire and 3 Axle single unit trucks, buses and all multiple unit trucks.

  12. Correlation of Tumor Immunohistochemistry with Dynamic Contrast-Enhanced and DSC-MRI Parameters in Patients with Gliomas.

    PubMed

    Nguyen, T B; Cron, G O; Bezzina, K; Perdrizet, K; Torres, C H; Chakraborty, S; Woulfe, J; Jansen, G H; Thornhill, R E; Zanette, B; Cameron, I G

    2016-12-01

    Tumor CBV is a prognostic and predictive marker for patients with gliomas. Tumor CBV can be measured noninvasively with different MR imaging techniques; however, it is not clear which of these techniques most closely reflects histologically-measured tumor CBV. Our aim was to investigate the correlations between dynamic contrast-enhanced and DSC-MR imaging parameters and immunohistochemistry in patients with gliomas. Forty-three patients with a new diagnosis of glioma underwent a preoperative MR imaging examination with dynamic contrast-enhanced and DSC sequences. Unnormalized and normalized cerebral blood volume was obtained from DSC MR imaging. Two sets of plasma volume and volume transfer constant maps were obtained from dynamic contrast-enhanced MR imaging. Plasma volume obtained from the phase-derived vascular input function and bookend T1 mapping (Vp_Φ) and volume transfer constant obtained from phase-derived vascular input function and bookend T1 mapping (K trans _Φ) were determined. Plasma volume obtained from magnitude-derived vascular input function (Vp_SI) and volume transfer constant obtained from magnitude-derived vascular input function (K trans _SI) were acquired, without T1 mapping. Using CD34 staining, we measured microvessel density and microvessel area within 3 representative areas of the resected tumor specimen. The Mann-Whitney U test was used to test for differences according to grade and degree of enhancement. The Spearman correlation was performed to determine the relationship between dynamic contrast-enhanced and DSC parameters and histopathologic measurements. Microvessel area, microvessel density, dynamic contrast-enhanced, and DSC-MR imaging parameters varied according to the grade and degree of enhancement (P < .05). A strong correlation was found between microvessel area and Vp_Φ and between microvessel area and unnormalized blood volume (r s ≥ 0.61). A moderate correlation was found between microvessel area and normalized blood volume, microvessel area and Vp_SI, microvessel area and K trans _Φ, microvessel area and K trans _SI, microvessel density and Vp_Φ, microvessel density and unnormalized blood volume, and microvessel density and normalized blood volume (0.44 ≤ r s ≤ 0.57). A weaker correlation was found between microvessel density and K trans _Φ and between microvessel density and K trans _SI (r s ≤ 0.41). With dynamic contrast-enhanced MR imaging, use of a phase-derived vascular input function and bookend T1 mapping improves the correlation between immunohistochemistry and plasma volume, but not between immunohistochemistry and the volume transfer constant. With DSC-MR imaging, normalization of tumor CBV could decrease the correlation with microvessel area. © 2016 by American Journal of Neuroradiology.

  13. Magnetic map of the Irish Hills and surrounding areas, San Luis Obispo County, central California

    USGS Publications Warehouse

    Langenheim, V.E.; Watt, J.T.; Denton, K.M.

    2012-01-01

    A magnetic map of the Irish Hills and surrounding areas was created as part of a cooperative research and development agreement with the Pacific Gas and Electric Company and is intended to promote further understanding of the areal geology and structure by serving as a basis for geophysical interpretations and by supporting geological mapping, mineral and water resource investigations, and other topical studies. Local spatial variations in the Earth's magnetic field (evident as anomalies on magnetic maps) reflect the distribution of magnetic minerals, primarily magnetite, in the underlying rocks. In many cases the volume content of magnetic minerals can be related to rock type, and abrupt spatial changes in the amount of magnetic minerals can be related to either lithologic or structural boundaries. Magnetic susceptibility measurements from the area indicate that bodies of serpentinite and other mafic and ultramafic rocks tend to produce the most intense magnetic anomalies, but such generalizations must be applied with caution because some sedimentary units also can produce measurable magnetic anomalies. Remanent magnetization does not appear to be a significant source for magnetic anomalies because it is an order of magnitude less than the induced magnetization. The map is a mosaic of three separate surveys collected by (1) fixed-wing aircraft at a nominal height of 305 m, (2) by boat with the sensor at sea level, and (3) by helicopter. The helicopter survey was flown by New-Sense Geophysics in October 2009 along flight lines spaced 150-m apart and at a nominal terrain clearance of 50 to 100 m. Tie lines were flown 1,500-m apart. Data were adjusted for lag error and diurnal field variations. Further processing included microleveling using the tie lines and subtraction of the reference field defined by International Geomagnetic Reference Field (IGRF) 2005 extrapolated to August 1, 2008.

  14. Magnetic resonance studies of dissolving particulate solids.

    PubMed

    Johns, M L; Gladden, L F

    2003-01-01

    Magnetic resonance methods have been used to elucidate the internal pore structure of particulate solids, in particular detergent tablets. Such information is essential to a comprehensive understanding of the dissolution characteristics of these materials and how this property is related to processing conditions during tablet formation. In particular 3-D images of porosity are produced and 2-D self-diffusion maps are acquired as a function of observation time, which enables pore size to be quantified as a function of position via the extracted surface-to-volume ratio of the pore space. These properties are determined as a function of processing parameters, in particular the compression force used in tablet formation.

  15. Archiving of interferometric radio and mm/submm data at the National Radio Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Lacy, Mark

    2018-06-01

    Modern radio interferometers such as ALMA and the VLA are capable of producing ~1TB/day of data for processing into image products of comparable size. Besides the shear volume of data, the products themselves can be complicated and are sometimes hard to map into standard astronomical archive metadata. We also face similar issues to those faced by archives at other wavelengths, namely the role of archives as the basis of reprocessing platforms and facilities, and the validation and ingestion of user-derived products. In this talk I shall discuss the plans of NRAO in these areas over the next decade.

  16. Metal Matrix Composite Material by Direct Metal Deposition

    NASA Astrophysics Data System (ADS)

    Novichenko, D.; Marants, A.; Thivillon, L.; Bertrand, P. H.; Smurov, I.

    Direct Metal Deposition (DMD) is a laser cladding process for producing a protective coating on the surface of a metallic part or manufacturing layer-by-layer parts in a single-step process. The objective of this work is to demonstrate the possibility to create carbide-reinforced metal matrix composite objects. Powders of steel 16NCD13 with different volume contents of titanium carbide are tested. On the base of statistical analysis, a laser cladding processing map is constructed. Relationships between the different content of titanium carbide in a powder mixture and the material microstructure are found. Mechanism of formation of various precipitated titanium carbides is investigated.

  17. Lightning activity and severe storm structure

    NASA Technical Reports Server (NTRS)

    Taylor, W. L.; Brandes, E. A.; Rust, W. D.; Macgorman, D. R.

    1984-01-01

    Space-time mapping of VHF sources from four severe storms on June 19, 1980 reveals that lightning processes for cloud-to-ground (CG) and large intracloud (IC) flashes are confined to an altitude below about 10 km and closely associated with the central regions of high reflectivity. Another class of IC flashes produces a splattering of sources within the storms' main electrically active volumes and also within the large divergent wind canopy aloft. There is no apparent temporal association between the small high altitude IC flashes that occur almost continuously and the large IC and CG flashes that occur sporadically in the lower portions of storms.

  18. Mapping Prehistoric, Historic, and Channel Sediment Distribution, South Fork Noyo River: A Tool For Understanding Sources, Storage, and Transport

    Treesearch

    Rich D. Koehler; Keith I. Kelson; Graham Matthews; K.H. Kang; Andrew D. Barron

    2007-01-01

    The South Fork Noyo River (SFNR) watershed in coastal northern California contains large volumes of historic sediment that were delivered to channels in response to past logging operations. This sediment presently is stored beneath historic terraces and in present-day channels. We conducted geomorphic mapping on the SFNR valley floor to assess the volume and location...

  19. IMAGING AND MEASUREMENT OF THE PRERETINAL SPACE IN VITREOMACULAR ADHESION AND VITREOMACULAR TRACTION BY A NEW SPECTRAL DOMAIN OPTICAL COHERENCE TOMOGRAPHY ANALYSIS.

    PubMed

    Stopa, Marcin; Marciniak, Elżbieta; Rakowicz, Piotr; Stankiewicz, Agnieszka; Marciniak, Tomasz; Dąbrowski, Adam

    2017-10-01

    To evaluate a new method for volumetric imaging of the preretinal space (also known as the subhyaloid, subcortical, or retrocortical space) and investigate differences in preretinal space volume in vitreomacular adhesion (VMA) and vitreomacular traction (VMT). Nine patients with VMA and 13 with VMT were prospectively evaluated. Automatic inner limiting membrane line segmentation, which exploits graph search theory implementation, and posterior cortical vitreous line segmentation were performed on 141 horizontal spectral domain optical coherence tomography B-scans per patient. Vertical distances (depths) between the posterior cortical vitreous and inner limiting membrane lines were calculated for each optical coherence tomography B-scan acquired. The derived distances were merged and visualized as a color depth map that represented the preretinal space between the posterior surface of the hyaloid and the anterior surface of the retina. The early treatment d retinopathy study macular map was overlaid onto final virtual maps, and preretinal space volumes were calculated for each early treatment diabetic retinopathy study map sector. Volumetric maps representing preretinal space volumes were created for each patient in the VMA and VMT groups. Preretinal space volumes were larger in all early treatment diabetic retinopathy study map macular regions in the VMT group compared with those in the VMA group. The differences reached statistical significance in all early treatment diabetic retinopathy study sectors, except for the superior outer macula and temporal outer macula where significance values were P = 0.05 and P = 0.08, respectively. Overall, the relative differences in preretinal space volumes between the VMT and VMA groups varied from 2.7 to 4.3 in inner regions and 1.8 to 2.9 in outer regions. Our study provides evidence of significant differences in preretinal space volume between eyes with VMA and those with VMT. This may be useful not only in the investigation of preretinal space properties in VMA and VMT, but also in other conditions, such as age-related macular degeneration, diabetic retinopathy, and central retinal vein occlusion.

  20. JPSS VIIRS level 3 uncollated sea surface temperature product at NOAA

    NASA Astrophysics Data System (ADS)

    Ignatov, Alexander; Gladkova, Irina; Ding, Yanni; Shahriar, Fazlul; Kihai, Yury; Zhou, Xinjia

    2017-07-01

    Following the launch of the Suomi National Polar-orbiting Partnership satellite in October 2011 with the Visible Infrared Imager Radiometer Suite (VIIRS) sensor onboard, National Oceanic and Atmospheric Administration (NOAA) started generating a global level 2 preprocessed (L2P) sea surface temperature (SST) product. The NOAA Advanced Clear-Sky Processor for Ocean (ACSPO) L2P data are organized into 144 10-min granules per day, with a total volume of ˜27 GB. The L2P product has been successfully assimilated in several level 4 (L4) analyses. At the same time, some other users requested a gridded level 3 (L3) product with a reduced data volume. An L3U "uncollated" product (in which multiple passes over the same grid are independently saved) was produced by mapping the L2P product into equal 0.02° grids. Similar to the L2P, the L3U data are also reported in 10-min granules, but with a daily volume <1 GB. Currently, the NOAA VIIRS L3U SST product is operationally used or tested in several major international numerical weather prediction centers. The L3U shows comparable performance with L2P, suggesting that both products can be used interchangeably as input into L4 analyses. The original L2P pixel-level swath data continue to be produced and available to interested users from NOAA (NCEI) and JPL (Physical Oceanography) data archives.

  1. Modeling the Lyα Forest in Collisionless Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorini, Daniele; Oñorbe, José; Lukić, Zarija

    2016-08-11

    Cosmological hydrodynamic simulations can accurately predict the properties of the intergalactic medium (IGM), but only under the condition of retaining the high spatial resolution necessary to resolve density fluctuations in the IGM. This resolution constraint prohibits simulating large volumes, such as those probed by BOSS and future surveys, like DESI and 4MOST. To overcome this limitation, we present in this paper "Iteratively Matched Statistics" (IMS), a novel method to accurately model the Lyα forest with collisionless N-body simulations, where the relevant density fluctuations are unresolved. We use a small-box, high-resolution hydrodynamic simulation to obtain the probability distribution function (PDF) andmore » the power spectrum of the real-space Lyα forest flux. These two statistics are iteratively mapped onto a pseudo-flux field of an N-body simulation, which we construct from the matter density. We demonstrate that our method can reproduce the PDF, line of sight and 3D power spectra of the Lyα forest with good accuracy (7%, 4%, and 7% respectively). We quantify the performance of the commonly used Gaussian smoothing technique and show that it has significantly lower accuracy (20%–80%), especially for N-body simulations with achievable mean inter-particle separations in large-volume simulations. Finally, in addition, we show that IMS produces reasonable and smooth spectra, making it a powerful tool for modeling the IGM in large cosmological volumes and for producing realistic "mock" skies for Lyα forest surveys.« less

  2. MODELING THE Ly α FOREST IN COLLISIONLESS SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorini, Daniele; Oñorbe, José; Hennawi, Joseph F.

    2016-08-20

    Cosmological hydrodynamic simulations can accurately predict the properties of the intergalactic medium (IGM), but only under the condition of retaining the high spatial resolution necessary to resolve density fluctuations in the IGM. This resolution constraint prohibits simulating large volumes, such as those probed by BOSS and future surveys, like DESI and 4MOST. To overcome this limitation, we present “Iteratively Matched Statistics” (IMS), a novel method to accurately model the Ly α forest with collisionless N -body simulations, where the relevant density fluctuations are unresolved. We use a small-box, high-resolution hydrodynamic simulation to obtain the probability distribution function (PDF) and themore » power spectrum of the real-space Ly α forest flux. These two statistics are iteratively mapped onto a pseudo-flux field of an N -body simulation, which we construct from the matter density. We demonstrate that our method can reproduce the PDF, line of sight and 3D power spectra of the Ly α forest with good accuracy (7%, 4%, and 7% respectively). We quantify the performance of the commonly used Gaussian smoothing technique and show that it has significantly lower accuracy (20%–80%), especially for N -body simulations with achievable mean inter-particle separations in large-volume simulations. In addition, we show that IMS produces reasonable and smooth spectra, making it a powerful tool for modeling the IGM in large cosmological volumes and for producing realistic “mock” skies for Ly α forest surveys.« less

  3. Estimating the eruptive volume of a large pyroclastic body: the Otowi Member of the Bandelier Tuff, Valles caldera, New Mexico

    NASA Astrophysics Data System (ADS)

    Cook, Geoffrey W.; Wolff, John A.; Self, Stephen

    2016-02-01

    The 1.60 Ma caldera-forming eruption of the Otowi Member of the Bandelier Tuff produced Plinian and coignimbrite fall deposits, outflow and intracaldera ignimbrite, all of it deposited on land. We present a detailed approach to estimating and reconstructing the original volume of the eroded, partly buried large ignimbrite and distal ash-fall deposits. Dense rock equivalent (DRE) volume estimates for the eruption are 89 + 33/-10 km3 of outflow ignimbrite and 144 ± 72 km3 of intracaldera ignimbrite. Also, there was at least 65 km3 (DRE) of Plinian fall when extrapolated distally, and 107 + 40/-12 km3 of coignimbrite ash was "lost" from the outflow sheet to form an unknown proportion of the distal ash fall. The minimum total volume is 216 km3 and the maximum is 550 km3; hence, the eruption overlaps the low end of the super-eruption spectrum (VEI ˜8.0). Despite an abundance of geological data for the Otowi Member, the errors attached to these estimates do not allow us to constrain the proportions of intracaldera (IC), outflow (O), and distal ash (A) to better than a factor of three. We advocate caution in applying the IC/O/A = 1:1:1 relation of Mason et al. (2004) to scaling up mapped volumes of imperfectly preserved caldera-forming ignimbrites.

  4. Transmission imaging for integrated PET-MR systems.

    PubMed

    Bowen, Spencer L; Fuin, Niccolò; Levine, Michael A; Catana, Ciprian

    2016-08-07

    Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method's performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was  ⩽4% for soft tissue and  ⩽11% for bone ROIs. An implementation of the single torus source was filled with (18)F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm(-1) was recorded for an experimental uniform cylinder phantom scan, while a bias of  <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.

  5. Transmission imaging for integrated PET-MR systems

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Fuin, Niccolò; Levine, Michael A.; Catana, Ciprian

    2016-08-01

    Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method’s performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was  ⩽4% for soft tissue and  ⩽11% for bone ROIs. An implementation of the single torus source was filled with 18F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm-1 was recorded for an experimental uniform cylinder phantom scan, while a bias of  <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.

  6. Alaska Interim Land Cover Mapping Program; final report

    USGS Publications Warehouse

    Fitzpatrick-Lins, Katherine; Doughty, E.F.; Shasby, Mark; Benjamin, Susan

    1989-01-01

    In 1985, the U.S. Geological Survey initiated a research project to develop an interim land cover data base for Alaska as an alternative to the nationwide Land Use and Land Cover Mapping Program. The Alaska Interim Land Cover Mapping Program was subsequently created to develop methods for producing a series of land cover maps that utilized the existing Landsat digital land cover classifications produced by and for the major land management agencies for mapping the vegetation of Alaska. The program was successful in producing digital land cover classifications and statistical summaries using a common statewide classification and in reformatting these data to produce l:250,000-scale quadrangle-based maps directly from the Scitex laser plotter. A Federal and State agency review of these products found considerable user support for the maps. Presently the Geological Survey is committed to digital processing of six to eight quadrangles each year.

  7. Discovering a "True" Map of the World--Learning Activities.

    ERIC Educational Resources Information Center

    Hantula, James

    "True" maps of the world, as seen from the perspective of the time in which they were produced, remain an ethnocentric visual language in modern times. Students can gain insight into such "true" maps by studying maps produced in the great traditions of the West and East. Teachers can determine a map's appropriateness by identifying its title,…

  8. Bound-preserving Legendre-WENO finite volume schemes using nonlinear mapping

    NASA Astrophysics Data System (ADS)

    Smith, Timothy; Pantano, Carlos

    2017-11-01

    We present a new method to enforce field bounds in high-order Legendre-WENO finite volume schemes. The strategy consists of reconstructing each field through an intermediate mapping, which by design satisfies realizability constraints. Determination of the coefficients of the polynomial reconstruction involves nonlinear equations that are solved using Newton's method. The selection between the original or mapped reconstruction is implemented dynamically to minimize computational cost. The method has also been generalized to fields that exhibit interdependencies, requiring multi-dimensional mappings. Further, the method does not depend on the existence of a numerical flux function. We will discuss details of the proposed scheme and show results for systems in conservation and non-conservation form. This work was funded by the NSF under Grant DMS 1318161.

  9. Water volume and sediment accumulation in Lake Linganore, Frederick County, Maryland, 2009

    USGS Publications Warehouse

    Sekellick, Andrew J.; Banks, S.L.

    2010-01-01

    To assist in understanding sediment and phosphorus loadings and the management of water resources, a bathymetric survey was conducted at Lake Linganore in Frederick County, Maryland in June 2009 by the U.S. Geological Survey, in cooperation with the City of Frederick and Frederick County, Maryland. Position data and water-depth data were collected using a survey grade echo sounder and a differentially corrected global positioning system. Data were compiled and edited using geographic information system software. A three-dimensional triangulated irregular network model of the lake bottom was created to calculate the volume of stored water in the reservoir. Large-scale topographic maps of the valley prior to inundation in 1972 were provided by the City of Frederick and digitized. The two surfaces were compared and a sediment volume was calculated. Cartographic representations of both water depth and sediment accumulation were produced along with an area/capacity table. An accuracy assessment was completed on the resulting bathymetric model. Vertical accuracy at the 95-percent confidence level for the collected data, the bathymetric surface model, and the bathymetric contour map was calculated to be 0.95 feet, 1.53 feet, and 3.63 feet, respectively. The water storage volume of Lake Linganore was calculated to be 1,860 acre-feet at full pool elevation. Water volume in the reservoir has decreased by 350 acre-feet (about 16 percent) in the 37 years since the dam was constructed. The total calculated volume of sediment deposited in the lake since 1972 is 313 acre-feet. This represents an average rate of sediment accumulation of 8.5 acre-feet per year since Linganore Creek was impounded. A sectional analysis of sediment distribution indicates that the most upstream third of Lake Linganore contains the largest volume of sediment whereas the section closest to the dam contains the largest amount of water. In comparison to other Maryland Piedmont reservoirs, Lake Linganore was found to have one of the lowest sedimentation rates at 0.26 cubic yards per year per acre of drainage area. Sedimentation rates in other comparable Maryland reservoirs were Prettyboy Reservoir (filling at a rate of 2.26 cubic yards per year per acre), Loch Raven Reservoir (filling at a rate of 0.88 cubic yards per year per acre) and Piney Run Reservoir (filling at a negligible rate).

  10. Spatial and Temporal Analysis of Eruption Locations, Compositions, and Styles in Northern Harrat Rahat, Kingdom of Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Dietterich, H. R.; Stelten, M. E.; Downs, D. T.; Champion, D. E.

    2017-12-01

    Harrat Rahat is a predominantly mafic, 20,000 km2 volcanic field in western Saudi Arabia with an elongate volcanic axis extending 310 km north-south. Prior mapping suggests that the youngest eruptions were concentrated in northernmost Harrat Rahat, where our new geologic mapping and geochronology reveal >300 eruptive vents with ages ranging from 1.2 Ma to a historic eruption in 1256 CE. Eruption compositions and styles vary spatially and temporally within the volcanic field, where extensive alkali basaltic lavas dominate, but more evolved compositions erupted episodically as clusters of trachytic domes and small-volume pyroclastic flows. Analysis of vent locations, compositions, and eruption styles shows the evolution of the volcanic field and allows assessment of the spatio-temporal probabilities of vent opening and eruption styles. We link individual vents and fissures to eruptions and their deposits using field relations, petrography, geochemistry, paleomagnetism, and 40Ar/39Ar and 36Cl geochronology. Eruption volumes and deposit extents are derived from geologic mapping and topographic analysis. Spatial density analysis with kernel density estimation captures vent densities of up to 0.2 %/km2 along the north-south running volcanic axis, decaying quickly away to the east but reaching a second, lower high along a secondary axis to the west. Temporal trends show slight younging of mafic eruption ages to the north in the past 300 ka, as well as clustered eruptions of trachytes over the past 150 ka. Vent locations, timing, and composition are integrated through spatial probability weighted by eruption age for each compositional range to produce spatio-temporal models of vent opening probability. These show that the next mafic eruption is most probable within the north end of the main (eastern) volcanic axis, whereas more evolved compositions are most likely to erupt within the trachytic centers further to the south. These vent opening probabilities, combined with corresponding eruption properties, can be used as the basis for lava flow and tephra fall hazard maps.

  11. Development of AHPDST Vulnerability Indexing Model for Groundwater Vulnerability Assessment Using Hydrogeophysical Derived Parameters and GIS Application

    NASA Astrophysics Data System (ADS)

    Mogaji, K. A.

    2017-04-01

    Producing a bias-free vulnerability assessment map model is significantly needed for planning a scheme of groundwater quality protection. This study developed a GIS-based AHPDST vulnerability index model for producing groundwater vulnerability model map in the hard rock terrain, Nigeria by exploiting the potentials of analytic hierarchy process (AHP) and Dempster-Shafer theory (DST) data mining models. The acquired borehole and geophysical data in the study area were processed to derive five groundwater vulnerability conditioning factors (GVCFs), namely recharge rate, aquifer transmissivity, hydraulic conductivity, transverse resistance and longitudinal conductance. The produced GVCFs' thematic maps were multi-criterially analyzed by employing the mechanisms of AHP and DST models to determine the normalized weight ( W) parameter for the GVCFs and mass function factors (MFFs) parameter for the GVCFs' thematic maps' class boundaries, respectively. Based on the application of the weighted linear average technique, the determined W and MFFs parameters were synthesized to develop groundwater vulnerability potential index (GVPI)-based AHPDST model algorithm. The developed model was applied to establish four GVPI mass/belief function indices. The estimates based on the applied GVPI belief function indices were processed in GIS environment to create prospective groundwater vulnerability potential index maps. The most representative of the resulting vulnerability maps (the GVPIBel map) was considered for producing the groundwater vulnerability potential zones (GVPZ) map for the area. The produced GVPZ map established 48 and 52% of the areal extent to be covered by the lows/moderate and highs vulnerable zones, respectively. The success and the prediction rates of the produced GVPZ map were determined using the relative operating characteristics technique to give 82.3 and 77.7%, respectively. The analyzed results reveal that the developed GVPI-based AHPDST model algorithm is capable of producing efficient groundwater vulnerability potential zones prediction map and characterizing the predicted zones uncertainty via the DST mechanism processes in the area. The produced GVPZ map in this study can be used by decision makers to formulate appropriate groundwater management strategies and the approach may be well opted in other hard rock regions of the world, especially in economically poor nations.

  12. Mapping debris-flow hazard in Honolulu using a DEM

    USGS Publications Warehouse

    Ellen, Stephen D.; Mark, Robert K.; ,

    1993-01-01

    A method for mapping hazard posed by debris flows has been developed and applied to an area near Honolulu, Hawaii. The method uses studies of past debris flows to characterize sites of initiation, volume at initiation, and volume-change behavior during flow. Digital simulations of debris flows based on these characteristics are then routed through a digital elevation model (DEM) to estimate degree of hazard over the area.

  13. Data Base Design with GIS in Ecosystem Based Multiple Use Forest Management in Artvin, Turkey: A Case Study in Balcı Forest Management Planning Unit.

    PubMed

    Yolasığmaz, Hacı Ahmet; Keleş, Sedat

    2009-01-01

    In Turkey, the understanding of planning focused on timber production has given its place on Multiple Use Management (MUM). Because the whole infrastructure of forestry with inventory system leading the way depends on timber production, some cases of bottle neck are expected during the transition period. Database design, probably the most important stage during the transition to MUM, together with the digital basic maps making up the basis of this infrastructure constitute the main point of this article. Firstly, the forest management philosophy of Turkey in the past was shortly touched upon in the article. Ecosystem Based Multiple Use Forest Management (EBMUFM) approaches was briefly introduced. The second stage of the process of EBMUFM, database design was described by examining the classical planning infrastructure and the coverage to be produced and consumed were suggested in the form of lists. At the application stage, two different geographical databases were established with GIS in Balcı Planning Unit of the years 1984 and 2006. Following that the related basic maps are produced. Timely diversity of the planning unit of 20 years is put forward comparatively with regard to the stand parameters such as tree types, age class, development stage, canopy closure, mixture, volume and increment.

  14. High-resolution AUV mapping of the 2015 flows at Axial Seamount, Juan de Fuca Ridge

    NASA Astrophysics Data System (ADS)

    Paduan, J. B.; Chadwick, W. W., Jr.; Clague, D. A.; Le Saout, M.; Caress, D. W.; Thomas, H. J.; Yoerger, D.

    2016-12-01

    Lava flows erupted in April 2015 at Axial Seamount were mapped at 1-m resolution with the AUV Sentry in August 2015 and the MBARI Mapping AUVs in July 2016 and observed and sampled with ROVs on those same expeditions. Thirty percent of terrain covered by new flows had been mapped by the MBARI AUVs prior to the eruption. Differencing of before and after maps (using ship-collected bathymetry where the AUV had not mapped before) allows calculation of extents and volumes of flows and shows new fissures. The maps reveal unexpected fissure patterns and shifts in the style of flow emplacement through a single eruption. There were 11 separate flows totaling 1.48 x 108 m3 of lava erupted from numerous en echelon fissures over 19 km on the NE caldera floor, on the NE flank, and down the N rift zone. Flows in and around the caldera have maximum thicknesses of 5-19 m. Most erupted as sheet flows and spread along intricate channels that terminated in thin margins. Some utilized pre-existing fissures. Some flows erupted from short fissures, while at least two longer new fissures produced little or no lava. A flow on the upper N rift has a spectacular lava channel flanked by narrow lava pillars supporting a thin roof left after the flow drained. A shatter ring still emanating warm fluid is visible in the map as a 15-m wide low cone. Hundreds of exploded pillows were observed but are not discernable in the bathymetry. The northern-most three flows deep on the N rift are similar in area to the others but comprise the bulk of the eruption volume. Differencing of ship-based bathymetry shows only these flows. Near the eruptive fissures they are sheet flows, but as they flowed downslope they built complexes of coalesced pillow mounds up to 67-128 m thick. Changes in flow morphology occurred through the course of the eruption. Large pillow mounds had molten cores that deformed as the eruption progressed. One flow began as a thin, effusive sheet flow but as the eruption rate decreased, a pillow mound built over the fissure. As the eruption waned on the caldera floor, near the fissure a small inflated margin developed on top of channels from an earlier phase of the flow. Several landslides occurred at the caldera wall. One is near where a 2015 fissure on the caldera floor cut through the caldera-bounding fault into the flank of the volcano.

  15. Coal seam gas water: potential hazards and exposure pathways in Queensland.

    PubMed

    Navi, Maryam; Skelly, Chris; Taulis, Mauricio; Nasiri, Shahram

    2015-01-01

    The extraction of coal seam gas (CSG) produces large volumes of potentially contaminated water. It has raised concerns about the environmental health impacts of the co-produced CSG water. In this paper, we review CSG water contaminants and their potential health effects in the context of exposure pathways in Queensland's CSG basins. The hazardous substances associated with CSG water in Queensland include fluoride, boron, lead and benzene. The exposure pathways for CSG water are (1) water used for municipal purposes; (2) recreational water activities in rivers; (3) occupational exposures; (4) water extracted from contaminated aquifers; and (5) indirect exposure through the food chain. We recommend mapping of exposure pathways into communities in CSG regions to determine the potentially exposed populations in Queensland. Future efforts to monitor chemicals of concern and consolidate them into a central database will build the necessary capability to undertake a much needed environmental health impact assessment.

  16. Voluminous eruption from a zoned magma body after an increase in supply rate at Axial Seamount

    NASA Astrophysics Data System (ADS)

    Chadwick, W. W.; Paduan, J. B.; Clague, D. A.; Dreyer, B. M.; Merle, S. G.; Bobbitt, A. M.; Caress, D. W.; Philip, B. T.; Kelley, D. S.; Nooner, S. L.

    2016-12-01

    Axial Seamount is the best monitored submarine volcano in the world, providing an exceptional window into the dynamic interactions between magma storage, transport, and eruption processes in a mid-ocean ridge setting. An eruption in April 2015 produced the largest volume of erupted lava since monitoring and mapping began in the mid-1980s after the shortest repose time, due to a recent increase in magma supply. The higher rate of magma replenishment since 2011 resulted in the eruption of the most mafic lava in the last 500-600 years. Eruptive fissures at the volcano summit produced pyroclastic ash that was deposited over an area of at least 8 km2. A systematic spatial distribution of compositions is consistent with a single dike tapping different parts of a thermally and chemically zoned magma reservoir that can be directly related to previous multichannel seismic-imaging results.

  17. Basal Topography of the South Polar Layered Deposits

    NASA Astrophysics Data System (ADS)

    Davies, C. W.; Murray, B. C.; Byrne, S.

    2004-12-01

    The ice inferred to comprise the south polar layered deposits (SPLD) represents a significant fraction of the total water reservoir of the planet. The basal topography of these deposits is currently unconstrained but may be expected to contain considerable relief based on the heavily cratered nature of the surrounding terrain. In this work we report on our efforts to characterize the overall nature of this basal topography and in so doing better constrain the volume of this important volatile reservoir. Our approach has been to measure elevations at the periphery of the SPLD (defined by [1]) and use various interpolation techniques to estimate the basal topography. We used 1300 control points from the edges and areas surrounding the SPLD and included extensive control points from within the Chasmae and other features to fit a surface beneath the SPLD. No assumptions were made about any lithospheric flexure, nor did the results suggest that possibility. We first tested a variety of surface interpolation routines on a comparable area of cratered terrain immediately adjacent to the SPLD, using the same spatial distribution of 1300 height control points as we used for the SPLD itself, and found that the topography was broadly reproducible (ignoring craters) to within a few hundred meters. The SPLD basal topography we derive can be subtracted from the current spatial topography to produce isopach maps of the layered deposits. All interpolation methods we tested (within the ArcMap 8.3) indicate a lower total SPLD volume than that previous published [Smith et al., 2001]. Our best estimate for the SPLD volume is ~1 million km3, with a formal error in volume of ~5%, corresponding to an average thickness of ~950 meters. In comparison, [2] estimated this volume to be ~1.2-1.7 million km3. The Prometheus impact basin is present as a rimmed depression, consistent with the inference by [3]. More unexpected is the presence of a broad ridge underlying nearly the entire eastern half of the SPLD, which makes those deposits relatively thin. Our isopach maps show the northwestern portion of the Ultimi lobe to be an isolated thick region, in agreement with [1]. [1] Kolb, E. J., and K. L. Tanaka (2001), Icarus, 154, 22-39. [2] Smith, D. E., et al. (2001), J. Geophys. Res., 106(E10), 23,689-23,722. [3] Byrne, S., and A. B. Ivanov (2004), J. Geophys. Res., In press.

  18. The Application of MRI for Depiction of Subtle Blood Brain Barrier Disruption in Stroke

    PubMed Central

    Israeli, David; Tanne, David; Daniels, Dianne; Last, David; Shneor, Ran; Guez, David; Landau, Efrat; Roth, Yiftach; Ocherashvilli, Aharon; Bakon, Mati; Hoffman, Chen; Weinberg, Amit; Volk, Talila; Mardor, Yael

    2011-01-01

    The development of imaging methodologies for detecting blood-brain-barrier (BBB) disruption may help predict stroke patient's propensity to develop hemorrhagic complications following reperfusion. We have developed a delayed contrast extravasation MRI-based methodology enabling real-time depiction of subtle BBB abnormalities in humans with high sensitivity to BBB disruption and high spatial resolution. The increased sensitivity to subtle BBB disruption is obtained by acquiring T1-weighted MRI at relatively long delays (~15 minutes) after contrast injection and subtracting from them images acquired immediately after contrast administration. In addition, the relatively long delays allow for acquisition of high resolution images resulting in high resolution BBB disruption maps. The sensitivity is further increased by image preprocessing with corrections for intensity variations and with whole body (rigid+elastic) registration. Since only two separate time points are required, the time between the two acquisitions can be used for acquiring routine clinical data, keeping the total imaging time to a minimum. A proof of concept study was performed in 34 patients with ischemic stroke and 2 patients with brain metastases undergoing high resolution T1-weighted MRI acquired at 3 time points after contrast injection. The MR images were pre-processed and subtracted to produce BBB disruption maps. BBB maps of patients with brain metastases and ischemic stroke presented different patterns of BBB opening. The significant advantage of the long extravasation time was demonstrated by a dynamic-contrast-enhancement study performed continuously for 18 min. The high sensitivity of our methodology enabled depiction of clear BBB disruption in 27% of the stroke patients who did not have abnormalities on conventional contrast-enhanced MRI. In 36% of the patients, who had abnormalities detectable by conventional MRI, the BBB disruption volumes were significantly larger in the maps than in conventional MRI. These results demonstrate the advantages of delayed contrast extravasation in increasing the sensitivity to subtle BBB disruption in ischemic stroke patients. The calculated disruption maps provide clear depiction of significant volumes of BBB disruption unattainable by conventional contrast-enhanced MRI. PMID:21209786

  19. The application of MRI for depiction of subtle blood brain barrier disruption in stroke.

    PubMed

    Israeli, David; Tanne, David; Daniels, Dianne; Last, David; Shneor, Ran; Guez, David; Landau, Efrat; Roth, Yiftach; Ocherashvilli, Aharon; Bakon, Mati; Hoffman, Chen; Weinberg, Amit; Volk, Talila; Mardor, Yael

    2010-12-26

    The development of imaging methodologies for detecting blood-brain-barrier (BBB) disruption may help predict stroke patient's propensity to develop hemorrhagic complications following reperfusion. We have developed a delayed contrast extravasation MRI-based methodology enabling real-time depiction of subtle BBB abnormalities in humans with high sensitivity to BBB disruption and high spatial resolution. The increased sensitivity to subtle BBB disruption is obtained by acquiring T1-weighted MRI at relatively long delays (~15 minutes) after contrast injection and subtracting from them images acquired immediately after contrast administration. In addition, the relatively long delays allow for acquisition of high resolution images resulting in high resolution BBB disruption maps. The sensitivity is further increased by image preprocessing with corrections for intensity variations and with whole body (rigid+elastic) registration. Since only two separate time points are required, the time between the two acquisitions can be used for acquiring routine clinical data, keeping the total imaging time to a minimum. A proof of concept study was performed in 34 patients with ischemic stroke and 2 patients with brain metastases undergoing high resolution T1-weighted MRI acquired at 3 time points after contrast injection. The MR images were pre-processed and subtracted to produce BBB disruption maps. BBB maps of patients with brain metastases and ischemic stroke presented different patterns of BBB opening. The significant advantage of the long extravasation time was demonstrated by a dynamic-contrast-enhancement study performed continuously for 18 min. The high sensitivity of our methodology enabled depiction of clear BBB disruption in 27% of the stroke patients who did not have abnormalities on conventional contrast-enhanced MRI. In 36% of the patients, who had abnormalities detectable by conventional MRI, the BBB disruption volumes were significantly larger in the maps than in conventional MRI. These results demonstrate the advantages of delayed contrast extravasation in increasing the sensitivity to subtle BBB disruption in ischemic stroke patients. The calculated disruption maps provide clear depiction of significant volumes of BBB disruption unattainable by conventional contrast-enhanced MRI.

  20. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  1. Cost, accuracy, and consistency comparisons of land use maps made from high-altitutde aircraft photography and ERTS imagery

    USGS Publications Warehouse

    Fitzpatrick, Katherine A.

    1975-01-01

    Accuracy analyses for the land use maps of the Central Atlantic Regional Ecological Test Site were performed for a 1-percent sample of the area. Researchers compared Level II land use maps produced at three scales, 1:24,000, 1:100,000, and 1:250,000 from high-altitude photography, with each other and with point data obtained in the field. They employed the same procedures to determine the accuracy of the Level I land use maps produced at 1:250,000 from high-altitude photography and color composite ERTS imagery. The accuracy of the Level II maps was 84.9 percent at 1:24,000, 77.4 percent at 1:100,000, and 73.0 percent at 1:250,000. The accuracy of the Level I 1:250,000 maps produced from high-altitude aircraft photography was 76.5 percent and for those produced from ERTS imagery was 69.5 percent The cost of Level II land use mapping at 1:24,000 was found to be high ($11.93 per km2 ). The cost of mapping at 1:100,000 ($1.75) was about 2 times as expensive as mapping at 1:250,000 ($.88), and the accuracy increased by only 4.4 percent. Level I land use maps, when mapped from highaltitude photography, were about 4 times as expensive as the maps produced from ERTS imagery, although the accuracy is 7.0 percent greater. The Level I land use category that is least accurately mapped from ERTS imagery is urban and built-up land in the non-urban areas; in the urbanized areas, built-up land is more reliably mapped.

  2. 3-D ultrafast Doppler imaging applied to the noninvasive mapping of blood vessels in vivo.

    PubMed

    Provost, Jean; Papadacci, Clement; Demene, Charlie; Gennisson, Jean-Luc; Tanter, Mickael; Pernot, Mathieu

    2015-08-01

    Ultrafast Doppler imaging was introduced as a technique to quantify blood flow in an entire 2-D field of view, expanding the field of application of ultrasound imaging to the highly sensitive anatomical and functional mapping of blood vessels. We have recently developed 3-D ultrafast ultrasound imaging, a technique that can produce thousands of ultrasound volumes per second, based on a 3-D plane and diverging wave emissions, and demonstrated its clinical feasibility in human subjects in vivo. In this study, we show that noninvasive 3-D ultrafast power Doppler, pulsed Doppler, and color Doppler imaging can be used to perform imaging of blood vessels in humans when using coherent compounding of 3-D tilted plane waves. A customized, programmable, 1024-channel ultrasound system was designed to perform 3-D ultrafast imaging. Using a 32 × 32, 3-MHz matrix phased array (Vermon, Tours, France), volumes were beamformed by coherently compounding successive tilted plane wave emissions. Doppler processing was then applied in a voxel-wise fashion. The proof of principle of 3-D ultrafast power Doppler imaging was first performed by imaging Tygon tubes of various diameters, and in vivo feasibility was demonstrated by imaging small vessels in the human thyroid. Simultaneous 3-D color and pulsed Doppler imaging using compounded emissions were also applied in the carotid artery and the jugular vein in one healthy volunteer.

  3. The Shoreline Management Tool - an ArcMap tool for analyzing water depth, inundated area, volume, and selected habitats, with an example for the lower Wood River Valley, Oregon

    USGS Publications Warehouse

    Snyder, Daniel T.; Haluska, Tana L.; Respini-Irwin, Darius

    2013-01-01

    The Shoreline Management Tool is a geographic information system (GIS) based program developed to assist water- and land-resource managers in assessing the benefits and effects of changes in surface-water stage on water depth, inundated area, and water volume. Additionally, the Shoreline Management Tool can be used to identify aquatic or terrestrial habitat areas where conditions may be suitable for specific plants or animals as defined by user-specified criteria including water depth, land-surface slope, and land-surface aspect. The tool can also be used to delineate areas for use in determining a variety of hydrologic budget components such as surface-water storage, precipitation, runoff, or evapotranspiration. The Shoreline Management Tool consists of two parts, a graphical user interface for use with Esri™ ArcMap™ GIS software to interact with the user to define scenarios and map results, and a spreadsheet in Microsoft® Excel® developed to display tables and graphs of the results. The graphical user interface allows the user to define a scenario consisting of an inundation level (stage), land areas (parcels), and habitats (areas meeting user-specified conditions) based on water depth, slope, and aspect criteria. The tool uses data consisting of land-surface elevation, tables of stage/volume and stage/area, and delineated parcel boundaries to produce maps (data layers) of inundated areas and areas that meet the habitat criteria. The tool can be run in a Single-Time Scenario mode or in a Time-Series Scenario mode, which uses an input file of dates and associated stages. The spreadsheet part of the tool uses a macro to process the results from the graphical user interface to create tables and graphs of inundated water volume, inundated area, dry area, and mean water depth for each land parcel based on the user-specified stage. The macro also creates tables and graphs of the area, perimeter, and number of polygons comprising the user-specified habitat areas within each parcel. The Shoreline Management Tool is highly transferable, using easily generated or readily available data. The capabilities of the tool are demonstrated using data from the lower Wood River Valley adjacent to Upper Klamath and Agency Lakes in southern Oregon.

  4. Providing open access data online to advance malaria research and control.

    PubMed

    Moyes, Catherine L; Temperley, William H; Henry, Andrew J; Burgert, Clara R; Hay, Simon I

    2013-05-16

    To advance research on malaria, the outputs from existing studies and the data that fed into them need to be made freely available. This will ensure new studies can build on the work that has gone before. These data and results also need to be made available to groups who are developing public health policies based on up-to-date evidence. The Malaria Atlas Project (MAP) has collated and geopositioned over 50,000 parasite prevalence and vector occurrence survey records contributed by over 3,000 sources including research groups, government agencies and non-governmental organizations worldwide. This paper describes the results of a project set up to release data gathered, used and generated by MAP. Requests for permission to release data online were sent to 236 groups who had contributed unpublished prevalence (parasite rate) surveys. An online explorer tool was developed so that users can visualize the spatial distribution of the vector and parasite survey data before downloading it. In addition, a consultation group was convened to provide advice on the mode and format of release for data generated by MAP's modelling work. New software was developed to produce a suite of publication-quality map images for download from the internet for use in external publications. More than 40,000 survey records can now be visualized on a set of dynamic maps and downloaded from the MAP website on a free and unrestricted basis. As new data are added and new permissions to release existing data come in, the volume of data available for download will increase. The modelled data output from MAP's own analyses are also available online in a range of formats, including image files and GIS surface data, for use in advocacy, education, further research and to help parameterize or validate other mathematical models.

  5. Predictive Mapping of Forest Attributes on the Fishlake National Forest

    Treesearch

    Tracey S. Frescino; Gretchen G. Moisen

    2005-01-01

    Forest land managers increasingly need maps of forest characteristics to aid in planning and management. A set of 30-m resolution maps was prepared for the Fishlake National Forest by modeling FIA plot variables as nonparametric functions of ancillary digital data. The set includes maps of volume, biomass, growth, stand age, size, crown cover, and various aspen...

  6. Uranium in US surface, ground, and domestic waters. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drury, J.S.; Reynolds, S.; Owen, P.T.

    1981-04-01

    The report Uranium in US Surface, Ground, and Domestic Waters comprises four volumes. Volumes 2, 3, and 4 contain data characterizing the location, sampling date, type, use, and uranium conentrations of 89,994 individual samples presented in tabular form. The tabular data in volumes 2, 3, and 4 are summarized in volume 1 in narrative form and with maps and histograms.

  7. Influences on the variability of eruption sequences and style transitions in the Auckland Volcanic Field, New Zealand

    NASA Astrophysics Data System (ADS)

    Kereszturi, Gábor; Németh, Károly; Cronin, Shane J.; Procter, Jonathan; Agustín-Flores, Javier

    2014-10-01

    Monogenetic basaltic volcanism is characterised by a complex array of eruptive behaviours, reflecting spatial and temporal variability of the magmatic properties (e.g. composition, eruptive volume, magma flux) as well as environmental factors at the vent site (e.g. availability of water, country rock geology, faulting). These combine to produce changes in eruption style over brief periods (minutes to days) in many eruption episodes. Monogenetic eruptions in some volcanic fields often start with a phreatomagmatic vent-opening phase that later transforms into "dry" magmatic explosive or effusive activity, with a strong variation in the duration and importance of this first phase. Such an eruption sequence pattern occurred in 83% of the known eruption in the 0.25 My-old Auckland Volcanic Field (AVF), New Zealand. In this investigation, the eruptive volumes were compared with the sequences of eruption styles preserved in the pyroclastic record at each volcano of the AVF, as well as environmental influencing factors, such as distribution and thickness of water-saturated semi- to unconsolidated sediments, topographic position, distances from known fault lines. The AVF showed that there is no correlation between ejecta ring volumes and environmental influencing factors that is valid for the entire AVF. In contrary, using a set of comparisons of single volcanoes with well-known and documented sequences, resultant eruption sequences could be explained by predominant patterns of the environment in which these volcanoes were erupted. Based on the spatial variability of these environmental factors, a first-order susceptibility hazard map was constructed for the AVF that forecasts areas of largest likelihood for phreatomagmatic eruptions by overlaying topographical and shallow geological information. Combining detailed phase-by-phase breakdowns of eruptive volumes and the event sequences of the AVF, along with the new susceptibility map, more realistic eruption scenarios can be developed for different parts of the volcanic field. This approach can be applied to tailoring field and sub-field specific hazard forecasting at similar volcanic fields worldwide.

  8. Middle Atmosphere Program. Handbook for MAP, volume 4

    NASA Technical Reports Server (NTRS)

    Sechrist, C. F., Jr. (Editor)

    1982-01-01

    Topics include winter in the Northern Hemisphere, temperature measurement, geopotential heights, wind measurement, atmospheric motions, photochemical reactions, solar spectral irradiance, trace constituents, tides, gravity waves, and turbulence. Highlights from the Map Steering Committee and a Map Open Meeting including organizational structure are also given.

  9. Hazard analysis of landslides triggered by Typhoon Chata'an on July 2, 2002, in Chuuk State, Federated States of Micronesia

    USGS Publications Warehouse

    Harp, Edwin L.; Reid, Mark E.; Michael, John A.

    2004-01-01

    More than 250 landslides were triggered across the eastern volcanic islands of Chuuk State in the Federated States of Micronesia by torrential rainfall from tropical storm Chata?an on July 2, 2002. Landslides triggered during nearly 20 inches of rainfall in less than 24 hours caused 43 fatalities and the destruction or damage of 231 structures, including homes, schools, community centers, and medical dispensaries. Landslides also buried roads, crops, and water supplies. The landslides ranged in volume from a few cubic meters to more than 1 million cubic meters. Most of the failures began as slumps and transformed into debris flows, some of which traveled several hundred meters across coastal flatlands into populated areas. A landslide-inventory map produced after the storm shows that the island of Tonoas had the largest area affected by landslides, although the islands of Weno, Fefan, Etten, Uman, Siis, Udot, Eot, and Fanapanges also had significant landslides. Based on observations since the storm, we estimate the continuing hazard from landslides triggered by Chata?an to be relatively low. However, tropical storms and typhoons similar to Chata?an frequently develop in Micronesia and are likely to affect the islands of Chuuk in the future. To assess the landslide hazard from future tropical storms, we produced a hazard map that identifies landslide-source areas of high, moderate, and low hazard. This map can be used to identify relatively safe areas for relocating structures or establishing areas where people could gather for shelter in relative safety during future typhoons or tropical storms similar to Chata?an.

  10. Using LiDAR to Estimate Surface Erosion Volumes within the Post-storm 2012 Bagley Fire

    NASA Astrophysics Data System (ADS)

    Mikulovsky, R. P.; De La Fuente, J. A.; Mondry, Z. J.

    2014-12-01

    The total post-storm 2012 Bagley fire sediment budget of the Squaw Creek watershed in the Shasta-Trinity National Forest was estimated using many methods. A portion of the budget was quantitatively estimated using LiDAR. Simple workflows were designed to estimate the eroded volume's of debris slides, fill failures, gullies, altered channels and streams. LiDAR was also used to estimate depositional volumes. Thorough manual mapping of large erosional features using the ArcGIS 10.1 Geographic Information System was required as these mapped features determined the eroded volume boundaries in 3D space. The 3D pre-erosional surface for each mapped feature was interpolated based on the boundary elevations. A surface difference calculation was run using the estimated pre-erosional surfaces and LiDAR surfaces to determine volume of sediment potentially delivered into the stream system. In addition, cross sections of altered channels and streams were taken using stratified random selection based on channel gradient and stream order respectively. The original pre-storm surfaces of channel features were estimated using the cross sections and erosion depth criteria. Open source software Inkscape was used to estimate cross sectional areas for randomly selected channel features and then averaged for each channel gradient and stream order classes. The average areas were then multiplied by the length of each class to estimate total eroded altered channel and stream volume. Finally, reservoir and in-channel depositional volumes were estimated by mapping channel forms and generating specific reservoir elevation zones associated with depositional events. The in-channel areas and zones within the reservoir were multiplied by estimated and field observed sediment thicknesses to attain a best guess sediment volume. In channel estimates included re-occupying stream channel cross sections established before the fire. Once volumes were calculated, other erosion processes of the Bagley sedimentation study, such as surface soil erosion were combined to estimate the total fire and storm sediment budget for the Squaw Creek watershed. The LiDAR-based measurement workflows can be easily applied to other sediment budget studies using one high resolution LiDAR dataset.

  11. An Efficient Method of Sharing Mass Spatio-Temporal Trajectory Data Based on Cloudera Impala for Traffic Distribution Mapping in an Urban City.

    PubMed

    Zhou, Lianjie; Chen, Nengcheng; Yuan, Sai; Chen, Zeqiang

    2016-10-29

    The efficient sharing of spatio-temporal trajectory data is important to understand traffic congestion in mass data. However, the data volumes of bus networks in urban cities are growing rapidly, reaching daily volumes of one hundred million datapoints. Accessing and retrieving mass spatio-temporal trajectory data in any field is hard and inefficient due to limited computational capabilities and incomplete data organization mechanisms. Therefore, we propose an optimized and efficient spatio-temporal trajectory data retrieval method based on the Cloudera Impala query engine, called ESTRI, to enhance the efficiency of mass data sharing. As an excellent query tool for mass data, Impala can be applied for mass spatio-temporal trajectory data sharing. In ESTRI we extend the spatio-temporal trajectory data retrieval function of Impala and design a suitable data partitioning method. In our experiments, the Taiyuan BeiDou (BD) bus network is selected, containing 2300 buses with BD positioning sensors, producing 20 million records every day, resulting in two difficulties as described in the Introduction section. In addition, ESTRI and MongoDB are applied in experiments. The experiments show that ESTRI achieves the most efficient data retrieval compared to retrieval using MongoDB for data volumes of fifty million, one hundred million, one hundred and fifty million, and two hundred million. The performance of ESTRI is approximately seven times higher than that of MongoDB. The experiments show that ESTRI is an effective method for retrieving mass spatio-temporal trajectory data. Finally, bus distribution mapping in Taiyuan city is achieved, describing the buses density in different regions at different times throughout the day, which can be applied in future studies of transport, such as traffic scheduling, traffic planning and traffic behavior management in intelligent public transportation systems.

  12. Semantic layers for illustrative volume rendering.

    PubMed

    Rautek, Peter; Bruckner, Stefan; Gröller, Eduard

    2007-01-01

    Direct volume rendering techniques map volumetric attributes (e.g., density, gradient magnitude, etc.) to visual styles. Commonly this mapping is specified by a transfer function. The specification of transfer functions is a complex task and requires expert knowledge about the underlying rendering technique. In the case of multiple volumetric attributes and multiple visual styles the specification of the multi-dimensional transfer function becomes more challenging and non-intuitive. We present a novel methodology for the specification of a mapping from several volumetric attributes to multiple illustrative visual styles. We introduce semantic layers that allow a domain expert to specify the mapping in the natural language of the domain. A semantic layer defines the mapping of volumetric attributes to one visual style. Volumetric attributes and visual styles are represented as fuzzy sets. The mapping is specified by rules that are evaluated with fuzzy logic arithmetics. The user specifies the fuzzy sets and the rules without special knowledge about the underlying rendering technique. Semantic layers allow for a linguistic specification of the mapping from attributes to visual styles replacing the traditional transfer function specification.

  13. Clearance detector and method for motion and distance

    DOEpatents

    Xavier, Patrick G [Albuquerque, NM

    2011-08-09

    A method for correct and efficient detection of clearances between three-dimensional bodies in computer-based simulations, where one or both of the volumes is subject to translation and/or rotations. The method conservatively determines of the size of such clearances and whether there is a collision between the bodies. Given two bodies, each of which is undergoing separate motions, the method utilizes bounding-volume hierarchy representations for the two bodies and, mappings and inverse mappings for the motions of the two bodies. The method uses the representations, mappings and direction vectors to determine the directionally furthest locations of points on the convex hulls of the volumes virtually swept by the bodies and hence the clearance between the bodies, without having to calculate the convex hulls of the bodies. The method includes clearance detection for bodies comprising convex geometrical primitives and more specific techniques for bodies comprising convex polyhedra.

  14. Mapping Isobaric Aging onto the Equilibrium Phase Diagram.

    PubMed

    Niss, Kristine

    2017-09-15

    The linear volume relaxation and the nonlinear volume aging of a glass-forming liquid are measured, directly compared, and used to extract the out-of-equilibrium relaxation time. This opens a window to investigate how the relaxation time depends on temperature, structure, and volume in parts of phase space that are not accessed by the equilibrium liquid. It is found that the temperature dependence of relaxation time is non-Arrhenius even in the isostructural case-challenging the Adam-Gibbs entropy model. Based on the presented data and the idea that aging happens through quasiequilibrium states, we suggest a mapping of the out-of-equilibrium states during isobaric aging to the equilibrium phase diagram. This mapping implies the existence of isostructural lines in the equilibrium phase diagram. The relaxation time is found to depend on the bath temperature, density, and a just single structural parameter, referred to as an effective temperature.

  15. Study of Abrasive Wear Volume Map for PTFE and PTFE Composites

    NASA Astrophysics Data System (ADS)

    Unal, H.; Sen, U.; Mimaroglu, A.

    2007-11-01

    The potential of this work is based on consideration of wear volume map for the evaluation of abrasive wear performance of polytetrafluoroethylene (PTFE) and PTFE composites. The fillers used in the composite are 25% bronze, 35% graphite and 17% glass fibre glass (GFR). The influence of filler materials, abrasion surface roughness and applied load values on abrasive wear performance of PTFE and PTFE composites were studied and evaluated. Experimental abrasive wear tests were carried out at atmospheric condition on pin-on-disc wear tribometer. Tests were performed under 4, 6, 8 and 10 N load values, travelling speed of 1 m/sec and abrasion surface roughness values of 5, 20 and 45 µm. Wear volume maps were obtained and the results showed that the lowest wear volume rate for PTFE is reached using GFR filler. Furthermore, the results also showed that the higher is the applied load and the roughness of the abrasion surface, the higher is the wear rate. Finally it is also concluded that abrasive wear process mechanism include ploughing and cutting mechanisms.

  16. Wildland resource information system: user's guide

    Treesearch

    Robert M. Russell; David A. Sharpnack; Elliot L. Amidon

    1975-01-01

    This user's guide provides detailed information about how to use the computer programs of WRIS, a computer system for storing and manipulating data about land areas. Instructions explain how to prepare maps, digitize by automatic scanners or by hand, produce polygon maps, and combine map layers. Support programs plot maps, store them on tapes, produce summaries,...

  17. Glof Study in Tawang River Basin, Arunachal Pradesh, India

    NASA Astrophysics Data System (ADS)

    Panda, R.; Padhee, S. K.; Dutta, S.

    2014-11-01

    Glacial lake outburst flood (GLOF) is one of the major unexpected hazards in the high mountain regions susceptible to climate change. The Tawang river basin in Arunachal Pradesh is an unexplored region in the Eastern Himalayas, which is impending to produce several upcoming hydro-electric projects (HEP). The main source of the river system is the snow melt in the Eastern Himalayas, which is composed of several lakes located at the snout of the glacier dammed by the lateral or end moraine. These lakes might prove as potential threat to the future scenario as they have a tendency to produce flash flood with large quantity of sediment load during outbursts. This study provides a methodology to detect the potential lakes as a danger to the HEP sites in the basin, followed by quantification of volume of discharge from the potential lake and prediction of hydrograph at the lake site. The remote location of present lakes induced the use of remote sensing data, which was fulfilled by Landsat-8 satellite imagery with least cloud coverage. Suitable reflectance bands on the basis of spectral responses were used to produce informational layers (NDWI, Potential snow cover map, supervised classification map) in GIS environment for discriminating different land features. The product obtained from vector overlay operation of these layers; representing possible water area, was further utilized in combination with Google earth to identify the lakes within the watershed. Finally those identified lakes were detected as potentially dangerous lakes based on the criteria of elevation, area, proximity from streamline, slope and volume of water held. HEC-RAS simulation model was used with cross sections from Google Earth and field survey as input to simulate dam break like situation; hydrodynamic channel routing of the outburst hydrograph along river reach was carried out to get the GLOF hydrograph at the project sites. It was concluded from the results that, the assessed GLOF would be a lead for the qualitative approximation of the amount of bed load transported along the river reach and thus hydropower project sites.

  18. Improving the quality of mass produced maps

    USGS Publications Warehouse

    Simley, J.

    2001-01-01

    Quality is critical in cartography because key decisions are often made based on the information the map communicates. The mass production of digital cartographic information to support geographic information science has now added a new dimension to the problem of cartographic quality, as problems once limited to small volumes can now proliferate in mass production programs. These problems can also affect the economics of map production by diverting a sizeable portion of production cost to pay for rework on maps with poor quality. Such problems are common to general industry-in response, the quality engineering profession has developed a number of successful methods to overcome these problems. Two important methods are the reduction of error through statistical analysis and addressing the quality environment in which people work. Once initial and obvious quality problems have been solved, outside influences periodically appear that cause adverse variations in quality and consequently increase production costs. Such errors can be difficult to detect before the customer is affected. However, a number of statistical techniques can be employed to detect variation so that the problem is eliminated before significant damage is caused. Additionally, the environment in which the workforce operates must be conductive to quality. Managers have a powerful responsibility to create this environment. Two sets of guidelines, known as Deming's Fourteen Points and ISO-9000, provide models for this environment.

  19. A coarse-to-fine approach for pericardial effusion localization and segmentation in chest CT scans

    NASA Astrophysics Data System (ADS)

    Liu, Jiamin; Chellamuthu, Karthik; Lu, Le; Bagheri, Mohammadhadi; Summers, Ronald M.

    2018-02-01

    Pericardial effusion on CT scans demonstrates very high shape and volume variability and very low contrast to adjacent structures. This inhibits traditional automated segmentation methods from achieving high accuracies. Deep neural networks have been widely used for image segmentation in CT scans. In this work, we present a two-stage method for pericardial effusion localization and segmentation. For the first step, we localize the pericardial area from the entire CT volume, providing a reliable bounding box for the more refined segmentation step. A coarse-scaled holistically-nested convolutional networks (HNN) model is trained on entire CT volume. The resulting HNN per-pixel probability maps are then threshold to produce a bounding box covering the pericardial area. For the second step, a fine-scaled HNN model is trained only on the bounding box region for effusion segmentation to reduce the background distraction. Quantitative evaluation is performed on a dataset of 25 CT scans of patient (1206 images) with pericardial effusion. The segmentation accuracy of our two-stage method, measured by Dice Similarity Coefficient (DSC), is 75.59+/-12.04%, which is significantly better than the segmentation accuracy (62.74+/-15.20%) of only using the coarse-scaled HNN model.

  20. The Syrtis Major volcano, Mars: A multidisciplinary approach to interpreting its magmatic evolution and structural development

    NASA Astrophysics Data System (ADS)

    Lillis, Robert J.; Dufek, Josef; Kiefer, Walter S.; Black, Benjamin A.; Manga, Michael; Richardson, Jacob A.; Bleacher, Jacob E.

    2015-09-01

    Very weak crustal magnetic fields over the Syrtis Major volcanic complex imply almost total thermal demagnetization via magmatic intrusions over a large area less than ~4 Ga. We fit a model of these intrusions and the resulting thermal demagnetization to maps of crustal magnetic field strength at 185 km altitude. The best fits are most consistent with a "dog bone"-shaped region of intrusive material, elongated approximately north-south, with an area of ~350,000 km2 and an inferred volume of ~4-19 × 106 km3. Such a large volume is best explained by a long-lived mantle plume beneath the Syrtis edifice. A free-air gravity anomaly high over the Syrtis Major caldera is consistent with dense mafic residue remaining at depth following crystal fractionation that produced the silicic magmas seen at the surface. The elongation of this region is consistent with ascent and north-south emplacement of magma enabled by structures parallel to and associated with the preexisting Isidis impact basin.

  1. Food waste volume and origin: Case studies in the Finnish food service sector.

    PubMed

    Silvennoinen, Kirsi; Heikkilä, Lotta; Katajajuuri, Juha-Matti; Reinikainen, Anu

    2015-12-01

    We carried out a project to map the volume and composition of food waste in the Finnish food service sector. The amount, type and origin of avoidable food waste were investigated in 51 food service outlets, including schools, day-care centres, workplace canteens, petrol stations, restaurants and diners. Food service outlet personnel kept diaries and weighed the food produced and wasted during a one-week or one-day period. For weighing and sorting, the food waste was divided into two categories: originally edible (OE) food waste was separated from originally inedible (OIE) waste, such as vegetable peelings, bones and coffee grounds. In addition, food waste (OE) was divided into three categories in accordance with its origins: kitchen waste, service waste and customer leftovers. According to the results, about 20% of all food handled and prepared in the sector was wasted. The findings also suggest that the main drivers of wasted food are buffet services and overproduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Genetic Mapping of Quantitative Trait Loci Controlling Growth and Wood Quality Traits in Eucalyptus Grandis Using a Maternal Half-Sib Family and Rapd Markers

    PubMed Central

    Grattapaglia, D.; Bertolucci, FLG.; Penchel, R.; Sederoff, R. R.

    1996-01-01

    Quantitative trait loci (QTL) mapping of forest productivity traits was performed using an open pollinated half-sib family of Eucalyptus grandis. For volume growth, a sequential QTL mapping approach was applied using bulk segregant analysis (BSA), selective genotyping (SG) and cosegregation analysis (CSA). Despite the low heritability of this trait and the heterogeneous genetic background employed for mapping. BSA detected one putative QTL and SG two out of the three later found by CSA. The three putative QTL for volume growth were found to control 13.7% of the phenotypic variation, corresponding to an estimated 43.7% of the genetic variation. For wood specific gravity five QTL were identified controlling 24.7% of the phenotypic variation corresponding to 49% of the genetic variation. Overlapping QTL for CBH, WSG and percentage dry weight of bark were observed. A significant case of digenic epistasis was found, involving unlinked QTL for volume. Our results demonstrate the applicability of the within half-sib design for QTL mapping in forest trees and indicate the existence of major genes involved in the expression of economically important traits related to forest productivity in Eucalyptus grandis. These findings have important implications for marker-assisted tree breeding. PMID:8913761

  3. Partial volume correction of brain perfusion estimates using the inherent signal data of time-resolved arterial spin labeling.

    PubMed

    Ahlgren, André; Wirestam, Ronnie; Petersen, Esben Thade; Ståhlberg, Freddy; Knutsson, Linda

    2014-09-01

    Quantitative perfusion MRI based on arterial spin labeling (ASL) is hampered by partial volume effects (PVEs), arising due to voxel signal cross-contamination between different compartments. To address this issue, several partial volume correction (PVC) methods have been presented. Most previous methods rely on segmentation of a high-resolution T1 -weighted morphological image volume that is coregistered to the low-resolution ASL data, making the result sensitive to errors in the segmentation and coregistration. In this work, we present a methodology for partial volume estimation and correction, using only low-resolution ASL data acquired with the QUASAR sequence. The methodology consists of a T1 -based segmentation method, with no spatial priors, and a modified PVC method based on linear regression. The presented approach thus avoids prior assumptions about the spatial distribution of brain compartments, while also avoiding coregistration between different image volumes. Simulations based on a digital phantom as well as in vivo measurements in 10 volunteers were used to assess the performance of the proposed segmentation approach. The simulation results indicated that QUASAR data can be used for robust partial volume estimation, and this was confirmed by the in vivo experiments. The proposed PVC method yielded probable perfusion maps, comparable to a reference method based on segmentation of a high-resolution morphological scan. Corrected gray matter (GM) perfusion was 47% higher than uncorrected values, suggesting a significant amount of PVEs in the data. Whereas the reference method failed to completely eliminate the dependence of perfusion estimates on the volume fraction, the novel approach produced GM perfusion values independent of GM volume fraction. The intra-subject coefficient of variation of corrected perfusion values was lowest for the proposed PVC method. As shown in this work, low-resolution partial volume estimation in connection with ASL perfusion estimation is feasible, and provides a promising tool for decoupling perfusion and tissue volume. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Unwrapping 3D complex hollow organs for spatial dose surface analysis.

    PubMed

    Witztum, A; George, B; Warren, S; Partridge, M; Hawkins, M A

    2016-11-01

    Toxicity dose-response models describe the correlation between dose delivered to an organ and a given toxic endpoint. Duodenal toxicity is a dose limiting factor in the treatment of pancreatic cancer with radiation but the relationship between dose and toxicity in the duodenum is not well understood. While there have been limited studies into duodenal toxicity through investigations of the volume of the organ receiving dose over a specific threshold, both dose-volume and dose-surface histograms lack spatial information about the dose distribution, which may be important in determining normal tissue response. Due to the complex geometry of the duodenum, previous methods for unwrapping tubular organs for spatial modeling of toxicity are insufficient. A geometrically robust method for producing 2D dose surface maps (DSMs), specifically for the duodenum, has been developed and tested in order to characterize the spatial dose distribution. The organ contour is defined using Delaunay triangulation. The user selects a start and end coordinate in the structure and a path is found by regulating both length and curvature. This path is discretized and rays are cast from each point on the plane normal to the vector between the previous and the next point on the path and the dose at the closest perimeter point recorded. These angular perimeter slices are "unwrapped" from the edge distal to the pancreas to ensure the high dose region (proximal to the tumor) falls in the centre of the dose map. Gamma analysis is used to quantify the robustness of this method and the effect of overlapping planes. This method was used to extract DSMs for 15 duodena, with one esophagus case to illustrate the application to simpler geometries. Visual comparison indicates that a 30 × 30 map provides sufficient resolution to view gross spatial features of interest. A lookup table is created to store the area (cm 2 ) represented by each pixel in the DSMs in order to allow spatial descriptors in absolute size. The method described in this paper is robust, requires minimal human interaction, has been shown to be generalizable to simpler geometries, and uses readily available commercial software. The difference seen in DSMs due to overlapping planes is large and justifies the need for a solution that removes such planes. This is the first time 2D dose surface maps have been produced for the duodenum and provide spatial dose distribution information which can be explored to create models that may improve toxicity prediction in treatments for locally advanced pancreatic cancer.

  5. Assessment of histological differentiation in gastric cancers using whole-volume histogram analysis of apparent diffusion coefficient maps.

    PubMed

    Zhang, Yujuan; Chen, Jun; Liu, Song; Shi, Hua; Guan, Wenxian; Ji, Changfeng; Guo, Tingting; Zheng, Huanhuan; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng; Liu, Tian

    2017-02-01

    To investigate the efficacy of histogram analysis of the entire tumor volume in apparent diffusion coefficient (ADC) maps for differentiating between histological grades in gastric cancer. Seventy-eight patients with gastric cancer were enrolled in a retrospective 3.0T magnetic resonance imaging (MRI) study. ADC maps were obtained at two different b values (0 and 1000 sec/mm 2 ) for each patient. Tumors were delineated on each slice of the ADC maps, and a histogram for the entire tumor volume was subsequently generated. A series of histogram parameters (eg, skew and kurtosis) were calculated and correlated with the histological grade of the surgical specimen. The diagnostic performance of each parameter for distinguishing poorly from moderately well-differentiated gastric cancers was assessed by using the area under the receiver operating characteristic curve (AUC). There were significant differences in the 5 th , 10 th , 25 th , and 50 th percentiles, skew, and kurtosis between poorly and well-differentiated gastric cancers (P < 0.05). There were correlations between the degrees of differentiation and histogram parameters, including the 10 th percentile, skew, kurtosis, and max frequency; the correlation coefficients were 0.273, -0.361, -0.339, and -0.370, respectively. Among all the histogram parameters, the max frequency had the largest AUC value, which was 0.675. Histogram analysis of the ADC maps on the basis of the entire tumor volume can be useful in differentiating between histological grades for gastric cancer. 4 J. Magn. Reson. Imaging 2017;45:440-449. © 2016 International Society for Magnetic Resonance in Medicine.

  6. Long-term Plan for Concrete Pavement Research and Technology--The Concrete Pavement Road Map : Volume II, Tracks

    DOT National Transportation Integrated Search

    2005-09-01

    The Long-Term Plan for Concrete Pavement Research and Technology (CP Road Map) is a holistic, strategic : plan for concrete pavement research and technology transfer. The CP Road Map is a 7- to 10-year plan that : includes 12 distinct but integrated ...

  7. Nure aerial gamma-ray and magnetic reconnaissance survey: Chugach/Yakutat area, Alaska, Mt. Saint Elias Quadrangle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-10-01

    Volume II contains the following data on Mt. Saint Elias, Alaska: geologic base map, flight path map, anomaly maps (U, Th, K, UlTh, UlK, ThlK), radiometric multiple-parameter stacked profiles, magnetic and ancillary profile data, and statistical data. (LK)

  8. Long-term Plan for Concrete Pavement Research and Technology--the Concrete Pavement Road Map (second generation) : Volume II, Tracks

    DOT National Transportation Integrated Search

    2012-07-01

    The Long-Term Plan for Concrete Pavement Research and Technology (CP Road Map) is a holistic strategic plan for : concrete pavement research and technology transfer. The CP Road Map is a living plan that includes 12 distinct but : integrated research...

  9. Middle Atmosphere Program: Handbook for MAP, volume 24

    NASA Technical Reports Server (NTRS)

    Edwards, Belva (Editor)

    1987-01-01

    This MAP handbook lists the MAP steering and standing committees. Also listed is the regional consultative group, the secretariat, study groups, projects, and national representatives. The major portion of the handbook is taken up with the listing of scientist members by country and the alphabetical index and addresses of scientists.

  10. AAFE man-made noise experiment project. Volume 1: Introduction experiment definition and requirements

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An experiment was conducted to measure and map the man-made radio frequency emanations which exist at earth orbital altitudes. The major objectives of the program are to develop a complete conceptual experiment and developmental hardware for the collection and processing of data required to produce meaningful statistics on man-made noise level variations as functions of time, frequency, and geographic location. A wide dispersion measurement receiver mounted in a spacecraft operating in a specialized orbit is used to obtain the data. A summary of the experiment designs goals and constraints is provided. The recommended orbit for the spacecraft is defined. The characteristics of the receiver and the antennas are analyzed.

  11. Middle Atmosphere Program. Handbook for MAP, Volume 17

    NASA Technical Reports Server (NTRS)

    Sechrist, C. F., Jr. (Editor)

    1985-01-01

    The Middle Atmosphere Program (MAP) handbook is divided into three parts. Part 1 consists of minutes of MAP steering committee meeting and MAP assembly. Part 2 consists of project and study group reports, such as: (1) Atmospheric Tides Middle Atmosphere Program (ATMAP), report of the Nov./Dec. 1981, and May 1982 observational campaigns; MAP/WINE experimenters meeting at Berlin, 1985; (3) MAP/WINE experimenters meeting at Loen, Norway, 1985; and (4) the penetration of ultraviolet solar radiation into the middle atmosphere. Part 3 consists of national reports.

  12. DCE-MRI using small-molecular and albumin-binding contrast agents in experimental carcinomas with different stromal content.

    PubMed

    Farace, Paolo; Merigo, Flavia; Fiorini, Silvia; Nicolato, Elena; Tambalo, Stefano; Daducci, Alessandro; Degrassi, Anna; Sbarbati, Andrea; Rubello, Domenico; Marzola, Pasquina

    2011-04-01

    To compare DCE-MRI experiments performed using a standard small-molecular (Gd-DTPA) and an albumin-binding (MS-325) contrast agent in two carcinoma models with different stromal content. DU-145 or BXPC-3 cancer cells were subcutaneously injected into nude mice. DCE-MRI was performed by a bolus injection of Gd-DTPA or MS-325 about 2 weeks after inoculation. For quantitative analysis a volume of interest was manually drawn over each tumor. To address the heterogeneous enhancement, each tumor volume was then divided into the 20% most-enhancing and the remaining 80% least-enhancing fractions. Mean tumor enhancement was calculated over these selected tumor volumes and compared between tumor groups and contrast agents. Maps of differential enhancement, peak enhancement and time-to-peak were used for visual evaluation. CD31 and VEGF immunohistochemistry were performed in excised tumors. In the 80% least-enhancing volume, at late time points of the dynamic scan, the mean enhancement elicited by MS-325 was higher in BXPC-3 than in DU-145 tumors. In the 20% most-enhancing volume, using either contrast agents, significant difference between the two tumors types were observed only early, while at later time points of the dynamic scan the difference were obscured by the faster washout observed in the BXPC-3 tumors. Enhancement maps confirmed that BXPC-3 tumors were characterized by marked washout rate using either contrast agent, particularly in the higher enhancing peripheral rim. With MS-325 this washout pattern appeared to be specific to the BXPC-3 carcinomas, since it was not observed in the DU-145 tumors. Finally, in both tumor types, MS-325 produced significantly higher enhancement than Gd-DTPA in the late phase of the dynamic scan. Ex vivo analysis confirmed the marked presence of aberrant infiltrative stroma in BXPC-3 tumors, in which tumor vessels were embedded. In all tumors the central portion was less viable and less infiltrated by stromal tissue then the peripheral areas. Contrast distribution proved to be related to stromal content, which presumably produced the higher enhancement and faster washout observed in the BXPC-3 tumors. In particular, 'early' contrast-enhanced MRI, appeared as the most sensitive technique to detect the tumor portions characterized by a high stromal content, i.e. the peripheral rim of the BXPC-3 tumors. Since the same tumor models were recently investigated using FDG-PET imaging, showing inverse relationship between FDG uptake and stromal content, contrast-enhanced MRI and FDG-PET could provide complementary and comprehensive sensitivity in the assessment of carcinomas. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  13. Integrated seismic stochastic inversion and multi-attributes to delineate reservoir distribution: Case study MZ fields, Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.

    2017-07-01

    This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.

  14. Volumetric calculation using low cost unmanned aerial vehicle (UAV) approach

    NASA Astrophysics Data System (ADS)

    Rahman, A. A. Ab; Maulud, K. N. Abdul; Mohd, F. A.; Jaafar, O.; Tahar, K. N.

    2017-12-01

    Unmanned Aerial Vehicles (UAV) technology has evolved dramatically in the 21st century. It is used by both military and general public for recreational purposes and mapping work. Operating cost for UAV is much cheaper compared to that of normal aircraft and it does not require a large work space. The UAV systems have similar functions with the LIDAR and satellite images technologies. These systems require a huge cost, labour and time consumption to produce elevation and dimension data. Measurement of difficult objects such as water tank can also be done by using UAV. The purpose of this paper is to show the capability of UAV to compute the volume of water tank based on a different number of images and control points. The results were compared with the actual volume of the tank to validate the measurement. In this study, the image acquisition was done using Phantom 3 Professional, which is a low cost UAV. The analysis in this study is based on different volume computations using two and four control points with variety set of UAV images. The results show that more images will provide a better quality measurement. With 95 images and four GCP, the error percentage to the actual volume is about 5%. Four controls are enough to get good results but more images are needed, estimated about 115 until 220 images. All in all, it can be concluded that the low cost UAV has a potential to be used for volume of water and dimension measurement.

  15. A method to estimate the effect of deformable image registration uncertainties on daily dose mapping

    PubMed Central

    Murphy, Martin J.; Salguero, Francisco J.; Siebers, Jeffrey V.; Staub, David; Vaman, Constantin

    2012-01-01

    Purpose: To develop a statistical sampling procedure for spatially-correlated uncertainties in deformable image registration and then use it to demonstrate their effect on daily dose mapping. Methods: Sequential daily CT studies are acquired to map anatomical variations prior to fractionated external beam radiotherapy. The CTs are deformably registered to the planning CT to obtain displacement vector fields (DVFs). The DVFs are used to accumulate the dose delivered each day onto the planning CT. Each DVF has spatially-correlated uncertainties associated with it. Principal components analysis (PCA) is applied to measured DVF error maps to produce decorrelated principal component modes of the errors. The modes are sampled independently and reconstructed to produce synthetic registration error maps. The synthetic error maps are convolved with dose mapped via deformable registration to model the resulting uncertainty in the dose mapping. The results are compared to the dose mapping uncertainty that would result from uncorrelated DVF errors that vary randomly from voxel to voxel. Results: The error sampling method is shown to produce synthetic DVF error maps that are statistically indistinguishable from the observed error maps. Spatially-correlated DVF uncertainties modeled by our procedure produce patterns of dose mapping error that are different from that due to randomly distributed uncertainties. Conclusions: Deformable image registration uncertainties have complex spatial distributions. The authors have developed and tested a method to decorrelate the spatial uncertainties and make statistical samples of highly correlated error maps. The sample error maps can be used to investigate the effect of DVF uncertainties on daily dose mapping via deformable image registration. An initial demonstration of this methodology shows that dose mapping uncertainties can be sensitive to spatial patterns in the DVF uncertainties. PMID:22320766

  16. 3D Volume and Morphology of Perennial Cave Ice and Related Geomorphological Models at Scăriloara Ice Cave, Romania, from Structure from Motion, Ground Penetrating Radar and Total Station Surveys

    NASA Astrophysics Data System (ADS)

    Hubbard, J.; Onac, B. P.; Kruse, S.; Forray, F. L.

    2017-12-01

    Research at Scăriloara Ice Cave has proceeded for over 150 years, primarily driven by the presence and paleoclimatic importance of the large perennial ice block and various ice speleothems located within its galleries. Previous observations of the ice block led to rudimentary volume estimates of 70,000 to 120,000 cubic meters (m3), prospectively placing it as one of the world's largest cave ice deposits. The cave morphology and the surface of the ice block are now recreated in a total station survey-validated 3D model, produced using Structure from Motion (SfM) software. With the total station survey and the novel use of ArcGIS tools, the SfM validation process is drastically simplified to produce a scaled, georeferenced, and photo-texturized 3D model of the cave environment with a root-mean-square error (RMSE) of 0.24 m. Furthermore, ground penetrating radar data was collected and spatially oriented with the total station survey to recreate the ice block basal surface and was combined with the SfM model to create a model of the ice block itself. The resulting ice block model has a volume of over 118,000 m3 with an uncertainty of 9.5%, with additional volumes left un-surveyed. The varying elevation of the ice block basal surface model reflect specific features of the cave roof, such as areas of enlargement, shafts, and potential joints, which offer further validation and inform theories on cave and ice genesis. Specifically, a large depression area was identified as a potential area of initial ice growth. Finally, an ice thickness map was produced that will aid in the designing of future ice coring projects. This methodology presents a powerful means to observe and accurately characterize and measure cave and cave ice morphologies with ease and affordability. Results further establish the significance of Scăriloara's ice block to paleoclimate research, provide insights into cave and ice block genesis, and aid future study design.

  17. The Derivation of Fault Volumetric Properties from 3D Trace Maps Using Outcrop Constrained Discrete Fracture Network Models

    NASA Astrophysics Data System (ADS)

    Hodgetts, David; Seers, Thomas

    2015-04-01

    Fault systems are important structural elements within many petroleum reservoirs, acting as potential conduits, baffles or barriers to hydrocarbon migration. Large, seismic-scale faults often serve as reservoir bounding seals, forming structural traps which have proved to be prolific plays in many petroleum provinces. Though inconspicuous within most seismic datasets, smaller subsidiary faults, commonly within the damage zones of parent structures, may also play an important role. These smaller faults typically form narrow, tabular low permeability zones which serve to compartmentalize the reservoir, negatively impacting upon hydrocarbon recovery. Though considerable improvements have been made in the visualization field to reservoir-scale fault systems with the advent of 3D seismic surveys, the occlusion of smaller scale faults in such datasets is a source of significant uncertainty during prospect evaluation. The limited capacity of conventional subsurface datasets to probe the spatial distribution of these smaller scale faults has given rise to a large number of outcrop based studies, allowing their intensity, connectivity and size distributions to be explored in detail. Whilst these studies have yielded an improved theoretical understanding of the style and distribution of sub-seismic scale faults, the ability to transform observations from outcrop to quantities that are relatable to reservoir volumes remains elusive. These issues arise from the fact that outcrops essentially offer a pseudo-3D window into the rock volume, making the extrapolation of surficial fault properties such as areal density (fracture length per unit area: P21), to equivalent volumetric measures (i.e. fracture area per unit volume: P32) applicable to fracture modelling extremely challenging. Here, we demonstrate an approach which harnesses advances in the extraction of 3D trace maps from surface reconstructions using calibrated image sequences, in combination with a novel semi-deterministic, outcrop constrained discrete fracture network modeling code to derive volumetric fault intensity measures (fault area per unit volume / fault volume per unit volume). Producing per-vertex measures of volumetric intensity; our method captures the spatial variability in 3D fault density across a surveyed outcrop, enabling first order controls to be probed. We demonstrate our approach on pervasively faulted exposures of a Permian aged reservoir analogue from the Vale of Eden Basin, UK.

  18. Numerical Tsunami Hazard Assessment of the Only Active Lesser Antilles Arc Submarine Volcano: Kick 'em Jenny.

    NASA Astrophysics Data System (ADS)

    Dondin, F. J. Y.; Dorville, J. F. M.; Robertson, R. E. A.

    2015-12-01

    The Lesser Antilles Volcanic Arc has potentially been hit by prehistorical regional tsunamis generated by voluminous volcanic landslides (volume > 1 km3) among the 53 events recognized so far. No field evidence of these tsunamis are found in the vincity of the sources. Such a scenario taking place nowadays would trigger hazardous tsunami waves bearing potentially catastrophic consequences for the closest islands and regional offshore oil platforms.Here we applied a complete hazard assessment method on the only active submarine volcano of the arc Kick 'em Jenny (KeJ). KeJ is the southernmost edifice with recognized associated volcanic landslide deposits. From the three identified landslide episodes one is associated with a collapse volume ca. 4.4 km3. Numerical simulations considering a single pulse collapse revealed that this episode would have produced a regional tsunami. An edifice current volume estimate is ca. 1.5 km3.Previous study exists in relationship to assessment of regional tsunami hazard related to shoreline surface elevation (run-up) in the case of a potential flank collapse scenario at KeJ. However this assessment was based on inferred volume of collapse material. We aim to firstly quantify potential initial volumes of collapse material using relative slope instability analysis (RSIA); secondly to assess first order run-ups and maximum inland inundation distance for Barbados and Trinidad and Tobago, i.e. two important economic centers of the Lesser Antilles. In this framework we present for seven geomechanical models tested in the RSIA step maps of critical failure surface associated with factor of stability (Fs) for twelve sectors of 30° each; then we introduce maps of expected potential run-ups (run-up × the probability of failure at a sector) at the shoreline.The RSIA evaluates critical potential failure surface associated with Fs <1 as compared to areas of deficit/surplus of mass/volume identified on the volcanic edifice using (VolcanoFit 2.0 & SSAP 4.5). Tsunami sources characteristics are retrieved from numerical simulation using an hydraulic equations-based code (VolcFlow-Matlab). The tsunami propagation towards the coasts is computed using the open source a Boussinesq equations-based code (FUNWAVE) taking into account high order non linear effects including dissipation.

  19. Maps for the nation: The current federal mapping establishment

    USGS Publications Warehouse

    North, G.W.

    1983-01-01

    The U.S. Government annually produces an estimated 53,000 new maps and charts and distributes about 160 million copies. A large number of these maps are produced under the national mapping program, a decentralized Federal/State cooperative approach to mapping the country at standard scales. Circular A-16, issued by the Office of Management and Budget in 1953 and revised in 1967, delegates the mapping responsibilities to various federal agencies. The U.S. Department of the Interior's Geological Survey is the principal federal agency responsible for implementing the national mapping program. Other major federal map producing agencies include the Departments of Agriculture, Commerce, Defense, Housing and Urban Development, and Transportation, and the Tennessee Valley Authority. To make maps and mapping information more readily available, the National Cartographic Information Center was established in 1974 and an expanded National Map Library Depository Program in 1981. The most recent of many technological advances made under the mapping program are in the areas of digital cartography and video disc and optical disc information storage systems. Future trends and changes in the federal mapping program will involve expanded information and customer service operations, further developments in the production and use of digital cartographic data, and consideration of a Federal Mapping Agency. ?? 1983.

  20. Forming maps of targets having multiple reflectors with a biomimetic audible sonar.

    PubMed

    Kuc, Roman

    2018-05-01

    A biomimetic audible sonar mimics human echolocation by emitting clicks and sensing echoes binaurally to investigate the limitations in acoustic mapping of 2.5 dimensional targets. A monaural sonar that provides only echo time-of-flight values produces biased maps that lie outside the target surfaces. Reflector bearing estimates derived from the first echoes detected by a binaural sonar are employed to form unbiased maps. Multiple echoes from a target introduce phantom-reflector artifacts into its map because later echoes are produced by reflectors at bearings different from those determined from the first echoes. In addition, overlapping echoes interfere to produce bearing errors. Addressing the causes of these bearing errors motivates a processing approach that employs template matching to extract valid echoes. Interfering echoes can mimic a valid echo and also form PR artifacts. These artifacts are eliminated by recognizing the bearing fluctuations that characterize echo interference. Removing PR artifacts produces a map that resembles the physical target shape to within the resolution capabilities of the sonar. The remaining differences between the target shape and the final map are void artifacts caused by invalid or missing echoes.

  1. Precise Ortho Imagery as the Source for Authoritative Airport Mapping

    NASA Astrophysics Data System (ADS)

    Howard, H.; Hummel, P.

    2016-06-01

    As the aviation industry moves from paper maps and charts to the digital cockpit and electronic flight bag, producers of these products need current and accurate data to ensure flight safety. FAA (Federal Aviation Administration) and ICAO (International Civil Aviation Organization) require certified suppliers to follow a defined protocol to produce authoritative map data for the aerodrome. Typical airport maps have been produced to meet 5 m accuracy requirements. The new digital aviation world is moving to 1 m accuracy maps to provide better situational awareness on the aerodrome. The commercial availability of 0.5 m satellite imagery combined with accurate ground control is enabling the production of avionics certified .85 m orthophotos of airports around the globe. CompassData maintains an archive of over 400+ airports as source data to support producers of 1 m certified Aerodrome Mapping Database (AMDB) critical to flight safety and automated situational awareness. CompassData is a DO200A certified supplier of authoritative orthoimagery and attendees will learn how to utilize current airport imagery to build digital aviation mapping products.

  2. Long-term Plan for Concrete Pavement Research and Technology--the Concrete Pavement Road Map (second generation) : Volume I, Background and Summary

    DOT National Transportation Integrated Search

    2012-04-01

    The Long-Term Plan for Concrete Pavement Research and Technology (CP Road Map) is a holistic strategic plan for concrete : pavement research and technology transfer. The CP Road Map is a living plan that includes 12 distinct but integrated research :...

  3. Long-term Plan for Concrete Pavement Research and Technology--the Concrete Pavement Road Map : Volume I, Background and summary

    DOT National Transportation Integrated Search

    2005-09-01

    The Long-Term Plan for Concrete Pavement Research and Technology (CP Road Map) is a holistic, strategic plan : for concrete pavement research and technology transfer. The CP Road Map is a 7- to 10-year plan that includes 12 : distinct but integrated ...

  4. The Map Corner.

    ERIC Educational Resources Information Center

    Cheyney, Arnold B.; Capone, Donald L.

    This teaching resource is aimed at helping students develop the skills necessary to locate places on the earth. Designed as a collection of map skill exercises rather than a sequential program of study, this program expects that students have access to and some knowledge of how to use globes, maps, atlases, and encyclopedias. The volume contains 6…

  5. Test functions for three-dimensional control-volume mixed finite-element methods on irregular grids

    USGS Publications Warehouse

    Naff, R.L.; Russell, T.F.; Wilson, J.D.; ,; ,; ,; ,; ,

    2000-01-01

    Numerical methods based on unstructured grids, with irregular cells, usually require discrete shape functions to approximate the distribution of quantities across cells. For control-volume mixed finite-element methods, vector shape functions are used to approximate the distribution of velocities across cells and vector test functions are used to minimize the error associated with the numerical approximation scheme. For a logically cubic mesh, the lowest-order shape functions are chosen in a natural way to conserve intercell fluxes that vary linearly in logical space. Vector test functions, while somewhat restricted by the mapping into the logical reference cube, admit a wider class of possibilities. Ideally, an error minimization procedure to select the test function from an acceptable class of candidates would be the best procedure. Lacking such a procedure, we first investigate the effect of possible test functions on the pressure distribution over the control volume; specifically, we look for test functions that allow for the elimination of intermediate pressures on cell faces. From these results, we select three forms for the test function for use in a control-volume mixed method code and subject them to an error analysis for different forms of grid irregularity; errors are reported in terms of the discrete L2 norm of the velocity error. Of these three forms, one appears to produce optimal results for most forms of grid irregularity.

  6. Status and trends of land change in the United States--1973 to 2000

    USGS Publications Warehouse

    ,

    2012-01-01

    U.S. Geological Survey (USGS) Professional Paper 1794 is a four-volume series on the status and trends of the Nation’s land use and land cover, providing an assessment of the rates and causes of land-use and land-cover change in the United States between 1973 and 2000. Volumes A, B, C, and D provide analyses for the Western United States, the Great Plains, the Midwest–South Central United States, and the Eastern United States, respectively. The assessments of land-use and land-cover trends are conducted on an ecoregion-by-ecoregion basis, and each ecoregion assessment is guided by a nationally consistent study design that includes mapping, statistical methods, field studies, and analysis. Individual assessments provide a picture of the characteristics of land change occurring in a given ecoregion; in combination, they provide a framework for understanding the complex national mosaic of change and also the causes and consequences of change. Thus, each volume in this series provides a regional assessment of how (and how fast) land use and land cover are changing, and why. The four volumes together form the first comprehensive picture of land change across the Nation. This report is only one of the products produced by USGS on land-use and land-cover change in the United States. Other reports and land-cover statistics are available online at http://landcovertrends.usgs.gov.

  7. Preconstruction Engineering and Design Phase Reevaluation Report: Olcott Harbor Project. Vol 1: Main Report and Supplemental Environmental Impact Statement

    DTIC Science & Technology

    1992-09-01

    District’s most recent copies of the U.S. Fish and Wildlife Service National Wetland Inventory Maps and the New York State Department of Environmental... Inventory Maps, U.S. Department of the Interior - Fish and Wildlife Service. 9. Bureau of Census, Census of Population and Housing (STFIA), 1980. 10...Bureau of the Census. 16. Inventory - Community Water Systems, New York State (Volume I - Municipal, Volume I1 - Non-Municipal) 1984, New York State

  8. 1994 ASPRS/ACSM annual convention exposition. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-01-01

    This report is Volume II of presented papers at the joint 1994 convention of the American Society for Photgrammetry and Remote Sensing and American Congress on Surveying and Mapping. Topic areas covered include the following: Data Base/GPS Issues; Survey Management Issues; Surveying computations; Surveying education; Digital mapping; global change, EOS and NALC issues; GPS issues; Battelle Research in Remote Sensing and in GIS; Advanced Image Processing;GIS Issues; Surveying and Geodesy Issues; water resource issues; Advanced applications of remote sensing; Landsat Pathfinder I.

  9. Three-dimensional mapping of the lateral ventricles in autism

    PubMed Central

    Vidal, Christine N.; Nicolsonln, Rob; Boire, Jean-Yves; Barra, Vincent; DeVito, Timothy J.; Hayashi, Kiralee M.; Geaga, Jennifer A.; Drost, Dick J.; Williamson, Peter C.; Rajakumar, Nagalingam; Toga, Arthur W.; Thompson, Paul M.

    2009-01-01

    In this study, a computational mapping technique was used to examine the three-dimensional profile of the lateral ventricles in autism. T1-weighted three-dimensional magnetic resonance images of the brain were acquired from 20 males with autism (age: 10.1 ± 3.5 years) and 22 male control subjects (age: 10.7 ± 2.5 years). The lateral ventricles were delineated manually and ventricular volumes were compared between the two groups. Ventricular traces were also converted into statistical three-dimensional maps, based on anatomical surface meshes. These maps were used to visualize regional morphological differences in the thickness of the lateral ventricles between patients and controls. Although ventricular volumes measured using traditional methods did not differ significantly between groups, statistical surface maps revealed subtle, highly localized reductions in ventricular size in patients with autism in the left frontal and occipital horns. These localized reductions in the lateral ventricles may result from exaggerated brain growth early in life. PMID:18502618

  10. Novel cardiac magnetic resonance biomarkers: native T1 and extracellular volume myocardial mapping.

    PubMed

    Cannaò, Paola Maria; Altabella, Luisa; Petrini, Marcello; Alì, Marco; Secchi, Francesco; Sardanelli, Francesco

    2016-04-28

    Cardiac magnetic resonance (CMR) is a non-invasive diagnostic tool playing a key role in the assessment of cardiac morphology and function as well as in tissue characterization. Late gadolinium enhancement is a fundamental CMR technique for detecting focal or regional abnormalities such as scar tissue, replacement fibrosis, or inflammation using qualitative, semi-quantitative, or quantitative methods, but not allowing for evaluating the whole myocardium in the presence of diffuse disease. The novel T1 mapping approach permits a quantitative assessment of the entire myocardium providing a voxel-by-voxel map of native T1 relaxation time, obtained before the intravenous administration of gadolinium-based contrast material. Combining T1 data obtained before and after contrast injection, it is also possible to calculate the voxel-by-voxel extracellular volume (ECV), resulting in another myocardial parametric map. This article describes technical challenges and clinical perspectives of these two novel CMR biomarkers: myocardial native T1 and ECV mapping.

  11. Designing and Testing a UAV Mapping System for Agricultural Field Surveying

    PubMed Central

    Skovsen, Søren

    2017-01-01

    A Light Detection and Ranging (LiDAR) sensor mounted on an Unmanned Aerial Vehicle (UAV) can map the overflown environment in point clouds. Mapped canopy heights allow for the estimation of crop biomass in agriculture. The work presented in this paper contributes to sensory UAV setup design for mapping and textual analysis of agricultural fields. LiDAR data are combined with data from Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) sensors to conduct environment mapping for point clouds. The proposed method facilitates LiDAR recordings in an experimental winter wheat field. Crop height estimates ranging from 0.35–0.58 m are correlated to the applied nitrogen treatments of 0–300 kgNha. The LiDAR point clouds are recorded, mapped, and analysed using the functionalities of the Robot Operating System (ROS) and the Point Cloud Library (PCL). Crop volume estimation is based on a voxel grid with a spatial resolution of 0.04 × 0.04 × 0.001 m. Two different flight patterns are evaluated at an altitude of 6 m to determine the impacts of the mapped LiDAR measurements on crop volume estimations. PMID:29168783

  12. Intrinsic optical signal imaging of the blood volume changes is sufficient for mapping the resting state functional connectivity in the rodent cortex.

    PubMed

    Kura, Sreekanth; Xie, Hongyu; Fu, Buyin; Ayata, Cenk; Boas, David A; Sakadžić, Sava

    2018-06-01

    Resting state functional connectivity (RSFC) allows the study of functional organization in normal and diseased brain by measuring the spontaneous brain activity generated under resting conditions. Intrinsic optical signal imaging (IOSI) based on multiple illumination wavelengths has been used successfully to compute RSFC maps in animal studies. The IOSI setup complexity would be greatly reduced if only a single wavelength can be used to obtain comparable RSFC maps. We used anesthetized mice and performed various comparisons between the RSFC maps based on single wavelength as well as oxy-, deoxy- and total hemoglobin concentration changes. The RSFC maps based on IOSI at a single wavelength selected for sensitivity to the blood volume changes are quantitatively comparable to the RSFC maps based on oxy- and total hemoglobin concentration changes obtained by the more complex IOSI setups. Moreover, RSFC maps do not require CCD cameras with very high frame acquisition rates, since our results demonstrate that they can be computed from the data obtained at frame rates as low as 5 Hz. Our results will have general utility for guiding future RSFC studies based on IOSI and making decisions about the IOSI system designs.

  13. Designing and Testing a UAV Mapping System for Agricultural Field Surveying.

    PubMed

    Christiansen, Martin Peter; Laursen, Morten Stigaard; Jørgensen, Rasmus Nyholm; Skovsen, Søren; Gislum, René

    2017-11-23

    A Light Detection and Ranging (LiDAR) sensor mounted on an Unmanned Aerial Vehicle (UAV) can map the overflown environment in point clouds. Mapped canopy heights allow for the estimation of crop biomass in agriculture. The work presented in this paper contributes to sensory UAV setup design for mapping and textual analysis of agricultural fields. LiDAR data are combined with data from Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) sensors to conduct environment mapping for point clouds. The proposed method facilitates LiDAR recordings in an experimental winter wheat field. Crop height estimates ranging from 0.35-0.58 m are correlated to the applied nitrogen treatments of 0-300 kg N ha . The LiDAR point clouds are recorded, mapped, and analysed using the functionalities of the Robot Operating System (ROS) and the Point Cloud Library (PCL). Crop volume estimation is based on a voxel grid with a spatial resolution of 0.04 × 0.04 × 0.001 m. Two different flight patterns are evaluated at an altitude of 6 m to determine the impacts of the mapped LiDAR measurements on crop volume estimations.

  14. Geologic map and structure sections of the Clear Lake Volcanics, Northern California

    USGS Publications Warehouse

    Hearn, B.C.; Donnelly-Nolan, J. M.; Goff, F.E.

    1995-01-01

    The Clear Lake Volcanics are located in the California Coast Ranges about 150 km north of San Francisco. This Quaternary volcanic field has erupted intermittently since 2.1 million years ago. This volcanic field is considered a high-threat volcanic system (Ewert and others, 2005) The adjacent Geysers geothermal field, largest power-producing geothermal field in the world, is powered by the magmatic heat source for the volcanic field. This report consists of three sheets that include the geologic map, one table, two figures, three cross sections, description of map units, charts of standard and diagrammatic correlation of map units, and references. This map supersedes U.S. Geological Survey Open-File Report 76-751. Descriptions of map units are grouped by geographic area. Summaries of the evolution, chemistry, structure, and tectonic setting of the Clear Lake Volcanics are given in Hearn and others (1981) and Donnelly-Nolan and others (1981). The geology of parts of the area underlain by the Cache Formation is based on mapping by Rymer (1981); the geology of parts of the areas underlain by the Sonoma Volcanics, Franciscan assemblage, and Great Valley sequence is based on mapping by McLaughlin (1978). Volcanic compositional map units are basalt, basaltic andesite, andesite, dacite, rhyodacite, and rhyolite, based on SiO2 content. Included in this report are maps showing the distribution of volcanic rocks through time and a chart showing erupted volumes of different lava types through time. A table gives petrographic data for each map unit by mineral type, abundance, and size. Most ages are potassium-argon (K/Ar) ages determined for whole-rock samples and mineral separates by Donnelly-Nolan and others (1981), unless otherwise noted. A few ages are carbon-14 ages or were estimated from geologic relationships. Magnetic polarities are from Mankinen and others (1978; 1981) or were determined in the field by B.C. Hearn, Jr., using a portable fluxgate magnetometer. Thickness for most units is estimated from topographic relief except where drill-hole data were available.

  15. Tissue microstructure features derived from anomalous diffusion measurements in magnetic resonance imaging.

    PubMed

    Yu, Qiang; Reutens, David; O'Brien, Kieran; Vegh, Viktor

    2017-02-01

    Tissue microstructure features, namely axon radius and volume fraction, provide important information on the function of white matter pathways. These parameters vary on the scale much smaller than imaging voxels (microscale) yet influence the magnetic resonance imaging diffusion signal at the image voxel scale (macroscale) in an anomalous manner. Researchers have already mapped anomalous diffusion parameters from magnetic resonance imaging data, but macroscopic variations have not been related to microscale influences. With the aid of a tissue model, we aimed to connect anomalous diffusion parameters to axon radius and volume fraction using diffusion-weighted magnetic resonance imaging measurements. An ex vivo human brain experiment was performed to directly validate axon radius and volume fraction measurements in the human brain. These findings were validated using electron microscopy. Additionally, we performed an in vivo study on nine healthy participants to map axon radius and volume fraction along different regions of the corpus callosum projecting into various cortical areas identified using tractography. We found a clear relationship between anomalous diffusion parameters and axon radius and volume fraction. We were also able to map accurately the trend in axon radius along the corpus callosum, and in vivo findings resembled the low-high-low-high behaviour in axon radius demonstrated previously. Axon radius and volume fraction measurements can potentially be used in brain connectivity studies and to understand the implications of white matter structure in brain diseases and disorders. Hum Brain Mapp 38:1068-1081, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Dynamics of vascular volume and hemodilution of lactated Ringer’s solution in patients during induction of general and epidural anesthesia*

    PubMed Central

    Li, Yu-hong; Lou, Xian-feng; Bao, Fang-ping

    2006-01-01

    Objective: To investigate the dynamics of vascular volume and the plasma dilution of lactated Ringer’s solution in patients during the induction of general and epidural anesthesia. Methods: The hemodilution of i.v. infusion of 1000 ml of lactated Ringer’s solution over 60 min was studied in patients undergoing general (n=31) and epidural (n=22) anesthesia. Heart rate, arterial blood pressure and hemoglobin (Hb) concentration were measured every 5 min during the study. Surgery was not started until the study period had been completed. Results: General anesthesia caused the greater decrease of mean arterial blood pressure (MAP) (mean 15% versus 9%; P<0.01) and thereby followed by a more pronounced plasma dilution, blood volume expansion (VE) and blood volume expansion efficiency (VEE). A strong linear correlation between hemodilution and the reduction in MAP (r=−0.50; P<0.01) was found. At the end of infusion, patients undergoing general anesthesia retained 47% (SD 19%) of the infused fluid in the circulation, while epidural anesthesia retained 29% (SD 13%) (P<0.001). Correspondingly, a fewer urine output (mean 89 ml versus 156 ml; P<0.05) and extravascular expansion (454 ml versus 551 ml; P<0.05) were found during general anesthesia. Conclusion: We concluded that the induction of general anesthesia caused more hemodilution, volume expansion and volume expansion efficiency than epidural anesthesia, which was triggered only by the lower MAP. PMID:16909476

  17. SU-F-J-113: Multi-Atlas Based Automatic Organ Segmentation for Lung Radiotherapy Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J; Han, J; Ailawadi, S

    Purpose: Normal organ segmentation is one time-consuming and labor-intensive step for lung radiotherapy treatment planning. The aim of this study is to evaluate the performance of a multi-atlas based segmentation approach for automatic organs at risk (OAR) delineation. Methods: Fifteen Lung stereotactic body radiation therapy patients were randomly selected. Planning CT images and OAR contours of the heart - HT, aorta - AO, vena cava - VC, pulmonary trunk - PT, and esophagus – ES were exported and used as reference and atlas sets. For automatic organ delineation for a given target CT, 1) all atlas sets were deformably warpedmore » to the target CT, 2) the deformed sets were accumulated and normalized to produce organ probability density (OPD) maps, and 3) the OPD maps were converted to contours via image thresholding. Optimal threshold for each organ was empirically determined by comparing the auto-segmented contours against their respective reference contours. The delineated results were evaluated by measuring contour similarity metrics: DICE, mean distance (MD), and true detection rate (TD), where DICE=(intersection volume/sum of two volumes) and TD = {1.0 - (false positive + false negative)/2.0}. Diffeomorphic Demons algorithm was employed for CT-CT deformable image registrations. Results: Optimal thresholds were determined to be 0.53 for HT, 0.38 for AO, 0.28 for PT, 0.43 for VC, and 0.31 for ES. The mean similarity metrics (DICE[%], MD[mm], TD[%]) were (88, 3.2, 89) for HT, (79, 3.2, 82) for AO, (75, 2.7, 77) for PT, (68, 3.4, 73) for VC, and (51,2.7, 60) for ES. Conclusion: The investigated multi-atlas based approach produced reliable segmentations for the organs with large and relatively clear boundaries (HT and AO). However, the detection of small and narrow organs with diffused boundaries (ES) were challenging. Sophisticated atlas selection and multi-atlas fusion algorithms may further improve the quality of segmentations.« less

  18. WE-AB-209-06: Dynamic Collimator Trajectory Algorithm for Use in VMAT Treatment Deliveries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacDonald, L; Thomas, C; Syme, A

    2016-06-15

    Purpose: To develop advanced dynamic collimator positioning algorithms for optimal beam’s-eye-view (BEV) fitting of targets in VMAT procedures, including multiple metastases stereotactic radiosurgery procedures. Methods: A trajectory algorithm was developed, which can dynamically modify the angle of the collimator as a function of VMAT control point to provide optimized collimation of target volume(s). Central to this algorithm is a concept denoted “whitespace”, defined as area within the jaw-defined BEV field, outside of the PTV, and not shielded by the MLC when fit to the PTV. Calculating whitespace at all collimator angles and every control point, a two-dimensional topographical map depictingmore » the tightness-of-fit of the MLC was generated. A variety of novel searching algorithms identified a number of candidate trajectories of continuous collimator motion. Ranking these candidate trajectories according to their accrued whitespace value produced an optimal solution for navigation of this map. Results: All trajectories were normalized to minimum possible (i.e. calculated without consideration of collimator motion constraints) accrued whitespace. On an acoustic neuroma case, a random walk algorithm generated a trajectory with 151% whitespace; random walk including a mandatory anchor point improved this to 148%; gradient search produced a trajectory with 137%; and bi-directional gradient search generated a trajectory with 130% whitespace. For comparison, a fixed collimator angle of 30° and 330° accumulated 272% and 228% of whitespace, respectively. The algorithm was tested on a clinical case with two metastases (single isocentre) and identified collimator angles that allow for simultaneous irradiation of the PTVs while minimizing normal tissue irradiation. Conclusion: Dynamic collimator trajectories have the potential to improve VMAT deliveries through increased efficiency and reduced normal tissue dose, especially in treatment of multiple cranial metastases, without significant safety concerns that hinder immediate clinical implementation.« less

  19. Operational monitoring of land-cover change using multitemporal remote sensing data

    NASA Astrophysics Data System (ADS)

    Rogan, John

    2005-11-01

    Land-cover change, manifested as either land-cover modification and/or conversion, can occur at all spatial scales, and changes at local scales can have profound, cumulative impacts at broader scales. The implication of operational land-cover monitoring is that researchers have access to a continuous stream of remote sensing data, with the long term goal of providing for consistent and repetitive mapping. Effective large area monitoring of land-cover (i.e., >1000 km2) can only be accomplished by using remotely sensed images as an indirect data source in land-cover change mapping and as a source for land-cover change model projections. Large area monitoring programs face several challenges: (1) choice of appropriate classification scheme/map legend over large, topographically and phenologically diverse areas; (2) issues concerning data consistency and map accuracy (i.e., calibration and validation); (3) very large data volumes; (4) time consuming data processing and interpretation. Therefore, this dissertation research broadly addresses these challenges in the context of examining state-of-the-art image pre-processing, spectral enhancement, classification, and accuracy assessment techniques to assist the California Land-cover Mapping and Monitoring Program (LCMMP). The results of this dissertation revealed that spatially varying haze can be effectively corrected from Landsat data for the purposes of change detection. The Multitemporal Spectral Mixture Analysis (MSMA) spectral enhancement technique produced more accurate land-cover maps than those derived from the Multitemporal Kauth Thomas (MKT) transformation in northern and southern California study areas. A comparison of machine learning classifiers showed that Fuzzy ARTMAP outperformed two classification tree algorithms, based on map accuracy and algorithm robustness. Variation in spatial data error (positional and thematic) was explored in relation to environmental variables using geostatistical interpolation techniques. Finally, the land-cover modification maps generated for three time intervals (1985--1990--1996--2000), with nine change-classes revealed important variations in land-cover gain and loss between northern and southern California study areas.

  20. LiDAR observations of an Earth magmatic plumbing system as an analog for Venus and Mars distributed volcanism

    NASA Astrophysics Data System (ADS)

    Richardson, Jacob; Connor, Charles; Malservisi, Rocco; Bleacher, Jacob; Connor, Laura

    2014-05-01

    Clusters of tens to thousands of small volcanoes (diameters generally <30 km) are common features on the surface of Mars, Venus, and the Earth. These clusters may be described as distributed-style volcanism. Better characterizing the magmatic plumbing system of these clusters can constrain magma ascent processes as well as the regional magma production budget and heat flux beneath each cluster. Unfortunately, directly observing the plumbing systems of volcano clusters on Mars and Venus eludes our current geologic abilities. Because erosion exposes such systems at the Earth's surface, a better understanding of magmatic processes and migration can be achieved via field analysis. The terrestrial plumbing system of an eroded volcanic field may be a valuable planetary analog for Venus and Mars clusters. The magmatic plumbing system of a Pliocene-aged monogenetic volcanic field, emplaced at 0.8 km depth, is currently exposed as a sill and dike swarm in the San Rafael Desert of Central Utah, USA. The mafic bodies in this region intruded into Mesozoic sedimentary units and now make up the most erosion resistant units as sills, dikes, and plug-like conduits. Light Detection and Ranging (LiDAR) can identify volcanic units (sills, dikes, and conduits) at high resolution, both geomorphologically and with near infrared return intensity values. Two Terrestrial LiDAR Surveys and an Airborne LiDAR Survey have been carried out over the San Rafael volcanic swarm, producing a three dimensional point cloud over approximately 36 sq. km. From the point clouds of these surveys, 1-meter DEMs are produced and volcanic intrusions have been mapped. Here we present reconstructions of the volcanic instrusions of the San Rafael Swarm. We create this reconstruction by extrapolating mapped intrustions from the LiDAR surveys into a 3D space around the current surface. We compare the estimated intrusive volume to the estimated conduit density and estimates of extrusive volume at volcano clusters of similar density. The extrapolated reconstruction and conduit mapping provide a first-order estimate of the final intrustive/extrusive volume ratio for the now eroded volcanic field. Earth, Venus and Mars clusters are compared using Kernel Density Estimation (KDE) , which objectively compares cluster area, complexity, and vent density per sq. km. We show that Martian clusters are less dense than Venus clusters, which in turn are less dense than those on Earth. KDE and previous models of intrusive morphology for Mars and Venus are here used to calibrate the San Rafael plumbing system model to clusters on the two planets. The results from the calibrated Mars and Venus plumbing system models can be compared to previous estimates of magma budget and intrusive/extrusive ratios on Venus and Mars.

  1. Crossing the roof of the world: Trade in medicinal plants from Nepal to China.

    PubMed

    He, Jun; Yang, Bin; Dong, Min; Wang, Yunshang

    2018-04-26

    Trade in medicinal and aromatic plants (MAPs) between Nepal and China has taken place for centuries along the Tibetan border. While there is anecdotal evidence that economic development in China over the past decades, coupled with regional infrastructure development and increasing market integration, has substantially changed this trade, there are no current published studies investigating this, e.g. in terms of species and market structure. This knowledge gap impedes the development of public interventions, e.g. in support of sustainable trade. The primary objective of this study is to provide the first informative insights into the Nepal-China trade in MAPs, with particular emphasis on the value chain in Tibet. Data was collected from December 2015 to August 2016 in Nepal and Tibet. The data collection included standardized questionnaires to understand and map the value chain of MAPs, including the actors involved and key governance issues. Data was collected from Katmandu-based MAPs wholesalers exporting from Nepal to China, (n = 6) and with regional wholesalers (n = 40) based in Tibet. The questionnaires contained quantitative and qualitative components focusing on key elements of the MAPs value chain, e.g. the traded species and their values. This was augmented with qualitative interviews with Lhasa-based processors (n = 4) and government officials (n = 12) working in border controls, customs, and/or drug administration. We also collected official statistics on the Nepal-China MAPs trade and conducted a workshop in Lhasa with traders and government officials to discuss the nature of the Nepal-China MAPs trade. The Nepal-China MAPs trade boomed after 2011 when the value of traded plants increased more than nine-fold. This rapid increase reflected both a broader species composition and higher unit prices in response to increasing demand from China. The trade expansion was also driven by increasing demand in China and facilitated by improved infrastructure in Nepal and Tibet, including direct flight connections for the transport of high-value products. Official records on both sides of the border under-document both the value and the volume of the trade, implying that much of it is extra-legal. The value chain is thus governed by both legal and extra-legal mechanisms. This study provides the first structured overview of the current trade in commercial MAPs from Nepal to China. While the trade is thriving, growing in both volume and value to the benefit of producers in Nepal and consumers in China, there is little empirical data or research to support policy formulation on sustainable trading. This study provides informative insights into the value chain and makes public policy recommendations to increase the transparency and sustainability of trade by improving traditional border markets and removing market barriers. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Cognitive Implications of Deep Gray Matter Iron in Multiple Sclerosis.

    PubMed

    Fujiwara, E; Kmech, J A; Cobzas, D; Sun, H; Seres, P; Blevins, G; Wilman, A H

    2017-05-01

    Deep gray matter iron accumulation is increasingly recognized in association with multiple sclerosis and can be measured in vivo with MR imaging. The cognitive implications of this pathology are not well-understood, especially vis-à-vis deep gray matter atrophy. Our aim was to investigate the relationships between cognition and deep gray matter iron in MS by using 2 MR imaging-based iron-susceptibility measures. Forty patients with multiple sclerosis (relapsing-remitting, n = 16; progressive, n = 24) and 27 healthy controls were imaged at 4.7T by using the transverse relaxation rate and quantitative susceptibility mapping. The transverse relaxation rate and quantitative susceptibility mapping values and volumes (atrophy) of the caudate, putamen, globus pallidus, and thalamus were determined by multiatlas segmentation. Cognition was assessed with the Brief Repeatable Battery of Neuropsychological Tests. Relationships between cognition and deep gray matter iron were examined by hierarchic regressions. Compared with controls, patients showed reduced memory ( P < .001) and processing speed ( P = .02) and smaller putamen ( P < .001), globus pallidus ( P = .002), and thalamic volumes ( P < .001). Quantitative susceptibility mapping values were increased in patients compared with controls in the putamen ( P = .003) and globus pallidus ( P = .003). In patients only, thalamus ( P < .001) and putamen ( P = .04) volumes were related to cognitive performance. After we controlled for volume effects, quantitative susceptibility mapping values in the globus pallidus ( P = .03; trend for transverse relaxation rate, P = .10) were still related to cognition. Quantitative susceptibility mapping was more sensitive compared with the transverse relaxation rate in detecting deep gray matter iron accumulation in the current multiple sclerosis cohort. Atrophy and iron accumulation in deep gray matter both have negative but separable relationships to cognition in multiple sclerosis. © 2017 by American Journal of Neuroradiology.

  3. Investigating the state-of-the-art in whole-body MR-based attenuation correction: an intra-individual, inter-system, inventory study on three clinical PET/MR systems.

    PubMed

    Beyer, Thomas; Lassen, Martin L; Boellaard, Ronald; Delso, Gaspar; Yaqub, Maqsood; Sattler, Bernhard; Quick, Harald H

    2016-02-01

    We assess inter- and intra-subject variability of magnetic resonance (MR)-based attenuation maps (MRμMaps) of human subjects for state-of-the-art positron emission tomography (PET)/MR imaging systems. Four healthy male subjects underwent repeated MR imaging with a Siemens Biograph mMR, Philips Ingenuity TF and GE SIGNA PET/MR system using product-specific MR sequences and image processing algorithms for generating MRμMaps. Total lung volumes and mean attenuation values in nine thoracic reference regions were calculated. Linear regression was used for comparing lung volumes on MRμMaps. Intra- and inter-system variability was investigated using a mixed effects model. Intra-system variability was seen for the lung volume of some subjects, (p = 0.29). Mean attenuation values across subjects were significantly different (p < 0.001) due to different segmentations of the trachea. Differences in the attenuation values caused noticeable intra-individual and inter-system differences that translated into a subsequent bias of the corrected PET activity values, as verified by independent simulations. Significant differences of MRμMaps generated for the same subjects but different PET/MR systems resulted in differences in attenuation correction factors, particularly in the thorax. These differences currently limit the quantitative use of PET/MR in multi-center imaging studies.

  4. Towards large-scale mapping of urban three-dimensional structure using Landsat imagery and global elevation datasets

    NASA Astrophysics Data System (ADS)

    Wang, P.; Huang, C.

    2017-12-01

    The three-dimensional (3D) structure of buildings and infrastructures is fundamental to understanding and modelling of the impacts and challenges of urbanization in terms of energy use, carbon emissions, and earthquake vulnerabilities. However, spatially detailed maps of urban 3D structure have been scarce, particularly in fast-changing developing countries. We present here a novel methodology to map the volume of buildings and infrastructures at 30 meter resolution using a synergy of Landsat imagery and openly available global digital surface models (DSMs), including the Shuttle Radar Topography Mission (SRTM), ASTER Global Digital Elevation Map (GDEM), ALOS World 3D - 30m (AW3D30), and the recently released global DSM from the TanDEM-X mission. Our method builds on the concept of object-based height profile to extract height metrics from the DSMs and use a machine learning algorithm to predict height and volume from the height metrics. We have tested this algorithm in the entire England and assessed our result using Lidar measurements in 25 England cities. Our initial assessments achieved a RMSE of 1.4 m (R2 = 0.72) for building height and a RMSE of 1208.7 m3 (R2 = 0.69) for building volume, demonstrating the potential of large-scale applications and fully automated mapping of urban structure.

  5. Role of angiotensin in renal sympathetic activation in cirrhotic rats.

    PubMed

    Voigt, M D; Jones, S Y; DiBona, G F

    1999-08-01

    Central nervous system (CNS) renin-angiotensin activity influences the basal level of renal sympathetic nerve activity (RSNA) and its reflex regulation. The effect of type 1 angiotensin II (ANG II)-receptor antagonist treatment (losartan) on cardiac baroreflex regulation of RSNA and renal sodium handling was examined in rats with cirrhosis due to common bile duct ligation (CBDL). Basal levels of heart rate, mean arterial pressure (MAP), RSNA, and urinary sodium excretion were not affected by intracerebroventricular administration of either losartan or vehicle to CBDL rats. After acute intravenous isotonic saline loading (10% body wt) in vehicle-treated CBDL rats, MAP was unchanged and the decrease in RSNA seen in normal rats did not occur. However, in losartan-treated CBDL rats, there were significant concurrent but transient decreases in MAP (-20 +/- 2 mmHg) and RSNA (-25 +/- 3%). The natriuretic response to acute volume loading in losartan-treated CBDL rats was significantly less than that in vehicle-treated CBDL rats only at those time points where there were significant decreases in MAP. Antagonism of CNS ANG II type 1 receptors augments the renal sympathoinhibitory response to acute volume loading in CBDL. However, the natriuretic response to the acute volume loading is not improved, likely due to the strong antinatriuretic influence of the concomitant marked decrease in MAP (renal perfusion pressure) mediated by widespread sympathetic withdrawal from the systemic vasculature.

  6. Using satellite data in map design and production

    USGS Publications Warehouse

    Hutchinson, John A.

    2002-01-01

    Satellite image maps have been produced by the U.S. Geological Survey (USGS) since shortly after the launch of the first Landsat satellite in 1972. Over the years, the use of image data to design and produce maps has developed from a manual and photographic process to one that incorporates geographic information systems, desktop publishing, and digital prepress techniques. At the same time, the content of most image-based maps produced by the USGS has shifted from raw image data to land cover or other information layers derived from satellite imagery, often portrayed in combination with shaded relief.

  7. Forecasting inundation from debris flows that grow during travel, with application to the Oregon Coast Range, USA

    USGS Publications Warehouse

    Reid, Mark E.; Coe, Jeffrey A.; Brien, Dianne

    2016-01-01

    Many debris flows increase in volume as they travel downstream, enhancing their mobility and hazard. Volumetric growth can result from diverse physical processes, such as channel sediment entrainment, stream bank collapse, adjacent landsliding, hillslope erosion and rilling, and coalescence of multiple debris flows; incorporating these varied phenomena into physics-based debris-flow models is challenging. As an alternative, we embedded effects of debris-flow growth into an empirical/statistical approach to forecast potential inundation areas within digital landscapes in a GIS framework. Our approach used an empirical debris-growth function to account for the effects of growth phenomena. We applied this methodology to a debris-flow-prone area in the Oregon Coast Range, USA, where detailed mapping revealed areas of erosion and deposition along paths of debris flows that occurred during a large storm in 1996. Erosion was predominant in stream channels with slopes > 5°. Using pre- and post-event aerial photography, we derived upslope contributing area and channel-length growth factors. Our method reproduced the observed inundation patterns produced by individual debris flows; it also generated reproducible, objective potential inundation maps for entire drainage networks. These maps better matched observations than those using previous methods that focus on proximal or distal regions of a drainage network.

  8. Action-based Dynamical Modeling for the Milky Way Disk: The Influence of Spiral Arms

    NASA Astrophysics Data System (ADS)

    Trick, Wilma H.; Bovy, Jo; D'Onghia, Elena; Rix, Hans-Walter

    2017-04-01

    RoadMapping is a dynamical modeling machinery developed to constrain the Milky Way’s (MW) gravitational potential by simultaneously fitting an axisymmetric parametrized potential and an action-based orbit distribution function (DF) to discrete 6D phase-space measurements of stars in the Galactic disk. In this work, we demonstrate RoadMapping's robustness in the presence of spiral arms by modeling data drawn from an N-body simulation snapshot of a disk-dominated galaxy of MW mass with strong spiral arms (but no bar), exploring survey volumes with radii 500 {pc}≤slant {r}\\max ≤slant 5 {kpc}. The potential constraints are very robust, even though we use a simple action-based DF, the quasi-isothermal DF. The best-fit RoadMapping model always recovers the correct gravitational forces where most of the stars that entered the analysis are located, even for small volumes. For data from large survey volumes, RoadMapping finds axisymmetric models that average well over the spiral arms. Unsurprisingly, the models are slightly biased by the excess of stars in the spiral arms. Gravitational potential models derived from survey volumes with at least {r}\\max =3 {kpc} can be reliably extrapolated to larger volumes. However, a large radial survey extent, {r}\\max ˜ 5 {kpc}, is needed to correctly recover the halo scale length. In general, the recovery and extrapolability of potentials inferred from data sets that were drawn from inter-arm regions appear to be better than those of data sets drawn from spiral arms. Our analysis implies that building axisymmetric models for the Galaxy with upcoming Gaia data will lead to sensible and robust approximations of the MW’s potential.

  9. Pressure-dependent changes in haematocrit and plasma volume during anaesthesia, a randomised clinical trial.

    PubMed

    Damén, T; Reinsfelt, B; Redfors, B; Nygren, A

    2016-05-01

    Induction of general anaesthesia has been shown to cause haemodilution and an increase in plasma volume. The aim of this study was to evaluate whether prevention of hypotension during anaesthesia induction could avoid haemodilution. Twenty-four cardiac surgery patients, 66 ± 10 years, were randomised to receive either norepinephrine in a dose needed to maintain mean arterial blood pressure (MAP) at pre-anaesthesia levels after induction or to a control group that received vasopressor if MAP decreased below 60 mmHg. No fluids were infused. Changes in plasma volume were calculated with standard formula: 100 × (Hct(pre)/Hct(post) - 1)/(1 - Hct(pre)). Arterial blood gas was analysed every 10 minutes and non-invasive continuous haemoglobin (SpHb) was continuously measured. Pre-anaesthesia MAP was 98 ± 7 mmHg. Ten minutes after anaesthesia induction, the haematocrit decreased by 5.0 ± 2.5% in the control group compared with 1.2 ± 1.4% in the intervention group, which corresponds to increases in plasma volume by 310 ml and 85 ml respectively. MAP decreased to 69 ± 15 mmHg compared to 92 ± 10 mmHg in the intervention group. The difference maintained throughout the 70 min intervention period. The change in haemoglobin level measured by blood gas analysis could not be detected by SpHb measurement. The mean bias between the SpHb and blood gas haemoglobin was 15 g/l. During anaesthesia induction, haematocrit decreases and plasma volume increases early and parallel to a decrease in blood pressure. This autotransfusion is blunted when blood pressure is maintained at pre-induction levels with norepinephrine. © 2016 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  10. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    DTIC Science & Technology

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  11. Automated segmentation of chronic stroke lesions using LINDA: Lesion Identification with Neighborhood Data Analysis

    PubMed Central

    Pustina, Dorian; Coslett, H. Branch; Turkeltaub, Peter E.; Tustison, Nicholas; Schwartz, Myrna F.; Avants, Brian

    2015-01-01

    The gold standard for identifying stroke lesions is manual tracing, a method that is known to be observer dependent and time consuming, thus impractical for big data studies. We propose LINDA (Lesion Identification with Neighborhood Data Analysis), an automated segmentation algorithm capable of learning the relationship between existing manual segmentations and a single T1-weighted MRI. A dataset of 60 left hemispheric chronic stroke patients is used to build the method and test it with k-fold and leave-one-out procedures. With respect to manual tracings, predicted lesion maps showed a mean dice overlap of 0.696±0.16, Hausdorff distance of 17.9±9.8mm, and average displacement of 2.54±1.38mm. The manual and predicted lesion volumes correlated at r=0.961. An additional dataset of 45 patients was utilized to test LINDA with independent data, achieving high accuracy rates and confirming its cross-institutional applicability. To investigate the cost of moving from manual tracings to automated segmentation, we performed comparative lesion-to-symptom mapping (LSM) on five behavioral scores. Predicted and manual lesions produced similar neuro-cognitive maps, albeit with some discussed discrepancies. Of note, region-wise LSM was more robust to the prediction error than voxel-wise LSM. Our results show that, while several limitations exist, our current results compete with or exceed the state-of-the-art, producing consistent predictions, very low failure rates, and transferable knowledge between labs. This work also establishes a new viewpoint on evaluating automated methods not only with segmentation accuracy but also with brain-behavior relationships. LINDA is made available online with trained models from over 100 patients. PMID:26756101

  12. Computer-composite mapping for geologists

    USGS Publications Warehouse

    van Driel, J.N.

    1980-01-01

    A computer program for overlaying maps has been tested and evaluated as a means for producing geologic derivative maps. Four maps of the Sugar House Quadrangle, Utah, were combined, using the Multi-Scale Data Analysis and Mapping Program, in a single composite map that shows the relative stability of the land surface during earthquakes. Computer-composite mapping can provide geologists with a powerful analytical tool and a flexible graphic display technique. Digitized map units can be shown singly, grouped with different units from the same map, or combined with units from other source maps to produce composite maps. The mapping program permits the user to assign various values to the map units and to specify symbology for the final map. Because of its flexible storage, easy manipulation, and capabilities of graphic output, the composite-mapping technique can readily be applied to mapping projects in sedimentary and crystalline terranes, as well as to maps showing mineral resource potential. ?? 1980 Springer-Verlag New York Inc.

  13. The New Albany Shale Petroleum System, Illinois Basin - Data and Map Image Archive from the Material-Balance Assessment

    USGS Publications Warehouse

    Higley, Debra K.; Henry, M.E.; Lewan, M.D.; Pitman, Janet K.

    2003-01-01

    The data files and explanations presented in this report were used to generate published material-balance approach estimates of amounts of petroleum 1) expelled from a source rock, and the sum of 2) petroleum discovered in-place plus that lost due to 3) secondary migration within, or leakage or erosion from a petroleum system. This study includes assessment of cumulative production, known petroleum volume, and original oil in place for hydrocarbons that were generated from the New Albany Shale source rocks.More than 4.00 billion barrels of oil (BBO) have been produced from Pennsylvanian-, Mississippian-, Devonian-, and Silurian-age reservoirs in the New Albany Shale petroleum system. Known petroleum volume is 4.16 BBO; the average recovery factor is 103.9% of the current cumulative production. Known petroleum volume of oil is 36.22% of the total original oil in place of 11.45 BBO. More than 140.4 BBO have been generated from the Upper Devonian and Lower Mississippian New Albany Shale in the Illinois Basin. Approximately 86.29 billion barrels of oil that was trapped south of the Cottage Grove fault system were lost by erosion of reservoir intervals. The remaining 54.15 BBO are 21% of the hydrocarbons that were generated in the basin and are accounted for using production data. Included in this publication are 2D maps that show the distribution of production for different formations versus the Rock-Eval pyrolysis hydrogen-indices (HI) contours, and 3D images that show the close association between burial depth and HI values.The primary vertical migration pathway of oil and gas was through faults and fractures into overlying reservoir strata. About 66% of the produced oil is located within the generative basin, which is outlined by an HI contour of 400. The remaining production is concentrated within 30 miles (50 km) outside the 400 HI contour. The generative basin is subdivided by contours of progressively lower hydrogen indices that represent increased levels of thermal maturity and generative capacity of New Albany Shale source rocks. The generative basin was also divided into seven oil-migration catchments. The catchments were determined using a surface-flow hydrologic model with contoured HI values as input to the model.

  14. Image registration with auto-mapped control volumes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreibmann, Eduard; Xing Lei

    2006-04-15

    Many image registration algorithms rely on the use of homologous control points on the two input image sets to be registered. In reality, the interactive identification of the control points on both images is tedious, difficult, and often a source of error. We propose a two-step algorithm to automatically identify homologous regions that are used as a priori information during the image registration procedure. First, a number of small control volumes having distinct anatomical features are identified on the model image in a somewhat arbitrary fashion. Instead of attempting to find their correspondences in the reference image through user interaction,more » in the proposed method, each of the control regions is mapped to the corresponding part of the reference image by using an automated image registration algorithm. A normalized cross-correlation (NCC) function or mutual information was used as the auto-mapping metric and a limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm (L-BFGS) was employed to optimize the function to find the optimal mapping. For rigid registration, the transformation parameters of the system are obtained by averaging that derived from the individual control volumes. In our deformable calculation, the mapped control volumes are treated as the nodes or control points with known positions on the two images. If the number of control volumes is not enough to cover the whole image to be registered, additional nodes are placed on the model image and then located on the reference image in a manner similar to the conventional BSpline deformable calculation. For deformable registration, the established correspondence by the auto-mapped control volumes provides valuable guidance for the registration calculation and greatly reduces the dimensionality of the problem. The performance of the two-step registrations was applied to three rigid registration cases (two PET-CT registrations and a brain MRI-CT registration) and one deformable registration of inhale and exhale phases of a lung 4D CT. Algorithm convergence was confirmed by starting the registration calculations from a large number of initial transformation parameters. An accuracy of {approx}2 mm was achieved for both deformable and rigid registration. The proposed image registration method greatly reduces the complexity involved in the determination of homologous control points and allows us to minimize the subjectivity and uncertainty associated with the current manual interactive approach. Patient studies have indicated that the two-step registration technique is fast, reliable, and provides a valuable tool to facilitate both rigid and nonrigid image registrations.« less

  15. Multiadaptive Plan (MAP) IMRT to Accommodate Independent Movement of the Prostate and Pelvic Lymph Nodes

    DTIC Science & Technology

    2013-06-01

    08-1-0358 TITLE: Multiadaptive Plan (MAP) IMRT to Accommodate Independent Movement of the Prostate and Pelvic Lymph Nodes PRINCIPAL...AND SUBTITLE Multi-Adaptive Plan (MAP) IMRT to Accommodate Independent 5a. CONTRACT NUMBER W81XWH-08-1-0358 Movement of the Prostate and...multi-adaptive plan (MAP) IMRT to accommodate independent movement of the two targeted tumor volumes. In this project, we evaluated two adaptive

  16. High-resolution geological mapping at 3D Environments: A case study from the fold-and-thrust belt in northern Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Y. C.; Shih, N. C.; Hsieh, Y. C.

    2016-12-01

    Geologic maps have provided fundamental information for many scientific and engineering applications in human societies. Geologic maps directly influence the reliability of research results or the robustness of engineering projects. In the past, geologic maps were mainly produced by field geologists through direct field investigations and 2D topographic maps. However, the quality of traditional geologic maps was significantly compromised by field conditions, particularly, when the map area is covered by heavy forest canopies. Recent developments in airborne LiDAR technology may virtually remove trees or buildings, thus, providing a useful data set for improving geological mapping. Because high-quality topographic information still needs to be interpreted in terms of geology, there are many fundamental questions regarding how to best apply the data set for high-resolution geological mapping. In this study, we aim to test the quality and reliability of high-resolution geologic maps produced by recent technological methods through an example from the fold-and-thrust belt in northern Taiwan. We performed the geological mapping by applying the LiDAR-derived DEM, self-developed program tools and many layers of relevant information at interactive 3D environments. Our mapping results indicate that the proposed methods will considerably improve the quality and consistency of the geologic maps. The study also shows that in order to gain consistent mapping results, future high-resolution geologic maps should be produced at interactive 3D environments on the basis of existing geologic maps.

  17. DistMap: a toolkit for distributed short read mapping on a Hadoop cluster.

    PubMed

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/

  18. DistMap: A Toolkit for Distributed Short Read Mapping on a Hadoop Cluster

    PubMed Central

    Pandey, Ram Vinay; Schlötterer, Christian

    2013-01-01

    With the rapid and steady increase of next generation sequencing data output, the mapping of short reads has become a major data analysis bottleneck. On a single computer, it can take several days to map the vast quantity of reads produced from a single Illumina HiSeq lane. In an attempt to ameliorate this bottleneck we present a new tool, DistMap - a modular, scalable and integrated workflow to map reads in the Hadoop distributed computing framework. DistMap is easy to use, currently supports nine different short read mapping tools and can be run on all Unix-based operating systems. It accepts reads in FASTQ format as input and provides mapped reads in a SAM/BAM format. DistMap supports both paired-end and single-end reads thereby allowing the mapping of read data produced by different sequencing platforms. DistMap is available from http://code.google.com/p/distmap/ PMID:24009693

  19. The south-central United States magnetic anomaly

    NASA Technical Reports Server (NTRS)

    Hinze, W. J.; Braile, L. W. (Principal Investigator); Starich, P. J.

    1984-01-01

    The South-Central United States Magnetic Anomaly is the most prominent positive feature in the MAGSAT scalar magnetic field over North America. The anomaly correlates with increased crustal thickness, above average crustal velocity, negative free air gravity anomalies and an extensive zone of Middle Proterozoic anorogenic felsic basement rocks. Spherical dipole source inversion of the MAGSAT scalar data and subsequent calculation of reduced to pole and derivative maps provide constraints for a crustal magnetic model which corresponds geographically to the extensive Middle Proterozoic felsic rocks trending northeasterly across the United States. These felsic rocks contain insufficient magnetization or volume to produce the anomaly, but are rather indicative of a crustal zone which was disturbed during a Middle Proterozoic thermal event which enriched magnetic material deep in the crust.

  20. Use of satellite data in volcano monitoring

    NASA Technical Reports Server (NTRS)

    Mcclelland, Lindsay

    1987-01-01

    It is argued that Total Ozone Mapping Spectrometer (TOMS) data, especially data on sulfur dioxide detection in volcanic clouds, and weather satellite data complement each other. TOMS data is most useful for discovering previously unknown eruptions and indicating a minimum volume of SO sub 2 produced by a given eruption. Once an eruption has been reported, weather satellite data can be used to accurately monitor its progress. To be used effectively, these data need to be analyzed jointly and in real time. Toward this end, it is hoped that full and timely utilization can be made of existing TOMS data, a polar orbiting TOMS can be launched in the near future, and that TOMS type instruments can be included on future geostationary satellites.

  1. Destroying the Art of Cartography: Teaching Illustrations Using ArcView.

    ERIC Educational Resources Information Center

    Hill, Miriam Helen

    Introductory courses in Geographic Information Systems and cartography cover the fundamentals of map design. Students are given guidelines for producing a good map, but visual demonstrations are much more impressive. ArcView was used to produce illustrations of bad mapping practices and placed in a Microsoft PowerPoint presentation to demonstrate…

  2. Comparing the efficiency of digital and conventional soil mapping to predict soil types in a semi-arid region in Iran

    NASA Astrophysics Data System (ADS)

    Zeraatpisheh, Mojtaba; Ayoubi, Shamsollah; Jafari, Azam; Finke, Peter

    2017-05-01

    The efficiency of different digital and conventional soil mapping approaches to produce categorical maps of soil types is determined by cost, sample size, accuracy and the selected taxonomic level. The efficiency of digital and conventional soil mapping approaches was examined in the semi-arid region of Borujen, central Iran. This research aimed to (i) compare two digital soil mapping approaches including Multinomial logistic regression and random forest, with the conventional soil mapping approach at four soil taxonomic levels (order, suborder, great group and subgroup levels), (ii) validate the predicted soil maps by the same validation data set to determine the best method for producing the soil maps, and (iii) select the best soil taxonomic level by different approaches at three sample sizes (100, 80, and 60 point observations), in two scenarios with and without a geomorphology map as a spatial covariate. In most predicted maps, using both digital soil mapping approaches, the best results were obtained using the combination of terrain attributes and the geomorphology map, although differences between the scenarios with and without the geomorphology map were not significant. Employing the geomorphology map increased map purity and the Kappa index, and led to a decrease in the 'noisiness' of soil maps. Multinomial logistic regression had better performance at higher taxonomic levels (order and suborder levels); however, random forest showed better performance at lower taxonomic levels (great group and subgroup levels). Multinomial logistic regression was less sensitive than random forest to a decrease in the number of training observations. The conventional soil mapping method produced a map with larger minimum polygon size because of traditional cartographic criteria used to make the geological map 1:100,000 (on which the conventional soil mapping map was largely based). Likewise, conventional soil mapping map had also a larger average polygon size that resulted in a lower level of detail. Multinomial logistic regression at the order level (map purity of 0.80), random forest at the suborder (map purity of 0.72) and great group level (map purity of 0.60), and conventional soil mapping at the subgroup level (map purity of 0.48) produced the most accurate maps in the study area. The multinomial logistic regression method was identified as the most effective approach based on a combined index of map purity, map information content, and map production cost. The combined index also showed that smaller sample size led to a preference for the order level, while a larger sample size led to a preference for the great group level.

  3. Forces and dynamics in epithelial domes of controlled size and shape

    NASA Astrophysics Data System (ADS)

    Latorre-Ibars, Ernest; Casares, Laura; Gomez-Gonzalez, Manuel; Uroz, Marina; Arroyo, Marino; Trepat, Xavier

    Mechanobiology of epithelia plays a central role in morphogenesis, wound healing, and tumor progression. Its current understanding relies on mechanical measurements on flat epithelial layers. However, most epithelia in vivo exhibit a curved 3D shape enclosing a pressurized lumen. Using soft micropatterned substrates we produce massive parallel arrays of epithelial domes with controlled size and basal shape. We measure epithelial traction, tension, and luminal pressure in epithelial domes. The local stress tensor on the freestanding epithelial membrane is then mapped by combining measured luminal pressure and local curvature. We show that tension and cell shape are highly anisotropic and vary along the meridional position of the domes. Finally, we establish constitutive relations between shape, tension, and pressure during perturbations of the contractile machinery, osmotic shocks, and spontaneous fluctuations of dome volume. Our findings contradict a description of the epithelium as a fluid capillary surface. Cells in the dome are unable to relax into a uniform and isotropic tensional state through sub- and supra-cellular rearrangements. Mapping epithelial shape, tension, and pressure will enable quantitative studies of mechanobiology in 3D epithelia of controlled size and shape.

  4. Discrimination of fluoride and phosphate contamination in central Florida for analyses of environmental effects

    NASA Technical Reports Server (NTRS)

    Coker, A. E.; Marshall, R.; Thomson, F.

    1972-01-01

    A study was made of the spatial registration of fluoride and phosphate pollution parameters in central Florida by utilizing remote sensing techniques. Multispectral remote sensing data were collected over the area and processed to produce multispectral recognition maps. These processed data were used to map land areas and waters containing concentrations of fluoride and phosphate. Maps showing distribution of affected and unaffected vegetation were produced. In addition, the multispectral data were processed by single band radiometric slicing to produce radiometric maps used to delineate areas of high ultraviolet radiance, which indicates high fluoride concentrations. The multispectral parameter maps and radiometric maps in combination showed distinctive patterns, which are correlated with areas known to be affected by fluoride and phosphate contamination. These remote sensing techniques have the potential for regional use to assess the environmental impact of fluoride and phosphate wastes in central Florida.

  5. Openings: On the Journal of Homosexuality, Volume 1, Issue 1.

    PubMed

    Gotkin, Kevin

    2016-01-01

    This article serves as one of the supplementary pieces of this special issue on "Mapping Queer Bioethics," in which we take a solipsistic turn to "map" the Journal of Homosexuality itself. Here, the author examines Volume 1, Issue 1 of the Journal of Homosexuality and asks whether the journal's first contributors might reveal a historically problematic relationship whereby the categories of front-line LGBT health advocates in the 1970s might be incommensurate with the post-AIDS, queer politics that would follow in decades to come.

  6. Can Atmospheric Reanalysis Data Sets Be Used to Reproduce Flooding Over Large Scales?

    NASA Astrophysics Data System (ADS)

    Andreadis, Konstantinos M.; Schumann, Guy J.-P.; Stampoulis, Dimitrios; Bates, Paul D.; Brakenridge, G. Robert; Kettner, Albert J.

    2017-10-01

    Floods are costly to global economies and can be exceptionally lethal. The ability to produce consistent flood hazard maps over large areas could provide a significant contribution to reducing such losses, as the lack of knowledge concerning flood risk is a major factor in the transformation of river floods into flood disasters. In order to accurately reproduce flooding in river channels and floodplains, high spatial resolution hydrodynamic models are needed. Despite being computationally expensive, recent advances have made their continental to global implementation feasible, although inputs for long-term simulations may require the use of reanalysis meteorological products especially in data-poor regions. We employ a coupled hydrologic/hydrodynamic model cascade forced by the 20CRv2 reanalysis data set and evaluate its ability to reproduce flood inundation area and volume for Australia during the 1973-2012 period. Ensemble simulations using the reanalysis data were performed to account for uncertainty in the meteorology and compared with a validated benchmark simulation. Results show that the reanalysis ensemble capture the inundated areas and volumes relatively well, with correlations for the ensemble mean of 0.82 and 0.85 for area and volume, respectively, although the meteorological ensemble spread propagates in large uncertainty of the simulated flood characteristics.

  7. Building Better Volcanic Hazard Maps Through Scientific and Stakeholder Collaboration

    NASA Astrophysics Data System (ADS)

    Thompson, M. A.; Lindsay, J. M.; Calder, E.

    2015-12-01

    All across the world information about natural hazards such as volcanic eruptions, earthquakes and tsunami is shared and communicated using maps that show which locations are potentially exposed to hazards of varying intensities. Unlike earthquakes and tsunami, which typically produce one dominant hazardous phenomenon (ground shaking and inundation, respectively) volcanic eruptions can produce a wide variety of phenomena that range from near-vent (e.g. pyroclastic flows, ground shaking) to distal (e.g. volcanic ash, inundation via tsunami), and that vary in intensity depending on the type and location of the volcano. This complexity poses challenges in depicting volcanic hazard on a map, and to date there has been no consistent approach, with a wide range of hazard maps produced and little evaluation of their relative efficacy. Moreover, in traditional hazard mapping practice, scientists analyse data about a hazard, and then display the results on a map that is then presented to stakeholders. This one-way, top-down approach to hazard communication does not necessarily translate into effective hazard education, or, as tragically demonstrated by Nevado del Ruiz, Columbia in 1985, its use in risk mitigation by civil authorities. Furthermore, messages taken away from a hazard map can be strongly influenced by its visual design. Thus, hazard maps are more likely to be useful, usable and used if relevant stakeholders are engaged during the hazard map process to ensure a) the map is designed in a relevant way and b) the map takes into account how users interpret and read different map features and designs. The IAVCEI Commission on Volcanic Hazards and Risk has recently launched a Hazard Mapping Working Group to collate some of these experiences in graphically depicting volcanic hazard from around the world, including Latin America and the Caribbean, with the aim of preparing some Considerations for Producing Volcanic Hazard Maps that may help map makers in the future.

  8. Soot Volume Fraction Maps for Normal and Reduced Gravity Laminar Acetylene Jet Diffusion Flames

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Ku, Jerry C.

    1997-01-01

    The study of soot particulate distribution inside gas jet diffusion flames is important to the understanding of fundamental soot particle and thermal radiative transport processes, as well as providing findings relevant to spacecraft fire safety, soot emissions, and radiant heat loads for combustors used in air-breathing propulsion systems. Compared to those under normal gravity (1-g) conditions, the elimination of buoyancy-induced flows is expected to significantly change the flow field in microgravity (O g) flames, resulting in taller and wider flames with longer particle residence times. Work by Bahadori and Edelman demonstrate many previously unreported qualitative and semi-quantitative results, including flame shape and radiation, for sooting laminar zas jet diffusion flames. Work by Ku et al. report soot aggregate size and morphology analyses and data and model predictions of soot volume fraction maps for various gas jet diffusion flames. In this study, we present the first 1-g and 0-g comparisons of soot volume fraction maps for laminar acetylene and nitrogen-diluted acetylene jet diffusion flames. Volume fraction is one of the most useful properties in the study of sooting diffusion flames. The amount of radiation heat transfer depends directly on the volume fraction and this parameter can be measured from line-of-sight extinction measurements. Although most Soot aggregates are submicron in size, the primary particles (20 to 50 nm in diameter) are in the Rayleigh limit, so the extinction absorption) cross section of aggregates can be accurately approximated by the Rayleigh solution as a function of incident wavelength, particles' complex refractive index, and particles' volume fraction.

  9. Diving in Head First: Finding the Volume of Norris lake

    ERIC Educational Resources Information Center

    Foster, Drew W.

    2008-01-01

    This article allows students to apply their knowledge and experience of area and volume to find the volume of Norris Lake, a large reservoir lake in Tennessee. Students have the opportunity to demonstrate their skills in using maps and scales as well as to incorporate the use of technology in developing the solution. This project satisfied the…

  10. New version of 1 km global river flood hazard maps for the next generation of Aqueduct Global Flood Analyzer

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, Edwin; van Beek, Rens; Winsemius, Hessel; Ward, Philip; Bierkens, Marc

    2017-04-01

    The Aqueduct Global Flood Analyzer, launched in 2015, is an open-access and free-of-charge web-based interactive platform which assesses and visualises current and future projections of river flood impacts across the globe. One of the key components in the Analyzer is a set of river flood inundation hazard maps derived from the global hydrological model simulation of PCR-GLOBWB. For the current version of the Analyzer, accessible on http://floods.wri.org/#/, the early generation of PCR-GLOBWB 1.0 was used and simulated at 30 arc-minute ( 50 km at the equator) resolution. In this presentation, we will show the new version of these hazard maps. This new version is based on the latest version of PCR-GLOBWB 2.0 (https://github.com/UU-Hydro/PCR-GLOBWB_model, Sutanudjaja et al., 2016, doi:10.5281/zenodo.60764) simulated at 5 arc-minute ( 10 km at the equator) resolution. The model simulates daily hydrological and water resource fluxes and storages, including the simulation of overbank volume that ends up on the floodplain (if flooding occurs). The simulation was performed for the present day situation (from 1960) and future climate projections (until 2099) using the climate forcing created in the ISI-MIP project. From the simulated flood inundation volume time series, we then extract annual maxima for each cell, and fit these maxima to a Gumbel extreme value distribution. This allows us to derive flood volume maps of any hazard magnitude (ranging from 2-year to 1000-year flood events) and for any time period (e.g. 1960-1999, 2010-2049, 2030-2069, and 2060-2099). The derived flood volumes (at 5 arc-minute resolution) are then spread over the high resolution terrain model using an updated GLOFRIS downscaling module (Winsemius et al., 2013, doi:10.5194/hess-17-1871-2013). The updated version performs a volume spreading sequentially from more upstream basins to downstream basins, hence enabling a better inclusion of smaller streams, and takes into account spreading of water over diverging deltaic regions. This results in a set of high resolution hazard maps of flood inundation depth at 30 arc-second ( 1 km at the equator) resolution. Together with many other updates and new features, the resulting flood hazard maps will be used in the next generation of the Aqueduct Global Flood Analyzer.

  11. Applications systems verification and transfer project. Volume 8: Satellite snow mapping and runoff prediction handbook

    NASA Technical Reports Server (NTRS)

    Bowley, C. J.; Barnes, J. C.; Rango, A.

    1981-01-01

    The purpose of the handbook is to update the various snowcover interpretation techniques, document the snow mapping techniques used in the various ASVT study areas, and describe the ways snowcover data have been applied to runoff prediction. Through documentation in handbook form, the methodology developed in the Snow Mapping ASVT can be applied to other areas.

  12. Intrinsic optical signal imaging of the blood volume changes is sufficient for mapping the resting state functional connectivity in the rodent cortex

    NASA Astrophysics Data System (ADS)

    Kura, Sreekanth; Xie, Hongyu; Fu, Buyin; Ayata, Cenk; Boas, David A.; Sakadžić, Sava

    2018-06-01

    Objective. Resting state functional connectivity (RSFC) allows the study of functional organization in normal and diseased brain by measuring the spontaneous brain activity generated under resting conditions. Intrinsic optical signal imaging (IOSI) based on multiple illumination wavelengths has been used successfully to compute RSFC maps in animal studies. The IOSI setup complexity would be greatly reduced if only a single wavelength can be used to obtain comparable RSFC maps. Approach. We used anesthetized mice and performed various comparisons between the RSFC maps based on single wavelength as well as oxy-, deoxy- and total hemoglobin concentration changes. Main results. The RSFC maps based on IOSI at a single wavelength selected for sensitivity to the blood volume changes are quantitatively comparable to the RSFC maps based on oxy- and total hemoglobin concentration changes obtained by the more complex IOSI setups. Moreover, RSFC maps do not require CCD cameras with very high frame acquisition rates, since our results demonstrate that they can be computed from the data obtained at frame rates as low as 5 Hz. Significance. Our results will have general utility for guiding future RSFC studies based on IOSI and making decisions about the IOSI system designs.

  13. Insights into earthquake hazard map performance from shaking history simulations

    NASA Astrophysics Data System (ADS)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher-than-mapped shaking — arises by chance or reflects biases in the map. Due to this problem, there are limits to how well we can expect hazard maps to predict future shaking, as well as to our ability to test the performance of a hazard map based on available observations.

  14. Cerebellar contribution to motor and cognitive performance in multiple sclerosis: An MRI sub-regional volumetric analysis.

    PubMed

    D'Ambrosio, Alessandro; Pagani, Elisabetta; Riccitelli, Gianna C; Colombo, Bruno; Rodegher, Mariaemma; Falini, Andrea; Comi, Giancarlo; Filippi, Massimo; Rocca, Maria A

    2017-08-01

    To investigate the role of cerebellar sub-regions on motor and cognitive performance in multiple sclerosis (MS) patients. Whole and sub-regional cerebellar volumes, brain volumes, T2 hyperintense lesion volumes (LV), and motor performance scores were obtained from 95 relapse-onset MS patients and 32 healthy controls (HC). MS patients also underwent an evaluation of working memory and processing speed functions. Cerebellar anterior and posterior lobes were segmented using the Spatially Unbiased Infratentorial Toolbox (SUIT) from Statistical Parametric Mapping (SPM12). Multivariate linear regression models assessed the relationship between magnetic resonance imaging (MRI) measures and motor/cognitive scores. Compared to HC, only secondary progressive multiple sclerosis (SPMS) patients had lower cerebellar volumes (total and posterior cerebellum). In MS patients, lower anterior cerebellar volume and brain T2 LV predicted worse motor performance, whereas lower posterior cerebellar volume and brain T2 LV predicted poor cognitive performance. Global measures of brain volume and infratentorial T2 LV were not selected by the final multivariate models. Cerebellar volumetric abnormalities are likely to play an important contribution to explain motor and cognitive performance in MS patients. Consistently with functional mapping studies, cerebellar posterior-inferior volume accounted for variance in cognitive measures, whereas anterior cerebellar volume accounted for variance in motor performance, supporting the assessment of cerebellar damage at sub-regional level.

  15. Estimating Residual Solids Volume In Underground Storage Tanks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Jason L.; Worthy, S. Jason; Martin, Bruce A.

    2014-01-08

    The Savannah River Site liquid waste system consists of multiple facilities to safely receive and store legacy radioactive waste, treat, and permanently dispose waste. The large underground storage tanks and associated equipment, known as the 'tank farms', include a complex interconnected transfer system which includes underground transfer pipelines and ancillary equipment to direct the flow of waste. The waste in the tanks is present in three forms: supernatant, sludge, and salt. The supernatant is a multi-component aqueous mixture, while sludge is a gel-like substance which consists of insoluble solids and entrapped supernatant. The waste from these tanks is retrieved andmore » treated as sludge or salt. The high level (radioactive) fraction of the waste is vitrified into a glass waste form, while the low-level waste is immobilized in a cementitious grout waste form called saltstone. Once the waste is retrieved and processed, the tanks are closed via removing the bulk of the waste, chemical cleaning, heel removal, stabilizing remaining residuals with tailored grout formulations and severing/sealing external penetrations. The comprehensive liquid waste disposition system, currently managed by Savannah River Remediation, consists of 1) safe storage and retrieval of the waste as it is prepared for permanent disposition; (2) definition of the waste processing techniques utilized to separate the high-level waste fraction/low-level waste fraction; (3) disposition of LLW in saltstone; (4) disposition of the HLW in glass; and (5) closure state of the facilities, including tanks. This paper focuses on determining the effectiveness of waste removal campaigns through monitoring the volume of residual solids in the waste tanks. Volume estimates of the residual solids are performed by creating a map of the residual solids on the waste tank bottom using video and still digital images. The map is then used to calculate the volume of solids remaining in the waste tank. The ability to accurately determine a volume is a function of the quantity and quality of the waste tank images. Currently, mapping is performed remotely with closed circuit video cameras and still photograph cameras due to the hazardous environment. There are two methods that can be used to create a solids volume map. These methods are: liquid transfer mapping / post transfer mapping and final residual solids mapping. The task is performed during a transfer because the liquid level (which is a known value determined by a level measurement device) is used as a landmark to indicate solids accumulation heights. The post transfer method is primarily utilized after the majority of waste has been removed. This method relies on video and still digital images of the waste tank after the liquid transfer is complete to obtain the relative height of solids across a waste tank in relation to known and usable landmarks within the waste tank (cooling coils, column base plates, etc.). In order to accurately monitor solids over time across various cleaning campaigns, and provide a technical basis to support final waste tank closure, a consistent methodology for volume determination has been developed and implemented at SRS.« less

  16. FASTMap v. 2010.01

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bynum, Leo

    FASTMap is mapping application available for the web or on mobile devices (IOS and Android) that browses geospatial data and produces detailed reports of objects within any area of analysis. FASTMap can access any geospatial dataset. The software can provide immediate access to the selected data through a fully symbolized interactive mapping interface. FASTMap can load arbitrary contours that represent a region of interest and can dynamically identify and geospatially select objects that reside within the region. The software can produce a report listing the objects and aggregations for the region, as well as producing publication quality maps. FASTMap alsomore » has the ability to post and maintain authored maps, any GIS data included in the map, areas of interest, as well as any titles, and labels. These defining ingredients of a map are called map contexts. These mao contexts can be instantly broadcast via the internet through any of an infinite number of named channels to small or large numbers of users monitouring any of the channels being posted to, so a user can author a map and immediately share that map with others instantly, whether they are on traditional desktop computer, laptop, mobile tablet or smartphone. Further, users receiving broadcast maps can also alter the maps can also alter the maps, or create new ones and publish back to the channel in a collaborative manner. FASTMap can be configured to access virtually any geospatial data.« less

  17. Chernobyl doses. Volume 1. Analysis of forest canopy radiation response from multispectral imagery and the relationship to doses. Technical report, 29 July 1987-30 September 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClennan, G.E.; Anno, G.H.; Whicker, F.W.

    1994-09-01

    This volume of the report Chernobyl Doses presents details of a new, quantitative method for remotely sensing ionizing radiation dose to vegetation. Analysis of Landsat imagery of the area within a few kilometers of the Chernobyl nuclear reactor station provides maps of radiation dose to pine forest canopy resulting from the accident of April 26, 1986. Detection of the first date of significant, persistent deviation from normal of the spectral reflectance signature of pine foliage produces contours of radiation dose in the 20 to 80 Gy range extending up to 4 km from the site of the reactor explosion. Themore » effective duration of exposure for the pine foliage is about 3 weeks. For this exposure time, the LD50 of Pinus sylvestris (Scotch pine) is about 23 Gy. The practical lower dose limit for the remote detection of radiation dose to pine foliage with the Landsat Thematic Mapper is about 5 Gy or 1/4 of the LD50.« less

  18. Prostate tissue ablation with MRI guided transurethral therapeutic ultrasound and intraoperative assessment of the integrity of the neurovascular bundle

    NASA Astrophysics Data System (ADS)

    Sammet, Steffen; Partanen, Ari; Yousuf, Ambereen; Wardrip, Craig; Niekrasz, Marek; Antic, Tatjana; Razmaria, Aria; Sokka, Sham; Karczmar, Gregory; Oto, Aytekin

    2017-03-01

    OBJECTIVES: Evaluation of the precision of prostate tissue ablation with MRI guided therapeuticultrasound by intraoperative objective assessment of the neurovascular bundle in canines in-vivo. METHODS: In this ongoing IACUC approved study, eight male canines were scanned in a clinical 3T Achieva MRI scanner (Philips) before, during, and after ultrasound therapy with a prototype MR-guided ultrasound therapy system (Philips). The system includes a therapy console to plan treatment, to calculate real-time temperature maps, and to control ultrasound exposures with temperature feedback. Atransurethral ultrasound applicator with eight transducer elements was used to ablate canine prostate tissue in-vivo. Ablated prostate tissue volumes were compared to the prescribed target volumes to evaluate technical effectiveness. The ablated volumes determined by MRI (T1, T2, diffusion, dynamic contrast enhanced and 240 CEM43 thermal dose maps) were compared to H&E stained histological slides afterprostatectomy. Potential nerve damage of the neurovascular bundle was objectively assessed intraoperativelyduring prostatectomy with a CaverMap Surgical Aid nerve stimulator (Blue Torch Medical Technologies). RESULTS: Transurethral MRI -guided ultrasound therapy can effectively ablate canine prostate tissue invivo. Coronal MR-imaging confirmed the correct placement of the HIFU transducer. MRI temperature maps were acquired during HIFU treatment, and subsequently used for calculating thermal dose. Prescribed target volumes corresponded to the 240 CEM43 thermal dose maps during HIFU treatment in all canines. Ablated volumes on high resolution anatomical, diffusion weighted, and contrast enhanced MR images matched corresponding histological slides after prostatectomy. MRI guidance with realtime temperature monitoring showed no damage to surrounding tissues, especially to the neurovascular bundle (assessed intra-operatively with a nerve stimulator) or to the rectum wall. CONCLUSIONS: Our study demonstrates the effectiveness and precision of transurethral ultrasound ablation of prostatic tissue in canines with MRI monitoring and guidance. The canine prostate is an excellent model for the human prostate with similar anatomical characteristics and diseases. MRI guidance with real-time, intraoperative temperature monitoring reduces the risk of damaging critical surrounding anatomical structures in ultrasound therapy of the prostate.

  19. Water volume and sediment volume and density in Lake Linganore between Boyers Mill Road Bridge and Bens Branch, Frederick County, Maryland, 2012

    USGS Publications Warehouse

    Sekellick, Andrew J.; Banks, William S.L.; Myers, Michael K.

    2013-01-01

    To assist in understanding sediment loadings and the management of water resources, a bathymetric survey was conducted in the part of Lake Linganore between Boyers Mill Road Bridge and Bens Branch in Frederick County, Maryland. The bathymetric survey was performed in January 2012 by the U.S. Geological Survey, in cooperation with the City of Frederick and Frederick County. A separate, but related, field effort to collect 18 sediment cores was conducted in March and April 2012. Depth and location data from the bathymetric survey and location data for the sediment cores were compiled and edited by using geographic information system (GIS) software. A three-dimensional triangulated irregular network (TIN) model of the lake bottom was created to calculate the volume of stored water in the reservoir. Large-scale topographic maps of the valley prior to inundation in 1972 were provided by the Frederick County Division of Utilities and Solid Waste Management and digitized for comparison with current (2012) conditions in order to calculate sediment volume. Cartographic representations of both water depth and sediment accumulation were produced, along with an accuracy assessment for the resulting bathymetric model. Vertical accuracies at the 95-percent confidence level for the collected data, the bathymetric surface model, and the bathymetric contour map were calculated to be 0.64 feet (ft), 1.77 ft, and 2.30 ft, respectively. A dry bulk sediment density was calculated for each of the 18 sediment cores collected during March and April 2012, and used to determine accumulated sediment mass. Water-storage capacity in the study area is 110 acre-feet (acre-ft) at a full-pool elevation 308 ft above the National Geodetic Vertical Datum of 1929, whereas total sediment volume in the study area is 202 acre-ft. These totals indicate a loss of about 65 percent of the original water-storage capacity in the 40 years since dam construction. This corresponds to an average rate of sediment accumulation of 5.1 acre-ft per year since Linganore Creek was impounded. Sediment thicknesses ranged from 0 to 16.7 ft. Sediment densities ranged from 0.38 to 1.08 grams per cubic centimeter, and generally decreased in the downstream direction. The total accumulated-sediment mass was 156,000 metric tons between 1972 and 2012.

  20. Status and trends of land change in the Midwest–South Central United States—1973 to 2000

    USGS Publications Warehouse

    Auch, Roger F.; Karstensen, Krista A.; Auch, Roger F.; Karstensen, Krista A.

    2015-12-10

    U.S. Geological Survey (USGS) Professional Paper 1794–C is the third in a four-volume series on the status and trends of the Nation’s land use and land cover, providing an assessment of the rates and causes of land-use and land-cover change in the Midwest–South Central United States between 1973 and 2000. Volumes A, B, and D provide similar analyses for the Western United States, the Great Plains of the United States, and the Eastern United States, respectively. The assessments of land-use and land-cover trends are conducted on an ecoregion-by-ecoregion basis, and each ecoregion assessment is guided by a nationally consistent study design that includes mapping, statistical methods, field studies, and analysis. Individual assessments provide a picture of the characteristics of land change occurring in a given ecoregion; in combination, they provide a framework for understanding the complex national mosaic of change and also the causes and consequences of change. Thus, each volume in this series provides a regional assessment of how (and how fast) land use and land cover are changing, and why. The four volumes together form the first comprehensive picture of land change across the Nation.Geographic understanding of land-use and land-cover change is directly relevant to a wide variety of stakeholders, including land and resource managers, policymakers, and scientists. The chapters in this volume present brief summaries of the patterns and rates of land change observed in each ecoregion in the Midwest–South Central United States, together with field photographs, statistics, and comparisons with other assessments. In addition, a synthesis chapter summarizes the scope of land change observed across the entire Midwest–South Central United States. The studies provide a way of integrating information across the landscape, and they form a critical component in the efforts to understand how land use and land cover affect important issues such as the provision of ecological goods and services and also the determination of risks to, and vulnerabilities of, human communities. Results from this project also are published in peer-reviewed journals, and they are further used to produce maps of change and other tools for land management, as well as to provide inputs for carbon-cycle modeling and other climate change research.

  1. The relative effectiveness of empirical and physical models for simulating the dense undercurrent of pyroclastic flows under different emplacement conditions

    USGS Publications Warehouse

    Ogburn, Sarah E.; Calder, Eliza S

    2017-01-01

    High concentration pyroclastic density currents (PDCs) are hot avalanches of volcanic rock and gas and are among the most destructive volcanic hazards due to their speed and mobility. Mitigating the risk associated with these flows depends upon accurate forecasting of possible impacted areas, often using empirical or physical models. TITAN2D, VolcFlow, LAHARZ, and ΔH/L or energy cone models each employ different rheologies or empirical relationships and therefore differ in appropriateness of application for different types of mass flows and topographic environments. This work seeks to test different statistically- and physically-based models against a range of PDCs of different volumes, emplaced under different conditions, over different topography in order to test the relative effectiveness, operational aspects, and ultimately, the utility of each model for use in hazard assessments. The purpose of this work is not to rank models, but rather to understand the extent to which the different modeling approaches can replicate reality in certain conditions, and to explore the dynamics of PDCs themselves. In this work, these models are used to recreate the inundation areas of the dense-basal undercurrent of all 13 mapped, land-confined, Soufrière Hills Volcano dome-collapse PDCs emplaced from 1996 to 2010 to test the relative effectiveness of different computational models. Best-fit model results and their input parameters are compared with results using observation- and deposit-derived input parameters. Additional comparison is made between best-fit model results and those using empirically-derived input parameters from the FlowDat global database, which represent “forward” modeling simulations as would be completed for hazard assessment purposes. Results indicate that TITAN2D is able to reproduce inundated areas well using flux sources, although velocities are often unrealistically high. VolcFlow is also able to replicate flow runout well, but does not capture the lateral spreading in distal regions of larger-volume flows. Both models are better at reproducing the inundated area of single-pulse, valley-confined, smaller-volume flows than sustained, highly unsteady, larger-volume flows, which are often partially unchannelized. The simple rheological models of TITAN2D and VolcFlow are not able to recreate all features of these more complex flows. LAHARZ is fast to run and can give a rough approximation of inundation, but may not be appropriate for all PDCs and the designation of starting locations is difficult. The ΔH/L cone model is also very quick to run and gives reasonable approximations of runout distance, but does not inherently model flow channelization or directionality and thus unrealistically covers all interfluves. Empirically-based models like LAHARZ and ΔH/L cones can be quick, first-approximations of flow runout, provided a database of similar flows, e.g., FlowDat, is available to properly calculate coefficients or ΔH/L. For hazard assessment purposes, geophysical models like TITAN2D and VolcFlow can be useful for producing both scenario-based or probabilistic hazard maps, but must be run many times with varying input parameters. LAHARZ and ΔH/L cones can be used to produce simple modeling-based hazard maps when run with a variety of input volumes, but do not explicitly consider the probability of occurrence of different volumes. For forward modeling purposes, the ability to derive potential input parameters from global or local databases is crucial, though important input parameters for VolcFlow cannot be empirically estimated. Not only does this work provide a useful comparison of the operational aspects and behavior of various models for hazard assessment, but it also enriches conceptual understanding of the dynamics of the PDCs themselves.

  2. A Fast and Scalable Radiation Hybrid Map Construction and Integration Strategy

    PubMed Central

    Agarwala, Richa; Applegate, David L.; Maglott, Donna; Schuler, Gregory D.; Schäffer, Alejandro A.

    2000-01-01

    This paper describes a fast and scalable strategy for constructing a radiation hybrid (RH) map from data on different RH panels. The maps on each panel are then integrated to produce a single RH map for the genome. Recurring problems in using maps from several sources are that the maps use different markers, the maps do not place the overlapping markers in same order, and the objective functions for map quality are incomparable. We use methods from combinatorial optimization to develop a strategy that addresses these issues. We show that by the standard objective functions of obligate chromosome breaks and maximum likelihood, software for the traveling salesman problem produces RH maps with better quality much more quickly than using software specifically tailored for RH mapping. We use known algorithms for the longest common subsequence problem as part of our map integration strategy. We demonstrate our methods by reconstructing and integrating maps for markers typed on the Genebridge 4 (GB4) and the Stanford G3 panels publicly available from the RH database. We compare map quality of our integrated map with published maps for GB4 panel and G3 panel by considering whether markers occur in the same order on a map and in DNA sequence contigs submitted to GenBank. We find that all of the maps are inconsistent with the sequence data for at least 50% of the contigs, but our integrated maps are more consistent. The map integration strategy not only scales to multiple RH maps but also to any maps that have comparable criteria for measuring map quality. Our software improves on current technology for doing RH mapping in areas of computation time and algorithms for considering a large number of markers for mapping. The essential impediments to producing dense high-quality RH maps are data quality and panel size, not computation. PMID:10720576

  3. Application of Modern Design of Experiments to CARS Thermometry in a Model Scramjet Engine

    NASA Technical Reports Server (NTRS)

    Danehy, P. M.; DeLoach, R.; Cutler, A. D.

    2002-01-01

    We have applied formal experiment design and analysis to optimize the measurement of temperature in a supersonic combustor at NASA Langley Research Center. We used the coherent anti-Stokes Raman spectroscopy (CARS) technique to map the temperature distribution in the flowfield downstream of an 1160 K, Mach 2 freestream into which supersonic hydrogen fuel is injected at an angle of 30 degrees. CARS thermometry is inherently a single-point measurement technique; it was used to map thc flow by translating the measurement volume through the flowfield. The method known as "Modern Design of Experiments" (MDOE) was used to estimate the data volume required, design the test matrix, perform the experiment and analyze the resulting data. MDOE allowed us to match the volume of data acquired to the precision requirements of the customer. Furthermore, one aspect of MDOE, known as response surface methodology, allowed us to develop precise maps of the flowfield temperature, allowing interpolation between measurement points. An analytic function in two spatial variables was fit to the data from a single measurement plane. Fitting with a Cosine Series Bivariate Function allowed the mean temperature to be mapped with 95% confidence interval half-widths of +/- 30 Kelvin, comfortably meeting the confidence of +/- 50 Kelvin specified prior to performing the experiments. We estimate that applying MDOE to the present experiment saved a factor of 5 in data volume acquired, compared to experiments executed in the traditional manner. Furthermore, the precision requirements could have been met with less than half the data acquired.

  4. Specification for the U.S. Geological Survey Historical Topographic Map Collection

    USGS Publications Warehouse

    Allord, Gregory J.; Walter, Jennifer L.; Fishburn, Kristin A.; Shea, Gale A.

    2014-01-01

    This document provides the detailed requirements for producing, archiving, and disseminating a comprehensive digital collection of topographic maps for the U.S. Geological Survey (USGS) Historical Topographic Map Collection (HTMC). The HTMC is a digital archive of about 190,000 printed topographic maps published by the USGS from the inception of the topographic mapping program in 1884 until the last paper topographic map using lithographic printing technology was published in 2006. The HTMC provides a comprehensive digital repository of all scales and all editions of USGS printed topographic maps that is easily discovered, browsed, and downloaded by the public at no cost. The HTMC provides ready access to maps that are no longer available for distribution in print. A digital file representing the original paper historical topographic map is produced for each historical map in the HTMC in georeferenced PDF (GeoPDF) format (a portable document format [PDF] with a geospatial extension).

  5. Comparing Geologic Data Sets Collected by Planetary Analog Traverses and by Standard Geologic Field Mapping: Desert Rats Data Analysis

    NASA Technical Reports Server (NTRS)

    Feng, Wanda; Evans, Cynthia; Gruener, John; Eppler, Dean

    2014-01-01

    Geologic mapping involves interpreting relationships between identifiable units and landforms to understand the formative history of a region. Traditional field techniques are used to accomplish this on Earth. Mapping proves more challenging for other planets, which are studied primarily by orbital remote sensing and, less frequently, by robotic and human surface exploration. Systematic comparative assessments of geologic maps created by traditional mapping versus photogeology together with data from planned traverses are limited. The objective of this project is to produce a geologic map from data collected on the Desert Research and Technology Studies (RATS) 2010 analog mission using Apollo-style traverses in conjunction with remote sensing data. This map is compared with a geologic map produced using standard field techniques.

  6. Aeromagnetic survey map of Sacramento Valley, California

    USGS Publications Warehouse

    Langenheim, Victoria E.

    2015-01-01

    Three aeromagnetic surveys were flown to improve understanding of the geology and structure in the Sacramento Valley. The resulting data serve as a basis for geophysical interpretations, and support geological mapping, water and mineral resource investigations, and other topical studies. Local spatial variations in the Earth's magnetic field (evident as anomalies on aeromagnetic maps) reflect the distribution of magnetic minerals, primarily magnetite, in the underlying rocks. In many cases the volume content of magnetic minerals can be related to rock type, and abrupt spatial changes in the amount of magnetic minerals commonly mark lithologic or structural boundaries. Bodies of serpentinite and other mafic and ultramafic rocks tend to produce the most intense positive magnetic anomalies (for example, in the northwest part of the map). These rock types are the inferred sources, concealed beneath weakly magnetic, valley-fill deposits, of the most prominent magnetic features in the map area, the magnetic highs that extend along the valley axis. Cenozoic volcanic rocks are also an important source of magnetic anomalies and coincide with short-wavelength anomalies that can be either positive (strong central positive anomaly flanked by lower-amplitude negative anomalies) or negative (strong central negative anomaly flanked by lower-amplitude positive anomalies), reflecting the contribution of remanent magnetization. Rocks with more felsic compositions or even some sedimentary units also can cause measurable magnetic anomalies. For example, the long, linear, narrow north-trending anomalies (with amplitudes of <50 nanoteslas [nT]) along the western margin of the valley coincide with exposures of the Mesozoic Great Valley sequence. Note that isolated, short-wavelength anomalies, such as those in the city of Sacramento and along some of the major roads, are caused by manmade features.

  7. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  8. Design and Applications of Rapid Image Tile Producing Software Based on Mosaic Dataset

    NASA Astrophysics Data System (ADS)

    Zha, Z.; Huang, W.; Wang, C.; Tang, D.; Zhu, L.

    2018-04-01

    Map tile technology is widely used in web geographic information services. How to efficiently produce map tiles is key technology for rapid service of images on web. In this paper, a rapid producing software for image tile data based on mosaic dataset is designed, meanwhile, the flow of tile producing is given. Key technologies such as cluster processing, map representation, tile checking, tile conversion and compression in memory are discussed. Accomplished by software development and tested by actual image data, the results show that this software has a high degree of automation, would be able to effectively reducing the number of IO and improve the tile producing efficiency. Moreover, the manual operations would be reduced significantly.

  9. 3D PATTERN OF BRAIN ABNORMALITIES IN FRAGILE X SYNDROME VISUALIZED USING TENSOR-BASED MORPHOMETRY

    PubMed Central

    Lee, Agatha D.; Leow, Alex D.; Lu, Allen; Reiss, Allan L.; Hall, Scott; Chiang, Ming-Chang; Toga, Arthur W.; Thompson, Paul M.

    2007-01-01

    Fragile X syndrome (FraX), a genetic neurodevelopmental disorder, results in impaired cognition with particular deficits in executive function and visuo-spatial skills. Here we report the first detailed 3D maps of the effects of the Fragile X mutation on brain structure, using tensor-based morphometry. TBM visualizes structural brain deficits automatically, without time-consuming specification of regions-of-interest. We compared 36 subjects with FraX (age: 14.66+/−1.58SD, 18 females/18 males), and 33 age-matched healthy controls (age: 14.67+/−2.2SD, 17 females/16 males), using high-dimensional elastic image registration. All 69 subjects' 3D T1-weighted brain MRIs were spatially deformed to match a high-resolution single-subject average MRI scan in ICBM space, whose geometry was optimized to produce a minimal deformation target. Maps of the local Jacobian determinant (expansion factor) were computed from the deformation fields. Statistical maps showed increased caudate (10% higher; p=0.001) and lateral ventricle volumes (19% higher; p=0.003), and trend-level parietal and temporal white matter excesses (10% higher locally; p=0.04). In affected females, volume abnormalities correlated with reduction in systemically measured levels of the fragile X mental retardation protein (FMRP; Spearman's r<−0.5 locally). Decreased FMRP correlated with ventricular expansion (p=0.042; permutation test), and anterior cingulate tissue reductions (p=0.0026; permutation test) supporting theories that FMRP is required for normal dendritic pruning in fronto-striatal-limbic pathways. No sex differences were found; findings were confirmed using traditional volumetric measures in regions of interest. Deficit patterns were replicated using Lie group statistics optimized for tensor-valued data. Investigation of how these anomalies emerge over time will accelerate our understanding of FraX and its treatment. PMID:17161622

  10. Large area 3-D optical coherence tomography imaging of lumpectomy specimens for radiation treatment planning

    NASA Astrophysics Data System (ADS)

    Wang, Cuihuan; Kim, Leonard; Barnard, Nicola; Khan, Atif; Pierce, Mark C.

    2016-02-01

    Our long term goal is to develop a high-resolution imaging method for comprehensive assessment of tissue removed during lumpectomy procedures. By identifying regions of high-grade disease within the excised specimen, we aim to develop patient-specific post-operative radiation treatment regimens. We have assembled a benchtop spectral-domain optical coherence tomography (SD-OCT) system with 1320 nm center wavelength. Automated beam scanning enables "sub-volumes" spanning 5 mm x 5 mm x 2 mm (500 A-lines x 500 B-scans x 2 mm in depth) to be collected in under 15 seconds. A motorized sample positioning stage enables multiple sub-volumes to be acquired across an entire tissue specimen. Sub-volumes are rendered from individual B-scans in 3D Slicer software and en face (XY) images are extracted at specific depths. These images are then tiled together using MosaicJ software to produce a large area en face view (up to 40 mm x 25 mm). After OCT imaging, specimens were sectioned and stained with HE, allowing comparison between OCT image features and disease markers on histopathology. This manuscript describes the technical aspects of image acquisition and reconstruction, and reports initial qualitative comparison between large area en face OCT images and HE stained tissue sections. Future goals include developing image reconstruction algorithms for mapping an entire sample, and registering OCT image volumes with clinical CT and MRI images for post-operative treatment planning.

  11. Application of Remote Sensing Data to Improve the Water and Soil Resource Management of Rwanda

    NASA Astrophysics Data System (ADS)

    Csorba, Ádám; Bukombe, Benjamin; Naramabuye, Francois Xavier; Szegi, Tamás; Vekerdy, Zoltán; Michéli, Erika

    2017-04-01

    The Rwandan agriculture strongly relies in the dry seasons on the water stored in artificial reservoirs of various sizes for irrigation purposes. Furthermore, the success of irrigation depends on a wide range of soil properties which directly affect the moisture regime of the growing medium. By integrating remote sensing and auxiliary data the objectives of our study are to monitor the water level fluctuation in the reservoirs, estimate the volume of water available for irrigation and to combine this information with soil property maps to support the decision making for sustainable irrigation water management in a study area in Southern Rwanda. For water level and volume estimation a series of Sentinel-1 (product type: GRD, acquisition mode: IW, polarizations HH and VH) data were obtained covering the study area and spanning over a period of two years. To map the extent of water bodies the Radar-Based Water Body Mapping module of the Water Observation and Information System (WOIS) was used. High-resolution optical data (Sentinel-2) were used for validation in cloud-free periods. To estimate the volume changes in the reservoirs, we combined the information derived from the water body mapping procedure and digital elevation models. For sustainable irrigation water management, digital soil property maps were developed by the application of wide range of environmental covariates related to soil forming factors. To develop covariates which represent the land use a time series analysis of the 2 years of Sentinel-1 data was performed. As auxiliary soil data, the ISRIC-WISE harmonized soil profile database was used. The developed digital soil mapping approach is integrated into a new WOIS workflow.

  12. Submarine radial vents on Mauna Loa Volcano, Hawaìi

    NASA Astrophysics Data System (ADS)

    Wanless, V. Dorsey; Garcia, M. O.; Trusdell, F. A.; Rhodes, J. M.; Norman, M. D.; Weis, Dominique; Fornari, D. J.; Kurz, M. D.; Guillou, Hervé

    2006-05-01

    A 2002 multibeam sonar survey of Mauna Loa's western flank revealed ten submarine radial vents and three submarine lava flows. Only one submarine radial vent was known previously. The ages of these vents are constrained by eyewitness accounts, geologic relationships, Mn-Fe coatings, and geochemical stratigraphy; they range from 128 years B.P. to possibly 47 ka. Eight of the radial vents produced degassed lavas despite eruption in water depths sufficient to inhibit sulfur degassing. These vents formed truncated cones and short lava flows. Two vents produced undegassed lavas that created "irregular" cones and longer lava flows. Compositionally and isotopically, the submarine radial vent lavas are typical of Mauna Loa lavas, except two cones that erupted alkalic lavas. He-Sr isotopes for the radial vent lavas follow Mauna Loa's evolutionary trend. The compositional and isotopic heterogeneity of these lavas indicates most had distinct parental magmas. Bathymetry and acoustic backscatter results, along with photography and sampling during four JASON2 dives, are used to produce a detailed geologic map to evaluate Mauna Loa's submarine geologic history. The new map shows that the 1877 submarine eruption was much larger than previously thought, resulting in a 10% increase for recent volcanism. Furthermore, although alkalic lavas were found at two radial vents, there is no systematic increase in alkalinity among these or other Mauna Loa lavas as expected for a dying volcano. These results refute an interpretation that Mauna Loa's volcanism is waning. The submarine radial vents and flows cover 29 km2 of seafloor and comprise a total volume of ˜2 × 109 m3 of lava, reinforcing the idea that submarine lava eruptions are important in the growth of oceanic island volcanoes even after they emerged above sea level.

  13. Submarine radial vents on Mauna Loa Volcano, Hawai'i

    USGS Publications Warehouse

    Wanless, V. Dorsey; Garcia, M.O.; Trusdell, F.A.; Rhodes, J.M.; Norman, M.D.; Weis, Dominique; Fornari, D.J.; Kurz, M.D.; Guillou, Herve

    2006-01-01

    A 2002 multibeam sonar survey of Mauna Loa's western flank revealed ten submarine radial vents and three submarine lava flows. Only one submarine radial vent was known previously. The ages of these vents are constrained by eyewitness accounts, geologic relationships, Mn-Fe coatings, and geochemical stratigraphy; they range from 128 years B.P. to possibly 47 ka. Eight of the radial vents produced degassed lavas despite eruption in water depths sufficient to inhibit sulfur degassing. These vents formed truncated cones and short lava flows. Two vents produced undegassed lavas that created “irregular” cones and longer lava flows. Compositionally and isotopically, the submarine radial vent lavas are typical of Mauna Loa lavas, except two cones that erupted alkalic lavas. He-Sr isotopes for the radial vent lavas follow Mauna Loa's evolutionary trend. The compositional and isotopic heterogeneity of these lavas indicates most had distinct parental magmas. Bathymetry and acoustic backscatter results, along with photography and sampling during four JASON2 dives, are used to produce a detailed geologic map to evaluate Mauna Loa's submarine geologic history. The new map shows that the 1877 submarine eruption was much larger than previously thought, resulting in a 10% increase for recent volcanism. Furthermore, although alkalic lavas were found at two radial vents, there is no systematic increase in alkalinity among these or other Mauna Loa lavas as expected for a dying volcano. These results refute an interpretation that Mauna Loa's volcanism is waning. The submarine radial vents and flows cover 29 km2 of seafloor and comprise a total volume of ∼2×109 m3 of lava, reinforcing the idea that submarine lava eruptions are important in the growth of oceanic island volcanoes even after they emerged above sea level.

  14. High-resolution mapping and modelling of surface albedo in Norwegian boreal forests: from remotely sensed data to predictions

    NASA Astrophysics Data System (ADS)

    Cherubini, Francesco; Hu, Xiangping; Vezhapparambu, Sajith; Stromman, Anders

    2017-04-01

    Surface albedo, a key parameter of the Earth's climate system, has high variability in space, time, and land cover and its parameterization is among the most important variables in climate models. The lack of extensive estimates for model improvement is one of the main limitations for accurately quantifying the influence of surface albedo changes on the planetary radiation balance. We use multi-year satellite retrievals of MODIS surface albedo (MCD43A3), high resolution land cover maps, and meteorological records to characterize albedo variations in Norway across latitude, seasons, land cover type, and topography. We then use this dataset to elaborate semi-empirical models to predict albedo values as a function of tree species, age, volume and climate variables like temperature and snow water equivalents (SWE). Given the complexity of the dataset and model formulation, we apply an innovative non-linear programming approach simultaneously coupled with linear un-mixing. The MODIS albedo products are at a resolution of about 500 m and 8 days. The land cover maps provide vegetation structure information on relative abundance of tree species, age, and biomass volumes at 16 m resolution (for both deciduous and coniferous species). Daily observations of meteorological information on air temperature and SWE are produced at 1 km resolution from interpolation of meteorological weather stations in Norway. These datasets have different resolution and projection, and are harmonized by identifying, for each MODIS pixel, the intersecting land cover polygons and the percentage area of the MODIS pixel represented by each land cover type. We then filter the subplots according to the following criteria: i) at least 96% of the total pixel area is covered by a single land cover class (either forest or cropland); ii) if forest area, at least 98% of the forest area is covered by spruce, deciduous or pine. Forested pixels are then categorized as spruce, deciduous, or pine dominant if the fraction of the respective tree species is greater than 75%. Results show averages of albedo estimates for forests and cropland depicting spatial (along a latitudinal gradient) and temporal (daily, monthly, and seasonal) variations across Norway. As the case study region is a country with heterogeneous topography, we also study the sensitivity of the albedo estimates to the slope and aspect of the terrain. The mathematical programming approach uses a variety of functional forms, constraints and variables, leading to many different model outputs. There are several models with relatively high performances, allowing for a flexibility in the model selection, with different model variants suitable for different situations. This approach produces albedo predictions at the same resolution of the land cover dataset (16 m, notably higher than the MODIS estimates), can incorporate changes in climate conditions, and is robust to cross-validation between different locations. By integrating satellite measurements and high-resolution vegetation maps, we can thus produce semi-empirical models that can predict albedo values for boreal forests using a variety of input variables representing climate and/or vegetation structure. Further research can explore the possible advantages of its implementation in land surface schemes over existing approaches.

  15. Isomap transform for segmenting human body shapes.

    PubMed

    Cerveri, P; Sarro, K J; Marchente, M; Barros, R M L

    2011-09-01

    Segmentation of the 3D human body is a very challenging problem in applications exploiting volume capture data. Direct clustering in the Euclidean space is usually complex or even unsolvable. This paper presents an original method based on the Isomap (isometric feature mapping) transform of the volume data-set. The 3D articulated posture is mapped by Isomap in the pose of Da Vinci's Vitruvian man. The limbs are unrolled from each other and separated from the trunk and pelvis, and the topology of the human body shape is recovered. In such a configuration, Hoshen-Kopelman clustering applied to concentric spherical shells is used to automatically group points into the labelled principal curves. Shepard interpolation is utilised to back-map points of the principal curves into the original volume space. The experimental results performed on many different postures have proved the validity of the proposed method. Reliability of less than 2 cm and 3° in the location of the joint centres and direction axes of rotations has been obtained, respectively, which qualifies this procedure as a potential tool for markerless motion analysis.

  16. Moving NSDC's Staff Development Standards into Practice: Innovation Configurations, Volume II. [CD-ROMs

    ERIC Educational Resources Information Center

    National Staff Development Council, 2005

    2005-01-01

    The second volume of "Moving NSDC's Staff Development Standards into Practice: Innovation Configurations" builds on the work that began with the first volume published in 2003. An Innovation Configuration map is a device that identifies and describes the major components of a new practice such as the standards and details of how it would look in…

  17. Respiratory effort correction strategies to improve the reproducibility of lung expansion measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Kaifang; Reinhardt, Joseph M.; Christensen, Gary E.

    2013-12-15

    Purpose: Four-dimensional computed tomography (4DCT) can be used to make measurements of pulmonary function longitudinally. The sensitivity of such measurements to identify change depends on measurement uncertainty. Previously, intrasubject reproducibility of Jacobian-based measures of lung tissue expansion was studied in two repeat prior-RT 4DCT human acquisitions. Difference in respiratory effort such as breathing amplitude and frequency may affect longitudinal function assessment. In this study, the authors present normalization schemes that correct ventilation images for variations in respiratory effort and assess the reproducibility improvement after effort correction.Methods: Repeat 4DCT image data acquired within a short time interval from 24 patients priormore » to radiation therapy (RT) were used for this analysis. Using a tissue volume preserving deformable image registration algorithm, Jacobian ventilation maps in two scanning sessions were computed and compared on the same coordinate for reproducibility analysis. In addition to computing the ventilation maps from end expiration to end inspiration, the authors investigated the effort normalization strategies using other intermediated inspiration phases upon the principles of equivalent tidal volume (ETV) and equivalent lung volume (ELV). Scatter plots and mean square error of the repeat ventilation maps and the Jacobian ratio map were generated for four conditions: no effort correction, global normalization, ETV, and ELV. In addition, gamma pass rate was calculated from a modified gamma index evaluation between two ventilation maps, using acceptance criterions of 2 mm distance-to-agreement and 5% ventilation difference.Results: The pattern of regional pulmonary ventilation changes as lung volume changes. All effort correction strategies improved reproducibility when changes in respiratory effort were greater than 150 cc (p < 0.005 with regard to the gamma pass rate). Improvement of reproducibility was correlated with respiratory effort difference (R = 0.744 for ELV in the cohort with tidal volume difference greater than 100 cc). In general for all subjects, global normalization, ETV and ELV significantly improved reproducibility compared to no effort correction (p = 0.009, 0.002, 0.005 respectively). When tidal volume difference was small (less than 100 cc), none of the three effort correction strategies improved reproducibility significantly (p = 0.52, 0.46, 0.46 respectively). For the cohort (N = 13) with tidal volume difference greater than 100 cc, the average gamma pass rate improves from 57.3% before correction to 66.3% after global normalization, and 76.3% after ELV. ELV was found to be significantly better than global normalization (p = 0.04 for all subjects, and p = 0.003 for the cohort with tidal volume difference greater than 100 cc).Conclusions: All effort correction strategies improve the reproducibility of the authors' pulmonary ventilation measures, and the improvement of reproducibility is highly correlated with the changes in respiratory effort. ELV gives better results as effort difference increase, followed by ETV, then global. However, based on the spatial and temporal heterogeneity in the lung expansion rate, a single scaling factor (e.g., global normalization) appears to be less accurate to correct the ventilation map when changes in respiratory effort are large.« less

  18. Detection And Mapping (DAM) package. Volume 4B: Software System Manual, part 2

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    Computer programs, graphic devices, and an integrated set of manual procedures designed for efficient production of precisely registered and formatted maps from digital data are presented. The software can be used on any Univac 1100 series computer. The software includes pre-defined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3.

  19. Production and Distribution of NASA MODIS Remote Sensing Products

    NASA Technical Reports Server (NTRS)

    Wolfe, Robert

    2007-01-01

    The two Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on-board NASA's Earth Observing System (EOS) Terra and Aqua satellites make key measurements for understanding the Earth's terrestrial ecosystems. Global time-series of terrestrial geophysical parameters have been produced from MODIS/Terra for over 7 years and for MODIS/Aqua for more than 4 1/2 years. These well calibrated instruments, a team of scientists and a large data production, archive and distribution systems have allowed for the development of a new suite of high quality product variables at spatial resolutions as fine as 250m in support of global change research and natural resource applications. This talk describes the MODIS Science team's products, with a focus on the terrestrial (land) products, the data processing approach and the process for monitoring and improving the product quality. The original MODIS science team was formed in 1989. The team's primary role is the development and implementation of the geophysical algorithms. In addition, the team provided feedback on the design and pre-launch testing of the instrument and helped guide the development of the data processing system. The key challenges the science team dealt with before launch were the development of algorithms for a new instrument and provide guidance of the large and complex multi-discipline processing system. Land, Ocean and Atmosphere discipline teams drove the processing system requirements, particularly in the area of the processing loads and volumes needed to daily produce geophysical maps of the Earth at resolutions as fine as 250 m. The processing system had to handle a large number of data products, large data volumes and processing loads, and complex processing requirements. Prior to MODIS, daily global maps from heritage instruments, such as Advanced Very High Resolution Radiometer (AVHRR), were not produced at resolutions finer than 5 km. The processing solution evolved into a combination of processing the lower level (Level 1) products and the higher level discipline specific Land and Atmosphere products in the MODIS Science Investigator Lead Processing System (SIPS), the MODIS Adaptive Processing System (MODAPS), and archive and distribution of the Land products to the user community by two of NASA s EOS Distributed Active Archive Centers (DAACs). Recently, a part of MODAPS, the Level 1 and Atmosphere Archive and Distribution System (LAADS), took over the role of archiving and distributing the Level 1 and Atmosphere products to the user community.

  20. The national tree-list layer

    Treesearch

    Stacy A. Drury; Jason M. Herynk

    2011-01-01

    The National Tree-List Layer (NTLL) project used LANDFIRE map products to produce the first national tree-list map layer that represents tree populations at stand and regional levels. The NTLL was produced in a short time frame to address the needs of Fire and Aviation Management for a map layer that could be used as input for simulating fire-caused tree mortality...

  1. Volcanism on Io: Results from Global Geologic Mapping

    NASA Technical Reports Server (NTRS)

    Williams, David A.; Keszthelyi, L. P.; Crown, D. A.; Geissler, P. E.; Schenk, P. M.; Yff, Jessica; Jaeger, W. L.

    2010-01-01

    We have completed a new 1:15,000,000 global geologic map of Jupiter's volcanic moon, Io, based on a set of 1 km/pixel combined Galileo- Voyager mosaics produced by the U.S. Geological Survey. The map was produced over the last three years using ArcGIS(TM) software, and has undergone peer-review. Here we report some of the key results from our global mapping efforts, and how these results relate to questions regarding the volcano-tectonic evolution of Io.

  2. Developing prehospital clinical practice guidelines for resource limited settings: why re-invent the wheel?

    PubMed

    McCaul, Michael; de Waal, Ben; Hodkinson, Peter; Pigoga, Jennifer L; Young, Taryn; Wallis, Lee A

    2018-02-05

    Methods on developing new (de novo) clinical practice guidelines (CPGs) have received substantial attention. However, the volume of literature is not matched by research into alternative methods of CPG development using existing CPG documents-a specific issue for guideline development groups in low- and middle-income countries. We report on how we developed a context specific prehospital CPG using an alternative guideline development method. Difficulties experienced and lessons learnt in applying existing global guidelines' recommendations to a national context are highlighted. The project produced the first emergency care CPG for prehospital providers in Africa. It included > 270 CPGs and produced over 1000 recommendations for prehospital emergency care. We encountered various difficulties, including (1) applicability issues: few pre-hospital CPGs applicable to Africa, (2) evidence synthesis: heterogeneous levels of evidence classifications and (3) guideline quality. Learning points included (1) focusing on key CPGs and evidence mapping, (2) searching other resources for CPGs, (3) broad representation on CPG advisory boards and (4) transparency and knowledge translation. Re-inventing the wheel to produce CPGs is not always feasible. We hope this paper will encourage further projects to use existing CPGs in developing guidance to improve patient care in resource-limited settings.

  3. Comparison of Aerosol Volume Size Distributions between Surface and Ground-based Remote Sensing Measurements Downwind of Seoul, Korea during MAPS-Seoul

    NASA Astrophysics Data System (ADS)

    Kim, P.; Choi, Y.; Ghim, Y. S.

    2016-12-01

    Both sunphotometer (Cimel, CE-318) and skyradiometer (Prede, POM-02) were operated in May, 2015 as a part of the Megacity Air Pollution Studies-Seoul (MAPS-Seoul) campaign. These instruments were collocated at the Hankuk University of Foreign Studies (Hankuk_UFS) site of AErosol RObotic NETwork (AERONET) and the Yongin (YGN) site of SKYradiometer NETwork (SKYNET). The aerosol volume size distribution at the surface was measured using a wide range aerosol spectrometer (WRAS) system consisting of a scanning mobility particle sizer (Grimm, Model 5.416; 45 bins, 0.01-1.09 μm) and an optical particle counter (Grimm, Model 1.109; 31 bins, 0.27-34 μm). The measurement site (37.34oN, 127.27oE, 167 m above sea level) is located about 35 km southeast of downtown Seoul. To investigate the discrepancies in volume concentrations, effective diameters and fine mode volume fractions, we compared the volume size distributions from sunphotometer, skyradiometer, and WRAS system when the measurement time coincided within 5 minutes considering that the measurement intervals were different between instruments.

  4. Regulatory T cells in cattle and their potential role in bovine paratuberculosis.

    PubMed

    Coussens, Paul M; Sipkovsky, Sue; Murphy, Brooke; Roussey, Jon; Colvin, Christopher J

    2012-05-01

    The intracellular bacterium Mycobacterium avium subspecies paratuberculosis (MAP) causes Johne's disease in wild and domestic ruminants. Johne's disease presents as a chronic enteritis with severe inflammation of intestinal tissues, characterized by widespread infiltration of macrophages, the target cell of MAP. Clinical signs of Johne's disease are typically accompanied by a loss of peripheral CD4+ T cell responses to MAP antigens and an increase in anti-MAP serum IgG levels. Recently, it was proposed that regulatory T cells might develop over the lengthy course of subclinical MAP infection. In the past five years, significant progress in defining bovine regulatory T cells has been made. These studies grew out of observations that IL-10 is produced by PBMCs in response to MAP antigen stimulation and that neutralization of this IL-10 could enhance IFN-γ production from MAP-antigen reactive effector T cells. Depletion studies revealed that MAP responsive cell populations producing IL-10 were largely CD4+ and CD25+, although monocytes have also been shown to produce IL-10 in response to MAP. In addition, evidence for a regulatory population of γδ T cells has also begun to accumulate. We summarize current thinking regarding regulatory T cells in MAP infection and provide data suggesting a potential link between regulatory T cells, bovine leukemia virus, and MAP. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Multifrequency Aperture-Synthesizing Microwave Radiometer System (MFASMR). Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Wiley, C. A.; Chang, M. U.

    1981-01-01

    A number of topics supporting the systems analysis of a multifrequency aperture-synthesizing microwave radiometer system are discussed. Fellgett's (multiple) advantage, interferometer mapping behavior, mapping geometry, image processing programs, and sampling errors are among the topics discussed. A FORTRAN program code is given.

  6. Repeatability and sensitivity of high resolution blood volume mapping in mouse kidney disease.

    PubMed

    Wang, Feng; Jiang, Rosie T; Tantawy, Mohammed Noor; Borza, Dorin B; Takahashi, Keiko; Gore, John C; Harris, Raymond C; Takahashi, Takamune; Quarles, C Chad

    2014-04-01

    To evaluate the repeatability of MRI-derived relative blood volume (RBV) measurements in mouse kidneys across subjects and days and to evaluate sensitivity of this approach to renal pathology. A 7 Tesla MRI system and an intravascular iron-oxide contrast agent were used to acquire spin-echo-based renal RBV maps in 10 healthy mice on 2 consecutive days. Renal RBV maps were also acquired in the Alport and unilateral ureteral obstruction mouse models of renal disease. The average renal RBV measured on consecutive days was 19.97 ± 1.50 and 19.86 ± 1.62, yielding a concordance correlation coefficient of 0.94, indicating that this approach is highly repeatable. In the disease models, the RBV values were regionally dissimilar and substantially lower than those found in control mice. In vivo renal iron-oxide-based RBV mapping in mice complements the physiological information obtained from conventional assays of kidney function and could shed new insights into the pathological mechanisms of kidney disease. Copyright © 2013 Wiley Periodicals, Inc.

  7. An Efficient Method of Sharing Mass Spatio-Temporal Trajectory Data Based on Cloudera Impala for Traffic Distribution Mapping in an Urban City

    PubMed Central

    Zhou, Lianjie; Chen, Nengcheng; Yuan, Sai; Chen, Zeqiang

    2016-01-01

    The efficient sharing of spatio-temporal trajectory data is important to understand traffic congestion in mass data. However, the data volumes of bus networks in urban cities are growing rapidly, reaching daily volumes of one hundred million datapoints. Accessing and retrieving mass spatio-temporal trajectory data in any field is hard and inefficient due to limited computational capabilities and incomplete data organization mechanisms. Therefore, we propose an optimized and efficient spatio-temporal trajectory data retrieval method based on the Cloudera Impala query engine, called ESTRI, to enhance the efficiency of mass data sharing. As an excellent query tool for mass data, Impala can be applied for mass spatio-temporal trajectory data sharing. In ESTRI we extend the spatio-temporal trajectory data retrieval function of Impala and design a suitable data partitioning method. In our experiments, the Taiyuan BeiDou (BD) bus network is selected, containing 2300 buses with BD positioning sensors, producing 20 million records every day, resulting in two difficulties as described in the Introduction section. In addition, ESTRI and MongoDB are applied in experiments. The experiments show that ESTRI achieves the most efficient data retrieval compared to retrieval using MongoDB for data volumes of fifty million, one hundred million, one hundred and fifty million, and two hundred million. The performance of ESTRI is approximately seven times higher than that of MongoDB. The experiments show that ESTRI is an effective method for retrieving mass spatio-temporal trajectory data. Finally, bus distribution mapping in Taiyuan city is achieved, describing the buses density in different regions at different times throughout the day, which can be applied in future studies of transport, such as traffic scheduling, traffic planning and traffic behavior management in intelligent public transportation systems. PMID:27801869

  8. Three-dimensional characterization of ODS ferritic steel using by FIB-SEM serial sectioning method.

    PubMed

    Endo, T; Sugino, Y; Ohono, N; Ukai, S; Miyazaki, N; Wang, Y; Ohnuki, S

    2014-11-01

    Considerable attention has been paid to the research of the electron tomography due to determine the three-dimensional (3D) structure of materials [1]. One of the electron tomography techniques, focused ion beam/scanning electron microscopy (FIB-SEM) imaging has advantages of high resolutions (10 nm), large area observation (μm order) and simultaneous energy dispersive x- ray microanalysis (EDS)/ electron backscatter diffraction (EBSD) analysis. The purpose of this study, three-dimensional EBSD analysis of ODS ferritic steel which carried out cold work using FIB-SEM equipment was conducted, and it aimed at analyzing the microstructure obtained there. The zone annealing tests were conducted for ferritic steel [2,3], which were produced through mechanical alloying and hot-extrusion. After zone annealing, specimens were mechanically polished with #400∼4000 emery paper, 1 µm diamond paste and alumina colloidal silica. The serial sectioning and the 3D-electron backscattering diffraction (3D-EBSD) analysis were carried out. We made the micro pillar (30 x 30 x 15 µm). The EBSD measurements were carried out in each layer after serial sectioning at a step size and milling depth was 80 nm with 30 slices. After EBSD analysis, the series of cross-sectional images were aligned according to arbitrarily specified areas and then stacked up to form a volume. Consequently, we obtained the 3D-IPF maps for ODS ferritic steel. In this specimen, the {111} and {001} grains are layered by turns. In addition, the volume fraction value of both plane are similar. The aspect ratio increases with specimen depth. The 3D-EBSD mapping is useful to analysis of the bulk material since this method obtain many microstructure information, such a shape, volume and orientation of the crystal, grain boundary. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Neural networks for satellite remote sensing and robotic sensor interpretation

    NASA Astrophysics Data System (ADS)

    Martens, Siegfried

    Remote sensing of forests and robotic sensor fusion can be viewed, in part, as supervised learning problems, mapping from sensory input to perceptual output. This dissertation develops ARTMAP neural networks for real-time category learning, pattern recognition, and prediction tailored to remote sensing and robotics applications. Three studies are presented. The first two use ARTMAP to create maps from remotely sensed data, while the third uses an ARTMAP system for sensor fusion on a mobile robot. The first study uses ARTMAP to predict vegetation mixtures in the Plumas National Forest based on spectral data from the Landsat Thematic Mapper satellite. While most previous ARTMAP systems have predicted discrete output classes, this project develops new capabilities for multi-valued prediction. On the mixture prediction task, the new network is shown to perform better than maximum likelihood and linear mixture models. The second remote sensing study uses an ARTMAP classification system to evaluate the relative importance of spectral and terrain data for map-making. This project has produced a large-scale map of remotely sensed vegetation in the Sierra National Forest. Network predictions are validated with ground truth data, and maps produced using the ARTMAP system are compared to a map produced by human experts. The ARTMAP Sierra map was generated in an afternoon, while the labor intensive expert method required nearly a year to perform the same task. The robotics research uses an ARTMAP system to integrate visual information and ultrasonic sensory information on a B14 mobile robot. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. ARTMAP effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.

  10. Integrating laser-range finding, electronic compass measurements and GPS to rapidly map vertical changes in volcanic stratigraphy and constrain unit thicknesses and volumes: two examples from the northern Cordilleran volcanic province

    NASA Astrophysics Data System (ADS)

    Nogier, M.; Edwards, B. R.; Wetherell, K.

    2005-12-01

    We present preliminary results of laser-range finding-GPS surveys from two separate locations in northern British Columbia, in the south-central northern Cordilleran volcanic province: Hoodoo Mountain volcano and Craven Lake cone. This technique, described in detail below, is appropriate for rapidly measuring changes in vertical thicknesses of units that either would be difficult or impossible to measure by most other techniques. The ability to accurately measure thicknesses of geologic units in otherwise difficult-to-access locations will aide in generating better quantitative estimates of deposit geometries and eruption volumes. Such data is particularly important for constraining quantitative models of magma production and eruption dynamics. The deposits of interest in this study comprised at least partly inaccessible, largely pyroclastic units, although the technique could be used to map any vertical surfaces. The first field location was the northern side of Hoodoo Mountain volcano (56deg47'23.72'N/131deg17'36.97'W/1208m-asl), where a sequence of welded to unwelded, trachytic-phonolitic tephra was deposited in a paleovalley. This deposit is informally referred to as the Pointer Ridge deposit, and it comprises at least 7 distinct subunits. The horizontal limit of the exposures is approximately 1.5km, and the vertical limit is approximately 250m. Three different GPS base stations were used to map the lateral and vertical variations in the deposit. The second field location is north of Craven Lake (56deg54'44.55'N/129deg21'42.17'W/1453m-asl), along Craven Creek, where a sequence of basaltic tephra is overlain by pillow lava and glacial diamicton. This exposure is 200m long and approximately 30m high, much smaller than the area mapped at Hoodoo Mountain. The basaltic tephra appears to comprise 4 distinct sequences (measured thicknesses vary from 3-4m) not including the overlying pillow lava (measured thickness varies from 2 to 10m), and measurements of the sequences give average thicknesses of 5-10m. The laser-range finding-GPS setup used in these studies comprises an IMPULSE LR laser-range finder, a MapStar Module II electronic compass, and a Trimble ProXL global positioning (GPS) unit attached to a tripod specially designed to hold all three pieces of equipment. The three units communicate via a variety of cables. The maximum distance over which the laser can used is 500m (this distance varies for different lasers); the tripod was set at a relatively short distance from the exposure of interest at Craven Lake (95-115m), but further away at Hoodoo Mountain (up to 450m). The range finder was used to 'shoot' bottom and top contacts of each unit within the vertical faces. The distance and relative bearing were automatically transferred into compass and then to the GPS unit, producing in a map of the vertical face with horizontal and vertical coordinates. Analysis of the data provides detailed estimates of unit thicknesses across the vertical faces. The data collected can be imported into ArcGIS as a SHAPE file and overlain on DEM models for the areas of interest. ArcGIS extensions such as Spatial Analyst and 3D Analyst can be used to estimate surface areas and volumes for units mapped within the laser-GPS setup.

  11. A medical device-grade T1 and ECV phantom for global T1 mapping quality assurance-the T1 Mapping and ECV Standardization in cardiovascular magnetic resonance (T1MES) program.

    PubMed

    Captur, Gabriella; Gatehouse, Peter; Keenan, Kathryn E; Heslinga, Friso G; Bruehl, Ruediger; Prothmann, Marcel; Graves, Martin J; Eames, Richard J; Torlasco, Camilla; Benedetti, Giulia; Donovan, Jacqueline; Ittermann, Bernd; Boubertakh, Redha; Bathgate, Andrew; Royet, Celine; Pang, Wenjie; Nezafat, Reza; Salerno, Michael; Kellman, Peter; Moon, James C

    2016-09-22

    T 1 mapping and extracellular volume (ECV) have the potential to guide patient care and serve as surrogate end-points in clinical trials, but measurements differ between cardiovascular magnetic resonance (CMR) scanners and pulse sequences. To help deliver T 1 mapping to global clinical care, we developed a phantom-based quality assurance (QA) system for verification of measurement stability over time at individual sites, with further aims of generalization of results across sites, vendor systems, software versions and imaging sequences. We thus created T1MES: The T1 Mapping and ECV Standardization Program. A design collaboration consisting of a specialist MRI small-medium enterprise, clinicians, physicists and national metrology institutes was formed. A phantom was designed covering clinically relevant ranges of T 1 and T 2 in blood and myocardium, pre and post-contrast, for 1.5 T and 3 T. Reproducible mass manufacture was established. The device received regulatory clearance by the Food and Drug Administration (FDA) and Conformité Européene (CE) marking. The T1MES phantom is an agarose gel-based phantom using nickel chloride as the paramagnetic relaxation modifier. It was reproducibly specified and mass-produced with a rigorously repeatable process. Each phantom contains nine differently-doped agarose gel tubes embedded in a gel/beads matrix. Phantoms were free of air bubbles and susceptibility artifacts at both field strengths and T 1 maps were free from off-resonance artifacts. The incorporation of high-density polyethylene beads in the main gel fill was effective at flattening the B 1 field. T 1 and T 2 values measured in T1MES showed coefficients of variation of 1 % or less between repeat scans indicating good short-term reproducibility. Temperature dependency experiments confirmed that over the range 15-30 °C the short-T 1 tubes were more stable with temperature than the long-T 1 tubes. A batch of 69 phantoms was mass-produced with random sampling of ten of these showing coefficients of variations for T 1 of 0.64 ± 0.45 % and 0.49 ± 0.34 % at 1.5 T and 3 T respectively. The T1MES program has developed a T 1 mapping phantom to CE/FDA manufacturing standards. An initial 69 phantoms with a multi-vendor user manual are now being scanned fortnightly in centers worldwide. Future results will explore T 1 mapping sequences, platform performance, stability and the potential for standardization.

  12. The Effective Dynamics of the Volume Preserving Mean Curvature Flow

    NASA Astrophysics Data System (ADS)

    Chenn, Ilias; Fournodavlos, G.; Sigal, I. M.

    2018-04-01

    We consider the dynamics of small closed submanifolds (`bubbles') under the volume preserving mean curvature flow. We construct a map from (n+1 )-dimensional Euclidean space into a given (n+1 )-dimensional Riemannian manifold which characterizes the existence, stability and dynamics of constant mean curvature submanifolds. This is done in terms of a reduced area function on the Euclidean space, which is given constructively and can be computed perturbatively. This allows us to derive adiabatic and effective dynamics of the bubbles. The results can be mapped by rescaling to the dynamics of fixed size bubbles in almost Euclidean Riemannian manifolds.

  13. Split-step eigenvector-following technique for exploring enthalpy landscapes at absolute zero.

    PubMed

    Mauro, John C; Loucks, Roger J; Balakrishnan, Jitendra

    2006-03-16

    The mapping of enthalpy landscapes is complicated by the coupling of particle position and volume coordinates. To address this issue, we have developed a new split-step eigenvector-following technique for locating minima and transition points in an enthalpy landscape at absolute zero. Each iteration is split into two steps in order to independently vary system volume and relative atomic coordinates. A separate Lagrange multiplier is used for each eigendirection in order to provide maximum flexibility in determining step sizes. This technique will be useful for mapping the enthalpy landscapes of bulk systems such as supercooled liquids and glasses.

  14. Harry S. Truman Dam and Reservoir, Missouri, Mitigation of the Adverse Effects Upon the Local Paleontological Resources. Volume 2. Plan Maps of Excavations.

    DTIC Science & Technology

    1983-01-01

    7 A-A147 693 HARRY S TRUMAN D’AM AND RESERVOIR MISSOURI MITIGATION OF 1/ THE ADVERSE EFFEC. .0) ILLINOIS STATE MUSEUM SOCIETY SPRINGFIELD d d...BUREAU OF STANDARDS-.1963-A 1 2 qZTM II 0A MITIGATION OF THE ADVERSE EFFECTS UPON THE LOCAL PALEONTOLOGICAL RESOURCES OF THE HARRY S. TRUMAN DAM AND...RESERVOIR OSAGE RIVER BASIN, MISSOURI FINAL REPORT OF FINDINGS VOLUME II: PLAN MAPS OF EXCAVATIONS by Jeffrey J. Saunders, Ph.D. Illinois State Museum

  15. Bathymetric contour maps of lakes surveyed in Iowa in 2004

    USGS Publications Warehouse

    Linhart, S. Mike; Lund, Kris D.

    2006-01-01

    Bathymetric data were collected using a boat-mounted, differential global positioning system, echo depth-sounding equipment, and computer software. Data were processed with commercial hydrographic software and exported into a geographic information system for mapping and calculating area and volume. Lake volume estimates ranged from 83,924,000 cubic feet (1,930 acre-feet) at Lake Darling to 5,967,000 cubic feet (140 acre-feet) at Upper Gar Lake. Surface area estimates ranged from 10,660,000 square feet (240 acres) at Lake Darling to 1,557,000 square feet (36 acres) at Upper Gar Lake.

  16. Bathymetric contour maps for lakes surveyed in Iowa in 2003

    USGS Publications Warehouse

    Linhart, S. Mike; Lund, Kris D.

    2006-01-01

    Bathymetric data were collected using boat-mounted, differential global positioning system (GPS) equipment, echo depth-sounding equipment, and computer software. Data were processed with commercial hydrographic software and exported into a geographic information system (GIS) for mapping and calculation of area and volume. Lake volume estimates ranged from 590,501,000 cubic feet (13,600 acre-feet) at Lake Macbride to 17,831,000 cubic feet (410 acre-feet) at Lake Meyer. Surface area estimates ranged from 38,118,000 square feet (875 acres) at Lake Macbride to 1,373,000 square feet (32 acres) at Lake Meyer.

  17. Comparison of manually produced and automated cross country movement maps using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Wynn, L. K.

    1985-01-01

    The Image-Based Information System (IBIS) was used to automate the cross country movement (CCM) mapping model developed by the Defense Mapping Agency (DMA). Existing terrain factor overlays and a CCM map, produced by DMA for the Fort Lewis, Washington area, were digitized and reformatted into geometrically registered images. Terrain factor data from Slope, Soils, and Vegetation overlays were entered into IBIS, and were then combined utilizing IBIS-programmed equations to implement the DMA CCM model. The resulting IBIS-generated CCM map was then compared with the digitized manually produced map to test similarity. The numbers of pixels comprising each CCM region were compared between the two map images, and percent agreement between each two regional counts was computed. The mean percent agreement equalled 86.21%, with an areally weighted standard deviation of 11.11%. Calculation of Pearson's correlation coefficient yielded +9.997. In some cases, the IBIS-calculated map code differed from the DMA codes: analysis revealed that IBIS had calculated the codes correctly. These highly positive results demonstrate the power and accuracy of IBIS in automating models which synthesize a variety of thematic geographic data.

  18. Use of generalized linear models and digital data in a forest inventory of Northern Utah

    USGS Publications Warehouse

    Moisen, Gretchen G.; Edwards, Thomas C.

    1999-01-01

    Forest inventories, like those conducted by the Forest Service's Forest Inventory and Analysis Program (FIA) in the Rocky Mountain Region, are under increased pressure to produce better information at reduced costs. Here we describe our efforts in Utah to merge satellite-based information with forest inventory data for the purposes of reducing the costs of estimates of forest population totals and providing spatial depiction of forest resources. We illustrate how generalized linear models can be used to construct approximately unbiased and efficient estimates of population totals while providing a mechanism for prediction in space for mapping of forest structure. We model forest type and timber volume of five tree species groups as functions of a variety of predictor variables in the northern Utah mountains. Predictor variables include elevation, aspect, slope, geographic coordinates, as well as vegetation cover types based on satellite data from both the Advanced Very High Resolution Radiometer (AVHRR) and Thematic Mapper (TM) platforms. We examine the relative precision of estimates of area by forest type and mean cubic-foot volumes under six different models, including the traditional double sampling for stratification strategy. Only very small gains in precision were realized through the use of expensive photointerpreted or TM-based data for stratification, while models based on topography and spatial coordinates alone were competitive. We also compare the predictive capability of the models through various map accuracy measures. The models including the TM-based vegetation performed best overall, while topography and spatial coordinates alone provided substantial information at very low cost.

  19. Jet-torus connection in radio galaxies. Relativistic hydrodynamics and synthetic emission

    NASA Astrophysics Data System (ADS)

    Fromm, C. M.; Perucho, M.; Porth, O.; Younsi, Z.; Ros, E.; Mizuno, Y.; Zensus, J. A.; Rezzolla, L.

    2018-01-01

    Context. High resolution very long baseline interferometry observations of active galactic nuclei have revealed asymmetric structures in the jets of radio galaxies. These asymmetric structures may be due to internal asymmetries in the jets or they may be induced by the different conditions in the surrounding ambient medium, including the obscuring torus, or a combination of the two. Aims: In this paper we investigate the influence of the ambient medium, including the obscuring torus, on the observed properties of jets from radio galaxies. Methods: We performed special-relativistic hydrodynamic (SRHD) simulations of over-pressured and pressure-matched jets using the special-relativistic hydrodynamics code Ratpenat, which is based on a second-order accurate finite-volume method and an approximate Riemann solver. Using a newly developed radiative transfer code to compute the electromagnetic radiation, we modelled several jets embedded in various ambient medium and torus configurations and subsequently computed the non-thermal emission produced by the jet and thermal absorption from the torus. To better compare the emission simulations with observations we produced synthetic radio maps, taking into account the properties of the observatory. Results: The detailed analysis of our simulations shows that the observed properties such as core shift could be used to distinguish between over-pressured and pressure matched jets. In addition to the properties of the jets, insights into the extent and density of the obscuring torus can be obtained from analyses of the single-dish spectrum and spectral index maps.

  20. Black-blood native T1 mapping: Blood signal suppression for reduced partial voluming in the myocardium.

    PubMed

    Weingärtner, Sebastian; Meßner, Nadja M; Zöllner, Frank G; Akçakaya, Mehmet; Schad, Lothar R

    2017-08-01

    To study the feasibility of black-blood contrast in native T 1 mapping for reduction of partial voluming at the blood-myocardium interface. A saturation pulse prepared heart-rate-independent inversion recovery (SAPPHIRE) T 1 mapping sequence was combined with motion-sensitized driven-equilibrium (MSDE) blood suppression for black-blood T 1 mapping at 3 Tesla. Phantom scans were performed to assess the T 1 time accuracy. In vivo black-blood and conventional SAPPHIRE T 1 mapping was performed in eight healthy subjects and analyzed for T 1 times, precision, and inter- and intraobserver variability. Furthermore, manually drawn regions of interest (ROIs) in all T 1 maps were dilated and eroded to analyze the dependence of septal T 1 times on the ROI thickness. Phantom results and in vivo myocardial T 1 times show comparable accuracy with black-blood compared to conventional SAPPHIRE (in vivo: black-blood: 1562 ± 56 ms vs. conventional: 1583 ± 58 ms, P = 0.20); Using black-blood SAPPHIRE precision was significantly lower (standard deviation: 133.9 ± 24.6 ms vs. 63.1 ± 6.4 ms, P < .0001), and blood T 1 time measurement was not possible. Significantly increased interobserver interclass correlation coefficient (ICC) (0.996 vs. 0.967, P = 0.011) and similar intraobserver ICC (0.979 vs. 0.939, P = 0.11) was obtained with the black-blood sequence. Conventional SAPPHIRE showed strong dependence on the ROI thickness (R 2 = 0.99). No such trend was observed using the black-blood approach (R 2 = 0.29). Black-blood SAPPHIRE successfully eliminates partial voluming at the blood pool in native myocardial T 1 mapping while providing accurate T 1 times, albeit at a reduced precision. Magn Reson Med 78:484-493, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  1. The Past 20,000 Years of Plinian Explosive Activity at Mt Pelée Volcano (Lesser Antilles)

    NASA Astrophysics Data System (ADS)

    Carazzo, G.; Michaud-Dubuy, A.; Kaminski, E. C.; Tait, S.

    2017-12-01

    Major volcanic hazards in the Lesser Antilles arc include powerful Plinian explosive eruptions that inject ash into the atmosphere and produce dangerous pyroclastic density currents (PDC) on the ground. Reconstructions of past eruptive activities based on stratigraphic records are crucial to assessing specific hazards in this region where large eruptions do not occur frequently. The present study focuses on the dynamics of the last Plinian eruptions of Mount Pelée volcano in Martinique. Previous field-based studies identified 6 major Plinian eruptions over the past 5,000 years but limited information on their dynamics exist, except for the most recent one dated at AD 1300. Based on a new comprehensive field study and physical models of volcanic plumes, we largely improve our knowledge of the number of Plinian eruptions that occurred in Martinique over the past 20,000 years. We also provide a detailed reconstruction of important eruptive parameters such as mass eruption rates, maximum column heights, volumes, and impacted areas. Among the 6 Plinian eruptions newly identified during our field campaign, one is found to have produced voluminous pyroclastic density currents that reached the sea and partially rose as a co-PDC plume above a region that is beyond the existing hazard map. The estimated mass eruption rates for the 12 Plinian eruptions identified over the last 20,000 years range from 107 to 108 kg/s, producing 15-30 km-high Plinian columns, initially stable but ultimately collapsing and forming PDC. Empirical models of deposit thinning suggest that the minimum volume of pyroclastic deposits systematically ranges between 0.1 and 1 km3, corresponding to VEI 4 to 5 events. Archaeological evidences suggest that the impact of several eruptions forced the first Caribbean inhabitants to flee to other islands for decades.

  2. GPR as a Low Impact Paleontogical Survey Technique

    NASA Astrophysics Data System (ADS)

    Sturdevant, G. C.; Leverence, R.; Stewart, R.

    2013-12-01

    The Deweyville Formation, a Pleistocene fluvial sandstone, is a prolific source of megafaunal fossils from periods of low stand environmental conditions. GPR was employed in an environmentally sensitive area in close proximity to a salt dome in Northwest Harris County, Texas as a method of evaluating the probable paleo-depositional environment and to prospect for potential further site development of two distinct fossiliferous zones. The primary zone of interest is a lag gravel bounded sand responsible for producing a regionally unique fossil assemblage including South American megafauna (Lundelius et al, 2013). The secondary zone of interest contains undisturbed mammoth remains housed in coarse white sand emplaced on top of a clay drape which has been hypothesized to represent an oxbow lake formed by the meandering paleo-Brazos river. With an accurate map of the paleo-channel planning future activity can focus on maximizing fossil recovery and minimizing site impact. Pulse EKKO 250 MHz, 400MHz, and 1GHz system was employed in a prospect area proximal to the secondary site to calibrate and evaluate these systems for their resolution and penetration depth in the modern sediments. The data was processed using EKKO Mapper and EKKO View Deluxe software packages, 3d volumes were produced and sliced. Preliminary results from the 250 MHz demonstrate successful imaging of the sand-clay interface. After these surveys were run a small portion of the site was excavated to confirm the estimated velocities, the observed anomalies, and refine our modeling and interpretation, and improve grid design for further surveys. It was confirmed that the sand-clay interface was easily observable using GPR, however the grid spacing proved to be too wide, leading to artifacts in the 3d volume produced.

  3. Fluence map optimization (FMO) with dose-volume constraints in IMRT using the geometric distance sorting method.

    PubMed

    Lan, Yihua; Li, Cunhua; Ren, Haozheng; Zhang, Yong; Min, Zhifang

    2012-10-21

    A new heuristic algorithm based on the so-called geometric distance sorting technique is proposed for solving the fluence map optimization with dose-volume constraints which is one of the most essential tasks for inverse planning in IMRT. The framework of the proposed method is basically an iterative process which begins with a simple linear constrained quadratic optimization model without considering any dose-volume constraints, and then the dose constraints for the voxels violating the dose-volume constraints are gradually added into the quadratic optimization model step by step until all the dose-volume constraints are satisfied. In each iteration step, an interior point method is adopted to solve each new linear constrained quadratic programming. For choosing the proper candidate voxels for the current dose constraint adding, a so-called geometric distance defined in the transformed standard quadratic form of the fluence map optimization model was used to guide the selection of the voxels. The new geometric distance sorting technique can mostly reduce the unexpected increase of the objective function value caused inevitably by the constraint adding. It can be regarded as an upgrading to the traditional dose sorting technique. The geometry explanation for the proposed method is also given and a proposition is proved to support our heuristic idea. In addition, a smart constraint adding/deleting strategy is designed to ensure a stable iteration convergence. The new algorithm is tested on four cases including head-neck, a prostate, a lung and an oropharyngeal, and compared with the algorithm based on the traditional dose sorting technique. Experimental results showed that the proposed method is more suitable for guiding the selection of new constraints than the traditional dose sorting method, especially for the cases whose target regions are in non-convex shapes. It is a more efficient optimization technique to some extent for choosing constraints than the dose sorting method. By integrating a smart constraint adding/deleting scheme within the iteration framework, the new technique builds up an improved algorithm for solving the fluence map optimization with dose-volume constraints.

  4. Automated customized retrieval of radiotherapy data for clinical trials, audit and research.

    PubMed

    Romanchikova, Marina; Harrison, Karl; Burnet, Neil G; Hoole, Andrew Cf; Sutcliffe, Michael Pf; Parker, Michael Andrew; Jena, Rajesh; Thomas, Simon James

    2018-02-01

    To enable fast and customizable automated collection of radiotherapy (RT) data from tomotherapy storage. Human-readable data maps (TagMaps) were created to generate DICOM-RT (Digital Imaging and Communications in Medicine standard for Radiation Therapy) data from tomotherapy archives, and provided access to "hidden" information comprising delivery sinograms, positional corrections and adaptive-RT doses. 797 data sets totalling 25,000 scans were batch-exported in 31.5 h. All archived information was restored, including the data not available via commercial software. The exported data were DICOM-compliant and compatible with major commercial tools including RayStation, Pinnacle and ProSoma. The export ran without operator interventions. The TagMap method for DICOM-RT data modelling produced software that was many times faster than the vendor's solution, required minimal operator input and delivered high volumes of vendor-identical DICOM data. The approach is applicable to many clinical and research data processing scenarios and can be adapted to recover DICOM-RT data from other proprietary storage types such as Elekta, Pinnacle or ProSoma. Advances in knowledge: A novel method to translate data from proprietary storage to DICOM-RT is presented. It provides access to the data hidden in electronic archives, offers a working solution to the issues of data migration and vendor lock-in and paves the way for large-scale imaging and radiomics studies.

  5. Quantifying the water storage volume of major aquifers in the US

    NASA Astrophysics Data System (ADS)

    Jame, S. A.; Bowling, L. C.

    2017-12-01

    Groundwater is one of our most valuable natural resources which affects not only the food and energy nexus, but ecosystem and human health, through the availability of drinking water. Quantification of current groundwater storage is not only required to better understand groundwater flow and its role in the hydrologic cycle, but also sustainable use. In this study, a new high resolution map (5' minutes) of groundwater properties is created for US major aquifers to provide an estimate of total groundwater storage. The estimation was done using information on the spatial extent of the principal aquifers of the US from the USGS Groundwater Atlas, the average porosity of different hydrolithologic groups and the current saturated thickness of each aquifer. Saturated thickness varies within aquifers, and has been calculated by superimposing current water-table contour maps over the base aquifer altitude provided by USGS. The average saturated thickness has been computed by interpolating available data on saturated thickness for an aquifer using the kriging method. Total storage of aquifers in each cell was then calculated by multiplying the spatial extent, porosity, and thickness of the saturated layer. The resulting aquifer storage estimates was compared with current groundwater withdrawal rates to produce an estimate of how many years' worth of water are stored in the aquifers. The resulting storage map will serve as a national dataset for stakeholders to make decisions for sustainable use of groundwater.

  6. Consistency mapping of 16 lymph node stations in gastric cancer by CT-based vessel-guided delineation of 255 patients.

    PubMed

    Xu, Shuhang; Feng, Lingling; Chen, Yongming; Sun, Ying; Lu, Yao; Huang, Shaomin; Fu, Yang; Zheng, Rongqin; Zhang, Yujing; Zhang, Rong

    2017-06-20

    In order to refine the location and metastasis-risk density of 16 lymph node stations of gastric cancer for neoadjuvant radiotherapy, we retrospectively reviewed the initial images and pathological reports of 255 gastric cancer patients with lymphatic metastasis. Metastatic lymph nodes identified in the initial computed tomography images were investigated by two radiologists with gastrointestinal specialty. A circle with a diameter of 5 mm was used to identify the central position of each metastatic lymph node, defined as the LNc (the central position of the lymph node). The LNc was drawn at the equivalent location on the reference images of a standard patient based on the relative distances to the same reference vessels and the gastric wall using a Monaco® version 5.0 workstation. The image manipulation software Medi-capture was programmed for image analysis to produce a contour and density atlas of 16 lymph node stations. Based on a total of 2846 LNcs contoured (31-599 per lymph node station), we created a density distribution map of 16 lymph node drainage stations of the stomach on computed tomography images, showing the detailed radiographic delineation of each lymph node station as well as high-risk areas for lymph node metastasis. Our mapping can serve as a template for the delineation of gastric lymph node stations when defining clinical target volume in pre-operative radiotherapy for gastric cancer.

  7. Evaluation of satellite remote sensing and automatic data techniques for characterization of wetlands and marshlands

    NASA Technical Reports Server (NTRS)

    Cartmill, R. H. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Using the 12S Digicol color additive viewer and eight color classification map has been produced of a portion of the study area. Channel 3 of the MSS produced the best map. Enlargements of the MSS data have been accomplished by using the Data Analysis Station. The attached film recorder has three color guns which are capable of placing 2400 square elements across a 9 inch film. It has been found that by repeating ERTS element 9 times and each scan line 13 times that a map of a scale approximately 1:62,000 can be produced as a color negative film strip. This can be contact printed to produce a color map of the scale. As yet this procedure does not correct for image skew caused by rotation which is believed to be the major source of distortion and blockiness in the image. However, the final product which has not undergone any photographic enlargement is superior to photographically enlarged maps of the same scale.

  8. Serial turbo trellis coded modulation using a serially concatenated coder

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Dolinar, Samuel J. (Inventor); Pollara, Fabrizio (Inventor)

    2010-01-01

    Serial concatenated trellis coded modulation (SCTCM) includes an outer coder, an interleaver, a recursive inner coder and a mapping element. The outer coder receives data to be coded and produces outer coded data. The interleaver permutes the outer coded data to produce interleaved data. The recursive inner coder codes the interleaved data to produce inner coded data. The mapping element maps the inner coded data to a symbol. The recursive inner coder has a structure which facilitates iterative decoding of the symbols at a decoder system. The recursive inner coder and the mapping element are selected to maximize the effective free Euclidean distance of a trellis coded modulator formed from the recursive inner coder and the mapping element. The decoder system includes a demodulation unit, an inner SISO (soft-input soft-output) decoder, a deinterleaver, an outer SISO decoder, and an interleaver.

  9. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 1. [Arizona, Colorado, Montana, New Mexico, Utah, and Wyoming

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. New LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  10. Onboard Algorithms for Data Prioritization and Summarization of Aerial Imagery

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Hayden, David; Thompson, David R.; Castano, Rebecca

    2013-01-01

    Many current and future NASA missions are capable of collecting enormous amounts of data, of which only a small portion can be transmitted to Earth. Communications are limited due to distance, visibility constraints, and competing mission downlinks. Long missions and high-resolution, multispectral imaging devices easily produce data exceeding the available bandwidth. To address this situation computationally efficient algorithms were developed for analyzing science imagery onboard the spacecraft. These algorithms autonomously cluster the data into classes of similar imagery, enabling selective downlink of representatives of each class, and a map classifying the terrain imaged rather than the full dataset, reducing the volume of the downlinked data. A range of approaches was examined, including k-means clustering using image features based on color, texture, temporal, and spatial arrangement

  11. Three-dimensional information hierarchical encryption based on computer-generated holograms

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Shen, Xueju; Cao, Liangcai; Zhang, Hao; Zong, Song; Jin, Guofan

    2016-12-01

    A novel approach for encrypting three-dimensional (3-D) scene information hierarchically based on computer-generated holograms (CGHs) is proposed. The CGHs of the layer-oriented 3-D scene information are produced by angular-spectrum propagation algorithm at different depths. All the CGHs are then modulated by different chaotic random phase masks generated by the logistic map. Hierarchical encryption encoding is applied when all the CGHs are accumulated one by one, and the reconstructed volume of the 3-D scene information depends on permissions of different users. The chaotic random phase masks could be encoded into several parameters of the chaotic sequences to simplify the transmission and preservation of the keys. Optical experiments verify the proposed method and numerical simulations show the high key sensitivity, high security, and application flexibility of the method.

  12. PNAS Plus: Mapping patterns of long-term settlement in Northern Mesopotamia at a large scale

    NASA Astrophysics Data System (ADS)

    Menze, Bjoern H.; Ur, Jason A.

    2012-04-01

    The landscapes of the Near East show both the first settlements and the longest trajectories of settlement systems. Mounding is a characteristic property of these settlement sites, resulting from millennia of continuing settlement activity at distinguished places. So far, however, this defining feature of ancient settlements has not received much attention, or even been the subject of systematic evaluation. We propose a remote sensing approach for comprehensively mapping the pattern of human settlement at large scale and establish the largest archaeological record for a landscape in Mesopotamia, mapping about 14,000 settlement sites-spanning eight millennia-at 15-m resolution in a 23,000-km2 area in northeastern Syria. To map both low- and high-mounded places-the latter of which are often referred to as "tells"-we develop a strategy for detecting anthrosols in time series of multispectral satellite images and measure the volume of settlement sites in a digital elevation model. Using this volume as a proxy to continued occupation, we find a dependency of the long-term attractiveness of a site on local water availability, but also a strong relation to the relevance within a basin-wide exchange network that we can infer from our record and third millennium B.C. intersite routes visible on the ground until recent times. We believe it is possible to establish a nearly comprehensive map of human settlements in the fluvial plains of northern Mesopotamia and beyond, and site volume may be a key quantity to uncover long-term trends in human settlement activity from such a record.

  13. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

  14. Prediction of subacute infarct size in acute middle cerebral artery stroke: comparison of perfusion-weighted imaging and apparent diffusion coefficient maps.

    PubMed

    Drier, Aurélie; Tourdias, Thomas; Attal, Yohan; Sibon, Igor; Mutlu, Gurkan; Lehéricy, Stéphane; Samson, Yves; Chiras, Jacques; Dormont, Didier; Orgogozo, Jean-Marc; Dousset, Vincent; Rosso, Charlotte

    2012-11-01

    To compare perfusion-weighted (PW) imaging and apparent diffusion coefficient (ADC) maps in prediction of infarct size and growth in patients with acute middle cerebral artery infarct. This study was approved by the local institutional review board. Written informed consent was obtained from all 80 patients. Subsequent infarct volume and growth on follow-up magnetic resonance (MR) images obtained within 6 days were compared with the predictions based on PW images by using a time-to-peak threshold greater than 4 seconds and ADC maps obtained less than 12 hours after middle cerebral artery infarct. ADC- and PW imaging-predicted infarct growth areas and infarct volumes were correlated with subsequent infarct growth and follow-up diffusion-weighted (DW) imaging volumes. The impact of MR imaging time delay on the correlation coefficient between the predicted and subsequent infarct volumes and individual predictions of infarct growth by using receiver operating characteristic curves were assessed. The infarct volume measurements were highly reproducible (concordance correlation coefficient [CCC] of 0.965 and 95% confidence interval [CI]: 0.949, 0.976 for acute DW imaging; CCC of 0.995 and 95% CI: 0.993, 0.997 for subacute DW imaging). The subsequent infarct volume correlated (P<.0001) with ADC- (ρ=0.853) and PW imaging- (ρ=0.669) predicted volumes. The correlation was higher for ADC-predicted volume than for PW imaging-predicted volume (P<.005), but not when the analysis was restricted to patients without recanalization (P=.07). The infarct growth correlated (P<.0001) with PW imaging-DW imaging mismatch (ρ=0.470) and ADC-DW imaging mismatch (ρ=0.438), without significant differences between both methods (P=.71). The correlations were similar among time delays with ADC-predicted volumes but decreased with PW imaging-based volumes beyond the therapeutic window. Accuracies of ADC- and PW imaging-based predictions of infarct growth in an individual prediction were similar (area under the receiver operating characteristic curve [AUC] of 0.698 and 95% CI: 0.585, 0.796 vs AUC of 0.749 and 95% CI: 0.640, 0.839; P=.48). The ADC-based method was as accurate as the PW imaging-based method for evaluating infarct growth and size in the subacute phase. © RSNA, 2012

  15. Geomorphological relationships through the use of 2-D seismic reflection data, Lidar, and aerial imagery

    NASA Astrophysics Data System (ADS)

    Alesce, Meghan Elizabeth

    Barrier Islands are crucial in protecting coastal environments. This study focuses on Dauphin Island, Alabama, located within the Northern Gulf of Mexico (NGOM) Barrier Island complex. It is one of many islands serving as natural protection for NGOM ecosystems and coastal cities. The NGOM barrier islands formed at 4 kya in response to a decrease in rate of sea level rise. The morphology of these islands changes with hurricanes, anthropogenic activity, and tidal and wave action. This study focuses on ancient incised valleys and and the impact on island morphology on hurricane breaches. Using high frequency 2-D seismic reflection data four horizons, including the present seafloor, were interpreted. Subaerial portions of Dauphin Island were imaged using Lidar data and aerial imagery over a ten-year time span, as well as historical maps. Historical shorelines of Dauphin Island were extracted from aerial imagery and historical maps, and were compared to the location of incised valleys seen within the 2-D seismic reflection data. Erosion and deposition volumes of Dauphin Island from 1998 to 2010 (the time span covering hurricanes Ivan and Katrina) in the vicinity of Katrina Cut and Pelican Island were quantified using Lidar data. For the time period prior to Hurricane Ivan an erosional volume of 46,382,552 m3 and depositional volume of 16,113.6 m3 were quantified from Lidar data. The effects of Hurricane Ivan produced a total erosion volume of 4,076,041.5 m3. The erosional and depositional volumes of Katrina Cut being were 7,562,068.5 m3 and 510,936.7 m3, respectively. More volume change was found within Pelican Pass. For the period between hurricanes Ivan and Katrina the erosion volume was 595,713.8 m3. This was mostly located within Katrina Cut. Total deposition for the same period, including in Pelican Pass, was 15,353,961 m3. Hurricane breaches were compared to ancient incised valleys seen within the 2-D seismic reflection results. Breaches from hurricanes from 1849, 1916, and 2005 all correlated with incised valley. Interpretations from horizons A and P correlated well with the 1849 shoreline, while Horizon B correlated best with the 1916 and 2005 hurricane breaches. The correlation of incised valleys and breaches provides a probable causation of breach locations. With further investigations, determination of the impacts of these relict valleys can be validated and established. Preferential subsidence within the relict incised valleys would account for the lower elevations on the island surface. Very fine to fine grained sediment deposits from the Mobile Bay would contribute to preferential subsidence. The lower elevations are more likely to erode from overwash and surges during strong storm systems.

  16. Documentation of the GLAS fourth order general calculation model. Volume 3: Vectorized code for the Cyber 205

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Balgovind, R.; Chao, W.; Edelmann, D.; Pfaendtner, J.; Takacs, L.; Takano, K.

    1983-01-01

    Volume 3 of a 3-volume technical memoranda which contains documentation of the GLAS fourth order genera circulation model is presented. The volume contains the CYBER 205 scalar and vector codes of the model, list of variables, and cross references. A dictionary of FORTRAN variables used in the Scalar Version, and listings of the FORTRAN Code compiled with the C-option, are included. Cross reference maps of local variables are included for each subroutine.

  17. Storage and residence time of suspended sediment in gravel bars of Difficult Run, VA

    NASA Astrophysics Data System (ADS)

    George, J.; Benthem, A.; Pizzuto, J. E.; Skalak, K.

    2016-12-01

    Reducing the export of suspended sediment is an important consideration for restoring water quality to the Chesapeake Bay, but sediment budgets for in-channel landforms are poorly constrained. We quantified fine (< 2 mm) sediment storage and residence times for gravel bars at two reaches along Difficult Run, a 5th order tributary to the Potomac River. Eight gravel bars were mapped in a 150m headwater reach at Miller Heights (bankfull width 11m; total bar volume 114 m3) and 6 gravel bars were mapped in a 160m reach downstream near Leesburg Pike (bankfull width 19m; total bar volume 210 m3). Grain size analyses of surface and subsurface samples from 2 bars at each reach indicate an average suspended sediment content of 55%, suggesting a total volume of suspended sediment stored in the mapped bars to be 178 m3, or 283000 kg, comprising 5% of the average annual suspended sediment load of the two study reaches. Estimates of the annual bedload flux at Miller Heights based on stream gaging records and the Wilcock-Crowe bedload transport equation imply that the bars are entirely reworked at least annually. Scour chains installed in 2 bars at each site (a total of 50 chains) recorded scour and fill events during the winter and spring of 2016. These data indicate that 38% of the total volume of the bars is exchanged per year, for a residence time of 2.6 ± 1.2 years, a value we interpret as the residence time of suspended sediment stored in the bars. These results are supported by mapping of topographic changes derived from structure-from-motion analyses of digital aerial imagery. Storage in alluvial bars therefore represents a significant component of the suspended sediment budget of mid-Atlantic streams.

  18. 44 CFR 65.13 - Mapping and map revisions for areas subject to alluvial fan flooding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... sound engineering analyses which demonstrate that the measures will effectively eliminate alluvial fan... necessarily limited to, the following: (1) Engineering analyses that quantify the discharges and volumes of... movement must be assessed using an engineering method acceptable to FEMA. The assessment should consider...

  19. Three-dimensional radar imaging of structures and craters in the Martian polar caps.

    PubMed

    Putzig, Nathaniel E; Smith, Isaac B; Perry, Matthew R; Foss, Frederick J; Campbell, Bruce A; Phillips, Roger J; Seu, Roberto

    2018-07-01

    Over the last decade, observations acquired by the Shallow Radar (SHARAD) sounder on individual passes of the Mars Reconnaissance Orbiter have revealed the internal structure of the Martian polar caps and provided new insights into the formation of the icy layers within and their relationship to climate. However, a complete picture of the cap interiors has been hampered by interfering reflections from off-nadir surface features and signal losses associated with sloping structures and scattering. Foss et al. (2017) addressed these limitations by assembling three-dimensional data volumes of SHARAD observations from thousands of orbital passes over each polar region and applying geometric corrections simultaneously. The radar volumes provide unprecedented views of subsurface features, readily imaging structures previously inferred from time-intensive manual analysis of single-orbit data (e.g., trough-bounding surfaces, a buried chasma, and a basal unit in the north, massive carbon-dioxide ice deposits and discontinuous layered sequences in the south). Our new mapping of the carbon-dioxide deposits yields a volume of 16,500 km 3 , 11% larger than the prior estimate. In addition, the radar volumes newly reveal other structures, including what appear to be buried impact craters with no surface expression. Our first assessment of 21 apparent craters at the base of the north polar layered deposits suggests a Hesperian age for the substrate, consistent with that of the surrounding plains as determined from statistics of surface cratering rates. Planned mapping of similar features throughout both polar volumes may provide new constraints on the age of the icy layered deposits. The radar volumes also provide new topographic data between the highest latitudes observed by the Mars Orbiter Laser Altimeter and those observed by SHARAD. In general, mapping of features in these radar volumes is placing new constraints on the nature and evolution of the polar deposits and associated climate changes.

  20. Three-dimensional radar imaging of structures and craters in the Martian polar caps

    NASA Astrophysics Data System (ADS)

    Putzig, Nathaniel E.; Smith, Isaac B.; Perry, Matthew R.; Foss, Frederick J.; Campbell, Bruce A.; Phillips, Roger J.; Seu, Roberto

    2018-07-01

    Over the last decade, observations acquired by the Shallow Radar (SHARAD) sounder on individual passes of the Mars Reconnaissance Orbiter have revealed the internal structure of the Martian polar caps and provided new insights into the formation of the icy layers within and their relationship to climate. However, a complete picture of the cap interiors has been hampered by interfering reflections from off-nadir surface features and signal losses associated with sloping structures and scattering. Foss et al. (The Leading Edge 36, 43-57, 2017, https://doi.org/10.1190/tle36010043.1) addressed these limitations by assembling three-dimensional data volumes of SHARAD observations from thousands of orbital passes over each polar region and applying geometric corrections simultaneously. The radar volumes provide unprecedented views of subsurface features, readily imaging structures previously inferred from time-intensive manual analysis of single-orbit data (e.g., trough-bounding surfaces, a buried chasma, and a basal unit in the north, massive carbon-dioxide ice deposits and discontinuous layered sequences in the south). Our new mapping of the carbon-dioxide deposits yields a volume of 16,500 km3, 11% larger than the prior estimate. In addition, the radar volumes newly reveal other structures, including what appear to be buried impact craters with no surface expression. Our first assessment of 21 apparent craters at the base of the north polar layered deposits suggests a Hesperian age for the substrate, consistent with that of the surrounding plains as determined from statistics of surface cratering rates. Planned mapping of similar features throughout both polar volumes may provide new constraints on the age of the icy layered deposits. The radar volumes also provide new topographic data between the highest latitudes observed by the Mars Orbiter Laser Altimeter and those observed by SHARAD. In general, mapping of features in these radar volumes is placing new constraints on the nature and evolution of the polar deposits and associated climate changes.

  1. Temporal and spatial distribution of landslides in the Redwood Creek Basin, Northern California

    USGS Publications Warehouse

    Madej, Mary Ann; Medley, C. Nicholas; Patterson, Glenn; Parker, Melanie J.

    2011-01-01

    Mass movement processes are a dominant means of supplying sediment to mountainous rivers of north coastal California, but the episodic nature of landslides represents a challenge to interpreting patterns of slope instability. This study compares two major landslide events occurring in 1964-1975 and in 1997 in the Redwood Creek basin in north coastal California. In 1997, a moderate-intensity, long-duration storm with high antecedent precipitation triggered 317 landslides with areas greater than 400 m2 in the 720-km2 Redwood Creek basin. The intensity-duration threshold for landslide initiation in 1997 was consistent with previously published values. Aerial photographs (1:6,000 scale) taken a few months after the 1997 storm facilitated the mapping of shallow debris slides, debris flows, and bank failures. The magnitude and location of the 1997 landslides were compared to the distributions of landslides generated by larger floods in 1964, 1972, and 1975. The volume of landslide material produced by the 1997 storm was an order of magnitude less than that generated in the earlier period. During both periods, inner gorge hillslopes produced many landslides, but the relative contribution of tributary basins to overall landslide production differed. Slope stability models can help identify areas susceptible to failure. The 22 percent of the watershed area classified as moderately to highly unstable by the SHALSTAB slope stability model included locations that generated almost 90 percent of the landslide volume during the 1997 storm.

  2. LOD map--A visual interface for navigating multiresolution volume visualization.

    PubMed

    Wang, Chaoli; Shen, Han-Wei

    2006-01-01

    In multiresolution volume visualization, a visual representation of level-of-detail (LOD) quality is important for us to examine, compare, and validate different LOD selection algorithms. While traditional methods rely on ultimate images for quality measurement, we introduce the LOD map--an alternative representation of LOD quality and a visual interface for navigating multiresolution data exploration. Our measure for LOD quality is based on the formulation of entropy from information theory. The measure takes into account the distortion and contribution of multiresolution data blocks. A LOD map is generated through the mapping of key LOD ingredients to a treemap representation. The ordered treemap layout is used for relative stable update of the LOD map when the view or LOD changes. This visual interface not only indicates the quality of LODs in an intuitive way, but also provides immediate suggestions for possible LOD improvement through visually-striking features. It also allows us to compare different views and perform rendering budget control. A set of interactive techniques is proposed to make the LOD adjustment a simple and easy task. We demonstrate the effectiveness and efficiency of our approach on large scientific and medical data sets.

  3. Semantics by analogy for illustrative volume visualization☆

    PubMed Central

    Gerl, Moritz; Rautek, Peter; Isenberg, Tobias; Gröller, Eduard

    2012-01-01

    We present an interactive graphical approach for the explicit specification of semantics for volume visualization. This explicit and graphical specification of semantics for volumetric features allows us to visually assign meaning to both input and output parameters of the visualization mapping. This is in contrast to the implicit way of specifying semantics using transfer functions. In particular, we demonstrate how to realize a dynamic specification of semantics which allows to flexibly explore a wide range of mappings. Our approach is based on three concepts. First, we use semantic shader augmentation to automatically add rule-based rendering functionality to static visualization mappings in a shader program, while preserving the visual abstraction that the initial shader encodes. With this technique we extend recent developments that define a mapping between data attributes and visual attributes with rules, which are evaluated using fuzzy logic. Second, we let users define the semantics by analogy through brushing on renderings of the data attributes of interest. Third, the rules are specified graphically in an interface that provides visual clues for potential modifications. Together, the presented methods offer a high degree of freedom in the specification and exploration of rule-based mappings and avoid the limitations of a linguistic rule formulation. PMID:23576827

  4. The Colorado River and its deposits downstream from Grand Canyon in Arizona, California, and Nevada

    USGS Publications Warehouse

    Crow, Ryan S.; Block, Debra L.; Felger, Tracey J.; House, P. Kyle; Pearthree, Philip A.; Gootee, Brian F.; Youberg, Ann M.; Howard, Keith A.; Beard, L. Sue

    2018-02-05

    Understanding the evolution of the Colorado River system has direct implications for (1) the processes and timing of continental-scale river system integration, (2) the formation of iconic landscapes like those in and around Grand Canyon, and (3) the availability of groundwater resources. Spatial patterns in the position and type of Colorado River deposits, only discernible through geologic mapping, can be used to test models related to Colorado River evolution. This is particularly true downstream from Grand Canyon where ancestral Colorado River deposits are well-exposed. We are principally interested in (1) regional patterns in the minimum and maximum elevation of each depositional unit, which are affected by depositional mechanism and postdepositional deformation; and (2) the volume of each unit, which reflects regional changes in erosion, transport efficiency, and accommodation space. The volume of Colorado River deposits below Grand Canyon has implications for groundwater resources, as the primary regional aquifer there is composed of those deposits. To this end, we are presently mapping Colorado River deposits and compiling and updating older mapping. This preliminary data release shows the current status of our mapping and compilation efforts. We plan to update it at regular intervals in conjunction with ongoing mapping.

  5. SeaWiFS Postlaunch Calibration and Validation Analyses

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine (Editor); McClain, Charles R.; Barnes, Robert A.; Eplee, Robert E., Jr.; Franz, Bryan A.; Hsu, N. Christina; Patt, Frederick S.; Pietras, Christophe M.; Robinson, Wayne D.

    2000-01-01

    The effort to resolve data quality issues and improve on the initial data evaluation methodologies of the SeaWiFS Project was an extensive one. These evaluations have resulted, to date, in three major reprocessings of the entire data set where each reprocessing addressed the data quality issues that could be identified up to the time of the reprocessing. Three volumes of the SeaWiFS Postlaunch Technical Report Series (Volumes 9, 10, and 11) are needed to document the improvements implemented since launch. Volume 10 continues the sequential presentation of postlaunch data analysis and algorithm descriptions begun in Volume 9. Chapter 1 of Volume 10 describes an absorbing aerosol index, similar to that produced by the Total Ozone Mapping Spectrometer (TOMS) Project, which is used to flag pixels contaminated by absorbing aerosols, such as, dust and smoke. Chapter 2 discusses the algorithm being used to remove SeaWiFS out-of-band radiance from the water-leaving radiances. Chapter 3 provides an itemization of all significant changes in the processing algorithms for each of the first three reprocessings. Chapter 4 shows the time series of global clear water and deep-water (depths greater than 1,000m) bio-optical and atmospheric properties (normalized water-leaving radiances, chlorophyll, atmospheric optical depth, etc.) based on the eight-day composites as a check on the sensor calibration stability. Chapter 5 examines the variation in the derived products with scan angle using high resolution data around Hawaii to test for residual scan modulation effects and atmospheric correction biases. Chapter 6 provides a methodology for evaluating the atmospheric correction algorithm and atmospheric derived products using ground-based observations. Similarly, Chapter 7 presents match-up comparisons of coincident satellite and in situ data to determine the accuracy of the water-leaving radiances, chlorophyll a, and K(490) products.

  6. Spatial distribution of erosion and deposition during a glacier surge: Brúarjökull, Iceland

    NASA Astrophysics Data System (ADS)

    Korsgaard, Niels J.; Schomacker, Anders; Benediktsson, Ívar Örn; Larsen, Nicolaj K.; Ingólfsson, Ólafur; Kjær, Kurt H.

    2015-12-01

    Time-series of digital elevation models (DEMs) of the forefield of the Brúarjökull surge-type glacier in Iceland were used to quantify the volume of material that was mobilized by the 1963-1964 surge. The DEMs were produced by stereophotogrammetry on aerial photographs from before the surge (1961) and after (1988 and 2003). The analysis was performed on two DEMs of Difference (DoDs), i.e., a 1961-2003 DoD documenting the impact of the surge and a 1988-2003 DoD documenting the post-surge modification of the juvenile surging glacier landsystem. Combined with a digital geomorphological map, the DoDs allow us to quantify the impact of the surge on a landsystem scale down to individual landforms. A total of 34.2 ± 11.3 × 106 m3 of material was mobilized in the 30.7-km2 study area as a result of the most recent surge event. Of these, 17.4 ± 6.6 × 106 m3 of the material were eroded and 16.8 ± 4.7 × 106 m3 were deposited. More than half of the deposited volume was ice-cored landforms. This study demonstrates that although the total mobilized mass volume is high, the net volume gain of ice and sediment deposited as landforms in the forefield caused by the surge is low. Furthermore, deposition of new dead-ice from the 1963-1964 surge constitutes as much as 64% of the volume gain in the forefield. The 1988-2003 DoD is used to quantify the melt-out of this dead-ice and other paraglacial modification of the recently deglaciated forefield of Brúarjökull.

  7. A statistically valid method for using FIA plots to guide spectral class rejection in producing stratification maps

    Treesearch

    Michael L. Hoppus; Andrew J. Lister

    2002-01-01

    A Landsat TM classification method (iterative guided spectral class rejection) produced a forest cover map of southern West Virginia that provided the stratification layer for producing estimates of timberland area from Forest Service FIA ground plots using a stratified sampling technique. These same high quality and expensive FIA ground plots provided ground reference...

  8. Using National Drug Codes and drug knowledge bases to organize prescription records from multiple sources.

    PubMed

    Simonaitis, Linas; McDonald, Clement J

    2009-10-01

    The utility of National Drug Codes (NDCs) and drug knowledge bases (DKBs) in the organization of prescription records from multiple sources was studied. The master files of most pharmacy systems include NDCs and local codes to identify the products they dispense. We obtained a large sample of prescription records from seven different sources. These records carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in or associated with the sample of prescription records. Considering the total prescription volume, DKBs covered 93.0-99.8% of the product codes from three outpatient sources and 77.4-97.0% of the product codes from four inpatient sources. Among the in-patient sources, invented codes explained 36-94% of the noncoverage. Outpatient pharmacy sources rarely invented codes, which comprised only 0.11-0.21% of their total prescription volume, compared with inpatient pharmacy sources for which invented codes comprised 1.7-7.4% of their prescription volume. The distribution of prescribed products was highly skewed, with 1.4-4.4% of codes accounting for 50% of the message volume and 10.7-34.5% accounting for 90% of the message volume. DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources.

  9. Velocity gradients and reservoir volumes lessons in computational sensitivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, P.W.

    1995-12-31

    The sensitivity of reservoir volume estimation from depth converted geophysical time maps to the velocity gradients employed is investigated through a simple model study. The computed volumes are disconcertingly sensitive to gradients, both horizontal and vertical. The need for an accurate method of time to depth conversion is well demonstrated by the model study in which errors in velocity are magnified 40 fold in the computation of the volume. Thus if +/- 10% accuracy in the volume is desired, we must be able to estimate the velocity at the water contact with 0.25% accuracy. Put another way, if the velocitymore » is 8000 feet per second at the well then we have only +/- 20 feet per second leeway in estimating the velocity at the water contact. Very moderate horizontal and vertical gradients would typically indicate a velocity change of a few hundred feet per second if they are in the same direction. Clearly the interpreter needs to by very careful. A methodology is demonstrated which takes into account all the information that is available, velocities, tops, depositional and lithologic spatial patterns, and common sense. It is assumed that through appropriate use of check shot and other time-depth information, that the interpreter has correctly tied the reflection picks to the well tops. Such ties are ordinarily too soft for direct time-depth conversion to give adequate depth ties. The proposed method uses a common compaction law as its basis and incorporates time picks, tops and stratigraphic maps into the depth conversion process. The resulting depth map ties the known well tops in an optimum fashion.« less

  10. Application of an imputation method for geospatial inventory of forest structural attributes across multiple spatial scales in the Lake States, U.S.A

    NASA Astrophysics Data System (ADS)

    Deo, Ram K.

    Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.

  11. Multinational Activities of Major U.S. Automotive Producers : Volume 5. Diffusion of Production and Sales Operations Abroad.

    DOT National Transportation Integrated Search

    1978-09-01

    This is Volume V on the multi-national activities of the major U.S. automotive producers. The purpose of this Volume is to evaluate the foreign manufacturing and sales activities of the General Motors Corporation, Ford Motor Company, and Chrysler Cor...

  12. Multinational Activities of Major U.S. Automotive Producers : Volume 3. Research, Development, and Engineering Abroad

    DOT National Transportation Integrated Search

    1978-09-01

    This is Volume III of the report on the multinational activities of the four major U.S. automotive producers. The primary objective of Volume III is to evaluate the RD&E activities performed abroad by the U.S. automotive multinationals in order to de...

  13. Low current plasmatron fuel converter having enlarged volume discharges

    DOEpatents

    Rabinovich, Alexander; Alexeev, Nikolai; Bromberg, Leslie; Cohn, Daniel R.; Samokhin, Andrei

    2005-04-19

    A novel apparatus and method is disclosed for a plasmatron fuel converter (""plasmatron"") that efficiently uses electrical energy to produce hydrogen rich gas. The volume and shape of the plasma discharge is controlled by a fluid flow established in a plasma discharge volume. A plasmatron according to this invention produces a substantially large effective plasma discharge volume allowing for substantially greater volumetric efficiency in the initiation of chemical reactions within a volume of bulk fluid reactant flowing through the plasmatron.

  14. Low current plasmatron fuel converter having enlarged volume discharges

    DOEpatents

    Rabinovich, Alexander [Swampscott, MA; Alexeev, Nikolai [Moscow, RU; Bromberg, Leslie [Sharon, MA; Cohn, Daniel R [Chestnut Hill, MA; Samokhin, Andrei [Moscow, RU

    2009-10-06

    A novel apparatus and method is disclosed for a plasmatron fuel converter ("plasmatron") that efficiently uses electrical energy to produce hydrogen rich gas. The volume and shape of the plasma discharge is controlled by a fluid flow established in a plasma discharge volume. A plasmatron according to this invention produces a substantially large effective plasma discharge volume allowing for substantially greater volumetric efficiency in the initiation of chemical reactions within a volume of bulk fluid reactant flowing through the plasmatron.

  15. Narrowing the Search After Gravitational-Wave Detections

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-11-01

    Now that were able to detect gravitational waves, the next challenge is to spot electromagnetic signatures associated with gravitational-wave events. A team of scientists has proposed a new algorithm that might narrow the search.Artists illustrations of the stellar-merger model for short gamma-ray bursts. In the model, 1) two neutron stars inspiral, 2) they merge and produce a gamma-ray burst, 3) a small fraction of their mass is flung out and radiates as a kilonova, 4) a massive neutron star or black hole with a disk remains after the event. [NASA, ESA, and A. Feild (STScI)]Light from Neutron-Star MergersJust over a year ago, LIGO detected its first gravitational-wave signal: GW150914, produced when two black holes merged. While we didnt expect to see any sort of light-based signal from this merger, we could expect to see transient electromagnetic signatures in the case of a neutron starblack hole merger or a neutron starneutron star merger in the form of a kilonova or a short gamma-ray burst.While we havent yet detected any mergers involving neutron stars, LIGO has the sensitivity to make these detections in the local universe, and we hope to start seeing them soon! Finding the electromagnetic companions to gravitational-wave signals would be the best way to probe the evolution history of the universe and learn what happens when evolved stars collide. So how do we hunt them down?2D localization maps for LIGOs detection of GW150914 (black contours), as well as the footprints of follow-up observations (red for radio, green for optical/IR, blue for X-ray). [Abbott et al. 2016]Pinpointing a VolumeThe two LIGO detectors can already provide rough 2D localization of where the gravitational-wave signal came from, but the region predicted for GW150914 still covered 600 square degrees, which is a pretty hefty patch of sky! In light of this, the simplest follow-up strategy of tiling large survey observations of the entire predicted region is somewhat impractical and time-consuming. Could we possibly take a more targeted approach?The key, say a team of scientists led by Leo Singer (NASA Goddard SFC), is in using 3D estimates of the source location, rather than 2D sky maps: we need to produce distance estimates for the gravitational-wave source as well. Singer and collaborators have developed an algorithm that, from a gravitational-wave signal, can produce a fast full-volume estimate of the probability distribution for its sources location.Volume rendering of the 90% credible region for a simulated gravitational-wave event, superimposed over a galaxy map for the region. Green crosshairs represent the true location of the source; the most massive galaxies inside the credible region are highlighted. Searching only these galaxies could significantly reduce the observing time needed to detect an electromagnetic counterpart. [Singer et al. 2016]We can then easily cross-reference the volume predicted to contain the source against a galaxy catalog to identify the most probable host galaxies for the signal. This approach allows us to target specific galaxies for rapid observations with follow-up campaigns in the search for a counterpart.Targeted EfficiencySinger and collaborators approach would make searching for electromagnetic counterparts to gravitational-wave events a much more efficient process. One particular advantage would be in reducing the number of false positives: for a typical wide-field follow-up campaign searching 100 square degrees, hundreds of contaminatingsupernovae would be in the field. Targeting only 10 x 10 patches around 100 nearby galaxies, however, reduces the background to fewer than 10 contaminatingsupernovae.An additional benefit is that this targeted strategy opens the door of gravitational-wave follow-up to many small-field-of-view, large-aperture telescopes, instead of limiting the task to broad synoptic surveys. This permits the involvement of many more campaigns in the hunt for the important electromagnetic counterparts to gravitational waves.Note: Want to check out the teams data? Its publicly available here!CitationLeo P. Singer et al 2016 ApJL 829 L15. doi:10.3847/2041-8205/829/1/L15

  16. Mapping the energy footprint of produced water management in New Mexico

    NASA Astrophysics Data System (ADS)

    Zemlick, Katie; Kalhor, Elmira; Thomson, Bruce M.; Chermak, Janie M.; Sullivan Graham, Enid J.; Tidwell, Vincent C.

    2018-02-01

    Hydraulic fracturing (HF) and horizontal drilling have revolutionized the fossil fuel industry by enabling production from unconventional oil and gas (UOG) reserves. However, UOG development requires large volumes of water, and subsequent oil and gas production from both conventional and unconventional wells generate large volumes of produced water (PW). While PW is usually considered a waste product, its reuse may lessen demand for freshwater supplies, reduce costs for transportation and disposal, and reduce the risks for injection-induced seismicity. Whether this water is disposed of or treated and reused, both methods require significant amounts of energy. The objective of this study was to identify the primary energy demands of alternative water management strategies, and to characterize and quantify their geographic variability in four oil and gas producing basins in New Mexico using a single year of production. Results illustrate the importance of each component of each produced water management strategy in determining its total energy footprint. Based on 2015 production and water use data, the energy to extract fresh groundwater for hydraulic fracturing (34 GWh-th yr-1.) exceeds the energy that would be required if the same volume of PW were treated chemically (19 GWh-th yr-1.). In addition, the energy required to transport fresh water and dispose of PW (167 GWh-th yr-1.) is far greater than that required to move treated PW (8 GWh-th yr-1.) to a point of reuse. Furthermore, transportation distances, which contribute significantly to the total energy footprint of a given management strategy, are underestimated by nearly 50% state-wide. This indicates that reuse may be an even more energy efficient way to manage PW, even with energy-intensive treatment strategies like electrocoagulation. Reuse of PW for HF is not only more energy efficient than conventional management techniques, it also reduces both demand for scarce fresh water resources and use of disposal wells. By evaluating components of each management strategy individually, this work illustrates how the energy footprint of regional PW management can be reduced. The advent of UOG recovery in the last decade highlights the need to understand existing water management in the industry, identify opportunities and strategies for improvement, and recognize that these dynamics are likely to change into the future.

  17. Highly dissipative Hénon map behavior in the four-level model of the CO 2 laser with modulated losses

    NASA Astrophysics Data System (ADS)

    Pando L., C. L.; Acosta, G. A. Luna; Meucci, R.; Ciofini, M.

    1995-02-01

    We show that the four-level model for the CO 2 laser with modulated losses behaves in a qualitatively similar way as the highly dissipative Hénon map. The ubiquity of elements of the universal sequence, their related symbolic dynamics, and the presence of reverse bifurcations of chaotic bands in the model are reminiscent of the logistic map which is the limit of the Hénon map when the Jacobian equals zero. The coexistence of attractors, its dynamics related to contraction of volumes in phase space and the associated return maps can be correlated with those of the highly dissipative Hénon map.

  18. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  19. Grid-based mapping: A method for rapidly determining the spatial distributions of small features over very large areas

    NASA Astrophysics Data System (ADS)

    Ramsdale, Jason D.; Balme, Matthew R.; Conway, Susan J.; Gallagher, Colman; van Gasselt, Stephan A.; Hauber, Ernst; Orgel, Csilla; Séjourné, Antoine; Skinner, James A.; Costard, Francois; Johnsson, Andreas; Losiak, Anna; Reiss, Dennis; Swirad, Zuzanna M.; Kereszturi, Akos; Smith, Isaac B.; Platz, Thomas

    2017-06-01

    The increased volume, spatial resolution, and areal coverage of high-resolution images of Mars over the past 15 years have led to an increased quantity and variety of small-scale landform identifications. Though many such landforms are too small to represent individually on regional-scale maps, determining their presence or absence across large areas helps form the observational basis for developing hypotheses on the geological nature and environmental history of a study area. The combination of improved spatial resolution and near-continuous coverage significantly increases the time required to analyse the data. This becomes problematic when attempting regional or global-scale studies of metre and decametre-scale landforms. Here, we describe an approach for mapping small features (from decimetre to kilometre scale) across large areas, formulated for a project to study the northern plains of Mars, and provide context on how this method was developed and how it can be implemented. Rather than ;mapping; with points and polygons, grid-based mapping uses a ;tick box; approach to efficiently record the locations of specific landforms (we use an example suite of glacial landforms; including viscous flow features, the latitude dependant mantle and polygonised ground). A grid of squares (e.g. 20 km by 20 km) is created over the mapping area. Then the basemap data are systematically examined, grid-square by grid-square at full resolution, in order to identify the landforms while recording the presence or absence of selected landforms in each grid-square to determine spatial distributions. The result is a series of grids recording the distribution of all the mapped landforms across the study area. In some ways, these are equivalent to raster images, as they show a continuous distribution-field of the various landforms across a defined (rectangular, in most cases) area. When overlain on context maps, these form a coarse, digital landform map. We find that grid-based mapping provides an efficient solution to the problems of mapping small landforms over large areas, by providing a consistent and standardised approach to spatial data collection. The simplicity of the grid-based mapping approach makes it extremely scalable and workable for group efforts, requiring minimal user experience and producing consistent and repeatable results. The discrete nature of the datasets, simplicity of approach, and divisibility of tasks, open up the possibility for citizen science in which crowdsourcing large grid-based mapping areas could be applied.

  20. Use of Major- and Minor-Element Mapping to Measure Chemical Variability in Diogenite Pyroxenes

    NASA Astrophysics Data System (ADS)

    Spilde, M. N.; Papike, J. J.

    1993-07-01

    Diogenite orthopyroxene grains have been shown to exhibit chemical variability within individual meteorite samples, e.g., the population groups reported by Hewins [1]. Our previous work [2] has shown a great deal of inter- and intragrain variability in OPX. The Garland diogenite, for example, appears to have two distinct populations of OPX, based on Cr/Al ratios. However, within individual crystals, excursions of Cr/Al ratios are present that may span a wide range within each population group. We are presently conducting SIMS analysis of OPX in diogenites and, therefore, the chemical variability of analyzed pyroxenes must be completely determined in order to find that portion of the crystal that most accurately records the igneous, rather than the metamorphic history. Optical analysis alone is not sufficient to categorize the pyroxene crystals. For example, numerous grains in EET 83246 appear to be zoned, with changes in interference colors toward the rims. However, EDS mapping indicates that the rims are chemically similar to the cores; the interference colors are interpreted to be due to some sort of grain abrasion process that occurred during brecciation. Backscattered image mapping at low magnification can identify intergrain variations and gross intragrain chemical variations, but this must be followed up with more detailed elemental mapping. Our microprobe employs a Link (Oxford) eXL II analyzer with full-stage automation, so that we can combine EDS maps of major elements with WDS maps of minor elements (Al, Ti, Mn, Cr). Large area maps (>1 mm) are produced using stage rastering to avoid defocusing of the WDS spectrometers. In a final step, quantitative elemental maps of target grains are produced, whereby the characteristic X-ray intensity collected at each map pixel is background suppressed and fitted against a standard intensity to yield an apparent concentration. Appropriate standards, close to the composition of the mapped grain, are used such that the apparent concentration is close to the real (ZAF-corrected) concentration. A number of grains have been mapped in Roda, Garland, EET 83246 and LEW 88008. Except for Garland, OPX grains are relatively homogeneous. Some large Roda grains show very fine (100) augite lamellae along with small augite blebs and scattered spinel along relict grain boundaries. Orthopyroxenes in Garland exhibit rare zoning and thin augite lamellae. Both symmetrical and asymmetrical zoning are evident in elemental maps; an example of nearly symmetrical zoning is shown in the Fe and Mg quantitative maps in Fig. 1, which appears in the hard copy (note a companion abstract [3] that illustrates a microprobe traverse across this grain). In addition, some grains show evidence of partial relict rims, indicating that the grains are brecciated fragments of larger, zoned grains. These brecciated pieces may explain some, but not all, of the intragrain variability seen in Garland. Because Cr decreases toward the rim while Al remains relatively constant, those fragments from the grain core will have higher Cr/Al ratios than fragments from the rim. Acknowledgment: This research was supported by NASA Grant NAGW-3347 and the Institute of Meteoritics. References: [1] Hewins R. (1980) LPSC XI, 441-443. [2] Papike J. J. et al. (1993) LPSC XXIV, 1109-110. [3] Papike J. J. and Spilde M. N. (1993), this volume.

  1. A study of an orbital radar mapping mission to Venus. Volume 3: Parametric studies and subsystem comparisons

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Parametric studies and subsystem comparisons for the orbital radar mapping mission to planet Venus are presented. Launch vehicle requirements and primary orbiter propulsion system requirements are evaluated. The systems parametric analysis indicated that orbit size and orientation interrelated with almost all of the principal spacecraft systems and influenced significantly the definition of orbit insertion propulsion requirements, weight in orbit capability, radar system design, and mapping strategy.

  2. Laser Scanning for the definition of high resolution topography in the Apuan Alps (IT) marble district

    NASA Astrophysics Data System (ADS)

    Riccucci, Silvia; Salvini, Riccardo; Francioni, Mirko

    2010-05-01

    The present paper describes the results of five laser scanning surveys performed by a Leica™ ScanStation2 with the aim of producing an high definition topography of a quarry, at the scale of 1:1,000. The project comes from a joint research carried out by the Authors in collaboration with the Local Sanitary Unit Agency (ASL n.1) of Massa and Carrara (IT). The main objective of this work has been the survey of both the exploited quarry walls and the upper residual and natural slopes. The area of interest is characterized by several quarry fronts which develop to variable and alternate directions to create very evident rock spurs, from NW-SE to NE-SW trending for a total wideness of about 2 hectares. Moreover, the quarry walls are sub-vertical and sometimes overhang by a total height of 100 m in respect to the quarry floor. Differential GPS and orthometric correction have been applied in order to co-register and to georeference the five point clouds; for these purposes a series of optical targets have been measured using a Laser Total Station. The subsequent phase has been the mesh construction and editing, from which a topographic map, 25 cm equidistance contours lines, has been created. The planimetric map shows the position and the geometry of crests, banks, escarpments, walls and all others exploitation features and the characteristics of the upper natural slopes. Morphological profiles along the maximum slope have been realized in order to better plan the future extractive activities according to the regional law. In order to make easier the prospective observation of detailed and overall areas, 3D views of multi-directional orientation have been realized. The produced data has been published by using the free LeicaTM TruView plug-in for Internet Explorer in a way to easily view the photographs and to measure the laser scan point clouds. Processing results have highlighted the higher spatial resolution of data coming from laser scanning in respect to the traditional topographic methodologies. Such information can be used for multitemporal evaluation of mining volumes. For local authorities the control of extraction volumes can be more easily and rapidly executed by map algebra function of image differencing between bi-temporal Digital Dense Surface Model (DDSM) of the quarry walls. Finally, the DDSM has been utilized by the Authors to analyze the rock slope stability, to collect geological-engineering data, such as joints attitude and spacing, and to perform detailed analysis of rockfall trajectories.

  3. Digital data sets for map products produced as part of the Black Hills Hydrology Study, western South Dakota

    USGS Publications Warehouse

    Williamson, Joyce E.; Jarrell, Gregory J.; Clawges, Rick M.; Galloway, Joel M.; Carter, Janet M.

    2000-01-01

    This compact disk contains digital data produced as part of the 1:100,000-scale map products for the Black Hills Hydrology Study conducted in western South Dakota. The digital data include 28 individual Geographic Information System (GIS) data sets: data sets for the hydrogeologic unit map including all mapped hydrogeologic units within the study area (1 data set) and major geologic structure including anticlines and synclines (1 data set); data sets for potentiometric maps including the potentiometric contours for the Inyan Kara, Minnekahta, Minnelusa, Madison, and Deadwood aquifers (5 data sets), wells used as control points for each aquifer (5 data sets), and springs used as control points for the potentiometric contours (1 data set); and data sets for the structure-contour maps including the structure contours for the top of each formation that contains major aquifers (5 data sets), wells and tests holes used as control points for each formation (5 data sets), and surficial deposits (alluvium and terrace deposits) that directly overlie each of the major aquifer outcrops (5 data sets). These data sets were used to produce the maps published by the U.S. Geological Survey.

  4. Comparison of SAM and OBIA as Tools for Lava Morphology Classification - A Case Study in Krafla, NE Iceland

    NASA Astrophysics Data System (ADS)

    Aufaristama, Muhammad; Hölbling, Daniel; Höskuldsson, Ármann; Jónsdóttir, Ingibjörg

    2017-04-01

    The Krafla volcanic system is part of the Icelandic North Volcanic Zone (NVZ). During Holocene, two eruptive events occurred in Krafla, 1724-1729 and 1975-1984. The last eruptive episode (1975-1984), known as the "Krafla Fires", resulted in nine volcanic eruption episodes. The total area covered by the lavas from this eruptive episode is 36 km2 and the volume is about 0.25-0.3 km3. Lava morphology is related to the characteristics of the surface morphology of a lava flow after solidification. The typical morphology of lava can be used as primary basis for the classification of lava flows when rheological properties cannot be directly observed during emplacement, and also for better understanding the behavior of lava flow models. Although mapping of lava flows in the field is relatively accurate such traditional methods are time consuming, especially when the lava covers large areas such as it is the case in Krafla. Semi-automatic mapping methods that make use of satellite remote sensing data allow for an efficient and fast mapping of lava morphology. In this study, two semi-automatic methods for lava morphology classification are presented and compared using Landsat 8 (30 m spatial resolution) and SPOT-5 (10 m spatial resolution) satellite images. For assessing the classification accuracy, the results from semi-automatic mapping were compared to the respective results from visual interpretation. On the one hand, the Spectral Angle Mapper (SAM) classification method was used. With this method an image is classified according to the spectral similarity between the image reflectance spectrums and the reference reflectance spectra. SAM successfully produced detailed lava surface morphology maps. However, the pixel-based approach partly leads to a salt-and-pepper effect. On the other hand, we applied the Random Forest (RF) classification method within an object-based image analysis (OBIA) framework. This statistical classifier uses a randomly selected subset of training samples to produce multiple decision trees. For final classification of pixels or - in the present case - image objects, the average of the class assignments probability predicted by the different decision trees is used. While the resulting OBIA classification of lava morphology types shows a high coincidence with the reference data, the approach is sensitive to the segmentation-derived image objects that constitute the base units for classification. Both semi-automatic methods produce reasonable results in the Krafla lava field, even if the identification of different pahoehoe and aa types of lava appeared to be difficult. The use of satellite remote sensing data shows a high potential for fast and efficient classification of lava morphology, particularly over large and inaccessible areas.

  5. Database for the geologic map of Upper Geyser Basin, Yellowstone National Park, Wyoming

    USGS Publications Warehouse

    Abendini, Atosa A.; Robinson, Joel E.; Muffler, L. J. Patrick; White, D. E.; Beeson, Melvin H.; Truesdell, A. H.

    2015-01-01

    This dataset contains contacts, geologic units, and map boundaries from Miscellaneous Investigations Series Map I-1371, "The Geologic map of upper Geyser Basin, Yellowstone, National Park, Wyoming". This dataset was constructed to produce a digital geologic map as a basis for ongoing studies of hydrothermal processes.

  6. Deciding Optimal Noise Monitoring Sites with Matrix Gray Absolute Relation Degree Theory

    NASA Astrophysics Data System (ADS)

    Gao, Zhihua; Li, Yadan; Zhao, Limin; Wang, Shuangwei

    2015-08-01

    Noise maps are applied to assess noise level in cities all around the world. There are mainly two ways of producing noise maps: one way is producing noise maps through theoretical simulations with the surrounding conditions, such as traffic flow, building distribution, etc.; the other one is calculating noise level with actual measurement data from noise monitors. Currently literature mainly focuses on considering more factors that affect sound traveling during theoretical simulations and interpolation methods in producing noise maps based on measurements of noise. Although many factors were considered during simulation, noise maps have to be calibrated by actual noise measurements. Therefore, the way of obtaining noise data is significant to both producing and calibrating a noise map. However, there is little literature mentioned about rules of deciding the right monitoring sites when placed the specified number of noise sensors and given the deviation of a noise map produced with data from them. In this work, by utilizing matrix Gray Absolute Relation Degree Theory, we calculated the relation degrees between the most precise noise surface and those interpolated with different combinations of noise data with specified number. We found that surfaces plotted with different combinations of noise data produced different relation degrees with the most precise one. Then we decided the least significant one among the total and calculated the corresponding deviation when it was excluded in making a noise surface. Processing the left noise data in the same way, we found out the least significant datum among the left data one by one. With this method, we optimized the noise sensor’s distribution in an area about 2km2. And we also calculated the bias of surfaces with the least significant data removed. Our practice provides an optimistic solution to the situation faced by most governments that there is limited financial budget available for noise monitoring, especially in the undeveloped regions.

  7. Measuring Waves and Erosion in Underwater Oil Blobs and Monitoring Other Arbitrary Surfaces with a Kinect v2 Time-of-Flight Camera

    NASA Astrophysics Data System (ADS)

    Butkiewicz, T.

    2014-12-01

    We developed free software that enables researchers to utilize Microsoft's new Kinect for Windows v2 sensor for a range of coastal and ocean mapping applications, as well as monitoring and measuring experimental scenes. While the original Kinect device used structured light and had very poor resolution, many geophysical researchers found uses for it in their experiments. The new next generation of this sensor uses time-of-flight technology, and can produce higher resolution depth measurements with an order of magnitude more accuracy. It also is capable of measurement through and under water. An analysis tool in our application lets users quickly select any arbitrary surface in the sensor's view. The tools automatically scans the surface, then calibrates and aligns a measurement volume to it. Depth readings from the sensor are converted into 3D point clouds, and points falling within this volume are projected into surface coordinates. Raster images can be output which consist of height fields aligned to the surface, generated from these projected measurements and interpolations between them. Images have a simple 1 pixel = 1 mm resolution and intensity values representing mm in height from the base-plane, which enables easy measurement and calculations to be conducted on the images in other analysis packages. Single snapshots can be taken manually on demand, or the software can monitor the surface automatically, capturing frames at preset intervals. This produces time lapse animations of dynamically changing surfaces. We apply this analysis tool to an experiment studying the behavior of underwater oil in response to flowing water of different speeds and temperatures. Blobs of viscous oils are placed in a flume apparatus, which circulates water past them. Over the course of a couple hours, the oil blobs spread out, waves slowly ripple across their surfaces, and erosions occur as smaller blobs break off from the main blob. All of this can be captured in 3D, with mm accuracy, through the water using the Kinect for Windows v2 sensor and our K2MapKit software.

  8. Recent Developments in Film and Gas Research in Modified Atmosphere Packaging of Fresh Foods.

    PubMed

    Zhang, Min; Meng, Xiangyong; Bhandari, Bhesh; Fang, Zhongxiang

    2016-10-02

    Due to the rise of consumer's awareness of fresh foods to health, in the past few years, the consumption of fresh and fresh-cut produces has increased sturdily. Modified atmosphere packaging (MAP) possesses a potential to become one of the most appropriate technologies for packaging fresh and fresh-cut produces. The MAP has advantages of extending the shelf-life, preserving or stabilizing the desired properties of fresh produces, and convenience in handing and distribution. The success of MAP-fresh foods depends on many factors including types of fresh foods, storage temperature and humidity, gas composition, and the characteristics of package materials. This paper reviews the recent developments highlighting the most critical factors of film and gas on the quality of MAP fresh foods. Although the innovations and development of food packaging technology will continue to promote the development of novel MAP, concentrated research and endeavors from scientists and engineers are still important to the development of MAP that focuses on consumers' requirements, enhancing product quality, environmental friendly design, and cost-effective application.

  9. Algebraic grid generation using tensor product B-splines. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Saunders, B. V.

    1985-01-01

    Finite difference methods are more successful if the accompanying grid has lines which are smooth and nearly orthogonal. The development of an algorithm which produces such a grid when given the boundary description. Topological considerations in structuring the grid generation mapping are discussed. The concept of the degree of a mapping and how it can be used to determine what requirements are necessary if a mapping is to produce a suitable grid is examined. The grid generation algorithm uses a mapping composed of bicubic B-splines. Boundary coefficients are chosen so that the splines produce Schoenberg's variation diminishing spline approximation to the boundary. Interior coefficients are initially chosen to give a variation diminishing approximation to the transfinite bilinear interpolant of the function mapping the boundary of the unit square onto the boundary grid. The practicality of optimizing the grid by minimizing a functional involving the Jacobian of the grid generation mapping at each interior grid point and the dot product of vectors tangent to the grid lines is investigated. Grids generated by using the algorithm are presented.

  10. Evaluation of a color-coded Landsat 5/6 ratio image for mapping lithologic differences in western South Dakota

    USGS Publications Warehouse

    Raines, Gary L.; Bretz, R.F.; Shurr, George W.

    1979-01-01

    From analysis of a color-coded Landsat 5/6 ratio, image, a map of the vegetation density distribution has been produced by Raines of 25,000 sq km of western South Dakota. This 5/6 ratio image is produced digitally calculating the ratios of the bands 5 and 6 of the Landsat data and then color coding these ratios in an image. Bretz and Shurr compared this vegetation density map with published and unpublished data primarily of the U.S. Geological Survey and the South Dakota Geological Survey; good correspondence is seen between this map and existing geologic maps, especially with the soils map. We believe that this Landsat ratio image can be used as a tool to refine existing maps of surficial geology and bedrock, where bedrock is exposed, and to improve mapping accuracy in areas of poor exposure common in South Dakota. In addition, this type of image could be a useful, additional tool in mapping areas that are unmapped.

  11. Assessing Volunteered Geographic Information (vgi) Quality Based on CONTRIBUTORS' Mapping Behaviours

    NASA Astrophysics Data System (ADS)

    Bégin, D.; Devillers, R.; Roche, S.

    2013-05-01

    VGI changed the mapping landscape by allowing people that are not professional cartographers to contribute to large mapping projects, resulting at the same time in concerns about the quality of the data produced. While a number of early VGI studies used conventional methods to assess data quality, such approaches are not always well adapted to VGI. Since VGI is a user-generated content, we posit that features and places mapped by contributors largely reflect contributors' personal interests. This paper proposes studying contributors' mapping processes to understand the characteristics and quality of the data produced. We argue that contributors' behaviour when mapping reflects contributors' motivation and individual preferences in selecting mapped features and delineating mapped areas. Such knowledge of contributors' behaviour could allow for the derivation of information about the quality of VGI datasets. This approach was tested using a sample area from OpenStreetMap, leading to a better understanding of data completeness for contributor's preferred features.

  12. Land use and land cover mapping: City of Palm Bay, Florida

    NASA Technical Reports Server (NTRS)

    Barile, D. D.; Pierce, R.

    1977-01-01

    Two different computer systems were compared for use in making land use and land cover maps. The Honeywell 635 with the LANDSAT signature development program (LSDP) produced a map depicting general patterns, but themes were difficult to classify as specific land use. Urban areas were unclassified. The General Electric Image 100 produced a map depicting eight land cover categories classifying 68 percent of the total area. Ground truth, LSDP, and Image 100 maps were all made to the same scale for comparison. LSDP agreed with the ground truth 60 percent and 64 percent within the two test areas compared and Image 100 was in agreement 70 percent and 80 percent.

  13. Comparison of an Atomic Model and Its Cryo-EM Image at the Central Axis of a Helix

    PubMed Central

    He, Jing; Zeil, Stephanie; Hallak, Hussam; McKaig, Kele; Kovacs, Julio; Wriggers, Willy

    2016-01-01

    Cryo-electron microscopy (cryo-EM) is an important biophysical technique that produces three-dimensional (3D) density maps at different resolutions. Because more and more models are being produced from cryo-EM density maps, validation of the models is becoming important. We propose a method for measuring local agreement between a model and the density map using the central axis of the helix. This method was tested using 19 helices from cryo-EM density maps between 5.5 Å and 7.2 Å resolution and 94 helices from simulated density maps. This method distinguished most of the well-fitting helices, although challenges exist for shorter helices. PMID:27280059

  14. Automatic transfer function design for medical visualization using visibility distributions and projective color mapping.

    PubMed

    Cai, Lile; Tay, Wei-Liang; Nguyen, Binh P; Chui, Chee-Kong; Ong, Sim-Heng

    2013-01-01

    Transfer functions play a key role in volume rendering of medical data, but transfer function manipulation is unintuitive and can be time-consuming; achieving an optimal visualization of patient anatomy or pathology is difficult. To overcome this problem, we present a system for automatic transfer function design based on visibility distribution and projective color mapping. Instead of assigning opacity directly based on voxel intensity and gradient magnitude, the opacity transfer function is automatically derived by matching the observed visibility distribution to a target visibility distribution. An automatic color assignment scheme based on projective mapping is proposed to assign colors that allow for the visual discrimination of different structures, while also reflecting the degree of similarity between them. When our method was tested on several medical volumetric datasets, the key structures within the volume were clearly visualized with minimal user intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. A qualitative enquiry into OpenStreetMap making

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Wei

    2011-04-01

    Based on a case study on the OpenStreetMap community, this paper provides a contextual and embodied understanding of the user-led, user-participatory and user-generated produsage phenomenon. It employs Grounded Theory, Social Worlds Theory, and qualitative methods to illuminate and explores the produsage processes of OpenStreetMap making, and how knowledge artefacts such as maps can be collectively and collaboratively produced by a community of people, who are situated in different places around the world but engaged with the same repertoire of mapping practices. The empirical data illustrate that OpenStreetMap itself acts as a boundary object that enables actors from different social worlds to co-produce the Map through interacting with each other and negotiating the meanings of mapping, the mapping data and the Map itself. The discourses also show that unlike traditional maps that black-box cartographic knowledge and offer a single dominant perspective of cities or places, OpenStreetMap is an embodied epistemic object that embraces different world views. The paper also explores how contributors build their identities as an OpenStreetMaper alongside some other identities they have. Understanding the identity-building process helps to understand mapping as an embodied activity with emotional, cognitive and social repertoires.

  16. Layout-aware simulation of soft errors in sub-100 nm integrated circuits

    NASA Astrophysics Data System (ADS)

    Balbekov, A.; Gorbunov, M.; Bobkov, S.

    2016-12-01

    Single Event Transient (SET) caused by charged particle traveling through the sensitive volume of integral circuit (IC) may lead to different errors in digital circuits in some cases. In technologies below 180 nm, a single particle can affect multiple devices causing multiple SET. This fact adds the complexity to fault tolerant devices design, because the schematic design techniques become useless without their layout consideration. The most common layout mitigation technique is a spatial separation of sensitive nodes of hardened circuits. Spatial separation decreases the circuit performance and increases power consumption. Spacing should thus be reasonable and its scaling follows the device dimensions' scaling trend. This paper presents the development of the SET simulation approach comprised of SPICE simulation with "double exponent" current source as SET model. The technique uses layout in GDSII format to locate nearby devices that can be affected by a single particle and that can share the generated charge. The developed software tool automatizes multiple simulations and gathers the produced data to present it as the sensitivity map. The examples of conducted simulations of fault tolerant cells and their sensitivity maps are presented in this paper.

  17. Investigation of the relative orientation of the system of optical sensors to monitor the technosphere objects

    NASA Astrophysics Data System (ADS)

    Petrochenko, Andrey; Konyakhin, Igor

    2017-06-01

    In connection with the development of robotics have become increasingly popular variety of three-dimensional reconstruction of the system mapping and image-set received from the optical sensors. The main objective of technical and robot vision is the detection, tracking and classification of objects of the space in which these systems and robots operate [15,16,18]. Two-dimensional images sometimes don't contain sufficient information to address those or other problems: the construction of the map of the surrounding area for a route; object identification, tracking their relative position and movement; selection of objects and their attributes to complement the knowledge base. Three-dimensional reconstruction of the surrounding space allows you to obtain information on the relative positions of objects, their shape, surface texture. Systems, providing training on the basis of three-dimensional reconstruction of the results of the comparison can produce two-dimensional images of three-dimensional model that allows for the recognition of volume objects on flat images. The problem of the relative orientation of industrial robots with the ability to build threedimensional scenes of controlled surfaces is becoming actual nowadays.

  18. Evaluation of satellite remote sensing and automatic data techniques for characterization of wetlands and coastal marshlands. [Atchafalaya River Basin, Louisiana

    NASA Technical Reports Server (NTRS)

    Cartmill, R. H. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The evaluation was conducted in a humid swamp and marsh area of southern Louisiana. ERTS digital multispectral scanner data was compared with similar data gathered by intermediate altitude aircraft. Automatic data processing was applied to several data sets to produce simulated color infrared images, analysis of single bands, thematic maps, and surface classifications. These products were used to determine the effectiveness of satellites to monitor accretion of land, locate aquatic plants, determine water characteristics, and identify marsh and forest species. The results show that to some extent all of these can be done with satellite data. It is most effective for monitoring accretion and least effective in locating aquatic plants. The data sets used show that the ERTS data is superior in mapping quality and accuracy to the aircraft data. However, in some applications requiring high resolution or maximum use of intermittent clear weather conditions, data gathering by aircraft is preferable. Data processing costs for equivalent areas are about three times greater for aircraft data than ERTS data. This is primarily because of the larger volume of data generated by the high resolution aircraft system.

  19. Assessment of self-organizing maps to analyze sole-carbon source utilization profiles.

    PubMed

    Leflaive, Joséphine; Céréghino, Régis; Danger, Michaël; Lacroix, Gérard; Ten-Hage, Loïc

    2005-07-01

    The use of community-level physiological profiles obtained with Biolog microplates is widely employed to consider the functional diversity of bacterial communities. Biolog produces a great amount of data which analysis has been the subject of many studies. In most cases, after some transformations, these data were investigated with classical multivariate analyses. Here we provided an alternative to this method, that is the use of an artificial intelligence technique, the Self-Organizing Maps (SOM, unsupervised neural network). We used data from a microcosm study of algae-associated bacterial communities placed in various nutritive conditions. Analyses were carried out on the net absorbances at two incubation times for each substrates and on the chemical guild categorization of the total bacterial activity. Compared to Principal Components Analysis and cluster analysis, SOM appeared as a valuable tool for community classification, and to establish clear relationships between clusters of bacterial communities and sole-carbon sources utilization. Specifically, SOM offered a clear bidimensional projection of a relatively large volume of data and were easier to interpret than plots commonly obtained with multivariate analyses. They would be recommended to pattern the temporal evolution of communities' functional diversity.

  20. Delineation and segmentation of cerebral tumors by mapping blood-brain barrier disruption with dynamic contrast-enhanced CT and tracer kinetics modeling-a feasibility study.

    PubMed

    Bisdas, S; Yang, X; Lim, C C T; Vogl, T J; Koh, T S

    2008-01-01

    Dynamic contrast-enhanced (DCE) imaging is a promising approach for in vivo assessment of tissue microcirculation. Twenty patients with clinical and routine computed tomography (CT) evidence of intracerebral neoplasm were examined with DCE-CT imaging. Using a distributed-parameter model for tracer kinetics modeling of DCE-CT data, voxel-level maps of cerebral blood flow (F), intravascular blood volume (vi) and intravascular mean transit time (t1) were generated. Permeability-surface area product (PS), extravascular extracellular blood volume (ve) and extraction ratio (E) maps were also calculated to reveal pathologic locations of tracer extravasation, which are indicative of disruptions in the blood-brain barrier (BBB). All maps were visually assessed for quality of tumor delineation and measurement of tumor extent by two radiologists. Kappa (kappa) coefficients and their 95% confidence intervals (CI) were calculated to determine the interobserver agreement for each DCE-CT map. There was a substantial agreement for the tumor delineation quality in the F, ve and t1 maps. The agreement for the quality of the tumor delineation was excellent for the vi, PS and E maps. Concerning the measurement of tumor extent, excellent and nearly excellent agreement was achieved only for E and PS maps, respectively. According to these results, we performed a segmentation of the cerebral tumors on the base of the E maps. The interobserver agreement for the tumor extent quantification based on manual segmentation of tumor in the E maps vs. the computer-assisted segmentation was excellent (kappa = 0.96, CI: 0.93-0.99). The interobserver agreement for the tumor extent quantification based on computer segmentation in the mean images and the E maps was substantial (kappa = 0.52, CI: 0.42-0.59). This study illustrates the diagnostic usefulness of parametric maps associated with BBB disruption on a physiology-based approach and highlights the feasibility for automatic segmentation of cerebral tumors.

  1. Global land ice measurements from space (GLIMS): remote sensing and GIS investigations of the Earth's cryosphere

    USGS Publications Warehouse

    Bishop, Michael P.; Olsenholler, Jeffrey A.; Shroder, John F.; Barry, Roger G.; Rasup, Bruce H.; Bush, Andrew B. G.; Copland, Luke; Dwyer, John L.; Fountain, Andrew G.; Haeberli, Wilfried; Kääb, Andreas; Paul, Frank; Hall, Dorothy K.; Kargel, Jeffrey S.; Molnia, Bruce F.; Trabant, Dennis C.; Wessels, Rick L.

    2004-01-01

    Concerns over greenhouse‐gas forcing and global temperatures have initiated research into understanding climate forcing and associated Earth‐system responses. A significant component is the Earth's cryosphere, as glacier‐related, feedback mechanisms govern atmospheric, hydrospheric and lithospheric response. Predicting the human and natural dimensions of climate‐induced environmental change requires global, regional and local information about ice‐mass distribution, volumes, and fluctuations. The Global Land‐Ice Measurements from Space (GLIMS) project is specifically designed to produce and augment baseline information to facilitate glacier‐change studies. This requires addressing numerous issues, including the generation of topographic information, anisotropic‐reflectance correction of satellite imagery, data fusion and spatial analysis, and GIS‐based modeling. Field and satellite investigations indicate that many small glaciers and glaciers in temperate regions are downwasting and retreating, although detailed mapping and assessment are still required to ascertain regional and global patterns of ice‐mass variations. Such remote sensing/GIS studies, coupled with field investigations, are vital for producing baseline information on glacier changes, and improving our understanding of the complex linkages between atmospheric, lithospheric, and glaciological processes.

  2. Toward a Digital Thread and Data Package for Metals-Additive Manufacturing.

    PubMed

    Kim, D B; Witherell, P; Lu, Y; Feng, S

    2017-01-01

    Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread.

  3. Toward a Digital Thread and Data Package for Metals-Additive Manufacturing

    PubMed Central

    Kim, D. B.; Witherell, P.; Lu, Y.; Feng, S.

    2017-01-01

    Additive manufacturing (AM) has been envisioned by many as a driving factor of the next industrial revolution. Potential benefits of AM adoption include the production of low-volume, customized, complicated parts/products, supply chain efficiencies, shortened time-to-market, and environmental sustainability. Work remains, however, for AM to reach the status of a full production-ready technology. Whereas the ability to create unique 3D geometries has been generally proven, production challenges remain, including lack of (1) data manageability through information management systems, (2) traceability to promote product producibility, process repeatability, and part-to-part reproducibility, and (3) accountability through mature certification and qualification methodologies. To address these challenges in part, this paper discusses the building of data models to support the development of validation and conformance methodologies in AM. We present an AM information map that leverages informatics to facilitate part producibility, process repeatability, and part-to-part reproducibility in an AM process. We present three separate case studies to demonstrate the importance of establishing baseline data structures and part provenance through an AM digital thread. PMID:28691115

  4. Tensor-based Dictionary Learning for Spectral CT Reconstruction

    PubMed Central

    Zhang, Yanbo; Wang, Ge

    2016-01-01

    Spectral computed tomography (CT) produces an energy-discriminative attenuation map of an object, extending a conventional image volume with a spectral dimension. In spectral CT, an image can be sparsely represented in each of multiple energy channels, and are highly correlated among energy channels. According to this characteristics, we propose a tensor-based dictionary learning method for spectral CT reconstruction. In our method, tensor patches are extracted from an image tensor, which is reconstructed using the filtered backprojection (FBP), to form a training dataset. With the Candecomp/Parafac decomposition, a tensor-based dictionary is trained, in which each atom is a rank-one tensor. Then, the trained dictionary is used to sparsely represent image tensor patches during an iterative reconstruction process, and the alternating minimization scheme is adapted for optimization. The effectiveness of our proposed method is validated with both numerically simulated and real preclinical mouse datasets. The results demonstrate that the proposed tensor-based method generally produces superior image quality, and leads to more accurate material decomposition than the currently popular popular methods. PMID:27541628

  5. Constraints of subducted slab geometries on trench migration and subduction velocities: flat slabs and slab curtains in the mantle under Asia

    NASA Astrophysics Data System (ADS)

    Wu, J. E.; Suppe, J.; Renqi, L.; Lin, C.; Kanda, R. V.

    2013-12-01

    The past locations, shapes and polarity of subduction trenches provide first-order constraints for plate tectonic reconstructions. Analogue and numerical models of subduction zones suggest that relative subducting (Vs) and overriding (Vor) plate velocities may strongly influence final subducted slab geometries. Here we have mapped the 3D geometries of subducted slabs in the upper and lower mantle of Asia from global seismic tomography. We have incorporated these slabs into plate tectonic models, which allows us to infer the subducting and overriding plate velocities. We describe two distinct slab geometry styles, ';flat slabs' and ';slab curtains', and show their implications for paleo-trench positions and subduction geometries in plate tectonic reconstructions. When compared to analogue and numerical models, the mapped slab styles show similarities to modeled slabs that occupy very different locations within Vs:Vor parameter space. ';Flat slabs' include large swaths of sub-horizontal slabs in the lower mantle that underlie the well-known northward paths of India and Australia from Eastern Gondwana, viewed in a moving hotspot reference. At India the flat slabs account for a significant proportion of the predicted lost Ceno-Tethys Ocean since ~100 Ma, whereas at Australia they record the existence of a major 8000km by 2500-3000km ocean that existed at ~43 Ma between East Asia, the Pacific and Australia. Plate reconstructions incorporating the slab constraints imply these flat slab geometries were generated when continent overran oceanic lithosphere to produce rapid trench retreat, or in other words, when subducting and overriding velocities were equal (i.e. Vs ~ Vor). ';Slab curtains' include subvertical Pacific slabs near the Izu-Bonin and Marianas trenches that extend from the surface down to 1500 km in the lower mantle and are 400 to 500 km thick. Reconstructed slab lengths were assessed from tomographic volumes calculated at serial cross-sections. The ';slab curtain' geometry and restored slab lengths indicate a nearly stationary Pacific trench since ~43 Ma. In contrast to the flat slabs, here the reconstructed subduction zone had large subducting plate velocities relative to very small overriding plate velocities (i.e. Vs >> Vor). In addition to flat slabs and slab curtains, we also find other less widespread local subduction settings that lie at other locations in Vs:Vor parameter space or involved other processes. Slabs were mapped using Gocad software. Mapped slabs were restored to a spherical model Earth surface by two approaches: unfolding (i.e. piecewise flattening) to minimize shape and area distortions, and by evaluated mapped slab volumes. Gplates software was used to integrate the mapped slabs with plate tectonic reconstructions.

  6. CIS-lunar space infrastructure lunar technologies: Executive summary

    NASA Technical Reports Server (NTRS)

    Faller, W.; Hoehn, A.; Johnson, S.; Moos, P.; Wiltberger, N.

    1989-01-01

    Technologies necessary for the creation of a cis-Lunar infrastructure, namely: (1) automation and robotics; (2) life support systems; (3) fluid management; (4) propulsion; and (5) rotating technologies, are explored. The technological focal point is on the development of automated and robotic systems for the implementation of a Lunar Oasis produced by Automation and Robotics (LOAR). Under direction from the NASA Office of Exploration, automation and robotics were extensively utilized as an initiating stage in the return to the Moon. A pair of autonomous rovers, modular in design and built from interchangeable and specialized components, is proposed. Utilizing a buddy system, these rovers will be able to support each other and to enhance their individual capabilities. One rover primarily explores and maps while the second rover tests the feasibility of various materials-processing techniques. The automated missions emphasize availability and potential uses of Lunar resources, and the deployment and operations of the LOAR program. An experimental bio-volume is put into place as the precursor to a Lunar environmentally controlled life support system. The bio-volume will determine the reproduction, growth and production characteristics of various life forms housed on the Lunar surface. Physicochemical regenerative technologies and stored resources will be used to buffer biological disturbances of the bio-volume environment. The in situ Lunar resources will be both tested and used within this bio-volume. Second phase development on the Lunar surface calls for manned operations. Repairs and re-configuration of the initial framework will ensue. An autonomously-initiated manned Lunar oasis can become an essential component of the United States space program.

  7. Metateaching and the Instructional Map. Teaching Techniques/Strategies Series, Volume 1.

    ERIC Educational Resources Information Center

    Timpson, William M.

    This book describes a conceptual framework, the "Instructional Map"--a metaphor for envisioning the interconnectedness of teacher and student, teaching and learning, and content and process--that can help teachers plan, sort their way through course material and instructional options, interact with students, and reflect upon progress made and what…

  8. Mapping Surface Features Produced by an Active Landslide

    NASA Astrophysics Data System (ADS)

    Parise, Mario; Gueguen, Erwan; Vennari, Carmela

    2016-10-01

    A large landslide reactivated on December 2013, at Montescaglioso, southern Italy, after 56 hours of rainfall. The landslide disrupted over 500 m of a freeway, involved a few warehouses, a supermarket, and private homes. After the event, it has been performed field surveys, aided by visual analysis of terrestrial and helicopter photographs, to compile a map of the surface deformations. The geomorphological features mapped included single fractures, sets of fractures, tension cracks, trenches, and pressure ridges. In this paper we present the methodology used, the map obtained through the intensive field work, and discuss the main surface features produced by the landslide.

  9. Non-invasive assessment of vasospasm following aneurysmal SAH using C-arm FDCT parenchymal blood volume measurement in the neuro-interventional suite: Technical feasibility

    PubMed Central

    Downer, Jonathan; Corkill, Rufus; Byrne, James V

    2015-01-01

    Introduction Cerebral vasospasm is the leading cause of morbidity and mortality in patients with aneurysmal subarachnoid haemorrhage (SAH) surviving the initial ictus. Commonly used techniques for vasospasm assessment are digital subtraction angiography and transcranial Doppler sonography. These techniques can reliably identify only the major vessel spasm and fail to estimate its haemodynamic significance. To overcome these issues and to enable comprehensive non-invasive assessment of vasospasm inside the interventional suite, a novel protocol involving measurement of parenchymal blood volume (PBV) using C-arm flat detector computed tomography (FDCT) was implemented. Materials and methods Patients from the neuro-intensive treatment unit (ITU) with suspected vasospasm following aneurysmal SAH were scanned with a biplane C-arm angiography system using an intravenous contrast injection protocol. The PBV maps were generated using prototype software. Contemporaneous clinically indicated MR scan including the diffusion- and perfusion-weighted sequences was performed. C-arm PBV maps were compared against the MR perfusion maps. Results Distribution of haemodynamic impairment on C-arm PBV maps closely matched the pattern of abnormality on MR perfusion maps. On visual comparison between the two techniques, the extent of abnormality indicated PBV to be both cerebral blood flow and cerebral blood volume weighted. Conclusion C-arm FDCT PBV measurements allow an objective assessment of the severity and localisation of cerebral hypoperfusion resulting from vasospasm. The technique has proved feasible and useful in very sick patients after aneurysmal SAH. The promise shown in this early study indicates that it deserves further evaluation both for post-SAH vasospasm and in other relevant clinical settings. PMID:26017197

  10. Non-invasive assessment of vasospasm following aneurysmal SAH using C-arm FDCT parenchymal blood volume measurement in the neuro-interventional suite: Technical feasibility.

    PubMed

    Kamran, Mudassar; Downer, Jonathan; Corkill, Rufus; Byrne, James V

    2015-08-01

    Cerebral vasospasm is the leading cause of morbidity and mortality in patients with aneurysmal subarachnoid haemorrhage (SAH) surviving the initial ictus. Commonly used techniques for vasospasm assessment are digital subtraction angiography and transcranial Doppler sonography. These techniques can reliably identify only the major vessel spasm and fail to estimate its haemodynamic significance. To overcome these issues and to enable comprehensive non-invasive assessment of vasospasm inside the interventional suite, a novel protocol involving measurement of parenchymal blood volume (PBV) using C-arm flat detector computed tomography (FDCT) was implemented. Patients from the neuro-intensive treatment unit (ITU) with suspected vasospasm following aneurysmal SAH were scanned with a biplane C-arm angiography system using an intravenous contrast injection protocol. The PBV maps were generated using prototype software. Contemporaneous clinically indicated MR scan including the diffusion- and perfusion-weighted sequences was performed. C-arm PBV maps were compared against the MR perfusion maps. Distribution of haemodynamic impairment on C-arm PBV maps closely matched the pattern of abnormality on MR perfusion maps. On visual comparison between the two techniques, the extent of abnormality indicated PBV to be both cerebral blood flow and cerebral blood volume weighted. C-arm FDCT PBV measurements allow an objective assessment of the severity and localisation of cerebral hypoperfusion resulting from vasospasm. The technique has proved feasible and useful in very sick patients after aneurysmal SAH. The promise shown in this early study indicates that it deserves further evaluation both for post-SAH vasospasm and in other relevant clinical settings. © The Author(s) 2015.

  11. The prediction and mapping of geoidal undulations from GEOS-3 altimetry. [gravity anomalies

    NASA Technical Reports Server (NTRS)

    Kearsley, W.

    1978-01-01

    From the adjusted altimeter data an approximation to the geoid height in ocean areas is obtained. Methods are developed to produce geoid maps in these areas. Geoid heights are obtained for grid points in the region to be mapped, and two of the parameters critical to the production of an accurate map are investigated. These are the spacing of the grid, which must be related to the half-wavelength of the altimeter signal whose amplitude is the desired accuracy of the contour; and the method adopted to predict the grid values. Least squares collocation was used to find geoid undulations on a 1 deg grid in the mapping area. Twenty maps, with their associated precisions, were produced and are included. These maps cover the Indian Ocean, Southwestern and Northeastern portions of the Pacific Ocean, and Southwest Atlantic and the U.S. Calibration Area.

  12. a Conceptual Framework for Indoor Mapping by Using Grammars

    NASA Astrophysics Data System (ADS)

    Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.

    2017-09-01

    Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.

  13. Map of assessed tight-gas resources in the United States

    USGS Publications Warehouse

    Biewick, Laura R. H.; ,

    2014-01-01

    This report presents a digital map of tight-gas resource assessments in the United States as part of the U.S. Geological Survey’s (USGS) National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the USGS quantitatively estimated potential volumes of undiscovered, technically recoverable natural gas resources within tight-gas assessment units (AUs). This is the second digital map product in a series of USGS unconventional oil and gas resource maps. The map plate included in this report can be printed in hard-copy form or downloaded in a Geographic Information System (GIS) data package, including an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and published map file (.pmf). In addition, the publication access table contains hyperlinks to current USGS tight-gas assessment publications and web pages.

  14. Map of assessed coalbed-gas resources in the United States, 2014

    USGS Publications Warehouse

    ,; Biewick, Laura R. H.

    2014-01-01

    This report presents a digital map of coalbed-gas resource assessments in the United States as part of the U.S. Geological Survey’s (USGS) National Assessment of Oil and Gas Project. Using a geology-based assessment methodology, the USGS quantitatively estimated potential volumes of undiscovered, technically recoverable natural gas resources within coalbed-gas assessment units (AUs). This is the third digital map product in a series of USGS unconventional oil and gas resource maps. The map plate included in this report can be printed in hardcopy form or downloaded in a Geographic Information System (GIS) data package, including an ArcGIS ArcMap document (.mxd), geodatabase (.gdb), and published map file (.pmf). In addition, the publication access table contains hyperlinks to current USGS coalbed-gas assessment publications and web pages.

  15. The eruption in Holuhraun, NE Iceland 2014-2015: Real-time monitoring and influence of landscape on lava flow

    NASA Astrophysics Data System (ADS)

    Jónsdóttir, Ingibjörg; Höskuldsson, Ármann; Thordarson, Thor; Bartolini, Stefania; Becerril, Laura; Marti Molist, Joan; Þorvaldsson, Skúli; Björnsson, Daði; Höskuldsson, Friðrik

    2016-04-01

    The largest eruption in Iceland since the Laki 1783-84 event began in Holuhraun, NE Iceland, on 31 August 2014, producing a lava flow field which, by the end of the eruption on February 27th 2015, covered 84,5 km2 with volume of 1,44 km3. Throughout the event, various satellite images (NOAA AVHRR, MODIS, SUOMI NPP VIIRS, ASTER, LANDSAT7&8, EO-1 ALI & HYPERION, RADARSAT-2, SENTINEL-1, COSMO SKYMED, TERRASAR X) were analysed to monitor the development of activity, identify active flow fronts and channels, and map the lava extent in close collaboration with the on-site field group. Aerial photographs and radar images from the Icelandic Coast Guard Dash 8 aircraft supported this effort. By the end of 2015, Loftmyndir ehf had produced a detailed 3D model of the lava using aerial photographs from 2013 and 2015. The importance of carrying out real-time monitoring of a volcanic eruption is: i) to locate sites of elevated temperature that may be registering new areas of activity within the lava or opening of vents or fissures. ii) To establish and verify timing of events at the vents and within the lava. iii) To identify potential volcanic hazard that can be caused by lava movements, eruption-induced flash flooding, tephra fallout or gas pollution. iv) to provide up-to-date regional information to field groups concerning safety as well as to locate sites for sampling lava, tephra and polluted water. v) to produce quantitative information on magma discharge and lava flow advance, map the lava extent, document the flow morphology and plume/tephra dispersal. During the eruption, these efforts supported mapping of the extent of the lava every 3-4 days on average underpinning the time series of magma discharge calculations. Digitial elevation models from before and after the event, combined with the real-time data series, supports detailed analysis of how landscape affects lava flow in a flat terrain (<0,4°), and provides important input to further developing lava flow models within the EU VETOOLS project, aiming to improve response to future events. Monitoring the site was carried out throughout 2015, including the cooling of the lava in relation to thickness and inflation history. This also included mapping of hydrology in the Dyngjujökull outwash plane, development of ponds where the lava blocked previous river channels.

  16. Mapping the MRI voxel volume in which thermal noise matches physiological noise--implications for fMRI.

    PubMed

    Bodurka, J; Ye, F; Petridou, N; Murphy, K; Bandettini, P A

    2007-01-15

    This work addresses the choice of the imaging voxel volume in blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI). Noise of physiological origin that is present in the voxel time course is a prohibitive factor in the detection of small activation-induced BOLD signal changes. If the physiological noise contribution dominates over the temporal fluctuation contribution in the imaging voxel, further increases in the voxel signal-to-noise ratio (SNR) will have diminished corresponding increases in temporal signal-to-noise (TSNR), resulting in reduced corresponding increases in the ability to detect activation induced signal changes. On the other hand, if the thermal and system noise dominate (suggesting a relatively low SNR) further decreases in SNR can prohibit detection of activation-induced signal changes. Here we have proposed and called the "suggested" voxel volume for fMRI the volume where thermal plus system-related and physiological noise variances are equal. Based on this condition we have created maps of fMRI suggested voxel volume from our experimental data at 3T, since this value will spatially vary depending on the contribution of physiologic noise in each voxel. Based on our fast EPI segmentation technique we have found that for gray matter (GM), white matter (WM), and cerebral spinal fluid (CSF) brain compartments the mean suggested cubical voxel volume is: (1.8 mm)3, (2.1 mm)3 and (1.4 mm)3, respectively. Serendipitously, (1.8 mm)3 cubical voxel volume for GM approximately matches the cortical thickness, thus optimizing BOLD contrast by minimizing partial volume averaging. The introduced suggested fMRI voxel volume can be a useful parameter for choice of imaging volume for functional studies.

  17. Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks

    NASA Astrophysics Data System (ADS)

    Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.

    2015-03-01

    The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which is to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.

  18. Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks

    NASA Astrophysics Data System (ADS)

    Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.

    2014-11-01

    The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which are to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.

  19. Global forest cover mapping for the United Nations Food and Agriculture Organization forest resources assessment 2000 program

    USGS Publications Warehouse

    Zhu, Z.; Waller, E.

    2003-01-01

    Many countries periodically produce national reports on the status and changes of forest resources, using statistical surveys and spatial mapping of remotely sensed data. At the global level, the Food and Agriculture Organization (FAO) of the United Nations has conducted a Forest Resources Assessment (FRA) program every 10 yr since 1980, producing statistics and analysis that give a global synopsis of forest resources in the world. For the year 2000 of the FRA program (FRA2000), a global forest cover map was produced to provide spatial context to the extensive survey. The forest cover map, produced at the U.S. Geological Survey (USGS) EROS Data Center (EDC), has five classes: closed forest, open or fragmented forest, other wooded land, other land cover, and water. The first two forested classes at the global scale were delineated using combinations of temporal compositing, modified mixture analysis, geographic stratification, and other classification techniques. The remaining three FAO classes were derived primarily from the USGS global land cover characteristics database (Loveland et al. 1999). Validated on the basis of existing reference data sets, the map is estimated to be 77% accurate for the first four classes (no reference data were available for water), and 86% accurate for the forest and nonforest classification. The final map will be published as an insert to the FAO FRA2000 report.

  20. Flexible, reconfigurable, power efficient transmitter and method

    NASA Technical Reports Server (NTRS)

    Bishop, James W. (Inventor); Zaki, Nazrul H. Mohd (Inventor); Newman, David Childress (Inventor); Bundick, Steven N. (Inventor)

    2011-01-01

    A flexible, reconfigurable, power efficient transmitter device and method is provided. In one embodiment, the method includes receiving outbound data and determining a mode of operation. When operating in a first mode the method may include modulation mapping the outbound data according a modulation scheme to provide first modulation mapped digital data, converting the first modulation mapped digital data to an analog signal that comprises an intermediate frequency (IF) analog signal, upconverting the IF analog signal to produce a first modulated radio frequency (RF) signal based on a local oscillator signal, amplifying the first RF modulated signal to produce a first RF output signal, and outputting the first RF output signal via an isolator. In a second mode of operation method may include modulation mapping the outbound data according a modulation scheme to provide second modulation mapped digital data, converting the second modulation mapped digital data to a first digital baseband signal, conditioning the first digital baseband signal to provide a first analog baseband signal, modulating one or more carriers with the first analog baseband signal to produce a second modulated RF signal based on a local oscillator signal, amplifying the second RF modulated signal to produce a second RF output signal, and outputting the second RF output signal via the isolator. The digital baseband signal may comprise an in-phase (I) digital baseband signal and a quadrature (Q) baseband signal.

  1. Development of the geoCamera, a System for Mapping Ice from a Ship

    NASA Astrophysics Data System (ADS)

    Arsenault, R.; Clemente-Colon, P.

    2012-12-01

    The geoCamera produces maps of the ice surrounding an ice-capable ship by combining images from one or more digital cameras with the ship's position and attitude data. Maps are produced along the ship's path with the achievable width and resolution depending on camera mounting height as well as camera resolution and lens parameters. Our system has produced maps up to 2000m wide at 1m resolution. Once installed and calibrated, the system is designed to operate automatically producing maps in near real-time and making them available to on-board users via existing information systems. The resulting small-scale maps complement existing satellite based products as well as on-board observations. Development versions have temporarily been deployed in Antarctica on the RV Nathaniel B. Palmer in 2010 and in the Arctic on the USCGC Healy in 2011. A permanent system has been deployed during the summer of 2012 on the USCGC Healy. To make the system attractive to other ships of opportunity, design goals include using existing ship systems when practical, using low costs commercial-off-the-shelf components if additional hardware is necessary, automating the process to virtually eliminate adding to the workload of ships technicians and making the software components modular and flexible enough to allow more seamless integration with a ships particular IT system.

  2. One-dimensional turbulence modeling for cylindrical and spherical flows: model formulation and application

    NASA Astrophysics Data System (ADS)

    Lignell, David O.; Lansinger, Victoria B.; Medina, Juan; Klein, Marten; Kerstein, Alan R.; Schmidt, Heiko; Fistler, Marco; Oevermann, Michael

    2018-06-01

    The one-dimensional turbulence (ODT) model resolves a full range of time and length scales and is computationally efficient. ODT has been applied to a wide range of complex multi-scale flows, such as turbulent combustion. Previous ODT comparisons to experimental data have focused mainly on planar flows. Applications to cylindrical flows, such as round jets, have been based on rough analogies, e.g., by exploiting the fortuitous consistency of the similarity scalings of temporally developing planar jets and spatially developing round jets. To obtain a more systematic treatment, a new formulation of the ODT model in cylindrical and spherical coordinates is presented here. The model is written in terms of a geometric factor so that planar, cylindrical, and spherical configurations are represented in the same way. Temporal and spatial versions of the model are presented. A Lagrangian finite-volume implementation is used with a dynamically adaptive mesh. The adaptive mesh facilitates the implementation of cylindrical and spherical versions of the triplet map, which is used to model turbulent advection (eddy events) in the one-dimensional flow coordinate. In cylindrical and spherical coordinates, geometric stretching of the three triplet map images occurs due to the radial dependence of volume, with the stretching being strongest near the centerline. Two triplet map variants, TMA and TMB, are presented. In TMA, the three map images have the same volume, but different radial segment lengths. In TMB, the three map images have the same radial segment lengths, but different segment volumes. Cylindrical results are presented for temporal pipe flow, a spatial nonreacting jet, and a spatial nonreacting jet flame. These results compare very well to direct numerical simulation for the pipe flow, and to experimental data for the jets. The nonreacting jet treatment overpredicts velocity fluctuations near the centerline, due to the geometric stretching of the triplet maps and its effect on the eddy event rate distribution. TMB performs better than TMA. A hybrid planar-TMB (PTMB) approach is also presented, which further improves the results. TMA, TMB, and PTMB are nearly identical in the pipe flow where the key dynamics occur near the wall away from the centerline. The jet flame illustrates effects of variable density and viscosity, including dilatational effects.

  3. Proton echo-planar spectroscopic imaging of J-coupled resonances in human brain at 3 and 4 Tesla.

    PubMed

    Posse, Stefan; Otazo, Ricardo; Caprihan, Arvind; Bustillo, Juan; Chen, Hongji; Henry, Pierre-Gilles; Marjanska, Malgorzata; Gasparovic, Charles; Zuo, Chun; Magnotta, Vincent; Mueller, Bryon; Mullins, Paul; Renshaw, Perry; Ugurbil, Kamil; Lim, Kelvin O; Alger, Jeffry R

    2007-08-01

    In this multicenter study, 2D spatial mapping of J-coupled resonances at 3T and 4T was performed using short-TE (15 ms) proton echo-planar spectroscopic imaging (PEPSI). Water-suppressed (WS) data were acquired in 8.5 min with 1-cm(3) spatial resolution from a supraventricular axial slice. Optimized outer volume suppression (OVS) enabled mapping in close proximity to peripheral scalp regions. Constrained spectral fitting in reference to a non-WS (NWS) scan was performed with LCModel using correction for relaxation attenuation and partial-volume effects. The concentrations of total choline (tCho), creatine + phosphocreatine (Cr+PCr), glutamate (Glu), glutamate + glutamine (Glu+Gln), myo-inositol (Ins), NAA, NAA+NAAG, and two macromolecular resonances at 0.9 and 2.0 ppm were mapped with mean Cramer-Rao lower bounds (CRLBs) between 6% and 18% and approximately 150-cm(3) sensitive volumes. Aspartate, GABA, glutamine (Gln), glutathione (GSH), phosphoethanolamine (PE), and macromolecules (MMs) at 1.2 ppm were also mapped, although with larger mean CRLBs between 30% and 44%. The CRLBs at 4T were 19% lower on average as compared to 3T, consistent with a higher signal-to-noise ratio (SNR) and increased spectral resolution. Metabolite concentrations were in the ranges reported in previous studies. Glu concentration was significantly higher in gray matter (GM) compared to white matter (WM), as anticipated. The short acquisition time makes this methodology suitable for clinical studies.

  4. A measurement of material in the ATLAS tracker using secondary hadronic interactions in 7 TeV pp collisions

    DOE PAGES

    Aaboud, M.; Aad, G.; Abbott, B.; ...

    2016-11-30

    Knowledge of the material in the ATLAS inner tracking detector is crucial in understanding the reconstruction of charged-particle tracks, the performance of algorithms that identify jets containing b-hadrons and is also essential to reduce background in searches for exotic particles that can decay within the inner detector volume. Interactions of primary hadrons produced in pp collisions with the material in the inner detector are used to map the location and amount of this material. The hadronic interactions of primary particles may result in secondary vertices, which in this analysis are reconstructed by an inclusive vertex-finding algorithm. Data were collected usingmore » minimum-bias triggers by the ATLAS detector operating at the LHC during 2010 at centre-of-mass energy √s = 7 TeV, and correspond to an integrated luminosity of 19 nb -1. Kinematic properties of these secondary vertices are used to study the validity of the modelling of hadronic interactions in simulation. Finally, secondary-vertex yields are compared between data and simulation over a volume of about 0.7 m 3 around the interaction point, and agreement is found within overall uncertainties.« less

  5. A measurement of material in the ATLAS tracker using secondary hadronic interactions in 7 TeV pp collisions

    NASA Astrophysics Data System (ADS)

    Aaboud, M.; Aad, G.; Abbott, B.; Abdallah, J.; Abdinov, O.; Abeloos, B.; Aben, R.; AbouZeid, O. S.; Abraham, N. L.; Abramowicz, H.; Abreu, H.; Abreu, R.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Adelman, J.; Adomeit, S.; Adye, T.; Affolder, A. A.; Agatonovic-Jovin, T.; Agricola, J.; Aguilar-Saavedra, J. A.; Ahlen, S. P.; Ahmadov, F.; Aielli, G.; Akerstedt, H.; Åkesson, T. P. A.; Akimov, A. V.; Alberghi, G. L.; Albert, J.; Albrand, S.; Alconada Verzini, M. J.; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alhroob, M.; Ali, B.; Aliev, M.; Alimonti, G.; Alison, J.; Alkire, S. P.; Allbrooke, B. M. M.; Allen, B. W.; Allport, P. P.; Aloisio, A.; Alonso, A.; Alonso, F.; Alpigiani, C.; Alstaty, M.; Alvarez Gonzalez, B.; Álvarez Piqueras, D.; Alviggi, M. G.; Amadio, B. T.; Amako, K.; Amaral Coutinho, Y.; Amelung, C.; Amidei, D.; Amor Dos Santos, S. P.; Amorim, A.; Amoroso, S.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anders, J. K.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Angelidakis, S.; Angelozzi, I.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antel, C.; Antonelli, M.; Antonov, A.; Anulli, F.; Aoki, M.; Aperio Bella, L.; Arabidze, G.; Arai, Y.; Araque, J. P.; Arce, A. T. H.; Arduh, F. A.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Armitage, L. J.; Arnaez, O.; Arnold, H.; Arratia, M.; Arslan, O.; Artamonov, A.; Artoni, G.; Artz, S.; Asai, S.; Asbah, N.; Ashkenazi, A.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Augsten, K.; Avolio, G.; Axen, B.; Ayoub, M. K.; Azuelos, G.; Baak, M. A.; Baas, A. E.; Baca, M. J.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Baines, J. T.; Baker, O. K.; Baldin, E. M.; Balek, P.; Balestri, T.; Balli, F.; Balunas, W. K.; Banas, E.; Banerjee, Sw.; Bannoura, A. A. E.; Barak, L.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisits, M.-S.; Barklow, T.; Barlow, N.; Barnes, S. L.; Barnett, B. M.; Barnett, R. M.; Barnovska, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barranco Navarro, L.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Bartoldus, R.; Barton, A. E.; Bartos, P.; Basalaev, A.; Bassalat, A.; Bates, R. L.; Batista, S. J.; Batley, J. R.; Battaglia, M.; Bauce, M.; Bauer, F.; Bawa, H. S.; Beacham, J. B.; Beattie, M. D.; Beau, T.; Beauchemin, P. H.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, M.; Beckingham, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bednyakov, V. A.; Bedognetti, M.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, J. K.; Belanger-Champagne, C.; Bell, A. S.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belotskiy, K.; Beltramello, O.; Belyaev, N. L.; Benary, O.; Benchekroun, D.; Bender, M.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Benhar Noccioli, E.; Benitez, J.; Benjamin, D. P.; Bensinger, J. R.; Bentvelsen, S.; Beresford, L.; Beretta, M.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Beringer, J.; Berlendis, S.; Bernard, N. R.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertoli, G.; Bertolucci, F.; Bertram, I. A.; Bertsche, C.; Bertsche, D.; Besjes, G. J.; Bessidskaia Bylund, O.; Bessner, M.; Besson, N.; Betancourt, C.; Bethani, A.; Bethke, S.; Bevan, A. J.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Biedermann, D.; Bielski, R.; Biesuz, N. V.; Biglietti, M.; Bilbao De Mendizabal, J.; Billoud, T. R. V.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biondi, S.; Bisanz, T.; Bjergaard, D. M.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blazek, T.; Bloch, I.; Blocker, C.; Blum, W.; Blumenschein, U.; Blunier, S.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Bock, C.; Boehler, M.; Boerner, D.; Bogaerts, J. A.; Bogavac, D.; Bogdanchikov, A. G.; Bohm, C.; Boisvert, V.; Bokan, P.; Bold, T.; Boldyrev, A. S.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Bortfeldt, J.; Bortoletto, D.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Bossio Sola, J. D.; Boudreau, J.; Bouffard, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Boutle, S. K.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bracinik, J.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Breaden Madden, W. D.; Brendlinger, K.; Brennan, A. J.; Brenner, L.; Brenner, R.; Bressler, S.; Bristow, T. M.; Britton, D.; Britzger, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Broughton, J. H.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Bruni, A.; Bruni, G.; Bruni, L. S.; Brunt, BH; Bruschi, M.; Bruscino, N.; Bryant, P.; Bryngemark, L.; Buanes, T.; Buat, Q.; Buchholz, P.; Buckley, A. G.; Budagov, I. A.; Buehrer, F.; Bugge, M. K.; Bulekov, O.; Bullock, D.; Burckhart, H.; Burdin, S.; Burgard, C. D.; Burghgrave, B.; Burka, K.; Burke, S.; Burmeister, I.; Burr, J. T. P.; Busato, E.; Büscher, D.; Büscher, V.; Bussey, P.; Butler, J. M.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Buzykaev, A. R.; Cabrera Urbán, S.; Caforio, D.; Cairo, V. M.; Cakir, O.; Calace, N.; Calafiura, P.; Calandri, A.; Calderini, G.; Calfayan, P.; Callea, G.; Caloba, L. P.; Calvente Lopez, S.; Calvet, D.; Calvet, S.; Calvet, T. P.; Camacho Toro, R.; Camarda, S.; Camarri, P.; Cameron, D.; Caminal Armadans, R.; Camincher, C.; Campana, S.; Campanelli, M.; Camplani, A.; Campoverde, A.; Canale, V.; Canepa, A.; Cano Bret, M.; Cantero, J.; Cantrill, R.; Cao, T.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Carbone, R. M.; Cardarelli, R.; Cardillo, F.; Carli, I.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Casolino, M.; Casper, D. W.; Castaneda-Miranda, E.; Castelijn, R.; Castelli, A.; Castillo Gimenez, V.; Castro, N. F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caudron, J.; Cavaliere, V.; Cavallaro, E.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerda Alberich, L.; Cerio, B. C.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cerv, M.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chan, S. K.; Chan, Y. L.; Chang, P.; Chapman, J. D.; Charlton, D. G.; Chatterjee, A.; Chau, C. C.; Chavez Barajas, C. A.; Che, S.; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, S.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, H. J.; Cheng, Y.; Cheplakov, A.; Cheremushkina, E.; Cherkaoui El Moursli, R.; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiarelli, G.; Chiodini, G.; Chisholm, A. S.; Chitan, A.; Chizhov, M. V.; Choi, K.; Chomont, A. R.; Chouridou, S.; Chow, B. K. B.; Christodoulou, V.; Chromek-Burckhart, D.; Chudoba, J.; Chuinard, A. J.; Chwastowski, J. J.; Chytka, L.; Ciapetti, G.; Ciftci, A. K.; Cinca, D.; Cindro, V.; Cioara, I. A.; Ciocca, C.; Ciocio, A.; Cirotto, F.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, B. L.; Clark, M. R.; Clark, P. J.; Clarke, R. N.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Colasurdo, L.; Cole, B.; Colijn, A. P.; Collot, J.; Colombo, T.; Compostella, G.; Conde Muiño, P.; Coniavitis, E.; Connell, S. H.; Connelly, I. A.; Consorti, V.; Constantinescu, S.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cormier, K. J. R.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Crawley, S. J.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Cribbs, W. A.; Crispin Ortuzar, M.; Cristinziani, M.; Croft, V.; Crosetti, G.; Cueto, A.; Cuhadar Donszelmann, T.; Cummings, J.; Curatolo, M.; Cúth, J.; Czirr, H.; Czodrowski, P.; D'amen, G.; D'Auria, S.; D'Onofrio, M.; Da Cunha Sargedas De Sousa, M. J.; Da Via, C.; Dabrowski, W.; Dado, T.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Dandoy, J. R.; Dang, N. P.; Daniells, A. C.; Dann, N. S.; Danninger, M.; Dano Hoffmann, M.; Dao, V.; Darbo, G.; Darmora, S.; Dassoulas, J.; Dattagupta, A.; Davey, W.; David, C.; Davidek, T.; Davies, M.; Davison, P.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Benedetti, A.; De Castro, S.; De Cecco, S.; De Groot, N.; de Jong, P.; De la Torre, H.; De Lorenzi, F.; De Maria, A.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dedovich, D. V.; Dehghanian, N.; Deigaard, I.; Del Gaudio, M.; Del Peso, J.; Del Prete, T.; Delgove, D.; Deliot, F.; Delitzsch, C. M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; DeMarco, D. A.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Denysiuk, D.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Dette, K.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; Di Ciaccio, A.; Di Ciaccio, L.; Di Clemente, W. K.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaconu, C.; Diamond, M.; Dias, F. A.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Diglio, S.; Dimitrievska, A.; Dingfelder, J.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; Djuvsland, J. I.; do Vale, M. A. B.; Dobos, D.; Dobre, M.; Doglioni, C.; Dolejsi, J.; Dolezal, Z.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dova, M. T.; Doyle, A. T.; Drechsler, E.; Dris, M.; Du, Y.; Duarte-Campderros, J.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Dudder, A. Chr.; Duffield, E. M.; Duflot, L.; Dührssen, M.; Dumancic, M.; Dunford, M.; Duran Yildiz, H.; Düren, M.; Durglishvili, A.; Duschinger, D.; Dutta, B.; Dyndal, M.; Eckardt, C.; Ecker, K. M.; Edgar, R. C.; Edwards, N. C.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; El Kacimi, M.; Ellajosyula, V.; Ellert, M.; Elles, S.; Ellinghaus, F.; Elliot, A. A.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Ennis, J. S.; Erdmann, J.; Ereditato, A.; Ernis, G.; Ernst, J.; Ernst, M.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Esposito, B.; Etienvre, A. I.; Etzion, E.; Evans, H.; Ezhilov, A.; Fabbri, F.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Falla, R. J.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farina, C.; Farina, E. M.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Faucci Giannelli, M.; Favareto, A.; Fawcett, W. J.; Fayard, L.; Fedin, O. L.; Fedorko, W.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Feremenga, L.; Fernandez Martinez, P.; Fernandez Perez, S.; Ferrando, J.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferreira de Lima, D. E.; Ferrer, A.; Ferrere, D.; Ferretti, C.; Ferretto Parodi, A.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, A.; Fischer, C.; Fischer, J.; Fisher, W. C.; Flaschel, N.; Fleck, I.; Fleischmann, P.; Fletcher, G. T.; Fletcher, R. R. M.; Flick, T.; Floderus, A.; Flores Castillo, L. R.; Flowerdew, M. J.; Forcolin, G. T.; Formica, A.; Forti, A.; Foster, A. G.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Francis, D.; Franconi, L.; Franklin, M.; Frate, M.; Fraternali, M.; Freeborn, D.; Fressard-Batraneanu, S. M.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fullana Torregrosa, E.; Fusayasu, T.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gach, G. P.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, L. G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gao, J.; Gao, Y.; Gao, Y. S.; Garay Walls, F. M.; García, C.; García Navarro, J. E.; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gascon Bravo, A.; Gasnikova, K.; Gatti, C.; Gaudiello, A.; Gaudio, G.; Gauthier, L.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Gecse, Z.; Gee, C. N. P.; Geich-Gimbel, Ch.; Geisen, M.; Geisler, M. P.; Gemme, C.; Genest, M. H.; Geng, C.; Gentile, S.; Gentsos, C.; George, S.; Gerbaudo, D.; Gershon, A.; Ghasemi, S.; Ghazlane, H.; Ghneimat, M.; Giacobbe, B.; Giagu, S.; Giannetti, P.; Gibbard, B.; Gibson, S. M.; Gignac, M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gilles, G.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giorgi, F. M.; Giorgi, F. M.; Giraud, P. F.; Giromini, P.; Giugni, D.; Giuli, F.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkaitatzis, S.; Gkialas, I.; Gkougkousis, E. L.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Goblirsch-Kolb, M.; Godlewski, J.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gonçalo, R.; Goncalves Pinto Firmino Da Costa, J.; Gonella, G.; Gonella, L.; Gongadze, A.; González de la Hoz, S.; Gonzalez Parra, G.; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Goudet, C. R.; Goujdami, D.; Goussiou, A. G.; Govender, N.; Gozani, E.; Graber, L.; Grabowska-Bold, I.; Gradin, P. O. J.; Grafström, P.; Gramling, J.; Gramstad, E.; Grancagnolo, S.; Gratchev, V.; Gravila, P. M.; Gray, H. M.; Graziani, E.; Greenwood, Z. D.; Grefe, C.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Grevtsov, K.; Griffiths, J.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grivaz, J.-F.; Groh, S.; Grohs, J. P.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Grout, Z. J.; Guan, L.; Guan, W.; Guenther, J.; Guescini, F.; Guest, D.; Gueta, O.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Guo, J.; Guo, Y.; Gupta, R.; Gupta, S.; Gustavino, G.; Gutierrez, P.; Gutierrez Ortiz, N. G.; Gutschow, C.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Hadef, A.; Hageböck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Haley, J.; Halladjian, G.; Hallewell, G. D.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamilton, A.; Hamity, G. N.; Hamnett, P. G.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Haney, B.; Hanisch, S.; Hanke, P.; Hanna, R.; Hansen, J. B.; Hansen, J. D.; Hansen, M. C.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Hariri, F.; Harkusha, S.; Harrington, R. D.; Harrison, P. F.; Hartjes, F.; Hartmann, N. M.; Hasegawa, M.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauser, R.; Hauswald, L.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hayakawa, D.; Hayden, D.; Hays, C. P.; Hays, J. M.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, J. J.; Heinrich, L.; Heinz, C.; Hejbal, J.; Helary, L.; Hellman, S.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Heng, Y.; Henkelmann, S.; Henriques Correia, A. M.; Henrot-Versille, S.; Herbert, G. H.; Herget, V.; Hernández Jiménez, Y.; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hetherly, J. W.; Hickling, R.; Higón-Rodriguez, E.; Hill, E.; Hill, J. C.; Hiller, K. H.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hinman, R. R.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoenig, F.; Hohn, D.; Holmes, T. R.; Homann, M.; Hong, T. M.; Hooberman, B. H.; Hopkins, W. H.; Horii, Y.; Horton, A. J.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hrynevich, A.; Hsu, C.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, Q.; Hu, S.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Huo, P.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Ideal, E.; Idrissi, Z.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Ilic, N.; Ince, T.; Introzzi, G.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Ishijima, N.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ito, F.; Ponce, J. M. Iturbe; Iuppa, R.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jabbar, S.; Jackson, B.; Jackson, P.; Jain, V.; Jakobi, K. B.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansky, R.; Janssen, J.; Janus, M.; Jarlskog, G.; Javadov, N.; Javůrek, T.; Jeanneau, F.; Jeanty, L.; Jejelava, J.; Jeng, G.-Y.; Jennens, D.; Jenni, P.; Jeske, C.; Jézéquel, S.; Ji, H.; Jia, J.; Jiang, H.; Jiang, Y.; Jiggins, S.; Jimenez Pena, J.; Jin, S.; Jinaru, A.; Jinnouchi, O.; Jivan, H.; Johansson, P.; Johns, K. A.; Johnson, W. J.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, S.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Jovicevic, J.; Ju, X.; Juste Rozas, A.; Köhler, M. K.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kahn, S. J.; Kaji, T.; Kajomovitz, E.; Kalderon, C. W.; Kaluza, A.; Kama, S.; Kamenshchikov, A.; Kanaya, N.; Kaneti, S.; Kanjir, L.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kaplan, L. S.; Kapliy, A.; Kar, D.; Karakostas, K.; Karamaoun, A.; Karastathis, N.; Kareem, M. J.; Karentzos, E.; Karnevskiy, M.; Karpov, S. N.; Karpova, Z. M.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kasahara, K.; Kashif, L.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Kato, C.; Katre, A.; Katzy, J.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazanin, V. F.; Keeler, R.; Kehoe, R.; Keller, J. S.; Kempster, J. J.; Kawade, K.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Keyes, R. A.; Khader, M.; Khalil-zada, F.; Khanov, A.; Kharlamov, A. G.; Khoo, T. J.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kido, S.; Kilby, C. R.; Kim, H. Y.; Kim, S. H.; Kim, Y. K.; Kimura, N.; Kind, O. M.; King, B. T.; King, M.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kiss, F.; Kiuchi, K.; Kivernyk, O.; Kladiva, E.; Klein, M. H.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klioutchnikova, T.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Knapik, J.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, A.; Kobayashi, D.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koehler, N. M.; Koffas, T.; Koffeman, E.; Koi, T.; Kolanoski, H.; Kolb, M.; Koletsou, I.; Komar, A. A.; Komori, Y.; Kondo, T.; Kondrashova, N.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Kortner, O.; Kortner, S.; Kosek, T.; Kostyukhin, V. V.; Kotwal, A.; Kourkoumeli-Charalampidi, A.; Kourkoumelis, C.; Kouskoura, V.; Kowalewska, A. B.; Kowalewski, R.; Kowalski, T. Z.; Kozakai, C.; Kozanecki, W.; Kozhin, A. S.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Kravchenko, A.; Kretz, M.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Krizka, K.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Krumnack, N.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kucuk, H.; Kuday, S.; Kuechler, J. T.; Kuehn, S.; Kugel, A.; Kuger, F.; Kuhl, A.; Kuhl, T.; Kukhtin, V.; Kukla, R.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunigo, T.; Kupco, A.; Kurashige, H.; Kurochkin, Y. A.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwan, T.; Kyriazopoulos, D.; La Rosa, A.; La Rosa Navarro, J. L.; La Rotonda, L.; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lammers, S.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lanfermann, M. C.; Lang, V. S.; Lange, J. C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Lasagni Manghi, F.; Lassnig, M.; Laurelli, P.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Lazovich, T.; Lazzaroni, M.; Le, B.; Le Dortz, O.; Le Guirriec, E.; Le Quilleuc, E. P.; LeBlanc, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, S. C.; Lee, L.; Lefebvre, B.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Lehmann Miotto, G.; Lei, X.; Leight, W. A.; Leisos, A.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzi, B.; Leone, R.; Leone, S.; Leonidopoulos, C.; Leontsinis, S.; Lerner, G.; Leroy, C.; Lesage, A. A. J.; Lester, C. G.; Levchenko, M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Lewis, D.; Leyko, A. M.; Leyton, M.; Li, B.; Li, C.; Li, H.; Li, H. L.; Li, L.; Li, L.; Li, Q.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liberti, B.; Liblong, A.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limosani, A.; Lin, S. C.; Lin, T. H.; Lindquist, B. E.; Lionti, A. E.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, H.; Liu, H.; Liu, J.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y. L.; Liu, Y.; Livan, M.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lo Sterzo, F.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loew, K. M.; Loginov, A.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Long, B. A.; Long, J. D.; Long, R. E.; Longo, L.; Looper, K. A.; Lopes, L.; Lopez Mateos, D.; Lopez Paredes, B.; Lopez Paz, I.; Lopez Solis, A.; Lorenz, J.; Martinez, N. Lorenzo; Losada, M.; Lösel, P. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lu, H.; Lu, N.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luedtke, C.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Luzi, P. M.; Lynn, D.; Lysak, R.; Lytken, E.; Lyubushkin, V.; Ma, H.; Ma, L. L.; Ma, Y.; Maccarrone, G.; Macchiolo, A.; Macdonald, C. M.; Maček, B.; Machado Miguens, J.; Madaffari, D.; Madar, R.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeda, J.; Maeland, S.; Maeno, T.; Maevskiy, A.; Magradze, E.; Mahlstedt, J.; Maiani, C.; Maidantchik, C.; Maier, A. A.; Maier, T.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyukov, S.; Mamuzic, J.; Mancini, G.; Mandelli, B.; Mandelli, L.; Mandić, I.; Maneira, J.; Filho, L. Manhaes de Andrade; Manjarres Ramos, J.; Mann, A.; Manousos, A.; Mansoulie, B.; Mansour, J. D.; Mantifel, R.; Mantoani, M.; Manzoni, S.; Mapelli, L.; Marceca, G.; March, L.; Marchiori, G.; Marcisovsky, M.; Marjanovic, M.; Marley, D. E.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Marti-Garcia, S.; Martin, B.; Martin, T. A.; Martin, V. J.; dit Latour, B. Martin; Martinez, M.; Martinez Outschoorn, V. I.; Martin-Haugh, S.; Martoiu, V. S.; Martyniuk, A. C.; Marx, M.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massa, L.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Mättig, P.; Mattmann, J.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazza, S. M.; McFadden, N. C.; McGoldrick, G.; McKee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McClymont, L. I.; McDonald, E. F.; Mcfayden, J. A.; Mchedlidze, G.; McMahon, S. J.; McPherson, R. A.; Medinnis, M.; Meehan, S.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Melini, D.; Mellado Garcia, B. R.; Melo, M.; Meloni, F.; Mengarelli, A.; Menke, S.; Meoni, E.; Mergelmeyer, S.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Theenhausen, H. Meyer Zu; Miano, F.; Middleton, R. P.; Miglioranzi, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Milesi, M.; Milic, A.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Minaenko, A. A.; Minami, Y.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mistry, K. P.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Miucci, A.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Mochizuki, K.; Mohapatra, S.; Molander, S.; Moles-Valls, R.; Monden, R.; Mondragon, M. C.; Mönig, K.; Monk, J.; Monnier, E.; Montalbano, A.; Montejo Berlingen, J.; Monticelli, F.; Monzani, S.; Moore, R. W.; Morange, N.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Mori, D.; Mori, T.; Morii, M.; Morinaga, M.; Morisbak, V.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Mortensen, S. S.; Morvaj, L.; Mosidze, M.; Moss, J.; Motohashi, K.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Muanza, S.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, R. S. P.; Mueller, T.; Muenstermann, D.; Mullen, P.; Mullier, G. A.; Munoz Sanchez, F. J.; Murillo Quijada, J. A.; Murray, W. J.; Musheghyan, H.; Muškinja, M.; Myagkov, A. G.; Myska, M.; Nachman, B. P.; Nackenhorst, O.; Nagai, K.; Nagai, R.; Nagano, K.; Nagasaka, Y.; Nagata, K.; Nagel, M.; Nagy, E.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Naranjo Garcia, R. F.; Narayan, R.; Narrias Villar, D. I.; Naryshkin, I.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Negri, A.; Negrini, M.; Nektarijevic, S.; Nellist, C.; Nelson, A.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nguyen, D. H.; Nguyen Manh, T.; Nickerson, R. B.; Nicolaidou, R.; Nielsen, J.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, J. K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nomachi, M.; Nomidis, I.; Nooney, T.; Norberg, S.; Nordberg, M.; Norjoharuddeen, N.; Novgorodova, O.; Nowak, S.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nurse, E.; Nuti, F.; O'grady, F.; O'Neil, D. C.; O'Rourke, A. A.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, I.; Ochoa-Ricoux, J. P.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Oide, H.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Oleiro Seabra, L. F.; Olivares Pino, S. A.; Oliveira Damazio, D.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onogi, K.; Onyisi, P. U. E.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Orr, R. S.; Osculati, B.; Ospanov, R.; Garzon, G. Otero y.; Otono, H.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Owen, M.; Owen, R. E.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pacheco Pages, A.; Pacheco Rodriguez, L.; Padilla Aranda, C.; Pagáčová, M.; Pagan Griso, S.; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palazzo, S.; Palestini, S.; Palka, M.; Pallin, D.; Panagiotopoulou, E. St.; Pandini, C. E.; Panduro Vazquez, J. G.; Pani, P.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Paredes Hernandez, D.; Parker, A. J.; Parker, M. A.; Parker, K. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pascuzzi, V. R.; Pasqualucci, E.; Passaggio, S.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Pater, J. R.; Pauly, T.; Pearce, J.; Pearson, B.; Pedersen, L. E.; Pedersen, M.; Pedraza Lopez, S.; Pedro, R.; Peleganchuk, S. V.; Penc, O.; Peng, C.; Peng, H.; Penwell, J.; Peralva, B. S.; Perego, M. M.; Perepelitsa, D. V.; Perez Codina, E.; Perini, L.; Pernegger, H.; Perrella, S.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petroff, P.; Petrolo, E.; Petrov, M.; Petrucci, F.; Pettersson, N. E.; Peyaud, A.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Pickering, M. A.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pin, A. W. J.; Pinamonti, M.; Pinfold, J. L.; Pingel, A.; Pires, S.; Pirumov, H.; Pitt, M.; Plazak, L.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Pluth, D.; Poettgen, R.; Poggioli, L.; Pohl, D.; Polesello, G.; Poley, A.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Poppleton, A.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Pozo Astigarraga, M. E.; Pralavorio, P.; Pranko, A.; Prell, S.; Price, D.; Price, L. E.; Primavera, M.; Prince, S.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Puddu, D.; Purohit, M.; Puzo, P.; Qian, J.; Qin, G.; Qin, Y.; Quadt, A.; Quayle, W. B.; Queitsch-Maitland, M.; Quilty, D.; Raddum, S.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Raine, J. A.; Rajagopalan, S.; Rammensee, M.; Rangel-Smith, C.; Ratti, M. G.; Rauscher, F.; Rave, S.; Ravenscroft, T.; Ravinovich, I.; Raymond, M.; Read, A. L.; Readioff, N. P.; Reale, M.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Rehnisch, L.; Reichert, J.; Reisin, H.; Rembser, C.; Ren, H.; Rescigno, M.; Resconi, S.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Richter, S.; Richter-Was, E.; Ricken, O.; Ridel, M.; Rieck, P.; Riegel, C. J.; Rieger, J.; Rifki, O.; Rijssenbeek, M.; Rimoldi, A.; Rimoldi, M.; Rinaldi, L.; Ristić, B.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Rizzi, C.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Roda, C.; Rodina, Y.; Rodriguez Perez, A.; Rodriguez Rodriguez, D.; Roe, S.; Rogan, C. S.; RØhne, O.; Romaniouk, A.; Romano, M.; Romano Saez, S. M.; Romero Adam, E.; Rompotis, N.; Ronzani, M.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, P.; Rosenthal, O.; Rosien, N.-A.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, J. H. N.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Russell, H. L.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryu, S.; Ryzhov, A.; Rzehorz, G. F.; Saavedra, A. F.; Sabato, G.; Sacerdoti, S.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Saha, P.; Sahinsoy, M.; Saimpert, M.; Saito, T.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Salazar Loyola, J. E.; Salek, D.; Sales De Bruin, P. H.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sammel, D.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Sanchez Martinez, V.; Sandaker, H.; Sandbach, R. L.; Sander, H. G.; Sandhoff, M.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sannino, M.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Santoyo Castillo, I.; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sasaki, O.; Sasaki, Y.; Sato, K.; Sauvage, G.; Sauvan, E.; Savage, G.; Savard, P.; Savic, N.; Sawyer, C.; Sawyer, L.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Scarfone, V.; Schaarschmidt, J.; Schacht, P.; Schachtner, B. M.; Schaefer, D.; Schaefer, L.; Schaefer, R.; Schaeffer, J.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Schiavi, C.; Schier, S.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt-Sommerfeld, K. R.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schmitz, S.; Schneider, B.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schopf, E.; Schott, M.; Schovancova, J.; Schramm, S.; Schreyer, M.; Schuh, N.; Schulte, A.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwarz, T. A.; Schweiger, H.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Sciolla, G.; Scuri, F.; Scutti, F.; Searcy, J.; Seema, P.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekhon, K.; Sekula, S. J.; Seliverstov, D. M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Sessa, M.; Seuster, R.; Severini, H.; Sfiligoj, T.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shaikh, N. W.; Shan, L. Y.; Shang, R.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Shaw, S. M.; Shcherbakova, A.; Shehu, C. Y.; Sherwood, P.; Shi, L.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shiyakova, M.; Shmeleva, A.; Shoaleh Saadi, D.; Shochet, M. J.; Shojaii, S.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Sicho, P.; Sickles, A. M.; Sidebo, P. E.; Sidiropoulou, O.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silva, J.; Silverstein, S. B.; Simak, V.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simon, D.; Simon, M.; Sinervo, P.; Sinev, N. B.; Sioli, M.; Siragusa, G.; Sivoklokov, S. Yu.; Sjölin, J.; Skinner, M. B.; Skottowe, H. P.; Skubic, P.; Slater, M.; Slavicek, T.; Slawinska, M.; Sliwa, K.; Slovak, R.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smiesko, J.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, M. N. K.; Smith, R. W.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snyder, S.; Sobie, R.; Socher, F.; Soffer, A.; Soh, D. A.; Sokhrannyi, G.; Solans Sanchez, C. A.; Solar, M.; Soldatov, E. Yu.; Soldevila, U.; Solodkov, A. A.; Soloshenko, A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Son, H.; Song, H. Y.; Sood, A.; Sopczak, A.; Sopko, V.; Sorin, V.; Sosa, D.; Sotiropoulou, C. L.; Soualah, R.; Soukharev, A. M.; South, D.; Sowden, B. C.; Spagnolo, S.; Spalla, M.; Spangenberg, M.; Spanò, F.; Sperlich, D.; Spettel, F.; Spighi, R.; Spigo, G.; Spiller, L. A.; Spousta, M.; St. Denis, R. D.; Stabile, A.; Stamen, R.; Stamm, S.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, G. H.; Stark, J.; Staroba, P.; Starovoitov, P.; Stärz, S.; Staszewski, R.; Steinberg, P.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Stramaglia, M. E.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Strubig, A.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Suchek, S.; Sugaya, Y.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, S.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, S.; Svatos, M.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Taccini, C.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takai, H.; Takashima, R.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tan, K. G.; Tanaka, J.; Tanaka, M.; Tanaka, R.; Tanaka, S.; Tannenwald, B. B.; Tapia Araya, S.; Tapprogge, S.; Tarem, S.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Tavares Delgado, A.; Tayalati, Y.; Taylor, A. C.; Taylor, G. N.; Taylor, P. T. E.; Taylor, W.; Teischinger, F. A.; Teixeira-Dias, P.; Temming, K. K.; Temple, D.; Ten Kate, H.; Teng, P. K.; Teoh, J. J.; Tepel, F.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Theveneaux-Pelzer, T.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, E. N.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Tibbetts, M. J.; Ticse Torres, R. E.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tipton, P.; Tisserant, S.; Todome, K.; Todorov, T.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tolley, E.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, B.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Trefzger, T.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trocmé, B.; Trofymov, A.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; Truong, L.; Trzebinski, M.; Trzupek, A.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsui, K. M.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tu, Y.; Tudorache, A.; Tudorache, V.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turgeman, D.; Turra, R.; Turvey, A. J.; Tuts, P. M.; Tyndel, M.; Ucchielli, G.; Ueda, I.; Ughetto, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Unverdorben, C.; Urban, J.; Urquijo, P.; Urrejola, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valderanis, C.; Valdes Santurio, E.; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Valls Ferrer, J. A.; Van Den Wollenberg, W.; Van Der Deijl, P. C.; van der Graaf, H.; van Eldik, N.; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vanguri, R.; Vaniachine, A.; Vankov, P.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vasquez, J. G.; Vazeille, F.; Vazquez Schroeder, T.; Veatch, J.; Veeraraghavan, V.; Veloce, L. M.; Veloso, F.; Veneziano, S.; Ventura, A.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Boeriu, O. E. Vickey; Viehhauser, G. H. A.; Viel, S.; Vigani, L.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Vittori, C.; Vivarelli, I.; Vlachos, S.; Vlasak, M.; Vogel, M.; Vokac, P.; Volpi, G.; Volpi, M.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobev, K.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, P.; Wagner, W.; Wahlberg, H.; Wahrmund, S.; Wakabayashi, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wallangen, V.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, T.; Wang, W.; Wang, X.; Wanotayaroj, C.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Washbrook, A.; Watkins, P. M.; Watson, A. T.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weinert, B.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, M. D.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; Whallon, N. L.; Wharton, A. M.; White, A.; White, M. J.; White, R.; Whiteson, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wildauer, A.; Wilk, F.; Wilkens, H. G.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, J. A.; Wingerter-Seez, I.; Winklmeier, F.; Winston, O. J.; Winter, B. T.; Wittgen, M.; Wittkowski, J.; Wolf, T. M. H.; Wolter, M. W.; Wolters, H.; Worm, S. D.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wu, M.; Wu, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yamaguchi, D.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, S.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yap, Y. C.; Yasu, Y.; Yatsenko, E.; Yau Wong, K. H.; Ye, J.; Ye, S.; Yeletskikh, I.; Yen, A. L.; Yildirim, E.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J. M.; Yu, J.; Yuan, L.; Yuen, S. P. Y.; Yusuff, I.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zakharchuk, N.; Zalieckas, J.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zeng, J. C.; Zeng, Q.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zhang, D.; Zhang, F.; Zhang, G.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, R.; Zhang, R.; Zhang, X.; Zhang, Z.; Zhao, X.; Zhao, Y.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, C.; Zhou, L.; Zhou, L.; Zhou, M.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zhukov, K.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, S.; Zinonos, Z.; Zinser, M.; Ziolkowski, M.; Živković, L.; Zobernig, G.; Zoccoli, A.; zur Nedden, M.; Zwalinski, L.

    2016-11-01

    Knowledge of the material in the ATLAS inner tracking detector is crucial in understanding the reconstruction of charged-particle tracks, the performance of algorithms that identify jets containing b-hadrons and is also essential to reduce background in searches for exotic particles that can decay within the inner detector volume. Interactions of primary hadrons produced in pp collisions with the material in the inner detector are used to map the location and amount of this material. The hadronic interactions of primary particles may result in secondary vertices, which in this analysis are reconstructed by an inclusive vertex-finding algorithm. Data were collected using minimum-bias triggers by the ATLAS detector operating at the LHC during 2010 at centre-of-mass energy √s = 7 TeV, and correspond to an integrated luminosity of 19 nb-1. Kinematic properties of these secondary vertices are used to study the validity of the modelling of hadronic interactions in simulation. Secondary-vertex yields are compared between data and simulation over a volume of about 0.7 m3 around the interaction point, and agreement is found within overall uncertainties.

  6. Methane clathrate stability zone variations and gas transport in the Martian subsurface

    NASA Astrophysics Data System (ADS)

    Karatekin, O.; Gloesener, E.; Dehant, V. M. A.; Temel, O.

    2016-12-01

    During the last years, several detections of methane in the atmosphere of Mars were reported from Earth-based and Mars orbit instruments with abundances ranging to tens of parts-per-billion by volume (ppbv). Recently, the Curiosity rover detected methane with background levels of 0.7 ppbv and episodic releases of 7 ppbv. Although the methane sources are still unknown, this gas may have been stored in reservoirs of clathrate hydrate in the Martian subsurface where thermodynamics conditions are favourable to their presence. Clathrate hydrates are crystalline compounds constituted by cages formed by hydrogen-bonded water molecules inside of which guest gas molecules are trapped. In this study, methane clathrate stability in the Martian subsurface are investigated and their temporal and spatial variations are studied. Present-day maps of methane clathrate stability zone are produced by coupling the stability conditions of methane clathrate with a subsurface model using the available observations such as the the thermal inertia derived from TES MGS data. Then, a gas transport model has been used to study the methane flux at the surface due to the diffusion of different plausible methane volumes released by clathrate hydrates at variable depths under the Martian surface.

  7. Characterization and classification of vegetation canopy structure and distribution within the Great Smoky Mountains National Park using LiDAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Jitendra; HargroveJr., William Walter; Norman, Steven P

    Vegetation canopy structure is a critically important habit characteristic for many threatened and endangered birds and other animal species, and it is key information needed by forest and wildlife managers for monitoring and managing forest resources, conservation planning and fostering biodiversity. Advances in Light Detection and Ranging (LiDAR) technologies have enabled remote sensing-based studies of vegetation canopies by capturing three-dimensional structures, yielding information not available in two-dimensional images of the landscape pro- vided by traditional multi-spectral remote sensing platforms. However, the large volume data sets produced by airborne LiDAR instruments pose a significant computational challenge, requiring algorithms to identify andmore » analyze patterns of interest buried within LiDAR point clouds in a computationally efficient manner, utilizing state-of-art computing infrastructure. We developed and applied a computationally efficient approach to analyze a large volume of LiDAR data and to characterize and map the vegetation canopy structures for 139,859 hectares (540 sq. miles) in the Great Smoky Mountains National Park. This study helps improve our understanding of the distribution of vegetation and animal habitats in this extremely diverse ecosystem.« less

  8. Going beyond the flood insurance rate map: insights from flood hazard map co-production

    NASA Astrophysics Data System (ADS)

    Luke, Adam; Sanders, Brett F.; Goodrich, Kristen A.; Feldman, David L.; Boudreau, Danielle; Eguiarte, Ana; Serrano, Kimberly; Reyes, Abigail; Schubert, Jochen E.; AghaKouchak, Amir; Basolo, Victoria; Matthew, Richard A.

    2018-04-01

    Flood hazard mapping in the United States (US) is deeply tied to the National Flood Insurance Program (NFIP). Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM) such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1) legends that frame flood intensity both qualitatively and quantitatively, and (2) flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1) standing water depths following the flood, (2) the erosive potential of flowing water, and (3) pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating pluvial flood hazards and by using concrete reference points to describe flooding scenarios rather than exceedance probabilities or frequencies.

  9. Book review: Handbook of the birds of the world, Volume 8, Broadbills to Tapaculos

    USGS Publications Warehouse

    Gustafson, Mary

    2004-01-01

    No abstract available.Review info: Handbook of the Birds of the World, Volume 8, Broadbills to Tapaculos. Edited by Josep del Hoyo, Andrew Elliott, and David Christie. Lynx Edicions, Barcelona, Spain. 2003: 845 pp., 81 color plates, over 470 color photographs, 672 maps. ISBN: 8487334504, $195.00 (cloth)

  10. Bibliographies of Northern and Central California Indians. Volume 1--Tribal Bibliographies A-N.

    ERIC Educational Resources Information Center

    Brandt, Randal S., Ed.; Davis-Kimball, Jeannine, Ed.

    This document is the first of a three-volume set made up of bibliographic citations to published texts, unpublished manuscripts, photographs, sound recordings, motion pictures, and maps concerning Native American tribal groups that inhabit, or have traditionally inhabited, northern and central California. Introductory material in this volume…

  11. Dynamic-contrast-enhanced-MRI with extravasating contrast reagent: Rat cerebral glioma blood volume determination

    NASA Astrophysics Data System (ADS)

    Li, Xin; Rooney, William D.; Várallyay, Csanád G.; Gahramanov, Seymur; Muldoon, Leslie L.; Goodman, James A.; Tagge, Ian J.; Selzer, Audrey H.; Pike, Martin M.; Neuwelt, Edward A.; Springer, Charles S.

    2010-10-01

    The accurate mapping of the tumor blood volume (TBV) fraction ( vb) is a highly desired imaging biometric goal. It is commonly thought that achieving this is difficult, if not impossible, when small molecule contrast reagents (CRs) are used for the T1-weighted (Dynamic-Contrast-Enhanced) DCE-MRI technique. This is because angiogenic malignant tumor vessels allow facile CR extravasation. Here, a three-site equilibrium water exchange model is applied to DCE-MRI data from the cerebrally-implanted rat brain U87 glioma, a tumor exhibiting rapid CR extravasation. Analyses of segments of the (and the entire) DCE data time-course with this "shutter-speed" pharmacokinetic model, which admits finite water exchange kinetics, allow TBV estimation from the first-pass segment. Pairwise parameter determinances were tested with grid searches of 2D parametric error surfaces. Tumor blood volume ( vb), as well as ve (the extracellular, extravascular space volume fraction), and Ktrans (a CR extravasation rate measure) parametric maps are presented. The role of the Patlak Plot in DCE-MRI is also considered.

  12. 27 CFR 24.301 - Bulk still wine record.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    .... The bulk still wine record will contain the following: (a) The volume produced by fermentation in wine... fermentation, amelioration, sweetening, addition of spirits, blending; (d) The volume of wine used and produced...

  13. 27 CFR 24.301 - Bulk still wine record.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    .... The bulk still wine record will contain the following: (a) The volume produced by fermentation in wine... fermentation, amelioration, sweetening, addition of spirits, blending; (d) The volume of wine used and produced...

  14. 27 CFR 24.301 - Bulk still wine record.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    .... The bulk still wine record will contain the following: (a) The volume produced by fermentation in wine... fermentation, amelioration, sweetening, addition of spirits, blending; (d) The volume of wine used and produced...

  15. 27 CFR 24.301 - Bulk still wine record.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    .... The bulk still wine record will contain the following: (a) The volume produced by fermentation in wine... fermentation, amelioration, sweetening, addition of spirits, blending; (d) The volume of wine used and produced...

  16. 27 CFR 24.301 - Bulk still wine record.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... The bulk still wine record will contain the following: (a) The volume produced by fermentation in wine... fermentation, amelioration, sweetening, addition of spirits, blending; (d) The volume of wine used and produced...

  17. TU-AB-202-03: Prediction of PET Transfer Uncertainty by DIR Error Estimating Software, AUTODIRECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Phillips, J

    2016-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool, but DIR errors can adversely affect its clinical applications. To estimate voxel-specific DIR uncertainty, a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), has been developed and validated. This work tests the ability of this software to predict uncertainty for the transfer of standard uptake values (SUV) from positron-emission tomography (PET) with DIR. Methods: Virtual phantoms are used for this study. Each phantom has a planning computed tomography (CT) image and a diagnostic PET-CT image set. A deformation was digitally applied to the diagnostic CT to create the planningmore » CT image and establish a known deformation between the images. One lung and three rectum patient datasets were employed to create the virtual phantoms. Both of these sites have difficult deformation scenarios associated with them, which can affect DIR accuracy (lung tissue sliding and changes in rectal filling). The virtual phantoms were created to simulate these scenarios by introducing discontinuities in the deformation field at the lung rectum border. The DIR algorithm from Plastimatch software was applied to these phantoms. The SUV mapping errors from the DIR were then compared to that predicted by AUTODIRECT. Results: The SUV error distributions closely followed the AUTODIRECT predicted error distribution for the 4 test cases. The minimum and maximum PET SUVs were produced from AUTODIRECT at 95% confidence interval before applying gradient-based SUV segmentation for each of these volumes. Notably, 93.5% of the target volume warped by the true deformation was included within the AUTODIRECT-predicted maximum SUV volume after the segmentation, while 78.9% of the target volume was within the target volume warped by Plastimatch. Conclusion: The AUTODIRECT framework is able to predict PET transfer uncertainty caused by DIR, which enables an understanding of the associated target volume uncertainty.« less

  18. Mapping internal connectivity through human migration in malaria endemic countries.

    PubMed

    Sorichetta, Alessandro; Bird, Tom J; Ruktanonchai, Nick W; Zu Erbach-Schoenberg, Elisabeth; Pezzulo, Carla; Tejedor, Natalia; Waldock, Ian C; Sadler, Jason D; Garcia, Andres J; Sedda, Luigi; Tatem, Andrew J

    2016-08-16

    Human mobility continues to increase in terms of volumes and reach, producing growing global connectivity. This connectivity hampers efforts to eliminate infectious diseases such as malaria through reintroductions of pathogens, and thus accounting for it becomes important in designing global, continental, regional, and national strategies. Recent works have shown that census-derived migration data provides a good proxy for internal connectivity, in terms of relative strengths of movement between administrative units, across temporal scales. To support global malaria eradication strategy efforts, here we describe the construction of an open access archive of estimated internal migration flows in endemic countries built through pooling of census microdata. These connectivity datasets, described here along with the approaches and methods used to create and validate them, are available both through the WorldPop website and the WorldPop Dataverse Repository.

  19. Mapping internal connectivity through human migration in malaria endemic countries

    PubMed Central

    Sorichetta, Alessandro; Bird, Tom J.; Ruktanonchai, Nick W.; zu Erbach-Schoenberg, Elisabeth; Pezzulo, Carla; Tejedor, Natalia; Waldock, Ian C.; Sadler, Jason D.; Garcia, Andres J.; Sedda, Luigi; Tatem, Andrew J.

    2016-01-01

    Human mobility continues to increase in terms of volumes and reach, producing growing global connectivity. This connectivity hampers efforts to eliminate infectious diseases such as malaria through reintroductions of pathogens, and thus accounting for it becomes important in designing global, continental, regional, and national strategies. Recent works have shown that census-derived migration data provides a good proxy for internal connectivity, in terms of relative strengths of movement between administrative units, across temporal scales. To support global malaria eradication strategy efforts, here we describe the construction of an open access archive of estimated internal migration flows in endemic countries built through pooling of census microdata. These connectivity datasets, described here along with the approaches and methods used to create and validate them, are available both through the WorldPop website and the WorldPop Dataverse Repository. PMID:27529469

  20. Artificial intelligence for geologic mapping with imaging spectrometers

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.

    1993-01-01

    This project was a three year study at the Center for the Study of Earth from Space (CSES) within the Cooperative Institute for Research in Environmental Science (CIRES) at the University of Colorado, Boulder. The goal of this research was to develop an expert system to allow automated identification of geologic materials based on their spectral characteristics in imaging spectrometer data such as the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). This requirement was dictated by the volume of data produced by imaging spectrometers, which prohibits manual analysis. The research described is based on the development of automated techniques for analysis of imaging spectrometer data that emulate the analytical processes used by a human observer. The research tested the feasibility of such an approach, implemented an operational system, and tested the validity of the results for selected imaging spectrometer data sets.

  1. Secondary oil recovery from selected Carter sandstone oilfields, Black Warrior Basin, Alabama

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, J.C.

    1993-04-15

    The objectives of this secondary oil recovery project involving the Carter sandstone in northwest Alabama are: (1) To increase the ultimate economic recovery of oil from the Carter reservoirs, thereby increasing domestic reserves and lessening US dependence on foreign oil; (2) To extensively model, test, and monitor the reservoirs so their management is optimized; and (3) To assimilate and transfer the information and results gathered to other US oil companies to encourage them to attempt similar projects. Start-up water injection began on 0 1/12/93 at the Central Bluff Field, and daily operations began on 01/13/93. These operations include monitoring wellheadmore » pressures at the injector and two producers, and injection water treatment. Water injection was running 200-300 bbl/day at the end of February. Once the unit is pressured-up well testing will be performed. Unitization was approved on 03/01/93.b. For the North Fairview Field correlations and log analyses were used to determine the fluid and rock properties. A summary of these properties is included in Table 1. The results of the log analysis were used to construct the hydrocarbon pore volume map shown on Figure 1. The map was planimetered to determine original oil-in-place (OOIP) values and the hydrocarbon pore volume by tract. The OOIP summed over an tracts by this method is 824.7 Mbbl (Figure 2). Original oil-in-place was also calculated directly: two such independent calculations gave 829.4 Mbbl (Table 1) and 835.6 Mbbl (Table 2). Thus, the three estimates of OOIP are within one percent. The approximately 88% of OOIP remaining provides an attractive target for secondary recovery. Injection start-up is planned for mid-June.« less

  2. Mapping edge-based traffic measurements onto the internal links in MPLS network

    NASA Astrophysics Data System (ADS)

    Zhao, Guofeng; Tang, Hong; Zhang, Yi

    2004-09-01

    Applying multi-protocol label switching techniques to IP-based backbone for traffic engineering goals has shown advantageous. Obtaining a volume of load on each internal link of the network is crucial for traffic engineering applying. Though collecting can be available for each link, such as applying traditional SNMP scheme, the approach may cause heavy processing load and sharply degrade the throughput of the core routers. Then monitoring merely at the edge of the network and mapping the measurements onto the core provides a good alternative way. In this paper, we explore a scheme for traffic mapping with edge-based measurements in MPLS network. It is supposed that the volume of traffic on each internal link over the domain would be mapped onto by measurements available only at ingress nodes. We apply path-based measurements at ingress nodes without enabling measurements in the core of the network. We propose a method that can infer a path from the ingress to the egress node using label distribution protocol without collecting routing data from core routers. Based on flow theory and queuing theory, we prove that our approach is effective and present the algorithm for traffic mapping. We also show performance simulation results that indicate potential of our approach.

  3. Interpretation of a 3D Seismic-Reflection Volume in the Basin and Range, Hawthorne, Nevada

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Kell, A. M.; Pullammanappallil, S.; Oldow, J. S.; Sabin, A.; Lazaro, M.

    2009-12-01

    A collaborative effort by the Great Basin Center for Geothermal Energy at the University of Nevada, Reno, and Optim Inc. of Reno has interpreted a 3d seismic data set recorded by the U.S. Navy Geothermal Programs Office (GPO) at the Hawthorne Army Depot, Nevada. The 3d survey incorporated about 20 NNW-striking lines covering an area of approximately 3 by 10 km. The survey covered an alluvial area below the eastern flank of the Wassuk Range. In the reflection volume the most prominent events are interpreted to be the base of Quaternary alluvium, the Quaternary Wassuk Range-front normal fault zone, and sequences of intercalated Tertiary volcanic flows and sediments. Such a data set is rare in the Basin and Range. Our interpretation reveals structural and stratigraphic details that form a basis for rapid development of the geothermal-energy resources underlying the Depot. We interpret a map of the time-elevation of the Wassuk Range fault and its associated splays and basin-ward step faults. The range-front fault is the deepest, and its isochron map provides essentially a map of "economic basement" under the prospect area. There are three faults that are the most readily picked through vertical sections. The fault reflections show an uncertainty in the time-depth that we can interpret for them of 50 to 200 ms, due to the over-migrated appearance of the processing contractor’s prestack time-migrated data set. Proper assessment of velocities for mitigating the migration artifacts through prestack depth migration is not possible from this data set alone, as the offsets are not long enough for sufficiently deep velocity tomography. The three faults we interpreted appear as gradients in potential-field maps. In addition, the southern boundary of a major Tertiary graben may be seen within the volume as the northward termination of the strong reflections from older Tertiary volcanics. Using a transparent volume view across the survey gives a view of the volcanics in full, providing a clear picture of prominent structures. Potential drill targets and areas of development are defined within the data volume by the intersections of the fault surfaces with the tracked, strong stratigraphic reflections. Target volumes for drilling and development are defined by the intersections of the faults and bright-spot stratigraphy, and their uncertainty bounds. There are a few such intersections present within the 3d volume. Analyzing seismic attributes gives the opportunity to identify characteristics common in geothermal environments.

  4. The reliability of the physical examination to guide fluid therapy in adults with severe falciparum malaria: an observational study.

    PubMed

    Hanson, Josh; Lam, Sophia W K; Alam, Shamsul; Pattnaik, Rajyabardhan; Mahanta, Kishore C; Uddin Hasan, Mahatab; Mohanty, Sanjib; Mishra, Saroj; Cohen, Sophie; Day, Nicholas; White, Nicholas; Dondorp, Arjen

    2013-10-01

    Adults with severe malaria frequently require intravenous fluid therapy to restore their circulating volume. However, fluid must be delivered judiciously as both under- and over-hydration increase the risk of complications and, potentially, death. As most patients will be cared for in a resource-poor setting, management guidelines necessarily recommend that physical examination should guide fluid resuscitation. However, the reliability of this strategy is uncertain. To determine the ability of physical examination to identify hypovolaemia, volume responsiveness, and pulmonary oedema, clinical signs and invasive measures of volume status were collected independently during an observational study of 28 adults with severe malaria. The physical examination defined volume status poorly. Jugular venous pressure (JVP) did not correlate with intravascular volume as determined by global end diastolic volume index (GEDVI; r(s) = 0.07, p = 0.19), neither did dry mucous membranes (p = 0.85), or dry axillae (p = 0.09). GEDVI was actually higher in patients with decreased tissue turgor (p < 0.001). Poor capillary return correlated with GEDVI, but was present infrequently (7% of observations) and, therefore, insensitive. Mean arterial pressure (MAP) correlated with GEDVI (rs = 0.16, p = 0.002), but even before resuscitation patients with a low GEDVI had a preserved MAP. Anuria on admission was unrelated to GEDVI and although liberal fluid resuscitation led to a median hourly urine output of 100 ml in 19 patients who were not anuric on admission, four (21%) developed clinical pulmonary oedema subsequently. MAP was unrelated to volume responsiveness (p = 0.71), while a low JVP, dry mucous membranes, dry axillae, increased tissue turgor, prolonged capillary refill, and tachycardia all had a positive predictive value for volume responsiveness of ≤50%. Extravascular lung water ≥11 ml/kg indicating pulmonary oedema was present on 99 of the 353 times that it was assessed during the study, but was identified on less than half these occasions by tachypnoea, chest auscultation, or an elevated JVP. A clear chest on auscultation and a respiratory rate <30 breaths/minute could exclude pulmonary oedema on 82% and 72% of occasions respectively. Findings on physical examination correlate poorly with true volume status in adults with severe malaria and must be used with caution to guide fluid therapy. Clinicaltrials.gov identifier: NCT00692627.

  5. Calibration of the DRASTIC ground water vulnerability mapping method

    USGS Publications Warehouse

    Rupert, M.G.

    2001-01-01

    Ground water vulnerability maps developed using the DRASTIC method have been produced in many parts of the world. Comparisons of those maps with actual ground water quality data have shown that the DRASTIC method is typically a poor predictor of ground water contamination. This study significantly improved the effectiveness of a modified DRASTIC ground water vulnerability map by calibrating the point rating schemes to actual ground water quality data by using nonparametric statistical techniques and a geographic information system. Calibration was performed by comparing data on nitrite plus nitrate as nitrogen (NO2 + NO3-N) concentrations in ground water to land-use, soils, and depth to first-encountered ground water data. These comparisons showed clear statistical differences between NO2 + NO3-N concentrations and the various categories. Ground water probability point ratings for NO2 + NO3-N contamination were developed from the results of these comparisons, and a probability map was produced. This ground water probability map was then correlated with an independent set of NO2 + NO3-N data to demonstrate its effectiveness in predicting elevated NO2 + NO3-N concentrations in ground water. This correlation demonstrated that the probability map was effective, but a vulnerability map produced with the uncalibrated DRASTIC method in the same area and using the same data layers was not effective. Considerable time and expense have been outlaid to develop ground water vulnerability maps with the DRASTIC method. This study demonstrates a cost-effective method to improve and verify the effectiveness of ground water vulnerability maps.

  6. Obesity-induced hepatic hypoperfusion primes for hepatic dysfunction after resuscitated hemorrhagic shock.

    PubMed

    Matheson, Paul J; Hurt, Ryan T; Franklin, Glen A; McClain, Craig J; Garrison, R Neal

    2009-10-01

    Obese patients (BMI>35) after blunt trauma are at increased risk compared to non-obese for organ dysfunction, prolonged hospital stay, infection, prolonged mechanical ventilation, and mortality. Obesity and non-alcoholic fatty liver disease (NAFLD) produce a low grade systemic inflammatory response syndrome (SIRS) with compromised hepatic blood flow, which increases with body mass index. We hypothesized that obesity further aggravates liver dysfunction by reduced hepatic perfusion following resuscitated hemorrhagic shock (HEM). Age-matched Zucker rats (Obese, 314-519 g & Lean, 211-280 g) were randomly assigned to 4 groups (n = 10-12/group): (1) Lean-Sham; (2) Lean, HEM, and resuscitation (HEM/RES); (3) Obese-Sham; and (4) Obese-HEM/RES. HEM was 40% of mean arterial pressure (MAP) for 60 min; RES was return of shed blood/5 min and 2 volumes of saline/25 min. Hepatic blood flow (HBF) using galactose clearance, liver enzymes and complete metabolic panel were measured over 4 h after completion of RES. Obese rats had increased MAP, heart rate, and fasting blood glucose and BUN concentrations compared to lean controls, required less blood withdrawal (mL/g) to maintain 40% MAP, and RES did not restore BL MAP. Obese rats had decreased HBF at BL and during HEM/RES, which persisted 4 h post RES. ALT and BUN were increased compared to Lean-HEM/RES at 4 h post-RES. These data suggest that obesity significantly contributes to trauma outcomes through compromised vascular control or through fat-induced sinusoidal compression to impair hepatic blood flow after HEM/RES resulting in a greater hepatic injury. The pro-inflammatory state of NAFLD seen in obesity appears to prime the liver for hepatic ischemia after resuscitated hemorrhagic shock, perhaps intensified by insidious and ongoing hepatic hypoperfusion established prior to the traumatic injury or shock.

  7. Use of the Box and Jenkins time series technique in traffic forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nihan, N.L.; Holmesland, K.O.

    The use of recently developed time series techniques for short-term traffic volume forecasting is examined. A data set containing monthly volumes on a freeway segment for 1968-76 is used to fit a time series model. The resultant model is used to forecast volumes for 1977. The forecast volumes are then compared with actual volumes in 1977. Time series techniques can be used to develop highly accurate and inexpensive short-term forecasts. The feasibility of using these models to evaluate the effects of policy changes or other outside impacts is considered. (1 diagram, 1 map, 14 references,2 tables)

  8. Mapping Subcortical Brain Maturation during Adolescence: Evidence of Hemisphere-and Sex-Specific Longitudinal Changes

    ERIC Educational Resources Information Center

    Dennison, Meg; Whittle, Sarah; Yücel, Murat; Vijayakumar, Nandita; Kline, Alexandria; Simmons, Julian; Allen, Nicholas B.

    2013-01-01

    Early to mid-adolescence is an important developmental period for subcortical brain maturation, but longitudinal studies of these neurodevelopmental changes are lacking. The present study acquired repeated magnetic resonance images from 60 adolescent subjects (28 female) at ages 12.5 and 16.5 years to map changes in subcortical structure volumes.…

  9. Modelling of Carbon Monoxide Air Pollution in Larg Cities by Evaluetion of Spectral LANDSAT8 Images

    NASA Astrophysics Data System (ADS)

    Hamzelo, M.; Gharagozlou, A.; Sadeghian, S.; Baikpour, S. H.; Rajabi, A.

    2015-12-01

    Air pollution in large cities is one of the major problems that resolve and reduce it need multiple applications and environmental management. Of The main sources of this pollution is industrial activities, urban and transport that enter large amounts of contaminants into the air and reduces its quality. With Variety of pollutants and high volume manufacturing, local distribution of manufacturing centers, Testing and measuring emissions is difficult. Substances such as carbon monoxide, sulfur dioxide, and unburned hydrocarbons and lead compounds are substances that cause air pollution and carbon monoxide is most important. Today, data exchange systems, processing, analysis and modeling is of important pillars of management system and air quality control. In this study, using the spectral signature of carbon monoxide gas as the most efficient gas pollution LANDSAT8 images in order that have better spatial resolution than appropriate spectral bands and weather meters،SAM classification algorithm and Geographic Information System (GIS ), spatial distribution of carbon monoxide gas in Tehran over a period of one year from the beginning of 2014 until the beginning of 2015 at 11 map have modeled and then to the model valuation ،created maps were compared with the map provided by the Tehran quality comparison air company. Compare involved plans did with the error matrix and results in 4 types of care; overall, producer, user and kappa coefficient was investigated. Results of average accuracy were about than 80%, which indicates the fit method and data used for modeling.

  10. Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.

  11. Landslide susceptibility mapping for a landslide-prone area (Findikli, NE of Turkey) by likelihood-frequency ratio and weighted linear combination models

    NASA Astrophysics Data System (ADS)

    Akgun, Aykut; Dag, Serhat; Bulut, Fikri

    2008-05-01

    Landslides are very common natural problems in the Black Sea Region of Turkey due to the steep topography, improper use of land cover and adverse climatic conditions for landslides. In the western part of region, many studies have been carried out especially in the last decade for landslide susceptibility mapping using different evaluation methods such as deterministic approach, landslide distribution, qualitative, statistical and distribution-free analyses. The purpose of this study is to produce landslide susceptibility maps of a landslide-prone area (Findikli district, Rize) located at the eastern part of the Black Sea Region of Turkey by likelihood frequency ratio (LRM) model and weighted linear combination (WLC) model and to compare the results obtained. For this purpose, landslide inventory map of the area were prepared for the years of 1983 and 1995 by detailed field surveys and aerial-photography studies. Slope angle, slope aspect, lithology, distance from drainage lines, distance from roads and the land-cover of the study area are considered as the landslide-conditioning parameters. The differences between the susceptibility maps derived by the LRM and the WLC models are relatively minor when broad-based classifications are taken into account. However, the WLC map showed more details but the other map produced by LRM model produced weak results. The reason for this result is considered to be the fact that the majority of pixels in the LRM map have high values than the WLC-derived susceptibility map. In order to validate the two susceptibility maps, both of them were compared with the landslide inventory map. Although the landslides do not exist in the very high susceptibility class of the both maps, 79% of the landslides fall into the high and very high susceptibility zones of the WLC map while this is 49% for the LRM map. This shows that the WLC model exhibited higher performance than the LRM model.

  12. Map showing principal drainage basins, principal runoff-producing areas, and selected stream flow data in the Kaiparowits coal-basin area, Utah

    USGS Publications Warehouse

    Price, Don

    1978-01-01

    This is one of a series of maps that describe the geology and related natural resources in the Kaiparowits coal-basin area. Streamflow records used to compile this map and the accompanying table were collected by the U.S. Geological Survey in cooperation with the Utah State Engineer and the Utah Department of Transportation. The principal runoff-producing areas were delineated from a work map (scale 1:250,000) compiled to estimate water yields in Utah (Bagley and others, 1964). Information about Lake Powell was furnished by the U.S. Bureau of Reclamation.

  13. Harmonization of forest disturbance datasets of the conterminous USA from 1986 to 2011

    USGS Publications Warehouse

    Soulard, Christopher E.; Acevedo, William; Cohen, Warren B.; Yang, Zhiqiang; Stehman, Stephen V.; Taylor, Janis L.

    2017-01-01

    Several spatial forest disturbance datasets exist for the conterminous USA. The major problem with forest disturbance mapping is that variability between map products leads to uncertainty regarding the actual rate of disturbance. In this article, harmonized maps were produced from multiple data sources (i.e., Global Forest Change, LANDFIRE Vegetation Disturbance, National Land Cover Database, Vegetation Change Tracker, and Web-Enabled Landsat Data). The harmonization process involved fitting common class ontologies and determining spatial congruency to produce forest disturbance maps for four time intervals (1986–1992, 1992–2001, 2001–2006, and 2006–2011). Pixels mapped as disturbed for two or more datasets were labeled as disturbed in the harmonized maps. The primary advantage gained by harmonization was improvement in commission error rates relative to the individual disturbance products. Disturbance omission errors were high for both harmonized and individual forest disturbance maps due to underlying limitations in mapping subtle disturbances with Landsat classification algorithms. To enhance the value of the harmonized disturbance products, we used fire perimeter maps to add information on the cause of disturbance.

  14. Harmonization of forest disturbance datasets of the conterminous USA from 1986 to 2011.

    PubMed

    Soulard, Christopher E; Acevedo, William; Cohen, Warren B; Yang, Zhiqiang; Stehman, Stephen V; Taylor, Janis L

    2017-04-01

    Several spatial forest disturbance datasets exist for the conterminous USA. The major problem with forest disturbance mapping is that variability between map products leads to uncertainty regarding the actual rate of disturbance. In this article, harmonized maps were produced from multiple data sources (i.e., Global Forest Change, LANDFIRE Vegetation Disturbance, National Land Cover Database, Vegetation Change Tracker, and Web-Enabled Landsat Data). The harmonization process involved fitting common class ontologies and determining spatial congruency to produce forest disturbance maps for four time intervals (1986-1992, 1992-2001, 2001-2006, and 2006-2011). Pixels mapped as disturbed for two or more datasets were labeled as disturbed in the harmonized maps. The primary advantage gained by harmonization was improvement in commission error rates relative to the individual disturbance products. Disturbance omission errors were high for both harmonized and individual forest disturbance maps due to underlying limitations in mapping subtle disturbances with Landsat classification algorithms. To enhance the value of the harmonized disturbance products, we used fire perimeter maps to add information on the cause of disturbance.

  15. Effects of international football matches on ambulance call profiles and volumes during the 2006 World Cup.

    PubMed

    Deakin, Charles D; Thompson, Fizz; Gibson, Caroline; Green, Mark

    2007-06-01

    Prompt ambulance attendance is aimed at improving patient care. With finite resources struggling to meet performance targets, unforeseen demand precludes the ability to tailor resources to cope with increased call volumes, and can have a marked detrimental effect on performance and hence patient care. The effects of the 2006 World Cup football matches on call volumes and profiles were analysed to understand how public events can influence demands on the ambulance service. All emergency calls to the Hampshire Ambulance Service NHS Trust (currently the Hampshire Division of South Central Ambulance Service, Winchester, UK) during the first weekend of the 2006 World Cup football matches were analysed by call volume and classification of call (call type). On the day of the first football match, call volume was over 50% higher than that on a typical Saturday, with distinct peaks before and after the inaugural match. Call profile analysis showed increases in alcohol-related emergencies, including collapse, unconsciousness, assault and road traffic accidents. The increase in assaults was particularly marked at the end of each match and increased again into the late evening. A detailed mapping of call volumes and profiles during the World Cup football shows a significant increase in overall emergency calls, mostly alcohol related. Mapping of limited resources to these patterns will allow improved responses to emergency calls.

  16. Effects of international football matches on ambulance call profiles and volumes during the 2006 World Cup

    PubMed Central

    Deakin, Charles D; Thompson, Fizz; Gibson, Caroline; Green, Mark

    2007-01-01

    Background Prompt ambulance attendance is aimed at improving patient care. With finite resources struggling to meet performance targets, unforeseen demand precludes the ability to tailor resources to cope with increased call volumes, and can have a marked detrimental effect on performance and hence patient care. The effects of the 2006 World Cup football matches on call volumes and profiles were analysed to understand how public events can influence demands on the ambulance service. Methods All emergency calls to the Hampshire Ambulance Service NHS Trust (currently the Hampshire Division of South Central Ambulance Service, Winchester, UK) during the first weekend of the 2006 World Cup football matches were analysed by call volume and classification of call (call type). Results On the day of the first football match, call volume was over 50% higher than that on a typical Saturday, with distinct peaks before and after the inaugural match. Call profile analysis showed increases in alcohol‐related emergencies, including collapse, unconsciousness, assault and road traffic accidents. The increase in assaults was particularly marked at the end of each match and increased again into the late evening. Conclusion A detailed mapping of call volumes and profiles during the World Cup football shows a significant increase in overall emergency calls, mostly alcohol related. Mapping of limited resources to these patterns will allow improved responses to emergency calls. PMID:17513536

  17. An automated mapping satellite system ( Mapsat).

    USGS Publications Warehouse

    Colvocoresses, A.P.

    1982-01-01

    The favorable environment of space permits a satellite to orbit the Earth with very high stability as long as no local perturbing forces are involved. Solid-state linear-array sensors have no moving parts and create no perturbing force on the satellite. Digital data from highly stabilized stereo linear arrays are amenable to simplified processing to produce both planimetric imagery and elevation data. A satellite imaging system, called Mapsat, including this concept has been proposed to produce data from which automated mapping in near real time can be accomplished. Image maps as large as 1:50 000 scale with contours as close as a 20-m interval may be produced from Mapsat data. -from Author

  18. Translation from the collaborative OSM database to cartography

    NASA Astrophysics Data System (ADS)

    Hayat, Flora

    2018-05-01

    The OpenStreetMap (OSM) database includes original items very useful for geographical analysis and for creating thematic maps. Contributors record in the open database various themes regarding amenities, leisure, transports, buildings and boundaries. The Michelin mapping department develops map prototypes to test the feasibility of mapping based on OSM. To translate the OSM database structure into a database structure fitted with Michelin graphic guidelines a research project is in development. It aims at defining the right structure for the Michelin uses. The research project relies on the analysis of semantic and geometric heterogeneities in OSM data. In that order, Michelin implements methods to transform the input geographical database into a cartographic image dedicated for specific uses (routing and tourist maps). The paper focuses on the mapping tools available to produce a personalised spatial database. Based on processed data, paper and Web maps can be displayed. Two prototypes are described in this article: a vector tile web map and a mapping method to produce paper maps on a regional scale. The vector tile mapping method offers an easy navigation within the map and within graphic and thematic guide- lines. Paper maps can be partly automatically drawn. The drawing automation and data management are part of the mapping creation as well as the final hand-drawing phase. Both prototypes have been set up using the OSM technical ecosystem.

  19. The Regionalization of Lumbar Spine Procedures in New York State: A 10-Year Analysis.

    PubMed

    Jancuska, Jeffrey; Adrados, Murillo; Hutzler, Lorraine; Bosco, Joseph

    2016-01-01

    A retrospective review of an administrative database. The purpose of this study is to determine the current extent of regionalization by mapping lumbar spine procedures according to hospital and patient zip code, as well as examine the rate of growth of lumbar spine procedures performed at high-, medium-, and low-volume institutions in New York State. The association between hospital and spine surgeon volume and improved patient outcomes is well established. There is no study investigating the actual process of patient migration to high-volume hospitals. New York Statewide Planning and Research Cooperative System (SPARCS) administrative data were used to identify 228,695 lumbar spine surgery patients from 2005 to 2014. The data included the patients' zip code, hospital of operation, and year of discharge. The volume of lumbar spine surgery in New York State was mapped according to patient and hospital 3-digit zip code. New York State hospitals were categorized as low, medium, and high volume and descriptive statistics were used to determine trends in changes in hospital volume. Lumbar spine surgery recipients are widely distributed throughout the state. Procedures are regionalized on a select few metropolitan centers. The total number of procedures grew 2.5% over the entire 10-year-period. High-volume hospital caseload increased 50%, from 7253 procedures in 2005 to 10,915 procedures in 2014. The number of procedures at medium and low-volume hospitals decreased 30% and 13%, respectively. Despite any concerted effort aimed at moving orthopedic patients to high-volume hospitals, migration to high-volume centers occurred. Public interest in quality outcomes and cost, as well as financial incentives among medical centers to increase market share, potentially influence the migration of patients to high-volume centers. Further regionalization has the potential to exacerbate the current level of disparities among patient populations at low and high-volume hospitals. 3.

  20. Redesigning a joint replacement program using Lean Six Sigma in a Veterans Affairs hospital.

    PubMed

    Gayed, Benjamin; Black, Stephen; Daggy, Joanne; Munshi, Imtiaz A

    2013-11-01

    In April 2009, an analysis of joint replacement surgical procedures at the Richard L. Roudebush Veterans Affairs Medical Center, Indianapolis, Indiana, revealed that total hip and knee replacements incurred $1.4 million in non-Veterans Affairs (VA) care costs with an average length of stay of 6.1 days during fiscal year 2008. The Joint Replacement Program system redesign project was initiated following the Vision-Analysis-Team-Aim-Map-Measure-Change-Sustain (VA-TAMMCS) model to increase efficiency, decrease length of stay, and reduce non-VA care costs. To determine the effectiveness of Lean Six Sigma process improvement methods applied in a VA hospital. Perioperative processes for patients undergoing total joint replacement were redesigned following the VA-TAMMCS model--the VA's official, branded method of Lean Six Sigma process improvement. A multidisciplinary team including the orthopedic surgeons, frontline staff, and executive management identified waste in the current processes and initiated changes to reduce waste and increase efficiency. Data collection included a 1-year baseline period and a 20-month sustainment period. The primary endpoint was length of stay; a secondary analysis considered non-VA care cost reductions. Length of stay decreased 36% overall, decreasing from 5.3 days during the preproject period to 3.4 days during the 20-month sustainment period (P < .001). Non-VA care was completely eliminated for patients undergoing total hip and knee replacement at the Richard L. Roudebush Veterans Affairs Medical Center, producing an estimated return on investment of $1 million annually when compared with baseline cost and volumes. In addition, the volume of total joint replacements at this center increased during the data collection period. The success of the Joint Replacement Program demonstrates that VA-TAMMCS is an effective tool for Lean and Six Sigma process improvement initiatives in a surgical practice, producing a 36% sustained reduction in length of stay and completely eliminating non-VA care for total hip and knee replacements while increasing total joint replacement volume at this medical center.

  1. Increasing the Accuracy of Volume and ADC Delineation for Heterogeneous Tumor on Diffusion-Weighted MRI: Correlation with PET/CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gong, Nan-Jie; Wong, Chun-Sing, E-mail: drcswong@gmail.com; Chu, Yiu-Ching

    2013-10-01

    Purpose: To improve the accuracy of volume and apparent diffusion coefficient (ADC) measurements in diffusion-weighted magnetic resonance imaging (MRI), we proposed a method based on thresholding both the b0 images and the ADC maps. Methods and Materials: In 21 heterogeneous lesions from patients with metastatic gastrointestinal stromal tumors (GIST), gross lesion were manually contoured, and corresponding volumes and ADCs were denoted as gross tumor volume (GTV) and gross ADC (ADC{sub g}), respectively. Using a k-means clustering algorithm, the probable high-cellularity tumor tissues were selected based on b0 images and ADC maps. ADC and volume of the tissues selected using themore » proposed method were denoted as thresholded ADC (ADC{sub thr}) and high-cellularity tumor volume (HCTV), respectively. The metabolic tumor volume (MTV) in positron emission tomography (PET)/computed tomography (CT) was measured using 40% maximum standard uptake value (SUV{sub max}) as the lower threshold, and corresponding mean SUV (SUV{sub mean}) was also measured. Results: HCTV had excellent concordance with MTV according to Pearson's correlation (r=0.984, P<.001) and linear regression (slope = 1.085, intercept = −4.731). In contrast, GTV overestimated the volume and differed significantly from MTV (P=.005). ADC{sub thr} correlated significantly and strongly with SUV{sub mean} (r=−0.807, P<.001) and SUV{sub max} (r=−0.843, P<.001); both were stronger than those of ADC{sub g}. Conclusions: The proposed lesion-adaptive semiautomatic method can help segment high-cellularity tissues that match hypermetabolic tissues in PET/CT and enables more accurate volume and ADC delineation on diffusion-weighted MR images of GIST.« less

  2. Volume of Valley Networks on Mars and Its Hydrologic Implications

    NASA Astrophysics Data System (ADS)

    Luo, W.; Cang, X.; Howard, A. D.; Heo, J.

    2015-12-01

    Valley networks on Mars are river-like features that offer the best evidence for water activities in its geologic past. Previous studies have extracted valley network lines automatically from digital elevation model (DEM) data and manually from remotely sensed images. The volume of material removed by valley networks is an important parameter that could help us infer the amount of water needed to carve the valleys. A progressive black top hat (PBTH) transformation algorithm has been adapted from image processing to extract valley volume and successfully applied to simulated landform and Ma'adim Valles, Mars. However, the volume of valley network excavation on Mars has not been estimated on a global scale. In this study, the PBTH method was applied to the whole Mars to estimate this important parameter. The process was automated with Python in ArcGIS. Polygons delineating the valley associated depressions were generated by using a multi-flow direction growth method, which started with selected high point seeds on a depth grid (essentially an inverted valley) created by PBTH transformation and grew outward following multi-flow direction on the depth grid. Two published versions of valley network lines were integrated to automatically select depression polygons that represent the valleys. Some crater depressions that are connected with valleys and thus selected in the previous step were removed by using information from a crater database. Because of large distortion associated with global dataset in projected maps, the volume of each cell within a valley was calculated using the depth of the cell multiplied by the spherical area of the cell. The volumes of all the valley cells were then summed to produce the estimate of global valley excavation volume. Our initial result of this estimate was ~2.4×1014 m3. Assuming a sediment density of 2900 kg/m3, a porosity of 0.35, and a sediment load of 1.5 kg/m3, the global volume of water needed to carve the valleys was estimated to be ~7.1×1017 m3. Because of the coarse resolution of MOLA data, this is a conservative lower bound. Comparing with the hypothesized northern ocean volume 2.3×1016 m3 estimated by Carr and Head (2003), our estimate of water volume suggests and confirms an active hydrologic cycle for early Mars. Further hydrologic analysis will improve the estimate accuracy.

  3. Updating the Geologic Maps of the Apollo 15, 16, and 17 Landing Sites

    NASA Astrophysics Data System (ADS)

    Garry, W. B.; Mest, S. C.; Yingst, R. A.; Ostrach, L. R.; Petro, N. E.; Cohen, B. A.

    2018-06-01

    Our team is funded through NASA's Planetary Data Archiving, Restoration, and Tools (PDART) program to produce two new USGS Special Investigation Maps (SIM) for the Apollo 15, 16, and 17 missions: a regional map (1:200K) and a landing-site map (1:24K).

  4. Clinical high-resolution mapping of the proteoglycan-bound water fraction in articular cartilage of the human knee joint.

    PubMed

    Bouhrara, Mustapha; Reiter, David A; Sexton, Kyle W; Bergeron, Christopher M; Zukley, Linda M; Spencer, Richard G

    2017-11-01

    We applied our recently introduced Bayesian analytic method to achieve clinically-feasible in-vivo mapping of the proteoglycan water fraction (PgWF) of human knee cartilage with improved spatial resolution and stability as compared to existing methods. Multicomponent driven equilibrium single-pulse observation of T 1 and T 2 (mcDESPOT) datasets were acquired from the knees of two healthy young subjects and one older subject with previous knee injury. Each dataset was processed using Bayesian Monte Carlo (BMC) analysis incorporating a two-component tissue model. We assessed the performance and reproducibility of BMC and of the conventional analysis of stochastic region contraction (SRC) in the estimation of PgWF. Stability of the BMC analysis of PgWF was tested by comparing independent high-resolution (HR) datasets from each of the two young subjects. Unlike SRC, the BMC-derived maps from the two HR datasets were essentially identical. Furthermore, SRC maps showed substantial random variation in estimated PgWF, and mean values that differed from those obtained using BMC. In addition, PgWF maps derived from conventional low-resolution (LR) datasets exhibited partial volume and magnetic susceptibility effects. These artifacts were absent in HR PgWF images. Finally, our analysis showed regional variation in PgWF estimates, and substantially higher values in the younger subjects as compared to the older subject. BMC-mcDESPOT permits HR in-vivo mapping of PgWF in human knee cartilage in a clinically-feasible acquisition time. HR mapping reduces the impact of partial volume and magnetic susceptibility artifacts compared to LR mapping. Finally, BMC-mcDESPOT demonstrated excellent reproducibility in the determination of PgWF. Published by Elsevier Inc.

  5. Temporal similarity perfusion mapping: A standardized and model-free method for detecting perfusion deficits in stroke

    PubMed Central

    Song, Sunbin; Luby, Marie; Edwardson, Matthew A.; Brown, Tyler; Shah, Shreyansh; Cox, Robert W.; Saad, Ziad S.; Reynolds, Richard C.; Glen, Daniel R.; Cohen, Leonardo G.; Latour, Lawrence L.

    2017-01-01

    Introduction Interpretation of the extent of perfusion deficits in stroke MRI is highly dependent on the method used for analyzing the perfusion-weighted signal intensity time-series after gadolinium injection. In this study, we introduce a new model-free standardized method of temporal similarity perfusion (TSP) mapping for perfusion deficit detection and test its ability and reliability in acute ischemia. Materials and methods Forty patients with an ischemic stroke or transient ischemic attack were included. Two blinded readers compared real-time generated interactive maps and automatically generated TSP maps to traditional TTP/MTT maps for presence of perfusion deficits. Lesion volumes were compared for volumetric inter-rater reliability, spatial concordance between perfusion deficits and healthy tissue and contrast-to-noise ratio (CNR). Results Perfusion deficits were correctly detected in all patients with acute ischemia. Inter-rater reliability was higher for TSP when compared to TTP/MTT maps and there was a high similarity between the lesion volumes depicted on TSP and TTP/MTT (r(18) = 0.73). The Pearson's correlation between lesions calculated on TSP and traditional maps was high (r(18) = 0.73, p<0.0003), however the effective CNR was greater for TSP compared to TTP (352.3 vs 283.5, t(19) = 2.6, p<0.03.) and MTT (228.3, t(19) = 2.8, p<0.03). Discussion TSP maps provide a reliable and robust model-free method for accurate perfusion deficit detection and improve lesion delineation compared to traditional methods. This simple method is also computationally faster and more easily automated than model-based methods. This method can potentially improve the speed and accuracy in perfusion deficit detection for acute stroke treatment and clinical trial inclusion decision-making. PMID:28973000

  6. Rapid mapping of hurricane damage to forests

    Treesearch

    Erik M. Nielsen

    2009-01-01

    The prospects for producing rapid, accurate delineations of the spatial extent of forest wind damage were evaluated using Hurricane Katrina as a test case. A damage map covering the full spatial extent of Katrina?s impact was produced from Moderate Resolution Imaging Spectroradiometer (MODIS) satellite imagery using higher resolution training data. Forest damage...

  7. Measuring pedestrian volumes and conflicts. Volume 1, Pedestrian volume sampling

    DOT National Transportation Integrated Search

    1987-12-01

    This final report presents the findings, conclusions, and recommendations of the study conducted to develop a model to predict pedestrian volumes using small sampling schemes. This research produced four pedestrian volume prediction models (i.e., 1-,...

  8. Effect of left atrial volume and pulmonary vein anatomy on outcome of nMARQ™ catheter ablation of paroxysmal atrial fibrillation.

    PubMed

    Stabile, Giuseppe; Anselmino, Matteo; Soldati, Ezio; De Ruvo, Ermengildo; Solimene, Francesco; Iuliano, Assunta; Sciarra, Luigi; Bongiorni, Maria Grazia; Calò, Leonardo; Gaita, Fiorenzo

    2017-03-01

    Left atrial volume (LA) and pulmonary vein (PV) anatomy may potentially relate to technical challenges in achieving stable and effective catheter position in case of atrial fibrillation (AF) ablation by means of "one-shot" catheters. The aim of this study was to investigate whether LA volume and PV anatomy, evaluated by computed tomography (CT) or magnetic resonance (MR) prior to ablation, predict acute and midterm outcome of AF ablation by nMARQ™. We included 75 patients (mean age 58 ± 11 years, 67 % male) with symptomatic paroxysmal AF. All patients underwent CT/MR scanning prior to catheter ablation to evaluate LA volume and PV anatomy. All the patients underwent PV isolation by nMARQ™, an open-irrigated mapping and radiofrequency (RF) decapolar ablation catheter. Ablation was guided by electroanatomic mapping allowing RF energy delivery in the antral region of PVs from ten irrigated electrodes simultaneously. Mean LA volume was 75 ± 40 ml. A normal anatomy (4 PVs) was documented in 40 (53 %) patients and abnormal anatomy (common truncus or accessory PVs) in 35 patients. Mean procedural and fluoroscopy times were 94 ± 55 and 8 ± 5 min, respectively, without significant differences among patients with normal or abnormal anatomy (92 ± 45 vs 95 ± 64 min, p = 0.85 and 6 ± 3 vs 8 ± 4 min, p = 0.65, respectively). Mean ablation time was 14 ± 3 min, and 99 % of the targeted veins were isolated with a mean of 23 ± 5 RF pulses per patient. After a mean follow-up of 17 ± 8 months, 23 (31 %) patients had an atrial arrhythmia recurrence. Neither LA volume nor PV anatomy was a predictor of outcome. LA volume and PV anatomy did not affect procedural data and outcome in patients who underwent PV isolation by an open-irrigated mapping and RF decapolar ablation catheter.

  9. Linkage disequilibrium, SNP frequency change due to selection, and association mapping in popcorn chromosome regions containing QTLs for quality traits

    PubMed Central

    Paes, Geísa Pinheiro; Viana, José Marcelo Soriano; Silva, Fabyano Fonseca e; Mundim, Gabriel Borges

    2016-01-01

    Abstract The objectives of this study were to assess linkage disequilibrium (LD) and selection-induced changes in single nucleotide polymorphism (SNP) frequency, and to perform association mapping in popcorn chromosome regions containing quantitative trait loci (QTLs) for quality traits. Seven tropical and two temperate popcorn populations were genotyped for 96 SNPs chosen in chromosome regions containing QTLs for quality traits. The populations were phenotyped for expansion volume, 100-kernel weight, kernel sphericity, and kernel density. The LD statistics were the difference between the observed and expected haplotype frequencies (D), the proportion of D relative to the expected maximum value in the population, and the square of the correlation between the values of alleles at two loci. Association mapping was based on least squares and Bayesian approaches. In the tropical populations, D-values greater than 0.10 were observed for SNPs separated by 100-150 Mb, while most of the D-values in the temperate populations were less than 0.05. Selection for expansion volume indirectly led to increase in LD values, population differentiation, and significant changes in SNP frequency. Some associations were observed for expansion volume and the other quality traits. The candidate genes are involved with starch, storage protein, lipid, and cell wall polysaccharides synthesis. PMID:27007903

  10. Linkage disequilibrium, SNP frequency change due to selection, and association mapping in popcorn chromosome regions containing QTLs for quality traits.

    PubMed

    Paes, Geísa Pinheiro; Viana, José Marcelo Soriano; Silva, Fabyano Fonseca E; Mundim, Gabriel Borges

    2016-03-01

    The objectives of this study were to assess linkage disequilibrium (LD) and selection-induced changes in single nucleotide polymorphism (SNP) frequency, and to perform association mapping in popcorn chromosome regions containing quantitative trait loci (QTLs) for quality traits. Seven tropical and two temperate popcorn populations were genotyped for 96 SNPs chosen in chromosome regions containing QTLs for quality traits. The populations were phenotyped for expansion volume, 100-kernel weight, kernel sphericity, and kernel density. The LD statistics were the difference between the observed and expected haplotype frequencies (D), the proportion of D relative to the expected maximum value in the population, and the square of the correlation between the values of alleles at two loci. Association mapping was based on least squares and Bayesian approaches. In the tropical populations, D-values greater than 0.10 were observed for SNPs separated by 100-150 Mb, while most of the D-values in the temperate populations were less than 0.05. Selection for expansion volume indirectly led to increase in LD values, population differentiation, and significant changes in SNP frequency. Some associations were observed for expansion volume and the other quality traits. The candidate genes are involved with starch, storage protein, lipid, and cell wall polysaccharides synthesis.

  11. Register cardiac fiber orientations from 3D DTI volume to 2D ultrasound image of rat hearts

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Wang, Silun; Shen, Ming; Zhang, Xiaodong; Lerakis, Stamatios; Wagner, Mary B.; Fei, Baowei

    2015-03-01

    Two-dimensional (2D) ultrasound or echocardiography is one of the most widely used examinations for the diagnosis of cardiac diseases. However, it only supplies the geometric and structural information of the myocardium. In order to supply more detailed microstructure information of the myocardium, this paper proposes a registration method to map cardiac fiber orientations from three-dimensional (3D) magnetic resonance diffusion tensor imaging (MR-DTI) volume to the 2D ultrasound image. It utilizes a 2D/3D intensity based registration procedure including rigid, log-demons, and affine transformations to search the best similar slice from the template volume. After registration, the cardiac fiber orientations are mapped to the 2D ultrasound image via fiber relocations and reorientations. This method was validated by six images of rat hearts ex vivo. The evaluation results indicated that the final Dice similarity coefficient (DSC) achieved more than 90% after geometric registrations; and the inclination angle errors (IAE) between the mapped fiber orientations and the gold standards were less than 15 degree. This method may provide a practical tool for cardiologists to examine cardiac fiber orientations on ultrasound images and have the potential to supply additional information for diagnosis of cardiac diseases.

  12. Mapping growing stock volume and forest live biomass: a case study of the Polissya region of Ukraine

    NASA Astrophysics Data System (ADS)

    Bilous, Andrii; Myroniuk, Viktor; Holiaka, Dmytrii; Bilous, Svitlana; See, Linda; Schepaschenko, Dmitry

    2017-10-01

    Forest inventory and biomass mapping are important tasks that require inputs from multiple data sources. In this paper we implement two methods for the Ukrainian region of Polissya: random forest (RF) for tree species prediction and k-nearest neighbors (k-NN) for growing stock volume and biomass mapping. We examined the suitability of the five-band RapidEye satellite image to predict the distribution of six tree species. The accuracy of RF is quite high: ~99% for forest/non-forest mask and 89% for tree species prediction. Our results demonstrate that inclusion of elevation as a predictor variable in the RF model improved the performance of tree species classification. We evaluated different distance metrics for the k-NN method, including Euclidean or Mahalanobis distance, most similar neighbor (MSN), gradient nearest neighbor, and independent component analysis. The MSN with the four nearest neighbors (k = 4) is the most precise (according to the root-mean-square deviation) for predicting forest attributes across the study area. The k-NN method allowed us to estimate growing stock volume with an accuracy of 3 m3 ha-1 and for live biomass of about 2 t ha-1 over the study area.

  13. Object Tracking Vision System for Mapping the UCN τ Apparatus Volume

    NASA Astrophysics Data System (ADS)

    Lumb, Rowan; UCNtau Collaboration

    2016-09-01

    The UCN τ collaboration has an immediate goal to measure the lifetime of the free neutron to within 0.1%, i.e. about 1 s. The UCN τ apparatus is a magneto-gravitational ``bottle'' system. This system holds low energy, or ultracold, neutrons in the apparatus with the constraint of gravity, and keeps these low energy neutrons from interacting with the bottle via a strong 1 T surface magnetic field created by a bowl-shaped array of permanent magnets. The apparatus is wrapped with energized coils to supply a magnetic field throughout the ''bottle'' volume to prevent depolarization of the neutrons. An object-tracking stereo-vision system will be presented that precisely tracks a Hall probe and allows a mapping of the magnetic field throughout the volume of the UCN τ bottle. The stereo-vision system utilizes two cameras and open source openCV software to track an object's 3-d position in space in real time. The desired resolution is +/-1 mm resolution along each axis. The vision system is being used as part of an even larger system to map the magnetic field of the UCN τ apparatus and expose any possible systematic effects due to field cancellation or low field points which could allow neutrons to depolarize and possibly escape from the apparatus undetected. Tennessee Technological University.

  14. Geological Mapping Uses Landsat 4-5TM Satellite Data in Manlai Soum of Omnogovi Aimag

    NASA Astrophysics Data System (ADS)

    Norovsuren, B.

    2014-12-01

    Author: Bayanmonkh N1, Undram.G1, Tsolmon.R2, Ariunzul.Ya1, Bayartungalag B31 Environmental Research Information and Study Center 2NUM-ITC-UNESCO Space Science and Remote Sensing International Laboratory, National University of Mongolia 3Geology and Hydrology School, Korea University KEY WORDS: geology, mineral resources, fracture, structure, lithologyABSTRACTGeologic map is the most important map for mining when it does exploration job. In Mongolia geological map completed by Russian geologists which is done by earlier technology. Those maps doesn't satisfy for present requirements. Thus we want to study improve geological map which includes fracture, structural map and lithology use Landsat TM4-5 satellite data. If we can produce a geological map from satellite data with more specification then geologist can explain or read mineralogy very easily. We searched all methodology and researches of every single element of geological mapping. Then we used 3 different remote sensing methodologies to produce structural and lithology and fracture map based on geographic information system's softwares. There can be found a visible lithology border improvement and understandable structural map and we found fracture of the Russian geological map has a lot of distortion. The result of research geologist can read mineralogy elements very easy and discovered 3 unfound important elements from satellite image.

  15. Polymer gel dosimeters for pretreatment radiotherapy verification using the three-dimensional gamma evaluation and pass rate maps.

    PubMed

    Hsieh, Ling-Ling; Shieh, Jiunn-I; Wei, Li-Ju; Wang, Yi-Chun; Cheng, Kai-Yuan; Shih, Cheng-Ting

    2017-05-01

    Polymer gel dosimeters (PGDs) have been widely studied for use in the pretreatment verification of clinical radiation therapy. However, the readability of PGDs in three-dimensional (3D) dosimetry remain unclear. In this study, the pretreatment verifications of clinical radiation therapy were performed using an N-isopropyl-acrylamide (NIPAM) PGD, and the results were used to evaluate the performance of the NIPAM PGD on 3D dose measurement. A gel phantom was used to measure the dose distribution of a clinical case of intensity-modulated radiation therapy. Magnetic resonance imaging scans were performed for dose readouts. The measured dose volumes were compared with the planned dose volume. The relative volume histograms showed that relative volumes with a negative percent dose difference decreased as time elapsed. Furthermore, the histograms revealed few changes after 24h postirradiation. For the 3%/3mm and 2%/2mm criteria, the pass rates of the 12- and 24-h dose volumes were higher than 95%, respectively. This study thus concludes that the pass rate map can be used to evaluate the dose-temporal readability of PGDs and that the NIPAM PGD can be used for clinical pretreatment verifications. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  16. The Future of Food Service: A Basis for Planning.

    ERIC Educational Resources Information Center

    Powers, Thomas F., Ed.; Swinton, John R., Ed.

    Designed as a basic reference document, the report has mapped the economic and technological territory of the food service industry, examined the dynamics shaping the industry today, and examined questions in need of further research. It is a volume that might be useful to food service teachers and curriculum planners as well as a volume which…

  17. ENGLISH FOR TODAY. BOOK TWO, THE WORLD WE LIVE IN.

    ERIC Educational Resources Information Center

    SLAGER, WILLIAM R.; AND OTHERS

    THIS SECOND VOLUME (SECOND YEAR) OF THE "ENGLISH FOR TODAY" SERIES FOLLOWS THE SAME GENERAL FIVE-UNIT FORMAT AND AUDIO-LINGUAL APPROACH AS BOOK ONE. THE 25 READINGS CONTAINED IN THIS VOLUME ARE GRADED AND CONTROLLED, DEALING WITH SUCH GEOGRAPHICAL AND SOCIAL TOPICS AS MAPS, TRAVEL, EXPLORING, FAMOUS SITES AND MONUMENTS, AND PROGRESS IN…

  18. Historical and Political Gazeteer of Afghanistan. Volume 1, Badakhshan Province and Northeastern Afghanistan.

    ERIC Educational Resources Information Center

    Adamec, Ludwig W., Ed.

    Combining the most important geographical data with historical, political, and cultural information, this work, one of six volumes designed as a tool of research and a general reference source, updates and includes a previous publication compiled in 1914 with corrections and additions of maps and considerable new material to take into account…

  19. Alteration, slope-classified alteration, and potential lahar inundation maps of volcanoes for the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Volcano Archive

    USGS Publications Warehouse

    Mars, John C.; Hubbard, Bernard E.; Pieri, David; Linick, Justin

    2015-01-01

    This study was undertaken during 2012–2013 in cooperation with the National Aeronautics and Space Administration (NASA). Since completion of this study, a new lahar modeling program (LAHAR_pz) has been released, which may produce slightly different modeling results from the LAHARZ model used in this study. The maps and data from this study should not be used in place of existing volcano hazard maps published by local authorities. For volcanoes without hazard maps and (or) published lahar-related hazard studies, this work will provide a starting point from which more accurate hazard maps can be produced. This is the first dataset to provide digital maps of altered volcanoes and adjacent watersheds that can be used for assessing volcanic hazards, hydrothermal alteration, and other volcanic processes in future studies.

  20. High-resolution multibeam bathymetry of East and West Flower Gardens and Stetson Banks, Gulf of Mexico

    USGS Publications Warehouse

    Gardner, J.V.; Mayer, L.A.; Hughes, Clarke J.E.; Kleiner, A.

    1998-01-01

    The 1990s have seen rapid advances in seafloor mapping technology. Multibeam sonars are now capable of mapping a wide range of water depths with beams as narrow as 1??, and provide up to a 150?? swath. When these multibeam sonars are coupled with an extremely accurate vehicle motion sensor and very precise navigation, they are capable of producing unprecedented images of the seafloor. This technology was used in December 1997 to map the East and West Flower Gardens and Stetson Banks, Gulf of Mexico. The results from this survey provide the most accurate maps of these areas yet produced and reveal features at submeter resolution never mapped in these areas before. The digital data provide a database that should become the fundamental base maps for all subsequent work in this recently established National Marine Sanctuary.

  1. Comparing maps of mean monthly surface temperature and precipitation for Alaska and adjacent areas of Canada produced by two different methods

    USGS Publications Warehouse

    Simpson, James J.; Hufford, Gary L.; Daly, Christopher; Berg, Jared S.; Fleming, Michael D.

    2005-01-01

    Maps of mean monthly surface temperature and precipitation for Alaska and adjacent areas of Canada, produced by Oregon State University's Spatial Climate Analysis Service (SCAS) and the Alaska Geospatial Data Clearinghouse (AGDC), were analyzed. Because both sets of maps are generally available and in use by the community, there is a need to document differences between the processes and input data sets used by the two groups to produce their respective set of maps and to identify similarities and differences between the two sets of maps and possible reasons for the differences. These differences do not affect the observed large-scale patterns of seasonal and annual variability. Alaska is divided into interior and coastal zones, with consistent but different variability, separated by a transition region. The transition region has high interannual variability but low long-term mean variability. Both data sets support the four major ecosystems and ecosystem transition zone identified in our earlier work. Differences between the two sets of maps do occur, however, on the regional scale; they reflect differences in physiographic domains and in the treatment of these domains by the two groups (AGDC, SCAS). These differences also provide guidance for an improved observational network for Alaska. On the basis of validation with independent in situ data, we conclude that the data set produced by SCAS provides the best spatial coverage of Alaskan long-term mean monthly surface temperature and precipitation currently available. ?? The Arctic Institute of North America.

  2. Fusion of pan-tropical biomass maps using weighted averaging and regional calibration data

    NASA Astrophysics Data System (ADS)

    Ge, Yong; Avitabile, Valerio; Heuvelink, Gerard B. M.; Wang, Jianghao; Herold, Martin

    2014-09-01

    Biomass is a key environmental variable that influences many biosphere-atmosphere interactions. Recently, a number of biomass maps at national, regional and global scales have been produced using different approaches with a variety of input data, such as from field observations, remotely sensed imagery and other spatial datasets. However, the accuracy of these maps varies regionally and is largely unknown. This research proposes a fusion method to increase the accuracy of regional biomass estimates by using higher-quality calibration data. In this fusion method, the biases in the source maps were first adjusted to correct for over- and underestimation by comparison with the calibration data. Next, the biomass maps were combined linearly using weights derived from the variance-covariance matrix associated with the accuracies of the source maps. Because each map may have different biases and accuracies for different land use types, the biases and fusion weights were computed for each of the main land cover types separately. The conceptual arguments are substantiated by a case study conducted in East Africa. Evaluation analysis shows that fusing multiple source biomass maps may produce a more accurate map than when only one biomass map or unweighted averaging is used.

  3. [The experiment of participatory mapping in order to construct a cartographical alternative to the FHS].

    PubMed

    Goldstein, Roberta Argento; Barcellos, Christovam; Magalhães, Monica de Avelar Figueiredo Mafra; Gracie, Renata; Viacava, Francisco

    2013-01-01

    Maps and mapping procedures are useful tools for systematic interpretation and evaluation and for reporting of results to management. Applied to the Family Health Strategy (FHS), these maps permit the demarcation of the territory and the establishment of links between the territory, its population and health services. In this paper the use of maps by the FHS in 17 municipalities in northern and northeastern Brazil is studied and the process of demarcation and digitization of areas with the participation of teams is described. The survey conducted using questionnaires and discussion workshops showed that difficulties still prevail in reconciling the map (drawing) produced at the local level with maps produced by other government sectors. In general, the maps used at local level employ their own references, which prevent the interplay of information with other cartographic documents and their full use as a tool for evaluation and management. The combination of participatory mapping tools, associated with Geographic Information Systems (GIS) applications proposed in this paper, represents an alternative to mapping the territory of operations of FHS teams, as well as a reflection on the concept of territory and operation by the FHS.

  4. Topography- and nightlight-based national flood risk assessment in Canada

    NASA Astrophysics Data System (ADS)

    Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich

    2017-04-01

    In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the effect of the overglow phenomenon on flood risk mapping.

  5. Mapping broom snakeweed through image analysis of color-infrared photography and digital imagery.

    PubMed

    Everitt, J H; Yang, C

    2007-11-01

    A study was conducted on a south Texas rangeland area to evaluate aerial color-infrared (CIR) photography and CIR digital imagery combined with unsupervised image analysis techniques to map broom snakeweed [Gutierrezia sarothrae (Pursh.) Britt. and Rusby]. Accuracy assessments performed on computer-classified maps of photographic images from two sites had mean producer's and user's accuracies for broom snakeweed of 98.3 and 88.3%, respectively; whereas, accuracy assessments performed on classified maps from digital images of the same two sites had mean producer's and user's accuracies for broom snakeweed of 98.3 and 92.8%, respectively. These results indicate that CIR photography and CIR digital imagery combined with image analysis techniques can be used successfully to map broom snakeweed infestations on south Texas rangelands.

  6. The Relationship between Self-Assembly and Conformal Mappings

    NASA Astrophysics Data System (ADS)

    Duque, Carlos; Santangelo, Christian

    The isotropic growth of a thin sheet has been used as a way to generate programmed shapes through controlled buckling. We discuss how conformal mappings, which are transformations that locally preserve angles, provide a way to quantify the area growth needed to produce a particular shape. A discrete version of the conformal map can be constructed from circle packings, which are maps between packings of circles whose contact network is preserved. This provides a link to the self-assembly of particles on curved surfaces. We performed simulations of attractive particles on a curved surface using molecular dynamics. The resulting particle configurations were used to generate the corresponding discrete conformal map, allowing us to quantify the degree of area distortion required to produce a particular shape by finding particle configurations that minimize the area distortion.

  7. Diviner lunar radiometer gridded brightness temperatures from geodesic binning of modeled fields of view

    NASA Astrophysics Data System (ADS)

    Sefton-Nash, E.; Williams, J.-P.; Greenhagen, B. T.; Aye, K.-M.; Paige, D. A.

    2017-12-01

    An approach is presented to efficiently produce high quality gridded data records from the large, global point-based dataset returned by the Diviner Lunar Radiometer Experiment aboard NASA's Lunar Reconnaissance Orbiter. The need to minimize data volume and processing time in production of science-ready map products is increasingly important with the growth in data volume of planetary datasets. Diviner makes on average >1400 observations per second of radiance that is reflected and emitted from the lunar surface, using 189 detectors divided into 9 spectral channels. Data management and processing bottlenecks are amplified by modeling every observation as a probability distribution function over the field of view, which can increase the required processing time by 2-3 orders of magnitude. Geometric corrections, such as projection of data points onto a digital elevation model, are numerically intensive and therefore it is desirable to perform them only once. Our approach reduces bottlenecks through parallel binning and efficient storage of a pre-processed database of observations. Database construction is via subdivision of a geodesic icosahedral grid, with a spatial resolution that can be tailored to suit the field of view of the observing instrument. Global geodesic grids with high spatial resolution are normally impractically memory intensive. We therefore demonstrate a minimum storage and highly parallel method to bin very large numbers of data points onto such a grid. A database of the pre-processed and binned points is then used for production of mapped data products that is significantly faster than if unprocessed points were used. We explore quality controls in the production of gridded data records by conditional interpolation, allowed only where data density is sufficient. The resultant effects on the spatial continuity and uncertainty in maps of lunar brightness temperatures is illustrated. We identify four binning regimes based on trades between the spatial resolution of the grid, the size of the FOV and the on-target spacing of observations. Our approach may be applicable and beneficial for many existing and future point-based planetary datasets.

  8. A risk-based framework to assess long-term effects of policy and water supply changes on water resources systems

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia

    2015-04-01

    Climate uncertainty can affect water resources availability and management decisions. Sustainable water resources management therefore requires evaluation of policy and management decisions under a wide range of possible future water supply conditions. This study proposes a risk-based framework to integrate water supply uncertainty into a forward-looking decision making context. To apply this framework, a stochastic reconstruction scheme is used to generate a large ensemble of flow series. For the Rocky Mountain basins considered here, two key characteristics of the annual hydrograph are its annual flow volume and the timing of the seasonal flood peak. These are perturbed to represent natural randomness and potential changes due to future climate. 30-year series of perturbed flows are used as input to the SWAMP model - an integrated water resources model that simulates regional water supply-demand system and estimates economic productivity of water and other sustainability indicators, including system vulnerability and resilience. The simulation results are used to construct 2D-maps of net revenue of a particular water sector; e.g., hydropower, or for all sectors combined. Each map cell represents a risk scenario of net revenue based on a particular annual flow volume, timing of the peak flow, and 200 stochastic realizations of flow series. This framework is demonstrated for a water resources system in the Saskatchewan River Basin (SaskRB) in Saskatchewan, Canada. Critical historical drought sequences, derived from tree-ring reconstructions of several hundred years of annual river flows, are used to evaluate the system's performance (net revenue risk) under extremely low flow conditions and also to locate them on the previously produced 2D risk maps. This simulation and analysis framework is repeated under various reservoir operation strategies (e.g., maximizing flood protection or maximizing water supply security); development proposals, such as irrigation expansion; and change in energy prices. Such risk-based analysis demonstrates relative reduction/increase of risk associated with management and policy decisions and allow decision makers to explore the relative importance of policy versus natural water supply change in a water resources system.

  9. A new service offered by rural environment to the city: stormwater reception.

    NASA Astrophysics Data System (ADS)

    Chiaradia, Enrico Antonio; Weber, Enrico; Masseroni, Daniele; Battista Bischetti, Gian; Gandolfi, Claudio

    2017-04-01

    Stormwaters are the main cause of urban floods in many urbanized areas. Historically, stormwater management practices have been focused on building infrastructures that achieve runoff attenuation through the storage of water volumes in large detention basins. However, this approach has proven to be insufficient to resolve the problem as well as it is difficult to implement in areas with a dense urban fabric. Nowadays, around the world, water managers are increasingly embracing "soft path" approaches, that aim to manage the excess of urban runoff through Green Infrastructures, where detention capacities are provided by the retention proprieties of soil and vegetation elements. Along the line of these new sustainable stormwater management practices, the aim of this study is to promote a further paradigm-shift with respect to the traditional practices i.e. to investigate the possibility to use the already existing green infrastructures of the peri-urban rural areas as reception element of the surplus of urban runoff. Many territories in Northern Italy, for example. are characterized by a high density of irrigation canals and agricultural fields that, in some cases, are isolated or pent-up inside urbanized areas. Both these elements may represent storage volumes for accumulating stormwater from urban areas. In this work, we implemented a holistic framework, based on Self Organized Map technique (SOM), with the objective to produce a spatial map of the stormwater reception level that can be provided by the rural environment. We elaborated physiographic characteristics of irrigation canals and agricultural fields through the SOM algorithm obtaining as output a series of cluster groups with the same level of receptivity. This procedure was applied on an area of 1933 km2 around the city of Milan and a map of 250x250m resolution was obtained with three different levels of stormwater reception capacity. About 50% of rural environment has a good level of reception and only 30% and 20% of rural areas have respectively a moderate and scarce level of reception. By the results we can conclude that the rural environment could become a valuable structural alternative to the traditional stormwater control methods, ascribing the rural environment to a new role in urban flood protection from.

  10. Hydraulic modeling of clay ceramic water filters for point-of-use water treatment.

    PubMed

    Schweitzer, Ryan W; Cunningham, Jeffrey A; Mihelcic, James R

    2013-01-02

    The acceptability of ceramic filters for point-of-use water treatment depends not only on the quality of the filtered water, but also on the quantity of water the filters can produce. This paper presents two mathematical models for the hydraulic performance of ceramic water filters under typical usage. A model is developed for two common filter geometries: paraboloid- and frustum-shaped. Both models are calibrated and evaluated by comparison to experimental data. The hydraulic models are able to predict the following parameters as functions of time: water level in the filter (h), instantaneous volumetric flow rate of filtrate (Q), and cumulative volume of water produced (V). The models' utility is demonstrated by applying them to estimate how the volume of water produced depends on factors such as the filter shape and the frequency of filling. Both models predict that the volume of water produced can be increased by about 45% if users refill the filter three times per day versus only once per day. Also, the models predict that filter geometry affects the volume of water produced: for two filters with equal volume, equal wall thickness, and equal hydraulic conductivity, a filter that is tall and thin will produce as much as 25% more water than one which is shallow and wide. We suggest that the models can be used as tools to help optimize filter performance.

  11. Specific degradation of the mucus adhesion-promoting protein (MapA) of Lactobacillus reuteri to an antimicrobial peptide.

    PubMed

    Bøhle, Liv Anette; Brede, Dag Anders; Diep, Dzung B; Holo, Helge; Nes, Ingolf F

    2010-11-01

    The intestinal flora of mammals contains lactic acid bacteria (LAB) that may provide positive health effects for the host. Such bacteria are referred to as probiotic bacteria. From a pig, we have isolated a Lactobacillus reuteri strain that produces an antimicrobial peptide (AMP). The peptide was purified and characterized, and it was unequivocally shown that the AMP was a well-defined degradation product obtained from the mucus adhesion-promoting protein (MapA); it was therefore termed AP48-MapA. This finding demonstrates how large proteins might inherit unexpected pleiotropic functions by conferring antimicrobial capacities on the producer. The MapA/AP48-MapA system is the first example where a large protein of an intestinal LAB is shown to give rise to such an AMP. It is also of particular interest that the protein that provides this AMP is associated with the binding of the bacterium producing it to the surface/lining of the gut. This finding gives us new perspective on how some probiotic bacteria may successfully compete in this environment and thereby contribute to a healthy microbiota.

  12. New Delhi Metallo-beta-lactamase around the world: an eReview using Google Maps.

    PubMed

    Berrazeg, M; Diene, Sm; Medjahed, L; Parola, P; Drissi, M; Raoult, D; Rolain, Jm

    2014-05-22

    Gram-negative carbapenem-resistant bacteria, in particular those producing New Delhi Metallo-betalactamase-1 (NDM-1), are a major global health problem. To inform the scientific and medical community in real time about worldwide dissemination of isolates of NDM-1-producing bacteria, we used the PubMed database to review all available publications from the first description in 2009 up to 31 December 2012, and created a regularly updated worldwide dissemination map using a web-based mapping application. We retrieved 33 reviews, and 136 case reports describing 950 isolates of NDM-1-producing bacteria. Klebsiella pneumoniae (n= 359) and Escherichia coli (n=268) were the most commonly reported bacteria producing NDM-1 enzyme. Several case reports of infections due to imported NDM-1 producing bacteria have been reported in a number of countries, including the United Kingdom, Italy, and Oman. In most cases (132/153, 86.3%), patients had connections with the Indian subcontinent or Balkan countries. Those infected were originally from these areas, had either spent time and/or been hospitalised there, or were potentially linked to other patients who had been hospitalised in these regions. By using Google Maps, we were able to trace spread of NDM-1-producing bacteria. We strongly encourage epidemiologists to use these types of interactive tools for surveillance purposes and use the information to prevent the spread and outbreaks of such bacteria.

  13. Measurement of gastric meal and secretion volumes using magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Hoad, C. L.; Parker, H.; Hudders, N.; Costigan, C.; Cox, E. F.; Perkins, A. C.; Blackshaw, P. E.; Marciani, L.; Spiller, R. C.; Fox, M. R.; Gowland, P. A.

    2015-02-01

    MRI can assess multiple gastric functions without ionizing radiation. However, time consuming image acquisition and analysis of gastric volume data, plus confounding of gastric emptying measurements by gastric secretions mixed with the test meal have limited its use to research centres. This study presents an MRI acquisition protocol and analysis algorithm suitable for the clinical measurement of gastric volume and secretion volume. Reproducibility of gastric volume measurements was assessed using data from 10 healthy volunteers following a liquid test meal with rapid MRI acquisition within one breath-hold and semi-automated analysis. Dilution of the ingested meal with gastric secretion was estimated using a respiratory-triggered T1 mapping protocol. Accuracy of the secretion volume measurements was assessed using data from 24 healthy volunteers following a mixed (liquid/solid) test meal with MRI meal volumes compared to data acquired using gamma scintigraphy (GS) on the same subjects studied on a separate study day. The mean ± SD coefficient of variance between 3 observers for both total gastric contents (including meal, secretions and air) and just the gastric contents (meal and secretion only) was 3  ±  2% at large gastric volumes (>200 ml). Mean ± SD secretion volumes post meal ingestion were 64  ±  51 ml and 110  ±  40 ml at 15 and 75 min, respectively. Comparison with GS meal volumes, showed that MRI meal only volume (after correction for secretion volume) were similar to GS, with a linear regression gradient ± std err of 1.06  ±  0.10 and intercept -11  ±  24 ml. In conclusion, (i) rapid volume acquisition and respiratory triggered T1 mapping removed the requirement to image during prolonged breath-holds (ii) semi-automatic analysis greatly reduced the time required to derive measurements and (iii) correction for secretion volumes provided accurate assessment of gastric meal volumes and emptying. Together these features provide the scientific basis of a protocol which would be suitable in clinical practice.

  14. 40 CFR 63.6620 - What performance tests and other procedures must I use?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... based on the ratio of oxygen volume to the ultimate CO2 volume produced by the fuel at zero percent... volume of CO2 produced to the gross calorific value of the fuel from Method 19, dsm3/J (dscf/106 Btu... equivalent percent carbon dioxide (CO2). If pollutant concentrations are to be corrected to 15 percent oxygen...

  15. 40 CFR 63.6620 - What performance tests and other procedures must I use?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... based on the ratio of oxygen volume to the ultimate CO2 volume produced by the fuel at zero percent... volume of CO2 produced to the gross calorific value of the fuel from Method 19, dsm3/J (dscf/106 Btu... equivalent percent carbon dioxide (CO2). If pollutant concentrations are to be corrected to 15 percent oxygen...

  16. Regional Ventilation Changes in the Lung: Treatment Response Mapping by Using Hyperpolarized Gas MR Imaging as a Quantitative Biomarker.

    PubMed

    Horn, Felix C; Marshall, Helen; Collier, Guilhem J; Kay, Richard; Siddiqui, Salman; Brightling, Christopher E; Parra-Robles, Juan; Wild, Jim M

    2017-09-01

    Purpose To assess the magnitude of regional response to respiratory therapeutic agents in the lungs by using treatment response mapping (TRM) with hyperpolarized gas magnetic resonance (MR) imaging. TRM was used to quantify regional physiologic response in adults with asthma who underwent a bronchodilator challenge. Materials and Methods This study was approved by the national research ethics committee and was performed with informed consent. Imaging was performed in 20 adult patients with asthma by using hyperpolarized helium 3 ( 3 He) ventilation MR imaging. Two sets of baseline images were acquired before inhalation of a bronchodilating agent (salbutamol 400 μg), and one set was acquired after. All images were registered for voxelwise comparison. Regional treatment response, ΔR(r), was calculated as the difference in regional gas distribution (R[r] = ratio of inhaled gas to total volume of a voxel when normalized for lung inflation volume) before and after intervention. A voxelwise activation threshold from the variability of the baseline images was applied to ΔR(r) maps. The summed global treatment response map (ΔR net ) was then used as a global lung index for comparison with metrics of bronchodilator response measured by using spirometry and the global imaging metric percentage ventilated volume (%VV). Results ΔR net showed significant correlation (P < .01) with changes in forced expiratory volume in 1 second (r = 0.70), forced vital capacity (r = 0.84), and %VV (r = 0.56). A significant (P < .01) positive treatment effect was detected with all metrics; however, ΔR net showed a lower intersubject coefficient of variation (64%) than all of the other tests (coefficient of variation, ≥99%). Conclusion TRM provides regional quantitative information on changes in inhaled gas ventilation in response to therapy. This method could be used as a sensitive regional outcome metric for novel respiratory interventions. © RSNA, 2017 Online supplemental material is available for this article.

  17. Automated land-use mapping from spacecraft data. [Oakland County, Michigan

    NASA Technical Reports Server (NTRS)

    Chase, P. E. (Principal Investigator); Rogers, R. H.; Reed, L. E.

    1974-01-01

    The author has identified the following significant results. In response to the need for a faster, more economical means of producing land use maps, this study evaluated the suitability of using ERTS-1 computer compatible tape (CCT) data as a basis for automatic mapping. Significant findings are: (1) automatic classification accuracy greater than 90% is achieved on categories of deep and shallow water, tended grass, rangeland, extractive (bare earth), urban, forest land, and nonforested wet lands; (2) computer-generated printouts by target class provide a quantitative measure of land use; and (3) the generation of map overlays showing land use from ERTS-1 CCTs offers a significant breakthrough in the rate at which land use maps are generated. Rather than uncorrected classified imagery or computer line printer outputs, the processing results in geometrically-corrected computer-driven pen drawing of land categories, drawn on a transparent material at a scale specified by the operator. These map overlays are economically produced and provide an efficient means of rapidly updating maps showing land use.

  18. Numerical Conformal Mapping Using Cross-Ratios and Delaunay Triangulation

    NASA Technical Reports Server (NTRS)

    Driscoll, Tobin A.; Vavasis, Stephen A.

    1996-01-01

    We propose a new algorithm for computing the Riemann mapping of the unit disk to a polygon, also known as the Schwarz-Christoffel transformation. The new algorithm, CRDT, is based on cross-ratios of the prevertices, and also on cross-ratios of quadrilaterals in a Delaunay triangulation of the polygon. The CRDT algorithm produces an accurate representation of the Riemann mapping even in the presence of arbitrary long, thin regions in the polygon, unlike any previous conformal mapping algorithm. We believe that CRDT can never fail to converge to the correct Riemann mapping, but the correctness and convergence proof depend on conjectures that we have so far not been able to prove. We demonstrate convergence with computational experiments. The Riemann mapping has applications to problems in two-dimensional potential theory and to finite-difference mesh generation. We use CRDT to produce a mapping and solve a boundary value problem on long, thin regions for which no other algorithm can solve these problems.

  19. 40 CFR 80.1403 - Which fuels are not subject to the 20% GHG thresholds?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... baseline volume of ethanol that is produced from facilities and any expansions all of which commenced... produced by those facilities described in paragraph (d) of this section, only the ethanol volume (to the...

  20. 40 CFR 80.1403 - Which fuels are not subject to the 20% GHG thresholds?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... baseline volume of ethanol that is produced from facilities and any expansions all of which commenced... produced by those facilities described in paragraph (d) of this section, only the ethanol volume (to the...

Top