Sample records for cloud slicing technique

  1. The Cloud Detection and Ultraviolet Monitoring Experiment (CLUE)

    NASA Technical Reports Server (NTRS)

    Barbier, Louis M.; Loh, Eugene C.; Krizmanic, John F.; Sokolsky, Pierre; Streitmatter, Robert E.

    2004-01-01

    In this paper we describe a new balloon instrument - CLUE - which is designed to monitor ultraviolet (uv) nightglow levels and determine cloud cover and cloud heights with a CO2 slicing technique. The CO2 slicing technique is based on the MODIS instrument on NASA's Aqua and Terra spacecraft. CLUE will provide higher spatial resolution (0.5 km) and correlations between the uv and the cloud cover.

  2. Evaluation of Multilayer Cloud Detection Using a MODIS CO2-Slicing Algorithm With CALIPSO-CloudSat Measurements

    NASA Technical Reports Server (NTRS)

    Viudez-Mora, Antonio; Kato, Seiji

    2015-01-01

    This work evaluates the multilayer cloud (MCF) algorithm based on CO2-slicing techniques against CALISPO-CloudSat (CLCS) measurement. This evaluation showed that the MCF underestimates the presence of multilayered clouds compared with CLCS and are retrained to cloud emissivities below 0.8 and cloud optical septs no larger than 0.3.

  3. Derivation of Tropospheric Column Ozone from the EPTOMS/GOES Co-Located Data Sets using the Cloud Slicing Technique

    NASA Technical Reports Server (NTRS)

    Ahn, C.; Ziemke, J. R.; Chandra, S.; Bhartia, P. K.

    2002-01-01

    A recently developed technique called cloud slicing used for deriving upper tropospheric ozone from the Nimbus 7 Total Ozone Mapping Spectrometer (TOMS) instrument combined together with temperature-humidity and infrared radiometer (THIR) is no longer applicable to the Earth Probe TOMS (EPTOMS) because EPTOMS does not have an instrument to measure cloud top temperatures. For continuing monitoring of tropospheric ozone between 200-500hPa and testing the feasibility of this technique across spacecrafts, EPTOMS data are co-located in time and space with the Geostationary Operational Environmental Satellite (GOES)-8 infrared data for 2001 and early 2002, covering most of North and South America (45S-45N and 120W-30W). The maximum column amounts for the mid-latitudinal sites of the northern hemisphere are found in the March-May season. For the mid-latitudinal sites of the southern hemisphere, the highest column amounts are found in the September-November season, although overall seasonal variability is smaller than those of the northern hemisphere. The tropical sites show the weakest seasonal variability compared to higher latitudes. The derived results for selected sites are cross validated qualitatively with the seasonality of ozonesonde observations and the results from THIR analyses over the 1979-1984 time period due to the lack of available ozonesonde measurements to study sites for 2001. These comparisons show a reasonably good agreement among THIR, ozonesonde observations, and cloud slicing-derived column ozone. With very limited co-located EPTOMS/GOES data sets, the cloud slicing technique is still viable to derive the upper tropospheric column ozone. Two new variant approaches, High-Low (HL) cloud slicing and ozone profile derivation from cloud slicing are introduced to estimate column ozone amounts using the entire cloud information in the troposphere.

  4. First Look at the Upper Tropospheric Ozone Mixing Ratio from OMI Estimated using the Cloud Slicing Technique

    NASA Technical Reports Server (NTRS)

    Bhartia, Pawan K.; Ziemke, Jerry; Chandra, Sushil; Joiner, Joanna; Vassilkov, Alexandra; Taylor, Steven; Yang, Kai; Ahn, Chang-Woo

    2004-01-01

    The Cloud Slicing technique has emerged as a powerful tool for the study of ozone in the upper troposphere. In this technique one looks at the variation with cloud height of the above-cloud column ozone derived from the backscattered ultraviolet instruments, such as TOMS, to determine the ozone mixing ratio. For this technique to work properly one needs an instrument with relatively good horizontal resolution with very good signal to noise in measuring above-cloud column ozone. In addition, one needs the (radiatively) effective cloud pressure rather than the cloud-top pressure, for the ultraviolet photons received by a satellite instrument are scattered from inside the cloud rather than from the top. For this study we use data from the OMI sensor, which was recently launched on the EOS Aura satellite. OMI is a W-Visible backscattering instrument with a nadir pixel size of 13 x 24 km. The effective cloud pressure is derived from a new algorithm based on Rotational Raman Scattering and O2-O2, absorption in the 340-400 nm band of OMI.

  5. Improvement in thin cirrus retrievals using an emissivity-adjusted CO2 slicing algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Hong; Menzel, W. Paul

    2002-09-01

    CO2 slicing has been generally accepted as a useful algorithm for determining cloud top pressure (CTP) and effective cloud amount (ECA) for tropospheric clouds above 600 hPa. To date, the technique has assumed that the surface emissivity is that of a blackbody in the long-wavelength infrared radiances and that the cloud emissivities in spectrally close bands are approximately equal. The modified CO2 slicing algorithm considers adjustments of both surface emissivity and cloud emissivity ratio. Surface emissivity is adjusted according to the surface types. The ratio of cloud emissivities in spectrally close bands is adjusted away from unity according to radiative transfer calculations. The new CO2 slicing algorithm is examined with Moderate Resolution Imaging Spectroradiometer (MODIS) Airborne Simulator (MAS) CO2 band radiance measurements over thin clouds and validated against Cloud Lidar System (CLS) measurements of the same clouds; it is also applied to Geostationary Operational Environmental Satellite (GOES) Sounder data to study the overall impact on cloud property determinations. For high thin clouds an improved product emerges, while for thick and opaque clouds there is little change. For very thin clouds, the CTP increases by about 10-20 hPa and RMS (root mean square bias) difference is approximately 50 hPa; for thin clouds, the CTP increase is about 10 hPa bias and RMS difference is approximately 30 hPa. The new CO2 slicing algorithm places the clouds lower in the troposphere.

  6. Refinements to HIRS CO2 Slicing Algorithm with Results Compared to CALIOP and MODIS

    NASA Astrophysics Data System (ADS)

    Frey, R.; Menzel, P.

    2012-12-01

    This poster reports on the refinement of a cloud top property algorithm using High-resolution Infrared Radiation Sounder (HIRS) measurements. The HIRS sensor has been flown on fifteen satellites from TIROS-N through NOAA-19 and MetOp-A forming a continuous 30 year cloud data record. Cloud Top Pressure and effective emissivity (cloud fraction multiplied by cloud emissivity) are derived using the 15 μm spectral bands in the CO2 absorption band, implementing the CO2 slicing technique which is strong for high semi-transparent clouds but weak for low clouds with little thermal contrast from clear skies. We report on algorithm adjustments suggested from MODIS cloud record validations and the inclusion of collocated AVHRR cloud fraction data from the PATMOS-x algorithm. Reprocessing results for 2008 are shown using NOAA-18 HIRS and collocated CALIOP data for validation, as well as comparisons to MODIS monthly mean values. Adjustments to the cloud algorithm include (a) using CO2 slicing for all ice and mixed phase clouds and infrared window determinations for all water clouds, (b) determining the cloud top pressure from the most opaque CO2 spectral band pair seeing the cloud, (c) reducing the cloud detection threshold for the CO2 slicing algorithm to include conditions of smaller radiance differences that are often due to thin ice clouds, and (d) identifying stratospheric clouds when an opaque band is warmer than a less opaque band.

  7. Global Free-tropospheric NO2 Abundances Derived Using a Cloud Slicing Technique from AURA OMI

    NASA Technical Reports Server (NTRS)

    Choi, S.; Joiner, J.; Choi, Y.; Duncan, B.N.; Vasilkov, A.; Krotkov, N.; Bucsela, E.J.

    2014-01-01

    We derive free-tropospheric NO2 volume mixing ratios (VMRs) by applying a cloud-slicing technique to data from the Ozone Monitoring Instrument (OMI) on the Aura satellite. In the cloud-slicing approach, the slope of the above-cloud NO2 column versus the cloud scene pressure is proportional to the NO2 VMR. In this work, we use a sample of nearby OMI pixel data from a single orbit for the linear fit. The OMI data include cloud scene pressures from the rotational-Raman algorithm and above-cloud NO2 vertical column density (VCD) (defined as the NO2 column from the cloud scene pressure to the top of the atmosphere) from a differential optical absorption spectroscopy (DOAS) algorithm. We compare OMI-derived NO2 VMRs with in situ aircraft profiles measured during the NASA Intercontinental Chemical Transport Experiment Phase B (INTEX-B) campaign in 2006. The agreement is generally within the estimated uncertainties when appropriate data screening is applied. We then derive a global seasonal climatology of free-tropospheric NO2 VMR in cloudy conditions. Enhanced NO2 in the free troposphere commonly appears near polluted urban locations where NO2 produced in the boundary layer may be transported vertically out of the boundary layer and then horizontally away from the source. Signatures of lightning NO2 are also shown throughout low and middle latitude regions in summer months. A profile analysis of our cloud-slicing data indicates signatures of lightning-generated NO2 in the upper troposphere. Comparison of the climatology with simulations from the global modeling initiative (GMI) for cloudy conditions (cloud optical depth less than10) shows similarities in the spatial patterns of continental pollution outflow. However, there are also some differences in the seasonal variation of free-tropospheric NO2 VMRs near highly populated regions and in areas affected by lightning-generated NOx.

  8. Global Free Tropospheric NO2 Abundances Derived Using a Cloud Slicing Technique Applied to Satellite Observations from the Aura Ozone Monitoring Instrument (OMI)

    NASA Technical Reports Server (NTRS)

    Choi, S.; Joiner, J.; Choi, Y.; Duncan, B. N.; Bucsela, E.

    2014-01-01

    We derive free-tropospheric NO2 volume mixing ratios (VMRs) and stratospheric column amounts of NO2 by applying a cloud slicing technique to data from the Ozone Monitoring Instrument (OMI) on the Aura satellite. In the cloud-slicing approach, the slope of the above-cloud NO2 column versus the cloud scene pressure is proportional to the NO2 VMR. In this work, we use a sample of nearby OMI pixel data from a single orbit for the linear fit. The OMI data include cloud scene pressures from the rotational-Raman algorithm and above-cloud NO2 vertical column density (VCD) (defined as the NO2 column from the cloud scene pressure to the top-of-the-atmosphere) from a differential optical absorption spectroscopy (DOAS) algorithm. Estimates of stratospheric column NO2 are obtained by extrapolating the linear fits to the tropopause. We compare OMI-derived NO2 VMRs with in situ aircraft profiles measured during the NASA Intercontinental Chemical Transport Experiment Phase B (INTEX-B) campaign in 2006. The agreement is generally within the estimated uncertainties when appropriate data screening is applied. We then derive a global seasonal climatology of free-tropospheric NO2 VMR in cloudy conditions. Enhanced NO2 in the free troposphere commonly appears near polluted urban locations where NO2 produced in the boundary layer may be transported vertically out of the boundary layer and then horizontally away from the source. Signatures of lightning NO2 are also shown throughout low and middle latitude regions in summer months. A profile analysis of our cloud slicing data indicates signatures of uplifted and transported anthropogenic NO2 in the middle troposphere as well as lightning-generated NO2 in the upper troposphere. Comparison of the climatology with simulations from the Global Modeling Initiative (GMI) for cloudy conditions (cloud optical thicknesses > 10) shows similarities in the spatial patterns of continental pollution outflow. However, there are also some differences in the seasonal variation of free-tropospheric NO2 VMRs near highly populated regions and in areas affected by lightning-generated NOx. Stratospheric column NO2 obtained from cloud slicing agrees well with other independently-generated estimates, providing further confidence in the free-tropospheric results.

  9. Evaluation of multi-layer cloud detection based on MODIS CO2-slicing algorithm with CALIPSO-CloudSat measurements.

    NASA Astrophysics Data System (ADS)

    Viudez-Mora, A.; Kato, S.; Smith, W. L., Jr.; Chang, F. L.

    2016-12-01

    Knowledge of the vertical cloud distribution is important for a variety of climate and weather applications. The cloud overlapping variations greatly influence the atmospheric heating/cooling rates, with implications for the surface-troposphere radiative balance, global circulation and precipitation. Additionally, an accurate knowledge of the multi-layer cloud distribution in real-time can be used in applications such safety condition for aviation through storms and adverse weather conditions. In this study, we evaluate a multi-layered cloud algorithm (Chang et al. 2005) based on MODIS measurements aboard Aqua satellite (MCF). This algorithm uses the CO2-slicing technique combined with cloud properties determined from VIS, IR and NIR channels to locate high thin clouds over low-level clouds, and retrieve the τ of each layer. We use CALIPSO (Winker et. al, 2010) and CloudSat (Stephens et. al, 2002) (CLCS) derived cloud vertical profiles included in the C3M data product (Kato et al. 2010) to evaluate MCF derived multi-layer cloud properties. We focus on 2 layer overlapping and 1-layer clouds identified by the active sensors and investigate how well these systems are identified by the MODIS multi-layer technique. The results show that for these multi-layered clouds identified by CLCS, the MCF correctly identifies about 83% of the cases as multi-layer. However, it is found that the upper CTH is underestimated by about 2.6±0.4 km, because the CO2-slicing technique is not as sensitive to the cloud physical top as the CLCS. The lower CTH agree better with differences found to be about 1.2±0.5 km. Another outstanding issue for the MCF approach is the large number of multi-layer false alarms that occur in single-layer conditions. References: Chang, F.-L., and Z. Li, 2005: A new method for detection of cirrus overlapping water clouds and determination of their optical properties. J. Atmos. Sci., 62. Kato, S., et al. (2010), Relationships among cloud occurrence frequency, overlap, and effective thickness derived from CALIPSO and CloudSat merged cloud vertical profiles, J. Geophys. Res., 115. Stephens, G. L., et al. (2002), The CloudSat mission and A-Train, Bull. Am. Meteorol. Soc., 83. Winker, D. M., et al., 2010: The CALIPSO Mission: A global 3D view of aerosols and clouds. Bull. Amer. Meteor. Soc., 91.

  10. "Cloud Slicing" : A New Technique to Derive Tropospheric Ozone Profile Information from Satellite Measurements

    NASA Technical Reports Server (NTRS)

    Ziemke, J. R.; Chandra, S.; Bhartia, P. K.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    A new technique denoted cloud slicing has been developed for estimating tropospheric ozone profile information. All previous methods using satellite data were only capable of estimating the total column of ozone in the troposphere. Cloud slicing takes advantage of the opaque property of water vapor clouds to ultraviolet wavelength radiation. Measurements of above-cloud column ozone from the Nimbus 7 total ozone mapping spectrometer (TOMS) instrument are combined together with Nimbus 7 temperature humidity and infrared radiometer (THIR) cloud-top pressure data to derive ozone column amounts in the upper troposphere. In this study tropical TOMS and THIR data for the period 1979-1984 are analyzed. By combining total tropospheric column ozone (denoted TCO) measurements from the convective cloud differential (CCD) method with 100-400 hPa upper tropospheric column ozone amounts from cloud slicing, it is possible to estimate 400-1000 hPa lower tropospheric column ozone and evaluate its spatial and temporal variability. Results for both the upper and lower tropical troposphere show a year-round zonal wavenumber 1 pattern in column ozone with largest amounts in the Atlantic region (up to approx. 15 DU in the 100-400 hPa pressure band and approx. 25-30 DU in the 400-1000 hPa pressure band). Upper tropospheric ozone derived from cloud slicing shows maximum column amounts in the Atlantic region in the June-August and September-November seasons which is similar to the seasonal variability of CCD derived TCO in the region. For the lower troposphere, largest column amounts occur in the September-November season over Brazil in South America and also southern Africa. Localized increases in the tropics in lower tropospheric ozone are found over the northern region of South America around August and off the west coast of equatorial Africa in the March-May season. Time series analysis for several regions in South America and Africa show an anomalous increase in ozone in the lower troposphere around the month of March which is not observed in the upper troposphere. The eastern Pacific indicates weak seasonal variability of upper, lower, and total tropospheric ozone compared to the western Pacific which shows largest TCO amounts in both hemispheres around spring months. Ozone variability in the western Pacific is expected to have greater variability caused by strong convection, pollution and biomass burning, land/sea contrast and monsoon developments.

  11. Properties of CIRRUS Overlapping Clouds as Deduced from the GOES-12 Imagery Data

    NASA Technical Reports Server (NTRS)

    Chang, Fu-Lung; Minnis, Patrick; Lin, Bing; Sun-Mack, Sunny; Khaiyer, Mandana

    2006-01-01

    Understanding the impact of cirrus clouds on modifying both the solar reflected and terrestrial emitted radiations is crucial for climate studies. Unlike most boundary layer stratus and stratocumulus clouds that have a net cooling effect on the climate, high-level thin cirrus clouds can have a warming effect on our climate. Many research efforts have been devoted to retrieving cirrus cloud properties due to their ubiquitous presence. However, using satellite observations to detect and/or retrieve cirrus cloud properties faces two major challenges. First, they are often semitransparent at visible to infrared wavelengths; and secondly, they often occur over a lower cloud system. The overlapping of high-level cirrus and low-level stratus cloud poses a difficulty in determining the individual cloud top altitudes and optical properties, especially when the signals from cirrus clouds are overwhelmed by the signals of stratus clouds. Moreover, the operational satellite retrieval algorithms, which often assume only single layer cloud in the development of cloud retrieval techniques, cannot resolve the cloud overlapping situation properly. The new geostationary satellites, starting with the Twelfth Geostationary Operational Environmental Satellite (GOES-12), are providing a new suite of imager bands that have replaced the conventional 12-micron channel with a 13.3-micron CO2 absorption channel. The replacement of the 13.3-micron channel allows for the application of a CO2-slicing retrieval technique (Chahine et al. 1974; Smith and Platt 1978), which is one of the important passive satellite methods for remote sensing the altitudes of mid to high-level clouds. Using the CO2- slicing technique is more effective in detecting semitransparent cirrus clouds than using the conventional infrared-window method.

  12. Upper Tropospheric Ozone Between Latitudes 60S and 60N Derived from Nimbus 7 TOMS/THIR Cloud Slicing

    NASA Technical Reports Server (NTRS)

    Ziemke, Jerald R.; Chandra, Sushil; Bhartia, P. K.

    2002-01-01

    This study evaluates the spatial distributions and seasonal cycles in upper tropospheric ozone (pressure range 200-500 hPa) from low to high latitudes (60S to 60N) derived from the satellite retrieval method called "Cloud Slicing." Cloud Slicing is a unique technique for determining ozone profile information in the troposphere by combining co-located measurements of cloud-top, pressure and above-cloud column ozone. For upper tropospheric ozone, co-located measurements of Nimbus 7 Total Ozone Mapping Spectrometer (TOMS) above-cloud column ozone, and Nimbus 7 Temperature Humidity Infrared Radiometer (THIR) cloud-top pressure during 1979-1984 were incorporated. In the tropics, upper tropospheric ozone shows year-round enhancement in the Atlantic region and evidence of a possible semiannual variability. Upper tropospheric ozone outside the tropics shows greatest abundance in winter and spring seasons in both hemispheres with largest seasonal and largest amounts in the NH. These characteristics are similar to lower stratospheric ozone. Comparisons of upper tropospheric column ozone with both stratospheric ozone and a proxy of lower stratospheric air mass (i.e., tropopause pressure) from National Centers for Environmental Prediction (NCEP) suggest that stratosphere-troposphere exchange (STE) may be a significant source for the seasonal variability of upper tropospheric ozone almost everywhere between 60S and 60N except in low latitudes around 10S to 25N where other sources (e.g., tropospheric transport, biomass burning, aerosol effects, lightning, etc.) may have a greater role.

  13. Cloud Overlapping Detection Algorithm Using Solar and IR Wavelengths With GOSE Data Over ARM/SGP Site

    NASA Technical Reports Server (NTRS)

    Kawamoto, Kazuaki; Minnis, Patrick; Smith, William L., Jr.

    2001-01-01

    One of the most perplexing problems in satellite cloud remote sensing is the overlapping of cloud layers. Although most techniques assume a 1-layer cloud system in a given retrieval of cloud properties, many observations are affected by radiation from more than one cloud layer. As such, cloud overlap can cause errors in the retrieval of many properties including cloud height, optical depth, phase, and particle size. A variety of methods have been developed to identify overlapped clouds in a given satellite imager pixel. Baum el al. (1995) used CO2 slicing and a spatial coherence method to demonstrate a possible analysis method for nighttime detection of multilayered clouds. Jin and Rossow (1997) also used a multispectral CO2 slicing technique for a global analysis of overlapped cloud amount. Lin et al. (1999) used a combination infrared, visible, and microwave data to detect overlapped clouds over water. Recently, Baum and Spinhirne (2000) proposed 1.6 and 11 microns. bispectral threshold method. While all of these methods have made progress in solving this stubborn problem, none have yet proven satisfactory for continuous and consistent monitoring of multilayer cloud systems. It is clear that detection of overlapping clouds from passive instruments such as satellite radiometers is in an immature stage of development and requires additional research. Overlapped cloud systems also affect the retrievals of cloud properties over the ARM domains (e.g., Minnis et al 1998) and hence should identified as accurately as possible. To reach this goal, it is necessary to determine which information can be exploited for detecting multilayered clouds from operational meteorological satellite data used by ARM. This paper examines the potential information available in spectral data available on the Geostationary Operational Environmental Satellite (GOES) imager and the NOAA Advanced Very High Resolution Radiometer (AVHRR) used over the ARM SGP and NSA sites to study the capability of detecting overlapping clouds

  14. Cloud Overlapping Detection Algorithm Using Solar and IR Wavelengths with GOES Data Over ARM/SGP Site

    NASA Technical Reports Server (NTRS)

    Kawamoto, K.; Minnis, P.; Smith, W. L., Jr.

    2001-01-01

    One of the most perplexing problems in satellite cloud remote sensing is the overlapping of cloud layers. Although most techniques assume a one layer cloud system in a given retrieval of cloud properties, many observations are affected by radiation from more than one cloud layer. As such, cloud overlap can cause errors in the retrieval of many properties including cloud height, optical depth, phase, and particle size. A variety of methods have been developed to identify overlapped clouds in a given satellite imager pixel. Baum et al used CO2 slicing and a spatial coherence method to demonstrate a possible analysis method for nighttime detection of multilayered clouds. Jin and Rossow also used a multispectral CO2 slicing technique for a global analysis of overlapped cloud amount. Lin et al. used a combination infrared (IR), visible (VIS), and microwave data to detect overlapped clouds over water. Recently, Baum and Spinhirne proposed a 1.6 and 11 micron bispectral threshold method. While all of these methods have made progress in solving this stubborn problem none have yet proven satisfactory for continuous and consistent monitoring of multilayer cloud systems. It is clear that detection of overlapping clouds from passive instruments such as satellite radiometers is in an immature stage of development and requires additional research. Overlapped cloud systems also affect the retrievals of cloud properties over the Atmospheric Radiation Measurement (ARM) domains and hence should be identified as accurately as possible. To reach this goal, it is necessary to determine which information can be exploited for detecting multilayered clouds from operational meteorological satellite data used by ARM. This paper examines the potential information available in spectral data available on the Geostationary Operational Environmental Satellite (GOES) imager and the National Oceanic Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR) used over the ARM Program's Southern Great Plains (SGP), and North Slope of Alaska (NSA) sites to study the capability of detecting overlapping clouds.

  15. Detection and Retrieval of Multi-Layered Cloud Properties Using Satellite Data

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Sun-Mack, Sunny; Chen, Yan; Yi, Helen; Huang, Jian-Ping; Nguyen, Louis; Khaiyer, Mandana M.

    2005-01-01

    Four techniques for detecting multilayered clouds and retrieving the cloud properties using satellite data are explored to help address the need for better quantification of cloud vertical structure. A new technique was developed using multispectral imager data with secondary imager products (infrared brightness temperature differences, BTD). The other methods examined here use atmospheric sounding data (CO2-slicing, CO2), BTD, or microwave data. The CO2 and BTD methods are limited to optically thin cirrus over low clouds, while the MWR methods are limited to ocean areas only. This paper explores the use of the BTD and CO2 methods as applied to Moderate Resolution Imaging Spectroradiometer (MODIS) and Advanced Microwave Scanning Radiometer EOS (AMSR-E) data taken from the Aqua satellite over ocean surfaces. Cloud properties derived from MODIS data for the Clouds and the Earth's Radiant Energy System (CERES) Project are used to classify cloud phase and optical properties. The preliminary results focus on a MODIS image taken off the Uruguayan coast. The combined MW visible infrared (MVI) method is assumed to be the reference for detecting multilayered ice-over-water clouds. The BTD and CO2 techniques accurately match the MVI classifications in only 51 and 41% of the cases, respectively. Much additional study is need to determine the uncertainties in the MVI method and to analyze many more overlapped cloud scenes.

  16. Detection and retrieval of multi-layered cloud properties using satellite data

    NASA Astrophysics Data System (ADS)

    Minnis, Patrick; Sun-Mack, Sunny; Chen, Yan; Yi, Helen; Huang, Jianping; Nguyen, Louis; Khaiyer, Mandana M.

    2005-10-01

    Four techniques for detecting multilayered clouds and retrieving the cloud properties using satellite data are explored to help address the need for better quantification of cloud vertical structure. A new technique was developed using multispectral imager data with secondary imager products (infrared brightness temperature differences, BTD). The other methods examined here use atmospheric sounding data (CO2-slicing, CO2), BTD, or microwave data. The CO2 and BTD methods are limited to optically thin cirrus over low clouds, while the MWR methods are limited to ocean areas only. This paper explores the use of the BTD and CO2 methods as applied to Moderate Resolution Imaging Spectroradiometer (MODIS) and Advanced Microwave Scanning Radiometer EOS (AMSR-E) data taken from the Aqua satellite over ocean surfaces. Cloud properties derived from MODIS data for the Clouds and the Earth's Radiant Energy System (CERES) Project are used to classify cloud phase and optical properties. The preliminary results focus on a MODIS image taken off the Uruguayan coast. The combined MW visible infrared (MVI) method is assumed to be the reference for detecting multilayered ice-over-water clouds. The BTD and CO2 techniques accurately match the MVI classifications in only 51 and 41% of the cases, respectively. Much additional study is need to determine the uncertainties in the MVI method and to analyze many more overlapped cloud scenes.

  17. Evaluation of Passive Multilayer Cloud Detection Using Preliminary CloudSat and CALIPSO Cloud Profiles

    NASA Astrophysics Data System (ADS)

    Minnis, P.; Sun-Mack, S.; Chang, F.; Huang, J.; Nguyen, L.; Ayers, J. K.; Spangenberg, D. A.; Yi, Y.; Trepte, C. R.

    2006-12-01

    During the last few years, several algorithms have been developed to detect and retrieve multilayered clouds using passive satellite data. Assessing these techniques has been difficult due to the need for active sensors such as cloud radars and lidars that can "see" through different layers of clouds. Such sensors have been available only at a few surface sites and on aircraft during field programs. With the launch of the CALIPSO and CloudSat satellites on April 28, 2006, it is now possible to observe multilayered systems all over the globe using collocated cloud radar and lidar data. As part of the A- Train, these new active sensors are also matched in time ad space with passive measurements from the Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) and Advanced Microwave Scanning Radiometer - EOS (AMSR-E). The Clouds and the Earth's Radiant Energy System (CERES) has been developing and testing algorithms to detect ice-over-water overlapping cloud systems and to retrieve the cloud liquid path (LWP) and ice water path (IWP) for those systems. One technique uses a combination of the CERES cloud retrieval algorithm applied to MODIS data and a microwave retrieval method applied to AMSR-E data. The combination of a CO2-slicing cloud retireval technique with the CERES algorithms applied to MODIS data (Chang et al., 2005) is used to detect and analyze such overlapped systems that contain thin ice clouds. A third technique uses brightness temperature differences and the CERES algorithms to detect similar overlapped methods. This paper uses preliminary CloudSat and CALIPSO data to begin a global scale assessment of these different methods. The long-term goals are to assess and refine the algorithms to aid the development of an optimal combination of the techniques to better monitor ice 9and liquid water clouds in overlapped conditions.

  18. Evaluation of Passive Multilayer Cloud Detection Using Preliminary CloudSat and CALIPSO Cloud Profiles

    NASA Astrophysics Data System (ADS)

    Minnis, P.; Sun-Mack, S.; Chang, F.; Huang, J.; Nguyen, L.; Ayers, J. K.; Spangenberg, D. A.; Yi, Y.; Trepte, C. R.

    2005-05-01

    During the last few years, several algorithms have been developed to detect and retrieve multilayered clouds using passive satellite data. Assessing these techniques has been difficult due to the need for active sensors such as cloud radars and lidars that can "see" through different layers of clouds. Such sensors have been available only at a few surface sites and on aircraft during field programs. With the launch of the CALIPSO and CloudSat satellites on April 28, 2006, it is now possible to observe multilayered systems all over the globe using collocated cloud radar and lidar data. As part of the A- Train, these new active sensors are also matched in time ad space with passive measurements from the Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) and Advanced Microwave Scanning Radiometer - EOS (AMSR-E). The Clouds and the Earth's Radiant Energy System (CERES) has been developing and testing algorithms to detect ice-over-water overlapping cloud systems and to retrieve the cloud liquid path (LWP) and ice water path (IWP) for those systems. One technique uses a combination of the CERES cloud retrieval algorithm applied to MODIS data and a microwave retrieval method applied to AMSR-E data. The combination of a CO2-slicing cloud retireval technique with the CERES algorithms applied to MODIS data (Chang et al., 2005) is used to detect and analyze such overlapped systems that contain thin ice clouds. A third technique uses brightness temperature differences and the CERES algorithms to detect similar overlapped methods. This paper uses preliminary CloudSat and CALIPSO data to begin a global scale assessment of these different methods. The long-term goals are to assess and refine the algorithms to aid the development of an optimal combination of the techniques to better monitor ice 9and liquid water clouds in overlapped conditions.

  19. Volcanic Ash Cloud Altitude retrievals from passive satellite sensors: the 03-09 December 2015 Etna eruption.

    NASA Astrophysics Data System (ADS)

    corradini, stefano; merucci, luca; guerrieri, lorenzo; pugnaghi, sergio; mcgarragh, greg; carboni, elisa; ventress, lucy; grainger, roy; scollo, simona; pardini, federica; zaksek, klemen; langmann, baerbel; bancalá, severin; stelitano, dario

    2016-04-01

    The volcanic ash cloud altitude is one of the most important parameter needed for the volcanic ash cloud estimations (mass, effective radius and optical depth). It is essential by modelers to initialize the ash cloud transportation models, and by volcanologists to give insights into eruption dynamics. Moreover, it is extremely important in order to reduce the disruption to flights as a result of volcanic activity whilst still ensuring safe travel. In this work, the volcanic ash cloud altitude is computed from remote sensing passive satellite data (SEVIRI, MODIS, IASI and MISR) by using the most of the existing retrieval techniques. A novel approach, based on the CO2 slicing procedure, is also shown. The comparisons among different techniques are presented and advantages and drawbacks emphasized. As test cases Etna eruptions in the period between 03 and 09 December 2015 are considered. During this time four lava fountain events occurred at the Voragine crater, forming eruption columns higher than 12 km asl and producing copious tephra fallout on volcano flanks. These events, among the biggest of the last 20 years, produced emissions that reached the stratosphere and produced a circum-global transport throughout the northern hemisphere.

  20. A Comparison of High Spectral Resolution Infrared Cloud-Top Pressure Altitude Algorithms Using S-HIS Measurements

    NASA Technical Reports Server (NTRS)

    Holz, Robert E.; Ackerman, Steve; Antonelli, Paolo; Nagle, Fred; McGill, Matthew; Hlavka, Dennis L.; Hart, William D.

    2005-01-01

    This paper presents a comparison of cloud-top altitude retrieval methods applied to S-HIS (Scanning High Resolution Interferometer Sounder) measurements. Included in this comparison is an improvement to the traditional CO2 Slicing method. The new method, CO2 Sorting, determines optimal channel pairs to apply the CO2 Slicing. Measurements from collocated samples of the Cloud Physics Lidar (CPL) and Modis Airborne Simulator (MAS) instruments assist in the comparison. For optically thick clouds good correlation between the S-HIS and lidar cloud-top retrievals are found. For tenuous ice clouds there can be large differences between lidar (CPL) and S-HIS retrieved cloud-tops. It is found that CO2 Sorting significantly reduces the cloud height biases for the optically thin cloud (total optical depths less then 1.0). For geometrically thick but optically thin cirrus clouds large differences between the S-HIS infrared cloud top retrievals and the CPL detected cloud top where found. For these cases the cloud height retrieved by the S-HIS cloud retrievals correlated closely with the level the CPL integrated cloud optical depth was approximately 1.0.

  1. Limitations on Space-based Air Fluorescence Detector Apertures obtained from IR Cloud Measurements

    NASA Technical Reports Server (NTRS)

    White, Nicholas E. (Technical Monitor); Krizmanic, John; Sokolsky, Pierre; Streitmatter, Robert

    2003-01-01

    The presence of clouds between an airshower and a space-based detector can dramatically alter the measured signal characteristics due to absorption and scattering of the photonic signals. Furthermore, knowledge of the cloud cover in the observed atmosphere is needed to determine the instantaneous aperture of such a detector. Before exploring the complex nature of cloud-airshower interactions, we examine a simpler issue. We investigate the fraction of ultra-high energy cosmic ray events that may be expected to occur in volumes of the viewed atmosphere non-obscured by clouds. To this end, we use space-based IR data in concert with Monte Carlo simulated 10(exp 20) eV airshowers to determine the acceptable event fractions. Earth-observing instruments, such as MODIS, measure detailed cloud configurations via a CO2-slicing technique that can be used to determine cloud-top altitudes over large areas. Thus, events can be accepted if their observed 3-dimensional endpoints occur above low clouds as well as from areas of cloud-free atmosphere. An initial analysis has determined that by accepting airshowers that occur above low clouds, the non-obscured acceptance can be increased by approximately a factor of 3 over that obtained using a cloud-free criterion.

  2. Spatiotemporal Visualization of Time-Series Satellite-Derived CO2 Flux Data Using Volume Rendering and Gpu-Based Interpolation on a Cloud-Driven Digital Earth

    NASA Astrophysics Data System (ADS)

    Wu, S.; Yan, Y.; Du, Z.; Zhang, F.; Liu, R.

    2017-10-01

    The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.

  3. CloudSat Takes a 3D Slice of Hurricane Matthew

    NASA Image and Video Library

    2016-10-07

    NASA's CloudSat flew east of Hurricane Matthew's center on Oct. 6 at 11:30 a.m. PDT (2:30 p.m. EDT), intersecting parts of Matthew's outer rain bands and revealing Matthew's anvil clouds (thick cirrus cloud cover), with cumulus and cumulonimbus clouds beneath (lower image). Reds/pinks are larger water/ice droplets. http://photojournal.jpl.nasa.gov/catalog/PIA21095

  4. A novel technique for evaluating the volcanic cloud top altitude using GPS Radio Occultation data

    NASA Astrophysics Data System (ADS)

    Biondi, Riccardo; Corradini, Stefano; Guerrieri, Lorenzo; Merucci, Luca; Stelitano, Dario; Pugnaghi, Sergio

    2017-04-01

    Volcanic ash and sulfuric gases are a major hazards to aviation since they damage the aircraft engines also at large distance from the eruption. Many challenges given by volcanic explosive eruptions are still discussed and several issues are far from being solved. The cloud top altitude can be detected with different techniques, but the accuracy is still quite coarse. This parameter is important for the air traffic to know what altitude can be ash free, and it assumes a key role for the contribution of the eruption to the climate change. Moreover, the cloud top altitude is also strictly related to the mass ejected by the eruption and represent a key parameter for the ash and SO2 retrievals by using several techniques. The Global Positioning System (GPS) Radio Occultation (RO) technique enables real time measurement of atmospheric density structure in any meteorological condition, in remote areas and during extreme atmospheric events with high vertical resolution and accuracy and this makes the RO an interesting tool for this kind of studies. In this study we have tracked the Eyjafjöll 2010 eruption by using MODIS satellite measurements and retrieved the volcanic cloud top altitudes by using two different procedures exploiting the thermal infrared CO2 absorption bands around 13.4 micrometers. The first approach is a modification of the standard CO2 slicing method while the second is based on look up tables computations. We have then selected all the RO profiles co-located with the volcanic cloud and implemented an algorithm based on the variation of the bending angle for detecting the cloud top altitude with high accuracy. The results of the comparison between the MODIS and RO volcanic height retrievals are encouraging and suggesting that, due to their independence from weather conditions and due to their high vertical resolution, the RO observations can contribute to improved detection and monitoring of volcanic clouds and to support warning systems.

  5. Automatic segmentation of coronary arteries from computed tomography angiography data cloud using optimal thresholding

    NASA Astrophysics Data System (ADS)

    Ansari, Muhammad Ahsan; Zai, Sammer; Moon, Young Shik

    2017-01-01

    Manual analysis of the bulk data generated by computed tomography angiography (CTA) is time consuming, and interpretation of such data requires previous knowledge and expertise of the radiologist. Therefore, an automatic method that can isolate the coronary arteries from a given CTA dataset is required. We present an automatic yet effective segmentation method to delineate the coronary arteries from a three-dimensional CTA data cloud. Instead of a region growing process, which is usually time consuming and prone to leakages, the method is based on the optimal thresholding, which is applied globally on the Hessian-based vesselness measure in a localized way (slice by slice) to track the coronaries carefully to their distal ends. Moreover, to make the process automatic, we detect the aorta using the Hough transform technique. The proposed segmentation method is independent of the starting point to initiate its process and is fast in the sense that coronary arteries are obtained without any preprocessing or postprocessing steps. We used 12 real clinical datasets to show the efficiency and accuracy of the presented method. Experimental results reveal that the proposed method achieves 95% average accuracy.

  6. Slicing Method for curved façade and window extraction from point clouds

    NASA Astrophysics Data System (ADS)

    Iman Zolanvari, S. M.; Laefer, Debra F.

    2016-09-01

    Laser scanning technology is a fast and reliable method to survey structures. However, the automatic conversion of such data into solid models for computation remains a major challenge, especially where non-rectilinear features are present. Since, openings and the overall dimensions of the buildings are the most critical elements in computational models for structural analysis, this article introduces the Slicing Method as a new, computationally-efficient method for extracting overall façade and window boundary points for reconstructing a façade into a geometry compatible for computational modelling. After finding a principal plane, the technique slices a façade into limited portions, with each slice representing a unique, imaginary section passing through a building. This is done along a façade's principal axes to segregate window and door openings from structural portions of the load-bearing masonry walls. The method detects each opening area's boundaries, as well as the overall boundary of the façade, in part, by using a one-dimensional projection to accelerate processing. Slices were optimised as 14.3 slices per vertical metre of building and 25 slices per horizontal metre of building, irrespective of building configuration or complexity. The proposed procedure was validated by its application to three highly decorative, historic brick buildings. Accuracy in excess of 93% was achieved with no manual intervention on highly complex buildings and nearly 100% on simple ones. Furthermore, computational times were less than 3 sec for data sets up to 2.6 million points, while similar existing approaches required more than 16 hr for such datasets.

  7. The topology of large-scale structure. VI - Slices of the universe

    NASA Astrophysics Data System (ADS)

    Park, Changbom; Gott, J. R., III; Melott, Adrian L.; Karachentsev, I. D.

    1992-03-01

    Results of an investigation of the topology of large-scale structure in two observed slices of the universe are presented. Both slices pass through the Coma cluster and their depths are 100 and 230/h Mpc. The present topology study shows that the largest void in the CfA slice is divided into two smaller voids by a statistically significant line of galaxies. The topology of toy models like the white noise and bubble models is shown to be inconsistent with that of the observed slices. A large N-body simulation was made of the biased cloud dark matter model and the slices are simulated by matching them in selection functions and boundary conditions. The genus curves for these simulated slices are spongelike and have a small shift in the direction of a meatball topology like those of observed slices.

  8. The topology of large-scale structure. VI - Slices of the universe

    NASA Technical Reports Server (NTRS)

    Park, Changbom; Gott, J. R., III; Melott, Adrian L.; Karachentsev, I. D.

    1992-01-01

    Results of an investigation of the topology of large-scale structure in two observed slices of the universe are presented. Both slices pass through the Coma cluster and their depths are 100 and 230/h Mpc. The present topology study shows that the largest void in the CfA slice is divided into two smaller voids by a statistically significant line of galaxies. The topology of toy models like the white noise and bubble models is shown to be inconsistent with that of the observed slices. A large N-body simulation was made of the biased cloud dark matter model and the slices are simulated by matching them in selection functions and boundary conditions. The genus curves for these simulated slices are spongelike and have a small shift in the direction of a meatball topology like those of observed slices.

  9. Cloud Ablation by a Relativistic Jet and the Extended Flare in CTA 102 in 2016 and 2017

    NASA Astrophysics Data System (ADS)

    Zacharias, M.; Böttcher, M.; Jankowsky, F.; Lenain, J.-P.; Wagner, S. J.; Wierzcholska, A.

    2017-12-01

    In late 2016 and early 2017, the flat spectrum radio quasar CTA 102 exhibited a very strong and long-lasting outburst. The event can be described by a roughly two-month long increase of the baseline flux in the monitored energy bands (optical to γ-rays) by a factor 8, and a subsequent decrease over another two months back to pre-flare levels. The long-term trend was superseded by short but very strong flares, resulting in a peak flux that was a factor 50 above pre-flare levels in the γ-ray domain and almost a factor 100 above pre-flare levels in the optical domain. In this paper, we explain the long-term evolution of the outburst by the ablation of a gas cloud penetrating the relativistic jet. The slice-by-slice ablation results in a gradual increase of the particle injection until the center of the cloud is reached, after which the injected number of particles decreases again. With reasonable cloud parameters, we obtain excellent fits of the long-term trend.

  10. Atmospheric parameterization schemes for satellite cloud property retrieval during FIRE IFO 2

    NASA Technical Reports Server (NTRS)

    Titlow, James; Baum, Bryan A.

    1993-01-01

    Satellite cloud retrieval algorithms generally require atmospheric temperature and humidity profiles to determine such cloud properties as pressure and height. For instance, the CO2 slicing technique called the ratio method requires the calculation of theoretical upwelling radiances both at the surface and a prescribed number (40) of atmospheric levels. This technique has been applied to data from, for example, the High Resolution Infrared Radiometer Sounder (HIRS/2, henceforth HIRS) flown aboard the NOAA series of polar orbiting satellites and the High Resolution Interferometer Sounder (HIS). In this particular study, four NOAA-11 HIRS channels in the 15-micron region are used. The ratio method may be applied to various channel combinations to estimate cloud top heights using channels in the 15-mu m region. Presently, the multispectral, multiresolution (MSMR) scheme uses 4 HIRS channel combination estimates for mid- to high-level cloud pressure retrieval and Advanced Very High Resolution Radiometer (AVHRR) data for low-level (is greater than 700 mb) cloud level retrieval. In order to determine theoretical upwelling radiances, atmospheric temperature and water vapor profiles must be provided as well as profiles of other radiatively important gas absorber constituents such as CO2, O3, and CH4. The assumed temperature and humidity profiles have a large effect on transmittance and radiance profiles, which in turn are used with HIRS data to calculate cloud pressure, and thus cloud height and temperature. For large spatial scale satellite data analysis, atmospheric parameterization schemes for cloud retrieval algorithms are usually based on a gridded product such as that provided by the European Center for Medium Range Weather Forecasting (ECMWF) or the National Meteorological Center (NMC). These global, gridded products prescribe temperature and humidity profiles for a limited number of pressure levels (up to 14) in a vertical atmospheric column. The FIRE IFO 2 experiment provides an opportunity to investigate current atmospheric profile parameterization schemes, compare satellite cloud height results using both gridded products (ECMWF) and high vertical resolution sonde data from the National Weather Service (NWS) and Cross Chain Loran Atmospheric Sounding System (CLASS), and suggest modifications in atmospheric parameterization schemes based on these results.

  11. Four years of global cirrus cloud statistics using HIRS

    NASA Technical Reports Server (NTRS)

    Wylie, Donald P.; Menzel, W. Paul; Woolf, Harold M.; Strabala, Kathleen I.

    1994-01-01

    Trends in global upper-tropospheric transmissive cirrus cloud cover are beginning to emerge from a four-year cloud climatology using NOAA polar-orbiting High-Resolution Infrared Radiation Sounder (HIRS) multispectral data. Cloud occurrence, height, and effective emissivity are determined with the CO2 slicing technique on the four years of data (June 1989-May 1993). There is a global preponderance of transmissive high clouds, 42% on the average; about three-fourths of these are above 500 hPa and presumed to be cirrus. In the Inter-tropical Convergence Zone (ITCZ), a high frequency of cirrus (greater than 50%) is found at all times; a modest seasonal movement tracks the sun. Large seasonal changes in cloud cover occur over the oceans in the storm belts at midlatitudes; the concentrations of these clouds migrate north and south with the seasons following the progressions of the subtropical highs (anticyclones). More cirrus is found in the summer than in the winter in each hemisphere. A significant change in cirrus cloud cover occurs in 1991, the third year of the study. Cirrus observations increase from 35% to 43% of the data, a change of eight percentage points. Other cloud forms, opaque to terrestrial radiation, decerase by nearly the same amount. Most of the increase is thinner cirrus with infrared optical depths below 0.7. The increase in cirrus happens at the same time as the 1991-92 El Nino/Southern Oscillation (ENSO) and the eruption of Mt. Pinatubo. The cirrus changes occur at the start of the ENSO and persist into 1993 in contrast to other climatic indicators that return to near pre-ENSO and volcanic levels in 1993.

  12. Atmospheric CO2 Concentration Measurements with Clouds from an Airborne Lidar

    NASA Astrophysics Data System (ADS)

    Mao, J.; Abshire, J. B.; Kawa, S. R.; Riris, H.; Allan, G. R.; Hasselbrack, W. E.; Numata, K.; Chen, J. R.; Sun, X.; DiGangi, J. P.; Choi, Y.

    2017-12-01

    Globally distributed atmospheric CO2 concentration measurements with high precision, low bias and full seasonal sampling are crucial to advance carbon cycle sciences. However, two thirds of the Earth's surface is typically covered by clouds, and passive remote sensing approaches from space are limited to cloud-free scenes. NASA Goddard is developing a pulsed, integrated-path differential absorption (IPDA) lidar approach to measure atmospheric column CO2 concentrations, XCO2, from space as a candidate for NASA's ASCENDS mission. Measurements of time-resolved laser backscatter profiles from the atmosphere also allow this technique to estimate XCO2 and range to cloud tops in addition to those to the ground with precise knowledge of the photon path-length. We demonstrate this measurement capability using airborne lidar measurements from summer 2017 ASCENDS airborne science campaign in Alaska. We show retrievals of XCO2 to ground and to a variety of cloud tops. We will also demonstrate how the partial column XCO2 to cloud tops and cloud slicing approach help resolving vertical and horizontal gradient of CO2 in cloudy conditions. The XCO2 retrievals from the lidar are validated against in situ measurements and compared to the Goddard Parameterized Chemistry Transport Model (PCTM) simulations. Adding this measurement capability to the future lidar mission for XCO2 will provide full global and seasonal data coverage and some information about vertical structure of CO2. This unique facility is expected to benefit atmospheric transport process studies, carbon data assimilation in models, and global and regional carbon flux estimation.

  13. Recent developments in multi-wire fixed abrasive slicing technique (FAST). [for low cost silicon wafer production from ingots

    NASA Technical Reports Server (NTRS)

    Schmid, F.; Khattak, C. P.; Smith, M. B.; Lynch, L. D.

    1982-01-01

    Slicing is an important processing step for all technologies based on the use of ingots. A comparison of the economics of three slicing techniques shows that the fixed abrasive slicing technique (FAST) is superior to the internal diameter (ID) and the multiblade slurry (MBS) techniques. Factors affecting contact length are discussed, taking into account kerf width, rocking angle, ingot size, and surface speed. Aspects of blade development are also considered. A high concentration of diamonds on wire has been obtained in wire packs usd for FAST slicing. The material removal rate was found to be directly proportional to the pressure at the diamond tips.

  14. Nitrogen oxides in the global upper troposphere interpreted with cloud-sliced NO2 from the Ozone Monitoring Instrument

    NASA Astrophysics Data System (ADS)

    Marais, Eloise A.; Jacob, Daniel J.; Choi, Sungyeon; Joiner, Joanna; Belmonte-Rivas, Maria; Cohen, Ronald C.; Ryerson, Thomas B.; Weinheimer, Andrew J.; Volz-Thomas, Andreas

    2017-04-01

    Nitrogen oxides (NOx ≡ NO + NO2) are long lived in the upper troposphere (UT), and so have a large impact on ozone formation where ozone is a powerful greenhouse gas. Measurements of UT NOx are limited to summertime aircraft campaigns predominantly in North America. There are year-round NOx measurements from instruments onboard commercial aircraft, but NO2 measurements are susceptible to large interferences. Satellites provide global coverage, but traditional space-based NO2 observations only provide one piece of vertical information in the troposphere. New cloud-sliced satellite NO2 products offer additional vertical information by retrieving partial NO2 columns above clouds and further exploit differences in cloud heights to calculate UT NO2 mixing ratios. Two new cloud-sliced NO2 products from the Ozone Monitoring Instrument (OMI; 2004 launch) provide seasonal UT NO2 data centered at 350 hPa for 2005-2007 (NASA product) and 380 hPa for 2006 only (KNMI). Differences between the products include spectral fitting to obtain NO2 along the viewing path (slant column), the air mass factor calculation to convert slant columns to true vertical columns, treatment of the stratospheric NO2 component, and the choice of cloud products. The resultant NASA NO2 mixing ratios are 30% higher than KNMI NO2 and are consistent with summertime aircraft NO2 observations over North America. Comparison between NASA NO2 and the GEOS-Chem chemical transport model exposes glaring inadequacies in the model. In summer in the eastern US lightning NOx emissions are overestimated by at least a factor of 2, corroborated by comparison of GEOS-Chem and MOZAIC aircraft observations of reactive nitrogen (NOy). Too fast heterogeneous hydrolysis of dinitrogen pentoxide (N2O5) leads to an underestimate in UT NO2 in winter across the northern hemisphere. Absence of interannual variability in lightning flashes in the lightning NOx parameterization induces biases in UT NO2 in the tropics due to anomalous lightning activity linked to the El Niño Southern Oscillation. Ongoing work is to use GEOS-Chem to investigate the implications of updated representation of UT NOx on ozone.

  15. A novel dehydration technique for carrot slices implementing ultrasound and vacuum drying methods.

    PubMed

    Chen, Zhi-Gang; Guo, Xiao-Yu; Wu, Tao

    2016-05-01

    A novel drying technique using a combination of ultrasound and vacuum dehydration was developed to shorten the drying time and improve the quality of carrot slices. Carrot slices were dried with ultrasonic vacuum (USV) drying and vacuum drying at 65 °C and 75 °C. The drying rate was significantly influenced by the drying techniques and temperatures. Compared with vacuum drying, USV drying resulted in a 41-53% decrease in the drying time. The drying time for the USV and vacuum drying techniques at 75 °C was determined to be 140 and 340 min for carrot slices, respectively. The rehydration potential, nutritional value (retention of β-carotene and ascorbic acid), color, and textural properties of USV-dried carrot slices are predominately better compared to vacuum-dried carrot slices. Moreover, lower energy consumption was used in the USV technique. The drying data (time versus moisture ratio) were successfully fitted to Wang and Singh model. Copyright © 2015. Published by Elsevier B.V.

  16. High-order multiband encoding in the heart.

    PubMed

    Cunningham, Charles H; Wright, Graham A; Wood, Michael L

    2002-10-01

    Spatial encoding with multiband selective excitation (e.g., Hadamard encoding) has been restricted to a small number of slices because the RF pulse becomes unacceptably long when more than about eight slices are encoded. In this work, techniques to shorten multiband RF pulses, and thus allow larger numbers of slices, are investigated. A method for applying the techniques while retaining the capability of adaptive slice thickness is outlined. A tradeoff between slice thickness and pulse duration is shown. Simulations and experiments with the shortened pulses confirmed that motion-induced excitation profile blurring and phase accrual were reduced. The connection between gradient hardware limitations, slice thickness, and flow sensitivity is shown. Excitation profiles for encoding 32 contiguous slices of 1-mm thickness were measured experimentally, and the artifact resulting from errors in timing of RF pulse relative to gradient was investigated. A multiband technique for imaging 32 contiguous 2-mm slices, with adaptive slice thickness, was developed and demonstrated for coronary artery imaging in healthy subjects. With the ability to image high numbers of contiguous slices, using relatively short (1-2 ms) RF pulses, multiband encoding has been advanced further toward practical application. Copyright 2002 Wiley-Liss, Inc.

  17. The Cloud Detection and UV Monitoring Experiment (CLUE)

    NASA Technical Reports Server (NTRS)

    Barbier, L.; Loh, E.; Sokolsky, P.; Streitmatter, R.

    2004-01-01

    We propose a large-area, low-power instrument to perform CLoud detection and Ultraviolet monitoring, CLUE. CLUE will combine the W detection capabilities of the NIGHTGLOW payload, with an array of infrared sensors to perform cloud slicing measurements. Missions such as EUSO and OWL which seek to measure UHE cosmic-rays at 1W20 eV use the atmosphere as a fluorescence detector. CLUE will provide several important correlated measurements for these missions, including: monitoring the atmospheric W emissions &om 330 - 400 nm, determining the ambient cloud cover during those W measurements (with active LIDAR), measuring the optical depth of the clouds (with an array of narrow band-pass IR sensors), and correlating LIDAR and IR cloud cover measurements. This talk will describe the instrument as we envision it.

  18. A Slice of Orion

    NASA Image and Video Library

    2006-08-15

    This image composite shows a part of the Orion constellation surveyed by NASA Spitzer Space Telescope. The shape of the main image was designed by astronomers to roughly follow the shape of Orion cloud A, an enormous star-making factory.

  19. Organotypic Slice Cultures for Studies of Postnatal Neurogenesis

    PubMed Central

    Mosa, Adam J.; Wang, Sabrina; Tan, Yao Fang; Wojtowicz, J. Martin

    2015-01-01

    Here we describe a technique for studying hippocampal postnatal neurogenesis in the rodent brain using the organotypic slice culture technique. This method maintains the characteristic topographical morphology of the hippocampus while allowing direct application of pharmacological agents to the developing hippocampal dentate gyrus. Additionally, slice cultures can be maintained for up to 4 weeks and thus, allow one to study the maturation process of newborn granule neurons. Slice cultures allow for efficient pharmacological manipulation of hippocampal slices while excluding complex variables such as uncertainties related to the deep anatomic location of the hippocampus as well as the blood brain barrier. For these reasons, we sought to optimize organotypic slice cultures specifically for postnatal neurogenesis research. PMID:25867138

  20. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images.

    PubMed

    Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin

    2017-12-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.

  1. The Improvement of the Closed Bounded Volume (CBV) Evaluation Methods to Compute a Feasible Rough Machining Area Based on Faceted Models

    NASA Astrophysics Data System (ADS)

    Hadi Sutrisno, Himawan; Kiswanto, Gandjar; Istiyanto, Jos

    2017-06-01

    The rough machining is aimed at shaping a workpiece towards to its final form. This process takes up a big proportion of the machining time due to the removal of the bulk material which may affect the total machining time. In certain models, the rough machining has limitations especially on certain surfaces such as turbine blade and impeller. CBV evaluation is one of the concepts which is used to detect of areas admissible in the process of machining. While in the previous research, CBV area detection used a pair of normal vectors, in this research, the writer simplified the process to detect CBV area with a slicing line for each point cloud formed. The simulation resulted in three steps used for this method and they are: 1. Triangulation from CAD design models, 2. Development of CC point from the point cloud, 3. The slicing line method which is used to evaluate each point cloud position (under CBV and outer CBV). The result of this evaluation method can be used as a tool for orientation set-up on each CC point position of feasible areas in rough machining.

  2. Study on super-resolution three-dimensional range-gated imaging technology

    NASA Astrophysics Data System (ADS)

    Guo, Huichao; Sun, Huayan; Wang, Shuai; Fan, Youchen; Li, Yuanmiao

    2018-04-01

    Range-gated three dimensional imaging technology is a hotspot in recent years, because of the advantages of high spatial resolution, high range accuracy, long range, and simultaneous reflection of target reflectivity information. Based on the study of the principle of intensity-related method, this paper has carried out theoretical analysis and experimental research. The experimental system adopts the high power pulsed semiconductor laser as light source, gated ICCD as the imaging device, can realize the imaging depth and distance flexible adjustment to achieve different work mode. The imaging experiment of small imaging depth is carried out aiming at building 500m away, and 26 group images were obtained with distance step 1.5m. In this paper, the calculation method of 3D point cloud based on triangle method is analyzed, and 15m depth slice of the target 3D point cloud are obtained by using two frame images, the distance precision is better than 0.5m. The influence of signal to noise ratio, illumination uniformity and image brightness on distance accuracy are analyzed. Based on the comparison with the time-slicing method, a method for improving the linearity of point cloud is proposed.

  3. Taking a 3-D Slice of Hurricane Maria's Cloud Structure

    NASA Image and Video Library

    2017-09-20

    NASA's CloudSat satellite flew over Hurricane Maria on Sept. 17, 2017, at 1:23 p.m. EDT (17:23 UTC) as the storm had just strengthened into a hurricane in the Atlantic Ocean. Hurricane Maria contained estimated maximum sustained winds of 75 miles per hour (65 knots) and had a minimum barometric pressure of 986 millibars. CloudSat flew over Maria through the center of the rapidly intensifying storm, directly through an overshooting cloud top (a dome-shaped protrusion that shoots out of the top of the anvil cloud of a thunderstorm). CloudSat reveals the vertical extent of the overshooting cloud top, showing the estimated height of the cloud to be 11 miles (18 kilometers). Areas of high reflectivity with deep red and pink colors extend well above 9 miles (15 kilometers) in height, showing large amounts of water being drawn upward high into the atmosphere. A movie is available at https://photojournal.jpl.nasa.gov/catalog/PIA21961

  4. GOES Cloud Detection at the Global Hydrology and Climate Center

    NASA Technical Reports Server (NTRS)

    Laws, Kevin; Jedlovec, Gary J.; Arnold, James E. (Technical Monitor)

    2002-01-01

    The bi-spectral threshold (BTH) for cloud detection and height assignment is now operational at NASA's Global Hydrology and Climate Center (GHCC). This new approach is similar in principle to the bi-spectral spatial coherence (BSC) method with improvements made to produce a more robust cloud-filtering algorithm for nighttime cloud detection and subsequent 24-hour operational cloud top pressure assignment. The method capitalizes on cloud and surface emissivity differences from the GOES 3.9 and 10.7-micrometer channels to distinguish cloudy from clear pixels. Separate threshold values are determined for day and nighttime detection, and applied to a 20-day minimum composite difference image to better filter background effects and enhance differences in cloud properties. A cloud top pressure is assigned to each cloudy pixel by referencing the 10.7-micrometer channel temperature to a thermodynamic profile from a locally -run regional forecast model. This paper and supplemental poster will present an objective validation of nighttime cloud detection by the BTH approach in comparison with previous methods. The cloud top pressure will be evaluated by comparing to the NESDIS operational CO2 slicing approach.

  5. Point clouds segmentation as base for as-built BIM creation

    NASA Astrophysics Data System (ADS)

    Macher, H.; Landes, T.; Grussenmeyer, P.

    2015-08-01

    In this paper, a three steps segmentation approach is proposed in order to create 3D models from point clouds acquired by TLS inside buildings. The three scales of segmentation are floors, rooms and planes composing the rooms. First, floor segmentation is performed based on analysis of point distribution along Z axis. Then, for each floor, room segmentation is achieved considering a slice of point cloud at ceiling level. Finally, planes are segmented for each room, and planes corresponding to ceilings and floors are identified. Results of each step are analysed and potential improvements are proposed. Based on segmented point clouds, the creation of as-built BIM is considered in a future work section. Not only the classification of planes into several categories is proposed, but the potential use of point clouds acquired outside buildings is also considered.

  6. Wire blade development for Fixed Abrasive Slicing Technique (FAST) slicing

    NASA Technical Reports Server (NTRS)

    Khattak, C. P.; Schmid, F.; Smith, M. B.

    1982-01-01

    A low cost, effective slicing method is essential to make ingot technology viable for photovoltaics in terrestrial applications. The fixed abrasive slicing technique (FAST) combines the advantages of the three commercially developed techniques. In its development stage FAST demonstrated cutting effectiveness of 10 cm and 15 cm diameter workpieces. Wire blade development is still the critical element for commercialization of FAST technology. Both impregnated and electroplated wire blades have been developed; techniques have been developed to fix diamonds only in the cutting edge of the wire. Electroplated wires show the most near term promise and this approach is emphasized. With plated wires it has been possible to control the size and shape of the electroplating, it is expected that this feature reduces kerf and prolongs the life of the wirepack.

  7. Inter-slice Leakage Artifact Reduction Technique for Simultaneous Multi-Slice Acquisitions

    PubMed Central

    Cauley, Stephen F.; Polimeni, Jonathan R.; Bhat, Himanshu; Wang, Dingxin; Wald, Lawrence L.; Setsompop, Kawin

    2015-01-01

    Purpose Controlled aliasing techniques for simultaneously acquired EPI slices have been shown to significantly increase the temporal efficiency for both diffusion-weighted imaging (DWI) and fMRI studies. The “slice-GRAPPA” (SG) method has been widely used to reconstruct such data. We investigate robust optimization techniques for SG to ensure image reconstruction accuracy through a reduction of leakage artifacts. Methods Split slice-GRAPPA (SP-SG) is proposed as an alternative kernel optimization method. The performance of SP-SG is compared to standard SG using data collected on a spherical phantom and in-vivo on two subjects at 3T. Slice accelerated and non-accelerated data were collected for a spin-echo diffusion weighted acquisition. Signal leakage metrics and time-series SNR were used to quantify the performance of the kernel fitting approaches. Results The SP-SG optimization strategy significantly reduces leakage artifacts for both phantom and in-vivo acquisitions. In addition, a significant boost in time-series SNR for in-vivo diffusion weighted acquisitions with in-plane 2× and slice 3× accelerations was observed with the SP-SG approach. Conclusion By minimizing the influence of leakage artifacts during the training of slice-GRAPPA kernels, we have significantly improved reconstruction accuracy. Our robust kernel fitting strategy should enable better reconstruction accuracy and higher slice-acceleration across many applications. PMID:23963964

  8. The Reliability and Validity of the Thin Slice Technique: Observational Research on Video Recorded Medical Interactions

    ERIC Educational Resources Information Center

    Foster, Tanina S.

    2014-01-01

    Introduction: Observational research using the thin slice technique has been routinely incorporated in observational research methods, however there is limited evidence supporting use of this technique compared to full interaction coding. The purpose of this study was to determine if this technique could be reliability coded, if ratings are…

  9. An experimental comparison of standard stereo matching algorithms applied to cloud top height estimation from satellite IR images

    NASA Astrophysics Data System (ADS)

    Anzalone, Anna; Isgrò, Francesco

    2016-10-01

    The JEM-EUSO (Japanese Experiment Module-Extreme Universe Space Observatory) telescope will measure Ultra High Energy Cosmic Ray properties by detecting the UV fluorescent light generated in the interaction between cosmic rays and the atmosphere. Cloud information is crucial for a proper interpretation of these data. The problem of recovering the cloud-top height from satellite images in infrared has struck some attention over the last few decades, as a valuable tool for the atmospheric monitoring. A number of radiative methods do exist, like C02 slicing and Split Window algorithms, using one or more infrared bands. A different way to tackle the problem is, when possible, to exploit the availability of multiple views, and recover the cloud top height through stereo imaging and triangulation. A crucial step in the 3D reconstruction is the process that attempts to match a characteristic point or features selected in one image, with one of those detected in the second image. In this article the performance of a group matching algorithms that include both area-based and global techniques, has been tested. They are applied to stereo pairs of satellite IR images with the final aim of evaluating the cloud top height. Cloudy images from SEVIRI on the geostationary Meteosat Second Generation 9 and 10 (MSG-2, MSG-3) have been selected. After having applied to the cloudy scenes the algorithms for stereo matching, the outcoming maps of disparity are transformed in depth maps according to the geometry of the reference data system. As ground truth we have used the height maps provided by the database of MODIS (Moderate Resolution Imaging Spectroradiometer) on-board Terra/Aqua polar satellites, that contains images quasi-synchronous to the imaging provided by MSG.

  10. A Registration Method Based on Contour Point Cloud for 3D Whole-Body PET and CT Images

    PubMed Central

    Yang, Qiyao; Wang, Zhiguo; Zhang, Guoxu

    2017-01-01

    The PET and CT fusion image, combining the anatomical and functional information, has important clinical meaning. An effective registration of PET and CT images is the basis of image fusion. This paper presents a multithread registration method based on contour point cloud for 3D whole-body PET and CT images. Firstly, a geometric feature-based segmentation (GFS) method and a dynamic threshold denoising (DTD) method are creatively proposed to preprocess CT and PET images, respectively. Next, a new automated trunk slices extraction method is presented for extracting feature point clouds. Finally, the multithread Iterative Closet Point is adopted to drive an affine transform. We compare our method with a multiresolution registration method based on Mattes Mutual Information on 13 pairs (246~286 slices per pair) of 3D whole-body PET and CT data. Experimental results demonstrate the registration effectiveness of our method with lower negative normalization correlation (NC = −0.933) on feature images and less Euclidean distance error (ED = 2.826) on landmark points, outperforming the source data (NC = −0.496, ED = 25.847) and the compared method (NC = −0.614, ED = 16.085). Moreover, our method is about ten times faster than the compared one. PMID:28316979

  11. Improved sliced velocity map imaging apparatus optimized for H photofragments.

    PubMed

    Ryazanov, Mikhail; Reisler, Hanna

    2013-04-14

    Time-sliced velocity map imaging (SVMI), a high-resolution method for measuring kinetic energy distributions of products in scattering and photodissociation reactions, is challenging to implement for atomic hydrogen products. We describe an ion optics design aimed at achieving SVMI of H fragments in a broad range of kinetic energies (KE), from a fraction of an electronvolt to a few electronvolts. In order to enable consistently thin slicing for any imaged KE range, an additional electrostatic lens is introduced in the drift region for radial magnification control without affecting temporal stretching of the ion cloud. Time slices of ∼5 ns out of a cloud stretched to ⩾50 ns are used. An accelerator region with variable dimensions (using multiple electrodes) is employed for better optimization of radial and temporal space focusing characteristics at each magnification level. The implemented system was successfully tested by recording images of H fragments from the photodissociation of HBr, H2S, and the CH2OH radical, with kinetic energies ranging from <0.4 eV to >3 eV. It demonstrated KE resolution ≲1%-2%, similar to that obtained in traditional velocity map imaging followed by reconstruction, and to KE resolution achieved previously in SVMI of heavier products. We expect it to perform just as well up to at least 6 eV of kinetic energy. The tests showed that numerical simulations of the electric fields and ion trajectories in the system, used for optimization of the design and operating parameters, provide an accurate and reliable description of all aspects of system performance. This offers the advantage of selecting the best operating conditions in each measurement without the need for additional calibration experiments.

  12. Integrating interface slicing into software engineering processes

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  13. Geometry Processing of Conventionally Produced Mouse Brain Slice Images.

    PubMed

    Agarwal, Nitin; Xu, Xiangmin; Gopi, M

    2018-04-21

    Brain mapping research in most neuroanatomical laboratories relies on conventional processing techniques, which often introduce histological artifacts such as tissue tears and tissue loss. In this paper we present techniques and algorithms for automatic registration and 3D reconstruction of conventionally produced mouse brain slices in a standardized atlas space. This is achieved first by constructing a virtual 3D mouse brain model from annotated slices of Allen Reference Atlas (ARA). Virtual re-slicing of the reconstructed model generates ARA-based slice images corresponding to the microscopic images of histological brain sections. These image pairs are aligned using a geometric approach through contour images. Histological artifacts in the microscopic images are detected and removed using Constrained Delaunay Triangulation before performing global alignment. Finally, non-linear registration is performed by solving Laplace's equation with Dirichlet boundary conditions. Our methods provide significant improvements over previously reported registration techniques for the tested slices in 3D space, especially on slices with significant histological artifacts. Further, as one of the application we count the number of neurons in various anatomical regions using a dataset of 51 microscopic slices from a single mouse brain. To the best of our knowledge the presented work is the first that automatically registers both clean as well as highly damaged high-resolutions histological slices of mouse brain to a 3D annotated reference atlas space. This work represents a significant contribution to this subfield of neuroscience as it provides tools to neuroanatomist for analyzing and processing histological data. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Ripple artifact reduction using slice overlap in slice encoding for metal artifact correction.

    PubMed

    den Harder, J Chiel; van Yperen, Gert H; Blume, Ulrike A; Bos, Clemens

    2015-01-01

    Multispectral imaging (MSI) significantly reduces metal artifacts. Yet, especially in techniques that use gradient selection, such as slice encoding for metal artifact correction (SEMAC), a residual ripple artifact may be prominent. Here, an analysis is presented of the ripple artifact and of slice overlap as an approach to reduce the artifact. The ripple artifact was analyzed theoretically to clarify its cause. Slice overlap, conceptually similar to spectral bin overlap in multi-acquisition with variable resonances image combination (MAVRIC), was achieved by reducing the selection gradient and, thus, increasing the slice profile width. Time domain simulations and phantom experiments were performed to validate the analyses and proposed solution. Discontinuities between slices are aggravated by signal displacement in the frequency encoding direction in areas with deviating B0. Specifically, it was demonstrated that ripple artifacts appear only where B0 varies both in-plane and through-plane. Simulations and phantom studies of metal implants confirmed the efficacy of slice overlap to reduce the artifact. The ripple artifact is an important limitation of gradient selection based MSI techniques, and can be understood using the presented simulations. At a scan-time penalty, slice overlap effectively addressed the artifact, thereby improving image quality near metal implants. © 2014 Wiley Periodicals, Inc.

  15. Computing and Visualizing Reachable Volumes for Maneuvering Satellites

    NASA Astrophysics Data System (ADS)

    Jiang, M.; de Vries, W.; Pertica, A.; Olivier, S.

    2011-09-01

    Detecting and predicting maneuvering satellites is an important problem for Space Situational Awareness. The spatial envelope of all possible locations within reach of such a maneuvering satellite is known as the Reachable Volume (RV). As soon as custody of a satellite is lost, calculating the RV and its subsequent time evolution is a critical component in the rapid recovery of the satellite. In this paper, we present a Monte Carlo approach to computing the RV for a given object. Essentially, our approach samples all possible trajectories by randomizing thrust-vectors, thrust magnitudes and time of burn. At any given instance, the distribution of the "point-cloud" of the virtual particles defines the RV. For short orbital time-scales, the temporal evolution of the point-cloud can result in complex, multi-reentrant manifolds. Visualization plays an important role in gaining insight and understanding into this complex and evolving manifold. In the second part of this paper, we focus on how to effectively visualize the large number of virtual trajectories and the computed RV. We present a real-time out-of-core rendering technique for visualizing the large number of virtual trajectories. We also examine different techniques for visualizing the computed volume of probability density distribution, including volume slicing, convex hull and isosurfacing. We compare and contrast these techniques in terms of computational cost and visualization effectiveness, and describe the main implementation issues encountered during our development process. Finally, we will present some of the results from our end-to-end system for computing and visualizing RVs using examples of maneuvering satellites.

  16. Finite slice analysis (FINA) of sliced and velocity mapped images on a Cartesian grid

    NASA Astrophysics Data System (ADS)

    Thompson, J. O. F.; Amarasinghe, C.; Foley, C. D.; Rombes, N.; Gao, Z.; Vogels, S. N.; van de Meerakker, S. Y. T.; Suits, A. G.

    2017-08-01

    Although time-sliced imaging yields improved signal-to-noise and resolution compared with unsliced velocity mapped ion images, for finite slice widths as encountered in real experiments there is a loss of resolution and recovered intensities for the slow fragments. Recently, we reported a new approach that permits correction of these effects for an arbitrarily sliced distribution of a 3D charged particle cloud. This finite slice analysis (FinA) method utilizes basis functions that model the out-of-plane contribution of a given velocity component to the image for sequential subtraction in a spherical polar coordinate system. However, the original approach suffers from a slow processing time due to the weighting procedure needed to accurately model the out-of-plane projection of an anisotropic angular distribution. To overcome this issue we present a variant of the method in which the FinA approach is performed in a cylindrical coordinate system (Cartesian in the image plane) rather than a spherical polar coordinate system. Dubbed C-FinA, we show how this method is applied in much the same manner. We compare this variant to the polar FinA method and find that the processing time (of a 510 × 510 pixel image) in its most extreme case improves by a factor of 100. We also show that although the resulting velocity resolution is not quite as high as the polar version, this new approach shows superior resolution for fine structure in the differential cross sections. We demonstrate the method on a range of experimental and synthetic data at different effective slice widths.

  17. A Long Data Record (1979-2003) of Stratospheric Ozone Derived from TOMS Cloud Slicing: Comparison with SAGE and Implications for Ozone Recovery

    NASA Technical Reports Server (NTRS)

    Ziemke, Jerry R.; Chandra, Sushil; Bhartia, Pawan K.

    2004-01-01

    It is generally recognized that Stratospheric Aerosols and Gas Experiment (SAGE) stratospheric ozone data have become a standard long-record reference field for comparison with other stratospheric ozone measurements. This study demonstrates that stratospheric column ozone (SCO) derived from total ozone mapping spectrometer (TOMS) Cloud Slicing may be used to supplement SAGE data as a stand-alone long- record reference field in the tropics extending to middle and high latitudes over the Pacific. Comparisons of SAGE I1 version 6.2 SCO and TOMS version 8 Cloud Slicing SCO for 1984-2003 exhibit remarkable agreement in monthly ensemble means to within 1-3 DU (1 - 1.5% of SCO) despite being independently-calibrated measurements. An important component of our study is to incorporate these column ozone measurements to investigate long-term trends for the period 1979-2003. Our study includes Solar Backscatter Ultraviolet (SBW) version 8 measurements of upper stratospheric column ozone (i.e., zero to 32 hPa column ozone) to characterize seasonal cycles and seasonal trends in this region, as well as the lower stratosphere and troposphere when combined with TOMS SCO and total column ozone. The trend analyses suggest that most ozone reduction in the atmosphere since 1979 in mid-to-high latitudes has occurred in the Lower stratosphere below approx. 25 km. The delineation of upper and lower stratospheric column ozone indicate that trends in the upper stratosphere during the latter half of the 1979-2003 period have reduced to near zero globally, while trends in the lower stratosphere have become larger by approx. 5 DU decade%om the tropics extending to mid-latitudes in both hemispheres. For TCO, the trend analyses suggest moderate increases over the 25-year time record in the extra-tropics of both hemispheres of around 4-6 DU (Northern Hemisphere) and 6-8 DU (Southern Hemisphere).

  18. The SLICE, CHESS, and SISTINE Ultraviolet Spectrographs: Rocket-Borne Instrumentation Supporting Future Astrophysics Missions

    NASA Astrophysics Data System (ADS)

    France, Kevin; Hoadley, Keri; Fleming, Brian T.; Kane, Robert; Nell, Nicholas; Beasley, Matthew; Green, James C.

    2016-03-01

    NASA’s suborbital program provides an opportunity to conduct unique science experiments above Earth’s atmosphere and is a pipeline for the technology and personnel essential to future space astrophysics, heliophysics, and atmospheric science missions. In this paper, we describe three astronomy payloads developed (or in development) by the Ultraviolet Rocket Group at the University of Colorado. These far-ultraviolet (UV) (100-160nm) spectrographic instruments are used to study a range of scientific topics, from gas in the interstellar medium (accessing diagnostics of material spanning five orders of magnitude in temperature in a single observation) to the energetic radiation environment of nearby exoplanetary systems. The three instruments, Suborbital Local Interstellar Cloud Experiment (SLICE), Colorado High-resolution Echelle Stellar Spectrograph (CHESS), and Suborbital Imaging Spectrograph for Transition region Irradiance from Nearby Exoplanet host stars (SISTINE) form a progression of instrument designs and component-level technology maturation. SLICE is a pathfinder instrument for the development of new data handling, storage, and telemetry techniques. CHESS and SISTINE are testbeds for technology and instrument design enabling high-resolution (R>105) point source spectroscopy and high throughput imaging spectroscopy, respectively, in support of future Explorer, Probe, and Flagship-class missions. The CHESS and SISTINE payloads support the development and flight testing of large-format photon-counting detectors and advanced optical coatings: NASA’s top two technology priorities for enabling a future flagship observatory (e.g. the LUVOIR Surveyor concept) that offers factors of ˜50-100 gain in UV spectroscopy capability over the Hubble Space Telescope. We present the design, component level laboratory characterization, and flight results for these instruments.

  19. Retrieval of radiative and microphysical properties of clouds from multispectral infrared measurements

    NASA Astrophysics Data System (ADS)

    Iwabuchi, Hironobu; Saito, Masanori; Tokoro, Yuka; Putri, Nurfiena Sagita; Sekiguchi, Miho

    2016-12-01

    Satellite remote sensing of the macroscopic, microphysical, and optical properties of clouds are useful for studying spatial and temporal variations of clouds at various scales and constraining cloud physical processes in climate and weather prediction models. Instead of using separate independent algorithms for different cloud properties, a unified, optimal estimation-based cloud retrieval algorithm is developed and applied to moderate resolution imaging spectroradiometer (MODIS) observations using ten thermal infrared bands. The model considers sensor configurations, background surface and atmospheric profile, and microphysical and optical models of ice and liquid cloud particles and radiative transfer in a plane-parallel, multilayered atmosphere. Measurement and model errors are thoroughly quantified from direct comparisons of clear-sky observations over the ocean with model calculations. Performance tests by retrieval simulations show that ice cloud properties are retrieved with high accuracy when cloud optical thickness (COT) is between 0.1 and 10. Cloud-top pressure is inferred with uncertainty lower than 10 % when COT is larger than 0.3. Applying the method to a tropical cloud system and comparing the results with the MODIS Collection 6 cloud product shows good agreement for ice cloud optical thickness when COT is less than about 5. Cloud-top height agrees well with estimates obtained by the CO2 slicing method used in the MODIS product. The present algorithm can detect optically thin parts at the edges of high clouds well in comparison with the MODIS product, in which these parts are recognized as low clouds by the infrared window method. The cloud thermodynamic phase in the present algorithm is constrained by cloud-top temperature, which tends not to produce results with an ice cloud that is too warm and liquid cloud that is too cold.

  20. The mouse cerebellar cortex in organotypic slice cultures: an in vitro model to analyze the consequences of mutations and pathologies on neuronal survival, development, and function.

    PubMed

    Lonchamp, Etienne; Dupont, Jean-Luc; Beekenkamp, Huguette; Poulain, Bernard; Bossu, Jean-Louis

    2006-01-01

    Thin acute slices and dissociated cell cultures taken from different parts of the brain have been widely used to examine the function of the nervous system, neuron-specific interactions, and neuronal development (specifically, neurobiology, neuropharmacology, and neurotoxicology studies). Here, we focus on an alternative in vitro model: brain-slice cultures in roller tubes, initially introduced by Beat Gähwiler for studies with rats, that we have recently adapted for studies of mouse cerebellum. Cultured cerebellar slices afford many of the advantages of dissociated cultures of neurons and thin acute slices. Organotypic slice cultures were established from newborn or 10-15-day-old mice. After 3-4 weeks in culture, the slices flattened to form a cell monolayer. The main types of cerebellar neurons could be identified with immunostaining techniques, while their electrophysiological properties could be easily characterized with the patch-clamp recording technique. When slices were taken from newborn mice and cultured for 3 weeks, aspects of the cerebellar development were displayed. A functional neuronal network was established despite the absence of mossy and climbing fibers, which are the two excitatory afferent projections to the cerebellum. When slices were made from 10-15-day-old mice, which are at a developmental stage when cerebellum organization is almost established, the structure and neuronal pathways were intact after 3-4 weeks in culture. These unique characteristics make organotypic slice cultures of mouse cerebellar cortex a valuable model for analyzing the consequences of gene mutations that profoundly alter neuronal function and compromise postnatal survival.

  1. Slice sampling technique in Bayesian extreme of gold price modelling

    NASA Astrophysics Data System (ADS)

    Rostami, Mohammad; Adam, Mohd Bakri; Ibrahim, Noor Akma; Yahya, Mohamed Hisham

    2013-09-01

    In this paper, a simulation study of Bayesian extreme values by using Markov Chain Monte Carlo via slice sampling algorithm is implemented. We compared the accuracy of slice sampling with other methods for a Gumbel model. This study revealed that slice sampling algorithm offers more accurate and closer estimates with less RMSE than other methods . Finally we successfully employed this procedure to estimate the parameters of Malaysia extreme gold price from 2000 to 2011.

  2. Preparation of Acute Brain Slices Using an Optimized N-Methyl-D-glucamine Protective Recovery Method.

    PubMed

    Ting, Jonathan T; Lee, Brian R; Chong, Peter; Soler-Llavina, Gilberto; Cobbs, Charles; Koch, Christof; Zeng, Hongkui; Lein, Ed

    2018-02-26

    This protocol is a practical guide to the N-methyl-D-glucamine (NMDG) protective recovery method of brain slice preparation. Numerous recent studies have validated the utility of this method for enhancing neuronal preservation and overall brain slice viability. The implementation of this technique by early adopters has facilitated detailed investigations into brain function using diverse experimental applications and spanning a wide range of animal ages, brain regions, and cell types. Steps are outlined for carrying out the protective recovery brain slice technique using an optimized NMDG artificial cerebrospinal fluid (aCSF) media formulation and enhanced procedure to reliably obtain healthy brain slices for patch clamp electrophysiology. With this updated approach, a substantial improvement is observed in the speed and reliability of gigaohm seal formation during targeted patch clamp recording experiments while maintaining excellent neuronal preservation, thereby facilitating challenging experimental applications. Representative results are provided from multi-neuron patch clamp recording experiments to assay synaptic connectivity in neocortical brain slices prepared from young adult transgenic mice and mature adult human neurosurgical specimens. Furthermore, the optimized NMDG protective recovery method of brain slicing is compatible with both juvenile and adult animals, thus resolving a limitation of the original methodology. In summary, a single media formulation and brain slicing procedure can be implemented across various species and ages to achieve excellent viability and tissue preservation.

  3. Preparation of Acute Brain Slices Using an Optimized N-Methyl-D-glucamine Protective Recovery Method

    PubMed Central

    Chong, Peter; Soler-Llavina, Gilberto; Cobbs, Charles; Koch, Christof; Zeng, Hongkui; Lein, Ed

    2018-01-01

    This protocol is a practical guide to the N-methyl-D-glucamine (NMDG) protective recovery method of brain slice preparation. Numerous recent studies have validated the utility of this method for enhancing neuronal preservation and overall brain slice viability. The implementation of this technique by early adopters has facilitated detailed investigations into brain function using diverse experimental applications and spanning a wide range of animal ages, brain regions, and cell types. Steps are outlined for carrying out the protective recovery brain slice technique using an optimized NMDG artificial cerebrospinal fluid (aCSF) media formulation and enhanced procedure to reliably obtain healthy brain slices for patch clamp electrophysiology. With this updated approach, a substantial improvement is observed in the speed and reliability of gigaohm seal formation during targeted patch clamp recording experiments while maintaining excellent neuronal preservation, thereby facilitating challenging experimental applications. Representative results are provided from multi-neuron patch clamp recording experiments to assay synaptic connectivity in neocortical brain slices prepared from young adult transgenic mice and mature adult human neurosurgical specimens. Furthermore, the optimized NMDG protective recovery method of brain slicing is compatible with both juvenile and adult animals, thus resolving a limitation of the original methodology. In summary, a single media formulation and brain slicing procedure can be implemented across various species and ages to achieve excellent viability and tissue preservation. PMID:29553547

  4. Inter-Annual and Decadal Changes in Tropospheric and Stratospheric Ozone

    NASA Technical Reports Server (NTRS)

    Ziemke, Jr. R.; Chandra, S.

    2011-01-01

    Ozone data beginning October 2004 from the Aura Ozone Monitoring Instrument (OMI) and Aura Microwave Limb Sounder (MLS) are used to evaluate the accuracy of the Cloud slicing technique in effort to develop long data records of tropospheric and stratospheric ozone and studying their long-term changes. Using this technique, we have produced a 32-year (1979-2010) long record of tropospheric and stratospheric ozone from the combined Total Ozone Mapping Spectrometer (Toms) and OMI. The analyses of these time series suggest that the quasi-biennial oscillation (QBO) is the dominant source of inter-annual changes of 30-40 Dobson Units (DU). Tropospheric ozone also indicates a QBO signal in the peak to peak changes varying from 2 to 7 DU. Decadal changes in global stratospheric ozone indicate a turnaround in ozone loss around mid 1990's with most of these changes occurring in the Northern Hemisphere from the subtropics to high latitudes. The trend results are generally consistent with the prediction of chemistry climate models which include the reduction of ozone destroying substances beginning in the late 1980's mandated by the Montreal Protocol.

  5. Cloud Size Distributions from Multi-sensor Observations of Shallow Cumulus Clouds

    NASA Astrophysics Data System (ADS)

    Kleiss, J.; Riley, E.; Kassianov, E.; Long, C. N.; Riihimaki, L.; Berg, L. K.

    2017-12-01

    Combined radar-lidar observations have been used for almost two decades to document temporal changes of shallow cumulus clouds at the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Facility's Southern Great Plains (SGP) site in Oklahoma, USA. Since the ARM zenith-pointed radars and lidars have a narrow field-of-view (FOV), the documented cloud statistics, such as distributions of cloud chord length (or horizontal length scale), represent only a slice along the wind direction of a region surrounding the SGP site, and thus may not be representative for this region. To investigate this impact, we compare cloud statistics obtained from wide-FOV sky images collected by ground-based observations at the SGP site to those from the narrow FOV active sensors. The main wide-FOV cloud statistics considered are cloud area distributions of shallow cumulus clouds, which are frequently required to evaluate model performance, such as routine large eddy simulation (LES) currently being conducted by the ARM LASSO (LES ARM Symbiotic Simulation and Observation) project. We obtain complementary macrophysical properties of shallow cumulus clouds, such as cloud chord length, base height and thickness, from the combined radar-lidar observations. To better understand the broader observational context where these narrow FOV cloud statistics occur, we compare them to collocated and coincident cloud area distributions from wide-FOV sky images and high-resolution satellite images. We discuss the comparison results and illustrate the possibility to generate a long-term climatology of cloud size distributions from multi-sensor observations at the SGP site.

  6. Dispersion-based Fresh-slice Scheme for Free-Electron Lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guetg, Marc

    The Fresh-slice technique improved the performance of several Self-Amplified Spontaneous Emission Free-Electron laser schemes by granting selective control on the temporal lasing slice without spoiling the other electron bunch slices. So far, the implementation required a special insertion device to create the beam yaw, called dechirper. We demonstrate a novel scheme to enable Freshslice operation based on electron energy chirp and orbit dispersion that can be implemented at any free-electron laser facility without additional hardware.

  7. 3D acquisition and modeling for flint artefacts analysis

    NASA Astrophysics Data System (ADS)

    Loriot, B.; Fougerolle, Y.; Sestier, C.; Seulin, R.

    2007-07-01

    In this paper, we are interested in accurate acquisition and modeling of flint artefacts. Archaeologists needs accurate geometry measurements to refine their understanding of the flint artefacts manufacturing process. Current techniques require several operations. First, a copy of a flint artefact is reproduced. The copy is then sliced. A picture is taken for each slice. Eventually, geometric information is manually determined from the pictures. Such a technique is very time consuming, and the processing applied to the original, as well as the reproduced object, induces several measurement errors (prototyping approximations, slicing, image acquisition, and measurement). By using 3D scanners, we significantly reduce the number of operations related to data acquisition and completely suppress the prototyping step to obtain an accurate 3D model. The 3D models are segmented into sliced parts that are then analyzed. Each slice is then automatically fitted by mathematical representation. Such a representation offers several interesting properties: geometric features can be characterized (e.g. shapes, curvature, sharp edges, etc), and a shape of the original piece of stone can be extrapolated. The contributions of this paper are an acquisition technique using 3D scanners that strongly reduces human intervention, acquisition time and measurement errors, and the representation of flint artefacts as mathematical 2D sections that enable accurate analysis.

  8. A Climatology of Fair-Weather Cloud Statistics at the Atmospheric Radiation Measurement Program Southern Great Plains Site: Temporal and Spatial Variability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, Larry K.; Kassianov, Evgueni I.; Long, Charles N.

    2006-03-30

    In previous work, Berg and Stull (2005) developed a new parameterization for Fair-Weather Cumuli (FWC). Preliminary testing of the new scheme used data collected during a field experiment conducted during the summer of 1996. This campaign included a few research flights conducted over three locations within the Atmospheric Radiation Measurement (ARM) Climate Research Facility (ACRF) Southern Great Plains (SGP) site. A more comprehensive verification of the new scheme requires a detailed climatology of FWC. Several cloud climatologies have been completed for the ACRF SGP, but these efforts have focused on either broad categories of clouds grouped by height and seasonmore » (e.g., Lazarus et al. 1999) or height and time of day (e.g., Dong et al. 2005). In these two examples, the low clouds were not separated by the type of cloud, either stratiform or cumuliform, nor were the horizontal chord length (the length of the cloud slice that passed directly overhead) or cloud aspect ratio (defined as the ratio of the cloud thickness to the cloud chord length) reported. Lane et al. (2002) presented distributions of cloud chord length, but only for one year. The work presented here addresses these shortcomings by looking explicitly at cases with FWC over five summers. Specifically, we will address the following questions: •Does the cloud fraction (CF), cloud-base height (CBH), and cloud-top height (CTH) of FWC change with the time of day or the year? •What is the distribution of FWC chord lengths? •Is there a relationship between the cloud chord length and the cloud thickness?« less

  9. ERTS-1 image enhancement by optically combining density slices

    NASA Technical Reports Server (NTRS)

    Tapper, G. O.; Pease, R. W.

    1973-01-01

    The technique of density slicing using a photographic film and its application to enhancement of ERTS-1 imagery has proved to be useful for mapping varigated areal phenomena and provides a useful supplement ot the I2S MiniAddcol viewing system. The intial experiments conducted with this film were encouraging, and indicated that this technique of density slicing using readily accessible darkroom facilities and simple darkroom procedures allows rapid, accurate, and facile interpretation of certain areal phenomena to be made from the imagery. The distribution of the tree yucca, Yucca brevifolia Jaegeriana, in the eastern Mojave Desert of Southern California and southern Nevada was used as an example to test the accuracy of the technique for mapping purposes. The distribution was mapped at a relatively high level of accuracy.

  10. Above-Cloud Precipitable Water Retrievals using the MODIS 0.94 micron Band with Applications for Multi-Layer Cloud Detection

    NASA Technical Reports Server (NTRS)

    Platnick, S.; Wind, G.

    2004-01-01

    In order to perform satellite retrievals of cloud properties, it is important to account for the effect of the above-cloud atmosphere on the observations. The solar bands used in the operational MODIS Terra and Aqua cloud optical and microphysical algorithms (visible, NIR, and SWIR spectral windows) are primarily affected by water vapor, and to a lesser extent by well-mixed gases. For water vapor, the above-cloud column amount, or precipitable water, provides adequate information for an atmospheric correction; details of the vertical vapor distribution are not typically necessary for the level of correction required. Cloud-top pressure has a secondary effect due to pressure broadening influences. For well- mixed gases, cloud-top pressure is also required for estimates of above-cloud abundances. We present a method for obtaining above-cloud precipitable water over dark Ocean surfaces using the MODIS 0.94 pm vapor absorption band. The retrieval includes an iterative procedure for establishing cloud-top temperature and pressure, and is useful for both single layer water and ice clouds. Knowledge of cloud thermodynamic phase is fundamental in retrieving cloud optical and microphysical properties. However, in cases of optically thin cirrus overlapping lower water clouds, the concept of a single unique phase is ill- defined and depends, at least, on the spectral region of interest. We will present a method for multi-layer and multi-phase cloud detection which uses above-cloud precipitable water retrievals along with several existing MODIS operational cloud products (cloud-top pressure derived from a C02 slicing algorithm, IR and SWIR phase retrievals). Results are catagorized by whether the radiative signature in the MODIS solar bands is primarily that of a water cloud with ice cloud contamination, or visa-versa. Examples in polar and mid-latitude regions will be shown.

  11. The Advanced Light Source (ALS) Slicing Undulator Beamline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimann, P. A.; Glover, T. E.; Plate, D.

    2007-01-19

    A beamline optimized for the bunch slicing technique has been construction at the Advanced Light Source (ALS). This beamline includes an in-vacuum undulator, soft and hard x-ray beamlines and a femtosecond laser system. The soft x-ray beamline may operate in spectrometer mode, where an entire absorption spectrum is accumulated at one time, or in monochromator mode. The femtosecond laser system has a high repetition rate of 20 kHz to improve the average slicing flux. The performance of the soft x-ray branch of the ALS slicing undulator beamline will be presented.

  12. Urea Biosynthesis Using Liver Slices

    ERIC Educational Resources Information Center

    Teal, A. R.

    1976-01-01

    Presented is a practical scheme to enable introductory biology students to investigate the mechanism by which urea is synthesized in the liver. The tissue-slice technique is discussed, and methods for the quantitative analysis of metabolites are presented. (Author/SL)

  13. Fresh Slice Self-Seeding and Fresh Slice Harmonic Lasing at LCLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amann, J.W.

    We present results from the successful demonstration of fresh slice self-seeding at the Linac Coherent Light Source (LCLS).* The performance is compared with SASE and regular self-seeding at photon energy of 5.5 keV, resulting in a relative average brightness increase of a factor of 12 and a factor of 2 respectively. Following this proof-of-principle we discuss the forthcoming plans to use the same technique** for fresh slice harmonic lasing in an upcoming experiment. The demonstration of fresh slice harmonic lasing provides an attractive solution for future XFELs aiming to achieve high efficiency, high brightness X-ray pulses at high photon energiesmore » (>12 keV).***« less

  14. A Comparison of Several Techniques to Assign Heights to Cloud Tracers.

    NASA Astrophysics Data System (ADS)

    Nieman, Steven J.; Schmetz, Johannes; Menzel, W. Paul

    1993-09-01

    Satellite-derived cloud-motion vector (CMV) production has been troubled by inaccurate height assignment of cloud tracers, especially in thin semitransparent clouds. This paper presents the results of an intercomparison of current operational height assignment techniques. Currently, heights are assigned by one of three techniques when the appropriate spectral radiance measurements are available. The infrared window (IRW) technique compares measured brightness temperatures to forecast temperature profiles and thus infers opaque cloud levels. In semitransparent or small subpixel clouds, the carbon dioxide (CO2) technique uses the ratio of radiances from different layers of the atmosphere to infer the correct cloud height. In the water vapor (H2O) technique, radiances influenced by upper-tropospheric moisture and IRW radiances are measured for several pixels viewing different cloud amounts, and their linear relationship is used to extrapolate the correct cloud height. The results presented in this paper suggest that the H2O technique is a viable alternative to the CO2 technique for inferring the heights of semitransparent cloud elements. This is important since future National Environmental Satellite, Data, and Information Service (NESDIS) operations will have to rely on H20-derived cloud-height assignments in the wind field determinations with the next operational geostationary satellite. On a given day, the heights from the two approaches compare to within 60 110 hPa rms; drier atmospheric conditions tend to reduce the effectiveness of the H2O technique. By inference one can conclude that the present height algorithms used operationally at NESDIS (with the C02 technique) and at the European Satellite Operations Center (ESOC) (with their version of the H20 technique) are providing similar results. Sample wind fields produced with the ESOC and NESDIS algorithms using Meteosat-4 data show good agreement.

  15. Hyoid bone development: An assessment of optimal CT scanner parameters and 3D volume rendering techniques

    PubMed Central

    Cotter, Meghan M.; Whyms, Brian J.; Kelly, Michael P.; Doherty, Benjamin M.; Gentry, Lindell R.; Bersu, Edward T.; Vorperian, Houri K.

    2015-01-01

    The hyoid bone anchors and supports the vocal tract. Its complex shape is best studied in three dimensions, but it is difficult to capture on computed tomography (CT) images and three-dimensional volume renderings. The goal of this study was to determine the optimal CT scanning and rendering parameters to accurately measure the growth and developmental anatomy of the hyoid and to determine whether it is feasible and necessary to use these parameters in the measurement of hyoids from in vivo CT scans. Direct linear and volumetric measurements of skeletonized hyoid bone specimens were compared to corresponding CT images to determine the most accurate scanning parameters and three-dimensional rendering techniques. A pilot study was undertaken using in vivo scans from a retrospective CT database to determine feasibility of quantifying hyoid growth. Scanning parameters and rendering technique affected accuracy of measurements. Most linear CT measurements were within 10% of direct measurements; however, volume was overestimated when CT scans were acquired with a slice thickness greater than 1.25 mm. Slice-by-slice thresholding of hyoid images decreased volume overestimation. The pilot study revealed that the linear measurements tested correlate with age. A fine-tuned rendering approach applied to small slice thickness CT scans produces the most accurate measurements of hyoid bones. However, linear measurements can be accurately assessed from in vivo CT scans at a larger slice thickness. Such findings imply that investigation into the growth and development of the hyoid bone, and the vocal tract as a whole, can now be performed using these techniques. PMID:25810349

  16. Hyoid Bone Development: An Assessment Of Optimal CT Scanner Parameters and Three-Dimensional Volume Rendering Techniques.

    PubMed

    Cotter, Meghan M; Whyms, Brian J; Kelly, Michael P; Doherty, Benjamin M; Gentry, Lindell R; Bersu, Edward T; Vorperian, Houri K

    2015-08-01

    The hyoid bone anchors and supports the vocal tract. Its complex shape is best studied in three dimensions, but it is difficult to capture on computed tomography (CT) images and three-dimensional volume renderings. The goal of this study was to determine the optimal CT scanning and rendering parameters to accurately measure the growth and developmental anatomy of the hyoid and to determine whether it is feasible and necessary to use these parameters in the measurement of hyoids from in vivo CT scans. Direct linear and volumetric measurements of skeletonized hyoid bone specimens were compared with corresponding CT images to determine the most accurate scanning parameters and three-dimensional rendering techniques. A pilot study was undertaken using in vivo scans from a retrospective CT database to determine feasibility of quantifying hyoid growth. Scanning parameters and rendering technique affected accuracy of measurements. Most linear CT measurements were within 10% of direct measurements; however, volume was overestimated when CT scans were acquired with a slice thickness greater than 1.25 mm. Slice-by-slice thresholding of hyoid images decreased volume overestimation. The pilot study revealed that the linear measurements tested correlate with age. A fine-tuned rendering approach applied to small slice thickness CT scans produces the most accurate measurements of hyoid bones. However, linear measurements can be accurately assessed from in vivo CT scans at a larger slice thickness. Such findings imply that investigation into the growth and development of the hyoid bone, and the vocal tract as a whole, can now be performed using these techniques. © 2015 Wiley Periodicals, Inc.

  17. SU-E-J-240: Development of a Novel 4D MRI Sequence for Real-Time Liver Tumor Tracking During Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhuang, L; Burmeister, J; Ye, Y

    2015-06-15

    Purpose: To develop a Novel 4D MRI Technique that is feasible for realtime liver tumor tracking during radiotherapy. Methods: A volunteer underwent an abdominal 2D fast EPI coronal scan on a 3.0T MRI scanner (Siemens Inc., Germany). An optimal set of parameters was determined based on image quality and scan time. A total of 23 slices were scanned to cover the whole liver in the test scan. For each scan position, the 2D images were retrospectively sorted into multiple phases based on breathing signal extracted from the images. Consequently the 2D slices with same phase numbers were stacked to formmore » one 3D image. Multiple phases of 3D images formed the 4D MRI sequence representing one breathing cycle. Results: The optimal set of scan parameters were: TR= 57ms, TE= 19ms, FOV read= 320mm and flip angle= 30°, which resulted in a total scan time of 14s for 200 frames (FMs) per slice and image resolution of (2.5mm,2.5mm,5.0mm) in three directions. Ten phases of 3D images were generated, each of which had 23 slices. Based on our test scan, only 100FMs were necessary for the phase sorting process which may lower the scan time to 7s/100FMs/slice. For example, only 5 slices/35s are necessary for a 4D MRI scan to cover liver tumor size ≤ 2cm leading to the possibility of tumor trajectory tracking every 35s during treatment. Conclusion: The novel 4D MRI technique we developed can reconstruct a 4D liver MRI sequence representing one breathing cycle (7s/ slice) without an external monitor. This technique can potentially be used for real-time liver tumor tracking during radiotherapy.« less

  18. Comparison of global cloud liquid water path derived from microwave measurements with CERES-MODIS

    NASA Astrophysics Data System (ADS)

    Yi, Y.; Minnis, P.; Huang, J.; Lin, B.; Ayers, K.; Sun-Mack, S.; Fan, A.

    Cloud liquid water path LWP is a crucial parameter for climate studies due to the link that it provides between the atmospheric hydrological and radiative budgets Satellite-based visible infrared techniques such as the Visible Infrared Solar Split-Window Technique VISST can retrieve LWP for water clouds assumes single-layer over a variety of surfaces If the water clouds are overlapped by ice clouds the LWP of the underlying clouds can not be retrieved by such techniques However microwave techniques may be used to retrieve the LWP underneath ice clouds due to the microwave s insensitivity to cloud ice particles LWP is typically retrieved from satellite-observed microwave radiances only over ocean due to variations of land surface temperature and emissivity Recently Deeter and Vivekanandan 2006 developed a new technique for retrieving LWP over land In order to overcome the sensitivity to land surface temperature and emissivity their technique is based on a parameterization of microwave polarization-difference signals In this study a similar regression-based technique for retrieving LWP over land and ocean using Advanced Microwave Scanning Radiometer - EOS AMSR-E measurements is developed Furthermore the microwave surface emissivities are also derived using clear-sky fields of view based on the Clouds and Earth s Radiant Energy System Moderate-resolution Imaging Spectroradiometer CERES-MODIS cloud mask These emissivities are used in an alternate form of the technique The results are evaluated using independent measurements such

  19. Bias Field Inconsistency Correction of Motion-Scattered Multislice MRI for Improved 3D Image Reconstruction

    PubMed Central

    Kim, Kio; Habas, Piotr A.; Rajagopalan, Vidya; Scott, Julia A.; Corbett-Detig, James M.; Rousseau, Francois; Barkovich, A. James; Glenn, Orit A.; Studholme, Colin

    2012-01-01

    A common solution to clinical MR imaging in the presence of large anatomical motion is to use fast multi-slice 2D studies to reduce slice acquisition time and provide clinically usable slice data. Recently, techniques have been developed which retrospectively correct large scale 3D motion between individual slices allowing the formation of a geometrically correct 3D volume from the multiple slice stacks. One challenge, however, in the final reconstruction process is the possibility of varying intensity bias in the slice data, typically due to the motion of the anatomy relative to imaging coils. As a result, slices which cover the same region of anatomy at different times may exhibit different sensitivity. This bias field inconsistency can induce artifacts in the final 3D reconstruction that can impact both clinical interpretation of key tissue boundaries and the automated analysis of the data. Here we describe a framework to estimate and correct the bias field inconsistency in each slice collectively across all motion corrupted image slices. Experiments using synthetic and clinical data show that the proposed method reduces intensity variability in tissues and improves the distinction between key tissue types. PMID:21511561

  20. Comparative evaluation of polarimetric and bi-spectral cloud microphysics retrievals: Retrieval closure experiments and comparisons based on idealized and LES case studies

    NASA Astrophysics Data System (ADS)

    Miller, D. J.; Zhang, Z.; Ackerman, A. S.; Platnick, S. E.; Cornet, C.

    2016-12-01

    A remote sensing cloud retrieval simulator, created by coupling an LES cloud model with vector radiative transfer (RT) models is the ideal framework for assessing cloud remote sensing techniques. This simulator serves as a tool for understanding bi-spectral and polarimetric retrievals by comparing them directly to LES cloud properties (retrieval closure comparison) and for comparing the retrieval techniques to one another. Our simulator utilizes the DHARMA LES [Ackerman et al., 2004] with cloud properties based on marine boundary layer (MBL) clouds observed during the DYCOMS-II and ATEX field campaigns. The cloud reflectances are produced by the vectorized RT models based on polarized doubling adding and monte carlo techniques (PDA, MCPOL). Retrievals are performed utilizing techniques as similar as possible to those implemented on their corresponding well known instruments; polarimetric retrievals are based on techniques implemented for polarimeters (POLDER, AirMSPI, and RSP) and bi-spectral retrievals are performed using the Nakajima-King LUT method utilized on a number of spectral instruments (MODIS and VIIRS). Retrieval comparisons focus on cloud droplet effective radius (re), effective variance (ve), and cloud optical thickness (τ). This work explores the sensitivities of these two retrieval techniques to various observation limitations, such as spatial resolution/cloud inhomogeneity, impact of 3D radiative effects, and angular resolution requirements. With future remote sensing missions like NASA's Aerosols/Clouds/Ecosystems (ACE) planning to feature advanced polarimetric instruments it is important to understand how these retrieval techniques compare to one another. The cloud retrieval simulator we've developed allows us to probe these important questions in a realistically relevant test bed.

  1. A z-gradient array for simultaneous multi-slice excitation with a single-band RF pulse.

    PubMed

    Ertan, Koray; Taraghinia, Soheil; Sadeghi, Alireza; Atalar, Ergin

    2018-07-01

    Multi-slice radiofrequency (RF) pulses have higher specific absorption rates, more peak RF power, and longer pulse durations than single-slice RF pulses. Gradient field design techniques using a z-gradient array are investigated for exciting multiple slices with a single-band RF pulse. Two different field design methods are formulated to solve for the required current values of the gradient array elements for the given slice locations. The method requirements are specified, optimization problems are formulated for the minimum current norm and an analytical solution is provided. A 9-channel z-gradient coil array driven by independent, custom-designed gradient amplifiers is used to validate the theory. Performance measures such as normalized slice thickness error, gradient strength per unit norm current, power dissipation, and maximum amplitude of the magnetic field are provided for various slice locations and numbers of slices. Two and 3 slices are excited by a single-band RF pulse in simulations and phantom experiments. The possibility of multi-slice excitation with a single-band RF pulse using a z-gradient array is validated in simulations and phantom experiments. Magn Reson Med 80:400-412, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  2. Tryptophan availability modulates serotonin release from rat hypothalamic slices

    NASA Technical Reports Server (NTRS)

    Schaechter, Judith D.; Wurtman, Richard J.

    1989-01-01

    The relationship between the tryptophan availability and serononin release from rat hypothalamus was investigated using a new in vitro technique for estimating rates at which endogenous serotonin is released spontaneously or upon electrical depolarization from hypothalamic slices superfused with a solution containing various amounts of tryptophan. It was found that the spontaneous, as well as electrically induced, release of serotonin from the brain slices exhibited a dose-dependent relationship with the tryptophan concentration of the superfusion medium.

  3. Fresh-slice multicolour X-ray free-electron lasers

    DOE PAGES

    Lutman, Alberto A.; Maxwell, Timothy J.; MacArthur, James P.; ...

    2016-10-24

    X-ray free-electron lasers (XFELs) provide femtosecond X-ray pulses with a narrow energy bandwidth and unprecedented brightness. Ultrafast physical and chemical dynamics, initiated with a site-specific X-ray pulse, can be explored using XFELs with a second ultrashort X-ray probe pulse. However, existing double-pulse schemes are complicated, difficult to customize or provide only low-intensity pulses. Here we present the novel fresh-slice technique for multicolour pulse production, wherein different temporal slices of an electron bunch lase to saturation in separate undulator sections. This method combines electron bunch tailoring from a passive wakefield device with trajectory control to provide multicolour pulses. The fresh-slice schememore » outperforms existing techniques at soft X-ray wavelengths. It produces femtosecond pulses with a power of tens of gigawatts and flexible colour separation. The pulse delay can be varied from temporal overlap to almost one picosecond. As a result, we also demonstrate the first three-colour XFEL and variably polarized two-colour pulses.« less

  4. The luminosity function for the CfA redshift survey slices

    NASA Technical Reports Server (NTRS)

    De Lapparent, Valerie; Geller, Margaret J.; Huchra, John P.

    1989-01-01

    The luminosity function for two complete slices of the extension of the CfA redshift survey is calculated. The nonparametric technique of Lynden-Bell (1971) and Turner (1979) is used to determine the shape for the luminosity function of the 12 deg slice of the redshift survey. The amplitude of the luminosity function is determined, taking large-scale inhomogeneities into account. The effects of the Malmquist bias on a magnitude-limited redshift survey are examined, showing that the random errors in the magnitudes for the 12 deg slice affect both the determination of the luminosity function and the spatial density constrast of large scale structures.

  5. Slicing The 2010 Saturn's Storm: Upper Clouds And Hazes

    NASA Astrophysics Data System (ADS)

    Perez-Hoyos, Santiago; Sanz-Requena, J. F.; Sanchez-Lavega, A.; Hueso, R.

    2012-10-01

    At the end of 2010 a small storm erupted in Saturn's northern mid-latitudes. Starting from a localized perturbation, it grew up to be a global-scale disturbance and cover the whole latitude band by February, 2011 (Fletcher et al. 2011, Science 332; Sánchez-Lavega et al. 2011, Nature 475; Fischer et al. 2011, Nature 475). By June, 2011 the storm was facing its end and gradually disappeared (Sánchez-Lavega et al. 2012, Icarus 220). In this work we use the observations acquired by the Cassini ISS instrument during the whole process to investigate the vertical cloud and haze structure above the ammonia condensation level (roughly 1 bar). Cassini ISS observations cover visual wavelengths from the blue to the near-infrared including two methane absorption bands. Such observations have been modeled using a radiative transfer code which reproduces the atmospheric reflectivity as a function of observation/illumination geometry and wavelength together with a retrieval technique to find maximum likelihood atmospheric models. This allows to investigate some atmospheric parameters: cloud-top pressures, aerosol optical thickness and particle absorption, among others. We will focus on two aspects: (1) maximum likelihood models for the undisturbed reference atmosphere in the 15°N to 45°N band before and after the disturbance; (2) models for particular structures during the development of the global-scale phenomenon. Our results show a general increase of particle density and single-scattering albedo inside the storm. However, some discrete features showing anomalous structure and related to the storm peculiar dynamics will also be discussed. Acknowledgments: This work was supported by the Spanish MICIIN project AYA2009-10701 with FEDER funds, by Grupos Gobierno Vasco IT-464-07 and by Universidad País Vasco UPV/EHU through program UFI11/55.

  6. Comparison of Image Processing Techniques for Nonviable Tissue Quantification in Late Gadolinium Enhancement Cardiac Magnetic Resonance Images.

    PubMed

    Carminati, M Chiara; Boniotti, Cinzia; Fusini, Laura; Andreini, Daniele; Pontone, Gianluca; Pepi, Mauro; Caiani, Enrico G

    2016-05-01

    The aim of this study was to compare the performance of quantitative methods, either semiautomated or automated, for left ventricular (LV) nonviable tissue analysis from cardiac magnetic resonance late gadolinium enhancement (CMR-LGE) images. The investigated segmentation techniques were: (i) n-standard deviations thresholding; (ii) full width at half maximum thresholding; (iii) Gaussian mixture model classification; and (iv) fuzzy c-means clustering. These algorithms were applied either in each short axis slice (single-slice approach) or globally considering the entire short-axis stack covering the LV (global approach). CMR-LGE images from 20 patients with ischemic cardiomyopathy were retrospectively selected, and results from each technique were assessed against manual tracing. All methods provided comparable performance in terms of accuracy in scar detection, computation of local transmurality, and high correlation in scar mass compared with the manual technique. In general, no significant difference between single-slice and global approach was noted. The reproducibility of manual and investigated techniques was confirmed in all cases with slightly lower results for the nSD approach. Automated techniques resulted in accurate and reproducible evaluation of LV scars from CMR-LGE in ischemic patients with performance similar to the manual technique. Their application could minimize user interaction and computational time, even when compared with semiautomated approaches.

  7. Juno First Slice of Jupiter

    NASA Image and Video Library

    2016-10-19

    This composite image depicts Jupiter's cloud formations as seen through the eyes of Juno's Microwave Radiometer (MWR) instrument as compared to the top layer, a Cassini Imaging Science Subsystem image of the planet. The MWR can see a couple of hundred miles (kilometers) into Jupiter's atmosphere with its largest antenna. The belts and bands visible on the surface are also visible in modified form in each layer below. http://photojournal.jpl.nasa.gov/catalog/PIA21107

  8. Very high-resolution spectroscopy for extremely large telescopes using pupil slicing and adaptive optics.

    PubMed

    Beckers, Jacques M; Andersen, Torben E; Owner-Petersen, Mette

    2007-03-05

    Under seeing limited conditions very high resolution spectroscopy becomes very difficult for extremely large telescopes (ELTs). Using adaptive optics (AO) the stellar image size decreases proportional with the telescope diameter. This makes the spectrograph optics and hence its resolution independent of the telescope diameter. However AO for use with ELTs at visible wavelengths require deformable mirrors with many elements. Those are not likely to be available for quite some time. We propose to use the pupil slicing technique to create a number of sub-pupils each of which having its own deformable mirror. The images from all sub-pupils are combined incoherently with a diameter corresponding to the diffraction limit of the sub-pupil. The technique is referred to as "Pupil Slicing Adaptive Optics" or PSAO.

  9. Slicing of silicon into sheet material. Silicon sheet growth development for the large area silicon sheet task of the low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Holden, S. C.; Fleming, J. R.

    1978-01-01

    Fabrication of a prototype large capacity multiple blade slurry saw is considered. Design of the bladehead which will tension up to 1000 blades, and cut a 45 cm long silicon ingot as large as 12 cm in diameter is given. The large blade tensioning force of 270,000 kg is applied through two bolts acting on a pair of scissor toggles, significantly reducing operator set-up time. Tests with an upside-down cutting technique resulted in 100% wafering yields and the highest wafer accuracy yet experienced with MS slicing. Variations in oil and abrasives resulted only in degraded slicing results. A technique of continuous abrasive slurry separation to remove silicon debris is described.

  10. Bias field inconsistency correction of motion-scattered multislice MRI for improved 3D image reconstruction.

    PubMed

    Kim, Kio; Habas, Piotr A; Rajagopalan, Vidya; Scott, Julia A; Corbett-Detig, James M; Rousseau, Francois; Barkovich, A James; Glenn, Orit A; Studholme, Colin

    2011-09-01

    A common solution to clinical MR imaging in the presence of large anatomical motion is to use fast multislice 2D studies to reduce slice acquisition time and provide clinically usable slice data. Recently, techniques have been developed which retrospectively correct large scale 3D motion between individual slices allowing the formation of a geometrically correct 3D volume from the multiple slice stacks. One challenge, however, in the final reconstruction process is the possibility of varying intensity bias in the slice data, typically due to the motion of the anatomy relative to imaging coils. As a result, slices which cover the same region of anatomy at different times may exhibit different sensitivity. This bias field inconsistency can induce artifacts in the final 3D reconstruction that can impact both clinical interpretation of key tissue boundaries and the automated analysis of the data. Here we describe a framework to estimate and correct the bias field inconsistency in each slice collectively across all motion corrupted image slices. Experiments using synthetic and clinical data show that the proposed method reduces intensity variability in tissues and improves the distinction between key tissue types.

  11. Examining the Complex Regulation and Drug-Induced Plasticity of Dopamine Release and Uptake Using Voltammetry in Brain Slices

    PubMed Central

    2013-01-01

    Fast scan cyclic voltammetry in brain slices (slice voltammetry) has been used over the last several decades to increase substantially our understanding of the complex local regulation of dopamine release and uptake in the striatum. This technique is routinely used for the study of changes that occur in the dopamine system associated with various disease states and pharmacological treatments, and to study mechanisms of local circuitry regulation of dopamine terminal function. In the context of this Review, we compare the relative advantages of voltammetry using striatal slice preparations versus in vivo preparations, and highlight recent advances in our understanding of dopamine release and uptake in the striatum specifically from studies that use slice voltammetry in drug-naïve animals and animals with a history of psychostimulant self-administration. PMID:23581570

  12. Selective encryption for H.264/AVC video coding

    NASA Astrophysics Data System (ADS)

    Shi, Tuo; King, Brian; Salama, Paul

    2006-02-01

    Due to the ease with which digital data can be manipulated and due to the ongoing advancements that have brought us closer to pervasive computing, the secure delivery of video and images has become a challenging problem. Despite the advantages and opportunities that digital video provide, illegal copying and distribution as well as plagiarism of digital audio, images, and video is still ongoing. In this paper we describe two techniques for securing H.264 coded video streams. The first technique, SEH264Algorithm1, groups the data into the following blocks of data: (1) a block that contains the sequence parameter set and the picture parameter set, (2) a block containing a compressed intra coded frame, (3) a block containing the slice header of a P slice, all the headers of the macroblock within the same P slice, and all the luma and chroma DC coefficients belonging to the all the macroblocks within the same slice, (4) a block containing all the ac coefficients, and (5) a block containing all the motion vectors. The first three are encrypted whereas the last two are not. The second method, SEH264Algorithm2, relies on the use of multiple slices per coded frame. The algorithm searches the compressed video sequence for start codes (0x000001) and then encrypts the next N bits of data.

  13. Comparison of cloud top heights derived from FY-2 meteorological satellites with heights derived from ground-based millimeter wavelength cloud radar

    NASA Astrophysics Data System (ADS)

    Wang, Zhe; Wang, Zhenhui; Cao, Xiaozhong; Tao, Fa

    2018-01-01

    Clouds are currently observed by both ground-based and satellite remote sensing techniques. Each technique has its own strengths and weaknesses depending on the observation method, instrument performance and the methods used for retrieval. It is important to study synergistic cloud measurements to improve the reliability of the observations and to verify the different techniques. The FY-2 geostationary orbiting meteorological satellites continuously observe the sky over China. Their cloud top temperature product can be processed to retrieve the cloud top height (CTH). The ground-based millimeter wavelength cloud radar can acquire information about the vertical structure of clouds-such as the cloud base height (CBH), CTH and the cloud thickness-and can continuously monitor changes in the vertical profiles of clouds. The CTHs were retrieved using both cloud top temperature data from the FY-2 satellites and the cloud radar reflectivity data for the same time period (June 2015 to May 2016) and the resulting datasets were compared in order to evaluate the accuracy of CTH retrievals using FY-2 satellites. The results show that the concordance rate of cloud detection between the two datasets was 78.1%. Higher consistencies were obtained for thicker clouds with larger echo intensity and for more continuous clouds. The average difference in the CTH between the two techniques was 1.46 km. The difference in CTH between low- and mid-level clouds was less than that for high-level clouds. An attenuation threshold of the cloud radar for rainfall was 0.2 mm/min; a rainfall intensity below this threshold had no effect on the CTH. The satellite CTH can be used to compensate for the attenuation error in the cloud radar data.

  14. Implicit-Explicit Formulations of a Three-Dimensional Nonhydrostatic Unified Model of the Atmosphere (NUMA)

    DTIC Science & Technology

    2013-01-01

    Gravity Wave. A slice of the potential temperature perturbation (at y=50 km) after 700 s for 30× 30× 5 elements with 4th-order polynomials . The contour...CONSTANTINESCU ‡ Key words. cloud-resolving model; compressible flow; element-based Galerkin methods; Euler; global model; IMEX; Lagrange; Legendre ...methods in terms of accuracy and efficiency for two types of geophysical fluid dynamics problems: buoyant convection and inertia- gravity waves. These

  15. Demonstration of application-driven network slicing and orchestration in optical/packet domains: on-demand vDC expansion for Hadoop MapReduce optimization.

    PubMed

    Kong, Bingxin; Liu, Siqi; Yin, Jie; Li, Shengru; Zhu, Zuqing

    2018-05-28

    Nowadays, it is common for service providers (SPs) to leverage hybrid clouds to improve the quality-of-service (QoS) of their Big Data applications. However, for achieving guaranteed latency and/or bandwidth in its hybrid cloud, an SP might desire to have a virtual datacenter (vDC) network, in which it can manage and manipulate the network connections freely. To address this requirement, we design and implement a network slicing and orchestration (NSO) system that can create and expand vDCs across optical/packet domains on-demand. Considering Hadoop MapReduce (M/R) as the use-case, we describe the proposed architectures of the system's data, control and management planes, and present the operation procedures for creating, expanding, monitoring and managing a vDC for M/R optimization. The proposed NSO system is then realized in a small-scale network testbed that includes four optical/packet domains, and we conduct experiments in it to demonstrate the whole operations of the data, control and management planes. Our experimental results verify that application-driven on-demand vDC expansion across optical/packet domains can be achieved for M/R optimization, and after being provisioned with a vDC, the SP using the NSO system can fully control the vDC network and further optimize the M/R jobs in it with network orchestration.

  16. A Unified Approach to Diffusion Direction Sensitive Slice Registration and 3-D DTI Reconstruction From Moving Fetal Brain Anatomy

    PubMed Central

    Fogtmann, Mads; Seshamani, Sharmishtaa; Kroenke, Christopher; Cheng, Xi; Chapman, Teresa; Wilm, Jakob; Rousseau, François

    2014-01-01

    This paper presents an approach to 3-D diffusion tensor image (DTI) reconstruction from multi-slice diffusion weighted (DW) magnetic resonance imaging acquisitions of the moving fetal brain. Motion scatters the slice measurements in the spatial and spherical diffusion domain with respect to the underlying anatomy. Previous image registration techniques have been described to estimate the between slice fetal head motion, allowing the reconstruction of 3-D a diffusion estimate on a regular grid using interpolation. We propose Approach to Unified Diffusion Sensitive Slice Alignment and Reconstruction (AUDiSSAR) that explicitly formulates a process for diffusion direction sensitive DW-slice-to-DTI-volume alignment. This also incorporates image resolution modeling to iteratively deconvolve the effects of the imaging point spread function using the multiple views provided by thick slices acquired in different anatomical planes. The algorithm is implemented using a multi-resolution iterative scheme and multiple real and synthetic data are used to evaluate the performance of the technique. An accuracy experiment using synthetically created motion data of an adult head and a experiment using synthetic motion added to sedated fetal monkey dataset show a significant improvement in motion-trajectory estimation compared to a state-of-the-art approaches. The performance of the method is then evaluated on challenging but clinically typical in utero fetal scans of four different human cases, showing improved rendition of cortical anatomy and extraction of white matter tracts. While the experimental work focuses on DTI reconstruction (second-order tensor model), the proposed reconstruction framework can employ any 5-D diffusion volume model that can be represented by the spatial parameterizations of an orientation distribution function. PMID:24108711

  17. Inter-slice bidirectional registration-based segmentation of the prostate gland in MR and CT image sequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalvati, Farzad, E-mail: farzad.khalvati@uwaterloo.ca; Tizhoosh, Hamid R.; Salmanpour, Aryan

    Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less

  18. Inter-slice bidirectional registration-based segmentation of the prostate gland in MR and CT image sequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalvati, Farzad, E-mail: farzad.khalvati@uwaterloo.ca; Tizhoosh, Hamid R.; Salmanpour, Aryan

    2013-12-15

    Purpose: Accurate segmentation and volume estimation of the prostate gland in magnetic resonance (MR) and computed tomography (CT) images are necessary steps in diagnosis, treatment, and monitoring of prostate cancer. This paper presents an algorithm for the prostate gland volume estimation based on the semiautomated segmentation of individual slices in T2-weighted MR and CT image sequences. Methods: The proposedInter-Slice Bidirectional Registration-based Segmentation (iBRS) algorithm relies on interslice image registration of volume data to segment the prostate gland without the use of an anatomical atlas. It requires the user to mark only three slices in a given volume dataset, i.e., themore » first, middle, and last slices. Next, the proposed algorithm uses a registration algorithm to autosegment the remaining slices. We conducted comprehensive experiments to measure the performance of the proposed algorithm using three registration methods (i.e., rigid, affine, and nonrigid techniques). Results: The results with the proposed technique were compared with manual marking using prostate MR and CT images from 117 patients. Manual marking was performed by an expert user for all 117 patients. The median accuracies for individual slices measured using the Dice similarity coefficient (DSC) were 92% and 91% for MR and CT images, respectively. The iBRS algorithm was also evaluated regarding user variability, which confirmed that the algorithm was robust to interuser variability when marking the prostate gland. Conclusions: The proposed algorithm exploits the interslice data redundancy of the images in a volume dataset of MR and CT images and eliminates the need for an atlas, minimizing the computational cost while producing highly accurate results which are robust to interuser variability.« less

  19. Hexagonal undersampling for faster MRI near metallic implants.

    PubMed

    Sveinsson, Bragi; Worters, Pauline W; Gold, Garry E; Hargreaves, Brian A

    2015-02-01

    Slice encoding for metal artifact correction acquires a three-dimensional image of each excited slice with view-angle tilting to reduce slice and readout direction artifacts respectively, but requires additional imaging time. The purpose of this study was to provide a technique for faster imaging around metallic implants by undersampling k-space. Assuming that areas of slice distortion are localized, hexagonal sampling can reduce imaging time by 50% compared with conventional scans. This work demonstrates this technique by comparisons of fully sampled images with undersampled images, either from simulations from fully acquired data or from data actually undersampled during acquisition, in patients and phantoms. Hexagonal sampling is also shown to be compatible with parallel imaging and partial Fourier acquisitions. Image quality was evaluated using a structural similarity (SSIM) index. Images acquired with hexagonal undersampling had no visible difference in artifact suppression from fully sampled images. The SSIM index indicated high similarity to fully sampled images in all cases. The study demonstrates the ability to reduce scan time by undersampling without compromising image quality. © 2014 Wiley Periodicals, Inc.

  20. Extraction of convective cloud parameters from Doppler Weather Radar MAX(Z) product using Image Processing Technique

    NASA Astrophysics Data System (ADS)

    Arunachalam, M. S.; Puli, Anil; Anuradha, B.

    2016-07-01

    In the present work continuous extraction of convective cloud optical information and reflectivity (MAX(Z) in dBZ) using online retrieval technique for time series data production from Doppler Weather Radar (DWR) located at Indian Meteorological Department, Chennai has been developed in MATLAB. Reflectivity measurements for different locations within the DWR range of 250 Km radii of circular disc area can be retrieved using this technique. It gives both time series reflectivity of point location and also Range Time Intensity (RTI) maps of reflectivity for the corresponding location. The Graphical User Interface (GUI) developed for the cloud reflectivity is user friendly; it also provides the convective cloud optical information such as cloud base height (CBH), cloud top height (CTH) and cloud optical depth (COD). This technique is also applicable for retrieving other DWR products such as Plan Position Indicator (Z, in dBZ), Plan Position Indicator (Z, in dBZ)-Close Range, Volume Velocity Processing (V, in knots), Plan Position Indicator (V, in m/s), Surface Rainfall Intensity (SRI, mm/hr), Precipitation Accumulation (PAC) 24 hrs at 0300UTC. Keywords: Reflectivity, cloud top height, cloud base, cloud optical depth

  1. Airway mechanics and methods used to visualize smooth muscle dynamics in vitro.

    PubMed

    Cooper, P R; McParland, B E; Mitchell, H W; Noble, P B; Politi, A Z; Ressmeyer, A R; West, A R

    2009-10-01

    Contraction of airway smooth muscle (ASM) is regulated by the physiological, structural and mechanical environment in the lung. We review two in vitro techniques, lung slices and airway segment preparations, that enable in situ ASM contraction and airway narrowing to be visualized. Lung slices and airway segment approaches bridge a gap between cell culture and isolated ASM, and whole animal studies. Imaging techniques enable key upstream events involved in airway narrowing, such as ASM cell signalling and structural and mechanical events impinging on ASM, to be investigated.

  2. Separation of parallel encoded complex-valued slices (SPECS) from a single complex-valued aliased coil image.

    PubMed

    Rowe, Daniel B; Bruce, Iain P; Nencka, Andrew S; Hyde, James S; Kociuba, Mary C

    2016-04-01

    Achieving a reduction in scan time with minimal inter-slice signal leakage is one of the significant obstacles in parallel MR imaging. In fMRI, multiband-imaging techniques accelerate data acquisition by simultaneously magnetizing the spatial frequency spectrum of multiple slices. The SPECS model eliminates the consequential inter-slice signal leakage from the slice unaliasing, while maintaining an optimal reduction in scan time and activation statistics in fMRI studies. When the combined k-space array is inverse Fourier reconstructed, the resulting aliased image is separated into the un-aliased slices through a least squares estimator. Without the additional spatial information from a phased array of receiver coils, slice separation in SPECS is accomplished with acquired aliased images in shifted FOV aliasing pattern, and a bootstrapping approach of incorporating reference calibration images in an orthogonal Hadamard pattern. The aliased slices are effectively separated with minimal expense to the spatial and temporal resolution. Functional activation is observed in the motor cortex, as the number of aliased slices is increased, in a bilateral finger tapping fMRI experiment. The SPECS model incorporates calibration reference images together with coefficients of orthogonal polynomials into an un-aliasing estimator to achieve separated images, with virtually no residual artifacts and functional activation detection in separated images. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Impacts of simultaneous multislice acquisition on sensitivity and specificity in fMRI.

    PubMed

    Risk, Benjamin B; Kociuba, Mary C; Rowe, Daniel B

    2018-05-15

    Simultaneous multislice (SMS) imaging can be used to decrease the time between acquisition of fMRI volumes, which can increase sensitivity by facilitating the removal of higher-frequency artifacts and boosting effective sample size. The technique requires an additional processing step in which the slices are separated, or unaliased, to recover the whole brain volume. However, this may result in signal "leakage" between aliased locations, i.e., slice "leakage," and lead to spurious activation (decreased specificity). SMS can also lead to noise amplification, which can reduce the benefits of decreased repetition time. In this study, we evaluate the original slice-GRAPPA (no leak block) reconstruction algorithm and acceleration factor (AF = 8) used in the fMRI data in the young adult Human Connectome Project (HCP). We also evaluate split slice-GRAPPA (leak block), which can reduce slice leakage. We use simulations to disentangle higher test statistics into true positives (sensitivity) and false positives (decreased specificity). Slice leakage was greatly decreased by split slice-GRAPPA. Noise amplification was decreased by using moderate acceleration factors (AF = 4). We examined slice leakage in unprocessed fMRI motor task data from the HCP. When data were smoothed, we found evidence of slice leakage in some, but not all, subjects. We also found evidence of SMS noise amplification in unprocessed task and processed resting-state HCP data. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Assessing variability in Orbiting Carbon Observatory-2 (OCO-2) XCO2 using high spatial resolution color slices and other retrieval parameters

    NASA Astrophysics Data System (ADS)

    Merrelli, A. J.; Taylor, T.; O'Dell, C.; Cronk, H. Q.; Eldering, A.; Crisp, D.

    2017-12-01

    The Orbiting Carbon Observatory-2 (OCO-2) measures reflected sunlight in the Oxygen A-band (0.76 μm), Weak CO2 band (1.61 μm) and Strong CO2 band (2.06 μm) with resolving powers 18,000, 19,500 and 19,500, respectively. Soundings are collected at 3Hz, yielding 8 contiguous <1.3 km x 2.3 km footprints across a narrow (<0.8°) swath. After cloud screening, these high-resolution spectra are used in an optimal estimation retrieval to produce estimates of the column averaged carbon dioxide dry air mole fraction (XCO2). In the absence of strong CO2 absorbers, e.g., intense agricultural regions, or strong emitters, e.g., mega-cities, the variability of XCO2 over small scales, e.g., tens of kilometers, is expected to be less than 1 ppm. However, deviations on the order of +/- 2 ppm, or more, are often observed in the production Version 7 (B7) data product. We hypothesize that most of this variability is spurious, with contributions from both retrieval errors and undetected cloud and aerosol contamination. The contiguous nature of the OCO-2 spatial sampling allows for analysis of the variability in XCO2 and correlation with variables, such as the full spatial resolution "color slices" and other retrieved parameters. Color slices avoid the on-board averaging across the detector focal plane array, providing increased spatial information compared to the nominal spectra. This work explores the new B8 production data set using MODIS visible imagery from the CSU Vistool to provide visual context to the OCO-2 parameters. The large volume of data that has been collected since September 2014 allows for statistical analysis of parameters in relation to XCO2 variability. Some detailed case studies are presented.

  5. Sliceable transponders for metro-access transmission links

    NASA Astrophysics Data System (ADS)

    Wagner, C.; Madsen, P.; Spolitis, S.; Vegas Olmos, J. J.; Tafur Monroy, I.

    2015-01-01

    This paper presents a solution for upgrading optical access networks by reusing existing electronics or optical equipment: sliceable transponders using signal spectrum slicing and stitching back method after direct detection. This technique allows transmission of wide bandwidth signals from the service provider (OLT - optical line terminal) to the end user (ONU - optical network unit) over an optical distribution network (ODN) via low bandwidth equipment. We show simulation and experimental results for duobinary signaling of 1 Gbit/s and 10 Gbit/s waveforms. The number of slices is adjusted to match the lowest analog bandwidth of used electrical devices and scale from 2 slices to 10 slices. Results of experimental transmission show error free signal recovery by using post forward error correction with 7% overhead.

  6. The Radiative Consistency of Atmospheric Infrared Sounder and Moderate Resolution Imaging Spectroradiometer Cloud Retrievals

    NASA Technical Reports Server (NTRS)

    Kahn, Brian H.; Fishbein, Evan; Nasiri, Shaima L.; Eldering, Annmarie; Fetzer, Eric J.; Garay, Michael J.; Lee, Sung-Yung

    2007-01-01

    The consistency of cloud top temperature (Tc) and effective cloud fraction (f) retrieved by the Atmospheric Infrared Sounder (AIRS)/Advanced Microwave Sounding Unit (AMSU) observation suite and the Moderate Resolution Imaging Spectroradiometer (MODIS) on the EOS-Aqua platform are investigated. Collocated AIRS and MODIS TC and f are compared via an 'effective scene brightness temperature' (Tb,e). Tb,e is calculated with partial field of view (FOV) contributions from TC and surface temperature (TS), weighted by f and 1-f, respectively. AIRS reports up to two cloud layers while MODIS reports up to one. However, MODIS reports TC, TS, and f at a higher spatial resolution than AIRS. As a result, pixel-scale comparisons of TC and f are difficult to interpret, demonstrating the need for alternatives such as Tb,e. AIRS-MODIS Tb,e differences ((Delta)Tb,e) for identical observing scenes are useful as a diagnostic for cloud quantity comparisons. The smallest values of DTb,e are for high and opaque clouds, with increasing scatter in (Delta)Tb,e for clouds of smaller opacity and lower altitude. A persistent positive bias in DTb,e is observed in warmer and low-latitude scenes, characterized by a mixture of MODIS CO2 slicing and 11-mm window retrievals. These scenes contain heterogeneous cloud cover, including mixtures of multilayered cloudiness and misplaced MODIS cloud top pressure. The spatial patterns of (Delta)Tb,e are systematic and do not correlate well with collocated AIRS-MODIS radiance differences, which are more random in nature and smaller in magnitude than (Delta)Tb,e. This suggests that the observed inconsistencies in AIRS and MODIS cloud fields are dominated by retrieval algorithm differences, instead of differences in the observed radiances. The results presented here have implications for the validation of cloudy satellite retrieval algorithms, and use of cloud products in quantitative analyses.

  7. Application research of 3D additive manufacturing technology in the nail shell

    NASA Astrophysics Data System (ADS)

    Xiao, Shanhua; Yan, Ruiqiang; Song, Ning

    2018-04-01

    Based on the analysis of hierarchical slicing algorithm, 3D scanning of enterprise product nailing handle case file is carried out, point cloud data processing is performed on the source file, and the surface modeling and innovative design of nail handling handle case are completed. Using MakerBot Replicator2X-based 3D printer for layered 3D print samples, for the new nail product development to provide reverse modeling and rapid prototyping technical support.

  8. A search for embedded young stellar objects in and near the IC 1396 complex

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard D.; Wilking, Bruce A.; Giulbudagian, Armen L.

    1991-01-01

    The IRAS data base is used to locate young stellar object candidates in and near the IC 1396 complex located in the Cepheus OB2 association. Co-added survey data are used to identify all sources with a flux density Snu(100) greater than 10 Jy and with Snu(100) greater than Snu(60). The 15 sources located at the positions of globules and dark clouds are further analyzed using the inscan slices to assess the source profiles.

  9. Neural network classification technique and machine vision for bread crumb grain evaluation

    NASA Astrophysics Data System (ADS)

    Zayas, Inna Y.; Chung, O. K.; Caley, M.

    1995-10-01

    Bread crumb grain was studied to develop a model for pattern recognition of bread baked at Hard Winter Wheat Quality Laboratory (HWWQL), Grain Marketing and Production Research Center (GMPRC). Images of bread slices were acquired with a scanner in a 512 multiplied by 512 format. Subimages in the central part of the slices were evaluated by several features such as mean, determinant, eigen values, shape of a slice and other crumb features. Derived features were used to describe slices and loaves. Neural network programs of MATLAB package were used for data analysis. Learning vector quantization method and multivariate discriminant analysis were applied to bread slices from what of different sources. A training and test sets of different bread crumb texture classes were obtained. The ranking of subimages was well correlated with visual judgement. The performance of different models on slice recognition rate was studied to choose the best model. The recognition of classes created according to human judgement with image features was low. Recognition of arbitrarily created classes, according to porosity patterns, with several feature patterns was approximately 90%. Correlation coefficient was approximately 0.7 between slice shape features and loaf volume.

  10. Tectonic slicing of subducting oceanic crust along plate interfaces: Numerical modeling

    NASA Astrophysics Data System (ADS)

    Ruh, J. B.; Le Pourhiet, L.; Agard, Ph.; Burov, E.; Gerya, T.

    2015-10-01

    Multikilometer-sized slivers of high-pressure low-temperature metamorphic oceanic crust and mantle are observed in many mountain belts. These blueschist and eclogite units were detached from the descending plate during subduction. Large-scale thermo-mechanical numerical models based on finite difference marker-in-cell staggered grid technique are implemented to investigate slicing processes that lead to the detachment of oceanic slivers and their exhumation before the onset of the continental collision phase. In particular, we investigate the role of the serpentinized subcrustal slab mantle in the mechanisms of shallow and deep crustal slicing. Results show that spatially homogeneous serpentinization of the sub-Moho slab mantle leads to complete accretion of oceanic crust within the accretionary wedge. Spatially discontinuous serpentinization of the slab mantle in form of unconnected patches can lead to shallow slicing of the oceanic crust below the accretionary wedge and to its deep slicing at mantle depths depending on the patch length, slab angle, convergence velocity and continental geothermal gradient. P-T paths obtained in this study are compared to natural examples of shallow slicing of the Crescent Terrane below Vancouver Island and deeply sliced crust of the Lago Superiore and Saas-Zermatt units in the Western Alps.

  11. Locating knots by industrial tomography- A feasibility study

    Treesearch

    Fred W. Taylor; Francis G. Wagner; Charles W. McMillin; Ira L. Morgan; Forrest F. Hopkins

    1984-01-01

    Industrial photon tomography was used to scan four southern pine logs and one red oak log. The logs were scanned at 16 cross-sectional slice planes located 1 centimeter apart along their longitudinal axes. Tomographic reconstructions were made from the scan data collected at these slice planes, and a cursory image analysis technique was developed to locate the log...

  12. Techniques for Producing Coastal Land Water Masks from Landsat and Other Multispectral Satellite Data

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.; Hall, Callie

    2005-01-01

    Coastal erosion and land loss continue to threaten many areas in the United States. Landsat data has been used to monitor regional coastal change since the 1970s. Many techniques can be used to produce coastal land water masks, including image classification and density slicing of individual bands or of band ratios. Band ratios used in land water detection include several variations of the Normalized Difference Water Index (NDWI). This poster discusses a study that compares land water masks computed from unsupervised Landsat image classification with masks from density-sliced band ratios and from the Landsat TM band 5. The greater New Orleans area is employed in this study, due to its abundance of coastal habitats and its vulnerability to coastal land loss. Image classification produced the best results based on visual comparison to higher resolution satellite and aerial image displays. However, density sliced NDWI imagery from either near infrared (NIR) and blue bands or from NIR and green bands also produced more effective land water masks than imagery from the density-sliced Landsat TM band 5. NDWI based on NIR and green bands is noteworthy because it allows land water masks to be generated from multispectral satellite sensors without a blue band (e.g., ASTER and Landsat MSS). NDWI techniques also have potential for producing land water masks from coarser scaled satellite data, such as MODIS.

  13. Techniques for Producing Coastal Land Water Masks from Landsat and Other Multispectral Satellite Data

    NASA Technical Reports Server (NTRS)

    Spruce, Joe; Hall, Callie

    2005-01-01

    Coastal erosion and land loss continue to threaten many areas in the United States. Landsat data has been used to monitor regional coastal change since the 1970's. Many techniques can be used to produce coastal land water masks, including image classification and density slicing of individual bands or of band ratios. Band ratios used in land water detection include several variations of the Normalized Difference Water Index (NDWI). This poster discusses a study that compares land water masks computed from unsupervised Landsat image classification with masks from density-sliced band ratios and from the Landsat TM band 5. The greater New Orleans area is imployed in this study, due to its abundance of coastal habitats and ist vulnerability to coastal land loss. Image classification produced the best results based on visual comparison to higher resolution satellite and aerial image displays. However, density-sliced NDWI imagery from either near infrared (NIR) and blue bands or from NIR and green bands also produced more effective land water masks than imagery from the density-sliced Landsat TM band 5. NDWI based on NIR and green bands is noteworthy because it allows land water masks to be generated form multispectral satellite sensors without a blue band (e.g., ASTER and Landsat MSS). NDWI techniques also have potential for producing land water masks from coarser scaled satellite data, such as MODIS.

  14. Dual-energy contrast enhanced digital breast tomosynthesis: concept, method, and evaluation on phantoms

    NASA Astrophysics Data System (ADS)

    Puong, Sylvie; Patoureaux, Fanny; Iordache, Razvan; Bouchevreau, Xavier; Muller, Serge

    2007-03-01

    In this paper, we present the development of dual-energy Contrast-Enhanced Digital Breast Tomosynthesis (CEDBT). A method to produce background clutter-free slices from a set of low and high-energy projections is introduced, along with a scheme for the determination of the optimal low and high-energy techniques. Our approach consists of a dual-energy recombination of the projections, with an algorithm that has proven its performance in Contrast-Enhanced Digital Mammography1 (CEDM), followed by an iterative volume reconstruction. The aim is to eliminate the anatomical background clutter and to reconstruct slices where the gray level is proportional to the local iodine volumetric concentration. Optimization of the low and high-energy techniques is performed by minimizing the total glandular dose to reach a target iodine Signal Difference to Noise Ratio (SDNR) in the slices. In this study, we proved that this optimization could be done on the projections, by consideration of the SDNR in the projections instead of the SDNR in the slices, and verified this with phantom measurements. We also discuss some limitations of dual-energy CEDBT, due to the restricted angular range for the projection views, and to the presence of scattered radiation. Experiments on textured phantoms with iodine inserts were conducted to assess the performance of dual-energy CEDBT. Texture contrast was nearly completely removed and the iodine signal was enhanced in the slices.

  15. Brain Slice Staining and Preparation for Three-Dimensional Super-Resolution Microscopy

    PubMed Central

    German, Christopher L.; Gudheti, Manasa V.; Fleckenstein, Annette E.; Jorgensen, Erik M.

    2018-01-01

    Localization microscopy techniques – such as photoactivation localization microscopy (PALM), fluorescent PALM (FPALM), ground state depletion (GSD), and stochastic optical reconstruction microscopy (STORM) – provide the highest precision for single molecule localization currently available. However, localization microscopy has been largely limited to cell cultures due to the difficulties that arise in imaging thicker tissue sections. Sample fixation and antibody staining, background fluorescence, fluorophore photoinstability, light scattering in thick sections, and sample movement create significant challenges for imaging intact tissue. We have developed a sample preparation and image acquisition protocol to address these challenges in rat brain slices. The sample preparation combined multiple fixation steps, saponin permeabilization, and tissue clarification. Together, these preserve intracellular structures, promote antibody penetration, reduce background fluorescence and light scattering, and allow acquisition of images deep in a 30 μm thick slice. Image acquisition challenges were resolved by overlaying samples with a permeable agarose pad and custom-built stainless steel imaging adapter, and sealing the imaging chamber. This approach kept slices flat, immobile, bathed in imaging buffer, and prevented buffer oxidation during imaging. Using this protocol, we consistently obtained single molecule localizations of synaptic vesicle and active zone proteins in three-dimensions within individual synaptic terminals of the striatum in rat brain slices. These techniques may be easily adapted to the preparation and imaging of other tissues, substantially broadening the application of super-resolution imaging. PMID:28924666

  16. A new technique for the rapid screening and selection of large pieces of tissue for ultrastructural evaluation.

    PubMed

    Dalley, B K; Seliger, W G

    1980-05-01

    A simple and rapid technique is described for the screening of Epon embedded organ slices for the location, isolation, and removal of small specific sites for ultrastructural study with the transmission electron microscope. This procedure consists of perfusion fixation followed by making 1 to 21/2 mm thick slices of relatively large pieces of the organs, control of the degree and evenness of the osmium staining by addition of 3% sodium iodate, and infiltration with a fluorescent dye prior to embedment in Epon. Tissue slices are embedded in wafer-shaped blocks, generally with several slices in one "wafer", and are examined in a controlled manner using a rapid form of serial surface polishing. Each level of the polished wafer is examined using an epi-illuminated fluorescence microscope, and selected sites are chosen at each level for ultrastructural study. Methods are also described for marking each selected site using a conventional slide marker, and for the removal of the selected site in the form of a small disc of Epon, after which the Epon wafer can be further serially polished and the examination continued. Areas to be thin-sectioned are removed using a core drill mounted on a model-maker's drill press. The technique is simple, does not require the destruction of remaining tissues to evaluate more critically a single small site, allows for the easy maintenance of tissue orientation, and the most time-consuming portions of the technique can be quickly taught to a person with no previous histological training.

  17. Directional Multi-scale Modeling of High-Resolution Computed Tomography (HRCT) Lung Images for Diffuse Lung Disease Classification

    NASA Astrophysics Data System (ADS)

    Vo, Kiet T.; Sowmya, Arcot

    A directional multi-scale modeling scheme based on wavelet and contourlet transforms is employed to describe HRCT lung image textures for classifying four diffuse lung disease patterns: normal, emphysema, ground glass opacity (GGO) and honey-combing. Generalized Gaussian density parameters are used to represent the detail sub-band features obtained by wavelet and contourlet transforms. In addition, support vector machines (SVMs) with excellent performance in a variety of pattern classification problems are used as classifier. The method is tested on a collection of 89 slices from 38 patients, each slice of size 512x512, 16 bits/pixel in DICOM format. The dataset contains 70,000 ROIs of those slices marked by experienced radiologists. We employ this technique at different wavelet and contourlet transform scales for diffuse lung disease classification. The technique presented here has best overall sensitivity 93.40% and specificity 98.40%.

  18. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    PubMed Central

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  19. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    PubMed

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, H; UT Southwestern Medical Center, Dallas, TX; Hilts, M

    Purpose: To commission a multislice computed tomography (CT) scanner for fast and reliable readout of radiation therapy (RT) dose distributions using CT polymer gel dosimetry (PGD). Methods: Commissioning was performed for a 16-slice CT scanner using images acquired through a 1L cylinder filled with water. Additional images were collected using a single slice machine for comparison purposes. The variability in CT number associated with the anode heel effect was evaluated and used to define a new slice-by-slice background image subtraction technique. Image quality was assessed for the multislice system by comparing image noise and uniformity to that of the singlemore » slice machine. The consistency in CT number across slices acquired simultaneously using the multislice detector array was also evaluated. Finally, the variability in CT number due to increasing x-ray tube load was measured for the multislice scanner and compared to the tube load effects observed on the single slice machine. Results: Slice-by-slice background subtraction effectively removes the variability in CT number across images acquired simultaneously using the multislice scanner and is the recommended background subtraction method when using a multislice CT system. Image quality for the multislice machine was found to be comparable to that of the single slice scanner. Further study showed CT number was consistent across image slices acquired simultaneously using the multislice detector array for each detector configuration of the slice thickness examined. In addition, the multislice system was found to eliminate variations in CT number due to increasing x-ray tube load and reduce scanning time by a factor of 4 when compared to imaging a large volume using a single slice scanner. Conclusion: A multislice CT scanner has been commissioning for CT PGD, allowing images of an entire dose distribution to be acquired in a matter of minutes. Funding support provided by the Natural Sciences and Engineering Research Council of Canada (NSERC)« less

  1. Effects of irradiation and fumaric acid treatment on the inactivation of Listeria monocytogenes and Salmonella typhimurium inoculated on sliced ham

    NASA Astrophysics Data System (ADS)

    Song, Hyeon-Jeong; Lee, Ji-Hye; Song, Kyung Bin

    2011-11-01

    To examine the effects of fumaric acid and electron beam irradiation on the inactivation of foodborne pathogens in ready-to-eat meat products, sliced ham was inoculated with Listeria monocytogenes and Salmonella typhimurium. The inoculated ham slices were treated with 0.5% fumaric acid or electron beam irradiation at 2 kGy. Fumaric acid treatment reduced the populations of L. monocytogenes and S. typhimurium by approximately 1 log CFU/g compared to control populations. In contrast, electron beam irradiation decreased the populations of S. typhimurium and L. monocytogenes by 3.78 and 2.42 log CFU/g, respectively. These results suggest that electron beam irradiation is a better and appropriate technique for improving the microbial safety of sliced ham.

  2. Automated Detection of Clouds in Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary

    2010-01-01

    Many different approaches have been used to automatically detect clouds in satellite imagery. Most approaches are deterministic and provide a binary cloud - no cloud product used in a variety of applications. Some of these applications require the identification of cloudy pixels for cloud parameter retrieval, while others require only an ability to mask out clouds for the retrieval of surface or atmospheric parameters in the absence of clouds. A few approaches estimate a probability of the presence of a cloud at each point in an image. These probabilities allow a user to select cloud information based on the tolerance of the application to uncertainty in the estimate. Many automated cloud detection techniques develop sophisticated tests using a combination of visible and infrared channels to determine the presence of clouds in both day and night imagery. Visible channels are quite effective in detecting clouds during the day, as long as test thresholds properly account for variations in surface features and atmospheric scattering. Cloud detection at night is more challenging, since only courser resolution infrared measurements are available. A few schemes use just two infrared channels for day and night cloud detection. The most influential factor in the success of a particular technique is the determination of the thresholds for each cloud test. The techniques which perform the best usually have thresholds that are varied based on the geographic region, time of year, time of day and solar angle.

  3. Collaborative Research: Simulation of Beam-Electron Cloud Interactions in Circular Accelerators Using Plasma Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsouleas, Thomas; Decyk, Viktor

    Final Report for grant DE-FG02-06ER54888, "Simulation of Beam-Electron Cloud Interactions in Circular Accelerators Using Plasma Models" Viktor K. Decyk, University of California, Los Angeles Los Angeles, CA 90095-1547 The primary goal of this collaborative proposal was to modify the code QuickPIC and apply it to study the long-time stability of beam propagation in low density electron clouds present in circular accelerators. The UCLA contribution to this collaborative proposal was in supporting the development of the pipelining scheme for the QuickPIC code, which extended the parallel scaling of this code by two orders of magnitude. The USC work was as describedmore » here the PhD research for Ms. Bing Feng, lead author in reference 2 below, who performed the research at USC under the guidance of the PI Tom Katsouleas and the collaboration of Dr. Decyk The QuickPIC code [1] is a multi-scale Particle-in-Cell (PIC) code. The outer 3D code contains a beam which propagates through a long region of plasma and evolves slowly. The plasma response to this beam is modeled by slices of a 2D plasma code. This plasma response then is fed back to the beam code, and the process repeats. The pipelining is based on the observation that once the beam has passed a 2D slice, its response can be fed back to the beam immediately without waiting for the beam to pass all the other slices. Thus independent blocks of 2D slices from different time steps can be running simultaneously. The major difficulty was when particles at the edges needed to communicate with other blocks. Two versions of the pipelining scheme were developed, for the the full quasi-static code and the other for the basic quasi-static code used by this e-cloud proposal. Details of the pipelining scheme were published in [2]. The new version of QuickPIC was able to run with more than 1,000 processors, and was successfully applied in modeling e-clouds by our collaborators in this proposal [3-8]. Jean-Luc Vay at Lawrence Berkeley National Lab later implemented a similar basic quasistatic scheme including pipelining in the code WARP [9] and found good to very good quantitative agreement between the two codes in modeling e-clouds. References [1] C. Huang, V. K. Decyk, C. Ren, M. Zhou, W. Lu, W. B. Mori, J. H. Cooley, T. M. Antonsen, Jr., and T. Katsouleas, "QUICKPIC: A highly efficient particle-in-cell code for modeling wakefield acceleration in plasmas," J. Computational Phys. 217, 658 (2006). [2] B. Feng, C. Huang, V. K. Decyk, W. B. Mori, P. Muggli, and T. Katsouleas, "Enhancing parallel quasi-static particle-in-cell simulations with a pipelining algorithm," J. Computational Phys, 228, 5430 (2009). [3] C. Huang, V. K. Decyk, M. Zhou, W. Lu, W. B. Mori, J. H. Cooley, T. M. Antonsen, Jr., and B. Feng, T. Katsouleas, J. Vieira, and L. O. Silva, "QUICKPIC: A highly efficient fully parallelized PIC code for plasma-based acceleration," Proc. of the SciDAC 2006 Conf., Denver, Colorado, June, 2006 [Journal of Physics: Conference Series, W. M. Tang, Editor, vol. 46, Institute of Physics, Bristol and Philadelphia, 2006], p. 190. [4] B. Feng, C. Huang, V. Decyk, W. B. Mori, T. Katsouleas, P. Muggli, "Enhancing Plasma Wakefield and E-cloud Simulation Performance Using a Pipelining Algorithm," Proc. 12th Workshop on Advanced Accelerator Concepts, Lake Geneva, WI, July, 2006, p. 201 [AIP Conf. Proceedings, vol. 877, Melville, NY, 2006]. [5] B. Feng, P. Muggli, T. Katsouleas, V. Decyk, C. Huang, and W. Mori, "Long Time Electron Cloud Instability Simulation Using QuickPIC with Pipelining Algorithm," Proc. of the 2007 Particle Accelerator Conference, Albuquerque, NM, June, 2007, p. 3615. [6] B. Feng, C. Huang, V. Decyk, W. B. Mori, G. H. Hoffstaetter, P. Muggli, T. Katsouleas, "Simulation of Electron Cloud Effects on Electron Beam at ERL with Pipelined QuickPIC," Proc. 13th Workshop on Advanced Accelerator Concepts, Santa Cruz, CA, July-August, 2008, p. 340 [AIP Conf. Proceedings, vol. 1086, Melville, NY, 2008]. [7] B. Feng, C. Huang, V. K. Decyk, W. B. Mori, P. Muggli, and T. Katsouleas, "Enhancing parallel quasi-static particle-in-cell simulations with a pipelining algorithm," J. Computational Phys, 228, 5430 (2009). [8] C. Huang, W. An, V. K. Decyk, W. Lu, W. B. Mori, F. S. Tsung, M. Tzoufras, S. Morshed, T. Antonsen, B. Feng, T. Katsouleas, R., A. Fonseca, S. F. Martins, J. Vieira, L. O. Silva, E. Esarey, C. G. R. Geddes, W. P. Leemans, E. Cormier-Michel, J.-L. Vay, D. L. Bruhwiler, B. Cowan, J. R. Cary, and K. Paul, "Recent results and future challenges for large scale particleion- cell simulations of plasma-based accelerator concepts," Proc. of the SciDAC 2009 Conf., San Diego, CA, June, 2009 [Journal of Physics: Conference Series, vol. 180, Institute of Physics, Bristol and Philadelphia, 2009], p. 012005. [9] J.-L. Vay, C. M. Celata, M. A. Furman, G. Penn, M. Venturini, D. P. Grote, and K. G. Sonnad, ?Update on Electron-Cloud Simulations Using the Package WARP-POSINST.? Proc. of the 2009 Particle Accelerator Conference PAC09, Vancouver, Canada, June, 2009, paper FR5RFP078.« less

  4. Use of automatic exposure control in multislice computed tomography of the coronaries: comparison of 16‐slice and 64‐slice scanner data with conventional coronary angiography

    PubMed Central

    Deetjen, Anja; Möllmann, Susanne; Conradi, Guido; Rolf, Andreas; Schmermund, Axel; Hamm, Christian W; Dill, Thorsten

    2007-01-01

    Objective To evaluate the radiation‐dose‐reduction potential of automatic exposure control (AEC) in 16‐slice and 64‐slice multislice computed tomography (MSCT) of the coronary arteries (computed tomography angiography, CTA) in patients. The rapid growth in MSCT CTA emphasises the necessity of adjusting technique factors to reduce radiation dose exposure. Design A retrospective data analysis was performed for 154 patients who had undergone MSCT CTA. Group 1 (n = 56) had undergone 16‐slice MSCT without AEC, and group 2 (n = 51), with AEC. In group 1, invasive coronary angiography (ICA) had been performed in addition. Group 3 (n = 47) had been examined using a 64‐slice scanner (with AEC, without ECG‐triggered tube current modulation). Results In group 1, the mean (SD) effective dose (ED) for MSCT CTA was 9.76 (1.84) mSv and for ICA it was 2.6 (1.27) mSv. In group 2, the mean ED for MSCT CTA was 5.83 (1.73) mSv, which signifies a 42.8% dose reduction for CTA by the use of AEC. In comparison to ICA, MSCT CTA without AEC shows a 3.8‐fold increase in radiation dose, and the radiation dose of CTA with AEC was increased by a factor of 1.9. In group 3, the mean ED for MSCT CTA was 13.58 (2.80) mSV. Conclusions This is the first study to show the significant dose‐reduction potential (42.8%) of AEC in MSCT CTA in patients. This relatively new technique can be used to optimise the radiation dose levels in MSCT CTA. PMID:17395667

  5. Simultaneous multi-slice combined with PROPELLER.

    PubMed

    Norbeck, Ola; Avventi, Enrico; Engström, Mathias; Rydén, Henric; Skare, Stefan

    2018-08-01

    Simultaneous multi-slice (SMS) imaging is an advantageous method for accelerating MRI scans, allowing reduced scan time, increased slice coverage, or high temporal resolution with limited image quality penalties. In this work we combine the advantages of SMS acceleration with the motion correction and artifact reduction capabilities of the PROPELLER technique. A PROPELLER sequence was developed with support for CAIPIRINHA and phase optimized multiband radio frequency pulses. To minimize the time spent on acquiring calibration data, both in-plane-generalized autocalibrating partial parallel acquisition (GRAPPA) and slice-GRAPPA weights for all PROPELLER blade angles were calibrated on a single fully sampled PROPELLER blade volume. Therefore, the proposed acquisition included a single fully sampled blade volume, with the remaining blades accelerated in both the phase and slice encoding directions without additional auto calibrating signal lines. Comparison to 3D RARE was performed as well as demonstration of 3D motion correction performance on the SMS PROPELLER data. We show that PROPELLER acquisitions can be efficiently accelerated with SMS using a short embedded calibration. The potential in combining these two techniques was demonstrated with a high quality 1.0 × 1.0 × 1.0 mm 3 resolution T 2 -weighted volume, free from banding artifacts, and capable of 3D retrospective motion correction, with higher effective resolution compared to 3D RARE. With the combination of SMS acceleration and PROPELLER imaging, thin-sliced reformattable T 2 -weighted image volumes with 3D retrospective motion correction capabilities can be rapidly acquired with low sensitivity to flow and head motion. Magn Reson Med 80:496-506, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  6. Preparing Fresh Retinal Slices from Adult Zebrafish for Ex Vivo Imaging Experiments.

    PubMed

    Giarmarco, Michelle M; Cleghorn, Whitney M; Hurley, James B; Brockerhoff, Susan E

    2018-05-09

    The retina is a complex tissue that initiates and integrates the first steps of vision. Dysfunction of retinal cells is a hallmark of many blinding diseases, and future therapies hinge on fundamental understandings about how different retinal cells function normally. Gaining such information with biochemical methods has proven difficult because contributions of particular cell types are diminished in the retinal cell milieu. Live retinal imaging can provide a view of numerous biological processes on a subcellular level, thanks to a growing number of genetically encoded fluorescent biosensors. However, this technique has thus far been limited to tadpoles and zebrafish larvae, the outermost retinal layers of isolated retinas, or lower resolution imaging of retinas in live animals. Here we present a method for generating live ex vivo retinal slices from adult zebrafish for live imaging via confocal microscopy. This preparation yields transverse slices with all retinal layers and most cell types visible for performing confocal imaging experiments using perfusion. Transgenic zebrafish expressing fluorescent proteins or biosensors in specific retinal cell types or organelles are used to extract single-cell information from an intact retina. Additionally, retinal slices can be loaded with fluorescent indicator dyes, adding to the method's versatility. This protocol was developed for imaging Ca 2+ within zebrafish cone photoreceptors, but with proper markers it could be adapted to measure Ca 2+ or metabolites in Müller cells, bipolar and horizontal cells, microglia, amacrine cells, or retinal ganglion cells. The retinal pigment epithelium is removed from slices so this method is not suitable for studying that cell type. With practice, it is possible to generate serial slices from one animal for multiple experiments. This adaptable technique provides a powerful tool for answering many questions about retinal cell biology, Ca 2+ , and energy homeostasis.

  7. Performance evaluation of a 64-slice CT system with z-flying focal spot.

    PubMed

    Flohr, T; Stierstorfer, K; Raupach, R; Ulzheimer, S; Bruder, H

    2004-12-01

    The meanwhile established generation of 16-slice CT systems enables routine sub-millimeter imaging at short breath-hold times. Clinical progress in the development of multidetector row CT (MDCT) technology beyond 16 slices can more likely be expected from further improvement in spatial and temporal resolution rather than from a mere increase in the speed of volume coverage. We present an evaluation of a recently introduced 64-slice CT system (SOMATOM Sensation 64, Siemens AG, Forchheim, Germany), which uses a periodic motion of the focal spot in longitudinal direction (z-flying focal spot) to double the number of simultaneously acquired slices. This technique acquires 64 overlapping 0.6 mm slices per rotation. The sampling scheme corresponds to that of a 64 x 0.3 mm detector, with the goal of improved longitudinal resolution and reduced spiral artifacts. After an introduction to the detector design, we discuss the basics of z-flying focal spot technology (z-Sharp). We present phantom and specimen scans for performance evaluation. The measured full width at half maximum (FWHM) of the thinnest spiral slice is 0.65 mm. All spiral slice widths are almost independent of the pitch, with deviations of less than 0.1 mm from the nominal value. Using a high-resolution bar pattern phantom (CATPHAN, Phantom Laboratories, Salem, NY), the longitudinal resolution can be demonstrated to be up to 15 lp/cm at the isocenter independent of the pitch, corresponding to a bar diameter of 0.33 mm. Longitudinal resolution is only slightly degraded for off-center locations. At a distance of 100 mm from the isocenter, 14 lp/cm can be resolved in the z-direction, corresponding to a bar diameter of 0.36 mm. Spiral "windmill" artifacts presenting as hyper- and hypodense structures around osseous edges are effectively reduced by the z-flying focal spot technique. Cardiac scanning benefits from the short gantry rotation time of 0.33 s, providing up to 83 ms temporal resolution with 2-segment ECG-gated reconstruction.

  8. Local diffusion and diffusion-T2 distribution measurements in porous media

    NASA Astrophysics Data System (ADS)

    Vashaee, S.; Newling, B.; MacMillan, B.; Marica, F.; Li, M.; Balcom, B. J.

    2017-05-01

    Slice-selective pulsed field gradient (PFG) and PFG-T2 measurements are developed to measure spatially-resolved molecular diffusion and diffusion-T2 distributions. A spatially selective adiabatic inversion pulse was employed for slice-selection. The slice-selective pulse is able to select a coarse slice, on the order of 1 cm, at an arbitrary position in the sample. The new method can be employed to characterize oil-water mixtures in porous media. The new technique has an inherent sensitivity advantage over phase encoding imaging based methods due to signal being localized from a thick slice. The method will be advantageous for magnetic resonance of porous media at low field where sensitivity is problematic. Experimental CPMG data, following PFG diffusion measurement, were compromised by a transient ΔB0(t) field offset. The off resonance effects of ΔB0(t) were examined by simulation. The ΔB0 offset artifact in D-T2 distribution measurements may be avoided by employing real data, instead of magnitude data.

  9. A Higher-Order Neural Network Design for Improving Segmentation Performance in Medical Image Series

    NASA Astrophysics Data System (ADS)

    Selvi, Eşref; Selver, M. Alper; Güzeliş, Cüneyt; Dicle, Oǧuz

    2014-03-01

    Segmentation of anatomical structures from medical image series is an ongoing field of research. Although, organs of interest are three-dimensional in nature, slice-by-slice approaches are widely used in clinical applications because of their ease of integration with the current manual segmentation scheme. To be able to use slice-by-slice techniques effectively, adjacent slice information, which represents likelihood of a region to be the structure of interest, plays critical role. Recent studies focus on using distance transform directly as a feature or to increase the feature values at the vicinity of the search area. This study presents a novel approach by constructing a higher order neural network, the input layer of which receives features together with their multiplications with the distance transform. This allows higher-order interactions between features through the non-linearity introduced by the multiplication. The application of the proposed method to 9 CT datasets for segmentation of the liver shows higher performance than well-known higher order classification neural networks.

  10. [A modified intracellular labelling technique for high-resolution staining of neuron in 500 microm-thickness brain slice].

    PubMed

    Zhao, Ming-liang; Liu, Guo-long; Sui, Jian-feng; Ruan, Huai-zhen; Xiong, Ying

    2007-05-01

    To develop simple but reliable intracellular labelling method for high-resolution visualization of the fine structure of single neurons in brain slice with thickness of 500 microm. Biocytin was introduced into neurons in 500 microm-thickness brain slices while blind whole cell recording. Following processed for histochemistry using the avidin-biotin-complex method, stained slices were mounted in glycerol on special glass slides. Labelled cells were digital photomicrographed every 30 microm and reconstructed with Adobe Photoshop software. After histochemistry, limited background staining was produced. The resolution was so high that fine structure, including branching, termination of individual axons and even spines of neurons could be identified in exquisite detail with optic microscope. With the help of software, the neurons of interest could be reconstructed from a stack of photomicrographs. The modified method provides an easy and reliable approach to revealing the detailed morphological properties of single neurons in 500 microm-thickness brain slice. Without requisition of special equipment, it is suited to be broadly applied.

  11. Linear measurements in 2-dimensional pelvic floor imaging: the impact of slice tilt angles on measurement reproducibility.

    PubMed

    Hoyte, L; Ratiu, P

    2001-09-01

    Magnetic resonance imaging techniques have improved the study of female pelvic dysfunction. However, disagreements between magnetic resonance measurements and their derived 3-dimensional reconstructions were noted. We tested the hypothesis that these discrepancies stemmed from variations in magnetic resonance acquisition angle. Images from the pelvis of the Visible Human Female (a thinly sliced cadaveric image data set) were obtained. Slices in the axial plane were rotated around pivot points in the pelvis to yield a set of similar-appearing para-axial images. A parameter that described the maximum anterior-posterior dimension of the levator hiatus was defined. This levator hiatus parameter was measured on all of the rotated images and compared with an expected value that was calculated from trigonometry. The levator hiatus was also measured on a group of similar-appearing slices rotated slightly around a defined point. In 1 group of slices, expected levator hiatus variation was 1.5 to 6.1%, whereas measured variation was 4% to 15%. Among the similar-appearing rotated slices, 4.8% to 16.0% variations were seen in the levator hiatus. Identical measurements made on radiologic images can vary widely. Slice acquisition must be standardized to avoid errors in data comparison.

  12. Accelerated Slice Encoding for Metal Artifact Correction

    PubMed Central

    Hargreaves, Brian A.; Chen, Weitian; Lu, Wenmiao; Alley, Marcus T.; Gold, Garry E.; Brau, Anja C. S.; Pauly, John M.; Pauly, Kim Butts

    2010-01-01

    Purpose To demonstrate accelerated imaging with artifact reduction near metallic implants and different contrast mechanisms. Materials and Methods Slice-encoding for metal artifact correction (SEMAC) is a modified spin echo sequence that uses view-angle tilting and slice-direction phase encoding to correct both in-plane and through-plane artifacts. Standard spin echo trains and short-TI inversion recovery (STIR) allow efficient PD-weighted imaging with optional fat suppression. A completely linear reconstruction allows incorporation of parallel imaging and partial Fourier imaging. The SNR effects of all reconstructions were quantified in one subject. 10 subjects with different metallic implants were scanned using SEMAC protocols, all with scan times below 11 minutes, as well as with standard spin echo methods. Results The SNR using standard acceleration techniques is unaffected by the linear SEMAC reconstruction. In all cases with implants, accelerated SEMAC significantly reduced artifacts compared with standard imaging techniques, with no additional artifacts from acceleration techniques. The use of different contrast mechanisms allowed differentiation of fluid from other structures in several subjects. Conclusion SEMAC imaging can be combined with standard echo-train imaging, parallel imaging, partial-Fourier imaging and inversion recovery techniques to offer flexible image contrast with a dramatic reduction of metal-induced artifacts in scan times under 11 minutes. PMID:20373445

  13. Accelerated slice encoding for metal artifact correction.

    PubMed

    Hargreaves, Brian A; Chen, Weitian; Lu, Wenmiao; Alley, Marcus T; Gold, Garry E; Brau, Anja C S; Pauly, John M; Pauly, Kim Butts

    2010-04-01

    To demonstrate accelerated imaging with both artifact reduction and different contrast mechanisms near metallic implants. Slice-encoding for metal artifact correction (SEMAC) is a modified spin echo sequence that uses view-angle tilting and slice-direction phase encoding to correct both in-plane and through-plane artifacts. Standard spin echo trains and short-TI inversion recovery (STIR) allow efficient PD-weighted imaging with optional fat suppression. A completely linear reconstruction allows incorporation of parallel imaging and partial Fourier imaging. The signal-to-noise ratio (SNR) effects of all reconstructions were quantified in one subject. Ten subjects with different metallic implants were scanned using SEMAC protocols, all with scan times below 11 minutes, as well as with standard spin echo methods. The SNR using standard acceleration techniques is unaffected by the linear SEMAC reconstruction. In all cases with implants, accelerated SEMAC significantly reduced artifacts compared with standard imaging techniques, with no additional artifacts from acceleration techniques. The use of different contrast mechanisms allowed differentiation of fluid from other structures in several subjects. SEMAC imaging can be combined with standard echo-train imaging, parallel imaging, partial-Fourier imaging, and inversion recovery techniques to offer flexible image contrast with a dramatic reduction of metal-induced artifacts in scan times under 11 minutes. (c) 2010 Wiley-Liss, Inc.

  14. A Fourier approach to cloud motion estimation

    NASA Technical Reports Server (NTRS)

    Arking, A.; Lo, R. C.; Rosenfield, A.

    1977-01-01

    A Fourier technique is described for estimating cloud motion from pairs of pictures using the phase of the cross spectral density. The method allows motion estimates to be made for individual spatial frequencies, which are related to cloud pattern dimensions. Results obtained are presented and compared with the results of a Fourier domain cross correlation scheme. Using both artificial and real cloud data show that the technique is relatively sensitive to the presence of mixtures of motions, changes in cloud shape, and edge effects.

  15. Extending MODIS Cloud Top and Infrared Phase Climate Records with VIIRS and CrIS

    NASA Astrophysics Data System (ADS)

    Heidinger, A. K.; Platnick, S. E.; Ackerman, S. A.; Holz, R.; Meyer, K.; Frey, R.; Wind, G.; Li, Y.; Botambekov, D.

    2015-12-01

    The MODIS imagers on the NASA EOS Terra and Aqua satellites have generated accurate and well-used cloud climate data records for 15 years. Both missions are expected to continue until the end of this decade and perhaps beyond. The Visible and Infrared Imaging Radiometer Suite (VIIRS) imagers on the Suomi-NPP (SNPP) mission (launched in October 2011) and future NOAA Joint Polar Satellite System (JPSS) platforms are the successors for imager-based cloud climate records from polar orbiting satellites after MODIS. To ensure product continuity across a broad suite of EOS products, NASA has funded a SNPP science team to develop EOS-like algorithms that can be use with SNPP and JPSS observations, including two teams to work on cloud products. Cloud data record continuity between MODIS and VIIRS is particularly challenging due to the lack of VIIRS CO2-slicing channels, which reduces information content for cloud detection and cloud-top property products, as well as down-stream cloud optical products that rely on both. Here we report on our approach to providing continuity specifically for the MODIS/VIIRS cloud-top and infrared-derived thermodynamic phase products by combining elements of the NASA MODIS science team (MOD) and the NOAA Algorithm Working Group (AWG) algorithms. The combined approach is referred to as the MODAWG processing package. In collaboration with the NASA Atmospheric SIPS located at the University of Wisconsin Space Science and Engineering Center, the MODAWG code has been exercised on one year of SNPP VIIRS data. In addition to cloud-top and phase, MODAWG provides a full suite of cloud products that are physically consistent with MODIS and have a similar data format. Further, the SIPS has developed tools to allow use of Cross-track Infrared Sounder (CrIS) observations in the MODAWG processing that can ameliorate the loss of the CO2 absorption channels on VIIRS. Examples will be given that demonstrate the positive impact that the CrIS data can provide when combined with VIIRS for cloud height and IR-phase retrievals.

  16. Imaging of Brain Slices with a Genetically Encoded Voltage Indicator.

    PubMed

    Quicke, Peter; Barnes, Samuel J; Knöpfel, Thomas

    2017-01-01

    Functional fluorescence microscopy of brain slices using voltage sensitive fluorescent proteins (VSFPs) allows large scale electrophysiological monitoring of neuronal excitation and inhibition. We describe the equipment and techniques needed to successfully record functional responses optical voltage signals from cells expressing a voltage indicator such as VSFP Butterfly 1.2. We also discuss the advantages of voltage imaging and the challenges it presents.

  17. Development of a high efficiency thin silicon solar cell

    NASA Technical Reports Server (NTRS)

    Lindmayer, J.; Wrigley, C. Y.

    1977-01-01

    A key to the success of this program was the breakthrough development of a technology for producing ultra-thin silicon slices which are very flexible, resilient, and tolerant of moderate handling abuse. Experimental topics investigated were thinning technology, gaseous junction diffusion, aluminum back alloying, internal reflectance, tantalum oxide anti-reflective coating optimization, slice flexibility, handling techniques, production rate limiting steps, low temperature behavior, and radiation tolerance.

  18. Daytime Cloud Property Retrievals Over the Arctic from Multispectral MODIS Data

    NASA Technical Reports Server (NTRS)

    Spangenberg, Douglas A.; Trepte, Qing; Minnis, Patrick; Uttal, Taneil

    2004-01-01

    Improving climate model predictions over Earth's polar regions requires a complete understanding of polar clouds properties. Passive satellite remote sensing techniques can be used to retrieve macro and microphysical properties of polar cloud systems. However, over the Arctic, there is minimal contrast between clouds and the background snow surface observed in satellite data, especially for visible wavelengths. This makes it difficult to identify clouds and retrieve their properties from space. Variable snow and ice cover, temperature inversions, and the predominance of mixed-phase clouds further complicate cloud property identification. For this study, the operational Clouds and the Earth s Radiant Energy System (CERES) cloud mask is first used to discriminate clouds from the background surface in Terra Moderate Resolution Imaging Spectroradiometer (MODIS) data. A solar-infrared infrared nearinfrared technique (SINT) first used by Platnick et al. (2001) is used here to retrieve cloud properties over snow and ice covered regions.

  19. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  20. Silicon Ingot Casting - Heat Exchanger Method (HEM). Multi-Wire Slicing - Fixed Abrasive Slicing Technique (Fast). Phase 4 Silicon Sheet Growth Development for the Large Area Sheet Task of the Low-Cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Schmid, F.

    1981-01-01

    The crystallinity of large HEM silicon ingots as a function of heat flow conditions is investigated. A balanced heat flow at the bottom of the ingot restricts spurious nucleation to the edge of the melted-back seed in contact with the crucible. Homogeneous resistivity distribution over all the ingot has been achieved. The positioning of diamonds electroplated on wirepacks used to slice silicon crystals is considered. The electroplating of diamonds on only the cutting edge is described and the improved slicing performance of these wires evaluated. An economic analysis of value added costs of HEM ingot casting and band saw sectioning indicates the projected add on cost of HEM is well below the 1986 allocation.

  1. A simple water-immersion condenser for imaging living brain slices on an inverted microscope.

    PubMed

    Prusky, G T

    1997-09-05

    Due to some physical limitations of conventional condensers, inverted compound microscopes are not optimally suited for imaging living brain slices with transmitted light. Herein is described a simple device that converts an inverted microscope into an effective tool for this application by utilizing an objective as a condenser. The device is mounted on a microscope in place of the condenser, is threaded to accept a water immersion objective, and has a slot for a differential interference contrast (DIC) slider. When combined with infrared video techniques, this device allows an inverted microscope to effectively image living cells within thick brain slices in an open perfusion chamber.

  2. Maintaining network activity in submerged hippocampal slices: importance of oxygen supply.

    PubMed

    Hájos, Norbert; Ellender, Tommas J; Zemankovics, Rita; Mann, Edward O; Exley, Richard; Cragg, Stephanie J; Freund, Tamás F; Paulsen, Ole

    2009-01-01

    Studies in brain slices have provided a wealth of data on the basic features of neurons and synapses. In the intact brain, these properties may be strongly influenced by ongoing network activity. Although physiologically realistic patterns of network activity have been successfully induced in brain slices maintained in interface-type recording chambers, they have been harder to obtain in submerged-type chambers, which offer significant experimental advantages, including fast exchange of pharmacological agents, visually guided patch-clamp recordings, and imaging techniques. Here, we investigated conditions for the emergence of network oscillations in submerged slices prepared from the hippocampus of rats and mice. We found that the local oxygen level is critical for generation and propagation of both spontaneously occurring sharp wave-ripple oscillations and cholinergically induced fast oscillations. We suggest three ways to improve the oxygen supply to slices under submerged conditions: (i) optimizing chamber design for laminar flow of superfusion fluid; (ii) increasing the flow rate of superfusion fluid; and (iii) superfusing both surfaces of the slice. These improvements to the recording conditions enable detailed studies of neurons under more realistic conditions of network activity, which are essential for a better understanding of neuronal network operation.

  3. TH-EF-BRA-08: A Novel Technique for Estimating Volumetric Cine MRI (VC-MRI) From Multi-Slice Sparsely Sampled Cine Images Using Motion Modeling and Free Form Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, W; Yin, F; Wang, C

    Purpose: To develop a technique to estimate on-board VC-MRI using multi-slice sparsely-sampled cine images, patient prior 4D-MRI, motion-modeling and free-form deformation for real-time 3D target verification of lung radiotherapy. Methods: A previous method has been developed to generate on-board VC-MRI by deforming prior MRI images based on a motion model(MM) extracted from prior 4D-MRI and a single-slice on-board 2D-cine image. In this study, free-form deformation(FD) was introduced to correct for errors in the MM when large anatomical changes exist. Multiple-slice sparsely-sampled on-board 2D-cine images located within the target are used to improve both the estimation accuracy and temporal resolution ofmore » VC-MRI. The on-board 2D-cine MRIs are acquired at 20–30frames/s by sampling only 10% of the k-space on Cartesian grid, with 85% of that taken at the central k-space. The method was evaluated using XCAT(computerized patient model) simulation of lung cancer patients with various anatomical and respirational changes from prior 4D-MRI to onboard volume. The accuracy was evaluated using Volume-Percent-Difference(VPD) and Center-of-Mass-Shift(COMS) of the estimated tumor volume. Effects of region-of-interest(ROI) selection, 2D-cine slice orientation, slice number and slice location on the estimation accuracy were evaluated. Results: VCMRI estimated using 10 sparsely-sampled sagittal 2D-cine MRIs achieved VPD/COMS of 9.07±3.54%/0.45±0.53mm among all scenarios based on estimation with ROI-MM-ROI-FD. The FD optimization improved estimation significantly for scenarios with anatomical changes. Using ROI-FD achieved better estimation than global-FD. Changing the multi-slice orientation to axial, coronal, and axial/sagittal orthogonal reduced the accuracy of VCMRI to VPD/COMS of 19.47±15.74%/1.57±2.54mm, 20.70±9.97%/2.34±0.92mm, and 16.02±13.79%/0.60±0.82mm, respectively. Reducing the number of cines to 8 enhanced temporal resolution of VC-MRI by 25% while maintaining the estimation accuracy. Estimation using slices sampled uniformly through the tumor achieved better accuracy than slices sampled non-uniformly. Conclusions: Preliminary studies showed that it is feasible to generate VC-MRI from multi-slice sparsely-sampled 2D-cine images for real-time 3D-target verification. This work was supported by the National Institutes of Health under Grant No. R01-CA184173 and a research grant from Varian Medical Systems.« less

  4. Hurricane Alex as Observed by NASA's Spaceborne Atmospheric Infrared Sounder (AIRS)

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site] Click on the image for August 3, 2004 movie, slicing down the atmosphere with the AIRS infrared sensor

    These images of hurricane Alex were captured on August 3, 2004 at 1:30pm EDT. Located in the Atlantic Ocean located about 80 miles south-southeast of Charleston, South Carolina, Alex is now a category 2 hurricane with maximum sustained winds were near 100 mph (161 kph). Alex's center was about 65 miles (104 kilometers) northeast of Cape Hatteras and moving away from the U.S. coast.

    The major contribution to radiation (infrared light) that AIRS infrared channels sense comes from different levels in the atmosphere, depending upon the channel wavelength. To create the movies, a set of AIRS infrared channels were selected which probe the atmosphere at progressively deeper levels. If there were no clouds, the color in each frame would be nearly uniform until the Earth's surface is encountered. The tropospheric air temperature warms at a rate of 6 K (about 11 F) for each kilometer of descent toward the surface. Thus the colors would gradually change from cold to warm as the movie progresses.

    Clouds block the infrared radiation. Thus wherever there are clouds we can penetrate no deeper in infrared. The color remains fixed as the movie progresses, for that area of the image is 'stuck' to the cloud top temperature. The coldest temperatures around 220 K (about -65 F) come from altitudes of about 10 miles.

    We therefore see in a 'surface channel' at the end of the movie, signals from clouds as cold as 220 K and from Earth's surface at 310 K (about 100 F). The very coldest clouds are seen in deep convection thunderstorms over land. Images [figure removed for brevity, see original site] August 2, 2004, 1:30am ET Frame from August 2 movie, slicing down the atmosphere with the AIRS infrared sensor. Alex a tropical storm, sustained winds at 60 mph. The storm is 115 miles southeast of Charleston, South Carolina, traveling northeast at 6 mph.

    [figure removed for brevity, see original site] August 1, 2004, 1:30am ET Daylight snapshot from AIRS visible/near-infrared. At the time AIRS made this observation, Alex was still a tropical depression and just getting organized.

    Movies Slice down the atmosphere with the AIRS infrared sensor.

    [figure removed for brevity, see original site] August 3, 2004, 1:30am ET Alex becomes the first hurricane of the 2004 North Atlantic season with sustained winds at 75 mph.

    [figure removed for brevity, see original site] August 2, 2004, 1:30pm ET Alex is located about 120 miles southeast of Charleston, South Carolina. Alex has now begun to move to the northeast and a general northeastward track is expected the next couple of days with a gradual acceleration in forward speed as it begins to interact with stronger upper level winds.

    [figure removed for brevity, see original site] August 2, 2004, 1:30am ET Alex now has sustained winds of 35 knots.

    [figure removed for brevity, see original site] August 1, 2004, 1:30pm ET Alex is tropical depression and beginning to get organized.

    The Atmospheric Infrared Sounder Experiment, with its visible, infrared, and microwave detectors, provides a three-dimensional look at Earth's weather. Working in tandem, the three instruments can make simultaneous observations all the way down to the Earth's surface, even in the presence of heavy clouds. With more than 2,000 channels sensing different regions of the atmosphere, the system creates a global, 3-D map of atmospheric temperature and humidity and provides information on clouds, greenhouse gases, and many other atmospheric phenomena. The AIRS Infrared Sounder Experiment flies onboard NASA's Aqua spacecraft and is managed by NASA's Jet Propulsion Laboratory, Pasadena, Calif., under contract to NASA. JPL is a division of the California Institute of Technology in Pasadena.

  5. Cochlear Implant Electrode Localization Using an Ultra-High Resolution Scan Mode on Conventional 64-Slice and New Generation 192-Slice Multi-Detector Computed Tomography.

    PubMed

    Carlson, Matthew L; Leng, Shuai; Diehn, Felix E; Witte, Robert J; Krecke, Karl N; Grimes, Josh; Koeller, Kelly K; Bruesewitz, Michael R; McCollough, Cynthia H; Lane, John I

    2017-08-01

    A new generation 192-slice multi-detector computed tomography (MDCT) clinical scanner provides enhanced image quality and superior electrode localization over conventional MDCT. Currently, accurate and reliable cochlear implant electrode localization using conventional MDCT scanners remains elusive. Eight fresh-frozen cadaveric temporal bones were implanted with full-length cochlear implant electrodes. Specimens were subsequently scanned with conventional 64-slice and new generation 192-slice MDCT scanners utilizing ultra-high resolution modes. Additionally, all specimens were scanned with micro-CT to provide a reference criterion for electrode position. Images were reconstructed according to routine temporal bone clinical protocols. Three neuroradiologists, blinded to scanner type, reviewed images independently to assess resolution of individual electrodes, scalar localization, and severity of image artifact. Serving as the reference standard, micro-CT identified scalar crossover in one specimen; imaging of all remaining cochleae demonstrated complete scala tympani insertions. The 192-slice MDCT scanner exhibited improved resolution of individual electrodes (p < 0.01), superior scalar localization (p < 0.01), and reduced blooming artifact (p < 0.05), compared with conventional 64-slice MDCT. There was no significant difference between platforms when comparing streak or ring artifact. The new generation 192-slice MDCT scanner offers several notable advantages for cochlear implant imaging compared with conventional MDCT. This technology provides important feedback regarding electrode position and course, which may help in future optimization of surgical technique and electrode design.

  6. The preliminary exploration of 64-slice volume computed tomography in the accurate measurement of pleural effusion.

    PubMed

    Guo, Zhi-Jun; Lin, Qiang; Liu, Hai-Tao; Lu, Jun-Ying; Zeng, Yan-Hong; Meng, Fan-Jie; Cao, Bin; Zi, Xue-Rong; Han, Shu-Ming; Zhang, Yu-Huan

    2013-09-01

    Using computed tomography (CT) to rapidly and accurately quantify pleural effusion volume benefits medical and scientific research. However, the precise volume of pleural effusions still involves many challenges and currently does not have a recognized accurate measuring. To explore the feasibility of using 64-slice CT volume-rendering technology to accurately measure pleural fluid volume and to then analyze the correlation between the volume of the free pleural effusion and the different diameters of the pleural effusion. The 64-slice CT volume-rendering technique was used to measure and analyze three parts. First, the fluid volume of a self-made thoracic model was measured and compared with the actual injected volume. Second, the pleural effusion volume was measured before and after pleural fluid drainage in 25 patients, and the volume reduction was compared with the actual volume of the liquid extract. Finally, the free pleural effusion volume was measured in 26 patients to analyze the correlation between it and the diameter of the effusion, which was then used to calculate the regression equation. After using the 64-slice CT volume-rendering technique to measure the fluid volume of the self-made thoracic model, the results were compared with the actual injection volume. No significant differences were found, P = 0.836. For the 25 patients with drained pleural effusions, the comparison of the reduction volume with the actual volume of the liquid extract revealed no significant differences, P = 0.989. The following linear regression equation was used to compare the pleural effusion volume (V) (measured by the CT volume-rendering technique) with the pleural effusion greatest depth (d): V = 158.16 × d - 116.01 (r = 0.91, P = 0.000). The following linear regression was used to compare the volume with the product of the pleural effusion diameters (l × h × d): V = 0.56 × (l × h × d) + 39.44 (r = 0.92, P = 0.000). The 64-slice CT volume-rendering technique can accurately measure the volume in pleural effusion patients, and a linear regression equation can be used to estimate the volume of the free pleural effusion.

  7. Using Word Clouds to Develop Proactive Learners

    ERIC Educational Resources Information Center

    Miley, Frances; Read, Andrew

    2011-01-01

    This article examines student responses to a technique for summarizing electronically available information based on word frequency. Students used this technique to create word clouds, using those word clouds to enhance personal and small group study. This is a qualitative study. Small focus groups were used to obtain student feedback. Feedback…

  8. Four dimensional observations of clouds from geosynchronous orbit using stereo display and measurement techniques on an interactive information processing system

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Desjardins, M.; Shenk, W. E.

    1979-01-01

    Simultaneous Geosynchronous Operational Environmental Satellite (GOES) 1 km resolution visible image pairs can provide quantitative three dimensional measurements of clouds. These data have great potential for severe storms research and as a basic parameter measurement source for other areas of meteorology (e.g. climate). These stereo cloud height measurements are not subject to the errors and ambiguities caused by unknown cloud emissivity and temperature profiles that are associated with infrared techniques. This effort describes the display and measurement of stereo data using digital processing techniques.

  9. Neuronal network imaging in acute slices using Ca2+ sensitive bioluminescent reporter.

    PubMed

    Tricoire, Ludovic; Lambolez, Bertrand

    2014-01-01

    Genetically encoded indicators are valuable tools to study intracellular signaling cascades in real time using fluorescent or bioluminescent imaging techniques. Imaging of Ca(2+) indicators is widely used to record transient intracellular Ca(2+) increases associated with bioelectrical activity. The natural bioluminescent Ca(2+) sensor aequorin has been historically the first Ca(2+) indicator used to address biological questions. Aequorin imaging offers several advantages over fluorescent reporters: it is virtually devoid of background signal; it does not require light excitation and interferes little with intracellular processes. Genetically encoded sensors such as aequorin are commonly used in dissociated cultured cells; however it becomes more challenging to express them in differentiated intact specimen such as brain tissue. Here we describe a method to express a GFP-aequorin (GA) fusion protein in pyramidal cells of neocortical acute slices using recombinant Sindbis virus. This technique allows expressing GA in several hundreds of neurons on the same slice and to perform the bioluminescence recording of Ca(2+) transients in single neurons or multiple neurons simultaneously.

  10. Analysis and Evaluation of Processes and Equipment in Tasks 2 and 4 of the Low-cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Goldman, H.; Wolf, M.

    1978-01-01

    The significant economic data for the current production multiblade wafering and inner diameter slicing processes were tabulated and compared to data on the experimental and projected multiblade slurry, STC ID diamond coated blade, multiwire slurry and crystal systems fixed abrasive multiwire slicing methods. Cost calculations were performed for current production processes and for 1982 and 1986 projected wafering techniques.

  11. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  12. Reconstruction of 3D Shapes of Opaque Cumulus Clouds from Airborne Multiangle Imaging: A Proof-of-Concept

    NASA Astrophysics Data System (ADS)

    Davis, A. B.; Bal, G.; Chen, J.

    2015-12-01

    Operational remote sensing of microphysical and optical cloud properties is invariably predicated on the assumption of plane-parallel slab geometry for the targeted cloud. The sole benefit of this often-questionable assumption about the cloud is that it leads to one-dimensional (1D) radiative transfer (RT)---a textbook, computationally tractable model. We present new results as evidence that, thanks to converging advances in 3D RT, inverse problem theory, algorithm implementation, and computer hardware, we are at the dawn of a new era in cloud remote sensing where we can finally go beyond the plane-parallel paradigm. Granted, the plane-parallel/1D RT assumption is reasonable for spatially extended stratiform cloud layers, as well as the smoothly distributed background aerosol layers. However, these 1D RT-friendly scenarios exclude cases that are critically important for climate physics. 1D RT---whence operational cloud remote sensing---fails catastrophically for cumuliform clouds that have fully 3D outer shapes and internal structures driven by shallow or deep convection. For these situations, the first order of business in a robust characterization by remote sensing is to abandon the slab geometry framework and determine the 3D geometry of the cloud, as a first step toward bone fide 3D cloud tomography. With this specific goal in mind, we deliver a proof-of-concept for an entirely new kind of remote sensing applicable to 3D clouds. It is based on highly simplified 3D RT and exploits multi-angular suites of cloud images at high spatial resolution. Airborne sensors like AirMSPI readily acquire such data. The key element of the reconstruction algorithm is a sophisticated solution of the nonlinear inverse problem via linearization of the forward model and an iteration scheme supported, where necessary, by adaptive regularization. Currently, the demo uses a 2D setting to show how either vertical profiles or horizontal slices of the cloud can be accurately reconstructed. Extension to 3D volumes is straightforward but the next challenge is to accommodate images at lower spatial resolution, e.g., from MISR/Terra. G. Bal, J. Chen, and A.B. Davis (2015). Reconstruction of cloud geometry from multi-angle images, Inverse Problems in Imaging (submitted).

  13. Biogenic Aerosols – Effects on Climate and Clouds. Cloud Optical Depth (COD) Sensor Three-Waveband Spectrally-Agile Technique (TWST) Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niple, E. R.; Scott, H. E.

    2016-04-01

    This report describes the data collected by the Three-Waveband Spectrally-agile Technique (TWST) sensor deployed at Hyytiälä, Finland from 16 July to 31 August 2014 as a guest on the Biogenic Aerosols Effects on Climate and Clouds (BAECC) campaign. These data are currently available from the Atmospheric Radiation Measurement (ARM) Data Archive website and consists of Cloud Optical Depth (COD) measurements for the clouds directly overhead approximately every second (with some dropouts described below) during the daylight periods. A good range of cloud conditions were observed from clear sky to heavy rainfall.

  14. Transmission of 2.5 Gbit/s Spectrum-sliced WDM System for 50 km Single-mode Fiber

    NASA Astrophysics Data System (ADS)

    Ahmed, Nasim; Aljunid, Sayed Alwee; Ahmad, R. Badlisha; Fadil, Hilal Adnan; Rashid, Mohd Abdur

    2011-06-01

    The transmission of a spectrum-sliced WDM channel at 2.5 Gbit/s for 50 km of single mode fiber using an system channel spacing only 0.4 nm is reported. We have investigated the system performance using NRZ modulation format. The proposed system is compared with conventional system. The system performance is characterized as the bit-error-rate (BER) received against the system bit rates. Simulation results show that the NRZ modulation format performs well for 2.5 Gbit/s system bit rates. Using this narrow channel spectrum-sliced technique, the total number of multiplexed channels can be increased greatly in WDM system. Therefore, 0.4 nm channel spacing spectrum-sliced WDM system is highly recommended for the long distance optical access networks, like the Metro Area Network (MAN), Fiber-to-the-Building (FTTB) and Fiber-to-the-Home (FTTH).

  15. ASSURED CLOUD COMPUTING UNIVERSITY CENTER OFEXCELLENCE (ACC UCOE)

    DTIC Science & Technology

    2018-01-18

    average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...infrastructure security -Design of algorithms and techniques for real- time assuredness in cloud computing -Map-reduce task assignment with data locality...46 DESIGN OF ALGORITHMS AND TECHNIQUES FOR REAL- TIME ASSUREDNESS IN CLOUD COMPUTING

  16. Evaluation of Satellite-Based Upper Troposphere Cloud Top Height Retrievals in Multilayer Cloud Conditions During TC4

    NASA Technical Reports Server (NTRS)

    Chang, Fu-Lung; Minnis, Patrick; Ayers, J. Kirk; McGill, Matthew J.; Palikonda, Rabindra; Spangenberg, Douglas A.; Smith, William L., Jr.; Yost, Christopher R.

    2010-01-01

    Upper troposphere cloud top heights (CTHs), restricted to cloud top pressures (CTPs) less than 500 hPa, inferred using four satellite retrieval methods applied to Twelfth Geostationary Operational Environmental Satellite (GOES-12) data are evaluated using measurements during the July August 2007 Tropical Composition, Cloud and Climate Coupling Experiment (TC4). The four methods are the single-layer CO2-absorption technique (SCO2AT), a modified CO2-absorption technique (MCO2AT) developed for improving both single-layered and multilayered cloud retrievals, a standard version of the Visible Infrared Solar-infrared Split-window Technique (old VISST), and a new version of VISST (new VISST) recently developed to improve cloud property retrievals. They are evaluated by comparing with ER-2 aircraft-based Cloud Physics Lidar (CPL) data taken during 9 days having extensive upper troposphere cirrus, anvil, and convective clouds. Compared to the 89% coverage by upper tropospheric clouds detected by the CPL, the SCO2AT, MCO2AT, old VISST, and new VISST retrieved CTPs less than 500 hPa in 76, 76, 69, and 74% of the matched pixels, respectively. Most of the differences are due to subvisible and optically thin cirrus clouds occurring near the tropopause that were detected only by the CPL. The mean upper tropospheric CTHs for the 9 days are 14.2 (+/- 2.1) km from the CPL and 10.7 (+/- 2.1), 12.1 (+/- 1.6), 9.7 (+/- 2.9), and 11.4 (+/- 2.8) km from the SCO2AT, MCO2AT, old VISST, and new VISST, respectively. Compared to the CPL, the MCO2AT CTHs had the smallest mean biases for semitransparent high clouds in both single-layered and multilayered situations whereas the new VISST CTHs had the smallest mean biases when upper clouds were opaque and optically thick. The biases for all techniques increased with increasing numbers of cloud layers. The transparency of the upper layer clouds tends to increase with the numbers of cloud layers.

  17. Remote Sensing of Multiple Cloud Layer Heights Using Multi-Angular Measurements

    NASA Technical Reports Server (NTRS)

    Sinclair, Kenneth; Van Diedenhoven, Bastiaan; Cairns, Brian; Yorks, John; Wasilewski, Andrzej; Mcgill, Matthew

    2017-01-01

    Cloud top height (CTH) affects the radiative properties of clouds. Improved CTH observations will allow for improved parameterizations in large-scale models and accurate information on CTH is also important when studying variations in freezing point and cloud microphysics. NASAs airborne Research Scanning Polarimeter (RSP) is able to measure cloud top height using a novel multi-angular contrast approach. For the determination of CTH, a set of consecutive nadir reflectances is selected and the cross-correlations between this set and co-located sets at other viewing angles are calculated for a range of assumed cloud top heights, yielding a correlation profile. Under the assumption that cloud reflectances are isotropic, local peaks in the correlation profile indicate cloud layers. This technique can be applied to every RSP footprint and we demonstrate that detection of multiple peaks in the correlation profile allow retrieval of heights of multiple cloud layers within single RSP footprints. This paper provides an in-depth description of the architecture and performance of the RSPs CTH retrieval technique using data obtained during the Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC(exp. 4)RS) campaign. RSP retrieved cloud heights are evaluated using collocated data from the Cloud Physics Lidar (CPL). The method's accuracy associated with the magnitude of correlation, optical thickness, cloud thickness and cloud height are explored. The technique is applied to measurements at a wavelength of 670 nm and 1880 nm and their combination. The 1880-nm band is virtually insensitive to the lower troposphere due to strong water vapor absorption.

  18. Regulation of Axolotl (Ambystoma mexicanum) Limb Blastema Cell Proliferation by Nerves and BMP2 in Organotypic Slice Culture.

    PubMed

    Lehrberg, Jeffrey; Gardiner, David M

    2015-01-01

    We have modified and optimized the technique of organotypic slice culture in order to study the mechanisms regulating growth and pattern formation in regenerating axolotl limb blastemas. Blastema cells maintain many of the behaviors that are characteristic of blastemas in vivo when cultured as slices in vitro, including rates of proliferation that are comparable to what has been reported in vivo. Because the blastema slices can be cultured in basal medium without fetal bovine serum, it was possible to test the response of blastema cells to signaling molecules present in serum, as well as those produced by nerves. We also were able to investigate the response of blastema cells to experimentally regulated changes in BMP signaling. Blastema cells responded to all of these signals by increasing the rate of proliferation and the level of expression of the blastema marker gene, Prrx-1. The organotypic slice culture model provides the opportunity to identify and characterize the spatial and temporal co-regulation of pathways in order to induce and enhance a regenerative response.

  19. Regulation of Axolotl (Ambystoma mexicanum) Limb Blastema Cell Proliferation by Nerves and BMP2 in Organotypic Slice Culture

    PubMed Central

    Lehrberg, Jeffrey; Gardiner, David M.

    2015-01-01

    We have modified and optimized the technique of organotypic slice culture in order to study the mechanisms regulating growth and pattern formation in regenerating axolotl limb blastemas. Blastema cells maintain many of the behaviors that are characteristic of blastemas in vivo when cultured as slices in vitro, including rates of proliferation that are comparable to what has been reported in vivo. Because the blastema slices can be cultured in basal medium without fetal bovine serum, it was possible to test the response of blastema cells to signaling molecules present in serum, as well as those produced by nerves. We also were able to investigate the response of blastema cells to experimentally regulated changes in BMP signaling. Blastema cells responded to all of these signals by increasing the rate of proliferation and the level of expression of the blastema marker gene, Prrx-1. The organotypic slice culture model provides the opportunity to identify and characterize the spatial and temporal co-regulation of pathways in order to induce and enhance a regenerative response. PMID:25923915

  20. Mass detection in digital breast tomosynthesis data using convolutional neural networks and multiple instance learning.

    PubMed

    Yousefi, Mina; Krzyżak, Adam; Suen, Ching Y

    2018-05-01

    Digital breast tomosynthesis (DBT) was developed in the field of breast cancer screening as a new tomographic technique to minimize the limitations of conventional digital mammography breast screening methods. A computer-aided detection (CAD) framework for mass detection in DBT has been developed and is described in this paper. The proposed framework operates on a set of two-dimensional (2D) slices. With plane-to-plane analysis on corresponding 2D slices from each DBT, it automatically learns complex patterns of 2D slices through a deep convolutional neural network (DCNN). It then applies multiple instance learning (MIL) with a randomized trees approach to classify DBT images based on extracted information from 2D slices. This CAD framework was developed and evaluated using 5040 2D image slices derived from 87 DBT volumes. The empirical results demonstrate that this proposed CAD framework achieves much better performance than CAD systems that use hand-crafted features and deep cardinality-restricted Bolzmann machines to detect masses in DBTs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Image Quality of 3rd Generation Spiral Cranial Dual-Source CT in Combination with an Advanced Model Iterative Reconstruction Technique: A Prospective Intra-Individual Comparison Study to Standard Sequential Cranial CT Using Identical Radiation Dose

    PubMed Central

    Wenz, Holger; Maros, Máté E.; Meyer, Mathias; Förster, Alex; Haubenreisser, Holger; Kurth, Stefan; Schoenberg, Stefan O.; Flohr, Thomas; Leidecker, Christianne; Groden, Christoph; Scharf, Johann; Henzler, Thomas

    2015-01-01

    Objectives To prospectively intra-individually compare image quality of a 3rd generation Dual-Source-CT (DSCT) spiral cranial CT (cCT) to a sequential 4-slice Multi-Slice-CT (MSCT) while maintaining identical intra-individual radiation dose levels. Methods 35 patients, who had a non-contrast enhanced sequential cCT examination on a 4-slice MDCT within the past 12 months, underwent a spiral cCT scan on a 3rd generation DSCT. CTDIvol identical to initial 4-slice MDCT was applied. Data was reconstructed using filtered backward projection (FBP) and 3rd-generation iterative reconstruction (IR) algorithm at 5 different IR strength levels. Two neuroradiologists independently evaluated subjective image quality using a 4-point Likert-scale and objective image quality was assessed in white matter and nucleus caudatus with signal-to-noise ratios (SNR) being subsequently calculated. Results Subjective image quality of all spiral cCT datasets was rated significantly higher compared to the 4-slice MDCT sequential acquisitions (p<0.05). Mean SNR was significantly higher in all spiral compared to sequential cCT datasets with mean SNR improvement of 61.65% (p*Bonferroni0.05<0.0024). Subjective image quality improved with increasing IR levels. Conclusion Combination of 3rd-generation DSCT spiral cCT with an advanced model IR technique significantly improves subjective and objective image quality compared to a standard sequential cCT acquisition acquired at identical dose levels. PMID:26288186

  2. Image Quality of 3rd Generation Spiral Cranial Dual-Source CT in Combination with an Advanced Model Iterative Reconstruction Technique: A Prospective Intra-Individual Comparison Study to Standard Sequential Cranial CT Using Identical Radiation Dose.

    PubMed

    Wenz, Holger; Maros, Máté E; Meyer, Mathias; Förster, Alex; Haubenreisser, Holger; Kurth, Stefan; Schoenberg, Stefan O; Flohr, Thomas; Leidecker, Christianne; Groden, Christoph; Scharf, Johann; Henzler, Thomas

    2015-01-01

    To prospectively intra-individually compare image quality of a 3rd generation Dual-Source-CT (DSCT) spiral cranial CT (cCT) to a sequential 4-slice Multi-Slice-CT (MSCT) while maintaining identical intra-individual radiation dose levels. 35 patients, who had a non-contrast enhanced sequential cCT examination on a 4-slice MDCT within the past 12 months, underwent a spiral cCT scan on a 3rd generation DSCT. CTDIvol identical to initial 4-slice MDCT was applied. Data was reconstructed using filtered backward projection (FBP) and 3rd-generation iterative reconstruction (IR) algorithm at 5 different IR strength levels. Two neuroradiologists independently evaluated subjective image quality using a 4-point Likert-scale and objective image quality was assessed in white matter and nucleus caudatus with signal-to-noise ratios (SNR) being subsequently calculated. Subjective image quality of all spiral cCT datasets was rated significantly higher compared to the 4-slice MDCT sequential acquisitions (p<0.05). Mean SNR was significantly higher in all spiral compared to sequential cCT datasets with mean SNR improvement of 61.65% (p*Bonferroni0.05<0.0024). Subjective image quality improved with increasing IR levels. Combination of 3rd-generation DSCT spiral cCT with an advanced model IR technique significantly improves subjective and objective image quality compared to a standard sequential cCT acquisition acquired at identical dose levels.

  3. Silicon Ingot Casting: Heat Exchanger Method. Multi-wire Slicing: Fixed Abrasine Slicing Technique, Phase 3

    NASA Technical Reports Server (NTRS)

    Schmid, F.; Khattak, C. P.

    1979-01-01

    Ingot casting was scaled up to 16 cm by 16 cm square cross section size and ingots weighing up to 8.1 kg were cast. The high degree of crystallinity was maintained in the large ingot. For large sizes, the nonuniformity of heat treatment causes chipping of the surface of the ingot. Progress was made in the development of a uniform graded structure in the silica crucibles. The high speed slicer blade-head weight was reduced to 37 pounds, allowing surface speeds of up to 500 feet per minute. Slicing of 10 cm diameter workpieces at these speeds increased the through-put of the machine to 0.145 mm/min.

  4. Volunteered Cloud Computing for Disaster Management

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.

  5. Visualising the procedures in the influence of water on the ablation of dental hard tissue with erbium:yttrium-aluminium-garnet and erbium, chromium:yttrium-scandium-gallium-garnet laser pulses.

    PubMed

    Mir, Maziar; Gutknecht, Norbert; Poprawe, Reinhart; Vanweersch, Leon; Lampert, Friedrich

    2009-05-01

    The exact mechanism of the ablation of tooth hard tissue with most common wavelengths, which are 2,940 nm and 2,780 nm, is not yet clear. There are several different theories, but none of them has yet been established. Concepts and methods of looking at these mechanisms have been based on heat formation and transformation, and mathematical calculations evaluating the outcome of ablation, such as looking at the shape of cuts. This study provides a new concept, which is the monitoring of the direct interactions between laser light, water and enamel, with a high-speed camera. For this purpose, both the above-mentioned wavelengths were examined. Bovine anterior teeth were prepared as thin slices. Each imaged slice had a thickness close to that of the beam diameter so that the ablation effect could be shown in two dimensional pictures. The single images were extracted from the video-clips and then were animated. The following steps, explaining the ablation procedures during each pulse, were seen and reported: (1) low-output energy intensity in the first pulses that did not lead to an ablative effect; (2) bubble formation with higher output energy density; (3) the tooth surface during the pulse was covered with the plume of vapour (comparable with a cloud), and the margins of ablation on the tooth were not clear; (4) when the vapour bubble (cloud) was collapsing, an additional ablative process at the surface could be seen.

  6. Visual patch clamp recording of neurons in thick portions of the adult spinal cord.

    PubMed

    Munch, Anders Sonne; Smith, Morten; Moldovan, Mihai; Perrier, Jean-François

    2010-07-15

    The study of visually identified neurons in slice preparations from the central nervous system offers considerable advantages over in vivo preparations including high mechanical stability in the absence of anaesthesia and full control of the extracellular medium. However, because of their relative thinness, slices are not appropriate for investigating how individual neurons integrate synaptic inputs generated by large numbers of neurons. Here we took advantage of the exceptional resistance of the turtle to anoxia to make slices of increasing thicknesses (from 300 to 3000 microm) from the lumbar enlargement of the spinal cord. With a conventional upright microscope in which the light condenser was carefully adjusted, we could visualize neurons present at the surface of the slice and record them with the whole-cell patch clamp technique. We show that neurons present in the middle of the preparation remain alive and capable of generating action potentials. By stimulating the lateral funiculus we can evoke intense synaptic activity associated with large increases in conductance of the recorded neurons. The conductance increases substantially more in neurons recorded in thick slices suggesting that the size of the network recruited with the stimulation increases with the thickness of the slices. We also find that that the number of spontaneous excitatory postsynaptic currents (EPSCs) is higher in thick slices compared with thin slices while the number of spontaneous inhibitory postsynaptic currents (IPSCs) remains constant. These preliminary data suggest that inhibitory and excitatory synaptic connections are balanced locally while excitation dominates long-range connections in the spinal cord. Copyright 2010 Elsevier B.V. All rights reserved.

  7. Cumulus cloud base height estimation from high spatial resolution Landsat data - A Hough transform approach

    NASA Technical Reports Server (NTRS)

    Berendes, Todd; Sengupta, Sailes K.; Welch, Ron M.; Wielicki, Bruce A.; Navar, Murgesh

    1992-01-01

    A semiautomated methodology is developed for estimating cumulus cloud base heights on the basis of high spatial resolution Landsat MSS data, using various image-processing techniques to match cloud edges with their corresponding shadow edges. The cloud base height is then estimated by computing the separation distance between the corresponding generalized Hough transform reference points. The differences between the cloud base heights computed by these means and a manual verification technique are of the order of 100 m or less; accuracies of 50-70 m may soon be possible via EOS instruments.

  8. Cloud cover typing from environmental satellite imagery. Discriminating cloud structure with Fast Fourier Transforms (FFT)

    NASA Technical Reports Server (NTRS)

    Logan, T. L.; Huning, J. R.; Glackin, D. L.

    1983-01-01

    The use of two dimensional Fast Fourier Transforms (FFTs) subjected to pattern recognition technology for the identification and classification of low altitude stratus cloud structure from Geostationary Operational Environmental Satellite (GOES) imagery was examined. The development of a scene independent pattern recognition methodology, unconstrained by conventional cloud morphological classifications was emphasized. A technique for extracting cloud shape, direction, and size attributes from GOES visual imagery was developed. These attributes were combined with two statistical attributes (cloud mean brightness, cloud standard deviation), and interrogated using unsupervised clustering amd maximum likelihood classification techniques. Results indicate that: (1) the key cloud discrimination attributes are mean brightness, direction, shape, and minimum size; (2) cloud structure can be differentiated at given pixel scales; (3) cloud type may be identifiable at coarser scales; (4) there are positive indications of scene independence which would permit development of a cloud signature bank; (5) edge enhancement of GOES imagery does not appreciably improve cloud classification over the use of raw data; and (6) the GOES imagery must be apodized before generation of FFTs.

  9. Robust Transmission of H.264/AVC Streams Using Adaptive Group Slicing and Unequal Error Protection

    NASA Astrophysics Data System (ADS)

    Thomos, Nikolaos; Argyropoulos, Savvas; Boulgouris, Nikolaos V.; Strintzis, Michael G.

    2006-12-01

    We present a novel scheme for the transmission of H.264/AVC video streams over lossy packet networks. The proposed scheme exploits the error-resilient features of H.264/AVC codec and employs Reed-Solomon codes to protect effectively the streams. A novel technique for adaptive classification of macroblocks into three slice groups is also proposed. The optimal classification of macroblocks and the optimal channel rate allocation are achieved by iterating two interdependent steps. Dynamic programming techniques are used for the channel rate allocation process in order to reduce complexity. Simulations clearly demonstrate the superiority of the proposed method over other recent algorithms for transmission of H.264/AVC streams.

  10. Localized Spatio-Temporal Constraints for Accelerated CMR Perfusion

    PubMed Central

    Akçakaya, Mehmet; Basha, Tamer A.; Pflugi, Silvio; Foppa, Murilo; Kissinger, Kraig V.; Hauser, Thomas H.; Nezafat, Reza

    2013-01-01

    Purpose To develop and evaluate an image reconstruction technique for cardiac MRI (CMR)perfusion that utilizes localized spatio-temporal constraints. Methods CMR perfusion plays an important role in detecting myocardial ischemia in patients with coronary artery disease. Breath-hold k-t based image acceleration techniques are typically used in CMR perfusion for superior spatial/temporal resolution, and improved coverage. In this study, we propose a novel compressed sensing based image reconstruction technique for CMR perfusion, with applicability to free-breathing examinations. This technique uses local spatio-temporal constraints by regularizing image patches across a small number of dynamics. The technique is compared to conventional dynamic-by-dynamic reconstruction, and sparsity regularization using a temporal principal-component (pc) basis, as well as zerofilled data in multi-slice 2D and 3D CMR perfusion. Qualitative image scores are used (1=poor, 4=excellent) to evaluate the technique in 3D perfusion in 10 patients and 5 healthy subjects. On 4 healthy subjects, the proposed technique was also compared to a breath-hold multi-slice 2D acquisition with parallel imaging in terms of signal intensity curves. Results The proposed technique results in images that are superior in terms of spatial and temporal blurring compared to the other techniques, even in free-breathing datasets. The image scores indicate a significant improvement compared to other techniques in 3D perfusion (2.8±0.5 vs. 2.3±0.5 for x-pc regularization, 1.7±0.5 for dynamic-by-dynamic, 1.1±0.2 for zerofilled). Signal intensity curves indicate similar dynamics of uptake between the proposed method with a 3D acquisition and the breath-hold multi-slice 2D acquisition with parallel imaging. Conclusion The proposed reconstruction utilizes sparsity regularization based on localized information in both spatial and temporal domains for highly-accelerated CMR perfusion with potential utility in free-breathing 3D acquisitions. PMID:24123058

  11. Automated Quantification of Myocardial Salvage in a Rat Model of Ischemia–Reperfusion Injury Using 3D High‐Resolution Magnetic Resonance Imaging (MRI)

    PubMed Central

    Grieve, Stuart M.; Mazhar, Jawad; Callaghan, Fraser; Kok, Cindy Y.; Tandy, Sarah; Bhindi, Ravinay; Figtree, Gemma A.

    2014-01-01

    Background Quantification of myocardial “area at risk” (AAR) and myocardial infarction (MI) zone is critical for assessing novel therapies targeting myocardial ischemia–reperfusion (IR) injury. Current “gold‐standard” methods perfuse the heart with Evan's Blue and stain with triphenyl tetrazolium chloride (TTC), requiring manual slicing and analysis. We aimed to develop and validate a high‐resolution 3‐dimensional (3D) magnetic resonance imaging (MRI) method for quantifying MI and AAR. Methods and Results Forty‐eight hours after IR was induced, rats were anesthetized and gadopentetate dimeglumine was administered intravenously. After 10 minutes, the coronary artery was re‐ligated and a solution containing iron oxide microparticles and Evan's Blue was infused (for comparison). Hearts were harvested and transversally sectioned for TTC staining. Ex vivo MR images of slices were acquired on a 9.4‐T magnet. T2* data allowed visualization of AAR, with microparticle‐associated signal loss in perfused regions. T1 data demonstrated gadolinium retention in infarcted zones. Close correlation (r=0.92 to 0.94; P<0.05) of MRI and Evan's Blue/TTC measures for both AAR and MI was observed when the combined techniques were applied to the same heart slice. However, 3D MRI acquisition and analysis of whole heart reduced intra‐observer variability compared to assessment of isolated slices, and allowed automated segmentation and analysis, thus reducing interobserver variation. Anatomical resolution of 81 μm3 was achieved (versus ≈2 mm with manual slicing). Conclusions This novel, yet simple, MRI technique allows precise assessment of infarct and AAR zones. It removes the need for tissue slicing and provides opportunity for 3D digital analysis at high anatomical resolution in a streamlined manner accessible for all laboratories already performing IR experiments. PMID:25146703

  12. A comparison of several techniques to assign heights to cloud tracers

    NASA Technical Reports Server (NTRS)

    Nieman, Steven J.; Schmetz, Johannes; Menzel, W. P.

    1993-01-01

    Experimental results are presented which suggest that the water-vapor technique of radiance measurement is a viable alternative to the CO2 technique for inferring the height of semitransparent cloud elements. Future environmental satellites will rely on H2O-derived cloud-height assignments in the wind-field determinations with the next operational geostationary satellite. On a given day, the heights from the H2O and CO2 approaches compare to within 60-110 hPa rms.

  13. Cloud properties inferred from 8-12 micron data

    NASA Technical Reports Server (NTRS)

    Strabala, Kathleen I.; Ackerman, Steven A.; Menzel, W. Paul

    1994-01-01

    A trispectral combination of observations at 8-, 11-, and 12-micron bands is suggested for detecting cloud and cloud properties in the infrared. Atmospheric ice and water vapor absorption peak in opposite halves of the window region so that positive 8-minus-11-micron brightness temperature differences indicate cloud, while near-zero or negative differences indicate clear regions. The absorption coefficient for water increases more between 11 and 12 microns than between 8 and 11 microns, while for ice, the reverse is true. Cloud phases is determined by a scatter diagram of 8-minus-11-micron versus 11-minus-12-micron brightness temperature differences; ice cloud shows a slope greater than 1 and water cloud less than 1. The trispectral brightness temperature method was tested upon high-resolution interferometer data resulting in clear-cloud and cloud-phase delineation. Simulations using differing 8-micron bandwidths revealed no significant degradation of cloud property detection. Thus, the 8-micron bandwidth for future satellites can be selected based on the requirements of other applications, such as surface characterization studies. Application of the technique to current polar-orbiting High-Resolution Infrared Sounder (HIRS)-Advanced Very High Resolution Radiometer (AVHRR) datasets is constrained by the nonuniformity of the cloud scenes sensed within the large HIRS field of view. Analysis of MAS (MODIS Airborne Simulator) high-spatial resolution (500 m) data with all three 8-, 11-, and 12-micron bands revealed sharp delineation of differing cloud and background scenes, from which a simple automated threshold technique was developed. Cloud phase, clear-sky, and qualitative differences in cloud emissivity and cloud height were identified on a case study segment from 24 November 1991, consistent with the scene. More rigorous techniques would allow further cloud parameter clarification. The opportunities for global cloud delineation with the Moderate-Resolution Imaging Spectrometer (MODIS) appear excellent. The spectral selection, the spatial resolution, and the global coverage are all well suited for significant advances.

  14. Effects of chemical composite, puffing temperature and intermediate moisture content on physical properties of potato and apple slices

    NASA Astrophysics Data System (ADS)

    Tabtaing, S.; Paengkanya, S.; Tanthong, P.

    2017-09-01

    Puffing technique is the process that can improve texture and volumetric of crisp fruit and vegetable. However, the effect of chemical composite in foods on puffing characteristics is still lack of study. Therefore, potato and apple slices were comparative study on their physical properties. Potato and apple were sliced into 2.5 mm thickness and 2.5 cm in diameter. Potato slices were treated by hot water for 2 min while apple slices were not treatment. After that, they were dried in 3 steps. First step, they were dried by hot air at temperature of 90°C until their moisture content reached to 30, 40, and 50 % dry basis. Then they were puffed by hot air at temperature of 130, 150, and 170°C for 2 min. Finally, they were dried again by hot air at temperature of 90°C until their final moisture content reached to 4% dry basis. The experimental results showed that chemical composite of food affected on physical properties of puffed product. Puffed potato had higher volume ratio than those puffed apple because potato slices contains starch. The higher starch content provided more hard texture of potato than those apples. Puffing temperature and moisture content strongly affected on the color, volume ratio, and textural properties of puffed potato slices. In addition, the high drying rate of puffed product observed at high puffing temperature and higher moisture content.

  15. Assimilation of Satellite to Improve Cloud Simulation in Wrf Model

    NASA Astrophysics Data System (ADS)

    Park, Y. H.; Pour Biazar, A.; McNider, R. T.

    2012-12-01

    A simple approach has been introduced to improve cloud simulation spatially and temporally in a meteorological model. The first step for this approach is to use Geostationary Operational Environmental Satellite (GOES) observations to identify clouds and estimate the clouds structure. Then by comparing GOES observations to model cloud field, we identify areas in which model has under-predicted or over-predicted clouds. Next, by introducing subsidence in areas with over-prediction and lifting in areas with under-prediction, erroneous clouds are removed and new clouds are formed. The technique estimates a vertical velocity needed for the cloud correction and then uses a one dimensional variation schemes (1D_Var) to calculate the horizontal divergence components and the consequent horizontal wind components needed to sustain such vertical velocity. Finally, the new horizontal winds are provided as a nudging field to the model. This nudging provides the dynamical support needed to create/clear clouds in a sustainable manner. The technique was implemented and tested in the Weather Research and Forecast (WRF) Model and resulted in substantial improvement in model simulated clouds. Some of the results are presented here.

  16. Sliding-slab three-dimensional TSE imaging with a spiral-In/Out readout.

    PubMed

    Li, Zhiqiang; Wang, Dinghui; Robison, Ryan K; Zwart, Nicholas R; Schär, Michael; Karis, John P; Pipe, James G

    2016-02-01

    T2 -weighted imaging is of great diagnostic value in neuroimaging. Three-dimensional (3D) Cartesian turbo spin echo (TSE) scans provide high signal-to-noise ratio (SNR) and contiguous slice coverage. The purpose of this preliminary work is to implement a novel 3D spiral TSE technique with image quality comparable to 2D/3D Cartesian TSE. The proposed technique uses multislab 3D TSE imaging. To mitigate the slice boundary artifacts, a sliding-slab method is extended to spiral imaging. A spiral-in/out readout is adopted to minimize the artifacts that may be present with the conventional spiral-out readout. Phase errors induced by B0 eddy currents are measured and compensated to allow for the combination of the spiral-in and spiral-out images. A nonuniform slice encoding scheme is used to reduce the truncation artifacts while preserving the SNR performance. Preliminary results show that each of the individual measures contributes to the overall performance, and the image quality of the results obtained with the proposed technique is, in general, comparable to that of 2D or 3D Cartesian TSE. 3D sliding-slab TSE with a spiral-in/out readout provides good-quality T2 -weighted images, and, therefore, may become a promising alternative to Cartesian TSE. © 2015 Wiley Periodicals, Inc.

  17. Fetal brain volumetry through MRI volumetric reconstruction and segmentation

    PubMed Central

    Estroff, Judy A.; Barnewolt, Carol E.; Connolly, Susan A.; Warfield, Simon K.

    2013-01-01

    Purpose Fetal MRI volumetry is a useful technique but it is limited by a dependency upon motion-free scans, tedious manual segmentation, and spatial inaccuracy due to thick-slice scans. An image processing pipeline that addresses these limitations was developed and tested. Materials and methods The principal sequences acquired in fetal MRI clinical practice are multiple orthogonal single-shot fast spin echo scans. State-of-the-art image processing techniques were used for inter-slice motion correction and super-resolution reconstruction of high-resolution volumetric images from these scans. The reconstructed volume images were processed with intensity non-uniformity correction and the fetal brain extracted by using supervised automated segmentation. Results Reconstruction, segmentation and volumetry of the fetal brains for a cohort of twenty-five clinically acquired fetal MRI scans was done. Performance metrics for volume reconstruction, segmentation and volumetry were determined by comparing to manual tracings in five randomly chosen cases. Finally, analysis of the fetal brain and parenchymal volumes was performed based on the gestational age of the fetuses. Conclusion The image processing pipeline developed in this study enables volume rendering and accurate fetal brain volumetry by addressing the limitations of current volumetry techniques, which include dependency on motion-free scans, manual segmentation, and inaccurate thick-slice interpolation. PMID:20625848

  18. The temperature of large dust grains in molecular clouds

    NASA Technical Reports Server (NTRS)

    Clark, F. O.; Laureijs, R. J.; Prusti, T.

    1991-01-01

    The temperature of the large dust grains is calculated from three molecular clouds ranging in visual extinction from 2.5 to 8 mag, by comparing maps of either extinction derived from star counts or gas column density derived from molecular observations to I(100). Both techniques show the dust temperature declining into clouds. The two techniques do not agree in absolute scale.

  19. The effect of propofol on CA1 pyramidal cell excitability and GABAA-mediated inhibition in the rat hippocampal slice.

    PubMed

    Albertson, T E; Walby, W F; Stark, L G; Joy, R M

    1996-05-24

    An in vitro paired-pulse orthodromic stimulation technique was used to examine the effects of propofol on excitatory afferent terminals, CA1 pyramidal cells and recurrent collateral evoked inhibition in the rat hippocampal slice. Hippocampal slices 400 microns thick were perfused with oxygenated artificial cerebrospinal fluid, and electrodes were placed in the CA1 region to record extracellular field population spike (PS) or excitatory postsynaptic potential (EPSP) responses to stimulation of Schaffer collateral/commissural fibers. Gamma-aminobutyric acid (GABA)-mediated recurrent inhibition was measured using a paired-pulse technique. The major effect of propofol (7-28 microM) was a dose and time dependent increase in the intensity and duration of GABA-mediated inhibition. This propofol effect could be rapidly and completely reversed by exposure to known GABAA antagonists, including picrotoxin, bicuculline and pentylenetetrazol. It was also reversed by the chloride channel antagonist, 4,4'-diisothiocyanostilbene-2,2'-disulfonic acid (DIDS). It was not antagonized by central (flumazenil) or peripheral (PK11195) benzodiazepine antagonists. Reversal of endogenous inhibition was also noted with the antagonists picrotoxin and pentylenetetrazol. Input/output curves constructed using stimulus propofol caused only a small enhancement of EPSPs at higher stimulus intensities but had no effect on PS amplitudes. These studies are consistent with propofol having a GABAA-chloride channel mechanism causing its effect on recurrent collateral evoked inhibition in the rat hippocampal slice.

  20. Capitalizing Resolving Power of Density Gradient Ultracentrifugation by Freezing and Precisely Slicing Centrifuged Solution: Enabling Identification of Complex Proteins from Mitochondria by Matrix Assisted Laser Desorption/Ionization Time-of-Flight Mass Spectrometry

    PubMed Central

    Yu, Haiqing; Lu, Joann J.; Rao, Wei

    2016-01-01

    Density gradient centrifugation is widely utilized for various high purity sample preparations, and density gradient ultracentrifugation (DGU) is often used for more resolution-demanding purification of organelles and protein complexes. Accurately locating different isopycnic layers and precisely extracting solutions from these layers play a critical role in achieving high-resolution DGU separations. In this technique note, we develop a DGU procedure by freezing the solution rapidly (but gently) after centrifugation to fix the resolved layers and by slicing the frozen solution to fractionate the sample. Because the thickness of each slice can be controlled to be as thin as 10 micrometers, we retain virtually all the resolution produced by DGU. To demonstrate the effectiveness of this method, we fractionate complex V from HeLa mitochondria using a conventional technique and this freezing-slicing (F-S) method. The comparison indicates that our F-S method can reduce complex V layer thicknesses by ~40%. After fractionation, we analyze complex V proteins directly on a matrix assisted laser desorption/ionization, time-of-flight mass spectrometer. Twelve out of fifteen subunits of complex V are positively identified. Our method provides a practical protocol to identify proteins from complexes, which is useful to investigate biomolecular complexes and pathways in various conditions and cell types. PMID:27668122

  1. Processing Uav and LIDAR Point Clouds in Grass GIS

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.

    2016-06-01

    Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  2. Development of methods of producing large areas of silicon sheet by the slicing of silicon ingots using Inside Diameter (I.D.) saws

    NASA Technical Reports Server (NTRS)

    Aharonyan, P.

    1980-01-01

    Modifications to a 16 inch STC automated saw included: a programmable feed system; a crystal rotating system; and a STC dynatrack blade boring and control system. By controlling the plating operation and by grinding the cutting edge, 16 inch I.D. blades were produced with a cutting edge thickness of .22 mm. Crystal rotation mechanism was used to slice 100 mm diameter crystals with a 16 inch blade down to a thickness of .20 mm. Cutting rates with crystal rotation were generally slower than with standard plunge I.D. slicing techniques. Using programmed feeds and programmed rotation, maximum cutting rates were from 0.3 to 1.0 inches per minute.

  3. Silicon Ingot Casting - Heat Exchanger Method Multi-wire Slicing - Fixed Abrasive Slicing Technique. Phase 3 Silicon Sheet Growth Development for the Large Area Sheet Task of the Low-cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Schmid, F.; Khattak, C. P.

    1979-01-01

    Several 20 cm diameter silicon ingots, up to 6.3 kg. were cast with good crystallinity. The graphite heat zone can be purified by heating it to high temperatures in vacuum. This is important in reducing costs and purification of large parts. Electroplated wires with 45 um synthetic diamonds and 30 um natural diamonds showed good cutting efficiency and lifetime. During slicing of a 10 cm x 10 cm workpiece, jerky motion occurred in the feed and rocking mechanisms. This problem is corrected and modifications were made to reduce the weight of the bladeheat by 50%.

  4. In vivo free-breathing DTI and IVIM of the whole human heart using a real-time slice-followed SE-EPI navigator-based sequence: A reproducibility study in healthy volunteers.

    PubMed

    Moulin, Kevin; Croisille, Pierre; Feiweier, Thorsten; Delattre, Benedicte M A; Wei, Hongjiang; Robert, Benjamin; Beuf, Olivier; Viallon, Magalie

    2016-07-01

    In this study, we proposed an efficient free-breathing strategy for rapid and improved cardiac diffusion-weighted imaging (DWI) acquisition using a single-shot spin-echo echo planar imaging (SE-EPI) sequence. A real-time slice-following technique during free-breathing was combined with a sliding acquisition-window strategy prior Principal Component Analysis temporal Maximum Intensity Projection (PCAtMIP) postprocessing of in-plane co-registered diffusion-weighted images. This methodology was applied to 10 volunteers to quantify the performance of the motion correction technique and the reproducibility of diffusion parameters. The slice-following technique offers a powerful head-foot respiratory motion management solution for SE-EPI cDWI with the advantage of a 100% duty cycle scanning efficiency. The level of co-registration was further improved using nonrigid motion corrections and was evaluated with a co-registration index. Vascular fraction f and the diffusion coefficients D and D* were determined to be 0.122 ± 0.013, 1.41 ± 0.09 × 10(-3) mm(2) /s and 43.6 ± 9.2 × 10(-3) mm(2) /s, respectively. From the multidirectional dataset, the measured mean diffusivity was 1.72 ± 0.09 × 10(-3) mm(2) /s and the fractional anisotropy was 0.36 ± 0.02. The slice-following DWI SE-EPI sequence is a promising solution for clinical implementation, offering a robust improved workflow for further evaluation of DWI in cardiology. Magn Reson Med 76:70-82, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  5. Serial sectioning methods for 3D investigations in materials science.

    PubMed

    Zankel, Armin; Wagner, Julian; Poelt, Peter

    2014-07-01

    A variety of methods for the investigation and 3D representation of the inner structure of materials has been developed. In this paper, techniques based on slice and view using scanning microscopy for imaging are presented and compared. Three different methods of serial sectioning combined with either scanning electron or scanning ion microscopy or atomic force microscopy (AFM) were placed under scrutiny: serial block-face scanning electron microscopy, which facilitates an ultramicrotome built into the chamber of a variable pressure scanning electron microscope; three-dimensional (3D) AFM, which combines an (cryo-) ultramicrotome with an atomic force microscope, and 3D FIB, which delivers results by slicing with a focused ion beam. These three methods complement one another in many respects, e.g., in the type of materials that can be investigated, the resolution that can be obtained and the information that can be extracted from 3D reconstructions. A detailed review is given about preparation, the slice and view process itself, and the limitations of the methods and possible artifacts. Applications for each technique are also provided. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. [Virtual otoscopy--technique, indications and initial experiences with multislice spiral CT].

    PubMed

    Klingebiel, R; Bauknecht, H C; Lehmann, R; Rogalla, P; Werbs, M; Behrbohm, H; Kaschke, O

    2000-11-01

    We report the standardized postprocessing of high-resolution CT data acquired by incremental CT and multi-slice CT in patients with suspected middle ear disorders to generate three-dimensional endoluminal views known as virtual otoscopy. Subsequent to the definition of a postprocessing protocol, standardized endoluminal views of the middle ear were generated according to their otological relevance. The HRCT data sets of 26 ENT patients were transferred to a workstation and postprocessed to 52 virtual otoscopies. Generation of predefined endoluminal views from the HRCT data sets was possible in all patients. Virtual endoscopic views added meaningful information to the primary cross-sectional data in patients suffering from ossicular pathology, having contraindications for invasive tympanic endoscopy or being assessed for surgery of the tympanic cavity. Multi slice CT improved the visualization of subtle anatomic details such as the stapes suprastructure and reduced the scanning time. Virtual endoscopy allows for the non invasive endoluminal visualization of various tympanic lesions. Use of the multi-slice CT technique reduces the scanning time and improves image quality in terms of detail resolution.

  7. Acetylcholinesterase inhibition reveals endogenous nicotinic modulation of glutamate inputs to CA1 stratum radiatum interneurons in hippocampal slices

    PubMed Central

    Alkondon, Manickavasagom; Albuquerque, Edson X.; Pereira, Edna F.R.

    2013-01-01

    The involvement of brain nicotinic acetylcholine receptors (nAChRs) in the neurotoxicological effects of soman, a potent acetylcholinesterase (AChE) inhibitor and a chemical warfare agent, is not clear. This is partly due to a poor understanding of the role of AChE in brain nAChR-mediated functions. To test the hypothesis that AChE inhibition builds sufficient acetylcholine (ACh) in the brain and facilitates nAChR-dependent glutamate transmission, we used whole-cell patch-clamp technique to record spontaneous glutamate excitatory postsynaptic currents (EPSCs) from CA1 stratum radiatum interneurons (SRI) in hippocampal slices. First, the frequency, amplitude and kinetics of EPSCs recorded from slices of control guinea pigs were compared to those recorded from slices of guinea pigs after a single injection of the irreversible AChE inhibitor soman (25.2 μg/kg, s.c.). Second, EPSCs were recorded from rat hippocampal slices before and after their superfusion with the reversible AChE inhibitor donepezil (100 nM). The frequency of EPSCs was significantly higher in slices taken from guinea pigs 24 h but not 7 days after the soman injection than in slices from control animals. In 52% of the rat hippocampal slices tested, bath application of donepezil increased the frequency of EPSCs. Further, exposure to donepezil increased both burst-like and large-amplitude EPSCs, and increased the proportion of short (20–100 ms) inter-event intervals. Donepezil’s effects were suppressed significantly in presence of 10 μM mecamylamine or 10 nM methyllycaconitine. These results support the concept that AChE inhibition is able to recruit nAChR-dependent glutamate transmission in the hippocampus and such a mechanism can contribute to the acute neurotoxicological actions of soman. PMID:23511125

  8. On the Use of Deep Convective Clouds to Calibrate AVHRR Data

    NASA Technical Reports Server (NTRS)

    Doelling, David R.; Nguyen, Louis; Minnis, Patrick

    2004-01-01

    Remote sensing of cloud and radiation properties from National Oceanic and Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR) satellites requires constant monitoring of the visible sensors. NOAA satellites do not have onboard visible calibration and need to be calibrated vicariously in order to determine the calibration and the degradation rate. Deep convective clouds are extremely bright and cold, are at the tropopause, have nearly a Lambertian reflectance, and provide predictable albedos. The use of deep convective clouds as calibration targets is developed into a calibration technique and applied to NOAA-16 and NOAA-17. The technique computes the relative gain drift over the life-span of the satellite. This technique is validated by comparing the gain drifts derived from inter-calibration of coincident AVHRR and Moderate-Resolution Imaging Spectroradiometer (MODIS) radiances. A ray-matched technique, which uses collocated, coincident, and co-angled pixel satellite radiance pairs is used to intercalibrate MODIS and AVHRR. The deep convective cloud calibration technique was found to be independent of solar zenith angle, by using well calibrated Visible Infrared Scanner (VIRS) radiances onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, which precesses through all solar zenith angles in 23 days.

  9. Detection of mesoscale convective complexes using multispectral RGB technique of Himawari-8 (Case Study: Jakarta, 20 February 2017)

    NASA Astrophysics Data System (ADS)

    Fatkhuroyan; Wati, T.

    2018-05-01

    Mesoscale Convective Complexes (MCC) is a well-organized convective cloud that has big size and long lifetime. The aim of the study is to detect and to monitor the development of MCC around Jakarta on 20th February 2017 using satellite Himawari-8. This study uses the analyzing method of the infrared channel and multispectral imagery RGB Technique to monitor the development of radiative, morphology and cloud position which describe the cloud top microphysics, structure and movement of the MCC. On 20th February 2017, the result from Himawari-8 shows that there are many dense-clouds with small ice particle and cloud top temperature could be < -50°C which can be seen as red and yellow dot colour by RGB Technique. The MCC caused a severe storm at Jakarta and its surrounding area.

  10. Development of a Global Multilayered Cloud Retrieval System

    NASA Technical Reports Server (NTRS)

    Huang, J.; Minnis, P.; Lin, B.; Yi, Y.; Ayers, J. K.; Khaiyer, M. M.; Arduini, R.; Fan, T.-F

    2004-01-01

    A more rigorous multilayered cloud retrieval system has been developed to improve the determination of high cloud properties in multilayered clouds. The MCRS attempts a more realistic interpretation of the radiance field than earlier methods because it explicitly resolves the radiative transfer that would produce the observed radiances. A two-layer cloud model was used to simulate multilayered cloud radiative characteristics. Despite the use of a simplified two-layer cloud reflectance parameterization, the MCRS clearly produced a more accurate retrieval of ice water path than simple differencing techniques used in the past. More satellite data and ground observation have to be used to test the MCRS. The MCRS methods are quite appropriate for interpreting the radiances when the high cloud has a relatively large optical depth (tau(sub I) greater than 2). For thinner ice clouds, a more accurate retrieval might be possible using infrared methods. Selection of an ice cloud retrieval and a variety of other issues must be explored before a complete global application of this technique can be implemented. Nevertheless, the initial results look promising.

  11. Infrared experiments for spaceborne planetary atmospheres research. Full report

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The role of infrared sensing in atmospheric science is discussed and existing infrared measurement techniques are reviewed. Proposed techniques for measuring planetary atmospheres are criticized and recommended instrument developments for spaceborne investigations are summarized for the following phenomena: global and local radiative budget; radiative flux profiles; winds; temperature; pressure; transient and marginal atmospheres; planetary rotation and global atmospheric activity; abundances of stable constituents; vertical, lateral, and temporal distribution of abundances; composition of clouds and aerosols; radiative properties of clouds and aerosols; cloud microstructure; cloud macrostructure; and non-LTE phenomena.

  12. Application of the SRI cloud-tracking technique to rapid-scan GOES observations

    NASA Technical Reports Server (NTRS)

    Wolf, D. E.; Endlich, R. M.

    1980-01-01

    An automatic cloud tracking system was applied to multilayer clouds associated with severe storms. The method was tested using rapid scan observations of Hurricane Eloise obtained by the GOES satellite on 22 September 1975. Cloud tracking was performed using clustering based either on visible or infrared data. The clusters were tracked using two different techniques. The data of 4 km and 8 km resolution of the automatic system yielded comparable in accuracy and coverage to those obtained by NASA analysts using the Atmospheric and Oceanographic Information Processing System.

  13. Monitoring gap junctional communication in astrocytes from acute adult mouse brain slices using the gap-FRAP technique.

    PubMed

    Yi, Chenju; Teillon, Jérémy; Koulakoff, Annette; Berry, Hugues; Giaume, Christian

    2018-06-01

    Intercellular communication through gap junction channels plays a key role in cellular homeostasis and in synchronizing physiological functions, a feature that is modified in number of pathological situations. In the brain, astrocytes are the cell population that expresses the highest amount of gap junction proteins, named connexins. Several techniques have been used to assess the level of gap junctional communication in astrocytes, but so far they remain very difficult to apply in adult brain tissue. Here, using specific loading of astrocytes with sulforhodamine 101, we adapted the gap-FRAP (Fluorescence Recovery After Photobleaching) to acute hippocampal slices from 9 month-old adult mice. We show that gap junctional communication monitored in astrocytes with this technique was inhibited either by pharmacological treatment with a gap junctional blocker or in mice lacking the two main astroglial connexins, while a partial inhibition was measured when only one connexin was knocked-out. We validate this approach using a mathematical model of sulforhodamine 101 diffusion in an elementary astroglial network and a quantitative analysis of the exponential fits to the fluorescence recovery curves. Consequently, we consider that the adaptation of the gap-FRAP technique to acute brain slices from adult mice provides an easy going and valuable approach that allows overpassing this age-dependent obstacle and will facilitate the investigation of gap junctional communication in adult healthy or pathological brain. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. The effects of cloud inhomogeneities upon radiative fluxes, and the supply of a cloud truth validation dataset

    NASA Technical Reports Server (NTRS)

    Welch, Ronald M.

    1996-01-01

    The ASTER polar cloud mask algorithm is currently under development. Several classification techniques have been developed and implemented. The merits and accuracy of each are being examined. The classification techniques under investigation include fuzzy logic, hierarchical neural network, and a pairwise histogram comparison scheme based on sample histograms called the Paired Histogram Method. Scene adaptive methods also are being investigated as a means to improve classifier performance. The feature, arctan of Band 4 and Band 5, and the Band 2 vs. Band 4 feature space are key to separating frozen water (e.g., ice/snow, slush/wet ice, etc.) from cloud over frozen water, and land from cloud over land, respectively. A total of 82 Landsat TM circumpolar scenes are being used as a basis for algorithm development and testing. Numerous spectral features are being tested and include the 7 basic Landsat TM bands, in addition to ratios, differences, arctans, and normalized differences of each combination of bands. A technique for deriving cloud base and top height is developed. It uses 2-D cross correlation between a cloud edge and its corresponding shadow to determine the displacement of the cloud from its shadow. The height is then determined from this displacement, the solar zenith angle, and the sensor viewing angle.

  15. Organotypic slice cultures of human gastric and esophagogastric junction cancer.

    PubMed

    Koerfer, Justus; Kallendrusch, Sonja; Merz, Felicitas; Wittekind, Christian; Kubick, Christoph; Kassahun, Woubet T; Schumacher, Guido; Moebius, Christian; Gaßler, Nikolaus; Schopow, Nikolas; Geister, Daniela; Wiechmann, Volker; Weimann, Arved; Eckmann, Christian; Aigner, Achim; Bechmann, Ingo; Lordick, Florian

    2016-07-01

    Gastric and esophagogastric junction cancers are heterogeneous and aggressive tumors with an unpredictable response to cytotoxic treatment. New methods allowing for the analysis of drug resistance are needed. Here, we describe a novel technique by which human tumor specimens can be cultured ex vivo, preserving parts of the natural cancer microenvironment. Using a tissue chopper, fresh surgical tissue samples were cut in 400 μm slices and cultivated in 6-well plates for up to 6 days. The slices were processed for routine histopathology and immunohistochemistry. Cytokeratin stains (CK8, AE1/3) were applied for determining tumor cellularity, Ki-67 for proliferation, and cleaved caspase-3 staining for apoptosis. The slices were analyzed under naive conditions and following 2-4 days in vitro exposure to 5-FU and cisplatin. The slice culture technology allowed for a good preservation of tissue morphology and tumor cell integrity during the culture period. After chemotherapy exposure, a loss of tumor cellularity and an increase in apoptosis were observed. Drug sensitivity of the tumors could be assessed. Organotypic slice cultures of gastric and esophagogastric junction cancers were successfully established. Cytotoxic drug effects could be monitored. They may be used to examine mechanisms of drug resistance in human tissue and may provide a unique and powerful ex vivo platform for the prediction of treatment response. © 2016 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  16. Depth extraction method with high accuracy in integral imaging based on moving array lenslet technique

    NASA Astrophysics Data System (ADS)

    Wang, Yao-yao; Zhang, Juan; Zhao, Xue-wei; Song, Li-pei; Zhang, Bo; Zhao, Xing

    2018-03-01

    In order to improve depth extraction accuracy, a method using moving array lenslet technique (MALT) in pickup stage is proposed, which can decrease the depth interval caused by pixelation. In this method, the lenslet array is moved along the horizontal and vertical directions simultaneously for N times in a pitch to get N sets of elemental images. Computational integral imaging reconstruction method for MALT is taken to obtain the slice images of the 3D scene, and the sum modulus (SMD) blur metric is taken on these slice images to achieve the depth information of the 3D scene. Simulation and optical experiments are carried out to verify the feasibility of this method.

  17. Three-Dimensional Medical Image Registration Using a Patient Space Correlation Technique

    DTIC Science & Technology

    1991-12-01

    dates (e.g. 10 Seenon Technial Jun 87 - 30 Jun 88). Statements on TechnicalDocuments." Block 4. Title and Subtitle. A title is taken from DOE - See...requirements ( 30 :6). The context analysis for this development was conducted primarily to bound the image regis- tration problem and to isolate the required...a series of 30 transverse slices. Each slice is composed of 240 voxels in the x-dimension and 164 voxels in the y-dimension. The dataset was provided

  18. Determination of Cloud Base Height, Wind Velocity, and Short-Range Cloud Structure Using Multiple Sky Imagers Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Dong; Schwartz, Stephen E.; Yu, Dantong

    Clouds are a central focus of the U.S. Department of Energy (DOE)’s Atmospheric System Research (ASR) program and Atmospheric Radiation Measurement (ARM) Climate Research Facility, and more broadly are the subject of much investigation because of their important effects on atmospheric radiation and, through feedbacks, on climate sensitivity. Significant progress has been made by moving from a vertically pointing (“soda-straw”) to a three-dimensional (3D) view of clouds by investing in scanning cloud radars through the American Recovery and Reinvestment Act of 2009. Yet, because of the physical nature of radars, there are key gaps in ARM's cloud observational capabilities. Formore » example, cloud radars often fail to detect small shallow cumulus and thin cirrus clouds that are nonetheless radiatively important. Furthermore, it takes five to twenty minutes for a cloud radar to complete a 3D volume scan and clouds can evolve substantially during this period. Ground-based stereo-imaging is a promising technique to complement existing ARM cloud observation capabilities. It enables the estimation of cloud coverage, height, horizontal motion, morphology, and spatial arrangement over an extended area of up to 30 by 30 km at refresh rates greater than 1 Hz (Peng et al. 2015). With fine spatial and temporal resolution of modern sky cameras, the stereo-imaging technique allows for the tracking of a small cumulus cloud or a thin cirrus cloud that cannot be detected by a cloud radar. With support from the DOE SunShot Initiative, the Principal Investigator (PI)’s team at Brookhaven National Laboratory (BNL) has developed some initial capability for cloud tracking using multiple distinctly located hemispheric cameras (Peng et al. 2015). To validate the ground-based cloud stereo-imaging technique, the cloud stereo-imaging field campaign was conducted at the ARM Facility’s Southern Great Plains (SGP) site in Oklahoma from July 15 to December 24. As shown in Figure 1, the cloud stereo-imaging system consisted of two inexpensive high-definition (HD) hemispheric cameras (each cost less than $1,500) and ARM’s Total Sky Imager (TSI). Together with other co-located ARM instrumentation, the campaign provides a promising opportunity to validate stereo-imaging-based cloud base height and, more importantly, to examine the feasibility of cloud thickness retrieval for low-view-angle clouds.« less

  19. Comparison of helical and cine acquisitions for 4D-CT imaging with multislice CT.

    PubMed

    Pan, Tinsu

    2005-02-01

    We proposed a data sufficiency condition (DSC) for four-dimensional-CT (4D-CT) imaging on a multislice CT scanner, designed a pitch factor for a helical 4D-CT, and compared the acquisition time, slice sensitivity profile (SSP), effective dose, ability to cope with an irregular breathing cycle, and gating technique (retrospective or prospective) of the helical 4D-CT and the cine 4D-CT on the General Electric (GE) LightSpeed RT (4-slice), Plus (4-slice), Ultra (8-slice) and 16 (16-slice) multislice CT scanners. To satisfy the DSC, a helical or cine 4D-CT acquisition has to collect data at each location for the duration of a breathing cycle plus the duration of data acquisition for an image reconstruction. The conditions for the comparison were 20 cm coverage in the cranial-caudal direction, a 4 s breathing cycle, and half-scan reconstruction. We found that the helical 4D-CT has the advantage of a shorter scan time that is 10% shorter than that of the cine 4D-CT, and the disadvantages of 1.8 times broadening of SSP and requires an additional breathing cycle of scanning to ensure an adequate sampling at the start and end locations. The cine 4D-CT has the advantages of maintaining the same SSP as slice collimation (e.g., 8 x 2.5 mm slice collimation generates 2.5 mm SSP in the cine 4D-CT as opposed to 4.5 mm in the helical 4D-CT) and a lower dose by 4% on the 8- and 16-slice systems, and 8% on the 4-slice system. The advantage of faster scanning in the helical 4D-CT will diminish if a repeat scan at the location of a breathing irregularity becomes necessary. The cine 4D-CT performs better than the helical 4D-CT in the repeat scan because it can scan faster and is more dose efficient.

  20. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    PubMed Central

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  1. Geometric data perturbation-based personal health record transactions in cloud computing.

    PubMed

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  2. Adaptation of Microplate-based Respirometry for Hippocampal Slices and Analysis of Respiratory Capacity

    PubMed Central

    Schuh, Rosemary A.; Clerc, Pascaline; Hwang, Hyehyun; Mehrabian, Zara; Bittman, Kevin; Chen, Hegang; Polster, Brian M.

    2011-01-01

    Multiple neurodegenerative disorders are associated with altered mitochondrial bioenergetics. Although mitochondrial O2 consumption is frequently measured in isolated mitochondria, isolated synaptic nerve terminals (synaptosomes), or cultured cells, the absence of mature brain circuitry is a remaining limitation. Here we describe the development of a method that adapts the Seahorse Extracellular Flux Analyzer (XF24) for the microplate-based measurement of hippocampal slice O2 consumption. As a first evaluation of the technique, we compared whole slice bioenergetics to previous measurements made with synaptosomes or cultured neurons. We found that mitochondrial respiratory capacity and O2 consumption coupled to ATP synthesis could be estimated in cultured or acute hippocampal slices with preserved neural architecture. Mouse organotypic hippocampal slices oxidizing glucose displayed mitochondrial O2 consumption that was well-coupled, as determined by the sensitivity to the ATP synthase inhibitor oligomycin. However stimulation of respiration by uncoupler was modest (<120% of basal respiration) compared to previous measurements in cells or synaptosomes, although enhanced slightly (to ~150% of basal respiration) by the acute addition of the mitochondrial complex I-linked substrate pyruvate. These findings suggest a high basal utilization of respiratory capacity in slices and a limitation of glucose-derived substrate for maximal respiration. The improved throughput of microplate-based hippocampal respirometry over traditional O2 electrode-based methods is conducive to neuroprotective drug screening. When coupled with cell type-specific pharmacology or genetic manipulations, the ability to efficiently measure O2 consumption from whole slices should advance our understanding of mitochondrial roles in physiology and neuropathology. PMID:21520220

  3. Pole-Like Road Furniture Detection in Sparse and Unevenly Distributed Mobile Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Li, F.; Lehtomäki, M.; Oude Elberink, S.; Vosselman, G.; Puttonen, E.; Kukko, A.; Hyyppä, J.

    2018-05-01

    Pole-like road furniture detection received much attention due to its traffic functionality in recent years. In this paper, we develop a framework to detect pole-like road furniture from sparse mobile laser scanning data. The framework is carried out in four steps. The unorganised point cloud is first partitioned. Then above ground points are clustered and roughly classified after removing ground points. A slicing check in combination with cylinder masking is proposed to extract pole-like road furniture candidates. Pole-like road furniture are obtained after occlusion analysis in the last stage. The average completeness and correctness of pole-like road furniture in sparse and unevenly distributed mobile laser scanning data was above 0.83. It is comparable to the state of art in the field of pole-like road furniture detection in mobile laser scanning data of good quality and is potentially of practical use in the processing of point clouds collected by autonomous driving platforms.

  4. CloudSat system engineering: techniques that point to a future success

    NASA Technical Reports Server (NTRS)

    Basilio, R. R.; Boain, R. J.; Lam, T.

    2002-01-01

    Over the past three years the CloutSat Project, a NASA Earth System Science Pathfinder mission to provide from space the first global survey of cloud profiles and cloud physical properties, has implemented a successful project system engineering approach. Techniques learned through heuristic reasoning of past project events and professional experience were applied along with select methods recently touted to increase effectiveness without compromising effiency.

  5. Submillimeter-Wave Cloud Ice Radiometry

    NASA Technical Reports Server (NTRS)

    Walter, Steven J.

    1999-01-01

    Submillimeter-wave cloud ice radiometry is a new and innovative technique for characterizing cirrus ice clouds. Cirrus clouds affect Earth's climate and hydrological cycle by reflecting incoming solar energy, trapping outgoing IR radiation, sublimating into vapor, and influencing atmospheric circulation. Since uncertainties in the global distribution of cloud ice restrict the accuracy of both climate and weather models, successful development of this technique could provide a valuable tool for investigating how clouds affect climate and weather. Cloud ice radiometry could fill an important gap in the observational capabilities of existing and planned Earth-observing systems. Using submillimeter-wave radiometry to retrieve properties of ice clouds can be understood with a simple model. There are a number of submillimeter-wavelength spectral regions where the upper troposphere is transparent. At lower tropospheric altitudes water vapor emits a relatively uniform flux of thermal radiation. When cirrus clouds are present, they scatter a portion of the upwelling flux of submillimeter-wavelength radiation back towards the Earth as shown in the diagram, thus reducing the upward flux o f energy. Hence, the power received by a down-looking radiometer decreases when a cirrus cloud passes through the field of view causing the cirrus cloud to appear radiatively cool against the warm lower atmospheric thermal emissions. The reduction in upwelling thermal flux is a function of both the total cloud ice content and mean crystal size. Radiometric measurements made at multiple widely spaced frequencies permit flux variations caused by changes in crystal size to be distinguished from changes in ice content, and polarized measurements can be used to constrain mean crystal shape. The goal of the cloud ice radiometry program is to further develop and validate this technique of characterizing cirrus. A multi-frequency radiometer is being designed to support airborne science and spacecraft validation missions. This program has already extended the initial millimeter-wave modeling studies to submillimeter-wavelengths and has improved the realism of the cloud scattering models. Additionally a proof-of-concept airborne submillimeter-wave radiometer was constructed and fielded. It measured a radiometric signal from cirrus confirming the basic technical feasibility of this technique. This program is a cooperative effort of the University of Colorado, Colorado State University, Swales Aerospace, and Jet Propulsion Laboratory. Additional information is contained in the original.

  6. Measurement of the distribution of ventilation-perfusion ratios in the human lung with proton MRI: comparison with the multiple inert-gas elimination technique.

    PubMed

    Sá, Rui Carlos; Henderson, A Cortney; Simonson, Tatum; Arai, Tatsuya J; Wagner, Harrieth; Theilmann, Rebecca J; Wagner, Peter D; Prisk, G Kim; Hopkins, Susan R

    2017-07-01

    We have developed a novel functional proton magnetic resonance imaging (MRI) technique to measure regional ventilation-perfusion (V̇ A /Q̇) ratio in the lung. We conducted a comparison study of this technique in healthy subjects ( n = 7, age = 42 ± 16 yr, Forced expiratory volume in 1 s = 94% predicted), by comparing data measured using MRI to that obtained from the multiple inert gas elimination technique (MIGET). Regional ventilation measured in a sagittal lung slice using Specific Ventilation Imaging was combined with proton density measured using a fast gradient-echo sequence to calculate regional alveolar ventilation, registered with perfusion images acquired using arterial spin labeling, and divided on a voxel-by-voxel basis to obtain regional V̇ A /Q̇ ratio. LogSDV̇ and LogSDQ̇, measures of heterogeneity derived from the standard deviation (log scale) of the ventilation and perfusion vs. V̇ A /Q̇ ratio histograms respectively, were calculated. On a separate day, subjects underwent study with MIGET and LogSDV̇ and LogSDQ̇ were calculated from MIGET data using the 50-compartment model. MIGET LogSDV̇ and LogSDQ̇ were normal in all subjects. LogSDQ̇ was highly correlated between MRI and MIGET (R = 0.89, P = 0.007); the intercept was not significantly different from zero (-0.062, P = 0.65) and the slope did not significantly differ from identity (1.29, P = 0.34). MIGET and MRI measures of LogSDV̇ were well correlated (R = 0.83, P = 0.02); the intercept differed from zero (0.20, P = 0.04) and the slope deviated from the line of identity (0.52, P = 0.01). We conclude that in normal subjects, there is a reasonable agreement between MIGET measures of heterogeneity and those from proton MRI measured in a single slice of lung. NEW & NOTEWORTHY We report a comparison of a new proton MRI technique to measure regional V̇ A /Q̇ ratio against the multiple inert gas elimination technique (MIGET). The study reports good relationships between measures of heterogeneity derived from MIGET and those derived from MRI. Although currently limited to a single slice acquisition, these data suggest that single sagittal slice measures of V̇ A /Q̇ ratio provide an adequate means to assess heterogeneity in the normal lung. Copyright © 2017 the American Physiological Society.

  7. Sensitivity analysis of add-on price estimate for select silicon wafering technologies

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1982-01-01

    The cost of producing wafers from silicon ingots is a major component of the add-on price of silicon sheet. Economic analyses of the add-on price estimates and their sensitivity internal-diameter (ID) sawing, multiblade slurry (MBS) sawing and fixed-abrasive slicing technique (FAST) are presented. Interim price estimation guidelines (IPEG) are used for estimating a process add-on price. Sensitivity analysis of price is performed with respect to cost parameters such as equipment, space, direct labor, materials (blade life) and utilities, and the production parameters such as slicing rate, slices per centimeter and process yield, using a computer program specifically developed to do sensitivity analysis with IPEG. The results aid in identifying the important cost parameters and assist in deciding the direction of technology development efforts.

  8. Retrospective 4D MR image construction from free-breathing slice Acquisitions: A novel graph-based approach.

    PubMed

    Tong, Yubing; Udupa, Jayaram K; Ciesielski, Krzysztof C; Wu, Caiyun; McDonough, Joseph M; Mong, David A; Campbell, Robert M

    2017-01-01

    Dynamic or 4D imaging of the thorax has many applications. Both prospective and retrospective respiratory gating and tracking techniques have been developed for 4D imaging via CT and MRI. For pediatric imaging, due to radiation concerns, MRI becomes the de facto modality of choice. In thoracic insufficiency syndrome (TIS), patients often suffer from extreme malformations of the chest wall, diaphragm, and/or spine with inability of the thorax to support normal respiration or lung growth (Campbell et al., 2003, Campbell and Smith, 2007), as such patient cooperation needed by some of the gating and tracking techniques are difficult to realize without causing patient discomfort and interference with the breathing mechanism itself. Therefore (ventilator-supported) free-breathing MRI acquisition is currently the best choice for imaging these patients. This, however, raises a question of how to create a consistent 4D image from such acquisitions. This paper presents a novel graph-based technique for compiling the best 4D image volume representing the thorax over one respiratory cycle from slice images acquired during unencumbered natural tidal-breathing of pediatric TIS patients. In our approach, for each coronal (or sagittal) slice position, images are acquired at a rate of about 200-300ms/slice over several natural breathing cycles which yields over 2000 slices. A weighted graph is formed where each acquired slice constitutes a node and the weight of the arc between two nodes defines the degree of contiguity in space and time of the two slices. For each respiratory phase, an optimal 3D spatial image is constructed by finding the best path in the graph in the spatial direction. The set of all such 3D images for a given respiratory cycle constitutes a 4D image. Subsequently, the best 4D image among all such constructed images is found over all imaged respiratory cycles. Two types of evaluation studies are carried out to understand the behavior of this algorithm and in comparison to a method called Random Stacking - a 4D phantom study and 10 4D MRI acquisitions from TIS patients and normal subjects. The 4D phantom was constructed by 3D printing the pleural spaces of an adult thorax, which were segmented in a breath-held MRI acquisition. Qualitative visual inspection via cine display of the slices in space and time and in 3D rendered form showed smooth variation for all data sets constructed by the proposed method. Quantitative evaluation was carried out to measure spatial and temporal contiguity of the slices via segmented pleural spaces. The optimal method showed smooth variation of the pleural space as compared to Random Stacking whose behavior was erratic. The volumes of the pleural spaces at the respiratory phase corresponding to end inspiration and end expiration were compared to volumes obtained from breath-hold acquisitions at roughly the same phase. The mean difference was found to be roughly 3%. The proposed method is purely image-based and post-hoc and does not need breath holding or external surrogates or instruments to record respiratory motion or tidal volume. This is important and practically warranted for pediatric patients. The constructed 4D images portray spatial and temporal smoothness that should be expected in a consistent 4D volume. We believe that the method can be routinely used for thoracic 4D imaging. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Development of lidar sensor for cloud-based measurements during convective conditions

    NASA Astrophysics Data System (ADS)

    Vishnu, R.; Bhavani Kumar, Y.; Rao, T. Narayana; Nair, Anish Kumar M.; Jayaraman, A.

    2016-05-01

    Atmospheric convection is a natural phenomena associated with heat transport. Convection is strong during daylight periods and rigorous in summer months. Severe ground heating associated with strong winds experienced during these periods. Tropics are considered as the source regions for strong convection. Formation of thunder storm clouds is common during this period. Location of cloud base and its associated dynamics is important to understand the influence of convection on the atmosphere. Lidars are sensitive to Mie scattering and are the suitable instruments for locating clouds in the atmosphere than instruments utilizing the radio frequency spectrum. Thunder storm clouds are composed of hydrometers and strongly scatter the laser light. Recently, a lidar technique was developed at National Atmospheric Research Laboratory (NARL), a Department of Space (DOS) unit, located at Gadanki near Tirupati. The lidar technique employs slant path operation and provides high resolution measurements on cloud base location in real-time. The laser based remote sensing technique allows measurement of atmosphere for every second at 7.5 m range resolution. The high resolution data permits assessment of updrafts at the cloud base. The lidar also provides real-time convective boundary layer height using aerosols as the tracers of atmospheric dynamics. The developed lidar sensor is planned for up-gradation with scanning facility to understand the cloud dynamics in the spatial direction. In this presentation, we present the lidar sensor technology and utilization of its technology for high resolution cloud base measurements during convective conditions over lidar site, Gadanki.

  10. Multi-wavelength dual polarisation lidar for monitoring precipitation process in the cloud seeding technique

    NASA Astrophysics Data System (ADS)

    Sudhakar, P.; Sheela, K. Anitha; Ramakrishna Rao, D.; Malladi, Satyanarayana

    2016-05-01

    In recent years weather modification activities are being pursued in many countries through cloud seeding techniques to facilitate the increased and timely precipitation from the clouds. In order to induce and accelerate the precipitation process clouds are artificially seeded with suitable materials like silver iodide, sodium chloride or other hygroscopic materials. The success of cloud seeding can be predicted with confidence if the precipitation process involving aerosol, the ice water balance, water vapor content and size of the seeding material in relation to aerosol in the cloud is monitored in real time and optimized. A project on the enhancement of rain fall through cloud seeding is being implemented jointly with Kerala State Electricity Board Ltd. Trivandrum, Kerala, India at the catchment areas of the reservoir of one of the Hydro electric projects. The dual polarization lidar is being used to monitor and measure the microphysical properties, the extinction coefficient, size distribution and related parameters of the clouds. The lidar makes use of the Mie, Rayleigh and Raman scattering techniques for the various measurement proposed. The measurements with the dual polarization lidar as above are being carried out in real time to obtain the various parameters during cloud seeding operations. In this paper we present the details of the multi-wavelength dual polarization lidar being used and the methodology to monitor the various cloud parameters involved in the precipitation process. The necessary retrieval algorithms for deriving the microphysical properties of clouds, aerosols characteristics and water vapor profiles are incorporated as a software package working under Lab-view for online and off line analysis. Details on the simulation studies and the theoretical model developed in this regard for the optimization of various parameters are discussed.

  11. Myocardial perfusion MRI with sliding-window conjugate-gradient HYPR.

    PubMed

    Ge, Lan; Kino, Aya; Griswold, Mark; Mistretta, Charles; Carr, James C; Li, Debiao

    2009-10-01

    First-pass perfusion MRI is a promising technique for detecting ischemic heart disease. However, the diagnostic value of the method is limited by the low spatial coverage, resolution, signal-to-noise ratio (SNR), and cardiac motion-related image artifacts. In this study we investigated the feasibility of using a method that combines sliding window and CG-HYPR methods (SW-CG-HYPR) to reduce the acquisition window for each slice while maintaining the temporal resolution of one frame per heartbeat in myocardial perfusion MRI. This method allows an increased number of slices, reduced motion artifacts, and preserves the relatively high SNR and spatial resolution of the "composite images." Results from eight volunteers demonstrate the feasibility of SW-CG-HYPR for accelerated myocardial perfusion imaging with accurate signal intensity changes of left ventricle blood pool and myocardium. Using this method the acquisition time per cardiac cycle was reduced by a factor of 4 and the number of slices was increased from 3 to 8 as compared to the conventional technique. The SNR of the myocardium at peak enhancement with SW-CG-HYPR (13.83 +/- 2.60) was significantly higher (P < 0.05) than the conventional turbo-FLASH protocol (8.40 +/- 1.62). Also, the spatial resolution of the myocardial perfection images was significantly improved. SW-CG-HYPR is a promising technique for myocardial perfusion MRI. (c) 2009 Wiley-Liss, Inc.

  12. Optical property retrievals of subvisual cirrus clouds from OSIRIS limb-scatter measurements

    NASA Astrophysics Data System (ADS)

    Wiensz, J. T.; Degenstein, D. A.; Lloyd, N. D.; Bourassa, A. E.

    2012-08-01

    We present a technique for retrieving the optical properties of subvisual cirrus clouds detected by OSIRIS, a limb-viewing satellite instrument that measures scattered radiances from the UV to the near-IR. The measurement set is composed of a ratio of limb radiance profiles at two wavelengths that indicates the presence of cloud-scattering regions. Optical properties from an in-situ database are used to simulate scattering by cloud-particles. With appropriate configurations discussed in this paper, the SASKTRAN successive-orders of scatter radiative transfer model is able to simulate accurately the in-cloud radiances from OSIRIS. Configured in this way, the model is used with a multiplicative algebraic reconstruction technique (MART) to retrieve the cloud extinction profile for an assumed effective cloud particle size. The sensitivity of these retrievals to key auxiliary model parameters is shown, and it is demonstrated that the retrieved extinction profile models accurately the measured in-cloud radiances from OSIRIS. Since OSIRIS has an 11-yr record of subvisual cirrus cloud detections, the work described in this manuscript provides a very useful method for providing a long-term global record of the properties of these clouds.

  13. Advanced Visualization and Interactive Display Rapid Innovation and Discovery Evaluation Research (VISRIDER) Program Task 6: Point Cloud Visualization Techniques for Desktop and Web Platforms

    DTIC Science & Technology

    2017-04-01

    ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH (VISRIDER) PROGRAM TASK 6: POINT CLOUD...To) OCT 2013 – SEP 2014 4. TITLE AND SUBTITLE ADVANCED VISUALIZATION AND INTERACTIVE DISPLAY RAPID INNOVATION AND DISCOVERY EVALUATION RESEARCH...various point cloud visualization techniques for viewing large scale LiDAR datasets. Evaluate their potential use for thick client desktop platforms

  14. The Education Value of Cloud Computing

    ERIC Educational Resources Information Center

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  15. A simple biota removal algorithm for 35 GHz cloud radar measurements

    NASA Astrophysics Data System (ADS)

    Kalapureddy, Madhu Chandra R.; Sukanya, Patra; Das, Subrata K.; Deshpande, Sachin M.; Pandithurai, Govindan; Pazamany, Andrew L.; Ambuj K., Jha; Chakravarty, Kaustav; Kalekar, Prasad; Krishna Devisetty, Hari; Annam, Sreenivas

    2018-03-01

    Cloud radar reflectivity profiles can be an important measurement for the investigation of cloud vertical structure (CVS). However, extracting intended meteorological cloud content from the measurement often demands an effective technique or algorithm that can reduce error and observational uncertainties in the recorded data. In this work, a technique is proposed to identify and separate cloud and non-hydrometeor echoes using the radar Doppler spectral moments profile measurements. The point and volume target-based theoretical radar sensitivity curves are used for removing the receiver noise floor and identified radar echoes are scrutinized according to the signal decorrelation period. Here, it is hypothesized that cloud echoes are observed to be temporally more coherent and homogenous and have a longer correlation period than biota. That can be checked statistically using ˜ 4 s sliding mean and standard deviation value of reflectivity profiles. The above step helps in screen out clouds critically by filtering out the biota. The final important step strives for the retrieval of cloud height. The proposed algorithm potentially identifies cloud height solely through the systematic characterization of Z variability using the local atmospheric vertical structure knowledge besides to the theoretical, statistical and echo tracing tools. Thus, characterization of high-resolution cloud radar reflectivity profile measurements has been done with the theoretical echo sensitivity curves and observed echo statistics for the true cloud height tracking (TEST). TEST showed superior performance in screening out clouds and filtering out isolated insects. TEST constrained with polarimetric measurements was found to be more promising under high-density biota whereas TEST combined with linear depolarization ratio and spectral width perform potentially to filter out biota within the highly turbulent shallow cumulus clouds in the convective boundary layer (CBL). This TEST technique is promisingly simple in realization but powerful in performance due to the flexibility in constraining, identifying and filtering out the biota and screening out the true cloud content, especially the CBL clouds. Therefore, the TEST algorithm is superior for screening out the low-level clouds that are strongly linked to the rainmaking mechanism associated with the Indian Summer Monsoon region's CVS.

  16. Morphological diagnostics of star formation in molecular clouds

    NASA Astrophysics Data System (ADS)

    Beaumont, Christopher Norris

    Molecular clouds are the birth sites of all star formation in the present-day universe. They represent the initial conditions of star formation, and are the primary medium by which stars transfer energy and momentum back to parsec scales. Yet, the physical evolution of molecular clouds remains poorly understood. This is not due to a lack of observational data, nor is it due to an inability to simulate the conditions inside molecular clouds. Instead, the physics and structure of the interstellar medium are sufficiently complex that interpreting molecular cloud data is very difficult. This dissertation mitigates this problem, by developing more sophisticated ways to interpret morphological information in molecular cloud observations and simulations. In particular, I have focused on leveraging machine learning techniques to identify physically meaningful substructures in the interstellar medium, as well as techniques to inter-compare molecular cloud simulations to observations. These contributions make it easier to understand the interplay between molecular clouds and star formation. Specific contributions include: new insight about the sheet-like geometry of molecular clouds based on observations of stellar bubbles; a new algorithm to disambiguate overlapping yet morphologically distinct cloud structures; a new perspective on the relationship between molecular cloud column density distributions and the sizes of cloud substructures; a quantitative analysis of how projection effects affect measurements of cloud properties; and an automatically generated, statistically-calibrated catalog of bubbles identified from their infrared morphologies.

  17. All-sky photogrammetry techniques to georeference a cloud field

    NASA Astrophysics Data System (ADS)

    Crispel, Pierre; Roberts, Gregory

    2018-01-01

    In this study, we present a novel method of identifying and geolocalizing cloud field elements from a portable all-sky camera stereo network based on the ground and oriented towards zenith. The methodology is mainly based on stereophotogrammetry which is a 3-D reconstruction technique based on triangulation from corresponding stereo pixels in rectified images. In cases where clouds are horizontally separated, identifying individual positions is performed with segmentation techniques based on hue filtering and contour detection algorithms. Macroscopic cloud field characteristics such as cloud layer base heights and velocity fields are also deduced. In addition, the methodology is fitted to the context of measurement campaigns which impose simplicity of implementation, auto-calibration, and portability. Camera internal geometry models are achieved a priori in the laboratory and validated to ensure a certain accuracy in the peripheral parts of the all-sky image. Then, stereophotogrammetry with dense 3-D reconstruction is applied with cameras spaced 150 m apart for two validation cases. The first validation case is carried out with cumulus clouds having a cloud base height at 1500 m a.g.l. The second validation case is carried out with two cloud layers: a cumulus fractus layer with a base height at 1000 m a.g.l. and an altocumulus stratiformis layer with a base height of 2300 m a.g.l. Velocity fields at cloud base are computed by tracking image rectangular patterns through successive shots. The height uncertainty is estimated by comparison with a Vaisala CL31 ceilometer located on the site. The uncertainty on the horizontal coordinates and on the velocity field are theoretically quantified by using the experimental uncertainties of the cloud base height and camera orientation. In the first cumulus case, segmentation of the image is performed to identify individuals clouds in the cloud field and determine the horizontal positions of the cloud centers.

  18. Breath-hold imaging of the coronary arteries using Quiescent-Interval Slice-Selective (QISS) magnetic resonance angiography: pilot study at 1.5 Tesla and 3 Tesla.

    PubMed

    Edelman, Robert R; Giri, S; Pursnani, A; Botelho, M P F; Li, W; Koktzoglou, I

    2015-11-23

    Coronary magnetic resonance angiography (MRA) is usually obtained with a free-breathing navigator-gated 3D acquisition. Our aim was to develop an alternative breath-hold approach that would allow the coronary arteries to be evaluated in a much shorter time and without risk of degradation by respiratory motion artifacts. For this purpose, we implemented a breath-hold, non-contrast-enhanced, quiescent-interval slice-selective (QISS) 2D technique. Sequence performance was compared at 1.5 and 3 Tesla using both radial and Cartesian k-space trajectories. The left coronary circulation was imaged in six healthy subjects and two patients with coronary artery disease. Breath-hold QISS was compared with T2-prepared 2D balanced steady-state free-precession (bSSFP) and free-breathing, navigator-gated 3D bSSFP. Approximately 10 2.1-mm thick slices were acquired in a single ~20-s breath-hold using two-shot QISS. QISS contrast-to-noise ratio (CNR) was 1.5-fold higher at 3 Tesla than at 1.5 Tesla. Cartesian QISS provided the best coronary-to-myocardium CNR, whereas radial QISS provided the sharpest coronary images. QISS image quality exceeded that of free-breathing 3D coronary MRA with few artifacts at either field strength. Compared with T2-prepared 2D bSSFP, multi-slice capability was not restricted by the specific absorption rate at 3 Tesla and pericardial fluid signal was better suppressed. In addition to depicting the coronary arteries, QISS could image intra-cardiac structures, pericardium, and the aortic root in arbitrary slice orientations. Breath-hold QISS is a simple, versatile, and time-efficient method for coronary MRA that provides excellent image quality at both 1.5 and 3 Tesla. Image quality exceeded that of free-breathing, navigator-gated 3D MRA in a much shorter scan time. QISS also allowed rapid multi-slice bright-blood, diastolic phase imaging of the heart, which may have complementary value to multi-phase cine imaging. We conclude that, with further clinical validation, QISS might provide an efficient alternative to commonly used free-breathing coronary MRA techniques.

  19. Mesoscale weather and climate modeling with the global non-hydrostatic Goddard Earth Observing System Model (GEOS-5) at cloud-permitting resolutions

    NASA Astrophysics Data System (ADS)

    Putman, W. M.; Suarez, M.

    2009-12-01

    The Goddard Earth Observing System Model (GEOS-5), an earth system model developed in the NASA Global Modeling and Assimilation Office (GMAO), has integrated the non-hydrostatic finite-volume dynamical core on the cubed-sphere grid. The extension to a non-hydrostatic dynamical framework and the quasi-uniform cubed-sphere geometry permits the efficient exploration of global weather and climate modeling at cloud permitting resolutions of 10- to 4-km on today's high performance computing platforms. We have explored a series of incremental increases in global resolution with GEOS-5 from it's standard 72-level 27-km resolution (~5.5 million cells covering the globe from the surface to 0.1 hPa) down to 3.5-km (~3.6 billion cells). We will present results from a series of forecast experiments exploring the impact of the non-hydrostatic dynamics at transition resolutions of 14- to 7-km, and the influence of increased horizontal/vertical resolution on convection and physical parameterizations within GEOS-5. Regional and mesoscale features of 5- to 10-day weather forecasts will be presented and compared with satellite observations. Our results will highlight the impact of resolution on the structure of cloud features including tropical convection and tropical cyclone predicability, cloud streets, von Karman vortices, and the marine stratocumulus cloud layer. We will also present experiment design and early results from climate impact experiments for global non-hydrostatic models using GEOS-5. Our climate experiments will focus on support for the Year of Tropical Convection (YOTC). We will also discuss a seasonal climate time-slice experiment design for downscaling coarse resolution century scale climate simulations to global non-hydrostatic resolutions of 14- to 7-km with GEOS-5.

  20. Lesion Detection in CT Images Using Deep Learning Semantic Segmentation Technique

    NASA Astrophysics Data System (ADS)

    Kalinovsky, A.; Liauchuk, V.; Tarasau, A.

    2017-05-01

    In this paper, the problem of automatic detection of tuberculosis lesion on 3D lung CT images is considered as a benchmark for testing out algorithms based on a modern concept of Deep Learning. For training and testing of the algorithms a domestic dataset of 338 3D CT scans of tuberculosis patients with manually labelled lesions was used. The algorithms which are based on using Deep Convolutional Networks were implemented and applied in three different ways including slice-wise lesion detection in 2D images using semantic segmentation, slice-wise lesion detection in 2D images using sliding window technique as well as straightforward detection of lesions via semantic segmentation in whole 3D CT scans. The algorithms demonstrate superior performance compared to algorithms based on conventional image analysis methods.

  1. A deep learning model integrating FCNNs and CRFs for brain tumor segmentation.

    PubMed

    Zhao, Xiaomei; Wu, Yihong; Song, Guidong; Li, Zhenye; Zhang, Yazhuo; Fan, Yong

    2018-01-01

    Accurate and reliable brain tumor segmentation is a critical component in cancer diagnosis, treatment planning, and treatment outcome evaluation. Build upon successful deep learning techniques, a novel brain tumor segmentation method is developed by integrating fully convolutional neural networks (FCNNs) and Conditional Random Fields (CRFs) in a unified framework to obtain segmentation results with appearance and spatial consistency. We train a deep learning based segmentation model using 2D image patches and image slices in following steps: 1) training FCNNs using image patches; 2) training CRFs as Recurrent Neural Networks (CRF-RNN) using image slices with parameters of FCNNs fixed; and 3) fine-tuning the FCNNs and the CRF-RNN using image slices. Particularly, we train 3 segmentation models using 2D image patches and slices obtained in axial, coronal and sagittal views respectively, and combine them to segment brain tumors using a voting based fusion strategy. Our method could segment brain images slice-by-slice, much faster than those based on image patches. We have evaluated our method based on imaging data provided by the Multimodal Brain Tumor Image Segmentation Challenge (BRATS) 2013, BRATS 2015 and BRATS 2016. The experimental results have demonstrated that our method could build a segmentation model with Flair, T1c, and T2 scans and achieve competitive performance as those built with Flair, T1, T1c, and T2 scans. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Grading of Chinese Cantonese Sausage Using Hyperspectral Imaging Combined with Chemometric Methods

    PubMed Central

    Gong, Aiping; Zhu, Susu; He, Yong; Zhang, Chu

    2017-01-01

    Fast and accurate grading of Chinese Cantonese sausage is an important concern for customers, organizations, and the industry. Hyperspectral imaging in the spectral range of 874–1734 nm, combined with chemometric methods, was applied to grade Chinese Cantonese sausage. Three grades of intact and sliced Cantonese sausages were studied, including the top, first, and second grades. Support vector machine (SVM) and random forests (RF) techniques were used to build two different models. Second derivative spectra and RF were applied to select optimal wavelengths. The optimal wavelengths were the same for intact and sliced sausages when selected from second derivative spectra, while the optimal wavelengths for intact and sliced sausages selected using RF were quite similar. The SVM and RF models, using full spectra and the optimal wavelengths, obtained acceptable results for intact and sliced sausages. Both models for intact sausages performed better than those for sliced sausages, with a classification accuracy of the calibration and prediction set of over 90%. The overall results indicated that hyperspectral imaging combined with chemometric methods could be used to grade Chinese Cantonese sausages, with intact sausages being better suited for grading. This study will help to develop fast and accurate online grading of Cantonese sausages, as well as other sausages. PMID:28757578

  3. Increased glutamate-stimulated norepinephrine release from prefrontal cortex slices of spontaneously hypertensive rats.

    PubMed

    Russell, V A; Wiggins, T M

    2000-12-01

    Spontaneously hypertensive rats (SHR) have behavioral characteristics (hyperactivity, impulsiveness, poorly sustained attention) similar to the behavioral disturbances of children with attention-deficit hyperactivity disorder (ADHD). We have previously shown that dopaminergic and noradrenergic systems are disturbed in the prefrontal cortex of SHR compared to their normotensive Wistar-Kyoto (WKY) control rats. It was of interest to determine whether the underlying neural circuits that use glutamate as a neurotransmitter function normally in the prefrontal cortex of SHR. An in vitro superfusion technique was used to demonstrate that glutamate caused a concentration-dependent stimulation of [3H]norepinephrine release from rat prefrontal cortex slices. Glutamate (100 microM and 1 mM) caused significantly greater release of norepinephrine from prefrontal cortex slices of SHR than from control slices. The effect of glutamate was not mediated by NMDA receptors, since NMDA (10 and 100 microM) did not exert any effect on norepinephrine release and MK-801 (10 microM) did not antagonize the effect of 100 microM glutamate. These results demonstrate that glutamate stimulates norepinephrine release from rat prefrontal cortex slices and that this increase is enhanced in SHR. The results are consistent with the suggestion that the noradrenergic system is overactive in prefrontal cortex of SHR, the animal model for ADHD.

  4. A novel diagnostic aid for intra-abdominal adhesion detection in cine-MR imaging: Pilot study and initial diagnostic impressions.

    PubMed

    Randall, David; Joosten, Frank; ten Broek, Richard; Gillott, Richard; Bardhan, Karna Dev; Strik, Chema; Prins, Wiesje; van Goor, Harry; Fenner, John

    2017-07-14

    A non-invasive diagnostic technique for abdominal adhesions is not currently available. Capture of abdominal motion due to respiration in cine-MRI has shown promise, but is difficult to interpret. This article explores the value of a complimentary diagnostic aid to facilitate the non-invasive detection of abdominal adhesions using cine-MRI. An image processing technique was developed to quantify the amount of sliding that occurs between the organs of the abdomen and the abdominal wall in sagittal cine-MRI slices. The technique produces a 'sheargram' which depicts the amount of sliding which has occurred over 1-3 respiratory cycles. A retrospective cohort of 52 patients, scanned for suspected adhesions, made 281 cine-MRI sagittal slices available for processing. The resulting sheargrams were reported by two operators and compared to expert clinical judgement of the cine-MRI scans. The sheargram matched clinical judgement in 84% of all sagittal slices and 93-96% of positive adhesions were identified on the sheargram. The sheargram displayed a slight skew towards sensitivity over specificity, with a high positive adhesion detection rate but at the expense of false positives. Good correlation between sheargram and absence/presence of inferred adhesions indicates quantification of sliding motion has potential to aid adhesion detection in cine-MRI. Advances in Knowledge: This is the first attempt to clinically evaluate a novel image processing technique quantifying the sliding motion of the abdominal contents against the abdominal wall. The results of this pilot study reveal its potential as a diagnostic aid for detection of abdominal adhesions.

  5. Sniffer patch laser uncaging response (SPLURgE): an assay of regional differences in allosteric receptor modulation and neurotransmitter clearance

    PubMed Central

    Christian, Catherine A.

    2013-01-01

    Allosteric modulators exert actions on neurotransmitter receptors by positively or negatively altering the effective response of these receptors to their respective neurotransmitter. γ-Aminobutyric acid (GABA) type A ionotropic receptors (GABAARs) are major targets for allosteric modulators such as benzodiazepines, neurosteroids, and barbiturates. Analysis of substances that produce similar effects has been hampered by the lack of techniques to assess the localization and function of such agents in brain slices. Here we describe measurement of the sniffer patch laser uncaging response (SPLURgE), which combines the sniffer patch recording configuration with laser photolysis of caged GABA. This methodology enables the detection of allosteric GABAAR modulators endogenously present in discrete areas of the brain slice and allows for the application of exogenous GABA with spatiotemporal control without altering the release and localization of endogenous modulators within the slice. Here we demonstrate the development and use of this technique for the measurement of allosteric modulation in different areas of the thalamus. Application of this technique will be useful in determining whether a lack of modulatory effect on a particular category of neurons or receptors is due to insensitivity to allosteric modulation or a lack of local release of endogenous ligand. We also demonstrate that this technique can be used to investigate GABA diffusion and uptake. This method thus provides a biosensor assay for rapid detection of endogenous GABAAR modulators and has the potential to aid studies of allosteric modulators that exert effects on other classes of neurotransmitter receptors, such as glutamate, acetylcholine, or glycine receptors. PMID:23843428

  6. Sniffer patch laser uncaging response (SPLURgE): an assay of regional differences in allosteric receptor modulation and neurotransmitter clearance.

    PubMed

    Christian, Catherine A; Huguenard, John R

    2013-10-01

    Allosteric modulators exert actions on neurotransmitter receptors by positively or negatively altering the effective response of these receptors to their respective neurotransmitter. γ-Aminobutyric acid (GABA) type A ionotropic receptors (GABAARs) are major targets for allosteric modulators such as benzodiazepines, neurosteroids, and barbiturates. Analysis of substances that produce similar effects has been hampered by the lack of techniques to assess the localization and function of such agents in brain slices. Here we describe measurement of the sniffer patch laser uncaging response (SPLURgE), which combines the sniffer patch recording configuration with laser photolysis of caged GABA. This methodology enables the detection of allosteric GABAAR modulators endogenously present in discrete areas of the brain slice and allows for the application of exogenous GABA with spatiotemporal control without altering the release and localization of endogenous modulators within the slice. Here we demonstrate the development and use of this technique for the measurement of allosteric modulation in different areas of the thalamus. Application of this technique will be useful in determining whether a lack of modulatory effect on a particular category of neurons or receptors is due to insensitivity to allosteric modulation or a lack of local release of endogenous ligand. We also demonstrate that this technique can be used to investigate GABA diffusion and uptake. This method thus provides a biosensor assay for rapid detection of endogenous GABAAR modulators and has the potential to aid studies of allosteric modulators that exert effects on other classes of neurotransmitter receptors, such as glutamate, acetylcholine, or glycine receptors.

  7. Data Processing Methods for 3D Seismic Imaging of Subsurface Volcanoes: Applications to the Tarim Flood Basalt.

    PubMed

    Wang, Lei; Tian, Wei; Shi, Yongmin

    2017-08-07

    The morphology and structure of plumbing systems can provide key information on the eruption rate and style of basalt lava fields. The most powerful way to study subsurface geo-bodies is to use industrial 3D reflection seismological imaging. However, strategies to image subsurface volcanoes are very different from that of oil and gas reservoirs. In this study, we process seismic data cubes from the Northern Tarim Basin, China, to illustrate how to visualize sills through opacity rendering techniques and how to image the conduits by time-slicing. In the first case, we isolated probes by the seismic horizons marking the contacts between sills and encasing strata, applying opacity rendering techniques to extract sills from the seismic cube. The resulting detailed sill morphology shows that the flow direction is from the dome center to the rim. In the second seismic cube, we use time-slices to image the conduits, which corresponds to marked discontinuities within the encasing rocks. A set of time-slices obtained at different depths show that the Tarim flood basalts erupted from central volcanoes, fed by separate pipe-like conduits.

  8. Cloud cover over the equatorial eastern Pacific derived from July 1983 International Satellite Cloud Climatology Project data using a hybrid bispectral threshold method

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Harrison, Edwin F.; Gibson, Gary G.

    1987-01-01

    A set of visible and IR data obtained with GOES from July 17-31, 1983 is analyzed using a modified version of the hybrid bispectral threshold method developed by Minnis and Harrison (1984). This methodology can be divided into a set of procedures or optional techniques to determine the proper contaminate clear-sky temperature or IR threshold. The various optional techniques are described; the options are: standard, low-temperature limit, high-reflectance limit, low-reflectance limit, coldest pixel and thermal adjustment limit, IR-only low-cloud temperature limit, IR clear-sky limit, and IR overcast limit. Variations in the cloud parameters and the characteristics and diurnal cycles of trade cumulus and stratocumulus clouds over the eastern equatorial Pacific are examined. It is noted that the new method produces substantial changes in about one third of the cloud amount retrieval; and low cloud retrievals are affected most by the new constraints.

  9. A new characterization of supercooled clouds below 10,000 feet AGL

    NASA Technical Reports Server (NTRS)

    Masters, C. O.

    1985-01-01

    Icing caused by supercooled clouds below 10,000 feet were characterized with a view toward a change in FAA standards for civil aircraft ice protection standards. Current techniques in cloud physics were employed.

  10. In vitro comparison of the effect of different slice thicknesses on the accuracy of linear measurements on cone beam computed tomography images in implant sites.

    PubMed

    Shokri, Abbas; Khajeh, Samira

    2015-01-01

    Use of dental implants in edentulous patients has become a common treatment modality. Treatment of such implants requires radiographic evaluation, and in most cases, several different imaging techniques are necessary to evaluate the height, width, and structure of the bone at the implant site. In the current study, an attempt was made to evaluate the accuracy of measurements on cone beam computed tomography (CBCT) images with different slice thicknesses so that accurate data can be collected for proper clinical applications. In the present in vitro study, 11 human dry mandibles were used. The width and height of bone at the central, canine, and molar teeth areas were measured on the left and right sides by using digital calipers (as gold standard) and on CBCT images with 0.5-, 1-, 2-, 3-, 5-, and 10-mm slice thicknesses. Data were analyzed with SPSS 16, using paired t-test, Tukey test, and inter class correlation. Data were collected by evaluation of 11 skulls and 63 samples on the whole. There were no significant differences in bone width in any area (P > 0.05). There were significant differences in bone height in the central and molar teeth areas (P = 0.02). The measurements were not significant only at 4-mm slice thickness option and 5-mm slice thickness option for height compared with the gold standard (P = 0.513 and 0.173, respectively). The results did not show any significant differences between the observers (P = 0.329). The highest measurement accuracy of CBCT software program was observed at 4-mm slices for bone width and 5-mm slice thickness for bone height.

  11. Three-dimensional computed topography analysis of a patient with an unusual anatomy of the maxillary second and third molars.

    PubMed

    Zhao, Jin; Li, Yan; Yang, Zhi-Wei; Wang, Wei; Meng, Yan

    2011-10-01

    We present a case of a patient with rare anatomy of a maxillary second molar with three mesiobuccal root canals and a maxillary third molar with four separate roots, identified using multi-slice computed topography (CT) and three-dimensional reconstruction techniques. The described case enriched/might enrich our knowledge about possible anatomical aberrations of maxillary molars. In addition, we demonstrate the role of multi-slice CT as an objective tool for confirmatory diagnosis and successful endodontic management.

  12. Retrieval of subvisual cirrus cloud optical thickness from limb-scatter measurements

    NASA Astrophysics Data System (ADS)

    Wiensz, J. T.; Degenstein, D. A.; Lloyd, N. D.; Bourassa, A. E.

    2013-01-01

    We present a technique for estimating the optical thickness of subvisual cirrus clouds detected by OSIRIS (Optical Spectrograph and Infrared Imaging System), a limb-viewing satellite instrument that measures scattered radiances from the UV to the near-IR. The measurement set is composed of a ratio of limb radiance profiles at two wavelengths that indicates the presence of cloud-scattering regions. Cross-sections and phase functions from an in situ database are used to simulate scattering by cloud-particles. With appropriate configurations discussed in this paper, the SASKTRAN successive-orders of scatter radiative transfer model is able to simulate accurately the in-cloud radiances from OSIRIS. Configured in this way, the model is used with a multiplicative algebraic reconstruction technique (MART) to retrieve the cloud extinction profile for an assumed effective cloud particle size. The sensitivity of these retrievals to key auxiliary model parameters is shown, and it is shown that the retrieved extinction profile, for an assumed effective cloud particle size, models well the measured in-cloud radiances from OSIRIS. The greatest sensitivity of the retrieved optical thickness is to the effective cloud particle size. Since OSIRIS has an 11-yr record of subvisual cirrus cloud detections, the work described in this manuscript provides a very useful method for providing a long-term global record of the properties of these clouds.

  13. Characterizing a New Surface-Based Shortwave Cloud Retrieval Technique, Based on Transmitted Radiance for Soil and Vegetated Surface Types

    NASA Technical Reports Server (NTRS)

    Coddington, Odele; Pilewskie, Peter; Schmidt, K. Sebastian; McBride, Patrick J.; Vukicevic, Tomislava

    2013-01-01

    This paper presents an approach using the GEneralized Nonlinear Retrieval Analysis (GENRA) tool and general inverse theory diagnostics including the maximum likelihood solution and the Shannon information content to investigate the performance of a new spectral technique for the retrieval of cloud optical properties from surface based transmittance measurements. The cumulative retrieval information over broad ranges in cloud optical thickness (tau), droplet effective radius (r(sub e)), and overhead sun angles is quantified under two conditions known to impact transmitted radiation; the variability in land surface albedo and atmospheric water vapor content. Our conclusions are: (1) the retrieved cloud properties are more sensitive to the natural variability in land surface albedo than to water vapor content; (2) the new spectral technique is more accurate (but still imprecise) than a standard approach, in particular for tau between 5 and 60 and r(sub e) less than approximately 20 nm; and (3) the retrieved cloud properties are dependent on sun angle for clouds of tau from 5 to 10 and r(sub e) less than 10 nm, with maximum sensitivity obtained for an overhead sun.

  14. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  15. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).« less

  16. Imaging neuronal responses in slice preparations of vomeronasal organ expressing a genetically encoded calcium sensor.

    PubMed

    Ma, Limei; Haga-Yamanaka, Sachiko; Yu, Qingfeng Elden; Qiu, Qiang; Kim, Sangseong; Yu, C Ron

    2011-12-06

    The vomeronasal organ (VNO) detects chemosensory signals that carry information about the social, sexual and reproductive status of the individuals within the same species. These intraspecies signals, the pheromones, as well as signals from some predators, activate the vomeronasal sensory neurons (VSNs) with high levels of specificity and sensitivity. At least three distinct families of G-protein coupled receptors, V1R, V2R and FPR, are expressed in VNO neurons to mediate the detection of the chemosensory cues. To understand how pheromone information is encoded by the VNO, it is critical to analyze the response profiles of individual VSNs to various stimuli and identify the specific receptors that mediate these responses. The neuroepithelia of VNO are enclosed in a pair of vomer bones. The semi-blind tubular structure of VNO has one open end (the vomeronasal duct) connecting to the nasal cavity. VSNs extend their dendrites to the lumen part of the VNO, where the pheromone cues are in contact with the receptors expressed at the dendritic knobs. The cell bodies of the VSNs form pseudo-stratified layers with V1R and V2R expressed in the apical and basal layers respectively. Several techniques have been utilized to monitor responses of VSNs to sensory stimuli. Among these techniques, acute slice preparation offers several advantages. First, compared to dissociated VSNs, slice preparations maintain the neurons in their native morphology and the dendrites of the cells stay relatively intact. Second, the cell bodies of the VSNs are easily accessible in coronal slice of the VNO to allow electrophysiology studies and imaging experiments as compared to whole epithelium and whole-mount preparations. Third, this method can be combined with molecular cloning techniques to allow receptor identification. Sensory stimulation elicits strong Ca2+ influx in VSNs that is indicative of receptor activation. We thus develop transgenic mice that express G-CaMP2 in the olfactory sensory neurons, including the VSNs. The sensitivity and the genetic nature of the probe greatly facilitate Ca2+ imaging experiments. This method has eliminated the dye loading process used in previous studies. We also employ a ligand delivery system that enables application of various stimuli to the VNO slices. The combination of the two techniques allows us to monitor multiple neurons simultaneously in response to large numbers of stimuli. Finally, we have established a semi-automated analysis pipeline to assist image processing.

  17. Automatic segmentation of left ventricle in cardiac cine MRI images based on deep learning

    NASA Astrophysics Data System (ADS)

    Zhou, Tian; Icke, Ilknur; Dogdas, Belma; Parimal, Sarayu; Sampath, Smita; Forbes, Joseph; Bagchi, Ansuman; Chin, Chih-Liang; Chen, Antong

    2017-02-01

    In developing treatment of cardiovascular diseases, short axis cine MRI has been used as a standard technique for understanding the global structural and functional characteristics of the heart, e.g. ventricle dimensions, stroke volume and ejection fraction. To conduct an accurate assessment, heart structures need to be segmented from the cine MRI images with high precision, which could be a laborious task when performed manually. Herein a fully automatic framework is proposed for the segmentation of the left ventricle from the slices of short axis cine MRI scans of porcine subjects using a deep learning approach. For training the deep learning models, which generally requires a large set of data, a public database of human cine MRI scans is used. Experiments on the 3150 cine slices of 7 porcine subjects have shown that when comparing the automatic and manual segmentations the mean slice-wise Dice coefficient is about 0.930, the point-to-curve error is 1.07 mm, and the mean slice-wise Hausdorff distance is around 3.70 mm, which demonstrates the accuracy and robustness of the proposed inter-species translational approach.

  18. Pulmonary parenchyma segmentation in thin CT image sequences with spectral clustering and geodesic active contour model based on similarity

    NASA Astrophysics Data System (ADS)

    He, Nana; Zhang, Xiaolong; Zhao, Juanjuan; Zhao, Huilan; Qiang, Yan

    2017-07-01

    While the popular thin layer scanning technology of spiral CT has helped to improve diagnoses of lung diseases, the large volumes of scanning images produced by the technology also dramatically increase the load of physicians in lesion detection. Computer-aided diagnosis techniques like lesions segmentation in thin CT sequences have been developed to address this issue, but it remains a challenge to achieve high segmentation efficiency and accuracy without much involvement of human manual intervention. In this paper, we present our research on automated segmentation of lung parenchyma with an improved geodesic active contour model that is geodesic active contour model based on similarity (GACBS). Combining spectral clustering algorithm based on Nystrom (SCN) with GACBS, this algorithm first extracts key image slices, then uses these slices to generate an initial contour of pulmonary parenchyma of un-segmented slices with an interpolation algorithm, and finally segments lung parenchyma of un-segmented slices. Experimental results show that the segmentation results generated by our method are close to what manual segmentation can produce, with an average volume overlap ratio of 91.48%.

  19. A sandwich-like differential B-dot based on EACVD polycrystalline diamond slice

    NASA Astrophysics Data System (ADS)

    Xu, P.; Yu, Y.; Xu, L.; Zhou, H. Y.; Qiu, C. J.

    2018-06-01

    In this article, we present a method of mass production of a standardized high-performance differential B-dot magnetic probe together with the magnetic field measurement in a pulsed current device with the current up to hundreds of kilo-Amperes. A polycrystalline diamond slice produced in an Electron Assisted Chemical Vapor Deposition device is used as the base and insulating material to imprint two symmetric differential loops for the magnetic field measurement. The SP3 carbon bond in the cubic lattice structure of diamond is confirmed by Raman spectra. The thickness of this slice is 20 μm. A gold loop is imprinted onto each surface of the slice by using the photolithography technique. The inner diameter, width, and thickness of each loop are 0.8 mm, 50 μm, and 1 μm, respectively. It provides a way of measuring the pulsed magnetic field with a high spatial and temporal resolution, especially in limited space. This differential magnetic probe has demonstrated a very good common-mode rejection rate through the pulsed magnetic field measurement.

  20. Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hampton, Jesse Clay

    The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.

  1. Study to determine cloud motion from meteorological satellite data

    NASA Technical Reports Server (NTRS)

    Clark, B. B.

    1972-01-01

    Processing techniques were tested for deducing cloud motion vectors from overlapped portions of pairs of pictures made from meteorological satellites. This was accomplished by programming and testing techniques for estimating pattern motion by means of cross correlation analysis with emphasis placed upon identifying and reducing errors resulting from various factors. Techniques were then selected and incorporated into a cloud motion determination program which included a routine which would select and prepare sample array pairs from the preprocessed test data. The program was then subjected to limited testing with data samples selected from the Nimbus 4 THIR data provided by the 11.5 micron channel.

  2. Night Sky Weather Monitoring System Using Fish-Eye CCD

    NASA Astrophysics Data System (ADS)

    Tomida, Takayuki; Saito, Yasunori; Nakamura, Ryo; Yamazaki, Katsuya

    Telescope Array (TA) is international joint experiment observing ultra-high energy cosmic rays. TA employs fluorescence detection technique to observe cosmic rays. In this technique, tho existence of cloud significantly affects quality of data. Therefore, cloud monitoring provides important information. We are developing two new methods for evaluating night sky weather with pictures taken by charge-coupled device (CCD) camera. One is evaluating the amount of cloud with pixels brightness. The other is counting the number of stars with contour detection technique. The results of these methods show clear correlation, and we concluded both the analyses are reasonable methods for weather monitoring. We discuss reliability of the star counting method.

  3. Field Impact Evaluation Process on Electronic Tabular Display Subsystem (ETABS).

    DTIC Science & Technology

    1979-10-01

    structural and process techniques are described. These include a diagonal slice approach to team formulation and several different methods of team building, process control and conflict management . (Author)

  4. Automatic online adaptive radiation therapy techniques for targets with significant shape change: a feasibility study.

    PubMed

    Court, Laurence E; Tishler, Roy B; Petit, Joshua; Cormack, Robert; Chin, Lee

    2006-05-21

    This work looks at the feasibility of an online adaptive radiation therapy concept that would detect the daily position and shape of the patient, and would then correct the daily treatment to account for any changes compared with planning position. In particular, it looks at the possibility of developing algorithms to correct for large complicated shape change. For co-planar beams, the dose in an axial plane is approximately associated with the positions of a single multi-leaf collimator (MLC) pair. We start with a primary plan, and automatically generate several secondary plans with gantry angles offset by regular increments. MLC sequences for each plan are calculated keeping monitor units (MUs) and number of segments constant for a given beam (fluences are different). Bulk registration (3D) of planning and daily CT images gives global shifts. Slice-by-slice (2D) registration gives local shifts and rotations about the longitudinal axis for each axial slice. The daily MLC sequence is then created for each axial slice/MLC leaf pair combination, by taking the MLC positions from the pre-calculated plan with the nearest rotation, and shifting using a beam's-eye-view calculation to account for local linear shifts. A planning study was carried out using two head and neck region MR images of a healthy volunteer which were contoured to simulate a base-of-tongue treatment: one with the head straight (used to simulate the planning image) and the other with the head tilted to the left (the daily image). Head and neck treatment was chosen to evaluate this technique because of its challenging nature, with varying internal and external contours, and multiple degrees of freedom. Shape change was significant: on a slice-by-slice basis, local rotations in the daily image varied from 2 to 31 degrees, and local shifts ranged from -0.2 to 0.5 cm and -0.4 to 0.0 cm in right-left and posterior-anterior directions, respectively. The adapted treatment gave reasonable target coverage (100%, 90% and 80% of the base-of-tongue, left nodes and right nodes, respectively, receiving the daily prescription dose), and kept the daily cord dose below the limit used in the original plan (65%, equivalent to 46 Gy over 35 fractions). Most of the loss of coverage was due to one shoulder being raised more superior relative to the other shoulder compared with the plan. This type of skew-like motion is not accounted for by the proposed ART technique. In conclusion, this technique has potential to correct for fairly extreme daily changes in patient setup, but some control of the daily position would still be necessary. Importantly, it was possible to combine treatments from different plans (MLC sequences) to correct for position and shape change.

  5. Feasibility of Quantifying Arterial Cerebral Blood Volume Using Multiphase Alternate Ascending/Descending Directional Navigation (ALADDIN).

    PubMed

    Kim, Ki Hwan; Choi, Seung Hong; Park, Sung-Hong

    2016-01-01

    Arterial cerebral blood volume (aCBV) is associated with many physiologic and pathologic conditions. Recently, multiphase balanced steady state free precession (bSSFP) readout was introduced to measure labeled blood signals in the arterial compartment, based on the fact that signal difference between labeled and unlabeled blood decreases with the number of RF pulses that is affected by blood velocity. In this study, we evaluated the feasibility of a new 2D inter-slice bSSFP-based arterial spin labeling (ASL) technique termed, alternate ascending/descending directional navigation (ALADDIN), to quantify aCBV using multiphase acquisition in six healthy subjects. A new kinetic model considering bSSFP RF perturbations was proposed to describe the multiphase data and thus to quantify aCBV. Since the inter-slice time delay (TD) and gap affected the distribution of labeled blood spins in the arterial and tissue compartments, we performed the experiments with two TDs (0 and 500 ms) and two gaps (300% and 450% of slice thickness) to evaluate their roles in quantifying aCBV. Comparison studies using our technique and an existing method termed arterial volume using arterial spin tagging (AVAST) were also separately performed in five subjects. At 300% gap or 500-ms TD, significant tissue perfusion signals were demonstrated, while tissue perfusion signals were minimized and arterial signals were maximized at 450% gap and 0-ms TD. ALADDIN has an advantage of visualizing bi-directional flow effects (ascending/descending) in a single experiment. Labeling efficiency (α) of inter-slice blood flow effects could be measured in the superior sagittal sinus (SSS) (20.8±3.7%.) and was used for aCBV quantification. As a result of fitting to the proposed model, aCBV values in gray matter (1.4-2.3 mL/100 mL) were in good agreement with those from literature. Our technique showed high correlation with AVAST, especially when arterial signals were accentuated (i.e., when TD = 0 ms) (r = 0.53). The bi-directional perfusion imaging with multiphase ALADDIN approach can be an alternative to existing techniques for quantification of aCBV.

  6. A Slice of Orion

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Extended Orion Nebula Cloud

    This image composite shows a part of the Orion constellation surveyed by NASA's Spitzer Space Telescope. The shape of the main image was designed by astronomers to roughly follow the shape of Orion cloud A, an enormous star-making factory containing about 1,800 young stars. This giant cloud includes the famous Orion nebula (bright circular area in 'blade' part of hockey stick-shaped box at the bottom), which is visible to the naked eye on a clear, dark night as a fuzzy star in the hunter constellation's sword.

    The region that makes up the shaft part of the hockey stick box stretches 70 light-years beyond the Orion nebula. This particular area does not contain massive young stars like those of the Orion nebula, but is filled with 800 stars about the same mass as the sun. These sun-like stars don't live in big 'cities,' or clusters, of stars like the one in the Orion nebula; instead, they can be found in small clusters (right inset), or in relative isolation (middle insert).

    In the right inset, developing stars are illuminating the dusty cloud, creating small wisps that appear greenish. The stars also power speedy jets of gas (also green), which glow as the jets ram into the cloudy material.

    Since infrared light can penetrate through dust, we see not only stars within the cloud, but thousands of stars many light-years behind it, which just happen to be in the picture like unwanted bystanders. Astronomers carefully separate the young stars in the Orion cloud complex from the bystanders by looking for their telltale infrared glow.

    The infrared image shows light captured by Spitzer's infrared array camera. Light with wavelengths of 8 and 5.8 microns (red and orange) comes mainly from dust that has been heated by starlight. Light of 4.5 microns (green) shows hot gas and dust; and light of 3.6 microns (blue) is from starlight.

  7. Graph-based retrospective 4D image construction from free-breathing MRI slice acquisitions

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Ciesielski, Krzysztof C.; McDonough, Joseph M.; Mong, Andrew; Campbell, Robert M.

    2014-03-01

    4D or dynamic imaging of the thorax has many potential applications [1, 2]. CT and MRI offer sufficient speed to acquire motion information via 4D imaging. However they have different constraints and requirements. For both modalities both prospective and retrospective respiratory gating and tracking techniques have been developed [3, 4]. For pediatric imaging, x-ray radiation becomes a primary concern and MRI remains as the de facto choice. The pediatric subjects we deal with often suffer from extreme malformations of their chest wall, diaphragm, and/or spine, as such patient cooperation needed by some of the gating and tracking techniques are difficult to realize without causing patient discomfort. Moreover, we are interested in the mechanical function of their thorax in its natural form in tidal breathing. Therefore free-breathing MRI acquisition is the ideal modality of imaging for these patients. In our set up, for each coronal (or sagittal) slice position, slice images are acquired at a rate of about 200-300 ms/slice over several natural breathing cycles. This produces typically several thousands of slices which contain both the anatomic and dynamic information. However, it is not trivial to form a consistent and well defined 4D volume from these data. In this paper, we present a novel graph-based combinatorial optimization solution for constructing the best possible 4D scene from such data entirely in the digital domain. Our proposed method is purely image-based and does not need breath holding or any external surrogates or instruments to record respiratory motion or tidal volume. Both adult and children patients' data are used to illustrate the performance of the proposed method. Experimental results show that the reconstructed 4D scenes are smooth and consistent spatially and temporally, agreeing with known shape and motion of the lungs.

  8. Synthetic Minority Oversampling Technique and Fractal Dimension for Identifying Multiple Sclerosis

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Dong; Zhang, Yin; Phillips, Preetha; Dong, Zhengchao; Wang, Shuihua

    Multiple sclerosis (MS) is a severe brain disease. Early detection can provide timely treatment. Fractal dimension can provide statistical index of pattern changes with scale at a given brain image. In this study, our team used susceptibility weighted imaging technique to obtain 676 MS slices and 880 healthy slices. We used synthetic minority oversampling technique to process the unbalanced dataset. Then, we used Canny edge detector to extract distinguishing edges. The Minkowski-Bouligand dimension was a fractal dimension estimation method and used to extract features from edges. Single hidden layer neural network was used as the classifier. Finally, we proposed a three-segment representation biogeography-based optimization to train the classifier. Our method achieved a sensitivity of 97.78±1.29%, a specificity of 97.82±1.60% and an accuracy of 97.80±1.40%. The proposed method is superior to seven state-of-the-art methods in terms of sensitivity and accuracy.

  9. NOTE: Optimization of megavoltage CT scan registration settings for thoracic cases on helical tomotherapy

    NASA Astrophysics Data System (ADS)

    Woodford, Curtis; Yartsev, Slav; Van Dyk, Jake

    2007-08-01

    This study aims to investigate the settings that provide optimum registration accuracy when registering megavoltage CT (MVCT) studies acquired on tomotherapy with planning kilovoltage CT (kVCT) studies of patients with lung cancer. For each experiment, the systematic difference between the actual and planned positions of the thorax phantom was determined by setting the phantom up at the planning isocenter, generating and registering an MVCT study. The phantom was translated by 5 or 10 mm, MVCT scanned, and registration was performed again. A root-mean-square equation that calculated the residual error of the registration based on the known shift and systematic difference was used to assess the accuracy of the registration process. The phantom study results for 18 combinations of different MVCT/kVCT registration options are presented and compared to clinical registration data from 17 lung cancer patients. MVCT studies acquired with coarse (6 mm), normal (4 mm) and fine (2 mm) slice spacings could all be registered with similar residual errors. No specific combination of resolution and fusion selection technique resulted in a lower residual error. A scan length of 6 cm with any slice spacing registered with the full image fusion selection technique and fine resolution will result in a low residual error most of the time. On average, large corrections made manually by clinicians to the automatic registration values are infrequent. Small manual corrections within the residual error averages of the registration process occur, but their impact on the average patient position is small. Registrations using the full image fusion selection technique and fine resolution of 6 cm MVCT scans with coarse slices have a low residual error, and this strategy can be clinically used for lung cancer patients treated on tomotherapy. Automatic registration values are accurate on average, and a quick verification on a sagittal MVCT slice should be enough to detect registration outliers.

  10. Spatially offset Raman spectroscopy for photon migration investigations in long bone

    NASA Astrophysics Data System (ADS)

    Sowoidnich, Kay; Churchwell, John H.; Buckley, Kevin; Kerns, Jemma G.; Goodship, Allen E.; Parker, Anthony W.; Matousek, Pavel

    2015-07-01

    Raman Spectroscopy has become an important technique for assessing the composition of excised sections of bone, and is currently being developed as an in vivo tool for transcutaneous detection of bone disease using spatially offset Raman spectroscopy (SORS). The sampling volume of the Raman technique (and thus the amount of bone material interrogated by SORS) depends on the nature of the photon scattering in the probed tissue. Bone is a complex hierarchical material and to date little is known regarding its diffuse scattering properties which are important for the development and optimization of SORS as a diagnostic tool for characterizing bone disease in vivo. SORS measurements at 830 nm excitation wavelength are carried out on stratified samples to determine the depth from which the Raman signal originates within bone tissue. The measurements are made using a 0.38 mm thin Teflon slice, to give a pronounced and defined spectral signature, inserted in between layers of stacked 0.60 mm thin equine bone slices. Comparing the stack of bone slices with and without underlying bone section below the Teflon slice illustrated that thin sections of bone can lose appreciable number of photons through the unilluminated back surface. The results show that larger SORS offsets lead to progressively larger penetration depth into the sample; different Raman spectral signatures could be retrieved through up to 3.9 mm of overlying bone material with a 7 mm offset. These findings have direct impact on potential diagnostic medical applications; for instance on the detection of bone tumors or areas of infected bone.

  11. Banking for the future: an Australian experience in brain banking.

    PubMed

    Sarris, M; Garrick, T M; Sheedy, D; Harper, C G

    2002-06-01

    The New South Wales (NSW) Tissue Resource Centre (TRC) has been set up to provide Australian and international researchers with fixed and frozen brain tissue from cases that are well characterised, both clinically and pathologically, for projects related to neuropsychiatric and alcohol-related disorders. A daily review of the Department of Forensic Medicine provides initial information regarding a potential collection. If the case adheres to the strict inclusion criteria, the pathologist performing the postmortem examination is approached regarding retention of the brain tissue. The next of kin of the deceased is then contacted requesting permission to retain the brain for medical research. Cases are also obtained through donor programmes, where donors are assessed and consent to donate their brain during life. Once the brain is removed at autopsy, the brain is photographed, weighed and the volume determined, the brainstem and cerebellum are removed. The two hemispheres are divided, one hemisphere is fresh frozen and one fixed (randomised). Prior to freezing, the hemisphere is sliced into 1-cm coronal slices and a set of critical area blocks is taken. All frozen tissues are kept bagged at -80 degrees C. The other hemisphere is fixed in 15% buffered formalin for 2 weeks, embedded in agar and sliced at 3-mm intervals in the coronal plane. Tissue blocks from these slices are used for neuropathological analysis to exclude any other pathology. The TRC currently has 230 cases of both fixed and frozen material that has proven useful in a range of techniques in many research projects. These techniques include quantitative analyses of brain regions using neuropathological, neurochemical, neuropharmacological and gene expression assays.

  12. Regioselective Biolistic Targeting in Organotypic Brain Slices Using a Modified Gene Gun

    PubMed Central

    Arsenault, Jason; Nagy, Andras; Henderson, Jeffrey T.; O'Brien, John A.

    2014-01-01

    Transfection of DNA has been invaluable for biological sciences and with recent advances to organotypic brain slice preparations, the effect of various heterologous genes could thus be investigated easily while maintaining many aspects of in vivo biology. There has been increasing interest to transfect terminally differentiated neurons for which conventional transfection methods have been fraught with difficulties such as low yields and significant losses in viability. Biolistic transfection can circumvent many of these difficulties yet only recently has this technique been modified so that it is amenable for use in mammalian tissues. New modifications to the accelerator chamber have enhanced the gene gun's firing accuracy and increased its depths of penetration while also allowing the use of lower gas pressure (50 psi) without loss of transfection efficiency as well as permitting a focused regioselective spread of the particles to within 3 mm. In addition, this technique is straight forward and faster to perform than tedious microinjections. Both transient and stable expression are possible with nanoparticle bombardment where episomal expression can be detected within 24 hr and the cell survival was shown to be better than, or at least equal to, conventional methods. This technique has however one crucial advantage: it permits the transfection to be localized within a single restrained radius thus enabling the user to anatomically isolate the heterologous gene's effects. Here we present an in-depth protocol to prepare viable adult organotypic slices and submit them to regioselective transfection using an improved gene gun. PMID:25407047

  13. Spectral signatures of polar stratospheric clouds and sulfate aerosol

    NASA Technical Reports Server (NTRS)

    Massie, S. T.; Bailey, P. L.; Gille, J. C.; Lee, E. C.; Mergenthaler, J. L.; Roche, A. E.; Kumer, J. B.; Fishbein, E. F.; Waters, J. W.; Lahoz, W. A.

    1994-01-01

    Multiwavelength observations of Antarctic and midlatitude aerosol by the Cryogenic Limb Array Etalon Spectrometer (CLAES) experiment on the Upper Atmosphere Research Satellite (UARS) are used to demonstrate a technique that identifies the location of polar stratospheric clouds. The technique discussed uses the normalized area of the triangle formed by the aerosol extinctions at 925, 1257, and 1605/cm (10.8, 8.0, and 6.2 micrometers) to derive a spectral aerosol measure M of the aerosol spectrum. Mie calculations for spherical particles and T-matrix calculations for spheriodal particles are used to generate theoretical spectral extinction curves for sulfate and polar stratospheric cloud particles. The values of the spectral aerosol measure M for the sulfate and polar stratospheric cloud particles are shown to be different. Aerosol extinction data, corresponding to temperatures between 180 and 220 K at a pressure of 46 hPa (near 21-km altitude) for 18 August 1992, are used to demonstrate the technique. Thermodynamic calculations, based upon frost-point calculations and laboratory phase-equilibrium studies of nitric acid trihydrate, are used to predict the location of nitric acid trihydrate cloud particles.

  14. The role of global cloud climatologies in validating numerical models

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1991-01-01

    The net upward longwave surface radiation is exceedingly difficult to measure from space. A hybrid method using General Circulation Model (GCM) simulations and satellite data from the Earth Radiation Budget Experiment (ERBE) and the International Satellite Cloud Climatology Project (ISCCP) was used to produce global maps of this quantity over oceanic areas. An advantage of this technique is that no independent knowledge or assumptions regarding cloud cover for a particular month are required. The only information required is a relationship between the cloud radiation forcing (CRF) at the top of the atmosphere and that at the surface, which is obtained from the GCM simulation. A flow diagram of the technique and results are given.

  15. Investigating the Accuracy of Point Clouds Generated for Rock Surfaces

    NASA Astrophysics Data System (ADS)

    Seker, D. Z.; Incekara, A. H.

    2016-12-01

    Point clouds which are produced by means of different techniques are widely used to model the rocks and obtain the properties of rock surfaces like roughness, volume and area. These point clouds can be generated by applying laser scanning and close range photogrammetry techniques. Laser scanning is the most common method to produce point cloud. In this method, laser scanner device produces 3D point cloud at regular intervals. In close range photogrammetry, point cloud can be produced with the help of photographs taken in appropriate conditions depending on developing hardware and software technology. Many photogrammetric software which is open source or not currently provide the generation of point cloud support. Both methods are close to each other in terms of accuracy. Sufficient accuracy in the mm and cm range can be obtained with the help of a qualified digital camera and laser scanner. In both methods, field work is completed in less time than conventional techniques. In close range photogrammetry, any part of rock surfaces can be completely represented owing to overlapping oblique photographs. In contrast to the proximity of the data, these two methods are quite different in terms of cost. In this study, whether or not point cloud produced by photographs can be used instead of point cloud produced by laser scanner device is investigated. In accordance with this purpose, rock surfaces which have complex and irregular shape located in İstanbul Technical University Ayazaga Campus were selected as study object. Selected object is mixture of different rock types and consists of both partly weathered and fresh parts. Study was performed on a part of 30m x 10m rock surface. 2D and 3D analysis were performed for several regions selected from the point clouds of the surface models. 2D analysis is area-based and 3D analysis is volume-based. Analysis conclusions showed that point clouds in both are similar and can be used as alternative to each other. This proved that point cloud produced using photographs which are both economical and enables to produce data in less time can be used in several studies instead of point cloud produced by laser scanner.

  16. Impact of varying lidar measurement and data processing techniques in evaluating cirrus cloud and aerosol direct radiative effects

    NASA Astrophysics Data System (ADS)

    Lolli, Simone; Madonna, Fabio; Rosoldi, Marco; Campbell, James R.; Welton, Ellsworth J.; Lewis, Jasper R.; Gu, Yu; Pappalardo, Gelsomina

    2018-03-01

    In the past 2 decades, ground-based lidar networks have drastically increased in scope and relevance, thanks primarily to the advent of lidar observations from space and their need for validation. Lidar observations of aerosol and cloud geometrical, optical and microphysical atmospheric properties are subsequently used to evaluate their direct radiative effects on climate. However, the retrievals are strongly dependent on the lidar instrument measurement technique and subsequent data processing methodologies. In this paper, we evaluate the discrepancies between the use of Raman and elastic lidar measurement techniques and corresponding data processing methods for two aerosol layers in the free troposphere and for two cirrus clouds with different optical depths. Results show that the different lidar techniques are responsible for discrepancies in the model-derived direct radiative effects for biomass burning (0.05 W m-2 at surface and 0.007 W m-2 at top of the atmosphere) and dust aerosol layers (0.7 W m-2 at surface and 0.85 W m-2 at top of the atmosphere). Data processing is further responsible for discrepancies in both thin (0.55 W m-2 at surface and 2.7 W m-2 at top of the atmosphere) and opaque (7.7 W m-2 at surface and 11.8 W m-2 at top of the atmosphere) cirrus clouds. Direct radiative effect discrepancies can be attributed to the larger variability of the lidar ratio for aerosols (20-150 sr) than for clouds (20-35 sr). For this reason, the influence of the applied lidar technique plays a more fundamental role in aerosol monitoring because the lidar ratio must be retrieved with relatively high accuracy. In contrast, for cirrus clouds, with the lidar ratio being much less variable, the data processing is critical because smoothing it modifies the aerosol and cloud vertically resolved extinction profile that is used as input to compute direct radiative effect calculations.

  17. Metabolic Therapy for Temporal Lobe Epilepsy in a Dish: Investigating Mechanisms of Ketogenic Diet using Electrophysiological Recordings in Hippocampal Slices

    PubMed Central

    Kawamura, Masahito Jr.; Ruskin, David N.; Masino, Susan A.

    2016-01-01

    The hippocampus is prone to epileptic seizures and is a key brain region and experimental platform for investigating mechanisms associated with the abnormal neuronal excitability that characterizes a seizure. Accordingly, the hippocampal slice is a common in vitro model to study treatments that may prevent or reduce seizure activity. The ketogenic diet is a metabolic therapy used to treat epilepsy in adults and children for nearly 100 years; it can reduce or eliminate even severe or refractory seizures. New insights into its underlying mechanisms have been revealed by diverse types of electrophysiological recordings in hippocampal slices. Here we review these reports and their relevant mechanistic findings. We acknowledge that a major difficulty in using hippocampal slices is the inability to reproduce precisely the in vivo condition of ketogenic diet feeding in any in vitro preparation, and progress has been made in this in vivo/in vitro transition. Thus far at least three different approaches are reported to reproduce relevant diet effects in the hippocampal slices: (1) direct application of ketone bodies; (2) mimicking the ketogenic diet condition during a whole-cell patch-clamp technique; and (3) reduced glucose incubation of hippocampal slices from ketogenic diet–fed animals. Significant results have been found with each of these methods and provide options for further study into short- and long-term mechanisms including Adenosine triphosphate (ATP)-sensitive potassium (KATP) channels, vesicular glutamate transporter (VGLUT), pannexin channels and adenosine receptors underlying ketogenic diet and other forms of metabolic therapy. PMID:27847463

  18. Double-Pulsed 2-Micrometer Lidar Validation for Atmospheric CO2 Measurements

    NASA Technical Reports Server (NTRS)

    Singh, Upendra N.; Refaat, Tamer F.; Yu, Jirong; Petros, Mulugeta; Remus, Ruben

    2015-01-01

    A double-pulsed, 2-micron Integrated Path Differential Absorption (IPDA) lidar instrument for atmospheric carbon dioxide (CO2) measurements is successfully developed at NASA Langley Research Center (LaRC). Based on direct detection technique, the instrument can be operated on ground or onboard a small aircraft. Key features of this compact, rugged and reliable IPDA lidar includes high transmitted laser energy, wavelength tuning, switching and locking, and sensitive detection. As a proof of concept, the IPDA ground and airborne CO2 measurement and validation will be presented. IPDA lidar CO2 measurements ground validation were conducted at NASA LaRC using hard targets and a calibrated in-situ sensor. Airborne validation, conducted onboard the NASA B-200 aircraft, included CO2 plum detection from power stations incinerators, comparison to in-flight CO2 in-situ sensor and comparison to air sampling at different altitude conducted by NOAA at the same site. Airborne measurements, spanning for 20 hours, were obtained from different target conditions. Ground targets included soil, vegetation, sand, snow and ocean. In addition, cloud slicing was examined over the ocean. These flight validations were conducted at different altitudes, up to 7 km, with different wavelength controlled weighing functions. CO2 measurement results agree with modeling conducted through the different sensors, as will be discussed.

  19. AP-Cloud: Adaptive particle-in-cloud method for optimal solutions to Vlasov–Poisson equation

    DOE PAGES

    Wang, Xingyu; Samulyak, Roman; Jiao, Xiangmin; ...

    2016-04-19

    We propose a new adaptive Particle-in-Cloud (AP-Cloud) method for obtaining optimal numerical solutions to the Vlasov–Poisson equation. Unlike the traditional particle-in-cell (PIC) method, which is commonly used for solving this problem, the AP-Cloud adaptively selects computational nodes or particles to deliver higher accuracy and efficiency when the particle distribution is highly non-uniform. Unlike other adaptive techniques for PIC, our method balances the errors in PDE discretization and Monte Carlo integration, and discretizes the differential operators using a generalized finite difference (GFD) method based on a weighted least square formulation. As a result, AP-Cloud is independent of the geometric shapes ofmore » computational domains and is free of artificial parameters. Efficient and robust implementation is achieved through an octree data structure with 2:1 balance. We analyze the accuracy and convergence order of AP-Cloud theoretically, and verify the method using an electrostatic problem of a particle beam with halo. Here, simulation results show that the AP-Cloud method is substantially more accurate and faster than the traditional PIC, and it is free of artificial forces that are typical for some adaptive PIC techniques.« less

  20. AP-Cloud: Adaptive Particle-in-Cloud method for optimal solutions to Vlasov–Poisson equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Xingyu; Samulyak, Roman, E-mail: roman.samulyak@stonybrook.edu; Computational Science Initiative, Brookhaven National Laboratory, Upton, NY 11973

    We propose a new adaptive Particle-in-Cloud (AP-Cloud) method for obtaining optimal numerical solutions to the Vlasov–Poisson equation. Unlike the traditional particle-in-cell (PIC) method, which is commonly used for solving this problem, the AP-Cloud adaptively selects computational nodes or particles to deliver higher accuracy and efficiency when the particle distribution is highly non-uniform. Unlike other adaptive techniques for PIC, our method balances the errors in PDE discretization and Monte Carlo integration, and discretizes the differential operators using a generalized finite difference (GFD) method based on a weighted least square formulation. As a result, AP-Cloud is independent of the geometric shapes ofmore » computational domains and is free of artificial parameters. Efficient and robust implementation is achieved through an octree data structure with 2:1 balance. We analyze the accuracy and convergence order of AP-Cloud theoretically, and verify the method using an electrostatic problem of a particle beam with halo. Simulation results show that the AP-Cloud method is substantially more accurate and faster than the traditional PIC, and it is free of artificial forces that are typical for some adaptive PIC techniques.« less

  1. AP-Cloud: Adaptive particle-in-cloud method for optimal solutions to Vlasov–Poisson equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Xingyu; Samulyak, Roman; Jiao, Xiangmin

    We propose a new adaptive Particle-in-Cloud (AP-Cloud) method for obtaining optimal numerical solutions to the Vlasov–Poisson equation. Unlike the traditional particle-in-cell (PIC) method, which is commonly used for solving this problem, the AP-Cloud adaptively selects computational nodes or particles to deliver higher accuracy and efficiency when the particle distribution is highly non-uniform. Unlike other adaptive techniques for PIC, our method balances the errors in PDE discretization and Monte Carlo integration, and discretizes the differential operators using a generalized finite difference (GFD) method based on a weighted least square formulation. As a result, AP-Cloud is independent of the geometric shapes ofmore » computational domains and is free of artificial parameters. Efficient and robust implementation is achieved through an octree data structure with 2:1 balance. We analyze the accuracy and convergence order of AP-Cloud theoretically, and verify the method using an electrostatic problem of a particle beam with halo. Here, simulation results show that the AP-Cloud method is substantially more accurate and faster than the traditional PIC, and it is free of artificial forces that are typical for some adaptive PIC techniques.« less

  2. Mash-up of techniques between data crawling/transfer, data preservation/stewardship and data processing/visualization technologies on a science cloud system designed for Earth and space science: a report of successful operation and science projects of the NICT Science Cloud

    NASA Astrophysics Data System (ADS)

    Murata, K. T.

    2014-12-01

    Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT Science Cloud for Earth and space science. In 2003 56 refereed papers were published. At the end, we introduce a couple of successful results of Earth and space sciences using these three techniques carried out on the NICT Sciences Cloud. [1] http://sc-web.nict.go.jp

  3. Comparison of MISR and Meteosat-9 cloud-motion vectors

    NASA Astrophysics Data System (ADS)

    Lonitz, Katrin; HorváTh, ÁKos

    2011-12-01

    Stereo motion vectors (SMVs) from the Multiangle Imaging SpectroRadiometer (MISR) were evaluated against Meteosat-9 cloud-motion vectors (CMVs) over a one-year period. In general, SMVs had weaker westerlies and southerlies than CMVs at all latitudes and levels. The E-W wind comparison showed small vertical variations with a mean difference of -0.4 m s-1, -1 m s-1, -0.7 m s-1 and corresponding rmsd of 2.4 m s-1, 3.8 m s-1, 3.5 m s-1for low-, mid-, and high-level clouds, respectively. The N-S wind discrepancies were larger and steadily increased with altitude, having a mean difference of -0.8 m s-1, -2.9 m s-1, -4.4 m s-1 and rmsd of 3.5 m s-1, 6.9 m s-1, 9.5 m s-1at low, mid, and high levels. The best overall agreement was found in marine stratocumulus off Namibia, while differences were larger in the Tropics and convective clouds. The SMVs were typically assigned to higher altitudes than CMVs. Attributing each observed height difference to MISR and/or Meteosat-9 retrieval biases will require further research; nevertheless, we already identified a few regions and cloud types where CMV height assignment seemed to be the one in error. In thin mid- and high-level clouds over Africa and Arabia as well as in broken marine boundary layer clouds the 10.8-μm brightness temperature-based heights were often biased low due to radiance contributions from the warm surface. Contrarily, low-level CMVs in the South Atlantic were frequently assigned to mid levels by the CO2-slicing method in multilayer situations. We also noticed an apparent cross-swath dependence in SMVs, whereby retrievals were less accurate on the eastern side of the MISR swath than on the western side. This artifact was traced back to sub-pixel MISR co-registration errors, which introduced cross-swath biases in E-W wind, N-S wind, and height of 0.6 m s-1, 2.6 m s-1, and 210 m.

  4. Segmentation of lung nodules in computed tomography images using dynamic programming and multidirection fusion techniques.

    PubMed

    Wang, Qian; Song, Enmin; Jin, Renchao; Han, Ping; Wang, Xiaotong; Zhou, Yanying; Zeng, Jianchao

    2009-06-01

    The aim of this study was to develop a novel algorithm for segmenting lung nodules on three-dimensional (3D) computed tomographic images to improve the performance of computer-aided diagnosis (CAD) systems. The database used in this study consists of two data sets obtained from the Lung Imaging Database Consortium. The first data set, containing 23 nodules (22% irregular nodules, 13% nonsolid nodules, 17% nodules attached to other structures), was used for training. The second data set, containing 64 nodules (37% irregular nodules, 40% nonsolid nodules, 62% nodules attached to other structures), was used for testing. Two key techniques were developed in the segmentation algorithm: (1) a 3D extended dynamic programming model, with a newly defined internal cost function based on the information between adjacent slices, allowing parameters to be adapted to each slice, and (2) a multidirection fusion technique, which makes use of the complementary relationships among different directions to improve the final segmentation accuracy. The performance of this approach was evaluated by the overlap criterion, complemented by the true-positive fraction and the false-positive fraction criteria. The mean values of the overlap, true-positive fraction, and false-positive fraction for the first data set achieved using the segmentation scheme were 66%, 75%, and 15%, respectively, and the corresponding values for the second data set were 58%, 71%, and 22%, respectively. The experimental results indicate that this segmentation scheme can achieve better performance for nodule segmentation than two existing algorithms reported in the literature. The proposed 3D extended dynamic programming model is an effective way to segment sequential images of lung nodules. The proposed multidirection fusion technique is capable of reducing segmentation errors especially for no-nodule and near-end slices, thus resulting in better overall performance.

  5. Multi-slice ptychography with large numerical aperture multilayer Laue lenses

    DOE PAGES

    Ozturk, Hande; Yan, Hanfei; He, Yan; ...

    2018-05-09

    Here, the highly convergent x-ray beam focused by multilayer Laue lenses with large numerical apertures is used as a three-dimensional (3D) probe to image layered structures with an axial separation larger than the depth of focus. Instead of collecting weakly scattered high-spatial-frequency signals, the depth-resolving power is provided purely by the intense central cone diverged from the focused beam. Using the multi-slice ptychography method combined with the on-the-fly scan scheme, two layers of nanoparticles separated by 10 μm are successfully reconstructed with 8.1 nm lateral resolution and with a dwell time as low as 0.05 s per scan point. Thismore » approach obtains high-resolution images with extended depth of field, which paves the way for multi-slice ptychography as a high throughput technique for high-resolution 3D imaging of thick samples.« less

  6. Multi-slice ptychography with large numerical aperture multilayer Laue lenses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozturk, Hande; Yan, Hanfei; He, Yan

    Here, the highly convergent x-ray beam focused by multilayer Laue lenses with large numerical apertures is used as a three-dimensional (3D) probe to image layered structures with an axial separation larger than the depth of focus. Instead of collecting weakly scattered high-spatial-frequency signals, the depth-resolving power is provided purely by the intense central cone diverged from the focused beam. Using the multi-slice ptychography method combined with the on-the-fly scan scheme, two layers of nanoparticles separated by 10 μm are successfully reconstructed with 8.1 nm lateral resolution and with a dwell time as low as 0.05 s per scan point. Thismore » approach obtains high-resolution images with extended depth of field, which paves the way for multi-slice ptychography as a high throughput technique for high-resolution 3D imaging of thick samples.« less

  7. Minimization of Dead-Periods in MRI Pulse Sequences for Imaging Oblique Planes

    PubMed Central

    Atalar, Ergin; McVeigh, Elliot R.

    2007-01-01

    With the advent of breath-hold MR cardiac imaging techniques, the minimization of TR and TE for oblique planes has become a critical issue. The slew rates and maximum currents of gradient amplifiers limit the minimum possible TR and TE by adding dead-periods to the pulse sequences. We propose a method of designing gradient waveforms that will be applied to the amplifiers instead of the slice, readout, and phase encoding waveforms. Because this method ensures that the gradient amplifiers will always switch at their maximum slew rate, it results in the minimum possible dead-period for given imaging parameters and scan plane position. A GRASS pulse sequence has been designed and ultra-short TR and TE values have been obtained with standard gradient amplifiers and coils. For some oblique slices, we have achieved shorter TR and TE values than those for nonoblique slices. PMID:7869900

  8. Simultaneous Multi-Slice fMRI using Spiral Trajectories

    PubMed Central

    Zahneisen, Benjamin; Poser, Benedikt A.; Ernst, Thomas; Stenger, V. Andrew

    2014-01-01

    Parallel imaging methods using multi-coil receiver arrays have been shown to be effective for increasing MRI acquisition speed. However parallel imaging methods for fMRI with 2D sequences show only limited improvements in temporal resolution because of the long echo times needed for BOLD contrast. Recently, Simultaneous Multi-Slice (SMS) imaging techniques have been shown to increase fMRI temporal resolution by factors of four and higher. In SMS fMRI multiple slices can be acquired simultaneously using Echo Planar Imaging (EPI) and the overlapping slices are un-aliased using a parallel imaging reconstruction with multiple receivers. The slice separation can be further improved using the “blipped-CAIPI” EPI sequence that provides a more efficient sampling of the SMS 3D k-space. In this paper a blipped-spiral SMS sequence for ultra-fast fMRI is presented. The blipped-spiral sequence combines the sampling efficiency of spiral trajectories with the SMS encoding concept used in blipped-CAIPI EPI. We show that blipped spiral acquisition can achieve almost whole brain coverage at 3 mm isotropic resolution in 168 ms. It is also demonstrated that the high temporal resolution allows for dynamic BOLD lag time measurement using visual/motor and retinotopic mapping paradigms. The local BOLD lag time within the visual cortex following the retinotopic mapping stimulation of expanding flickering rings is directly measured and easily translated into an eccentricity map of the cortex. PMID:24518259

  9. Seismic attribute analysis for reservoir and fluid prediction, Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansor, M.N.; Rudolph, K.W.; Richards, F.B.

    1994-07-01

    The Malay Basin is characterized by excellent seismic data quality, but complex clastic reservoir architecture. With these characteristics, seismic attribute analysis is a very important tool in exploration and development geoscience and is routinely used for mapping fluids and reservoir, recognizing and risking traps, assessment, depth conversion, well placement, and field development planning. Attribute analysis can be successfully applied to both 2-D and 3-D data as demonstrated by comparisons of 2-D and 3-D amplitude maps of the same area. There are many different methods of extracting amplitude information from seismic data, including amplitude mapping, horizon slice, summed horizon slice, isochronmore » slice, and horizon slice from AVO (amplitude versus offset) cube. Within the Malay Basin, horizon/isochron slice techniques have several advantages over simply extracting amplitudes from a picked horizon: they are much faster, permit examination of the amplitude structure of the entire cube, yield better results for weak/variable signatures, and aid summation of amplitudes. Summation in itself often yields improved results because it incorporates the signature from the entire reservoir interval, reducing any effects due to noise, mispicking, or waveform variations. Dip and azimuth attributes have been widely applied by industry for fault identification. In addition, these attributes can also be used to map signature variations associated with hydrocarbon contacts or stratigraphic changes, and this must be considered when using these attributes for structural interpretation.« less

  10. Using virtual machine monitors to overcome the challenges of monitoring and managing virtualized cloud infrastructures

    NASA Astrophysics Data System (ADS)

    Bamiah, Mervat Adib; Brohi, Sarfraz Nawaz; Chuprat, Suriayati

    2012-01-01

    Virtualization is one of the hottest research topics nowadays. Several academic researchers and developers from IT industry are designing approaches for solving security and manageability issues of Virtual Machines (VMs) residing on virtualized cloud infrastructures. Moving the application from a physical to a virtual platform increases the efficiency, flexibility and reduces management cost as well as effort. Cloud computing is adopting the paradigm of virtualization, using this technique, memory, CPU and computational power is provided to clients' VMs by utilizing the underlying physical hardware. Beside these advantages there are few challenges faced by adopting virtualization such as management of VMs and network traffic, unexpected additional cost and resource allocation. Virtual Machine Monitor (VMM) or hypervisor is the tool used by cloud providers to manage the VMs on cloud. There are several heterogeneous hypervisors provided by various vendors that include VMware, Hyper-V, Xen and Kernel Virtual Machine (KVM). Considering the challenge of VM management, this paper describes several techniques to monitor and manage virtualized cloud infrastructures.

  11. Efficient operating system level virtualization techniques for cloud resources

    NASA Astrophysics Data System (ADS)

    Ansu, R.; Samiksha; Anju, S.; Singh, K. John

    2017-11-01

    Cloud computing is an advancing technology which provides the servcies of Infrastructure, Platform and Software. Virtualization and Computer utility are the keys of Cloud computing. The numbers of cloud users are increasing day by day. So it is the need of the hour to make resources available on demand to satisfy user requirements. The technique in which resources namely storage, processing power, memory and network or I/O are abstracted is known as Virtualization. For executing the operating systems various virtualization techniques are available. They are: Full System Virtualization and Para Virtualization. In Full Virtualization, the whole architecture of hardware is duplicated virtually. No modifications are required in Guest OS as the OS deals with the VM hypervisor directly. In Para Virtualization, modifications of OS is required to run in parallel with other OS. For the Guest OS to access the hardware, the host OS must provide a Virtual Machine Interface. OS virtualization has many advantages such as migrating applications transparently, consolidation of server, online maintenance of OS and providing security. This paper briefs both the virtualization techniques and discusses the issues in OS level virtualization.

  12. Water Ice Clouds in the Martian Atmosphere: A View from MGS TES

    NASA Technical Reports Server (NTRS)

    Hale, A. S.; Tamppari, L. K.; Christensen, P. R.; Smith, M. D.; Bass, Deborah; Qu, Zheng; Pearl, J. C.

    2005-01-01

    We use the method of Tamppari et al. to map water ice clouds in the Martian atmosphere. This technique was originally developed to analyze the broadband Viking IRTM channels and we have now applied it to the TES data. To do this, the TES spectra are convolved to the IRTM bandshapes and spatial resolutions, enabling use of the same processing techniques as were used in Tamppari et al.. This retrieval technique relies on using the temperature difference recorded in the 20 micron and 11 micron IRTM bands (or IRTM convolved TES bands) to map cold water ice clouds above the warmer Martian surface. Careful removal of surface contributions to the observed radiance is therefore necessary, and we have used both older Viking-derived basemaps of the surface emissivity and albedo, and new MGS derived basemaps in order the explore any possible differences on cloud retrieval due to differences in surface contribution removal. These results will be presented in our poster. Our previous work has concentrated primarily on comparing MGS TES to Viking data; that work saw that large-scale cloud features, such as the aphelion cloud belt, are quite repeatable from year to year, though small scale behavior shows some variation. Comparison of Viking and MGS era cloud maps will be presented in our poster. In the current stage of our study, we have concentrated our efforts on close analysis of water ice cloud behavior in the northern summer of the three MGS mapping years on relatively small spatial scales, and present our results below. Additional information is included in the original extended abstract.

  13. Vertical Optical Scanning with Panoramic Vision for Tree Trunk Reconstruction

    PubMed Central

    Berveglieri, Adilson; Liang, Xinlian; Honkavaara, Eija

    2017-01-01

    This paper presents a practical application of a technique that uses a vertical optical flow with a fisheye camera to generate dense point clouds from a single planimetric station. Accurate data can be extracted to enable the measurement of tree trunks or branches. The images that are collected with this technique can be oriented in photogrammetric software (using fisheye models) and used to generate dense point clouds, provided that some constraints on the camera positions are adopted. A set of images was captured in a forest plot in the experiments. Weighted geometric constraints were imposed in the photogrammetric software to calculate the image orientation, perform dense image matching, and accurately generate a 3D point cloud. The tree trunks in the scenes were reconstructed and mapped in a local reference system. The accuracy assessment was based on differences between measured and estimated trunk diameters at different heights. Trunk sections from an image-based point cloud were also compared to the corresponding sections that were extracted from a dense terrestrial laser scanning (TLS) point cloud. Cylindrical fitting of the trunk sections allowed the assessment of the accuracies of the trunk geometric shapes in both clouds. The average difference between the cylinders that were fitted to the photogrammetric cloud and those to the TLS cloud was less than 1 cm, which indicates the potential of the proposed technique. The point densities that were obtained with vertical optical scanning were 1/3 less than those that were obtained with TLS. However, the point density can be improved by using higher resolution cameras. PMID:29207468

  14. Vertical Optical Scanning with Panoramic Vision for Tree Trunk Reconstruction.

    PubMed

    Berveglieri, Adilson; Tommaselli, Antonio M G; Liang, Xinlian; Honkavaara, Eija

    2017-12-02

    This paper presents a practical application of a technique that uses a vertical optical flow with a fisheye camera to generate dense point clouds from a single planimetric station. Accurate data can be extracted to enable the measurement of tree trunks or branches. The images that are collected with this technique can be oriented in photogrammetric software (using fisheye models) and used to generate dense point clouds, provided that some constraints on the camera positions are adopted. A set of images was captured in a forest plot in the experiments. Weighted geometric constraints were imposed in the photogrammetric software to calculate the image orientation, perform dense image matching, and accurately generate a 3D point cloud. The tree trunks in the scenes were reconstructed and mapped in a local reference system. The accuracy assessment was based on differences between measured and estimated trunk diameters at different heights. Trunk sections from an image-based point cloud were also compared to the corresponding sections that were extracted from a dense terrestrial laser scanning (TLS) point cloud. Cylindrical fitting of the trunk sections allowed the assessment of the accuracies of the trunk geometric shapes in both clouds. The average difference between the cylinders that were fitted to the photogrammetric cloud and those to the TLS cloud was less than 1 cm, which indicates the potential of the proposed technique. The point densities that were obtained with vertical optical scanning were 1/3 less than those that were obtained with TLS. However, the point density can be improved by using higher resolution cameras.

  15. Earthscape, a Multi-Purpose Interactive 3d Globe Viewer for Hybrid Data Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Sarthou, A.; Mas, S.; Jacquin, M.; Moreno, N.; Salamon, A.

    2015-08-01

    The hybrid visualization and interaction tool EarthScape is presented here. The software is able to display simultaneously LiDAR point clouds, draped videos with moving footprint, volume scientific data (using volume rendering, isosurface and slice plane), raster data such as still satellite images, vector data and 3D models such as buildings or vehicles. The application runs on touch screen devices such as tablets. The software is based on open source libraries, such as OpenSceneGraph, osgEarth and OpenCV, and shader programming is used to implement volume rendering of scientific data. The next goal of EarthScape is to perform data analysis using ENVI Services Engine, a cloud data analysis solution. EarthScape is also designed to be a client of Jagwire which provides multisource geo-referenced video fluxes. When all these components will be included, EarthScape will be a multi-purpose platform that will provide at the same time data analysis, hybrid visualization and complex interactions. The software is available on demand for free at france@exelisvis.com.

  16. McIDAS-V: Advanced Visualization for 3D Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Rink, T.; Achtor, T. H.

    2010-12-01

    McIDAS-V is a Java-based, open-source, freely available software package for analysis and visualization of geophysical data. Its advanced capabilities provide very interactive 4-D displays, including 3D volumetric rendering and fast sub-manifold slicing, linked to an abstract mathematical data model with built-in metadata for units, coordinate system transforms and sampling topology. A Jython interface provides user defined analysis and computation in terms of the internal data model. These powerful capabilities to integrate data, analysis and visualization are being applied to hyper-spectral sounding retrievals, eg. AIRS and IASI, of moisture and cloud density to interrogate and analyze their 3D structure, as well as, validate with instruments such as CALIPSO, CloudSat and MODIS. The object oriented framework design allows for specialized extensions for novel displays and new sources of data. Community defined CF-conventions for gridded data are understood by the software, and can be immediately imported into the application. This presentation will show examples how McIDAS-V is used in 3-dimensional data analysis, display and evaluation.

  17. A motion compensation technique using sliced blocks and its application to hybrid video coding

    NASA Astrophysics Data System (ADS)

    Kondo, Satoshi; Sasai, Hisao

    2005-07-01

    This paper proposes a new motion compensation method using "sliced blocks" in DCT-based hybrid video coding. In H.264 ? MPEG-4 Advance Video Coding, a brand-new international video coding standard, motion compensation can be performed by splitting macroblocks into multiple square or rectangular regions. In the proposed method, on the other hand, macroblocks or sub-macroblocks are divided into two regions (sliced blocks) by an arbitrary line segment. The result is that the shapes of the segmented regions are not limited to squares or rectangles, allowing the shapes of the segmented regions to better match the boundaries between moving objects. Thus, the proposed method can improve the performance of the motion compensation. In addition, adaptive prediction of the shape according to the region shape of the surrounding macroblocks can reduce overheads to describe shape information in the bitstream. The proposed method also has the advantage that conventional coding techniques such as mode decision using rate-distortion optimization can be utilized, since coding processes such as frequency transform and quantization are performed on a macroblock basis, similar to the conventional coding methods. The proposed method is implemented in an H.264-based P-picture codec and an improvement in bit rate of 5% is confirmed in comparison with H.264.

  18. Photogrammetric Analysis of Rotor Clouds Observed during T-REX

    NASA Astrophysics Data System (ADS)

    Romatschke, U.; Grubišić, V.

    2017-12-01

    Stereo photogrammetric analysis is a rarely utilized but highly valuable tool for studying smaller, highly ephemeral clouds. In this study, we make use of data that was collected during the Terrain-induced Rotor Experiment (T-REX), which took place in Owens Valley, eastern California, in the spring of 2006. The data set consists of matched digital stereo photographs obtained at high temporal (on the order of seconds) and spatial resolution (limited by the pixel size of the cameras). Using computer vision techniques we have been able to develop algorithms for camera calibration, automatic feature matching, and ultimately reconstruction of 3D cloud scenes. Applying these techniques to images from different T-REX IOPs we capture the motion of clouds in several distinct mountain wave scenarios ranging from short lived lee wave clouds on an otherwise clear sky day to rotor clouds formed in an extreme turbulence environment with strong winds and high cloud coverage. Tracking the clouds in 3D space and time allows us to quantify phenomena such as vertical and horizontal movement of clouds, turbulent motion at the upstream edge of rotor clouds, the structure of the lifting condensation level, extreme wind shear, and the life cycle of clouds in lee waves. When placed into context with the existing literature that originated from the T-REX field campaign, our results complement and expand our understanding of the complex dynamics observed in a variety of different lee wave settings.

  19. An AVHRR Cloud Classification Database Typed by Experts

    DTIC Science & Technology

    1993-10-01

    analysis. Naval Research Laboratory, Monterey, CA. 110 pp. Gallaudet , Timothy C. and James J. Simpson, 1991: Automated cloud screening of AVHRR imagery...1987) and Saunders and Kriebel (1988a,b) have used threshold techniques to classify clouds. Gallaudet and Simpson (1991) have used split-and-merge

  20. Nowcasting Cloud Fields for U.S. Air Force Special Operations

    DTIC Science & Technology

    2017-03-01

    application of Bayes’ Rule offers many advantages over Kernel Density Estimation (KDE) and other commonly used statistical post-processing methods...reflectance and probability of cloud. A statistical post-processing technique is applied using Bayesian estimation to train the system from a set of past...nowcasting, low cloud forecasting, cloud reflectance, ISR, Bayesian estimation, statistical post-processing, machine learning 15. NUMBER OF PAGES

  1. Applicability of ERTS-1 imagery to the study of suspended sediment and aquatic fronts

    NASA Technical Reports Server (NTRS)

    Klemas, V.; Srna, R.; Treasure, W.; Otley, M.

    1973-01-01

    Imagery from three successful ERTS-1 passes over the Delaware Bay and Atlantic Coastal Region have been evaluated to determine visibility of aquatic features. Data gathered from ground truth teams before and during the overflights, in conjunction with aerial photographs taken at various altitudes, were used to interpret the imagery. The overpasses took place on August 16, October 10, 1972, and January 26, 1973, with cloud cover ranging from about zero to twenty percent. (I.D. Nos. 1024-15073, 1079-15133, and 1187-15140). Visual inspection, density slicing and multispectral analysis of the imagery revealed strong suspended sediment patterns and several distinct types of aquatic interfaces or frontal systems.

  2. Evaluation of terrestrial photogrammetric point clouds derived from thermal imagery

    NASA Astrophysics Data System (ADS)

    Metcalf, Jeremy P.; Olsen, Richard C.

    2016-05-01

    Computer vision and photogrammetric techniques have been widely applied to digital imagery producing high density 3D point clouds. Using thermal imagery as input, the same techniques can be applied to infrared data to produce point clouds in 3D space, providing surface temperature information. The work presented here is an evaluation of the accuracy of 3D reconstruction of point clouds produced using thermal imagery. An urban scene was imaged over an area at the Naval Postgraduate School, Monterey, CA, viewing from above as with an airborne system. Terrestrial thermal and RGB imagery were collected from a rooftop overlooking the site using a FLIR SC8200 MWIR camera and a Canon T1i DSLR. In order to spatially align each dataset, ground control points were placed throughout the study area using Trimble R10 GNSS receivers operating in RTK mode. Each image dataset is processed to produce a dense point cloud for 3D evaluation.

  3. Cloud cover estimation optical package: New facility, algorithms and techniques

    NASA Astrophysics Data System (ADS)

    Krinitskiy, Mikhail

    2017-02-01

    Short- and long-wave radiation is an important component of surface heat budget over sea and land. For estimating them accurate observations of the cloud cover are needed. While massively observed visually, for building accurate parameterizations cloud cover needs also to be quantified using precise instrumental measurements. Major disadvantages of the most of existing cloud-cameras are associated with their complicated design and inaccuracy of post-processing algorithms which typically result in the uncertainties of 20% to 30% in the camera-based estimates of cloud cover. The accuracy of these types of algorithm in terms of true scoring compared to human-observed values is typically less than 10%. We developed new generation package for cloud cover estimating, which provides much more accurate results and also allows for measuring additional characteristics. New algorithm, namely SAIL GrIx, based on routine approach, also developed for this package. It uses the synthetic controlling index ("grayness rate index") which allows to suppress the background sunburn effect. This makes it possible to increase the reliability of the detection of the optically thin clouds. The accuracy of this algorithm in terms of true scoring became 30%. One more approach, namely SAIL GrIx ML, we have used to increase the cloud cover estimating accuracy is the algorithm that uses machine learning technique along with some other signal processing techniques. Sun disk condition appears to be a strong feature in this kind of models. Artificial Neural Networks type of model demonstrates the best quality. This model accuracy in terms of true scoring increases up to 95,5%. Application of a new algorithm lets us to modify the design of the optical sensing package and to avoid the use of the solar trackers. This made the design of the cloud camera much more compact. New cloud-camera has already been tested in several missions across Atlantic and Indian oceans on board of IORAS research vessels.

  4. A Simple Technique for Securing Data at Rest Stored in a Computing Cloud

    NASA Astrophysics Data System (ADS)

    Sedayao, Jeff; Su, Steven; Ma, Xiaohao; Jiang, Minghao; Miao, Kai

    "Cloud Computing" offers many potential benefits, including cost savings, the ability to deploy applications and services quickly, and the ease of scaling those application and services once they are deployed. A key barrier for enterprise adoption is the confidentiality of data stored on Cloud Computing Infrastructure. Our simple technique implemented with Open Source software solves this problem by using public key encryption to render stored data at rest unreadable by unauthorized personnel, including system administrators of the cloud computing service on which the data is stored. We validate our approach on a network measurement system implemented on PlanetLab. We then use it on a service where confidentiality is critical - a scanning application that validates external firewall implementations.

  5. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  6. Multislice spiral CT simulator for dynamic cardiopulmonary studies

    NASA Astrophysics Data System (ADS)

    De Francesco, Silvia; Ferreira da Silva, Augusto M.

    2002-04-01

    We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.

  7. Supervised neural network classification of pre-sliced cooked pork ham images using quaternionic singular values.

    PubMed

    Valous, Nektarios A; Mendoza, Fernando; Sun, Da-Wen; Allen, Paul

    2010-03-01

    The quaternionic singular value decomposition is a technique to decompose a quaternion matrix (representation of a colour image) into quaternion singular vector and singular value component matrices exposing useful properties. The objective of this study was to use a small portion of uncorrelated singular values, as robust features for the classification of sliced pork ham images, using a supervised artificial neural network classifier. Images were acquired from four qualities of sliced cooked pork ham typically consumed in Ireland (90 slices per quality), having similar appearances. Mahalanobis distances and Pearson product moment correlations were used for feature selection. Six highly discriminating features were used as input to train the neural network. An adaptive feedforward multilayer perceptron classifier was employed to obtain a suitable mapping from the input dataset. The overall correct classification performance for the training, validation and test set were 90.3%, 94.4%, and 86.1%, respectively. The results confirm that the classification performance was satisfactory. Extracting the most informative features led to the recognition of a set of different but visually quite similar textural patterns based on quaternionic singular values. Copyright 2009 Elsevier Ltd. All rights reserved.

  8. Navigation-supported diagnosis of the substantia nigra by matching midbrain sonography and MRI

    NASA Astrophysics Data System (ADS)

    Salah, Zein; Weise, David; Preim, Bernhard; Classen, Joseph; Rose, Georg

    2012-03-01

    Transcranial sonography (TCS) is a well-established neuroimaging technique that allows for visualizing several brainstem structures, including the substantia nigra, and helps for the diagnosis and differential diagnosis of various movement disorders, especially in Parkinsonian syndromes. However, proximate brainstem anatomy can hardly be recognized due to the limited image quality of B-scans. In this paper, a visualization system for the diagnosis of the substantia nigra is presented, which utilizes neuronavigated TCS to reconstruct tomographical slices from registered MRI datasets and visualizes them simultaneously with corresponding TCS planes in realtime. To generate MRI tomographical slices, the tracking data of the calibrated ultrasound probe are passed to an optimized slicing algorithm, which computes cross sections at arbitrary positions and orientations from the registered MRI dataset. The extracted MRI cross sections are finally fused with the region of interest from the ultrasound image. The system allows for the computation and visualization of slices at a near real-time rate. Primary tests of the system show an added value to the pure sonographic imaging. The system also allows for reconstructing volumetric (3D) ultrasonic data of the region of interest, and thus contributes to enhancing the diagnostic yield of midbrain sonography.

  9. Application of cellular automata approach for cloud simulation and rendering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher Immanuel, W.; Paul Mary Deborrah, S.; Samuel Selvaraj, R.

    Current techniques for creating clouds in games and other real time applications produce static, homogenous clouds. These clouds, while viable for real time applications, do not exhibit an organic feel that clouds in nature exhibit. These clouds, when viewed over a time period, were able to deform their initial shape and move in a more organic and dynamic way. With cloud shape technology we should be able in the future to extend to create even more cloud shapes in real time with more forces. Clouds are an essential part of any computer model of a landscape or an animation ofmore » an outdoor scene. A realistic animation of clouds is also important for creating scenes for flight simulators, movies, games, and other. Our goal was to create a realistic animation of clouds.« less

  10. Development of a Climate Record of Tropospheric and Stratospheric Column Ozone from Satellite Remote Sensing: Evidence of an Early Recovery of Global Stratospheric Ozone

    NASA Technical Reports Server (NTRS)

    Ziemke, Jerald R.; Chandra, Sushil

    2012-01-01

    Ozone data beginning October 2004 from the Aura Ozone Monitoring Instrument (OMI) and Aura Microwave Limb Sounder (MLS) are used to evaluate the accuracy of the Cloud Slicing technique in effort to develop long data records of tropospheric and stratospheric ozone and for studying their long-term changes. Using this technique, we have produced a 32-yr (1979-2010) long record of tropospheric and stratospheric column ozone from the combined Total Ozone Mapping Spectrometer (TOMS) and OMI. Analyses of these time series suggest that the quasi-biennial oscillation (QBO) is the dominant source of inter-annual variability of stratospheric ozone and is clearest in the Southern Hemisphere during the Aura time record with related inter-annual changes of 30- 40 Dobson Units. Tropospheric ozone for the long record also indicates a QBO signal in the tropics with peak-to-peak changes varying from 2 to 7 DU. The most important result from our study is that global stratospheric ozone indicates signature of a recovery occurring with ozone abundance now approaching the levels of year 1980 and earlier. The negative trends in stratospheric ozone in both hemispheres during the first 15 yr of the record are now positive over the last 15 yr and with nearly equal magnitudes. This turnaround in stratospheric ozone loss is occurring about 20 yr earlier than predicted by many chemistry climate models. This suggests that the Montreal Protocol which was first signed in 1987 as an international agreement to reduce ozone destroying substances is working well and perhaps better than anticipated.

  11. Sensitivity analysis of brain morphometry based on MRI-derived surface models

    NASA Astrophysics Data System (ADS)

    Klein, Gregory J.; Teng, Xia; Schoenemann, P. T.; Budinger, Thomas F.

    1998-07-01

    Quantification of brain structure is important for evaluating changes in brain size with growth and aging and for characterizing neurodegeneration disorders. Previous quantification efforts using ex vivo techniques suffered considerable error due to shrinkage of the cerebrum after extraction from the skull, deformation of slices during sectioning, and numerous other factors. In vivo imaging studies of brain anatomy avoid these problems and allow repetitive studies following progression of brain structure changes due to disease or natural processes. We have developed a methodology for obtaining triangular mesh models of the cortical surface from MRI brain datasets. The cortex is segmented from nonbrain tissue using a 2D region-growing technique combined with occasional manual edits. Once segmented, thresholding and image morphological operations (erosions and openings) are used to expose the regions between adjacent surfaces in deep cortical folds. A 2D region- following procedure is then used to find a set of contours outlining the cortical boundary on each slice. The contours on all slices are tiled together to form a closed triangular mesh model approximating the cortical surface. This model can be used for calculation of cortical surface area and volume, as well as other parameters of interest. Except for the initial segmentation of the cortex from the skull, the technique is automatic and requires only modest computation time on modern workstations. Though the use of image data avoids many of the pitfalls of ex vivo and sectioning techniques, our MRI-based technique is still vulnerable to errors that may impact the accuracy of estimated brain structure parameters. Potential inaccuracies include segmentation errors due to incorrect thresholding, missed deep sulcal surfaces, falsely segmented holes due to image noise and surface tiling artifacts. The focus of this paper is the characterization of these errors and how they affect measurements of cortical surface area and volume.

  12. Application of cell-surface engineering for visualization of yeast in bread dough: development of a fluorescent bio-imaging technique in the mixing process of dough.

    PubMed

    Maeda, Tatsuro; Shiraga, Seizaburo; Araki, Tetsuya; Ueda, Mitsuyoshi; Yamada, Masaharu; Takeya, Koji; Sagara, Yasuyuki

    2009-07-01

    Cell-surface engineering (Ueda et al., 2000) has been applied to develop a novel technique to visualize yeast in bread dough. Enhanced green fluorescent protein (EGFP) was bonded to the surface of yeast cells, and 0.5% EGFP yeasts were mixed into the dough samples at four different mixing stages. The samples were placed on a cryostat at -30 degrees C and sliced at 10 microm. The sliced samples were observed at an excitation wavelength of 480 nm and a fluorescent wavelength of 520 nm. The results indicated that the combination of the EGFP-displayed yeasts, rapid freezing, and cryo-sectioning made it possible to visualize 2-D distribution of yeast in bread dough to the extent that the EGFP yeasts could be clearly distinguished from the auto-fluorescent background of bread dough.

  13. Super-resolution reconstruction in frequency, image, and wavelet domains to reduce through-plane partial voluming in MRI.

    PubMed

    Gholipour, Ali; Afacan, Onur; Aganj, Iman; Scherrer, Benoit; Prabhu, Sanjay P; Sahin, Mustafa; Warfield, Simon K

    2015-12-01

    To compare and evaluate the use of super-resolution reconstruction (SRR), in frequency, image, and wavelet domains, to reduce through-plane partial voluming effects in magnetic resonance imaging. The reconstruction of an isotropic high-resolution image from multiple thick-slice scans has been investigated through techniques in frequency, image, and wavelet domains. Experiments were carried out with thick-slice T2-weighted fast spin echo sequence on the Academic College of Radiology MRI phantom, where the reconstructed images were compared to a reference high-resolution scan using peak signal-to-noise ratio (PSNR), structural similarity image metric (SSIM), mutual information (MI), and the mean absolute error (MAE) of image intensity profiles. The application of super-resolution reconstruction was then examined in retrospective processing of clinical neuroimages of ten pediatric patients with tuberous sclerosis complex (TSC) to reduce through-plane partial voluming for improved 3D delineation and visualization of thin radial bands of white matter abnormalities. Quantitative evaluation results show improvements in all evaluation metrics through super-resolution reconstruction in the frequency, image, and wavelet domains, with the highest values obtained from SRR in the image domain. The metric values for image-domain SRR versus the original axial, coronal, and sagittal images were PSNR = 32.26 vs 32.22, 32.16, 30.65; SSIM = 0.931 vs 0.922, 0.924, 0.918; MI = 0.871 vs 0.842, 0.844, 0.831; and MAE = 5.38 vs 7.34, 7.06, 6.19. All similarity metrics showed high correlations with expert ranking of image resolution with MI showing the highest correlation at 0.943. Qualitative assessment of the neuroimages of ten TSC patients through in-plane and out-of-plane visualization of structures showed the extent of partial voluming effect in a real clinical scenario and its reduction using SRR. Blinded expert evaluation of image resolution in resampled out-of-plane views consistently showed the superiority of SRR compared to original axial and coronal image acquisitions. Thick-slice 2D T2-weighted MRI scans are part of many routine clinical protocols due to their high signal-to-noise ratio, but are often severely affected by through-plane partial voluming effects. This study shows that while radiologic assessment is performed in 2D on thick-slice scans, super-resolution MRI reconstruction techniques can be used to fuse those scans to generate a high-resolution image with reduced partial voluming for improved postacquisition processing. Qualitative and quantitative evaluation showed the efficacy of all SRR techniques with the best results obtained from SRR in the image domain. The limitations of SRR techniques are uncertainties in modeling the slice profile, density compensation, quantization in resampling, and uncompensated motion between scans.

  14. New developments in surface technology and prototyping

    NASA Astrophysics Data System (ADS)

    Himmer, Thomas; Beyer, Eckhard

    2003-03-01

    Novel lightweight applications in the automotive and aircraft industries require advanced materials and techniques for surface protection as well as direct and rapid manufacturing of the related components and tools. The manufacturing processes presented in this paper are based on multiple additive and subtractive technologies such as laser cutting, laser welding, direct laser metal deposition, laser/plasma hybrid spraying technique or CNC milling. The process chain is similar to layer-based Rapid Prototyping Techniques. In the first step, the 3D CAD geometry is sliced into layers by a specially developed software. These slices are cut by high speed laser cutting and then joined together. In this way laminated tools or parts are built. To improve surface quality and to increase wear resistance a CNC machining center is used. The system consists of a CNC milling machine, in which a 3 kW Nd:YAG laser, a coaxial powder nozzle and a digitizing system are integrated. Using a new laser/plasma hybrid spraying technique, coatings can be deposited onto parts for surface protection. The layers show a low porosity and high adhesion strength, the thickness is up to 0.3 mm, and the lower effort for preliminary surface preparation reduces time and costs of the whole process.

  15. Three-dimensional inversion recovery manganese-enhanced MRI of mouse brain using super-resolution reconstruction to visualize nuclei involved in higher brain function.

    PubMed

    Poole, Dana S; Plenge, Esben; Poot, Dirk H J; Lakke, Egbert A J F; Niessen, Wiro J; Meijering, Erik; van der Weerd, Louise

    2014-07-01

    The visualization of activity in mouse brain using inversion recovery spin echo (IR-SE) manganese-enhanced MRI (MEMRI) provides unique contrast, but suffers from poor resolution in the slice-encoding direction. Super-resolution reconstruction (SRR) is a resolution-enhancing post-processing technique in which multiple low-resolution slice stacks are combined into a single volume of high isotropic resolution using computational methods. In this study, we investigated, first, whether SRR can improve the three-dimensional resolution of IR-SE MEMRI in the slice selection direction, whilst maintaining or improving the contrast-to-noise ratio of the two-dimensional slice stacks. Second, the contrast-to-noise ratio of SRR IR-SE MEMRI was compared with a conventional three-dimensional gradient echo (GE) acquisition. Quantitative experiments were performed on a phantom containing compartments of various manganese concentrations. The results showed that, with comparable scan times, the signal-to-noise ratio of three-dimensional GE acquisition is higher than that of SRR IR-SE MEMRI. However, the contrast-to-noise ratio between different compartments can be superior with SRR IR-SE MEMRI, depending on the chosen inversion time. In vivo experiments were performed in mice receiving manganese using an implanted osmotic pump. The results showed that SRR works well as a resolution-enhancing technique in IR-SE MEMRI experiments. In addition, the SRR image also shows a number of brain structures that are more clearly discernible from the surrounding tissues than in three-dimensional GE acquisition, including a number of nuclei with specific higher brain functions, such as memory, stress, anxiety and reward behavior. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Static Memory Deduplication for Performance Optimization in Cloud Computing.

    PubMed

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan

    2017-04-27

    In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible.

  17. New Perspectives of Point Clouds Color Management - the Development of Tool in Matlab for Applications in Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Pepe, M.; Ackermann, S.; Fregonese, L.; Achille, C.

    2017-02-01

    The paper describes a method for Point Clouds Color management and Integration obtained from Terrestrial Laser Scanner (TLS) and Image Based (IB) survey techniques. Especially in the Cultural Heritage (CH) environment, methods and techniques to improve the color quality of Point Clouds have a key role because a homogenous texture brings to a more accurate reconstruction of the investigated object and to a more pleasant perception of the color object as well. A color management method for point clouds can be useful in case of single data set acquired by TLS or IB technique as well as in case of chromatic heterogeneity resulting by merging different datasets. The latter condition can occur when the scans are acquired in different moments of the same day or when scans of the same object are performed in a period of weeks or months, and consequently with a different environment/lighting condition. In this paper, a procedure to balance the point cloud color in order to uniform the different data sets, to improve the chromatic quality and to highlight further details will be presented and discussed.

  18. Static Memory Deduplication for Performance Optimization in Cloud Computing

    PubMed Central

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan

    2017-01-01

    In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible. PMID:28448434

  19. Estimating GATE rainfall with geosynchronous satellite images

    NASA Technical Reports Server (NTRS)

    Stout, J. E.; Martin, D. W.; Sikdar, D. N.

    1979-01-01

    A method of estimating GATE rainfall from either visible or infrared images of geosynchronous satellites is described. Rain is estimated from cumulonimbus cloud area by the equation R = a sub 0 A + a sub 1 dA/dt, where R is volumetric rainfall, A cloud area, t time, and a sub 0 and a sub 1 are constants. Rainfall, calculated from 5.3 cm ship radar, and cloud area are measured from clouds in the tropical North Atlantic. The constants a sub 0 and a sub 1 are fit to these measurements by the least-squares method. Hourly estimates by the infrared version of this technique correlate well (correlation coefficient of 0.84) with rain totals derived from composited radar for an area of 100,000 sq km. The accuracy of this method is described and compared to that of another technique using geosynchronous satellite images. It is concluded that this technique provides useful estimates of tropical oceanic rainfall on a convective scale.

  20. Lidar

    NASA Technical Reports Server (NTRS)

    Collis, R. T. H.

    1969-01-01

    Lidar is an optical radar technique employing laser energy. Variations in signal intensity as a function of range provide information on atmospheric constituents, even when these are too tenuous to be normally visible. The theoretical and technical basis of the technique is described and typical values of the atmospheric optical parameters given. The significance of these parameters to atmospheric and meteorological problems is discussed. While the basic technique can provide valuable information about clouds and other material in the atmosphere, it is not possible to determine particle size and number concentrations precisely. There are also inherent difficulties in evaluating lidar observations. Nevertheless, lidar can provide much useful information as is shown by illustrations. These include lidar observations of: cirrus cloud, showing mountain wave motions; stratification in clear air due to the thermal profile near the ground; determinations of low cloud and visibility along an air-field approach path; and finally the motion and internal structure of clouds of tracer materials (insecticide spray and explosion-caused dust) which demonstrate the use of lidar for studying transport and diffusion processes.

  1. Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar

    NASA Astrophysics Data System (ADS)

    Lottman, Brian Todd

    1998-09-01

    This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.

  2. Assessment of left ventricular function and mass by MR imaging: a stereological study based on the systematic slice sampling procedure.

    PubMed

    Mazonakis, Michalis; Sahin, Bunyamin; Pagonidis, Konstantin; Damilakis, John

    2011-06-01

    The aim of this study was to combine the stereological technique with magnetic resonance (MR) imaging data for the volumetric and functional analysis of the left ventricle (LV). Cardiac MR examinations were performed in 13 consecutive subjects with known or suspected coronary artery disease. The end-diastolic volume (EDV), end-systolic volume, ejection fraction (EF), and mass were estimated by stereology using the entire slice set depicting LV and systematic sampling intensities of 1/2 and 1/3 that provided samples with every second and third slice, respectively. The repeatability of stereology was evaluated. Stereological assessments were compared with the reference values derived by manually tracing the endocardial and epicardial contours on MR images. Stereological EDV and EF estimations obtained by the 1/3 systematic sampling scheme were significantly different from those by manual delineation (P < .05). No difference was observed between the reference values and the LV parameters estimated by the entire slice set or a sampling intensity of 1/2 (P > .05). For these stereological approaches, a high correlation (r(2) = 0.80-0.93) and clinically acceptable limits of agreement were found with the reference method. Stereological estimations obtained by both sample sizes presented comparable coefficient of variation values of 2.9-5.8%. The mean time for stereological measurements on the entire slice set was 3.4 ± 0.6 minutes and it was reduced to 2.5 ± 0.5 minutes with the 1/2 systematic sampling scheme. Stereological analysis on systematic samples of MR slices generated by the 1/2 sampling intensity provided efficient and quick assessment of LV volumes, function, and mass. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  3. Characterization of cortical neuronal and glial alterations during culture of organotypic whole brain slices from neonatal and mature mice.

    PubMed

    Staal, Jerome A; Alexander, Samuel R; Liu, Yao; Dickson, Tracey D; Vickers, James C

    2011-01-01

    Organotypic brain slice culturing techniques are extensively used in a wide range of experimental procedures and are particularly useful in providing mechanistic insights into neurological disorders or injury. The cellular and morphological alterations associated with hippocampal brain slice cultures has been well established, however, the neuronal response of mouse cortical neurons to culture is not well documented. In the current study, we compared the cell viability, as well as phenotypic and protein expression changes in cortical neurons, in whole brain slice cultures from mouse neonates (P4-6), adolescent animals (P25-28) and mature adults (P50+). Cultures were prepared using the membrane interface method. Propidium iodide labeling of nuclei (due to compromised cell membrane) and AlamarBlue™ (cell respiration) analysis demonstrated that neonatal tissue was significantly less vulnerable to long-term culture in comparison to the more mature brain tissues. Cultures from P6 animals showed a significant increase in the expression of synaptic markers and a decrease in growth-associated proteins over the entire culture period. However, morphological analysis of organotypic brain slices cultured from neonatal tissue demonstrated that there were substantial changes to neuronal and glial organization within the neocortex, with a distinct loss of cytoarchitectural stratification and increased GFAP expression (p<0.05). Additionally, cultures from neonatal tissue had no glial limitans and, after 14 DIV, displayed substantial cellular protrusions from slice edges, including cells that expressed both glial and neuronal markers. In summary, we present a substantial evaluation of the viability and morphological changes that occur in the neocortex of whole brain tissue cultures, from different ages, over an extended period of culture.

  4. Intensity-corrected Herschel Observations of Nearby Isolated Low-mass Clouds

    NASA Astrophysics Data System (ADS)

    Sadavoy, Sarah I.; Keto, Eric; Bourke, Tyler L.; Dunham, Michael M.; Myers, Philip C.; Stephens, Ian W.; Di Francesco, James; Webb, Kristi; Stutz, Amelia M.; Launhardt, Ralf; Tobin, John J.

    2018-01-01

    We present intensity-corrected Herschel maps at 100, 160, 250, 350, and 500 μm for 56 isolated low-mass clouds. We determine the zero-point corrections for Herschel Photodetector Array Camera and Spectrometer (PACS) and Spectral Photometric Imaging Receiver (SPIRE) maps from the Herschel Science Archive (HSA) using Planck data. Since these HSA maps are small, we cannot correct them using typical methods. Here we introduce a technique to measure the zero-point corrections for small Herschel maps. We use radial profiles to identify offsets between the observed HSA intensities and the expected intensities from Planck. Most clouds have reliable offset measurements with this technique. In addition, we find that roughly half of the clouds have underestimated HSA-SPIRE intensities in their outer envelopes relative to Planck, even though the HSA-SPIRE maps were previously zero-point corrected. Using our technique, we produce corrected Herschel intensity maps for all 56 clouds and determine their line-of-sight average dust temperatures and optical depths from modified blackbody fits. The clouds have typical temperatures of ∼14–20 K and optical depths of ∼10‑5–10‑3. Across the whole sample, we find an anticorrelation between temperature and optical depth. We also find lower temperatures than what was measured in previous Herschel studies, which subtracted out a background level from their intensity maps to circumvent the zero-point correction. Accurate Herschel observations of clouds are key to obtaining accurate density and temperature profiles. To make such future analyses possible, intensity-corrected maps for all 56 clouds are publicly available in the electronic version. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  5. Liquid Water Cloud Measurements Using the Raman Lidar Technique: Current Understanding and Future Research Needs

    NASA Technical Reports Server (NTRS)

    Tetsu, Sakai; Whiteman, David N.; Russo, Felicita; Turner, David D.; Veselovskii, Igor; Melfi, S. Harvey; Nagai, Tomohiro; Mano, Yuzo

    2013-01-01

    This paper describes recent work in the Raman lidar liquid water cloud measurement technique. The range-resolved spectral measurements at the National Aeronautics and Space Administration Goddard Space Flight Center indicate that the Raman backscattering spectra measured in and below low clouds agree well with theoretical spectra for vapor and liquid water. The calibration coefficients of the liquid water measurement for the Raman lidar at the Atmospheric Radiation Measurement Program Southern Great Plains site of the U.S. Department of Energy were determined by comparison with the liquid water path (LWP) obtained with Atmospheric Emitted Radiance Interferometer (AERI) and the liquid water content (LWC) obtained with the millimeter wavelength cloud radar and water vapor radiometer (MMCR-WVR) together. These comparisons were used to estimate the Raman liquid water cross-sectional value. The results indicate a bias consistent with an effective liquid water Raman cross-sectional value that is 28%-46% lower than published, which may be explained by the fact that the difference in the detectors' sensitivity has not been accounted for. The LWP of a thin altostratus cloud showed good qualitative agreement between lidar retrievals and AERI. However, the overall ensemble of comparisons of LWP showed considerable scatter, possibly because of the different fields of view of the instruments, the 350-m distance between the instruments, and the horizontal inhomogeneity of the clouds. The LWC profiles for a thick stratus cloud showed agreement between lidar retrievals andMMCR-WVR between the cloud base and 150m above that where the optical depth was less than 3. Areas requiring further research in this technique are discussed.

  6. Digital tomosynthesis rendering of joint margins for arthritis assessment

    NASA Astrophysics Data System (ADS)

    Duryea, Jeffrey W.; Neumann, Gesa; Yoshioka, Hiroshi; Dobbins, James T., III

    2004-05-01

    PURPOSE: Rheumatoid arthritis (RA) of the hand is a significant healthcare problem. Techniques to accurately quantity the structural changes from RA are crucial for the development and prescription of therapies. Analysis of radiographic joint space width (JSW) is widely used and has demonstrated promise. However, radiography presents a 2D view of the joint. In this study we performed tomosynthesis reconstructions of proximal interphalangeal (PIP), and metacarpophalangeal (MCP) joints to measure the 3D joint structure. METHODS: We performed a reader study using simulated radiographs of 12 MCP and 12 PIP joints from skeletal specimens imaged with micro-CT. The tomosynthesis technique provided images of reconstructed planes with 0.75 mm spacing, which were presented to 2 readers with a computer tool. The readers were instructed to delineate the joint surfaces on tomosynthetic slices where they could visualize the margins. We performed a quantitative analysis of 5 slices surrounding the central portion of each joint. Reader-determined JSW was compared to a gold standard. As a figure of merit we calculated the average root-mean square deviation (RMSD). RESULTS: RMSD was 0.22 mm for both joints. For the individual joints, RMSD was 0.18 mm (MCP), and 0.26 mm (PIP). The reduced performance for the smaller PIP joints suggests that a slice spacing less than 0.75 mm may be more appropriate. CONCLUSIONS: We have demonstrated the capability of limited 3D rendering of joint surfaces using digital tomosynthesis. This technique promises to provide an improved method to visualize the structural changes of RA.

  7. A Dye-Tracer Technique for Experimentally Obtaining Impingement Characteristics of Arbitrary Bodies and a Method for Determining Droplet Size Distribution

    NASA Technical Reports Server (NTRS)

    VonGlahn, Uwe H.; Gelder, Thomas F.; Smyers, William H., Jr.

    1955-01-01

    A dye-tracer technique has been developed whereby the quantity of dyed water collected on a blotter-wrapped body exposed to an air stream containing a dyed-water spray cloud can be colorimetrically determined in order to obtain local collection efficiencies, total collection efficiency, and rearward extent of impingement on the body. In addition, a method has been developed whereby the impingement characteristics obtained experimentally for a body can be related to theoretical impingement data for the same body in order to determine the droplet size distribution of the impinging cloud. Several cylinders, a ribbon, and an aspirating device to measure cloud liquid-water content were used in the studies presented herein for the purpose of evaluating the dye-tracer technique. Although the experimental techniques used in the dye-tracer technique require careful control, the methods presented herein should be applicable for any wind tunnel provided the humidity of the air stream can be maintained near saturation.

  8. Experience of the JPL Exploratory Data Analysis Team at validating HIRS2/MSU cloud parameters

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph; Haskins, Robert D.; Granger-Gallegos, Stephanie; Pursch, Andrew; Delgenio, Anthony

    1992-01-01

    Validation of the HIRS2/MSU cloud parameters began with the cloud/climate feedback problem. The derived effective cloud amount is less sensitive to surface temperature for higher clouds. This occurs because as the cloud elevation increases, the difference between surface temperature and cloud temperature increases, so only a small change in cloud amount is needed to effect a large change in radiance at the detector. By validating the cloud parameters it is meant 'developing a quantitative sense for the physical meaning of the measured parameters', by: (1) identifying the assumptions involved in deriving parameters from the measured radiances, (2) testing the input data and derived parameters for statistical error, sensitivity, and internal consistency, and (3) comparing with similar parameters obtained from other sources using other techniques.

  9. Multilayered Clouds Identification and Retrieval for CERES Using MODIS

    NASA Technical Reports Server (NTRS)

    Sun-Mack, Sunny; Minnis, Patrick; Chen, Yan; Yi, Yuhong; Huang, Jainping; Lin, Bin; Fan, Alice; Gibson, Sharon; Chang, Fu-Lung

    2006-01-01

    Traditionally, analyses of satellite data have been limited to interpreting the radiances in terms of single layer clouds. Generally, this results in significant errors in the retrieved properties for multilayered cloud systems. Two techniques for detecting overlapped clouds and retrieving the cloud properties using satellite data are explored to help address the need for better quantification of cloud vertical structure. The first technique was developed using multispectral imager data with secondary imager products (infrared brightness temperature differences, BTD). The other method uses microwave (MWR) data. The use of BTD, the 11-12 micrometer brightness temperature difference, in conjunction with tau, the retrieved visible optical depth, was suggested by Kawamoto et al. (2001) and used by Pavlonis et al. (2004) as a means to detect multilayered clouds. Combining visible (VIS; 0.65 micrometer) and infrared (IR) retrievals of cloud properties with microwave (MW) retrievals of cloud water temperature Tw and liquid water path LWP retrieved from satellite microwave imagers appears to be a fruitful approach for detecting and retrieving overlapped clouds (Lin et al., 1998, Ho et al., 2003, Huang et al., 2005). The BTD method is limited to optically thin cirrus over low clouds, while the MWR method is limited to ocean areas only. With the availability of VIS and IR data from the Moderate Resolution Imaging Spectroradiometer (MODIS) and MW data from the Advanced Microwave Scanning Radiometer EOS (AMSR-E), both on Aqua, it is now possible to examine both approaches simultaneously. This paper explores the use of the BTD method as applied to MODIS and AMSR-E data taken from the Aqua satellite over non-polar ocean surfaces.

  10. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  11. Some physical and thermodynamic properties of rocket exhaust clouds measured with infrared scanners

    NASA Technical Reports Server (NTRS)

    Gomberg, R. I.; Kantsios, A. G.; Rosensteel, F. J.

    1977-01-01

    Measurements using infrared scanners were made of the radiation from exhaust clouds from liquid- and solid-propellant rocket boosters. Field measurements from four launches were discussed. These measurements were intended to explore the physical and thermodynamic properties of these exhaust clouds during their formation and subsequent dispersion. Information was obtained concerning the initial cloud's buoyancy, the stabilized cloud's shape and trajectory, the cloud volume as a function of time, and it's initial and stabilized temperatures. Differences in radiation intensities at various wavelengths from ambient and stabilized exhaust clouds were investigated as a method of distinguishing between the two types of clouds. The infrared remote sensing method used can be used at night when visible range cameras are inadequate. Infrared scanning techniques developed in this project can be applied directly to natural clouds, clouds containing certain radionuclides, or clouds of industrial pollution.

  12. Development of methods for inferring cloud thickness and cloud-base height from satellite radiance data

    NASA Technical Reports Server (NTRS)

    Smith, William L., Jr.; Minnis, Patrick; Alvarez, Joseph M.; Uttal, Taneil; Intrieri, Janet M.; Ackerman, Thomas P.; Clothiaux, Eugene

    1993-01-01

    Cloud-top height is a major factor determining the outgoing longwave flux at the top of the atmosphere. The downwelling radiation from the cloud strongly affects the cooling rate within the atmosphere and the longwave radiation incident at the surface. Thus, determination of cloud-base temperature is important for proper calculation of fluxes below the cloud. Cloud-base altitude is also an important factor in aircraft operations. Cloud-top height or temperature can be derived in a straightforward manner using satellite-based infrared data. Cloud-base temperature, however, is not observable from the satellite, but is related to the height, phase, and optical depth of the cloud in addition to other variables. This study uses surface and satellite data taken during the First ISCCP Regional Experiment (FIRE) Phase-2 Intensive Field Observation (IFO) period (13 Nov. - 7 Dec. 1991, to improve techniques for deriving cloud-base height from conventional satellite data.

  13. Modelling operations and security of cloud systems using Z-notation and Chinese Wall security policy

    NASA Astrophysics Data System (ADS)

    Basu, Srijita; Sengupta, Anirban; Mazumdar, Chandan

    2016-11-01

    Enterprises are increasingly using cloud computing for hosting their applications. Availability of fast Internet and cheap bandwidth are causing greater number of people to use cloud-based services. This has the advantage of lower cost and minimum maintenance. However, ensuring security of user data and proper management of cloud infrastructure remain major areas of concern. Existing techniques are either too complex, or fail to properly represent the actual cloud scenario. This article presents a formal cloud model using the constructs of Z-notation. Principles of the Chinese Wall security policy have been applied to design secure cloud-specific operations. The proposed methodology will enable users to safely host their services, as well as process sensitive data, on cloud.

  14. Simultaneous and synergistic profiling of cloud and drizzle properties using ground-based observations

    NASA Astrophysics Data System (ADS)

    Rusli, Stephanie P.; Donovan, David P.; Russchenberg, Herman W. J.

    2017-12-01

    Despite the importance of radar reflectivity (Z) measurements in the retrieval of liquid water cloud properties, it remains nontrivial to interpret Z due to the possible presence of drizzle droplets within the clouds. So far, there has been no published work that utilizes Z to identify the presence of drizzle above the cloud base in an optimized and a physically consistent manner. In this work, we develop a retrieval technique that exploits the synergy of different remote sensing systems to carry out this task and to subsequently profile the microphysical properties of the cloud and drizzle in a unified framework. This is accomplished by using ground-based measurements of Z, lidar attenuated backscatter below as well as above the cloud base, and microwave brightness temperatures. Fast physical forward models coupled to cloud and drizzle structure parameterization are used in an optimal-estimation-type framework in order to retrieve the best estimate for the cloud and drizzle property profiles. The cloud retrieval is first evaluated using synthetic signals generated from large-eddy simulation (LES) output to verify the forward models used in the retrieval procedure and the vertical parameterization of the liquid water content (LWC). From this exercise it is found that, on average, the cloud properties can be retrieved within 5 % of the mean truth. The full cloud-drizzle retrieval method is then applied to a selected ACCEPT (Analysis of the Composition of Clouds with Extended Polarization Techniques) campaign dataset collected in Cabauw, the Netherlands. An assessment of the retrieval products is performed using three independent methods from the literature; each was specifically developed to retrieve only the cloud properties, the drizzle properties below the cloud base, or the drizzle fraction within the cloud. One-to-one comparisons, taking into account the uncertainties or limitations of each retrieval, show that our results are consistent with what is derived using the three independent methods.

  15. Estimating nocturnal opaque ice cloud optical depth from MODIS multispectral infrared radiances using a neural network method

    NASA Astrophysics Data System (ADS)

    Minnis, Patrick; Hong, Gang; Sun-Mack, Szedung; Smith, William L.; Chen, Yan; Miller, Steven D.

    2016-05-01

    Retrieval of ice cloud properties using IR measurements has a distinct advantage over the visible and near-IR techniques by providing consistent monitoring regardless of solar illumination conditions. Historically, the IR bands at 3.7, 6.7, 11.0, and 12.0 µm have been used to infer ice cloud parameters by various methods, but the reliable retrieval of ice cloud optical depth τ is limited to nonopaque cirrus with τ < 8. The Ice Cloud Optical Depth from Infrared using a Neural network (ICODIN) method is developed in this paper by training Moderate Resolution Imaging Spectroradiometer (MODIS) radiances at 3.7, 6.7, 11.0, and 12.0 µm against CloudSat-estimated τ during the nighttime using 2 months of matched global data from 2007. An independent data set comprising observations from the same 2 months of 2008 was used to validate the ICODIN. One 4-channel and three 3-channel versions of the ICODIN were tested. The training and validation results show that IR channels can be used to estimate ice cloud τ up to 150 with correlations above 78% and 69% for all clouds and only opaque ice clouds, respectively. However, τ for the deepest clouds is still underestimated in many instances. The corresponding RMS differences relative to CloudSat are ~100 and ~72%. If the opaque clouds are properly identified with the IR methods, the RMS differences in the retrieved optical depths are ~62%. The 3.7 µm channel appears to be most sensitive to optical depth changes but is constrained by poor precision at low temperatures. A method for estimating total optical depth is explored for estimation of cloud water path in the future. Factors affecting the uncertainties and potential improvements are discussed. With improved techniques for discriminating between opaque and semitransparent ice clouds, the method can ultimately improve cloud property monitoring over the entire diurnal cycle.

  16. Implementation of a Message Passing Interface into a Cloud-Resolving Model for Massively Parallel Computing

    NASA Technical Reports Server (NTRS)

    Juang, Hann-Ming Henry; Tao, Wei-Kuo; Zeng, Xi-Ping; Shie, Chung-Lin; Simpson, Joanne; Lang, Steve

    2004-01-01

    The capability for massively parallel programming (MPP) using a message passing interface (MPI) has been implemented into a three-dimensional version of the Goddard Cumulus Ensemble (GCE) model. The design for the MPP with MPI uses the concept of maintaining similar code structure between the whole domain as well as the portions after decomposition. Hence the model follows the same integration for single and multiple tasks (CPUs). Also, it provides for minimal changes to the original code, so it is easily modified and/or managed by the model developers and users who have little knowledge of MPP. The entire model domain could be sliced into one- or two-dimensional decomposition with a halo regime, which is overlaid on partial domains. The halo regime requires that no data be fetched across tasks during the computational stage, but it must be updated before the next computational stage through data exchange via MPI. For reproducible purposes, transposing data among tasks is required for spectral transform (Fast Fourier Transform, FFT), which is used in the anelastic version of the model for solving the pressure equation. The performance of the MPI-implemented codes (i.e., the compressible and anelastic versions) was tested on three different computing platforms. The major results are: 1) both versions have speedups of about 99% up to 256 tasks but not for 512 tasks; 2) the anelastic version has better speedup and efficiency because it requires more computations than that of the compressible version; 3) equal or approximately-equal numbers of slices between the x- and y- directions provide the fastest integration due to fewer data exchanges; and 4) one-dimensional slices in the x-direction result in the slowest integration due to the need for more memory relocation for computation.

  17. The thin border between cloud and aerosol: Sensitivity of several ground based observation techniques

    NASA Astrophysics Data System (ADS)

    Calbó, Josep; Long, Charles N.; González, Josep-Abel; Augustine, John; McComiskey, Allison

    2017-11-01

    Cloud and aerosol are two manifestations of what it is essentially the same physical phenomenon: a suspension of particles in the air. The differences between the two come from the different composition (e.g., much higher amount of condensed water in particles constituting a cloud) and/or particle size, and also from the different number of such particles (10-10,000 particles per cubic centimeter depending on conditions). However, there exist situations in which the distinction is far from obvious, and even when broken or scattered clouds are present in the sky, the borders between cloud/not cloud are not always well defined, a transition area that has been coined as the ;twilight zone;. The current paper presents a discussion on the definition of cloud and aerosol, the need for distinguishing or for considering the continuum between the two, and suggests a quantification of the importance and frequency of such ambiguous situations, founded on several ground-based observing techniques. Specifically, sensitivity analyses are applied on sky camera images and broadband and spectral radiometric measurements taken at Girona (Spain) and Boulder (Co, USA). Results indicate that, at these sites, in more than 5% of the daytime hours the sky may be considered cloudless (but containing aerosols) or cloudy (with some kind of optically thin clouds) depending on the observing system and the thresholds applied. Similarly, at least 10% of the time the extension of scattered or broken clouds into clear areas is problematic to establish, and depends on where the limit is put between cloud and aerosol. These findings are relevant to both technical approaches for cloud screening and sky cover categorization algorithms and radiative transfer studies, given the different effect of clouds and aerosols (and the different treatment in models) on the Earth's radiation balance.

  18. Rain estimation from satellites: An examination of the Griffith-Woodley technique

    NASA Technical Reports Server (NTRS)

    Negri, A. J.; Adler, R. F.; Wetzel, P. J.

    1983-01-01

    The Griffith-Woodley Technique (GWT) is an approach to estimating precipitation using infrared observations of clouds from geosynchronous satellites. It is examined in three ways: an analysis of the terms in the GWT equations; a case study of infrared imagery portraying convective development over Florida; and the comparison of a simplified equation set and resultant rain map to results using the GWT. The objective is to determine the dominant factors in the calculation of GWT rain estimates. Analysis of a single day's convection over Florida produced a number of significant insights into various terms in the GWT rainfall equations. Due to the definition of clouds by a threshold isotherm the majority of clouds on this day did not go through an idealized life cycle before losing their identity through merger, splitting, etc. As a result, 85% of the clouds had a defined life of 0.5 or 1 h. For these clouds the terms in the GWT which are dependent on cloud life history become essentially constant. The empirically derived ratio of radar echo area to cloud area is given a singular value (0.02) for 43% of the sample, while the rainrate term is 20.7 mmh-1 for 61% of the sample. For 55% of the sampled clouds the temperature weighting term is identically 1.0. Cloud area itself is highly correlated (r=0.88) with GWT computed rain volume. An important, discriminating parameter in the GWT is the temperature defining the coldest 10% cloud area. The analysis further shows that the two dominant parameters in rainfall estimation are the existence of cold cloud and the duration of cloud over a point.

  19. New Satellite Estimates of Mixed-Phase Cloud Properties: A Synergistic Approach for Application to Global Satellite Imager Data

    NASA Astrophysics Data System (ADS)

    Smith, W. L., Jr.; Spangenberg, D.; Fleeger, C.; Sun-Mack, S.; Chen, Y.; Minnis, P.

    2016-12-01

    Determining accurate cloud properties horizontally and vertically over a full range of time and space scales is currently next to impossible using data from either active or passive remote sensors or from modeling systems. Passive satellite imagers provide horizontal and temporal resolution of clouds, but little direct information on vertical structure. Active sensors provide vertical resolution but limited spatial and temporal coverage. Cloud models embedded in NWP can produce realistic clouds but often not at the right time or location. Thus, empirical techniques that integrate information from multiple observing and modeling systems are needed to more accurately characterize clouds and their impacts. Such a strategy is employed here in a new cloud water content profiling technique developed for application to satellite imager cloud retrievals based on VIS, IR and NIR radiances. Parameterizations are developed to relate imager retrievals of cloud top phase, optical depth, effective radius and temperature to ice and liquid water content profiles. The vertical structure information contained in the parameterizations is characterized climatologically from cloud model analyses, aircraft observations, ground-based remote sensing data, and from CloudSat and CALIPSO. Thus, realistic cloud-type dependent vertical structure information (including guidance on cloud phase partitioning) circumvents poor assumptions regarding vertical homogeneity that plague current passive satellite retrievals. This paper addresses mixed phase cloud conditions for clouds with glaciated tops including those associated with convection and mid-latitude storm systems. Novel outcomes of our approach include (1) simultaneous retrievals of ice and liquid water content and path, which are validated with active sensor, microwave and in-situ data, and yield improved global cloud climatologies, and (2) new estimates of super-cooled LWC, which are demonstrated in aviation safety applications and validated with icing PIREPS. The initial validation is encouraging for single-layer cloud conditions. More work is needed to test and refine the method for global application in a wider range of cloud conditions. A brief overview of our current method, applications, verification, and plans for future work will be presented.

  20. Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System

    NASA Astrophysics Data System (ADS)

    Chan, T. O.; Lichti, D. D.; Belton, D.

    2013-10-01

    At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest that up to 78.43% accuracy improvement for the HDL-32E can be achieved using the proposed calibration method.

  1. Use of computed tomography renal angiography for screening feline renal transplant donors.

    PubMed

    Bouma, Jennifer L; Aronson, Lillian R; Keith, Dennis G; Saunders, H Mark

    2003-01-01

    Preoperative knowledge of the renal vascular anatomy is important for selection of the appropriate feline renal donor. Intravenous urograms (IVUs) have been performed routinely to screen potential donors at the Veterinary Hospital of the University of Pennsylvania (VHUP), but the vascular phase views lack sufficient detail of the renal vascular anatomy. Computed tomography angiography (CTA), which requires a helical computed tomography (CT) scanner, has been found to provide superior renal vascular anatomic information of prospective human renal donors. The specific aims of this study were as follows: 1) develop the CTA technique for the feline patient; and 2) obtain preliminary information on feline renal vessel anatomy in potential renal donors. Ten healthy, potential feline renal donors were anesthetized and imaged using a third-generation helical CT scanner. The time delay between i.v. contrast medium injection and image acquisition, and other parameters of slice collimation, slice interval, pitch, exposure settings, and reconstruction algorithms were varied to maximize contrast medium opacification of the renal vascular anatomy. Optimal CTA acquisition parameters were determined to be: 1) 10-sec delay post-i.v. bolus of iodinated contrast medium; 2) two serially acquired (corresponding to arterial and venous phases) helical scans through the renal vasculature; 3) pitch of 2 (4 mm/sec patient translation, 2 mm slice collimation); and 4) 120-kVp, 160-mA, and 1-sec exposure settings. Retrospective reconstructed CTA transverse images obtained at a 2-mm slice width and a 1-mm slice interval in combination with two-dimensional reformatted images and three-dimensional reconstructed images were qualitatively evaluated for vascular anatomy; vascular anatomy was confirmed at surgery. Four cats had single renal arteries and veins bilaterally; four cats had double renal veins. One cat had a small accessory artery supplying the caudal pole of the left kidney. One cat had a left renal artery originating from the aorta at a 90 degrees angle with the cranial mesenteric artery. CTA of the feline renal vascular anatomy is feasible, and reconstruction techniques provide excellent anatomic vascular detail. CTA is now used routinely at VHUP to screen all potential feline renal donors.

  2. Absorbing Aerosols Above Cloud: Detection, Quantitative Retrieval, and Radiative Forcing from Satellite-based Passive Sensors

    NASA Astrophysics Data System (ADS)

    Jethva, H.; Torres, O.; Remer, L. A.; Bhartia, P. K.

    2012-12-01

    Light absorbing particles such as carbonaceous aerosols generated from biomass burning activities and windblown dust particles can exert a net warming effect on climate; the strength of which depends on the absorption capacity of the particles and brightness of the underlying reflecting background. When advected over low-level bright clouds, these aerosols absorb the cloud reflected radiation from ultra-violet (UV) to shortwave-IR (SWIR) and makes cloud scene darker-a phenomenon commonly known as "cloud darkening". The apparent "darkening" effect can be seen by eyes in satellite images as well as quantitatively in the spectral reflectance measurements made by space borne sensors over regions where light absorbing carbonaceous and dust aerosols overlay low-level cloud decks. Theoretical radiative transfer simulations support the observational evidence, and further reveal that the strength of the cloud darkening and its spectral signature (or color ratio) between measurements at two wavelengths are a bi-function of aerosol and cloud optical thickness (AOT and COT); both are measures of the total amount of light extinction caused by aerosols and cloud, respectively. Here, we developed a retrieval technique, named as the "color ratio method" that uses the satellite measurements at two channels, one at shorter wavelength in the visible and one at longer wavelength in the shortwave-IR for the simultaneous retrieval of AOT and COT. The present technique requires assumptions on the aerosol single-scattering albedo and aerosol-cloud separation which are supplemented by the Aerosol Robotic Network (AERONET) and space borne CALIOP lidar measurements. The retrieval technique has been tested making use of the near-UV and visible reflectance observations made by the Ozone Monitoring Instrument (OMI) and Moderate Resolution Imaging Spectroradiometer (MODIS) for distinct above-cloud smoke and dust aerosol events observed seasonally over the southeast and tropical Atlantic Ocean, respectively. This study constitutes the first attempt to use non-polarized and non-lidar reflectance observations-both of them shown to have above-cloud aerosols retrieval capability, to retrieve above-cloud AOT by a passive non-polarized sensor. The uncertainty analysis suggests that the present method should retrieve above-cloud AOT within -10% to 50% which mainly arises due to uncertainty associated with the single-scattering albedo assumption. Although, currently tested by making use of OMI and MODIS measurements, the present color ratio method can be equally applied to the other satellite measurements that carry similar or near-by channels in VIS region of the spectrum such as MISR and NPP/VIIRS. The capability of quantifying the above-cloud aerosol load will facilitate several aspects of cloud-aerosol interaction research such as estimation of the direct radiative forcing of aerosols above clouds; the sign of which can be opposite (warming) to cloud-free aerosol forcing (cooling), aerosol transport, indirect effects of aerosols on clouds, and hydrological cycle.

  3. Impact of spatial resolution on cirrus infrared satellite retrievals in the presence of cloud heterogeneity

    NASA Astrophysics Data System (ADS)

    Fauchez, T.; Platnick, S. E.; Meyer, K.; Zhang, Z.; Cornet, C.; Szczap, F.; Dubuisson, P.

    2015-12-01

    Cirrus clouds are an important part of the Earth radiation budget but an accurate assessment of their role remains highly uncertain. Cirrus optical properties such as Cloud Optical Thickness (COT) and ice crystal effective particle size are often retrieved with a combination of Visible/Near InfraRed (VNIR) and ShortWave-InfraRed (SWIR) reflectance channels. Alternatively, Thermal InfraRed (TIR) techniques, such as the Split Window Technique (SWT), have demonstrated better accuracy for thin cirrus effective radius retrievals with small effective radii. However, current global operational algorithms for both retrieval methods assume that cloudy pixels are horizontally homogeneous (Plane Parallel Approximation (PPA)) and independent (Independent Pixel Approximation (IPA)). The impact of these approximations on ice cloud retrievals needs to be understood and, as far as possible, corrected. Horizontal heterogeneity effects in the TIR spectrum are mainly dominated by the PPA bias that primarily depends on the COT subpixel heterogeneity; for solar reflectance channels, in addition to the PPA bias, the IPA can lead to significant retrieval errors due to a significant photon horizontal transport between cloudy columns, as well as brightening and shadowing effects that are more difficult to quantify. Furthermore TIR retrievals techniques have demonstrated better retrieval accuracy for thin cirrus having small effective radii over solar reflectance techniques. The TIR range is thus particularly relevant in order to characterize, as accurately as possible, thin cirrus clouds. Heterogeneity effects in the TIR are evaluated as a function of spatial resolution in order to estimate the optimal spatial resolution for TIR retrieval applications. These investigations are performed using a cirrus 3D cloud generator (3DCloud), a 3D radiative transfer code (3DMCPOL), and two retrieval algorithms, namely the operational MODIS retrieval algorithm (MOD06) and a research-level SWT algorithm.

  4. Experimental 3-D residual stress measurement in rails with thermal annealing

    DOT National Transportation Integrated Search

    1999-07-01

    This report describes a novel method to determine residual stresses in railroad rails. The method uses thermal annealing to relieve the internal stresses in rail slices while advanced techniques (Miore and Twyman/Green interferometry) are applied to ...

  5. Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter

    PubMed Central

    Loganathan, Shyamala; Mukherjee, Saswati

    2015-01-01

    Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms. PMID:26473166

  6. Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter.

    PubMed

    Loganathan, Shyamala; Mukherjee, Saswati

    2015-01-01

    Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms.

  7. Improved Thin Cirrus and Terminator Cloud Detection in CERES Cloud Mask

    NASA Technical Reports Server (NTRS)

    Trepte, Qing; Minnis, Patrick; Palikonda, Rabindra; Spangenberg, Doug; Haeffelin, Martial

    2006-01-01

    Thin cirrus clouds account for about 20-30% of the total cloud coverage and affect the global radiation budget by increasing the Earth's albedo and reducing infrared emissions. Thin cirrus, however, are often underestimated by traditional satellite cloud detection algorithms. This difficulty is caused by the lack of spectral contrast between optically thin cirrus and the surface in techniques that use visible (0.65 micron ) and infrared (11 micron ) channels. In the Clouds and the Earth s Radiant Energy System (CERES) Aqua Edition 1 (AEd1) and Terra Edition 3 (TEd3) Cloud Masks, thin cirrus detection is significantly improved over both land and ocean using a technique that combines MODIS high-resolution measurements from the 1.38 and 11 micron channels and brightness temperature differences (BTDs) of 11-12, 8.5-11, and 3.7-11 micron channels. To account for humidity and view angle dependencies, empirical relationships were derived with observations from the 1.38 micron reflectance and the 11-12 and 8.5-11 micron BTDs using 70 granules of MODIS data in 2002 and 2003. Another challenge in global cloud detection algorithms occurs near the day/night terminator where information from the visible 0.65 micron channel and the estimated solar component of 3.7 micron channel becomes less reliable. As a result, clouds are often underestimated or misidentified near the terminator over land and ocean. Comparisons between the CLAVR-x (Clouds from Advanced Very High Resolution Radiometer [AVHRR]) cloud coverage and Geoscience Laser Altimeter System (GLAS) measurements north of 60 N indicate significant amounts of missing clouds from CLAVR-x because this part of the world was near the day/night terminator viewed by AVHRR. Comparisons between MODIS cloud products (MOD06) and GLAS in the same region also show similar difficulties with MODIS cloud retrievals. The consistent detection of clouds through out the day is needed to provide reliable cloud and radiation products for CERES and other research efforts involving the modeling of clouds and their interaction with the radiation budget.

  8. Improved automatic optic nerve radius estimation from high resolution MRI

    NASA Astrophysics Data System (ADS)

    Harrigan, Robert L.; Smith, Alex K.; Mawn, Louise A.; Smith, Seth A.; Landman, Bennett A.

    2017-02-01

    The optic nerve (ON) is a vital structure in the human visual system and transports all visual information from the retina to the cortex for higher order processing. Due to the lack of redundancy in the visual pathway, measures of ON damage have been shown to correlate well with visual deficits. These measures are typically taken at an arbitrary anatomically defined point along the nerve and do not characterize changes along the length of the ON. We propose a fully automated, three-dimensionally consistent technique building upon a previous independent slice-wise technique to estimate the radius of the ON and surrounding cerebrospinal fluid (CSF) on high-resolution heavily T2-weighted isotropic MRI. We show that by constraining results to be three-dimensionally consistent this technique produces more anatomically viable results. We compare this technique with the previously published slice-wise technique using a short-term reproducibility data set, 10 subjects, follow-up <1 month, and show that the new method is more reproducible in the center of the ON. The center of the ON contains the most accurate imaging because it lacks confounders such as motion and frontal lobe interference. Long-term reproducibility, 5 subjects, follow-up of approximately 11 months, is also investigated with this new technique and shown to be similar to short-term reproducibility, indicating that the ON does not change substantially within 11 months. The increased accuracy of this new technique provides increased power when searching for anatomical changes in ON size amongst patient populations.

  9. Improved Automatic Optic Nerve Radius Estimation from High Resolution MRI.

    PubMed

    Harrigan, Robert L; Smith, Alex K; Mawn, Louise A; Smith, Seth A; Landman, Bennett A

    2017-02-11

    The optic nerve (ON) is a vital structure in the human visual system and transports all visual information from the retina to the cortex for higher order processing. Due to the lack of redundancy in the visual pathway, measures of ON damage have been shown to correlate well with visual deficits. These measures are typically taken at an arbitrary anatomically defined point along the nerve and do not characterize changes along the length of the ON. We propose a fully automated, three-dimensionally consistent technique building upon a previous independent slice-wise technique to estimate the radius of the ON and surrounding cerebrospinal fluid (CSF) on high-resolution heavily T2-weighted isotropic MRI. We show that by constraining results to be three-dimensionally consistent this technique produces more anatomically viable results. We compare this technique with the previously published slice-wise technique using a short-term reproducibility data set, 10 subjects, follow-up <1 month, and show that the new method is more reproducible in the center of the ON. The center of the ON contains the most accurate imaging because it lacks confounders such as motion and frontal lobe interference. Long-term reproducibility, 5 subjects, follow-up of approximately 11 months, is also investigated with this new technique and shown to be similar to short-term reproducibility, indicating that the ON does not change substantially within 11 months. The increased accuracy of this new technique provides increased power when searching for anatomical changes in ON size amongst patient populations.

  10. Convective and stratiform components of a Winter Monsoon Cloud Cluster determined from geosynchronous infrared satellite data

    NASA Technical Reports Server (NTRS)

    Goldenberg, Stanley B.; Houze, Robert A., Jr.; Churchill, Dean D.

    1990-01-01

    The horizontal precipitation structure of cloud clusters observed over the South China Sea during the Winter Monsoon Experiment (WMONEX) is analyzed using a convective-stratiform technique (CST) developed by Adler and Negri (1988). The technique was modified by altering the method for identifying convective cells in the satellite data, accounting for the extremely cold cloud tops characteristic of the WMONEX region, and modifying the threshold infrared temperature for the boundary of the stratiform rain area. The precipitation analysis was extended to the entire history of the cloud cluster by applying the modified CST to IR imagery from geosynchronous-satellite observations. The ship and aircraft data from the later period of the cluster's lifetime make it possible to check the locations of convective and stratiform precipitation identified by the CST using in situ observations. The extended CST is considered to be effective for determining the climatology of the convective-stratiform structure of tropical cloud clusters.

  11. Techniques for the measurements of the line of sight velocity of high altitude Barium clouds

    NASA Technical Reports Server (NTRS)

    Mende, S. B.

    1981-01-01

    It is demonstrated that for maximizing the scientific output of future ion cloud release experiments a new type of instrument is required which will measure the line of sight velocity of the ion cloud by the Doppler Technique. A simple instrument was constructed using a 5 cm diameter solid Fabry-Perot etalon coupled to a low light level integrating television camera. It was demonstrated that the system has both the sensitivity and spectral resolution for the detection of ion clouds and the measurement of their line of sight Doppler velocity. The tests consisted of (1) a field experiment using a rocket barium cloud release to check the sensitivity, (2) laboratory experiments to show the spectral resolving capabilities of the system. The instrument was found to be operational if the source was brighter than about 1 kilorayleigh and it had a wavelength resolution much better than .2A which corresponds to about 12 km/sec or an acceleration potential of 100 volts.

  12. A post-reconstruction method to correct cupping artifacts in cone beam breast computed tomography

    PubMed Central

    Altunbas, M. C.; Shaw, C. C.; Chen, L.; Lai, C.; Liu, X.; Han, T.; Wang, T.

    2007-01-01

    In cone beam breast computed tomography (CT), scattered radiation leads to nonuniform biasing of CT numbers known as a cupping artifact. Besides being visual distractions, cupping artifacts appear as background nonuniformities, which impair efficient gray scale windowing and pose a problem in threshold based volume visualization/segmentation. To overcome this problem, we have developed a background nonuniformity correction method specifically designed for cone beam breast CT. With this technique, the cupping artifact is modeled as an additive background signal profile in the reconstructed breast images. Due to the largely circularly symmetric shape of a typical breast, the additive background signal profile was also assumed to be circularly symmetric. The radial variation of the background signals were estimated by measuring the spatial variation of adipose tissue signals in front view breast images. To extract adipose tissue signals in an automated manner, a signal sampling scheme in polar coordinates and a background trend fitting algorithm were implemented. The background fits compared with targeted adipose tissue signal value (constant throughout the breast volume) to get an additive correction value for each tissue voxel. To test the accuracy, we applied the technique to cone beam CT images of mastectomy specimens. After correction, the images demonstrated significantly improved signal uniformity in both front and side view slices. The reduction of both intra-slice and inter-slice variations in adipose tissue CT numbers supported our observations. PMID:17822018

  13. Chaos Control of Epileptiform Bursting in the Brain

    NASA Astrophysics Data System (ADS)

    Slutzky, M. W.; Cvitanovic, P.; Mogul, D. J.

    Epilepsy, defined as recurrent seizures, is a pathological state of the brain that afflicts over one percent of the world's population. Seizures occur as populations of neurons in the brain become overly synchronized. Although pharmacological agents are the primary treatment for preventing or reducing the incidence of these seizures, over 30% of epilepsy cases are not adequately helped by standard medical therapies. Several groups are exploring the use of electrical stimulation to terminate or prevent epileptic seizures. One experimental model used to test these algorithms is the brain slice where a select region of the brain is cut and kept viable in a well-oxygenated artificial cerebrospinal fluid. Under certain conditions, such slices may be made to spontaneously and repetitively burst, thereby providing an in vitro model of epilepsy. In this chapter, we discuss our efforts at applying chaos analysis and chaos control algorithms for manipulating this seizure-like behavior in a brain slice model. These techniques may provide a nonlinear control pathway for terminating or potentially preventing epileptic seizures in the whole brain.

  14. Multi-signal FIB/SEM tomography

    NASA Astrophysics Data System (ADS)

    Giannuzzi, Lucille A.

    2012-06-01

    Focused ion beam (FIB) milling coupled with scanning electron microscopy (SEM) on the same platform enables 3D microstructural analysis of structures using FIB for serial sectioning and SEM for imaging. Since FIB milling is a destructive technique, the acquisition of multiple signals from each slice is desirable. The feasibility of collecting both an inlens backscattered electron (BSE) signal and an inlens secondary electron (SE) simultaneously from a single scan of the electron beam from each FIB slice is demonstrated. The simultaneous acquisition of two different SE signals from two different detectors (inlens vs. Everhart-Thornley (ET) detector) is also possible. Obtaining multiple signals from each FIB slice with one scan increases the acquisition throughput. In addition, optimization of microstructural and morphological information from the target is achieved using multi-signals. Examples of multi-signal FIB/SEM tomography from a dental implant will be provided where both material contrast from the bone/ceramic coating/Ti substrate phases and porosity in the ceramic coating will be characterized.

  15. IMPLEMENTING A NOVEL CYCLIC CO2 FLOOD IN PALEOZOIC REEFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James R. Wood; W. Quinlan; A. Wylie

    2003-07-01

    Recycled CO2 will be used in this demonstration project to produce bypassed oil from the Silurian Charlton 6 pinnacle reef (Otsego County) in the Michigan Basin. Contract negotiations by our industry partner to gain access to this CO2 that would otherwise be vented to the atmosphere are near completion. A new method of subsurface characterization, log curve amplitude slicing, is being used to map facies distributions and reservoir properties in two reefs, the Belle River Mills and Chester 18 Fields. The Belle River Mills and Chester18 fields are being used as typefields because they have excellent log-curve and core datamore » coverage. Amplitude slicing of the normalized gamma ray curves is showing trends that may indicate significant heterogeneity and compartmentalization in these reservoirs. Digital and hard copy data continues to be compiled for the Niagaran reefs in the Michigan Basin. Technology transfer took place through technical presentations regarding the log curve amplitude slicing technique and a booth at the Midwest PTTC meeting.« less

  16. AMBER: a PIC slice code for DARHT

    NASA Astrophysics Data System (ADS)

    Vay, Jean-Luc; Fawley, William

    1999-11-01

    The accelerator for the second axis of the Dual Axis Radiographic Hydrodynamic Test (DARHT) facility will produce a 4-kA, 20-MeV, 2-μ s output electron beam with a design goal of less than 1000 π mm-mrad normalized transverse emittance and less than 0.5-mm beam centroid motion. In order to study the beam dynamics throughout the accelerator, we have developed a slice Particle-In-Cell code named AMBER, in which the beam is modeled as a time-steady flow, subject to self, as well as external, electrostatic and magnetostatic fields. The code follows the evolution of a slice of the beam as it propagates through the DARHT accelerator lattice, modeled as an assembly of pipes, solenoids and gaps. In particular, we have paid careful attention to non-paraxial phenomena that can contribute to nonlinear forces and possible emittance growth. We will present the model and the numerical techniques implemented, as well as some test cases and some preliminary results obtained when studying emittance growth during the beam propagation.

  17. Two-photon imaging in living brain slices.

    PubMed

    Mainen, Z F; Maletic-Savatic, M; Shi, S H; Hayashi, Y; Malinow, R; Svoboda, K

    1999-06-01

    Two-photon excitation laser scanning microscopy (TPLSM) has become the tool of choice for high-resolution fluorescence imaging in intact neural tissues. Compared with other optical techniques, TPLSM allows high-resolution imaging and efficient detection of fluorescence signal with minimal photobleaching and phototoxicity. The advantages of TPLSM are especially pronounced in highly scattering environments such as the brain slice. Here we describe our approaches to imaging various aspects of synaptic function in living brain slices. To combine several imaging modes together with patch-clamp electrophysiological recordings we found it advantageous to custom-build an upright microscope. Our design goals were primarily experimental convenience and efficient collection of fluorescence. We describe our TPLSM imaging system and its performance in detail. We present dynamic measurements of neuronal morphology of neurons expressing green fluorescent protein (GFP) and GFP fusion proteins as well as functional imaging of calcium dynamics in individual dendritic spines. Although our microscope is a custom instrument, its key advantages can be easily implemented as a modification of commercial laser scanning microscopes. Copyright 1999 Academic Press.

  18. Improving Scene Classifications with Combined Active/Passive Measurements

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Rodier, S.; Vaughan, M.; McGill, M.

    The uncertainties in cloud and aerosol physical properties derived from passive instruments such as MODIS are not insignificant And the uncertainty increases when the optical depths decrease Lidar observations do much better for the thin clouds and aerosols Unfortunately space-based lidar measurements such as the one onboard CALIPSO satellites are limited to nadir view only and thus have limited spatial coverage To produce climatologically meaningful thin cloud and aerosol data products it is necessary to combine the spatial coverage of MODIS with the highly sensitive CALIPSO lidar measurements Can we improving the quality of cloud and aerosol remote sensing data products by extending the knowledge about thin clouds and aerosols learned from CALIPSO-type of lidar measurements to a larger portion of the off-nadir MODIS-like multi-spectral pixels To answer the question we studied the collocated Cloud Physics Lidar CPL with Modis-Airborne-Simulation MAS observations and established an effective data fusion technique that will be applied in the combined CALIPSO MODIS cloud aerosol product algorithms This technique performs k-mean and Kohonen self-organized map cluster analysis on the entire swath of MAS data as well as on the combined CPL MAS data at the nadir track Interestingly the clusters generated from the two approaches are almost identical It indicates that the MAS multi-spectral data may have already captured most of the cloud and aerosol scene types such as cloud ice water phase multi-layer information aerosols

  19. The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP): Overview and Description of Models, Simulations and Climate Diagnostics

    NASA Technical Reports Server (NTRS)

    Lamarque, J.-F.; Shindell, D. T.; Naik, V.; Plummer, D.; Josse, B.; Righi, M.; Rumbold, S. T.; Schulz, M.; Skeie, R. B.; Strode, S.; hide

    2013-01-01

    The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP) consists of a series of time slice experiments targeting the long-term changes in atmospheric composition between 1850 and 2100, with the goal of documenting composition changes and the associated radiative forcing. In this overview paper, we introduce the ACCMIP activity, the various simulations performed (with a requested set of 14) and the associated model output. The 16 ACCMIP models have a wide range of horizontal and vertical resolutions, vertical extent, chemistry schemes and interaction with radiation and clouds. While anthropogenic and biomass burning emissions were specified for all time slices in the ACCMIP protocol, it is found that the natural emissions are responsible for a significant range across models, mostly in the case of ozone precursors. The analysis of selected present-day climate diagnostics (precipitation, temperature, specific humidity and zonal wind) reveals biases consistent with state-of-the-art climate models. The model-to- model comparison of changes in temperature, specific humidity and zonal wind between 1850 and 2000 and between 2000 and 2100 indicates mostly consistent results. However, models that are clear outliers are different enough from the other models to significantly affect their simulation of atmospheric chemistry.

  20. Realistic natural atmospheric phenomena and weather effects for interactive virtual environments

    NASA Astrophysics Data System (ADS)

    McLoughlin, Leigh

    Clouds and the weather are important aspects of any natural outdoor scene, but existing dynamic techniques within computer graphics only offer the simplest of cloud representations. The problem that this work looks to address is how to provide a means of simulating clouds and weather features such as precipitation, that are suitable for virtual environments. Techniques for cloud simulation are available within the area of meteorology, but numerical weather prediction systems are computationally expensive, give more numerical accuracy than we require for graphics and are restricted to the laws of physics. Within computer graphics, we often need to direct and adjust physical features or to bend reality to meet artistic goals, which is a key difference between the subjects of computer graphics and physical science. Pure physically-based simulations, however, evolve their solutions according to pre-set rules and are notoriously difficult to control. The challenge then is for the solution to be computationally lightweight and able to be directed in some measure while at the same time producing believable results. This work presents a lightweight physically-based cloud simulation scheme that simulates the dynamic properties of cloud formation and weather effects. The system simulates water vapour, cloud water, cloud ice, rain, snow and hail. The water model incorporates control parameters and the cloud model uses an arbitrary vertical temperature profile, with a tool described to allow the user to define this. The result of this work is that clouds can now be simulated in near real-time complete with precipitation. The temperature profile and tool then provide a means of directing the resulting formation..

  1. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds.

    PubMed

    Miller, Daniel J; Zhang, Zhibo; Ackerman, Andrew S; Platnick, Steven; Baum, Bryan A

    2016-04-27

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness ( τ ) and effective radius ( r e ) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5-10 g/m 2 . In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic r e profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques.

  2. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds

    PubMed Central

    Miller, Daniel J.; Zhang, Zhibo; Ackerman, Andrew S.; Platnick, Steven; Baum, Bryan A.

    2018-01-01

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness (τ) and effective radius (re) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5–10 g/m2. In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic re profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques. PMID:29637042

  3. Phase-partitioning in mixed-phase clouds - An approach to characterize the entire vertical column

    NASA Astrophysics Data System (ADS)

    Kalesse, H.; Luke, E. P.; Seifert, P.

    2017-12-01

    The characterization of the entire vertical profile of phase-partitioning in mixed-phase clouds is a challenge which can be addressed by synergistic profiling measurements with ground-based polarization lidars and cloud radars. While lidars are sensitive to small particles and can thus detect supercooled liquid (SCL) layers, cloud radar returns are dominated by larger particles (like ice crystals). The maximum lidar observation height is determined by complete signal attenuation at a penetrated optical depth of about three. In contrast, cloud radars are able to penetrate multiple liquid layers and can thus be used to expand the identification of cloud phase to the entire vertical column beyond the lidar extinction height, if morphological features in the radar Doppler spectrum can be related to the existence of SCL. Relevant spectral signatures such as bimodalities and spectral skewness can be related to cloud phase by training a neural network appropriately in a supervised learning scheme, with lidar measurements functioning as supervisor. The neural network output (prediction of SCL location) derived using cloud radar Doppler spectra can be evaluated with several parameters such as liquid water path (LWP) detected by microwave radiometer (MWR) and (liquid) cloud base detected by ceilometer or Raman lidar. The technique has been previously tested on data from Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) instruments in Barrow, Alaska and is in this study utilized for observations from the Leipzig Aerosol and Cloud Remote Observations System (LACROS) during the Analysis of the Composition of Clouds with Extended Polarization Techniques (ACCEPT) field experiment in Cabauw, Netherlands in Fall 2014. Comparisons to supercooled-liquid layers as classified by CLOUDNET are provided.

  4. Retrieval of Aerosol Optical Depth Above Clouds from OMI Observations: Sensitivity Analysis, Case Studies

    NASA Technical Reports Server (NTRS)

    Torres, O.; Jethva, H.; Bhartia, P. K.

    2012-01-01

    A large fraction of the atmospheric aerosol load reaching the free troposphere is frequently located above low clouds. Most commonly observed aerosols above clouds are carbonaceous particles generally associated with biomass burning and boreal forest fires, and mineral aerosols originated in arid and semi-arid regions and transported across large distances, often above clouds. Because these aerosols absorb solar radiation, their role in the radiative transfer balance of the earth atmosphere system is especially important. The generally negative (cooling) top of the atmosphere direct effect of absorbing aerosols, may turn into warming when the light-absorbing particles are located above clouds. The actual effect depends on the aerosol load and the single scattering albedo, and on the geometric cloud fraction. In spite of its potential significance, the role of aerosols above clouds is not adequately accounted for in the assessment of aerosol radiative forcing effects due to the lack of measurements. In this paper we discuss the basis of a simple technique that uses near-UV observations to simultaneously derive the optical depth of both the aerosol layer and the underlying cloud for overcast conditions. The two-parameter retrieval method described here makes use of the UV aerosol index and reflectance measurements at 388 nm. A detailed sensitivity analysis indicates that the measured radiances depend mainly on the aerosol absorption exponent and aerosol-cloud separation. The technique was applied to above-cloud aerosol events over the Southern Atlantic Ocean yielding realistic results as indicated by indirect evaluation methods. An error analysis indicates that for typical overcast cloudy conditions and aerosol loads, the aerosol optical depth can be retrieved with an accuracy of approximately 54% whereas the cloud optical depth can be derived within 17% of the true value.

  5. Detection of ground fog in mountainous areas from MODIS (Collection 051) daytime data using a statistical approach

    NASA Astrophysics Data System (ADS)

    Schulz, Hans Martin; Thies, Boris; Chang, Shih-Chieh; Bendix, Jörg

    2016-03-01

    The mountain cloud forest of Taiwan can be delimited from other forest types using a map of the ground fog frequency. In order to create such a frequency map from remotely sensed data, an algorithm able to detect ground fog is necessary. Common techniques for ground fog detection based on weather satellite data cannot be applied to fog occurrences in Taiwan as they rely on several assumptions regarding cloud properties. Therefore a new statistical method for the detection of ground fog in mountainous terrain from MODIS Collection 051 data is presented. Due to the sharpening of input data using MODIS bands 1 and 2, the method provides fog masks in a resolution of 250 m per pixel. The new technique is based on negative correlations between optical thickness and terrain height that can be observed if a cloud that is relatively plane-parallel is truncated by the terrain. A validation of the new technique using camera data has shown that the quality of fog detection is comparable to that of another modern fog detection scheme developed and validated for the temperate zones. The method is particularly applicable to optically thinner water clouds. Beyond a cloud optical thickness of ≈ 40, classification errors significantly increase.

  6. High-resolution Monte Carlo simulation of flow and conservative transport in heterogeneous porous media: 2. Transport results

    USGS Publications Warehouse

    Naff, R.L.; Haley, D.F.; Sudicky, E.A.

    1998-01-01

    In this, the second of two papers concerned with the use of numerical simulation to examine flow and transport parameters in heterogeneous porous media via Monte Carlo methods, results from the transport aspect of these simulations are reported on. Transport simulations contained herein assume a finite pulse input of conservative tracer, and the numerical technique endeavors to realistically simulate tracer spreading as the cloud moves through a heterogeneous medium. Medium heterogeneity is limited to the hydraulic conductivity field, and generation of this field assumes that the hydraulic-conductivity process is second-order stationary. Methods of estimating cloud moments, and the interpretation of these moments, are discussed. Techniques for estimation of large-time macrodispersivities from cloud second-moment data, and for the approximation of the standard errors associated with these macrodispersivities, are also presented. These moment and macrodispersivity estimation techniques were applied to tracer clouds resulting from transport scenarios generated by specific Monte Carlo simulations. Where feasible, moments and macrodispersivities resulting from the Monte Carlo simulations are compared with first- and second-order perturbation analyses. Some limited results concerning the possible ergodic nature of these simulations, and the presence of non-Gaussian behavior of the mean cloud, are reported on as well.

  7. Integrated fMRI Preprocessing Framework Using Extended Kalman Filter for Estimation of Slice-Wise Motion

    PubMed Central

    Pinsard, Basile; Boutin, Arnaud; Doyon, Julien; Benali, Habib

    2018-01-01

    Functional MRI acquisition is sensitive to subjects' motion that cannot be fully constrained. Therefore, signal corrections have to be applied a posteriori in order to mitigate the complex interactions between changing tissue localization and magnetic fields, gradients and readouts. To circumvent current preprocessing strategies limitations, we developed an integrated method that correct motion and spatial low-frequency intensity fluctuations at the level of each slice in order to better fit the acquisition processes. The registration of single or multiple simultaneously acquired slices is achieved online by an Iterated Extended Kalman Filter, favoring the robust estimation of continuous motion, while an intensity bias field is non-parametrically fitted. The proposed extraction of gray-matter BOLD activity from the acquisition space to an anatomical group template space, taking into account distortions, better preserves fine-scale patterns of activity. Importantly, the proposed unified framework generalizes to high-resolution multi-slice techniques. When tested on simulated and real data the latter shows a reduction of motion explained variance and signal variability when compared to the conventional preprocessing approach. These improvements provide more stable patterns of activity, facilitating investigation of cerebral information representation in healthy and/or clinical populations where motion is known to impact fine-scale data. PMID:29755312

  8. Integrated fMRI Preprocessing Framework Using Extended Kalman Filter for Estimation of Slice-Wise Motion.

    PubMed

    Pinsard, Basile; Boutin, Arnaud; Doyon, Julien; Benali, Habib

    2018-01-01

    Functional MRI acquisition is sensitive to subjects' motion that cannot be fully constrained. Therefore, signal corrections have to be applied a posteriori in order to mitigate the complex interactions between changing tissue localization and magnetic fields, gradients and readouts. To circumvent current preprocessing strategies limitations, we developed an integrated method that correct motion and spatial low-frequency intensity fluctuations at the level of each slice in order to better fit the acquisition processes. The registration of single or multiple simultaneously acquired slices is achieved online by an Iterated Extended Kalman Filter, favoring the robust estimation of continuous motion, while an intensity bias field is non-parametrically fitted. The proposed extraction of gray-matter BOLD activity from the acquisition space to an anatomical group template space, taking into account distortions, better preserves fine-scale patterns of activity. Importantly, the proposed unified framework generalizes to high-resolution multi-slice techniques. When tested on simulated and real data the latter shows a reduction of motion explained variance and signal variability when compared to the conventional preprocessing approach. These improvements provide more stable patterns of activity, facilitating investigation of cerebral information representation in healthy and/or clinical populations where motion is known to impact fine-scale data.

  9. Image reconstruction of x-ray tomography by using image J platform

    NASA Astrophysics Data System (ADS)

    Zain, R. M.; Razali, A. M.; Salleh, K. A. M.; Yahya, R.

    2017-01-01

    A tomogram is a technical term for a CT image. It is also called a slice because it corresponds to what the object being scanned would look like if it were sliced open along a plane. A CT slice corresponds to a certain thickness of the object being scanned. So, while a typical digital image is composed of pixels, a CT slice image is composed of voxels (volume elements). In the case of x-ray tomography, similar to x-ray Radiography, the quantity being imaged is the distribution of the attenuation coefficient μ(x) within the object of interest. The different is only on the technique to produce the tomogram. The image of x-ray radiography can be produced straight foward after exposed to x-ray, while the image of tomography produces by combination of radiography images in every angle of projection. A number of image reconstruction methods by converting x-ray attenuation data into a tomography image have been produced by researchers. In this work, Ramp filter in "filtered back projection" has been applied. The linear data acquired at each angular orientation are convolved with a specially designed filter and then back projected across a pixel field at the same angle. This paper describe the step of using Image J software to produce image reconstruction of x-ray tomography.

  10. Particle-in-cell/accelerator code for space-charge dominated beam simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-05-08

    Warp is a multidimensional discrete-particle beam simulation program designed to be applicable where the beam space-charge is non-negligible or dominant. It is being developed in a collaboration among LLNL, LBNL and the University of Maryland. It was originally designed and optimized for heave ion fusion accelerator physics studies, but has received use in a broader range of applications, including for example laser wakefield accelerators, e-cloud studies in high enery accelerators, particle traps and other areas. At present it incorporates 3-D, axisymmetric (r,z) planar (x-z) and transverse slice (x,y) descriptions, with both electrostatic and electro-magnetic fields, and a beam envelope model.more » The code is guilt atop the Python interpreter language.« less

  11. Cloud Service Selection Using Multicriteria Decision Analysis

    PubMed Central

    Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios. PMID:24696645

  12. Cloud service selection using multicriteria decision analysis.

    PubMed

    Whaiduzzaman, Md; Gani, Abdullah; Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Mohammad Nazmul; Haque, Israat Tanzeena

    2014-01-01

    Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios.

  13. - and Scene-Guided Integration of Tls and Photogrammetric Point Clouds for Landslide Monitoring

    NASA Astrophysics Data System (ADS)

    Zieher, T.; Toschi, I.; Remondino, F.; Rutzinger, M.; Kofler, Ch.; Mejia-Aguilar, A.; Schlögel, R.

    2018-05-01

    Terrestrial and airborne 3D imaging sensors are well-suited data acquisition systems for the area-wide monitoring of landslide activity. State-of-the-art surveying techniques, such as terrestrial laser scanning (TLS) and photogrammetry based on unmanned aerial vehicle (UAV) imagery or terrestrial acquisitions have advantages and limitations associated with their individual measurement principles. In this study we present an integration approach for 3D point clouds derived from these techniques, aiming at improving the topographic representation of landslide features while enabling a more accurate assessment of landslide-induced changes. Four expert-based rules involving local morphometric features computed from eigenvectors, elevation and the agreement of the individual point clouds, are used to choose within voxels of selectable size which sensor's data to keep. Based on the integrated point clouds, digital surface models and shaded reliefs are computed. Using an image correlation technique, displacement vectors are finally derived from the multi-temporal shaded reliefs. All results show comparable patterns of landslide movement rates and directions. However, depending on the applied integration rule, differences in spatial coverage and correlation strength emerge.

  14. Terahertz pulsed imaging for the monitoring of dental caries: a comparison with x-ray imaging

    NASA Astrophysics Data System (ADS)

    Karagoz, Burcu; Kamburoglu, Kıvanc; Altan, Hakan

    2017-07-01

    Dental caries in sliced samples are investigated using terahertz pulsed imaging. Frequency domain terahertz response of these structures consistent with X-ray imaging results show the potential of this technique in the detection of early caries.

  15. Operational implications of a cloud model simulation of space shuttle exhaust clouds in different atmospheric conditions

    NASA Technical Reports Server (NTRS)

    Zak, J. A.

    1989-01-01

    A three-dimensional cloud model was used to characterize the dominant influence of the environment on the Space Shuttle exhaust cloud. The model was modified to accept the actual heat and moisture from rocket exhausts and deluge water as initial conditions. An upper-air sounding determined the ambient atmosphere in which the cloud would grow. The model was validated by comparing simulated clouds with observed clouds from four actual Shuttle launches. Results are discussed with operational weather forecasters in mind. The model successfully produced clouds with dimensions, rise, decay, liquid water contents, and vertical motion fields very similar to observed clouds whose dimensions were calculated from 16 mm film frames. Once validated, the model was used in a number of different atmospheric conditions ranging from very unstable to very stable. Wind shear strongly affected the appearance of both the ground cloud and vertical column cloud. The ambient low-level atmospheric moisture governed the amount of cloud water in model clouds. Some dry atmospheres produced little or no cloud water. An empirical forecast technique for Shuttle cloud rise is presented and differences between natural atmospheric convection and exhaust clouds are discussed.

  16. CesrTA Retarding Field Analyzer Modeling Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvey, J.R.; Celata, C.M.; Crittenden, J.A.

    2010-05-23

    Retarding field analyzers (RFAs) provide an effective measure of the local electron cloud density and energy distribution. Proper interpretation of RFA data can yield information about the behavior of the cloud, as well as the surface properties of the instrumented vacuum chamber. However, due to the complex interaction of the cloud with the RFA itself, understanding these measurements can be nontrivial. This paper examines different methods for interpreting RFA data via cloud simulation programs. Techniques include postprocessing the output of a simulation code to predict the RFA response; and incorporating an RFA model into the cloud modeling program itself.

  17. Evaluation of Methods to Estimate the Surface Downwelling Longwave Flux during Arctic Winter

    NASA Technical Reports Server (NTRS)

    Chiacchio, Marc; Francis, Jennifer; Stackhouse, Paul, Jr.

    2002-01-01

    Surface longwave radiation fluxes dominate the energy budget of nighttime polar regions, yet little is known about the relative accuracy of existing satellite-based techniques to estimate this parameter. We compare eight methods to estimate the downwelling longwave radiation flux and to validate their performance with measurements from two field programs in thc Arctic: the Coordinated Eastern Arctic Experiment (CEAREX ) conducted in the Barents Sea during the autumn and winter of 1988, and the Lead Experiment performed in the Beaufort Sea in the spring of 1992. Five of the eight methods were developed for satellite-derived quantities, and three are simple parameterizations based on surface observations. All of the algorithms require information about cloud fraction, which is provided from the NASA-NOAA Television and Infrared Observation Satellite (TIROS) Operational Vertical Sounder (TOVS) polar pathfinder dataset (Path-P): some techniques ingest temperature and moisture profiles (also from Path-P): one-half of the methods assume that clouds are opaque and have a constant geometric thickness of 50 hPa, and three include no thickness information whatsoever. With a somewhat limited validation dataset, the following primary conclusions result: (1) all methods exhibit approximately the same correlations with measurements and rms differences, but the biases range from -34 W sq m (16% of the mean) to nearly 0; (2) the error analysis described here indicates that the assumption of a 50-hPa cloud thickness is too thin by a factor of 2 on average in polar nighttime conditions; (3) cloud-overlap techniques. which effectively increase mean cloud thickness, significantly improve the results; (4) simple Arctic-specific parameterizations performed poorly, probably because they were developed with surface-observed cloud fractions; and (5) the single algorithm that includes an estimate of cloud thickness exhibits the smallest differences from observations.

  18. [Preparation of simulate craniocerebral models via three dimensional printing technique].

    PubMed

    Lan, Q; Chen, A L; Zhang, T; Zhu, Q; Xu, T

    2016-08-09

    Three dimensional (3D) printing technique was used to prepare the simulate craniocerebral models, which were applied to preoperative planning and surgical simulation. The image data was collected from PACS system. Image data of skull bone, brain tissue and tumors, cerebral arteries and aneurysms, and functional regions and relative neural tracts of the brain were extracted from thin slice scan (slice thickness 0.5 mm) of computed tomography (CT), magnetic resonance imaging (MRI, slice thickness 1mm), computed tomography angiography (CTA), and functional magnetic resonance imaging (fMRI) data, respectively. MIMICS software was applied to reconstruct colored virtual models by identifying and differentiating tissues according to their gray scales. Then the colored virtual models were submitted to 3D printer which produced life-sized craniocerebral models for surgical planning and surgical simulation. 3D printing craniocerebral models allowed neurosurgeons to perform complex procedures in specific clinical cases though detailed surgical planning. It offered great convenience for evaluating the size of spatial fissure of sellar region before surgery, which helped to optimize surgical approach planning. These 3D models also provided detailed information about the location of aneurysms and their parent arteries, which helped surgeons to choose appropriate aneurismal clips, as well as perform surgical simulation. The models further gave clear indications of depth and extent of tumors and their relationship to eloquent cortical areas and adjacent neural tracts, which were able to avoid surgical damaging of important neural structures. As a novel and promising technique, the application of 3D printing craniocerebral models could improve the surgical planning by converting virtual visualization into real life-sized models.It also contributes to functional anatomy study.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree

    Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.

  20. A new lightweight solar cell

    NASA Technical Reports Server (NTRS)

    Lindmayer, J.; Wrigley, C.

    1976-01-01

    Highly reproducible, very thin (40-80 microns thick) silicon solar cells are examined. These cells are the product of silicon thinning techniques that produce very flexible, resilient slices as compared to other techniques. Measurements on solar cells 2 cm by 2 cm by 50 microns thick producing 60 mW or more at AM0 are described. These cells have fine-line metallizations, tantalum oxide antireflection coatings and back-surface aluminum alloy.

  1. Identifying the arterial input function from dynamic contrast-enhanced magnetic resonance images using an apex-seeking technique

    NASA Astrophysics Data System (ADS)

    Martel, Anne L.

    2004-04-01

    In order to extract quantitative information from dynamic contrast-enhanced MR images (DCE-MRI) it is usually necessary to identify an arterial input function. This is not a trivial problem if there are no major vessels present in the field of view. Most existing techniques rely on operator intervention or use various curve parameters to identify suitable pixels but these are often specific to the anatomical region or the acquisition method used. They also require the signal from several pixels to be averaged in order to improve the signal to noise ratio, however this introduces errors due to partial volume effects. We have described previously how factor analysis can be used to automatically separate arterial and venous components from DCE-MRI studies of the brain but although that method works well for single slice images through the brain when the blood brain barrier technique is intact, it runs into problems for multi-slice images with more complex dynamics. This paper will describe a factor analysis method that is more robust in such situations and is relatively insensitive to the number of physiological components present in the data set. The technique is very similar to that used to identify spectral end-members from multispectral remote sensing images.

  2. A framework for breast cancer visualization using augmented reality x-ray vision technique in mobile technology

    NASA Astrophysics Data System (ADS)

    Rahman, Hameedur; Arshad, Haslina; Mahmud, Rozi; Mahayuddin, Zainal Rasyid

    2017-10-01

    Breast Cancer patients who require breast biopsy has increased over the past years. Augmented Reality guided core biopsy of breast has become the method of choice for researchers. However, this cancer visualization has limitations to the extent of superimposing the 3D imaging data only. In this paper, we are introducing an Augmented Reality visualization framework that enables breast cancer biopsy image guidance by using X-Ray vision technique on a mobile display. This framework consists of 4 phases where it initially acquires the image from CT/MRI and process the medical images into 3D slices, secondly it will purify these 3D grayscale slices into 3D breast tumor model using 3D modeling reconstruction technique. Further, in visualization processing this virtual 3D breast tumor model has been enhanced using X-ray vision technique to see through the skin of the phantom and the final composition of it is displayed on handheld device to optimize the accuracy of the visualization in six degree of freedom. The framework is perceived as an improved visualization experience because the Augmented Reality x-ray vision allowed direct understanding of the breast tumor beyond the visible surface and direct guidance towards accurate biopsy targets.

  3. Registration of in vivo MR to histology of rodent brains using blockface imaging

    NASA Astrophysics Data System (ADS)

    Uberti, Mariano; Liu, Yutong; Dou, Huanyu; Mosley, R. Lee; Gendelman, Howard E.; Boska, Michael

    2009-02-01

    Registration of MRI to histopathological sections can enhance bioimaging validation for use in pathobiologic, diagnostic, and therapeutic evaluations. However, commonly used registration methods fall short of this goal due to tissue shrinkage and tearing after brain extraction and preparation. In attempts to overcome these limitations we developed a software toolbox using 3D blockface imaging as the common space of reference. This toolbox includes a semi-automatic brain extraction technique using constraint level sets (CLS), 3D reconstruction methods for the blockface and MR volume, and a 2D warping technique using thin-plate splines with landmark optimization. Using this toolbox, the rodent brain volume is first extracted from the whole head MRI using CLS. The blockface volume is reconstructed followed by 3D brain MRI registration to the blockface volume to correct the global deformations due to brain extraction and fixation. Finally, registered MRI and histological slices are warped to corresponding blockface images to correct slice specific deformations. The CLS brain extraction technique was validated by comparing manual results showing 94% overlap. The image warping technique was validated by calculating target registration error (TRE). Results showed a registration accuracy of a TRE < 1 pixel. Lastly, the registration method and the software tools developed were used to validate cell migration in murine human immunodeficiency virus type one encephalitis.

  4. Computer-Aided Diagnostic (CAD) Scheme by Use of Contralateral Subtraction Technique

    NASA Astrophysics Data System (ADS)

    Nagashima, Hiroyuki; Harakawa, Tetsumi

    We developed a computer-aided diagnostic (CAD) scheme for detection of subtle image findings of acute cerebral infarction in brain computed tomography (CT) by using a contralateral subtraction technique. In our computerized scheme, the lateral inclination of image was first corrected automatically by rotating and shifting. The contralateral subtraction image was then derived by subtraction of reversed image from original image. Initial candidates for acute cerebral infarctions were identified using the multiple-thresholding and image filtering techniques. As the 1st step for removing false positive candidates, fourteen image features were extracted in each of the initial candidates. Halfway candidates were detected by applying the rule-based test with these image features. At the 2nd step, five image features were extracted using the overlapping scale with halfway candidates in interest slice and upper/lower slice image. Finally, acute cerebral infarction candidates were detected by applying the rule-based test with five image features. The sensitivity in the detection for 74 training cases was 97.4% with 3.7 false positives per image. The performance of CAD scheme for 44 testing cases had an approximate result to training cases. Our CAD scheme using the contralateral subtraction technique can reveal suspected image findings of acute cerebral infarctions in CT images.

  5. Software Support for Fully Distributed/Loosely Coupled Processing Systems. Volume 1.

    DTIC Science & Technology

    1984-01-01

    1117 -%.APPENDIX H CLOUDS .............. ..................................... 14T8 UUU .1 Data Management...markingI "..".-_I .I I ___ I III _ __I I_ _ _ I.-.-, -4..’ I 1 I I𔃻 -I I Analyticoal I Instruction I I Monitorig In I Techniques I I Mixes I I Techniques I...Section 4 OPERATIONAL SUPPORT CAPABILITIES Page 117 second approach, the ’native’ system, corresponds to the Clouds project, under

  6. An enhanced technique for mobile cloudlet offloading with reduced computation using compression in the cloud

    NASA Astrophysics Data System (ADS)

    Moro, A. C.; Nadesh, R. K.

    2017-11-01

    The cloud computing paradigm has transformed the way we do business in today’s world. Services on cloud have come a long way since just providing basic storage or software on demand. One of the fastest growing factor in this is mobile cloud computing. With the option of offloading now available to mobile users, mobile users can offload entire applications onto cloudlets. With the problems regarding availability and limited-storage capacity of these mobile cloudlets, it becomes difficult to decide for the mobile user when to use his local memory or the cloudlets. Hence, we take a look at a fast algorithm that decides whether the mobile user should go for cloudlet or rely on local memory based on an offloading probability. We have partially implemented the algorithm which decides whether the task can be carried out locally or given to a cloudlet. But as it becomes a burden on the mobile devices to perform the complete computation, so we look to offload this on to a cloud in our paper. Also further we use a file compression technique before sending the file onto the cloud to further reduce the load.

  7. Super-resolution reconstruction in frequency, image, and wavelet domains to reduce through-plane partial voluming in MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gholipour, Ali, E-mail: ali.gholipour@childrens.harvard.edu; Afacan, Onur; Scherrer, Benoit

    Purpose: To compare and evaluate the use of super-resolution reconstruction (SRR), in frequency, image, and wavelet domains, to reduce through-plane partial voluming effects in magnetic resonance imaging. Methods: The reconstruction of an isotropic high-resolution image from multiple thick-slice scans has been investigated through techniques in frequency, image, and wavelet domains. Experiments were carried out with thick-slice T2-weighted fast spin echo sequence on the Academic College of Radiology MRI phantom, where the reconstructed images were compared to a reference high-resolution scan using peak signal-to-noise ratio (PSNR), structural similarity image metric (SSIM), mutual information (MI), and the mean absolute error (MAE) ofmore » image intensity profiles. The application of super-resolution reconstruction was then examined in retrospective processing of clinical neuroimages of ten pediatric patients with tuberous sclerosis complex (TSC) to reduce through-plane partial voluming for improved 3D delineation and visualization of thin radial bands of white matter abnormalities. Results: Quantitative evaluation results show improvements in all evaluation metrics through super-resolution reconstruction in the frequency, image, and wavelet domains, with the highest values obtained from SRR in the image domain. The metric values for image-domain SRR versus the original axial, coronal, and sagittal images were PSNR = 32.26 vs 32.22, 32.16, 30.65; SSIM = 0.931 vs 0.922, 0.924, 0.918; MI = 0.871 vs 0.842, 0.844, 0.831; and MAE = 5.38 vs 7.34, 7.06, 6.19. All similarity metrics showed high correlations with expert ranking of image resolution with MI showing the highest correlation at 0.943. Qualitative assessment of the neuroimages of ten TSC patients through in-plane and out-of-plane visualization of structures showed the extent of partial voluming effect in a real clinical scenario and its reduction using SRR. Blinded expert evaluation of image resolution in resampled out-of-plane views consistently showed the superiority of SRR compared to original axial and coronal image acquisitions. Conclusions: Thick-slice 2D T2-weighted MRI scans are part of many routine clinical protocols due to their high signal-to-noise ratio, but are often severely affected by through-plane partial voluming effects. This study shows that while radiologic assessment is performed in 2D on thick-slice scans, super-resolution MRI reconstruction techniques can be used to fuse those scans to generate a high-resolution image with reduced partial voluming for improved postacquisition processing. Qualitative and quantitative evaluation showed the efficacy of all SRR techniques with the best results obtained from SRR in the image domain. The limitations of SRR techniques are uncertainties in modeling the slice profile, density compensation, quantization in resampling, and uncompensated motion between scans.« less

  8. Super-resolution reconstruction in frequency, image, and wavelet domains to reduce through-plane partial voluming in MRI

    PubMed Central

    Gholipour, Ali; Afacan, Onur; Aganj, Iman; Scherrer, Benoit; Prabhu, Sanjay P.; Sahin, Mustafa; Warfield, Simon K.

    2015-01-01

    Purpose: To compare and evaluate the use of super-resolution reconstruction (SRR), in frequency, image, and wavelet domains, to reduce through-plane partial voluming effects in magnetic resonance imaging. Methods: The reconstruction of an isotropic high-resolution image from multiple thick-slice scans has been investigated through techniques in frequency, image, and wavelet domains. Experiments were carried out with thick-slice T2-weighted fast spin echo sequence on the Academic College of Radiology MRI phantom, where the reconstructed images were compared to a reference high-resolution scan using peak signal-to-noise ratio (PSNR), structural similarity image metric (SSIM), mutual information (MI), and the mean absolute error (MAE) of image intensity profiles. The application of super-resolution reconstruction was then examined in retrospective processing of clinical neuroimages of ten pediatric patients with tuberous sclerosis complex (TSC) to reduce through-plane partial voluming for improved 3D delineation and visualization of thin radial bands of white matter abnormalities. Results: Quantitative evaluation results show improvements in all evaluation metrics through super-resolution reconstruction in the frequency, image, and wavelet domains, with the highest values obtained from SRR in the image domain. The metric values for image-domain SRR versus the original axial, coronal, and sagittal images were PSNR = 32.26 vs 32.22, 32.16, 30.65; SSIM = 0.931 vs 0.922, 0.924, 0.918; MI = 0.871 vs 0.842, 0.844, 0.831; and MAE = 5.38 vs 7.34, 7.06, 6.19. All similarity metrics showed high correlations with expert ranking of image resolution with MI showing the highest correlation at 0.943. Qualitative assessment of the neuroimages of ten TSC patients through in-plane and out-of-plane visualization of structures showed the extent of partial voluming effect in a real clinical scenario and its reduction using SRR. Blinded expert evaluation of image resolution in resampled out-of-plane views consistently showed the superiority of SRR compared to original axial and coronal image acquisitions. Conclusions: Thick-slice 2D T2-weighted MRI scans are part of many routine clinical protocols due to their high signal-to-noise ratio, but are often severely affected by through-plane partial voluming effects. This study shows that while radiologic assessment is performed in 2D on thick-slice scans, super-resolution MRI reconstruction techniques can be used to fuse those scans to generate a high-resolution image with reduced partial voluming for improved postacquisition processing. Qualitative and quantitative evaluation showed the efficacy of all SRR techniques with the best results obtained from SRR in the image domain. The limitations of SRR techniques are uncertainties in modeling the slice profile, density compensation, quantization in resampling, and uncompensated motion between scans. PMID:26632048

  9. Scalar localization of the electrode array after cochlear implantation: clinical experience using 64-slice multidetector computed tomography.

    PubMed

    Lane, John I; Witte, Robert J; Driscoll, Colin L W; Shallop, Jon K; Beatty, Charles W; Primak, Andrew N

    2007-08-01

    To use the improved resolution available with 64-slice multidetector computed tomography (MDCT) in vivo to localize the cochlear implant electrode array within the basal turn. Sixty-four-slice MDCT examinations of the temporal bones were retrospectively reviewed in 17 patients. Twenty-three implants were evaluated. Tertiary referral facility. All patients with previous cochlear implantation evaluated at our center between January 2004 and March 2006 were offered a computed tomographic examination as part of the study. In addition, preoperative computed tomographic examinations in patients being evaluated for a second bilateral device were included. Sixty-four-slice MDCT examination of the temporal bones. Localization of the electrode array within the basal turn from multiplanar reconstructions of the cochlea. Twenty-three implants were imaged in 17 patients. We were able to localize the electrode array within the scala tympani within the basal turn in 10 implants. In 3 implants, the electrode array was localized to the scala vestibuli. Migration of the electrode array from scala tympani to scala vestibuli was observed in three implants. Of the 7 implants in which localization of the electrode array was indeterminate, all had disease entities that obscured the definition of the normal cochlear anatomy. Sixty-four-slice MDCT with multiplanar reconstructions of the postoperative cochlea after cochlear implantation allows for accurate localization of the electrode array within the basal turn where normal cochlear anatomy is not obscured by the underlying disease process. Correlating the position of the electrode in the basal turn with surgical technique and implant design could be helpful in improving outcomes.

  10. Investigating Functional Regeneration in Organotypic Spinal Cord Co-cultures Grown on Multi-electrode Arrays.

    PubMed

    Heidemann, Martina; Streit, Jürg; Tscherter, Anne

    2015-09-23

    Adult higher vertebrates have a limited potential to recover from spinal cord injury. Recently, evidence emerged that propriospinal connections are a promising target for intervention to improve functional regeneration. So far, no in vitro model exists that grants the possibility to examine functional recovery of propriospinal fibers. Therefore, a representative model that is based on two organotypic spinal cord sections of embryonic rat, cultured next to each other on multi-electrode arrays (MEAs) was developed. These slices grow and, within a few days in vitro, fuse along the sides facing each other. The design of the used MEAs permits the performance of lesions with a scalpel blade through this fusion site without inflicting damage on the MEAs. The slices show spontaneous activity, usually organized in network activity bursts, and spatial and temporal activity parameters such as the location of burst origins, speed and direction of their propagation and latencies between bursts can be characterized. Using these features, it is also possible to assess functional connection of the slices by calculating the amount of synchronized bursts between the two sides. Furthermore, the slices can be morphologically analyzed by performing immunohistochemical stainings after the recordings. Several advantages of the used techniques are combined in this model: the slices largely preserve the original tissue architecture with intact local synaptic circuitry, the tissue is easily and repeatedly accessible and neuronal activity can be detected simultaneously and non-invasively in a large number of spots at high temporal resolution. These features allow the investigation of functional regeneration of intraspinal connections in isolation in vitro in a sophisticated and efficient way.

  11. Development of a method to evaluate glutamate receptor function in rat barrel cortex slices.

    PubMed

    Lehohla, M; Russell, V; Kellaway, L; Govender, A

    2000-12-01

    The rat is a nocturnal animal and uses its vibrissae extensively to navigate its environment. The vibrissae are linked to a highly organized part of the sensory cortex, called the barrel cortex which contains spiny neurons that receive whisker specific thalamic input and distribute their output mainly within the cortical column. The aim of the present study was to develop a method to evaluate glutamate receptor function in the rat barrel cortex. Long Evans rats (90-160 g) were killed by cervical dislocation and decapitated. The brain was rapidly removed, cooled in a continuously oxygenated, ice-cold Hepes buffer (pH 7.4) and sliced using a vibratome to produce 0.35 mm slices. The barrel cortex was dissected from slices corresponding to 8.6 to 4.8 mm anterior to the interaural line and divided into rostral, middle and caudal regions. Depolarization-induced uptake of 45Ca2+ was achieved by incubating test slices in a high K+ (62.5 mM) buffer for 2 minutes at 35 degrees C. Potassium-stimulated uptake of 45Ca2+ into the rostral region was significantly lower than into middle and caudal regions of the barrel cortex. Glutamate had no effect. NMDA significantly increased uptake of 45Ca2+ into all regions of the barrel cortex. The technique is useful in determining NMDA receptor function and will be applied to study differences between spontaneously hypertensive rats (SHR) that are used as a model for attention deficit disorder and their normotensive control rats.

  12. Venus' night side atmospheric dynamics using near infrared observations from VEx/VIRTIS and TNG/NICS

    NASA Astrophysics Data System (ADS)

    Mota Machado, Pedro; Peralta, Javier; Luz, David; Gonçalves, Ruben; Widemann, Thomas; Oliveira, Joana

    2016-10-01

    We present night side Venus' winds based on coordinated observations carried out with Venus Express' VIRTIS instrument and the Near Infrared Camera (NICS) of the Telescopio Nazionale Galileo (TNG). With NICS camera, we acquired images of the continuum K filter at 2.28 μm, which allows to monitor motions at the Venus' lower cloud level, close to 48 km altitude. We will present final results of cloud tracked winds from ground-based TNG observations and from coordinated space-based VEx/VIRTIS observations.The Venus' lower cloud deck is centred at 48 km of altitude, where fundamental dynamical exchanges that help maintain superrotation are thought to occur. The lower Venusian atmosphere is a strong source of thermal radiation, with the gaseous CO2 component allowing radiation to escape in windows at 1.74 and 2.28 μm. At these wavelengths radiation originates below 35 km and unit opacity is reached at the lower cloud level, close to 48 km. Therefore, it is possible to observe the horizontal cloud structure, with thicker clouds seen silhouetted against the bright thermal background from the low atmosphere. By continuous monitoring of the horizontal cloud structure at 2.28 μm (NICS Kcont filter), it is possible to determine wind fields using the technique of cloud tracking. We acquired a series of short exposures of the Venus disk. Cloud displacements in the night side of Venus were computed taking advantage of a phase correlation semi-automated technique. The Venus apparent diameter at observational dates was greater than 32" allowing a high spatial precision. The 0.13" pixel scale of the NICS narrow field camera allowed to resolve ~3-pixel displacements. The absolute spatial resolution on the disk was ~100 km/px at disk center, and the (0.8-1") seeing-limited resolution was ~400 km/px. By co-adding the best images and cross-correlating regions of clouds the effective resolution was significantly better than the seeing-limited resolution. In order to correct for scattered light from the (saturated) day side crescent into the night side, a set of observations with a Br Υ filter were performed. Cloud features are invisible at this wavelength, and this technique allowed for a good correction of scattered light.

  13. Spatial and Temporal Varying Thresholds for Cloud Detection in Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary; Haines, Stephanie

    2007-01-01

    A new cloud detection technique has been developed and applied to both geostationary and polar orbiting satellite imagery having channels in the thermal infrared and short wave infrared spectral regions. The bispectral composite threshold (BCT) technique uses only the 11 micron and 3.9 micron channels, and composite imagery generated from these channels, in a four-step cloud detection procedure to produce a binary cloud mask at single pixel resolution. A unique aspect of this algorithm is the use of 20-day composites of the 11 micron and the 11 - 3.9 micron channel difference imagery to represent spatially and temporally varying clear-sky thresholds for the bispectral cloud tests. The BCT cloud detection algorithm has been applied to GOES and MODIS data over the continental United States over the last three years with good success. The resulting products have been validated against "truth" datasets (generated by the manual determination of the sky conditions from available satellite imagery) for various seasons from the 2003-2005 periods. The day and night algorithm has been shown to determine the correct sky conditions 80-90% of the time (on average) over land and ocean areas. Only a small variation in algorithm performance occurs between day-night, land-ocean, and between seasons. The algorithm performs least well. during he winter season with only 80% of the sky conditions determined correctly. The algorithm was found to under-determine clouds at night and during times of low sun angle (in geostationary satellite data) and tends to over-determine the presence of clouds during the day, particularly in the summertime. Since the spectral tests use only the short- and long-wave channels common to most multispectral scanners; the application of the BCT technique to a variety of satellite sensors including SEVERI should be straightforward and produce similar performance results.

  14. MRI segmentation by active contours model, 3D reconstruction, and visualization

    NASA Astrophysics Data System (ADS)

    Lopez-Hernandez, Juan M.; Velasquez-Aguilar, J. Guadalupe

    2005-02-01

    The advances in 3D data modelling methods are becoming increasingly popular in the areas of biology, chemistry and medical applications. The Nuclear Magnetic Resonance Imaging (NMRI) technique has progressed at a spectacular rate over the past few years, its uses have been spread over many applications throughout the body in both anatomical and functional investigations. In this paper we present the application of Zernike polynomials for 3D mesh model of the head using the contour acquired of cross-sectional slices by active contour model extraction and we propose the visualization with OpenGL 3D Graphics of the 2D-3D (slice-surface) information for the diagnostic aid in medical applications.

  15. TU-A-12A-07: CT-Based Biomarkers to Characterize Lung Lesion: Effects of CT Dose, Slice Thickness and Reconstruction Algorithm Based Upon a Phantom Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, B; Tan, Y; Tsai, W

    2014-06-15

    Purpose: Radiogenomics promises the ability to study cancer tumor genotype from the phenotype obtained through radiographic imaging. However, little attention has been paid to the sensitivity of image features, the image-based biomarkers, to imaging acquisition techniques. This study explores the impact of CT dose, slice thickness and reconstruction algorithm on measuring image features using a thorax phantom. Methods: Twentyfour phantom lesions of known volume (1 and 2mm), shape (spherical, elliptical, lobular and spicular) and density (-630, -10 and +100 HU) were scanned on a GE VCT at four doses (25, 50, 100, and 200 mAs). For each scan, six imagemore » series were reconstructed at three slice thicknesses of 5, 2.5 and 1.25mm with continuous intervals, using the lung and standard reconstruction algorithms. The lesions were segmented with an in-house 3D algorithm. Fifty (50) image features representing lesion size, shape, edge, and density distribution/texture were computed. Regression method was employed to analyze the effect of CT dose, slice of thickness and reconstruction algorithm on these features adjusting 3 confounding factors (size, density and shape of phantom lesions). Results: The coefficients of CT dose, slice thickness and reconstruction algorithm are presented in Table 1 in the supplementary material. No significant difference was found between the image features calculated on low dose CT scans (25mAs and 50mAs). About 50% texture features were found statistically different between low doses and high doses (100 and 200mAs). Significant differences were found for almost all features when calculated on 1.25mm, 2.5mm, and 5mm slice thickness images. Reconstruction algorithms significantly affected all density-based image features, but not morphological features. Conclusions: There is a great need to standardize the CT imaging protocols for radiogenomics study because CT dose, slice thickness and reconstruction algorithm impact quantitative image features to various degrees as our study has shown.« less

  16. Synthesis of Calcium Phosphate Composite Organogels by Using Emulsion Method for Dentine Occlusion Materials

    NASA Astrophysics Data System (ADS)

    Nopteeranupharp, C.; Akkarachaneeyakorn, K.; Songsasaen, A.

    2018-03-01

    Dentinal hypersensitivity (DH) is one of the most human’s problems caused by the erosion of enamel. There are many methods and materials to solve this problem. Calcium phosphate is an excellent alternative for curing this symptom because of its osteoconductivity, and biocompatibility properties. The low-cost and low-toxicity calcium phosphate nanogel was fabricated by using emulsion method and characterized by using TEM, EDX, and DLS techniques. The results showed that P123 (poly (ethylene oxide)19-block-Poly (propylene oxide)69-block-poly (ethylene oxide)19) has played a major role as template and gel formation, SDS was used as a surfactant to form water-in-oil emulsion nanodroplets with circle-like shape. Moreover, the ability of synthesised organogel to occlude the exposed dentine tubules was tested on the model of human’s dentine slices. The results showed that calcium phosphate composite organogel can be efficiently occluded on dentine slice, characterized by SEM technique, after 1 day.

  17. Comparison of ultrasonography, radiography and a single computed tomography slice for the identification of fluid within the tympanic bulla of rabbit cadavers.

    PubMed

    King, A M; Posthumus, J; Hammond, G; Sullivan, M

    2012-08-01

    Evaluation of the tympanic bulla (TB) in cases of otitis media in the rabbit can be a diagnostic challenge, although a feature often associated with the condition is the accumulation of fluid or material within the TB. Randomly selected TB from 40 rabbit cadavers were filled with a water-based, water-soluble jelly lubricant. A dorsoventral radiograph and single computed tomography (CT) slice were taken followed by an ultrasound (US) examination. Image interpretation was performed by blinded operators. The content of each TB was determined (fluid or gas) using each technique and the cadavers were frozen and sectioned for confirmation. CT was the most accurate diagnostic method, but US produced better results than radiography. Given the advantages of US over the other imaging techniques, the results suggest that further work is warranted to determine US applications in the evaluation of the rabbit TB and clinical cases of otitis media in this species. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. A Solar Reflectance Method for Retrieving Cloud Optical Thickness and Droplet Size Over Snow and Ice Surfaces

    NASA Technical Reports Server (NTRS)

    Platnick, S.; Li, J. Y.; King, M. D.; Gerber, H.; Hobbs, P. V.

    1999-01-01

    Cloud optical thickness and effective radius retrievals from solar reflectance measurements are traditionally implemented using a combination of spectral channels that are absorbing and non-absorbing for water particles. Reflectances in non-absorbing channels (e.g., 0.67, 0.86, 1.2 micron spectral window bands) are largely dependent on cloud optical thickness, while longer wavelength absorbing channels (1.6, 2. 1, and 3.7 micron window bands) provide cloud particle size information. Cloud retrievals over ice and snow surfaces present serious difficulties. At the shorter wavelengths, ice is bright and highly variable, both characteristics acting to significantly increase cloud retrieval uncertainty. In contrast, reflectances at the longer wavelengths are relatively small and may be comparable to that of dark open water. A modification to the traditional cloud retrieval technique is devised. The new algorithm uses only a combination of absorbing spectral channels for which the snow/ice albedo is relatively small. Using this approach, retrievals have been made with the MODIS Airborne Simulator (MAS) imager flown aboard the NASA ER-2 from May - June 1998 during the Arctic FIRE-ACE field deployment. Data from several coordinated ER-2 and University of Washington CV-580 in situ aircraft observations of liquid water stratus clouds are examined. MAS retrievals of optical thickness, droplet effective radius, and liquid water path are shown to be in good agreement with the in situ measurements. The initial success of the technique has implications for future operational satellite cloud retrieval algorithms in polar and wintertime regions.

  19. The effect of spatial resolution upon cloud optical property retrievals. I - Optical thickness

    NASA Technical Reports Server (NTRS)

    Feind, Rand E.; Christopher, Sundar A.; Welch, Ronald M.

    1992-01-01

    High spectral and spatial resolution Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) imagery is used to study the effects of spatial resolution upon fair weather cumulus cloud optical thickness retrievals. As a preprocessing step, a variation of the Gao and Goetz three-band ratio technique is used to discriminate clouds from the background. The combination of the elimination of cloud shadow pixels and using the first derivative of the histogram allows for accurate cloud edge discrimination. The data are progressively degraded from 20 m to 960 m spatial resolution. The results show that retrieved cloud area increases with decreasing spatial resolution. The results also show that there is a monotonic decrease in retrieved cloud optical thickness with decreasing spatial resolution. It is also demonstrated that the use of a single, monospectral reflectance threshold is inadequate for identifying cloud pixels in fair weather cumulus scenes and presumably in any inhomogeneous cloud field. Cloud edges have a distribution of reflectance thresholds. The incorrect identification of cloud edges significantly impacts the accurate retrieval of cloud optical thickness values.

  20. Ubiquity and impact of thin mid-level clouds in the tropics

    PubMed Central

    Bourgeois, Quentin; Ekman, Annica M. L.; Igel, Matthew R.; Krejci, Radovan

    2016-01-01

    Clouds are crucial for Earth's climate and radiation budget. Great attention has been paid to low, high and vertically thick tropospheric clouds such as stratus, cirrus and deep convective clouds. However, much less is known about tropospheric mid-level clouds as these clouds are challenging to observe in situ and difficult to detect by remote sensing techniques. Here we use Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) satellite observations to show that thin mid-level clouds (TMLCs) are ubiquitous in the tropics. Supported by high-resolution regional model simulations, we find that TMLCs are formed by detrainment from convective clouds near the zero-degree isotherm. Calculations using a radiative transfer model indicate that tropical TMLCs have a cooling effect on climate that could be as large in magnitude as the warming effect of cirrus. We conclude that more effort has to be made to understand TMLCs, as their influence on cloud feedbacks, heat and moisture transport, and climate sensitivity could be substantial. PMID:27530236

  1. Ubiquity and impact of thin mid-level clouds in the tropics.

    PubMed

    Bourgeois, Quentin; Ekman, Annica M L; Igel, Matthew R; Krejci, Radovan

    2016-08-17

    Clouds are crucial for Earth's climate and radiation budget. Great attention has been paid to low, high and vertically thick tropospheric clouds such as stratus, cirrus and deep convective clouds. However, much less is known about tropospheric mid-level clouds as these clouds are challenging to observe in situ and difficult to detect by remote sensing techniques. Here we use Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) satellite observations to show that thin mid-level clouds (TMLCs) are ubiquitous in the tropics. Supported by high-resolution regional model simulations, we find that TMLCs are formed by detrainment from convective clouds near the zero-degree isotherm. Calculations using a radiative transfer model indicate that tropical TMLCs have a cooling effect on climate that could be as large in magnitude as the warming effect of cirrus. We conclude that more effort has to be made to understand TMLCs, as their influence on cloud feedbacks, heat and moisture transport, and climate sensitivity could be substantial.

  2. Global Single and Multiple Cloud Classification with a Fuzzy Logic Expert System

    NASA Technical Reports Server (NTRS)

    Welch, Ronald M.; Tovinkere, Vasanth; Titlow, James; Baum, Bryan A.

    1996-01-01

    An unresolved problem in remote sensing concerns the analysis of satellite imagery containing both single and multiple cloud layers. While cloud parameterizations are very important both in global climate models and in studies of the Earth's radiation budget, most cloud retrieval schemes, such as the bispectral method used by the International Satellite Cloud Climatology Project (ISCCP), have no way of determining whether overlapping cloud layers exist in any group of satellite pixels. Coakley (1983) used a spatial coherence method to determine whether a region contained more than one cloud layer. Baum et al. (1995) developed a scheme for detection and analysis of daytime multiple cloud layers using merged AVHRR (Advanced Very High Resolution Radiometer) and HIRS (High-resolution Infrared Radiometer Sounder) data collected during the First ISCCP Regional Experiment (FIRE) Cirrus 2 field campaign. Baum et al. (1995) explored the use of a cloud classification technique based on AVHRR data. This study examines the feasibility of applying the cloud classifier to global satellite imagery.

  3. The Evolution of 3D Microimaging Techniques in Geosciences

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Proussevitch, A.

    2009-05-01

    In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.

  4. Fast spectroscopic variations on rapidly-rotating, cool dwarfs. 3: Masses of circumstellar absorbing clouds on AB Doradus

    NASA Technical Reports Server (NTRS)

    Cameron, A. Collier; Duncan, D. K.; Ehrenfreund, P.; Foing, B. H.; Kuntz, K. D.; Penston, M. V.; Robinson, R. D.; Soderblom, D. R.

    1989-01-01

    New time-resolved H alpha, Ca II H and K and Mg II h and k spectra of the rapidly-rotating K0 dwarf star AB Doradus (= HD 36705). The transient absorption features seen in the H alpha line are also present in the Ca II and Mg II resonance lines. New techniques are developed for measuring the average strength of the line absorption along lines of sight intersecting the cloud. These techniques also give a measure of the projected cloud area. The strength of the resonance line absorption provides useful new constraints on the column densities, projected surface areas, temperatures and internal turbulent velocity dispersions of the circumstellar clouds producing the absorption features. At any given time the star appears to be surrounded by at least 6 to 10 clouds with masses in the range 2 to 6 x 10(exp 17) g. The clouds appear to have turbulent internal velocity dispersions of order 3 to 20 km/s, comparable with the random velocities of discrete filamentary structures in solar quiescent prominences. Night-to-night changes in the amount of Ca II resonance line absorption can be explained by changes in the amplitude of turbulent motions in the clouds. The corresponding changes in the total energy of the internal motions are of order 10(exp 29) erg per cloud. Changes of this magnitude could easily be activated by the frequent energetic (approximately 10(exp 34) erg) x ray flares seen on this star.

  5. Advanced intensity-modulation continuous-wave lidar techniques for ASCENDS CO2 column measurements

    NASA Astrophysics Data System (ADS)

    Campbell, Joel F.; Lin, Bing; Nehrir, Amin R.; Harrison, F. W.; Obland, Michael D.; Meadows, Byron

    2015-10-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity- Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation, where it is shown useful for making tree canopy measurements.

  6. Advanced IMCW Lidar Techniques for ASCENDS CO2 Column Measurements

    NASA Astrophysics Data System (ADS)

    Campbell, Joel; lin, bing; nehrir, amin; harrison, fenton; obland, michael

    2015-04-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity-Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation.

  7. Advanced Intensity-Modulation Continuous-Wave Lidar Techniques for ASCENDS O2 Column Measurements

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Lin, Bing; Nehrir, Amin R.; Harrison, F. Wallace; Obland, Michael D.; Meadows, Byron

    2015-01-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity- Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation, where it is shown useful for making tree canopy measurements.

  8. Advanced Intensity-Modulation Continuous-Wave Lidar Techniques for Column CO2 Measurements

    NASA Astrophysics Data System (ADS)

    Campbell, J. F.; Lin, B.; Nehrir, A. R.; Obland, M. D.; Liu, Z.; Browell, E. V.; Chen, S.; Kooi, S. A.; Fan, T. F.

    2015-12-01

    Global and regional atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission and Atmospheric Carbon and Transport (ACT) - America airborne investigation are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity-Modulated Continuous-Wave (IM-CW) lidar techniques are being investigated as a means of facilitating CO2 measurements from space and airborne platforms to meet the mission science measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud returns. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of intervening optically thin clouds, thereby minimizing bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the Earth's surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques and provides very high (at sub-meter level) range resolution. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These techniques are used in a new data processing architecture to support the ASCENDS CarbonHawk Experiment Simulator (ACES) and ACT-America programs.

  9. Mesoscale mapping of available solar energy at the earth's surface by use of satellites

    NASA Technical Reports Server (NTRS)

    Hiser, H. W.; Senn, H. V.

    1980-01-01

    A method is presented for use of cloud images in the visual spectrum from the SMS/GOES geostationary satellites to determine the hourly distribution of sunshine on the mesoscale. Cloud coverage and density as a function of time of day and season are evaluated through the use of digital data processing techniques. Seasonal geographic distributions of cloud cover/sunshine are converted to joules of solar radiation received at the earth's surface through relationships developed from long-term measurements of these two parameters at six widely distributed stations. The technique can be used to generate maps showing the geographic distribution of total solar radiation on the mesoscale which is received at the earth's surface.

  10. The Use of Satellite Observed Cloud Patterns in Northern Hemisphere 300 mb and 1000/300 mb Numerical Analysis.

    DTIC Science & Technology

    1984-02-01

    prediction Extratropical cyclones Objective analysis Bogus techniques 20. ABSTRACT (Continue on reverse aide If necooearn mid Identify by block number) Jh A...quasi-objective statistical method for deriving 300 mb geopotential heights and 1000/300 mb thicknesses in the vicinity of extratropical cyclones 0I...with the aid of satellite imagery is presented. The technique utilizes satellite observed extratropical spiral cloud pattern parameters in conjunction

  11. A Comparative Study of YSO Classification Techniques using WISE Observations of the KR 120 Molecular Cloud

    NASA Astrophysics Data System (ADS)

    Kang, Sung-Ju; Kerton, C. R.

    2014-01-01

    KR 120 (Sh2-187) is a small Galactic HII region located at a distance of 1.4 kpc that shows evidence for triggered star formation in the surrounding molecular cloud. We present an analysis of the young stellar object (YSO) population of the molecular cloud as determined using a variety of classification techniques. YSO candidates are selected from the WISE all sky catalog and classified as Class I, Class II and Flat based on 1) spectral index, 2) color-color or color-magnitude plots, and 3) spectral energy distribution (SED) fits to radiative transfer models. We examine the discrepancies in YSO classification between the various techniques and explore how these discrepancies lead to uncertainty in such scientifically interesting quantities such as the ratio of Class I/Class II sources and the surface density of YSOs at various stages of evolution.

  12. METSAT information content: Cloud screening and solar correction investigations on the influence of NOAA-6 advanced very high resolution radiometer derived vegetation assessment

    NASA Technical Reports Server (NTRS)

    Mathews, M. L.

    1983-01-01

    The development of the cloud indicator index (CII) for use with METSAT's advanced very high resolution radiometer (AVHRR) is described. The CII is very effective at identification of clouds. Also, explored are different solar correction and standard techniques and the impact of these corrections have on the information content of AVHRR data.

  13. Multiple two-dimensional versus three-dimensional PTV definition in treatment planning for conformal radiotherapy.

    PubMed

    Stroom, J C; Korevaar, G A; Koper, P C; Visser, A G; Heijmen, B J

    1998-06-01

    To demonstrate the need for a fully three-dimensional (3D) computerized expansion of the gross tumour volume (GTV) or clinical target volume (CTV), as delineated by the radiation oncologist on CT slices, to obtain the proper planning target volume (PTV) for treatment planning according to the ICRU-50 recommendations. For 10 prostate cancer patients two PTVs have been determined by expansion of the GTV with a 1.5 cm margin, i.e. a 3D PTV and a multiple 2D PTV. The former was obtained by automatically adding the margin while accounting in 3D for GTV contour differences in neighbouring slices. The latter was generated by automatically adding the 1.5 cm margin to the GTV in each CT slice separately; the resulting PTV is a computer simulation of the PTV that a radiation oncologist would obtain with (the still common) manual contouring in CT slices. For each patient the two PTVs were compared to assess the deviations of the multiple 2D PTV from the 3D PTV. For both PTVs conformal plans were designed using a three-field technique with fixed block margins. For each patient dose-volume histograms and tumour control probabilities (TCPs) of the (correct) 3D PTV were calculated, both for the plan designed for this PTV and for the treatment plan based on the (deviating) 2D PTV. Depending on the shape of the GTV, multiple 2D PTV generation could locally result in a 1 cm underestimation of the GTV-to-PTV margin. The deviations occurred predominantly in the cranio-caudal direction at locations where the GTV contour shape varies significantly from slice to slice. This could lead to serious underdosage and to a TCP decrease of up to 15%. A full 3D GTV-to-PTV expansion should be applied in conformal radiotherapy to avoid underdosage.

  14. Cloud-based adaptive exon prediction for DNA analysis.

    PubMed

    Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen

    2018-02-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.

  15. Full-Time, Eye-Safe Cloud and Aerosol Lidar Observation at Atmospheric Radiation Measurement Program Sites: Instruments and Data Analysis

    NASA Technical Reports Server (NTRS)

    Campbell, James R.; Hlavka, Dennis L.; Welton, Ellsworth J.; Flynn, Connor J.; Turner, David D.; Spinhirne, James D.; Scott, V. Stanley, III; Hwang, I. H.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    Atmospheric radiative forcing, surface radiation budget, and top of the atmosphere radiance interpretation involves a knowledge of the vertical height structure of overlying cloud and aerosol layers. During the last decade, the U.S. Department of Energy through I the Atmospheric Radiation Measurement (ARM) program has constructed four long- term atmospheric observing sites in strategic climate regimes (north central Oklahoma, In Barrow. Alaska, and Nauru and Manus Islands in the tropical western Pacific). Micro Pulse Lidar (MPL) systems provide continuous, autonomous observation of all significant atmospheric cloud and aerosol at each of the central ARM facilities. Systems are compact and transmitted pulses are eye-safe. Eye-safety is achieved by expanding relatively low-powered outgoing Pulse energy through a shared, coaxial transmit/receive telescope. ARM NIPL system specifications, and specific unit optical designs are discussed. Data normalization and calibration techniques are presented. A multiple cloud boundary detection algorithm is also described. These techniques in tandem represent an operational value added processing package used to produce normalized data products for Cloud and aerosol research and the historical ARM data archive.

  16. Determine precipitation rates from visible and infrared satellite images of clouds by pattern recognition technique. Progress Report, 1 Jul. 1985 - 31 Mar. 1987 Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Weinman, James A.; Garan, Louis

    1987-01-01

    A more advanced cloud pattern analysis algorithm was subsequently developed to take the shape and brightness of the various clouds into account in a manner that is more consistent with the human analyst's perception of GOES cloud imagery. The results of that classification scheme were compared with precipitation probabilities observed from ships of opportunity off the U.S. east coast to derive empirical regressions between cloud types and precipitation probability. The cloud morphology was then quantitatively and objectively used to map precipitation probabilities during two winter months during which severe cold air outbreaks were observed over the northwest Atlantic. Precipitation probabilities associated with various cloud types are summarized. Maps of precipitation probability derived from the cloud morphology analysis program for two months and the precipitation probability derived from thirty years of ship observation were observed.

  17. Cloud Computing - A Unified Approach for Surveillance Issues

    NASA Astrophysics Data System (ADS)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  18. Outside-out "sniffer-patch" clamp technique for in situ measures of neurotransmitter release.

    PubMed

    Muller-Chrétien, Emilie

    2014-01-01

    The mechanism underlying neurotransmitter release is a critical research domain for the understanding of neuronal network function; however, few techniques are available for the direct detection and measurement of neurotransmitter release. To date, the sniffer-patch clamp technique is mainly used to investigate these mechanisms from individual cultured cells. In this study, we propose to adapt the sniffer-patch clamp technique to in situ detection of neurosecretion. Using outside-out patches from donor cells as specific biosensors plunged in acute cerebral slices, this technique allows for proper detection and quantification of neurotransmitter release at the level of the neuronal network.

  19. Security on Cloud Revocation Authority using Identity Based Encryption

    NASA Astrophysics Data System (ADS)

    Rajaprabha, M. N.

    2017-11-01

    As due to the era of cloud computing most of the people are saving there documents, files and other things on cloud spaces. Due to this security over the cloud is also important because all the confidential things are there on the cloud. So to overcome private key infrastructure (PKI) issues some revocable Identity Based Encryption (IBE) techniques are introduced which eliminates the demand of PKI. The technique introduced is key update cloud service provider which is having two issues in it and they are computation and communication cost is high and second one is scalability issue. So to overcome this problem we come along with the system in which the Cloud Revocation Authority (CRA) is there for the security which will only hold the secret key for each user. And the secret key was send with the help of advanced encryption standard security. The key is encrypted and send to the CRA for giving the authentication to the person who wants to share the data or files or for the communication purpose. Through that key only the other user will able to access that file and if the user apply some invalid key on the particular file than the information of that user and file is send to the administrator and administrator is having rights to block that person of black list that person to use the system services.

  20. Vertical Photon Transport in Cloud Remote Sensing Problems

    NASA Technical Reports Server (NTRS)

    Platnick, S.

    1999-01-01

    Photon transport in plane-parallel, vertically inhomogeneous clouds is investigated and applied to cloud remote sensing techniques that use solar reflectance or transmittance measurements for retrieving droplet effective radius. Transport is couched in terms of weighting functions which approximate the relative contribution of individual layers to the overall retrieval. Two vertical weightings are investigated, including one based on the average number of scatterings encountered by reflected and transmitted photons in any given layer. A simpler vertical weighting based on the maximum penetration of reflected photons proves useful for solar reflectance measurements. These weighting functions are highly dependent on droplet absorption and solar/viewing geometry. A superposition technique, using adding/doubling radiative transfer procedures, is derived to accurately determine both weightings, avoiding time consuming Monte Carlo methods. Superposition calculations are made for a variety of geometries and cloud models, and selected results are compared with Monte Carlo calculations. Effective radius retrievals from modeled vertically inhomogeneous liquid water clouds are then made using the standard near-infrared bands, and compared with size estimates based on the proposed weighting functions. Agreement between the two methods is generally within several tenths of a micrometer, much better than expected retrieval accuracy. Though the emphasis is on photon transport in clouds, the derived weightings can be applied to any multiple scattering plane-parallel radiative transfer problem, including arbitrary combinations of cloud, aerosol, and gas layers.

  1. Determination of cloud fields from analysis of HIRS2/MSU sounding data. [20 channel infrared and 4 channel microwave atmospheric sounders

    NASA Technical Reports Server (NTRS)

    Susskind, J.; Reuter, D.

    1986-01-01

    IR and microwave remote sensing data collected with the HIRS2 and MSU sensors on the NOAA polar-orbiting satellites were evaluated for their effectiveness as bases for determining the cloud cover and cloud physical characteristics. Techniques employed to adjust for day-night alterations in the radiance fields are described, along with computational procedures applied to compare scene pixel values with reference values for clear skies. Sample results are provided for the mean cloud coverage detected over South America and Africa June 1979, with attention given to concurrent surface pressure and cloud top pressure values.

  2. An approximate dynamic programming approach to resource management in multi-cloud scenarios

    NASA Astrophysics Data System (ADS)

    Pietrabissa, Antonio; Priscoli, Francesco Delli; Di Giorgio, Alessandro; Giuseppi, Alessandro; Panfili, Martina; Suraci, Vincenzo

    2017-03-01

    The programmability and the virtualisation of network resources are crucial to deploy scalable Information and Communications Technology (ICT) services. The increasing demand of cloud services, mainly devoted to the storage and computing, requires a new functional element, the Cloud Management Broker (CMB), aimed at managing multiple cloud resources to meet the customers' requirements and, simultaneously, to optimise their usage. This paper proposes a multi-cloud resource allocation algorithm that manages the resource requests with the aim of maximising the CMB revenue over time. The algorithm is based on Markov decision process modelling and relies on reinforcement learning techniques to find online an approximate solution.

  3. Retrieval of Ice Cloud Properties Using Variable Phase Functions

    NASA Astrophysics Data System (ADS)

    Heck, Patrick W.; Minnis, Patrick; Yang, Ping; Chang, Fu-Lung; Palikonda, Rabindra; Arduini, Robert F.; Sun-Mack, Sunny

    2009-03-01

    An enhancement to NASA Langley's Visible Infrared Solar-infrared Split-window Technique (VISST) is developed to identify and account for situations when errors are induced by using smooth ice crystals. The retrieval scheme incorporates new ice cloud phase functions that utilize hexagonal crystals with roughened surfaces. In some situations, cloud optical depths are reduced, hence, cloud height is increased. Cloud effective particle size also changes with the roughened ice crystal models which results in varied effects on the calculation of ice water path. Once validated and expanded, the new approach will be integrated in the CERES MODIS algorithm and real-time retrievals at Langley.

  4. OPTIMIZATION OF A PRECISION-CUT TROUT LIVER TISSUE SLICE ASSAY AS A SCREEN FOR VITELLOGENIN INDUCTION: COMPARISON OF SLIDE INCUBATION TECHNIQUES

    EPA Science Inventory

    The egg-yolk precursor protein vitellogenin (VTG) is normally only produced in the liver of female rainbow trout (RBT) in response to increasing 17B-estradiol (E2) during sexual maturation. However, its...

  5. Legal issues in clouds: towards a risk inventory.

    PubMed

    Djemame, Karim; Barnitzke, Benno; Corrales, Marcelo; Kiran, Mariam; Jiang, Ming; Armstrong, Django; Forgó, Nikolaus; Nwankwo, Iheanyi

    2013-01-28

    Cloud computing technologies have reached a high level of development, yet a number of obstacles still exist that must be overcome before widespread commercial adoption can become a reality. In a cloud environment, end users requesting services and cloud providers negotiate service-level agreements (SLAs) that provide explicit statements of all expectations and obligations of the participants. If cloud computing is to experience widespread commercial adoption, then incorporating risk assessment techniques is essential during SLA negotiation and service operation. This article focuses on the legal issues surrounding risk assessment in cloud computing. Specifically, it analyses risk regarding data protection and security, and presents the requirements of an inherent risk inventory. The usefulness of such a risk inventory is described in the context of the OPTIMIS project.

  6. Infrared cloud imaging in support of Earth-space optical communication.

    PubMed

    Nugent, Paul W; Shaw, Joseph A; Piazzolla, Sabino

    2009-05-11

    The increasing need for high data return from near-Earth and deep-space missions is driving a demand for the establishment of Earth-space optical communication links. These links will require a nearly obstruction-free path to the communication platform, so there is a need to measure spatial and temporal statistics of clouds at potential ground-station sites. A technique is described that uses a ground-based thermal infrared imager to provide continuous day-night cloud detection and classification according to the cloud optical depth and potential communication channel attenuation. The benefit of retrieving cloud optical depth and corresponding attenuation is illustrated through measurements that identify cloudy times when optical communication may still be possible through thin clouds.

  7. Determination of potential solar power sites in the United States based upon satellite cloud observations

    NASA Technical Reports Server (NTRS)

    Hiser, H. W.; Senn, H. V.; Bukkapatnam, S. T.; Akyuzlu, K.

    1977-01-01

    The use of cloud images in the visual spectrum from the SMS/GOES geostationary satellites to determine the hourly distribution of sunshine on a mesoscale in the continental United States excluding Alaska is presented. Cloud coverage and density as a function of time of day and season are evaluated through the use of digital data processing techniques. Low density cirrus clouds are less detrimental to solar energy collection than other types; and clouds in the morning and evening are less detrimental than those during midday hours of maximum insolation. Seasonal geographic distributions of cloud cover/sunshine are converted to langleys of solar radiation received at the earth's surface through relationships developed from long term measurements at six widely distributed stations.

  8. Simple Estimation of the Endolymphatic Volume Ratio after Intravenous Administration of a Single-dose of Gadolinium Contrast

    PubMed Central

    NAGANAWA, Shinji; KANOU, Mai; OHASHI, Toshio; KUNO, Kayao; SONE, Michihiko

    2016-01-01

    Purpose: To evaluate the feasibility of a simple estimation for the endolymphatic volume ratio (endolymph volume/total lymph volume = %ELvolume) from an area ratio obtained from only one slice (%EL1slice) or from three slices (%EL3slices). The %ELvolume, calculated from a time-consuming measurement on all magnetic resonance (MR) slices, was compared to the %EL1slice and the %EL3slices. Methods: In 40 ears of 20 patients with a clinical suspicion of endolymphatic hydrops, MR imaging was performed 4 hours after intravenous administration of a single dose of gadolinium-based contrast material (IV-SD-GBCM). Using previously reported HYDROPS2-Mi2 MR imaging, the %ELvolume values in the cochlea and the vestibule were measured separately by two observers. The correlations between the %EL1slice or the %EL3slices and the %ELvolume values were evaluated. Results: A strong linear correlation was observed between the %ELvolume and the %EL3slices or the %EL1slice in the cochlea. The Pearson correlation coefficient (r) was 0.968 (3 slices) and 0.965 (1 slice) for observer A, and 0.968 (3 slices) and 0.964 (1 slice) for observer B (P < 0.001, for all). A strong linear correlation was also observed between the %ELvolume and the %EL3slices or the %EL1slice in the vestibule. The Pearson correlation coefficient (r) was 0.980 (3 slices) and 0.953 (1 slice) for observer A, and 0.979 (3 slices) and 0.952 (1 slice) for observer B (P < 0.001, for all). The high intra-class correlation coefficients (0.991–0.997) between the endolymph volume ratios by two observers were observed in both the cochlea and the vestibule for values of the %ELvolume, the %EL3slices and the %EL1slice. Conclusion: The %ELvolume might be easily estimated from the %EL3slices or the %EL1slice. PMID:27001396

  9. Simple Estimation of the Endolymphatic Volume Ratio after Intravenous Administration of a Single-dose of Gadolinium Contrast.

    PubMed

    Naganawa, Shinji; Kanou, Mai; Ohashi, Toshio; Kuno, Kayao; Sone, Michihiko

    2016-10-11

    To evaluate the feasibility of a simple estimation for the endolymphatic volume ratio (endolymph volume/total lymph volume = %EL volume ) from an area ratio obtained from only one slice (%EL 1slice ) or from three slices (%EL 3slices ). The %EL volume, calculated from a time-consuming measurement on all magnetic resonance (MR) slices, was compared to the %EL 1slice and the %EL 3slices . In 40 ears of 20 patients with a clinical suspicion of endolymphatic hydrops, MR imaging was performed 4 hours after intravenous administration of a single dose of gadolinium-based contrast material (IV-SD-GBCM). Using previously reported HYDROPS2-Mi2 MR imaging, the %EL volume values in the cochlea and the vestibule were measured separately by two observers. The correlations between the %EL 1slice or the %EL 3slices and the %EL volume values were evaluated. A strong linear correlation was observed between the %EL volume and the %EL 3slices or the %EL 1slice in the cochlea. The Pearson correlation coefficient (r) was 0.968 (3 slices) and 0.965 (1 slice) for observer A, and 0.968 (3 slices) and 0.964 (1 slice) for observer B (P < 0.001, for all). A strong linear correlation was also observed between the %EL volume and the %EL 3slices or the %EL 1slice in the vestibule. The Pearson correlation coefficient (r) was 0.980 (3 slices) and 0.953 (1 slice) for observer A, and 0.979 (3 slices) and 0.952 (1 slice) for observer B (P < 0.001, for all). The high intra-class correlation coefficients (0.991-0.997) between the endolymph volume ratios by two observers were observed in both the cochlea and the vestibule for values of the %EL volume , the %EL 3slices and the %EL 1slice . The %EL volume might be easily estimated from the %EL 3slices or the %EL 1slice .

  10. Precision Cut Mouse Lung Slices to Visualize Live Pulmonary Dendritic Cells

    PubMed Central

    Lyons-Cohen, Miranda R.; Thomas, Seddon Y.; Cook, Donald N.; Nakano, Hideki

    2017-01-01

    SHORT ABSTRACT We describe a method for generating precision-cut lung slices (PCLS) and immunostaining them to visualize the localization of various immune cell types in the lung. Our protocol can be extended to visualize the location and function of many different cell types under a variety of conditions. LONG ABSTRACT Inhalation of allergens and pathogens elicits multiple changes in a variety of immune cell types in the lung. Flow cytometry is a powerful technique for quantitative analysis of cell surface proteins on immune cells, but it provides no information on the localization and migration patterns of these cells within the lung. Similarly, in vitro chemotaxis assays can be performed to study the potential of cells to respond to chemotactic factors in vitro, but these assays do not reproduce the complex environment of the intact lung. In contrast to these aforementioned techniques, the location of individual cell types within the lung can be readily visualized by generating precision-cut lung slices (PCLS), staining them with commercially available, fluorescently tagged antibodies, and visualizing the sections by confocal microscopy. PCLS can be used for both live and fixed lung tissue, and the slices can encompass areas as large as a cross section of an entire lobe. We have used this protocol to successfully visualize the location of a wide variety of cell types in the lung, including distinct types of dendritic cells, macrophages, neutrophils, T cells and B cells, as well as structural cells such as lymphatic, endothelial, and epithelial cells. The ability to visualize cellular interactions, such as those between dendritic cells and T cells, in live, three-dimensional lung tissue, can reveal how cells move within the lung and interact with one another at steady state and during inflammation. Thus, when used in combination with other procedures, such as flow cytometry and quantitative PCR, PCLS can contribute to a comprehensive understanding of cellular events that underlie allergic and inflammatory diseases of the lung. PMID:28448013

  11. Multi-slice Fractional Ventilation Imaging in Large Animals with Hyperpolarized Gas MRI

    PubMed Central

    Emami, Kiarash; Xu, Yinan; Hamedani, Hooman; Xin, Yi; Profka, Harrilla; Rajaei, Jennia; Kadlecek, Stephen; Ishii, Masaru; Rizi, Rahim R.

    2012-01-01

    Noninvasive assessment of regional lung ventilation is of critical importance in quantifying the severity of disease and evaluating response to therapy in many pulmonary diseases. This work presents for the first time the implementation of a hyperpolarized (HP) gas MRI technique for measuring whole-lung regional fractional ventilation (r) in Yorkshire pigs (n = 5) through the use of a gas mixing and delivery device in supine position. The proposed technique utilizes a series of back-to-back HP gas breaths with images acquired during short end-inspiratory breath-holds. In order to decouple the RF pulse decay effect from ventilatory signal build-up in the airways, regional distribution of flip angle (α) was estimated in the imaged slices by acquiring a series of back-to-back images with no inter-scan time delay during a breath-hold at the tail-end of the ventilation sequence. Analysis was performed to assess the multi-slice ventilation model sensitivity to noise, oxygen and number of flip angle images. The optimal α value was determined based on minimizing the error in r estimation; αopt = 5–6° for the set of acquisition parameters in pigs. The mean r values for the group of pigs were 0.27±0.09, 0.35±0.06, 0.40±0.04 for ventral, middle and dorsal slices, respectively, (excluding conductive airways r > 0.9). A positive gravitational (ventral-dorsal) ventilation gradient effect was present in all animals. The trachea and major conductive airways showed a uniform near-unity r value, with progressively smaller values corresponding to smaller diameter airways, and ultimately leading to lung parenchyma. Results demonstrate the feasibility of measurements of fractional ventilation in large species, and provides a platform to address technical challenges associated with long breathing time scales through the optimization of acquisition parameters in species with a pulmonary physiology very similar to that of human beings. PMID:22290603

  12. A novel cardiac MR chamber volume model for mechanical dyssynchrony assessment

    NASA Astrophysics Data System (ADS)

    Song, Ting; Fung, Maggie; Stainsby, Jeffrey A.; Hood, Maureen N.; Ho, Vincent B.

    2009-02-01

    A novel cardiac chamber volume model is proposed for the assessment of left ventricular mechanical dyssynchrony. The tool is potentially useful for assessment of regional cardiac function and identification of mechanical dyssynchrony on MRI. Dyssynchrony results typically from a contraction delay between one or more individual left ventricular segments, which in turn leads to inefficient ventricular function and ultimately heart failure. Cardiac resynchronization therapy has emerged as an electrical treatment of choice for heart failure patients with dyssynchrony. Prior MRI techniques have relied on assessments of actual cardiac wall changes either using standard cine MR images or specialized pulse sequences. In this abstract, we detail a semi-automated method that evaluates dyssynchrony based on segmental volumetric analysis of the left ventricular (LV) chamber as illustrated on standard cine MR images. Twelve sectors each were chosen for the basal and mid-ventricular slices and 8 sectors were chosen for apical slices for a total of 32 sectors. For each slice (i.e. basal, mid and apical), a systolic dyssynchrony index (SDI) was measured. SDI, a parameter used for 3D echocardiographic analysis of dyssynchrony, was defined as the corrected standard deviation of the time at which minimal volume is reached in each sector. The SDI measurement of a healthy volunteer was 3.54%. In a patient with acute myocardial infarction, the SDI measurements 10.98%, 16.57% and 1.41% for basal, mid-ventricular and apical LV slices, respectively. Based on published 3D echocardiogram reference threshold values, the patient's SDI corresponds to moderate basal dysfunction, severe mid-ventricular dysfunction, and normal apical LV function, which were confirmed on echocardiography. The LV chamber segmental volume analysis model and SDI is feasible using standard cine MR data and may provide more reliable assessment of patients with dyssynchrony especially if the LV myocardium is thin or if the MR images have spatial resolution insufficient for proper resolution of wall thickness-features problematic for dyssynchrony assessment using existing MR techniques.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hui, C; Beddar, S; Wen, Z

    Purpose: The purpose of this study is to develop a technique to obtain four-dimensional (4D) magnetic resonance (MR) images that are more representative of a patient’s typical breathing cycle by utilizing an extended acquisition time while minimizing the image artifacts. Methods: The 4D MR data were acquired with the balanced steady state free precession in two-dimensional sagittal plane of view. Each slice was acquired repeatedly for about 15 s, thereby obtaining multiple images at each of the 10 phases in the respiratory cycle. This improves the probability that at least one of the images were acquired at the desired phasemore » during a regular breathing cycle. To create optimal 4D MR images, an iterative approach was used to identify the set of images that yielded the highest slice-to-slice similarity. To assess the effectiveness of the approach, the data set was truncated into periods of 7 s (50 time points), 11 s (75 time points) and the full 15 s (100 time points). The 4D MR images were then sorted with data of the three different acquisition periods for comparison. Results: In general, the 4D MR images sorted using data from longer acquisition periods showed less mismatched artifacts. In addition, the normalized cross correlation (NCC) between slices of a 4D volume increases with increased acquisition period. The average NCC was 0.791 from the 7 s period, 0.794 from the 11 s period and 0.796 from the 15 s period. Conclusion: Our preliminary study showed that extending the acquisition time with the proposed sorting technique can improve image quality and reduce artifact presence in the 4D MR images. Data acquisition over two breathing cycles is a good trade-off between artifact reduction and scan time. This research was partially funded by the the Center for Radiation Oncology Research from UT MD Anderson Cancer Center.« less

  14. Advancing Technologies for Climate Observation

    NASA Technical Reports Server (NTRS)

    Wu, D.; Esper, J.; Ehsan, N.; Johnson, T.; Mast, W.; Piepmeier, J.; Racette, P.

    2014-01-01

    Climate research needs Accurate global cloud ice measurements Cloud ice properties are fundamental controlling variables of radiative transfer and precipitation Cost-effective, sensitive instruments for diurnal and wide-swath coverage Mature technology for space remote sensing IceCube objectivesDevelop and validate a flight-qualified 883 GHz receiver for future use in ice cloud radiometer missions Raise TRL (57) of 883 GHz receiver technology Reduce instrument cost and risk by developing path to space for COTS sub-mm-wave receiver systems Enable remote sensing of global cloud ice with advanced technologies and techniques

  15. Effects of Atmospheric Water Vapor and Clouds on NOAA (National Oceanic and Atmospheric Administration) AVHRR (Advanced Very High Resolution Radiometer) Satellite Data.

    DTIC Science & Technology

    1984-07-01

    aerosols and sub pixel-sized clouds all tend to increase Channel 1 with respect to Channel 2 and reduce the computed VIN. Further, the Guide states that... computation of the VIN. Large scale cloud contamination of pixels, while diffi- cult to correct for, can at least be monitored and affected pixels...techniques have been developed for computer cloud screening. See, for example, Horvath et al. (1982), Gray and McCrary (1981a) and Nixon et al. (1983

  16. Evaluation of a rule-based compositing technique for Landsat-5 TM and Landsat-7 ETM+ images

    NASA Astrophysics Data System (ADS)

    Lück, W.; van Niekerk, A.

    2016-05-01

    Image compositing is a multi-objective optimization process. Its goal is to produce a seamless cloud and artefact-free artificial image. This is achieved by aggregating image observations and by replacing poor and cloudy data with good observations from imagery acquired within the timeframe of interest. This compositing process aims to minimise the visual artefacts which could result from different radiometric properties, caused by atmospheric conditions, phenologic patterns and land cover changes. It has the following requirements: (1) image compositing must be cloud free, which requires the detection of clouds and shadows, and (2) the image composite must be seamless, minimizing artefacts and visible across inter image seams. This study proposes a new rule-based compositing technique (RBC) that combines the strengths of several existing methods. A quantitative and qualitative evaluation is made of the RBC technique by comparing it to the maximum NDVI (MaxNDVI), minimum red (MinRed) and maximum ratio (MaxRatio) compositing techniques. A total of 174 Landsat TM and ETM+ images, covering three study sites and three different timeframes for each site, are used in the evaluation. A new set of quantitative/qualitative evaluation techniques for compositing quality measurement was developed and showed that the RBC technique outperformed all other techniques, with MaxRatio, MaxNDVI, and MinRed techniques in order of performance from best to worst.

  17. Investigation of cloud/water vapor motion winds from geostationary satellite

    NASA Technical Reports Server (NTRS)

    Nieman, Steve; Velden, Chris; Hayden, Kit; Menzel, Paul

    1993-01-01

    Work has been primarily focussed on three tasks: (1) comparison of wind fields produced at MSFC with the CO2 autowind/autoeditor system newly installed in NESDIS operations; (2) evaluation of techniques for improved tracer selection through use of cloud classification predictors; and (3) development of height assignment algorithm with water vapor channel radiances. The contract goal is to improve the CIMSS wind system by developing new techniques and assimilating better existing techniques. The work reported here was done in collaboration with the NESDIS scientists working on the operational winds software, so that NASA funded research can benefit NESDIS operational algorithms.

  18. Ice Cloud Properties in Ice-Over-Water Cloud Systems Using TRMM VIRS and TMI Data

    NASA Technical Reports Server (NTRS)

    Minnis, Patrick; Huang, Jianping; Lin, Bing; Yi, Yuhong; Arduini, Robert F.; Fan, Tai-Fang; Ayers, J. Kirk; Mace, Gerald G.

    2007-01-01

    A multi-layered cloud retrieval system (MCRS) is updated and used to estimate ice water path in maritime ice-over-water clouds using Visible and Infrared Scanner (VIRS) and TRMM Microwave Imager (TMI) measurements from the Tropical Rainfall Measuring Mission spacecraft between January and August 1998. Lookup tables of top-of-atmosphere 0.65- m reflectance are developed for ice-over-water cloud systems using radiative transfer calculations with various combinations of ice-over-water cloud layers. The liquid and ice water paths, LWP and IWP, respectively, are determined with the MCRS using these lookup tables with a combination of microwave (MW), visible (VIS), and infrared (IR) data. LWP, determined directly from the TMI MW data, is used to define the lower-level cloud properties to select the proper lookup table. The properties of the upper-level ice clouds, such as optical depth and effective size, are then derived using the Visible Infrared Solar-infrared Split-window Technique (VISST), which matches the VIRS IR, 3.9- m, and VIS data to the multilayer-cloud lookup table reflectances and a set of emittance parameterizations. Initial comparisons with surface-based radar retrievals suggest that this enhanced MCRS can significantly improve the accuracy and decrease the IWP in overlapped clouds by 42% and 13% compared to using the single-layer VISST and an earlier simplified MW-VIS-IR (MVI) differencing method, respectively, for ice-over-water cloud systems. The tropical distribution of ice-over-water clouds is the same as derived earlier from combined TMI and VIRS data, but the new values of IWP and optical depth are slightly larger than the older MVI values, and exceed those of single-layered layered clouds by 7% and 11%, respectively. The mean IWP from the MCRS is 8-14% greater than that retrieved from radar retrievals of overlapped clouds over two surface sites and the standard deviations of the differences are similar to those for single-layered clouds. Examples of a method for applying the MCRS over land without microwave data yield similar differences with the surface retrievals. By combining the MCRS with other techniques that focus primarily on optically thin cirrus over low water clouds, it will be possible to more fully assess the IWP in all conditions over ocean except for precipitating systems.

  19. Retrieval of cloud cover parameters from multispectral satellite images

    NASA Technical Reports Server (NTRS)

    Arking, A.; Childs, J. D.

    1985-01-01

    A technique is described for extracting cloud cover parameters from multispectral satellite radiometric measurements. Utilizing three channels from the AVHRR (Advanced Very High Resolution Radiometer) on NOAA polar orbiting satellites, it is shown that one can retrieve four parameters for each pixel: cloud fraction within the FOV, optical thickness, cloud-top temperature and a microphysical model parameter. The last parameter is an index representing the properties of the cloud particle and is determined primarily by the radiance at 3.7 microns. The other three parameters are extracted from the visible and 11 micron infrared radiances, utilizing the information contained in the two-dimensional scatter plot of the measured radiances. The solution is essentially one in which the distributions of optical thickness and cloud-top temperature are maximally clustered for each region, with cloud fraction for each pixel adjusted to achieve maximal clustering.

  20. Towards Dynamic Remote Data Auditing in Computational Clouds

    PubMed Central

    Khurram Khan, Muhammad; Anuar, Nor Badrul

    2014-01-01

    Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server. PMID:25121114

  1. Obfuscatable multi-recipient re-encryption for secure privacy-preserving personal health record services.

    PubMed

    Shi, Yang; Fan, Hongfei; Xiong, Guoyue

    2015-01-01

    With the rapid development of cloud computing techniques, it is attractive for personal health record (PHR) service providers to deploy their PHR applications and store the personal health data in the cloud. However, there could be a serious privacy leakage if the cloud-based system is intruded by attackers, which makes it necessary for the PHR service provider to encrypt all patients' health data on cloud servers. Existing techniques are insufficiently secure under circumstances where advanced threats are considered, or being inefficient when many recipients are involved. Therefore, the objectives of our solution are (1) providing a secure implementation of re-encryption in white-box attack contexts and (2) assuring the efficiency of the implementation even in multi-recipient cases. We designed the multi-recipient re-encryption functionality by randomness-reusing and protecting the implementation by obfuscation. The proposed solution is secure even in white-box attack contexts. Furthermore, a comparison with other related work shows that the computational cost of the proposed solution is lower. The proposed technique can serve as a building block for supporting secure, efficient and privacy-preserving personal health record service systems.

  2. Towards dynamic remote data auditing in computational clouds.

    PubMed

    Sookhak, Mehdi; Akhunzada, Adnan; Gani, Abdullah; Khurram Khan, Muhammad; Anuar, Nor Badrul

    2014-01-01

    Cloud computing is a significant shift of computational paradigm where computing as a utility and storing data remotely have a great potential. Enterprise and businesses are now more interested in outsourcing their data to the cloud to lessen the burden of local data storage and maintenance. However, the outsourced data and the computation outcomes are not continuously trustworthy due to the lack of control and physical possession of the data owners. To better streamline this issue, researchers have now focused on designing remote data auditing (RDA) techniques. The majority of these techniques, however, are only applicable for static archive data and are not subject to audit the dynamically updated outsourced data. We propose an effectual RDA technique based on algebraic signature properties for cloud storage system and also present a new data structure capable of efficiently supporting dynamic data operations like append, insert, modify, and delete. Moreover, this data structure empowers our method to be applicable for large-scale data with minimum computation cost. The comparative analysis with the state-of-the-art RDA schemes shows that the proposed scheme is secure and highly efficient in terms of the computation and communication overhead on the auditor and server.

  3. Non-uniform muscle fat replacement along the proximodistal axis in Duchenne muscular dystrophy.

    PubMed

    Hooijmans, M T; Niks, E H; Burakiewicz, J; Anastasopoulos, C; van den Berg, S I; van Zwet, E; Webb, A G; Verschuuren, J J G M; Kan, H E

    2017-05-01

    The progressive replacement of muscle tissue by fat in Duchenne muscular dystrophy (DMD) has been studied using quantitative MRI between, but not within, individual muscles. We studied fat replacement along the proximodistal muscle axis using the Dixon technique on a 3T MR scanner in 22 DMD patients and 12 healthy controls. Mean fat fractions per muscle per slice for seven lower and upper leg muscles were compared between and within groups assuming a parabolic distribution. Average fat fraction for a small central slice stack and a large coverage slice stack were compared to the value when the stack was shifted one slice (15 mm) up or down. Higher fat fractions were observed in distal and proximal muscle segments compared to the muscle belly in all muscles of the DMD subjects (p <0.001). A shift of 15 mm resulted in a difference in mean fat fraction which was on average 1-2% ranging up to 12% (p <0.01). The muscle end regions are exposed to higher mechanical strain, which points towards mechanical disruption of the sarcolemma as one of the key factors in the pathophysiology. Overall, this non-uniformity in fat replacement needs to be taken into account to prevent sample bias when applying quantitative MRI as biomarker in clinical trials for DMD. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Contextual convolutional neural networks for lung nodule classification using Gaussian-weighted average image patches

    NASA Astrophysics Data System (ADS)

    Lee, Haeil; Lee, Hansang; Park, Minseok; Kim, Junmo

    2017-03-01

    Lung cancer is the most common cause of cancer-related death. To diagnose lung cancers in early stages, numerous studies and approaches have been developed for cancer screening with computed tomography (CT) imaging. In recent years, convolutional neural networks (CNN) have become one of the most common and reliable techniques in computer aided detection (CADe) and diagnosis (CADx) by achieving state-of-the-art-level performances for various tasks. In this study, we propose a CNN classification system for false positive reduction of initially detected lung nodule candidates. First, image patches of lung nodule candidates are extracted from CT scans to train a CNN classifier. To reflect the volumetric contextual information of lung nodules to 2D image patch, we propose a weighted average image patch (WAIP) generation by averaging multiple slice images of lung nodule candidates. Moreover, to emphasize central slices of lung nodules, slice images are locally weighted according to Gaussian distribution and averaged to generate the 2D WAIP. With these extracted patches, 2D CNN is trained to achieve the classification of WAIPs of lung nodule candidates into positive and negative labels. We used LUNA 2016 public challenge database to validate the performance of our approach for false positive reduction in lung CT nodule classification. Experiments show our approach improves the classification accuracy of lung nodules compared to the baseline 2D CNN with patches from single slice image.

  5. Vertical variation of ice particle size in convective cloud tops.

    PubMed

    van Diedenhoven, Bastiaan; Fridlind, Ann M; Cairns, Brian; Ackerman, Andrew S; Yorks, John E

    2016-05-16

    A novel technique is used to estimate derivatives of ice effective radius with respect to height near convective cloud tops ( dr e / dz ) from airborne shortwave reflectance measurements and lidar. Values of dr e / dz are about -6 μ m/km for cloud tops below the homogeneous freezing level, increasing to near 0 μ m/km above the estimated level of neutral buoyancy. Retrieved dr e / dz compares well with previously documented remote sensing and in situ estimates. Effective radii decrease with increasing cloud top height, while cloud top extinction increases. This is consistent with weaker size sorting in high, dense cloud tops above the level of neutral buoyancy where fewer large particles are present, and with stronger size sorting in lower cloud tops that are less dense. The results also confirm that cloud-top trends of effective radius can generally be used as surrogates for trends with height within convective cloud tops. These results provide valuable observational targets for model evaluation.

  6. Vertical Variation of Ice Particle Size in Convective Cloud Tops

    NASA Technical Reports Server (NTRS)

    Van Diedenhoven, Bastiaan; Fridlind, Ann M.; Cairns, Brian; Ackerman, Andrew S.; Yorks, John E.

    2016-01-01

    A novel technique is used to estimate derivatives of ice effective radius with respect to height near convective cloud tops (dr(sub e)/dz) from airborne shortwave reflectance measurements and lidar. Values of dr(sub e)/dz are about -6 micrometer/km for cloud tops below the homogeneous freezing level, increasing to near 0 micrometer/km above the estimated level of neutral buoyancy. Retrieved dr(sub e)/dz compares well with previously documented remote sensing and in situ estimates. Effective radii decrease with increasing cloud top height, while cloud top extinction increases. This is consistent with weaker size sorting in high, dense cloud tops above the level of neutral buoyancy where fewer large particles are present and with stronger size sorting in lower cloud tops that are less dense. The results also confirm that cloud top trends of effective radius can generally be used as surrogates for trends with height within convective cloud tops. These results provide valuable observational targets for model evaluation.

  7. An Investigation of the Characterization of Cloud Contamination in Hyperspectral Radiances

    NASA Technical Reports Server (NTRS)

    McCarty, William; Jedlovec, Gary J.; LeMarshall, John

    2007-01-01

    In regions lacking direct observations, the assimilation of radiances from infrared and microwave sounders is the primary method for characterizing the atmosphere in the analysis process. In recent years, technological advances have led to the launching of more advanced sounders, particularly in the thermal infrared spectrum. With the advent of these hyperspectral sounders, the amount of data available for the analysis process has and will continue to be dramatically increased. However, the utilization of infrared radiances in variational assimilation can be problematic in the presence of clouds; specifically the assessment of the presence of clouds in an instantaneous field of view (IFOV) and the contamination in the individual channels within the IFOV. Various techniques have been developed to determine if a channel is contaminated by clouds. The work presented in this paper and subsequent presentation will investigate traditional techniques and compare them to a new technique, the C02 sorting technique, which utilizes the high spectral resolution of the Atmospheric Infrared Sounder (AIRS) within the framework of the Gridpoint Statistical Interpolation (GSI) 3DVAR system. Ultimately, this work is done in preparation for the assessment of short-term forecast impacts with the regional assimilation of AIRS radiances within the analysis fields of the Weather Research and Forecast Nonhydrostatic Mesoscale Model (WRF-NMM) at the NASA Short-term Prediction Research and Transition (SPORT) Center.

  8. Cloud radiative properties and aerosol - cloud interaction

    NASA Astrophysics Data System (ADS)

    Viviana Vladutescu, Daniela; Gross, Barry; Li, Clement; Han, Zaw

    2015-04-01

    The presented research discusses different techniques for improvement of cloud properties measurements and analysis. The need for these measurements and analysis arises from the high errors noticed in existing methods that are currently used in retrieving cloud properties and implicitly cloud radiative forcing. The properties investigated are cloud fraction (cf) and cloud optical thickness (COT) measured with a suite of collocated remote sensing instruments. The novel approach makes use of a ground based "poor man's camera" to detect cloud and sky radiation in red, green, and blue with a high spatial resolution of 30 mm at 1km. The surface-based high resolution photography provides a new and interesting view of clouds. As the cloud fraction cannot be uniquely defined or measured, it depends on threshold and resolution. However as resolution decreases, cloud fraction tends to increase if the threshold is below the mean, and vice versa. Additionally cloud fractal dimension also depends on threshold. Therefore these findings raise concerns over the ability to characterize clouds by cloud fraction or fractal dimension. Our analysis indicate that Principal Component analysis may lead to a robust means of quantifying cloud contribution to radiance. The cloud images are analyzed in conjunction with a collocated CIMEL sky radiometer, Microwave Radiometer and LIDAR to determine homogeneity and heterogeneity. Additionally, MFRSR measurements are used to determine the cloud radiative properties as a validation tool to the results obtained from the other instruments and methods. The cloud properties to be further studied are aerosol- cloud interaction, cloud particle radii, and vertical homogeneity.

  9. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud

    PubMed Central

    Munisamy, Shyamala Devi; Chokkalingam, Arun

    2015-01-01

    Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization. PMID:26380364

  10. MODIS Snow Cover Mapping Decision Tree Technique: Snow and Cloud Discrimination

    NASA Technical Reports Server (NTRS)

    Riggs, George A.; Hall, Dorothy K.

    2010-01-01

    Accurate mapping of snow cover continues to challenge cryospheric scientists and modelers. The Moderate-Resolution Imaging Spectroradiometer (MODIS) snow data products have been used since 2000 by many investigators to map and monitor snow cover extent for various applications. Users have reported on the utility of the products and also on problems encountered. Three problems or hindrances in the use of the MODIS snow data products that have been reported in the literature are: cloud obscuration, snow/cloud confusion, and snow omission errors in thin or sparse snow cover conditions. Implementation of the MODIS snow algorithm in a decision tree technique using surface reflectance input to mitigate those problems is being investigated. The objective of this work is to use a decision tree structure for the snow algorithm. This should alleviate snow/cloud confusion and omission errors and provide a snow map with classes that convey information on how snow was detected, e.g. snow under clear sky, snow tinder cloud, to enable users' flexibility in interpreting and deriving a snow map. Results of a snow cover decision tree algorithm are compared to the standard MODIS snow map and found to exhibit improved ability to alleviate snow/cloud confusion in some situations allowing up to about 5% increase in mapped snow cover extent, thus accuracy, in some scenes.

  11. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud.

    PubMed

    Munisamy, Shyamala Devi; Chokkalingam, Arun

    2015-01-01

    Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization.

  12. Collaborative Research: Using ARM Observations to Evaluate GCM Cloud Statistics for Development of Stochastic Cloud-Radiation Parameterizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Samuel S. P.

    2013-09-01

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been an interdisciplinary collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen). The motivation and long-term goal underlying this work is the utilization of stochastic radiative transfer theory (Lane-Veron and Somerville, 2004; Lane et al., 2002) to develop a new class of parametric representations of cloud-radiation interactions and closely related processes for atmospheric models. The theoretical advantage of the stochastic approach is that it can accurately calculate the radiative heating rates through a broken cloud layer without requiring an exact description of the cloud geometry.« less

  13. Marine Cloud Brightening

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latham, John; Bower, Keith; Choularton, Tom

    2012-09-07

    The idea behind the marine cloud-brightening (MCB) geoengineering technique is that seeding marine stratocumulus clouds with copious quantities of roughly monodisperse sub-micrometre sea water particles might significantly enhance the cloud droplet number concentration, and thereby the cloud albedo and possibly longevity. This would produce a cooling, which general circulation model (GCM) computations suggest could - subject to satisfactory resolution of technical and scientific problems identified herein - have the capacity to balance global warming up to the carbon dioxide-doubling point. We describe herein an account of our recent research on a number of critical issues associated with MCB. This involvesmore » (i) GCM studies, which are our primary tools for evaluating globally the effectiveness of MCB, and assessing its climate impacts on rainfall amounts and distribution, and also polar sea-ice cover and thickness; (ii) high-resolution modelling of the effects of seeding on marine stratocumulus, which are required to understand the complex array of interacting processes involved in cloud brightening; (iii) microphysical modelling sensitivity studies, examining the influence of seeding amount, seedparticle salt-mass, air-mass characteristics, updraught speed and other parameters on cloud-albedo change; (iv) sea water spray-production techniques; (v) computational fluid dynamics studies of possible large-scale periodicities in Flettner rotors; and (vi) the planning of a three-stage limited-area field research experiment, with the primary objectives of technology testing and determining to what extent, if any, cloud albedo might be enhanced by seeding marine stratocumulus clouds on a spatial scale of around 100 km. We stress that there would be no justification for deployment of MCB unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favour of such action.« less

  14. Marine cloud brightening.

    PubMed

    Latham, John; Bower, Keith; Choularton, Tom; Coe, Hugh; Connolly, Paul; Cooper, Gary; Craft, Tim; Foster, Jack; Gadian, Alan; Galbraith, Lee; Iacovides, Hector; Johnston, David; Launder, Brian; Leslie, Brian; Meyer, John; Neukermans, Armand; Ormond, Bob; Parkes, Ben; Rasch, Phillip; Rush, John; Salter, Stephen; Stevenson, Tom; Wang, Hailong; Wang, Qin; Wood, Rob

    2012-09-13

    The idea behind the marine cloud-brightening (MCB) geoengineering technique is that seeding marine stratocumulus clouds with copious quantities of roughly monodisperse sub-micrometre sea water particles might significantly enhance the cloud droplet number concentration, and thereby the cloud albedo and possibly longevity. This would produce a cooling, which general circulation model (GCM) computations suggest could-subject to satisfactory resolution of technical and scientific problems identified herein-have the capacity to balance global warming up to the carbon dioxide-doubling point. We describe herein an account of our recent research on a number of critical issues associated with MCB. This involves (i) GCM studies, which are our primary tools for evaluating globally the effectiveness of MCB, and assessing its climate impacts on rainfall amounts and distribution, and also polar sea-ice cover and thickness; (ii) high-resolution modelling of the effects of seeding on marine stratocumulus, which are required to understand the complex array of interacting processes involved in cloud brightening; (iii) microphysical modelling sensitivity studies, examining the influence of seeding amount, seed-particle salt-mass, air-mass characteristics, updraught speed and other parameters on cloud-albedo change; (iv) sea water spray-production techniques; (v) computational fluid dynamics studies of possible large-scale periodicities in Flettner rotors; and (vi) the planning of a three-stage limited-area field research experiment, with the primary objectives of technology testing and determining to what extent, if any, cloud albedo might be enhanced by seeding marine stratocumulus clouds on a spatial scale of around 100×100 km. We stress that there would be no justification for deployment of MCB unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favour of such action.

  15. Photovoltaics for the Defense Community through Manufacturing Advances

    DTIC Science & Technology

    2009-04-27

    the mod- ule, the inverter, and the balance of system (BOS) costs. The module is the “solar panel ” component that generates electricity, the inverter...Silicon Key areas Examples Ingot Crystal Structures • Multicrystalline • Monocrystalline Wafering Techniques • Wire sawing • Pulling slices off the ingot

  16. AUTORADIOGRAPHIC ANALYSIS ON AGAR PLATES OF ANTIGENS FROM SUB CELLULAR FRACTIONS OF RAT LIVER SLICES

    PubMed Central

    Morgan, W. S.; Perlmann, P.; Hultin, T.

    1961-01-01

    Slices of rat livers were incubated with 14C amino acids, homogenized, and subjected to differential centrifugation. The microsomes were further extracted with the non-ionic detergent Lubrol W and with EDTA. These extracts and the microsome free "cell sap," freed from the pH 5 precipitable fraction, were subsequently reacted with antisera using agar diffusion techniques. The antisera employed were obtained from rabbits injected with different subcellular fractions of rat liver or with rat serum proteins. When the agar diffusion plates were autoradiographed it was found that some of the precipitates were radioactive while others were not. Control experiments indicated that this labeling was due to the specific incorporation of 14C amino acids into various rat liver antigens during incubation of the slices rather than to a non-specific adsorption of radioactive material to the immunological precipitates. When the slices were incubated with the isotope for up to 30 minutes, the serum proteins which could be extracted from the microsomes with the detergent were strongly labeled, as were a number of additional microsomal antigens of unknown significance. In contrast, the serum proteins present in the cell sap were only weakly labeled. Most of the typical cell sap proteins, both those precipitable and those soluble at pH 5, seemed to remain unlabeled. No consistently reproducible results were obtained with the EDTA extracts of the ribosomal residues remaining after extraction of the microsomes with the detergent. Incubation of the liver slices for longer periods (up to 120 minutes) led to a strong labeling of the serum proteins in the cell sap as well as to the appearance of labeling in additional cell sap proteins. The results are discussed with regard to the subcellular site of synthesis and the metabolism of the different antigens. PMID:13772607

  17. Survival, migration, and differentiation of Sox1-GFP embryonic stem cells in coculture with an auditory brainstem slice preparation.

    PubMed

    Glavaski-Joksimovic, Aleksandra; Thonabulsombat, Charoensri; Wendt, Malin; Eriksson, Mikael; Palmgren, Björn; Jonsson, Anna; Olivius, Petri

    2008-03-01

    The poor regeneration capability of the mammalian hearing organ has initiated different approaches to enhance its functionality after injury. To evaluate a potential neuronal repair paradigm in the inner ear and cochlear nerve we have previously used embryonic neuronal tissue and stem cells for implantation in vivo and in vitro. At present, we have used in vitro techniques to study the survival and differentiation of Sox1-green fluorescent protein (GFP) mouse embryonic stem (ES) cells as a monoculture or as a coculture with rat auditory brainstem slices. For the coculture, 300 microm-thick brainstem slices encompassing the cochlear nucleus and cochlear nerve were prepared from postnatal SD rats. The slices were propagated using the membrane interface method and the cochlear nuclei were prelabeled with DiI. After some days in culture a suspension of Sox1 cells was deposited next to the brainstem slice. Following deposition Sox1 cells migrated toward the brainstem and onto the cochlear nucleus. GFP was not detectable in undifferentiated ES cells but became evident during neural differentiation. Up to 2 weeks after transplantation the cocultures were fixed. The undifferentiated cells were evaluated with antibodies against progenitor cells whereas the differentiated cells were determined with neuronal and glial markers. The morphological and immunohistochemical data indicated that Sox1 cells in monoculture differentiated into a higher percentage of glial cells than neurons. However, when a coculture was used a significantly lower percentage of Sox1 cells differentiated into glial cells. The results demonstrate that a coculture of Sox1 cells and auditory brainstem present a useful model to study stem cell differentiation.

  18. Velocity encoding with the slice select refocusing gradient for faster imaging and reduced chemical shift-induced phase errors.

    PubMed

    Middione, Matthew J; Thompson, Richard B; Ennis, Daniel B

    2014-06-01

    To investigate a novel phase-contrast MRI velocity-encoding technique for faster imaging and reduced chemical shift-induced phase errors. Velocity encoding with the slice select refocusing gradient achieves the target gradient moment by time shifting the refocusing gradient, which enables the use of the minimum in-phase echo time (TE) for faster imaging and reduced chemical shift-induced phase errors. Net forward flow was compared in 10 healthy subjects (N = 10) within the ascending aorta (aAo), main pulmonary artery (PA), and right/left pulmonary arteries (RPA/LPA) using conventional flow compensated and flow encoded (401 Hz/px and TE = 3.08 ms) and slice select refocused gradient velocity encoding (814 Hz/px and TE = 2.46 ms) at 3 T. Improved net forward flow agreement was measured across all vessels for slice select refocused gradient compared to flow compensated and flow encoded: aAo vs. PA (1.7% ± 1.9% vs. 5.8% ± 2.8%, P = 0.002), aAo vs. RPA + LPA (2.1% ± 1.7% vs. 6.0% ± 4.3%, P = 0.03), and PA vs. RPA + LPA (2.9% ± 2.1% vs. 6.1% ± 6.3%, P = 0.04), while increasing temporal resolution (35%) and signal-to-noise ratio (33%). Slice select refocused gradient phase-contrast MRI with a high receiver bandwidth and minimum in-phase TE provides more accurate and less variable flow measurements through the reduction of chemical shift-induced phase errors and a reduced TE/repetition time, which can be used to increase the temporal/spatial resolution and/or reduce breath hold durations. Copyright © 2013 Wiley Periodicals, Inc.

  19. Diffusion weighted whole body imaging with background body signal suppression (DWIBS): technical improvement using free breathing, STIR and high resolution 3D display.

    PubMed

    Takahara, Taro; Imai, Yutaka; Yamashita, Tomohiro; Yasuda, Seiei; Nasu, Seiji; Van Cauteren, Marc

    2004-01-01

    To examine a new way of body diffusion weighted imaging (DWI) using the short TI inversion recovery-echo planar imaging (STIR-EPI) sequence and free breathing scanning (diffusion weighted whole body imaging with background body signal suppression; DWIBS) to obtain three-dimensional displays. 1) Apparent contrast-to-noise ratios (AppCNR) between lymph nodes and surrounding fat tissue were compared in three types of DWI with and without breath-holding, with variable lengths of scan time and slice thickness. 2) The STIR-EPI sequence and spin echo-echo planar imaging (SE-EPI) sequence with chemical shift selective (CHESS) pulse were compared in terms of their degree of fat suppression. 3) Eleven patients with neck, chest, and abdominal malignancy were scanned with DWIBS for evaluation of feasibility. Whole body imaging was done in a later stage of the study using the peripheral vascular coil. The AppCNR of 8 mm slice thickness images reconstructed from 4 mm slice thickness source images obtained in a free breathing scan of 430 sec were much better than 9 mm slice thickness breath-hold scans obtained in 25 sec. High resolution multi-planar reformat (MPR) and maximum intensity projection (MIP) images could be made from the data set of 4 mm slice thickness images. Fat suppression was much better in the STIR-EPI sequence than SE-EPI with CHESS pulse. The feasibility of DWIBS was showed in clinical scans of 11 patients. Whole body images were successfully obtained with adequate fat suppression. Three-dimensional DWIBS can be obtained with this technique, which may allow us to screen for malignancies in the whole body.

  20. Mapping of spatial and temporal heterogeneity of plantar flexor muscle activity during isometric contraction: correlation of velocity-encoded MRI with EMG

    PubMed Central

    Csapo, Robert; Malis, Vadim; Sinha, Usha

    2015-01-01

    The aim of this study was to assess the correlation between contraction-associated muscle kinematics as measured by velocity-encoded phase-contrast (VE-PC) magnetic resonance imaging (MRI) and activity recorded via electromyography (EMG), and to construct a detailed three-dimensional (3-D) map of the contractile behavior of the triceps surae complex from the MRI data. Ten axial-plane VE-PC MRI slices of the triceps surae and EMG data were acquired during submaximal isometric contractions in 10 subjects. MRI images were analyzed to yield the degree of contraction-associated muscle displacement on a voxel-by-voxel basis and determine the heterogeneity of muscle movement within and between slices. Correlational analyses were performed to determine the agreement between EMG data and displacements. Pearson's coefficients demonstrated good agreement (0.84 < r < 0.88) between EMG data and displacements. Comparison between different slices in the gastrocnemius muscle revealed significant heterogeneity in displacement values both in-plane and along the cranio-caudal axis, with highest values in the mid-muscle regions. By contrast, no significant differences between muscle regions were found in the soleus muscle. Substantial differences among displacements were also observed within slices, with those in static areas being only 17–39% (maximum) of those in the most mobile muscle regions. The good agreement between EMG data and displacements suggests that VE-PC MRI may be used as a noninvasive, high-resolution technique for quantifying and modeling muscle activity over the entire 3-D volume of muscle groups. Application to the triceps surae complex revealed substantial heterogeneity of contraction-associated muscle motion both within slices and between different cranio-caudal positions. PMID:26112239

  1. MPTP-induced changes in hippocampal synaptic plasticity and memory are prevented by memantine through the BDNF-TrkB pathway

    PubMed Central

    Zhu, Guoqi; Li, Junyao; He, Ling; Wang, Xuncui; Hong, Xiaoqi

    2015-01-01

    Background and Purpose Mild cognitive deficit in early Parkinson's disease (PD) has been widely studied. Here we have examined the effects of memantine in preventing memory deficit in experimental PD models and elucidated some of the underlying mechanisms. Experimental Approaches I.p. injection of 1-methyl-4- phenyl-1,2,3,6-tetrahydro pyridine (MPTP) in C57BL/6 mice was used to produce models of PD. We used behavioural tasks to test memory. In vitro, we used slices of hippocampus, with electrophysiological, Western blotting, real time PCR, elisa and immunochemical techniques. Key Results Following MPTP injection, long-term memory was impaired and these changes were prevented by pre-treatment with memantine. In hippocampal slices from MPTP treated mice, long-term potentiation (LTP) –induced by θ burst stimulation (10 bursts, 4 pulses) was decreased, while long-term depression (LTD) induced by low-frequency stimulation (1 Hz, 900 pulses) was enhanced, compared with control values. A single dose of memantine (i.p., 10 mg·kg−1) reversed the decreased LTP and the increased LTD in this PD model. Activity-dependent changes in tyrosine kinase receptor B (TrkB), ERK and brain-derived neurotrophic factor (BDNF) expression were decreased in slices from mice after MPTP treatment. These effects were reversed by pretreatment with memantine. Incubation of slices in vitro with 1-methyl-4-phenylpyridinium (MPP+) decreased depolarization-induced expression of BDNF. This effect was prevented by pretreatment of slices with memantine or with calpain inhibitor III, suggesting the involvement of an overactivated calcium signalling pathway. Conclusions and Implications Memantine should be useful in preventing loss of memory and hippocampal synaptic plasticity in PD models. PMID:25560396

  2. Arctic PBL Cloud Height and Motion Retrievals from MISR and MINX

    NASA Technical Reports Server (NTRS)

    Wu, Dong L.

    2012-01-01

    How Arctic clouds respond and feedback to sea ice loss is key to understanding of the rapid climate change seen in the polar region. As more open water becomes available in the Arctic Ocean, cold air outbreaks (aka. off-ice flow from polar lows) produce a vast sheet of roll clouds in the planetary boundary layer (PBl). The cold air temperature and wind velocity are the critical parameters to determine and understand the PBl structure formed under these roll clouds. It has been challenging for nadir visible/IR sensors to detect Arctic clouds due to lack of contrast between clouds and snowy/icy surfaces. In addition) PBl temperature inversion creates a further problem for IR sensors to relate cloud top temperature to cloud top height. Here we explore a new method with the Multiangle Imaging Spectro-Radiometer (MISR) instrument to measure cloud height and motion over the Arctic Ocean. Employing a stereoscopic-technique, MISR is able to measure cloud top height accurately and distinguish between clouds and snowy/icy surfaces with the measured height. We will use the MISR INteractive eXplorer (MINX) to quantify roll cloud dynamics during cold-air outbreak events and characterize PBl structures over water and over sea ice.

  3. Improved Cloud and Snow Screening in MAIAC Aerosol Retrievals Using Spectral and Spatial Analysis

    NASA Technical Reports Server (NTRS)

    Lyapustin, A.; Wang, Y.; Laszlo, I.; Kokrkin, S.

    2012-01-01

    An improved cloud/snow screening technique in the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm is described. It is implemented as part of MAIAC aerosol retrievals based on analysis of spectral residuals and spatial variability. Comparisons with AERONET aerosol observations and a large-scale MODIS data analysis show strong suppression of aerosol optical thickness outliers due to unresolved clouds and snow. At the same time, the developed filter does not reduce the aerosol retrieval capability at high 1 km resolution in strongly inhomogeneous environments, such as near centers of the active fires. Despite significant improvement, the optical depth outliers in high spatial resolution data are and will remain the problem to be addressed by the application-dependent specialized filtering techniques.

  4. Technique for ship/wake detection

    DOEpatents

    Roskovensky, John K [Albuquerque, NM

    2012-05-01

    An automated ship detection technique includes accessing data associated with an image of a portion of Earth. The data includes reflectance values. A first portion of pixels within the image are masked with a cloud and land mask based on spectral flatness of the reflectance values associated with the pixels. A given pixel selected from the first portion of pixels is unmasked when a threshold number of localized pixels surrounding the given pixel are not masked by the cloud and land mask. A spatial variability image is generated based on spatial derivatives of the reflectance values of the pixels which remain unmasked by the cloud and land mask. The spatial variability image is thresholded to identify one or more regions within the image as possible ship detection regions.

  5. Advanced Optical Diagnostics for Ice Crystal Cloud Measurements in the NASA Glenn Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Bencic, Timothy J.; Fagan, Amy; Van Zante, Judith F.; Kirkegaard, Jonathan P.; Rohler, David P.; Maniyedath, Arjun; Izen, Steven H.

    2013-01-01

    A light extinction tomography technique has been developed to monitor ice water clouds upstream of a direct connected engine in the Propulsion Systems Laboratory (PSL) at NASA Glenn Research Center (GRC). The system consists of 60 laser diodes with sheet generating optics and 120 detectors mounted around a 36-inch diameter ring. The sources are pulsed sequentially while the detectors acquire line-of-sight extinction data for each laser pulse. Using computed tomography algorithms, the extinction data are analyzed to produce a plot of the relative water content in the measurement plane. To target the low-spatial-frequency nature of ice water clouds, unique tomography algorithms were developed using filtered back-projection methods and direct inversion methods that use Gaussian basis functions. With the availability of a priori knowledge of the mean droplet size and the total water content at some point in the measurement plane, the tomography system can provide near real-time in-situ quantitative full-field total water content data at a measurement plane approximately 5 feet upstream of the engine inlet. Results from ice crystal clouds in the PSL are presented. In addition to the optical tomography technique, laser sheet imaging has also been applied in the PSL to provide planar ice cloud uniformity and relative water content data during facility calibration before the tomography system was available and also as validation data for the tomography system. A comparison between the laser sheet system and light extinction tomography resulting data are also presented. Very good agreement of imaged intensity and water content is demonstrated for both techniques. Also, comparative studies between the two techniques show excellent agreement in calculation of bulk total water content averaged over the center of the pipe.

  6. Rainbows, polarization, and the search for habitable planets.

    PubMed

    Bailey, Jeremy

    2007-04-01

    Current proposals for the characterization of extrasolar terrestrial planets rest primarily on the use of spectroscopic techniques. While spectroscopy is effective in detecting the gaseous components of a planet's atmosphere, it provides no way of detecting the presence of liquid water, the defining characteristic of a habitable planet. In this paper, I investigate the potential of an alternative technique for characterizing the atmosphere of a planet using polarization. By looking for a polarization peak at the "primary rainbow" scattering angle, it is possible to detect the presence of liquid droplets in a planet's atmosphere and constrain the nature of the liquid through its refractive index. Single scattering calculations are presented to show that a well-defined rainbow scattering peak is present over the full range of likely cloud droplet sizes and clearly distinguishes the presence of liquid droplets from solid particles such as ice or dust. Rainbow scattering has been used in the past to determine the nature of the cloud droplets in the Venus atmosphere and by the POLarization and Directionality of Earth Reflectances (POLDER) instrument to distinguish between liquid and ice clouds in the Earth atmosphere. While the presence of liquid water clouds does not guarantee the presence of water at the surface, this technique could complement spectroscopic techniques for characterizing the atmospheres of potential habitable planets. The disk-integrated rainbow peak for Earth is estimated to be at a degree of polarization of 12.7% or 15.5% for two different cloud cover scenarios. The observation of this rainbow peak is shown to be feasible with the proposed Terrestrial Planet Finder Coronograph mission in similar total integration times to those required for spectroscopic characterization.

  7. Near Real Time Detection and Tracking of the EYJAFJÖLL (iceland) Ash Cloud by the RST (robust Satellite Technique) Approach

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Filizzola, C.; Marchese, F.; Paciello, R.; Pergola, N.; Sannazzaro, F.

    2010-12-01

    Volcanic ash clouds, besides to be an environmental issue, represent a serious problem for air traffic and an important economic threat for aviation companies. During the recent volcanic crisis due to the April-May 2010 eruption of Eyjafjöll (Iceland), ash clouds became a real problem for common citizens as well: during the first days of the eruption thousands of flights were cancelled disrupting hundred of thousands of passengers. Satellite remote sensing confirmed to be a crucial tool for monitoring this kind of events, spreading for thousands of kilometres with a very rapid space-time dynamics. Especially weather satellites, thanks to their high temporal resolution, may furnish a fundamental contribution, providing frequently updated information. However, in this particular case ash cloud was accompanied by a sudden and significant emission of water vapour, due to the ice melting of Eyjafjallajökull glacier, making satellite ash detection and discrimination very hard, especially in the first few days of the eruption, exactly when accurate information were mostly required in order to support emergency management. Among the satellite-based techniques for near real-time detection and tracking of ash clouds, the RST (Robust Satellite Technique) approach, formerly named RAT - Robust AVHRR Technique, has been long since proposed, demonstrating high performances both in terms of reliability and sensitivity. In this paper, results achieved by using RST-based detection schemes, applied during the Eyjafjöll eruption were presented. MSG-SEVIRI (Meteosat Second Generation - Spinning Enhanced and Visible Infrared Imager) records, with a temporal sampling of 15 minutes, were used applying a standard as well as an advanced RST configuration, which includes the use of SO2 absorption band together with TIR and MIR channels. Main outcomes, limits and possible future improvements were also discussed.

  8. Stabilization of Global Temperature and Polar Sea-ice cover via seeding of Maritime Clouds

    NASA Astrophysics Data System (ADS)

    Chen, Jack; Gadian, Alan; Latham, John; Launder, Brian; Neukermans, Armand; Rasch, Phil; Salter, Stephen

    2010-05-01

    The marine cloud albedo enhancement (cloud whitening) geoengineering technique (Latham1990, 2002, Bower et al. 2006, Latham et al. 2008, Salter et al. 2008, Rasch et al. 2009) involves seeding maritime stratocumulus clouds with seawater droplets of size (at creation) around 1 micrometer, causing the droplet number concentration to increase within the clouds, thereby enhancing their albedo and possibly longevity. GCM modeling indicates that (subject to satisfactory resolution of specified scientific and technological problems) the technique could produce a globally averaged negative forcing of up to about -4W/m2, adequate to hold the Earth's average temperature constant as the atmospheric carbon dioxide concentration increases to twice the current value. This idea is being examined using GCM modeling, LES cloud modeling, technological development (practical and theoretical), and analysis of data from the recent, extensive VOCALS field study of marine stratocumulus clouds. We are also formulating plans for a possible limited-area field test of the technique. Recent general circulation model computations using a fully coupled ocean-atmosphere model indicate that increasing cloud reflectivity by seeding maritime boundary layer clouds may compensate for some effects on climate of increasing greenhouse gas concentrations. The chosen seeding strategy (one of many possible scenarios), when employed in an atmosphere where the CO2 concentration is doubled, can restore global averages of temperature, precipitation and polar sea-ice to present day values, but not simultaneously. The response varies nonlinearly with the extent of seeding, and geoengineering generates local changes to important climatic features. Our computations suggest that for the specimen cases examined there is no appreciable reduction of rainfall over land, as a consequence of seeding. This result is in agreement with one separate study but not another. Much further work is required to explain these discrepancies and to address the crucially important issue of adverse ramifications associated with the possible deployment of this geoengineering technique. We envisage, should deployment occur, that wind-driven, unmanned Flettner spray vessels will sail back and forth perpendicular to the local prevailing wind, releasing seawater droplets into the boundary layer beneath marine stratocumulus clouds. In an effort to optimize vessel performance, computations of flow around a Flettner rotor with Thom fences are being conducted. An early result is that that the lift coefficient on the rotating cylinder undergoes very large, slow variations in time, with a frequency an order of magnitude below that of the rotation frequency of the cylinder. The vessels will drag turbines resembling oversized propellers through the water to provide the means for generating electrical energy. Some will be used for rotor spin, but most for the creation of spray droplets. One promising spray production technique involves pumping carefully filtered water through banks of filters and then micro-nozzles with piezoelectric excitation to vary drop diameter. Another involves electro-spraying from Taylor cone-jets. The rotors offer convenient housing for spray nozzles, with fan assistance to help initial dispersion of the droplets. This global cooling technique has the advantages that: (1) the only raw materials required are wind and seawater; (2) the amount of global cooling could be adjusted by switching on or off, by remote control, sea-water droplet generators mounted on the vessels; (3) if necessary, the entire system could be immediately switched off, with conditions returning to normal within a few days; (4) since not all suitable clouds need to be seeded, there exists, in principle, flexibility to choose seeding locations so as to optimise beneficial effects and subdue or eliminate adverse ones. K.Bower, T.W.Choularton, J.Latham, J.Sahraei and S.Salter., 2006. Computational Assessment of a Proposed Technique for Global Warming Mitigation Via Albedo-Enhancement of Marine Stratocumulus Clouds. Atmos. Res. 82, 328-336. Latham, J., 1990: Nature 347. 339-340. Latham, J., 2002, Atmos. Sci. Letters. (doi:10.1006/Asle.2002.0048). Latham, P.J. Rasch, C.C.Chen, L. Kettles, A. Gadian, A. Gettelman, H. Morrison, S. Salter., 2008. Phil. Trans. Roy. Soc. A, 366, 3969-3987,doi:10.1098/rsta.2008.0137. P.J.Rasch, J. Latham & C.C.Chen, 2010. Environ. Res. Lett. 4 045112 (8pp) doi:10.1088/1748-9326/4/4/045112 S. Salter, G. Sortino and J. Latham, 2008. Phil.Trans.Roy. Soc. A, 366, 2989-4006, doi:10.1098/rsta.2008.0136

  9. PLIF Imaging of Capsule RCS Jets, Shear Layers, and Simulated Forebody Ablation

    NASA Technical Reports Server (NTRS)

    Inman, Jennifer A.; Danehy, Paul M.; Alderfer, David W.; Buck, Gregory M.; McCrea, Andrew

    2008-01-01

    Planar laser-induced fluorescence (PLIF) has been used to investigate hypersonic flows associated with capsule reentry vehicles. These flows included reaction control system (RCS) jets, shear layer flow, and simulated forebody heatshield ablation. Pitch, roll, and yaw RCS jets were studied. PLIF obtained planar slices in these flowfields. These slices could be viewed individually or they could be combined using computer visualization techniques to reconstruct the three dimensional shape of the flow. The tests described herein were conducted in the 31-Inch Mach 10 Air Tunnel at NASA Langley Research Center. Improvements to many facets of the imaging system increased the efficiency and quality of both data acquisition, in addition to increasing the overall robustness of the system.

  10. Exploring high dimensional free energy landscapes: Temperature accelerated sliced sampling

    NASA Astrophysics Data System (ADS)

    Awasthi, Shalini; Nair, Nisanth N.

    2017-03-01

    Biased sampling of collective variables is widely used to accelerate rare events in molecular simulations and to explore free energy surfaces. However, computational efficiency of these methods decreases with increasing number of collective variables, which severely limits the predictive power of the enhanced sampling approaches. Here we propose a method called Temperature Accelerated Sliced Sampling (TASS) that combines temperature accelerated molecular dynamics with umbrella sampling and metadynamics to sample the collective variable space in an efficient manner. The presented method can sample a large number of collective variables and is advantageous for controlled exploration of broad and unbound free energy basins. TASS is also shown to achieve quick free energy convergence and is practically usable with ab initio molecular dynamics techniques.

  11. Particle swarm optimization and its application in MEG source localization using single time sliced data

    NASA Astrophysics Data System (ADS)

    Lin, Juan; Liu, Chenglian; Guo, Yongning

    2014-10-01

    The estimation of neural active sources from the magnetoencephalography (MEG) data is a very critical issue for both clinical neurology and brain functions research. A widely accepted source-modeling technique for MEG involves calculating a set of equivalent current dipoles (ECDs). Depth in the brain is one of difficulties in MEG source localization. Particle swarm optimization(PSO) is widely used to solve various optimization problems. In this paper we discuss its ability and robustness to find the global optimum in different depths of the brain when using single equivalent current dipole (sECD) model and single time sliced data. The results show that PSO is an effective global optimization to MEG source localization when given one dipole in different depths.

  12. CORRECTING PHOTOLYSIS RATES ON THE BASIS OF SATELLITE OBSERVED CLOUDS

    EPA Science Inventory

    Clouds can significantly affect photochemical activities in the boundary layer by altering radiation intensity, and therefore their correct specification in the air quality models is of outmost importance. In this study we introduce a technique for using the satellite observed c...

  13. Comparing airborne and satellite retrievals of cloud optical thickness and particle effective radius using a spectral radiance ratio technique: two case studies for cirrus and deep convective clouds

    NASA Astrophysics Data System (ADS)

    Krisna, Trismono C.; Wendisch, Manfred; Ehrlich, André; Jäkel, Evelyn; Werner, Frank; Weigel, Ralf; Borrmann, Stephan; Mahnke, Christoph; Pöschl, Ulrich; Andreae, Meinrat O.; Voigt, Christiane; Machado, Luiz A. T.

    2018-04-01

    Solar radiation reflected by cirrus and deep convective clouds (DCCs) was measured by the Spectral Modular Airborne Radiation Measurement System (SMART) installed on the German High Altitude and Long Range Research Aircraft (HALO) during the Mid-Latitude Cirrus (ML-CIRRUS) and the Aerosol, Cloud, Precipitation, and Radiation Interaction and Dynamic of Convective Clouds System - Cloud Processes of the Main Precipitation Systems in Brazil: A Contribution to Cloud Resolving Modelling and to the Global Precipitation Measurement (ACRIDICON-CHUVA) campaigns. On particular flights, HALO performed measurements closely collocated with overpasses of the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite. A cirrus cloud located above liquid water clouds and a DCC topped by an anvil cirrus are analyzed in this paper. Based on the nadir spectral upward radiance measured above the two clouds, the optical thickness τ and particle effective radius reff of the cirrus and DCC are retrieved using a radiance ratio technique, which considers the cloud thermodynamic phase, the vertical profile of cloud microphysical properties, the presence of multilayer clouds, and the heterogeneity of the surface albedo. For the cirrus case, the comparison of τ and reff retrieved on the basis of SMART and MODIS measurements yields a normalized mean absolute deviation of up to 1.2 % for τ and 2.1 % for reff. For the DCC case, deviations of up to 3.6 % for τ and 6.2 % for reff are obtained. The larger deviations in the DCC case are mainly attributed to the fast cloud evolution and three-dimensional (3-D) radiative effects. Measurements of spectral upward radiance at near-infrared wavelengths are employed to investigate the vertical profile of reff in the cirrus. The retrieved values of reff are compared with corresponding in situ measurements using a vertical weighting method. Compared to the MODIS observations, measurements of SMART provide more information on the vertical distribution of particle sizes, which allow reconstructing the profile of reff close to the cloud top. The comparison between retrieved and in situ reff yields a normalized mean absolute deviation, which ranges between 1.5 and 10.3 %, and a robust correlation coefficient of 0.82.

  14. Retrieval of Cloud Properties for Partially Cloud-Filled Pixels During CRYSTAL-FACE

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Minnis, P.; Smith, W. L.; Khaiyer, M. M.; Heck, P. W.; Sun-Mack, S.; Uttal, T.; Comstock, J.

    2003-12-01

    Partially cloud-filled pixels can be a significant problem for remote sensing of cloud properties. Generally, the optical depth and effective particle sizes are often too small or too large, respectively, when derived from radiances that are assumed to be overcast but contain radiation from both clear and cloud areas within the satellite imager field of view. This study presents a method for reducing the impact of such partially cloud field pixels by estimating the cloud fraction within each pixel using higher resolution visible (VIS, 0.65mm) imager data. Although the nominal resolution for most channels on the Geostationary Operational Environmental Satellite (GOES) imager and the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra are 4 and 1 km, respectively, both instruments also take VIS channel data at 1 km and 0.25 km, respectively. Thus, it may be possible to obtain an improved estimate of cloud fraction within the lower resolution pixels by using the information contained in the higher resolution VIS data. GOES and MODIS multi-spectral data, taken during the Cirrus Regional Study of Tropical Anvils and Cirrus Layers - Florida Area Cirrus Experiment (CRYSTAL-FACE), are analyzed with the algorithm used for the Atmospheric Radiation Measurement Program (ARM) and the Clouds and Earth's Radiant Energy System (CERES) to derive cloud amount, temperature, height, phase, effective particle size, optical depth, and water path. Normally, the algorithm assumes that each pixel is either entirely clear or cloudy. In this study, a threshold method is applied to the higher resolution VIS data to estimate the partial cloud fraction within each low-resolution pixel. The cloud properties are then derived from the observed low-resolution radiances using the cloud cover estimate to properly extract the radiances due only to the cloudy part of the scene. This approach is applied to both GOES and MODIS data to estimate the improvement in the retrievals for each resolution. Results are compared with the radar reflectivity techniques employed by the NOAA ETL MMCR and the PARSL 94 GHz radars located at the CRYSTAL-FACE Eastern & Western Ground Sites, respectively. This technique is most likely to yield improvements for low and midlevel layer clouds that have little thermal variability in cloud height.

  15. Techniques A: continuous waves

    NASA Astrophysics Data System (ADS)

    Beuthan, J.

    1993-08-01

    In a vast amount of medical diseases the biochemical and physiological changes of soft tissues are hardly detectable by conventional techniques of diagnostic imaging (x- ray, ultrasound, computer tomography, and MRI). The detectivity is low and the technical efforts are tremendous. On the other hand these pathologic variations induce significant changes of the optical tissue parameters which can be detected. The corresponding variations of the scattered light can most easily be detected and evaluated by infrared diaphanoscopy, even on optical thick tissue slices.

  16. Whole-organ atlas imaged by label-free high-resolution photoacoustic microscopy assisted by a microtome

    NASA Astrophysics Data System (ADS)

    Wong, Terence T. W.; Zhang, Ruiying; Hsu, Hsun-Chia; Maslov, Konstantin I.; Shi, Junhui; Chen, Ruimin; Shung, K. Kirk; Zhou, Qifa; Wang, Lihong V.

    2018-02-01

    In biomedical imaging, all optical techniques face a fundamental trade-off between spatial resolution and tissue penetration. Therefore, obtaining an organelle-level resolution image of a whole organ has remained a challenging and yet appealing scientific pursuit. Over the past decade, optical microscopy assisted by mechanical sectioning or chemical clearing of tissue has been demonstrated as a powerful technique to overcome this dilemma, one of particular use in imaging the neural network. However, this type of techniques needs lengthy special preparation of the tissue specimen, which hinders broad application in life sciences. Here, we propose a new label-free three-dimensional imaging technique, named microtomy-assisted photoacoustic microscopy (mPAM), for potentially imaging all biomolecules with 100% endogenous natural staining in whole organs with high fidelity. We demonstrate the first label-free mPAM, using UV light for label-free histology-like imaging, in whole organs (e.g., mouse brains), most of them formalin-fixed and paraffin- or agarose-embedded for minimal morphological deformation. Furthermore, mPAM with dual wavelength illuminations is also employed to image a mouse brain slice, demonstrating the potential for imaging of multiple biomolecules without staining. With visible light illumination, mPAM also shows its deep tissue imaging capability, which enables less slicing and hence reduces sectioning artifacts. mPAM could potentially provide a new insight for understanding complex biological organs.

  17. Hyperspectrally-Resolved Surface Emissivity Derived Under Optically Thin Clouds

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Larar, Allen M.; Liu, Xu; Smith, William L.; Strow, L. Larrabee; Yang, Ping

    2010-01-01

    Surface spectral emissivity derived from current and future satellites can and will reveal critical information about the Earth s ecosystem and land surface type properties, which can be utilized as a means of long-term monitoring of global environment and climate change. Hyperspectrally-resolved surface emissivities are derived with an algorithm utilizes a combined fast radiative transfer model (RTM) with a molecular RTM and a cloud RTM accounting for both atmospheric absorption and cloud absorption/scattering. Clouds are automatically detected and cloud microphysical parameters are retrieved; and emissivity is retrieved under clear and optically thin cloud conditions. This technique separates surface emissivity from skin temperature by representing the emissivity spectrum with eigenvectors derived from a laboratory measured emissivity database; in other words, using the constraint as a means for the emissivity to vary smoothly across atmospheric absorption lines. Here we present the emissivity derived under optically thin clouds in comparison with that under clear conditions.

  18. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    NASA Astrophysics Data System (ADS)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  19. Methods of editing cloud and atmospheric layer affected pixels from satellite data

    NASA Technical Reports Server (NTRS)

    Nixon, P. R. (Principal Investigator); Wiegand, C. L.; Richardson, A. J.; Johnson, M. P.; Goodier, B. G.

    1981-01-01

    The location and migration of cloud, land and water features were examined in spectral space (reflective VIS vs. emissive IR). Daytime HCMM data showed two distinct types of cloud affected pixels in the south Texas test area. High altitude cirrus and/or cirrostratus and "subvisible cirrus" (SCi) reflected the same or only slightly more than land features. In the emissive band, the digital counts ranged from 1 to over 75 and overlapped land features. Pixels consisting of cumulus clouds, or of mixed cumulus and landscape, clustered in a different area of spectral space than the high altitude cloud pixels. Cumulus affected pixels were more reflective than land and water pixels. In August the high altitude clouds and SCi were more emissive than similar clouds were in July. Four-channel TIROS-N data were examined with the objective of developing a multispectral screening technique for removing SCi contaminated data.

  20. Quality of remote sensing measurements of cloud physical parameters in the cooperative convective precipitation experiment

    NASA Technical Reports Server (NTRS)

    Wu, M.-L.

    1985-01-01

    In order to develop the remote sensing techniques to infer cloud physical parameters, a multispectral cloud radiometer (MCR) was mounted on a NASA high-altitude aircraft in conjunction with the Cooperative Convective Precipitation Experiment in 1981. The MCR has seven spectral channels, of which three are centered near windows associated with water vapor bands in the near infrared, two are centered near the oxygen A band at 0.76 microns, one is centered at the 1.14-micron water vapor band, and one is centered in the thermal infrared. The reflectance and temperature measured on May 31, 1981, are presented together with theoretical calculations. The results indicate that the MCR produces quality measurements. Therefore several cloud parameters can be derived with good accuracy. The parameters are the cloud-scaled optical thickness, cloud top pressure, volume scattering coefficient, particle thermodynamic phase, effective mean particle size, and cloud-top temperature.

  1. Aerosol climatology using a tunable spectral variability cloud screening of AERONET data

    NASA Technical Reports Server (NTRS)

    Kaufman, Yoram J.; Gobbi, Gian Paolo; Koren, Ilan

    2005-01-01

    Can cloud screening of an aerosol data set, affect the aerosol optical thickness (AOT) climatology? Aerosols, humidity and clouds are correlated. Therefore, rigorous cloud screening can systematically bias towards less cloudy conditions, underestimating the average AOT. Here, using AERONET data we show that systematic rejection of variable atmospheric optical conditions can generate such bias in the average AOT. Therefore we recommend (1) to introduce more powerful spectral variability cloud screening and (2) to change the philosophy behind present aerosol climatologies: Instead of systematically rejecting all cloud contaminations, we suggest to intentionally allow the presence of cloud contamination, estimate the statistical impact of the contamination and correct for it. The analysis, applied to 10 AERONET stations with approx. 4 years of data, shows almost no change for Rome (Italy), but up to a change in AOT of 0.12 in Beijing (PRC). Similar technique may be explored for satellite analysis, e.g. MODIS.

  2. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  3. Star counts and visual extinctions in dark nebulae

    NASA Technical Reports Server (NTRS)

    Dickman, R. L.

    1978-01-01

    Application of star count techniques to the determination of visual extinctions in compact, fairly high-extinction dark nebulae is discussed. Particular attention is devoted to the determination of visual extinctions for a cloud having a possibly anomalous ratio of total to selective extinction. The techniques discussed are illustrated in application at two colors to four well-known compact dust clouds or Bok globules: Barnard 92, B 133, B 134, and B 335. Minimum masses and lower limits to the central extinction of these objects are presented.

  4. Study to perform preliminary experiments to evaluate particle generation and characterization techniques for zero-gravity cloud physics experiments

    NASA Technical Reports Server (NTRS)

    Katz, U.

    1982-01-01

    Methods of particle generation and characterization with regard to their applicability for experiments requiring cloud condensation nuclei (CCN) of specified properties were investigated. Since aerosol characterization is a prerequisite to assessing performance of particle generation equipment, techniques for characterizing aerosol were evaluated. Aerosol generation is discussed, and atomizer and photolytic generators including preparation of hydrosols (used with atomizers) and the evaluation of a flight version of an atomizer are studied.

  5. Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing

    PubMed Central

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640

  6. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  7. Accuracy Assessment of a Canal-Tunnel 3d Model by Comparing Photogrammetry and Laserscanning Recording Techniques

    NASA Astrophysics Data System (ADS)

    Charbonnier, P.; Chavant, P.; Foucher, P.; Muzet, V.; Prybyla, D.; Perrin, T.; Grussenmeyer, P.; Guillemin, S.

    2013-07-01

    With recent developments in the field of technology and computer science, conventional methods are being supplanted by laser scanning and digital photogrammetry. These two different surveying techniques generate 3-D models of real world objects or structures. In this paper, we consider the application of terrestrial Laser scanning (TLS) and photogrammetry to the surveying of canal tunnels. The inspection of such structures requires time, safe access, specific processing and professional operators. Therefore, a French partnership proposes to develop a dedicated equipment based on image processing for visual inspection of canal tunnels. A 3D model of the vault and side walls of the tunnel is constructed from images recorded onboard a boat moving inside the tunnel. To assess the accuracy of this photogrammetric model (PM), a reference model is build using static TLS. We here address the problem comparing the resulting point clouds. Difficulties arise because of the highly differentiated acquisition processes, which result in very different point densities. We propose a new tool, designed to compare differences between pairs of point cloud or surfaces (triangulated meshes). Moreover, dealing with huge datasets requires the implementation of appropriate structures and algorithms. Several techniques are presented : point-to-point, cloud-to-cloud and cloud-to-mesh. In addition farthest point resampling, octree structure and Hausdorff distance are adopted and described. Experimental results are shown for a 475 m long canal tunnel located in France.

  8. The DC-8 Submillimeter-Wave Cloud Ice Radiometer

    NASA Technical Reports Server (NTRS)

    Walter, Steven J.; Batelaan, Paul; Siegel, Peter; Evans, K. Franklin; Evans, Aaron; Balachandra, Balu; Gannon, Jade; Guldalian, John; Raz, Guy; Shea, James

    2000-01-01

    An airborne radiometer is being developed to demonstrate the capability of radiometry at submillimeter-wavelengths to characterize cirrus clouds. At these wavelengths, cirrus clouds scatter upwelling radiation from water vapor in the lower troposphere. Radiometric measurements made at multiple widely spaced frequencies permit flux variations caused by changes in scattering due to crystal size to be distinguished from changes in cloud ice content. Measurements at dual polarizations can also be used to constrain the mean crystal shape. An airborne radiometer measuring the upwelling submillimeter-wave flux should then able to retrieve both bulk and microphysical cloud properties. The radiometer is being designed to make measurements at four frequencies (183 GHz, 325 GHz, 448 GHz, and 643 GHz) with dual-polarization capability at 643 GHz. The instrument is being developed for flight on NASA's DC-8 and will scan cross-track through an aircraft window. Measurements with this radiometer in combination with independent ground-based and airborne measurements will validate the submillimeter-wave radiometer retrieval techniques. The goal of this effort is to develop a technique to enable spaceborne characterization of cirrus, which will meet a key climate measurement need. The development of an airborne radiometer to validate cirrus retrieval techniques is a critical step toward development of spaced-based radiometers to investigate and monitor cirrus on a global scale. The radiometer development is a cooperative effort of the University of Colorado, Colorado State University, Swales Aerospace, and Jet Propulsion Laboratory and is funded by the NASA Instrument Incubator Program.

  9. Weather radar equation and a receiver calibration based on a slice approach

    NASA Astrophysics Data System (ADS)

    Yurchak, B. S.

    2012-12-01

    Two circumstances are essential when exploiting radar measurement of precipitation. The first circumstance is a correct physical-mathematical model linking parameters of a rainfall microstructure with a magnitude of a return signal (the weather radar equation (WRE)). The second is a precise measurement of received power that is fitted by a calibration of radar receiver. WRE for the spatially extended geophysical target (SEGT), such as cloud or rain, has been derived based on "slice" approach [1]. In this approach, the particles located close to the wavefront of the radar illumination are assumed to produce backscatter that is mainly coherent. This approach allows the contribution of the microphysical parameters of the scattering media to the radar cross section to be more comprehensive than the model based on the incoherent approach (e.g., Probert-Jones equation (PJE)). In the particular case, when the particle number fluctuations within slices pertain the Poisson law, the WRE derived is transformed to PJE. When Poisson index (standard deviation / mean number of particles) of a slice deviates from 1, the deviation of return power estimated by PJE from the actual value varies from +8 dB to - 12 dB. In general, the backscatter depends on mean, variance and third moment of the particle size distribution function (PSDF). The incoherent approach assumes only dependence on the sixth moment of PSDF (radar reflectivity Z). Additional difference from the classical estimate can be caused by a correlation between slice field reflectivity [2]. Overall, the deviation in particle statistics of a slice from the Poisson law is one of main physical factors that contribute to errors in radar precipitation measurements based on Z-conception. One of the components of calibration error is caused by difference between processing by weather radar receiver of the calibration pulse, and actual return signal from SEGT. A receiver with non uniform amplitude-frequency response (AFR) processes these signals with the same input power but with different radio-frequency spectrums (RFS). This causes different output magnitude due to different distortion experienced while RFS passing through a receiver filter. To assess the calibration error, RFS of signals from SEGT has been studied including theoretical, experimental and simulation stages [3]. It is shown that the return signal carrier wave is phase modulated due to overlapping of replicas of RF-probing pulse reflected from SEGT's slices. The RFSs depends on the phase statistics of the carrier wave and on RFS of the probing pulse. The bandwidth of SEGT's RFS is not greater than that of the probing pulse. Typical phase correlation interval was found to be around the same as that of the probing pulse duration. Application of a long calibration signal (proportional to SEGT extension) causes the error up to -1 dB for conventional radar with matched filter. To eliminate the calibration error, a power estimate of individual return waveform should be corrected with the transformation loss coefficient calculated based on RFS and AFR parameters. To embrace with calibration the high and low frequency parts of a receiver, the calibration should be performed with a long pulse. That long pulse is composed from adjoining replicas of a probe pulse with random initial phases and having the same magnitude governed by the power of probe pulse.

  10. Estimation of Cirrus and Stratus Cloud Heights Using Landsat Imagery

    NASA Technical Reports Server (NTRS)

    Inomata, Yasushi; Feind, R. E.; Welch, R. M.

    1996-01-01

    A new method based upon high-spatial-resolution imagery is presented that matches cloud and shadow regions to estimate cirrus and stratus cloud heights. The distance between the cloud and the matching shadow pattern is accomplished using the 2D cross-correlation function from which the cloud height is derived. The distance between the matching cloud-shadow patterns is verified manually. The derived heights also are validated through comparison with a temperature-based retrieval of cloud height. It is also demonstrated that an estimate of cloud thickness can be retrieved if both the sunside and anti-sunside of the cloud-shadow pair are apparent. The technique requires some intepretation to determine the cloud height level retrieved (i.e., the top, base, or mid-level). It is concluded that the method is accurate to within several pixels, equivalent to cloud height variations of about +/- 250 m. The results show that precise placement of the templates is unnecessary, so that the development of a semi-automated procedure is possible. Cloud templates of about 64 pixels on a side or larger produce consistent results. The procedure was repeated for imagery degraded to simulate lower spatial resolutions. The results suggest that spatial resolution of 150-200 m or better is necessary in order to obtain stable cloud height retrievals.

  11. Spatial and Temporal Distribution of Tropospheric Clouds and Aerosols Observed by MODIS Onboard the Terra and Aqua Satellites

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Remer, Lorraine A.; Kaufman, Yoram J.

    2004-01-01

    Remote sensing of cloud and aerosol optical properties is routinely obtained using the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the Terra and Aqua satellites. Techniques that are being used to enhance our ability to characterize the global distribution of cloud and aerosol properties include well-calibrated multispectral radiometers that rely on visible, near-infrared, and thermal infrared channels. The availability of thermal channels to aid in cloud screening for aerosol properties is an important additional piece of information that has not always been incorporated into sensor designs. In this paper, we describe the radiative properties of clouds as currently determined from satellites (cloud fraction, optical thickness, cloud top pressure, and cloud effective radius), and highlight the global and regional cloud microphysical properties currently available for assessing climate variability and forcing. These include the latitudinal distribution of cloud optical and radiative properties of both liquid water and ice clouds, as well as joint histograms of cloud optical thickness and effective radius for selected geographical locations around the world. In addition, we will illustrate the radiative and microphysical properties of aerosol particles that are currently available from space-based observations, and show selected cases in which aerosol particles are observed to modify the cloud optical properties.

  12. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    PubMed Central

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  13. A New Normalized Difference Cloud Retrieval Technique Applied to Landsat Radiances Over the Oklahoma ARM Site

    NASA Technical Reports Server (NTRS)

    Orepoulos, Lazaros; Cahalan, Robert; Marshak, Alexander; Wen, Guoyong

    1999-01-01

    We suggest a new approach to cloud retrieval, using a normalized difference of nadir reflectivities (NDNR) constructed from a non-absorbing and absorbing (with respect to liquid water) wavelength. Using Monte Carlo simulations we show that this quantity has the potential of removing first order scattering effects caused by cloud side illumination and shadowing at oblique Sun angles. Application of the technique to TM (Thematic Mapper) radiance observations from Landsat-5 over the Southern Great Plains site of the ARM (Atmospheric Radiation Measurement) program gives very similar regional statistics and histograms, but significant differences at the pixel level. NDNR can be also combined with the inverse NIPA (Nonlocal Independent Pixel Approximation) of Marshak (1998) which is applied for the first time on overcast Landsat scene subscenes. We demonstrate the sensitivity of the NIPA-retrieved cloud fields on the parameters of the method and discuss practical issues related to the optimal choice of these parameters.

  14. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    PubMed

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  15. The Citizen CATE Experiment: Techniques to Determine Totality Coverage and Clouded Data Removal.

    NASA Astrophysics Data System (ADS)

    McKay, Myles A.; Ursache, Andrei; Penn, Matthew; Citizen CATE Experiment 2017 Team

    2018-01-01

    August 21, 2017, the Citizen Continental-America Telescopic Eclipse(CATE) Experiment observed the 2017 total solar eclipse using a network of 68 identical telescopes and camera systems along the path of totality. The result from the observation was over 90% of all sites collected totality data on the day of the eclipse. Since the volunteers had to remove the solar filter manually, there is an uncertainty between the time of totality and data acquired during totality. Some sites also experienced cloudy weather which obscured the eclipse in some of the exposures but had small breaks in the clouds during the observation, collecting clear totality data. Before we can process and analyze the eclipse data, we must carefully determine which frames cover the time of totality for each site and remove exposures with clouds blocking the FOV. In this poster, we will discuss the techniques we used to determine the extent of totality from each location using the logged GPS data and the removal of totality exposure with clouds.

  16. Cloudnet Project

    DOE Data Explorer

    Hogan, Robin

    2008-01-15

    Cloudnet is a research project supported by the European Commission. This project aims to use data obtained quasi-continuously for the development and implementation of cloud remote sensing synergy algorithms. The use of active instruments (lidar and radar) results in detailed vertical profiles of important cloud parameters which cannot be derived from current satellite sensing techniques. A network of three already existing cloud remote sensing stations (CRS-stations) will be operated for a two year period, activities will be co-ordinated, data formats harmonised and analysis of the data performed to evaluate the representation of clouds in four major european weather forecast models.

  17. Polarimetric radar and aircraft observations of saggy bright bands during MC3E

    DOE PAGES

    Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree; ...

    2016-03-19

    Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.

  18. Deployment of the third-generation infrared cloud imager: A two-year study of Arctic clouds at Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Nugent, Paul Winston

    Cloud cover is an important but poorly understood component of current climate models, and although climate change is most easily observed in the Arctic, cloud data in the Arctic is unreliable or simply unavailable. Ground-based infrared cloud imaging has the potential to fill this gap. This technique uses a thermal infrared camera to observe cloud amount, cloud optical depth, and cloud spatial distribution at a particular location. The Montana State University Optical Remote Sensor Laboratory has developed the ground-based Infrared Cloud Imager (ICI) instrument to measure spatial and temporal cloud data. To build an ICI for Arctic sites required the system to be engineered to overcome the challenges of this environment. Of particular challenge was keeping the system calibration and data processing accurate through the severe temperature changes. Another significant challenge was that weak emission from the cold, dry Arctic atmosphere pushed the camera used in the instrument to its operational limits. To gain an understanding of the operation of the ICI systems for the Arctic and to gather critical data on Arctic clouds, a prototype arctic ICI was deployed in Barrow, AK from July 2012 through July 2014. To understand the long-term operation of an ICI in the arctic, a study was conducted of the ICI system accuracy in relation to co-located active and passive sensors. Understanding the operation of this system in the Arctic environment required careful characterization of the full optical system, including the lens, filter, and detector. Alternative data processing techniques using decision trees and support vector machines were studied to improve data accuracy and reduce dependence on auxiliary instrument data and the resulting accuracy is reported here. The work described in this project was part of the effort to develop a fourth-generation ICI ready to be deployed in the Arctic. This system will serve a critical role in developing our understanding of cloud cover in the Arctic, an important but poorly understood region of the world.

  19. The geometry and physical properties of exhaust clouds generated during the static firing of S-1C and S-2 rocket engines

    NASA Technical Reports Server (NTRS)

    Forbes, R. E.; Smith, M. R.; Farrell, R. R.

    1972-01-01

    An experimental program was conducted during the static firing of the S-1C stage 13, 14, and 15 rocket engines and the S-2 stage 13, 14, and 15 rocket engines. The data compiled during the experimental program consisted of photographic recordings of the time-dependent growth and diffusion of the exhaust clouds, the collection of meteorological data in the ambient atmosphere, and the acquisition of data on the physical structure of the exhaust clouds which were obtained by flying instrumented aircraft through the clouds. A new technique was developed to verify the previous measurements of evaporation and entrainment of blast deflector cooling water into the cloud. The results of the experimental program indicate that at the lower altitudes the rocket exhaust cloud or plume closely resembles a free-jet type of flow. At the upper altitudes, where the cloud is approaching an equilibrium condition, structure is very similar to a natural cumulus cloud.

  20. Cloud-based adaptive exon prediction for DNA analysis

    PubMed Central

    Putluri, Srinivasareddy; Fathima, Shaik Yasmeen

    2018-01-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database. PMID:29515813

  1. Toward a Big Data Science: A challenge of "Science Cloud"

    NASA Astrophysics Data System (ADS)

    Murata, Ken T.; Watanabe, Hidenobu

    2013-04-01

    During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.

  2. Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network-Cloud Classification System

    NASA Astrophysics Data System (ADS)

    Hong, Yang

    Precipitation estimation from satellite information (VISIBLE , IR, or microwave) is becoming increasingly imperative because of its high spatial/temporal resolution and board coverage unparalleled by ground-based data. After decades' efforts of rainfall estimation using IR imagery as basis, it has been explored and concluded that the limitations/uncertainty of the existing techniques are: (1) pixel-based local-scale feature extraction; (2) IR temperature threshold to define rain/no-rain clouds; (3) indirect relationship between rain rate and cloud-top temperature; (4) lumped techniques to model high variability of cloud-precipitation processes; (5) coarse scales of rainfall products. As continuing studies, a new version of Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network (PERSIANN), called Cloud Classification System (CCS), has been developed to cope with these limitations in this dissertation. CCS includes three consecutive components: (1) a hybrid segmentation algorithm, namely Hierarchically Topographical Thresholding and Stepwise Seeded Region Growing (HTH-SSRG), to segment satellite IR images into separated cloud patches; (2) a 3D feature extraction procedure to retrieve both pixel-based local-scale and patch-based large-scale features of cloud patch at various heights; (3) an ANN model, Self-Organizing Nonlinear Output (SONO) network, to classify cloud patches into similarity-based clusters, using Self-Organizing Feature Map (SOFM), and then calibrate hundreds of multi-parameter nonlinear functions to identify the relationship between every cloud types and their underneath precipitation characteristics using Probability Matching Method and Multi-Start Downhill Simplex optimization techniques. The model was calibrated over the Southwest of United States (100°--130°W and 25°--45°N) first and then adaptively adjusted to the study region of North America Monsoon Experiment (65°--135°W and 10°--50°N) using observations from Geostationary Operational Environmental Satellite (GOES) IR imagery, Next Generation Radar (NEXRAD) rainfall network, and Tropical Rainfall Measurement Mission (TRMM) microwave rain rate estimates. CCS functions as a distributed model that first identifies cloud patches and then dispatches different but the best matching cloud-precipitation function for each cloud patch to estimate instantaneous rain rate at high spatial resolution (4km) and full temporal resolution of GOES IR images (every 30-minute). Evaluated over a range of spatial and temporal scales, the performance of CCS compared favorably with GOES Precipitation Index (GPI), Universal Adjusted GPI (UAGPI), PERSIANN, and Auto-Estimator (AE) algorithms, consistently. Particularly, the large number of nonlinear functions and optimum IR-rain rate thresholds of CCS model are highly variable, reflecting the complexity of dominant cloud-precipitation processes from cloud patch to cloud patch over various regions. As a result, CCS can more successfully capture variability in rain rate at small scales than existing algorithms and potentially provides rainfall product from GOES IR-NEXARD-TRMM TMI (SSM/I) at 0.12° x 0.12° and 3-hour resolution with relative low standard error (˜=3.0mm/hr) and high correlation coefficient (˜=0.65).

  3. Development of an improved high efficiency thin silicon solar cell

    NASA Technical Reports Server (NTRS)

    Lindmayer, J.

    1978-01-01

    Efforts were concerned with optimizing techniques for thinning silicon slices in NaOH etches, initial investigations of surface texturing, variation of furnace treatments to improve cell efficiency, initial efforts on optimization of gridline and cell sizes and Pilot Line fabrication of quantities of 2 cm x 2 cm 50 micron thick cells.

  4. Application of an automatic cloud tracking technique to Meteosat water vapor and infrared observations

    NASA Technical Reports Server (NTRS)

    Endlich, R. M.; Wolf, D. E.

    1980-01-01

    The automatic cloud tracking system was applied to METEOSAT 6.7 micrometers water vapor measurements to learn whether the system can track the motions of water vapor patterns. Data for the midlatitudes, subtropics, and tropics were selected from a sequence of METEOSAT pictures for 25 April 1978. Trackable features in the water vapor patterns were identified using a clustering technique and the features were tracked by two different methods. In flat (low contrast) water vapor fields, the automatic motion computations were not reliable, but in areas where the water vapor fields contained small scale structure (such as in the vicinity of active weather phenomena) the computations were successful. Cloud motions were computed using METEOSAT infrared observations (including tropical convective systems and midlatitude jet stream cirrus).

  5. A survey of laser lightning rod techniques

    NASA Technical Reports Server (NTRS)

    Barnes, Arnold A., Jr.; Berthel, Robert O.

    1991-01-01

    The work done to create a laser lightning rod (LLR) is discussed. Some ongoing research which has the potential for achieving an operational laser lightning rod for use in the protection of missile launch sites, launch vehicles, and other property is discussed. Because of the ease with which a laser beam can be steered into any cloud overhead, an LLR could be used to ascertain if there exists enough charge in the clouds to discharge to the ground as triggered lightning. This leads to the possibility of using LLRs to test clouds prior to launching missiles through the clouds or prior to flying aircraft through the clouds. LLRs could also be used to probe and discharge clouds before or during any hazardous ground operations. Thus, an operational LLR may be able to both detect such sub-critical electrical fields and effectively neutralize them.

  6. Graphic correlation across a facies change marking the location of a middle to late Cenomanian ocean front

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, C.G.

    1993-03-01

    An abrupt lithofacies change between calcareous shale and non-calcareous shale occurs in strata deposited in the mid-Cretaceous Greenhorn Seaway in the extreme southeastern corner of Montana, U.S.A. This strata, north of the Black Hills has previously been miss-correlated due to the extreme difficulty in locating unique continuous marker beds. Supplemental Graphic Correlation techniques of Lucy Edwards, which expand on those of Shaw, were employed in the difficult task of correlating across the Little Missouri River Valley. Precise correlation was necessary in order to interpret the cause of the lithofacies change. Edwards's use of non-unique event marker beds and the side-by-sidemore » graph method proved to be invaluable tools. Precise correlation across the facies change was accomplished using a combination of bentonite beds, calcarenite beds, ammonite species, foraminiferal and calcareous nannofossil assemblages. Supplemental Graphic Correlation techniques allowed the definition of twenty-five time slices and permitted the identification of an ocean front during each of these time slices.« less

  7. Fast automatic correction of motion artifacts in shoulder MRI

    NASA Astrophysics Data System (ADS)

    Manduca, Armando; McGee, Kiaran P.; Welch, Edward B.; Felmlee, Joel P.; Ehman, Richard L.

    2001-07-01

    The ability to correct certain types of MR images for motion artifacts from the raw data alone by iterative optimization of an image quality measure has recently been demonstrated. In the first study on a large data set of clinical images, we showed that such an autocorrection technique significantly improved the quality of clinical rotator cuff images, and performed almost as well as navigator echo correction while never degrading an image. One major criticism of such techniques is that they are computationally intensive, and reports of the processing time required have ranged form a few minutes to tens of minutes per slice. In this paper we describe a variety of improvements to our algorithm as well as approaches to correct sets of adjacent slices efficiently. The resulting algorithm is able to correct 256x256x20 clinical shoulder data sets for motion at an effective rate of 1 second/image on a standard commercial workstation. Future improvements in processor speeds and/or the use of specialized hardware will translate directly to corresponding reductions in this calculation time.

  8. Application of machine vision to pup loaf bread evaluation

    NASA Astrophysics Data System (ADS)

    Zayas, Inna Y.; Chung, O. K.

    1996-12-01

    Intrinsic end-use quality of hard winter wheat breeding lines is routinely evaluated at the USDA, ARS, USGMRL, Hard Winter Wheat Quality Laboratory. Experimental baking test of pup loaves is the ultimate test for evaluating hard wheat quality. Computer vision was applied to developing an objective methodology for bread quality evaluation for the 1994 and 1995 crop wheat breeding line samples. Computer extracted features for bread crumb grain were studied, using subimages (32 by 32 pixel) and features computed for the slices with different threshold settings. A subsampling grid was located with respect to the axis of symmetry of a slice to provide identical topological subimage information. Different ranking techniques were applied to the databases. Statistical analysis was run on the database with digital image and breadmaking features. Several ranking algorithms and data visualization techniques were employed to create a sensitive scale for porosity patterns of bread crumb. There were significant linear correlations between machine vision extracted features and breadmaking parameters. Crumb grain scores by human experts were correlated more highly with some image features than with breadmaking parameters.

  9. An approach to localize the retinal blood vessels using bit planes and centerline detection.

    PubMed

    Fraz, M M; Barman, S A; Remagnino, P; Hoppe, A; Basit, A; Uyyanonvara, B; Rudnicka, A R; Owen, C G

    2012-11-01

    The change in morphology, diameter, branching pattern or tortuosity of retinal blood vessels is an important indicator of various clinical disorders of the eye and the body. This paper reports an automated method for segmentation of blood vessels in retinal images. A unique combination of techniques for vessel centerlines detection and morphological bit plane slicing is presented to extract the blood vessel tree from the retinal images. The centerlines are extracted by using the first order derivative of a Gaussian filter in four orientations and then evaluation of derivative signs and average derivative values is performed. Mathematical morphology has emerged as a proficient technique for quantifying the blood vessels in the retina. The shape and orientation map of blood vessels is obtained by applying a multidirectional morphological top-hat operator with a linear structuring element followed by bit plane slicing of the vessel enhanced grayscale image. The centerlines are combined with these maps to obtain the segmented vessel tree. The methodology is tested on three publicly available databases DRIVE, STARE and MESSIDOR. The results demonstrate that the performance of the proposed algorithm is comparable with state of the art techniques in terms of accuracy, sensitivity and specificity. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  10. Optical coherence tomography visualizes neurons in human entorhinal cortex

    PubMed Central

    Magnain, Caroline; Augustinack, Jean C.; Konukoglu, Ender; Frosch, Matthew P.; Sakadžić, Sava; Varjabedian, Ani; Garcia, Nathalie; Wedeen, Van J.; Boas, David A.; Fischl, Bruce

    2015-01-01

    Abstract. The cytoarchitecture of the human brain is of great interest in diverse fields: neuroanatomy, neurology, neuroscience, and neuropathology. Traditional histology is a method that has been historically used to assess cell and fiber content in the ex vivo human brain. However, this technique suffers from significant distortions. We used a previously demonstrated optical coherence microscopy technique to image individual neurons in several square millimeters of en-face tissue blocks from layer II of the human entorhinal cortex, over 50  μm in depth. The same slices were then sectioned and stained for Nissl substance. We registered the optical coherence tomography (OCT) images with the corresponding Nissl stained slices using a nonlinear transformation. The neurons were then segmented in both images and we quantified the overlap. We show that OCT images contain information about neurons that is comparable to what can be obtained from Nissl staining, and thus can be used to assess the cytoarchitecture of the ex vivo human brain with minimal distortion. With the future integration of a vibratome into the OCT imaging rig, this technique can be scaled up to obtain undistorted volumetric data of centimeter cube tissue blocks in the near term, and entire human hemispheres in the future. PMID:25741528

  11. On the Cloud Observations in JAXA's Next Coming Satellite Missions

    NASA Technical Reports Server (NTRS)

    Nakajima, Takashi Y.; Nagao, Takashi M.; Letu, Husi; Ishida, Haruma; Suzuki, Kentaroh

    2012-01-01

    The use of JAXA's next generation satellites, the EarthCARE and the GCOM-C, for observing overall cloud systems on the Earth is discussed. The satellites will be launched in the middle of 2010-era and contribute for observing aerosols and clouds in terms of climate change, environment, weather forecasting, and cloud revolution process study. This paper describes the role of such satellites and how to use the observing data showing concepts and some sample viewgraphs. Synergistic use of sensors is a key of the study. Visible to infrared bands are used for cloudy and clear discriminating from passively obtained satellite images. Cloud properties such as the cloud optical thickness, the effective particle radii, and the cloud top temperature will be retrieved from visible to infrared wavelengths of imagers. Additionally, we are going to combine cloud properties obtained from passive imagers and radar reflectivities obtained from an active radar in order to improve our understanding of cloud evolution process. This is one of the new techniques of satellite data analysis in terms of cloud sciences in the next decade. Since the climate change and cloud process study have mutual beneficial relationship, a multispectral wide-swath imagers like the GCOM-C SGLI and a comprehensive observation package of cloud and aerosol like the EarthCARE are both necessary.

  12. An Extension of the Split Window Technique for the Retrieval of Precipitable Water: Experimental Verification

    DTIC Science & Technology

    1988-09-23

    DOWNGRADING SCHEDULE D~istribution Unlimited 4. PERFORMING ORGANIZATiON REPORT NUMVBER(S) 5. MONITORiG ORGANIZATION REPORT NUMBER(S) AFGL-TR-88-0237...Collocations were performed on launch sites of the cloud contamination, aerosol problems, collocation 1200 UT radiosondes on 25 Aug 1987. Statistics were...al (1987) and Thomason, 1987). In this imagery opaque clouds to this problem appear white, low clouds and fog appear bright red against a brown

  13. Element fracture technique for hypervelocity impact simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-tian; Li, Xiao-gang; Liu, Tao; Jia, Guang-hui

    2015-05-01

    Hypervelocity impact dynamics is the theoretical support of spacecraft shielding against space debris. The numerical simulation has become an important approach for obtaining the ballistic limits of the spacecraft shields. Currently, the most widely used algorithm for hypervelocity impact is the smoothed particle hydrodynamics (SPH). Although the finite element method (FEM) is widely used in fracture mechanics and low-velocity impacts, the standard FEM can hardly simulate the debris cloud generated by hypervelocity impact. This paper presents a successful application of the node-separation technique for hypervelocity impact debris cloud simulation. The node-separation technique assigns individual/coincident nodes for the adjacent elements, and it applies constraints to the coincident node sets in the modeling step. In the explicit iteration, the cracks are generated by releasing the constrained node sets that meet the fracture criterion. Additionally, the distorted elements are identified from two aspects - self-piercing and phase change - and are deleted so that the constitutive computation can continue. FEM with the node-separation technique is used for thin-wall hypervelocity impact simulations. The internal structures of the debris cloud in the simulation output are compared with that in the test X-ray graphs under different material fracture criteria. It shows that the pressure criterion is more appropriate for hypervelocity impact. The internal structures of the debris cloud are also simulated and compared under different thickness-to-diameter ratios (t/D). The simulation outputs show the same spall pattern with the tests. Finally, the triple-plate impact case is simulated with node-separation FEM.

  14. Cloud-point detection using a portable thickness shear mode crystal resonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansure, A.J.; Spates, J.J.; Germer, J.W.

    1997-08-01

    The Thickness Shear Mode (TSM) crystal resonator monitors the crude oil by propagating a shear wave into the oil. The coupling of the shear wave and the crystal vibrations is a function of the viscosity of the oil. By driving the crystal with circuitry that incorporates feedback, it is possible to determine the change from Newtonian to non-Newtonian viscosity at the cloud point. A portable prototype TSM Cloud Point Detector (CPD) has performed flawlessly during field and lab tests proving the technique is less subjective or operator dependent than the ASTM standard. The TSM CPD, in contrast to standard viscositymore » techniques, makes the measurement in a closed container capable of maintaining up to 100 psi. The closed container minimizes losses of low molecular weight volatiles, allowing samples (25 ml) to be retested with the addition of chemicals. By cycling/thermal soaking the sample, the effects of thermal history can be investigated and eliminated as a source of confusion. The CPD is portable, suitable for shipping the field offices for use by personnel without special training or experience in cloud point measurements. As such, it can make cloud point data available without the delays and inconvenience of sending samples to special labs. The crystal resonator technology can be adapted to in-line monitoring of cloud point and deposition detection.« less

  15. Different Applications of FORTRACC: From Convective Clouds to thunderstorms and radar fields

    NASA Astrophysics Data System (ADS)

    Morales, C.; Machado, L. A.

    2009-09-01

    The algorithm Forecasting and Tracking the Evolution of Cloud Clusters (ForTraCC), Vila et al. (2008), has been employed operationally in Brazil since 2005 to track and forecast the development of convective clouds. This technique depicts the main morphological features of the cloud systems and most importantly it reconstructs its entire life cycle. Based on this information, several relationships that use the area expansion and convective and stratiform fraction are employed to predict the life time duration and cloud area. Because of these features, the civil defense and power companies are using this information to mitigate the damages in the population. Further developments in FORTRACC included the integration of satellite rainfall retrievals, radar fields and thunderstorm initiation. These improvements try to address the following problems: a) most of the satellite rainfall retrievals do not take into account the life cycle stage that it is a key element on defining the rain area and rain intensity; b) by using the life cycle information it is possible to better predict the precipitation pattern observed in the radar fields; c) cloud signatures are associated to the development of systems that have lightning and no lightning activity. During the presentation, an overview of the different applications of FORTRACC will be presented including case studies and evaluation of the technique. Finally, the presentation will address how the users can have access to the algorithm to implement in their institute.

  16. The boiled-egg technique: a new method for obtaining femoral head autograft used in acetabular defect reconstruction.

    PubMed

    Bucknall, Vittoria; Mehdi, Ali

    2013-09-01

    Primary total hip arthroplasty can be complicated by acetabular bony defects, threatening the biomechanical integrity of the prosthesis. Traditionally, when autologous bone is used to pack these defects, it is obtained from thin slices of femoral head in addition to acetabular reamings. We report a novel technique for the acquisition of autologous femoral head bone graft used in the reconstruction of acetabular defects during primary total hip arthroplasty. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Cloud Processing of Secondary Organic Aerosol from Isoprene and Methacrolein Photooxidation.

    PubMed

    Giorio, Chiara; Monod, Anne; Brégonzio-Rozier, Lola; DeWitt, Helen Langley; Cazaunau, Mathieu; Temime-Roussel, Brice; Gratien, Aline; Michoud, Vincent; Pangui, Edouard; Ravier, Sylvain; Zielinski, Arthur T; Tapparo, Andrea; Vermeylen, Reinhilde; Claeys, Magda; Voisin, Didier; Kalberer, Markus; Doussin, Jean-François

    2017-10-12

    Aerosol-cloud interaction contributes to the largest uncertainties in the estimation and interpretation of the Earth's changing energy budget. The present study explores experimentally the impacts of water condensation-evaporation events, mimicking processes occurring in atmospheric clouds, on the molecular composition of secondary organic aerosol (SOA) from the photooxidation of methacrolein. A range of on- and off-line mass spectrometry techniques were used to obtain a detailed chemical characterization of SOA formed in control experiments in dry conditions, in triphasic experiments simulating gas-particle-cloud droplet interactions (starting from dry conditions and from 60% relative humidity (RH)), and in bulk aqueous-phase experiments. We observed that cloud events trigger fast SOA formation accompanied by evaporative losses. These evaporative losses decreased SOA concentration in the simulation chamber by 25-32% upon RH increase, while aqueous SOA was found to be metastable and slowly evaporated after cloud dissipation. In the simulation chamber, SOA composition measured with a high-resolution time-of-flight aerosol mass spectrometer, did not change during cloud events compared with high RH conditions (RH > 80%). In all experiments, off-line mass spectrometry techniques emphasize the critical role of 2-methylglyceric acid as a major product of isoprene chemistry, as an important contributor to the total SOA mass (15-20%) and as a key building block of oligomers found in the particulate phase. Interestingly, the comparison between the series of oligomers obtained from experiments performed under different conditions show a markedly different reactivity. In particular, long reaction times at high RH seem to create the conditions for aqueous-phase processing to occur in a more efficient manner than during two relatively short cloud events.

  18. Marine cloud brightening

    PubMed Central

    Latham, John; Bower, Keith; Choularton, Tom; Coe, Hugh; Connolly, Paul; Cooper, Gary; Craft, Tim; Foster, Jack; Gadian, Alan; Galbraith, Lee; Iacovides, Hector; Johnston, David; Launder, Brian; Leslie, Brian; Meyer, John; Neukermans, Armand; Ormond, Bob; Parkes, Ben; Rasch, Phillip; Rush, John; Salter, Stephen; Stevenson, Tom; Wang, Hailong; Wang, Qin; Wood, Rob

    2012-01-01

    The idea behind the marine cloud-brightening (MCB) geoengineering technique is that seeding marine stratocumulus clouds with copious quantities of roughly monodisperse sub-micrometre sea water particles might significantly enhance the cloud droplet number concentration, and thereby the cloud albedo and possibly longevity. This would produce a cooling, which general circulation model (GCM) computations suggest could—subject to satisfactory resolution of technical and scientific problems identified herein—have the capacity to balance global warming up to the carbon dioxide-doubling point. We describe herein an account of our recent research on a number of critical issues associated with MCB. This involves (i) GCM studies, which are our primary tools for evaluating globally the effectiveness of MCB, and assessing its climate impacts on rainfall amounts and distribution, and also polar sea-ice cover and thickness; (ii) high-resolution modelling of the effects of seeding on marine stratocumulus, which are required to understand the complex array of interacting processes involved in cloud brightening; (iii) microphysical modelling sensitivity studies, examining the influence of seeding amount, seed-particle salt-mass, air-mass characteristics, updraught speed and other parameters on cloud–albedo change; (iv) sea water spray-production techniques; (v) computational fluid dynamics studies of possible large-scale periodicities in Flettner rotors; and (vi) the planning of a three-stage limited-area field research experiment, with the primary objectives of technology testing and determining to what extent, if any, cloud albedo might be enhanced by seeding marine stratocumulus clouds on a spatial scale of around 100×100 km. We stress that there would be no justification for deployment of MCB unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favour of such action. PMID:22869798

  19. Cognitive Approaches for Medicine in Cloud Computing.

    PubMed

    Ogiela, Urszula; Takizawa, Makoto; Ogiela, Lidia

    2018-03-03

    This paper will present the application potential of the cognitive approach to data interpretation, with special reference to medical areas. The possibilities of using the meaning approach to data description and analysis will be proposed for data analysis tasks in Cloud Computing. The methods of cognitive data management in Cloud Computing are aimed to support the processes of protecting data against unauthorised takeover and they serve to enhance the data management processes. The accomplishment of the proposed tasks will be the definition of algorithms for the execution of meaning data interpretation processes in safe Cloud Computing. • We proposed a cognitive methods for data description. • Proposed a techniques for secure data in Cloud Computing. • Application of cognitive approaches for medicine was described.

  20. Topologic analysis and comparison of brain activation in children with epilepsy versus controls: an fMRI study

    NASA Astrophysics Data System (ADS)

    Oweis, Khalid J.; Berl, Madison M.; Gaillard, William D.; Duke, Elizabeth S.; Blackstone, Kaitlin; Loew, Murray H.; Zara, Jason M.

    2010-03-01

    This paper describes the development of novel computer-aided analysis algorithms to identify the language activation patterns at a certain Region of Interest (ROI) in Functional Magnetic Resonance Imaging (fMRI). Previous analysis techniques have been used to compare typical and pathologic activation patterns in fMRI images resulting from identical tasks but none of them analyzed activation topographically in a quantitative manner. This paper presents new analysis techniques and algorithms capable of identifying a pattern of language activation associated with localization related epilepsy. fMRI images of 64 healthy individuals and 31 patients with localization related epilepsy have been studied and analyzed on an ROI basis. All subjects are right handed with normal MRI scans and have been classified into three age groups (4-6, 7-9, 10-12 years). Our initial efforts have focused on investigating activation in the Left Inferior Frontal Gyrus (LIFG). A number of volumetric features have been extracted from the data. The LIFG has been cut into slices and the activation has been investigated topographically on a slice by slice basis. Overall, a total of 809 features have been extracted, and correlation analysis was applied to eliminate highly correlated features. Principal Component analysis was then applied to account only for major components in the data and One-Way Analysis of Variance (ANOVA) has been applied to test for significantly different features between normal and patient groups. Twenty Nine features have were found to be significantly different (p<0.05) between patient and control groups

Top