Sample records for deep field hdf

  1. Large-scale fluctuations in the number density of galaxies in independent surveys of deep fields

    NASA Astrophysics Data System (ADS)

    Shirokov, S. I.; Lovyagin, N. Yu.; Baryshev, Yu. V.; Gorokhov, V. L.

    2016-06-01

    New arguments supporting the reality of large-scale fluctuations in the density of the visible matter in deep galaxy surveys are presented. A statistical analysis of the radial distributions of galaxies in the COSMOS and HDF-N deep fields is presented. Independent spectral and photometric surveys exist for each field, carried out in different wavelength ranges and using different observing methods. Catalogs of photometric redshifts in the optical (COSMOS-Zphot) and infrared (UltraVISTA) were used for the COSMOS field in the redshift interval 0.1 < z < 3.5, as well as the zCOSMOS (10kZ) spectroscopic survey and the XMM-COSMOS and ALHAMBRA-F4 photometric redshift surveys. The HDFN-Zphot and ALHAMBRA-F5 catalogs of photometric redshifts were used for the HDF-N field. The Pearson correlation coefficient for the fluctuations in the numbers of galaxies obtained for independent surveys of the same deep field reaches R = 0.70 ± 0.16. The presence of this positive correlation supports the reality of fluctuations in the density of visible matter with sizes of up to 1000 Mpc and amplitudes of up to 20% at redshifts z ~ 2. The absence of correlations between the fluctuations in different fields (the correlation coefficient between COSMOS and HDF-N is R = -0.20 ± 0.31) testifies to the independence of structures visible in different directions on the celestial sphere. This also indicates an absence of any influence from universal systematic errors (such as "spectral voids"), which could imitate the detection of correlated structures.

  2. The Chandra Deep Field-North Survey and the cosmic X-ray background.

    PubMed

    Brandt, W Nielsen; Alexander, David M; Bauer, Franz E; Hornschemeier, Ann E

    2002-09-15

    Chandra has performed a 1.4 Ms survey centred on the Hubble Deep Field-North (HDF-N), probing the X-ray Universe 55-550 times deeper than was possible with pre-Chandra missions. We describe the detected point and extended X-ray sources and discuss their overall multi-wavelength (optical, infrared, submillimetre and radio) properties. Special attention is paid to the HDF-N X-ray sources, luminous infrared starburst galaxies, optically faint X-ray sources and high-to-extreme redshift active galactic nuclei. We also describe how stacking analyses have been used to probe the average X-ray-emission properties of normal and starburst galaxies at cosmologically interesting distances. Finally, we discuss plans to extend the survey and argue that a 5-10 Ms Chandra survey would lay key groundwork for future missions such as XEUS and Generation-X.

  3. REVIEWS OF TOPICAL PROBLEMS: Sky surveys and deep fields of ground-based and space telescopes

    NASA Astrophysics Data System (ADS)

    Reshetnikov, Vladimir P.

    2005-11-01

    Selected results obtained in major observational sky surveys (DSS, 2MASS, 2dF, SDSS) and deep field observations (HDF, GOODS, UHDF, etc.) are reviewed. Modern surveys provide information on the characteristics and space distribution of millions of galaxies. Deep fields allow one to study galaxies at the stage of formation and to trace their evolution over billions of years. The wealth of observational data is altering the face of modern astronomy: the formulation of problems and their solutions are changing and all the previous knowledge, from planetary studies in the solar system to the most distant galaxies and quasars, is being revised.

  4. My Most Memorable AAS Meeting, or How Stephen Hawking's Chauffeur and Chubby Wise's Fiddle Are Related to the Hubble Deep Field (At Least In My Mind and Experience!)

    NASA Astrophysics Data System (ADS)

    Lucas, R. A.

    1999-05-01

    Sometimes, in the most extraordinary conditions and times, strange things happen which remind us of just how small a world we really inhabit, and how so many varied things may suddenly be juxtaposed in our lives, and in the lives of others. My most memorable AAS meeting involves not only the meeting but events while getting there. It was January 1996, and we had just finished our observations and initial data reduction of the Hubble Deep Field, the members of the HDF working group doggedly coming in to the STScI by various means over the December holidays and the New Year, in the midst of several blizzards which even closed STScI for a number of days. Not surprisingly, work on the HDF AAS presentations was ongoing until the last minute, until people left snowy Baltimore for sunny San Antonio. My street was plowed for the first time in a week a few hours before my 6AM flight, so after digging out my car, with no time for sleep, between 3AM and 6AM on the morning I left, I soon discovered my own surprising connections between Stephen Hawking's chauffeur, Chubby Wise's fiddle, and the Hubble Deep Field. I'll elaborate in this paper if you're curious!

  5. ESO imaging survey: infrared observations of CDF-S and HDF-S

    NASA Astrophysics Data System (ADS)

    Olsen, L. F.; Miralles, J.-M.; da Costa, L.; Benoist, C.; Vandame, B.; Rengelink, R.; Rité, C.; Scodeggio, M.; Slijkhuis, R.; Wicenec, A.; Zaggia, S.

    2006-06-01

    This paper presents infrared data obtained from observations carried out at the ESO 3.5 m New Technology Telescope (NTT) of the Hubble Deep Field South (HDF-S) and the Chandra Deep Field South (CDF-S). These data were taken as part of the ESO Imaging Survey (EIS) program, a public survey conducted by ESO to promote follow-up observations with the VLT. In the HDF-S field the infrared observations cover an area of ~53 square arcmin, encompassing the HST WFPC2 and STIS fields, in the JHKs passbands. The seeing measured in the final stacked images ranges from 0.79 arcsec to 1.22 arcsec and the median limiting magnitudes (AB system, 2'' aperture, 5σ detection limit) are J_AB˜23.0, H_AB˜22.8 and K_AB˜23.0 mag. Less complete data are also available in JKs for the adjacent HST NICMOS field. For CDF-S, the infrared observations cover a total area of ~100 square arcmin, reaching median limiting magnitudes (as defined above) of J_AB˜23.6 and K_AB˜22.7 mag. For one CDF-S field H band data are also available. This paper describes the observations and presents the results of new reductions carried out entirely through the un-supervised, high-throughput EIS Data Reduction System and its associated EIS/MVM C++-based image processing library developed, over the past 5 years, by the EIS project and now publicly available. The paper also presents source catalogs extracted from the final co-added images which are used to evaluate the scientific quality of the survey products, and hence the performance of the software. This is done comparing the results obtained in the present work with those obtained by other authors from independent data and/or reductions carried out with different software packages and techniques. The final science-grade catalogs together with the astrometrically and photometrically calibrated co-added images are available at CDS.

  6. The Great Observatories Origins Deep Survey (GOODS): Overview and Status

    NASA Astrophysics Data System (ADS)

    Hook, R. N.; GOODS Team

    2002-12-01

    GOODS is a very large project to gather deep imaging data and spectroscopic followup of two fields, the Hubble Deep Field North (HDF-N) and the Chandra Deep Field South (CDF-S), with both space and ground-based instruments to create an extensive multiwavelength public data set for community research on the distant Universe. GOODS includes a SIRTF Legacy Program (PI: Mark Dickinson) and a Hubble Treasury Program of ACS imaging (PI: Mauro Giavalisco). The ACS imaging was also optimized for the detection of high-z supernovae which are being followed up by a further target of opportunity Hubble GO Program (PI: Adam Riess). The bulk of the CDF-S ground-based data presently available comes from an ESO Large Programme (PI: Catherine Cesarsky) which includes both deep imaging and multi-object followup spectroscopy. This is currently complemented in the South by additional CTIO imaging. Currently available HDF-N ground-based data forming part of GOODS includes NOAO imaging. Although the SIRTF part of the survey will not begin until later in the year the ACS imaging is well advanced and there is also a huge body of complementary ground-based imaging and some follow-up spectroscopy which is already publicly available. We summarize the current status of GOODS and give an overview of the data products currently available and present the timescales for the future. Many early science results from the survey are presented in other GOODS papers at this meeting. Support for the HST GOODS program presented here and in companion abstracts was provided by NASA thorugh grant number GO-9425 from the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Incorporated, under NASA contract NAS5-26555.

  7. PHOTOMETRIC REDSHIFTS IN THE HAWAII-HUBBLE DEEP FIELD-NORTH (H-HDF-N)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, G.; Xue, Y. Q.; Kong, X.

    2015-01-01

    We derive photometric redshifts (z {sub phot}) for sources in the entire (∼0.4 deg{sup 2}) Hawaii-Hubble Deep Field-North (H-HDF-N) field with the EAzY code, based on point-spread-function-matched photometry of 15 broad bands from the ultraviolet (U band) to mid-infrared (IRAC 4.5 μm). Our catalog consists of a total of 131,678 sources. We evaluate the z {sub phot} quality by comparing z {sub phot} with spectroscopic redshifts (z {sub spec}) when available, and find a value of normalized median absolute deviation σ{sub NMAD} = 0.029 and an outlier fraction of 5.5% (outliers are defined as sources having |z{sub phot} – z{sub spec} |/(1more » + z{sub spec} ) > 0.15) for non-X-ray sources. More specifically, we obtain σ{sub NMAD} = 0.024 with 2.7% outliers for sources brighter than R = 23 mag, σ{sub NMAD} = 0.035 with 7.4% outliers for sources fainter than R = 23 mag, σ{sub NMAD} = 0.026 with 3.9% outliers for sources having z < 1, and σ{sub NMAD} = 0.034 with 9.0% outliers for sources having z > 1. Our z {sub phot} quality shows an overall improvement over an earlier z {sub phot} work that focused only on the central H-HDF-N area. We also classify each object as a star or galaxy through template spectral energy distribution fitting and complementary morphological parameterization, resulting in 4959 stars and 126,719 galaxies. Furthermore, we match our catalog with the 2 Ms Chandra Deep Field-North main X-ray catalog. For the 462 matched non-stellar X-ray sources (281 having z {sub spec}), we improve their z {sub phot} quality by adding three additional active galactic nucleus templates, achieving σ{sub NMAD} = 0.035 and an outlier fraction of 12.5%. We make our catalog publicly available presenting both photometry and z {sub phot}, and provide guidance on how to make use of our catalog.« less

  8. The Nature of Radio Emission from Distant Galaxies: The 1.4 GHZ Observations

    NASA Astrophysics Data System (ADS)

    Richards, E. A.

    2000-04-01

    We have conducted a deep radio survey with the Very Large Array at 1.4 GHz of a region containing the Hubble Deep Field (HDF). This survey overlaps previous observations at 8.5 GHz allowing us to investigate the radio spectral properties of microjansky sources to flux densities greater than 40 μJy at 1.4 GHz and greater than 8 μJy at 8.5 GHz. A total of 371 sources have been cataloged at 1.4 GHz as part of a complete sample within 20' of the HDF. The differential source count for this region is only marginally sub-Euclidean and is given by n(S)=(8.3+/-0.4)S-2.4+/-0.1 sr-1 Jy-1. Above about 100 μJy the radio source count is systematically lower in the HDF as compared to other fields. We conclude that there is clustering in our radio sample on size scales of 1'-40'. The 1.4 GHz-selected sample shows that the radio spectral indices are preferentially steep (α1.4=0.85) and that the sources are moderately extended with average angular size θ=1.8". Optical identification with disk-type systems at z~0.1-1 suggests that synchrotron emission, produced by supernovae remnants, is powering the radio emission in the majority of sources. The 8.5 GHz sample contains primarily moderately flat spectrum sources (α8.5=0.35), with less than 15% inverted. We argue that we may be observing an increased fraction of optically thin bremsstrahlung over synchrotron radiation in these distant star-forming galaxies.

  9. Wide Field Imaging of the Hubble Deep Field-South Region III: Catalog

    NASA Technical Reports Server (NTRS)

    Palunas, Povilas; Collins, Nicholas R.; Gardner, Jonathan P.; Hill, Robert S.; Malumuth, Eliot M.; Rhodes, Jason; Teplitz, Harry I.; Woodgate, Bruce E.

    2002-01-01

    We present 1/2 square degree uBVRI imaging around the Hubble Deep Field - South. These data have been used in earlier papers to examine the QSO population and the evolution of the correlation function in the region around the HDF-S. The images were obtained with the Big Throughput Camera at CTIO in September 1998. The images reach 5 sigma limits of u approx. 24.4, B approx. 25.6, V approx. 25.3, R approx. 24.9 and I approx. 23.9. We present a catalog of approx. 22,000 galaxies. We also present number-magnitude counts and a comparison with other observations of the same field. The data presented here are available over the world wide web.

  10. New insights on the accuracy of photometric redshift measurements

    NASA Astrophysics Data System (ADS)

    Massarotti, M.; Iovino, A.; Buzzoni, A.; Valls-Gabaud, D.

    2001-12-01

    We use the deepest and most complete redshift catalog currently available (the Hubble Deep Field (HDF) North supplemented by new HDF South redshift data) to minimize residuals between photometric and spectroscopic redshift estimates. The good agreement at zspec < 1.5 shows that model libraries provide a good description of the galaxy population. At zspec >= 2.0, the systematic shift between photometric and spectroscopic redshifts decreases when the modeling of the absorption by the interstellar and intergalactic media is refined. As a result, in the entire redshift range z in [0, 6], residuals between photometric and spectroscopic redshifts are roughly halved. For objects fainter than the spectroscopic limit, the main source of uncertainty in photometric redshifts is related to photometric errors, and can be assessed with Monte Carlo simulations.

  11. ISOCAM observations of the Hubble Deep Field reduced with the PRETI method

    NASA Astrophysics Data System (ADS)

    Aussel, H.; Cesarsky, C. J.; Elbaz, D.; Starck, J. L.

    1999-02-01

    We have developed a new ISOCAM data reduction technique based on wavelet analysis, especially designed for the detection of faint sources in mid-infrared surveys. This method, the Pattern REcognition Technique for Isocam data (PRETI) has been used to reduce the observations of the Hubble Deep Field (HDF) and flanking fields with ISOCAM at 6.75 (LW2) and 15 mu m (LW3) (Rowan-Robinson et al. \\cite{RowanRobinson}). Simulations of ISOCAM data allow us to test the photometric accuracy and completeness of the reduction. According to these simulations, the PRETI source list is 95% complete in the 15 mu m band at 200 mu Jy and in the 6.75 mu m band at 65 mu Jy, using detection thresholds which minimize the number of false detections. We detect 49 objects in the ISO-HDF at high confidence secure level, 42 in the LW3 filter, 3 in the LW2 filter, and 4 in both filters. An additional, less secure, list of 100 sources is presented, of which 89 are detected at 15 mu m only, 7 at 6.75 mu m only and 4 in both filters. All ISO-HDF objects detected in the HDF itself have optical or infrared counterparts, except for one from the additional list. All except one of the radio sources detected in the field by Fomalont et al. (\\cite{Fomalont}) are detected with ISOCAM. Using a precise correction for the field of view distortion of ISOCAM allows us to separate blended sources. This, together with the fact that PRETI allows to correct data on the tail of cosmic rays glitches, lead us to produce deeper source lists than previous authors. Our list of bright sources agree with those of Désert et al. (\\cite{IAS}) in both filters, and with those of Goldschmidt et al. (\\cite{Goldschmidt}) in the LW3 filter, with systematic difference in photometry. Number counts derived from our results show an excess by a factor of 10 with respect to the prediction of a no evolution model (Franceschini \\cite{Franceschini98}) in the LW3 band. On the contrary, the number of sources in the LW2 band is compatible with the prediction of such a model, but with greater uncertainties, given the small number of detections. Based on observations with the Infrared Space Observatory (ISO). ISO is an ESA project with instruments funded by ESA Member States (especially the PI countries: France, Germany, The Netherlands and the United Kingdom) and with participation of ISAS and NASA.

  12. HDF-EOS Dump Tools

    NASA Astrophysics Data System (ADS)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The Descriptor file (.desc) as the reference. The tool takes (.desc), and (.met) an ODL file as inputs, and generates a simple output file contains the results of the checking process.

  13. The GISMO two-millimeter deep field in GOODS-N

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Staguhn, Johannes G.; Kovács, Attila; Arendt, Richard G.

    2014-07-20

    We present deep continuum observations using the GISMO camera at a wavelength of 2 mm centered on the Hubble Deep Field in the GOODS-N field. These are the first deep field observations ever obtained at this wavelength. The 1σ sensitivity in the innermost ∼4' of the 7' diameter map is ∼135 μJy beam{sup –1}, a factor of three higher in flux/beam sensitivity than the deepest available SCUBA 850 μm observations, and almost a factor of four higher in flux/beam sensitivity than the combined MAMBO/AzTEC 1.2 mm observations of this region. Our source extraction algorithm identifies 12 sources directly, and anothermore » 3 through correlation with known sources at 1.2 mm and 850 μm. Five of the directly detected GISMO sources have counterparts in the MAMBO/AzTEC catalog, and four of those also have SCUBA counterparts. HDF850.1, one of the first blank-field detected submillimeter galaxies, is now detected at 2 mm. The median redshift of all sources with counterparts of known redshifts is z-tilde =2.91±0.94. Statistically, the detections are most likely real for five of the seven 2 mm sources without shorter wavelength counterparts, while the probability for none of them being real is negligible.« less

  14. The Drizzling Cookbook

    NASA Astrophysics Data System (ADS)

    Gonzaga, S.; Biretta, J.; Wiggs, M. S.; Hsu, J. C.; Smith, T. E.; Bergeron, L.

    1998-12-01

    The drizzle software combines dithered images while preserving photometric accuracy, enhancing resolution, and removing geometric distortion. A recent upgrade also allows removal of cosmic rays from single images at each dither pointing. This document gives detailed examples illustrating drizzling procedures for six cases: WFPC2 observations of a deep field, a crowded field, a large galaxy, a planetary nebula, STIS/CCD observations of a HDF-North field, and NICMOS/NIC2 observations of the Egg Nebula. Command scripts and input images for each example are available on the WFPC2 WWW website. Users are encouraged to retrieve the data for the case that most closely resembles their own data and then practice and experiment drizzling the example.

  15. Making Data Mobile: The Hubble Deep Field Academy iPad app

    NASA Astrophysics Data System (ADS)

    Eisenhamer, Bonnie; Cordes, K.; Davis, S.; Eisenhamer, J.

    2013-01-01

    Many school districts are purchasing iPads for educators and students to use as learning tools in the classroom. Educators often prefer these devices to desktop and laptop computers because they offer portability and an intuitive design, while having a larger screen size when compared to smart phones. As a result, we began investigating the potential of adapting online activities for use on Apple’s iPad to enhance the dissemination and usage of these activities in instructional settings while continuing to meet educators’ needs. As a pilot effort, we are developing an iPad app for the “Hubble Deep Field Academy” - an activity that is currently available online and commonly used by middle school educators. The Hubble Deep Field Academy app features the HDF-North image while centering on the theme of how scientists use light to explore and study the universe. It also includes features such as embedded links to vocabulary, images and videos, teacher background materials, and readings about Hubble’s other deep field surveys. It is our goal is to impact students’ engagement in STEM-related activities, while enhancing educators’ usage of NASA data via new and innovative mediums. We also hope to develop and share lessons learned with the E/PO community that can be used to support similar projects. We plan to test the Hubble Deep Field Academy app during the school year to determine if this new activity format is beneficial to the education community.

  16. On-line hemodiafiltration. Gold standard or top therapy?

    PubMed

    Passlick-Deetjen, Jutta; Pohlmeier, Robert

    2002-01-01

    In summary, on-line HDF is an extracorporeal blood purification therapy with increased convective removal of uremic toxins as compared to the most frequently used low- or high-flux HD therapy. The clinical advantages of on-line HDF have shown to be dose dependent, which makes on-line HDF superior to other therapies with less convective solute removal. Among the therapies with high convective solute removal, i.e. on-line HDF, on-line HF and double high-flux dialysis, it is difficult to finally decide on the best therapy, as direct comparisons of these therapies are not performed. Theoretical considerations like the relative to on-line HDF lower achievable Kt/Vurea with on-line HF, allow to state that on-line HDF is the top therapy now available for patients with ESRD. A gold standard may be defined as something with which everything else is compared if one tries to establish it in the respective field. In order to declare on-line HDF as the gold standard in renal replacement therapy, we need more direct comparisons of on-line HDF with other therapies, including mortality as an outcome parameter. However, based on our current knowledge, it does not seem to be too speculative that high-quality clinical studies will establish on-line HDF in the next years as the new gold standard in renal replacement therapy.

  17. NASA Tech Briefs, December 2008

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Topics covered include: Crew Activity Analyzer; Distributing Data to Hand-Held Devices in a Wireless Network; Reducing Surface Clutter in Cloud Profiling Radar Data; MODIS Atmospheric Data Handler; Multibeam Altimeter Navigation Update Using Faceted Shape Model; Spaceborne Hybrid-FPGA System for Processing FTIR Data; FPGA Coprocessor for Accelerated Classification of Images; SiC JFET Transistor Circuit Model for Extreme Temperature Range; TDR Using Autocorrelation and Varying-Duration Pulses; Update on Development of SiC Multi-Chip Power Modules; Radio Ranging System for Guidance of Approaching Spacecraft; Electromagnetically Clean Solar Arrays; Improved Short-Circuit Protection for Power Cells in Series; Electromagnetically Clean Solar Arrays; Logic Gates Made of N-Channel JFETs and Epitaxial Resistors; Improved Short-Circuit Protection for Power Cells in Series; Communication Limits Due to Photon-Detector Jitter; System for Removing Pollutants from Incinerator Exhaust; Sealing and External Sterilization of a Sample Container; Converting EOS Data from HDF-EOS to netCDF; HDF-EOS 2 and HDF-EOS 5 Compatibility Library; HDF-EOS Web Server; HDF-EOS 5 Validator; XML DTD and Schemas for HDF-EOS; Converting from XML to HDF-EOS; Simulating Attitudes and Trajectories of Multiple Spacecraft; Specialized Color Function for Display of Signed Data; Delivering Alert Messages to Members of a Work Force; Delivering Images for Mars Rover Science Planning; Oxide Fiber Cathode Materials for Rechargeable Lithium Cells; Electrocatalytic Reduction of Carbon Dioxide to Methane; Heterogeneous Superconducting Low-Noise Sensing Coils; Progress toward Making Epoxy/Carbon-Nanotube Composites; Predicting Properties of Unidirectional-Nanofiber Composites; Deployable Crew Quarters; Nonventing, Regenerable, Lightweight Heat Absorber; Miniature High-Force, Long-Stroke SMA Linear Actuators; "Bootstrap" Configuration for Multistage Pulse-Tube Coolers; Reducing Liquid Loss during Ullage Venting in Microgravity; Ka-Band Transponder for Deep-Space Radio Science; Replication of Space-Shuttle Computers in FPGAs and ASICs; Demisable Reaction-Wheel Assembly; Spatial and Temporal Low-Dimensional Models for Fluid Flow; Advanced Land Imager Assessment System; Range Imaging without Moving Parts.

  18. The nature of radio emission from distant galaxies

    NASA Astrophysics Data System (ADS)

    Richards, Eric A.

    I describe an observational program aimed at understanding the radio emission from distant, rapidly evolving galaxy populations. These observations were carried out at 1.4 and 8.5 GHz with the VLA centered on the Hubble Deep Field. Further MERLIN observations of the HDF region at 1.4 GHz provided an angular resolution of 0.2'' and when combined with the VLA data produced an image with an unprecedented rms noise of 4 μJy. All radio sources detected in the VLA complete sample are resolved with a median angular size of 1-2''. The differential count of the radio sources is marginally sub-Euclidean (γ = -2.4 +/- 0.1) and fluctuation analysis suggests nearly 60 sources per armin2 are present at the 1 μJy level. A correlation analysis indicates spatial clustering among the 371 radio sources on angular scales of 1-40 arcmin. Optical identifications are made primarily with bright (I = 22) disk systems composed of irregulars, peculiars, interacting/merging galaxies, and a few isolated field spirals. Available redshifts span the range 0.2-3. These clues coupled with the steep spectral index of the 1.4 GHz selected sample are indicative of diffuse synchrotron radiation in distant galactic disks. Thus the evolution in the microjansky radio population is driven principally by star-formation. I have isolated a number of optically faint radio sources (about 25% of the overall sample) which remain unidentified to I = 26-28 in the HDF and flanking optical fields. Several of these objects have extremely red counterparts and constitute a new class of radio sources which are candidate high redshift dusty protogalaxies.

  19. HDF-EOS 2 and HDF-EOS 5 Compatibility Library

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    The HDF-EOS 2 and HDF-EOS 5 Compatibility Library contains C-language functions that provide uniform access to HDF-EOS 2 and HDF-EOS 5 files through one set of application programming interface (API) calls. ("HDFEOS 2" and "HDF-EOS 5" are defined in the immediately preceding article.) Without this library, differences between the APIs of HDF-EOS 2 and HDF-EOS 5 would necessitate writing of different programs to cover HDF-EOS 2 and HDF-EOS 5. The API associated with this library is denoted "he25." For nearly every HDF-EOS 5 API call, there is a corresponding he25 API call. If a file in question is in the HDF-EOS 5 format, the code reverts to the corresponding HDF-EOS 5 call; if the file is in the HDF-EOS 2 format, the code translates the arguments to HDF-EOS 2 equivalents (if necessary), calls the HDFEOS 2 call, and retranslates the results back to HDF-EOS 5 (if necessary).

  20. Building the Case for SNAP: Creation of Multi-Band, Simulated Images With Shapelets

    NASA Technical Reports Server (NTRS)

    Ferry, Matthew A.

    2005-01-01

    Dark energy has simultaneously been the most elusive and most important phenomenon in the shaping of the universe. A case for a proposed space-telescope called SNAP (SuperNova Acceleration Probe) is being built, a crucial component of which is image simulations. One method for this is "Shapelets," developed at Caltech. Shapelets form an orthonormal basis and are uniquely able to represent realistic space images and create new images based on real ones. Previously, simulations were created using the Hubble Deep Field (HDF) as a basis Set in one band. In this project, image simulations are created.using the 4 bands of the Hubble Ultra Deep Field (UDF) as a basis set. This provides a better basis for simulations because (1) the survey is deeper, (2) they have a higher resolution, and (3) this is a step closer to simulating the 9 bands of SNAP. Image simulations are achieved by detecting sources in the UDF, decomposing them into shapelets, tweaking their parameters in realistic ways, and recomposing them into new images. Morphological tests were also run to verify the realism of the simulations. They have a wide variety of uses, including the ability to create weak gravitational lensing simulations.

  1. Constraints on z~10 Galaxies from the Deepest Hubble Space Telescope NICMOS Fields

    NASA Astrophysics Data System (ADS)

    Bouwens, R. J.; Illingworth, G. D.; Thompson, R. I.; Franx, M.

    2005-05-01

    We use all available fields with deep NICMOS imaging to search for J110-dropouts (H160,AB<~28) at z~10. Our primary data set for this search is the two J110+H160 NICMOS fields taken in parallel with the Advanced Camera for Surveys (ACS) Hubble Ultra Deep Field (UDF). The 5 σ limiting magnitudes were ~28.6 in J110 and ~28.5 in H160 (0.6" apertures). Several shallower fields were also used: J110+H160 NICMOS frames available over the Hubble Deep Field (HDF) North, the HDF-South NICMOS parallel, and the ACS UDF (with 5 σ limiting magnitudes in J110 and H160 ranging from 27.0 to 28.2). The primary selection criterion was (J110-H160)AB>1.8. Eleven such sources were found in all search fields using this criterion. Eight of these are clearly ruled out as credible z~10 sources, either as a result of detections (>2 σ) blueward of J110 or their colors redward of the break (H160-K~1.5) (redder than >~98% of lower redshift dropouts). The nature of the three remaining sources could not be determined from the data. This number appears consistent with the expected contamination from low-redshift interlopers. Analysis of the stacked images for the three candidates also suggests some contamination. Regardless of their true redshifts, the actual number of z~10 sources must be three or fewer. To assess the significance of these results, two lower redshift samples (a z~3.8 B-dropout and z~6 i-dropout sample) were projected to z~7-13 using a (1+z)-1 size scaling (for fixed luminosity). They were added to the image frames and the selection was repeated, giving 15.6 and 4.8 J110-dropouts, respectively. This suggests that to the limit of this probe (~0.3L*z=3), there has been evolution from z~3.8 and possibly from z~6. This is consistent with the strong evolution already noted at z~6 and z~7.5 relative to z~3-4. Even assuming that three sources from this probe are at z~10, the rest-frame continuum UV (~1500 Å) luminosity density at z~10 (integrated down to 0.3L*z=3) is just 0.19+0.13-0.09 times that at z~3.8 (or 0.19+0.15-0.10 times, including the small effect from cosmic variance). However, if none of our sources are at z~10, this ratio has a 1 σ upper limit of 0.07. Based on observations made with the NASA/ESA Hubble Space Telescope, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.

  2. Hierarchical Data Formats (HDF) Update

    NASA Technical Reports Server (NTRS)

    Pourmal, Elena

    2017-01-01

    In this presentation, we will talk about the latest releases of HDF4 and HDF5 software and tools, new features available in HDF5, and roadmap for the HDF software. We will also solicit feedback from the users of HDF data and HDF application developers on new features and new tools. The talk will cover: Difference between 1.8 and 1.10 releases and how and when to move to the latest release Features of the recent HDF5 1.8.19, 1.10.1 and HDF 4.2.13 Overview of HDF View 3.0 and other enhancements to tools Supported compilers and systems Open discussion of new requirements and wish list of the HDF features Compression library for interoperability with h5py and Pandas and better floating-point data compression.

  3. Statistical analysis of the horizontal divergent flow in emerging solar active regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toriumi, Shin; Hayashi, Keiji; Yokoyama, Takaaki, E-mail: shin.toriumi@nao.ac.jp

    Solar active regions (ARs) are thought to be formed by magnetic fields from the convection zone. Our flux emergence simulations revealed that a strong horizontal divergent flow (HDF) of unmagnetized plasma appears at the photosphere before the flux begins to emerge. In our earlier study, we analyzed HMI data for a single AR and confirmed presence of this precursor plasma flow in the actual Sun. In this paper, as an extension of our earlier study, we conducted a statistical analysis of the HDFs to further investigate their characteristics and better determine the properties. From SDO/HMI data, we picked up 23more » flux emergence events over a period of 14 months, the total flux of which ranges from 10{sup 20} to 10{sup 22} Mx. Out of 23 selected events, 6 clear HDFs were detected by the method we developed in our earlier study, and 7 HDFs detected by visual inspection were added to this statistic analysis. We found that the duration of the HDF is on average 61 minutes and the maximum HDF speed is on average 3.1 km s{sup –1}. We also estimated the rising speed of the subsurface magnetic flux to be 0.6-1.4 km s{sup –1}. These values are highly consistent with our previous one-event analysis as well as our simulation results. The observation results lead us to the conclusion that the HDF is a rather common feature in the earliest phase of AR emergence. Moreover, our HDF analysis has the capability of determining the subsurface properties of emerging fields that cannot be directly measured.« less

  4. Hemodiafiltration history, technology, and clinical results.

    PubMed

    Ronco, Claudio; Cruz, Dinna

    2007-07-01

    Hemodiafiltration (HDF) is an extracorporeal renal-replacement technique using a highly permeable membrane, in which diffusion and convection are conveniently combined to enhance solute removal in a wide spectrum of molecular weights. In this modality, ultrafiltration exceeds the desired fluid loss in the patient, and replacement fluid must be administered to achieve the target fluid balance. Over the years, various HDF variants have emerged, including acetate-free biofiltration, high-volume HDF, internal HDF, paired-filtration dialysis, middilution HDF, double high-flux HDF, push-pull HDF, and online HDF. Recent technology has allowed online production of large volumes of microbiologically ultrapure fluid for reinfusion, greatly simplifying the practice of HDF. Several advantages of HDF over purely diffusive hemodialysis techniques have been described in the literature, including a greater clearance of urea, phosphate, beta(2)-microglobulin and other larger solutes, reduction in dialysis hypotension, and improved anemia management. Although randomized controlled trials have failed to show a survival benefit of HDF, recent data from large observational studies suggest a positive effect of HDF on survival. This article provides a brief review of the history of HDF, the various HDF techniques, and summary of their clinical effects.

  5. MISR HDF-to-Binary Converter and Radiance/BRF Calculation Tools

    Atmospheric Science Data Center

    2013-04-01

    ... to have the HDF and HDF-EOS libraries for the target computer. The HDF libraries are available from  The HDF Group (THG) . The ... and the HDF-EOS include and library files on the target computer. The following files are included in the distribution tar file for ...

  6. An Approach Using Parallel Architecture to Storage DICOM Images in Distributed File System

    NASA Astrophysics Data System (ADS)

    Soares, Tiago S.; Prado, Thiago C.; Dantas, M. A. R.; de Macedo, Douglas D. J.; Bauer, Michael A.

    2012-02-01

    Telemedicine is a very important area in medical field that is expanding daily motivated by many researchers interested in improving medical applications. In Brazil was started in 2005, in the State of Santa Catarina has a developed server called the CyclopsDCMServer, which the purpose to embrace the HDF for the manipulation of medical images (DICOM) using a distributed file system. Since then, many researches were initiated in order to seek better performance. Our approach for this server represents an additional parallel implementation in I/O operations since HDF version 5 has an essential feature for our work which supports parallel I/O, based upon the MPI paradigm. Early experiments using four parallel nodes, provide good performance when compare to the serial HDF implemented in the CyclopsDCMServer.

  7. Moving from HDF4 to HDF5/netCFD-4

    NASA Technical Reports Server (NTRS)

    Pourmal, Elena; Yang, Kent; Lee, Joe

    2017-01-01

    In this presentation, we will go over the major differences between two file formats and libraries, and will talk about the HDF5 features that users should consider when designing new products in HDF5netCDF4. We will also discuss the h4h5tools toolkit that can facilitate conversion of data in the existing HDF4 files to HDF5 and netCDF-4, and we will engage the participants in the discussion of how The HDF Group can help with the transition and adoption of HDF5 and netCDF-4.

  8. What software tools can I use to view ERBE HDF data products?

    Atmospheric Science Data Center

    2014-12-08

    Visualize ERBE data with view_hdf: view_hdf a visualization and analysis tool for accessing data stored in Hierarchical Data Format (HDF) and HDF-EOS. ... Start HDFView Select File Select Open Select the file to be viewed ERBE: Data Access ...

  9. History of Hubble Space Telescope (HST)

    NASA Image and Video Library

    1995-12-01

    This deepest-ever view of the universe unveils myriad galaxies back to the begirning of time. Several hundred, never-before-seen, galaxies are visible in this view of the universe, called Hubble Deep Field (HDF). Besides the classical spiral and elliptical shaped galaxies, there is a bewildering variety of other galaxy shapes and colors that are important clues to understanding the evolution of the universe. Some of the galaxies may have formed less than one-billion years after the Big Bang. The image was assembled from many separate exposures with the Wide Field/Planetary Camera 2 (WF/PC2), for ten consecutive days between December 18, 1995 and December 28, 1995. This true-color view was assembled from separate images taken with blue, red, and infrared light. By combining these separate images into a single color picture, astronomers will be able to infer, at least statistically, the distance, age, and composition of galaxies in the field. Blue objects contain young stars and/or are relatively close, while redder objects contain older stellar populations and/or are farther away.

  10. HDF Update

    NASA Technical Reports Server (NTRS)

    Pourmal, Elena

    2016-01-01

    The HDF Group maintains and evolves HDF software used by NASA ESDIS program to manage remote sense data. In this talk we will discuss new features of HDF (Virtual Datasets, Single writerMultiple reader access, Community supported HDF5 compression filters) that address storage and IO performance requirements of the applications that work with the ESDIS data products.

  11. Divert to ULTRA: differences in infused volumes and clearance in two on-line hemodiafiltration treatments.

    PubMed

    Panichi, Vincenzo; De Ferrari, Giacomo; Saffioti, Stefano; Sidoti, Antonino; Biagioli, Marina; Bianchi, Stefano; Imperiali, Patrizio; Gabbrielli, Claudio; Conti, Paolo; Patrone, Pietro; Falqui, Valeria; Rombolà, Giuseppe; Mura, Carlo; Icardi, Andrea; Mulas, Donatella; Rosati, Alberto; Santori, Francesco; Mannarino, Antonio; Tomei, Valeria; Bertucci, Andrea; Steckiph, Denis; Palla, Roberto

    2012-06-01

    Mixed diffusive-convective dialysis therapies offer greater removal capabilities than conventional dialysis. The aim of this study was to compare two different on-line, post-dilution hemodiafiltration (HDF) treatments with regard to achieved convective volume and middle-molecule dialysis efficiency: standard volume control (sOL-HDF) and automated control of the transmembrane pressure (TMP) (UC-HDF). We enrolled 30 ESRD patients (55.9 ± 14.0 years, 20/10 M/F) in a randomized, prospective, cross-over study. The patients received a 3-month period of sOL-HDF followed by UC-HDF for a further 3 months, or vice versa, using the same dialysis machine. In sOL-HDF, fixed exchange volumes were set according to a filtration fraction greater than or equal to 25%. In UC-HDF therapy, the exchanged volume was driven by a biofeedback system controlling the TMP and its set point in a double loop. Patients maintained their treatment time, dialyzer, blood flow rate, and anticoagulant regimen unchanged throughout the study. Greater convective volumes were achieved in UC-HDF than in sOL-HDF (23.8 ± 3.9 vs.19.8 ± 4.8 L; p<0.001) with high pre-dialysis Ht value (sOL-HDF 34.0 ± 4.5% and UC-HDF 34.0 ± 4.4%; p = 0.91). The average clearance values of ß2m and P were higher in UC-HDF than in sOL-HDF (respectively 123 ± 24 vs. 111 ± 22 ml/min, p<0.002 and 158 ± 26 vs. 152 ± 25 ml/min, p<0.05). Moreover, the UC-HDF mode led to a significantly increased rate of call-free sessions from 88% to 97% (p<0.0001). This study showed that the biofeedback module, applied to the automatic control of TMP in on-line HDF, results in higher convective volumes and correspondingly higher ß2m and P clearances. By making the HDF treatment more automated and less complex to perform, it significantly reduced the staff workload.

  12. The MUSE Hubble Ultra Deep Field Survey. IX. Evolution of galaxy merger fraction since z ≈ 6

    NASA Astrophysics Data System (ADS)

    Ventou, E.; Contini, T.; Bouché, N.; Epinat, B.; Brinchmann, J.; Bacon, R.; Inami, H.; Lam, D.; Drake, A.; Garel, T.; Michel-Dansac, L.; Pello, R.; Steinmetz, M.; Weilbacher, P. M.; Wisotzki, L.; Carollo, M.

    2017-11-01

    We provide, for the first time, robust observational constraints on the galaxy major merger fraction up to z ≈ 6 using spectroscopic close pair counts. Deep Multi Unit Spectroscopic Explorer (MUSE) observations in the Hubble Ultra Deep Field (HUDF) and Hubble Deep Field South (HDF-S) are used to identify 113 secure close pairs of galaxies among a parent sample of 1801 galaxies spread over a large redshift range (0.2 < z < 6) and stellar masses (107-1011 M⊙), thus probing about 12 Gyr of galaxy evolution. Stellar masses are estimated from spectral energy distribution (SED) fitting over the extensive UV-to-NIR HST photometry available in these deep Hubble fields, adding Spitzer IRAC bands to better constrain masses for high-redshift (z ⩾ 3) galaxies. These stellar masses are used to isolate a sample of 54 major close pairs with a galaxy mass ratio limit of 1:6. Among this sample, 23 pairs are identified at high redshift (z ⩾ 3) through their Lyα emission. The sample of major close pairs is divided into five redshift intervals in order to probe the evolution of the merger fraction with cosmic time. Our estimates are in very good agreement with previous close pair counts with a constant increase of the merger fraction up to z ≈ 3 where it reaches a maximum of 20%. At higher redshift, we show that the fraction slowly decreases down to about 10% at z ≈ 6. The sample is further divided into two ranges of stellar masses using either a constant separation limit of 109.5 M⊙ or the median value of stellar mass computed in each redshift bin. Overall, the major close pair fraction for low-mass and massive galaxies follows the same trend. These new, homogeneous, and robust estimates of the major merger fraction since z ≈ 6 are in good agreement with recent predictions of cosmological numerical simulations. Based on observations made with ESO telescopes at the La Silla-Paranal Observatory under programmes 094.A-0289(B), 095.A-0010(A), 096.A-0045(A) and 096.A-0045(B).

  13. Hierachical Data Format 5 v1.10

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOZIOL, QUINCEY

    2016-04-20

    HDF5 is a data model, library, and file format for storing and managing data. It supports an unlimited variety of datatypes, and is designed for flexible and efficient I/O and for high volume and complex data. HDF5 is portable and is extensible, allowing applications to evolve in their use of HDF5. The HDF5 Technology suite includes tools and applications for managing, manipulating, viewing, and analyzing data in the HDF5 format.

  14. Ten-Year Survival of End-Stage Renal Disease Patients Treated with High-Efficiency Online Hemodiafiltration: A Cohort Study of a Center in South East Asia.

    PubMed

    Tiranathanagul, Khajohn; Susantitaphong, Paweena; Srisawat, Nattachai; Mahatanan, Nanta; Tungsanga, Kriang; Praditpornsilpa, Kearkiat; Eiam-Ong, Somchai

    2018-03-07

    Recently, in the first hemodiafiltration (HDF) experience report from South East Asia (SEA), we reported a 3-year prospective study demonstrating the various short-term benefits of high-efficiency online HDF (OL-HDF) over high-flux hemodialysis (HD). Very few long-term survival reports of high-efficiency OL-HDF are available and the data are heterogeneous and incomplete. The present historical cohort study was conducted to determine the long-term survival and outcome of high-efficiency OL-HDF-treated patients. Sixty-six high-efficiency OL-HDF treated patients at a center in SEA were included in the study. The prescription included blood and dialysis fluid flow rates of 400 and 800 mL/min, respectively. The post- or pre-dilution substitution fluid of 100 or 200 mL/min, respectively, was prescribed. Of 66 HDF patients, whose age was 57.4 ± 14.0 years, there were 38 (58%) females. The majority of comorbidity was diabetes (36%). There were 33 (50%) incident HDF cases that were prescribed OL-HDF at the dialysis initiation and 33 (50%) prevalent HDF cases that were switched from HD to OL-HDF. The 1-, 3-, 5-, and 10-year survival rate were 95.1, 83.4, 77.7, and 61.8% respectively. The mean survival time was 8.99 ± 0.64 years. There were 15 transplantations and 15 deaths during this study periods. The 2 major causes of death were cardiovascular (33.3%) and infectious diseases (20%). Serum ferritin was the only parameter that correlated with mortality (HR 1.004, p = 0.005). There was comparable survival between incident and prevalent HDF cases. The survival after transplantation of a sub-group of patients who received kidney transplantation (KT) was not different from that of the overall HDF patients (p = 0.93). High-efficiency OL-HDF could provide an excellent long-term survival nearly comparable to the KT sub-group. © 2018 S. Karger AG, Basel.

  15. Status of LOFAR Data in HDF5 Format

    NASA Astrophysics Data System (ADS)

    Alexov, A.; Schellart, P.; ter Veen, S.; van der Akker, M.; Bähren, L.; Greissmeier, J.-M.; Hessels, J. W. T.; Mol, J. D.; Renting, G. A.; Swinbank, J.; Wise, M.

    2012-09-01

    The Hierarchical Data Format, version 5 (HDF5) is a data model, library, and file format for storing and managing data. It is designed for flexible and efficient I/O and for high volume, complex data. The Low Frequency Array (LOFAR) project is solving the challenge of data size and complexity using HDF5. Most of LOFAR's standard data products will be stored using HDF5; the beam-formed time-series data and transient buffer board data have already transitioned from project-specific binary format to HDF5. We report on our effort to pave the way towards new astronomical data encapsulation using HDF5, which can be used by future ground and space projects. The LOFAR project has formed a collaboration with NRAO, the Virtual Astronomical Observatory (VAO) and the HDF Group to obtain funding for a full-time staff member to work on documenting and developing standards for astronomical data written in HDF5. We hope our effort will enhance HDF5 visibility and usage within the community, specifically for LSST, the SKA pathfinders (ASKAP, MeerKAT, MWA, LWA), and other major new radio telescopes such as EVLA, ALMA, and eMERLIN.

  16. HDF4 Maps: For Now and For the Future

    NASA Astrophysics Data System (ADS)

    Plutchak, J.; Aydt, R.; Folk, M. J.

    2013-12-01

    Data formats and access tools necessarily change as technology improves to address emerging requirements with new capabilities. This on-going process inevitably leaves behind significant data collections in legacy formats that are difficult to support and sustain. NASA ESDIS and The HDF Group currently face this problem with large and growing archives of data in HDF4, an older version of the HDF format. Indefinitely guaranteeing the ability to read these data with multi-platform libraries in many languages is very difficult. As an alternative, HDF and NASA worked together to create maps of the files that contain metadata and information about data types, locations, and sizes of data objects in the files. These maps are written in XML and have successfully been used to access and understand data in HDF4 files without the HDF libraries. While originally developed to support sustainable access to these data, these maps can also be used to provide access to HDF4 metadata, facilitate user understanding of files prior to download, and validate the files for compliance with particular conventions. These capabilities are now available as a service for HDF4 archives and users.

  17. Historical milestones of a long pathway.

    PubMed

    Roy, Thomas

    2011-01-01

    Hemodiafiltration (HDF), developed from the combination of hemodialysis and hemofiltration, is considered to be the most effective current procedure to remove uremic toxins from the blood of kidney patients. Historically, the clinical use of HDF was for many years limited due to the cost burden related to the large amount of sterile volume replacement fluid needed. The solution offered was on-line preparation of replacement fluid from standard dialysate by means of membrane filtration. Industry opened to this concept quite early and worked on various technical solutions between the early 1980s and the late 1990s before real state-of-the-art systems became commercially available on a broad basis. This article reviews in particular the activities of initially Fresenius and later Fresenius Medical Care in this field and identifies major concepts and prototypes up to today's commercially available high-end product--the 5008 therapy system--where on-line HDF finally became integrated as a standard component. Copyright © 2011 S. Karger AG, Basel.

  18. Software to Compare NPP HDF5 Data Files

    NASA Technical Reports Server (NTRS)

    Wiegand, Chiu P.; LeMoigne-Stewart, Jacqueline; Ruley, LaMont T.

    2013-01-01

    This software was developed for the NPOESS (National Polar-orbiting Operational Environmental Satellite System) Preparatory Project (NPP) Science Data Segment. The purpose of this software is to compare HDF5 (Hierarchical Data Format) files specific to NPP and report whether the HDF5 files are identical. If the HDF5 files are different, users have the option of printing out the list of differences in the HDF5 data files. The user provides paths to two directories containing a list of HDF5 files to compare. The tool would select matching HDF5 file names from the two directories and run the comparison on each file. The user can also select from three levels of detail. Level 0 is the basic level, which simply states whether the files match or not. Level 1 is the intermediate level, which lists the differences between the files. Level 2 lists all the details regarding the comparison, such as which objects were compared, and how and where they are different. The HDF5 tool is written specifically for the NPP project. As such, it ignores certain attributes (such as creation_date, creation_ time, etc.) in the HDF5 files. This is because even though two HDF5 files could represent exactly the same granule, if they are created at different times, the creation date and time would be different. This tool is smart enough to ignore differences that are not relevant to NPP users.

  19. Interdisciplinary Research Scenario Testing of EOSDIS

    NASA Technical Reports Server (NTRS)

    Emmitt, G. D.

    1999-01-01

    During the reporting period, the Principle Investigator (PI) has continued to serve on numerous review panels, task forces and committees with the goal of providing input and guidance for the Earth Observing System Data and Information System (EOSDIS) program at NASA Headquarters and NASA GSFC. In addition, the PI has worked together with personnel at the University of Virginia and the subcontractor (Simpson Weather Associates (SWA)) to continue to evaluate the latest releases of various versions of the user interfaces to the EOSDIS. Finally, as part of the subcontract, SWA has created an on-line Hierarchial Data Format (HDF) tutorial for non-HDF experts, particularly those that will be using EOSDIS and future EOS data products. A summary of these three activities is provided. The topics include: 1) Participation on EODIS Panels and Committees; 2) Evaluation and Tire Kicking of EODIS User Interfaces; and 3) An On-line HDF Tutorial. The report also includes attachments A, B, and C. Attachment A: Report From the May 1999 Science Data Panel. The topics include: 1) Summary of Data Panel Meeting; and 2) Panel's Comments/Recommendations. Attachment B: Survey Requesting Integrated Design Systems (IDS) Teams Input on the Descoping and Rescoping of the EODIS; and Attachment C: An HDF Tutorial for Beginners: EODIS Users and Small Data Providers (HTML Version). The topics include: 1) Tutorial Overview; 2) An introduction to HDF; 3) The HDF Library: Software and Hardware; 4) Methods of Working with HDF Files; 5) Scientific Data API; 6) Attributes and Metadata; 7) Writing a SDS to an HDF file; 8) Obtaining Information on Existing HDF Files; 9) Reading a Scientific Data Set from an HDF file: 10) Example Programs; 11) Browsing and Visualizing HDF Data; and 12) Laboratory (Question and Answer).

  20. Post-fire "Hillslope Debris Flows": evidence of a distinct erosion process

    NASA Astrophysics Data System (ADS)

    Langhans, Christoph; Nyman, Petter; Noske, Phil; Vandersant, Rene; Lane, Patrick; Sheridan, Gary

    2017-04-01

    Debris flows occurring soon after fire have been associated with a somewhat mysterious erosion process upslope of their initiation zone that some authors have called 'miniature debris flows on hillslopes', and that leave behind levee-lined rills. Despite the unusual proposition of debris flow on planar hillslopes, the process has not received much attention. The objective of this study was to present evidence of this process from field observations, to analyse its initiation, movement and form through runoff experiments and video, explore the role of fire severity and runoff rate, and to propose a conceptual model of the process. Hillslope debris flows (HDF) consist of a lobe of gravel- to cobble-sized material 0.2 - 1 m wide that is pushed by runoff damming up behind it. During initiation, runoff moved individual particles that accumulated a small distance downslope until the accumulation of grains failed and formed the granular lobe of the HDF. They occur at relatively steep slope gradients (0.4 - 0.8), on a variety of geologies, and after fire of at least moderate intensity, where all litter is burnt and the soil surface becomes non-cohesive. HDF are a threshold process, and runoff rates of less than 0.5 L s-1 to more than 1 L s-1 were required for their initiation during the experiments. Char and ash lower the threshold considerably. Our conceptual model highlights HDF as a geomorphic process distinct from channel debris flows and classical rill erosion. On a matrix of slope and grain size, HDF are enveloped between purely gravity-driven dry ravel, and mostly runoff-driven bedload transport in rills.

  1. Hemodiafiltration Versus Hemodialysis and Survival in Patients With ESRD: The French Renal Epidemiology and Information Network (REIN) Registry.

    PubMed

    Mercadal, Lucile; Franck, Jeanna-Eve; Metzger, Marie; Urena Torres, Pablo; de Cornelissen, François; Edet, Stéphane; Béchade, Clémence; Vigneau, Cécile; Drüeke, Tilman; Jacquelinet, Christian; Stengel, Bénédicte

    2016-08-01

    Recent randomized trials report that mortality is lower with high-convection-volume hemodiafiltration (HDF) than with hemodialysis (HD). We used data from the French national Renal Epidemiology and Information Network (REIN) registry to investigate trends in HDF use and its relationship with mortality in the total population of incident dialysis patients. The study included those who initiated HD therapy from January 1, 2008, through December 31, 2011, and were dialyzed for more than 3 months; follow-up extended to the end of 2012. HDF use at the patient and facility level. All-cause and cardiovascular mortality, using Cox models to estimate HRs of HDF as time-dependent covariate at the patient level, with age as time scale and fully adjusted for comorbid conditions and laboratory data at baseline, catheter use, and facility type as time-dependent covariates. Analyses completed by Cox models for HRs of the facility-level exposure to HDF updated yearly. Of 28,407 HD patients, 5,526 used HDF for a median of 1.2 (IQR, 0.9-1.9) years; 2,254 of them used HDF exclusively. HRs for all-cause and cardiovascular mortality associated with HDF use were 0.84 (95% CI, 0.77-0.91) and 0.73 (95% CI, 0.61-0.88), respectively. In patients treated exclusively with HDF, these HRs were 0.77 (95% CI, 0.67-0.87) and 0.66 (95% CI, 0.50-0.86). At the facility level, increasing the percentage of patients using HDF from 0% to 100% was associated with HRs for all-cause and cardiovascular mortality of 0.87 (95% CI, 0.77-0.99) and 0.72 (95% CI, 0.54-0.96), respectively. Observational study. Whether analyzed as a patient- or facility-level predictor, HDF treatment was associated with better survival. Copyright © 2015 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  2. Protein-bound uraemic toxins, dicarbonyl stress and advanced glycation end products in conventional and extended haemodialysis and haemodiafiltration.

    PubMed

    Cornelis, Tom; Eloot, Sunny; Vanholder, Raymond; Glorieux, Griet; van der Sande, Frank M; Scheijen, Jean L; Leunissen, Karel M; Kooman, Jeroen P; Schalkwijk, Casper G

    2015-08-01

    Protein-bound uraemic toxins (PBUT), dicarbonyl stress and advanced glycation end products (AGEs) associate with cardiovascular disease in dialysis. Intensive haemodialysis (HD) may have significant clinical benefits. The aim of this study was to evaluate the acute effects of conventional and extended HD and haemodiafiltration (HDF) on reduction ratio (RR) and total solute removal (TSR) of PBUT, dicarbonyl stress compounds and AGEs. Thirteen stable conventional HD patients randomly completed a single study of 4-h HD (HD4), 4-h HDF (HDF4), 8-h HD (HD8) and 8-h HDF (HDF8) with a 2-week interval between the study sessions. RR and TSR of PBUT [indoxyl sulphate (IS), p-cresyl sulphate (PCS), p-cresyl glucuronide, 3-carboxyl-4-methyl-5-propyl-2-furanpropionic acid (CMPF), indole-3-acetic acid (IAA) and hippuric acid] of free and protein-bound AGEs [N(ε)-(carboxymethyl)lysine (CML), N(ε)-(carboxyethyl)lysine (CEL), Nδ-(5-hydro-5-methyl-4-imidazolon-2-yl)-ornithine, pentosidine], as well as of dicarbonyl compounds [glyoxal, methylglyoxal, 3-deoxyglucosone], were determined. Compared with HD4, HDF4 resulted in increased RR of total and/or free fractions of IAA and IS as well as increased RR of free CML and CEL. HD8 and HDF8 showed a further increase in TSR and RR of PBUT (except CMPF), as well as of dicarbonyl stress and free AGEs compared with HD4 and HDF4. Compared with HD8, HDF8 only significantly increased RR of total and free IAA and free PCS, as well as RR of free CEL. Dialysis time extension (HD8 and HDF8) optimized TSR and RR of PBUT, dicarbonyl stress and AGEs, whereas HDF8 was superior to HD8 for only a few compounds. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  3. Deep CO(1-0) Observations of z = 1.62 Cluster Galaxies with Substantial Molecular Gas Reservoirs and Normal Star Formation Efficiencies

    NASA Astrophysics Data System (ADS)

    Rudnick, Gregory; Hodge, Jacqueline; Walter, Fabian; Momcheva, Ivelina; Tran, Kim-Vy; Papovich, Casey; da Cunha, Elisabete; Decarli, Roberto; Saintonge, Amelie; Willmer, Christopher; Lotz, Jennifer; Lentati, Lindley

    2017-11-01

    We present an extremely deep CO(1-0) observation of a confirmed z = 1.62 galaxy cluster. We detect two spectroscopically confirmed cluster members in CO(1-0) with signal-to-noise ratio > 5. Both galaxies have log({{ M }}\\star /{{ M }}⊙ ) > 11 and are gas rich, with {{ M }}{mol}/({{ M }}\\star +{{ M }}{mol}) ˜ 0.17-0.45. One of these galaxies lies on the star formation rate (SFR)-{{ M }}\\star sequence, while the other lies an order of magnitude below. We compare the cluster galaxies to other SFR-selected galaxies with CO measurements and find that they have CO luminosities consistent with expectations given their infrared luminosities. We also find that they have gas fractions and star formation efficiencies (SFE) comparable to what is expected from published field galaxy scaling relations. The galaxies are compact in their stellar light distribution, at the extreme end for all high-redshift star-forming galaxies. However, their SFE is consistent with other field galaxies at comparable compactness. This is similar to two other sources selected in a blind CO survey of the HDF-N. Despite living in a highly quenched protocluster core, the molecular gas properties of these two galaxies, one of which may be in the process of quenching, appear entirely consistent with field scaling relations between the molecular gas content, stellar mass, star formation rate, and redshift. We speculate that these cluster galaxies cannot have any further substantive gas accretion if they are to become members of the dominant passive population in z< 1 clusters.

  4. Photon-HDF5: Open Data Format and Computational Tools for Timestamp-based Single-Molecule Experiments

    PubMed Central

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2017-01-01

    Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon-HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon-HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon-HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5. PMID:28649160

  5. Photon-HDF5: Open Data Format and Computational Tools for Timestamp-based Single-Molecule Experiments.

    PubMed

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-02-13

    Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon-HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon-HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon-HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5.

  6. Photon-HDF5: open data format and computational tools for timestamp-based single-molecule experiments

    NASA Astrophysics Data System (ADS)

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-02-01

    Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon- HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon- HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon- HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5.

  7. HDFITS: Porting the FITS data model to HDF5

    NASA Astrophysics Data System (ADS)

    Price, D. C.; Barsdell, B. R.; Greenhill, L. J.

    2015-09-01

    The FITS (Flexible Image Transport System) data format has been the de facto data format for astronomy-related data products since its inception in the late 1970s. While the FITS file format is widely supported, it lacks many of the features of more modern data serialization, such as the Hierarchical Data Format (HDF5). The HDF5 file format offers considerable advantages over FITS, such as improved I/O speed and compression, but has yet to gain widespread adoption within astronomy. One of the major holdbacks is that HDF5 is not well supported by data reduction software packages and image viewers. Here, we present a comparison of FITS and HDF5 as a format for storage of astronomy datasets. We show that the underlying data model of FITS can be ported to HDF5 in a straightforward manner, and that by doing so the advantages of the HDF5 file format can be leveraged immediately. In addition, we present a software tool, fits2hdf, for converting between FITS and a new 'HDFITS' format, where data are stored in HDF5 in a FITS-like manner. We show that HDFITS allows faster reading of data (up to 100x of FITS in some use cases), and improved compression (higher compression ratios and higher throughput). Finally, we show that by only changing the import lines in Python-based FITS utilities, HDFITS formatted data can be presented transparently as an in-memory FITS equivalent.

  8. Subsetting and Formatting Landsat-7 LOR ETM+ and Data Products

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.

    2000-01-01

    The Landsat-7 Processing System (LPS) processes Landsat-7 Enhanced Thematic Mapper (ETM+) instrument data into large, contiguous segments called "subintervals" and stores them in Level OR (LOR) data files. The LPS processed subinterval products must be subsetted and reformatted before the Level I processing systems can ingest them. The initial full subintervals produced by the LPS are stored mainly in HDF Earth Observing System (HDF-EOS) format which is an extension to the Hierarchical Data Format (HDF). The final LOR products are stored in native HDF format. Primarily the EOS Core System (ECS) and alternately the DAAC Emergency System (DES) subset the subinterval data for the operational Landsat-7 data processing systems. The HDF and HDF-EOS application programming interfaces (APIs) can be used for extensive data subsetting and data reorganization. A stand-alone subsetter tool has been developed which is based on some of the DES code. This tool makes use of the HDF and HDFEOS APIs to perform Landsat-7 LOR product subsetting and demonstrates how HDF and HDFEOS can be used for creating various configurations of full LOR products. How these APIs can be used to efficiently subset, format, and organize Landsat-7 LOR data as demonstrated by the subsetter tool and the DES is discussed.

  9. Utilizing HDF4 File Content Maps for the Cloud

    NASA Technical Reports Server (NTRS)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  10. Evaluation of Incremental Releases of ECS User Interfaces and the Development of HDF/HDF-EOS Tutorials

    NASA Technical Reports Server (NTRS)

    Emmitt, G. D.; Greco, S.

    2001-01-01

    During the reporting period, the PI has continued to serve on numerous review panels, task forces, and committees with the goal of providing input and guidance for the Earth Observing System Data and Information System (EOSDIS) program at NASA Headquarters and NASA Goddard Space Flight Center (GSFC). In addition, the PI has worked together with personnel at Simpson Weather Associates (SWA) to help create an on-line HDF/HDF-EOS tutorial for beginning and non-expert users of both the Hierarchical Data Format (HDF) and HDF-EOS data format and software libraries. Finally, the PI has worked together with personnel at SWA and the Information Technology and Systems Center (ITSC) at the University of Alabama in Huntsville (UAH) on a feasibility study regarding the use of data mining software to ascertain features from the gridded output from numerical meteorological forecast models. A summary of these activities is provided.

  11. Is HDF5 a Good Format to Replace UVFITS?

    NASA Astrophysics Data System (ADS)

    Price, D. C.; Barsdell, B. R.; Greenhill, L. J.

    2015-09-01

    The FITS (Flexible Image Transport System) data format was developed in the late 1970s for storage and exchange of astronomy-related image data. Since then, it has become a standard file format not only for images, but also for radio interferometer data (e.g. UVFITS, FITS-IDI). But is FITS the right format for next-generation telescopes to adopt? The newer Hierarchical Data Format (HDF5) file format offers considerable advantages over FITS, but has yet to gain widespread adoption within the radio astronomy. One of the major holdbacks is that HDF5 is not well supported by data reduction software packages. Here, we present a comparison of FITS, HDF5, and the MeasurementSet (MS) format for storage of interferometric data. In addition, we present a tool for converting between formats. We show that the underlying data model of FITS can be ported to HDF5, a first step toward achieving wider HDF5 support.

  12. Vector Data Model: A New Model of HDF-EOS to Support GIS Applications in EOS

    NASA Astrophysics Data System (ADS)

    Chi, E.; Edmonds, R d

    2001-05-01

    NASA's Earth Science Data Information System (ESDIS) project has an active program of research and development of systems for the storage and management of Earth science data for Earth Observation System (EOS) mission, a key program of NASA Earth Science Enterprise. EOS has adopted an extension of the Hierarchical Data Format (HDF) as the format of choice for standard product distribution. Three new EOS specific datatypes - point, swath and grid - have been defined within the HDF framework. The enhanced data format is named HDF-EOS. Geographic Information Systems (GIS) are used by Earth scientists in EOS data product generation, visualization, and analysis. There are two major data types in GIS applications, raster and vector. The current HDF-EOS handles only raster type in the swath data model. The vector data model is identified and developed as a new HDFEOS format to meet the requirements of scientists working with EOS data products in vector format. The vector model is designed using a topological data structure, which defines the spatial relationships among points, lines, and polygons. The three major topological concepts that the vector model adopts are: a) lines connect to each other at nodes (connectivity), b) lines that connect to surround an area define a polygon (area definition), and c) lines have direction and left and right sides (contiguity). The vector model is implemented in HDF by mapping the conceptual model to HDF internal data models and structures, viz. Vdata, Vgroup, and their associated attribute structures. The point, line, and polygon geometry and attribute data are stored in similar tables. Further, the vector model utilizes the structure and product metadata, which characterize the HDF-EOS. Both types of metadata are stored as attributes in HDF-EOS files, and are encoded in text format by using Object Description Language (ODL) and stored as global attributes in HDF-EOS files. EOS has developed a series of routines for storing, retrieving, and manipulating vector data in category of access, definition, basic I/O, inquiry, and subsetting. The routines are tested and form a package, HDF-EOS/Vector. The alpha version of HDFEOS/Vector has been distributed through the HDF-EOS project web site at http://hdfeos.gsfc.nasa.gov. We are also developing translators between HDF-EOS vector format and variety of GIS formats, such as Shapefile. The HDF-EOS vector model enables EOS scientists to deliver EOS data in a way ready for Earth scientists to analyze using GIS software, and also provides EOS project a mechanism to store GIS data product in meaningful vector format with significant economy in storage.

  13. Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.

    PubMed

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-01-05

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  14. Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments

    PubMed Central

    Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier

    2016-01-01

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. PMID:26745406

  15. Photon-HDF5: an open file format for single-molecule fluorescence experiments using photon-counting detectors

    DOE PAGES

    Ingargiola, A.; Laurence, T. A.; Boutelle, R.; ...

    2015-12-23

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less

  16. Developing a middleware to support HDF data access in ArcGIS

    NASA Astrophysics Data System (ADS)

    Sun, M.; Jiang, Y.; Yang, C. P.

    2014-12-01

    Hierarchical Data Format (HDF) is the standard data format for the NASA Earth Observing System (EOS) data products, like the MODIS level-3 data. These data have been widely used in long-term study of the land surface, biosphere, atmosphere, and oceans of the Earth. Several toolkits have been developed to access HDF data, such as the HDF viewer and Geospatial Data Abstraction Library (GDAL), etc. ArcGIS integrated the GDAL providing data user a Graphical User Interface (GUI) to read HDF data. However, there are still some problems when using the toolkits:for example, 1) the projection information is not recognized correctly, 2) the image is dispalyed inverted, and 3) the tool lacks of capability to read the third dimension information stored in the data subsets, etc. Accordingly, in this study we attempt to improve the current HDF toolkits to address the aformentioned issues. Considering the wide-usage of ArcGIS, we develop a middleware for ArcGIS based on GDAL to solve the particular data access problems happening in ArcGIS, so that data users can access HDF data successfully and perform further data analysis with the ArcGIS geoprocessing tools.

  17. Post-fire hillslope debris flows: Evidence of a distinct erosion process

    NASA Astrophysics Data System (ADS)

    Langhans, Christoph; Nyman, Petter; Noske, Philip J.; Van der Sant, Rene E.; Lane, Patrick N. J.; Sheridan, Gary J.

    2017-10-01

    After wildfire a hitherto unexplained erosion process that some authors have called 'miniature debris flows on hillslopes' and that leave behind levee-lined rills has been observed in some regions of the world. Despite the unusual proposition of debris flow on planar hillslopes, the process has not received much attention. The objectives of this study were to (1) accumulate observational evidence of Hillslope Debris Flows (HDF) as we have defined the process, to (2) understand their initiation process by conducting runoff experiments on hillslopes, to (3) propose a conceptual model of HDF, and to (4) contrast and classify HDF relative to other erosion and transport processes in the post-wildfire hillslope domain. HDF have been observed at relatively steep slope gradients (0.4-0.8), on a variety of geologies, and after fire of at least moderate severity and consist of a lobe of gravel- to cobble-sized material 0.2-1 m wide that is pushed by runoff damming up behind it. During initiation, runoff moved individual particles that accumulated a small distance downslope until the accumulation of grains failed and formed the granular lobe of the HDF. HDF are a threshold process, and runoff rates of 0.5 L s- 1 2 L s- 1 were required for their initiation during the experiments. The conceptual model highlights HDF as a geomorphic process distinct from channel debris flows, because they occur on planar, unconfined hillslopes rather than confined channels. HDF can erode very coarse non-cohesive surface soil, which distinguishes them from rill erosion that have suspended and bedload transport. On a matrix of slope and grain size, HDF are enveloped between purely gravity-driven dry ravel, and mostly runoff driven bedload transport in rills.

  18. Ghost Remains After Black Hole Eruption

    NASA Astrophysics Data System (ADS)

    2009-05-01

    NASA's Chandra X-ray Observatory has found a cosmic "ghost" lurking around a distant supermassive black hole. This is the first detection of such a high-energy apparition, and scientists think it is evidence of a huge eruption produced by the black hole. This discovery presents astronomers with a valuable opportunity to observe phenomena that occurred when the Universe was very young. The X-ray ghost, so-called because a diffuse X-ray source has remained after other radiation from the outburst has died away, is in the Chandra Deep Field-North, one of the deepest X-ray images ever taken. The source, a.k.a. HDF 130, is over 10 billion light years away and existed at a time 3 billion years after the Big Bang, when galaxies and black holes were forming at a high rate. "We'd seen this fuzzy object a few years ago, but didn't realize until now that we were seeing a ghost", said Andy Fabian of the Cambridge University in the United Kingdom. "It's not out there to haunt us, rather it's telling us something - in this case what was happening in this galaxy billions of year ago." Fabian and colleagues think the X-ray glow from HDF 130 is evidence for a powerful outburst from its central black hole in the form of jets of energetic particles traveling at almost the speed of light. When the eruption was ongoing, it produced prodigious amounts of radio and X-radiation, but after several million years, the radio signal faded from view as the electrons radiated away their energy. HDF 130 Chandra X-ray Image of HDF 130 However, less energetic electrons can still produce X-rays by interacting with the pervasive sea of photons remaining from the Big Bang - the cosmic background radiation. Collisions between these electrons and the background photons can impart enough energy to the photons to boost them into the X-ray energy band. This process produces an extended X-ray source that lasts for another 30 million years or so. "This ghost tells us about the black hole's eruption long after it has died," said co-author Scott Chapman, also of Cambridge University. "This means we don't have to catch the black holes in the act to witness the big impact they have." This is the first X-ray ghost ever seen after the demise of radio-bright jets. Astronomers have observed extensive X-ray emission with a similar origin, but only from galaxies with radio emission on large scales, signifying continued eruptions. In HDF 130, only a point source is detected in radio images, coinciding with the massive elliptical galaxy seen in its optical image. This radio source indicates the presence of a growing supermassive black hole. People Who Read This Also Read... Milky Way's Super-efficient Particle Accelerators Caught in The Act NASA Joins "Around the World in 80 Telescopes" Celebrate the International Year of Astronomy Galaxies Coming of Age in Cosmic Blobs "This result hints that the X-ray sky should be littered with such ghosts," said co-author Caitlin Casey, also of Cambridge, "especially if black hole eruptions are as common as we think they are in the early Universe." The power contained in the black hole eruption was likely to be considerable, equivalent to about a billion supernovas. The energy is dumped into the surroundings and transports and heats the gas. "Even after the ghost disappears, most of the energy from the black hole's eruption remains", said Fabian. "Because they're so powerful, these eruptions can have profound effects lasting for billions of years." The details of Chandra's data of HDF 130 helped secure its true nature. For example, in X-rays, HDF 130 has a cigar-like shape that extends for some 2.2 million light years. The linear shape of the X-ray source is consistent with the shape of radio jets and not with that of a galaxy cluster, which is expected to be circular. The energy distribution of the X-rays is also consistent with the interpretation of an X-ray ghost. NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra's science and flight operations from Cambridge, Mass.

  19. Effectiveness of Haemodiafiltration with Heat Sterilized High-Flux Polyphenylene HF Dialyzer in Reducing Free Light Chains in Patients with Myeloma Cast Nephropathy

    PubMed Central

    2015-01-01

    Introduction In cases of myeloma cast nephropathy in need of haemodialysis (HD), reduction of free light chains using HD with High-Cut-Off filters (HCO-HD), in combination with chemotherapy, may be associated with better renal recovery. The aim of the present study is to evaluate the effectiveness of haemodiafiltration (HDF) in reducing free light chain levels using a less expensive heat sterilized high-flux polyphenylene HF dialyzer (HF-HDF). Methods In a single-centre prospective cohort study, 327 dialysis sessions were performed using a 2.2 m2 heat sterilized high-flux polyphenylene HF dialyzer (Phylther HF22SD), a small (1.1m2) or large (2.1 m2) high-cut-off (HCO) dialyzer (HCOS and HCOL) in a cohort of 16 patients presenting with dialysis-dependent acute cast nephropathy and elevated free light chains (10 kappa, 6 lambda). The outcomes of the study were the mean reduction ratio (RR) of kappa and lambda, the proportion of treatments with an RR of at least 0.65, albumin loss and the description of patient outcomes. Statistical analysis was performed using linear and logistic regression through generalized estimating equation analysis so as to take into account repeated observation within subjects and adjust for session duration. Results There were no significant differences in the estimated marginal mean of kappa RR, which were respectively 0.67, 0.69 and 0.70 with HCOL-HD, HCOS-HDF and HF-HDF (P = 0.950). The estimated marginal mean of the proportions of treatments with a kappa RR ≥0.65 were 68%, 63% and 71% with HCOL-HD, HCOS-HDF and HF-HDF, respectively (P = 0.913). The estimated marginal mean of lambda RR were higher with HCOL-HDF (0.78), compared to HCOL-HD and HF-HDF (0.62, and 0.61 respectively). The estimated marginal mean proportion of treatments with a lambda RR ≥0.65 were higher with HCOL-HDF (81%), compared to 57% in HF-HDF (P = 0.042). The median albumin loss were 7, 21 and 63 g/session with HF-HDF, HCOL-HD and HCOL-HDF respectively (P = 0.044). Among survivors, 9 out of 10 episodes of acute kidney injuries became dialysis-independent following a median time of renal replacement therapy of 40 days (range 7–181). Conclusion Therefore, in patients with acute dialysis-dependent myeloma cast nephropathy, in addition to chemotherapy, HDF with a heat sterilized high-flux polyphenylene HF dialyzer could offer an alternative to HCO dialysis for extracorporeal kappa reduction with lower albumin loss. PMID:26466100

  20. The NeXus data format.

    PubMed

    Könnecke, Mark; Akeroyd, Frederick A; Bernstein, Herbert J; Brewster, Aaron S; Campbell, Stuart I; Clausen, Björn; Cottrell, Stephen; Hoffmann, Jens Uwe; Jemian, Pete R; Männicke, David; Osborn, Raymond; Peterson, Peter F; Richter, Tobias; Suzuki, Jiro; Watts, Benjamin; Wintersberger, Eugen; Wuttke, Joachim

    2015-02-01

    NeXus is an effort by an international group of scientists to define a common data exchange and archival format for neutron, X-ray and muon experiments. NeXus is built on top of the scientific data format HDF5 and adds domain-specific rules for organizing data within HDF5 files, in addition to a dictionary of well defined domain-specific field names. The NeXus data format has two purposes. First, it defines a format that can serve as a container for all relevant data associated with a beamline. This is a very important use case. Second, it defines standards in the form of application definitions for the exchange of data between applications. NeXus provides structures for raw experimental data as well as for processed data.

  1. Converting from XML to HDF-EOS

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program recreates an HDF-EOS file from an Extensible Markup Language (XML) representation of the contents of that file. This program is one of two programs written to enable testing of the schemas described in the immediately preceding article to determine whether the schemas capture all details of HDF-EOS files.

  2. The AstroHDF Effort

    NASA Astrophysics Data System (ADS)

    Masters, J.; Alexov, A.; Folk, M.; Hanisch, R.; Heber, G.; Wise, M.

    2012-09-01

    Here we update the astronomy community on our effort to deal with the demands of ever-increasing astronomical data size and complexity, using the Hierarchical Data Format, version 5 (HDF5) format (Wise et al. 2011). NRAO, LOFAR and VAO have joined forces with The HDF Group to write an NSF grant, requesting funding to assist in the effort. This paper briefly summarizes our motivation for the proposed project, an outline of the project itself, and some of the material discussed at the ADASS Birds of a Feather (BoF) discussion. Topics of discussion included: community experiences with HDF5 and other file formats; toolsets which exist and/or can be adapted for HDF5; a call for development towards visualizing large (> 1 TB) image cubes; and, general lessons learned from working with large and complex data.

  3. Efficiently Serving HDF5 Products via OPeNDAP

    NASA Technical Reports Server (NTRS)

    Yang, Kent

    2017-01-01

    Hyrax OPeNDAP services are widely used by the Earth Science data centers in NASA, NOAA and other organizations to serve end users. In this talk, we will present some key features added in the HDF5 Hyrax OPeNDAP handler that can help data centers to better serve the HDF5netCDF-4 data products. Among these new features, we will focus on the following:1.The DAP4 support 2.The memory cache and the disk cache support that can reduce the service access time 3.The enhancement that makes the swath-like HDF5 products visualized by CF-client tools. We will also discuss the role of the HDF5 handler in-depth in the recent study of the Hyrax service in the cloud environment.

  4. [Unconventional hemodiafiltration: double-high-flux and push-pull].

    PubMed

    Lentini, Paolo; Pellanda, Valentina; Contestabile, Andrea; Berlingo, Graziella; de Cal, Massimo; Ronco, Claudio; Dell'Aquila, Roberto

    2012-01-01

    Growing evidence demonstrates that morbidity and mortality in patients with end-stage renal disease correlate significantly with retention of larger uremic toxins including β2 microglobulin. Even when hemodialysis is performed, complications such as dialysis-associated amyloidosis are likely to develop. These complications seem to be related to the retention and accumulation of larger uremic substances, only a small amount of which are removed by hemodialysis. On-line hemodiafiltration (OL-HDF) is popular but expensive; double-highflux hemodiafiltration (DHF-HDF) and push-pull hemodiafiltration (PP-HDF), special types of HDF, are very efficient treatments without the need for ultrapure substitution fluid. In DHF-HDF two high-flux dialyzers are connected in series by blood and dialysate lines. In the first dialyzer mixed diffusion convection removes fluid and solutes; in the second dialyzer backfiltration of sterile dialysate occurs, resembling the post-dilution OL-HDF mode. The PP-HDF method alternates rapid convection of body fluids and rapid backfiltration of sterile pyrogen-free dialysate using a high-flux membrane and a double-pump system. These treatments require an elevated blood flow and have the advantage that they use dialysis fluid instead of ultrapure fluid. Several studies have shown an elevated removal rate of middle molecules and reduction of dialysis-related amyloidosis symptoms like back and shoulder pain, restless leg syndrome, and carpal tunnel syndrome.

  5. The HDF Product Designer - Interoperability in the First Mile

    NASA Astrophysics Data System (ADS)

    Lee, H.; Jelenak, A.; Habermann, T.

    2014-12-01

    Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.

  6. Intermittent hemodialysis is superior to continuous veno-venous hemodialysis/hemodiafiltration to eliminate methanol and formate during treatment for methanol poisoning

    PubMed Central

    Zakharov, Sergey; Pelclova, Daniela; Navratil, Tomas; Belacek, Jaromir; Kurcova, Ivana; Komzak, Ondrej; Salek, Tomas; Latta, Jiri; Turek, Radovan; Bocek, Robert; Kucera, Cyril; Hubacek, Jaroslav A; Fenclova, Zdenka; Petrik, Vit; Cermak, Martin; Hovda, Knut Erik

    2014-01-01

    During an outbreak of methanol poisonings in the Czech Republic in 2012, we were able to study methanol and formate elimination half-lives during intermittent hemodialysis (IHD) and continuous veno-venous hemodialysis/hemodiafiltration (CVVHD/HDF) and the relative impact of dialysate and blood flow rates on elimination. Data were obtained from 11 IHD and 13 CVVHD/HDF patients. Serum methanol and formate concentrations were measured by gas chromatography and an enzymatic method. The groups were relatively comparable, but the CVVHD/HDF group was significantly more acidotic (mean pH 6.9 vs. 7.1 IHD). The mean elimination half-life of methanol was 3.7 and formate 1.6 h with IHD, versus 8.1 and 3.6 h, respectively, with CVVHD/HDF (both significant). The 54% greater reduction in methanol and 56% reduction in formate elimination half-life during IHD resulted from the higher blood and dialysate flow rates. Increased blood and dialysate flow on the CVVHD/HDF also increased elimination significantly. Thus, IHD is superior to CVVHD/HDF for more rapid methanol and formate elimination, and if CVVHD/HDF is the only treatment available then elimination is greater with greater blood and dialysate flow rates. PMID:24621917

  7. The Hierarchical Data Format as a Foundation for Community Data Sharing

    NASA Astrophysics Data System (ADS)

    Habermann, T.

    2017-12-01

    Hierarchical Data Format (HDF) formats and libraries have been used by individual researchers and major science programs across many Earth and Space Science disciplines and sectors to provide high-performance information storage and access for several decades. Generic group, dataset, and attribute objects in HDF have been combined in many ways to form domain objects that scientists understand and use. Well-known applications of HDF in the Earth Sciences include thousands of global satellite observations and products produced by NASA's Earth Observing System using the HDF-EOS conventions, navigation quality bathymetry produced as Bathymetric Attributed Grids (BAGs) by the OpenNavigationSurface project and others, seismic wave collections written into the Adoptable Seismic Data Format (ASDF) and many oceanographic and atmospheric products produced using the climate-forecast conventions with the netCDF4 data model and API to HDF5. This is the modus operandi of these communities: 1) develop a model of scientific data objects and associated metadata used in a domain, 2) implement that model using HDF, 3) develop software libraries that connect that model to tools and 4) encourage adoption of those tools in the community. Understanding these domain object implementations and facilitating communication across communities is an important goal of The HDF Group. We will discuss these examples and approaches to community outreach during this session.

  8. A comparison of self-reported quality of life for an Australian haemodialysis and haemodiafiltration cohort.

    PubMed

    Hill, Kathleen E; Kim, Susan; Crail, Susan; Elias, Tony J; Whittington, Tiffany

    2017-08-01

    Haemodiafiltration (HDF) has been widely studied for evidence of superior outcomes in comparison with conventional haemodialysis (HD), and there is increasing interest in determining if HDF confers any benefit in relation to quality of life. Studies have been conducted with randomized incident patients; however, little is known regarding HDF and quality of life for prevalent patients. This study examined and compared self-reported quality of life at two time points, 12 months apart in a cohort of satellite HD and HDF patients, using a disease specific questionnaire to determine if HDF conferred an advantage. A longitudinal study with a linear mixed-effect model measuring quality of life in a cohort of 171 patients (HD, n = 85, HDF, n = 86) in seven South Australian satellite dialysis centres. Factors associated with significant reduction across the Kidney Disease Quality Of Life™ domains measured were younger age (- 20 to - 29) and comorbid diabetes (- 4.8 to - 11.1). HDF was not associated with moderation of this reduction at either time point (P > 0.05). Baseline physical functioning was reported as very low (median 33.9) and further reduced at time point two. In addition, dialysing for more than 12 h per week in a satellite dialysis unit was associated with reduced quality of life in relation to the burden of kidney disease (- 13.69). This study has demonstrated that younger age and comorbid diabetes were responsible for a statistically significant reduction in quality of life, and HDF did not confer any advantage. © 2016 Asian Pacific Society of Nephrology.

  9. Keck Spectroscopy of Redshift z ~ 3 Galaxies in the Hubble Deep Field

    NASA Astrophysics Data System (ADS)

    Lowenthal, James D.; Koo, David C.; Guzmán, Rafael; Gallego, Jesús; Phillips, Andrew C.; Faber, S. M.; Vogt, Nicole P.; Illingworth, Garth D.; Gronwall, Caryl

    1997-05-01

    We have obtained spectra with the 10 m Keck telescope of a sample of 24 galaxies having colors consistent with star-forming galaxies at redshifts 2 <~ z <~ 4.5 in the Hubble deep field (HDF). Eleven of these galaxies are confirmed to be at high redshift (zmed = 3.0), one is at z = 0.5, and the other 12 have uncertain redshifts but have spectra consistent with their being at z > 2. The spectra of the confirmed high-redshift galaxies show a diversity of features, including weak Lyα emission, strong Lyα breaks or damped Lyα absorption profiles, and the stellar and interstellar rest-UV absorption lines common to local starburst galaxies and high-redshift star-forming galaxies reported recently by others. The narrow profiles and low equivalent widths of C IV, Si IV, and N V absorption lines may imply low stellar metallicities. Combined with the five high-redshift galaxies in the HDF previously confirmed with Keck spectra by Steidel et al. (1996a), the 16 confirmed sources yield a comoving volume density of n >= 2.4 × 10-4 h350 Mpc-3 for q0 = 0.05, or n >= 1.1 × 10-3 h350 Mpc-3 for q0 = 0.5. These densities are 3-4 times higher than the recent estimates of Steidel et al. (1996b) based on ground-based photometry with slightly brighter limits and are comparable to estimates of the local volume density of galaxies brighter than L*. The high-redshift density measurement is only a lower limit and could be almost 3 times higher still if all 29 of the unconfirmed candidates in our original sample, including those not observed, are indeed also at high redshift. The galaxies are small but luminous, with half-light radii 1.8 < r1/2 < 6.5 h-150 kpc and absolute magnitudes -21.5 > MB > -23. The HST images show a wide range of morphologies, including several with very close, small knots of emission embedded in wispy extended structures. Using rest-frame UV continuum fluxes with no dust correction, we calculate star formation rates in the range 7-24 or 3-9 h-250 Msolar yr-1 for q0 = 0.05 and q0 = 0.5, respectively. These rates overlap those for local spiral and H II galaxies today, although they could be more than twice as high if dust extinction in the UV is significant. If the objects at z = 3 were simply to fade by 5 mag (assuming a 107 yr burst and passive evolution) without mergers in the 14 Gyr between then and now (for q0 = 0.05, h50 = 1.0), they would resemble average dwarf elliptical/spheroidal galaxies in both luminosity and size. However, the variety of morphologies and the high number density of z = 3 galaxies in the HDF suggest that they represent a range of physical processes and stages of galaxy formation and evolution, rather than any one class of object, such as massive ellipticals. A key issue remains the measurement of masses. These high-redshift objects are likely to be the low-mass, starbursting building blocks of more massive galaxies seen today. Based on observations obtained at the W. M. Keck Observatory, which is operated jointly by the University of California and the California Institute of Technology, and with the NASA/ESA Hubble Space Telescope, which is operated by AURA, Inc., under contract with NASA.

  10. MISR Level 2 Cloud Product Versioning

    Atmospheric Science Data Center

    2017-10-11

    ... New ancillary files: MISR_AM1_ASCT_BDAS_(WIN,SPR,SUM,FALL)_DCCAM_ T<901-932>_F02_0005.hdf MISR_AM1_ASCT_BDAS_(WIN,SPR,SUM,FALL)_DBCAM_ T<901-932>_F02_0005.hdf MISR_AM1_ASCT_BDAS_(WIN,SPR,SUM,FALL)_CBCAM_ T<901-932>_F02_0005.hdf ...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ingargiola, A.; Laurence, T. A.; Boutelle, R.

    We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less

  12. Cost-effectiveness analysis of online hemodiafiltration versus high-flux hemodialysis.

    PubMed

    Ramponi, Francesco; Ronco, Claudio; Mason, Giacomo; Rettore, Enrico; Marcelli, Daniele; Martino, Francesca; Neri, Mauro; Martin-Malo, Alejandro; Canaud, Bernard; Locatelli, Francesco

    2016-01-01

    Clinical studies suggest that hemodiafiltration (HDF) may lead to better clinical outcomes than high-flux hemodialysis (HF-HD), but concerns have been raised about the cost-effectiveness of HDF versus HF-HD. Aim of this study was to investigate whether clinical benefits, in terms of longer survival and better health-related quality of life, are worth the possibly higher costs of HDF compared to HF-HD. The analysis comprised a simulation based on the combined results of previous published studies, with the following steps: 1) estimation of the survival function of HF-HD patients from a clinical trial and of HDF patients using the risk reduction estimated in a meta-analysis; 2) simulation of the survival of the same sample of patients as if allocated to HF-HD or HDF using three-state Markov models; and 3) application of state-specific health-related quality of life coefficients and differential costs derived from the literature. Several Monte Carlo simulations were performed, including simulations for patients with different risk profiles, for example, by age (patients aged 40, 50, and 60 years), sex, and diabetic status. Scatter plots of simulations in the cost-effectiveness plane were produced, incremental cost-effectiveness ratios were estimated, and cost-effectiveness acceptability curves were computed. An incremental cost-effectiveness ratio of €6,982/quality-adjusted life years (QALY) was estimated for the baseline cohort of 50-year-old male patients. Given the commonly accepted threshold of €40,000/QALY, HDF is cost-effective. The probabilistic sensitivity analysis showed that HDF is cost-effective with a probability of ~81% at a threshold of €40,000/QALY. It is fundamental to measure the outcome also in terms of quality of life. HDF is more cost-effective for younger patients. HDF can be considered cost-effective compared to HF-HD.

  13. New insights into the effect of haemodiafiltration on mortality: the Romanian experience.

    PubMed

    Siriopol, Dimitrie; Canaud, Bernard; Stuard, Stefano; Mircescu, Gabriel; Nistor, Ionut; Covic, Adrian

    2015-02-01

    Haemodiafiltration (HDF), by successfully removing the larger solutes and protein-bound compounds, may offer a feasible approach to improve dialysis outcomes. Recently, three large, randomized, controlled trials have tested this hypothesis, but only one showed an improved survival associated with HDF treatment, when compared with haemodialysis (HD). This is a retrospective analysis of the entire Romanian dialysed population from the European Clinical Database (EUCLID) Fresenius Medical Care Database. We conducted two types of analysis. First, we used an intention-to-treat approach including all patients who were in dialysis (either HDF or HD) at 1 March 2010--'prevalent cohort analysis'. We then considered only the incident patients who started dialysis (either HDF or HD) after 1 March 2010--'incident cohort analysis'. In both analyses, patients were followed until 31 April 2013. In the prevalent cohort, we included 1546 patients who were already performing dialysis at the first time point-1322 on HD and 224 on HDF. When compared with HD, HDF treatment was associated with reduced mortality in both univariate and multivariate survival analysis (HR = 0.67, 95% CI 0.46-0.96 and HR = 0.58, 95% CI 0.36-0.93, respectively). In the incident cohort, 2447 patients started dialysis (2181 HD and 266 HDF) during the observation period. Patients in the HDF group maintained a reduced risk for all-cause mortality (HR = 0.20, 95% CI 0.11-0.38 for the univariate and HR = 0.24, 95% CI 0.13-0.46 for the fully adjusted model). This study suggests that HDF treatment could reduce all-cause mortality in incident and prevalent patients even after correction for different confounders. Interestingly, an additional survival benefit could be observed in incident patients. However, as with any observational study, there could have been other unmeasured confounders that could have influenced our final results. © The Author 2014. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  14. SV40-transformed human fibroblasts: evidence for cellular aging in pre-crisis cells.

    PubMed

    Stein, G H

    1985-10-01

    Pre-crisis SV40-transformed human diploid fibroblast (HDF) cultures have a finite proliferative lifespan, but they do not enter a viable senescent state at end of lifespan. Little is known about either the mechanism for this finite lifespan in SV40-transformed HDF or its relationship to finite lifespan in normal HDF. Recently we proposed that in normal HDF the phenomena of finite lifespan and arrest in a viable senescent state depend on two separate processes: 1) an age-related decrease in the ability of the cells to recognize or respond to serum and/or other mitogens such that the cells become functionally mitogen-deprived at the end of lifespan; and 2) the ability of the cells to enter a viable, G1-arrested state whenever they experience mitogen deprivation. In this paper, data are presented that suggest that pre-crisis SV40-transformed HDF retain the first process described above, but lack the second process. It is shown that SV40-transformed HDF have a progressively decreasing ability to respond to serum as they age, but they continue to traverse the cell cycle at the end of lifespan. Concomitantly, the rate of cell death increases steadily toward the end of lifespan, thereby causing the total population to cease growing and ultimately to decline. Previous studies have shown that when SV40-transformed HDF are environmentally serum deprived, they likewise exhibit continued cell cycle traverse coupled with increased cell death. Thus, these results support the hypothesis that pre-crisis SV40-transformed HDF still undergo the same aging process as do normal HDF, but they end their lifespan in crisis rather than in the normal G1-arrested senescent state because they have lost their ability to enter a viable, G1-arrested state in response to mitogen deprivation.

  15. SAGE III/ISS L2 Solar Event Species Profiles (HDF-EOS) V5 (g3bssp)

    Atmospheric Science Data Center

    2017-12-21

    SAGE III/ISS L2 Solar Event Species Profiles (HDF-EOS) V5 (g3bssp)   Project ... present Temporal Resolution:  1 file per event File Format:  HDF-4 Tools:  Earthdata ... Radiation Longwave Radiation Shortwave Radiation Event Tag Event Type Obs Beta Angle Order Data:  ...

  16. SAGE III/ISS L2 Lunar Event Species Profiles (HDF-EOS) V5 (g3blsp)

    Atmospheric Science Data Center

    2018-01-04

    SAGE III/ISS L2 Lunar Event Species Profiles (HDF-EOS) V5 (g3blsp)   Project ... present Temporal Resolution:  1 file per event File Format:  HDF-4 Tools:  Earthdata ... Radiation Longwave Radiation Shortwave Radiation Event Tag Event Type Obs Beta Angle Order Data:  ...

  17. The NeXus data format

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Könnecke, Mark; Akeroyd, Frederick A.; Bernstein, Herbert J.

    2015-01-30

    NeXus is an effort by an international group of scientists to define a common data exchange and archival format for neutron, X-ray and muon experiments. NeXus is built on top of the scientific data format HDF5 and adds domain-specific rules for organizing data within HDF5 files, in addition to a dictionary of well defined domain-specific field names. The NeXus data format has two purposes. First, it defines a format that can serve as a container for all relevant data associated with a beamline. This is a very important use case. Second, it defines standards in the form of application definitionsmore » for the exchange of data between applications. NeXus provides structures for raw experimental data as well as for processed data.« less

  18. The NeXus data format

    DOE PAGES

    Könnecke, Mark; Akeroyd, Frederick A.; Bernstein, Herbert J.; ...

    2015-01-30

    NeXus is an effort by an international group of scientists to define a common data exchange and archival format for neutron, X-ray and muon experiments. NeXus is built on top of the scientific data format HDF5 and adds domain-specific rules for organizing data within HDF5 files, in addition to a dictionary of well defined domain-specific field names. The NeXus data format has two purposes. First, it defines a format that can serve as a container for all relevant data associated with a beamline. This is a very important use case. Second, it defines standards in the form of application definitionsmore » for the exchange of data between applications. As a result, NeXus provides structures for raw experimental data as well as for processed data.« less

  19. HDF-EOS Web Server

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.

  20. SAGE III/ISS L1B Solar Event Transmission Data (HDF-EOS) V5 (g3bt)

    Atmospheric Science Data Center

    2017-12-21

    SAGE III/ISS L1B Solar Event Transmission Data (HDF-EOS) V5 (g3bt)   Project Title:  ... present Temporal Resolution:  1 file per event File Format:  HDF-4 Tools:  Earthdata ... Radiation Longwave Radiation Shortwave Radiation Event Tag Event Type Obs Beta Angle Order Data:  ...

  1. Post-Dilution on Line Haemodiafiltration with Citrate Dialysate: First Clinical Experience in Chronic Dialysis Patients

    PubMed Central

    Panichi, Vincenzo; Fiaccadori, Enrico; Fanelli, Roberto; Bernabini, Giada; Pizzarelli, Francesco

    2013-01-01

    Background. Citrate has anticoagulative properties and favorable effects on inflammation, but it has the potential hazards of inducing hypocalcemia. Bicarbonate dialysate (BHD) replacing citrate for acetate is now used in chronic haemodialysis but has never been tested in postdilution online haemodiafiltration (OL-HDF). Methods. Thirteen chronic stable dialysis patients were enrolled in a pilot, short-term study. Patients underwent one week (3 dialysis sessions) of BHD with 0.8 mmol/L citrate dialysate, followed by one week of postdilution high volume OL-HDF with standard bicarbonate dialysate, and one week of high volume OL-HDF with 0.8 mmol/L citrate dialysate. Results. In citrate OL-HDF pretreatment plasma levels of C-reactive protein and β2-microglobulin were significantly reduced; intra-treatment plasma acetate levels increased in the former technique and decreased in the latter. During both citrate techniques (OL-HDF and HD) ionized calcium levels remained stable within the normal range. Conclusions. Should our promising results be confirmed in a long-term study on a wider population, then OL-HDF with citrate dialysate may represent a further step in improving dialysis biocompatibility. PMID:24367243

  2. Bring NASA Scientific Data into GIS

    NASA Astrophysics Data System (ADS)

    Xu, H.

    2016-12-01

    NASA's Earth Observation System (EOS) and many other missions produce data of huge volume and near real time which drives the research and understanding of climate change. Geographic Information System (GIS) is a technology used for the management, visualization and analysis of spatial data. Since it's inception in the 1960s, GIS has been applied to many fields at the city, state, national, and world scales. People continue to use it today to analyze and visualize trends, patterns, and relationships from the massive datasets of scientific data. There is great interest in both the scientific and GIS communities in improving technologies that can bring scientific data into a GIS environment, where scientific research and analysis can be shared through the GIS platform to the public. Most NASA scientific data are delivered in the Hierarchical Data Format (HDF), a format is both flexible and powerful. However, this flexibility results in challenges when trying to develop supported GIS software - data stored with HDF formats lack a unified standard and convention among these products. The presentation introduces an information model that enables ArcGIS software to ingest NASA scientific data and create a multidimensional raster - univariate and multivariate hypercubes - for scientific visualization and analysis. We will present the framework how ArcGIS leverages the open source GDAL (Geospatial Data Abstract Library) to support its raster data access, discuss how we overcame the GDAL drivers limitations in handing scientific products that are stored with HDF4 and HDF5 formats and how we improve the way in modeling the multidimensionality with GDAL. In additional, we will talk about the direction of ArcGIS handling NASA products and demonstrate how the multidimensional information model can help scientists work with various data products such as MODIS, MOPPIT, SMAP as well as many data products in a GIS environment.

  3. The Cloud Absorption Radiometer HDF Data User's Guide

    NASA Technical Reports Server (NTRS)

    Li, Jason Y.; Arnold, G. Thomas; Meyer, Howard G.; Tsay, Si-Chee; King, Michael D.

    1997-01-01

    The purpose of this document is to describe the Cloud Absorption Radiometer (CAR) Instrument, methods used in the CAR Hierarchical Data Format (HDF) data processing, the structure and format of the CAR HDF data files, and methods for accessing the data. Examples of CAR applications and their results are also presented. The CAR instrument is a multiwavelength scanning radiometer that measures the angular distributions of scattered radiation.

  4. ZFP compression plugin (filter) for HDF5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mark C.

    H5Z-ZFP is a compression plugin (filter) for the HDF5 library based upon the ZFP-0.5.0 compression library. It supports 4- or 8-byte integer or floating point HDF5 datasets of any dimension but partitioned in 1, 2, or 3 dimensional chunks. It supports ZFP's four fundamental modes of operation; rate, precision, accuracy or expert. It is a lossy compression plugin.

  5. Multiple Independent File Parallel I/O with HDF5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M. C.

    2016-07-13

    The HDF5 library has supported the I/O requirements of HPC codes at Lawrence Livermore National Labs (LLNL) since the late 90’s. In particular, HDF5 used in the Multiple Independent File (MIF) parallel I/O paradigm has supported LLNL code’s scalable I/O requirements and has recently been gainfully used at scales as large as O(10 6) parallel tasks.

  6. A World Wide Web Human Dimensions Framework and Database for Wildlife and Forest Planning

    Treesearch

    Michael A. Tarrant; Alan D. Bright; H. Ken Cordell

    1999-01-01

    The paper describes a human dimensions framework(HDF) for application in wildlife and forest planning. The HDF is delivered via the world wide web and retrieves data on-line from the Social, Economic, Environmental, Leisure, and Attitudes (SEELA) database. The proposed HDF is guided by ten fundamental HD principles, and is applied to wildlife and forest planning using...

  7. The survey on data format of Earth observation satellite data at JAXA.

    NASA Astrophysics Data System (ADS)

    Matsunaga, M.; Ikehata, Y.

    2017-12-01

    JAXA's earth observation satellite data are distributed by a portal web site for search and deliver called "G-Portal". Users can download the satellite data of GPM, TRMM, Aqua, ADEOS-II, ALOS (search only), ALOS-2 (search only), MOS-1, MOS-1b, ERS-1 and JERS-1 from G-Portal. However, these data formats are different by each satellite like HDF4, HDF5, NetCDF4, CEOS, etc., and which formats are not familiar to new data users. Although the HDF type self-describing format is very convenient and useful for big dataset information, old-type format product is not readable by open GIS tool nor apply OGC standard. Recently, the satellite data are widely used to be applied to the various needs such as disaster, earth resources, monitoring the global environment, Geographic Information System(GIS) and so on. In order to remove a barrier of using Earth Satellite data for new community users, JAXA has been providing the format-converted product like GeoTIFF or KMZ. In addition, JAXA provides format conversion tool itself. We investigate the trend of data format for data archive, data dissemination and data utilization, then we study how to improve the current product format for various application field users and make a recommendation for new product.

  8. Wettability of graphene-laminated micropillar structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bong, Jihye; Seo, Keumyoung; Ju, Sanghyun, E-mail: jrahn@skku.edu, E-mail: shju@kgu.ac.kr

    2014-12-21

    The wetting control of graphene is of great interest for electronic, mechanical, architectural, and bionic applications. In this study, the wettability of graphene-laminated micropillar structures was manipulated by changing the height of graphene-laminated structures and employing the trichlorosilane (HDF-S)-based self-assembly monolayer. Graphene-laminated micropillar structures with HDF-S exhibited higher hydrophobicity (contact angle of 129.5°) than pristine graphene thin film (78.8°), pristine graphene-laminated micropillar structures (97.5°), and HDF-S self-assembled graphene thin film (98.5°). Wetting states of the graphene-laminated micropillar structure with HDF-S was also examined by using a urea solution, which flowed across the surface without leaving any residues.

  9. [A new method of in vitro chemosensitivity test using multicellular spheroids of cholangiocarcinoma cell line cocultured with fibroblasts].

    PubMed

    Kubota, S; Takezawa, T; Mori, Y; Takakuwa, T

    1992-09-01

    We applied the multicellular spheroids which consist of cholangiocarcinoma cell line (MEC) and human dermal fibroblasts (HDF) to in vitro chemosensitivity test. Five-day multicellular spheroids were incubated with 1.5 micrograms/ml of mitomycin C (MMC) for 24 hrs. Then, cell kinetics of MEC and HDF in a spheroid was determined by flow cytometric analysis. Twenty four hrs after treatment with MMC, both MEC and HDF were accumulated on S phase. Seven-day after treatment, DNA histogram in MEC returned to normal, but that of HDF was disappeared. These results showed that the multicellular assay could be more like on in vivo like chemosensitivity test.

  10. High-Efficiency Postdilution Online Hemodiafiltration Reduces All-Cause Mortality in Hemodialysis Patients

    PubMed Central

    Moreso, Francesc; Pons, Mercedes; Ramos, Rosa; Mora-Macià, Josep; Carreras, Jordi; Soler, Jordi; Torres, Ferran; Campistol, Josep M.; Martinez-Castelao, Alberto

    2013-01-01

    Retrospective studies suggest that online hemodiafiltration (OL-HDF) may reduce the risk of mortality compared with standard hemodialysis in patients with ESRD. We conducted a multicenter, open-label, randomized controlled trial in which we assigned 906 chronic hemodialysis patients either to continue hemodialysis (n=450) or to switch to high-efficiency postdilution OL-HDF (n=456). The primary outcome was all-cause mortality, and secondary outcomes included cardiovascular mortality, all-cause hospitalization, treatment tolerability, and laboratory data. Compared with patients who continued on hemodialysis, those assigned to OL-HDF had a 30% lower risk of all-cause mortality (hazard ratio [HR], 0.70; 95% confidence interval [95% CI], 0.53–0.92; P=0.01), a 33% lower risk of cardiovascular mortality (HR, 0.67; 95% CI, 0.44–1.02; P=0.06), and a 55% lower risk of infection-related mortality (HR, 0.45; 95% CI, 0.21–0.96; P=0.03). The estimated number needed to treat suggested that switching eight patients from hemodialysis to OL-HDF may prevent one annual death. The incidence rates of dialysis sessions complicated by hypotension and of all-cause hospitalization were lower in patients assigned to OL-HDF. In conclusion, high-efficiency postdilution OL-HDF reduces all-cause mortality compared with conventional hemodialysis. PMID:23411788

  11. High-efficiency postdilution online hemodiafiltration reduces all-cause mortality in hemodialysis patients.

    PubMed

    Maduell, Francisco; Moreso, Francesc; Pons, Mercedes; Ramos, Rosa; Mora-Macià, Josep; Carreras, Jordi; Soler, Jordi; Torres, Ferran; Campistol, Josep M; Martinez-Castelao, Alberto

    2013-02-01

    Retrospective studies suggest that online hemodiafiltration (OL-HDF) may reduce the risk of mortality compared with standard hemodialysis in patients with ESRD. We conducted a multicenter, open-label, randomized controlled trial in which we assigned 906 chronic hemodialysis patients either to continue hemodialysis (n=450) or to switch to high-efficiency postdilution OL-HDF (n=456). The primary outcome was all-cause mortality, and secondary outcomes included cardiovascular mortality, all-cause hospitalization, treatment tolerability, and laboratory data. Compared with patients who continued on hemodialysis, those assigned to OL-HDF had a 30% lower risk of all-cause mortality (hazard ratio [HR], 0.70; 95% confidence interval [95% CI], 0.53-0.92; P=0.01), a 33% lower risk of cardiovascular mortality (HR, 0.67; 95% CI, 0.44-1.02; P=0.06), and a 55% lower risk of infection-related mortality (HR, 0.45; 95% CI, 0.21-0.96; P=0.03). The estimated number needed to treat suggested that switching eight patients from hemodialysis to OL-HDF may prevent one annual death. The incidence rates of dialysis sessions complicated by hypotension and of all-cause hospitalization were lower in patients assigned to OL-HDF. In conclusion, high-efficiency postdilution OL-HDF reduces all-cause mortality compared with conventional hemodialysis.

  12. Exploring for Galaxies in the First Billion Years with Hubble and Spitzer - Pathfinding for JWST

    NASA Astrophysics Data System (ADS)

    Illingworth, Garth D.

    2017-01-01

    Hubble has revolutionized the field of distant galaxies through its deep imaging surveys, starting with the Hubble Deep Field (HDF) in 1995. That first deep survey revealed galaxies at redshift z~1-3 that provided insights into the development of the Hubble sequence. Each new HST instrument has explored new regimes, through the peak of star formation at z~2-3, just 2-3 billion years after the Big Bang, to our first datasets at a billion years at z~6, and then earlier to z~11. HST's survey capabilities were enhanced by 40X with ACS, and then similarly with the WFC3/IR, which opened up the first billion years to an unforeseen degree. I will discuss what we have learned from the remarkable HST and Spitzer imaging surveys (HUDF, GOODS, HUDF09/12 and CANDELS), as well as surveys of clusters like the Hubble Frontier Fields (HFF). Lensing clusters provide extraordinary opportunities for characterizing the faintest earliest galaxies, but also present extraordinary challenges. Together these surveys have resulted in the measurement of the volume density of galaxies in the first billion years down to astonishingly faint levels. The role of faint galaxies in reionizing the universe is still much-discussed, but there is no doubt that such galaxies contribute greatly to the UV ionizing flux, as shown by deep luminosity function studies. Together Hubble and Spitzer have also established the stellar-mass buildup over 97% of cosmic history. Yet some of the greatest surprises have come from the discovery of very luminous galaxies at z~8-11, around 400-650 million years after the Big Bang. Spectroscopic followup by Keck of some of these very rare, bright galaxies has confirmed redshifts from z~7 to z~9, and revealed, surprisingly, strong Lyα emission near the peak of reionization when the HI fraction in the IGM is high. The recent confirmation of a z=11.1 galaxy, just 400 million years after the Big Bang, by a combination of Hubble and Spitzer data, moved Hubble into JWST territory, far beyond what we ever expected Hubble could do. Twenty years of astonishing progress with Hubble and Spitzer leave me looking to JWST to provide even more remarkable exploration of the realm of the first galaxies.

  13. Incorporating ISO Metadata Using HDF Product Designer

    NASA Technical Reports Server (NTRS)

    Jelenak, Aleksandar; Kozimor, John; Habermann, Ted

    2016-01-01

    The need to store in HDF5 files increasing amounts of metadata of various complexity is greatly overcoming the capabilities of the Earth science metadata conventions currently in use. Data producers until now did not have much choice but to come up with ad hoc solutions to this challenge. Such solutions, in turn, pose a wide range of issues for data managers, distributors, and, ultimately, data users. The HDF Group is experimenting on a novel approach of using ISO 19115 metadata objects as a catch-all container for all the metadata that cannot be fitted into the current Earth science data conventions. This presentation will showcase how the HDF Product Designer software can be utilized to help data producers include various ISO metadata objects in their products.

  14. PH5 for integrating and archiving different data types

    NASA Astrophysics Data System (ADS)

    Azevedo, Steve; Hess, Derick; Beaudoin, Bruce

    2016-04-01

    PH5 is IRIS PASSCAL's file organization of HDF5 used for seismic data. The extensibility and portability of HDF5 allows the PH5 format to evolve and operate on a variety of platforms and interfaces. To make PH5 even more flexible, the seismic metadata is separated from the time series data in order to achieve gains in performance as well as ease of use and to simplify user interaction. This separation affords easy updates to metadata after the data are archived without having to access waveform data. To date, PH5 is currently used for integrating and archiving active source, passive source, and onshore-offshore seismic data sets with the IRIS Data Management Center (DMC). Active development to make PH5 fully compatible with FDSN web services and deliver StationXML is near completion. We are also exploring the feasibility of utilizing QuakeML for active seismic source representation. The PH5 software suite, PIC KITCHEN, comprises in-field tools that include data ingestion (e.g. RefTek format, SEG-Y, and SEG-D), meta-data management tools including QC, and a waveform review tool. These tools enable building archive ready data in-field during active source experiments greatly decreasing the time to produce research ready data sets. Once archived, our online request page generates a unique web form and pre-populates much of it based on the metadata provided to it from the PH5 file. The data requester then can intuitively select the extraction parameters as well as data subsets they wish to receive (current output formats include SEG-Y, SAC, mseed). The web interface then passes this on to the PH5 processing tools to generate the requested seismic data, and e-mail the requester a link to the data set automatically as soon as the data are ready. PH5 file organization was originally designed to hold seismic time series data and meta-data from controlled source experiments using RefTek data loggers. The flexibility of HDF5 has enabled us to extend the use of PH5 in several areas one of which is using PH5 to handle very large data sets. PH5 is also good at integrating data from various types of seismic experiments such as OBS, onshore-offshore, controlled source, and passive recording. HDF5 is capable of holding practically any type of digital data so integrating GPS data with seismic data is possible. Since PH5 is a common format and data contained in HDF5 is accessible randomly it has been easy to extend to include new input and output data formats as community needs arise.

  15. The Effect of On-Line Hemodiafiltration, Vegetarian Diet, and Urine Volume on Advanced Glycosylation End Products Measured by Changes in Skin Auto-Fluorescence.

    PubMed

    Nongnuch, Arkom; Davenport, Andrew

    2018-04-02

    Increasing urea clearance by hemodialysis (HD) has not improved patient survival. Hemodiafiltration (HDF) has been reported to reduce cardiovascular mortality. HDF increases middle sized solute clearances. Advanced glycosylation end products (AGEs) are associated with increased cardiovascular mortality. We wished to determine whether HDF reduces AGEs. Skin auto-fluorescence (SAF) measures circulating AGEs deposited in the skin. We compared SAF measurements 12 months apart in high flux HD and HDF patients. At enrollment SAF was not different (HD 3.34 ± 0.71 vs. HDF 3.48 ± 1.05 AU). At seven months after completion of SAF measurement, one hemodiafiltration center returned to hemodialysis, and one hemodialysis center converted to hemodiafiltration. In the 66 patients treated solely by high flux HD, SAF increased (3.36 ± 0.71 to 3.82 ± 0.88 AU, P < 0.001), whereas there was no change for 47 exclusively treated by HDF (3.45 ± 1.13 to 3.44 ± 0.85 AU, P > 0.9). SAF increased in 34 patients switching from HDF to high flux HD (3.52 ± 0.94 vs. 3.88 ± 1.05, P < 0.05), with no significant change for 33 patients converting from high flux HD to HDF (3.32 ± 0.72 to 3.48 ± 1.07 AU, P > 0.3). On multivariate analysis, SAF was associated with older age (β coefficient 0.013, P = 0.002), prescription of insulin (β 0.29, P = 0.016), lanthanum (β 0.36, P = 0.004), and warfarin (β 0.62, P = 0.012), whereas vegetarian diet and > 250 mL/day residual urine volume were negatively associated with SAF (β -0.58, P = 0.002 and β -0.26, P = 0.033 respectively). Residual urine output and vegetarian diet were associated with lower AGE deposition. Whereas SAF increased over time in patients treated with high flux HD, there was no statistical change in SAF in those exclusively treated by HDF. © 2018 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  16. Central online hemodiafiltration in Japan: management of water quality and practice.

    PubMed

    Yamashita, Akihiro C; Sato, Takashi

    2009-01-01

    Hemodiafiltration (HDF) includes a variety of technologies and preparation of ultrapure dialysis fluid has made it possible to perform online HDF and its extensive alternatives. According to current statistics, 5.8% of ESRD patients are treated with HDF in Japan. The majority of these HDF treatments are performed using the central dialysis fluid delivery system (CDDS), this is because most Japanese clinicians and researchers consider that with CDDS it is easier to prepare substitution fluid; moreover, CDDS has economical advantages against single-patient dialysis machine (SPDM)-based counterparts. The water quality at each patient station (dialysis console) is regularly validated by bacterial culture (colony-forming units) and by measuring endotoxin concentration (ET). Since ET measurement takes much less time than bacterial culture, ET is often used as an indicator to verify the water quality for online use. Dialysis fluid with ET below the detection level (usually <0.001 EU/ml) is used for online substitution. In CDDS online HDF, since dialysis clinics must prepare not only the dialysis fluid but also the substitution fluid, they need to satisfy almost the same requirements as pharmaceutical water treatment factories do. The Japanese Society for Dialysis Therapy (JSDT) together with the Japanese Society for Hemodiafiltration (JS-HDF) is now preparing guidelines to meet all these necessary requirements on a worldwide basis. (c) 2009 S. Karger AG, Basel.

  17. Vitamin E-coated membrane dialyzer and beta2-microglobulin removal.

    PubMed

    Mandolfo, S; Bucci, R; Imbasciati, E

    2003-12-01

    This study was designed to test the removal of beta2-microglobulin (beta2M) in a vitamin E-modified membrane. We investigated in vivo the dialyzer (Excebrane, series EE, 1.8 m2) with respect to hydraulic permeability (Kuf), maximum ultrafiltration rate (UF max), sieving coefficient (Sc), and solute clearances in hemodialysis (HD) and in soft hemodiafiltration (HDF). Kuf was 18.4 ml/h/mmHg, UF max was 75 ml/min, and Sc for beta2M was 0.45. Clearance values at 400 ml/min of Qb in HD were 258 ml/min for urea, 201 ml/min for creatinine, and 135 ml/min for phosphate. In soft HDF, clearances were slightly higher. beta2M clearance was 26 ml/min in HD and 43 ml/min in soft HDF. In conclusion, Excebrane (series EE) procures a soft HDF with an amount of substitution fluid in post dilution mode of over 60 ml/min. Remarkable small solute clearances were obtained when the blood flow was raised to 400 ml/min. A significant reduction of beta2M is demonstrated by HDF.

  18. HDF-EOS 5 Validator

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program partly automates the task of determining whether an HDF-EOS 5 file is valid in that it conforms to specifications for such characteristics as attribute names, dimensionality of data products, and ranges of legal data values. ["HDF-EOS" and variants thereof are defined in "Converting EOS Data From HDF-EOS to netCDF" (GSC-15007-1), which is the first of several preceding articles in this issue of NASA Tech Briefs.] Previously, validity of a file was determined in a tedious and error-prone process in which a person examined human-readable dumps of data-file-format information. The present software helps a user to encode the specifications for an HDFEOS 5 file, and then inspects the file for conformity with the specifications: First, the user writes the specifications in Extensible Markup Language (XML) by use of a document type definition (DTD) that is part of the program. Next, the portion of the program (denoted the validator) that performs the inspection is executed, using, as inputs, the specifications in XML and the HDF-EOS 5 file to be validated. Finally, the user examines the output of the validator.

  19. Experimental Directory Structure (Exdir): An Alternative to HDF5 Without Introducing a New File Format

    PubMed Central

    Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E.; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders

    2018-01-01

    Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data. PMID:29706879

  20. Experimental Directory Structure (Exdir): An Alternative to HDF5 Without Introducing a New File Format.

    PubMed

    Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders

    2018-01-01

    Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data.

  1. XML DTD and Schemas for HDF-EOS

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Yang, Jingli

    2008-01-01

    An Extensible Markup Language (XML) document type definition (DTD) standard for the structure and contents of HDF-EOS files and their contents, and an equivalent standard in the form of schemas, have been developed.

  2. Tuning HDF5 for Lustre File Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howison, Mark; Koziol, Quincey; Knaak, David

    2010-09-24

    HDF5 is a cross-platform parallel I/O library that is used by a wide variety of HPC applications for the flexibility of its hierarchical object-database representation of scientific data. We describe our recent work to optimize the performance of the HDF5 and MPI-IO libraries for the Lustre parallel file system. We selected three different HPC applications to represent the diverse range of I/O requirements, and measured their performance on three different systems to demonstrate the robustness of our optimizations across different file system configurations and to validate our optimization strategy. We demonstrate that the combined optimizations improve HDF5 parallel I/O performancemore » by up to 33 times in some cases running close to the achievable peak performance of the underlying file system and demonstrate scalable performance up to 40,960-way concurrency.« less

  3. Postmitotic human dermal fibroblasts preserve intact feeder properties for epithelial cell growth after long-term cryopreservation.

    PubMed

    Limat, A; Hunziker, T; Boillat, C; Noser, F; Wiesmann, U

    1990-07-01

    In vitro, human dermal fibroblasts (HDF) differentiate through morphologically and biochemically identified compartments. In the course of this spontaneous differentiation through mitotic and postmitotic states, a tremendous increase in cellular and nuclear size occurs. Induction of postmitotic states can be accelerated by chemical (e.g., mitomycin C) or physical (e.g., x-ray) treatments. Such experimentally induced postmitotic HDF cells support very efficiently the growth of cutaneous epithelial cells, i.e. interfollicular keratinocytes and follicular outer root sheath cells, especially in primary cultures starting from very low cell seeding densities. The HDF feeder system provides more fundamental and also practical advantages, i.e. use of initially diploid human fibroblasts from known anatomic locations, easy handling and excellent reproducibility, and the possibility of long-term storage by incubation at 37 degrees C. Conditions for the cryogenic storage of postmitotic HDF cells in liquid nitrogen are presented and related to the feeder capacity for epithelial cell growth. Because postmitotic HDF cells preserve intact feeder properties after long-term storage, the immediate availability of feeder cells and the possibility to repeat experiments with identical materials further substantiate the usefulness of this feeder system.

  4. Design and patient characteristics of ESHOL study, a Catalonian prospective randomized study.

    PubMed

    Maduell, Francisco; Moreso, Francesc; Pons, Mercedes; Ramos, Rosa; Mora-Macià, Josep; Foraster, Andreu; Soler, Jordi; Galceran, Josep M; Martinez-Castelao, Alberto

    2011-01-01

    Retrospective studies showed that online hemodiafiltration (OL-HDF) is associated with a risk reduction of mortality over standard hemodialysis (HD) in patients with end-stage renal disease. Until now, no information was available from prospective randomized clinical trials. A prospective, randomized, multicenter, open study was designed to be conducted in HD units from Catalonia (Spain). The aim of the study is to compare 3-year survival in prevalent end-stage renal disease patients randomized to OL-HDF or to continue on standard HD. The minimum sample size was calculated according to Catalonian mortality of patients on dialysis and assuming a risk reduction associated with OL-HDF of 35% (1-sided p<0.05 and a statistical power of 0.8) and a rate of dropout due to renal transplantation or loss to follow-up of 30%. From May 2007 to September 2008, 906 patients were included and randomized to OL-HDF (n=456) or standard HD (n=450). Demographics and analytical data at the time of randomization were not different between both groups of patients. Patients will be followed during a 3-year period. The present study will contribute to evaluating the benefit for patient survival of OL-HDF over standard HD.

  5. Effects of different blood purification methods on serum cytokine levels and prognosis in patients with acute severe organophosphorus pesticide poisoning.

    PubMed

    Liu, Lunzhi; Ding, Guohua

    2015-04-01

    The aim of the present study was to investigate the impact of three different blood purification methods, hemoperfusion (HP), continuous blood purification (CBP), and on-line high-volume hemodiafiltration (OL-HDF), on the survival rate of patients with acute severe organophosphorus pesticide poisoning (ASOPP), as well as on major pro-inflammatory (interleukin [IL]-1, IL-6, tumor necrosis factor-α [TNF-α]) and anti-inflammatory (IL-10) cytokines in the serum. Eighty-one ASOPP patients were randomly divided into three groups: HP (N = 23), HP + CBP (N = 26), HP + OL-HD (N = 32). Serum IL-1, IL-6, TNF-α, and IL-10 levels were assessed by ELISA before treatment and at 24 and 48 h post-treatment and survival rates were determined. Patient survival rate was significantly higher in OL-HDF and CBP treated patients compared with HP group (P < 0.05). A significantly greater clearance effect in serum IL-1, IL-6, and TNF-α levels at 24 and 48 h post-treatment was observed in CBP and OL-HDF groups compared with the HP group (P < 0.05). The levels of serum anti-inflammatory cytokine IL-10 increased significantly in CBP and OL-HDF groups compared with the HP group (P < 0.05 at 48 h post-treatment). In addition, OL-HDF treatment achieved similar changes in serum TNF-α, IL-1, IL-6 and IL-10 levels as CBP (P > 0.05). Compared with the HP method, CBP or OL-HDF combined with HP can rapidly clear inflammatory cytokines, reduce systemic inflammatory response syndrome, and improve the survival of ASOPP patients. Compared with CBP, OL-HDF is an economical and effective method to treat ASOPP with less technical difficulty and more suitability for rural areas and primary hospitals. © 2014 The Authors. Therapeutic Apheresis and Dialysis © 2014 International Society for Apheresis.

  6. A randomized trial of hemodiafiltration and change in cardiovascular parameters.

    PubMed

    Mostovaya, Irina M; Bots, Michiel L; van den Dorpel, Marinus A; Grooteman, Muriel P C; Kamp, Otto; Levesque, Renée; Ter Wee, Piet M; Nubé, Menso J; Blankestijn, Peter J

    2014-03-01

    Increased left ventricular mass (LVM), low ventricular ejection fraction (EF), and high pulse-wave velocity (PWV) relate to overall and cardiovascular mortality in patients with ESRD. The aim of this study was to determine the effect of online hemodiafiltration (HDF) versus low-flux hemodialysis (HD) on LVM, EF, and PWV. Echocardiography was used to assess LVM and EF in 342 patients in the CONvective TRAnsport STudy followed for up to 4 years. PWV was measured in 189 patients for up to 3 years. Effect of HDF versus HD on LVM, EF, and PWV was evaluated using linear mixed models. Patients had a mean age of 63 years, and 61% were male. At baseline, median LVM was 227 g (interquartile range [IQR], 183-279 g), and median EF was 65% (IQR, 55%-72%). Median PWV was 9.8 m/s (IQR, 7.5-12.0 m/s). There was no significant difference between the HDF and HD treatment groups in rate of change in LVM (HDF: change, -0.9 g/yr [95% confidence interval (95% CI), -8.9 to 7.7 g]; HD: change, 12.5 g/yr [95% CI, -3.0 to 27.5 g]; P for difference=0.13), EF (HDF: change, -0.3%/yr [95% CI, -2.3% to 1.8%]; HD: change, -3.4%/yr [95% CI, -5.9% to -0.9%]; P=0.17), or PWV (HDF: change, -0.0 m/s per year [95% CI, -0.4 to 0.4 m/s); HD: change, 0.0 m/s per year [95% CI, -0.3 to 0.2 m/s]; P=0.89). No differences in rate of change between treatment groups were observed for subgroups of age, sex, residual kidney function, dialysis vintage, history of cardiovascular disease, diabetes, or convection volume. Treatment with online HDF did not affect changes in LVM, EF, or PWV over time compared with HD.

  7. Galaxy Evolution in the Reddest Possible Filter

    NASA Astrophysics Data System (ADS)

    Richards, E. A.

    We describe an observational programme aimed at understanding the radio emission from distant, rapidly evolving galaxy populations. These observations were carried out at 1.4 and 8.5 GHz with the VLA, centred on the Hubble Deep Field, obtaining limiting flux densities of 40 and 8 μJy respectively. The differential count of the radio sources is marginally sub-Euclidean to the completeness limits (γ = - 2.4 +/- 0.1) and fluctuation analysis suggests nearly 60 sources per arcmin^2 at the 1 μJy level. Using high-resolution 1.4 GHz observations obtained with MERLIN, we resolve all radio sources detected in the VLA complete sample and measure a median angular size for the microjansky radio population of 1-2``. This clue, coupled with the steep spectral index of the 1.4 GHz selected sample, suggests diffuse synchrotron radiation in z ~ 1 galactic discs. The wide-field HST and ground-based optical exposures show that the radio sources are identified primarily with disc systems composed of irregulars, peculiars, interacting/merging galaxies and a few isolated field spirals. Only 20% of the radio sources can be attributed to AGN - the majority are probably associated with starburst activity. The available redshifts range from 0.1 to 3, with a mean of about 0.8. We are plrobably witnessing a major episode of starburst activity in these luminous (L > L_*) systems, occasionally accompanied by an embedded AGN. About 20% of the radio sources remain unidentified to I = 26-28 in the HDF and flanking fields. Several of these objects have extremely red counterparts. We suggest that these are high-redshift dusty protogalaxies.

  8. Hemodiafiltration improves free light chain removal and normalizes κ/λ ratio in hemodialysis patients.

    PubMed

    Bourguignon, Chloé; Chenine, Leïla; Bargnoux, Anne Sophie; Leray-Moragues, Hélène; Canaud, Bernard; Cristol, Jean-Paul; Morena, Marion

    2016-04-01

    Serum free light chain (FLC) levels are correlated with chronic kidney disease (CKD) stages and are highest in patients on hemodialysis (HD). Aim of this study was to assess the FLC removal efficiency of Elisio™-210H dialyzer using either high-flux HD or on line high efficiency hemodiafiltration (HDF) modalities in CKD-5D patients. In this prospective and comparative study, 20 CKD-5D patients free from multiple myeloma were randomized in two groups: HD versus on line HDF. All patients were dialyzed with Elisio™-210H dialyzer. Serum samples were collected before and after the midweek dialysis session, before randomization and at the end of the study to measure κ and λ FLC concentrations. Reduction ratios were corrected for net ultrafiltration. For both HD and HDF mode, κ and λ FLC concentrations were significantly lower after dialysis than before but median reductions in κ and λ FLC levels were significantly higher in HDF versus HD groups (κ 73.5 vs. 65.5 %, p = 0.04 and λ 51.0 vs. 36.6 %, p = 0.07). After dialysis, all κ/λ ratio values were between 0.26 and 1.65 which is the reference range described in subjects with normal kidney function, for both HD and HDF groups (median κ/λ ratios were 0.80 [0.47-1.22] and 0.67 [0.50-0.79] respectively). This study shows the superiority of on line HDF compared with HD to remove both κ and λ FLC. Moreover, all post-dialysis κ/λ ratios reached normal reference range.

  9. What Is the Optimal Target Convective Volume in On-Line Hemodiafiltration Therapy?

    PubMed

    Canaud, Bernard; Koehler, Katrin; Bowry, Sudhir; Stuard, Stefano

    2017-01-01

    Conventional diffusion-based dialysis modalities including high-flux hemodialysis are limited in their capacity to effectively remove large uremic toxins and to improve outcomes for end-stage chronic kidney disease (ESKD) patients. By increasing convective solute transport, hemodiafiltration (HDF) enhances solute removal capacity over a broad range of middle- and large-size uremic toxins implicated in the pathophysiology of chronic kidney disease. Furthermore, by offering flexible convection volume, on-line HDF permits customizing the treatment dose to the patient's needs. In addition, convective-based modalities have been shown to improve hemodynamic stability and to reduce patients' inflammation profile - both of which are implicated in CKD morbidity and mortality. Growing clinical evidence indicates that HDF-based modalities provide ESKD patients with a number of clinical and biological benefits, including improved outcomes. Interestingly, it has recently emerged that the clinical benefits associated with HDF are positively associated with the total ultrafiltered volume per session (and per week), namely convective dose. In this chapter, we revisit the concept of convective dose and discuss the threshold value above which an improvement in ESKD patient outcome can be expected. This particular point will be addressed by stratifying the level of efficacy of convective volumes, schematically defined as minimal, optimal, personalized, and maximal. In addition, factors and best clinical practices implicated in the achievement of an optimal convective dose are reviewed. To conclude, we show how HDF differs from standard hemodialysis and why HDF offers a paradigm shift in renal replacement therapy. © 2017 S. Karger AG, Basel.

  10. MiR-30a-3p Negatively Regulates BAFF Synthesis in Systemic Sclerosis and Rheumatoid Arthritis Fibroblasts

    PubMed Central

    Philippe, Lucas; Gong, Ya-Zhuo; Bahram, Seiamak; Cetin, Semih; Pfeffer, Sébastien; Gottenberg, Jacques-Eric; Wachsmann, Dominique; Georgel, Philippe; Sibilia, Jean

    2014-01-01

    We evaluated micro (mi) RNA-mediated regulation of BAFF expression in fibroblasts using two concomitant models: (i) synovial fibroblasts (FLS) isolated from healthy controls (N) or Rheumatoid Arthritis (RA) patients; (ii) human dermal fibroblasts (HDF) isolated from healthy controls (N) or Systemic Sclerosis (SSc) patients. Using RT-qPCR and ELISA, we first showed that SScHDF synthesized and released BAFF in response to Poly(I:C) or IFN-γ treatment, as previously observed in RAFLS, whereas NHDF released BAFF preferentially in response to IFN-γ. Next, we demonstrated that miR-30a-3p expression was down regulated in RAFLS and SScHDF stimulated with Poly(I:C) or IFN-γ. Moreover, we demonstrated that transfecting miR-30a-3p mimic in Poly(I:C)- and IFN-γ-activated RAFLS and SScHDF showed a strong decrease on BAFF synthesis and release and thus B cells survival in our model. Interestingly, FLS and HDF isolated from healthy subjects express higher levels of miR-30a-3p and lower levels of BAFF than RAFLS and SScHDF. Transfection of miR-30a-3p antisense in Poly(I:C)- and IFN-γ-activated NFLS and NHDF upregulated BAFF secretion, confirming that this microRNA is a basal repressors of BAFF expression in cells from healthy donors. Our data suggest a critical role of miR-30a-3p in the regulation of BAFF expression, which could have a major impact in the regulation of the autoimmune responses occurring in RA and SSc. PMID:25360821

  11. Haemodialysis or haemodiafiltration: that is the question.

    PubMed

    Locatelli, Francesco; Carfagna, Fabio; Del Vecchio, Lucia; La Milia, Vincenzo

    2018-04-24

    Despite the technological and pharmacological advancements in the last 30 years, morbidity and mortality of dialysis patients are still astonishingly high. Today, convective treatments, such as high-flux haemodialysis (hf-HD) and haemodiafiltration (HDF), are established techniques; the online production of fresh pure dialysate has provided clinical and economic advantages. Nevertheless, the actual benefits of HDF, even with high-convective-volume treatments, are still debatable. Three recent, randomized controlled trials compared survival outcomes in prevalent patients receiving conventional HD or post-dilution HDF and reported conflicting results. The meta-analyses of the published trials were ultimately incapable of providing a clear and definitive answer on the possible beneficial effects of choosing one treatment over the other. All-cause mortality, anaemia, phosphate control and clearance of small molecules seemed to be unaffected by the treatment modality. On the other hand, cardiovascular mortality, intradialytic vascular stability and the clearance of protein-bound molecules fared better in patients treated with HDF. These results were not consistent between the studies. Thus, there is still no conclusive answer to the question that nephrologists would like to have answered: 'Which is the best treatment for my patient?' In the age of evidence-based medicine, we need strong data to support the superiority of a treatment in comparison with another, although theoretically plausible. There is the need for a well-designed clinical trial comparing outcomes for patients randomly assigned to high- or moderate-convection-volume HDF versus hf-HD to clearly prove the clinical superiority of HDF, including the effect of different infusion volumes.

  12. Intradialytic Cardiac Magnetic Resonance Imaging to Assess Cardiovascular Responses in a Short-Term Trial of Hemodiafiltration and Hemodialysis

    PubMed Central

    Buchanan, Charlotte; Mohammed, Azharuddin; Cox, Eleanor; Köhler, Katrin; Canaud, Bernard; Taal, Maarten W.; Selby, Nicholas M.; Francis, Susan

    2017-01-01

    Hemodynamic stress during hemodialysis (HD) results in recurrent segmental ischemic injury (myocardial stunning) that drives cumulative cardiac damage. We performed a fully comprehensive study of the cardiovascular effect of dialysis sessions using intradialytic cardiac magnetic resonance imaging (MRI) to examine the comparative acute effects of standard HD versus hemodiafiltration (HDF) in stable patients. We randomly allocated 12 patients on HD (ages 32–72 years old) to either HD or HDF. Patients were stabilized on a modality for 2 weeks before undergoing serial cardiac MRI assessment during dialysis. Patients then crossed over to the other modality and were rescanned after 2 weeks. Cardiac MRI measurements included cardiac index, stroke volume index, global and regional contractile function (myocardial strain), coronary artery flow, and myocardial perfusion. Patients had mean±SEM ultrafiltration rates of 3.8±2.9 ml/kg per hour during HD and 4.4±2.5 ml/kg per hour during HDF (P=0.29), and both modalities provided a similar degree of cooling. All measures of systolic contractile function fell during HD and HDF, with partial recovery after dialysis. All patients experienced some degree of segmental left ventricular dysfunction, with severity proportional to ultrafiltration rate and BP reduction. Myocardial perfusion decreased significantly during HD and HDF. Treatment modality did not influence any of the cardiovascular responses to dialysis. In conclusion, in this randomized, crossover study, there was no significant difference in the cardiovascular response to HDF or HD with cooled dialysate as assessed with intradialytic MRI. PMID:28122851

  13. Implementing the HDF-EOS5 software library for data products in the UNAVCO InSAR archive

    NASA Astrophysics Data System (ADS)

    Baker, Scott; Meertens, Charles; Crosby, Christopher

    2017-04-01

    UNAVCO is a non-profit university-governed consortium that operates the U.S. National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The seamless synthetic aperture radar archive (SSARA) is a seamless distributed access system for SAR data and higher-level data products. Under the NASA-funded SSARA project, a user-contributed InSAR archive for interferograms, time series, and other derived data products was developed at UNAVCO. The InSAR archive development has led to the adoption of the HDF-EOS5 data model, file format, and library. The HDF-EOS software library was designed to support NASA Earth Observation System (EOS) science data products and provides data structures for radar geometry (Swath) and geocoded (Grid) data based on the HDF5 data model and file format provided by the HDF Group. HDF-EOS5 inherits the benefits of HDF5 (open-source software support, internal compression, portability, support for structural data, self-describing file metadata enhanced performance, and xml support) and provides a way to standardize InSAR data products. Instrument- and datatype-independent services, such as subsetting, can be applied to files across a wide variety of data products through the same library interface. The library allows integration with GIS software packages such as ArcGIS and GDAL, conversion to other data formats like NetCDF and GeoTIFF, and is extensible with new data structures to support future requirements. UNAVCO maintains a GitHub repository that provides example software for creating data products from popular InSAR processing software packages like GMT5SAR and ISCE as well as examples for reading and converting the data products into other formats. Digital object identifiers (DOI) have been incorporated into the InSAR archive allowing users to assign a permanent location for their processed result and easily reference the final data products. A metadata attribute is added to the HDF-EOS5 file when a DOI is minted for a data product. These data products are searchable through the SSARA federated query providing access to processed data for both expert and non-expert InSAR users. The archive facilitates timely distribution of processed data with particular importance for geohazards and event response.

  14. Final Technical Report for grant entitled "New Horizons in C-F Activation by Main Group Electrophiles"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozerov, Oleg V; Ozerov, Oleg V.

    2014-01-16

    We became interested in developing new methods for hydrodefluorination (HDF) and other types of C-F bond conversion in polyfluoroalkanes under mild conditions. We were attracted to an approach to C-F activation, where the key C-F cleavage proceeds by a Lewis acid abstraction of fluoride rather than a redox event. The efforts during the previous period were aimed at a) advancing the HDF reactivity with improvement in scope and catalyst longevity; b) extending C-F activation beyond HDF; c) generating insight about the elementary steps of the reaction and potential intermediates.

  15. Comparison of hemodialysis with medium cut-off dialyzer and on-line hemodiafiltration on the removal of small and middle-sized molecules
.

    PubMed

    Belmouaz, Mohamed; Diolez, Jeremy; Bauwens, Marc; Duthe, Fabien; Ecotiere, Laure; Desport, Estelle; Bridoux, Frank

    2018-01-01

    Recent data suggest that the use of medium cut-off (MCO) dialyzers in hemodialysis (HD) promotes greater clearance and reduction ratio (RR) for myoglobin and other large-sized molecules than on-line hemodiafiltration (ol-HDF), but its effects on β2-microglobulin are not clear. We compared RR and clearances of small and middle-sized molecules between high-flux ol-HDF and MCO (Theranova) dialyzer in HD (MCO-HD) as well as nutritional parameters. We retrospectively analyzed 10 patients treated first with ol-HDF who were thereafter switched to MCO-HD over a 1-year period. Three dialysis sessions in each 6-month period were examined. We calculated RR and clearance of small and middle-sized molecules. There was no significant difference between ol-HDF and MCO-HD for median serum albumin and prealbumin level, mean KT/V, mean urea and creatinine RR, mean β2-microglobulin (81 ± 5 vs. 81 ± 6%, p = 0.72) and myoglobin (60 ± 9% vs. 61 ± 7%, p = 0.59), RR or clearances. The use of MCO (Theranova) dialyzer in HD produces similar removal of urea, creatinine, β2-microglobulin and myoglobin as does ol-HDF, with good tolerance profile and without modification of nutritional status.
.

  16. Data are from Mars, Tools are from Venus

    NASA Technical Reports Server (NTRS)

    Lee, H. Joe

    2017-01-01

    Although during the data production phase, the data producers will usually ensure the products to be easily used by the specific power users the products serve. However, most data products are also posted for general public to use. It is not straightforward for data producers to anticipate what tools that these general end-data users are likely to use. In this talk, we will try to help fill in the gap by going over various tools related to Earth Science and how they work with the existing NASA HDF (Hierarchical Data Format) data products and the reasons why some products cannot be visualized or analyzed by existing tools. One goal is for to give insights for data producers on how to make their data product more interoperable. On the other hand, we also provide some hints for end users on how to make tools work with existing HDF data products. (tool category list: check the comments) HDF-EOS tools: HDFView HDF-EOS Plugin, HEG, h4tonccf, hdf-eos2 dumper, NCL, MATLAB, IDL, etc.net; CDF-Java tools: Panoply, IDV, toosUI, NcML, etc.net; CDF-C tools: ArcGIS Desktop, GrADS, NCL, NCO, etc.; GDAL tools: ArcGIS Desktop, QGIS, Google Earth, etc.; CSV tools: ArcGIS Online, MS Excel, Tableau, etc.

  17. An Enhanced GINGERSimulation Code with Harmonic Emission and HDF5IO Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fawley, William M.

    GINGER [1] is an axisymmetric, polychromatic (r-z-t) FEL simulation code originally developed in the mid-1980's to model the performance of single-pass amplifiers. Over the past 15 years GINGER's capabilities have been extended to include more complicated configurations such as undulators with drift spaces, dispersive sections, and vacuum chamber wakefield effects; multi-pass oscillators; and multi-stage harmonic cascades. Its coding base has been tuned to permit running effectively on platforms ranging from desktop PC's to massively parallel processors such as the IBM-SP. Recently, we have made significant changes to GINGER by replacing the original predictor-corrector field solver with a new direct implicitmore » algorithm, adding harmonic emission capability, and switching to the HDF5 IO library [2] for output diagnostics. In this paper, we discuss some details regarding these changes and also present simulation results for LCLS SASE emission at {lambda} = 0.15 nm and higher harmonics.« less

  18. Influence of dialysis membrane composition on plasma bisphenol A levels during online hemodiafiltration.

    PubMed

    Mas, Sebastian; Bosch-Panadero, Enrique; Abaigar, Pedro; Camarero, Vanesa; Mahillo, Ignacio; Civantos, Esther; Sanchez-Ospina, Didier; Ruiz-Priego, Alberto; Egido, Jesus; Ortiz, Alberto; González-Parra, Emilio

    2018-01-01

    Bisphenol A (BPA) is an ubiquitous environmental toxin that is also found in dialyzers. Online hemodiafiltration (OL-HDF) more efficiently clears high molecular weight molecules, and this may improve BPA clearance. However, the BPA contents of dialysis membranes may be a source of BPA loading during OL-HDF. A prospective study assessed plasma BPA levels in OL-HDF patients using BPA-free (polynephron) or BPA-containing (polysulfone) dialyzers in a crossover design with two arms, after a run-in OL-HDF period of at least 6 months with the same membrane: 31 patients on polynephron at baseline were switched to polysulfone membranes for 3 months (polynephron-to-polysulfone) and 29 patients on polysulfone were switched to polynephron for 3 months (polysulfone-to-polynephron). After a run-in OL-HDF period of at least 6 months with the same membrane, baseline pre-dialysis BPA was lower in patients on polynephron (8.79±7.97 ng/ml) than in those on polysulfone (23.42±20.38 ng/mL, p<0.01), but still higher than in healthy controls (<2 ng/mL). After 3 months of polynephron-to-polysulfone switch, BPA was unchanged (8.98±7.88 to 11.14±15.98 ng/mL, ns) while it decreased on the polysulfone-to-polynephron group (23.42±20.38 to 11.41±12.38 ng/mL, p<0.01). OL-HDF for 3 months with BPA-free dialyzer membranes was associated to a significant decrease in predialysis BPA levels when compared to baseline BPA levels while on a BPA-containing membrane.

  19. Influence of dialysis membrane composition on plasma bisphenol A levels during online hemodiafiltration

    PubMed Central

    Abaigar, Pedro; Camarero, Vanesa; Mahillo, Ignacio; Civantos, Esther; Sanchez-Ospina, Didier; Ruiz-Priego, Alberto; Egido, Jesus; Ortiz, Alberto; González-Parra, Emilio

    2018-01-01

    Introduction Bisphenol A (BPA) is an ubiquitous environmental toxin that is also found in dialyzers. Online hemodiafiltration (OL-HDF) more efficiently clears high molecular weight molecules, and this may improve BPA clearance. However, the BPA contents of dialysis membranes may be a source of BPA loading during OL-HDF. Methods A prospective study assessed plasma BPA levels in OL-HDF patients using BPA-free (polynephron) or BPA-containing (polysulfone) dialyzers in a crossover design with two arms, after a run-in OL-HDF period of at least 6 months with the same membrane: 31 patients on polynephron at baseline were switched to polysulfone membranes for 3 months (polynephron-to-polysulfone) and 29 patients on polysulfone were switched to polynephron for 3 months (polysulfone-to-polynephron). Results After a run-in OL-HDF period of at least 6 months with the same membrane, baseline pre-dialysis BPA was lower in patients on polynephron (8.79±7.97 ng/ml) than in those on polysulfone (23.42±20.38 ng/mL, p<0.01), but still higher than in healthy controls (<2 ng/mL). After 3 months of polynephron-to-polysulfone switch, BPA was unchanged (8.98±7.88 to 11.14±15.98 ng/mL, ns) while it decreased on the polysulfone-to-polynephron group (23.42±20.38 to 11.41±12.38 ng/mL, p<0.01). Conclusion OL-HDF for 3 months with BPA-free dialyzer membranes was associated to a significant decrease in predialysis BPA levels when compared to baseline BPA levels while on a BPA-containing membrane. PMID:29529055

  20. Feasibility of intermittent back-filtrate infusion hemodiafiltration to reduce intradialytic hypotension in patients with cardiovascular instability: a pilot study.

    PubMed

    Koda, Yutaka; Aoike, Ikuo; Hasegawa, Shin; Osawa, Yutaka; Nakagawa, Yoichi; Iwabuchi, Fumio; Iwahashi, Chikara; Sugimoto, Tokuichiro; Kikutani, Toshihiko

    2017-04-01

    Intradialytic hypotension (IDH) is one of the major problems in performing safe hemodialysis (HD). As blood volume depletion by fluid removal is a major cause of hypotension, careful regulation of blood volume change is fundamental. This study examined the effect of intermittent back-filtrate infusion hemodiafiltration (I-HDF), which modifies infusion and ultrafiltration pattern. Purified on-line quality dialysate was intermittently infused by back filtration through the dialysis membrane with a programmed dialysis machine. A bolus of 200 ml of dialysate was infused at 30 min intervals. The volume infused was offset by increasing the fluid removal over the next 30 min by an equivalent amount. Seventy-seven hypotension-prone patients with over 20-mmHg reduction of systolic blood pressure during dialysis or intervention-requirement of more than once a week were included in the crossover study of 4 weeks duration for each modality. In a total of 1632 sessions, the frequency of interventions, the blood pressure, and the pulse rate were documented. During I-HDF, interventions for symptomatic hypotension were reduced significantly from 4.5 to 3.0 (per person-month, median) and intradialytic systolic blood pressure was 4 mmHg higher on average. The heart rate was lower during I-HDF than HD in the later session. Older patients and those with greater interdialytic weight gain responded to I-HDF. I-HDF could reduce interventions for IDH. It is accompanied with the increased intradialytic blood pressure and the less tachycardia, suggesting less sympathetic stimulation occurs. Thus, I-HDF could be beneficial for some hypotension-prone patients. 000013816.

  1. Structural and Evolutionary Aspects of Antenna Chromophore Usage by Class II Photolyases*

    PubMed Central

    Kiontke, Stephan; Gnau, Petra; Haselsberger, Reinhard; Batschauer, Alfred; Essen, Lars-Oliver

    2014-01-01

    Light-harvesting and resonance energy transfer to the catalytic FAD cofactor are key roles for the antenna chromophores of light-driven DNA photolyases, which remove UV-induced DNA lesions. So far, five chemically diverse chromophores have been described for several photolyases and related cryptochromes, but no correlation between phylogeny and used antenna has been found. Despite a common protein topology, structural analysis of the distantly related class II photolyase from the archaeon Methanosarcina mazei (MmCPDII) as well as plantal orthologues indicated several differences in terms of DNA and FAD binding and electron transfer pathways. For MmCPDII we identify 8-hydroxydeazaflavin (8-HDF) as cognate antenna by in vitro and in vivo reconstitution, whereas the higher plant class II photolyase from Arabidopsis thaliana fails to bind any of the known chromophores. According to the 1.9 Å structure of the MmCPDII·8-HDF complex, its antenna binding site differs from other members of the photolyase-cryptochrome superfamily by an antenna loop that changes its conformation by 12 Å upon 8-HDF binding. Additionally, so-called N- and C-motifs contribute as conserved elements to the binding of deprotonated 8-HDF and allow predicting 8-HDF binding for most of the class II photolyases in the whole phylome. The 8-HDF antenna is used throughout the viridiplantae ranging from green microalgae to bryophyta and pteridophyta, i.e. mosses and ferns, but interestingly not in higher plants. Overall, we suggest that 8-hydroxydeazaflavin is a crucial factor for the survival of most higher eukaryotes which depend on class II photolyases to struggle with the genotoxic effects of solar UV exposure. PMID:24849603

  2. Beneficial effects of Plantago albicans on high-fat diet-induced obesity in rats.

    PubMed

    Samout, Noura; Ettaya, Amani; Bouzenna, Hafsia; Ncib, Sana; Elfeki, Abdelfattah; Hfaiedh, Najla

    2016-12-01

    Obesity is a one of the main global public health problems associated with chronic diseases such as coronary heart disease, diabetes and cancer. As a solution to obesity, we suggest Plantago albicans, which is a medicinal plant with several biological effects. This study assesses the possible anti-obesity protective properties of Plantago albicans in high fat diet-fed rats. 28 male Wistar rats were divided into 4 groups; a group which received normal diet (C), the second group was fed HDF diet (HDF), the third group was given normal diet supplemented with Plantago albicans (P.AL), and the fourth group received HDF supplemented with Plantago albicans (HDF+P.AL) (30mg/kg/day) for 7 weeks. Our results showed an increase in body weight of HDF rats by ∼16% as compared to the control group with an increase in the levels of total cholesterol (TC) as well as LDL-cholesterol, triglycerides (TG) in serum. Also, the concentration of TBARS increased in the liver and heart of HDF-fed rats as compared to the control group. The oral gavage of Plantago albicans extract to obese rats induced a reduction in their body weight, lipid accumulation in liver and heart tissue, compared to the high-fat diet control rats. The obtained results proved that the antioxidant potency of Plantago albicans extracts was correlated with their phenolic and flavonoid contents. The antioxidant capacity of the extract was evaluated by DPPH test (as EC50=250±2.12μg/mL) and FRAP tests (as EC50=27.77±0.14μg/mL). These results confirm the phytochemical and antioxidant impact of Plantago albicans extracts. Plantago albicans content was determined using validated HPLC methodology. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  3. Role of residual kidney function and convective volume on change in beta2-microglobulin levels in hemodiafiltration patients.

    PubMed

    Penne, E Lars; van der Weerd, Neelke C; Blankestijn, Peter J; van den Dorpel, Marinus A; Grooteman, Muriel P C; Nubé, Menso J; Ter Wee, Piet M; Lévesque, Renée; Bots, Michiel L

    2010-01-01

    Removal of beta2-microglobulin (beta2M) can be increased by adding convective transport to hemodialysis (HD). The aim of this study was to investigate the change in beta2M levels after 6-mo treatment with hemodiafiltration (HDF) and to evaluate the role of residual kidney function (RKF) and the amount of convective volume with this change. Predialysis serum beta2M levels were evaluated in 230 patients with and 176 patients without RKF from the CONvective TRAnsport STudy (CONTRAST) at baseline and 6 mo after randomization for online HDF or low-flux HD. In HDF patients, potential determinants of change in beta2M were analyzed using multivariable linear regression models. Mean serum beta2M levels decreased from 29.5 +/- 0.8 (+/-SEM) at baseline to 24.3 +/- 0.6 mg/L after 6 mo in HDF patients and increased from 31.9 +/- 0.9 to 34.4 +/- 1.0 mg/L in HD patients, with the difference of change between treatment groups being statistically significant (regression coefficient -7.7 mg/L, 95% confidence interval -9.5 to -5.6, P < 0.001). This difference was more pronounced in patients without RKF as compared with patients with RKF. In HDF patients, beta2M levels remained unchanged in patients with GFR >4.2 ml/min/1.73 m2. The beta2M decrease was not related to convective volume. This study demonstrated effective lowering of beta2M levels by HDF, especially in patients without RKF. The role of the amount of convective volume on beta2M decrease appears limited, possibly because of resistance to beta2M transfer between body compartments.

  4. Comparison of intradialytic hemodynamic tolerance between on-line hemodiafiltration and acetate-free biofiltration with profiled potassium dialysate concentration.

    PubMed

    Kosmadakis, George; Correia, Enrique Da Costa; Somda, Frederic; Aguilera, Didier

    2017-01-01

    Intradialytic hypotensive episodes are deleterious for hemodialysis (HD) patients. Acetate-free biofiltration with profiled potassium (AFBK) dialysate concentration may improve their cardiovascular stability. The aim of the present crossover study was to compare intradialytic hemodynamic tolerance and biological parameters between online hemodiafiltration (olHDF) and AFBK. Ten frail HD patients (8 males) with a mean age of 66.71- ± 12.31 years were studied for three months on olHDF and AFBK. There was a significant reduction of the hypotensive episodes during the AFBK period compared to the olHDF period. Mean intradialytic systolic and diastolic blood pressures were significantly higher during the AFBK period. There was a significant postdialytic increase in serum sodium concentration with the AFBK compared to olHDF. The dry weight and ultrafiltration indices were significantly higher, and the Kt/V was significantly lower during the AFBK period. Serum albumin concentration significantly increased during the AFBK period. AFBK leads to a significantly improved intradialytic tolerance in hemodynamically instable HD patients.

  5. A randomized controlled study on the effects of acetate-free biofiltration on organic anions and acid-base balance in hemodialysis patients.

    PubMed

    Sánchez-Canel, Juan J; Hernández-Jaras, Julio; Pons-Prades, Ramón

    2015-02-01

    Metabolic acidosis correction is achieved by the transfer of bicarbonate and other buffer anions in dialysis. The aim of this study was to evaluate changes in the main anions of intermediary metabolism on standard hemodiafiltration (HDF) and on acetate-free biofiltration (AFB). A prospective, in-center, crossover study was carried out with 22 patients on maintenance dialysis. Patients were randomly assigned to start with 12 successive sessions of standard HDF with bicarbonate (34 mmol/L) and acetate dialysate (3 mmol/L) or 12 successive sessions of AFB without base in the dialysate. Acetate increased significantly during the standard HDF session from 0.078 ± 0.062 mmol/L to 0.156 ± 0.128 mmol/L (P < 0.05) and remained unchanged at 0.044 ± 0.034 mmol and 0.055 ± 0.028 mmol/L in AFB modality. Differences in the acetate levels were observed at two hours (P < 0.005), at the end (P < 0.005) and thirty minutes after the session between HDF and AFB (P < 0.05). There were significantly more patients above the normal range in HDF group than AFB group (68.1% vs 4.5% P < 0.005) postdialysis and 30 minutes later. Serum lactate and pyruvate concentrations decreased during the sessions without differences between modalities. Citrate decreased only in the AFB group (P < 0.05). Acetoacetate and betahydroxybutyrate increased in both modalities, but the highest betahydroxybutyrate values were detected in HDF (P < 0.05). The sum of postdialysis unusual measured organic anions (OA) were higher in HDF compared to AFB (P < 0.05). AFB achieves an optimal control of acid-base equilibrium through a bicarbonate substitution fluid. It also prevents hyperacetatemia and restores internal homeostasis with less production of intermediary metabolites. © 2014 The Authors. Therapeutic Apheresis and Dialysis © 2014 International Society for Apheresis.

  6. A Randomized Trial of Hemodiafiltration and Change in Cardiovascular Parameters

    PubMed Central

    Bots, Michiel L.; van den Dorpel, Marinus A.; Grooteman, Muriel P.C.; Kamp, Otto; Levesque, Renée; ter Wee, Piet M.; Nubé, Menso J.; Blankestijn, Peter J.

    2014-01-01

    Background and objective Increased left ventricular mass (LVM), low ventricular ejection fraction (EF), and high pulse-wave velocity (PWV) relate to overall and cardiovascular mortality in patients with ESRD. The aim of this study was to determine the effect of online hemodiafiltration (HDF) versus low-flux hemodialysis (HD) on LVM, EF, and PWV. Design, setting, participants, & measurements Echocardiography was used to assess LVM and EF in 342 patients in the CONvective TRAnsport STudy followed for up to 4 years. PWV was measured in 189 patients for up to 3 years. Effect of HDF versus HD on LVM, EF, and PWV was evaluated using linear mixed models. Results Patients had a mean age of 63 years, and 61% were male. At baseline, median LVM was 227 g (interquartile range [IQR], 183–279 g), and median EF was 65% (IQR, 55%–72%). Median PWV was 9.8 m/s (IQR, 7.5–12.0 m/s). There was no significant difference between the HDF and HD treatment groups in rate of change in LVM (HDF: change, −0.9 g/yr [95% confidence interval (95% CI), −8.9 to 7.7 g]; HD: change, 12.5 g/yr [95% CI, −3.0 to 27.5 g]; P for difference=0.13), EF (HDF: change, −0.3%/yr [95% CI, −2.3% to 1.8%]; HD: change, −3.4%/yr [95% CI, −5.9% to −0.9%]; P=0.17), or PWV (HDF: change, −0.0 m/s per year [95% CI, −0.4 to 0.4 m/s); HD: change, 0.0 m/s per year [95% CI, −0.3 to 0.2 m/s]; P=0.89). No differences in rate of change between treatment groups were observed for subgroups of age, sex, residual kidney function, dialysis vintage, history of cardiovascular disease, diabetes, or convection volume. Conclusions Treatment with online HDF did not affect changes in LVM, EF, or PWV over time compared with HD. PMID:24408114

  7. Optimization of the convection volume in online post-dilution haemodiafiltration: practical and technical issues

    PubMed Central

    Chapdelaine, Isabelle; de Roij van Zuijdewijn, Camiel L.M.; Mostovaya, Ira M.; Lévesque, Renée; Davenport, Andrew; Blankestijn, Peter J.; Wanner, Christoph; Nubé, Menso J.; Grooteman, Muriel P.C.

    2015-01-01

    In post-dilution online haemodiafiltration (ol-HDF), a relationship has been demonstrated between the magnitude of the convection volume and survival. However, to achieve high convection volumes (>22 L per session) detailed notion of its determining factors is highly desirable. This manuscript summarizes practical problems and pitfalls that were encountered during the quest for high convection volumes. Specifically, it addresses issues such as type of vascular access, needles, blood flow rate, recirculation, filtration fraction, anticoagulation and dialysers. Finally, five of the main HDF systems in Europe are briefly described as far as HDF prescription and optimization of the convection volume is concerned. PMID:25815176

  8. Transforming the Geocomputational Battlespace Framework with HDF5

    DTIC Science & Technology

    2010-08-01

    layout level, dataset arrays can be stored in chunks or tiles , enabling fast subsetting of large datasets, including compressed datasets. HDF software...Image Base (CIB) image of the AOI: an orthophoto made from rectified grayscale aerial images b. An IKONOS satellite image made up of 3 spectral

  9. Regulation of the Salmonella enterica std fimbrial operon by DNA adenine methylation, SeqA, and HdfR.

    PubMed

    Jakomin, Marcello; Chessa, Daniela; Bäumler, Andreas J; Casadesús, Josep

    2008-11-01

    DNA adenine methylase (dam) mutants of Salmonella enterica serovar Typhimurium grown under laboratory conditions express the std fimbrial operon, which is tightly repressed in the wild type. Here, we show that uncontrolled production of Std fimbriae in S. enterica serovar Typhimurium dam mutants contributes to attenuation in mice, as indicated by the observation that an stdA dam strain is more competitive than a dam strain upon oral infection. Dam methylation appears to regulate std transcription, rather than std mRNA stability or turnover. A genetic screen for std regulators showed that the GATC-binding protein SeqA directly or indirectly represses std expression, while the poorly characterized yifA gene product serves as an std activator. YifA encodes a putative LysR-like protein and has been renamed HdfR, like its Escherichia coli homolog. Activation of std expression by HdfR is observed only in dam and seqA backgrounds. These data suggest that HdfR directly or indirectly activates std transcription. Since SeqA is unable to bind nonmethylated DNA, it is possible that std operon derepression in dam and seqA mutants may result from unconstrained HdfR-mediated activation of std transcription. Derepression of std in dam and seqA mutants of S. enterica occurs in only a fraction of the bacterial population, suggesting the occurrence of either bistable expression or phase variation.

  10. Visual three-dimensional representation of beat-to-beat electrocardiogram traces during hemodiafiltration.

    PubMed

    Rodriguez-Fernandez, Rodrigo; Infante, Oscar; Perez-Grovas, Héctor; Hernandez, Erika; Ruiz-Palacios, Patricia; Franco, Martha; Lerma, Claudia

    2012-06-01

    This study evaluated the usefulness of the three-dimensional representation of electrocardiogram traces (3DECG) to reveal acute and gradual changes during a full session of hemodiafiltration (HDF) in end-stage renal disease (ESRD) patients. Fifteen ESRD patients were included (six men, nine women, age 46 ± 19 years old). Serum electrolytes, blood pressure, heart rate, and blood urea nitrogen (BUN) were measured before and after HDF. Continuous electrocardiograms (ECGs) obtained by Holter monitoring during HDF were used to produce the 3DECG. Several major disturbances were identified by 3DECG images: increase in QRS amplitude (47%), decrease in T-wave amplitude (33%), increase in heart rate (33%), and occurrence of arrhythmia (53%). Different arrhythmia types were often concurrent and included isolated supraventricular premature beats (N = 5), atrial fibrillation or atrial bigeminy (N = 2), and isolated premature ventricular beats (N = 6). Patients with decrease in T-wave amplitude had higher potassium and BUN (both before HDF and total removal) than those without decrease in T-wave amplitude (P < 0.05). Concurrent acute and gradual ECG changes during HDF are identified by the 3DECG, which could be useful as a preventive and prognostic method. © 2011, Copyright the Authors. Artificial Organs © 2011, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  11. An analysis of image storage systems for scalable training of deep neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Young, Steven R; Patton, Robert M

    This study presents a principled empirical evaluation of image storage systems for training deep neural networks. We employ the Caffe deep learning framework to train neural network models for three different data sets, MNIST, CIFAR-10, and ImageNet. While training the models, we evaluate five different options to retrieve training image data: (1) PNG-formatted image files on local file system; (2) pushing pixel arrays from image files into a single HDF5 file on local file system; (3) in-memory arrays to hold the pixel arrays in Python and C++; (4) loading the training data into LevelDB, a log-structured merge tree based key-valuemore » storage; and (5) loading the training data into LMDB, a B+tree based key-value storage. The experimental results quantitatively highlight the disadvantage of using normal image files on local file systems to train deep neural networks and demonstrate reliable performance with key-value storage based storage systems. When training a model on the ImageNet dataset, the image file option was more than 17 times slower than the key-value storage option. Along with measurements on training time, this study provides in-depth analysis on the cause of performance advantages/disadvantages of each back-end to train deep neural networks. We envision the provided measurements and analysis will shed light on the optimal way to architect systems for training neural networks in a scalable manner.« less

  12. SAGEIII-ISS L2 Lunar Data Release

    Atmospheric Science Data Center

    2018-01-12

    ... Space Station (SAGE III-ISS) Science Team and the NASA Langley Atmospheric Science Data Center (ASDC), announces the public ... Lunar Event Species Profiles (HDF-EOS) V5 (g3bssp)      doi: 10.5067/ISS/SAGEIII/LUNAR_HDF4_L2-V5.0 SAGE III/ISS L2 Lunar Event ...

  13. Online Hemodiafiltration Reduces Bisphenol A Levels.

    PubMed

    Quiroga, Borja; Bosch, Ricardo J; Fiallos, Ruth A; Sánchez-Heras, Marta; Olea-Herrero, Nuria; López-Aparicio, Pilar; Muñóz-Moreno, Carmen; Pérez-Alvarsan, Miguel Angel; De Arriba, Gabriel

    2017-02-01

    Several uremic toxins have been identified and related to higher rates of morbidity and mortality in dialysis patients. Bisphenol A (BPA) accumulates in patients with chronic kidney disease. The aim of this study is to demonstrate the usefulness of online hemodiafiltration (OL-HDF) in reducing BPA levels. Thirty stable hemodialysis patients were selected to participate in this paired study. During three periods of 3 weeks each, patients were switched from high-flux hemodialysis (HF-HD) to OL-HDF, and back to HF-HD. BPA levels were measured in the last session of each period (pre- and post-dialysis) using ELISA and HPLC. Twenty-two patients (mean age 73 ± 14 years; 86.4% males) were included. Measurements of BPA levels by HPLC and ELISA assays showed a weak but significant correlation (r = 0.218, P = 0.012). BPA levels decreased in the OL-HDF period of hemodialysis, in contrast to the HF-HD period when they remained stable (P = 0.002). In conclusion, OL-HDF reduced BPA levels in dialysis patients. © 2016 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  14. On-line hemodiafiltration at home.

    PubMed

    Vega, Almudena; Abad, Soraya; Macías, Nicolás; Aragoncillo, Inés

    2018-04-01

    Survival with online hemodiafiltration (OL-HDF) is higher than with hemodialysis; frequent hemodialysis has also improved survival and quality of life. Home hemodialysis facilitates frequent therapy. We report our experience with 2 patients with stage 5 CKD who started home hemodialysis with OL-HDF in November 2016. After a training period at the hospital, they started home hemodialysis with OL-HDF after learning how to manage dialysis monitors and how to administer water treatment. We used the "5008-home" (FMC © ) monitor, and the Acqua C © (Fresenius Medical Care) for water treatment. Water conductivity was always checked before and during dialysis sessions and was always 2.5 to 3 mS/cm. Water cultures always fulfilled the criteria for ultrapurity. As far as we know, this is the first report on patients receiving OL-HDF at home. The technique proved to be safe and valid for renal replacement therapy and transfers the benefits of hospital convective therapy to the home setting. Future data will enable us to determine whether survival has also improved. © 2017 International Society for Hemodialysis.

  15. Deepest Infrared View of the Universe

    NASA Astrophysics Data System (ADS)

    2002-12-01

    VLT Images Progenitors of Today's Large Galaxies Summary An international team of astronomers [2] has made the deepest-ever near-infrared Ks-band image of the sky, using the ISAAC multi-mode instrument on the 8.2-m VLT ANTU telescope. For this, the VLT was pointed for more than 100 hours under optimal observing conditions at the Hubble Deep Field South (HDF-S) and obtained images in three near-infrared filters. The resulting images reveal extremely distant galaxies, which appear at infrared wavelengths, but are barely detected in the deepest optical images acquired with the Hubble Space Telescope (HST). Astronomer Marijn Franx from the University of Leiden and leader of the team concludes: "These results demonstrate that very deep observations in the near-infrared are essential to obtain a proper census of the earliest phases of the universe. The new VLT images have opened a new research domain which has not been observationally accessible before". The HDF-S is a tiny field on the sky in the southern constellation Tucana (The Toucan) - only about 1% of the area of the full moon. The NASA/ESA Hubble Space Telescope (HST) observed it with a total exposure time of about 1 week, yielding the deepest optical images ever taken of the sky, similar to those made earlier on the Hubble Deep Field North (HDF-N). The VLT infrared images of the same field were obtained in the course of a major research project, the Faint InfraRed Extragalactic Survey (FIRES). They were made at wavelengths up to 2.3 µm where the HST is not competitive. Ivo Labbé, another team member from the University of Leiden, is certain: "Without the unique capabilities of the VLT and ISAAC we would never have been able to observe these very remote galaxies. In fact, the image in the Ks-band is the deepest which has ever been made at that wavelength". The optical light emitted by the distant galaxies has been redshifted to the near-infrared spectral region [3]. Indeed, some of the galaxies found in the new images are so remote that - due to the finite speed of light - they are observed as they were when the Universe was still extremely young, less than 2 billion years old. From these observations, two interesting conclusions have been drawn so far. One is that although the newly identified galaxies do not appear to form stars very actively they probably account for about half the mass of normal matter present at this epoch. This is in sharp contrast to the galaxies at this early time found during optical surveys - they are very blue because of young and hot stars. Another is that galaxies existed already at that epoch which are clearly rather large, and some show spiral structure similar to that seen in very nearby galaxies. This new important insight is having profound impact on the current attempts to understand the formation and evolution of galaxies. PR Photo 28a/02: Composite colour image of the sky field observed by HST and VLT. PR Photo 28b/02: The ISAAC K s -band image , the deepest of its kind ever obtained. PR Photo 28c/02: Images of very red, very distant compact galaxies in different wavebands. PR Photo 28d/02: Images of very distant extended galaxies in different wavebands. Formation and evolution of galaxies How did galaxies form in the early Universe? How did they evolve and when did the first stars form in those systems? These are some of the key questions in present-day astronomy. Thanks to powerful ground- and space-based telescopes, astronomers are now able to actively pursue studies in this direction. Recent front-line observational results are helping them to gain new insights into these fundamental issues. Light emitted by distant galaxies travels a long time before we observe it with our telescopes. In this way, astronomers can look back in time and directly study galaxies as they were when the universe was still very young. However, this is technically difficult, as the galaxies are extremely faint. Another complication is that, due to the expansion of the universe, their light is shifted towards longer wavelengths [3]. In order to study those early galaxies in some detail, astronomers thus need to use the largest ground-based telescopes, collecting their faint light during very long integrations. And they must work in the infrared region of the spectrum which is not visible to the human eye. The Hubble Deep Field South (HDF-S) was selected to be studied in great detail with the Hubble Space Telescope (HST) and other powerful telescopes. The HST images of this field represent a total exposure time of 140 hours. Many ground-based telescopes have obtained additional photos and spectra, in particular telescopes at the European Southern Observatory in Chile. The ISAAC observations The sky field in the direction of HDF-S observed in the present study (the Faint InfraRed Extragalactic Survey (FIRES)), measures 2.5 x 2.5 arcmin2. It is slightly larger than the field covered by the WFPC2 camera on the HST, but still 100 times smaller than the full moon. Whenever the field was visible from Paranal and the atmospheric conditions were optimal, ESO astronomers pointed the 8.2-m VLT ANTU telescope in the direction of this field, taking near-infrared images with the ISAAC multi-mode instrument. The data were transmitted by Internet to the astronomers of the team in Europe, who then combined them to construct some of the deepest infrared astronomical images ever taken from the ground. Colours and distance A crucial feature of the new observations is that they were made in three infrared bands (Js, H, Ks), allowing a 3-dimensional view of a small region of the Universe. This is because, by comparing the brightness of the galaxies in these colours with that in optical light, as measured by the HST, it is possible to estimate their redshifts [3] and thus how long ago the light we now see has been emitted. For the reddest of the galaxies the answer is that we are seeing them as they were when the Universe was only about 2 billion years old. The nature of the galaxies Two conclusions drawn so far about the nature of these galaxies are therefore all the more important in the context of formation and evolution of galaxies. One is that a few of them are clearly rather large and show spiral structure similar to that seen in very nearby galaxies, cf. PR Photo 28d/02. It is not obvious that current theoretical models can easily account for such galaxies having evolved to this stage so early in the life of the Universe. Another conclusion is that, in contrast to the galaxies at similar redshifts (and hence, at this early epoch) found most commonly in surveys at optical wavelengths, most of the 'infrared-selected' galaxies show relatively little visible star-forming activity. They appear in fact to have already formed most of their stars and in quantities sufficient to account for at least half the total luminous mass of the Universe at that time. Given the time to reach this state they must clearly have formed even earlier in the life of the Universe and are thus probably amongst the "oldest" galaxies now known. Rather than being randomly distributed in space, these red galaxies are also found to prefer company, i.e., they tend to cluster close to each other. In general terms this can be taken as support for the latest theoretical models in which galaxies, which consist of "normal" matter, form in the highest-density regions of the much more pervasive "dark" matter. Although the latter accounts for most of the mass of the universe, its origin so far is completely unknown. These new observations may, therefore, also add new insight into one of the biggest mysteries currently confronting cosmologists. Marijn Franx agrees, but also cautions against drawing firm conclusions on this aspect too quickly: "We now need similar images of a considerably larger region of the sky. We will soon follow-up these first, tantalizing results with more observations of other sky fields." More information The information presented in this Press Release is based on a research article ("Ultradeep Near-Infrared ISAAC Observations of the Hubble Deep Field South: Observations, Reduction, Multicolor Catalog, and Photometric Redshifts" by Ivo Labbé et al.) that will soon appear in the research journal "Astronomical Journal" (cf. astro-ph/0212236). A shorter account will appear in the December 2002 issue of ESO's house journal "The Messenger". Information, including photos and reduced data, is also available at the website of the FIRES project. Notes [1]: This press release is issued in coordination between ESO, Leiden Observatory, the Netherlands Research School for Research in Astronomy (NOVA) and the Netherlands Foundation for Research (NWO). A Dutch-language version is available here. [2]: The team consists of Ivo Labbé, Marijn Franx, Natascha M. Förster Schreiber, Paul van der Werf, Huub Röttgering, Lottie van Starkenburg, Arjen van de Wel and Konrad Kuijken (Leiden Observatory, The Netherlands), Gregory Rudnick (Max-Planck-Institut für Astrophysik, Garching, Germany), Hans-Walter Rix (Max-Planck-Institut für Astronomie, Heidelberg, Germany), Alan Moorwood and Emanuele Daddi (ESO, Garching, Germany) and Pieter G. van Dokkum (California Institute of Technology, Pasadena, USA). [3]: In astronomy, the redshift denotes the fraction by which the lines in the spectrum of an object are shifted towards longer wavelengths. The observed redshift of a remote galaxy provides an estimate of its distance.

  16. Role of Residual Kidney Function and Convective Volume on Change in β2-Microglobulin Levels in Hemodiafiltration Patients

    PubMed Central

    Penne, E. Lars; van der Weerd, Neelke C.; Blankestijn, Peter J.; van den Dorpel, Marinus A.; Grooteman, Muriel P.C.; Nubé, Menso J.; Lévesque, Renée; Bots, Michiel L.

    2010-01-01

    Background and objectives: Removal of β2-microglobulin (β2M) can be increased by adding convective transport to hemodialysis (HD). The aim of this study was to investigate the change in β2M levels after 6-mo treatment with hemodiafiltration (HDF) and to evaluate the role of residual kidney function (RKF) and the amount of convective volume with this change. Design, setting, participants, & measurements: Predialysis serum β2M levels were evaluated in 230 patients with and 176 patients without RKF from the CONvective TRAnsport STudy (CONTRAST) at baseline and 6 mo after randomization for online HDF or low-flux HD. In HDF patients, potential determinants of change in β2M were analyzed using multivariable linear regression models. Results: Mean serum β2M levels decreased from 29.5 ± 0.8 (±SEM) at baseline to 24.3 ± 0.6 mg/L after 6 mo in HDF patients and increased from 31.9 ± 0.9 to 34.4 ± 1.0 mg/L in HD patients, with the difference of change between treatment groups being statistically significant (regression coefficient −7.7 mg/L, 95% confidence interval −9.5 to −5.6, P < 0.001). This difference was more pronounced in patients without RKF as compared with patients with RKF. In HDF patients, β2M levels remained unchanged in patients with GFR >4.2 ml/min/1.73 m2. The β2M decrease was not related to convective volume. Conclusions: This study demonstrated effective lowering of β2M levels by HDF, especially in patients without RKF. The role of the amount of convective volume on β2M decrease appears limited, possibly because of resistance to β2M transfer between body compartments. PMID:19965537

  17. Long-term clinical parameters after switching to nocturnal haemodialysis: a Dutch propensity-score-matched cohort study comparing patients on nocturnal haemodialysis with patients on three-times-a-week haemodialysis/haemodiafiltration.

    PubMed

    Jansz, Thijs Thomas; Özyilmaz, Akin; Grooteman, Muriel P C; Hoekstra, Tiny; Romijn, Marieke; Blankestijn, Peter J; Bots, Michael L; van Jaarsveld, Brigit C

    2018-03-08

    Nocturnal haemodialysis (NHD), characterised by 8-hour sessions ≥3 times a week, is known to improve clinical parameters in the short term compared with conventional-schedule haemodialysis (HD), generally 3×3.5-4 hours a week. We studied long-term effects of NHD and used patients on conventional HD/haemodiafiltration (HDF) as controls. Four-year prospective follow-up of patients who switched to NHD; we compared patients with patients on HD/HDF using propensity score matching. 28 Dutch dialysis centres. We included 159 patients starting with NHD any time since 2004, aged 56.7±12.9 years, with median dialysis vintage 2.3 (0.9-5.1) years. We propensity-score matched 100 patients on NHD to 100 on HD/HDF. Control of hypertension (predialysis blood pressure, number of antihypertensives), phosphate (phosphate, number of phosphate binders), nutritional status and inflammation (albumin, C reactive protein and postdialysis weight) and anaemia (erythropoiesis-stimulating agent (ESA) resistance). Switching to NHD was associated with a non-significant reduction of antihypertensives compared with HD/HDF (OR <2 types 2.17, 95% CI 0.86 to 5.50, P=0.11); and a prolonged lower need for phosphate binders (OR <2 types 1.83, 95% CI 1.10 to 3.03, P=0.02). NHD was not associated with significant changes in blood pressure or phosphate. NHD was associated with significantly higher albumin over time compared with HD/HDF (0.70 g/L/year, 95% CI 0.10 to 1.30, P=0.02). ESA resistance decreased significantly in NHD compared with HD/HDF, resulting in a 33% lower ESA dose in the long term. After switching to NHD, the lower need for antihypertensives, phosphate binders and ESA persists for at least 4 years. These sustained improvements in NHD contrast significantly with the course of these parameters during continued treatment with conventional-schedule HD and HDF. NHD provides an optimal form of dialysis, also suitable for patients expected to have a long waiting time for transplantation or those convicted to indefinite dialysis. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Heterogeneous Dental Follicle Cells and the Regeneration of Complex Periodontal Tissues

    PubMed Central

    Guo, Weihua; Chen, Lei; Gong, Kun; Ding, Bofu

    2012-01-01

    Dental follicle cells (DFCs) are a heterogeneous population that exhibit a variety of phenotypes. However, it remains unclear whether DFCs can maintain stem cell characteristics, or mediate tissue-regeneration to form single or complex tissues in the periodontium, after long-term culturing. Therefore, DFCs were isolated from human impacted molars (HIM-DFCs), passaged >30 times, and then evaluated for their heterogeneity and multipotential differentiation. Morphology, proliferation, epitope profile, and mineralization characteristics of clones derived from single HIM-DFCs in vitro were also assayed. HIM-DFCs (passage #30) were found to be positive for the heterogeneous markers, Notch-1, stro-1, alkaline phosphomonoesterase (ALP), type I collagen (COL-I), type III collagen (COL-III), and osteocalcine. Moreover, passage #30 of the HDF1, 2, and 3 subclone classes identified in this study were found to express high levels of the mesenchymal stem cells markers, CD146 and Stro1. HDF3 subclones were also associated with the strongest ALP staining detected, and strongly expressed osteoblast and cementoblast markers, including COL-I, COL-III, bone sialoprotein (BSP), and Runx2. In contrast, HDF1 subclone analyzed strongly expressed COL-I and COL-III, yet weakly expressed BSP and Runx2. The HDF2 subclone was associated with the strongest proliferative capacity. To evaluate differentiation characteristics in vivo, these various cell populations were combined with ceramic bovine bone and implanted into subcutaneous pockets of nude mice. The 30th passage of subclone HDF1 and 3 were observed to contribute to fiber collagens and the mineralized matrix present, respectively, whereas HDF2 subclones were found to have a minimal role in these formations. The formation of a cementum-periodontal ligament (PDL) complex was observed 6 weeks after HIM-DFCs (passage #30) were implanted in vivo, thus suggesting that these cells maintain stem cell characteristics. Therefore, subclone HDF1-3 may be related to the differentiation of fibroblasts in the PDL, undifferentiated cells, and osteoblasts and cementoblasts, respectively. Overall, this study is the first to amplify HIM-DFCs and associated subclones with the goal of reconstructing complex or single periodontium. Moreover, our results demonstrate the potential for this treatment approach to address periodontal defects that result from periodontitis, or for the regeneration of teeth. PMID:21919800

  19. Heterogeneous dental follicle cells and the regeneration of complex periodontal tissues.

    PubMed

    Guo, Weihua; Chen, Lei; Gong, Kun; Ding, Bofu; Duan, Yinzhong; Jin, Yan

    2012-03-01

    Dental follicle cells (DFCs) are a heterogeneous population that exhibit a variety of phenotypes. However, it remains unclear whether DFCs can maintain stem cell characteristics, or mediate tissue-regeneration to form single or complex tissues in the periodontium, after long-term culturing. Therefore, DFCs were isolated from human impacted molars (HIM-DFCs), passaged >30 times, and then evaluated for their heterogeneity and multipotential differentiation. Morphology, proliferation, epitope profile, and mineralization characteristics of clones derived from single HIM-DFCs in vitro were also assayed. HIM-DFCs (passage #30) were found to be positive for the heterogeneous markers, Notch-1, stro-1, alkaline phosphomonoesterase (ALP), type I collagen (COL-I), type III collagen (COL-III), and osteocalcine. Moreover, passage #30 of the HDF1, 2, and 3 subclone classes identified in this study were found to express high levels of the mesenchymal stem cells markers, CD146 and Stro1. HDF3 subclones were also associated with the strongest ALP staining detected, and strongly expressed osteoblast and cementoblast markers, including COL-I, COL-III, bone sialoprotein (BSP), and Runx2. In contrast, HDF1 subclone analyzed strongly expressed COL-I and COL-III, yet weakly expressed BSP and Runx2. The HDF2 subclone was associated with the strongest proliferative capacity. To evaluate differentiation characteristics in vivo, these various cell populations were combined with ceramic bovine bone and implanted into subcutaneous pockets of nude mice. The 30th passage of subclone HDF1 and 3 were observed to contribute to fiber collagens and the mineralized matrix present, respectively, whereas HDF2 subclones were found to have a minimal role in these formations. The formation of a cementum-periodontal ligament (PDL) complex was observed 6 weeks after HIM-DFCs (passage #30) were implanted in vivo, thus suggesting that these cells maintain stem cell characteristics. Therefore, subclone HDF1-3 may be related to the differentiation of fibroblasts in the PDL, undifferentiated cells, and osteoblasts and cementoblasts, respectively. Overall, this study is the first to amplify HIM-DFCs and associated subclones with the goal of reconstructing complex or single periodontium. Moreover, our results demonstrate the potential for this treatment approach to address periodontal defects that result from periodontitis, or for the regeneration of teeth.

  20. Evolution of the LBT Telemetry System

    NASA Astrophysics Data System (ADS)

    Summers, K.; Biddick, C.; De La Peña, M. D.; Summers, D.

    2014-05-01

    The Large Binocular Telescope (LBT) Telescope Control System (TCS) records about 10GB of telemetry data per night. Additionally, the vibration monitoring system records about 9GB of telemetry data per night. Through 2013, we have amassed over 6TB of Hierarchical Data Format (HDF5) files and almost 9TB in a MySQL database of TCS and vibration data. The LBT telemetry system, in its third major revision since 2004, provides the mechanism to capture and store this data. The telemetry system has evolved from a simple HDF file system with MySQL stream definitions within the TCS, to a separate system using a MySQL database system for the definitions and data, and finally to no database use at all, using HDF5 files.

  1. A price and performance comparison of three different storage architectures for data in cloud-based systems

    NASA Astrophysics Data System (ADS)

    Gallagher, J. H. R.; Jelenak, A.; Potter, N.; Fulker, D. W.; Habermann, T.

    2017-12-01

    Providing data services based on cloud computing technology that is equivalent to those developed for traditional computing and storage systems is critical for successful migration to cloud-based architectures for data production, scientific analysis and storage. OPeNDAP Web-service capabilities (comprising the Data Access Protocol (DAP) specification plus open-source software for realizing DAP in servers and clients) are among the most widely deployed means for achieving data-as-service functionality in the Earth sciences. OPeNDAP services are especially common in traditional data center environments where servers offer access to datasets stored in (very large) file systems, and a preponderance of the source data for these services is being stored in the Hierarchical Data Format Version 5 (HDF5). Three candidate architectures for serving NASA satellite Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS) were developed and their performance examined for a set of representative use cases. The performance was based both on runtime and incurred cost. The three architectures differ in how HDF5 files are stored in the Amazon Simple Storage Service (S3) and how the Hyrax server (as an EC2 instance) retrieves their data. The results for both the serial and parallel access to HDF5 data in the S3 will be presented. While the study focused on HDF5 data, OPeNDAP and the Hyrax data server, the architectures are generic and the analysis can be extrapolated to many different data formats, web APIs, and data servers.

  2. Transformation of HDF-EOS metadata from the ECS model to ISO 19115-based XML

    NASA Astrophysics Data System (ADS)

    Wei, Yaxing; Di, Liping; Zhao, Baohua; Liao, Guangxuan; Chen, Aijun

    2007-02-01

    Nowadays, geographic data, such as NASA's Earth Observation System (EOS) data, are playing an increasing role in many areas, including academic research, government decisions and even in people's every lives. As the quantity of geographic data becomes increasingly large, a major problem is how to fully make use of such data in a distributed, heterogeneous network environment. In order for a user to effectively discover and retrieve the specific information that is useful, the geographic metadata should be described and managed properly. Fortunately, the emergence of XML and Web Services technologies greatly promotes information distribution across the Internet. The research effort discussed in this paper presents a method and its implementation for transforming Hierarchical Data Format (HDF)-EOS metadata from the NASA ECS model to ISO 19115-based XML, which will be managed by the Open Geospatial Consortium (OGC) Catalogue Services—Web Profile (CSW). Using XML and international standards rather than domain-specific models to describe the metadata of those HDF-EOS data, and further using CSW to manage the metadata, can allow metadata information to be searched and interchanged more widely and easily, thus promoting the sharing of HDF-EOS data.

  3. Expediting Scientific Data Analysis with Reorganization of Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byna, Surendra; Wu, Kesheng

    2013-08-19

    Data producers typically optimize the layout of data files to minimize the write time. In most cases, data analysis tasks read these files in access patterns different from the write patterns causing poor read performance. In this paper, we introduce Scientific Data Services (SDS), a framework for bridging the performance gap between writing and reading scientific data. SDS reorganizes data to match the read patterns of analysis tasks and enables transparent data reads from the reorganized data. We implemented a HDF5 Virtual Object Layer (VOL) plugin to redirect the HDF5 dataset read calls to the reorganized data. To demonstrate themore » effectiveness of SDS, we applied two parallel data organization techniques: a sort-based organization on a plasma physics data and a transpose-based organization on mass spectrometry imaging data. We also extended the HDF5 data access API to allow selection of data based on their values through a query interface, called SDS Query. We evaluated the execution time in accessing various subsets of data through existing HDF5 Read API and SDS Query. We showed that reading the reorganized data using SDS is up to 55X faster than reading the original data.« less

  4. A novel cell-stiffness-fingerprinting analysis by scanning atomic force microscopy: Comparison of fibroblasts and diverse cancer cell lines

    PubMed Central

    Zoellner, Hans; Paknejad, Navid; Manova, Katia; Moore, Malcolm

    2016-01-01

    Differing stimuli affect cell-stiffness while cancer metastasis further relates to cell-stiffness. Cell-stiffness determined by atomic Force Microscopy (AFM) has been limited by measurement over nuclei to avoid spurious substratum effects in thin cytoplasmic domains, and we sought to develop a more complete approach including cytoplasmic areas. 90 μm square fields were recorded from 10 sites of cultured Human Dermal Fibroblasts (HDF), and 3 sites each for melanoma (MM39, WM175, MeIRMu), osteosarcoma (SAOS-2, U2OS), and ovarian carcinoma (COLO316, PEO4) cell lines, each site providing 1,024 measurements as 32x32 square grids. Stiffness recorded below 0.8 μm height was occasionally influenced by substratum, so only stiffness recorded above 0.8 μm was analyzed, but all sites were included for height and volume analysis. COLO316 had the lowest cell height and volume, followed by HDF (p<0.0001), and then PEO4, SAOS-2, MeIRMu, WM175, U2OS, and MM39. HDF were more stiff than all other cells (p < 0.0001), while in descending order of stiffness were PEO4, COLO316, WM175, SAOS-2, U2OS, MM39, and MeIRMu (p < 0.02). Stiffness-fingerprints comprised scattergrams of stiffness values plotted against the height at which each stiffness value was recorded, and appeared unique for each cell type studied, although in most cases the overall form of fingerprints was similar, with maximum stiffness at low height measurements and a second lower peak occurring at high height levels. We suggest our stiffness-fingerprint analytical method provides a more nuanced description than previously reported, and will facilitate study of the stiffness response to cell stimulation. PMID:26357955

  5. Nocturnal, every-other-day, online haemodiafiltration: an effective therapeutic alternative.

    PubMed

    Maduell, Francisco; Arias, Marta; Durán, Carlos E; Vera, Manel; Fontseré, Néstor; Azqueta, Manel; Rico, Nayra; Pérez, Nuria; Sentis, Alexis; Elena, Montserrat; Rodriguez, Néstor; Arcal, Carola; Bergadá, Eduardo; Cases, Aleix; Bedini, Jose Luis; Campistol, Josep M

    2012-04-01

    Longer and more frequent dialysis sessions have demonstrated excellent survival and clinical advantages, while online haemodiafiltration (OL-HDF) provides the most efficient form of dialysis treatment. The aim of this study was to evaluate the beneficial effects of a longer (nocturnal) and more frequent (every-other-day) dialysis schedule with OL-HDF at the same or the highest convective volume. This prospective, in-centre crossover study was carried out in 26 patients, 18 males and 8 females, 49.2±14 years old, on 4-5 h thrice-weekly post-dilution OL-HDF, switched to nocturnal every-other-day OL-HDF. Patient inclusion criteria consisted of stable patients with good vascular access and with good prospects for improved occupational, psychological and social rehabilitation. Patients were randomly assigned into two groups: Group A received the same convective volume as previously for 6 months followed by a higher convective volume for a further 6 months, while Group B received the same schedule in reverse order. Nocturnal every-other-day OL-HDF was well tolerated and 56% of patients who were working during the baseline period continued to work throughout the study with practically no absenteeism. The convective volume was 26.7±2 L at baseline, 27.5±2 with the unchanged volume and 42.9±4 L with the higher volume. eKt/V increased from 1.75±0.4 to 3.37±0.9. Bicarbonate, blood urea nitrogen (BUN) and creatinine values decreased, while phosphate levels fell markedly with a 90% reduction in phosphate binders. Blood pressure and left ventricular hypertrophy (LVH) improved and the use of anti-hypertensive drugs decreased. In both groups, BUN, creatinine and β2-microglobulin reduction ratios improved. Different removal patterns were observed for myoglobin, prolactin and α1-acid glycoprotein. Nocturnal every-other-day OL-HDF could be an excellent therapeutic alternative since good tolerance and occupational rehabilitation, marked improvement in dialysis dose, nutritional status, LVH, phosphate and hypertension control and a substantial reduction in drug requirements were observed. In this crossover study, different removal patterns of large solutes were identified.

  6. Eight-Year Experience with Nocturnal, Every-Other-Day, Online Haemodiafiltration.

    PubMed

    Maduell, Francisco; Ojeda, Raquel; Arias-Guillen, Marta; Rossi, Florencia; Fontseré, Néstor; Vera, Manel; Rico, Nayra; Gonzalez, Leonardo Nicolás; Piñeiro, Gastón; Jiménez-Hernández, Mario; Rodas, Lida; Bedini, José Luis

    2016-01-01

    New haemodialysis therapeutic regimens are required to improve patient survival. Longer and more frequent dialysis sessions have produced excellent survival and clinical advantages, while online haemodiafiltration (OL-HDF) provides the most efficient form of dialysis treatment. In this single-centre observational study, 57 patients on 4-5-hour thrice-weekly OL-HDF were switched to nocturnal every-other-day OL-HDF. Inclusion criteria consisted of stable patients with good prospects for improved occupational, psychological and social rehabilitation. The aim of this study was to report our 8-year experience with this schedule and to evaluate analytical and clinical outcomes. Nocturnal, every-other-day OL-HDF was well tolerated and 56% of patients were working. The convective volume increased from 26.7 ± 2 litres at baseline to 46.6 ± 6.5 litres at 24 months (p < 0.01). Increasing the dialysis dose significantly decreased bicarbonate, blood-urea-nitrogen and creatinine values. Predialysis phosphate levels fell markedly with complete suspension of phosphate binders from the second year of follow-up. Although haemoglobin was unchanged, there was a 50.4% reduction in darbepoetin dose at 24 months and a significant decrease in the erythropoietin resistance index. Blood pressure significantly decreased in a few months. Antihypertensive medication requirements were decreased by 60% after 3 months and by 73% after 1 year and this difference was maintained thereafter. Nocturnal, every-other-day OL-HDF could be an excellent therapeutic alternative since it is well tolerated and leads to clinical and social-occupational rehabilitation with satisfactory morbidity and mortality. These encouraging results strengthen us to continue and invite other clinicians to join this initiative. © 2016 S. Karger AG, Basel.

  7. Anti-proliferative and anti-migratory effects of hyperforin in 2D and 3D artificial constructs of human dermal fibroblasts - A new option for hypertrophic scar treatment?

    PubMed

    Füller, J; Müller-Goymann, C C

    2018-05-01

    Hyperforin (HYP), one of the main bioactive compounds in extracts of Hypericum perforatum, is a potential drug candidate for the treatment of skin diseases. Since extracts have proven to support wound healing, in the present study effects of HYP on human dermal fibroblasts (HDF) were evaluated in 2D and 3D in vitro dermal constructs. Viability and cytotoxicity assays as well as a live-dead cell staining were performed to test at which concentration HYP reduces viability and/or shows cytotoxicity. Furthermore a differentiation between cytotoxic, anti-proliferative and anti-migratory effects was done. For the latter purpose a 2D migration assay was performed. HDF-induced contraction of a 3D artificial dermal (AD) construct was determined at given HYP concentration. Induction of apoptosis was examined by determination of caspase 3/7 activities. HYP reduced viability of HDF down to 70% at concentrations of 5-10µM. This decrease was not due to cytotoxicity but to a reduction in proliferation as shown from both the proliferation assay and the cytotoxicity assay as well as from live-dead cell staining. The 2D migration assay showed that HYP reduced migration activity of HDF cells at a concentration of 10µM. At this concentration HYP also reduced the HDF-induced contraction of collagen gels as 3D AD constructs. Apoptotic effects of HYP were excluded performing a caspase 3/7 activity detecting assay. The results show for the first time that HYP may be rather a potential candidate for treatment of hypertrophic scars than promoting effects which are understood as important in wound healing. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Accidental mold/termite testing of high density fiberboard (HDF) treated with borates and N’N-naphthaloylhydroxylamine (NHA)

    Treesearch

    S. Nami Kartal; Harold H. Burdsall; Frederick Green

    2003-01-01

    High density fibreboard (HDF) was made from beech and pine furnish (50:50) and treated with boric acid (0.1-3%), borax (0.1-3%) or N'-N-(1,8- naphthalyl) hydroxylamine (NHA) (0.1-1%) prior to gluing with urea formaldehyde (UF) resin in order to determine resistance to Eastern subterranean termites ( Reticulitermes flavipes Kollar), the most economically important...

  9. The MODIS reprojection tool

    USGS Publications Warehouse

    Dwyer, John L.; Schmidt, Gail L.; Qu, J.J.; Gao, W.; Kafatos, M.; Murphy , R.E.; Salomonson, V.V.

    2006-01-01

    The MODIS Reprojection Tool (MRT) is designed to help individuals work with MODIS Level-2G, Level-3, and Level-4 land data products. These products are referenced to a global tiling scheme in which each tile is approximately 10° latitude by 10° longitude and non-overlapping (Fig. 9.1). If desired, the user may reproject only selected portions of the product (spatial or parameter subsetting). The software may also be used to convert MODIS products to file formats (generic binary and GeoTIFF) that are more readily compatible with existing software packages. The MODIS land products distributed by the Land Processes Distributed Active Archive Center (LP DAAC) are in the Hierarchical Data Format - Earth Observing System (HDF-EOS), developed by the National Center for Supercomputing Applications at the University of Illinois at Urbana Champaign for the NASA EOS Program. Each HDF-EOS file is comprised of one or more science data sets (SDSs) corresponding to geophysical or biophysical parameters. Metadata are embedded in the HDF file as well as contained in a .met file that is associated with each HDF-EOS file. The MRT supports 8-bit, 16-bit, and 32-bit integer data (both signed and unsigned), as well as 32-bit float data. The data type of the output is the same as the data type of each corresponding input SDS.

  10. Facing regulatory challenges of on-line hemodiafiltration.

    PubMed

    Kümmerle, Wolfgang

    2011-01-01

    On-line hemodiafiltration (on-line HDF) is the result of a vision that triggered multifarious changes in very different areas. Driven by the idea to offer better medical treatment for renal patients, technological innovations were developed and established that also constituted new challenges in the field of regulatory affairs. The existing regulations predominantly addressed the quality and safety of those products needed to perform dialysis treatment which were supplied by industrial manufacturers. However, the complexity of treatment system required for the provision of on-line fluids demanded a holistic approach encompassing all components involved. Hence, focus was placed not only on single products, but much more on their interfacing, and the clinical infrastructure, in particular, had to undergo substantial changes. The overall understanding of the interaction between such factors, quite different in their nature, was crucial to overcome the arising regulatory obstacles. This essay describes the evolution of the on-line HDF procedure from the regulatory point of view. A simplified diagram demonstrates the path taken from the former regulatory understanding to the realization of necessary changes. That achievement was only possible through 'management of preview' and consequent promotion of technical and medical innovations as well as regulatory re-evaluations. Copyright © 2011 S. Karger AG, Basel.

  11. High-Performance I/O: HDF5 for Lattice QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav

    2015-01-01

    Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less

  12. High-Performance I/O: HDF5 for Lattice QCD

    DOE PAGES

    Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav; ...

    2017-05-09

    Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less

  13. Reduced protein bound uraemic toxins in vegetarian kidney failure patients treated by haemodiafiltration.

    PubMed

    Kandouz, Sakina; Mohamed, Ali Shendi; Zheng, Yishan; Sandeman, Susan; Davenport, Andrew

    2016-10-01

    Introduction Indoxyl sulfate (IS) and p cresyl sulfate (PCS) are protein bound toxins which accumulate with chronic kidney disease. Haemodiafiltration (HDF) increases middle molecule clearances and has been suggested to increase IS and PCS clearance. We therefore wished to establish whether higher convective clearances with HDF would reduce IS and PCS concentrations. Methods We measured total plasma IS and PCS in a cohort of 138 CKD5d patients treated by On-line HDF (Ol-HDF), by high pressure liquid chromatography. Findings Mean patient age was 64.6 ± 16.5 years, 60.1% male, 57.3% diabetic, median dialysis vintage 25.9 months (12.4-62.0). The mean ICS concentration was 79.8 ± 56.4 umol/L and PCS 140.3 ± 101.8 umol/L. On multivariate analysis, IS was associated with serum albumin (β 4.31,P < 0.001), and negatively with residual renal function (β-4.1,P = 0.02) and vegetarian diet(β-28.3, P = 0.048) and PCS negatively with log C reactive protein (β-75.8, P < 0.001) and vegetarian diet (β-109, P = 0.001). Vegetarian patients had lower IS and PCS levels (median 41.5 (24.2-71.9) vs. 78.1 (49.5-107.5) and PCS (41.6 (14.2-178.3) vs. 127.3 (77.4-205.6) µmol/L, respectively, P < 0.05. Vegetarian patients had lower preOl-HDF serum urea, and phosphate (13.8 ±3.8 vs. 18.4 ± 5.2 mmol/L, and 1.33 ± 0.21 vs. 1.58 ± 0.45 mmol/L), and estimated urea nitrogen intake (1.25 ± 0.28 vs. 1.62 ± 0.5 g/kg/day), respectively, all P < 0.05. Discussion Plasma IS and PCS concentrations were not lower with Ol-HDF compared to previous studies in haemodialysis patients. However those eating a vegetarian diet had reduced IS and PCS concentrations. Although this could be due to differences in dietary protein intake, a vegetarian diet may also potentially reduce IS and PCS production by the intestinal microbiome. © 2016 International Society for Hemodialysis.

  14. Tuning HDF5 subfiling performance on parallel file systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byna, Suren; Chaarawi, Mohamad; Koziol, Quincey

    Subfiling is a technique used on parallel file systems to reduce locking and contention issues when multiple compute nodes interact with the same storage target node. Subfiling provides a compromise between the single shared file approach that instigates the lock contention problems on parallel file systems and having one file per process, which results in generating a massive and unmanageable number of files. In this paper, we evaluate and tune the performance of recently implemented subfiling feature in HDF5. In specific, we explain the implementation strategy of subfiling feature in HDF5, provide examples of using the feature, and evaluate andmore » tune parallel I/O performance of this feature with parallel file systems of the Cray XC40 system at NERSC (Cori) that include a burst buffer storage and a Lustre disk-based storage. We also evaluate I/O performance on the Cray XC30 system, Edison, at NERSC. Our results show performance benefits of 1.2X to 6X performance advantage with subfiling compared to writing a single shared HDF5 file. We present our exploration of configurations, such as the number of subfiles and the number of Lustre storage targets to storing files, as optimization parameters to obtain superior I/O performance. Based on this exploration, we discuss recommendations for achieving good I/O performance as well as limitations with using the subfiling feature.« less

  15. Cost-Effectiveness Analysis of High-Efficiency Hemodiafiltration Versus Low-Flux Hemodialysis Based on the Canadian Arm of the CONTRAST Study.

    PubMed

    Lévesque, Renee; Marcelli, Daniele; Cardinal, Héloïse; Caron, Marie-Line; Grooteman, Muriel P C; Bots, Michiel L; Blankestijn, Peter J; Nubé, Menso J; Grassmann, Aileen; Canaud, Bernard; Gandjour, Afschin

    2015-12-01

    The aim of this study was to assess the cost effectiveness of high-efficiency on-line hemodiafiltration (OL-HDF) compared with low-flux hemodialysis (LF-HD) for patients with end-stage renal disease (ESRD) based on the Canadian (Centre Hospitalier de l'Université de Montréal) arm of a parallel-group randomized controlled trial (RCT), the CONvective TRAnsport STudy. An economic evaluation was conducted for the period of the RCT (74 months). In addition, a Markov state transition model was constructed to simulate costs and health benefits over lifetime. The primary outcome was costs per quality-adjusted life-year (QALY) gained. The analysis had the perspective of the Quebec public healthcare system. A total of 130 patients were randomly allocated to OL-HDF (n = 67) and LF-HD (n = 63). The cost-utility ratio of OL-HDF versus LF-HD was Can$53,270 per QALY gained over lifetime. This ratio was fairly robust in the sensitivity analysis. The cost-utility ratio was lower than that of LF-HD compared with no treatment (immediate death), which was Can$93,008 per QALY gained. High-efficiency OL-HDF can be considered a cost-effective treatment for ESRD in a Canadian setting. Further research is needed to assess cost effectiveness in other settings and healthcare systems.

  16. Performance of hemodialysis with novel medium cut-off dialyzers

    PubMed Central

    Lyko, Raphael; Nilsson, Lars-Göran; Beck, Werner; Amdahl, Michael; Lechner, Petra; Schneider, Andreas; Wanner, Christoph; Rosenkranz, Alexander R.; Krieter, Detlef H.

    2017-01-01

    Background. Compared to high-flux dialysis membranes, novel medium cut-off (MCO) membranes show greater permeability for larger middle molecules. Methods. In two prospective, open-label, controlled, randomized, crossover pilot studies, 39 prevalent hemodialysis (HD) patients were studied in four dialysis treatments as follows: study 1, three MCO prototype dialyzers (AA, BB and CC with increasing permeability) and one high-flux dialyzer in HD; and study 2, two MCO prototype dialyzers (AA and BB) in HD and high-flux dialyzers in HD and hemodiafiltration (HDF). Primary outcome was lambda free light chain (λFLC) overall clearance. Secondary outcomes included overall clearances and pre-to-post-reduction ratios of middle and small molecules, and safety of MCO HD treatments. Results. MCO HD provided greater λFLC overall clearance [least square mean (standard error)] as follows: study 1: MCO AA 8.5 (0.54), MCO BB 11.3 (0.51), MCO CC 15.0 (0.53) versus high-flux HD 3.6 (0.51) mL/min; study 2: MCO AA 10.0 (0.58), MCO BB 12.5 (0.57) versus high-flux HD 4.4 (0.57) and HDF 6.2 (0.58) mL/min. Differences between MCO and high-flux dialyzers were consistently significant in mixed model analysis (each P < 0.001). Reduction ratios of λFLC were greater for MCO. Clearances of α1-microglobulin, complement factor D, kappa FLC (κFLC) and myoglobin were generally greater with MCO than with high-flux HD and similar to or greater than clearances with HDF. Albumin loss was moderate with MCO, but greater than with high-flux HD and HDF. Conclusions. MCO HD removes a wide range of middle molecules more effectively than high-flux HD and even exceeds the performance of high-volume HDF for large solutes, particularly λFLC. PMID:27587605

  17. The star formation history of the Hubble sequence: spatially resolved colour distributions of intermediate-redshift galaxies in the Hubble Deep Field

    NASA Astrophysics Data System (ADS)

    Abraham, R. G.; Ellis, R. S.; Fabian, A. C.; Tanvir, N. R.; Glazebrook, K.

    1999-03-01

    We analyse the spatially resolved colours of distant galaxies of known redshift in the Hubble Deep Field, using a new technique based on matching resolved four-band colour data to the predictions of evolutionary synthesis models. Given some simplifying assumptions, we demonstrate how our technique is capable of probing the evolutionary history of high-redshift systems, noting the specific advantage of observing galaxies at an epoch closer to the time of their formation. We quantify the relative age, dispersion in age, on-going star formation rate and star formation history of distinct components. We explicitly test for the presence of dust and quantify its effect on our conclusions. To demonstrate the potential of the method, we study the spirals and ellipticals in the near-complete sample of 32 I_814<21.9 mag galaxies with z~0.5 studied by Bouwens, Broadhurst & Silk. The dispersion of the internal colours of a sample of 0.4

  18. Guided Tour of Pythonian Museum

    NASA Technical Reports Server (NTRS)

    Lee, H. Joe

    2017-01-01

    At http:hdfeos.orgzoo, we have a large collection of Python examples of dealing with NASA HDF (Hierarchical Data Format) products. During this hands-on Python tutorial session, we'll present a few common hacks to access and visualize local NASA HDF data. We'll also cover how to access remote data served by OPeNDAP (Open-source Project for a Network Data Access Protocol). As a glue language, we will demonstrate how you can use Python for your data workflow - from searching data to analyzing data with machine learning.

  19. Trade Study: Storing NASA HDF5/netCDF-4 Data in the Amazon Cloud and Retrieving Data via Hyrax Server / THREDDS Data Server

    NASA Technical Reports Server (NTRS)

    Habermann, Ted; Jelenak, Aleksander; Lee, Joe; Yang, Kent; Gallagher, James; Potter, Nathan

    2017-01-01

    As part of the overall effort to understand implications of migrating ESDIS data and services to the cloud we are testing several common OPeNDAP and HDF use cases against three architectures for general performance and cost characteristics. The architectures include retrieving entire files, retrieving datasets using HTTP range gets, and retrieving elements of datasets (chunks) with HTTP range gets. We will describe these architectures and discuss our approach to estimating cost.

  20. History of the TOW Missile System

    DTIC Science & Technology

    1977-10-01

    pp. 1 .5 .4 & 1 .5 .5 . (2 ) L t r , DCG/LCS, I I C O M , t o CG, AMC, 29 Nar 65, s u b j : PEVA FY 65 APE P r o j AXMS 4220.X.32909...TOW). HDF. 3 8 ~ t r , C G , MICOM, t o CG, AMC, 23 J u l 65 , s u b j : PEVA FY 66 APE P r o j MCMS 4290.X.32947 (TOW). HDF. $1.7 m i l l

  1. Improving the Accessibility and Use of NASA Earth Science Data

    NASA Technical Reports Server (NTRS)

    Tisdale, Matthew; Tisdale, Brian

    2015-01-01

    Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.

  2. Higher convection volume exchange with online hemodiafiltration is associated with survival advantage for dialysis patients: the effect of adjustment for body size.

    PubMed

    Davenport, Andrew; Peters, Sanne A E; Bots, Michiel L; Canaud, Bernard; Grooteman, Muriel P C; Asci, Gulay; Locatelli, Francesco; Maduell, Francisco; Morena, Marion; Nubé, Menso J; Ok, Ercan; Torres, Ferran; Woodward, Mark; Blankestijn, Peter J

    2016-01-01

    Mortality remains high for hemodialysis patients. Online hemodiafiltration (OL-HDF) removes more middle-sized uremic toxins but outcomes of individual trials comparing OL-HDF with hemodialysis have been discrepant. Secondary analyses reported higher convective volumes, easier to achieve in larger patients, and improved survival. Here we tested different methods to standardize OL-HDF convection volume on all-cause and cardiovascular mortality compared with hemodialysis. Pooled individual patient analysis of four prospective trials compared thirds of delivered convection volume with hemodialysis. Convection volumes were either not standardized or standardized to weight, body mass index, body surface area, and total body water. Data were analyzed by multivariable Cox proportional hazards modeling from 2793 patients. All-cause mortality was reduced when the convective dose was unstandardized or standardized to body surface area and total body water; hazard ratio (95% confidence intervals) of 0.65 (0.51-0.82), 0.74 (0.58-0.93), and 0.71 (0.56-0.93) for those receiving higher convective doses. Standardization by body weight or body mass index gave no significant survival advantage. Higher convection volumes were generally associated with greater survival benefit with OL-HDF, but results varied across different ways of standardization for body size. Thus, further studies should take body size into account when evaluating the impact of delivered convection volume on mortality end points. Copyright © 2015 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  3. Gastrodia elata Blume Extract Modulates Antioxidant Activity and Ultraviolet A-Irradiated Skin Aging in Human Dermal Fibroblast Cells.

    PubMed

    Song, Eunju; Chung, Haeyon; Shim, Eugene; Jeong, Jung-Ky; Han, Bok-Kyung; Choi, Hyuk-Joon; Hwang, Jinah

    2016-11-01

    Gastrodia elata Blume (GEB), a traditional herbal medicine, has been used to treat a wide range of neurological disorders (e.g., paralysis and stroke) and skin problems (e.g., atopic dermatitis and eczema) in oriental medicine. This study was designed to investigate the antioxidant ability of GEB and its antiaging effect on human dermal fibroblast cells (HDF). The total phenolic and flavonoid contents of GEB were 21.8 and 0.43 mg/g dry weight (DW), respectively. The ergothioneine content of GEB was 0.41 mg/mL DW. The DPPH and ABTS radical scavenging activities of GEB at 5 and 10 mg/mL approximately ranged between 31% and 44%. The superoxide dismutase activity of GEB at 10 and 25 mg/mL was 57% and 76%, respectively. GEB increased procollagen type 1 (PC1) production and inhibited matrix metalloproteinase-1 (MMP-1) production and elastase-1 activity in UVA-irradiated HDF. PC1 messenger RNA (mRNA) levels decreased upon UVA irradiation, but recovered in response to high doses of GEB in HDF. On the contrary, GEB significantly decreased MMP-1 and elastase-1 mRNA levels, which were markedly induced in UVA-irradiated HDF. Collectively, these results suggest that GEB has sufficient antioxidant ability to prevent the signs of skin aging in UVA-irradiated human skin cells, suggesting its potential as a natural antiaging product.

  4. U-Th-He dating of diamond-forming C-O-H fluids and mantle metasomatic events

    NASA Astrophysics Data System (ADS)

    Weiss, Y.; Class, C.; Goldstein, S. L.; Winckler, G.; Kiro, Y.

    2017-12-01

    Carbon- and water-rich (C-O-H) fluids play important roles in the global material circulation, deep Earth processes, and have major impacts on the sub-continental lithospheric mantle (SCLM). Yet the origin and composition of C-O-H fluids, and the timing of fluid-rock interaction, are poorly constrained. `Fibrous' diamonds encapsulate C-O-H mantle fluids as μm-scale high-density fluid (HDF) inclusions. They can be directly sampled, and offer unique opportunities to investigate metasomatic events involving C-O-H fluids and the SCLM through Earth history. Until now no technique has provided reliable age constraints on HDFs. We applied a new in-vacuum crushing technique to determine the He abundances and 3He/4He ratios of HDFs in diamonds from the Kaapvaal lithosphere, South Africa. Three diamonds with saline HDFs have 3He/4He=3-4Ra. In 4He/3He vs 238U/3He space they define an `isochron' age of 96±45Ma, representing the first radiometric age reported for HDFs, and thus for C-O-H mantle fluids. In addition, a diamond with silicic HDFs and two that carry carbonatitic HDFs have low 3He/4He=0.07-0.6Ra. Using the measured U, Th, 4He and 3He contents of these diamonds, and the equation for 4He production from U-Th decay, we calculate 3He/4He as a function of time. Metasomatic fluids are derived from MORB, SCLM or subducted components with R/Ra=3-10, and this is assumed as the HDFs initial composition. The silicic and carbonatitic HDFs signify two older metasomatic events at 350 and 850 Ma, respectively. Thus, our new data reveal 3 metasomatic episodes in the Kaapvaal SCLM during the last 1 Ga, each by a different metasomatic agent. These 3 episodes correspond to late-Mesozoic kimberlite eruptions at 85 Ma, and the regional Namaqua-Natal and Damara Orogenies at 1 Ga and 500 Ma. We propose that the radioactive U-Th-He system in HDF-bearing diamonds can be used as a tool to provide meaningful radiometric ages of deep C-O-H fluids, and the timing of SCLM metasomatic events.

  5. U-Th-He dating of diamond-forming C-O-H fluids and mantle metasomatic events

    NASA Astrophysics Data System (ADS)

    Wasilewski, B.; O'Neil, J.; Rizo Garza, H. L.; Jean-Louis, P.; Gannoun, A.; Boyet, M.

    2016-12-01

    Carbon- and water-rich (C-O-H) fluids play important roles in the global material circulation, deep Earth processes, and have major impacts on the sub-continental lithospheric mantle (SCLM). Yet the origin and composition of C-O-H fluids, and the timing of fluid-rock interaction, are poorly constrained. `Fibrous' diamonds encapsulate C-O-H mantle fluids as μm-scale high-density fluid (HDF) inclusions. They can be directly sampled, and offer unique opportunities to investigate metasomatic events involving C-O-H fluids and the SCLM through Earth history. Until now no technique has provided reliable age constraints on HDFs. We applied a new in-vacuum crushing technique to determine the He abundances and 3He/4He ratios of HDFs in diamonds from the Kaapvaal lithosphere, South Africa. Three diamonds with saline HDFs have 3He/4He=3-4Ra. In 4He/3He vs 238U/3He space they define an `isochron' age of 96±45Ma, representing the first radiometric age reported for HDFs, and thus for C-O-H mantle fluids. In addition, a diamond with silicic HDFs and two that carry carbonatitic HDFs have low 3He/4He=0.07-0.6Ra. Using the measured U, Th, 4He and 3He contents of these diamonds, and the equation for 4He production from U-Th decay, we calculate 3He/4He as a function of time. Metasomatic fluids are derived from MORB, SCLM or subducted components with R/Ra=3-10, and this is assumed as the HDFs initial composition. The silicic and carbonatitic HDFs signify two older metasomatic events at 350 and 850 Ma, respectively. Thus, our new data reveal 3 metasomatic episodes in the Kaapvaal SCLM during the last 1 Ga, each by a different metasomatic agent. These 3 episodes correspond to late-Mesozoic kimberlite eruptions at 85 Ma, and the regional Namaqua-Natal and Damara Orogenies at 1 Ga and 500 Ma. We propose that the radioactive U-Th-He system in HDF-bearing diamonds can be used as a tool to provide meaningful radiometric ages of deep C-O-H fluids, and the timing of SCLM metasomatic events.

  6. A-Train Data Depot (ATDD)

    NASA Technical Reports Server (NTRS)

    Smith, Peter M.; Kempler, Steven; Leptoukh, Gregory; Savtchenko, Andrey; Kummerer, Robert; Gopolan, Arun

    2008-01-01

    ATDD is a web based tool which provides collocated data and display products for a number of A-train instruments Cloudsat, Calipso, OMI, AIRS, MODIS, MLS, POLDER-3, and ECWMF model data. Products provided include Clouds, Aerosols, Water Vapor, Temperatures and trace gases. All input data is online and in HDF4, HDF5 format. Display products include curtain images, horizontal strips, line plot overlays, and GE kmz files. Sample products are shown for two type of events. Hurricane event, Norbert, Oct 8, 2008 and a dust storm event over the Arabian Sea, Nov 13-14, 2008.

  7. Protective Effects of Triphala on Dermal Fibroblasts and Human Keratinocytes

    PubMed Central

    Varma, Sandeep R.; Sivaprakasam, Thiyagarajan O.; Mishra, Abheepsa; Kumar, L. M. Sharath; Prakash, N. S.; Prabhu, Sunil; Ramakrishnan, Shyam

    2016-01-01

    Human skin is body’s vital organ constantly exposed to abiotic oxidative stress. This can have deleterious effects on skin such as darkening, skin damage, and aging. Plant-derived products having skin-protective effects are well-known traditionally. Triphala, a formulation of three fruit products, is one of the most important rasayana drugs used in Ayurveda. Several skin care products based on Triphala are available that claim its protective effects on facial skin. However, the skin protective effects of Triphala extract (TE) and its mechanistic action on skin cells have not been elucidated in vitro. Gallic acid, ellagic acid, and chebulinic acid were deduced by LC-MS as the major constituents of TE. The identified key compounds were docked with skin-related proteins to predict their binding affinity. The IC50 values for TE on human dermal fibroblasts (HDF) and human keratinocytes (HaCaT) were 204.90 ± 7.6 and 239.13 ± 4.3 μg/mL respectively. The antioxidant capacity of TE was 481.33 ± 1.5 mM Trolox equivalents in HaCaT cells. Triphala extract inhibited hydrogen peroxide (H2O2) induced RBC haemolysis (IC50 64.95 μg/mL), nitric oxide production by 48.62 ± 2.2%, and showed high reducing power activity. TE also rescued HDF from H2O2-induced damage; inhibited H2O2 induced cellular senescence and protected HDF from DNA damage. TE increased collagen-I, involucrin and filaggrin synthesis by 70.72 ± 2.3%, 67.61 ± 2.1% and 51.91 ± 3.5% in HDF or HaCaT cells respectively. TE also exhibited anti-tyrosinase and melanin inhibition properties in a dose-dependent manner. TE increased the mRNA expression of collagen-I, elastin, superoxide dismutase (SOD-2), aquaporin-3 (AQP-3), filaggrin, involucrin, transglutaminase in HDF or HaCaT cells, and decreased the mRNA levels of tyrosinase in B16F10 cells. Thus, Triphala exhibits protective benefits on skin cells in vitro and can be used as a potential ingredient in skin care formulations. PMID:26731545

  8. Data Elevator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BYNA, SUNRENDRA; DONG, BIN; WU, KESHENG

    Data Elevator: Efficient Asynchronous Data Movement in Hierarchical Storage Systems Multi-layer storage subsystems, including SSD-based burst buffers and disk-based parallel file systems (PFS), are becoming part of HPC systems. However, software for this storage hierarchy is still in its infancy. Applications may have to explicitly move data among the storage layers. We propose Data Elevator for transparently and efficiently moving data between a burst buffer and a PFS. Users specify the final destination for their data, typically on PFS, Data Elevator intercepts the I/O calls, stages data on burst buffer, and then asynchronously transfers the data to their final destinationmore » in the background. This system allows extensive optimizations, such as overlapping read and write operations, choosing I/O modes, and aligning buffer boundaries. In tests with large-scale scientific applications, Data Elevator is as much as 4.2X faster than Cray DataWarp, the start-of-art software for burst buffer, and 4X faster than directly writing to PFS. The Data Elevator library uses HDF5's Virtual Object Layer (VOL) for intercepting parallel I/O calls that write data to PFS. The intercepted calls are redirected to the Data Elevator, which provides a handle to write the file in a faster and intermediate burst buffer system. Once the application finishes writing the data to the burst buffer, the Data Elevator job uses HDF5 to move the data to final destination in an asynchronous manner. Hence, using the Data Elevator library is currently useful for applications that call HDF5 for writing data files. Also, the Data Elevator depends on the HDF5 VOL functionality.« less

  9. Optimal convection volume for improving patient outcomes in an international incident dialysis cohort treated with online hemodiafiltration

    PubMed Central

    Canaud, Bernard; Barbieri, Carlo; Marcelli, Daniele; Bellocchio, Francesco; Bowry, Sudhir; Mari, Flavio; Amato, Claudia; Gatti, Emanuele

    2015-01-01

    Online hemodiafiltration (OL-HDF), the most efficient renal replacement therapy, enables enhanced removal of small and large uremic toxins by combining diffusive and convective solute transport. Randomized controlled trials on prevalent chronic kidney disease (CKD) patients showed improved patient survival with high-volume OL-HDF, underlining the effect of convection volume (CV). This retrospective international study was conducted in a large cohort of incident CKD patients to determine the CV threshold and range associated with survival advantage. Data were extracted from a cohort of adult CKD patients treated by post-dilution OL-HDF over a 101-month period. In total, 2293 patients with a minimum of 2 years of follow-up were analyzed using advanced statistical tools, including cubic spline analyses for determination of the CV range over which a survival increase was observed. The relative survival rate of OL-HDF patients, adjusted for age, gender, comorbidities, vascular access, albumin, C-reactive protein, and dialysis dose, was found to increase at about 55 l/week of CV and to stay increased up to about 75 l/week. Similar analysis of pre-dialysis β2-microglobin (marker of middle-molecule uremic toxins) concentrations found a nearly linear decrease in marker concentration as CV increased from 40 to 75 l/week. Analysis of log C-reactive protein levels showed a decrease over the same CV range. Thus, a convection dose target based on convection volume should be considered and needs to be confirmed by prospective trials as a new determinant of dialysis adequacy. PMID:25945407

  10. Deepest Wide-Field Colour Image in the Southern Sky

    NASA Astrophysics Data System (ADS)

    2003-01-01

    LA SILLA CAMERA OBSERVES CHANDRA DEEP FIELD SOUTH ESO PR Photo 02a/03 ESO PR Photo 02a/03 [Preview - JPEG: 400 x 437 pix - 95k] [Normal - JPEG: 800 x 873 pix - 904k] [HiRes - JPEG: 4000 x 4366 pix - 23.1M] Caption : PR Photo 02a/03 shows a three-colour composite image of the Chandra Deep Field South (CDF-S) , obtained with the Wide Field Imager (WFI) camera on the 2.2-m MPG/ESO telescope at the ESO La Silla Observatory (Chile). It was produced by the combination of about 450 images with a total exposure time of nearly 50 hours. The field measures 36 x 34 arcmin 2 ; North is up and East is left. Technical information is available below. The combined efforts of three European teams of astronomers, targeting the same sky field in the southern constellation Fornax (The Oven) have enabled them to construct a very deep, true-colour image - opening an exceptionally clear view towards the distant universe . The image ( PR Photo 02a/03 ) covers an area somewhat larger than the full moon. It displays more than 100,000 galaxies, several thousand stars and hundreds of quasars. It is based on images with a total exposure time of nearly 50 hours, collected under good observing conditions with the Wide Field Imager (WFI) on the MPG/ESO 2.2m telescope at the ESO La Silla Observatory (Chile) - many of them extracted from the ESO Science Data Archive . The position of this southern sky field was chosen by Riccardo Giacconi (Nobel Laureate in Physics 2002) at a time when he was Director General of ESO, together with Piero Rosati (ESO). It was selected as a sky region towards which the NASA Chandra X-ray satellite observatory , launched in July 1999, would be pointed while carrying out a very long exposure (lasting a total of 1 million seconds, or 278 hours) in order to detect the faintest possible X-ray sources. The field is now known as the Chandra Deep Field South (CDF-S) . The new WFI photo of CDF-S does not reach quite as deep as the available images of the "Hubble Deep Fields" (HDF-N in the northern and HDF-S in the southern sky, cf. e.g. ESO PR Photo 35a/98 ), but the field-of-view is about 200 times larger. The present image displays about 50 times more galaxies than the HDF images, and therefore provides a more representative view of the universe . The WFI CDF-S image will now form a most useful basis for the very extensive and systematic census of the population of distant galaxies and quasars, allowing at once a detailed study of all evolutionary stages of the universe since it was about 2 billion years old . These investigations have started and are expected to provide information about the evolution of galaxies in unprecedented detail. They will offer insights into the history of star formation and how the internal structure of galaxies changes with time and, not least, throw light on how these two evolutionary aspects are interconnected. GALAXIES IN THE WFI IMAGE ESO PR Photo 02b/03 ESO PR Photo 02b/03 [Preview - JPEG: 488 x 400 pix - 112k] [Normal - JPEG: 896 x 800 pix - 1.0M] [Full-Res - JPEG: 2591 x 2313 pix - 8.6M] Caption : PR Photo 02b/03 contains a collection of twelve subfields from the full WFI Chandra Deep Field South (WFI CDF-S), centred on (pairs or groups of) galaxies. Each of the subfields measures 2.5 x 2.5 arcmin 2 (635 x 658 pix 2 ; 1 pixel = 0.238 arcsec). North is up and East is left. Technical information is available below. The WFI CDF-S colour image - of which the full field is shown in PR Photo 02a/03 - was constructed from all available observations in the optical B- ,V- and R-bands obtained under good conditions with the Wide Field Imager (WFI) on the 2.2-m MPG/ESO telescope at the ESO La Silla Observatory (Chile), and now stored in the ESO Science Data Archive. It is the "deepest" image ever taken with this instrument. It covers a sky field measuring 36 x 34 arcmin 2 , i.e., an area somewhat larger than that of the full moon. The observations were collected during a period of nearly four years, beginning in January 1999 when the WFI instrument was first installed (cf. ESO PR 02/99 ) and ending in October 2002. Altogether, nearly 50 hours of exposure were collected in the three filters combined here, cf. the technical information below. Although it is possible to identify more than 100,000 galaxies in the image - some of which are shown in PR Photo 02b/03 - it is still remarkably "empty" by astronomical standards. Even the brightest stars in the field (of visual magnitude 9) can hardly be seen by human observers with binoculars. In fact, the area density of bright, nearby galaxies is only half of what it is in "normal" sky fields. Comparatively empty fields like this one provide an unsually clear view towards the distant regions in the universe and thus open a window towards the earliest cosmic times . Research projects in the Chandra Deep Field South ESO PR Photo 02c/03 ESO PR Photo 02c/03 [Preview - JPEG: 400 x 513 pix - 112k] [Normal - JPEG: 800 x 1026 pix - 1.2M] [Full-Res - JPEG: 1717 x 2201 pix - 5.5M] ESO PR Photo 02d/03 ESO PR Photo 02d/03 [Preview - JPEG: 400 x 469 pix - 112k] [Normal - JPEG: 800 x 937 pix - 1.0M] [Full-Res - JPEG: 2545 x 2980 pix - 10.7M] Caption : PR Photo 02c-d/03 shows two sky fields within the WFI image of CDF-S, reproduced at full (pixel) size to illustrate the exceptional information richness of these data. The subfields measure 6.8 x 7.8 arcmin 2 (1717 x 1975 pixels) and 10.1 x 10.5 arcmin 2 (2545 x 2635 pixels), respectively. North is up and East is left. Technical information is available below. Astronomers from different teams and disciplines have been quick to join forces in a world-wide co-ordinated effort around the Chandra Deep Field South. Observations of this area are now being performed by some of the most powerful astronomical facilities and instruments. They include space-based X-ray and infrared observations by the ESA XMM-Newton , the NASA CHANDRA , Hubble Space Telescope (HST) and soon SIRTF (scheduled for launch in a few months), as well as imaging and spectroscopical observations in the infrared and optical part of the spectrum by telescopes at the ground-based observatories of ESO (La Silla and Paranal) and NOAO (Kitt Peak and Tololo). A huge database is currently being created that will help to analyse the evolution of galaxies in all currently feasible respects. All participating teams have agreed to make their data on this field publicly available, thus providing the world-wide astronomical community with a unique opportunity to perform competitive research, joining forces within this vast scientific project. Concerted observations The optical true-colour WFI image presented here forms an important part of this broad, concerted approach. It combines observations of three scientific teams that have engaged in complementary scientific projects, thereby capitalizing on this very powerful combination of their individual observations. The following teams are involved in this work: * COMBO-17 (Classifying Objects by Medium-Band Observations in 17 filters) : an international collaboration led by Christian Wolf and other scientists at the Max-Planck-Institut für Astronomie (MPIA, Heidelberg, Germany). This team used 51 hours of WFI observing time to obtain images through five broad-band and twelve medium-band optical filters in the visual spectral region in order to measure the distances (by means of "photometric redshifts") and star-formation rates of about 10,000 galaxies, thereby also revealing their evolutionary status. * EIS (ESO Imaging Survey) : a team of visiting astronomers from the ESO community and beyond, led by Luiz da Costa (ESO). They observed the CDF-S for 44 hours in six optical bands with the WFI camera on the MPG/ESO 2.2-m telescope and 28 hours in two near-infrared bands with the SOFI instrument at the ESO 3.5-m New Technology Telescope (NTT) , both at La Silla. These observations form part of the Deep Public Imaging Survey that covers a total sky area of 3 square degrees. * GOODS (The Great Observatories Origins Deep Survey) : another international team (on the ESO side, led by Catherine Cesarsky ) that focusses on the coordination of deep space- and ground-based observations on a smaller, central area of the CDF-S in order to image the galaxies in many differerent spectral wavebands, from X-rays to radio. GOODS has contributed with 40 hours of WFI time for observations in three broad-band filters that were designed for the selection of targets to be spectroscopically observed with the ESO Very Large Telescope (VLT) at the Paranal Observatory (Chile), for which over 200 hours of observations are planned. About 10,000 galaxies will be spectroscopically observed in order to determine their redshift (distance), star formation rate, etc. Another important contribution to this large research undertaking will come from the GEMS project. This is a "HST treasury programme" (with Hans-Walter Rix from MPIA as Principal Investigator) which observes the 10,000 galaxies identified in COMBO-17 - and eventually the entire WFI-field with HST - to show the evolution of their shapes with time. Great questions With the combination of data from many wavelength ranges now at hand, the astronomers are embarking upon studies of the many different processes in the universe. They expect to shed more light on several important cosmological questions, such as: * How and when was the first generation of stars born? * When exactly was the neutral hydrogen in the universe ionized the first time by powerful radiation emitted from the first stars and active galactic nuclei? * How did galaxies and groups of galaxies evolve during the past 13 billion years? * What is the true nature of those elusive objects that are only seen at the infrared and submillimetre wavelengths (cf. ESO PR 23/02 )? * Which fraction of galaxies had an "active" nucleus (probably with a black hole at the centre) in their past, and how long did this phase last? Moreover, since these extensive optical observations were obtained in the course of a dozen observing periods during several years, it is also possible to perform studies of certain variable phenomena: * How many variable sources are seen and what are their types and properties? * How many supernovae are detected per time interval, i.e. what is the supernovae frequency at different cosmic epochs? * How do those processes depend on each other? This is just a short and very incomplete list of questions astronomers world-wide will address using all the complementary observations. No doubt that the coming studies of the Chandra Deep Field South - with this and other data - will be most exciting and instructive! Other wide-field images Other wide-field images from the WFI have been published in various ESO press releases during the past four years - they are also available at the WFI Photo Gallery . A collection of full-resolution files (TIFF-format) is available on a WFI CD-ROM . Technical Information The very extensive data reduction and colour image processing needed to produce these images were performed by Mischa Schirmer and Thomas Erben at the "Wide Field Expertise Center" of the Institut für Astrophysik und Extraterrestrische Forschung der Universität Bonn (IAEF) in Germany. It was done by means of a software pipeline specialised for reduction of multiple CCD wide-field imaging camera data. This pipeline is mainly based on publicly available software modules and algorithms ( EIS , FLIPS , LDAC , Terapix , Wifix ). The image was constructed from about 150 exposures in each of the following wavebands: B-band (centred at wavelength 456 nm; here rendered as blue, 15.8 hours total exposure time), V-band (540 nm; green, 15.6 hours) and R-band (652 nm; red, 17.8 hours). Only images taken under sufficiently good observing conditions (defined as seeing less than 1.1 arcsec) were included. In total, 450 images were assembled to produce this colour image, together with about as many calibration images (biases, darks and flats). More than 2 Terabyte (TB) of temporary files were produced during the extensive data reduction. Parallel processing of all data sets took about two weeks on a four-processor Sun Enterprise 450 workstation and a 1.8 GHz dual processor Linux PC. The final colour image was assembled in Adobe Photoshop. The observations were performed by ESO (GOODS, EIS) and the COMBO-17 collaboration in the period 1/1999-10/2002.

  11. Control of formaldehyde and TVOC emission from wood-based flooring composites at various manufacturing processes by surface finishing.

    PubMed

    Kim, Sumin

    2010-04-15

    This paper assesses the reproducibility of testing formaldehyde and TVOC emission behavior from wood flooring composites bonded by urea-formaldehyde resin at various manufacturing steps for surface finishing materials. The surface adhesion step of laminate flooring for this research was divided into two steps; HDF only and HDF with LPMs. In the case of engineered flooring, the manufacturing steps were divided into three steps; plywood only, fancy veneer bonded on plywood and UV coated on fancy veneer with plywood. Formaldehyde and VOCs emission decreased at the process of final surface finishing materials; LPMs were applied on the surface of HDF for laminate flooring. Although emissions increased when fancy veneer was bonded onto plywood in the case of engineered flooring, emission was dramatically reduced up to similar level with plywood only when final surface finishing; UV-curable coating was applied on fancy veneer. This study suggests that formaldehyde and VOCs emission from floorings can be controlled at manufacturing steps for surface finishing. 2009 Elsevier B.V. All rights reserved.

  12. Trade Study: Storing NASA HDF5/netCDF-4 Data in the Amazon Cloud and Retrieving Data Via Hyrax Server Data Server

    NASA Technical Reports Server (NTRS)

    Habermann, Ted; Gallagher, James; Jelenak, Aleksandar; Potter, Nathan; Lee, Joe; Yang, Kent

    2017-01-01

    This study explored three candidate architectures with different types of objects and access paths for serving NASA Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the study were: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and netCDF4 data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3). Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.We will describe the three architectures and the use cases along with performance results and recommendations for further work.

  13. MXA: a customizable HDF5-based data format for multi-dimensional data sets

    NASA Astrophysics Data System (ADS)

    Jackson, M.; Simmons, J. P.; De Graef, M.

    2010-09-01

    A new digital file format is proposed for the long-term archival storage of experimental data sets generated by serial sectioning instruments. The format is known as the multi-dimensional eXtensible Archive (MXA) format and is based on the public domain Hierarchical Data Format (HDF5). The MXA data model, its description by means of an eXtensible Markup Language (XML) file with associated Document Type Definition (DTD) are described in detail. The public domain MXA package is available through a dedicated web site (mxa.web.cmu.edu), along with implementation details and example data files.

  14. TRIQS: A toolbox for research on interacting quantum systems

    NASA Astrophysics Data System (ADS)

    Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka

    2015-11-01

    We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.

  15. Wound healing potential of adipose tissue stem cell extract.

    PubMed

    Na, You Kyung; Ban, Jae-Jun; Lee, Mijung; Im, Wooseok; Kim, Manho

    2017-03-25

    Adipose tissue stem cells (ATSCs) are considered as a promising source in the field of cell therapy and regenerative medicine. In addition to direct cell replacement using stem cells, intercellular molecule exchange by stem cell secretory factors showed beneficial effects by reducing tissue damage and augmentation of endogenous repair. Delayed cutaneous wound healing is implicated in many conditions such as diabetes, aging, stress and alcohol consumption. However, the effects of cell-free extract of ATSCs (ATSC-Ex) containing secretome on wound healing process have not been investigated. In this study, ATSC-Ex was topically applied on the cutaneous wound and healing speed was examined. As a result, wound closure was much faster in the cell-free extract treated wound than control wound at 4, 6, 8 days after application of ATSC-Ex. Dermal fibroblast proliferation, migration and extracellular matrix (ECM) production are critical aspects of wound healing, and the effects of ATSC-Ex on human dermal fibroblast (HDF) was examined. ATSC-Ex augmented HDF proliferation in a dose-dependent manner and migration ability was enhanced by extract treatment. Representative ECM proteins, collagen type I and matrix metalloproteinase-1, are significantly up-regulated by treatment of ATSC-Ex. Our results suggest that the ATSC-Ex have improving effect of wound healing and can be the potential therapeutic candidate for cutaneous wound healing. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Total Convection Affects Serum β2 Microglobulin and C-Reactive Protein but Not Erythropoietin Requirement following Post-Dilutional Hemodiafiltration.

    PubMed

    Movilli, Ezio; Camerini, Corrado; Gaggia, Paola; Zubani, Roberto; Feller, Paolo; Salviani, Chiara; Facchini, Annalisa; Cancarini, Giovanni

    2015-01-01

    Inflammation and increased erythropoiesis stimulating agents (ESA) requirement are frequently associated in patients on dialysis. On-line hemodiafiltration (ol-HDF), putting together high levels of diffusion, and convection could improve both conditions. However, it is still not known which depurative component plays a major role in determining this result. The aim of the study was to evaluate the role of convection and diffusion on long-term variations of serum β2 microglobulin (Δβ2M), high-sensitive C-reactive protein (ΔhsCRP) concentrations, and ESA requirement (ΔESA) in ol-HDF. Seventy-three patients prevalent on high flux HD (hfHD) were studied. Thirty-eight patients were switched from hfHD to post-dilutional ol-HDF (Study group); the other 35 patients were considered the Control group. At 6 and 12 months, the effects of ol-HDF and hfHD on ΔhsCRP, ΔB2M, and ΔESA (U/kg/week) were evaluated. Other variables considered were body weight (BW), serum albumin (sAlb), hemoglobin (Hb), and equilibrated Kt/V (eKt/V). Iron therapy and ESA were administered intravenously according to the K/DOQI guidelines in order to maintain transferrin saturation between 20 and 40%, serum ferritin between 150 and 500 ng/ml and Hb between 11 and 12 g/dl. Qb, treatment time and Qd remained constant. Ol-HDF and hfHD were performed using membranes of size 1.9-2.1 sqm. Ultrapure dialysate and substitution fluid were employed in both HDF and HD treatments. Data are expressed as mean ± SD. Paired t test, Mann-Whitney U test, and simple and multiple regression analyses were employed for statistical evaluation. total convective volume (TCV) was 22.1 ± 1.9 l/session. A significant reduction of hsCRP: from 6.8 ± 7.1 to 2.3 ± 2.4 mg/dl (p < 0.001), β2M: from 36.5 ± 14.4 to 24.7 ± 8.6 mg/dl (p < 0.0001) and ESAdose: from 107 ± 67 to 65 ± 44 U/kg/week (p < 0.005) was observed. No significant variations of Hb, BW and sAlb were seen. A significant inverse correlation was found between TCV and Δβ2M (r = -0.627; p < 0.0001), and TCV and ΔhsCRP (r = -0.514; p < 0.0001); no correlation between TCV and ΔESAdose was observed. No correlation was found between eKt/V and Δβ2M, ΔhsCRP, and ΔESAdose. Multiple regression analysis with ΔESAdose as dependent variable showed ΔhsCRP as the only significantly associated independent factor (p < 0.01). no significant variations of hsCRP, β2M, and ESAdose were observed over time. Ol-HDF induces a long-term significant reduction in pre-dialysis β2M and hsCRP concentrations. The magnitude of reduction is directly correlated to the amount of TCV achieved but not on eKt/V. The observed reduction in ESAdose requirement is independent either on convection or diffusion, but is directly associated to the concomitant reduction of inflammation.

  17. Antioxidants and NOX1/NOX4 inhibition blocks TGFβ1-induced CCN2 and α-SMA expression in dermal and gingival fibroblasts

    PubMed Central

    Murphy-Marshman, Hannah; Quensel, Katherine; Shi-wen, Xu; Barnfield, Rebecca; Kelly, Jacalyn; Peidl, Alex; Stratton, Richard J.

    2017-01-01

    TGFbeta induces fibrogenic responses in fibroblasts. Reactive oxygen species (ROS)/nicotinamide adenine dinucleotide phosphate (NADPH) oxidase (NOX) may contribute to fibrogenic responses. Here, we examine if the antioxidant N-acetylcysteine (NAC), the NOX inhibitor diphenyleneiodonium (DPI) and the selective NOX1/NOX4 inhibitor GKT-137831 impairs the ability of TGFbeta to induce profibrotic gene expression in human gingival (HGF) and dermal (HDF) fibroblasts. We also assess if GKT-137831 can block the persistent fibrotic phenotype of lesional scleroderma (SSc) fibroblasts. We use real-time polymerase chain reaction and Western blot analysis to evaluate whether NAC and DPI impair the ability of TGFbeta1 to induce expression of fibrogenic genes in fibroblasts. The effects of GKT-137831 on TGFbeta-induced protein expression and the persistent fibrotic phenotype of lesional scleroderma (SSc) fibroblasts were tested using Western blot and collagen gel contraction analyses. In HDF and HGF, TGFbeta1 induces CCN2, CCN1, endothelin-1 and alpha-smooth muscle actin (SMA) in a fashion sensitive to NAC. Induction of COL1A1 mRNA was unaffected. Similar results were seen with DPI. NAC and DPI impaired the ability of TGFbeta1 to induce protein expression of CCN2 and alpha-SMA in HDF and HGF. GKT-137831 impaired TGFbeta-induced CCN2 and alpha-SMA protein expression in HGF and HDF. In lesional SSc dermal fibroblasts, GKT-137831 reduced alpha-SMA and CCN2 protein overexpression and collagen gel contraction. These results are consistent with the hypothesis that antioxidants or NOX1/4 inhibition may be useful in blocking profibrotic effects of TGFbeta on dermal and gingival fibroblasts and warrant consideration for further development as potential antifibrotic agents. PMID:29049376

  18. Exploring New Methods of Displaying Bit-Level Quality and Other Flags for MODIS Data

    NASA Technical Reports Server (NTRS)

    Khalsa, Siri Jodha Singh; Weaver, Ron

    2003-01-01

    The NASA Distributed Active Archive Center (DAAC) at the National Snow and Ice Data Center (NSIDC) archives and distributes snow and sea ice products derived from the MODerate resolution Imaging Spectroradiometer (MODIS) on board NASA's Terra and Aqua satellites. All MODIS standard products are in the Earth Observing System version of the Hierarchal Data Format (HDF-EOS). The MODIS science team has packed a wealth of information into each HDF-EOS file. In addition to the science data arrays containing the geophysical product, there are often pixel-level Quality Assurance arrays which are important for understanding and interpreting the science data. Currently, researchers are limited in their ability to access and decode information stored as individual bits in many of the MODIS science products. Commercial and public domain utilities give users access, in varying degrees, to the elements inside MODIS HDF-EOS files. However, when attempting to visualize the data, users are confronted with the fact that many of the elements actually represent eight different 1-bit arrays packed into a single byte array. This project addressed the need for researchers to access bit-level information inside MODIS data files. In an previous NASA-funded project (ESDIS Prototype ID 50.0) we developed a visualization tool tailored to polar gridded HDF-EOS data set. This tool,called the Polar researchers to access, geolocate, visualize, and subset data that originate from different sources and have different spatial resolutions but which are placed on a common polar grid. The bit-level visualization function developed under this project was added to PHDIS, resulting in a versatile tool that serves a variety of needs. We call this the EOS Imaging Tool.

  19. The application of a new type of titanium mesh cage in hybrid anterior decompression and fusion technique for the treatment of continuously three-level cervical spondylotic myelopathy.

    PubMed

    Liu, Xiaowei; Chen, Yu; Yang, Haisong; Li, Tiefeng; Xu, Haidong; Xu, Bin; Chen, Deyu

    2017-01-01

    To evaluate the efficacy and safety of a new type of titanium mesh cage (NTMC) in hybrid anterior decompression and fusion method (HDF) in treating continuously three-level cervical spondylotic myelopathy (TCSM). Ninety-four cases who had TCSM and accepted the HDF from Jan 2007 to Jan 2010 were included. Clinical and radiological outcomes were compared between cases who had the NTMC (Group A, n = 45) and traditional titanium mesh cage (TTMC, Group B, n = 49) after corpectomies. Each case accepted one polyetheretherketone cage (PEEK) after discectomy. Mean follow-up were 74.4 and 77.3 months in Group A and B, respectively (p > 0.05). Differences in cervical lordosis (CL), segmental lordosis (SL), anterior segmental height (ASH) and posterior segmental height (PSH) between two groups were not significant preoperatively, 3-days postoperatively or at final visit. However, losses of the CL, SL, ASH and PSH were all significantly larger in Group B at the final visit, so did incidences of segmental subsidence and severe subsidence. Difference in preoperative Japanese Orthopedic Association (JOA), visual analog scale (VAS), neck disability index (NDI) or SF-36 between two groups was not significant. At the final visit, fusion rate, JOA, and SF-36 were all comparable between two groups, but the VAS and NDI were both significantly greater in Group B. For cases with TCSM, HDF with the NTMC and TTMC can provide comparable radiological and clinical improvements. But application of the NTMC in HDF is of advantages in decreasing the subsidence incidence, losses of lordosis correction, VAS and NDI.

  20. High-definition flow Doppler ultrasonographic technique to assess hepatic vasculature compared with color or power Doppler ultrasonography: preliminary experience.

    PubMed

    Kim, Se Hyung; Lee, Jeong Min; Kim, Young Jun; Lee, Jae Young; Han, Joon Koo; Choi, Byung Ihn

    2008-10-01

    The purpose of this study was to introduce a new high-definition flow (HDF) Doppler technique and to compare its performance with those of color Doppler ultrasonography (CDU) and power Doppler ultrasonography (PDU) for assessment of hepatic vasculature in native and transplanted livers. High-definition flow was invented as a high-resolution bidirectional PDU technique. We obtained CDU, PDU, and HDF images of the hepatic artery (HA), portal vein (PV), and hepatic vein from 60 patients. They were divided into 2 groups: a liver transplantation group (group 1, n = 10) and a native liver group (group 2, n = 50). Two radiologists independently reviewed the cine images and graded them using a 4-point scale in terms of the clarity of the vessel margin and degree of depiction of the HA, flow filling, and flash artifacts. The degree of differentiation between the HA and PV was also evaluated. Flow directionality was recorded, and interobserver agreement was finally analyzed. Moderate to almost perfect agreement was achieved between radiologists for all parameters of each ultrasonographic technique. High-definition flow was significantly superior to both CDU and PDU with respect to all analyzed items except the degree of flash artifacts (P < .05). With regard to flash artifacts, CDU was significantly better than either PDU or HDF. High-definition flow provided directional information, as did CDU. The HDF technique provides better resolution for depicting hepatic vessels as well as their margins with less blooming compared with conventional Doppler ultrasonography in both native and transplanted liver. It also provides solid directional flow information. One point of concern, however, is the frequency of flash artifacts compared with that on CDU.

  1. Habitual dietary fibre intake influences gut microbiota response to an inulin-type fructan prebiotic: a randomised, double-blind, placebo-controlled, cross-over, human intervention study.

    PubMed

    Healey, Genelle; Murphy, Rinki; Butts, Christine; Brough, Louise; Whelan, Kevin; Coad, Jane

    2018-01-01

    Dysbiotic gut microbiota have been implicated in human disease. Diet-based therapeutic strategies have been used to manipulate the gut microbiota towards a more favourable profile. However, it has been demonstrated that large inter-individual variability exists in gut microbiota response to a dietary intervention. The primary objective of this study was to investigate whether habitually low dietary fibre (LDF) v. high dietary fibre (HDF) intakes influence gut microbiota response to an inulin-type fructan prebiotic. In this randomised, double-blind, placebo-controlled, cross-over study, thirty-four healthy participants were classified as LDF or HDF consumers. Gut microbiota composition (16S rRNA bacterial gene sequencing) and SCFA concentrations were assessed following 3 weeks of daily prebiotic supplementation (Orafti® Synergy 1; 16 g/d) or placebo (Glucidex® 29 Premium; 16 g/d), as well as after 3 weeks of the alternative intervention, following a 3-week washout period. In the LDF group, the prebiotic intervention led to an increase in Bifidobacterium (P=0·001). In the HDF group, the prebiotic intervention led to an increase in Bifidobacterium (P<0·001) and Faecalibacterium (P=0·010) and decreases in Coprococcus (P=0·010), Dorea (P=0·043) and Ruminococcus (Lachnospiraceae family) (P=0·032). This study demonstrates that those with HDF intakes have a greater gut microbiota response and are therefore more likely to benefit from an inulin-type fructan prebiotic than those with LDF intakes. Future studies aiming to modulate the gut microbiota and improve host health, using an inulin-type fructan prebiotic, should take habitual dietary fibre intake into account.

  2. Achieving high convection volumes in postdilution online hemodiafiltration: a prospective multicenter study

    PubMed Central

    Chapdelaine, Isabelle; Nubé, Menso J; Blankestijn, Peter J; Bots, Michiel L; Konings, Constantijn J A M; Kremer Hovinga, Ton K; Molenaar, Femke M; van der Weerd, Neelke C; Grooteman, Muriel P C

    2017-01-01

    Abstract Background. Available evidence suggests a reduced mortality risk for patients treated with high-volume postdilution hemodiafiltration (HDF) when compared with hemodialysis (HD) patients. As the magnitude of the convection volume depends on treatment-related factors rather than patient-related characteristics, we prospectively investigated whether a high convection volume (defined as ≥22 L/session) is feasible in the majority of patients (>75%). Methods. A multicenter study was performed in adult prevalent dialysis patients. Nonparticipating eligible patients formed the control group. Using a stepwise protocol, treatment time (up to 4 hours), blood flow rate (up to 400 mL/min) and filtration fraction (up to 33%) were optimized as much as possible. The convection volume was determined at the end of this optimization phase and at 4 and 8 weeks thereafter. Results. Baseline characteristics were comparable in participants (n = 86) and controls (n = 58). At the end of the optimization and 8 weeks thereafter, 71/86 (83%) and 66/83 (80%) of the patients achieved high-volume HDF (mean 25.5 ± 3.6 and 26.0 ± 3.4 L/session, respectively). While treatment time remained unaltered, mean blood flow rate increased by 27% and filtration fraction increased by 23%. Patients with <22 L/session had a higher percentage of central venous catheters (CVCs), a shorter treatment time and lower blood flow rate when compared with patients with ≥22 L/session. Conclusions. High-volume HDF is feasible in a clear majority of dialysis patients. Since none of the patients agreed to increase treatment time, these findings indicate that high-volume HDF is feasible just by increasing blood flow rate and filtration fraction. PMID:29225810

  3. PH5: HDF5 Based Format for Integrating and Archiving Seismic Data

    NASA Astrophysics Data System (ADS)

    Hess, D.; Azevedo, S.; Falco, N.; Beaudoin, B. C.

    2017-12-01

    PH5 is a seismic data format created by IRIS PASSCAL using HDF5. Building PH5 on HDF5 allows for portability and extensibility on a scale that is unavailable in older seismic data formats. PH5 is designed to evolve to accept new data types as they become available in the future and to operate on a variety of platforms (i.e. Mac, Linux, Windows). Exemplifying PH5's flexibility is the evolution from just handling active source seismic data to now including passive source, onshore-offshore, OBS and mixed source seismic data sets. In PH5, metadata is separated from the time series data and stored in a size and performance efficient manner that also allows for easy user interaction and output of the metadata in a format appropriate for the data set. PH5's full-fledged "Kitchen Software Suite" comprises tools for data ingestion (e.g. RefTek, SEG-Y, SEG-D, SEG-2, MSEED), meta-data management, QC, waveform viewing, and data output. This software suite not only includes command line and GUI tools for interacting with PH5, it is also a comprehensive Python package to support the creation of software tools by the community to further enhance PH5. The PH5 software suite is currently being used in multiple capacities, including in-field for creating archive ready data sets as well as by the IRIS Data Management Center (DMC) to offer an FDSN compliant set of web services for serving PH5 data to the community in a variety of standard data and meta-data formats (i.e. StationXML, QuakeML, EventXML, SAC + Poles and Zeroes, MiniSEED, and SEG-Y) as well as StationTXT and ShotText formats. These web services can be accessed via standard FDSN clients such as ObsPy, irisFetch.m, FetchData, and FetchMetadata. This presentation will highlight and demonstrate the benefits of PH5 as a next generation adaptable and extensible data format for use in both archiving and working with seismic data.

  4. A comparison of data interoperability approaches of fusion codes with application to synthetic diagnostics

    NASA Astrophysics Data System (ADS)

    Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.

    2010-11-01

    As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.

  5. 2.05 µm holmium-doped all-fiber laser diode-pumped at 1.125 µm

    NASA Astrophysics Data System (ADS)

    Kir'yanov, A. V.; Barmenkov, Y. O.; Villegas Garcia, I.

    2017-08-01

    We report a holmium-doped all-fiber laser oscillating at ~2.05 µm in continuous wave at direct in-core pumping by a 1.125 µm laser diode. Two types of home-made holmium-doped alumino-germano-silicate fiber (HDF), differentiated in the Ho3+ doping level, were fabricated to implement the laser, for revealing the effect of Ho3+ concentration upon the laser output. Firstly, the fibers were characterized thoroughly from the material and optical viewpoints. Then, laser action with both HDFs was assessed using the simplest Fabry-Perot cavity, assembled by a couple of spectrally adjusted fiber Bragg gratings, also made-in-house. In the best case, when using the lower-doped HDF of proper length (1.4 m), low threshold (~370 mW) and moderate slope efficiency (~13%) of ~2.05 µm lasing were obtained at 1.125 µm diode pumping. Long-term stability, high brightness, low noise, and purely CW operation are shown to be the laser’s attractive features. Yet, when utilizing the heavier-doped HDF, laser output is revealed to be overall worse, with a possible reason being the deteriorating Ho3+ concentration-related effects.

  6. Huntington's disease: advocacy driving science.

    PubMed

    Wexler, Nancy S

    2012-01-01

    My mother, Leonore, was diagnosed with Huntington's disease (HD) in 1968 at age 53. I was 23, my sister Alice 26, and our father, Milton Wexler, 60 years old. The same year, our father created the Hereditary Disease Foundation (HDF), dedicated to finding treatments and cures for HD. HD is an autosomal dominant, neurodegenerative disorder. Alice and I each have a 50% chance of inheriting and dying from the disorder. Over the past 43 years, we have been proud to change the face of science. Through Milton Wexler Interdisciplinary Workshops, judicious funding, and focusing on innovation and creativity, the HDF is an integral partner in key discoveries. The HDF recruited and supported >100 scientists worldwide who worked together as the Huntington's Disease Collaborative Research Group in a successful ten-year search for the HD gene. We found a DNA marker for the HD gene in 1983-the first marker to be found when the chromosomal location was unknown. We isolated the HD gene itself a decade later. These breakthroughs helped launch the Human Genome Project. We supported creating the first mouse model of HD and many other model systems. Currently, we focus on gene silencing, among other approaches, to create new treatments and cures.

  7. Common Data Format: New XML and Conversion Tools

    NASA Astrophysics Data System (ADS)

    Han, D. B.; Liu, M. H.; McGuire, R. E.

    2002-12-01

    Common Data Format (CDF) is a self-describing platform-independent data format for storing, accessing, and manipulating scalar and multidimensional scientific data sets. Significant benefit has accrued to specific science communities from their use of standard formats within those communities. Examples include the International Solar Terrestrial Physics (ISTP) community in using CDF for traditional space physics data (fields, particles and plasma, waves, and images), the worldwide astronomical community in using FITS (Flexible Image Transport System) for solar data (primarily spectral images), the NASA Planetary community in using Planetary Data System (PDS) Labels, and the earth science community in using Hierarchical Data Format (HDF). Scientific progress in solar-terrestrial physics continues to be impeded by the multiplicity of available standards for data formats and dearth of general data format translators. As a result, scientists today spend a significant amount of time translating data into the format they are familiar with for their research. To minimize this unnecessary data translation time and to allow more research time, the CDF office located at GSFC National Space Science Data Center (NSSDC) has developed HDF-to-CDF and FITS-to-CDF translators, and employed the eXtensible Markup Language (XML) technology to facilitate and promote data interoperability within the space science community. We will present the current status of the CDF work including the conversion tools that have been recently developed, conversion tools that are planned in the near future, share some of the XML experiences, and use the discussion to gain community feedback to our planned future work.

  8. Hemodiafiltration: Technical and Clinical Issues.

    PubMed

    Ronco, Claudio

    2015-01-01

    Hemodiafiltration (HDF) seems to represent the gold standard in the field of replacement of renal function by dialysis. High convective fluxes have been correlated with better clinical outcomes. Sometimes, however, there are technical barriers to the achievement of high blood flows adequate to perform effective convective therapies. In spite of optimized procedures, the progressive increase in transmembrane pressure (TMP), the blood viscosity due to hemoconcentration and blood path resistance sometimes becomes inevitable. We propose two possible solutions that can be operated automatically via specific software in the dialysis machine: predilution on demand and backflush on demand. Predilution on demand consists in an automatic feedback of the machine, diverting part of the filtered dialysate into a predilution mode with an infusion of 200 ml in 30 s while the ultrafiltration pump stops. This produces a sudden hemodilution with a return of the parameters to acceptable values. The performance of the filter improves, and the pressure alterations are mitigated. Backflush on demand consists in an automatic feedback of the machine triggered by the TMP control, producing a positive pressure in the dialysate compartment due to a stop of filtration and rapid infusion of at least 100 ml of ultrapure dialysate into the hollow fiber. This not only produces a significant hemodilution, but also backflushes the membrane pores detaching protein layers and improving membrane permeability. These are two examples of how technology will permit to overcome technical barriers to a widespread diffusion of HDF and adequate convective dose delivery. © 2015 S. Karger AG, Basel.

  9. U.S. Army Research Laboratory (ARL) multimodal signatures database

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly

    2008-04-01

    The U.S. Army Research Laboratory (ARL) Multimodal Signatures Database (MMSDB) is a centralized collection of sensor data of various modalities that are co-located and co-registered. The signatures include ground and air vehicles, personnel, mortar, artillery, small arms gunfire from potential sniper weapons, explosives, and many other high value targets. This data is made available to Department of Defense (DoD) and DoD contractors, Intel agencies, other government agencies (OGA), and academia for use in developing target detection, tracking, and classification algorithms and systems to protect our Soldiers. A platform independent Web interface disseminates the signatures to researchers and engineers within the scientific community. Hierarchical Data Format 5 (HDF5) signature models provide an excellent solution for the sharing of complex multimodal signature data for algorithmic development and database requirements. Many open source tools for viewing and plotting HDF5 signatures are available over the Web. Seamless integration of HDF5 signatures is possible in both proprietary computational environments, such as MATLAB, and Free and Open Source Software (FOSS) computational environments, such as Octave and Python, for performing signal processing, analysis, and algorithm development. Future developments include extending the Web interface into a portal system for accessing ARL algorithms and signatures, High Performance Computing (HPC) resources, and integrating existing database and signature architectures into sensor networking environments.

  10. Wound healing potential of Spirulina platensis extracts on human dermal fibroblast cells

    PubMed Central

    Syarina, Pauzi Nur Aimi; Karthivashan, Govindarajan; Abas, Faridah; Arulselvan, Palanisamy; Fakurazi, Sharida

    2015-01-01

    Blue-green alga (Spirulina platensis) is a well renowned nutri-supplement due to its high nutritional and medicinal properties. The aim of this study was to examine the wound healing efficiency of Spirulina platensis at various solvent extracts using in vitro scratch assay on human dermal fibroblast cells (HDF). Various gradient solvent extracts (50 μg/ml of methanolic, ethanolic and aqueous extracts) from Spirulina platensis were treated on HDF cells to acquire its wound healing properties through scratch assay and in this investigation we have used allantoin, as a positive control to compare efficacy among the phytoextracts. Interestingly, aqueous extract were found to stimulate proliferation and migration of HDF cells at given concentrations and enhanced closure rate of wound area within 24 hours after treatment. Methanolic and ethanolic extracts have shown proliferative effect, however these extracts did not aid in the migration and closure of wound area when compared to aqueous extract. Based on phytochemical profile of the plant extracts analyzed by LC-MS/MS, it was shown that compounds supposedly involved in accelerating wound healing are cinnamic acid, narigenin, kaempferol, temsirolimus, phosphatidylserine isomeric derivatives and sulphoquinovosyl diacylglycerol. Our findings concluded that blue-green algae may pose potential biomedical application to treat various chronic wounds especially in diabetes mellitus patients. PMID:27004048

  11. Golgi polarization plays a role in the directional migration of neonatal dermal fibroblasts induced by the direct current electric fields.

    PubMed

    Kim, Min Sung; Lee, Mi Hee; Kwon, Byeong-Ju; Koo, Min-Ah; Seon, Gyeung Mi; Park, Jong-Chul

    2015-05-01

    Directional cell migration requires cell polarization. The reorganization of the Golgi apparatus is an important phenomenon in the polarization and migration of many types of cells. Direct current electric fields (dc (EF) induced directional cell migration in a wide variety of cells. Here nHDFs migrated toward cathode under 1 V/cm dc EF, however 1 μM of brefeldin A (BFA) inhibited the dc EF induced directional migration. BFA (1 μM) did not cause the complete Golgi dispersal for 2 h. When the Golgi polarization maintained their direction of polarity, the direction of cell migration also kept toward the same direction of the Golgi polarization even though the dc EF was reversed. In this study, the importance of the Golgi polarization in the directional migration of nHDf under dc EF was identified. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Redox-active cerium oxide nanoparticles protect human dermal fibroblasts from PQ-induced damage.

    PubMed

    von Montfort, Claudia; Alili, Lirija; Teuber-Hanselmann, Sarah; Brenneisen, Peter

    2015-01-01

    Recently, it has been published that cerium (Ce) oxide nanoparticles (CNP; nanoceria) are able to downregulate tumor invasion in cancer cell lines. Redox-active CNP exhibit both selective pro-oxidative and antioxidative properties, the first being responsible for impairment of tumor growth and invasion. A non-toxic and even protective effect of CNP in human dermal fibroblasts (HDF) has already been observed. However, the effect on important parameters such as cell death, proliferation and redox state of the cells needs further clarification. Here, we present that nanoceria prevent HDF from reactive oxygen species (ROS)-induced cell death and stimulate proliferation due to the antioxidative property of these particles. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Seven years (2008-2014) of meteorological observations plus a synthetic El Nino drought for BCI Panama.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Thomas; Kueppers, Lara; Paton, Steve

    This dataset is a derivative product of raw meteorological data collected at Barro Colorado Island, Panama (see acknowledgements below). This dataset contains the following: 1) a seven-year record (2008-2014) of meteorological observations from BCI that is in a comma delimited text format, 2) an R-script that converts the observed meteorology into an hdf5 format that can be read by the ED2 model, 3) two decades of meteorological drivers in hdf5 format that are based on the 7-year record of observations and include a synthetic 2-yr El Nino drought, 4) a ReadMe.txt file that explains how the data in the hdf5more » meteorological drivers correspond to the observations. The raw meteorological data were further QC'd as part of the NGEE-Tropics project to derive item 1 above. The R-script makes the appropriate unit conversions for all observed meteorological variables to be compatible with the ED2 model. The R-script also converts RH into specific humidity, splits total shortwave radiation into its 4-stream parts, and calculates longwave radiation from air temperature and RH. The synthetic El Nino drought is based on selected months from the observed meteorology where in each, precipitation (only) of the selected months was modified to reflect the precipitation patterns of the 1982/83 El Nino observed at BCI.« less

  14. Cultivation of human dermal fibroblasts and epidermal keratinocytes on keratin-coated silica bead substrates.

    PubMed

    Tan, Bee Yi; Nguyen, Luong T H; Kim, Hyo-Sop; Kim, Jae-Ho; Ng, Kee Woei

    2017-10-01

    Human hair keratin is promising as a bioactive material platform for various biomedical applications. To explore its versatility further, human hair keratin was coated onto monolayers of silica beads to produce film-like substrates. This combination was hypothesized to provide a synergistic effect in improving the biochemical properties of the resultant composite. Atomic force microscopy analysis showed uniform coatings of keratin on the silica beads with a slight increase in the resulting surface roughness. Keratin-coated silica beads had higher surface energy and relatively lower negative charge than those of bare silica beads. To investigate cell response, human dermal fibroblasts (HDFs), and human epidermal keratinocytes (HEKs) were cultured on the substrates over 4 days. Results showed that keratin coatings significantly enhanced the metabolic activity of HDFs and encouraged cell spreading but did not exert any significant effects on HEKs. HDF expression of collagen I was significantly more intense on the keratin-coated compared to the bare silica substrates. Furthermore, HDF secretion of various cytokines suggested that keratin coatings triggered active cell responses related to wound healing. Collectively, our study demonstrated that human hair keratin-coated silica bead monolayers have the potential to modulate HDF behavior in culture and may be exploited further. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 105A: 2789-2798, 2017. © 2017 Wiley Periodicals, Inc.

  15. Modulation of Myostatin/Hepatocyte Growth Factor Balance by Different Hemodialysis Modalities.

    PubMed

    Esposito, Pasquale; La Porta, Edoardo; Calatroni, Marta; Grignano, Maria Antonietta; Milanesi, Samantha; Verzola, Daniela; Battaglia, Yuri; Gregorini, Marilena; Libetta, Carmelo; Garibotto, Giacomo; Rampino, Teresa

    2017-01-01

    Background. In this study we investigated the relevance of myostatin and Hepatocyte Growth Factor (HGF) in patients undergoing hemodialysis HD and the influence of different HD modalities on their levels. Methods. We performed a prospective crossover study in which HD patients were randomized to undergo 3-month treatment periods with bicarbonate hemodialysis (BHD) followed by online hemodiafiltration (HDF). Clinical data, laboratory parameters, and myostatin and HGF serum levels were collected and compared. Results. Ten patients and six controls (C) were evaluated. In any experimental condition myostatin and HGF levels were higher in HD than in C. At enrollment and after BHD there were not significant correlations, whereas at the end of the HDF treatment period myostatin and HGF were inversely correlated ( r   -0.65, p < 0.05), myostatin serum levels inversely correlated with transferrin ( r   -0.73, p < 0.05), and HGF levels that resulted positively correlated with BMI ( r 0.67, p < 0.05). Moving from BHD to HDF, clinical and laboratory parameters were unchanged, as well as serum HGF, whereas myostatin levels significantly decreased (6.3 ± 4.1 versus 4.3 ± 3.1 ng/ml, p < 0.05). Conclusions. Modulation of myostatin levels and myostatin/HGF balance by the use of different HD modalities might represent a novel approach to the prevention and treatment of HD-related muscle wasting syndrome.

  16. ArrayBridge: Interweaving declarative array processing with high-performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xing, Haoyuan; Floratos, Sofoklis; Blanas, Spyros

    Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aimsmore » to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.« less

  17. WFIRST: Science from Deep Field Surveys

    NASA Astrophysics Data System (ADS)

    Koekemoer, Anton M.; Foley, Ryan; WFIRST Deep Field Working Group

    2018-06-01

    WFIRST will enable deep field imaging across much larger areas than those previously obtained with Hubble, opening up completely new areas of parameter space for extragalactic deep fields including cosmology, supernova and galaxy evolution science. The instantaneous field of view of the Wide Field Instrument (WFI) is about 0.3 square degrees, which would for example yield an Ultra Deep Field (UDF) reaching similar depths at visible and near-infrared wavelengths to that obtained with Hubble, over an area about 100-200 times larger, for a comparable investment in time. Moreover, wider fields on scales of 10-20 square degrees could achieve depths comparable to large HST surveys at medium depths such as GOODS and CANDELS, and would enable multi-epoch supernova science that could be matched in area to LSST Deep Drilling fields or other large survey areas. Such fields may benefit from being placed on locations in the sky that have ancillary multi-band imaging or spectroscopy from other facilities, from the ground or in space. The WFIRST Deep Fields Working Group has been examining the science considerations for various types of deep fields that may be obtained with WFIRST, and present here a summary of the various properties of different locations in the sky that may be considered for future deep fields with WFIRST.

  18. WFIRST: Science from Deep Field Surveys

    NASA Astrophysics Data System (ADS)

    Koekemoer, Anton; Foley, Ryan; WFIRST Deep Field Working Group

    2018-01-01

    WFIRST will enable deep field imaging across much larger areas than those previously obtained with Hubble, opening up completely new areas of parameter space for extragalactic deep fields including cosmology, supernova and galaxy evolution science. The instantaneous field of view of the Wide Field Instrument (WFI) is about 0.3 square degrees, which would for example yield an Ultra Deep Field (UDF) reaching similar depths at visible and near-infrared wavelengths to that obtained with Hubble, over an area about 100-200 times larger, for a comparable investment in time. Moreover, wider fields on scales of 10-20 square degrees could achieve depths comparable to large HST surveys at medium depths such as GOODS and CANDELS, and would enable multi-epoch supernova science that could be matched in area to LSST Deep Drilling fields or other large survey areas. Such fields may benefit from being placed on locations in the sky that have ancillary multi-band imaging or spectroscopy from other facilities, from the ground or in space. The WFIRST Deep Fields Working Group has been examining the science considerations for various types of deep fields that may be obtained with WFIRST, and present here a summary of the various properties of different locations in the sky that may be considered for future deep fields with WFIRST.

  19. ES9 Terra-Xtrk Ed3

    Atmospheric Science Data Center

    2018-05-16

    ... Instantaneous (Hourly Gridded), Monthly, Daily, Monthly Hourly File Format:  HDF Tools:  ... Aqua; Edition2 for TRMM; Edition1 for NPP) are approved for science publications. SCAR-B Block:  ...

  20. ES9 Aqua-Xtrk Ed3

    Atmospheric Science Data Center

    2018-05-16

    ... Instantaneous (Hourly Gridded), Monthly, Daily, Monthly Hourly File Format:  HDF Tools:  ... Aqua; Edition1 for NPP; Edition2 for TRMM) are approved for science publications. SCAR-B Block:  ...

  1. MOP03N (HDF)

    Atmospheric Science Data Center

    ... from Near Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA302. Project Title:  MOPITT ...

    2016-10-05

    ... from Near Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA302. Project Title:  MOPITT ...

  2. MOP03NM (HDF)

    Atmospheric Science Data Center

    ... from Near Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA303. Project Title:  MOPITT ...

    2016-10-05

    ... from Near Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA303. Project Title:  MOPITT ...

  3. MOP03J (HDF)

    Atmospheric Science Data Center

    ... and Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA300. Project Title:  MOPITT ...

    2016-10-05

    ... and Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA300. Project Title:  MOPITT ...

  4. MOP03JM (HDF)

    Atmospheric Science Data Center

    ... and Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA301. Project Title:  MOPITT ...

    2016-10-05

    ... and Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA301. Project Title:  MOPITT ...

  5. MOP02J (HDF)

    Atmospheric Science Data Center

    ... and Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA200. Project Title:  MOPITT ...

    2016-10-04

    ... and Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA200. Project Title:  MOPITT ...

  6. MOP03T (HDF)

    Atmospheric Science Data Center

    ... from Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA304. Project Title:  MOPITT ...

    2016-10-05

    ... from Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA304. Project Title:  MOPITT ...

  7. MOP03TM (HDF)

    Atmospheric Science Data Center

    ... from Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA305. Project Title:  MOPITT ...

    2016-10-05

    ... from Thermal Infrared Radiances, version 6, Hampton, VA, USA:NASA Atmospheric Science Data Center (ASDC), Accessed   at doi: 10.5067/TERRA/MOPITT/DATA305. Project Title:  MOPITT ...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav

    Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less

  9. Drug selection in French university hospitals: analysis of formularies for nine competitive pharmacological classes.

    PubMed

    Gallini, Adeline; Juillard-Condat, Blandine; Saux, Marie-Claude; Taboulet, Florence

    2011-11-01

    To give a panorama of the selectivity and agreement of French university hospitals' drug formularies (HDF) for nine competitive classes. All university hospitals were asked to send their HDF and selection criteria as of January 2009 for nine competitive pharmacological classes (proton pump inhibitors, serotonin antagonists, low molecular weight heparins, erythropoietins, angiotensin converting enzyme inhibitors, angiotensin II receptor antagonists, statins, α-adrenoreceptor antagonists and selective serotonin re-uptake inhibitors). Selectivity of HDF was estimated by the percentage of drug entities selected by the hospital within the pharmacological class. Agreement between hospitals was assessed with modified kappa coefficients for multi-raters. Twenty-one out of the 29 hospitals agreed to participate. These hospitals selected between 34% and 63% of the drug entities available for the nine classes, which represented 18 to 35 agents. Regarding the nature of chosen drug entities, the overall level of agreement was 'fair' and varied with pharmacological classes. Selection criteria were sent by only 12 hospitals. The technical component was the most important element in all hospitals. The weight of the economic component varied between 20% and 40% in the tender's grade. Large variations were seen in the number and nature of drugs selected by university hospitals which can be attributable to two successive decision-making processes (evaluation by the Drug and Therapeutics Committee followed by the purchasing process). © 2011 The Authors. British Journal of Clinical Pharmacology © 2011 The British Pharmacological Society.

  10. ESHOL study reanalysis: All-cause mortality considered by competing risks and time-dependent covariates for renal transplantation.

    PubMed

    Maduell, Francisco; Moreso, Francesc; Mora-Macià, Josep; Pons, Mercedes; Ramos, Rosa; Carreras, Jordi; Soler, Jordi; Torres, Ferrán

    2016-01-01

    The ESHOL study showed that post-dilution online haemodiafiltration (OL-HDF) reduces all-cause mortality versus haemodialysis. However, during the observation period, 355 patients prematurely completed the study and, according to the study design, these patients were censored at the time of premature termination. The aim of this study was to investigate the outcome of patients who discontinued the study. During follow-up, 207 patients died while under treatment and 47 patients died after discontinuation of the study. Compared with patients maintained on haemodialysis, those randomised to OL-HDF had lower all-cause mortality (12.4 versus 9.46 per 100 patient-years, hazard ratio and 95%CI: 0.76; [0.59-0.98], P= 0.031). For all-cause mortality by time-dependent covariates and competing risks for transplantation, the time-dependent Cox analysis showed very similar results to the main analysis with a hazard ratio of 0.77 (0.60-0.99, P= 0.043). The results of this analysis of the ESHOL trial confirm that post-dilution OL-HDF reduces all-cause mortality versus haemodialysis in prevalent patients. The original results of the ESHOL study, which censored patients discontinuing the study for any reason, were confirmed in the present ITT population without censures and when all-cause mortality was considered by time-dependent and competing risks for transplantation. Copyright © 2015 Sociedad Española de Nefrología. Published by Elsevier España, S.L.U. All rights reserved.

  11. Dosimetric comparison of moderate deep inspiration breath-hold and free-breathing intensity-modulated radiotherapy for left-sided breast cancer.

    PubMed

    Chi, F; Wu, S; Zhou, J; Li, F; Sun, J; Lin, Q; Lin, H; Guan, X; He, Z

    2015-05-01

    This study determined the dosimetric comparison of moderate deep inspiration breath-hold using active breathing control and free-breathing intensity-modulated radiotherapy (IMRT) after breast-conserving surgery for left-sided breast cancer. Thirty-one patients were enrolled. One free breathe and two moderate deep inspiration breath-hold images were obtained. A field-in-field-IMRT free-breathing plan and two field-in-field-IMRT moderate deep inspiration breath-holding plans were compared in the dosimetry to target volume coverage of the glandular breast tissue and organs at risks for each patient. The breath-holding time under moderate deep inspiration extended significantly after breathing training (P<0.05). There was no significant difference between the free-breathing and moderate deep inspiration breath-holding in the target volume coverage. The volume of the ipsilateral lung in the free-breathing technique were significantly smaller than the moderate deep inspiration breath-holding techniques (P<0.05); however, there was no significant difference between the two moderate deep inspiration breath-holding plans. There were no significant differences in target volume coverage between the three plans for the field-in-field-IMRT (all P>0.05). The dose to ipsilateral lung, coronary artery and heart in the field-in-field-IMRT were significantly lower for the free-breathing plan than for the two moderate deep inspiration breath-holding plans (all P<0.05); however, there was no significant difference between the two moderate deep inspiration breath-holding plans. The whole-breast field-in-field-IMRT under moderate deep inspiration breath-hold with active breathing control after breast-conserving surgery in left-sided breast cancer can reduce the irradiation volume and dose to organs at risks. There are no significant differences between various moderate deep inspiration breath-holding states in the dosimetry of irradiation to the field-in-field-IMRT target volume coverage and organs at risks. Copyright © 2015 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  12. CATS-ISS_L1B_N-M7.2-V2-07

    Atmospheric Science Data Center

    2018-05-23

    ... The Cloud-Aerosol Transport System (CATS) is a three wavelength, polarization-sensitive lidar that provides ... Temporal Resolution:  .051 second File Format:  HDF-5 Tools:  Contact User Services ...

  13. CATS-ISS_L1B_D-M7.2-V2-06

    Atmospheric Science Data Center

    2018-04-04

    ... The Cloud-Aerosol Transport System (CATS) is a three wavelength, polarization-sensitive lidar that provides ... Temporal Resolution:  .051 second File Format:  HDF-5 Tools:  Contact User Services ...

  14. CATS-ISS_L1B_N-M7.1-V2-07

    Atmospheric Science Data Center

    2018-05-23

    ... The Cloud-Aerosol Transport System (CATS) is a three wavelength, polarization-sensitive lidar that provides ... Temporal Resolution:  .051 second File Format:  HDF-5 Tools:  Contact User Services ...

  15. CATS-ISS_L1B_D-M7.1-V2-07

    Atmospheric Science Data Center

    2018-05-23

    ... The Cloud-Aerosol Transport System (CATS) is a three wavelength, polarization-sensitive lidar that provides ... Temporal Resolution:  .051 second File Format:  HDF-5 Tools:  Contact User Services ...

  16. CATS-ISS_L1B_N-M7.2-V2-06

    Atmospheric Science Data Center

    2018-04-04

    ... The Cloud-Aerosol Transport System (CATS) is a three wavelength, polarization-sensitive lidar that provides ... Temporal Resolution:  .051 second File Format:  HDF-5 Tools:  Contact User Services ...

  17. CATS-ISS_L1B_D-M7.2-V2-07

    Atmospheric Science Data Center

    2018-05-23

    ... The Cloud-Aerosol Transport System (CATS) is a three wavelength, polarization-sensitive lidar that provides ... Temporal Resolution:  .051 second File Format:  HDF-5 Tools:  Contact User Services ...

  18. SAGE III L2 Monthly Cloud Presence Data (HDF-EOS)

    Atmospheric Science Data Center

    2016-06-14

    ... degrees South Spatial Resolution:  1 km vertical Temporal Coverage:  02/27/2002 - 12/31/2005 ... Parameters:  Cloud Amount/Frequency Cloud Height Cloud Vertical Distribution Order Data:  Search and ...

  19. First Images from VLT Science Verification Programme

    NASA Astrophysics Data System (ADS)

    1998-09-01

    Two Weeks of Intensive Observations Successfully Concluded After a period of technical commissioning tests, the first 8.2-m telescope of the ESO VLT (UT1) has successfully performed an extensive series of "real science" observations , yielding nearly 100 hours of precious data. They concern all possible types of astronomical objects, from distant galaxies and quasars to pulsars, star clusters and solar system objects. This intensive Science Verification (SV) Programme took place as planned from August 17 to September 1, 1998, and was conducted by the ESO SV Team at the VLT Observatory on Paranal (Chile) and at the ESO Headquarters in Garching (Germany). The new giant telescope lived fully up to the high expectations and worked with spectacular efficiency and performance through the entire period. All data will be released by September 30 via the VLT archive and the web (with some access restrictions - see below). The Science Verification period Just before the beginning of the SV period, the 8.2-m primary mirror in its cell was temporarily removed in order to install the "M3 tower" with the tertiary mirror [1]. The reassembly began on August 15 and included re-installation at the Cassegrain focus of the VLT Test Camera that was also used for the "First Light" images in May 1998. After careful optical alignment and various system tests, the UT1 was handed over to the SV Team on August 17 at midnight local time. The first SV observations began immediately thereafter and the SV Team was active 24 hours a day throughout the two-week period. Video-conferences between Garching and Paranal took place every day at about noon Garching time (6 o'clock in the morning on Paranal). Then, while the Paranal observers were sleeping, data from the previous night were inspected and reduced in Garching, with feedback on what was best to do during the following night being emailed to Paranal several hours in advance of the beginning of the observations. The campaign ended in the morning of September 1 when the telescope was returned to the Commissioning Team that has since continued its work. The FORS instrument is now being installed and the first images from this facility are expected shortly. Observational circumstances During the two-week SV period, a total of 154 hours were available for astronomical observations. Of these, 95 hours (62%) were used to collect scientific data, including calibrations, e.g. flat-fielding and photometric standard star observations. 15 hours (10%) were spent to solve minor technical problems, while another 44 hours (29%) were lost due to adverse meteorological conditions (clouds or wind exceeding 15 m/sec). The amount of telescope technical downtime is very small at this moment of the UT1 commissioning. This fact provides an impressive indication of high technical reliability that has been achieved and which will be further consolidated during the next months. The meteorological conditions that were encountered at Paranal during this period were unfortunately below average, when compared to data from the same calendar period in earlier years. There was an excess of bad seeing and fewer good seeing periods than normal; see, however, ESO PR Photo 35c/98 with 0.26 arcsec image quality. Nevertheless, the measured image quality on the acquired frames was often better than the seeing measured outside the enclosure by the Paranal seeing monitor. Part of this very positive effect is due to "active field stabilization" , now performed during all observations by rapid motion (10 - 70 times per second) of the 1.1-m secondary mirror of beryllium (M2) and compensating for the "twinkling" of stars. Science Verification data soon to be released A great amount of valuable data was collected during the SV programme. The available programme time was distributed as follows: Hubble Deep Field - South [HDF-S; NICMOS and STIS Fields] (37.1 hrs); Lensed QSOs (3.2 hrs); High-z Clusters (6.2 hrs); Host Galaxies of Gamma-Ray Bursters (2.1 hrs); Edge-on Galaxies (7.4 hrs); Globular cluster cores (6.7 hrs); QSO Hosts (4.4 hrs); TNOs (3.4 hrs); Pulsars (1.3 hrs); Calibrations (22.7 hrs). All of the SV data are now in the process of being prepared for public release by September 30, 1998 to the ESO and Chilean astronomical communities. It will be possible to retrieve the data from the VLT archive, and a set of CDs will be distributed to all astronomical research institutes within the ESO member states and Chile. Moreover, data obtained on the HDF-S will become publicly available worldwide, and retrievable from the VLT archive. Updated information on this data release can be found on the ESO web site at http://www.eso.org/vltsv/. It is expected that the first scientific results based on the SV data will become available in the course of October and November 1998. First images from the Science Verification programme This Press Release is accompanied by three photos that reproduce some of the images obtained during the SV period. ESO PR Photo 35a/98 ESO PR Photo 35a/98 [Preview - JPEG: 671 x 800 pix - 752k] [High-Res - JPEG: 2518 x 3000 pix - 5.8Mb] This colour composite was constructed from the U+B, R and I Test Camera Images of the Hubble Deep Field South (HDF-S) NICMOS field. These images are displayed as blue, green and red, respectively. The first photo is a colour composite of the HDF-S NICMOS sky field that combines exposures obtained in different wavebands: ultraviolet (U) + blue (B), red (R) and near-infrared (I). For all of them, the image quality is better than 0.9 arcsec. Most of the objects seen in the field are distant galaxies. The image is reproduced in such a way that it shows the faintest features scaled, while rendering the image of the star below the large spiral galaxy approximately white. The spiral galaxy is displayed in such a way that the internal structure is visible. A provisional analysis has shown that limiting magnitudes that were predicted for the HDF-S observations (27.0 - 28.5, depending on the band), were in fact reached. Technical information : Photo 35a/98 is based on 16 U-frames (~370 nm; total exposure time 17800 seconds; mean seeing 0.71 arcsec) and 15 B-frames (~430 nm; 10200 seconds; 0.71 arcsec) were added and combined with 8 R frames (~600 nm; 7200 seconds; 0.49 arcsec) and 12 I-frames (~800 nm; 10150 seconds; 0.59 arcsec) to make this colour composite. Individual frames were flat-fielded and cleaned for cosmics before combination. The field shown measures 1.0 x 1.0 arcmin. North is up; East is to the left. ESO PR Photo 35b/98 ESO PR Photo 35b/98 [Preview - JPEG: 679 x 800 pix - 760k] [High-Res - JPEG: 2518 x 3000 pix - 5.7Mb] The colour composite of the HDF-S NICMOS field constructed by combining VLT Test Camera images in U+B and R bands with a HST NICMOS near-IR H-band exposure. These images are displayed as blue, green and red, respectively. The NICMOS image was smoothed to match the angular resolution of the R-band VLT image. The boundary of the NICMOS image is also shown. The next photo is similar to the first one, but uses a near-IR frame obtained with the Hubble Space Telescope NICMOS instrument instead of the VLT I-frame. The HST image has nearly the same total exposure time as the VLT images. Their combination is meaningful since the VLT and NICMOS images reach similar depths and show more or less the same faint objects. This is the result of several effects compensating each other: while more distant galaxies are redder and therefore better visible at the infrared waveband of the NICMOS image and this image has a better angular resolution than those from the VLT, the collecting area of the UT1 mirror is over 11 times larger than that of the HST. It is interesting to note that all objects in the NICMOS image are also visible in the VLT images, with the exception of the very red object just left of the face-on spiral. The bright red object near the bottom has not before been detected in optical images (to the limit of R ~ 26 mag), but is clearly present in all the VLT Test Camera coadded images, with the exception of the U-band image. Both of these very red objects are possibly extremely distant, elliptical galaxies [2]. The additional information that can be obtained from the combination of the VLT and the infrared NICMOS images has an immediate bearing on the future work with the VLT. When the infrared, multi-mode ISAAC instrument enters into operation in early 1999, it will be able to obtain spectra of such objects and, in general, to deliver very deep infrared images. Thus, the combination of visual (from FORS) and infrared (from ISAAC) images and spectra promises to become an extremely powerful tool that will allow the detection of very red and therefore exceedingly distant galaxies. Moreover, it is obvious that this sky field is not very crowded - much longer exposure times will thus be possible without encountering serious problems of overlapping objects at the "confusion limit". Technical information : Photo 35b/98 is based on 16 U-frames (~370 nm; total exposure time 17800 seconds; mean seeing 0.71 arcsec) and 15 B-frames (~430 nm; 10200 seconds; 0.71 arcsec) were added and combined with 8 R frames (~600 nm; 7200 seconds; 0.49 arcsec) as well as a HST/NICMOS H-band frame(a H-band HST/NICMOS image from the ST-ECF public archive) (~1600 nm; 7040 seconds; 0.2 arcsec) to make this colour composite. Individual frames were flat-fielded and cleaned for cosmics before combination. The field shown measures 1.0 x 1.0 arcmin. North is up; East is to the left. ESO PR Photo 35c/98 ESO PR Photo 35c/98 [Preview - JPEG: 654 x 800 pix - 280k] [High-Res - JPEG: 2489 x 3000 pix - 2.6Mb] Coaddition of two R-band images of edge-on galaxy ESO342-G017 , obtained with 0.26 arcsec image quality. The galaxy ESO342-G017 was observed on August 19, 1998 during a spell of excellent observing conditions. Two exposures, each lasting 120 seconds, were taken through a red filtre to produce this photo. The quality of the original images is excellent, with seeing (FWHM) of only 0.26 arcsec measured on the stars in the frame. ESO342-G017 is an Sc-type spiral galaxy seen edge-on, and the Test Camera was rotated so that the disk of the galaxy appears horizontal in the figure. Thanks to the image quality, the photo shows much detail in the rather flat disk, including a very thin, obscuring dust band and some brighter knots, most probably star-forming regions. This galaxy is located well outside the Milky Way band in the southern constellation of Sagittarius. Its distance is about 400 million light-years (recession velocity about 7,700 km/sec). A number of more distant galaxies are seen in the background on this short exposure. Technical information : Photo 35c/98 is a reproduced from a composite of two 120-second exposures in the red R-band (~600 nm) of the edge-on galaxy ESO342-G017, both with 0.26 arcsec image quality. The frames were flat-fielded and cleaned for cosmics before combination. The field shown measures 1.5 x 1.5 arcmin. North is inclined 38 o clockwise from the top, East is to the left. Notes: [1] The flat and elliptically shaped, tertiary mirror M3 is mounted on top of the M3 Tower that is fixed in the center of the M1 Cell. The tower can rotate along its axis and deflects the light coming from the M2 mirror to the astronomical instruments on either Nasmyth platform. A mechanism at the top of the M3 Tower is used to move the M3 mirror away from the optical path when the instrument at the Cassegrain focus is used, e.g. the Test Camera during the SV observations. [2] This effect is due to the fact that the more distant a galaxy is, the larger is the velocity with which it recedes from us (Hubble's law). The larger the velocity, the further its emitted light will be shifted redwards in the observed spectrum (the Doppler effect) and the redder its image will appear to us. By comparing the brightness of a distant galaxy in different wavebands (measuring its colour), it is therefore in practice possible to estimate its redshift and thus its distance (the " photometric redshift" method). How to obtain ESO Press Information ESO Press Information is made available on the World-Wide Web (URL: http://www.eso.org ). ESO Press Photos may be reproduced, if credit is given to the European Southern Observatory.

  20. Wound healing potential of adipose tissue stem cell extract

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Na, You Kyung; Ban, Jae-Jun; Lee, Mijung

    Adipose tissue stem cells (ATSCs) are considered as a promising source in the field of cell therapy and regenerative medicine. In addition to direct cell replacement using stem cells, intercellular molecule exchange by stem cell secretory factors showed beneficial effects by reducing tissue damage and augmentation of endogenous repair. Delayed cutaneous wound healing is implicated in many conditions such as diabetes, aging, stress and alcohol consumption. However, the effects of cell-free extract of ATSCs (ATSC-Ex) containing secretome on wound healing process have not been investigated. In this study, ATSC-Ex was topically applied on the cutaneous wound and healing speed wasmore » examined. As a result, wound closure was much faster in the cell-free extract treated wound than control wound at 4, 6, 8 days after application of ATSC-Ex. Dermal fibroblast proliferation, migration and extracellular matrix (ECM) production are critical aspects of wound healing, and the effects of ATSC-Ex on human dermal fibroblast (HDF) was examined. ATSC-Ex augmented HDF proliferation in a dose-dependent manner and migration ability was enhanced by extract treatment. Representative ECM proteins, collagen type I and matrix metalloproteinase-1, are significantly up-regulated by treatment of ATSC-Ex. Our results suggest that the ATSC-Ex have improving effect of wound healing and can be the potential therapeutic candidate for cutaneous wound healing. - Highlights: • Topical application of ATSC-Ex results in faster wound closure than normal wound in vivo. • ATSC-Ex enhances dermal fibroblast proliferation, migration and extracellular matrix production. • This study suggests that ATSC-Ex is an effective source to augment wound healing.« less

  1. Discovery of Marine Datasets and Geospatial Metadata Visualization

    NASA Astrophysics Data System (ADS)

    Schwehr, K. D.; Brennan, R. T.; Sellars, J.; Smith, S.

    2009-12-01

    NOAA's National Geophysical Data Center (NGDC) provides the deep archive of US multibeam sonar hydrographic surveys. NOAA stores the data as Bathymetric Attributed Grids (BAG; http://www.opennavsurf.org/) that are HDF5 formatted files containing gridded bathymetry, gridded uncertainty, and XML metadata. While NGDC provides the deep store and a basic ERSI ArcIMS interface to the data, additional tools need to be created to increase the frequency with which researchers discover hydrographic surveys that might be beneficial for their research. Using Open Source tools, we have created a draft of a Google Earth visualization of NOAA's complete collection of BAG files as of March 2009. Each survey is represented as a bounding box, an optional preview image of the survey data, and a pop up placemark. The placemark contains a brief summary of the metadata and links to directly download of the BAG survey files and the complete metadata file. Each survey is time tagged so that users can search both in space and time for surveys that meet their needs. By creating this visualization, we aim to make the entire process of data discovery, validation of relevance, and download much more efficient for research scientists who may not be familiar with NOAA's hydrographic survey efforts or the BAG format. In the process of creating this demonstration, we have identified a number of improvements that can be made to the hydrographic survey process in order to make the results easier to use especially with respect to metadata generation. With the combination of the NGDC deep archiving infrastructure, a Google Earth virtual globe visualization, and GeoRSS feeds of updates, we hope to increase the utilization of these high-quality gridded bathymetry. This workflow applies equally well to LIDAR topography and bathymetry. Additionally, with proper referencing and geotagging in journal publications, we hope to close the loop and help the community create a true “Geospatial Scholar” infrastructure.

  2. The field representation language.

    PubMed

    Tsafnat, Guy

    2008-02-01

    The complexity of quantitative biomedical models, and the rate at which they are published, is increasing to a point where managing the information has become all but impossible without automation. International efforts are underway to standardise representation languages for a number of mathematical entities that represent a wide variety of physiological systems. This paper presents the Field Representation Language (FRL), a portable representation of values that change over space and/or time. FRL is an extensible mark-up language (XML) derivative with support for large numeric data sets in Hierarchical Data Format version 5 (HDF5). Components of FRL can be reused through unified resource identifiers (URI) that point to external resources such as custom basis functions, boundary geometries and numerical data. To demonstrate the use of FRL as an interchange we present three models that study hyperthermia cancer treatment: a fractal model of liver tumour microvasculature; a probabilistic model simulating the deposition of magnetic microspheres throughout it; and a finite element model of hyperthermic treatment. The microsphere distribution field was used to compute the heat generation rate field around the tumour. We used FRL to convey results from the microsphere simulation to the treatment model. FRL facilitated the conversion of the coordinate systems and approximated the integral over regions of the microsphere deposition field.

  3. Replacement of acetate with citrate in dialysis fluid: a randomized clinical trial of short term safety and fluid biocompatibility

    PubMed Central

    2013-01-01

    Background The majority of bicarbonate based dialysis fluids are acidified with acetate. Citrate, a well known anticoagulant and antioxidant, has been suggested as a biocompatible alternative. The objective of this study was to evaluate short term safety and biocompatibility of a citrate containing acetate-free dialysis fluid. Methods Twenty four (24) patients on maintenance dialysis three times per week, 13 on on-line hemodiafiltration (HDF) and 11 on hemodialysis (HD), were randomly assigned to start with either citrate dialysis fluid (1 mM citrate, 1.5 mM calcium) or control fluid (3 mM acetate, 1.5 mM calcium) in an open-labeled cross-over trial (6 + 6 weeks with 8 treatments wash-out in between). Twenty (20) patients, 11 on HDF and 9 on HD were included in the analyses. Main objective was short term safety assessed by acid–base status, plasma ionized calcium and parathyroid hormone (PTH). In addition, biocompatibility was assessed by markers of inflammation (pentraxin 3 (PTX-3), CRP, IL-6, TNF-α and IL-1β) and thrombogenicity (activated partial thromboplastin time (APTT) and visual clotting scores). Results No differences dependent on randomization order or treatment mode (HD vs. HDF) were detected. Citrate in the dialysis fluid reduced the intra-dialytic shift in pH (+0.04 week 6 vs. +0.06 week 0, p = 0.046) and base excess (+3.9 mM week 6 vs. +5.6 mM week 0, p = 0.006) over the study period. Using the same calcium concentration (1.5 mM), citrate dialysis fluid resulted in lower post-dialysis plasma ionized calcium level (1.10 mM vs. 1.27 mM for control, p < 0.0001) and higher post-dialysis PTH level (28.8 pM vs. 14.7 pM for control, p < 0.0001) while pre-dialysis levels were unaffected. Citrate reduced intra-dialytic induction of PTX-3 (+1.1 ng/ml vs. +1.4 ng/ml for control, p = 0.04) but had no effect on other markers of inflammation or oxidative stress. Citrate reduced visual clotting in the arterial air chamber during HDF (1.0 vs. 1.8 for control, p = 0.03) and caused an intra-dialytic increase in APTT (+6.8 s, p = 0.003) without affecting post-dialysis values compared to control. Conclusions During this small short term study citrate dialysis fluid was apparently safe to use in HD and on-line HDF treatments. Indications of reduced treatment-induced inflammation and thrombogenicity suggest citrate as a biocompatible alternative to acetate in dialysis fluid. However, the results need to be confirmed in long term studies. Trial registration ISRCTN: ISRCTN28536511 PMID:24103587

  4. hdfscan

    Atmospheric Science Data Center

    2013-04-01

    ... free of charge from JPL, upon completion of a license agreement. hdfscan software consists of two components - a core hdf file ... at the Jet Propulsion Laboratory. To obtain the license agreement, go to the  MISR Science Software web page , read the introductory ...

  5. Bitshuffle: Filter for improving compression of typed binary data

    NASA Astrophysics Data System (ADS)

    Masui, Kiyoshi

    2017-12-01

    Bitshuffle rearranges typed, binary data for improving compression; the algorithm is implemented in a python/C package within the Numpy framework. The library can be used alongside HDF5 to compress and decompress datasets and is integrated through the dynamically loaded filters framework. Algorithmically, Bitshuffle is closely related to HDF5's Shuffle filter except it operates at the bit level instead of the byte level. Arranging a typed data array in to a matrix with the elements as the rows and the bits within the elements as the columns, Bitshuffle "transposes" the matrix, such that all the least-significant-bits are in a row, etc. This transposition is performed within blocks of data roughly 8kB long; this does not in itself compress data, but rearranges it for more efficient compression. A compression library is necessary to perform the actual compression. This scheme has been used for compression of radio data in high performance computing.

  6. Scientific Data Storage for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Readey, J.

    2014-12-01

    Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.

  7. Task 28: Web Accessible APIs in the Cloud Trade Study

    NASA Technical Reports Server (NTRS)

    Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun

    2017-01-01

    This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.

  8. PFLOTRAN: Reactive Flow & Transport Code for Use on Laptops to Leadership-Class Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan

    PFLOTRAN, a next-generation reactive flow and transport code for modeling subsurface processes, has been designed from the ground up to run efficiently on machines ranging from leadership-class supercomputers to laptops. Based on an object-oriented design, the code is easily extensible to incorporate additional processes. It can interface seamlessly with Fortran 9X, C and C++ codes. Domain decomposition parallelism is employed, with the PETSc parallel framework used to manage parallel solvers, data structures and communication. Features of the code include a modular input file, implementation of high-performance I/O using parallel HDF5, ability to perform multiple realization simulations with multiple processors permore » realization in a seamless manner, and multiple modes for multiphase flow and multicomponent geochemical transport. Chemical reactions currently implemented in the code include homogeneous aqueous complexing reactions and heterogeneous mineral precipitation/dissolution, ion exchange, surface complexation and a multirate kinetic sorption model. PFLOTRAN has demonstrated petascale performance using 2{sup 17} processor cores with over 2 billion degrees of freedom. Accomplishments achieved to date include applications to the Hanford 300 Area and modeling CO{sub 2} sequestration in deep geologic formations.« less

  9. Protein Secondary Structure Prediction Using Deep Convolutional Neural Fields.

    PubMed

    Wang, Sheng; Peng, Jian; Ma, Jianzhu; Xu, Jinbo

    2016-01-11

    Protein secondary structure (SS) prediction is important for studying protein structure and function. When only the sequence (profile) information is used as input feature, currently the best predictors can obtain ~80% Q3 accuracy, which has not been improved in the past decade. Here we present DeepCNF (Deep Convolutional Neural Fields) for protein SS prediction. DeepCNF is a Deep Learning extension of Conditional Neural Fields (CNF), which is an integration of Conditional Random Fields (CRF) and shallow neural networks. DeepCNF can model not only complex sequence-structure relationship by a deep hierarchical architecture, but also interdependency between adjacent SS labels, so it is much more powerful than CNF. Experimental results show that DeepCNF can obtain ~84% Q3 accuracy, ~85% SOV score, and ~72% Q8 accuracy, respectively, on the CASP and CAMEO test proteins, greatly outperforming currently popular predictors. As a general framework, DeepCNF can be used to predict other protein structure properties such as contact number, disorder regions, and solvent accessibility.

  10. Protein Secondary Structure Prediction Using Deep Convolutional Neural Fields

    NASA Astrophysics Data System (ADS)

    Wang, Sheng; Peng, Jian; Ma, Jianzhu; Xu, Jinbo

    2016-01-01

    Protein secondary structure (SS) prediction is important for studying protein structure and function. When only the sequence (profile) information is used as input feature, currently the best predictors can obtain ~80% Q3 accuracy, which has not been improved in the past decade. Here we present DeepCNF (Deep Convolutional Neural Fields) for protein SS prediction. DeepCNF is a Deep Learning extension of Conditional Neural Fields (CNF), which is an integration of Conditional Random Fields (CRF) and shallow neural networks. DeepCNF can model not only complex sequence-structure relationship by a deep hierarchical architecture, but also interdependency between adjacent SS labels, so it is much more powerful than CNF. Experimental results show that DeepCNF can obtain ~84% Q3 accuracy, ~85% SOV score, and ~72% Q8 accuracy, respectively, on the CASP and CAMEO test proteins, greatly outperforming currently popular predictors. As a general framework, DeepCNF can be used to predict other protein structure properties such as contact number, disorder regions, and solvent accessibility.

  11. perf-dump

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, T.

    perf-dump is a library for dumping performance data in much the same way physics simulations dump checkpoints. It records per-process, per-timestep, per-phase, and per-thread performance counter data and dumps this large data periodically into an HDF5 data file.

  12. MODIS Atmospheric Data Handler

    NASA Technical Reports Server (NTRS)

    Anantharaj, Valentine; Fitzpatrick, Patrick

    2008-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) Atmosphere Data Handler software converts the HDF data to ASCII format, and outputs: (1) atmospheric profiles of temperature and dew point and (2) total precipitable water. Quality-control data are also considered in the export procedure.

  13. Enhancement of Carbon Sequestration in west coast Douglas-fir Forests with Nitrogen Fertilization

    NASA Astrophysics Data System (ADS)

    Chen, B.; Jassal, R.; Black, A.; Brummer, C.; Spittlehouse, D.; Nesic, Z.

    2008-12-01

    Fertilization is one of the eligible management practices for C sequestering and hence reducing CO2 emissions under Article 3.4 of the Kyoto Protocol. In the coastal regions of British Columbia, which have very little nitrogen (N) deposition from pollution sources owing to their remote location, and soils deficient in N (Hanley et al., 1996), Douglas-fir stands respond to N fertilization (Brix, 1981; Fisher and Binkley, 2000; Chapin et al., 2002). However, a major concern with N fertilization is the potential loss from the soil surface of the highly potent greenhouse gas N2O, and little is known about such losses in N-fertilized forest soils. While it is necessary to determine and quantify the effects of N fertilization on stand C sequestration, it is also important to address environmental concerns by measuring N2O emissions to determine the net greenhouse gas (GHG) global warming potential (GWP). The GWP of N2O is 296 times (100-year time horizon) greater than that of CO2 (Ehhalt and Prather, 2001), yet there is little information on its net radiative forcing as a result of forest fertilization. We report two years of results on the effects of N fertilization in a chronosequence of three Douglas-fir stands (7, 19 and 58 years old, hereafter referred to as HDF00, HDF88 and DF49, respectively) on net C sequestration or net primary productivity measured using the eddy-covariance technique. DF49 (110 ha) and HDF88 (20 ha) were aerially fertilized with urea at 200 kg N ha-1 on Jan 13 and Feb 17, 2007, respectively, while due to its young age and competing understory, fertilizer to HDF00 (5 ha) was manually applied at 80 g urea/tree (60 kg N ha-1) along the tree drip line on Feb 13-14, 2007. Additionally, we calculate the net change in GHG GWP resulting from fertilization of DF49 by accounting for N2O emissions and energy costs of fertilizer production, transport, and application. We also compare polymer-coated slow-release urea (Environmentally Smart Nitrogen (ESN), Agrium Inc., Calgary, AB, Canada) with regular urea for its potential effectiveness in reducing N2O emissions from the forest-floor.

  14. Adipose-derived stem cells cooperate with fractional carbon dioxide laser in antagonizing photoaging: a potential role of Wnt and β-catenin signaling.

    PubMed

    Xu, Xiao; Wang, Hong-Yi; Zhang, Yu; Liu, Yang; Li, Yan-Qi; Tao, Kai; Wu, Chu-Tse; Jin, Ji-de; Liu, Xiao-Yan

    2014-01-01

    It is well established that adipose-derived stem cells (ADSCs) produce and secrete cytokines/growth factors that antagonize UV-induced photoaging of skin. However, the exact molecular basis underlying the anti-photoaging effects exerted by ADSCs is not well understood, and whether ADSCs cooperate with fractional carbon dioxide (CO2) laser to facilitate photoaging skin healing process has not been explored. Here, we investigated the impacts of ADSCs on photoaging in a photoaging animal model, its associated mechanisms, and its functional cooperation with fractional CO2 laser in treatment of photoaging skin. We showed that ADSCs improved dermal thickness and activated the proliferation of dermal fibroblast. We further demonstrated that the combined treatment of ADSCs and fractional CO2 laser, the latter which is often used to resurface skin and treat wrinkles, had more beneficial effects on the photoaging skin compared with each individual treatment. In our prepared HDF photoaging model, flow cytometry showed that, after adipose derived stem cells conditioned medium (ADSC-CM) co-cultured HDF photoaging model, the cell proliferation rate is higher than UVB irradiation induced HDF modeling (p < 0.05). Additionally, the expressions of β-catenin and Wnt3a, which were up-regulated after the transplantation of ADSCs alone or in combination with fractional CO2 laser treatment. And the expression of wnt3a and β-catenin has the positive correlation with photoaging related protein TGF-β2 and COLI. We also verified these protein expressions in tissue level. In addition, after injected SFRP2 into ADSC-CM co-cultured HDF photoaging model, wnt3a inhibitor, compared with un-intervened group, wnt3a, β-catenin protein level significantly decreased. Both ADSCs and fractional CO2 laser improved photoaging skin at least partially via targeting dermal fibroblast activity which was increased in photoaging skin. The combinatorial use of ADSCs and fractional CO2 laser synergistically improved the healing process of photoaging skin. Thus, we provide a strong rationale for a combined use of ADSCs and fractional CO2 laser in treatment of photoaging skin in clinic in the future. Moreover, we provided evidence that the Wnt/β-catenin signaling pathway may contribute to the activation of dermal fibroblast by the transplantation of ADSCs in both vitro and vivo experiment.

  15. Adipose-derived stem cells cooperate with fractional carbon dioxide laser in antagonizing photoaging: a potential role of Wnt and β-catenin signaling

    PubMed Central

    2014-01-01

    Background It is well established that adipose-derived stem cells (ADSCs) produce and secrete cytokines/growth factors that antagonize UV-induced photoaging of skin. However, the exact molecular basis underlying the anti-photoaging effects exerted by ADSCs is not well understood, and whether ADSCs cooperate with fractional carbon dioxide (CO2) laser to facilitate photoaging skin healing process has not been explored. Here, we investigated the impacts of ADSCs on photoaging in a photoaging animal model, its associated mechanisms, and its functional cooperation with fractional CO2 laser in treatment of photoaging skin. Results We showed that ADSCs improved dermal thickness and activated the proliferation of dermal fibroblast. We further demonstrated that the combined treatment of ADSCs and fractional CO2 laser, the latter which is often used to resurface skin and treat wrinkles, had more beneficial effects on the photoaging skin compared with each individual treatment. In our prepared HDF photoaging model, flow cytometry showed that, after adipose derived stem cells conditioned medium (ADSC-CM) co-cultured HDF photoaging model, the cell proliferation rate is higher than UVB irradiation induced HDF modeling (p < 0.05). Additionally, the expressions of β-catenin and Wnt3a, which were up-regulated after the transplantation of ADSCs alone or in combination with fractional CO2 laser treatment. And the expression of wnt3a and β-catenin has the positive correlation with photoaging related protein TGF-β2 and COLI. We also verified these protein expressions in tissue level. In addition, after injected SFRP2 into ADSC-CM co-cultured HDF photoaging model, wnt3a inhibitor, compared with un-intervened group, wnt3a, β-catenin protein level significantly decreased. Conclusion Both ADSCs and fractional CO2 laser improved photoaging skin at least partially via targeting dermal fibroblast activity which was increased in photoaging skin. The combinatorial use of ADSCs and fractional CO2 laser synergistically improved the healing process of photoaging skin. Thus, we provide a strong rationale for a combined use of ADSCs and fractional CO2 laser in treatment of photoaging skin in clinic in the future. Moreover, we provided evidence that the Wnt/β-catenin signaling pathway may contribute to the activation of dermal fibroblast by the transplantation of ADSCs in both vitro and vivo experiment. PMID:24917925

  16. HUBBLE'S TOP TEN GRAVITATIONAL LENSES

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NASA Hubble Space Telescope serendipitous survey of the sky has uncovered exotic patterns, rings, arcs and crosses that are all optical mirages produced by a gravitational lens, nature's equivalent of having giant magnifying glass in space. Shown are the top 10 lens candidates uncovered in the deepest 100 Hubble fields. Hubble's sensitivity and high resolution allow it to see faint and distant lenses that cannot be detected with ground-based telescopes whose images are blurred by Earth's atmosphere. [Top Left] - HST 01248+0351 is a lensed pair on either side of the edge-on disk lensing galaxy. [Top Center] - HST 01247+0352 is another pair of bluer lensed source images around the red spherical elliptical lensing galaxy. Two much fainter images can be seen near the detection limit which might make this a quadruple system. [Top Right] - HST 15433+5352 is a very good lens candidate with a bluer lensed source in the form of an extended arc about the redder elliptical lensing galaxy. [Middle Far Left] - HST 16302+8230 could be an 'Einstein ring' and the most intriguing lens candidate. It has been nicknamed the 'the London Underground' since it resembles that logo. [Middle Near Left] - HST 14176+5226 is the first, and brightest lens system discovered in 1995 with the Hubble telescope. This lens candidate has now been confirmed spectroscopically using large ground-based telescopes. The elliptical lensing galaxy is located 7 billion light-years away, and the lensed quasar is about 11 billion light-years distant. [Middle Near Right] - HST 12531-2914 is the second quadruple lens candidate discovered with Hubble. It is similar to the first, but appears smaller and fainter. [Middle Far Right] - HST 14164+5215 is a pair of bluish lensed images symmetrically placed around a brighter, redder galaxy. [Bottom Left] - HST 16309+8230 is an edge-on disk-like galaxy (blue arc) which has been significantly distorted by the redder lensing elliptical galaxy. [Bottom Center] - HST 12368+6212 is a blue arc in the Hubble Deep Field (HDF). [Bottom Right] - HST 18078+4600 is a blue arc caused by the gravitational potential of a small group of 4 galaxies. Credit: Kavan Ratnatunga (Carnegie Mellon Univ.) and NASA

  17. Tools and strategies for instrument monitoring, data mining and data access

    NASA Astrophysics Data System (ADS)

    van Hees, R. M., ,, Dr

    2009-04-01

    The ever growing size of data sets produced by various satellite instruments creates a challenge in data management. Three main tasks were identified: instrument performance monitoring, data mining by users and data deployment. In this presentation, I will discuss the three tasks and our solution. As a practical example to illustrate the problem and make the discussion less abstract, I will use Sciamachy on-board the ESA satellite Envisat. Since the launch of Envisat, in March 2002, Sciamachy has performed nearly a billion science measurements and performed daily calibrations measurements. The total size of the data set (not including reprocessed data) is over 30 TB, distributed over 150,000 files. [Instrument Monitoring] Most instruments produce house-keeping data, which may include time, geo-location, temperature of different parts of the instrument and instrument settings and configuration. In addition, many instruments perform calibration measurements. Instrument performance monitoring requires automated analyzes of critical parameters for events, and the option to off-line inspect the behavior of various parameters in time. We choose to extract the necessary information from the SCIAMACHY data products, and store everything in one file, where we separated house-keeping data from calibration measurements. Due to the large volume and the need to have quick random-access, the Hierarchical Data Format (HDF5) was our obvious choice. The HDF5 format is self describing and designed to organize different types of data in one file. For example, one data set may contain the meta data of the calibration measurements: time, geo-location, instrument settings, quality parameters (temperature of the instrument), while a second large data set contains the actual measurements. The HDF5 high-level packet table API is ideal for tables that only grow (by appending rows), while the HDF5 table API is better suited for tables where rows need to be updated, inserted or replaced. In particular, the packet table API allows very compact storage of compound data sets and very fast read/write access. Details about this implementation and pitfalls will be given in the presentation. [Data Mining] The ability to select relevant data is a requirement that all data centers have to offer. The NL-SCIA-DC allows the users to select data using several criteria including: time, geo-location, type of observation and data quality. The result of the query are [i] location and name of relevant data products (files), or [ii] listing of meta data of the relevant measurements, or [iii] listing of the measurements (level 2 or higher). For this application, we need the power of a relational database, the SQL language, and the availability of spatial functions. PostgreSQL, extended with postGIS support turned out to be a good choice. Common queries on tables with millions of rows can be executed within seconds. [Data Deployment] The dissemination of scientific data is often cumbersome by the usage of many different formats to store the products. Therefore, time-consuming and inefficient conversions are needed to use data products from different origin. Within the Atmospheric Data Access for the Geospatial User Community (ADAGUC) project we provide selected space borne atmospheric and land data sets in the same data format and consistent internal structure, so that users can easily use and combine data. The common format for storage is HDF5, but the netCDF-4 API is used to create the data sets. The standard for metadata and dataset attributes follow the netCDF Climate and Forecast conventions, in addition metadata complies to the ISO 19115:2003 INSPIRE profile are added. The advantage of netCDF-4 is that the API is essentially equal to netCDF-3 (with a few extensions), while the data format is HDF5 (recognized by many scientific tools). The added metadata ensures product traceability. Details will be given in the presentation and several posters.

  18. Overview of deep learning in medical imaging.

    PubMed

    Suzuki, Kenji

    2017-09-01

    The use of machine learning (ML) has been increasing rapidly in the medical imaging field, including computer-aided diagnosis (CAD), radiomics, and medical image analysis. Recently, an ML area called deep learning emerged in the computer vision field and became very popular in many fields. It started from an event in late 2012, when a deep-learning approach based on a convolutional neural network (CNN) won an overwhelming victory in the best-known worldwide computer vision competition, ImageNet Classification. Since then, researchers in virtually all fields, including medical imaging, have started actively participating in the explosively growing field of deep learning. In this paper, the area of deep learning in medical imaging is overviewed, including (1) what was changed in machine learning before and after the introduction of deep learning, (2) what is the source of the power of deep learning, (3) two major deep-learning models: a massive-training artificial neural network (MTANN) and a convolutional neural network (CNN), (4) similarities and differences between the two models, and (5) their applications to medical imaging. This review shows that ML with feature input (or feature-based ML) was dominant before the introduction of deep learning, and that the major and essential difference between ML before and after deep learning is the learning of image data directly without object segmentation or feature extraction; thus, it is the source of the power of deep learning, although the depth of the model is an important attribute. The class of ML with image input (or image-based ML) including deep learning has a long history, but recently gained popularity due to the use of the new terminology, deep learning. There are two major models in this class of ML in medical imaging, MTANN and CNN, which have similarities as well as several differences. In our experience, MTANNs were substantially more efficient in their development, had a higher performance, and required a lesser number of training cases than did CNNs. "Deep learning", or ML with image input, in medical imaging is an explosively growing, promising field. It is expected that ML with image input will be the mainstream area in the field of medical imaging in the next few decades.

  19. MISR Regional VBBE Map Projection

    Atmospheric Science Data Center

    2013-03-26

    ... Imagery:   Overview  |  Products  |  Data Quality  | Map Projection |  File Format  |  View Data ... is needed if you are doing high precision work. The packages mentioned about (HDF-EOS library, GCTP, and IDL) all convert to and ...

  20. MISR Regional UAE2 Map Projection

    Atmospheric Science Data Center

    2013-03-26

    ... Imagery:  Overview  |  Products  |  Data Quality  | Map Projection |  File Format  |  View Data ... is needed if you are doing high precision work. The packages mentioned about (HDF-EOS library, GCTP, and IDL) all convert to and ...

  1. APPHi: Automated Photometry Pipeline for High Cadence Large Volume Data

    NASA Astrophysics Data System (ADS)

    Sánchez, E.; Castro, J.; Silva, J.; Hernández, J.; Reyes, M.; Hernández, B.; Alvarez, F.; García T.

    2018-04-01

    APPHi (Automated Photometry Pipeline) carries out aperture and differential photometry of TAOS-II project data. It is computationally efficient and can be used also with other astronomical wide-field image data. APPHi works with large volumes of data and handles both FITS and HDF5 formats. Due the large number of stars that the software has to handle in an enormous number of frames, it is optimized to automatically find the best value for parameters to carry out the photometry, such as mask size for aperture, size of window for extraction of a single star, and the number of counts for the threshold for detecting a faint star. Although intended to work with TAOS-II data, APPHi can analyze any set of astronomical images and is a robust and versatile tool to performing stellar aperture and differential photometry.

  2. The First Pan-Starrs Medium Deep Field Variable Star Catalog

    NASA Astrophysics Data System (ADS)

    Flewelling, Heather

    2013-01-01

    We present the first Pan-Starrs 1 Medium Deep Field Variable Star Catalog (PS1-MDF-VSC). The Pan-Starrs 1 (PS1) telescope is a 1.8 meter survey telescope with a 1.4 Gigapixel camera, and is located in Haleakala, Hawaii. The Medium Deep survey, which consists of 10 fields located uniformly across the sky, totalling 70 square degrees, is observed each night, in 2-3 filters per field, with 8 exposures per filter. We have located and classified several hundred periodic variable stars within the Medium Deep fields, and we present the first catalog listing the properties of these variable stars.

  3. Global transcriptomic analysis of model human cell lines exposed to surface-modified gold nanoparticles: the effect of surface chemistry

    NASA Astrophysics Data System (ADS)

    Grzincic, E. M.; Yang, J. A.; Drnevich, J.; Falagan-Lotsch, P.; Murphy, C. J.

    2015-01-01

    Gold nanoparticles (Au NPs) are attractive for biomedical applications not only for their remarkable physical properties, but also for the ease of which their surface chemistry can be manipulated. Many applications involve functionalization of the Au NP surface in order to improve biocompatibility, attach targeting ligands or carry drugs. However, changes in cells exposed to Au NPs of different surface chemistries have been observed, and little is known about how Au NPs and their surface coatings may impact cellular gene expression. The gene expression of two model human cell lines, human dermal fibroblasts (HDF) and prostate cancer cells (PC3) was interrogated by microarray analysis of over 14 000 human genes. The cell lines were exposed to four differently functionalized Au NPs: citrate, poly(allylamine hydrochloride) (PAH), and lipid coatings combined with alkanethiols or PAH. Gene functional annotation categories and weighted gene correlation network analysis were used in order to connect gene expression changes to common cellular functions and to elucidate expression patterns between Au NP samples. Coated Au NPs affect genes implicated in proliferation, angiogenesis, and metabolism in HDF cells, and inflammation, angiogenesis, proliferation apoptosis regulation, survival and invasion in PC3 cells. Subtle changes in surface chemistry, such as the initial net charge, lability of the ligand, and underlying layers greatly influence the degree of expression change and the type of cellular pathway affected.Gold nanoparticles (Au NPs) are attractive for biomedical applications not only for their remarkable physical properties, but also for the ease of which their surface chemistry can be manipulated. Many applications involve functionalization of the Au NP surface in order to improve biocompatibility, attach targeting ligands or carry drugs. However, changes in cells exposed to Au NPs of different surface chemistries have been observed, and little is known about how Au NPs and their surface coatings may impact cellular gene expression. The gene expression of two model human cell lines, human dermal fibroblasts (HDF) and prostate cancer cells (PC3) was interrogated by microarray analysis of over 14 000 human genes. The cell lines were exposed to four differently functionalized Au NPs: citrate, poly(allylamine hydrochloride) (PAH), and lipid coatings combined with alkanethiols or PAH. Gene functional annotation categories and weighted gene correlation network analysis were used in order to connect gene expression changes to common cellular functions and to elucidate expression patterns between Au NP samples. Coated Au NPs affect genes implicated in proliferation, angiogenesis, and metabolism in HDF cells, and inflammation, angiogenesis, proliferation apoptosis regulation, survival and invasion in PC3 cells. Subtle changes in surface chemistry, such as the initial net charge, lability of the ligand, and underlying layers greatly influence the degree of expression change and the type of cellular pathway affected. Electronic supplementary information (ESI) available: UV-Vis spectra of Au NPs, the most significantly changed genes of HDF cells after Au NP incubation under GO accession number GO:0007049 ``cell cycle'', detailed information about the primer/probe sets used for RT-PCR validation of results. See DOI: 10.1039/c4nr05166a

  4. UAEMIAAE

    Atmospheric Science Data Center

    2013-12-19

    UAEMIAAE Aerosol product. ( File version details ) File version  F07_0015  has better ... properties. File version  F08_0016  has improved cloud screening procedure resulting in better aerosol optical depth. ... Coverage:  August - October 2004 File Format:  HDF-EOS Tools:  FTP Access: Data Pool ...

  5. FIRE_AX_ER2_MAS

    Atmospheric Science Data Center

    2015-11-24

    ... Transition Experiment (ASTEX) ER-2 MODIS Airborne Simulator (MAS) Data Project Title:  MAS ... Temporal Resolution:  Each granule contains one flight track File Format:  HDF Tools:  ... Additional Info:  NASA ER-2 MODIS Airborne Simulator (MAS) SCAR-B Block:  SCAR-B ...

  6. MISR Regional SAMUM Map Projection

    Atmospheric Science Data Center

    2017-03-29

    ... Regional Imagery:  Overview  |  Products  |  Data Quality  | Map Projection |  File Format  |  View Data  |  ... is needed if you are doing high precision work. The packages mentioned about (HDF-EOS library, GCTP, and IDL) all convert to and ...

  7. Swimming in Sculptor

    NASA Image and Video Library

    2016-03-07

    Peering deep into the early Universe, this picturesque parallel field observation from the NASA/ESA Hubble Space Telescope reveals thousands of colourful galaxies swimming in the inky blackness of space. A few foreground stars from our own galaxy, the Milky Way, are also visible. In October 2013 Hubble’s Wide Field Camera 3 (WFC3) and Advanced Camera for Surveys (ACS) began observing this portion of sky as part of the Frontier Fields programme. This spectacular skyscape was captured during the study of the giant galaxy cluster Abell 2744, otherwise known as Pandora’s Box. While one of Hubble’s cameras concentrated on Abell 2744, the other camera viewed this adjacent patch of sky near to the cluster. Containing countless galaxies of various ages, shapes and sizes, this parallel field observation is nearly as deep as the Hubble Ultra-Deep Field. In addition to showcasing the stunning beauty of the deep Universe in incredible detail, this parallel field — when compared to other deep fields — will help astronomers understand how similar the Universe looks in different directions

  8. Three dimensional amorphous silicon/microcrystalline silicon solar cells

    DOEpatents

    Kaschmitter, James L.

    1996-01-01

    Three dimensional deep contact amorphous silicon/microcrystalline silicon (a-Si/.mu.c-Si) solar cells which use deep (high aspect ratio) p and n contacts to create high electric fields within the carrier collection volume material of the cell. The deep contacts are fabricated using repetitive pulsed laser doping so as to create the high aspect p and n contacts. By the provision of the deep contacts which penetrate the electric field deep into the material where the high strength of the field can collect many of the carriers, thereby resulting in a high efficiency solar cell.

  9. Three dimensional amorphous silicon/microcrystalline silicon solar cells

    DOEpatents

    Kaschmitter, J.L.

    1996-07-23

    Three dimensional deep contact amorphous silicon/microcrystalline silicon (a-Si/{micro}c-Si) solar cells are disclosed which use deep (high aspect ratio) p and n contacts to create high electric fields within the carrier collection volume material of the cell. The deep contacts are fabricated using repetitive pulsed laser doping so as to create the high aspect p and n contacts. By the provision of the deep contacts which penetrate the electric field deep into the material where the high strength of the field can collect many of the carriers, thereby resulting in a high efficiency solar cell. 4 figs.

  10. Effects of conditioned medium from LL-37 treated adipose stem cells on human fibroblast migration.

    PubMed

    Yang, Eun-Jung; Bang, Sa-Ik

    2017-07-01

    Adipose stem cell-conditioned medium may promote human dermal fibroblast (HDF) proliferation and migration by activating paracrine peptides during the re-epithelization phase of wound healing. Human antimicrobial peptide LL-37 is upregulated in the skin epithelium as part of the normal response to injury. The effects of conditioned medium (CM) from LL-37 treated adipose stem cells (ASCs) on cutaneous wound healing, including the mediation of fibroblast migration, remain to be elucidated, therefore the aim of the present study was to determine how ASCs would react to an LL-37-rich microenvironment and if CM from LL-37 treated ASCs may influence the migration of HDFs. The present study conducted migration assays with HDFs treated with CM from LL-37 treated ASCs. Expression of CXC chemokine receptor 4 (CXCR4), which controls the recruitment of HDFs, was analyzed at the mRNA and protein levels. To further characterize the stimulatory effects of LL-37 on ASCs, the expression of stromal cell-derived factor-1α (SDF-1α), a CXC chemokine, was investigated. CM from LL-37-treated ASCs induced migration of HDFs in a time- and dose-dependent manner, with a maximum difference in migration observed 24 h following stimulation with LL-37 at a concentration of 10 µg/ml. The HDF migration and the expression of CXCR4 in fibroblasts was markedly increased upon treatment with CM from LL-37-treated ASCs compared with CM from untreated ASCs. SDF-1α expression was markedly increased in CM from LL-37 treated ASCs. It was additionally observed that SDF-1α blockade significantly reduced HDF migration. These findings suggest the feasibility of CM from LL-37-treated ASCs as a potential therapeutic for human dermal fibroblast migration.

  11. Dialysate bicarbonate variation in maintenance hemodiafiltration patients: Impact on serum bicarbonate, intradialytic hypotension and interdialytic weight gain.

    PubMed

    Viegas, Márcio; Cândido, Cristina; Felgueiras, Joana; Clemente, José; Barros, Sara; Farbota, Rostislav; Vera, Filipa; Matos, Antero; Sousa, Francisco

    2017-07-01

    The dialysate bicarbonate (DB) influences the acid-base balance in dialysis patients. Very low and high serum bicarbonate (SB) have been related with a higher mortality. Acid-base balance also has been associated with hemodynamic effects in these patients. The trial aim was to compare the effect of DB concentration variation on SB levels in maintenance hemodiafiltration (HDF) patients and the effect on intradialytic hypotension and interdialytic weight gain. A prospective study, with 9 months of follow-up, involving 93 patients, divided in two groups: group 1 and group 2 with a DB of 34 mmol/L and 30 mmol/L, respectively, with monitoring of pre and post HDF SB, intradialytic hypotension, and interdialytic weight gain. Pre dialysis SB was higher in group 1: median concentration of 22.7 mmol/L vs. 21.1 mmol/L (P < 0.001). Post dialysis SB levels were higher in group 1: median concentration of 28.0 mmol/L vs. 25.3 mmol/L (P < 0.001). Post dialysis SB in alkalotic range was only detected in group 1 (51.2% of the patients). No significant differences were detected in intradialytic hypotension rate [28.0 vs. 27.4 episodes per 1000 sessions in group 1 and 2, respectively, (P = 0.906)] or in average interdialytic weight gain [2.9% vs. 3.0% in group 1 and 2, respectively, (P = 0.710)]. DB of 30 mmol/L appears to be associated with SB levels closer to physiological levels than 34 mmol/L. The bicarbonate dialysate, in the tested concentrations, did not appear to have a significant impact on intradialytic hypotension and interdialytic weight gain in maintenance HDF patients. © 2016 International Society for Hemodialysis.

  12. Genetic and Molecular Characterization of the Caenorhabditis Elegans Spermatogenesis-Defective Gene Spe-17

    PubMed Central

    L'Hernault, S. W.; Benian, G. M.; Emmons, R. B.

    1993-01-01

    Two self-sterile mutations that define the spermatogenesis-defective gene spe-17 have been analyzed. These mutations affect unc-22 and fail to complement each other for both Unc-22 and spermatogenesis defects. Both of these mutations are deficiencies (hcDf1 and hDf13) that affect more than one transcription unit. Genomic DNA adjacent to and including the region deleted by the smaller deficiency (hcDf1) has been sequenced and four mRNAs (including unc-22) have been localized to this sequenced region. The three non unc-22 mRNAs are shown to be sex-specific: a 1.2-kb mRNA that can be detected in sperm-free hermaphrodites and 1.2- and 0.56-kb mRNAs found in males. hDf13 deletes at least 55 kb of chromosome IV, including all of unc-22, both male-specific mRNAs and at least part of the female-specific mRNA. hcDf1, which is approximately 15.6 kb, deletes only the 5' end of unc-22 and the gene that encodes the 0.56-kb male-specific mRNA. The common defect that apparently accounts for the defective sperm in hcDf1 and hDf13 homozygotes is deletion of the spe-17 gene, which encodes the 0.56-kb mRNA. Strains carrying two copies of either deletion are self-fertile when they are transgenic for any of four extrachromosomal array that include spe-17. We have sequenced two spe-17 cDNAs, and the deduced 142 amino acid protein sequence is highly charged and rich in serine and threonine, but shows no significant homology to any previously determined protein sequence. PMID:8349108

  13. MISR Regional INTEX-B Map Projection

    Atmospheric Science Data Center

    2016-09-28

    ... Regional Imagery:  Overview  |  Products  |  Data Quality  | Map Projection |  File Format  |  View Data  |  ... is needed if you are doing high precision work. The packages mentioned about (HDF-EOS library, GCTP, and IDL) all convert to and ...

  14. MISR Regional GoMACCS Map Projection

    Atmospheric Science Data Center

    2017-03-29

    ... Regional Imagery:  Overview  |  Products  |  Data Quality  | Map Projection |  File Format  |  View Data  |  ... is needed if you are doing high precision work. The packages mentioned about (HDF-EOS library, GCTP, and IDL) all convert to and ...

  15. Odor-active constituents in fresh pineapple (Ananas comosus [L.] Merr.) by quantitative and sensory evaluation.

    PubMed

    Tokitomo, Yukiko; Steinhaus, Martin; Büttner, Andrea; Schieberle, Peter

    2005-07-01

    By application of aroma extract dilution analysis (AEDA) to an aroma distillate prepared from fresh pineapple using solvent-assisted flavor evaporation (SAFE), 29 odor-active compounds were detected in the flavor dilution (FD) factor range of 2 to 4,096. Quantitative measurements performed by stable isotope dilution assays (SIDA) and a calculation of odor activity values (OAVs) of 12 selected odorants revealed the following compounds as key odorants in fresh pineapple flavor: 4-hydroxy-2,5-dimethyl-3(2H)-furanone (HDF; sweet, pineapple-like, caramel-like), ethyl 2-methylpropanoate (fruity), ethyl 2-methylbutanoate (fruity) followed by methyl 2-methylbutanoate (fruity, apple-like) and 1-(E,Z)-3,5-undecatriene (fresh, pineapple-like). A mixture of these 12 odorants in concentrations equal to those in the fresh pineapple resulted in an odor profile similar to that of the fresh juice. Furthermore, the results of omission tests using the model mixture showed that HDF and ethyl 2-methylbutanoate are character impact odorants in fresh pineapple.

  16. Boron nitride nanotubes enhance properties of chitosan-based scaffolds.

    PubMed

    Emanet, Melis; Kazanç, Emine; Çobandede, Zehra; Çulha, Mustafa

    2016-10-20

    With their low toxicity, high mechanical strength and chemical stability, boron nitride nanotubes (BNNTs) are good candidates to enhance the properties of polymers, composites and scaffolds. Chitosan-based scaffolds are exhaustively investigated in tissue engineering because of their biocompatibility and antimicrobial activity. However, their spontaneous degradation prevents their use in a range of tissue engineering applications. In this study, hydroxylated BNNTs (BNNT-OH) were included into a chitosan scaffold and tested for their mechanical strength, swelling behavior and biodegradability. The results show that inclusion of BNNTs-OH into the chitosan scaffold increases the mechanical strength and pore size at values optimal for high cellular proliferation and adhesion. The chitosan/BNNT-OH scaffold was also found to be non-toxic to Human Dermal Fibroblast (HDF) cells due to its slow degradation rate. HDF cell proliferation and adhesion were increased as compared to the chitosan-only scaffold as observed by scanning electron microscopy (SEM) and fluorescent microscopy images. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Gelatin/chondroitin sulfate nanofibrous scaffolds for stimulation of wound healing: In-vitro and in-vivo study.

    PubMed

    Pezeshki-Modaress, Mohamad; Mirzadeh, Hamid; Zandi, Mojgan; Rajabi-Zeleti, Sareh; Sodeifi, Niloofar; Aghdami, Nasser; Mofrad, Mohammad R K

    2017-07-01

    In this research, fabrication of gelatin/chondroitin sulfate (GAG) nanofibrous scaffolds using electrospinning technique for skin tissue engineering was studied. The influence of GAG content on chemical, physical, mechanical and biological properties of the scaffolds were investigated. Human dermal fibroblast (HDF) cells were cultured and bioactivity of electrospun gelatin/GAG scaffolds for skin tissue engineering was assayed. Biological results illustrated that HDF cells attached and spread well on gelatin/GAG nanofibrous scaffolds displaying spindle-like shapes and stretching. MTS assay was performed to evaluate the cell proliferation on electrospun gelatin/GAG scaffolds. The results confirmed the influence of GAG content as well as the nanofibrous structure on cell proliferation and attachment of substrates. The gelatin/GAG nanofibrous scaffolds with the desired thickness for in-vivo evaluations were used on the full-thickness wounds. Pathobiological results showed that cell loaded gelatin/GAG scaffolds significantly accelerated wounds healing. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 105A: 2020-2034, 2017. © 2017 Wiley Periodicals, Inc.

  18. An HDF5-based framework for the distribution and analysis of ultrasonic concrete data

    NASA Astrophysics Data System (ADS)

    Prince, Luke; Clayton, Dwight; Santos-Villalobos, Hector

    2017-02-01

    There are many commercial ultrasonic tomography devices (UTDs) available for use in nondestructive evaluation (NDE) of reinforced concrete structures. These devices emit, measure, and store ultrasonic signals typically in the 25 kHz to 5 MHz frequency range. UTDs are characterized by a composition of multiple transducers, also known as a transducer array or phased array. Often, UTDs data are in a proprietary format. Consequently, NDE research data is limited to those who have prior non-disclosure agreements or the appropriate licenses. Thus, there is a need for a proper universal data framework to exist such that proprietary file datasets for different concrete specimens can be converted, organized, and stored with relative metadata for individual or collaborative NDE research. Building upon the Hierarchical Data Format (HDF5) model, we have developed a UTD data management framework and Graphic User Interface (GUI) to promote the algorithmic reconstruction of ultrasonic data in a controlled environment for easily reproducible and publishable results.

  19. An HDF5-Based Framework for the Distribution and Analysis of Ultrasonic Concrete Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prince, Luke J; Clayton, Dwight A; Santos-Villalobos, Hector J

    There are many commercial ultrasonic tomography devices (UTDs) available for use in nondestructive evaluation (NDE) of reinforced concrete structures. These devices emit, measure, and store ultrasonic signals typically in the 25 kHz to 5 MHz frequency range. UTDs are characterized by a composition of multiple transducers, also known as a transducer array or phased array. Often, UTDs data are in a proprietary format. Consequently, NDE research data is limited to those who have prior non-disclosure agreements or the appropriate licenses. Thus, there is a need for a proper universal data framework to exist such that proprietary file datasets for differentmore » concrete specimens can be converted, organized, and stored with relative metadata for individual or collaborative NDE research. Building upon the Hierarchical Data Format (HDF5) model, we have developed a UTD data management framework and Graphic User Interface (GUI) to promote the algorithmic reconstruction of ultrasonic data in a controlled environment for easily reproducible and publishable results.« less

  20. Chitosan-hyaluronan/nano chondroitin sulfate ternary composite sponges for medical use.

    PubMed

    Anisha, B S; Sankar, Deepthi; Mohandas, Annapoorna; Chennazhi, K P; Nair, Shantikumar V; Jayakumar, R

    2013-02-15

    In this work chitosan-hyaluronan composite sponge incorporated with chondroitin sulfate nanoparticle (nCS) was developed. The fabrication of hydrogel was based on simple ionic cross-linking using EDC, followed by lyophilization to obtain the composite sponge. nCS suspension was characterized using DLS and SEM and showed a size range of 100-150 nm. The composite sponges were characterized using SEM, FT-IR and TG-DTA. Porosity, swelling, biodegradation, blood clotting and platelet activation of the prepared sponges were also evaluated. Nanocomposites showed a porosity of 67% and showed enhanced swelling and blood clotting ability. Cytocompatibility and cell adhesion studies of the sponges were done using human dermal fibroblast (HDF) cells and the nanocomposite sponges showed more than 90% viability. Nanocomposite sponges also showed enhanced proliferation of HDF cells within two days of study. These results indicated that this nanocomposite sponges would be a potential candidate for wound dressing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. SAGE III-ISS

    Atmospheric Science Data Center

    2017-12-27

    SAGE III-ISS Data and Information Launched on February 19, 2017 on a SpaceX ... vertical profiles of the stratosphere and mesosphere. The data provided by SAGE III-ISS includes key components of atmospheric ... Additional Info:  Data Format: HDF4 or Big Endian/IEEE Binary SCAR-B Block:  ...

  2. The importance of considering competing treatment affecting prognosis in the evaluation of therapy in trials: the example of renal transplantation in hemodialysis trials.

    PubMed

    Hazelbag, C Marijn; Peters, Sanne A E; Blankestijn, Peter J; Bots, Michiel L; Canaud, Bernard; Davenport, Andrew; Grooteman, Muriel P C; Kircelli, Fatih; Locatelli, Francesco; Maduell, Francisco; Morena, Marion; Nubé, Menso J; Ok, Ercan; Torres, Ferran; Hoes, Arno W; Groenwold, Rolf H H

    2017-04-01

    During the follow-up in a randomized controlled trial (RCT), participants may receive additional (non-randomly allocated) treatment that affects the outcome. Typically such additional treatment is not taken into account in evaluation of the results. Two pivotal trials of the effects of hemodiafiltration (HDF) versus hemodialysis (HD) on mortality in patients with end-stage renal disease reported differing results. We set out to evaluate to what extent methods to take other treatments (i.e. renal transplantation) into account may explain the difference in findings between RCTs. This is illustrated using a clinical example of two RCTs estimating the effect of HDF versus HD on mortality. Using individual patient data from the Estudio de Supervivencia de Hemodiafiltración On-Line (ESHOL; n  =  902) and The Dutch CONvective TRAnsport STudy (CONTRAST; n  = 714) trials, five methods for estimating the effect of HDF versus HD on all-cause mortality were compared: intention-to-treat (ITT) analysis (i.e. not taking renal transplantation into account), per protocol exclusion (PP excl ; exclusion of patients who receive transplantation), PP cens (censoring patients at the time of transplantation), transplantation-adjusted (TA) analysis and an extension of the TA analysis (TA ext ) with additional adjustment for variables related to both the risk of receiving a transplant and the risk of an outcome (transplantation-outcome confounders). Cox proportional hazards models were applied. Unadjusted ITT analysis of all-cause mortality led to differing results between CONTRAST and ESHOL: hazard ratio (HR) 0.95 (95% CI 0.75-1.20) and HR 0.76 (95% CI 0.59-0.97), respectively; difference between 5 and 24% risk reductions. Similar differences between the two trials were observed for the other unadjusted analytical methods (PP cens, PP excl , TA) The HRs of HDF versus HD treatment became more similar after adding transplantation as a time-varying covariate and including transplantation-outcome confounders: HR 0.89 (95% CI 0.69-1.13) in CONTRAST and HR 0.80 (95% CI 0.62-1.02) in ESHOL. The apparent differences in estimated treatment effects between two dialysis trials were to a large extent attributable to differences in applied methodology for taking renal transplantation into account in their final analyses. Our results exemplify the necessity of careful consideration of the treatment effect of interest when estimating the therapeutic effect in RCTs in which participants may receive additional treatments. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  3. Long Range Effect of The M7.8 April 2015 Nepal Earth Quake on the Deep Groudwater Outflow in a Thousand-Mile-Away Geothermal Field in Southern China's Guangdong

    NASA Astrophysics Data System (ADS)

    Lu, G.; Yu, S.; Xu, F.; Wang, X.; Yan, K.; Yuen, D. A.

    2015-12-01

    Deep ground waters sustain high temperature and pressure and are susceptible to impact from an earthquake. How an earthquake would have been associated with long-range effect on geological environment of deep groundwater is a question of interest to the scientific community and general public. The massive Richter 8.1 Nepal Earthquake (on April 25, 2015) provided a rare opportunity to test the response of deep groundwater systems. Deep ground waters at elevated temperature would naturally flow to ground surface along preferential flow path such as a deep fault, forming geothermal water flows. Geothermal water flows are susceptible to stress variation and can reflect the physical conditions of supercritical hot water kilometers deep down inside the crust. This paper introduces the monitoring work on the outflow in Xijiang Geothermal Field of Xinyi City, Guangdong Province in southern China. The geothermal field is one of typical geothermal fields with deep faults in Guangdong. The geothermal spring has characteristic daily variation of up to 72% in flow rate, which results from being associated with a north-south run deep fault susceptible to earthquake event. We use year-long monitoring data to illustrate how the Nepal earthquake would have affected the flows at the field site over 2.5 thousand kilometers away. The irregularity of flow is judged by deviation from otherwise good correlation of geothermal spring flow with solid earth tidal waves. This work could potentially provide the basis for further study of deep groundwater systems and insight to earthquake prediction.

  4. SEDS: The Spitzer Extended Deep Survey. Survey Design, Photometry, and Deep IRAC Source Counts

    NASA Technical Reports Server (NTRS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Arendt, A.; Barmby, P.; Barro, G; Bell, E. F.; Bouwens, R.; Cattaneo, A.; hide

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(exp 2) to a depth of 26 AB mag (3sigma) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 micron. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW / square m/sr at 3.6 and 4.5 micron to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  5. A Photometric redshift galaxy catalog from the Red-Sequence Cluster Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsieh, Bau-Ching; /Taiwan, Natl. Central U. /Taipei, Inst. Astron. Astrophys.; Yee, H.K.C.

    2005-02-01

    The Red-Sequence Cluster Survey (RCS) provides a large and deep photometric catalog of galaxies in the z' and R{sub c} bands for 90 square degrees of sky, and supplemental V and B data have been obtained for 33.6 deg{sup 2}. They compile a photometric redshift catalog from these 4-band data by utilizing the empirical quadratic polynomial photometric redshift fitting technique in combination with CNOC2 and GOODS/HDF-N redshift data. The training set includes 4924 spectral redshifts. The resulting catalog contains more than one million galaxies with photometric redshifts < 1.5 and R{sub c} < 24, giving an rms scatter {delta}({Delta}z)

  6. SAGE III

    Atmospheric Science Data Center

    2017-10-27

    SAGE III Data and Information The Stratospheric Aerosol and Gas ... on the spacecraft. SAGE III produced L1 and L2 scientific data from 5/07/2002 until 12/31/2005. The second instrument is as an ... Additional Info:  Data Format: HDF-EOS or Big Endian/IEEE Binary SCAR-B Block:  ...

  7. Unleashing Geophysics Data with Modern Formats and Services

    NASA Astrophysics Data System (ADS)

    Ip, Alex; Brodie, Ross C.; Druken, Kelsey; Bastrakova, Irina; Evans, Ben; Kemp, Carina; Richardson, Murray; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2016-04-01

    Geoscience Australia (GA) is the national steward of large volumes of geophysical data extending over the entire Australasian region and spanning many decades. The volume and variety of data which must be managed, coupled with the increasing need to support machine-to-machine data access, mean that the old "click-and-ship" model delivering data as downloadable files for local analysis is rapidly becoming unviable - a "big data" problem not unique to geophysics. The Australian Government, through the Research Data Services (RDS) Project, recently funded the Australian National Computational Infrastructure (NCI) to organize a wide range of Earth Systems data from diverse collections including geoscience, geophysics, environment, climate, weather, and water resources onto a single High Performance Data (HPD) Node. This platform, which now contains over 10 petabytes of data, is called the National Environmental Research Data Interoperability Platform (NERDIP), and is designed to facilitate broad user access, maximise reuse, and enable integration. GA has contributed several hundred terabytes of geophysical data to the NERDIP. Historically, geophysical datasets have been stored in a range of formats, with metadata of varying quality and accessibility, and without standardised vocabularies. This has made it extremely difficult to aggregate original data from multiple surveys (particularly un-gridded geophysics point/line data) into standard formats suited to High Performance Computing (HPC) environments. To address this, it was decided to use the NERDIP-preferred Hierarchical Data Format (HDF) 5, which is a proven, standard, open, self-describing and high-performance format supported by extensive software tools, libraries and data services. The Network Common Data Form (NetCDF) 4 API facilitates the use of data in HDF5, whilst the NetCDF Climate & Forecasting conventions (NetCDF-CF) further constrain NetCDF4/HDF5 data so as to provide greater inherent interoperability. The first geophysical data collection selected for transformation by GA was Airborne ElectroMagnetics (AEM) data which was held in proprietary-format files, with associated ISO 19115 metadata held in a separate relational database. Existing NetCDF-CF metadata profiles were enhanced to cover AEM and other geophysical data types, and work is underway to formalise the new geophysics vocabulary as a proposed extension to the Climate & Forecasting conventions. The richness and flexibility of HDF5's internal indexing mechanisms has allowed lossless restructuring of the AEM data for efficient storage, subsetting and access via either the NetCDF4/HDF5 APIs or Open-source Project for a Network Data Access Protocol (OPeNDAP) data services. This approach not only supports large-scale HPC processing, but also interactive access to a wide range of geophysical data in user-friendly environments such as iPython notebooks and more sophisticated cloud-enabled portals such as the Virtual Geophysics Laboratory (VGL). As multidimensional AEM datasets are relatively complex compared to other geophysical data types, the general approach employed in this project for modernizing AEM data is likely to be applicable to other geophysics data types. When combined with the use of standards-based data services and APIs, a coordinated, systematic modernisation will result in vastly improved accessibility to, and usability of, geophysical data in a wide range of computational environments both within and beyond the geophysics community.

  8. GES DAAC HDF Data Processing and Visualization Tools

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Cho, S.; Johnson, J.; Li, J.; Liu, Z.; Lu, L.; Pollack, N.; Qin, J.; Savtchenko, A.; Teng, B.

    2002-12-01

    The Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) plays a major role in enabling basic scientific research and providing access to scientific data to the general user community. Several GES DAAC Data Support Teams provide expert assistance to users in accessing data, including information on visualization tools and documentation for data products. To provide easy access to the science data, the data support teams have additionally developed many online and desktop tools for data processing and visualization. This presentation is an overview of major HDF tools implemented at the GES DAAC and aimed at optimizing access to EOS data for the Earth Sciences community. GES DAAC ONLINE TOOLS: MODIS and AIRS on-demand Channel/Variable Subsetter are web-based, on-the-fly/on-demand subsetters that perform channel/variable subsetting and restructuring for Level1B and Level 2 data products. Users can specify criteria to subset data files with desired channels and variables and then download the subsetted file. AIRS QuickLook is a CGI/IDL combo package that allows users to view AIRS/HSB/AMSU Level-1B data online by specifying a channel prior to obtaining data. A global map is also provided along with the image to show geographic coverage of the granule and flight direction of the spacecraft. OASIS (Online data AnalySIS) is an IDL-based HTML/CGI interface for search, selection, and simple analysis of earth science data. It supports binary and GRIB formatted data, such as TOVS, Data Assimilation products, and some NCEP operational products. TRMM Online Analysis System is designed for quick exploration, analyses, and visualization of TRMM Level-3 and other precipitation products. The products consist of the daily (3B42), monthly(3B43), near-real-time (3B42RT), and Willmott's climate data. The system is also designed to be simple and easy to use - users can plot the average or accumulated rainfall over their region of interest for a given time period, or plot the time series of regional rainfall average. WebGIS is an online web software that implements the Open GIS Consortium (OGC) standards for mapping requests and rendering. It allows users access to TRMM, MODIS, SeaWiFS, and AVHRR data from several DAAC map servers, as well as externally served data such as political boundaries, population centers, lakes, rivers, and elevation. GES DAAC DESKTOP TOOLS: HDFLook-MODIS is a new, multifunctional, data processing and visualization tool for Radiometric and Geolocation, Atmosphere, Ocean, and Land MODIS HDF-EOS data. Features include (1) accessing and visualization of all swath (Levels l and 2) MODIS and AIRS products, and gridded (Levels 3 and 4) MODIS products; (2) re-mapping of swath data to world map; (3) geo-projection conversion; (4) interactive and batch mode capabilities; (5) subsetting and multi-granule processing; and (6) data conversion. SIMAP is an IDL-based script that is designed to read and map MODIS Level 1B (L1B) and Level 2 (L2) Ocean and Atmosphere products. It is a non-interactive, command line executed tool. The resulting maps are scaled to physical units (e.g., radiances, concentrations, brightness temperatures) and saved in binary files. TRMM HDF (in C and Fortran), reads in TRMM HDF data files and writes out user-selected SDS arrays and Vdata tables as separate flat binary files.

  9. Deepest X-Rays Ever Reveal universe Teeming With Black Holes

    NASA Astrophysics Data System (ADS)

    2001-03-01

    For the first time, astronomers believe they have proof black holes of all sizes once ruled the universe. NASA's Chandra X-ray Observatory provided the deepest X-ray images ever recorded, and those pictures deliver a novel look at the past 12 billion years of black holes. Two independent teams of astronomers today presented images that contain the faintest X-ray sources ever detected, which include an abundance of active super massive black holes. "The Chandra data show us that giant black holes were much more active in the past than at present," said Riccardo Giacconi, of Johns Hopkins University and Associated Universities, Inc., Washington, DC. The exposure is known as "Chandra Deep Field South" since it is located in the Southern Hemisphere constellation of Fornax. "In this million-second image, we also detect relatively faint X-ray emission from galaxies, groups, and clusters of galaxies". The images, known as Chandra Deep Fields, were obtained during many long exposures over the course of more than a year. Data from the Chandra Deep Field South will be placed in a public archive for scientists beginning today. "For the first time, we are able to use X-rays to look back to a time when normal galaxies were several billion years younger," said Ann Hornschemeier, Pennsylvania State University, University Park. The group’s 500,000-second exposure included the Hubble Deep Field North, allowing scientists the opportunity to combine the power of Chandra and the Hubble Space Telescope, two of NASA's Great Observatories. The Penn State team recently acquired an additional 500,000 seconds of data, creating another one-million-second Chandra Deep Field, located in the constellation of Ursa Major. Chandra Deep Field North/Hubble Deep Field North Press Image and Caption The images are called Chandra Deep Fields because they are comparable to the famous Hubble Deep Field in being able to see further and fainter objects than any image of the universe taken at X-ray wavelengths. Both Chandra Deep Fields are comparable in observation time to the Hubble Deep Fields, but cover a much larger area of the sky. "In essence, it is like seeing galaxies similar to our own Milky Way at much earlier times in their lives," Hornschemeier added. "These data will help scientists better understand star formation and how stellar-sized black holes evolve." Combining infrared and X-ray observations, the Penn State team also found veils of dust and gas are common around young black holes. Another discovery to emerge from the Chandra Deep Field South is the detection of an extremely distant X-ray quasar, shrouded in gas and dust. "The discovery of this object, some 12 billion light years away, is key to understanding how dense clouds of gas form galaxies, with massive black holes at their centers," said Colin Norman of Johns Hopkins University. The Chandra Deep Field South results were complemented by the extensive use of deep optical observations supplied by the Very Large Telescope of the European Southern Observatory in Garching, Germany. The Penn State team obtained optical spectroscopy and imaging using the Hobby-Eberly Telescope in Ft. Davis, TX, and the Keck Observatory atop Mauna Kea, HI. Chandra's Advanced CCD Imaging Spectrometer was developed for NASA by Penn State and Massachusetts Institute of Technology under the leadership of Penn State Professor Gordon Garmire. NASA's Marshall Space Flight Center, Huntsville, AL, manages the Chandra program for the Office of Space Science, Washington, DC. TRW, Inc., Redondo Beach, California, is the prime contractor for the spacecraft. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, MA. More information is available on the Internet at: http://chandra.harvard.edu AND http://chandra.nasa.gov

  10. FLASH_SSF_Aqua-FM3-MODIS_Version3C

    Atmospheric Science Data Center

    2018-04-04

    ... Tool:  CERES Order Tool  (netCDF) Subset Data:  CERES Search and Subset Tool (HDF4 & netCDF) ... Cloud Layer Area Cloud Infared Emissivity Cloud Base Pressure Surface (Radiative) Flux TOA Flux Surface Types TOT ... Radiance SW Filtered Radiance LW Flux Order Data:  Earthdata Search:  Order Data Guide Documents:  ...

  11. FLASH_SSF_Terra-FM1-MODIS_Version3C

    Atmospheric Science Data Center

    2018-04-04

    ... Tool:  CERES Order Tool  (netCDF) Subset Data:  CERES Search and Subset Tool (HDF4 & netCDF) ... Cloud Layer Area Cloud Infrared Emissivity Cloud Base Pressure Surface (Radiative) Flux TOA Flux Surface Types TOT ... Radiance SW Filtered Radiance LW Flux Order Data:  Earthdata Search:  Order Data Guide Documents:  ...

  12. Electronic Warfare M-on-N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis

    DTIC Science & Technology

    2017-04-12

    LOGGING STREAM The goal of this report is to investigate logging of EW simulations not at the level of implementation in a database management ...differences of the logging stream and relational models.  A hierarchical navigation query style appears very natural for our application. Yet the

  13. Progress on H5Part: A Portable High Performance Parallel DataInterface for Electromagnetics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adelmann, Andreas; Gsell, Achim; Oswald, Benedikt

    Significant problems facing all experimental andcomputationalsciences arise from growing data size and complexity. Commonto allthese problems is the need to perform efficient data I/O ondiversecomputer architectures. In our scientific application, thelargestparallel particle simulations generate vast quantitiesofsix-dimensional data. Such a simulation run produces data foranaggregate data size up to several TB per run. Motived by the needtoaddress data I/O and access challenges, we have implemented H5Part,anopen source data I/O API that simplifies the use of the HierarchicalDataFormat v5 library (HDF5). HDF5 is an industry standard forhighperformance, cross-platform data storage and retrieval that runsonall contemporary architectures from large parallel supercomputerstolaptops. H5Part, whichmore » is oriented to the needs of the particlephysicsand cosmology communities, provides support for parallelstorage andretrieval of particles, structured and in the future unstructuredmeshes.In this paper, we describe recent work focusing on I/O supportforparticles and structured meshes and provide data showing performance onmodernsupercomputer architectures like the IBM POWER 5.« less

  14. Taming parallel I/O complexity with auto-tuning

    DOE PAGES

    Behzad, Babak; Luu, Huong Vu Thanh; Huchette, Joseph; ...

    2013-11-17

    We present an auto-tuning system for optimizing I/O performance of HDF5 applications and demonstrate its value across platforms, applications, and at scale. The system uses a genetic algorithm to search a large space of tunable parameters and to identify effective settings at all layers of the parallel I/O stack. The parameter settings are applied transparently by the auto-tuning system via dynamically intercepted HDF5 calls. To validate our auto-tuning system, we applied it to three I/O benchmarks (VPIC, VORPAL, and GCRM) that replicate the I/O activity of their respective applications. We tested the system with different weak-scaling configurations (128, 2048, andmore » 4096 CPU cores) that generate 30 GB to 1 TB of data, and executed these configurations on diverse HPC platforms (Cray XE6, IBM BG/P, and Dell Cluster). In all cases, the auto-tuning framework identified tunable parameters that substantially improved write performance over default system settings. In conclusion, we consistently demonstrate I/O write speedups between 2x and 100x for test configurations.« less

  15. Viral Organization of Human Proteins

    PubMed Central

    Wuchty, Stefan; Siwo, Geoffrey; Ferdig, Michael T.

    2010-01-01

    Although maps of intracellular interactions are increasingly well characterized, little is known about large-scale maps of host-pathogen protein interactions. The investigation of host-pathogen interactions can reveal features of pathogenesis and provide a foundation for the development of drugs and disease prevention strategies. A compilation of experimentally verified interactions between HIV-1 and human proteins and a set of HIV-dependency factors (HDF) allowed insights into the topology and intricate interplay between viral and host proteins on a large scale. We found that targeted and HDF proteins appear predominantly in rich-clubs, groups of human proteins that are strongly intertwined among each other. These assemblies of proteins may serve as an infection gateway, allowing the virus to take control of the human host by reaching protein pathways and diversified cellular functions in a pronounced and focused way. Particular transcription factors and protein kinases facilitate indirect interactions between HDFs and viral proteins. Discerning the entanglement of directly targeted and indirectly interacting proteins may uncover molecular and functional sites that can provide novel perspectives on the progression of HIV infection and highlight new avenues to fight this virus. PMID:20827298

  16. [Deep learning and neuronal networks in ophthalmology : Applications in the field of optical coherence tomography].

    PubMed

    Treder, M; Eter, N

    2018-04-19

    Deep learning is increasingly becoming the focus of various imaging methods in medicine. Due to the large number of different imaging modalities, ophthalmology is particularly suitable for this field of application. This article gives a general overview on the topic of deep learning and its current applications in the field of optical coherence tomography. For the benefit of the reader it focuses on the clinical rather than the technical aspects.

  17. Deep learning for computational chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goh, Garrett B.; Hodas, Nathan O.; Vishnu, Abhinav

    The rise and fall of artificial neural networks is well documented in the scientific literature of both the fields of computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on “deep” neural networks. Within the last few years, we have seen the transformative impact of deep learning the computer science domain, notably in speech recognition and computer vision, to the extent that the majority of practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. Inmore » this review, we provide an introductory overview into the theory of deep neural networks and their unique properties as compared to traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including QSAR, virtual screening, protein structure modeling, QM calculations, materials synthesis and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non neural networks state-of-the-art models across disparate research topics, and deep neural network based models often exceeded the “glass ceiling” expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a useful tool and may grow into a pivotal role for various challenges in the computational chemistry field.« less

  18. Improving Seismic Data Accessibility and Performance Using HDF Containers

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Wang, J.; Yang, R.

    2017-12-01

    The performance of computational geophysical data processing and forward modelling relies on both computational and data. Significant efforts on developing new data formats and libraries have been made the community, such as IRIS/PASSCAL and ASDF in data, and programs and utilities such as ObsPy and SPECFEM. The National Computational Infrastructure hosts a national significant geophysical data collection that is co-located with a high performance computing facility and provides an opportunity to investigate how to improve the data formats from both a data management and a performance point of view. This paper investigates how to enhance the data usability in several perspectives: 1) propose a convention for the seismic (both active and passive) community to improve the data accessibility and interoperability; 2) recommend the convention used in the HDF container when data is made available in PH5 or ASDF formats; 3) provide tools to convert between various seismic data formats; 4) provide performance benchmark cases using ObsPy library and SPECFEM3D to demonstrate how different data organization in terms of chunking size and compression impact on the performance by comparing new data formats, such as PH5 and ASDF to traditional formats such as SEGY, SEED, SAC, etc. In this work we apply our knowledge and experience on data standards and conventions, such as CF and ACDD from the climate community to the seismology community. The generic global attributes widely used in climate community are combined with the existing convention in the seismology community, such as CMT and QuakeML, StationXML, SEGY header convention. We also extend such convention by including the provenance and benchmarking records so that the r user can learn the footprint of the data together with its baseline performance. In practise we convert the example wide angle reflection seismic data from SEGY to PH5 or ASDF by using ObsPy and pyasdf libraries. It quantitatively demonstrates how the accessibility can be improved if the seismic data are stored in the HDF container.

  19. SCORPIO: A Scalable Two-Phase Parallel I/O Library With Application To A Large Scale Subsurface Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sreepathi, Sarat; Sripathi, Vamsi; Mills, Richard T

    2013-01-01

    Inefficient parallel I/O is known to be a major bottleneck among scientific applications employed on supercomputers as the number of processor cores grows into the thousands. Our prior experience indicated that parallel I/O libraries such as HDF5 that rely on MPI-IO do not scale well beyond 10K processor cores, especially on parallel file systems (like Lustre) with single point of resource contention. Our previous optimization efforts for a massively parallel multi-phase and multi-component subsurface simulator (PFLOTRAN) led to a two-phase I/O approach at the application level where a set of designated processes participate in the I/O process by splitting themore » I/O operation into a communication phase and a disk I/O phase. The designated I/O processes are created by splitting the MPI global communicator into multiple sub-communicators. The root process in each sub-communicator is responsible for performing the I/O operations for the entire group and then distributing the data to rest of the group. This approach resulted in over 25X speedup in HDF I/O read performance and 3X speedup in write performance for PFLOTRAN at over 100K processor cores on the ORNL Jaguar supercomputer. This research describes the design and development of a general purpose parallel I/O library, SCORPIO (SCalable block-ORiented Parallel I/O) that incorporates our optimized two-phase I/O approach. The library provides a simplified higher level abstraction to the user, sitting atop existing parallel I/O libraries (such as HDF5) and implements optimized I/O access patterns that can scale on larger number of processors. Performance results with standard benchmark problems and PFLOTRAN indicate that our library is able to maintain the same speedups as before with the added flexibility of being applicable to a wider range of I/O intensive applications.« less

  20. SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data

    NASA Astrophysics Data System (ADS)

    Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These solutions are encompassed in SciSpark, an open-source software framework for distributed computing on scientific data.

  1. Data Quality Screening Service

    NASA Technical Reports Server (NTRS)

    Strub, Richard; Lynnes, Christopher; Hearty, Thomas; Won, Young-In; Fox, Peter; Zednik, Stephan

    2013-01-01

    A report describes the Data Quality Screening Service (DQSS), which is designed to help automate the filtering of remote sensing data on behalf of science users. Whereas this process often involves much research through quality documents followed by laborious coding, the DQSS is a Web Service that provides data users with data pre-filtered to their particular criteria, while at the same time guiding the user with filtering recommendations of the cognizant data experts. The DQSS design is based on a formal semantic Web ontology that describes data fields and the quality fields for applying quality control within a data product. The accompanying code base handles several remote sensing datasets and quality control schemes for data products stored in Hierarchical Data Format (HDF), a common format for NASA remote sensing data. Together, the ontology and code support a variety of quality control schemes through the implementation of the Boolean expression with simple, reusable conditional expressions as operands. Additional datasets are added to the DQSS simply by registering instances in the ontology if they follow a quality scheme that is already modeled in the ontology. New quality schemes are added by extending the ontology and adding code for each new scheme.

  2. Deep Extragalactic VIsible Legacy Survey (DEVILS): Motivation, Design and Target Catalogue

    NASA Astrophysics Data System (ADS)

    Davies, L. J. M.; Robotham, A. S. G.; Driver, S. P.; Lagos, C. P.; Cortese, L.; Mannering, E.; Foster, C.; Lidman, C.; Hashemizadeh, A.; Koushan, S.; O'Toole, S.; Baldry, I. K.; Bilicki, M.; Bland-Hawthorn, J.; Bremer, M. N.; Brown, M. J. I.; Bryant, J. J.; Catinella, B.; Croom, S. M.; Grootes, M. W.; Holwerda, B. W.; Jarvis, M. J.; Maddox, N.; Meyer, M.; Moffett, A. J.; Phillipps, S.; Taylor, E. N.; Windhorst, R. A.; Wolf, C.

    2018-06-01

    The Deep Extragalactic VIsible Legacy Survey (DEVILS) is a large spectroscopic campaign at the Anglo-Australian Telescope (AAT) aimed at bridging the near and distant Universe by producing the highest completeness survey of galaxies and groups at intermediate redshifts (0.3 < z < 1.0). Our sample consists of ˜60,000 galaxies to Y<21.2 mag, over ˜6 deg2 in three well-studied deep extragalactic fields (Cosmic Origins Survey field, COSMOS, Extended Chandra Deep Field South, ECDFS and the X-ray Multi-Mirror Mission Large-Scale Structure region, XMM-LSS - all Large Synoptic Survey Telescope deep-drill fields). This paper presents the broad experimental design of DEVILS. Our target sample has been selected from deep Visible and Infrared Survey Telescope for Astronomy (VISTA) Y-band imaging (VISTA Deep Extragalactic Observations, VIDEO and UltraVISTA), with photometry measured by PROFOUND. Photometric star/galaxy separation is done on the basis of NIR colours, and has been validated by visual inspection. To maximise our observing efficiency for faint targets we employ a redshift feedback strategy, which continually updates our target lists, feeding back the results from the previous night's observations. We also present an overview of the initial spectroscopic observations undertaken in late 2017 and early 2018.

  3. Development and application of deep convolutional neural network in target detection

    NASA Astrophysics Data System (ADS)

    Jiang, Xiaowei; Wang, Chunping; Fu, Qiang

    2018-04-01

    With the development of big data and algorithms, deep convolution neural networks with more hidden layers have more powerful feature learning and feature expression ability than traditional machine learning methods, making artificial intelligence surpass human level in many fields. This paper first reviews the development and application of deep convolutional neural networks in the field of object detection in recent years, then briefly summarizes and ponders some existing problems in the current research, and the future development of deep convolutional neural network is prospected.

  4. Incorporating deep learning with convolutional neural networks and position specific scoring matrices for identifying electron transport proteins.

    PubMed

    Le, Nguyen-Quoc-Khanh; Ho, Quang-Thai; Ou, Yu-Yen

    2017-09-05

    In several years, deep learning is a modern machine learning technique using in a variety of fields with state-of-the-art performance. Therefore, utilization of deep learning to enhance performance is also an important solution for current bioinformatics field. In this study, we try to use deep learning via convolutional neural networks and position specific scoring matrices to identify electron transport proteins, which is an important molecular function in transmembrane proteins. Our deep learning method can approach a precise model for identifying of electron transport proteins with achieved sensitivity of 80.3%, specificity of 94.4%, and accuracy of 92.3%, with MCC of 0.71 for independent dataset. The proposed technique can serve as a powerful tool for identifying electron transport proteins and can help biologists understand the function of the electron transport proteins. Moreover, this study provides a basis for further research that can enrich a field of applying deep learning in bioinformatics. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  5. Deep Learning in Nuclear Medicine and Molecular Imaging: Current Perspectives and Future Directions.

    PubMed

    Choi, Hongyoon

    2018-04-01

    Recent advances in deep learning have impacted various scientific and industrial fields. Due to the rapid application of deep learning in biomedical data, molecular imaging has also started to adopt this technique. In this regard, it is expected that deep learning will potentially affect the roles of molecular imaging experts as well as clinical decision making. This review firstly offers a basic overview of deep learning particularly for image data analysis to give knowledge to nuclear medicine physicians and researchers. Because of the unique characteristics and distinctive aims of various types of molecular imaging, deep learning applications can be different from other fields. In this context, the review deals with current perspectives of deep learning in molecular imaging particularly in terms of development of biomarkers. Finally, future challenges of deep learning application for molecular imaging and future roles of experts in molecular imaging will be discussed.

  6. Spatial Vertical Directionality and Correlation of Low-Frequency Ambient Noise in Deep Ocean Direct-Arrival Zones.

    PubMed

    Yang, Qiulong; Yang, Kunde; Cao, Ran; Duan, Shunli

    2018-01-23

    Wind-driven and distant shipping noise sources contribute to the total noise field in the deep ocean direct-arrival zones. Wind-driven and distant shipping noise sources may significantly and simultaneously affect the spatial characteristics of the total noise field to some extent. In this work, a ray approach and parabolic equation solution method were jointly utilized to model the low-frequency ambient noise field in a range-dependent deep ocean environment by considering their calculation accuracy and efficiency in near-field wind-driven and far-field distant shipping noise fields. The reanalysis databases of National Center of Environment Prediction (NCEP) and Volunteer Observation System (VOS) were used to model the ambient noise source intensity and distribution. Spatial vertical directionality and correlation were analyzed in three scenarios that correspond to three wind speed conditions. The noise field was dominated by distant shipping noise sources when the wind speed was less than 3 m/s, and then the spatial vertical directionality and vertical correlation of the total noise field were nearly consistent with those of distant shipping noise field. The total noise field was completely dominated by near field wind generated noise sources when the wind speed was greater than 12 m/s at 150 Hz, and then the spatial vertical correlation coefficient and directionality pattern of the total noise field was approximately consistent with that of the wind-driven noise field. The spatial characteristics of the total noise field for wind speeds between 3 m/s and 12 m/s were the weighted results of wind-driven and distant shipping noise fields. Furthermore, the spatial characteristics of low-frequency ambient noise field were compared with the classical Cron/Sherman deep water noise field coherence function. Simulation results with the described modeling method showed good agreement with the experimental measurement results based on the vertical line array deployed near the bottom in deep ocean direct-arrival zones.

  7. Spatial Vertical Directionality and Correlation of Low-Frequency Ambient Noise in Deep Ocean Direct-Arrival Zones

    PubMed Central

    Yang, Qiulong; Yang, Kunde; Cao, Ran; Duan, Shunli

    2018-01-01

    Wind-driven and distant shipping noise sources contribute to the total noise field in the deep ocean direct-arrival zones. Wind-driven and distant shipping noise sources may significantly and simultaneously affect the spatial characteristics of the total noise field to some extent. In this work, a ray approach and parabolic equation solution method were jointly utilized to model the low-frequency ambient noise field in a range-dependent deep ocean environment by considering their calculation accuracy and efficiency in near-field wind-driven and far-field distant shipping noise fields. The reanalysis databases of National Center of Environment Prediction (NCEP) and Volunteer Observation System (VOS) were used to model the ambient noise source intensity and distribution. Spatial vertical directionality and correlation were analyzed in three scenarios that correspond to three wind speed conditions. The noise field was dominated by distant shipping noise sources when the wind speed was less than 3 m/s, and then the spatial vertical directionality and vertical correlation of the total noise field were nearly consistent with those of distant shipping noise field. The total noise field was completely dominated by near field wind generated noise sources when the wind speed was greater than 12 m/s at 150 Hz, and then the spatial vertical correlation coefficient and directionality pattern of the total noise field was approximately consistent with that of the wind-driven noise field. The spatial characteristics of the total noise field for wind speeds between 3 m/s and 12 m/s were the weighted results of wind-driven and distant shipping noise fields. Furthermore, the spatial characteristics of low-frequency ambient noise field were compared with the classical Cron/Sherman deep water noise field coherence function. Simulation results with the described modeling method showed good agreement with the experimental measurement results based on the vertical line array deployed near the bottom in deep ocean direct-arrival zones. PMID:29360793

  8. Hubble Sees a Legion of Galaxies

    NASA Image and Video Library

    2017-12-08

    Peering deep into the early universe, this picturesque parallel field observation from the NASA/ESA Hubble Space Telescope reveals thousands of colorful galaxies swimming in the inky blackness of space. A few foreground stars from our own galaxy, the Milky Way, are also visible. In October 2013 Hubble’s Wide Field Camera 3 (WFC3) and Advanced Camera for Surveys (ACS) began observing this portion of sky as part of the Frontier Fields program. This spectacular skyscape was captured during the study of the giant galaxy cluster Abell 2744, otherwise known as Pandora’s Box. While one of Hubble’s cameras concentrated on Abell 2744, the other camera viewed this adjacent patch of sky near to the cluster. Containing countless galaxies of various ages, shapes and sizes, this parallel field observation is nearly as deep as the Hubble Ultra-Deep Field. In addition to showcasing the stunning beauty of the deep universe in incredible detail, this parallel field — when compared to other deep fields — will help astronomers understand how similar the universe looks in different directions. Image credit: NASA, ESA and the HST Frontier Fields team (STScI), NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  9. Quantum neuromorphic hardware for quantum artificial intelligence

    NASA Astrophysics Data System (ADS)

    Prati, Enrico

    2017-08-01

    The development of machine learning methods based on deep learning boosted the field of artificial intelligence towards unprecedented achievements and application in several fields. Such prominent results were made in parallel with the first successful demonstrations of fault tolerant hardware for quantum information processing. To which extent deep learning can take advantage of the existence of a hardware based on qubits behaving as a universal quantum computer is an open question under investigation. Here I review the convergence between the two fields towards implementation of advanced quantum algorithms, including quantum deep learning.

  10. Deep Borehole Field Test Research Activities at LBNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobson, Patrick; Tsang, Chin-Fu; Kneafsey, Timothy

    The goal of the U.S. Department of Energy Used Fuel Disposition’s (UFD) Deep Borehole Field Test is to drill two 5 km large-diameter boreholes: a characterization borehole with a bottom-hole diameter of 8.5 inches and a field test borehole with a bottom-hole diameter of 17 inches. These boreholes will be used to demonstrate the ability to drill such holes in crystalline rocks, effectively characterize the bedrock repository system using geophysical, geochemical, and hydrological techniques, and emplace and retrieve test waste packages. These studies will be used to test the deep borehole disposal concept, which requires a hydrologically isolated environment characterizedmore » by low permeability, stable fluid density, reducing fluid chemistry conditions, and an effective borehole seal. During FY16, Lawrence Berkeley National Laboratory scientists conducted a number of research studies to support the UFD Deep Borehole Field Test effort. This work included providing supporting data for the Los Alamos National Laboratory geologic framework model for the proposed deep borehole site, conducting an analog study using an extensive suite of geoscience data and samples from a deep (2.5 km) research borehole in Sweden, conducting laboratory experiments and coupled process modeling related to borehole seals, and developing a suite of potential techniques that could be applied to the characterization and monitoring of the deep borehole environment. The results of these studies are presented in this report.« less

  11. VizieR Online Data Catalog: Spitzer-CANDELS catalog within 5 deep fields (Ashby+, 2015)

    NASA Astrophysics Data System (ADS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Dunlop, J. S.; Egami, E.; Faber, S. M.; Ferguson, H. C.; Grogin, N. A.; Hora, J. L.; Huang, J.-S.; Koekemoer, A. M.; Labbe, I.; Wang, Z.

    2015-08-01

    We chose to locate S-CANDELS inside the wider and shallower fields already covered by Spitzer Extended Deep Survey (SEDS), in regions that enjoy deep optical and NIR imaging from HST/CANDELS. These S-CANDELS fields are thus the Extended GOODS-south (aka the GEMS field, hereafter ECDFS; Rix et al. 2004ApJS..152..163R; Castellano et al. 2010A&A...511A..20C), the Extended GOODS-north (HDFN; Giavalisco et al. 2004, II/261; Wang et al. 2010, J/ApJS/187/251; Hathi et al. 2012ApJ...757...43H; Lin et al. 2012ApJ...756...71L), the UKIDSS UDS (aka the Subaru/XMM Deep Field, Ouchi et al. 2001ApJ...558L..83O; Lawrence et al. 2007, II/319), a narrow field within the EGS (Davis et al. 2007ApJ...660L...1D; Bielby et al. 2012A&A...545A..23B), and a strip within the UltraVista deep survey of the larger COSMOS field (Scoville et al. 2007ApJS..172...38S; McCracken et al. 2012, J/A+A/544/A156). The S-CANDELS observing strategy was designed to maximize the area covered to full depth within the CANDELS area. Each field was visited twice with six months separating the two visits. Table 1 lists the epochs for each field. All of the IRAC full-depth coverage is within the SEDS area (Ashby et al. 2013, J/ApJ/769/80), and almost all is within the area covered by HST for CANDELS. (6 data files).

  12. Low-cost, high-precision micro-lensed optical fiber providing deep-micrometer to deep-nanometer-level light focusing.

    PubMed

    Wen, Sy-Bor; Sundaram, Vijay M; McBride, Daniel; Yang, Yu

    2016-04-15

    A new type of micro-lensed optical fiber through stacking appropriate high-refractive microspheres at designed locations with respect to the cleaved end of an optical fiber is numerically and experimentally demonstrated. This new type of micro-lensed optical fiber can be precisely constructed with low cost and high speed. Deep micrometer-scale and submicrometer-scale far-field light spots can be achieved when the optical fibers are multimode and single mode, respectively. By placing an appropriate teardrop dielectric nanoscale scatterer at the far-field spot of this new type of micro-lensed optical fiber, a deep-nanometer near-field spot can also be generated with high intensity and minimum joule heating, which is valuable in high-speed, high-resolution, and high-power nanoscale detection compared with traditional near-field optical fibers containing a significant portion of metallic material.

  13. Deep CFHT Y-band Imaging of VVDS-F22 Field. II. Quasar Selection and Quasar Luminosity Function

    NASA Astrophysics Data System (ADS)

    Yang, Jinyi; Wu, Xue-Bing; Liu, Dezi; Fan, Xiaohui; Yang, Qian; Wang, Feige; McGreer, Ian D.; Fan, Zuhui; Yuan, Shuo; Shan, Huanyuan

    2018-03-01

    We report the results of a faint quasar survey in a one-square-degree field. The aim is to test the Y-K/g-z and J-K/i-Y color selection criteria for quasars at faint magnitudes to obtain a complete sample of quasars based on deep optical and near-infrared color–color selection and to measure the faint end of the quasar luminosity function (QLF) over a wide redshift range. We carried out a quasar survey based on the Y-K/g-z and J-K/i-Y quasar selection criteria, using the deep Y-band data obtained from our CFHT/WIRCam Y-band images in a two-degree field within the F22 field of the VIMOS VLT deep survey, optical co-added data from Sloan Digital Sky Survey Stripe 82 and deep near-infrared data from the UKIDSS Deep Extragalactic Survey in the same field. We discovered 25 new quasars at 0.5< z< 4.5 and i< 22.5 mag within one-square-degree field. The survey significantly increases the number of faint quasars in this field, especially at z∼ 2{--}3. It confirms that our color selections are highly complete in a wide redshift range (z< 4.5), especially over the quasar number density peak at z∼ 2{--}3, even for faint quasars. Combining all previous known quasars and new discoveries, we construct a sample with 109 quasars and measure the binned QLF and parametric QLF. Although the sample is small, our results agree with a pure luminosity evolution at lower redshift and luminosity evolution and density evolution model at redshift z> 2.5.

  14. AUC-Maximized Deep Convolutional Neural Fields for Protein Sequence Labeling.

    PubMed

    Wang, Sheng; Sun, Siqi; Xu, Jinbo

    2016-09-01

    Deep Convolutional Neural Networks (DCNN) has shown excellent performance in a variety of machine learning tasks. This paper presents Deep Convolutional Neural Fields (DeepCNF), an integration of DCNN with Conditional Random Field (CRF), for sequence labeling with an imbalanced label distribution. The widely-used training methods, such as maximum-likelihood and maximum labelwise accuracy, do not work well on imbalanced data. To handle this, we present a new training algorithm called maximum-AUC for DeepCNF. That is, we train DeepCNF by directly maximizing the empirical Area Under the ROC Curve (AUC), which is an unbiased measurement for imbalanced data. To fulfill this, we formulate AUC in a pairwise ranking framework, approximate it by a polynomial function and then apply a gradient-based procedure to optimize it. Our experimental results confirm that maximum-AUC greatly outperforms the other two training methods on 8-state secondary structure prediction and disorder prediction since their label distributions are highly imbalanced and also has similar performance as the other two training methods on solvent accessibility prediction, which has three equally-distributed labels. Furthermore, our experimental results show that our AUC-trained DeepCNF models greatly outperform existing popular predictors of these three tasks. The data and software related to this paper are available at https://github.com/realbigws/DeepCNF_AUC.

  15. AUC-Maximized Deep Convolutional Neural Fields for Protein Sequence Labeling

    PubMed Central

    Wang, Sheng; Sun, Siqi

    2017-01-01

    Deep Convolutional Neural Networks (DCNN) has shown excellent performance in a variety of machine learning tasks. This paper presents Deep Convolutional Neural Fields (DeepCNF), an integration of DCNN with Conditional Random Field (CRF), for sequence labeling with an imbalanced label distribution. The widely-used training methods, such as maximum-likelihood and maximum labelwise accuracy, do not work well on imbalanced data. To handle this, we present a new training algorithm called maximum-AUC for DeepCNF. That is, we train DeepCNF by directly maximizing the empirical Area Under the ROC Curve (AUC), which is an unbiased measurement for imbalanced data. To fulfill this, we formulate AUC in a pairwise ranking framework, approximate it by a polynomial function and then apply a gradient-based procedure to optimize it. Our experimental results confirm that maximum-AUC greatly outperforms the other two training methods on 8-state secondary structure prediction and disorder prediction since their label distributions are highly imbalanced and also has similar performance as the other two training methods on solvent accessibility prediction, which has three equally-distributed labels. Furthermore, our experimental results show that our AUC-trained DeepCNF models greatly outperform existing popular predictors of these three tasks. The data and software related to this paper are available at https://github.com/realbigws/DeepCNF_AUC. PMID:28884168

  16. Microsurgical robotic system for the deep surgical field: development of a prototype and feasibility studies in animal and cadaveric models.

    PubMed

    Morita, Akio; Sora, Shigeo; Mitsuishi, Mamoru; Warisawa, Shinichi; Suruman, Katopo; Asai, Daisuke; Arata, Junpei; Baba, Shoichi; Takahashi, Hidechika; Mochizuki, Ryo; Kirino, Takaaki

    2005-08-01

    To enhance the surgeon's dexterity and maneuverability in the deep surgical field, the authors developed a master-slave microsurgical robotic system. This concept and the results of preliminary experiments are reported in this paper. The system has a master control unit, which conveys motion commands in six degrees of freedom (X, Y, and Z directions; rotation; tip flexion; and grasping) to two arms. The slave manipulator has a hanging base with an additional six degrees of freedom; it holds a motorized operating unit with two manipulators (5 mm in diameter, 18 cm in length). The accuracy of the prototype in both shallow and deep surgical fields was compared with routine freehand microsurgery. Closure of a partial arteriotomy and complete end-to-end anastomosis of the carotid artery (CA) in the deep operative field were performed in 20 Wistar rats. Three routine surgical procedures were also performed in cadavers. The accuracy of pointing with the nondominant hand in the deep surgical field was significantly improved through the use of robotics. The authors successfully closed the partial arteriotomy and completely anastomosed the rat CAs in the deep surgical field. The time needed for stitching was significantly shortened over the course of the first 10 rat experiments. The robotic instruments also moved satisfactorily in cadavers, but the manipulators still need to be smaller to fit into the narrow intracranial space. Computer-controlled surgical manipulation will be an important tool for neurosurgery, and preliminary experiments involving this robotic system demonstrate its promising maneuverability.

  17. Effects of Dose Frequency of Early Communication Intervention in Young Children with and without Down Syndrome

    ERIC Educational Resources Information Center

    Yoder, Paul; Woynaroski, Tiffany; Fey, Marc; Warren, Steven

    2014-01-01

    Children with intellectual disability were randomly assigned to receive Milieu Communication Teaching (MCT) at one 1-hr session per week (low dose frequency, LDF) or five 1-hr sessions per week (high dose frequency, HDF) over 9 months (Fey, Yoder, Warren, & Bredin-Oja, 2013. Non-Down syndrome (NDS) and Down syndrome (DS) subgroups were matched…

  18. NREL Releases Major Update to Wind Energy Dataset | News | NREL

    Science.gov Websites

    Toolkit-made 2 terabytes (TB) of information available, covering about 120,000 locations identified using ) using the AWS cloud to provide users with easy access to the data, which is stored as a series of HDF5 files. The information can be narrowed to a specific site or time and analyzed using either a custom

  19. Mid-dilution hemodiafiltration: a comparison with pre- and postdilution modes using the same polyphenylene membrane.

    PubMed

    Maduell, Francisco; Arias, Marta; Vera, Manel; Fontseré, Néstor; Blasco, Miquel; Barros, Xoana; Garro, Julia; Elena, Montserrat; Bergadá, Eduardo; Cases, Aleix; Bedini, Jose Luis; Campistol, Josep M

    2009-01-01

    As a change from Diapes to polyphenylene membrane in the mid-dilution filter has recently been developed, the aim of this study was to compare mid-dilution using this new dialyzer versus pre- and postdilution. The prospective study included 20 patients who underwent 4 hemodiafiltration (HDF) sessions: 1.7 m(2) polyphenylene and predilution infusion flow (Qi) 200 ml/min, 1.7 m(2) and postdilution Qi 100 ml/min, 1.9 and 2.2 m(2) mid-dilution both with Qi 200 ml/ min. The urea and creatinine reduction ratios were slightly higher in postdilution. The beta(2)-microglobulin (85.8%), myoglobin (73.6%), prolactin (67.8%) and retinol-binding protein (29.2%) reduction ratios with 1.9 m(2) mid-dilution, which was similar to 2.2 m(2) mid-dilution, were significantly higher than with the post- and predilution modes. Mid-dilution appears to be a good HDF alternative that allows a better removal of larger molecules than postdilution and, mainly, predilution. Mid-dilution using 1.9 or 2.2 m(2) dialyzers, at the same convective volume, showed a similar removal. Copyright 2009 S. Karger AG, Basel.

  20. catsHTM: A Tool for Fast Accessing and Cross-matching Large Astronomical Catalogs

    NASA Astrophysics Data System (ADS)

    Soumagnac, Maayane T.; Ofek, Eran O.

    2018-07-01

    Fast access to large catalogs is required for some astronomical applications. Here we introduce the catsHTM tool, consisting of several large catalogs reformatted into HDF5-based file format, which can be downloaded and used locally. To allow fast access, the catalogs are partitioned into hierarchical triangular meshes and stored in HDF5 files. Several tools are provided to perform efficient cone searches at resolutions spanning from a few arc-seconds to degrees, within a few milliseconds time. The first released version includes the following catalogs (by alphabetical order): 2MASS, 2MASS extended sources, AKARI, APASS, Cosmos, DECaLS/DR5, FIRST, GAIA/DR1, GAIA/DR2, GALEX/DR6Plus7, HSC/v2, IPHAS/DR2, NED redshifts, NVSS, Pan-STARRS1/DR1, PTF photometric catalog, ROSAT faint source, SDSS sources, SDSS/DR14 spectroscopy, SkyMapper, Spitzer/SAGE, Spitzer/IRAC galactic center, UCAC4, UKIDSS/DR10, VST/ATLAS/DR3, VST/KiDS/DR3, WISE and XMM. We provide Python code that allows to perform cone searches, as well as MATLAB code for performing cone searches, catalog cross-matching, general searches, as well as load and create these catalogs.

  1. [Hospitalization due to skin diseases at Hôtel-Dieu de France Hospital (Beirut), 1998-2007].

    PubMed

    Maatouk, Ismaël; Moutran, Roy; Tomb, Roland

    2012-01-01

    This study aims to determine retrospectively the nature and frequency of dermatological diseases leading to hospitalization at Hôtel-Dieu de France Hospital (HDF) in Beirut, between 1998 and 2007 and to compare them with literature data. For the patients who were hospitalized in dermatology at HDF, we studied: demographics, diagnosis of hospitalization, length of stay, service, mode of financial support, in-hospital evolution, diagnostic tests and treatment. The data were processed by SPSS program. Alopecia areata, psoriatic erythroderma, acute urticaria and vasculitic purpura are the top four diagnoses (85% of hospitalizations). The third of the patients was admitted to same day care. The financial support of the hospitalization is based primarily on public insurance (57.6%). Corticosteroids are the most widely used treatment for patients in dermatology hospital with a frequency of 59.8%. The number of hospitalizations peaked at 44 in 2002 and since then has been declining (11 hospitalizations in 2007). Pathologies encountered in hospital are different from those encountered during consultation. Management of skin diseases on an outpatient basis is often insufficient. In the literature, no profile of skin diseases leading to hospitalization is similar to our study.

  2. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field.

    PubMed

    Christiansen, Peter; Nielsen, Lars N; Steen, Kim A; Jørgensen, Rasmus N; Karstoft, Henrik

    2016-11-11

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks" (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45-90 m) than RCNN. RCNN has a similar performance at a short range (0-30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit).

  3. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field

    PubMed Central

    Christiansen, Peter; Nielsen, Lars N.; Steen, Kim A.; Jørgensen, Rasmus N.; Karstoft, Henrik

    2016-01-01

    Convolutional neural network (CNN)-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation) algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN). In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m) than RCNN. RCNN has a similar performance at a short range (0–30 m). However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms =) a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit). PMID:27845717

  4. Deep Learning and Its Applications in Biomedicine.

    PubMed

    Cao, Chensi; Liu, Feng; Tan, Hai; Song, Deshou; Shu, Wenjie; Li, Weizhong; Zhou, Yiming; Bo, Xiaochen; Xie, Zhi

    2018-02-01

    Advances in biological and medical technologies have been providing us explosive volumes of biological and physiological data, such as medical images, electroencephalography, genomic and protein sequences. Learning from these data facilitates the understanding of human health and disease. Developed from artificial neural networks, deep learning-based algorithms show great promise in extracting features and learning patterns from complex data. The aim of this paper is to provide an overview of deep learning techniques and some of the state-of-the-art applications in the biomedical field. We first introduce the development of artificial neural network and deep learning. We then describe two main components of deep learning, i.e., deep learning architectures and model optimization. Subsequently, some examples are demonstrated for deep learning applications, including medical image classification, genomic sequence analysis, as well as protein structure classification and prediction. Finally, we offer our perspectives for the future directions in the field of deep learning. Copyright © 2018. Production and hosting by Elsevier B.V.

  5. Estimates of deep percolation beneath native vegetation, irrigated fields, and the Amargosa-River Channel, Amargosa Desert, Nye County, Nevada

    USGS Publications Warehouse

    Stonestrom, David A.; Prudic, David E.; Laczniak, Randell J.; Akstin, Katherine C.; Boyd, Robert A.; Henkelman, Katherine K.

    2003-01-01

    The presence and approximate rates of deep percolation beneath areas of native vegetation, irrigated fields, and the Amargosa-River channel in the Amargosa Desert of southern Nevada were evaluated using the chloride mass-balance method and inferred downward velocities of chloride and nitrate peaks. Estimates of deep-percolation rates in the Amargosa Desert are needed for the analysis of regional ground-water flow and transport. An understanding of regional flow patterns is important because ground water originating on the Nevada Test Site may pass through the area before discharging from springs at lower elevations in the Amargosa Desert and in Death Valley. Nine boreholes 10 to 16 meters deep were cored nearly continuously using a hollow-stem auger designed for gravelly sediments. Two boreholes were drilled in each of three irrigated fields in the Amargosa-Farms area, two in the Amargosa-River channel, and one in an undisturbed area of native vegetation. Data from previously cored boreholes beneath undisturbed, native vegetation were compared with the new data to further assess deep percolation under current climatic conditions and provide information on spatial variability.The profiles beneath native vegetation were characterized by large amounts of accumulated chloride just below the root zone with almost no further accumulation at greater depths. This pattern is typical of profiles beneath interfluvial areas in arid alluvial basins of the southwestern United States, where salts have been accumulating since the end of the Pleistocene. The profiles beneath irrigated fields and the Amargosa-River channel contained more than twice the volume of water compared to profiles beneath native vegetation, consistent with active deep percolation beneath these sites. Chloride profiles beneath two older fields (cultivated since the 1960’s) as well as the upstream Amargosa-River site were indicative of long-term, quasi-steady deep percolation. Chloride profiles beneath the newest field (cultivated since 1993), the downstream Amargosa-River site, and the edge of an older field were indicative of recently active deep percolation moving previously accumulated salts from the upper profile to greater depths.Results clearly indicate that deep percolation and ground-water recharge occur not only beneath areas of irrigation but also beneath ephemeral stream channels, despite the arid climate and infrequency of runoff. Rates of deep percolation beneath irrigated fields ranged from 0.1 to 0.5 m/yr. Estimated rates of deep percolation beneath the Amargosa-River channel ranged from 0.02 to 0.15 m/yr. Only a few decades are needed for excess irrigation water to move through the unsaturated zone and recharge ground water. Assuming vertical, one-dimensional flow, the estimated time for irrigation-return flow to reach the water table beneath the irrigated fields ranged from about 10 to 70 years. In contrast, infiltration from present-day runoff takes centuries to move through the unsaturated zone and reach the water table. The estimated time for water to reach the water table beneath the channel ranged from 140 to 1000 years. These values represent minimum times, as they do not take lateral flow into account. The estimated fraction of irrigation water becoming deep percolation averaged 8 to 16 percent. Similar fractions of infiltration from ephemeral flow events were estimated to become deep percolation beneath the normally dry Amargosa-River channel. In areas where flood-induced channel migration occurs at sub-centennial frequencies, residence times in the unsaturated zone beneath the Amargosa channel could be longer. Estimates of deep percolation presented herein provide a basis for evaluating the importance of recharge from irrigation and channel infiltration in models of ground-water flow from the Nevada Test Site.

  6. [Advantages and Application Prospects of Deep Learning in Image Recognition and Bone Age Assessment].

    PubMed

    Hu, T H; Wan, L; Liu, T A; Wang, M W; Chen, T; Wang, Y H

    2017-12-01

    Deep learning and neural network models have been new research directions and hot issues in the fields of machine learning and artificial intelligence in recent years. Deep learning has made a breakthrough in the applications of image and speech recognitions, and also has been extensively used in the fields of face recognition and information retrieval because of its special superiority. Bone X-ray images express different variations in black-white-gray gradations, which have image features of black and white contrasts and level differences. Based on these advantages of deep learning in image recognition, we combine it with the research of bone age assessment to provide basic datum for constructing a forensic automatic system of bone age assessment. This paper reviews the basic concept and network architectures of deep learning, and describes its recent research progress on image recognition in different research fields at home and abroad, and explores its advantages and application prospects in bone age assessment. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  7. VizieR Online Data Catalog: Improved multi-band photometry from SERVS (Nyland+, 2017)

    NASA Astrophysics Data System (ADS)

    Nyland, K.; Lacy, M.; Sajina, A.; Pforr, J.; Farrah, D.; Wilson, G.; Surace, J.; Haussler, B.; Vaccari, M.; Jarvis, M.

    2017-07-01

    The Spitzer Extragalactic Representative Volume Survey (SERVS) sky footprint includes five well-studied astronomical deep fields with abundant multi-wavelength data spanning an area of ~18deg2 and a co-moving volume of ~0.8Gpc3. The five deep fields included in SERVS are the XMM-LSS field, Lockman Hole (LH), ELAIS-N1 (EN1), ELAIS-S1 (ES1), and Chandra Deep Field South (CDFS). SERVS provides NIR, post-cryogenic imaging in the 3.6 and 4.5um Spitzer/IRAC bands to a depth of ~2uJy. IRAC dual-band source catalogs generated using traditional catalog extraction methods are described in Mauduit+ (2012PASP..124..714M). The Spitzer IRAC data are complemented by ground-based NIR observations from the VISTA Deep Extragalactic Observations (VIDEO; Jarvis+ 2013MNRAS.428.1281J) survey in the south in the Z, Y, J, H, and Ks bands and UKIRT Infrared Deep Sky Survey (UKIDSS; Lawrence+ 2007, see II/319) in the north in the J and K bands. SERVS also provides substantial overlap with infrared data from SWIRE (Lonsdale+ 2003PASP..115..897L) and the Herschel Multitiered Extragalactic Survey (HerMES; Oliver+ 2012, VIII/95). As shown in Figure 1, one square degree of the XMM-LSS field overlaps with ground-based optical data from the Canada-France-Hawaii Telescope Legacy Survey Deep field 1 (CFHTLS-D1). The CFHTLS-D1 region is centered at RAJ2000=02:25:59, DEJ2000=-04:29:40 and includes imaging through the filter set u', g', r', i', and z'. Thus, in combination with the NIR data from SERVS and VIDEO that overlap with the CFHTLS-D1 region, multi-band imaging over a total of 12 bands is available. (2 data files).

  8. Bit Grooming: statistically accurate precision-preserving quantization with compression, evaluated in the netCDF Operators (NCO, v4.4.8+)

    NASA Astrophysics Data System (ADS)

    Zender, Charles S.

    2016-09-01

    Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits of consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25-80 and 5-65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1-5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1-2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.

  9. eddy4R 0.2.0: a DevOps model for community-extensible processing and analysis of eddy-covariance data based on R, Git, Docker, and HDF5

    NASA Astrophysics Data System (ADS)

    Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.

    2017-08-01

    Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds the capacity to deploy complex algorithms developed by scientists in an efficient and scalable manner. In addition, modularity permits meeting project milestones while retaining extensibility with time.

  10. Albedo Neutron Dosimetry in a Deep Geological Disposal Repository for High-Level Nuclear Waste.

    PubMed

    Pang, Bo; Becker, Frank

    2017-04-28

    Albedo neutron dosemeter is the German official personal neutron dosemeter in mixed radiation fields where neutrons contribute to personal dose. In deep geological repositories for high-level nuclear waste, where neutrons can dominate the radiation field, it is of interest to investigate the performance of albedo neutron dosemeter in such facilities. In this study, the deep geological repository is represented by a shielding cask loaded with spent nuclear fuel placed inside a rock salt emplacement drift. Due to the backscattering of neutrons in the drift, issues concerning calibration of the dosemeter arise. Field-specific calibration of the albedo neutron dosemeter was hence performed with Monte Carlo simulations. In order to assess the applicability of the albedo neutron dosemeter in a deep geological repository over a long time scale, spent nuclear fuel with different ages of 50, 100 and 500 years were investigated. It was found out, that the neutron radiation field in a deep geological repository can be assigned to the application area 'N1' of the albedo neutron dosemeter, which is typical in reactors and accelerators with heavy shielding. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Primary experimental study on safety of deep brain stimulation in RF electromagnetic field.

    PubMed

    Jun, Xu; Luming, Li; Hongwei, Hao

    2009-01-01

    With the rapid growth of clinical application of Deep Brain Stimulation, its safety and functional concern in the electromagnetic field, another pollution becoming much more serious, has become more and more significant. Meanwhile, the measuring standards on Electromagnetic Compatibility (EMC) for DBS are still incomplete. Particularly, the knowledge of the electromagnetic field induced signals on the implanted lead is ignorant while some informal reports some side effects. This paper briefly surmised the status of EMC standards on implantable medical devices. Based on the EMC experiments of DBS device we developed, two experiments for measuring the induced voltage of the deep brain stimulator in RF electromagnetic field were reported. The measured data showed that the induced voltage in some frequency was prominent, for example over 2V. As a primary research, we think these results would be significant to cause researcher to pay more attention to the EMC safety problem and biological effects of the induced voltage in deep brain stimulation and other implantable devices.

  12. Current oscillations in semi-insulating GaAs associated with field-enhanced capture of electrons by the major deep donor EL2

    NASA Technical Reports Server (NTRS)

    Kaminska, M.; Parsey, J. M.; Lagowski, J.; Gatos, H. C.

    1982-01-01

    Current oscillations thermally activated by the release of electrons from deep levels in undoped semiinsulating GaAs were observed for the first time. They were attributed to electric field-enhanced capture of electrons by the dominant deep donor EL2 (antisite AsGa defect). This enhanced capture is due to the configurational energy barrier of EL2, which is readily penetrated by hot electrons.

  13. How Do Young Students with Different Profiles of Reading Skill Mastery, Perceived Ability, and Goal Orientation Respond to Holistic Diagnostic Feedback?

    ERIC Educational Resources Information Center

    Jang, Eunice Eunhee; Dunlop, Maggie; Park, Gina; van der Boom, Edith H.

    2015-01-01

    One critical issue with cognitive diagnostic assessment (CDA) lies in its lack of research evidence that shows how diagnostic feedback from CDA is interpreted and used by young students. This mixed methods research examined how holistic diagnostic feedback (HDF) is processed by young learners with different profiles of reading skills, goal…

  14. Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.12

    DTIC Science & Technology

    2015-09-03

    the Geostationary Ocean Color Imager (GOCI) sensor, aboard the Communication Ocean and Meteorological Satellite (COMS) satellite. Additionally, this...this capability works in conjunction with AOPS • Improvements to the AOPS mosaicking capability • Prepare the NRT Geostationary Ocean Color Imager...Warfare (EXW) Geostationary Ocean Color Imager (GOCI) Gulf of Mexico (GOM) Hierarchical Data Format (HDF) Integrated Data Processing System (IDPS

  15. A spatial epidemiological analysis of self-rated mental health in the slums of Dhaka

    PubMed Central

    2011-01-01

    Background The deprived physical environments present in slums are well-known to have adverse health effects on their residents. However, little is known about the health effects of the social environments in slums. Moreover, neighbourhood quantitative spatial analyses of the mental health status of slum residents are still rare. The aim of this paper is to study self-rated mental health data in several slums of Dhaka, Bangladesh, by accounting for neighbourhood social and physical associations using spatial statistics. We hypothesised that mental health would show a significant spatial pattern in different population groups, and that the spatial patterns would relate to spatially-correlated health-determining factors (HDF). Methods We applied a spatial epidemiological approach, including non-spatial ANOVA/ANCOVA, as well as global and local univariate and bivariate Moran's I statistics. The WHO-5 Well-being Index was used as a measure of self-rated mental health. Results We found that poor mental health (WHO-5 scores < 13) among the adult population (age ≥15) was prevalent in all slum settlements. We detected spatially autocorrelated WHO-5 scores (i.e., spatial clusters of poor and good mental health among different population groups). Further, we detected spatial associations between mental health and housing quality, sanitation, income generation, environmental health knowledge, education, age, gender, flood non-affectedness, and selected properties of the natural environment. Conclusions Spatial patterns of mental health were detected and could be partly explained by spatially correlated HDF. We thereby showed that the socio-physical neighbourhood was significantly associated with health status, i.e., mental health at one location was spatially dependent on the mental health and HDF prevalent at neighbouring locations. Furthermore, the spatial patterns point to severe health disparities both within and between the slums. In addition to examining health outcomes, the methodology used here is also applicable to residuals of regression models, such as helping to avoid violating the assumption of data independence that underlies many statistical approaches. We assume that similar spatial structures can be found in other studies focussing on neighbourhood effects on health, and therefore argue for a more widespread incorporation of spatial statistics in epidemiological studies. PMID:21599932

  16. Protein Losses and Urea Nitrogen Underestimate Total Nitrogen Losses in Peritoneal Dialysis and Hemodialysis Patients.

    PubMed

    Salame, Clara; Eaton, Simon; Grimble, George; Davenport, Andrew

    2018-04-28

    Muscle wasting is associated with increased mortality and is commonly reported in dialysis patients. Hemodialysis (HD) and peritoneal dialysis (PD) treatments lead to protein losses in effluent dialysate. We wished to determine whether changes in current dialysis practice had increased therapy-associated nitrogen losses. Cross-sectional cohort study. Measurement of total protein, urea and total nitrogen in effluent dialysate from 24-hour collections from PD patients, and during haemodiafiltration (HDF) and haemodialysis (HD) sessions. One hundred eight adult dialysis patients. Peritoneal dialysis, high-flux haemodialysis and haemodiafiltration. Total nitrogen and protein losses. Dialysate protein losses were measured in 68 PD and 40 HD patients. Sessional losses of urea (13.9 [9.2-21.1] vs. 4.8 [2.8-7.8] g); protein (8.6 [7.2-11.1] vs. 6.7 [3.9-11.1] g); and nitrogen (11.5 [8.7-17.7] vs. 4.9 [2.6-9.5] g) were all greater for HD than PD, P < .001. Protein-derived nitrogen was 71.9 (54.4-110.4) g for HD and 30.8 (16.1-59.6) g for PD. Weekly protein losses were lower with HD 25.9 (21.5-33.4) versus 46.6 (27-77.6) g/week, but nitrogen losses were similar. We found no difference between high-flux HD and HDF: urea (13.5 [8.8-20.6] vs. 15.3 [10.5-25.5] g); protein (8.8 [7.3-12.2] vs. 7.6 [5.8-9.0] g); and total nitrogen (11.6 [8.3-17.3] vs. 10.8 [8.9-22.5] g). Urea nitrogen (UN) only accounted for 45.1 (38.3-51.0)% PD and 63.0 (55.3-62.4)% HD of total nitrogen losses. Although sessional losses of protein and UN were greater with HD, weekly losses were similar between modalities. We found no differences between HD and HDF. However, total nitrogen losses were much greater than the combination of protein and UN, suggesting greater nutritional losses with dialysis than previously reported. Copyright © 2018 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  17. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.

  18. Magnetothermal genetic deep brain stimulation of motor behaviors in awake, freely moving mice

    PubMed Central

    Zhang, Qian; Castellanos Rubio, Idoia; del Pino, Pablo

    2017-01-01

    Establishing how neurocircuit activation causes particular behaviors requires modulating the activity of specific neurons. Here, we demonstrate that magnetothermal genetic stimulation provides tetherless deep brain activation sufficient to evoke motor behavior in awake mice. The approach uses alternating magnetic fields to heat superparamagnetic nanoparticles on the neuronal membrane. Neurons, heat-sensitized by expressing TRPV1 are activated with magnetic field application. Magnetothermal genetic stimulation in the motor cortex evoked ambulation, deep brain stimulation in the striatum caused rotation around the body-axis, and stimulation near the ridge between ventral and dorsal striatum caused freezing-of-gait. The duration of the behavior correlated tightly with field application. This approach provides genetically and spatially targetable, repeatable and temporarily precise activation of deep-brain circuits without the need for surgical implantation of any device. PMID:28826470

  19. Accurate segmentation of lung fields on chest radiographs using deep convolutional networks

    NASA Astrophysics Data System (ADS)

    Arbabshirani, Mohammad R.; Dallal, Ahmed H.; Agarwal, Chirag; Patel, Aalpan; Moore, Gregory

    2017-02-01

    Accurate segmentation of lung fields on chest radiographs is the primary step for computer-aided detection of various conditions such as lung cancer and tuberculosis. The size, shape and texture of lung fields are key parameters for chest X-ray (CXR) based lung disease diagnosis in which the lung field segmentation is a significant primary step. Although many methods have been proposed for this problem, lung field segmentation remains as a challenge. In recent years, deep learning has shown state of the art performance in many visual tasks such as object detection, image classification and semantic image segmentation. In this study, we propose a deep convolutional neural network (CNN) framework for segmentation of lung fields. The algorithm was developed and tested on 167 clinical posterior-anterior (PA) CXR images collected retrospectively from picture archiving and communication system (PACS) of Geisinger Health System. The proposed multi-scale network is composed of five convolutional and two fully connected layers. The framework achieved IOU (intersection over union) of 0.96 on the testing dataset as compared to manual segmentation. The suggested framework outperforms state of the art registration-based segmentation by a significant margin. To our knowledge, this is the first deep learning based study of lung field segmentation on CXR images developed on a heterogeneous clinical dataset. The results suggest that convolutional neural networks could be employed reliably for lung field segmentation.

  20. Deep-turbulence wavefront sensing using digital holography in the on-axis phase shifting recording geometry

    NASA Astrophysics Data System (ADS)

    Thornton, Douglas E.; Spencer, Mark F.; Perram, Glen P.

    2017-09-01

    The effects of deep turbulence in long-range imaging applications presents unique challenges to properly measure and correct for aberrations incurred along the atmospheric path. In practice, digital holography can detect the path-integrated wavefront distortions caused by deep turbulence, and di erent recording geometries offer different benefits depending on the application of interest. Previous studies have evaluated the performance of the off-axis image and pupil plane recording geometries for deep-turbulence sensing. This study models digital holography in the on-axis phase shifting recording geometry using wave optics simulations. In particular, the analysis models spherical-wave propagation through varying deep-turbulence conditions to estimate the complex optical field, and performance is evaluated by calculating the field-estimated Strehl ratio and RMS wavefront error. Altogether, the results show that digital holography in the on-axis phase shifting recording geometry is an effective wavefront-sensing method in the presence of deep turbulence.

  1. A comparative study of approaches to compute the field distribution of deep brain stimulation in the Hemiparkinson rat model.

    PubMed

    Bohme, Andrea; van Rienen, Ursula

    2016-08-01

    Computational modeling of the stimulating field distribution during Deep Brain Stimulation provides an opportunity to advance our knowledge of this neurosurgical therapy for Parkinson's disease. There exist several approaches to model the target region for Deep Brain Stimulation in Hemi-parkinson Rats with volume conductor models. We have described and compared the normalized mapping approach as well as the modeling with three-dimensional structures, which include curvilinear coordinates to assure an anatomically realistic conductivity tensor orientation.

  2. The Chandra Deep Wide-Field Survey: Completing the new generation of Chandra extragalactic surveys

    NASA Astrophysics Data System (ADS)

    Hickox, Ryan

    2016-09-01

    Chandra X-ray surveys have revolutionized our view of the growth of black holes across cosmic time. Recently, fundamental questions have emerged about the connection of AGN to their host large scale structures that clearly demand a wide, deep survey over a large area, comparable to the recent extensive Chandra surveys in smaller fields. We propose the Chandra Deep Wide-Field Survey (CDWFS) covering the central 6 sq. deg in the Bootes field, totaling 1.025 Ms (building on 550 ks from the HRC GTO program). CDWFS will efficiently probe a large cosmic volume, allowing us to carry out accurate new investigations of the connections between black holes and their large-scale structures, and will complete the next generation surveys that comprise a key part of Chandra's legacy.

  3. Deep Seawater Intrusion Enhanced by Geothermal Through Deep Faults in Xinzhou Geothermal Field in Guangdong, China

    NASA Astrophysics Data System (ADS)

    Lu, G.; Ou, H.; Hu, B. X.; Wang, X.

    2017-12-01

    This study investigates abnormal sea water intrusion from deep depth, riding an inland-ward deep groundwater flow, which is enhanced by deep faults and geothermal processes. The study site Xinzhou geothermal field is 20 km from the coast line. It is in southern China's Guangdong coast, a part of China's long coastal geothermal belt. The geothermal water is salty, having fueled an speculation that it was ancient sea water retained. However, the perpetual "pumping" of the self-flowing outflow of geothermal waters might alter the deep underground flow to favor large-scale or long distant sea water intrusion. We studied geochemical characteristics of the geothermal water and found it as a mixture of the sea water with rain water or pore water, with no indication of dilution involved. And we conducted numerical studies of the buoyancy-driven geothermal flow in the deep ground and find that deep down in thousand meters there is favorable hydraulic gradient favoring inland-ward groundwater flow, allowing seawater intrude inland for an unusually long tens of kilometers in a granitic groundwater flow system. This work formed the first in understanding geo-environment for deep ground water flow.

  4. ARC-2012-ACD12-0020-001

    NASA Image and Video Library

    2012-02-02

    Stein_Sun: Visualization of the complex magnetic field produced as magnetic flux rises toward the Sun¹s surface from the deep convection zone. The image shows a snapshot of how the magnetic field has evolved two days from the time uniform, untwisted, horizontal magnetic field started to be advected by inflows at the bottom (20 megameters deep). Axes are in megameters, and the color scale shows the log of the magnetic field strength. Credit: Robert Stein, Michigan State University; Tim Sandstrom, NASA/Ames

  5. Adding the missing piece: Spitzer imaging of the HSC-Deep/PFS fields

    NASA Astrophysics Data System (ADS)

    Sajina, Anna; Bezanson, Rachel; Capak, Peter; Egami, Eiichi; Fan, Xiaohui; Farrah, Duncan; Greene, Jenny; Goulding, Andy; Lacy, Mark; Lin, Yen-Ting; Liu, Xin; Marchesini, Danilo; Moutard, Thibaud; Ono, Yoshiaki; Ouchi, Masami; Sawicki, Marcin; Strauss, Michael; Surace, Jason; Whitaker, Katherine

    2018-05-01

    We propose to observe a total of 7sq.deg. to complete the Spitzer-IRAC coverage of the HSC-Deep survey fields. These fields are the sites of the PrimeFocusSpectrograph (PFS) galaxy evolution survey which will provide spectra of wide wavelength range and resolution for almost all M* galaxies at z 0.7-1.7, and extend out to z 7 for targeted samples. Our fields already have deep broadband and narrowband photometry in 12 bands spanning from u through K and a wealth of other ancillary data. We propose completing the matching depth IRAC observations in the extended COSMOS, ELAIS-N1 and Deep2-3 fields. By complementing existing Spitzer coverage, this program will lead to an unprecedended in spectro-photometric coverage dataset across a total of 15 sq.deg. This dataset will have significant legacy value as it samples a large enough cosmic volume to be representative of the full range of environments, but also doing so with sufficient information content per galaxy to confidently derive stellar population characteristics. This enables detailed studies of the growth and quenching of galaxies and their supermassive black holes in the context of a galaxy's local and large scale environment.

  6. Deep Zonal Flow and Time Variation of Jupiter’s Magnetic Field

    NASA Astrophysics Data System (ADS)

    Cao, Hao; Stevenson, David J.

    2017-10-01

    All four giant planets in the Solar System feature zonal flows on the order of 100 m/s in the cloud deck, and large-scale intrinsic magnetic fields on the order of 1 Gauss near the surface. The vertical structure of the zonal flows remains obscure. The end-member scenarios are shallow flows confined in the radiative atmosphere and deep flows throughout the entire planet. The electrical conductivity increases rapidly yet smoothly as a function of depth inside Jupiter and Saturn. Deep zonal flows will advect the non-axisymmetric component of the magnetic field, at depth with even modest electrical conductivity, and create time variations in the magnetic field.The observed time variations of the geomagnetic field has been used to derive surface flows of the Earth’s outer core. The same principle applies to Jupiter, however, the connection between the time variation of the magnetic field (dB/dt) and deep zonal flow (Uphi) at Jupiter is not well understood due to strong radial variation of electrical conductivity. Here we perform a quantitative analysis of the connection between dB/dt and Uphi for Jupiter adopting realistic interior electrical conductivity profile, taking the likely presence of alkali metals into account. This provides a tool to translate expected measurement of the time variation of Jupiter’s magnetic field to deep zonal flows. We show that the current upper limit on the dipole drift rate of Jupiter (3 degrees per 20 years) is compatible with 10 m/s zonal flows with < 500 km vertical scale height below 0.972 Rj. We further demonstrate that fast drift of resolved magnetic features (e.g. magnetic spots) at Jupiter is a possibility.

  7. VizieR Online Data Catalog: Galaxy samples rest-frame ultraviolet structure (Bond+, 2014)

    NASA Astrophysics Data System (ADS)

    Bond, N. A.; Gardner, J. P.; de Mello, D. F.; Teplitz, H. I.; Rafelski, M.; Koekemoer, A. M.; Coe, D.; Grogin, N.; Gawiser, E.; Ravindranath, S.; Scarlata, C.

    2017-03-01

    In this paper, we use data taken as part of a program (GO 11563, PI: Teplitz) to obtain UV imaging of the Hubble Ultra Deep Field (hereafter UVUDF) and study intermediate-redshift galaxy structure in the F336W, F275W, and F225W filters, complementing existing optical and near-IR measurements from the 2012 Hubble Ultra Deep Field (HUDF12; Ellis et al. 2013ApJ...763L...7E) survey. We use AB magnitudes throughout and assume a concordance cosmology with H0=71 km/s/Mpc, ωm=0.27, and ωλ=0.73 (Spergel et al. 2007ApJS..170..377S). The UVUDF data and the optical Hubble Ultradeep Field (UDF; Beckwith et al. 2006, J/AJ/132/1729) are both contained within a single deep field in the Great Observatories Origins Deep Survey South. The new UVUDF data include imaging in three filters (F336W, F275W, and F225W), obtained in 10 visits, for a total of 30 orbits per filter. In addition, from the UDF, we make use of deep drizzled images taken in the observed optical with the F435W, F606W, and F775W filters. (1 data file).

  8. DC3 Data and Information Page

    Atmospheric Science Data Center

    2015-03-16

    Deep Convective Clouds and Chemistry (DC3) Data and Information The Deep Convective Clouds and Chemistry ( DC3 ) field campaign is investigating the impact of deep, ... processes, on upper tropospheric (UT) composition and chemistry. The primary science objectives are:   To quantify and ...

  9. A web-based subsetting service for regional scale MODIS land products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SanthanaVannan, Suresh K; Cook, Robert B; Holladay, Susan K

    2009-12-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) sensor has provided valuable information on various aspects of the Earth System since March 2000. The spectral, spatial, and temporal characteristics of MODIS products have made them an important data source for analyzing key science questions relating to Earth System processes at regional, continental, and global scales. The size of the MODIS product and native HDF-EOS format are not optimal for use in field investigations at individual sites (100 - 100 km or smaller). In order to make MODIS data readily accessible for field investigations, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemicalmore » Dynamics at Oak Ridge National Laboratory (ORNL) has developed an online system that provides MODIS land products in an easy-to-use format and in file sizes more appropriate to field research. This system provides MODIS land products data in a nonproprietary comma delimited ASCII format and in GIS compatible formats (GeoTIFF and ASCII grid). Web-based visualization tools are also available as part of this system and these tools provide a quick snapshot of the data. Quality control tools and a multitude of data delivery options are available to meet the demands of various user communities. This paper describes the important features and design goals for the system, particularly in the context of data archive and distribution for regional scale analysis. The paper also discusses the ways in which data from this system can be used for validation, data intercomparison, and modeling efforts.« less

  10. An Overview of ARL’s Multimodal Signatures Database and Web Interface

    DTIC Science & Technology

    2007-12-01

    ActiveX components, which hindered distribution due to license agreements and run-time license software to use such components. g. Proprietary...Overview The database consists of multimodal signature data files in the HDF5 format. Generally, each signature file contains all the ancillary...only contains information in the database, Web interface, and signature files that is releasable to the public. The Web interface consists of static

  11. Deep brain stimulation as a functional scalpel.

    PubMed

    Broggi, G; Franzini, A; Tringali, G; Ferroli, P; Marras, C; Romito, L; Maccagnano, E

    2006-01-01

    Since 1995, at the Istituto Nazionale Neurologico "Carlo Besta" in Milan (INNCB,) 401 deep brain electrodes were implanted to treat several drug-resistant neurological syndromes (Fig. 1). More than 200 patients are still available for follow-up and therapeutical considerations. In this paper our experience is reviewed and pioneered fields are highlighted. The reported series of patients extends the use of deep brain stimulation beyond the field of Parkinson's disease to new fields such as cluster headache, disruptive behaviour, SUNCt, epilepsy and tardive dystonia. The low complication rate, the reversibility of the procedure and the available image guided surgery tools will further increase the therapeutic applications of DBS. New therapeutical applications are expected for this functional scalpel.

  12. The role of hemocytes in A. gambiae antiplasmodial immunity

    PubMed Central

    Ramirez, Jose Luis; Garver, Lindsey S.; Brayner, Fábio André; Alves, Luiz Carlos; Rodrigues, Janneth; Molina-Cruz, Alvaro; Barillas-Mury, Carolina

    2013-01-01

    Hemocytes synthesize key components of the mosquito complement-like system, but their role in the activation of antiplasmodial responses has not been established. The effect of activating Toll signaling in hemocytes on Plasmodium survival was investigated by transferring hemocytes or cell-free hemolymph from donor mosquitoes in which the suppressor cactus was silenced. These transfers greatly enhanced antiplasmodial immunity, indicating that hemocytes are active players in the activation of the complement-like system, through an effector(s) regulated by the Toll pathway. A comparative analysis of hemocyte populations between susceptible (S) G3 and the refractory (R) L3-5 A. gambiae mosquito strains did not reveal significant differences under basal conditions or in response to Plasmodium berghei infection. The response of S mosquitoes to different Plasmodium species revealed similar kinetics following infection with P. berghei, P. yoelii or P. falciparum, but the strength of the priming response was stronger in less compatible mosquito-parasite pairs. The Toll, Imd, STAT or JNK signaling cascades were not essential for the production of hemocyte differentiation factor (HDF) in response to P. berghei infection, but disruption of Toll, STAT or JNK abolished hemocyte differentiation in response to HDF. We conclude that hemocytes are key mediators of A. gambiae antiplasmodial responses. PMID:23886925

  13. Spark and HPC for High Energy Physics Data Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sehrish, Saba; Kowalkowski, Jim; Paterno, Marc

    A full High Energy Physics (HEP) data analysis is divided into multiple data reduction phases. Processing within these phases is extremely time consuming, therefore intermediate results are stored in files held in mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive data analytics. Growth in size and complexity of experimental datasets, along with emerging big data tools are beginning to cause changes to the traditional ways of doing data analyses. Use of big data tools for HEP analysis looks promising, mainly because extremely large HEP datasets can be representedmore » and held in memory across a system, and accessed interactively by encoding an analysis using highlevel programming abstractions. The mainstream tools, however, are not designed for scientific computing or for exploiting the available HPC platform features. We use an example from the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in Geneva, Switzerland. The LHC is the highest energy particle collider in the world. Our use case focuses on searching for new types of elementary particles explaining Dark Matter in the universe. We use HDF5 as our input data format, and Spark to implement the use case. We show the benefits and limitations of using Spark with HDF5 on Edison at NERSC.« less

  14. Applying n-bit floating point numbers and integers, and the n-bit filter of HDF5 to reduce file sizes of remote sensing products in memory-sensitive environments

    NASA Astrophysics Data System (ADS)

    Zinke, Stephan

    2017-02-01

    Memory sensitive applications for remote sensing data require memory-optimized data types in remote sensing products. Hierarchical Data Format version 5 (HDF5) offers user defined floating point numbers and integers and the n-bit filter to create data types optimized for memory consumption. The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) applies a compaction scheme to the disseminated products of the Day and Night Band (DNB) data of Suomi National Polar-orbiting Partnership (S-NPP) satellite's instrument Visible Infrared Imager Radiometer Suite (VIIRS) through the EUMETSAT Advanced Retransmission Service, converting the original 32 bits floating point numbers to user defined floating point numbers in combination with the n-bit filter for the radiance dataset of the product. The radiance dataset requires a floating point representation due to the high dynamic range of the DNB. A compression factor of 1.96 is reached by using an automatically determined exponent size and an 8 bits trailing significand and thus reducing the bandwidth requirements for dissemination. It is shown how the parameters needed for user defined floating point numbers are derived or determined automatically based on the data present in a product.

  15. SDS: A Framework for Scientific Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Bin; Byna, Surendra; Wu, Kesheng

    2013-10-31

    Large-scale scientific applications typically write their data to parallel file systems with organizations designed to achieve fast write speeds. Analysis tasks frequently read the data in a pattern that is different from the write pattern, and therefore experience poor I/O performance. In this paper, we introduce a prototype framework for bridging the performance gap between write and read stages of data access from parallel file systems. We call this framework Scientific Data Services, or SDS for short. This initial implementation of SDS focuses on reorganizing previously written files into data layouts that benefit read patterns, and transparently directs read callsmore » to the reorganized data. SDS follows a client-server architecture. The SDS Server manages partial or full replicas of reorganized datasets and serves SDS Clients' requests for data. The current version of the SDS client library supports HDF5 programming interface for reading data. The client library intercepts HDF5 calls and transparently redirects them to the reorganized data. The SDS client library also provides a querying interface for reading part of the data based on user-specified selective criteria. We describe the design and implementation of the SDS client-server architecture, and evaluate the response time of the SDS Server and the performance benefits of SDS.« less

  16. A warm Spitzer survey of the LSST/DES 'Deep drilling' fields

    NASA Astrophysics Data System (ADS)

    Lacy, Mark; Farrah, Duncan; Brandt, Niel; Sako, Masao; Richards, Gordon; Norris, Ray; Ridgway, Susan; Afonso, Jose; Brunner, Robert; Clements, Dave; Cooray, Asantha; Covone, Giovanni; D'Andrea, Chris; Dickinson, Mark; Ferguson, Harry; Frieman, Joshua; Gupta, Ravi; Hatziminaoglou, Evanthia; Jarvis, Matt; Kimball, Amy; Lubin, Lori; Mao, Minnie; Marchetti, Lucia; Mauduit, Jean-Christophe; Mei, Simona; Newman, Jeffrey; Nichol, Robert; Oliver, Seb; Perez-Fournon, Ismael; Pierre, Marguerite; Rottgering, Huub; Seymour, Nick; Smail, Ian; Surace, Jason; Thorman, Paul; Vaccari, Mattia; Verma, Aprajita; Wilson, Gillian; Wood-Vasey, Michael; Cane, Rachel; Wechsler, Risa; Martini, Paul; Evrard, August; McMahon, Richard; Borne, Kirk; Capozzi, Diego; Huang, Jiashang; Lagos, Claudia; Lidman, Chris; Maraston, Claudia; Pforr, Janine; Sajina, Anna; Somerville, Rachel; Strauss, Michael; Jones, Kristen; Barkhouse, Wayne; Cooper, Michael; Ballantyne, David; Jagannathan, Preshanth; Murphy, Eric; Pradoni, Isabella; Suntzeff, Nicholas; Covarrubias, Ricardo; Spitler, Lee

    2014-12-01

    We propose a warm Spitzer survey to microJy depth of the four predefined Deep Drilling Fields (DDFs) for the Large Synoptic Survey Telescope (LSST) (three of which are also deep drilling fields for the Dark Energy Survey (DES)). Imaging these fields with warm Spitzer is a key component of the overall success of these projects, that address the 'Physics of the Universe' theme of the Astro2010 decadal survey. With deep, accurate, near-infrared photometry from Spitzer in the DDFs, we will generate photometric redshift distributions to apply to the surveys as a whole. The DDFs are also the areas where the supernova searches of DES and LSST are concentrated, and deep Spitzer data is essential to obtain photometric redshifts, stellar masses and constraints on ages and metallicities for the >10000 supernova host galaxies these surveys will find. This 'DEEPDRILL' survey will also address the 'Cosmic Dawn' goal of Astro2010 through being deep enough to find all the >10^11 solar mass galaxies within the survey area out to z~6. DEEPDRILL will complete the final 24.4 square degrees of imaging in the DDFs, which, when added to the 14 square degrees already imaged to this depth, will map a volume of 1-Gpc^3 at z>2. It will find ~100 > 10^11 solar mass galaxies at z~5 and ~40 protoclusters at z>2, providing targets for JWST that can be found in no other way. The Spitzer data, in conjunction with the multiwavelength surveys in these fields, ranging from X-ray through far-infrared and cm-radio, will comprise a unique legacy dataset for studies of galaxy evolution.

  17. FIELD TEST OF AIR SPARGING COUPLED WITH SOIL VAPOR EXTRACTION

    EPA Science Inventory

    A controlled field study was designed and conducted to assess the performance of air sparging for remediation of petroleum fuel and solvent contamination in a shallow (3-m deep) groundwater aquifer. Sparging was performed in an insolation test cell (5 m by 3 m by 8-m deep). A soi...

  18. Data to Support Development of Geologic Framework Models for the Deep Borehole Field Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perry, Frank Vinton; Kelley, Richard E.

    This report summarizes work conducted in FY2017 to identify and document publically available data for developing a Geologic Framework Model (GFM) for the Deep Borehole Field Test (DBFT). Data was collected for all four of the sites being considered in 2017 for a DBFT site.

  19. Miniature ingestible telemeter devices to measure deep-body temperature

    NASA Technical Reports Server (NTRS)

    Pope, J. M.; Fryer, T. B. (Inventor)

    1976-01-01

    A telemetry device comprised of a pill-size ingestible transmitter developed to obtain deep body temperature measurements of a human is described. The device has particular utility in the medical field where deep body temperatures provide an indication of general health.

  20. Comparison of the induced fields using different coil configurations during deep transcranial magnetic stimulation.

    PubMed

    Lu, Mai; Ueno, Shoogo

    2017-01-01

    Stimulation of deeper brain structures by transcranial magnetic stimulation (TMS) plays a role in the study of reward and motivation mechanisms, which may be beneficial in the treatment of several neurological and psychiatric disorders. However, electric field distributions induced in the brain by deep transcranial magnetic stimulation (dTMS) are still unknown. In this paper, the double cone coil, H-coil and Halo-circular assembly (HCA) coil which have been proposed for dTMS have been numerically designed. The distributions of magnetic flux density, induced electric field in an anatomically based realistic head model by applying the dTMS coils were numerically calculated by the impedance method. Results were compared with that of standard figure-of-eight (Fo8) coil. Simulation results show that double cone, H- and HCA coils have significantly deep field penetration compared to the conventional Fo8 coil, at the expense of induced higher and wider spread electrical fields in superficial cortical regions. Double cone and HCA coils have better ability to stimulate deep brain subregions compared to that of the H-coil. In the mean time, both double cone and HCA coils increase risk for optical nerve excitation. Our results suggest although the dTMS coils offer new tool with potential for both research and clinical applications for psychiatric and neurological disorders associated with dysfunctions of deep brain regions, the selection of the most suitable coil settings for a specific clinical application should be based on a balanced evaluation between stimulation depth and focality.

  1. Unfolding the atmospheric and deep internal flows on Jupiter and Saturn using the Juno and Cassini gravity measurements

    NASA Astrophysics Data System (ADS)

    Galanti, Eli; Kaspi, Yohai

    2016-10-01

    In light of the first orbits of Juno at Jupiter, we discuss the Juno gravity experiment and possible initial results. Relating the flow on Jupiter and Saturn to perturbations in their density field is key to the analysis of the gravity measurements expected from both the Juno (Jupiter) and Cassini (Saturn) spacecraft during 2016-17. Both missions will provide latitude-dependent gravity fields, which in principle could be inverted to calculate the vertical structure of the observed cloud-level zonal flow on these planets. Current observations for the flow on these planets exists only at the cloud-level (0.1-1 bar). The observed cloud-level wind might be confined to the upper layers, or be a manifestation of deep cylindrical flows. Moreover, it is possible that in the case where the observed wind is superficial, there exists deep interior flow that is completely decoupled from the observed atmospheric flow.In this talk, we present a new adjoint based inverse model for inversion of the gravity measurements into flow fields. The model is constructed to be as general as possible, allowing for both cloud-level wind extending inward, and a decoupled deep flow that is constructed to produce cylindrical structures with variable width and magnitude, or can even be set to be completely general. The deep flow is also set to decay when approaching the upper levels so it has no manifestation there. The two sources of flow are then combined to a total flow field that is related to the density anomalies and gravity moments via a dynamical model. Given the measured gravitational moments from Jupiter and Saturn, the dynamical model, together with the adjoint inverse model are used for optimizing the control parameters and by this unfolding the deep and surface flows. Several scenarios are examined, including cases in which the surface wind and the deep flow have comparable effects on the gravity field, cases in which the deep flow is dominating over the surface wind, and an extreme case where the deep flow can have an unconstrained pattern. The method enables also the calculation of the uncertainties associated with each solution. We discuss the physical limitations to the method in view of the measurement uncertainties.

  2. The accurate particle tracer code

    NASA Astrophysics Data System (ADS)

    Wang, Yulei; Liu, Jian; Qin, Hong; Yu, Zhi; Yao, Yicun

    2017-11-01

    The Accurate Particle Tracer (APT) code is designed for systematic large-scale applications of geometric algorithms for particle dynamical simulations. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and nonlinear problems. To provide a flexible and convenient I/O interface, the libraries of Lua and Hdf5 are used. Following a three-step procedure, users can efficiently extend the libraries of electromagnetic configurations, external non-electromagnetic forces, particle pushers, and initialization approaches by use of the extendible module. APT has been used in simulations of key physical problems, such as runaway electrons in tokamaks and energetic particles in Van Allen belt. As an important realization, the APT-SW version has been successfully distributed on the world's fastest computer, the Sunway TaihuLight supercomputer, by supporting master-slave architecture of Sunway many-core processors. Based on large-scale simulations of a runaway beam under parameters of the ITER tokamak, it is revealed that the magnetic ripple field can disperse the pitch-angle distribution significantly and improve the confinement of energetic runaway beam on the same time.

  3. Schnek: A C++ library for the development of parallel simulation codes on regular grids

    NASA Astrophysics Data System (ADS)

    Schmitz, Holger

    2018-05-01

    A large number of algorithms across the field of computational physics are formulated on grids with a regular topology. We present Schnek, a library that enables fast development of parallel simulations on regular grids. Schnek contains a number of easy-to-use modules that greatly reduce the amount of administrative code for large-scale simulation codes. The library provides an interface for reading simulation setup files with a hierarchical structure. The structure of the setup file is translated into a hierarchy of simulation modules that the developer can specify. The reader parses and evaluates mathematical expressions and initialises variables or grid data. This enables developers to write modular and flexible simulation codes with minimal effort. Regular grids of arbitrary dimension are defined as well as mechanisms for defining physical domain sizes, grid staggering, and ghost cells on these grids. Ghost cells can be exchanged between neighbouring processes using MPI with a simple interface. The grid data can easily be written into HDF5 files using serial or parallel I/O.

  4. Hubble Space Telescope Deep Field Lesson Package. Teacher's Guide, Grades 6-8. Amazing Space: Education On-Line from the Hubble Space Telescope.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    This lesson guide accompanies the Hubble Deep Field set of 10 lithographs and introduces 4 astronomy lesson plans for middle school students. Lessons include: (1) "How Many Objects Are There?"; (2) "Classifying and Identifying"; (3) "Estimating Distances in Space"; and (4) "Review and Assessment." Appendices…

  5. Deep-Earth reactor: nuclear fission, helium, and the geomagnetic field.

    PubMed

    Hollenbach, D F; Herndon, J M

    2001-09-25

    Geomagnetic field reversals and changes in intensity are understandable from an energy standpoint as natural consequences of intermittent and/or variable nuclear fission chain reactions deep within the Earth. Moreover, deep-Earth production of helium, having (3)He/(4)He ratios within the range observed from deep-mantle sources, is demonstrated to be a consequence of nuclear fission. Numerical simulations of a planetary-scale geo-reactor were made by using the SCALE sequence of codes. The results clearly demonstrate that such a geo-reactor (i) would function as a fast-neutron fuel breeder reactor; (ii) could, under appropriate conditions, operate over the entire period of geologic time; and (iii) would function in such a manner as to yield variable and/or intermittent output power.

  6. Finding the First Galaxies

    NASA Technical Reports Server (NTRS)

    Gardner, Jonathan P.

    2009-01-01

    Astronomers study distant galaxies by taking long exposures in deep survey fields. They choose fields that are empty of known sources, so that they are statistically representative of the Universe as a whole. Astronomers can compare the distribution of the detected galaxies in brightness, color, morphology and redshift to theoretical models, in order to puzzle out the processes of galaxy evolution. In 2004, the Hubble Space Telescope was pointed at a small, deep-survey field in the southern constellation Fornax for more than 500 hours of exposure time. The resulting Hubble Ultra-Deep Field could see the faintest and most distant galaxies that the telescope is capable of viewing. These galaxies emitted their light less than 1 billion years after the Big Bang. From the Ultra Deep Field and other galaxy surveys, astronomers have built up a history of star formation in the universe. the peak occurred about7 billion years ago, about half of the age of the current universe, then the number of stars that were forming was about 15 time the rate today. Going backward in time to when the very first starts and galaxies formed, the average star-formation rate should drop to zero. but when looking at the most distant galaxies in the Ultra Deep field, the star formation rate is still higher than it is today. The faintest galaxies seen by Hubble are not the first galaxies that formed in the early universe. To detect these galaxies NASA is planning the James Webb Space Telescope for launch in 2013. Webb will have a 6.5-meter diameter primary mirror, much bigger than Hubble's 2.4-meter primary, and will be optimized for infrared observations to see the highly redshifted galaxies.

  7. VERAView

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Ronald W.; Collins, Benjamin S.; Godfrey, Andrew T.

    2016-12-09

    In order to support engineering analysis of Virtual Environment for Reactor Analysis (VERA) model results, the Consortium for Advanced Simulation of Light Water Reactors (CASL) needs a tool that provides visualizations of HDF5 files that adhere to the VERAOUT specification. VERAView provides an interactive graphical interface for the visualization and engineering analyses of output data from VERA. The Python-based software provides instantaneous 2D and 3D images, 1D plots, and alphanumeric data from VERA multi-physics simulations.

  8. NASA Briefing for Unidata

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher

    2016-01-01

    The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk covered the program of cloud computing prototypes being undertaken for the Earth Observing System Data and Information System (EOSDIS). Also discussed were dataset interoperability recommendations ratified via the EOSDIS Standards Office and the HDF Product Designer tool with respect to its possible applicability to data in network Common Data Form (NetCDF) version 4.

  9. Monthly average polar sea-ice concentration

    USGS Publications Warehouse

    Schweitzer, Peter N.

    1995-01-01

    The data contained in this CD-ROM depict monthly averages of sea-ice concentration in the modern polar oceans. These averages were derived from the Scanning Multichannel Microwave Radiometer (SMMR) and Special Sensor Microwave/Imager (SSM/I) instruments aboard satellites of the U.S. Air Force Defense Meteorological Satellite Program from 1978 through 1992. The data are provided as 8-bit images using the Hierarchical Data Format (HDF) developed by the National Center for Supercomputing Applications.

  10. Development of the EM tomography system by the vertical electromagnetic profiling (VEMP) method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miura, Y.; Osato, K.; Takasugi, S.

    1995-12-31

    As a part of the {open_quotes}Deep-Seated Geothermal Resources Survey{close_quotes} project being undertaken by the NEDO, the Vertical ElectroMagnetic Profiling (VEMP) method is being developed to accurately obtain deep resistivity structure. The VEMP method acquires multi-frequency three-component magnetic field data in an open hole well using controlled sources (loop sources or grounded-wire sources) emitted at the surface. Numerical simulation using EM3D demonstrated that phase data of the VEMP method is very sensitive to resistivity structure and the phase data will also indicate presence of deep anomalies. Forward modelling was also used to determine required transmitter moments for various grounded-wire and loopmore » sources for a field test using the WD-1 well in the Kakkonda geothermal area. Field logging of the well was carried out in May 1994 and the processed field data matches well the simulated data.« less

  11. Computational analysis of transcranial magnetic stimulation in the presence of deep brain stimulation probes

    NASA Astrophysics Data System (ADS)

    Syeda, F.; Holloway, K.; El-Gendy, A. A.; Hadimani, R. L.

    2017-05-01

    Transcranial Magnetic Stimulation is an emerging non-invasive treatment for depression, Parkinson's disease, and a variety of other neurological disorders. Many Parkinson's patients receive the treatment known as Deep Brain Stimulation, but often require additional therapy for speech and swallowing impairment. Transcranial Magnetic Stimulation has been explored as a possible treatment by stimulating the mouth motor area of the brain. We have calculated induced electric field, magnetic field, and temperature distributions in the brain using finite element analysis and anatomically realistic heterogeneous head models fitted with Deep Brain Stimulation leads. A Figure of 8 coil, current of 5000 A, and frequency of 2.5 kHz are used as simulation parameters. Results suggest that Deep Brain Stimulation leads cause surrounding tissues to experience slightly increased E-field (Δ Emax =30 V/m), but not exceeding the nominal values induced in brain tissue by Transcranial Magnetic Stimulation without leads (215 V/m). The maximum temperature in the brain tissues surrounding leads did not change significantly from the normal human body temperature of 37 °C. Therefore, we ascertain that Transcranial Magnetic Stimulation in the mouth motor area may stimulate brain tissue surrounding Deep Brain Stimulation leads, but will not cause tissue damage.

  12. Deep Borehole Field Test Requirements and Controlled Assumptions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientificmore » characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.« less

  13. Model United Nations and Deep Learning: Theoretical and Professional Learning

    ERIC Educational Resources Information Center

    Engel, Susan; Pallas, Josh; Lambert, Sarah

    2017-01-01

    This article demonstrates that the purposeful subject design, incorporating a Model United Nations (MUN), facilitated deep learning and professional skills attainment in the field of International Relations. Deep learning was promoted in subject design by linking learning objectives to Anderson and Krathwohl's (2001) four levels of knowledge or…

  14. Evaluation of Resuspension from Propeller Wash in DoD Harbors

    DTIC Science & Technology

    2016-05-01

    RESUSPENSION CHARACTERIZATION ............................................................. 11 5.3 DEEP -DRAFT RESUSPENSION STUDY IN PEARL HARBOR...RESUSPENSION FROM A DEEP -DRAFT VESSEL .............................................. 21 6.4.1 Field Observations Using ADCP...event resulted in validation of the FANS model for prediction of sediment resuspension by a deep draft vessel. While working on the resuspension

  15. Comment on 'Deep convolutional neural network with transfer learning for rectum toxicity prediction in cervical cancer radiotherapy: a feasibility study'.

    PubMed

    Valdes, Gilmer; Interian, Yannet

    2018-03-15

    The application of machine learning (ML) presents tremendous opportunities for the field of oncology, thus we read 'Deep convolutional neural network with transfer learning for rectum toxicity prediction in cervical cancer radiotherapy: a feasibility study' with great interest. In this article, the authors used state of the art techniques: a pre-trained convolutional neural network (VGG-16 CNN), transfer learning, data augmentation, drop out and early stopping, all of which are directly responsible for the success and the excitement that these algorithms have created in other fields. We believe that the use of these techniques can offer tremendous opportunities in the field of Medical Physics and as such we would like to praise the authors for their pioneering application to the field of Radiation Oncology. That being said, given that the field of Medical Physics has unique characteristics that differentiate us from those fields where these techniques have been applied successfully, we would like to raise some points for future discussion and follow up studies that could help the community understand the limitations and nuances of deep learning techniques.

  16. Comparison of the induced fields using different coil configurations during deep transcranial magnetic stimulation

    PubMed Central

    Ueno, Shoogo

    2017-01-01

    Stimulation of deeper brain structures by transcranial magnetic stimulation (TMS) plays a role in the study of reward and motivation mechanisms, which may be beneficial in the treatment of several neurological and psychiatric disorders. However, electric field distributions induced in the brain by deep transcranial magnetic stimulation (dTMS) are still unknown. In this paper, the double cone coil, H-coil and Halo-circular assembly (HCA) coil which have been proposed for dTMS have been numerically designed. The distributions of magnetic flux density, induced electric field in an anatomically based realistic head model by applying the dTMS coils were numerically calculated by the impedance method. Results were compared with that of standard figure-of-eight (Fo8) coil. Simulation results show that double cone, H- and HCA coils have significantly deep field penetration compared to the conventional Fo8 coil, at the expense of induced higher and wider spread electrical fields in superficial cortical regions. Double cone and HCA coils have better ability to stimulate deep brain subregions compared to that of the H-coil. In the mean time, both double cone and HCA coils increase risk for optical nerve excitation. Our results suggest although the dTMS coils offer new tool with potential for both research and clinical applications for psychiatric and neurological disorders associated with dysfunctions of deep brain regions, the selection of the most suitable coil settings for a specific clinical application should be based on a balanced evaluation between stimulation depth and focality. PMID:28586349

  17. Near Infrared Imaging of the Hubble Deep Field with Keck Telescope

    NASA Technical Reports Server (NTRS)

    Hogg, David W.; Neugebauer, G.; Armus, Lee; Matthews, K.; Pahre, Michael A.; Soifer, B. T.; Weinberger, A. J.

    1997-01-01

    Two deep K-band (2.2 micrometer) images, with point-source detection limits of K=25.2 mag (one sigma), taken with the Keck Telescope in subfields of the Hubble Deep Field, are presented and analyzed. A sample of objects to K=24 mag is constructed and V(sub 606)- I(sub 814) and I(sub 814)-K colors are measured. By stacking visually selected objects, mean I(sub 814)-K colors can be measured to very faint levels, the mean I(sub 814)-K color is constant with apparent magnitude down to V(sub 606)=28 mag.

  18. VizieR Online Data Catalog: Redshifts of 65 CANDELS supernovae (Rodney+, 2014)

    NASA Astrophysics Data System (ADS)

    Rodney, S. A.; Riess, A. G.; Strolger, L.-G.; Dahlen, T.; Graur, O.; Casertano, S.; Dickinson, M. E.; Ferguson, H. C.; Garnavich, P.; Hayden, B.; Jha, S. W.; Jones, D. O.; Kirshner, R. P.; Koekemoer, A. M.; McCully, C.; Mobasher, B.; Patel, B.; Weiner, B. J.; Cenko, S. B.; Clubb, K. I.; Cooper, M.; Filippenko, A. V.; Frederiksen, T. F.; Hjorth, J.; Leibundgut, B.; Matheson, T.; Nayyeri, H.; Penner, K.; Trump, J.; Silverman, J. M.; U, V.; Azalee Bostroem, K.; Challis, P.; Rajan, A.; Wolff, S.; Faber, S. M.; Grogin, N. A.; Kocevski, D.

    2015-01-01

    In this paper we present a measurement of the Type Ia supernova explosion rate as a function of redshift (SNR(z)) from a sample of 65 supernovae discovered in the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) supernova program. This supernova survey is a joint operation of two Hubble Space Telescope (HST) Multi-Cycle Treasury (MCT) programs: CANDELS (PIs: Faber and Ferguson; Grogin et al., 2011ApJS..197...35G; Koekemoer et al., 2011ApJS..197...36K), and the Cluster Lensing and Supernovae search with Hubble (CLASH; PI: Postman; Postman et al. 2012, cat. J/ApJS/199/25). The supernova discovery and follow-up for both programs were allocated to the HST MCT supernova program (PI: Riess). The results presented here are based on the full five fields and ~0.25deg2 of the CANDELS program, observed from 2010 to 2013. A companion paper presents the SN Ia rates from the CLASH sample (Graur et al., 2014ApJ...783...28G). A composite analysis that combines the CANDELS+CLASH supernova sample and revisits past HST surveys will be presented in a future paper. The three-year CANDELS program was designed to probe galaxy evolution out to z~8 with deep infrared and optical imaging of five well-studied extragalactic fields: GOODS-S, GOODS-N (the Great Observatories Origins Deep Survey South and North; Giavalisco et al. 2004, cat. II/261), COSMOS (the Cosmic Evolution Survey, Scoville et al., 2007ApJS..172....1S; Koekemoer et al., 2007ApJS..172..196K), UDS (the UKIDSS Ultra Deep Survey; Lawrence et al. 2007, cat. II/314; Cirasuolo et al., 2007MNRAS.380..585C), EGS (the Extended Groth Strip; Davis et al. 2007, cat. III/248). As described fully in Grogin et al. (2011ApJS..197...35G), the CANDELS program includes both "wide" and "deep" fields. The wide component of CANDELS comprises the COSMOS, UDS, and EGS fields, plus one-third of the GOODS-S field and one half of the GOODS-N field--a total survey area of 730 arcmin2. The "deep" component of CANDELS came from the central 67arcmin2 of each of the GOODS-S and GOODS-N fields. The CANDELS fields analyzed in this work are described in Table 1. (6 data files).

  19. Vitamins (A, C and E) and oxidative status of hemodialysis patients treated with HFR and HFR-Supra.

    PubMed

    Palleschi, Simonetta; Ghezzi, Paolo M; Palladino, Giuseppe; Rossi, Barbara; Ganadu, Marino; Casu, Domenica; Cossu, Maria; Mattana, Giovanni; Pinna, Antonio Maria; Contu, Bruno; Ghisu, Tonina; Monni, Alessandro; Gazzanelli, Luana; Mereu, Maria Cristina; Logias, Franco; Passaghe, Mario; Amore, Alessandro; Bolasco, Piergiorgio

    2016-08-26

    Hemodiafiltration with on-line endogenous reinfusion (HFR) is an extracorporeal dialytic method that combines diffusion, convection and adsorption. HFR-Supra (HFR-S) is a second-generation system with increased convective permeability and adsorption capability. Previous studies suggested that HFR reduces oxidative stress compared to standard haemodialysis. The principal aim of the present study was to compare antioxidant vitamins behavior and oxidative status of hemodialysis patients treated with HFR and HFR-S. The study was designed as a multicenter, randomized, crossover trial. Forty-one patients were recruited from 19 dialysis centers and after a 4-month washout stabilization period in on-line hemodiafiltration (ol-HDF), each patient was randomized to a sequence of treatments (HFR-S followed by HFR or viceversa) with each treatment applied over 6 months. Plasma levels of Advanced Oxidation Protein Products, Total Antioxidant Status, vitamins C, A and E and their ligands (Retinol Binding Protein and total lipids) were measured at baseline and at the end of each treatment period. Results show that the higher convective permeability of HFR-S with respect to HFR did not produce additional beneficial effects on the patients' oxidative status, a slight decrease of both Vitamin A and Retinol Binding Protein being the only difference registered in the long-term. However, as compared to ol-HDF, both the re-infusive techniques allowed to reduce the intradialytic loss of Vitamin C and, in the long-term, improve the patients' oxidative status and increase Retinol Binding Protein plasma values. No significant differences were found between the Vitamin C concentration of pre- and post cartridge UF neither in HFR-S nor in HFR showing that the sorbent resin does not adsorb Vitamin C. HFR-S and HFR are almost equivalent in term of impact on antioxidant vitamins and oxidative status of hemodialysis patients. Nonetheless, as compared to ol-HDF, both treatments produced a sensible sparing of Vitamin C and may represent a new approach for reducing oxidative stress and related complications in dialysis patients. Long-term effects of re-infusive treatments on patients' cardiovascular morbidity and mortality need to be evaluated. ClinicalTrials.gov Identifier NCT01492491 , retrospectively registered in 10 December 2011.

  20. Improvement of autonomic nervous regulation by blood purification therapy using acetate-free dialysis fluid - clinical evaluation by laser Doppler flowmetry.

    PubMed

    Sato, Takashi; Taoka, Masahiro; Miyahara, Takaaki

    2011-01-01

    In Japan, acetate-free biofiltration (AFBF) became commercially available in the year 2000, and these products have been reported to be clinically effective for controlling the decrease of blood pressure during dialysis or various types of dialysis intolerance. And more, acetate-free dialysis fluid was made clinically available in 2007, acetate-free hemodialysis (AFHD) is expected to inhibit the malnutrition-inflammation-atherosclerosis syndrome, improve anemia and the nutritional status of patients, stabilize hemodynamics, and reduce inflammation and oxidative stress. In a broad sense, AFBF can be classified as hemodiafiltration (HDF), and its clinical effects seem to be associated with multiple factors, including use of acetate-free dialysis fluid, massive removal of low molecular weight proteins by convection, and the sodium concentration of the replacement fluid. Therefore, the clinical significance of acetate-free dialysis fluid could be demonstrated more clearly by comparing AFHD with conventional hemodialysis (conv. HD) using dialysis fluid containing about 10 mEq/l acetate. Since 2005, we have been investigating the efficacy of various modalities of blood purification therapy by continuously monitoring changes of tissue blood flow in the lower limbs and earlobes (head) using non-invasive continuous monitoring method (NICOMM). In this report, we assess the clinical effectiveness of AFHD on the basis of clinical findings and head stability index (head SI) obtained by NICOMM, particularly with respect to the influence on autonomic regulation. After switching to AFHD from conv. HD, anemia, stored iron utilization, and the frequency of treatments for dialysis hypotension and of muscle cramps were significantly improved. Further, the head SI was also significantly smaller with AFHD than conv. HD. This finding suggests that AFHD improved the maintenance of homeostasis by the autonomic nervous regulation system. In addition, we could not find clinical features of excessive alkalosis during an observation period of about 1 year, even if online HDF using acetate-free dialysis fluid as the substitution fluid. Our conclusion is that the advent of acetate-free dialysis fluid has led to investigations into new clinical effectiveness of AFHD or online HDF/HF using ultrapurified acetate-free dialysis fluid as the substitution fluid. Copyright © 2011 S. Karger AG, Basel.

  1. An overview of regular dialysis treatment in Japan (as of 31 December 2012).

    PubMed

    Nakai, Shigeru; Hanafusa, Norio; Masakane, Ikuto; Taniguchi, Masatomo; Hamano, Takayuki; Shoji, Tetsuo; Hasegawa, Takeshi; Itami, Noritomo; Yamagata, Kunihiro; Shinoda, Toshio; Kazama, Junichiro James; Watanabe, Yuzo; Shigematsu, Takashi; Marubayashi, Seiji; Morita, Osamu; Wada, Atsushi; Hashimoto, Seiji; Suzuki, Kazuyuki; Nakamoto, Hidetomo; Kimata, Naoki; Wakai, Kenji; Fujii, Naohiko; Ogata, Satoshi; Tsuchida, Kenji; Nishi, Hiroshi; Iseki, Kunitoshi; Tsubakihara, Yoshiharu

    2014-12-01

    A nationwide statistical survey of 4279 dialysis facilities was conducted at the end of 2012, among which 4238 responded (99.0%). The number of new dialysis patients was 38055 in 2012. Since 2008, the number of new dialysis patients has remained almost the same without any marked increase or decrease. The number of dialysis patients who died in 2012 was 30710; a slight decrease from 2011 (30743). The dialysis patient population has been growing every year in Japan; it was 310007 at the end of 2012, which exceeded 310000 for the first time. The number of dialysis patients per million at the end of 2012 was 2431.2. The crude death rate of dialysis patients in 2012 was 10.0%, a slight decrease from that in 2011 (10.2%). The mean age of new dialysis patients was 68.5 years and the mean age of the entire dialysis patient population was 66.9 years. The most common primary cause of renal failure among new dialysis patients was diabetic nephropathy (44.2%). The actual number of new dialysis patients with diabetic nephropathy has been approximately 16000 for the last few years. Diabetic nephropathy was also the most common primary disease among the entire dialysis patient population (37.1%), followed by chronic glomerulonephritis (33.6%). The percentage of dialysis patients with diabetic nephropathy has been continuously increasing, whereas not only the percentage but also the actual number of dialysis patients with chronic glomerulonephritis has decreased. The number of patients who underwent hemodiafiltration (HDF) at the end of 2012 was 21725, a marked increase from that in 2011 (14115). In particular, the number of patients who underwent on-line HDF increased threefold from 4890 in 2011 to 14069 in 2012. From the results of the facility survey, the number of patients who underwent peritoneal dialysis (PD) was 9514 and that of patients who did not undergo PD despite having a PD catheter in the abdominal cavity was 347. From the results of the patient survey, among the PD patients, 1932 also underwent another dialysis method using extracorporeal circulation, such as hemodialysis (HD) and HDF. The number of patients who underwent HD at home in 2012 was 393, a marked increase from that in 2011 (327). © 2014 Japanese Society for Dialysis Therapy. Reproduced with permission.

  2. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Palamuttam, R. S.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; Verma, R.; Waliser, D. E.; Lee, H.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark under a NASA AIST grant (PI Mattmann). Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 10 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. We have implemented a parallel data ingest capability in which the user specifies desired variables (arrays) as several time-sorted lists of URL's (i.e. using OPeNDAP model.nc?varname, or local files). The specified variables are partitioned by time/space and then each Spark node pulls its bundle of arrays into memory to begin a computation pipeline. We also investigated the performance of several N-dim. array libraries (scala breeze, java jblas & netlib-java, and ND4J). We are currently developing science codes using ND4J and studying memory behavior on the JVM. On the pyspark side, many of our science codes already use the numpy and SciPy ecosystems. The talk will cover: the architecture of SciSpark, the design of the scientific RDD (sRDD) data structure, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.

  3. The effect of external magnetic field changing on the correlated quantum dot dynamics

    NASA Astrophysics Data System (ADS)

    Mantsevich, V. N.; Maslova, N. S.; Arseyev, P. I.

    2018-06-01

    The non-stationary response of local magnetic moment to abrupt switching "on" and "off" of external magnetic field was studied for a single-level quantum dot (QD) coupled to a reservoir. We found that transient processes look different for the shallow and deep localized energy level. It was demonstrated that for deep energy level the relaxation rates of the local magnetic moment strongly differ in the case of magnetic field switching "on" or "off". Obtained results can be applied in the area of dynamic memory devices stabilization in the presence of magnetic field.

  4. Near-UV Sources in the Hubble Ultra Deep Field: The Catalog

    NASA Technical Reports Server (NTRS)

    Gardner, Jonathan P.; Voyrer, Elysse; de Mello, Duilia F.; Siana, Brian; Quirk, Cori; Teplitz, Harry I.

    2009-01-01

    The catalog from the first high resolution U-band image of the Hubble Ultra Deep Field, taken with Hubble s Wide Field Planetary Camera 2 through the F300W filter, is presented. We detect 96 U-band objects and compare and combine this catalog with a Great Observatories Origins Deep Survey (GOODS) B-selected catalog that provides B, V, i, and z photometry, spectral types, and photometric redshifts. We have also obtained Far-Ultraviolet (FUV, 1614 Angstroms) data with Hubble s Advanced Camera for Surveys Solar Blind Channel (ACS/SBC) and with Galaxy Evolution Explorer (GALEX). We detected 31 sources with ACS/SBC, 28 with GALEX/FUV, and 45 with GALEX/NUV. The methods of observations, image processing, object identification, catalog preparation, and catalog matching are presented.

  5. Deep geothermal processes acting on faults and solid tides in coastal Xinzhou geothermal field, Guangdong, China

    NASA Astrophysics Data System (ADS)

    Lu, Guoping; Wang, Xiao; Li, Fusi; Xu, Fangyiming; Wang, Yanxin; Qi, Shihua; Yuen, David

    2017-03-01

    This paper investigated the deep fault thermal flow processes in the Xinzhou geothermal field in the Yangjiang region of Guangdong Province. Deep faults channel geothermal energy to the shallow ground, which makes it difficult to study due to the hidden nature. We conducted numerical experiments in order to investigate the physical states of the geothermal water inside the fault zone. We view the deep fault as a fast flow path for the thermal water from the deep crust driven up by the buoyancy. Temperature measurements at the springs or wells constrain the upper boundary, and the temperature inferred from the Currie temperature interface bounds the bottom. The deepened boundary allows the thermal reservoir to revolve rather than to be at a fixed temperature. The results detail the concept of a thermal reservoir in terms of its formation and heat distribution. The concept also reconciles the discrepancy in reservoir temperatures predicted from both quartz and Na-K-Mg. The downward displacement of the crust increases the pressure at the deep ground and leads to an elevated temperature and a lighter water density. Ultimately, our results are a first step in implementing numerical studies of deep faults through geothermal water flows; future works need to extend to cases of supercritical states. This approach is applicable to general deep-fault thermal flows and dissipation paths for the seismic energy from the deep crust.

  6. SMUVS: Spitzer Matching survey of the UltraVISTA ultra-deep Stripes

    NASA Astrophysics Data System (ADS)

    Caputi, Karina; Ashby, Matthew; Fazio, Giovanni; Huang, Jiasheng; Dunlop, James; Franx, Marijn; Le Fevre, Olivier; Fynbo, Johan; McCracken, Henry; Milvang-Jensen, Bo; Muzzin, Adam; Ilbert, Olivier; Somerville, Rachel; Wechsler, Risa; Behroozi, Peter; Lu, Yu

    2014-12-01

    We request 2026.5 hours to homogenize the matching ultra-deep IRAC data of the UltraVISTA ultra-deep stripes, producing a final area of ~0.6 square degrees with the deepest near- and mid-IR coverage existing in any such large area of the sky (H, Ks, [3.6], [4.5] ~ 25.3-26.1 AB mag; 5 sigma). The UltraVISTA ultra-deep stripes are contained within the larger COSMOS field, which has a rich collection of multi-wavelength, ancillary data, making it ideal to study different aspects of galaxy evolution with high statistical significance and excellent redshift accuracy. The UltraVISTA ultra-deep stripes are the region of the COSMOS field where these studies can be pushed to the highest redshifts, but securely identifying high-z galaxies, and determining their stellar masses, will only be possible if ultra-deep mid-IR data are available. Our IRAC observations will allow us to: 1) extend the galaxy stellar mass function at redshifts z=3 to z=5 to the intermediate mass regime (M~5x10^9-10^10 Msun), which is critical to constrain galaxy formation models; 2) gain a factor of six in the area where it is possible to effectively search for z>=6 galaxies and study their properties; 3) measure, for the first time, the large-scale structure traced by an unbiased galaxy sample at z=5 to z=7, and make the link to their host dark matter haloes. This cannot be done in any other field of the sky, as the UltraVISTA ultra-deep stripes form a quasi-contiguous, regular-shape field, which has a unique combination of large area and photometric depth. 4) provide a unique resource for the selection of secure z>5 targets for JWST and ALMA follow up. Our observations will have an enormous legacy value which amply justifies this new observing-time investment in the COSMOS field. Spitzer cannot miss this unique opportunity to open up a large 0.6 square-degree window to the early Universe.

  7. Deep-Earth reactor: Nuclear fission, helium, and the geomagnetic field

    PubMed Central

    Hollenbach, D. F.; Herndon, J. M.

    2001-01-01

    Geomagnetic field reversals and changes in intensity are understandable from an energy standpoint as natural consequences of intermittent and/or variable nuclear fission chain reactions deep within the Earth. Moreover, deep-Earth production of helium, having 3He/4He ratios within the range observed from deep-mantle sources, is demonstrated to be a consequence of nuclear fission. Numerical simulations of a planetary-scale geo-reactor were made by using the SCALE sequence of codes. The results clearly demonstrate that such a geo-reactor (i) would function as a fast-neutron fuel breeder reactor; (ii) could, under appropriate conditions, operate over the entire period of geologic time; and (iii) would function in such a manner as to yield variable and/or intermittent output power. PMID:11562483

  8. Identification of Telomerase Components and Telomerase Regulating Factors in Yeast

    DTIC Science & Technology

    1998-07-01

    subunit of telomerase in S. cerevisiae is encoded by TLCJ (7). Recently , through sequence comparison with the telomerase catalytic 6 subunit from Euplotes...length maintenance has been unclear, although very recent data has shown that Ku80p can be found specifically associated with telomeric DNA in vivo...chromatin structure. It has been recently observed that loss of either YKU80 or HDF1 results in altered telomere end structure, such that there appears to

  9. Modeling and Simulating Transitions from Authoritarian Rule

    DTIC Science & Technology

    1993-01-01

    perceived to be the underdog . Another example is the Christian Democrats. None of those interviewed was a Roman Catholic, the main interest group...is because of their extreme brand of nationalism. At the time of their departure, you were glad to see them go. Their leaders have talked of the need...reason they spun-off is because of their extreme brand of nationalism. At the time of their departure, the HDF was glad to see them go. Their leaders have

  10. Soil coring at multiple field environments can directly quantify variation in deep root traits to select wheat genotypes for breeding.

    PubMed

    Wasson, A P; Rebetzke, G J; Kirkegaard, J A; Christopher, J; Richards, R A; Watt, M

    2014-11-01

    We aim to incorporate deep root traits into future wheat varieties to increase access to stored soil water during grain development, which is twice as valuable for yield as water captured at younger stages. Most root phenotyping efforts have been indirect studies in the laboratory, at young plant stages, or using indirect shoot measures. Here, soil coring to 2 m depth was used across three field environments to directly phenotype deep root traits on grain development (depth, descent rate, density, length, and distribution). Shoot phenotypes at coring included canopy temperature depression, chlorophyll reflectance, and green leaf scoring, with developmental stage, biomass, and yield. Current varieties, and genotypes with breeding histories and plant architectures expected to promote deep roots, were used to maximize identification of variation due to genetics. Variation was observed for deep root traits (e.g. 111.4-178.5cm (60%) for depth; 0.09-0.22cm/°C day (144%) for descent rate) using soil coring in the field environments. There was significant variation for root traits between sites, and variation in the relative performance of genotypes between sites. However, genotypes were identified that performed consistently well or poorly at both sites. Furthermore, high-performing genotypes were statistically superior in root traits than low-performing genotypes or commercial varieties. There was a weak but significant negative correlation between green leaf score (-0.5), CTD (0.45), and rooting depth and a positive correlation for chlorophyll reflectance (0.32). Shoot phenotypes did not predict other root traits. This study suggests that field coring can directly identify variation in deep root traits to speed up selection of genotypes for breeding programmes. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  11. The Water Fraction of Calendula officinalis Hydroethanol Extract Stimulates In Vitro and In Vivo Proliferation of Dermal Fibroblasts in Wound Healing.

    PubMed

    Dinda, Manikarna; Mazumdar, Swagata; Das, Saurabh; Ganguly, Durba; Dasgupta, Uma B; Dutta, Ananya; Jana, Kuladip; Karmakar, Parimal

    2016-10-01

    The active fraction and/or compounds of Calendula officinalis responsible for wound healing are not known yet. In this work we studied the molecular target of C. officinalis hydroethanol extract (CEE) and its active fraction (water fraction of hydroethanol extract, WCEE) on primary human dermal fibroblasts (HDF). In vivo, CEE or WCEE were topically applied on excisional wounds of BALB/c mice and the rate of wound contraction and immunohistological studies were carried out. We found that CEE and only its WCEE significantly stimulated the proliferation as well as the migration of HDF cells. Also they up-regulate the expression of connective tissue growth factor (CTGF) and α-smooth muscle actin (α-SMA) in vitro. In vivo, CEE or WCEE treated mice groups showed faster wound healing and increased expression of CTGF and α-SMA compared to placebo control group. The increased expression of both the proteins during granulation phase of wound repair demonstrated the potential role of C. officinalis in wound healing. In addition, HPLC-ESI MS analysis of the active water fraction revealed the presence of two major compounds, rutin and quercetin-3-O-glucoside. Thus, our results showed that C. officinalis potentiated wound healing by stimulating the expression of CTGF and α-SMA and further we identified active compounds. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Lithospermum erythrorhizon extract protects keratinocytes and fibroblasts against oxidative stress.

    PubMed

    Yoo, Hee Geun; Lee, Bong Han; Kim, Wooki; Lee, Jong Suk; Kim, Gun Hee; Chun, Ock K; Koo, Sung I; Kim, Dae-Ok

    2014-11-01

    Oxidative stress damages dermal and epidermal cells and degrades extracellular matrix proteins, such as collagen, ultimately leading to skin aging. The present study evaluated the potential protective effect of the aqueous methanolic extract obtained from Lithospermum erythrorhizon (LE) against oxidative stress, induced by H2O2 and ultraviolet (UV) irradiation, on human keratinocyte (HaCaT) and human dermal fibroblast-neonatal (HDF-n) cells. Exposure of cells to H2O2 or UVB irradiation markedly increased oxidative stress and reduced cell viability. However, pretreatment of cells with the LE extract not only increased cell viability (up to 84.5%), but also significantly decreased oxidative stress. Further, the LE extract downregulated the expression of matrix metalloproteinase-1, an endopeptidase that degrades extracellular matrix collagen. In contrast, treatment with the LE extract did not affect the expression of procollagen type 1 in HDF-n cells exposed to UVA irradiation. Thirteen phenolic compounds, including derivatives of shikonin and caffeic acid, were identified by ultrahigh-performance liquid chromatography-electrospray ionization-tandem mass spectrometry. These results suggest that LE-derived extracts may protect oxidative-stress-induced skin aging by inhibiting degradation of skin collagen, and that this protection may derive at least in part from the antioxidant phenolics present in these extracts. Further studies are warranted to determine the potential utility of LE-derived extracts in both therapeutic and cosmetic applications.

  13. An 8 cm long holmium-doped fiber saturable absorber for Q-switched fiber laser generation at 2-μm region

    NASA Astrophysics Data System (ADS)

    Rahman, M. F. A.; Dhar, A.; Das, S.; Dutta, D.; Paul, M. C.; Rusdi, M. F. M.; Latiff, A. A.; Dimyati, K.; Harun, S. W.

    2018-07-01

    We demonstrate a Q-switched all-fiber laser operating at 2-μm region by adding a piece of 8 cm long holmium doped fiber (HDF) as a fiber saturable absorber (SA) in Thulium doped fiber laser (TDFL) ring cavity. Doping of Ho ions into yttria-alumina silica glass was done through conventional Modified Chemical Vapor Deposition (MCVD) technique in conjunction with solution doping process. The fabricated HDF has a linear absorption of 3 dB with a core diameter and a numerical aperture of 10 μm and 0.18, respectively. A self-started Q-switching operation begins at 418 mW pump level and continually dominant until 564 mW pump level. As the pump power increases, stable pulse train presence from 30.61 kHz to 38.89 kHz while the pulse width reduces from 3.18 μs to 2.27 μs. Both maximum output power and maximum peak power are obtained at 5.05 mW and 57.2 mW, respectively, while the maximum pulse energy is calculated to be 129 nJ. The signal-to-noise ratio (SNR) of the fundamental frequency is 50 dB. Our work may contribute to the discovery of stable, robust, and economic SA for pulse fiber laser generation at 2-μm region.

  14. Hemadsorption with Adult CytoSorb® in a Low Weight Pediatric Case

    PubMed Central

    Barascu, Ileana; Mc Kenzie Stancu, Samantha

    2017-01-01

    Cytokine adsorber (CytoSorb) has been used successfully as adjunctive treatment for adult patients with elevated cytokine levels in the setting with severe sepsis and septic shock and to reduce blood myoglobin, unconjugated bilirubin, and conjugated bilirubin. In this article we present the case of a nine-month-old male infant who was admitted to the NICU due to sepsis after cardiac surgery, Fallot tetralogy, and multisystem organ failure (MSOF) including liver failure and renal failure which was successfully treated by a combination of continuous hemodiafiltration (HDF) and hemadsorption with CytoSorb. HDF was safe and effective from the first day for urea removal, but the patient's bilirubin levels kept increasing gradually, culminating on the 9th day with a maximum value of 54 mg/dL of total bilirubin and 31.67 mg/dL of direct bilirubin when we performed hemadsorption with CytoSorb. Over the 49-hour period of hemadsorption, the total bilirubin value decreased from 54 to 14 mg/dL, and the patient's general status improved considerably accompanied by a rapid drop of aminotransferases. Hemodynamic status has been improved as well and inotropes dropped rapidly. The patient's ventilation settings improved during CytoSorb treatment permitting weaning the patient from mechanical ventilation after five days of hemadsorption. The patient was discharged home after 34 days of hospitalization, in a good general status. PMID:28127473

  15. Hemadsorption with Adult CytoSorb® in a Low Weight Pediatric Case.

    PubMed

    Cirstoveanu, Catalin Gabriel; Barascu, Ileana; Mc Kenzie Stancu, Samantha

    2017-01-01

    Cytokine adsorber (CytoSorb) has been used successfully as adjunctive treatment for adult patients with elevated cytokine levels in the setting with severe sepsis and septic shock and to reduce blood myoglobin, unconjugated bilirubin, and conjugated bilirubin. In this article we present the case of a nine-month-old male infant who was admitted to the NICU due to sepsis after cardiac surgery, Fallot tetralogy, and multisystem organ failure (MSOF) including liver failure and renal failure which was successfully treated by a combination of continuous hemodiafiltration (HDF) and hemadsorption with CytoSorb. HDF was safe and effective from the first day for urea removal, but the patient's bilirubin levels kept increasing gradually, culminating on the 9th day with a maximum value of 54 mg/dL of total bilirubin and 31.67 mg/dL of direct bilirubin when we performed hemadsorption with CytoSorb. Over the 49-hour period of hemadsorption, the total bilirubin value decreased from 54 to 14 mg/dL, and the patient's general status improved considerably accompanied by a rapid drop of aminotransferases. Hemodynamic status has been improved as well and inotropes dropped rapidly. The patient's ventilation settings improved during CytoSorb treatment permitting weaning the patient from mechanical ventilation after five days of hemadsorption. The patient was discharged home after 34 days of hospitalization, in a good general status.

  16. Landsat surface reflectance quality assurance extraction (version 1.7)

    USGS Publications Warehouse

    Jones, J.W.; Starbuck, M.J.; Jenkerson, Calli B.

    2013-01-01

    The U.S. Geological Survey (USGS) Land Remote Sensing Program is developing an operational capability to produce Climate Data Records (CDRs) and Essential Climate Variables (ECVs) from the Landsat Archive to support a wide variety of science and resource management activities from regional to global scale. The USGS Earth Resources Observation and Science (EROS) Center is charged with prototyping systems and software to generate these high-level data products. Various USGS Geographic Science Centers are charged with particular ECV algorithm development and (or) selection as well as the evaluation and application demonstration of various USGS CDRs and ECVs. Because it is a foundation for many other ECVs, the first CDR in development is the Landsat Surface Reflectance Product (LSRP). The LSRP incorporates data quality information in a bit-packed structure that is not readily accessible without postprocessing services performed by the user. This document describes two general methods of LSRP quality-data extraction for use in image processing systems. Helpful hints for the installation and use of software originally developed for manipulation of Hierarchical Data Format (HDF) produced through the National Aeronautics and Space Administration (NASA) Earth Observing System are first provided for users who wish to extract quality data into separate HDF files. Next, steps follow to incorporate these extracted data into an image processing system. Finally, an alternative example is illustrated in which the data are extracted within a particular image processing system.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon No; Dutta, Raghbendra Kumar; Kim, Seul-Gi

    Highlights: •A fasting–refeeding high fat diet (HDF) model mimics irregular eating habit. •A fasting–refeeding HFD induces liver ballooning injury. •A fasting–refeeding HDF process elicits hepatic triglyceride accumulation. •Fenofibrate, PPARα ligand, prevents liver damage induced by refeeding HFD. -- Abstract: Fenofibrate, a peroxisome proliferator-activated receptor α (PPARα) agonist, is an anti-hyperlipidemic agent that has been widely used in the treatment of dyslipidemia. In this study, we examined the effect of fenofibrate on liver damage caused by refeeding a high-fat diet (HFD) in mice after 24 h fasting. Here, we showed that refeeding HFD after fasting causes liver damage in mice determinedmore » by liver morphology and liver cell death. A detailed analysis revealed that hepatic lipid droplet formation is enhanced and triglyceride levels in liver are increased by refeeding HFD after starvation for 24 h. Also, NF-κB is activated and consequently induces the expression of TNF-α, IL1-β, COX-2, and NOS2. However, treating with fenofibrate attenuates the liver damage and triglyceride accumulation caused by the fasting–refeeding HFD process. Fenofibrate reduces the expression of NF-κB target genes but induces genes for peroxisomal fatty acid oxidation, peroxisome biogenesis and mitochondrial fatty acid oxidation. These results strongly suggest that the treatment of fenofibrate ameliorates the liver damage induced by fasting–refeeding HFD, possibly through the activation of fatty acid oxidation.« less

  18. Fully glutathione degradable waterborne polyurethane nanocarriers: Preparation, redox-sensitivity, and triggered intracellular drug release.

    PubMed

    Omrani, Ismail; Babanejad, Niloofar; Shendi, Hasan Kashef; Nabid, Mohammad Reza

    2017-01-01

    Polyurethanes are important class of biomaterials that are extensively used in medical devices. In spite of their easy synthesis, polyurethanes that are fully degradable in response to the intracellular reducing environment are less explored for controlled drug delivery. Herein, a novel glutathione degradable waterborne polyurethane (WPU) nanocarrier for redox triggered intracellular delivery of a model lipophilic anticancer drug, doxorubicin (DOX) is reported. The WPU was prepared from polyaddition reaction of isophorone diisocyanate (IPDI) and a novel linear polyester polyol involving disulfide linkage, disulfide labeled chain extender, dimethylolpropionic acid (DMPA) using dibutyltin dilaurate (DBTDL) as a catalyst. The resulting polyurethane self-assembles into nanocarrier in water. The dynamic light scattering (DLS) measurements and scanning electron microscope (SEM) revealed fast swelling and disruption of nanocarriers under an intracellular reduction-mimicking environment. The in vitro release studies showed that DOX was released in a controlled and redox-dependent manner. MTT assays showed that DOX-loaded WPU had a high in vitro antitumor activity in both HDF noncancer cells and MCF- 7 cancer cells. In addition, it is found that the blank WPU nanocarriers are nontoxic to HDF and MCF-7 cells even at a high concentration of 2mg/mL. Hence, nanocarriers based on disulfide labeled WPU have appeared as a new class of biocompatible and redox-degradable nanovehicle for efficient intracellular drug delivery. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Passiflora tarminiana fruits reduce UVB-induced photoaging in human skin fibroblasts.

    PubMed

    Bravo, Karent; Duque, Luisa; Ferreres, Federico; Moreno, Diego A; Osorio, Edison

    2017-03-01

    Skin aging is a complex process that is strongly affected by UV radiation, which stimulates the production of reactive oxygen species (ROS) in the epidermis and dermis and subsequently causes skin damage. Among the major consequences are increased collagen degradation and reduced collagen synthesis. Previous reports have demonstrated the beneficial effects of polyphenols for healthy skin. Passiflora tarminiana Coppens & V.E. Barney, a species of the Passifloraceae family, is widely distributed in South America and is rich in flavonoids. We show that UVB radiation increases metalloproteinase 1 (MMP-1) and reduces procollagen production in human dermal fibroblast (HDF) cells in a dose- and time-dependent manner. We examined the antioxidant and antiaging effects of the extract and fractions of P. tarminiana fruits. The fractions showed high polyphenol content (620mg EAG/g) and antioxidant activity, as measured by ORAC (4097μmol ET/g) and ABTS (2992μmol ET/g) assays. The aqueous fraction drastically inhibited the collagenase enzyme (IC 50 0.43μg/mL). The extract and fractions presented photoprotective effects by reducing UVB-induced MMP-1 production, increasing UVB-inhibited procollagen production, and decreasing ROS production after UVB irradiation in HDF. Finally, the polyphenol contents of the extracts and fractions from P. tarminiana were analyzed by HPLC-DAD-ESI-MS n , and procyanidins and glycosylated flavonoids were identified. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Scientific analogs and the development of human mission architectures for the Moon, deep space and Mars

    NASA Astrophysics Data System (ADS)

    Lim, D. S. S.; Abercromby, A.; Beaton, K.; Brady, A. L.; Cardman, Z.; Chappell, S.; Cockell, C. S.; Cohen, B. A.; Cohen, T.; Deans, M.; Deliz, I.; Downs, M.; Elphic, R. C.; Hamilton, J. C.; Heldmann, J.; Hillenius, S.; Hoffman, J.; Hughes, S. S.; Kobs-Nawotniak, S. E.; Lees, D. S.; Marquez, J.; Miller, M.; Milovsoroff, C.; Payler, S.; Sehlke, A.; Squyres, S. W.

    2016-12-01

    Analogs are destinations on Earth that allow researchers to approximate operational and/or physical conditions on other planetary bodies and within deep space. Over the past decade, our team has been conducting geobiological field science studies under simulated deep space and Mars mission conditions. Each of these missions integrate scientific and operational research with the goal to identify concepts of operations (ConOps) and capabilities that will enable and enhance scientific return during human and human-robotic missions to the Moon, into deep space and on Mars. Working under these simulated mission conditions presents a number of unique challenges that are not encountered during typical scientific field expeditions. However, there are significant benefits to this working model from the perspective of the human space flight and scientific operations research community. Specifically, by applying human (and human-robotic) mission architectures to real field science endeavors, we create a unique operational litmus test for those ConOps and capabilities that have otherwise been vetted under circumstances that did not necessarily demand scientific data return meeting the rigors of peer-review standards. The presentation will give an overview of our team's recent analog research, with a focus on the scientific operations research. The intent is to encourage collaborative dialog with a broader set of analog research community members with an eye towards future scientific field endeavors that will have a significant impact on how we design human and human-robotic missions to the Moon, into deep space and to Mars.

  1. Galaxies in the Diffuse Baryon Field Approaching Reionization: A Joint Study with JWST, HST, and Large Telescopes

    NASA Astrophysics Data System (ADS)

    Simcoe, Robert

    2017-08-01

    Our team is conducting a dedicated survey for emission-line galaxies at 5 < z < 7 in six fields containing the best and brightest z > 6 quasars, using JWST/NIRCAM's slitless grism in a 110 hour GTO allocation. We have acquired deep near-IR spectra of the QSOs, revealing multiple heavy-element absorption systems and probing the HI optical depth within each object's survey volume. These data will provide the first systematic view of the circumgalactic medium at z > 4, allowing us to study early metal enrichment, correlations of the intergalactic HI optical depth with galaxy density, and the environment of the quasar hosts. These fields generally do not have deep multicolor photometry that would facilitate selection of broadband dropout galaxies for future observation with JWST/NIRSPEC. However during long spectroscopic integrations with NIRCAM's long channel we will obtain deep JWST photometry in F115W and F200W, together with F356W for wavelength calibration. Here we request 30 orbits with HST/ACS to acquire deep optical photometry that (together with the JWST IR bands) will constrain SED models and enable dropout selection of fainter objects. For lower redshift objects the rest-UV ACS data will improve estimates of star formation rate and stellar mass. Within a Small-GO program scope we will obtain sensitivity similar to CANDELS-Deep in all six fields, and approximately double the size of our galaxy sample appropriate for JWST/NIRSPEC followup at redshifts approaching the reionization epoch.

  2. Photometric redshifts for the CFHTLS T0004 deep and wide fields

    NASA Astrophysics Data System (ADS)

    Coupon, J.; Ilbert, O.; Kilbinger, M.; McCracken, H. J.; Mellier, Y.; Arnouts, S.; Bertin, E.; Hudelot, P.; Schultheis, M.; Le Fèvre, O.; Le Brun, V.; Guzzo, L.; Bardelli, S.; Zucca, E.; Bolzonella, M.; Garilli, B.; Zamorani, G.; Zanichelli, A.; Tresse, L.; Aussel, H.

    2009-06-01

    Aims: We compute photometric redshifts in the fourth public release of the Canada-France-Hawaii Telescope Legacy Survey. This unique multi-colour catalogue comprises u^*, g', r', i', z' photometry in four deep fields of 1 deg2 each and 35 deg2 distributed over three wide fields. Methods: We used a template-fitting method to compute photometric redshifts calibrated with a large catalogue of 16 983 high-quality spectroscopic redshifts from the VVDS-F02, VVDS-F22, DEEP2, and the zCOSMOS surveys. The method includes correction of systematic offsets, template adaptation, and the use of priors. We also separated stars from galaxies using both size and colour information. Results: Comparing with galaxy spectroscopic redshifts, we find a photometric redshift dispersion, σΔ z/(1+z_s), of 0.028-0.30 and an outlier rate, |Δ z| ≥ 0.15× (1+z_s), of 3-4% in the deep field at i'_AB < 24. In the wide fields, we find a dispersion of 0.037-0.039 and an outlier rate of 3-4% at i'_AB < 22.5. Beyond i'_AB = 22.5 in the wide fields the number of outliers rises from 5% to 10% at i'_AB < 23 and i'_AB < 24, respectively. For the wide sample the systematic redshift bias stays below 1% to i'_AB < 22.5, whereas we find no significant bias in the deep fields. We investigated the effect of tile-to-tile photometric variations and demonstrated that the accuracy of our photometric redshifts is reduced by at most 21%. Application of our star-galaxy classifier reduced the contamination by stars in our catalogues from 60% to 8% at i'_AB < 22.5 in our field with the highest stellar density while keeping a complete galaxy sample. Our CFHTLS T0004 photometric redshifts are distributed to the community. Our release includes 592891 (i'_AB < 22.5) and 244701 (i'_AB < 24) reliable galaxy photometric redshifts in the wide and deep fields, respectively. Based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT) which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at Terapix and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS.

  3. Static Electric Fields and Lightning Over Land and Ocean in Florida Thunderstorms

    NASA Technical Reports Server (NTRS)

    Wilson, J. G.; Cummins, K. L.; Simpson, A. A.; Hinckley, A.

    2017-01-01

    Natural cloud-to-ground (CG) lightning and the charge structure of the associated clouds behave differently over land and ocean. Existing literature has raised questions over the years on the behavior of thunderstorms and lightning over oceans, and there are still open scientific questions. We expand on the observational datasets by obtaining identical electric field observations over coastal land, near-shore, and deep ocean regions during both clear air and thunderstorm periods. Oceanic observations were obtained using two 3-meter NOAA buoys that were instrumented with Campbell Scientific electric field mills to measure the static electric fields. These data were compared to selected electric field records from the existing on-shore electric field mill suite of 31 sensors at Kennedy Space Center (KSC). CG lightning occurrence times, locations and peak current values for both on-shore and ocean were provided by the U.S. National Lightning Detection Network. The buoy instruments were first evaluated on-shore at the Florida coast, to calibrate field enhancements and to confirm proper behavior of the system in elevated-field environments. The buoys were then moored 20NM and 120NM off the coast of KSC in February (20NM) and August (120NM) 2014. Statistically larger CG peak currents were reported over the deep ocean for first strokes and for subsequent strokes with new contacts points. Storm-related static fields were significantly larger at both oceanic sites, likely due to decreased screening by nearby space charge. Time-evolution of the static field during storm development and propagation indicated weak or missing lower positive charge regions in most storms that initiated over the deep ocean, supporting one mechanism for the observed high peak currents in negative first strokes over the deep ocean. This project also demonstrated the practicality of off-shore electric field measurements for safety-related decision making at KSC.

  4. Frontier Fields: Bringing the Distant Universe into View

    NASA Astrophysics Data System (ADS)

    Eisenhamer, Bonnie; Lawton, Brandon L.; Summers, Frank; Ryer, Holly

    2014-06-01

    The Frontier Fields is a multi-cycle program of six deep-field observations of strong-lensing galaxy clusters that will be taken in parallel with six deep “blank fields.” The three-year long collaborative program centers on observations from NASA’s Great Observatories, who will team up to look deeper into the universe than ever before, and potentially uncover galaxies that are as much as 100 times fainter than what the telescopes can typically see. Because of the unprecedented views of the universe that will be achieved, the Frontier Fields science program is ideal for informing audiences about scientific advances and topics in STEM. For example, the program provides an opportunity to look back on the history of deep field observations and how they changed (and continue to change) astronomy, while exploring the ways astronomers approach big science problems. As a result, the Space Telescope Science Institute’s Office of Public Outreach has initiated an education and public outreach (E/PO) project to follow the progress of the Frontier Fields program - providing a behind-the-scenes perspective of this observing initiative. This poster will highlight the goals of the Frontier Fields E/PO project and the cost-effective approach being used to bring the program’s results to both the public and educational audiences.

  5. Development of deep eutectic solvents applied in extraction and separation.

    PubMed

    Li, Xiaoxia; Row, Kyung Ho

    2016-09-01

    Deep eutectic solvents, as an alternative to ionic liquids, have greener credentials than ionic liquids, and have attracted considerable attention in related chemical research. Deep eutectic solvents have attracted increasing attention in chemistry for the extraction and separation of various target compounds from natural products. This review highlights the preparation of deep eutectic solvents, unique properties of deep eutectic solvents, and synthesis of deep-eutectic-solvent-based materials. On the other hand, application in the extraction and separation of deep eutectic solvents is also included in this report. In this paper, the available data and references in this field are reviewed to summarize the applications and developments of deep eutectic solvents. Based on the development of deep eutectic solvents, an exploitation of new deep eutectic solvents and deep eutectic solvents-based materials is expected to diversify into extraction and separation. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Bit Grooming: Statistically accurate precision-preserving quantization with compression, evaluated in the netCDF operators (NCO, v4.4.8+)

    DOE PAGES

    Zender, Charles S.

    2016-09-19

    Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits ofmore » consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25–80 and 5–65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1–5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1–2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that it can compress, Bit Grooming guarantees the specified precision throughout the full floating-point range. Data quantization by Bit Grooming is irreversible (i.e., lossy) yet transparent, meaning that no extra processing is required by data users/readers. Hence Bit Grooming can easily reduce data storage volume without sacrificing scientific precision or imposing extra burdens on users.« less

  7. Deep convolutional neural network based antenna selection in multiple-input multiple-output system

    NASA Astrophysics Data System (ADS)

    Cai, Jiaxin; Li, Yan; Hu, Ying

    2018-03-01

    Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.

  8. Resolution enhancement of wide-field interferometric microscopy by coupled deep autoencoders.

    PubMed

    Işil, Çağatay; Yorulmaz, Mustafa; Solmaz, Berkan; Turhan, Adil Burak; Yurdakul, Celalettin; Ünlü, Selim; Ozbay, Ekmel; Koç, Aykut

    2018-04-01

    Wide-field interferometric microscopy is a highly sensitive, label-free, and low-cost biosensing imaging technique capable of visualizing individual biological nanoparticles such as viral pathogens and exosomes. However, further resolution enhancement is necessary to increase detection and classification accuracy of subdiffraction-limited nanoparticles. In this study, we propose a deep-learning approach, based on coupled deep autoencoders, to improve resolution of images of L-shaped nanostructures. During training, our method utilizes microscope image patches and their corresponding manual truth image patches in order to learn the transformation between them. Following training, the designed network reconstructs denoised and resolution-enhanced image patches for unseen input.

  9. The Deep Space Network as an instrument for radio science research

    NASA Technical Reports Server (NTRS)

    Asmar, S. W.; Renzetti, N. A.

    1993-01-01

    Radio science experiments use radio links between spacecraft and sensor instrumentation that is implemented in the Deep Space Network. The deep space communication complexes along with the telecommunications subsystem on board the spacecraft constitute the major elements of the radio science instrumentation. Investigators examine small changes in the phase and/or amplitude of the radio signal propagating from a spacecraft to study the atmospheric and ionospheric structure of planets and satellites, planetary gravitational fields, shapes, masses, planetary rings, ephemerides of planets, solar corona, magnetic fields, cometary comae, and such aspects of the theory of general relativity as gravitational waves and gravitational redshift.

  10. A Comparison of Peak Electric Fields and GICs in the Pacific Northwest Using 1-D and 3-D Conductivity

    NASA Astrophysics Data System (ADS)

    Gannon, J. L.; Birchfield, A. B.; Shetye, K. S.; Overbye, T. J.

    2017-11-01

    Geomagnetically induced currents (GICs) are a result of the changing magnetic fields during a geomagnetic disturbance interacting with the deep conductivity structures of the Earth. When assessing GIC hazard, it is a common practice to use layer-cake or one-dimensional conductivity models to approximate deep Earth conductivity. In this paper, we calculate the electric field and estimate GICs induced in the long lines of a realistic system model of the Pacific Northwest, using the traditional 1-D models, as well as 3-D models represented by Earthscope's Electromagnetic transfer functions. The results show that the peak electric field during a given event has considerable variation across the analysis region in the Pacific Northwest, but the 1-D physiographic approximations may accurately represent the average response of an area, although corrections are needed. Rotations caused by real deep Earth conductivity structures greatly affect the direction of the induced electric field. This effect may be just as, or more, important than peak intensity when estimating GICs induced in long bulk power system lines.

  11. Deep Vadose Zone Treatability Test for the Hanford Central Plateau. Interim Post-Desiccation Monitoring Results, Fiscal Year 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Truex, Michael J.; Strickland, Christopher E.; Oostrom, Martinus

    A field test of desiccation is being conducted as an element of the Deep Vadose Zone Treatability Test Program. The active desiccation portion of the test has been completed. Monitoring data have been collected at the field test site during the post-desiccation period and are reported herein. This is an interim data summary report that includes about 4 years of post-desiccation monitoring data. The DOE field test plan proscribes a total of 5 years of post-desiccation monitoring.

  12. The scheme of LLSST based on inter-satellite link for planet gravity field measurement in deep-space mission

    NASA Astrophysics Data System (ADS)

    Yang, Yikang; Li, Xue; Liu, Lei

    2009-12-01

    Gravity field measurement for the interested planets and their moos in solar system, such as Luna and Mars, is one important task in the next step of deep-space mission. In this paper, Similar to GRACE mission, LLSST and DOWR technology of common-orbit master-slave satellites around task planet is inherited in this scheme. Furthermore, by intersatellite 2-way UQPSK-DSSS link, time synchronization and data processing are implemented autonomously by masterslave satellites instead of GPS and ground facilities supporting system. Conclusion is derived that the ISL DOWR based on 2-way incoherent time synchronization has the same precise level to GRACE DOWR based on GPS time synchronization. Moreover, because of inter-satellite link, the proposed scheme is rather autonomous for gravity field measurement of the task planet in deep-space mission.

  13. Comment on ‘Deep convolutional neural network with transfer learning for rectum toxicity prediction in cervical cancer radiotherapy: a feasibility study’

    NASA Astrophysics Data System (ADS)

    Valdes, Gilmer; Interian, Yannet

    2018-03-01

    The application of machine learning (ML) presents tremendous opportunities for the field of oncology, thus we read ‘Deep convolutional neural network with transfer learning for rectum toxicity prediction in cervical cancer radiotherapy: a feasibility study’ with great interest. In this article, the authors used state of the art techniques: a pre-trained convolutional neural network (VGG-16 CNN), transfer learning, data augmentation, drop out and early stopping, all of which are directly responsible for the success and the excitement that these algorithms have created in other fields. We believe that the use of these techniques can offer tremendous opportunities in the field of Medical Physics and as such we would like to praise the authors for their pioneering application to the field of Radiation Oncology. That being said, given that the field of Medical Physics has unique characteristics that differentiate us from those fields where these techniques have been applied successfully, we would like to raise some points for future discussion and follow up studies that could help the community understand the limitations and nuances of deep learning techniques.

  14. Photometric redshifts for the next generation of deep radio continuum surveys - I. Template fitting

    NASA Astrophysics Data System (ADS)

    Duncan, Kenneth J.; Brown, Michael J. I.; Williams, Wendy L.; Best, Philip N.; Buat, Veronique; Burgarella, Denis; Jarvis, Matt J.; Małek, Katarzyna; Oliver, S. J.; Röttgering, Huub J. A.; Smith, Daniel J. B.

    2018-01-01

    We present a study of photometric redshift performance for galaxies and active galactic nuclei detected in deep radio continuum surveys. Using two multiwavelength data sets, over the NOAO Deep Wide Field Survey Boötes and COSMOS fields, we assess photometric redshift (photo-z) performance for a sample of ∼4500 radio continuum sources with spectroscopic redshifts relative to those of ∼63 000 non-radio-detected sources in the same fields. We investigate the performance of three photometric redshift template sets as a function of redshift, radio luminosity and infrared/X-ray properties. We find that no single template library is able to provide the best performance across all subsets of the radio-detected population, with variation in the optimum template set both between subsets and between fields. Through a hierarchical Bayesian combination of the photo-z estimates from all three template sets, we are able to produce a consensus photo-z estimate that equals or improves upon the performance of any individual template set.

  15. Deep rooting conferred by DEEPER ROOTING 1 enhances rice yield in paddy fields.

    PubMed

    Arai-Sanoh, Yumiko; Takai, Toshiyuki; Yoshinaga, Satoshi; Nakano, Hiroshi; Kojima, Mikiko; Sakakibara, Hitoshi; Kondo, Motohiko; Uga, Yusaku

    2014-07-03

    To clarify the effect of deep rooting on grain yield in rice (Oryza sativa L.) in an irrigated paddy field with or without fertilizer, we used the shallow-rooting IR64 and the deep-rooting Dro1-NIL (a near-isogenic line homozygous for the Kinandang Patong allele of DEEPER ROOTING 1 (DRO1) in the IR64 genetic background). Although total root length was similar in both lines, more roots were distributed within the lower soil layer of the paddy field in Dro1-NIL than in IR64, irrespective of fertilizer treatment. At maturity, Dro1-NIL showed approximately 10% higher grain yield than IR64, irrespective of fertilizer treatment. Higher grain yield of Dro1-NIL was mainly due to the increased 1000-kernel weight and increased percentage of ripened grains, which resulted in a higher harvest index. After heading, the uptake of nitrogen from soil and leaf nitrogen concentration were higher in Dro1-NIL than in IR64. At the mid-grain-filling stage, Dro1-NIL maintained higher cytokinin fluxes from roots to shoots than IR64. These results suggest that deep rooting by DRO1 enhances nitrogen uptake and cytokinin fluxes at late stages, resulting in better grain filling in Dro1-NIL in a paddy field in this study.

  16. Deep learning for computational chemistry.

    PubMed

    Goh, Garrett B; Hodas, Nathan O; Vishnu, Abhinav

    2017-06-15

    The rise and fall of artificial neural networks is well documented in the scientific literature of both computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on multilayer neural networks. Within the last few years, we have seen the transformative impact of deep learning in many domains, particularly in speech recognition and computer vision, to the extent that the majority of expert practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. In this review, we provide an introductory overview into the theory of deep neural networks and their unique properties that distinguish them from traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including quantitative structure activity relationship, virtual screening, protein structure prediction, quantum chemistry, materials design, and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non-neural networks state-of-the-art models across disparate research topics, and deep neural network-based models often exceeded the "glass ceiling" expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a valuable tool for computational chemistry. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Characterization of deep level defects and thermally stimulated depolarization phenomena in La-doped TlInS{sub 2} layered semiconductor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seyidov, MirHasan Yu., E-mail: smirhasan@gyte.edu.tr; Suleymanov, Rauf A.; Mikailzade, Faik A.

    2015-06-14

    Lanthanum-doped high quality TlInS{sub 2} (TlInS{sub 2}:La) ferroelectric-semiconductor was characterized by photo-induced current transient spectroscopy (PICTS). Different impurity centers are resolved and identified. Analyses of the experimental data were performed in order to determine the characteristic parameters of the extrinsic and intrinsic defects. The energies and capturing cross section of deep traps were obtained by using the heating rate method. The observed changes in the Thermally Stimulated Depolarization Currents (TSDC) near the phase transition points in TlInS{sub 2}:La ferroelectric-semiconductor are interpreted as a result of self-polarization of the crystal due to the internal electric field caused by charged defects. Themore » TSDC spectra show the depolarization peaks, which are attributed to defects of dipolar origin. These peaks provide important information on the defect structure and localized energy states in TlInS{sub 2}:La. Thermal treatments of TlInS{sub 2}:La under an external electric field, which was applied at different temperatures, allowed us to identify a peak in TSDC which was originated from La-dopant. It was established that deep energy level trap BTE43, which are active at low temperature (T ≤ 156 K) and have activation energy 0.29 eV and the capture cross section 2.2 × 10{sup −14} cm{sup 2}, corresponds to the La dopant. According to the PICTS results, the deep level trap center B5 is activated in the temperature region of incommensurate (IC) phases of TlInS{sub 2}:La, having the giant static dielectric constant due to the structural disorders. From the PICTS simulation results for B5, native deep level trap having an activation energy of 0.3 eV and the capture cross section of 1.8 × 10{sup −16} cm{sup 2} were established. A substantial amount of residual space charges is trapped by the deep level localized energy states of B5 in IC-phase. While the external electric field is applied, permanent dipoles, which are originated from the charged B5 deep level defects, are aligned in the direction of the applied electric field and the equilibrium polarization can be reached in a relatively short time. When the polarization field is maintained, while cooling the temperature of sample to a sufficiently low degrees, the relaxation times of the aligned dipoles drastically increases. Practically, frozen internal electric field or electrets states remain inside the TlInS{sub 2}:La when the applied bias field is switched off. The influence of deep level defects on TSDC spectra of TlInS{sub 2}:La has been revealed for the first time.« less

  18. Dro1, a major QTL involved in deep rooting of rice under upland field conditions.

    PubMed

    Uga, Yusaku; Okuno, Kazutoshi; Yano, Masahiro

    2011-05-01

    Developing a deep root system is an important strategy for avoiding drought stress in rice. Using the 'basket' method, the ratio of deep rooting (RDR; the proportion of total roots that elongated through the basket bottom) was calculated to evaluate deep rooting. A new major quantitative trait locus (QTL) controlling RDR was detected on chromosome 9 by using 117 recombinant inbred lines (RILs) derived from a cross between the lowland cultivar IR64, with shallow rooting, and the upland cultivar Kinandang Patong (KP), with deep rooting. This QTL explained 66.6% of the total phenotypic variance in RDR in the RILs. A BC(2)F(3) line homozygous for the KP allele of the QTL had an RDR of 40.4%, compared with 2.6% for the homozygous IR64 allele. Fine mapping of this QTL was undertaken using eight BC(2)F(3) recombinant lines. The RDR QTL Dro1 (Deeper rooting 1) was mapped between the markers RM24393 and RM7424, which delimit a 608.4 kb interval in the reference cultivar Nipponbare. To clarify the influence of Dro1 in an upland field, the root distribution in different soil layers was quantified by means of core sampling. A line homozygous for the KP allele of Dro1 (Dro1-KP) and IR64 did not differ in root dry weight in the shallow soil layers (0-25 cm), but root dry weight of Dro1-KP in deep soil layers (25-50 cm) was significantly greater than that of IR64, suggesting that Dro1 plays a crucial role in increased deep rooting under upland field conditions.

  19. FRONTIER FIELDS: HIGH-REDSHIFT PREDICTIONS AND EARLY RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Dan; Bradley, Larry; Zitrin, Adi, E-mail: DCoe@STScI.edu

    2015-02-20

    The Frontier Fields program is obtaining deep Hubble and Spitzer Space Telescope images of new ''blank'' fields and nearby fields gravitationally lensed by massive galaxy clusters. The Hubble images of the lensed fields are revealing nJy sources (AB mag > 31), the faintest galaxies yet observed. The full program will transform our understanding of galaxy evolution in the first 600 million years (z > 9). Previous programs have yielded a dozen or so z > 9 candidates, including perhaps fewer than expected in the Ultra Deep Field and more than expected in shallower Hubble images. In this paper, we present high-redshift (z >more » 6) number count predictions for the Frontier Fields and candidates in three of the first Hubble images. We show the full Frontier Fields program may yield up to ∼70 z > 9 candidates (∼6 per field). We base this estimate on an extrapolation of luminosity functions observed between 4 < z < 8 and gravitational lensing models submitted by the community. However, in the first two deep infrared Hubble images obtained to date, we find z ∼ 8 candidates but no strong candidates at z > 9. We defer quantitative analysis of the z > 9 deficit (including detection completeness estimates) to future work including additional data. At these redshifts, cosmic variance (field-to-field variation) is expected to be significant (greater than ±50%) and include clustering of early galaxies formed in overdensities. The full Frontier Fields program will significantly mitigate this uncertainty by observing six independent sightlines each with a lensing cluster and nearby blank field.« less

  20. First Results from the UT1 Science Verification Programme

    NASA Astrophysics Data System (ADS)

    1998-11-01

    Performance verification is a step which has regularly been employed in space missions to assess and qualify the scientific capabilities of an instrument. Within this framework, it was the goal of the Science Verification program to submit the VLT Unit Telescope No. 1 (UT1) to the scrutiny that can only be achieved in an actual attempt to produce scientifically valuable results. To this end, an attractive and diversified set of observations were planned in advance to be executed at the VLT. These Science Verification observations at VLT UT1 took place as planned in the period from August 17 to September 1, 1998, cf. the September issue of the ESO Messenger ( No. 93, p. 1 ) and ESO PR 12/98 for all details. Although the meteorological conditions on Paranal were definitely below average, the telescope worked with spectacular efficiency and performance throughout the entire period, and very valuable data were gathered. After completion of all observations, the Science Verification Team started to prepare all of the datasets for the public release that took place on October 2, 1998. The data related to the Hubble Deep Field South (now extensively observed by the Hubble Space Telescope) were made public world-wide, while the release of other data was restricted to ESO member states. With this public release ESO intended to achieve two specific goals: offer to the scientific community an early opportunity to work on valuable VLT data, and in the meantime submit the VLT to the widest possible scrutiny. With the public release, many scientists started to analyse scientifically the VLT data, and the following few examples of research programmes are meant to give a sample of the work that has been carried out on the Science Verification data during the past two months. They represent typical investigations that will be carried out in the future with the VLT. Many of these will be directed towards the distant universe, in order to gather insight on the formation and evolution of galaxies, galaxy clusters, and large scale structure. Others will concentrate on more nearby objects, including stars and nebulae in the Milky Way galaxy, and some will attempt to study our own solar system. The following six research programmes were presented at the Press Conference that took place at the ESO Headquarters in Garching (Germany) today. Deep Galaxy Counts and Photometric Redshifts in the HDF-S NIC3 Field The goal of this programme was to verify the capability of the VLT by obtaining the deepest possible ground-based images and using multicolour information to derive the redshifts (and hence the distances) of the faintest galaxies. The space distribution, luminosity and colour of these extreme objects may provide crucial information on the initial phases of the evolution of the universe. The method is known as photometric redshift determination . The VLT Test Camera was used to collect CCD images for a total of 16.6 hours in five spectral filters (U, B, V, R and I) in the so-called HDF-S NIC3 field. This is a small area (about 1 arcmin square) of the southern sky where very deep observations in the infrared bands J, H and K (1.1, 1.6 and 2.2µm, respectively) have been obtained by the Hubble Space Telescope (HST). The observations were combined and analyzed by a team of astronomers at ESO and the Observatory of Rome (Italy). Galaxies were detected in the field down to magnitude ~ 27-28. In most colours, the planned limiting values of the fluxes were successfully reached. ESO PR Photo 48a/98 ESO PR Photo 48a/98 [Preview - JPEG: 800 x 856 pix - 144k] [High-Res - JPEG: 3000 x 3210 pix - 728k] PR Photo 48a/98 shows some examples of photometric redshift determination for faint galaxies in the HDF-S NIC3 field. The filled points are the fluxes measured in the five colors observed with the VLT Test Camera (U, B, V, R and I) and in the infrared H spectral band with the NICMOS instrument on the Hubble Space Telescope. The curves constitute the best fit to the points obtained from a library of more than 400,000 synthetic spectra of galaxies at various redshifts (Fontana et al., in preparation). For most of these very faint sources, it is not possible to collect enough photons to measure the recession velocity (the redshift) by spectroscopy, even with an 8-m telescope. The redshifts and the main galaxy properties are then determined by comparing the colour observations with synthetic spectra (see PR Photo 48a/98 ). This has been done for more than one hundred galaxies in the field brighter than magnitude 26.5. Around 20 are found to be at redshifts larger than 2. The brighter ones are excellent candidates for future detailed studies with the UT1 instruments FORS1 and ISAAC. The scientists involved in this study are: Sandro D'Odorico, Richard Hook, Alvio Renzini, Piero Rosati, Rodolfo Viezzer (ESO) and Adriano Fontana, Emanuele Giallongo, Francesco Poli (Rome Observatory, Italy). A Gravitational Einstein Ring Because the gravitational pull of matter bends the path of light rays, astronomical objects - stars, galaxies and galaxy clusters - can act like lenses, which magnify and severely distort the images of galaxies behind them, producing weird pictures as in a hall of mirrors. In the most extreme case, where the foreground lensing galaxy and the background galaxy are perfectly lined up, the image of the background galaxy is stretched into a ring. Such an image is known as an Einstein ring , because the correct formula for the bending of light was first described by the famous phycisist Albert Einstein . ESO PR Photo 48b/98 ESO PR Photo 48b/98 [Preview - JPEG: 800 x 1106 pix - 952k] [High-Res - JPEG: 3000 x 4148 pix - 5.4Mb] ESO PR Photo 48c/98 ESO PR Photo 48c/98 [Preview - JPEG: 800 x 977 pix - 272k] [High-Res - JPEG: 3000 x 3664 pix - 1.4Mb] PR Photo 48b/98 (left) shows a new, true colour image of an Einstein ring (upper centre of photo), first discovered at ESO in 1995. The ring, which is the stretched image of a galaxy far out in the Universe, stands out clearly in green, and the red galaxy inside the ring is the lens. The discovery image was very faint, but this new picture, taken with the VLT during the Science Verification Programme allows a much clearer view of the ring because of the great light-gathering capacity of the telescope and, not least, because of the superb image quality. In Photo 48c/98 (right), four images illustrate the deduced model of the lensing effect. In the upper left, the observed ring has been enlarged and the image of the lensing galaxy removed by image processing. Below it is a model of the gravitational field (potential) around this galaxy along with the "true" image of the background galaxy shown. At the lower right is the resulting gravitationally magnified and distorted image of the background galaxy, which to the upper right has been de-sharpened to the same image quality as the observed image. The similarity between the two is most convincing. The picture shows a new, true colour image of an Einstein ring, first discovered at ESO in 1995. The ring, which is the stretched image of a galaxy far out in the Universe, stands out clearly in green, and the red galaxy inside the ring is the lens. The discovery image was very faint, but this new picture, taken with the VLT during the Science Verification Programme allows a much clearer view of the ring because of the great light-gathering capacity the telescope and, not least, because of the superb image quality. Gravitational lensing provides a very useful tool with which to study the Universe. As "weighing scales", it provides a measure of the mass within the lensing body, and as a "magnifying glass", it allows us to see details in objects which would otherwise be beyond the reach of current telescopes. This new detailed picture has allowed a much more accurate measurement of the mass of the lensing galaxy, revealing the presence of vast quantities of "unseen" matter, five times more than if just the light from the galaxy is taken into account. This additional material represents some of the Universe's dark matter . The gravitational lens action is also magnifying the background object by a factor of ten, providing an unparalleled view of this very distant galaxy which is in a stage of active star-formation. The scientists involved in this study are : Palle Møller (ESO), Stephen J. Warren (Blackett Laboratory, Imperial College, UK), Paul C. Hewett (Institute of Astronomy, Cambridge, UK) and Geraint F. Lewis (Dept. of Physics and Astronomy, University of Victoria, Canada). An Extremely Red Galaxy One of the main goals of modern cosmology is to understand when and how the galaxies formed. In the very last years, many high-redshift (i.e. very distant) galaxies have been found, suggesting that some galaxies were already assembled, when the Universe was much younger than now. None of these high-redshift galaxies have ever been found to be a bona-fide red elliptical galaxy . The VLT, however, with its very good capabilities for infrared observations, is an ideal instrument to investigate when and how the red elliptical galaxies formed. The VLT Science Verification images have provided unique multicolour information about an extremely red galaxy that was originally (Treu et al., 1998, A&A Letters, Vol. 340, p. 10) identified on the Hubble Deep Field South (HDF-S) Test Image. This galaxy is shown in PR Photo 48d/98 that is an enlargment from ESO PR Photo 35b/98. It was detected on Near-IR images and also on images obtained in the optical part of the spectrum, at the very faint limit of magnitude B ~ 29 in the blue. However, this galaxy has not been detected in the near-ultraviolet band. ESO PR Photo 48d/98 ESO PR Photo 48d/98 [Preview - JPEG: 800 x 594 pix - 264k] [High-Res - JPEG: 3000 x 2229 pix - 1.8Mb] ESO PR Photo 48e/98 ESO PR Photo 48e/98 [Preview - JPEG: 800 x 942 pix - 96k] [High-Res - JPEG: 3000 x 3533 pix - 576k] PR Photo 48d/98 (left) shows the very red galaxy (at the arrow) in the Hubble Deep Field South , discussed here. Photo 48e/98 (right) is the spectrum of a typical elliptical galaxy, redshifted to z = 1.8 and compared with the brightness of the galaxy in different wavebands (crosses), as measured during the VLT SV programme and the Hubble Deep Field South Test Program (the cross to the right). The arrow indicates the upper limit by the VLT SV in the ultraviolet band. It can be seen that these observations are fully consistent with the object being an old, elliptical galaxy at the high redshift of z=1.8 , i.e. at an epoch, when the Universe was much younger than now. The new ISAAC instrument at VLT UT1 will be able to obtain an infrared spectrum of this galaxy and thus to affirm or refute this provisional conclusion. The colours measured at the VLT and on the HST Test Image are very well matched by those of an old elliptical galaxy at redshift z ~ 1.8 ; see Photo 48e/98 . All the available evidence is thus consistent with this object being an elliptical galaxy with the highest-known redshift for this galaxy type. A preliminary analysis of Hubble Deep Field South data, just released, seems to support this hypothesis. If these conclusions are confirmed by direct measurement of its spectrum, this galaxy must already have been "old" (i.e. significantly evolved) when the Universe had an age of only about one fifth of its present value. A spectroscopic confirmation is still outstanding, but is now possible with the ISAAC instrument at VLT UT1. A positive result would demonstrate that elliptical galaxies can form very early in the history of the Universe. The scientists involved in this study are: Massimo Stiavelli, Tommaso Treu (also Scuola Normale Superiore, Italy), Stefano Casertano, Mark Dickinson, Henry Ferguson, Andrew Fruchter, Crystal Martin (STSci, Baltimore, USA), Piero Rosati and Rodolfo Viezzer (ESO), Marcella Carollo (Johns Hopkins University, Baltimore, USA) and Henry Tieplitz (NASA, Goddard Space Flight Center, Greenbelt, USA). Lyman-alpha Companions and Extended Nebulosity around a Quasar at Redshift z=2.2 In current theories of galaxy formation, luminous galaxies we see to-day were built up through repeated merging of smaller protogalactic clumps. Quasars, prodigious sources pouring out 100 to 1000 times as much light as an entire galaxy, have been used as markers of galaxy formation activity and have guided astronomers in their hunting of primeval galaxies and large-scale structures at high redshift. A supermassive black-hole, swallowing stars, gas and dust, is thought to be the engine powering a quasar and the interaction of the galaxy hosting the black-hole with neighboring galaxies is expected to play a key role in "feeding the monster". At intermediate redshift, a large fraction of radio-loud quasars and radio galaxies inhabit rich clusters of galaxies, whereas radio-quiet quasars are rarely found in very rich environments. Furthermore, tidal interaction between quasars and their nearby companions is also the favoured explanation for the presence of large gaseous nebulosities associated with radio-loud quasars and radio galaxies. At high redshift, searches for Lyman-alpha quasar companions and emission-line nebulosities show strong similarities with those seen at lower redshift, although the detection rate is lower. ESO PR Photo 48f/98 ESO PR Photo 48f/98 [Preview - JPEG: 800 x 977 pix - 184k] [High-Res - JPEG: 3000 x 3662 pix - 1.1Mb] ESO PR Photo 48g/98 ESO PR Photo 48g/98 [Preview - JPEG: 800 x 966 pix - 328k] [High-Res - JPEG: 3000 x 3621 pix - 1.8Mb] PR Photo 48f/98 (left) is a false-colour reproduction of a B-band image of the field around the radio-weak quasar J2233-606 in the Hubble Deep Field South (HDF-S) . Photo 48g/98 (right) represents emission from the same direction at a wavelength that corresponds to Lyman-alpha emission at the redshift ( z = 2.2 ) of the quasar. Three Lyman-alpha candidate companions are indicated with arrows. Note also the extended nebulosity around the quasar. A search for Lyman-alpha companions to the radio-weak quasar J2233-606 in the Hubble Deep Field South (HDF-S) was conducted during the VLT UT1 SV programme in a small field of 1.2 x 1.3 arcmin 2 , centered on the quasar. Candidate Lyman-alpha companions were identified by subtracting a broad-band B (blue) image, that traces the galaxy stellar populations, from a narrow-band image, spectrally centered on the redshifted, narrow Lyman-alpha emission line of the quasar ( z = 2.2 ). Three Lyman-alpha candidate companions were discovered at angular distances of 15 to 23 arcsec, or 200 to 300 kpc (650,000 to 1,000,000 light-years) at the distance corresponding to the quasar redshift. The emission lines are very strong, relative to the continuum emission of the galaxies - this could be a consequence of the strong ionizing radiation field of the quasar. These companions to the quasar may trace a large-scale structure which would extend over larger distances beyond the observed, small field. Even more striking is the presence of a very extended nebulosity whose size (120 kpc x 160 kpc) and Lyman-alpha luminosity (3 x 10 44 erg/cm 2 /s) are among the largest observed around radio galaxies and radio-loud quasars, but rarely seen around a radio-weak quasar. Tidal interaction between the northern, very nearby companion and the quasar is clearly present: the companion is embedded in the quasar nebulosity, most of its gas has been stripped and lies in a tail westwards of the galaxy. The scientists involved in this study are: Jacqueline Bergeron (ESO), Stefano Cristiani, Stephane Arnouts, Gianni Fasano (Padova, Italy) and Patrick Petitjean (Institut d'Astrophysique, Paris, France). Very Distant Galaxy Clusters During the past years, it has become possible to detect and subsequently study progressively more distant clusters of galaxies. For this research programme, UT1 Science Verification data were used, in combination with data obtained with the SOFI instrument at the ESO New Technology Telescope (NTT) at La Silla, to confirm the existence of two very distant galaxy clusters at redshift z ~ 1 , that had originally been detected in the ESO Imaging Survey. This redshift corresponds to an epoch when the age of the Universe was only two-thirds of the present. ESO PR Photo 48h/98 ESO PR Photo 48h/98 [Preview - JPEG: 800 x 917 pix - 896k] [High-Res - JPEG: 3000 x 3438 pix - 6.0Mb] PR Photo 48h/98 (left) is a colour composite that shows the now confirmed cluster EIS0046-2930 . The image has been produced by combining the V (green-yellow), R (red) and I (Near-IR) exposures with the Test Camera obtained during the VLT-UT1 Science Verification. The yellow-orange galaxies are the cluster members and the bluer objects are galaxies belonging to the general field population. The cluster center is at the location of the largest (yellow-orange) cluster galaxy to the left of the center of the image. The field measures 90 x 90 arcsec. This was achieved by the detection of a spatial excess density of galaxies, with measured colour equal to that of elliptical galaxies at this redshift, as established by counts in the respective sky areas. The field of one these clusters is shown in PR Photo 48h/98 . These new data show that the VLT will most certainly play a major role in the studies of the cluster galaxy population in such distant systems. This will contribute to shed important new light on the evolution of galaxies. Furthermore, the VLT clearly has the potential to identify and confirm the reality of many more such clusters and thereby to increase considerably the number of known objects. This will be important in order to determine more accurate values of the basic cosmological constants, and thus for our understanding of the evolution of the Universe as a whole. The presentation was made by Lisbeth Fogh Olsen (Copenhagen Observatory, Denmark, and ESO) on behalf of the scientists involved in this study. Icy Planets in the Outer Solar System Observations with large optical telescopes during the past years have begun to cast more light on the still very little known, distant icy planets in the outer solar system. Until November 1998, about 70 of these have been discovered outside the orbit of Neptune (between 30 and 50 AU, or 4,500 to 7,500 million km, from the Sun). They are accordingly referred to as Trans-Neptunian Objects (TNOs) . Those found so far are believed to represent the "tip of the iceberg" of a large population of such objects belonging to the so-called Kuiper Belt . This is a roughly disk-shaped region between about 50 and 120 AU (about 7,500 to 18,000 million km) from the Sun, in which remnant bodies from the formation of the solar system are thought to be present. From their measured brightness and the distance, it is found that most known TNOs have diameters of the order of a few hundred kilometres. About half of those known move in elongated Pluto-like orbits, the others move somewhat further out in stable, circular orbits. During the two-week Science Verification programme, approximately 200 minutes were spent on a small observing programme aimed at obtaining images of some TNOs in different wavebands (B, V, R and I). Since this programme was primarily designed as a back-up to be executed during less favourable atmospheric conditions, some of the observations could not be used. However, images of three faint TNOs were recorded during an excellent series of 1-10 min exposures. From these data, it was possible to measure quite accurate magnitudes (and thus approximate sizes) and to determine their colours. One of them, 1996 TL66, was among the bluest TNOs ever observed. It is believed that this is because its surface has undergone recent transformation, possibly due to collisions with other objects or the breaking-off of small pieces from the surface, in both cases revealing "fresh" layers below. The combination of all available exposures made it possible to look for faint and tenous atmospheres around these TNOs, but none were found. These results show that it is possible, with little effort and even under quite unfavourable observing conditions, to obtain valuable information with the VLT about icy objects in the outer solar system. Of even greater interest will be future spectroscopic observations with FORS and ISAAC that will allow to study the surface composition in some detail, with the potential of providing direct information about (nearly?) pristine material from the early phases of the solar system. The scientists involved in this study are: Olivier Hainaut, Hermann Boehnhardt, Catherine Delahodde and Richard West (ESO) and Karen Meech (Institute of Astronomy, Hawaii, USA). How to obtain ESO Press Information ESO Press Information is made available on the World-Wide Web (URL: http://www.eso.org ). ESO Press Photos may be reproduced, if credit is given to the European Southern Observatory.

  1. Hello World Deep Learning in Medical Imaging.

    PubMed

    Lakhani, Paras; Gray, Daniel L; Pett, Carl R; Nagy, Paul; Shih, George

    2018-05-03

    There is recent popularity in applying machine learning to medical imaging, notably deep learning, which has achieved state-of-the-art performance in image analysis and processing. The rapid adoption of deep learning may be attributed to the availability of machine learning frameworks and libraries to simplify their use. In this tutorial, we provide a high-level overview of how to build a deep neural network for medical image classification, and provide code that can help those new to the field begin their informatics projects.

  2. High-resolution Topography of PACMANUS and DESMOS Hydrothermal Fields in the Manus Basin through ROV "FAXIAN"

    NASA Astrophysics Data System (ADS)

    Luan, Z.; Ma, X.; Yan, J.; Zhang, X.; Zheng, C.; Sun, D.

    2016-12-01

    High-resolution topography can help us deeply understand the seabed and related geological processes (e.g. hydrothermal/cold spring systems) in the deep sea areas. However, such studies are rare in China due to the limit of deep-sea detection technology. Here, we report the advances of the application of ROV in China and the newly measured high-resolution topographical data in PACMANUS and DESMOS hydrothermal fields. In June 2015, the ROV "FAXIAN" with a multibeam system (Kongsberg EM2040) was deployed to measure the topography of PACMANUS and DESMOS hydrothermal fields in the Manus basin. A composite positioning system on the ROV provided long baseline (LBL) navigation and positioning during measurements, giving a high positioning accuracy (better than 0.5m). The raw bathymetric data obtained were processed using CARIS HIPS (version 8.1). Based on the high-resolution data, we can describe the topographical details of the PACMANUS and DESMOS hydrothermal fields. High-resolution terrain clearly shows the detailed characters of the topography in the PACMANUS hydrothermal field, and some cones are corresponding to the pre discovered hydrothermal points and volcanic area. Most hydrothermal points in the PACMANUS hydrothermal field mainly developed on the steep slopes with a gradient exceeding 30 °. In contrast, the DESMOS field is a caldera that is approximately 250 m deep in the center with an E-W diameter of approximately1 km and a N-S diameter of approximately 2 km. The seafloor is much steeper on the inner side of the circular fracture. Two highlands occur in the northern and the southern flanks of the caldera. Video record indicated that pillow lava, sulfide talus, breccia, anhydrite, outcrops, and sediment all appeared in the DESMOS field. This is the first time for the ROV "FAXIAN" to be used in near-bottom topography measurements in the hydrothermal fields, opening a window of deep-sea researches in China.

  3. Importance of extended spatial coverage for quantitative susceptibility mapping of iron-rich deep gray matter.

    PubMed

    Elkady, Ahmed M; Sun, Hongfu; Wilman, Alan H

    2016-05-01

    Quantitative Susceptibility Mapping (QSM) is an emerging area of brain research with clear application to brain iron studies in deep gray matter. However, acquisition of standard whole brain QSM can be time-consuming. One means to reduce scan time is to use a focal acquisition restricted only to the regions of interest such as deep gray matter. However, the non-local dipole field necessary for QSM reconstruction extends far beyond the structure of interest. We demonstrate the practical implications of these non-local fields on the choice of brain volume for QSM. In an illustrative numerical simulation and then in human brain experiments, we examine the effect on QSM of volume reduction in each dimension. For the globus pallidus, as an example of iron-rich deep gray matter, we demonstrate that substantial errors can arise even when the field-of-view far exceeds the physical structural boundaries. Thus, QSM reconstruction requires a non-local field-of-view prescription to ensure minimal errors. An axial QSM acquisition, centered on the globus pallidus, should encompass at least 76mm in the superior-inferior direction to conserve susceptibility values from the globus pallidus. This dimension exceeds the physical coronal extent of this structure by at least five-fold. As QSM sees wider use in the neuroscience community, its unique requirement for an extended field-of-view needs to be considered. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. The ROSAT Deep Survey. 2; Optical Identification, Photometry and Spectra of X-Ray Sources in the Lockman Field

    NASA Technical Reports Server (NTRS)

    Schmidt, M.; Hasinger, G.; Gunn, J.; Schneider, D.; Burg, R.; Giacconi, R.; Lehmann, I.; MacKenty, J.; Truemper, J.; Zamorani, G.

    1998-01-01

    The ROSAT Deep Survey includes a complete sample of 50 X-ray sources with fluxes in the 0.5 - 2 keV band larger than 5.5 x 10(exp -15)erg/sq cm/s in the Lockman field (Hasinger et al., Paper 1). We have obtained deep broad-band CCD images of the field and spectra of many optical objects near the positions of the X-ray sources. We define systematically the process leading to the optical identifications of the X-ray sources. For this purpose, we introduce five identification (ID) classes that characterize the process in each case. Among the 50 X-ray sources, we identify 39 AGNs, 3 groups of galaxies, 1 galaxy and 3 galactic stars. Four X-ray sources remain unidentified so far; two of these objects may have an unusually large ratio of X-ray to optical flux.

  5. Hubble Team Unveils Most Colorful View of Universe Captured by Space Telescope

    NASA Image and Video Library

    2014-06-04

    Astronomers using NASA's Hubble Space Telescope have assembled a comprehensive picture of the evolving universe – among the most colorful deep space images ever captured by the 24-year-old telescope. Researchers say the image, in new study called the Ultraviolet Coverage of the Hubble Ultra Deep Field, provides the missing link in star formation. The Hubble Ultra Deep Field 2014 image is a composite of separate exposures taken in 2003 to 2012 with Hubble's Advanced Camera for Surveys and Wide Field Camera 3. Credit: NASA/ESA Read more: 1.usa.gov/1neD0se NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  6. Reconstruction of Horizontal Plasma Motions at the Photosphere from Intensitygrams: A Comparison Between DeepVel, LCT, FLCT, and CST

    NASA Astrophysics Data System (ADS)

    Tremblay, Benoit; Roudier, Thierry; Rieutord, Michel; Vincent, Alain

    2018-04-01

    Direct measurements of plasma motions in the photosphere are limited to the line-of-sight component of the velocity. Several algorithms have therefore been developed to reconstruct the transverse components from observed continuum images or magnetograms. We compare the space and time averages of horizontal velocity fields in the photosphere inferred from pairs of consecutive intensitygrams by the LCT, FLCT, and CST methods and the DeepVel neural network in order to identify the method that is best suited for generating synthetic observations to be used for data assimilation. The Stein and Nordlund ( Astrophys. J. Lett. 753, L13, 2012) magnetoconvection simulation is used to generate synthetic SDO/HMI intensitygrams and reference flows to train DeepVel. Inferred velocity fields show that DeepVel performs best at subgranular and granular scales and is second only to FLCT at mesogranular and supergranular scales.

  7. The Great Easter Egg Hunt: The Void's Incredible Richness

    NASA Astrophysics Data System (ADS)

    2006-04-01

    An image made of about 300 million pixels is being released by ESO, based on more than 64 hours of observations with the Wide-Field Camera on the 2.2m telescope at La Silla (Chile). The image covers an 'empty' region of the sky five times the size of the full moon, opening an exceptionally clear view towards the most distant part of our universe. It reveals objects that are 100 million times fainter than what the unaided eye can see. Easter is in many countries a time of great excitement for children who are on the big hunt for chocolate eggs, hidden all about the places. Astronomers, however, do not need to wait this special day to get such an excitement: it is indeed daily that they look for faraway objects concealed in deep images of the sky. And as with chocolate eggs, deep sky objects, such as galaxies, quasars or gravitational lenses, come in the wildest variety of colours and shapes. ESO PR Photo 11/06 ESO PR Photo 14a/06 The Deep 3 'Empty' Field The image presented here is one of such very deep image of the sky. It is the combination of 714 frames for a total exposure time of 64.5 hours obtained through four different filters (B, V, R, and I)! It consists of four adjacent Wide-Field Camera pointings (each 33x34 arcmin), covering a total area larger than one square degree. Yet, if you were to look at this large portion of the firmament with the unaided eye, you would just see... nothing. The area, named Deep 3, was indeed chosen to be a random but empty, high galactic latitude field, positioned in such a way that it can be observed from the La Silla observatory all over the year. Together with two other regions, Deep 1 and Deep 2, Deep 3 is part of the Deep Public Survey (DPS), based on ideas submitted by the ESO community and covering a total sky area of 3 square degrees. Deep 1 and Deep 2 were selected because they overlapped with regions of other scientific interest. For instance, Deep 1 was chosen to complement the deep ATESP radio survey carried out with the Australia Telescope Compact Array (ATCA) covering the region surveyed by the ESO Slice Project, while Deep 2 included the CDF-S field. Each region is observed in the optical, with the WFI, and in the near-infrared, with SOFI on the 3.5-m New Technology Telescope also at La Silla. Deep 3 is located in the Crater ('The Cup'), a southern constellation with very little interest (the brightest star is of fourth magnitude, i.e. only a factor six brighter than what a keen observer can see with the unaided eye), in between the Virgo, Corvus and Hydra constellations. Such comparatively empty fields provide an unusually clear view towards the distant regions in the Universe and thus open a window towards the earliest cosmic times. The deep imaging data can for example be used to pre-select objects by colour for follow-up spectroscopy with ESO's Very Large Telescope instruments. ESO PR Photo 11/06 ESO PR Photo 14b/06 Galaxy ESO 570-19 and Variable Star UW Crateris But being empty is only a relative notion. True, on the whole image, the SIMBAD Astronomical database references less than 50 objects, clearly a tiny number compared to the myriad of anonymous stars and galaxies that can be seen in the deep image obtained by the Survey! Among the objects catalogued is the galaxy visible in the top middle right (see also PR Photo 14b/06) and named ESO 570-19. Located 60 million light-years away, this spiral galaxy is the largest in the image. It is located not so far - on the image! - from the brightest star in the field, UW Crateris. This red giant is a variable star that is about 8 times fainter than what the unaided eye can see. The second and third brightest stars in this image are visible in the lower far right and in the lower middle left. The first is a star slightly more massive than the Sun, HD 98081, while the other is another red giant, HD 98507. ESO PR Photo 11/06 ESO PR Photo 14c/06 The DPS Deep 3 Field (Detail) In the image, a vast number of stars and galaxies are to be studied and compared. They come in a variety of colours and the stars form amazing asterisms (a group of stars forming a pattern), while the galaxies, which are to be counted by the tens of thousands come in different shapes and some even interact or form part of a cluster. The image and the other associated data will certainly provide a plethora of new results in the years to come. In the meantime, why don't you explore the image with the zoom-in facility, and start your own journey into infinity? Just be careful not to get lost. And remember: don't eat too many of these chocolate eggs! High resolution images and their captions are available on this page.

  8. Water resources data of the Seward area, Alaska

    USGS Publications Warehouse

    Dearborn, Larry L.; Anderson, Gary S.; Zenone, Chester

    1979-01-01

    Seward, Alaska, obtains a water supply of about 2 million gallons per day primarily from Marathon Springs and the Fort Raymond well field. The springs have supplied up to 800 gallons per minute, and the city 's deep wells currently have a combined capacity of about 3,000 gallons per minute. Freshwater is abundant in the area; future public supplies could be derived from both shallow and deep ground water and from stream impoundment with diversion. High deep-aquifer transmissivity at the Fort Raymond well field indicates that additional wells could be developed there. Water quality is generally not a problem for public consumption. A flood potential exists along several streams having broad alluvial fans. (Woodard-USGS)

  9. Fifty years of computer analysis in chest imaging: rule-based, machine learning, deep learning.

    PubMed

    van Ginneken, Bram

    2017-03-01

    Half a century ago, the term "computer-aided diagnosis" (CAD) was introduced in the scientific literature. Pulmonary imaging, with chest radiography and computed tomography, has always been one of the focus areas in this field. In this study, I describe how machine learning became the dominant technology for tackling CAD in the lungs, generally producing better results than do classical rule-based approaches, and how the field is now rapidly changing: in the last few years, we have seen how even better results can be obtained with deep learning. The key differences among rule-based processing, machine learning, and deep learning are summarized and illustrated for various applications of CAD in the chest.

  10. Project DEEP STEAM

    NASA Astrophysics Data System (ADS)

    Aeschliman, D. P.; Clay, R. G.; Donaldson, A. B.; Eisenhawer, S. W.; Fox, R. L.; Johnson, D. R.; Mulac, A. J.

    1982-01-01

    The objective of Project DEEP STEAM is to develop the technology to economically produce heavy oils from deep reservoirs. The tasks included in this project are the development of thermally efficient delivery systems and downhole steam generation systems. During the period January 1-March 31, 1981, effort has continued on a low pressure combustion downhole generator (Rocketdyne), and on two high pressure designs (Foster-Miller Associates, Sandia National Laboratories). The Sandia design was prepared for deployment in the Wilmington Field at Long Beach, California. Progress continued on the Min-Stress II packer concept at L'Garde, Inc., and on the extruded metal packer at Foster-Miller. Initial bare string field data are reported on the insulated tubular test at Lloydminster, Saskatchewan, Canada.

  11. Execution of deep dipole geoelectrical soundings in areas of geothermal interest. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patella, D.

    It is suggested that deep geoelectrical problems may be resolved by carrying out dipole soundings in the field and applying a quantitative interpretation in the Schlumberger domain. The 'transformation' of original field dipole sounding curves into equivalent Schlumberger curves is outlined for the cases of layered structures and arbitrary underground structures. Theoretical apparent resistivity curves are derived for soundings over bidimensional structures. Following a summary of the geological features of the Travale-Radicondoli geothermal area of Italy, the dipole sounding method employed for this field study and the means of collecting and analyzing the data, are outlined.

  12. Electromagnetic sounding of the moon using Apollo 16 and Lunokhod 2 surface magnetometer observations /preliminary results/

    NASA Technical Reports Server (NTRS)

    Vanian, L. L.; Vnutchokova, T. A.; Fainberg, E. B.; Eroschenko, E. A.; Dyal, P.; Parkin, C. W.; Daily, W. D.

    1977-01-01

    A technique of deep electromagnetic sounding of the moon using simultaneous magnetic-field measurements at two lunar surface sites is described. The method, used with the assumption that deep electrical conductivity is a function only of lunar radius, has the advantage of allowing calculation of the external driving field from two surface-site measurements only and therefore does not require data from a lunar orbiting satellite. A transient-response calculation is presented for the example of a magnetic-field discontinuity, measured simultaneously by Apollo 16 and Lunokhod 2 surface magnetometers.

  13. Electromagnetic Sounding of the Moon Using Apollo 16 and Lunokhod 2 Surface Magnetometer Observations (Preliminary Results)

    NASA Technical Reports Server (NTRS)

    Vanyan, L. L.; Vnutchokova, T. A.; Fainberg, E. B.; Eroschenko, E. A.; Dyal, P.; Parkin, C. W.; Parkin, C. W.

    1977-01-01

    A new technique of deep electromagnetic sounding of the Moon using simultaneous magnetic field measurements at two lunar surface sites is described. The method, used with the assumption that deep electrical conductivity is a function only of lunar radius, has the advantage of allowing calculation of the external driving field from two surface site measurements only, and therefore does not require data from a lunar orbiting satellite. A transient response calculation is presented for the example of a magnetic field discontinuity of February 13, 1973, measured simultaneously by Apollo 16 and Lunokhod 2 surface magnetometers.

  14. Using Jupiter's gravitational field to probe the Jovian convective dynamo.

    PubMed

    Kong, Dali; Zhang, Keke; Schubert, Gerald

    2016-03-23

    Convective motion in the deep metallic hydrogen region of Jupiter is believed to generate its magnetic field, the strongest in the solar system. The amplitude, structure and depth of the convective motion are unknown. A promising way of probing the Jovian convective dynamo is to measure its effect on the external gravitational field, a task to be soon undertaken by the Juno spacecraft. We calculate the gravitational signature of non-axisymmetric convective motion in the Jovian metallic hydrogen region and show that with sufficiently accurate measurements it can reveal the nature of the deep convection.

  15. Using Jupiter’s gravitational field to probe the Jovian convective dynamo

    PubMed Central

    Kong, Dali; Zhang, Keke; Schubert, Gerald

    2016-01-01

    Convective motion in the deep metallic hydrogen region of Jupiter is believed to generate its magnetic field, the strongest in the solar system. The amplitude, structure and depth of the convective motion are unknown. A promising way of probing the Jovian convective dynamo is to measure its effect on the external gravitational field, a task to be soon undertaken by the Juno spacecraft. We calculate the gravitational signature of non-axisymmetric convective motion in the Jovian metallic hydrogen region and show that with sufficiently accurate measurements it can reveal the nature of the deep convection. PMID:27005472

  16. A tribute to Peter A. Rona: A Russian Perspective

    NASA Astrophysics Data System (ADS)

    Sagalevich, Anatoly; Lutz, Richard A.

    2015-11-01

    In July 1985 Peter Rona led a cruise of the National Oceanic and Atmospheric Administration (NOAA) ship Researcher as part of the NOAA Vents Program and discovered, for the first time, black smokers, massive sulfide deposits and vent biota in the Atlantic Ocean. The site of the venting phenomena was the Trans-Atlantic Geotraverse (TAG) Hydrothermal Field on the east wall of the rift valley of the Mid-Atlantic Ridge at 26°08‧N; 44°50‧W (Rona, 1985; Rona et al., 1986). In 1986, Peter and an international research team carried out multidisciplnary investigations of both active and inactive hydrothermal zones of the TAG field using the R/V Atlantis and DSV Alvin, discovering two new species of shrimp (Rimicaris exoculata and Chorocaris chacei) (Williams and Rona, 1986) and a hexagonal-shaped form (Paleodictyon nodosum) thought to be extinct (Rona et al., 2009). In 1991 a Russian crew aboard the R/V Akademik Mstislav Keldysh, with two deep-diving, human-occupied submersibles (Mir-1 and Mir-2) (Fig. 1), had the honor of having Peter Rona and a Canadian IMAX film crew from the Stephen Low Company on board to visit the TAG hydrothermal vent field. This was the first of many deep-sea interactions between Russian deep-sea scientists and their colleagues from both the U.S. and Canada. This expedition to the TAG site was part of a major Russian undersea program aimed at exploring extreme deep-sea environments; between 1988 and 2005, the Mir submersibles visited hydrothermal vents and cold seep areas in 20 deep-sea regions throughout the world's oceans (Sagalevich, 2002). Images of several of these areas (the TAG, Snake Pit, Lost City and 9°50‧N vent fields) were obtained using an IMAX camera system emplaced for the first time within the spheres of the Mir submersibles and DSV Alvin in conjunction with the filming of science documentaries (e.g., ;Volcanoes of the Deep Sea;) produced by the Stephen Low Company in conjunction with Emory Kristof of National Geographic and Peter Rona. The initial test of this submersible-emplaced camera system was conducted during the 1991 expedition to the TAG hydrothermal vent field.

  17. Quantification of deep percolation from two flood-irrigated alfalfa field, Roswell Basin, New Mexico

    USGS Publications Warehouse

    Roark, D. Michael; Healy, D.F.

    1998-01-01

    For many years water management in the Roswell ground-water basin (Roswell Basin) and other declared basins in New Mexico has been the responsibility of the State of New Mexico. One of the water management issues requiring better quantification is the amount of deep percolation from applied irrigation water. Two adjacent fields, planted in alfalfa, were studied to determine deep percolation by the water-budget, volumetric-moisture, and chloride mass-balance methods. Components of the water-budget method were measured, in study plots called borders, for both fields during the 1996 irrigation season. The amount of irrigation water applied in the west border was 95.8 centimeters and in the east border was 169.8 centimeters. The total amount of precipitation that fell during the irrigation season was 21.9 centimeters. The increase in soil-moisture storage from the beginning to the end of the irrigation season was 3.2 centimeters in the west border and 8.8 centimeters in the east border. Evapotranspiration, as estimated by the Bowen ratio energy balance technique, in the west border was 97.8 centimeters and in the east border was 101.0 centimeters. Deep percolation determined using the water-budget method was 16.4 centimeters in the west border and 81.6 centimeters in the east border. An average deep percolation of 22.3 centimeters in the west border and 31.6 centimeters in the east border was determined using the volumetric-moisture method. The chloride mass-balance method determined the multiyear deep percolation to be 15.0 centimeters in the west border and 38.0 centimeters in the east border. Large differences in the amount of deep percolation between the two borders calculated by the water-budget method are due to differences in the amount of water that was applied to each border. More water was required to flood the east border because of the greater permeability of the soils in that field and the smaller rate at which water could be applied.

  18. New Earth Science Data and Access Methods

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Weinstein, Beth E.; Farnham, Jennifer

    2004-01-01

    NASA's Earth Science Enterprise, working with its domestic and international partners, provides scientific data and analysis to improve life here on Earth. NASA provides science data products that cover a wide range of physical, geophysical, biochemical and other parameters, as well as services for interdisciplinary Earth science studies. Management and distribution of these products is administered through the Earth Observing System Data and Information System (EOSDIS) Distributed Active Archive Centers (DAACs), which all hold data within a different Earth science discipline. This paper will highlight selected EOS datasets and will focus on how these observations contribute to the improvement of essential services such as weather forecasting, climate prediction, air quality, and agricultural efficiency. Emphasis will be placed on new data products derived from instruments on board Terra, Aqua and ICESat as well as new regional data products and field campaigns. A variety of data tools and services are available to the user community. This paper will introduce primary and specialized DAAC-specific methods for finding, ordering and using these data products. Special sections will focus on orienting users unfamiliar with DAAC resources, HDF-EOS formatted data and the use of desktop research and application tools.

  19. mz5: space- and time-efficient storage of mass spectrometry data sets.

    PubMed

    Wilhelm, Mathias; Kirchner, Marc; Steen, Judith A J; Steen, Hanno

    2012-01-01

    Across a host of MS-driven-omics fields, researchers witness the acquisition of ever increasing amounts of high throughput MS data and face the need for their compact yet efficiently accessible storage. Addressing the need for an open data exchange format, the Proteomics Standards Initiative and the Seattle Proteome Center at the Institute for Systems Biology independently developed the mzData and mzXML formats, respectively. In a subsequent joint effort, they defined an ontology and associated controlled vocabulary that specifies the contents of MS data files, implemented as the newer mzML format. All three formats are based on XML and are thus not particularly efficient in either storage space requirements or read/write speed. This contribution introduces mz5, a complete reimplementation of the mzML ontology that is based on the efficient, industrial strength storage backend HDF5. Compared with the current mzML standard, this strategy yields an average file size reduction to ∼54% and increases linear read and write speeds ∼3-4-fold. The format is implemented as part of the ProteoWizard project and is available under a permissive Apache license. Additional information and download links are available from http://software.steenlab.org/mz5.

  20. The accurate particle tracer code

    DOE PAGES

    Wang, Yulei; Liu, Jian; Qin, Hong; ...

    2017-07-20

    The Accurate Particle Tracer (APT) code is designed for systematic large-scale applications of geometric algorithms for particle dynamical simulations. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and nonlinear problems. To provide a flexible and convenient I/O interface, the libraries of Lua and Hdf5 are used. Following a three-step procedure, users can efficiently extend the libraries of electromagnetic configurations, external non-electromagnetic forces, particle pushers, and initialization approaches by use of the extendible module. APT has been used in simulations of key physical problems, such as runawaymore » electrons in tokamaks and energetic particles in Van Allen belt. As an important realization, the APT-SW version has been successfully distributed on the world’s fastest computer, the Sunway TaihuLight supercomputer, by supporting master–slave architecture of Sunway many-core processors. Here, based on large-scale simulations of a runaway beam under parameters of the ITER tokamak, it is revealed that the magnetic ripple field can disperse the pitch-angle distribution significantly and improve the confinement of energetic runaway beam on the same time.« less

  1. The SAMI Galaxy Survey: A prototype data archive for Big Science exploration

    NASA Astrophysics Data System (ADS)

    Konstantopoulos, I. S.; Green, A. W.; Foster, C.; Scott, N.; Allen, J. T.; Fogarty, L. M. R.; Lorente, N. P. F.; Sweet, S. M.; Hopkins, A. M.; Bland-Hawthorn, J.; Bryant, J. J.; Croom, S. M.; Goodwin, M.; Lawrence, J. S.; Owers, M. S.; Richards, S. N.

    2015-11-01

    We describe the data archive and database for the SAMI Galaxy Survey, an ongoing observational program that will cover ≈3400 galaxies with integral-field (spatially-resolved) spectroscopy. Amounting to some three million spectra, this is the largest sample of its kind to date. The data archive and built-in query engine use the versatile Hierarchical Data Format (HDF5), which precludes the need for external metadata tables and hence the setup and maintenance overhead those carry. The code produces simple outputs that can easily be translated to plots and tables, and the combination of these tools makes for a light system that can handle heavy data. This article acts as a contextual companion to the SAMI Survey Database source code repository, samiDB, which is freely available online and written entirely in Python. We also discuss the decisions related to the selection of tools and the creation of data visualisation modules. It is our aim that the work presented in this article-descriptions, rationale, and source code-will be of use to scientists looking to set up a maintenance-light data archive for a Big Science data load.

  2. The accurate particle tracer code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yulei; Liu, Jian; Qin, Hong

    The Accurate Particle Tracer (APT) code is designed for systematic large-scale applications of geometric algorithms for particle dynamical simulations. Based on a large variety of advanced geometric algorithms, APT possesses long-term numerical accuracy and stability, which are critical for solving multi-scale and nonlinear problems. To provide a flexible and convenient I/O interface, the libraries of Lua and Hdf5 are used. Following a three-step procedure, users can efficiently extend the libraries of electromagnetic configurations, external non-electromagnetic forces, particle pushers, and initialization approaches by use of the extendible module. APT has been used in simulations of key physical problems, such as runawaymore » electrons in tokamaks and energetic particles in Van Allen belt. As an important realization, the APT-SW version has been successfully distributed on the world’s fastest computer, the Sunway TaihuLight supercomputer, by supporting master–slave architecture of Sunway many-core processors. Here, based on large-scale simulations of a runaway beam under parameters of the ITER tokamak, it is revealed that the magnetic ripple field can disperse the pitch-angle distribution significantly and improve the confinement of energetic runaway beam on the same time.« less

  3. Long-term detection of Parkinsonian tremor activity from subthalamic nucleus local field potentials.

    PubMed

    Houston, Brady; Blumenfeld, Zack; Quinn, Emma; Bronte-Stewart, Helen; Chizeck, Howard

    2015-01-01

    Current deep brain stimulation paradigms deliver continuous stimulation to deep brain structures to ameliorate the symptoms of Parkinson's disease. This continuous stimulation has undesirable side effects and decreases the lifespan of the unit's battery, necessitating earlier replacement. A closed-loop deep brain stimulator that uses brain signals to determine when to deliver stimulation based on the occurrence of symptoms could potentially address these drawbacks of current technology. Attempts to detect Parkinsonian tremor using brain signals recorded during the implantation procedure have been successful. However, the ability of these methods to accurately detect tremor over extended periods of time is unknown. Here we use local field potentials recorded during a deep brain stimulation clinical follow-up visit 1 month after initial programming to build a tremor detection algorithm and use this algorithm to detect tremor in subsequent visits up to 8 months later. Using this method, we detected the occurrence of tremor with accuracies between 68-93%. These results demonstrate the potential of tremor detection methods for efficacious closed-loop deep brain stimulation over extended periods of time.

  4. Magnetically targeted delivery through cartilage

    NASA Astrophysics Data System (ADS)

    Jafari, Sahar; Mair, Lamar O.; Chowdhury, Sagar; Nacev, Alek; Hilaman, Ryan; Stepanov, Pavel; Baker-McKee, James; Ijanaten, Said; Koudelka, Christian; English, Bradley; Malik, Pulkit; Weinberg, Irving N.

    2018-05-01

    In this study, we have invented a method of delivering drugs deep into articular cartilage with shaped dynamic magnetic fields acting on small metallic magnetic nanoparticles with polyethylene glycol coating and average diameter of 30 nm. It was shown that transport of magnetic nanoparticles through the entire thickness of bovine articular cartilage can be controlled by a combined alternating magnetic field at 100 Hz frequency and static magnetic field of 0.8 tesla (T) generated by 1" dia. x 2" thick permanent magnet. Magnetic nanoparticles transport through bovine articular cartilage samples was investigated at various settings of magnetic field and time durations. Combined application of an alternating magnetic field and the static field gradient resulted in a nearly 50 times increase in magnetic nanoparticles transport in bovine articular cartilage tissue as compared with static field conditions. This method can be applied to locally deliver therapeutic-loaded magnetic nanoparticles deep into articular cartilage to prevent cartilage degeneration and promote cartilage repair in osteoarthritis.

  5. Effectiveness of evaluating tumor vascularization using 3D power Doppler ultrasound with high-definition flow technology in the prediction of the response to neoadjuvant chemotherapy for T2 breast cancer: a preliminary report

    NASA Astrophysics Data System (ADS)

    Shia, Wei-Chung; Chen, Dar-Ren; Huang, Yu-Len; Wu, Hwa-Koon; Kuo, Shou-Jen

    2015-10-01

    The aim of this study was to evaluate the effectiveness of advanced ultrasound (US) imaging of vascular flow and morphological features in the prediction of a pathologic complete response (pCR) and a partial response (PR) to neoadjuvant chemotherapy for T2 breast cancer. Twenty-nine consecutive patients with T2 breast cancer treated with six courses of anthracycline-based neoadjuvant chemotherapy were enrolled. Three-dimensional (3D) power Doppler US with high-definition flow (HDF) technology was used to investigate the blood flow in and morphological features of the tumors. Six vascularity quantization features, three morphological features, and two vascular direction features were selected and extracted from the US images. A support vector machine was used to evaluate the changes in vascularity after neoadjuvant chemotherapy, and pCR and PR were predicted on the basis of these changes. The most accurate prediction of pCR was achieved after the first chemotherapy cycle, with an accuracy of 93.1% and a specificity of 85.5%, while that of a PR was achieved after the second cycle, with an accuracy of 79.31% and a specificity of 72.22%. Vascularity data can be useful to predict the effects of neoadjuvant chemotherapy. Determination of changes in vascularity after neoadjuvant chemotherapy using 3D power Doppler US with HDF can generate accurate predictions of the patient response, facilitating early decision-making.

  6. Using ICESat/GLAS Data Produced in a Self-Describing Format

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Webster, D.; Fowler, C.; McAllister, M.; Haran, T. M.

    2015-12-01

    For the life of the ICESat mission and beyond, GLAS data have been distributed in binary format by NASA's National Snow and Ice Data Center Distributed Active Archive Center (NSIDC DAAC) at the University of Colorado in Boulder. These data have been extremely useful but, depending on the users, not always the easiest to use. Recently, with release 33 and 34, GLAS data have been produced in an HDF5 format. The NSIDC User Services Office has found that most users find this HDF5 format to be more user friendly than the original binary format. Some of the advantages include being able to view the actual data using HDFView or any of a number of open source tools freely available for users to view and work with the data. Also with this format NSIDC DAAC has been able to provide more selective and specific services which include spatial subsetting, file stitching, and the much sought after parameter subsetting through the use of Reverb, the next generation Earth science discovery tool. The final release of GLAS data in 2014 and the ongoing user questions not just about the data, but about the mission, satellite platform, and instrument have also spurred NSIDC DAAC efforts to make all of the mission documents and information available to the public in one location. Thus was born the ICESat/GLAS Long Term Archive now available online. The data and specifics from this mission are archived and made available to the public at NASA's NSIDC DAAC.

  7. Data Container Study for Handling Array-based Data Using Rasdaman, Hive, Spark, and MongoDB

    NASA Astrophysics Data System (ADS)

    Xu, M.; Hu, F.; Yu, M.; Scheele, C.; Liu, K.; Huang, Q.; Yang, C. P.; Little, M. M.

    2016-12-01

    Geoscience communities have come up with various big data storage solutions, such as Rasdaman and Hive, to address the grand challenges for massive Earth observation data management and processing. To examine the readiness of current solutions in supporting big Earth observation, we propose to investigate and compare four popular data container solutions, including Rasdaman, Hive, Spark, and MongoDB. Using different types of spatial and non-spatial queries, datasets stored in common scientific data formats (e.g., NetCDF and HDF), and two applications (i.e. dust storm simulation data mining and MERRA data analytics), we systematically compare and evaluate the feature and performance of these four data containers in terms of data discover and access. The computing resources (e.g. CPU, memory, hard drive, network) consumed while performing various queries and operations are monitored and recorded for the performance evaluation. The initial results show that 1) Rasdaman has the best performance for queries on statistical and operational functions, and supports NetCDF data format better than HDF; 2) Rasdaman clustering configuration is more complex than the others; 3) Hive performs better on single pixel extraction from multiple images; and 4) Except for the single pixel extractions, Spark performs better than Hive and its performance is close to Rasdaman. A comprehensive report will detail the experimental results, and compare their pros and cons regarding system performance, ease of use, accessibility, scalability, compatibility, and flexibility.

  8. Calcium balance in pediatric online hemodiafiltration: Beware of sodium and bicarbonate in the dialysate.

    PubMed

    Bacchetta, Justine; Sellier-Leclerc, Anne-Laure; Bertholet-Thomas, Aurélia; Carlier, Marie-Christine; Cartier, Régine; Cochat, Pierre; Ranchin, Bruno

    2015-11-01

    Online hemodiafiltration (oHDF) is increasingly used in children; we treated 28 children since 2009, adapting this technique to pediatric patients. In this service evaluation audit, we assessed plasma electrolytes to evaluate the evolution of total (tCa) and ionized (iCa) during a session, as well as dialysate calcium (dCa) concentrations. Using a 1.25 mmol Ca/L-dialysate, both tCa and iCa decreased during the session, with iCa falling below 1.1 mmol/L in 4/5 patients. In contrast, using a 1.5 mmol Ca/L-dialysate, iCa remained normal in all patients. Major discrepancies were observed between the expected and the measured dCa: 1.25 vs. 1.01 (0.83-1.04), and 1.5 vs. 1.47 (0.85-1.75) mmol/L, respectively (results presented as median [range]). These differences were explained by the modality of reconstituting dialysate: increasing bicarbonates and/or decreasing sodium requested in the dialysate decreases calcium extraction from the acid preparation. Proof of concept was given when requesting in an "ex-vivo" setting modifications in the requested sodium and bicarbonate in dialysate directly on the Fresenius machine. Nephrologists should be aware that "high bicarbonate and/or low sodium" requirements in oHDF decrease calcium in the dialysate. Copyright © 2015 Association Société de néphrologie. Published by Elsevier SAS. All rights reserved.

  9. A drug-induced accelerated senescence (DIAS) is a possibility to study aging in time lapse.

    PubMed

    Alili, Lirija; Diekmann, Johanna; Giesen, Melanie; Holtkötter, Olaf; Brenneisen, Peter

    2014-06-01

    Currently, the oxidative stress (or free radical) theory of aging is the most popular explanation of how aging occurs at the molecular level. Accordingly, a stress-induced senescence-like phenotype of human dermal fibroblasts can be induced in vitro by the exposure of human diploid fibroblasts to subcytotoxic concentrations of hydrogen peroxide. However, several biomarkers of replicative senescence e.g. cell cycle arrest and enlarged morphology are abrogated 14 days after treatment, indicating that reactive oxygen species (ROS) rather acts as a trigger for short-term senescence (1-3 days) than being responsible for the maintenance of the senescence-like phenotype. Further, DNA-damaging factors are discussed resulting in a permanent senescent cell type. To induce long-term premature senescence and to understand the molecular alterations occurring during the aging process, we analyzed mitomycin C (MMC) as an alkylating DNA-damaging agent and ROS producer. Human dermal fibroblasts (HDF), used as model for skin aging, were exposed to non-cytotoxic concentrations of MMC and analyzed for potential markers of cellular aging, for example enlarged morphology, activity of senescence-associated-ß-galactosidase, cell cycle arrest, increased ROS production and MMP1-activity, which are well-documented for HDF in replicative senescence. Our data show that mitomycin C treatment results in a drug-induced accelerated senescence (DIAS) with long-term expression of senescence markers, demonstrating that a combination of different susceptibility factors, here ROS and DNA alkylation, are necessary to induce a permanent senescent cell type.

  10. A SOAP Web Service for accessing MODIS land product subsets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SanthanaVannan, Suresh K; Cook, Robert B; Pan, Jerry Yun

    2011-01-01

    Remote sensing data from satellites have provided valuable information on the state of the earth for several decades. Since March 2000, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on board NASA s Terra and Aqua satellites have been providing estimates of several land parameters useful in understanding earth system processes at global, continental, and regional scales. However, the HDF-EOS file format, specialized software needed to process the HDF-EOS files, data volume, and the high spatial and temporal resolution of MODIS data make it difficult for users wanting to extract small but valuable amounts of information from the MODIS record. Tomore » overcome this usability issue, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemical Dynamics at Oak Ridge National Laboratory (ORNL) developed a Web service that provides subsets of MODIS land products using Simple Object Access Protocol (SOAP). The ORNL DAAC MODIS subsetting Web service is a unique way of serving satellite data that exploits a fairly established and popular Internet protocol to allow users access to massive amounts of remote sensing data. The Web service provides MODIS land product subsets up to 201 x 201 km in a non-proprietary comma delimited text file format. Users can programmatically query the Web service to extract MODIS land parameters for real time data integration into models, decision support tools or connect to workflow software. Information regarding the MODIS SOAP subsetting Web service is available on the World Wide Web (WWW) at http://daac.ornl.gov/modiswebservice.« less

  11. Anti-photoaging potential of propolis extract in UVB-irradiated human dermal fibroblasts through increasing the expression of FOXO3A and NGF genes.

    PubMed

    Ebadi, Parimah; Fazeli, Mehdi

    2017-11-01

    Propolis is a resinous compound that has been widely used in folk medicine. Different biological activities and therapeutic applications of propolis have been studied before. However, the effects of propolis on longevity-associated genes expression in the prevention of skin photoaging still remained unclear. Therefore in this study the protective effects of propolis on the expressions of two longevity-associated genes, FOXO3A and NGF genes, against UVB-induced photoaging in human dermal fibroblasts (HDF) were investigated. Propolis extract demonstrated a concentration-dependent free radical scavenging activity that was determined by 2, 2-diphenyl-1-picrylhydrazyl (DPPH) assay. Also, Folin-Ciocalteu method was used to measure the total phenolic content of the extract. The viability of HDF cells was decreased gradually with increasing UVB radiation doses and 248mJ/cm 2 was selected as the sub-cytotoxic dose. Pre-treatment with propolis extract increased the viability of UVB-irradiated human dermal fibroblasts and decreased the number of β-galactosidase positive cells as senescent cells among them. It also increased the expression of FOXO3A and NGF genes in irradiated and non-irradiated cells. Consequently, these findings suggest that propolis extract has anti-photoaging potential and this property, in addition to its strong antioxidant activity, may be due to its effects on upregulation of longevity-associated genes. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  12. In vitro and in vivo evaluation of microporous chitosan hydrogel/nanofibrin composite bandage for skin tissue regeneration.

    PubMed

    Sudheesh Kumar, P T; Raj, N Mincy; Praveen, G; Chennazhi, Krishna Prasad; Nair, Shantikumar V; Jayakumar, R

    2013-02-01

    In this work, we have developed chitosan hydrogel/nanofibrin composite bandages (CFBs) and characterized using Fourier transform-infrared spectroscopy and scanning electron microscopy. The homogeneous distribution of nanofibrin in the prepared chitosan hydrogel matrix was confirmed by phosphotungstic acid-hematoxylin staining. The mechanical strength, swelling, biodegradation, porosity, whole-blood clotting, and platelet activation studies were carried out. In addition, the cell viability, cell attachment, and infiltration of the prepared CFBs were evaluated using human umbilical vein endothelial cells (HUVECs) and human dermal fibroblast (HDF) cells. It was found that the CFBs were microporous, flexible, biodegradable, and showed enhanced blood clotting and platelet activity compared to the one without nanofibrin. The prepared CFBs were capable of absorbing fluid and this was confirmed when immersed in phosphate buffered saline. Cell viability studies on HUVECs and HDF cells proved the nontoxic nature of the CFBs. Cell attachment and infiltration studies showed that the cells were found attached and proliferated on the CFBs. In vivo experiments were carried out in Sprague-Dawley rats and found that the wound healing occurred within 2 weeks when treated with CFBs than compared to the bare wound and wound treated with Kaltostat. The deposition of collagen was found to be more on CFB-treated wounds compared to the control. The above results proved the use of these CFBs as an ideal candidate for skin tissue regeneration and wound healing.

  13. A hydrogel-based versatile screening platform for specific biomolecular recognition in a well plate format.

    PubMed

    Beer, Meike V; Rech, Claudia; Diederichs, Sylvia; Hahn, Kathrin; Bruellhoff, Kristina; Möller, Martin; Elling, Lothar; Groll, Jürgen

    2012-04-01

    Precise determination of biomolecular interactions in high throughput crucially depends on a surface coating technique that allows immobilization of a variety of interaction partners in a non-interacting environment. We present a one-step hydrogel coating system based on isocyanate functional six-arm poly(ethylene oxide)-based star polymers for commercially available 96-well microtiter plates that combines a straightforward and robust coating application with versatile bio-functionalization. This system generates resistance to unspecific protein adsorption and cell adhesion, as demonstrated with fluorescently labeled bovine serum albumin and primary human dermal fibroblasts (HDF), and high specificity for the assessment of biomolecular recognition processes when ligands are immobilized on this surface. One particular advantage is the wide range of biomolecules that can be immobilized and convert the per se inert coating into a specifically interacting surface. We here demonstrate the immobilization and quantification of a broad range of biochemically important ligands, such as peptide sequences GRGDS and GRGDSK-biotin, the broadly applicable coupler molecule biocytin, the protein fibronectin, and the carbohydrates N-acetylglucosamine and N-acetyllactosamine. A simplified protocol for an enzyme-linked immunosorbent assay was established for the detection and quantification of ligands on the coating surface. Cell adhesion on the peptide and protein-modified surfaces was assessed using HDF. All coatings were applied using a one-step preparation technique, including bioactivation, which makes the system suitable for high-throughput screening in a format that is compatible with the most routinely used testing systems.

  14. Nanostructured biomaterials from electrospun demineralized bone matrix: a survey of processing and crosslinking strategies.

    PubMed

    Leszczak, Victoria; Place, Laura W; Franz, Natalee; Popat, Ketul C; Kipper, Matt J

    2014-06-25

    In the design of scaffolds for tissue engineering biochemical function and nanoscale features are of particular interest. Natural polymers provide a wealth of biochemical function, but do not have the processability of synthetic polymers, limiting their ability to mimic the hierarchy of structures in the natural extracellular matrix. Thus, they are often combined with synthetic carrier polymers to enable processing. Demineralized bone matrix (DBM), a natural polymer, is allograft bone with inorganic material removed. DBM contains the protein components of bone, which includes adhesion ligands and osteoinductive signals, such as important growth factors. Herein we describe a novel method for tuning the nanostructure of DBM through electrospinning without the use of a carrier polymer. This work surveys solvents and solvent blends for electrospinning DBM. Blends of hexafluoroisopropanol and trifluoroacetic acid are studied in detail. The effects of DBM concentration and dissolution time on solution viscosity are also reported and correlated to observed differences in electrospun fiber morphology. We also present a survey of techniques to stabilize the resultant fibers with respect to aqueous environments. Glutaraldehyde vapor treatment is successful at maintaining both macroscopic and microscopic structure of the electrospun DBM fibers. Finally, we report results from tensile testing of stabilized DBM nanofiber mats, and preliminary evaluation of their cytocompatibility. The DBM nanofiber mats exhibit good cytocompatibility toward human dermal fibroblasts (HDF) in a 4-day culture; neither the electrospun solvents nor the cross-linking results in any measurable residual cytotoxicity toward HDF.

  15. The Local Group: the ultimate deep field

    NASA Astrophysics Data System (ADS)

    Boylan-Kolchin, Michael; Weisz, Daniel R.; Bullock, James S.; Cooper, Michael C.

    2016-10-01

    Near-field cosmology - using detailed observations of the Local Group and its environs to study wide-ranging questions in galaxy formation and dark matter physics - has become a mature and rich field over the past decade. There are lingering concerns, however, that the relatively small size of the present-day Local Group (˜2 Mpc diameter) imposes insurmountable sample-variance uncertainties, limiting its broader utility. We consider the region spanned by the Local Group's progenitors at earlier times and show that it reaches 3 arcmin ≈ 7 comoving Mpc in linear size (a volume of ≈350 Mpc3) at z = 7. This size at early cosmic epochs is large enough to be representative in terms of the matter density and counts of dark matter haloes with Mvir(z = 7) ≲ 2 × 109 M⊙. The Local Group's stellar fossil record traces the cosmic evolution of galaxies with 103 ≲ M⋆(z = 0)/M⊙ ≲ 109 (reaching M1500 > -9 at z ˜ 7) over a region that is comparable to or larger than the Hubble Ultra-Deep Field (HUDF) for the entire history of the Universe. In the JWST era, resolved stellar populations will probe regions larger than the HUDF and any deep JWST fields, further enhancing the value of near-field cosmology.

  16. Assessing the temporal stability of spatial patterns of soil apparent electrical conductivity using geophysical methods

    NASA Astrophysics Data System (ADS)

    De Caires, Sunshine A.; Wuddivira, Mark N.; Bekele, Isaac

    2014-10-01

    Cocoa remains in the same field for decades, resulting in plantations dominated with aging trees growing on variable and depleted soils. We determined the spatio-temporal variability of key soil properties in a (5.81 ha) field from the International Cocoa Genebank, Trinidad using geophysical methods. Multi-year (2008-2009) measurements of apparent electrical conductivity at 0-0.75 m (shallow) and 0.75-1.5 m (deep) were conducted. Apparent electrical conductivity at deep and shallow gave the strongest linear correlation with clay-silt content (R = 0.67 and R = 0.78, respectively) and soil solution electrical conductivity (R = 0.76 and R = 0.60, respectively). Spearman rank correlation coefficients ranged between 0.89-0.97 and 0.81- 0.95 for apparent electrical conductivity at deep and shallow, respectively, signifying a strong linear dependence between measurement days. Thus, in the humid tropics, cocoa fields with thick organic litter layer and relatively dense understory cover, experience minimal fluctuations in transient properties of soil water and temperature at the topsoil resulting in similarly stable apparent electrical conductivity at shallow and deep. Therefore, apparent electrical conductivity at shallow, which covers the depth where cocoa feeder roots concentrate, can be used as a fertility indicator and to develop soil zones for efficient application of inputs and management of cocoa fields.

  17. Smoldering Remediation of Coal-Tar-Contaminated Soil: Pilot Field Tests of STAR.

    PubMed

    Scholes, Grant C; Gerhard, Jason I; Grant, Gavin P; Major, David W; Vidumsky, John E; Switzer, Christine; Torero, Jose L

    2015-12-15

    Self-sustaining treatment for active remediation (STAR) is an emerging, smoldering-based technology for nonaqueous-phase liquid (NAPL) remediation. This work presents the first in situ field evaluation of STAR. Pilot field tests were performed at 3.0 m (shallow test) and 7.9 m (deep test) below ground surface within distinct lithological units contaminated with coal tar at a former industrial facility. Self-sustained smoldering (i.e., after the in-well ignition heater was terminated) was demonstrated below the water table for the first time. The outward propagation of a NAPL smoldering front was mapped, and the NAPL destruction rate was quantified in real time. A total of 3700 kg of coal tar over 12 days in the shallow test and 860 kg over 11 days in the deep test was destroyed; less than 2% of total mass removed was volatilized. Self-sustaining propagation was relatively uniform radially outward in the deep test, achieving a radius of influence of 3.7 m; strong permeability contrasts and installed barriers influenced the front propagation geometry in the shallow test. Reductions in soil hydrocarbon concentrations of 99.3% and 97.3% were achieved in the shallow and deep tests, respectively. Overall, this provides the first field evaluation of STAR and demonstrates that it is effective in situ and under a variety of conditions and provides the information necessary for designing the full-scale site treatment.

  18. Quantifying the influence of deep soil moisture on ecosystem albedo: The role of vegetation

    NASA Astrophysics Data System (ADS)

    Sanchez-Mejia, Zulia Mayari; Papuga, Shirley Anne; Swetish, Jessica Blaine; van Leeuwen, Willem Jan Dirk; Szutu, Daphne; Hartfield, Kyle

    2014-05-01

    As changes in precipitation dynamics continue to alter the water availability in dryland ecosystems, understanding the feedbacks between the vegetation and the hydrologic cycle and their influence on the climate system is critically important. We designed a field campaign to examine the influence of two-layer soil moisture control on bare and canopy albedo dynamics in a semiarid shrubland ecosystem. We conducted this campaign during 2011 and 2012 within the tower footprint of the Santa Rita Creosote Ameriflux site. Albedo field measurements fell into one of four Cases within a two-layer soil moisture framework based on permutations of whether the shallow and deep soil layers were wet or dry. Using these Cases, we identified differences in how shallow and deep soil moisture influence canopy and bare albedo. Then, by varying the number of canopy and bare patches within a gridded framework, we explore the influence of vegetation and soil moisture on ecosystem albedo. Our results highlight the importance of deep soil moisture in land surface-atmosphere interactions through its influence on aboveground vegetation characteristics. For instance, we show how green-up of the vegetation is triggered by deep soil moisture, and link deep soil moisture to a decrease in canopy albedo. Understanding relationships between vegetation and deep soil moisture will provide important insights into feedbacks between the hydrologic cycle and the climate system.

  19. The JWST North Ecliptic Pole Survey Field for Time-domain Studies

    NASA Astrophysics Data System (ADS)

    Jansen, Rolf A.; Alpaslan, Mehmet; Ashby, Matthew; Ashcraft, Teresa; Cohen, Seth H.; Condon, James J.; Conselice, Christopher; Ferrara, Andrea; Frye, Brenda L.; Grogin, Norman A.; Hammel, Heidi B.; Hathi, Nimish P.; Joshi, Bhavin; Kim, Duho; Koekemoer, Anton M.; Mechtley, Matt; Milam, Stefanie N.; Rodney, Steven A.; Rutkowski, Michael J.; Strolger, Louis-Gregory; Trujillo, Chadwick A.; Willmer, Christopher; Windhorst, Rogier A.; Yan, Haojing

    2017-01-01

    The JWST North Ecliptic Pole (NEP) Survey field is located within JWST's northern Continuous Viewing Zone, will span ˜14‧ in diameter (˜10‧ with NIRISS coverage) and will be roughly circular in shape (initially sampled during Cycle 1 at 4 distinct orientations with JWST/NIRCam's 4.4‧×2.2‧ FoV —the JWST “windmill”) and will have NIRISS slitless grism spectroscopy taken in parallel, overlapping an alternate NIRCam orientation. This is the only region in the sky where JWST can observe a clean extragalactic deep survey field (free of bright foreground stars and with low Galactic foreground extinction AV) at arbitrary cadence or at arbitrary orientation. This will crucially enable a wide range of new and exciting time-domain science, including high redshift transient searches and monitoring (e.g., SNe), variability studies from Active Galactic Nuclei to brown dwarf atmospheres, as well as proper motions of extreme scattered Kuiper Belt and Oort Cloud Objects, and of nearby Galactic brown dwarfs, low-mass stars, and ultracool white dwarfs. We therefore welcome and encourage follow-up through GO programs of the initial GTO observations to realize its potential as a JWST time-domain community field. The JWST NEP Survey field was selected from an analysis of WISE 3.4+4.6 micron, 2MASS JHKs, and SDSS ugriz source counts and of Galactic foreground extinction, and is one of very few such ˜10‧ fields that are devoid of sources brighter than mAB = 16 mag. We have secured deep (mAB ˜ 26 mag) wide-field (˜23‧×25‧) Ugrz images of this field and its surroundings with LBT/LBC. We also expect that deep MMT/MMIRS YJHK images, deep 8-12 GHz VLA radio observations (pending), and possibly HST ACS/WFC and WFC3/UVIS ultraviolet-visible images will be available before JWST launches in Oct 2018.

  20. The JWST North Ecliptic Pole Survey Field for Time-domain Studies

    NASA Astrophysics Data System (ADS)

    Jansen, Rolf A.; Webb Medium Deep Fields IDS GTO Team, the NEPTDS-VLA/VLBA Team, and the NEPTDS-Chandra Team

    2017-06-01

    The JWST North Ecliptic Pole (NEP) Survey field is located within JWST's northern Continuous Viewing Zone, will span ~14‧ in diameter (~10‧ with NIRISS coverage) and will be roughly circular in shape (initially sampled during Cycle 1 at 4 distinct orientations with JWST/NIRCam's 4.4‧×2.2‧ FoV —the JWST "windmill") and will have NIRISS slitless grism spectroscopy taken in parallel, overlapping an alternate NIRCam orientation. This is the only region in the sky where JWST can observe a clean extragalactic deep survey field (free of bright foreground stars and with low Galactic foreground extinction AV) at arbitrary cadence or at arbitrary orientation. This will crucially enable a wide range of new and exciting time-domain science, including high redshift transient searches and monitoring (e.g., SNe), variability studies from Active Galactic Nuclei to brown dwarf atmospheres, as well as proper motions of extreme scattered Kuiper Belt and Oort Cloud Objects, and of nearby Galactic brown dwarfs, low-mass stars, and ultracool white dwarfs. We therefore welcome and encourage follow-up through GO programs of the initial GTO observations to realize its potential as a JWST time-domain community field. The JWST NEP Survey field was selected from an analysis of WISE 3.4+4.6 μm, 2MASS JHKs, and SDSS ugriz source counts and of Galactic foreground extinction, and is one of very few such ~10‧ fields that are devoid of sources brighter than mAB = 16 mag. We have secured deep (mAB ~ 26 mag) wide-field (~23‧×25‧) Ugrz images of this field and its surroundings with LBT/LBC. We also expect that deep MMT/MMIRS YJHK images, deep 3-4.5 GHz VLA and VLBA radio observations, and possibly HST ACS/WFC and WFC3/UVIS ultraviolet-visible (pending) and Chandra/ACIS X-ray (pending) images will be available before JWST launches in Oct 2018.

  1. THE MULTIWAVELENGTH SURVEY BY YALE-CHILE (MUSYC): DEEP MEDIUM-BAND OPTICAL IMAGING AND HIGH-QUALITY 32-BAND PHOTOMETRIC REDSHIFTS IN THE ECDF-S

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardamone, Carolin N.; Van Dokkum, Pieter G.; Urry, C. Megan

    2010-08-15

    We present deep optical 18-medium-band photometry from the Subaru telescope over the {approx}30' x 30' Extended Chandra Deep Field-South, as part of the Multiwavelength Survey by Yale-Chile (MUSYC). This field has a wealth of ground- and space-based ancillary data, and contains the GOODS-South field and the Hubble Ultra Deep Field. We combine the Subaru imaging with existing UBVRIzJHK and Spitzer IRAC images to create a uniform catalog. Detecting sources in the MUSYC 'BVR' image we find {approx}40,000 galaxies with R {sub AB} < 25.3, the median 5{sigma} limit of the 18 medium bands. Photometric redshifts are determined using the EAzYmore » code and compared to {approx}2000 spectroscopic redshifts in this field. The medium-band filters provide very accurate redshifts for the (bright) subset of galaxies with spectroscopic redshifts, particularly at 0.1 < z < 1.2 and at z {approx}> 3.5. For 0.1 < z < 1.2, we find a 1{sigma} scatter in {Delta}z/(1 + z) of 0.007, similar to results obtained with a similar filter set in the COSMOS field. As a demonstration of the data quality, we show that the red sequence and blue cloud can be cleanly identified in rest-frame color-magnitude diagrams at 0.1 < z < 1.2. We find that {approx}20% of the red sequence galaxies show evidence of dust emission at longer rest-frame wavelengths. The reduced images, photometric catalog, and photometric redshifts are provided through the public MUSYC Web site.« less

  2. SELFI: an object-based, Bayesian method for faint emission line source detection in MUSE deep field data cubes

    NASA Astrophysics Data System (ADS)

    Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme

    2016-04-01

    We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).

  3. A MULTIWAVELENGTH STUDY OF TADPOLE GALAXIES IN THE HUBBLE ULTRA DEEP FIELD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straughn, Amber N.; Eufrasio, Rafael T.; Gardner, Jonathan P.

    2015-12-01

    Multiwavelength data are essential in order to provide a complete picture of galaxy evolution and to inform studies of galaxies’ morphological properties across cosmic time. Here we present the results of a multiwavelength investigation of the morphologies of “tadpole” galaxies at intermediate redshift (0.314 < z < 3.175) in the Hubble Ultra Deep Field. These galaxies were previously selected from deep Hubble Space Telescope (HST) F775W data based on their distinct asymmetric knot-plus-tail morphologies. Here we use deep Wide Field Camera 3 near-infrared imaging in addition to the HST optical data in order to study the rest-frame UV/optical morphologies ofmore » these galaxies across the redshift range 0.3 < z < 3.2. This study reveals that the majority of these galaxies do retain their general asymmetric morphology in the rest-frame optical over this redshift range, if not the distinct “tadpole” shape. The average stellar mass of tadpole galaxies is lower than that of field galaxies, with the effect being slightly greater at higher redshift within the errors. Estimated from spectral energy distribution fits, the average age of tadpole galaxies is younger than that of field galaxies in the lower-redshift bin, and the average metallicity is lower (whereas the specific star formation rate for tadpoles is roughly the same as field galaxies across the redshift range probed here). These average effects combined support the conclusion that this subset of galaxies is in an active phase of assembly, either late-stage merging or cold gas accretion causing localized clumpy star formation.« less

  4. Magnetically tunable oil droplet lens of deep-sea shrimp

    NASA Astrophysics Data System (ADS)

    Iwasaka, M.; Hirota, N.; Oba, Y.

    2018-05-01

    In this study, the tunable properties of a bio-lens from a deep-sea shrimp were investigated for the first time using magnetic fields. The skin of the shrimp exhibited a brilliantly colored reflection of incident white light. The light reflecting parts and the oil droplets in the shrimp's skin were observed in a glass slide sample cell using a digital microscope that operated in the bore of two superconducting magnets (maximum strengths of 5 and 13 T). In the ventral skin of the shrimp, which contained many oil droplets, some comparatively large oil droplets (50 to 150 μm in diameter) were present. A distinct response to magnetic fields was found in these large oil droplets. Further, the application of the magnetic fields to the sample cell caused a change in the size of the oil droplets. The phenomena observed in this work indicate that the oil droplets of deep sea shrimp can act as lenses in which the optical focusing can be modified via the application of external magnetic fields. The results of this study will make it possible to fabricate bio-inspired soft optical devices in future.

  5. The Charging of Composites in the Space Environment

    NASA Technical Reports Server (NTRS)

    Czepiela, Steven A.

    1997-01-01

    Deep dielectric charging and subsequent electrostatic discharge in composite materials used on spacecraft have become greater concerns since composite materials are being used more extensively as main structural components. Deep dielectric charging occurs when high energy particles penetrate and deposit themselves in the insulating material of spacecraft components. These deposited particles induce an electric field in the material, which causes the particles to move and thus changes the electric field. The electric field continues to change until a steady state is reached between the incoming particles from the space environment and the particles moving away due to the electric field. An electrostatic discharge occurs when the electric field is greater than the dielectric strength of the composite material. The goal of the current investigation is to investigate deep dielectric charging in composite materials and ascertain what modifications have to be made to the composite properties to alleviate any breakdown issues. A 1-D model was created. The space environment, which is calculated using the Environmental Workbench software, the composite material properties, and the electric field and voltage boundary conditions are input into the model. The output from the model is the charge density, electric field, and voltage distributions as functions of the depth into the material and time. Analysis using the model show that there should be no deep dielectric charging problem with conductive composites such as carbon fiber/epoxy. With insulating materials such as glass fiber/epoxy, Kevlar, and polymers, there is also no concern of deep dielectric charging problems with average day-to-day particle fluxes. However, problems can arise during geomagnetic substorms and solar particle events where particle flux levels increase by several orders of magnitude, and thus increase the electric field in the material by several orders of magnitude. Therefore, the second part of this investigation was an experimental attempt to measure the continuum electrical properties of a carbon fiber/epoxy composite, and to create a composite with tailorable conductivity without affecting its mechanical properties. The measurement of the conductivity and dielectric strength of carbon fiber/epoxy composites showed that these properties are surface layer dominated and difficult to measure. In the second experimental task, the conductivity of a glass fiber/epoxy composite was increased by 3 orders of magnitude, dielectric constant was increased approximately by a factor of 16, with minimal change to the mechanical properties, by adding conductive carbon black to the epoxy.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constable, S.A.; Orange, Arnold S.; Hoversten, G. Michael

    Induction in electrically conductive seawater attenuates themagnetotelluric (MT) fields and, coupled with a minimum around 1 Hz inthe natural magnetic field spectrum, leads to a dramatic loss of electricand magnetic field power on the sea floor at periods shorter than 1000 s,For this reason the marine MT method traditionally has been used only atperiods of 10(3) to 10(5) s to probe deep mantle structure; rarely does asea-floor MT response extend to a 100-s period. To be useful for mappingcontinental shelf structure at depths relevant to petroleum exploration,however, MT measurements need to be made at periods between 1 and 1000 s.Thismore » can be accomplished using ac-coupled sensors, induction coils forthe magnetic field, and an electric field amplifier developed for marinecontrolled-source applications. The electrically quiet sea floor allowsthe attenuated electric field to be amplified greatly before recording;in deep (l-km) water, motional noise in magnetic field sensors appearsnot to be a problem. In shallower water, motional noise does degrade themagnetic measurement, but sea-floor magnetic records can be replaced byland recordings, producing an effective sea-surface MT response. Fieldtrials of such equipment in l-km-deep water produced good-quality MTresponses at periods of 3 to 1000 s: in shallower water, responses to afew hertz can be obtained. Using an autonomous sea-floor data loggerdeveloped at Scripps Institution of Oceanography, marine surveys of 50 to100 sites are feasible.« less

  7. Project DEEP STEAM: Fourth meeting of the technical advisory panel

    NASA Astrophysics Data System (ADS)

    Fox, R. L.; Donaldson, A. B.; Eisenhawer, S. W.; Hart, C. M.; Johnson, D. R.; Mulac, A. J.; Wayland, J. R.; Weirick, L. J.

    1981-07-01

    The status of project DEEP STEAM was reviewed. Proceedings, are divided into five main sections: (1) the injection string modification program; (2) the downhole steam generator program; (3) supporting activities; (4) field testing; and (5) recommendations and discussion.

  8. ESA Atmospheric Toolbox

    NASA Astrophysics Data System (ADS)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/

  9. Chemical Composition of Moringa oleifera Ethyl Acetate Fraction and Its Biological Activity in Diabetic Human Dermal Fibroblasts

    PubMed Central

    Gothai, Sivapragasam; Muniandy, Katyakyini; Zarin, Mazni Abu; Sean, Tan Woan; Kumar, S. Suresh; Munusamy, Murugan A.; Fakurazi, Sharida; Arulselvan, Palanisamy

    2017-01-01

    Background: Moringa oleifera (MO), commonly known as the drumstick tree, is used in folklore medicine for the treatment of skin disease. Objective: The objective of this study is to evaluate the ethyl acetate (EtOAc) fraction of MO leaves for in vitro antibacterial, antioxidant, and wound healing activities and conduct gas chromatography-mass spectrometry (GC-MS) analysis. Materials and Methods: Antibacterial activity was evaluated against six Gram-positive bacteria and 10 Gram-negative bacteria by disc diffusion method. Free radical scavenging activity was assessed by 1, 1-diphenyl-2-picryl hydrazyl (DPPH) radical hydrogen peroxide scavenging and total phenolic content (TPC). Wound healing efficiency was studied using cell viability, proliferation, and scratch assays in diabetic human dermal fibroblast (HDF-D) cells. Results: The EtOAc fraction showed moderate activity against all bacterial strains tested, and the maximum inhibition zone was observed against Streptococcus pyogenes (30 mm in diameter). The fraction showed higher sensitivity to Gram-positive strains than Gram-negative strains. In the quantitative analysis of antioxidant content, the EtOAc fraction was found to have a TPC of 65.81 ± 0.01. The DPPH scavenging activity and the hydrogen peroxide assay were correlated with the TPC value, with IC50 values of 18.21 ± 0.06 and 59.22 ± 0.04, respectively. The wound healing experiment revealed a significant enhancement of cell proliferation and migration of HDF-D cells. GC-MS analysis confirmed the presence of 17 bioactive constituents that may be the principal factors in the significant antibacterial, antioxidant, and wound healing activity. Conclusion: The EtOAc fraction of MO leaves possesses remarkable wound healing properties, which can be attributed to the antibacterial and antioxidant activities of the fraction. SUMMARY Moringa oleifera (MO) leaf ethyl acetate (EtOAc) fraction possesses antibacterial activities toward Gram-positive bacteria such as Streptococcus pyogenes, Streptococcus faecalis, Bacillus subtilis, Bacillus cereus and Staphylococcus aureus, and Gram-negative bacteria such as Proteus mirabilis and Salmonella typhimuriumMO leaf EtOAc fraction contained the phenolic content of 65.81 ± 0.01 and flavonoid content of 37.1 ± 0.03, respectively. In addition, the fraction contained 17 bioactive constituents associated with the antibacterial, antioxidant, and wound healing properties that were identified using gas chromatography-mass spectrometry analysisMO leaf EtOAc fraction supports wound closure rate about 80% for treatments when compared with control group. Abbreviations used: MO: Moringa oleifera; EtOAc: Ethyl acetate; GC-MS: Gas Chromatography-Mass Spectrometry; HDF-D: Diabetic Human Dermal Fibroblast cells. PMID:29142400

  10. Chemical Composition of Moringa oleifera Ethyl Acetate Fraction and Its Biological Activity in Diabetic Human Dermal Fibroblasts.

    PubMed

    Gothai, Sivapragasam; Muniandy, Katyakyini; Zarin, Mazni Abu; Sean, Tan Woan; Kumar, S Suresh; Munusamy, Murugan A; Fakurazi, Sharida; Arulselvan, Palanisamy

    2017-10-01

    Moringa oleifera (MO), commonly known as the drumstick tree, is used in folklore medicine for the treatment of skin disease. The objective of this study is to evaluate the ethyl acetate (EtOAc) fraction of MO leaves for in vitro antibacterial, antioxidant, and wound healing activities and conduct gas chromatography-mass spectrometry (GC-MS) analysis. Antibacterial activity was evaluated against six Gram-positive bacteria and 10 Gram-negative bacteria by disc diffusion method. Free radical scavenging activity was assessed by 1, 1-diphenyl-2-picryl hydrazyl (DPPH) radical hydrogen peroxide scavenging and total phenolic content (TPC). Wound healing efficiency was studied using cell viability, proliferation, and scratch assays in diabetic human dermal fibroblast (HDF-D) cells. The EtOAc fraction showed moderate activity against all bacterial strains tested, and the maximum inhibition zone was observed against Streptococcus pyogenes (30 mm in diameter). The fraction showed higher sensitivity to Gram-positive strains than Gram-negative strains. In the quantitative analysis of antioxidant content, the EtOAc fraction was found to have a TPC of 65.81 ± 0.01. The DPPH scavenging activity and the hydrogen peroxide assay were correlated with the TPC value, with IC 50 values of 18.21 ± 0.06 and 59.22 ± 0.04, respectively. The wound healing experiment revealed a significant enhancement of cell proliferation and migration of HDF-D cells. GC-MS analysis confirmed the presence of 17 bioactive constituents that may be the principal factors in the significant antibacterial, antioxidant, and wound healing activity. The EtOAc fraction of MO leaves possesses remarkable wound healing properties, which can be attributed to the antibacterial and antioxidant activities of the fraction. Moringa oleifera (MO) leaf ethyl acetate (EtOAc) fraction possesses antibacterial activities toward Gram-positive bacteria such as Streptococcus pyogenes , Streptococcus faecalis , Bacillus subtilis , Bacillus cereus and Staphylococcus aureus , and Gram-negative bacteria such as Proteus mirabilis and Salmonella typhimurium MO leaf EtOAc fraction contained the phenolic content of 65.81 ± 0.01 and flavonoid content of 37.1 ± 0.03, respectively. In addition, the fraction contained 17 bioactive constituents associated with the antibacterial, antioxidant, and wound healing properties that were identified using gas chromatography-mass spectrometry analysisMO leaf EtOAc fraction supports wound closure rate about 80% for treatments when compared with control group. Abbreviations used: MO: Moringa oleifera ; EtOAc: Ethyl acetate; GC-MS: Gas Chromatography-Mass Spectrometry; HDF-D: Diabetic Human Dermal Fibroblast cells.

  11. The JPL roadmap for Deep Space navigation

    NASA Technical Reports Server (NTRS)

    Martin-Mur, Tomas J.; Abraham, Douglas S.; Berry, David; Bhaskaran, Shyam; Cesarone, Robert J.; Wood, Lincoln

    2006-01-01

    This paper reviews the tentative set of deep space missions that will be supported by NASA's Deep Space Mission System in the next twenty-five years, and extracts the driving set of navigation capabilities that these missions will require. There will be many challenges including the support of new mission navigation approaches such as formation flying and rendezvous in deep space, low-energy and low-thrust orbit transfers, precise landing and ascent vehicles, and autonomous navigation. Innovative strategies and approaches will be needed to develop and field advanced navigation capabilities.

  12. MUSE deep-fields: the Ly α luminosity function in the Hubble Deep Field-South at 2.91 < z < 6.64

    NASA Astrophysics Data System (ADS)

    Drake, Alyssa B.; Guiderdoni, Bruno; Blaizot, Jérémy; Wisotzki, Lutz; Herenz, Edmund Christian; Garel, Thibault; Richard, Johan; Bacon, Roland; Bina, David; Cantalupo, Sebastiano; Contini, Thierry; den Brok, Mark; Hashimoto, Takuya; Marino, Raffaella Anna; Pelló, Roser; Schaye, Joop; Schmidt, Kasper B.

    2017-10-01

    We present the first estimate of the Ly α luminosity function using blind spectroscopy from the Multi Unit Spectroscopic Explorer, MUSE, in the Hubble Deep Field-South. Using automatic source-detection software, we assemble a homogeneously detected sample of 59 Ly α emitters covering a flux range of -18.0 < log10 (F) < -16.3 (erg s-1 cm-2), corresponding to luminosities of 41.4 < log10 (L) < 42.8 (erg s-1). As recent studies have shown, Ly α fluxes can be underestimated by a factor of 2 or more via traditional methods, and so we undertake a careful assessment of each object's Ly α flux using a curve-of-growth analysis to account for extended emission. We describe our self-consistent method for determining the completeness of the sample, and present an estimate of the global Ly α luminosity function between redshifts 2.91 < z < 6.64 using the 1/Vmax estimator. We find that the luminosity function is higher than many number densities reported in the literature by a factor of 2-3, although our result is consistent at the 1σ level with most of these studies. Our observed luminosity function is also in good agreement with predictions from semi-analytic models, and shows no evidence for strong evolution between the high- and low-redshift halves of the data. We demonstrate that one's approach to Ly α flux estimation does alter the observed luminosity function, and caution that accurate flux assessments will be crucial in measurements of the faint-end slope. This is a pilot study for the Ly α luminosity function in the MUSE deep-fields, to be built on with data from the Hubble Ultra Deep Field that will increase the size of our sample by almost a factor of 10.

  13. Deep Flaw Detection with Giant Magnetoresistive (GMR) Based Self-Nulling Probe

    NASA Technical Reports Server (NTRS)

    Wincheski, Buzz; Namkung, Min

    2004-01-01

    In this paper a design modification to the Very-Low Frequency GMR Based Self-Nulling Probe has been presented to enable improved signal to noise ratio for deeply buried flaws. The design change consists of incorporating a feedback coil in the center of the flux focusing lens. The use of the feedback coil enables cancellation of the leakage fields in the center of the probe and biasing of the GMR sensor to a location of high magnetic field sensitivity. The effect of the feedback on the probe output was examined, and experimental results for deep flaw detection were presented. The experimental results show that the modified probe is capable of clearly identifying flaws up to 1 cm deep in aluminum alloy structures.

  14. A Comparison between Deep and Shallow Stress Fields in Korea Using Earthquake Focal Mechanism Inversions and Hydraulic Fracturing Stress Measurements

    NASA Astrophysics Data System (ADS)

    Lee, Rayeon; Chang, Chandong; Hong, Tae-kyung; Lee, Junhyung; Bae, Seong-Ho; Park, Eui-Seob; Park, Chan

    2016-04-01

    We are characterizing stress fields in Korea using two types of stress data: earthquake focal mechanism inversions (FMF) and hydraulic fracturing stress measurements (HF). The earthquake focal mechanism inversion data represent stress conditions at 2-20 km depths, whereas the hydraulic fracturing stress measurements, mostly conducted for geotechnical purposes, have been carried out at depths shallower than 1 km. We classified individual stress data based on the World Stress Map quality ranking scheme. A total of 20 FMF data were classified into A-B quality, possibly representing tectonic stress fields. A total of 83 HF data out of compiled 226 data were classified into B-C quality, which we use for shallow stress field characterization. The tectonic stress, revealed from the FMF data, is characterized by a remarkable consistency in its maximum stress (σ1) directions in and around Korea (N79±2° E), indicating a quite uniform deep stress field throughout. On the other hand, the shallow stress field, represented by HF data, exhibits local variations in σ1 directions, possibly due to effects of topography and geologic structures such as faults. Nonetheless, there is a general similarity in σ1 directions between deep and shallow stress fields. To investigate the shallow stress field statistically, we follow 'the mean orientation and wavelength analysis' suggested by Reiter et al. (2014). After the stress pattern analysis, the resulting stress points distribute sporadically over the country, not covering the entire region evenly. In the western part of Korea, the shallow σ1directions are generally uniform with their search radius reaching 100 km, where the average stress direction agrees well with those of the deep tectonic stress. We note two noticeable differences between shallow and deep stresses in the eastern part of Korea. First, the shallow σ1 orientations are markedly non-uniform in the southeastern part of Korea with their search radius less than 25 km. In this region, the average σ1orientation based on the entire B-C quality stress data is calculated to be 77±37° ; however, the average orientation is somewhat meaningless because of the high standard deviation. The southeastern part of Korea consists mainly of Cretaceous sedimentary basin, geologically younger than the rest of the country, where regional scale faults are intensely populated. The highly scattered stress directions in this region may represent the effect of the geologic structures on shallow stress field. Second, shallow σ1 directions in the northeastern part of Korea strike consistently to 135±12° , which is deviated by as much as 56° from the deep tectonic stress direction. This region is characterized by high altitude mountainous topography (an elevation of an order of 1 km) with its major ridge axis in the NW-SE direction. We interpret, as a rule of thumb, that the ridge-perpendicular shallow horizontal stress components may be weak, leading to the ridge-parallel components to be the maximum. Overall, there are similarity and also difference between shallow and deep stress fields. Thus, it will be necessary to differentiate the strategy to tackle the stress-related problems based on their natures.

  15. Deep circulations under simple classes of stratification

    NASA Technical Reports Server (NTRS)

    Salby, Murry L.

    1989-01-01

    Deep circulations where the motion field is vertically aligned over one or more scale heights are studied under barotropic and equivalent barotropic stratifications. The study uses two-dimensional equations reduced from the three-dimensional primitive equations in spherical geometry. A mapping is established between the full primitive equations and general shallow water behavior and the correspondence between variables describing deep atmospheric motion and those of shallow water behavior is established.

  16. Space Science

    NASA Image and Video Library

    2002-04-01

    This picture of the galaxy UGC 10214 was was taken by the Advanced Camera for Surveys (ACS), which was installed aboard the Hubble Space Telescope (HST) in March 2002 during HST Servicing Mission 3B (STS-109 mission). Dubbed the "Tadpole," this spiral galaxy is unlike the textbook images of stately galaxies. Its distorted shape was caused by a small interloper, a very blue, compact galaxy visible in the upper left corner of the more massive Tadpole. The Tadpole resides about 420 million light-years away in the constellation Draco. Seen shining through the Tadpole's disk, the tiny intruder is likely a hit-and-run galaxy that is now leaving the scene of the accident. Strong gravitational forces from the interaction created the long tail of debris, consisting of stars and gas that stretch our more than 280,000 light-years. The galactic carnage and torrent of star birth are playing out against a spectacular backdrop: a "wallpaper pattern" of 6,000 galaxies. These galaxies represent twice the number of those discovered in the legendary Hubble Deep Field, the orbiting observatory's "deepest" view of the heavens, taken in 1995 by the Wide Field and planetary camera 2. The ACS picture, however, was taken in one-twelfth of the time it took to observe the original HST Deep Field. In blue light, ACS sees even fainter objects than were seen in the "deep field." The galaxies in the ACS picture, like those in the deep field, stretch back to nearly the begirning of time. Credit: NASA, H. Ford (JHU), G. Illingworth (USCS/LO), M. Clampin (STScI), G. Hartig (STScI), the ACS Science Team, and ESA.

  17. HST/ACS Observations of RR Lyrae Stars in Six Ultra-Deep Fields of M31

    NASA Technical Reports Server (NTRS)

    Jeffery, E. J.; Smith, E.; Brown, T. M.; Sweigart, A. V.; Kalirai, J. S.; Ferguson, H. C.; Guhathakurta, P.; Renzini, A.; Rich, R. M.

    2010-01-01

    We present HST/ACS observations of RR Lyrae variable stars in six ultra deep fields of the Andromeda galaxy (M31), including parts of the halo, disk, and giant stellar stream. Past work on the RR Lyrae stars in M31 has focused on various aspects of the stellar populations that make up the galaxy s halo, including their distances and metallicities. This study builds upon this previous work by increasing the spatial coverage (something that has been lacking in previous studies) and by searching for these variable stars in constituents of the galaxy not yet explored. Besides the 55 RR Lyrae stars we found in our initial field located 11kpc from the galactic nucleus, we find additional RR Lyrae stars in four of the remaining five ultra deep fields as follows: 21 in the disk, 24 in the giant stellar stream, 3 in the halo field 21kpc from the galactic nucleus, and 5 in one of the halo fields at 35kpc. No RR Lyrae were found in the second halo field at 35kpc. The RR Lyrae populations of these fields appear to mostly be of Oosterhoff I type, although the 11kpc field appears to be intermediate or mixed. We will discuss the properties of these stars including period and reddening distributions. We calculate metallicities and distances for the stars in each of these fields using different methods and compare the results, to an extent that has not yet been done. We compare these methods not just on RR Lyrae in our M31 fields, but also on a data set of Milky Way field RR Lyrae stars.

  18. A deep redshift survey of field galaxies. Comments on the reality of the Butcher-Oemler effect

    NASA Technical Reports Server (NTRS)

    Koo, David C.; Kron, Richard G.

    1987-01-01

    A spectroscopic survey of over 400 field galaxies has been completed in three fields for which we have deep UBVI photographic photometry. The galaxies typically range from B=20 to 22 and possess redshifts z from 0.1 to 0.5 that are often quite spiky in distribution. Little, if any, luminosity evolution is observed up to redshifts z approx 0.5. By such redshifts, however, an unexpectedly large fraction of luminous galaxies has very blue intrinsic colors that suggest extensive star formation; in contrast, the reddest galaxies still have colors that match those of present-day ellipticals.

  19. An extended moderate-depth contiguous layer of the Chandra Bootes field - additional pointings

    NASA Astrophysics Data System (ADS)

    Kraft, Ralph

    2016-09-01

    We propose 150ks (6x25ks) ACIS-I observations to supplement existing X-ray data in XBootes. These new observations will allow the expansion of relatively large contiguous ( 2deg2) region in Bootes covered at 40ks, i.e., 5-8x deeper than the nominal Bootes field. In concert with the recently approved 1.025 Ms Chandra Deep Wide-Field Survey, this additional deep layer of Bootes will (1) provide new insights into the dark matter halos and large-scale structures that host AGN; (2) allow new measurements of the distribution of X-ray luminosities and connections to host galaxy evolution.

  20. Detection of a possible superluminous supernova in the epoch of reionization

    NASA Astrophysics Data System (ADS)

    Mould, Jeremy; Abbott, Tim; Cooke, Jeff; Curtin, Chris; Katsiani, Antonios; Koekemoer, Anton; Tescari, Edoardo; Uddin, Syed; Wang, Lifan; Wyithe, Stuaet

    2017-04-01

    An interesting transient has been detected in one of our three Dark Energy Camera deep fields. Observations of these deep fields take advantage of the high red sensitivity of DECam on the Cerro Tololo Interamerican Observatory Blanco telescope. The survey includes the Y band with rest wavelength 1430{Å} at z = 6. Survey fields (the Prime field 0555-6130, the 16hr field 1600-75 and the SUDSS New Southern Field) are deeper in Y than other infrared surveys. They are circumpolar, allowing all night to be used efficiently, exploiting the moon tolerance of 1 micron observations to minimize conflict with the Dark Energy Survey. As an i-band dropout (meaning that the flux decrement shortward of Lyman alpha is in the i bandpass), the transient we report here is a supernova candidate with z 6, with a luminosity comparable to the brightest known current epoch superluminous supernova (i.e., 2 x 10^11 solar luminosities).

  1. Deep Defects Seen on Visual Fields Spatially Correspond Well to Loss of Retinal Nerve Fiber Layer Seen on Circumpapillary OCT Scans.

    PubMed

    Mavrommatis, Maria A; Wu, Zhichao; Naegele, Saskia I; Nunez, Jason; De Moraes, Carlos; Ritch, Robert; Hood, Donald C

    2018-02-01

    To examine the structure-function relationship in glaucoma between deep defects on visual fields (VF) and deep losses in the circumpapillary retinal nerve fiber layer (cpRNFL) on optical coherence tomography (OCT) circle scans. Thirty two glaucomatous eyes with deep VF defects, as defined by at least one test location worse than ≤ -15 dB on the 10-2 and/or 24-2 VF pattern deviation (PD) plots, were included from 87 eyes with "early" glaucoma (i.e., 24-2 mean deviation better than -6 dB). Using the location of the deep VF points and a schematic model, the location of local damage on an OCT circle scan was predicted. The thinnest location of cpRNFL (i.e., deepest loss) was also determined. In 19 of 32 eyes, a region of complete or near complete cpRNFL loss was observed. All 19 of these had deep VF defects on the 24-2 and/or 10-2. All of the 32 eyes with deep VF defects had abnormal cpRNFL regions (red, 1%) and all but 2 had a region of cpRNFL thickness <21 μm. The midpoint of the VF defect and the location of deepest cpRNFL had a 95% limit of agreement within approximately two-thirds of a clock-hour (or 30°) sector (between -22.1° to 25.2°). Individual fovea-to-disc angle (FtoDa) adjustment improved agreement in one eye with an extreme FtoDa. Although studies relating local structural (OCT) and functional (VF) measures typically show poor to moderate correlations, there is good qualitative agreement between the location of deep cpRNFL loss and deep defects on VFs.

  2. The MUSE Hubble Ultra Deep Field Survey. II. Spectroscopic redshifts and comparisons to color selections of high-redshift galaxies

    NASA Astrophysics Data System (ADS)

    Inami, H.; Bacon, R.; Brinchmann, J.; Richard, J.; Contini, T.; Conseil, S.; Hamer, S.; Akhlaghi, M.; Bouché, N.; Clément, B.; Desprez, G.; Drake, A. B.; Hashimoto, T.; Leclercq, F.; Maseda, M.; Michel-Dansac, L.; Paalvast, M.; Tresse, L.; Ventou, E.; Kollatschny, W.; Boogaard, L. A.; Finley, H.; Marino, R. A.; Schaye, J.; Wisotzki, L.

    2017-11-01

    We have conducted a two-layered spectroscopic survey (1' × 1' ultra deep and 3' × 3' deep regions) in the Hubble Ultra Deep Field (HUDF) with the Multi Unit Spectroscopic Explorer (MUSE). The combination of a large field of view, high sensitivity, and wide wavelength coverage provides an order of magnitude improvement in spectroscopically confirmed redshifts in the HUDF; i.e., 1206 secure spectroscopic redshifts for Hubble Space Telescope (HST) continuum selected objects, which corresponds to 15% of the total (7904). The redshift distribution extends well beyond z> 3 and to HST/F775W magnitudes as faint as ≈ 30 mag (AB, 1σ). In addition, 132 secure redshifts were obtained for sources with no HST counterparts that were discovered in the MUSE data cubes by a blind search for emission-line features. In total, we present 1338 high quality redshifts, which is a factor of eight increase compared with the previously known spectroscopic redshifts in the same field. We assessed redshifts mainly with the spectral features [O II] at z< 1.5 (473 objects) and Lyα at 2.9

  3. Field Placement Treatments: A Comparative Study

    ERIC Educational Resources Information Center

    Parkison, Paul T.

    2008-01-01

    Field placement within teacher education represents a topic of interest for all preservice teacher programs. Present research addresses a set of important questions regarding field placement: (1) What pedagogical methodologies facilitate deep learning during field experiences? (2) Is there a significant difference in treatment effect for…

  4. WHATS-3: An improved flow-through multi-bottle fluid sampler for deep-sea geofluid research

    NASA Astrophysics Data System (ADS)

    Miyazaki, Junichi; Makabe, Akiko; Matsui, Yohei; Ebina, Naoya; Tsutsumi, Saki; Ishibashi, Jun-ichiro; Chen, Chong; Kaneko, Sho; Takai, Ken; Kawagucci, Shinsuke

    2017-06-01

    Deep-sea geofluid systems, such as hydrothermal vents and cold seeps, are key to understanding subseafloor environments of Earth. Fluid chemistry, especially, provides crucial information towards elucidating the physical, chemical and biological processes that occur in these ecosystems. To accurately assess fluid and gas properties of deep-sea geofluids, well-designed pressure-tight fluid samplers are indispensable and as such they are important assets of deep-sea geofluid research. Here, the development of a new flow-through, pressure-tight fluid sampler capable of four independent sampling events (two subsamples for liquid and gas analyses from each) is reported. This new sampler, named WHATS-3, is a new addition to the WHATS-series samplers and a major upgrade from the previous WHATS-2 sampler with improvements in sample number, valve operational time, physical robustness, and ease of maintenance. Routine laboratory-based pressure tests proved that it is suitable for operation up to 35 MPa pressure. Successful field tests of the new sampler were also carried out in five hydrothermal fields, two in Indian Ocean and three in Okinawa Trough (max. depth 3,300 m). Relations of Mg and major ion species demonstrated bimodal mixing trends between a hydrothermal fluid and seawater, confirming the high-quality of fluids sampled. The newly developed WHATS-3 sampler is well-balanced in sampling capability, field usability, and maintenance feasibility, and can serve as one of the best geofluid samplers available at present to conduct efficient research of deep-sea geofluid systems.

  5. Mechanisms of Exhaust Pollutants and Plume Formation in Continuous Combustion.

    DTIC Science & Technology

    1984-06-01

    device. 4.1.3 Dilute Swirl Combustor (DSC) A swirl-stabilized geometry was developed to address the deficiencies observed with the swirl CBC geometry and...certain deficiencies were apparent in the ability of the model to predict experimental trends. For example: (1) The velocity profiles (Figure lOa) show that...25,000 Re - 50,000 HDF LA 1.1 0.55 Prediction 1.2 0.71 Flow Visualization 0.92 0.66 0 LCF LA 1.2 0.60 Prediction 1.3 0.70 5 J~55 -* - *7 2-- tK2

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, Michael J.

    SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.

  7. Advanced graphical user interface for multi-physics simulations using AMST

    NASA Astrophysics Data System (ADS)

    Hoffmann, Florian; Vogel, Frank

    2017-07-01

    Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.

  8. Opportunities and obstacles for deep learning in biology and medicine.

    PubMed

    Ching, Travers; Himmelstein, Daniel S; Beaulieu-Jones, Brett K; Kalinin, Alexandr A; Do, Brian T; Way, Gregory P; Ferrero, Enrico; Agapow, Paul-Michael; Zietz, Michael; Hoffman, Michael M; Xie, Wei; Rosen, Gail L; Lengerich, Benjamin J; Israeli, Johnny; Lanchantin, Jack; Woloszynek, Stephen; Carpenter, Anne E; Shrikumar, Avanti; Xu, Jinbo; Cofer, Evan M; Lavender, Christopher A; Turaga, Srinivas C; Alexandari, Amr M; Lu, Zhiyong; Harris, David J; DeCaprio, Dave; Qi, Yanjun; Kundaje, Anshul; Peng, Yifan; Wiley, Laura K; Segler, Marwin H S; Boca, Simina M; Swamidass, S Joshua; Huang, Austin; Gitter, Anthony; Greene, Casey S

    2018-04-01

    Deep learning describes a class of machine learning algorithms that are capable of combining raw inputs into layers of intermediate features. These algorithms have recently shown impressive results across a variety of domains. Biology and medicine are data-rich disciplines, but the data are complex and often ill-understood. Hence, deep learning techniques may be particularly well suited to solve problems of these fields. We examine applications of deep learning to a variety of biomedical problems-patient classification, fundamental biological processes and treatment of patients-and discuss whether deep learning will be able to transform these tasks or if the biomedical sphere poses unique challenges. Following from an extensive literature review, we find that deep learning has yet to revolutionize biomedicine or definitively resolve any of the most pressing challenges in the field, but promising advances have been made on the prior state of the art. Even though improvements over previous baselines have been modest in general, the recent progress indicates that deep learning methods will provide valuable means for speeding up or aiding human investigation. Though progress has been made linking a specific neural network's prediction to input features, understanding how users should interpret these models to make testable hypotheses about the system under study remains an open challenge. Furthermore, the limited amount of labelled data for training presents problems in some domains, as do legal and privacy constraints on work with sensitive health records. Nonetheless, we foresee deep learning enabling changes at both bench and bedside with the potential to transform several areas of biology and medicine. © 2018 The Authors.

  9. Opportunities and obstacles for deep learning in biology and medicine

    PubMed Central

    2018-01-01

    Deep learning describes a class of machine learning algorithms that are capable of combining raw inputs into layers of intermediate features. These algorithms have recently shown impressive results across a variety of domains. Biology and medicine are data-rich disciplines, but the data are complex and often ill-understood. Hence, deep learning techniques may be particularly well suited to solve problems of these fields. We examine applications of deep learning to a variety of biomedical problems—patient classification, fundamental biological processes and treatment of patients—and discuss whether deep learning will be able to transform these tasks or if the biomedical sphere poses unique challenges. Following from an extensive literature review, we find that deep learning has yet to revolutionize biomedicine or definitively resolve any of the most pressing challenges in the field, but promising advances have been made on the prior state of the art. Even though improvements over previous baselines have been modest in general, the recent progress indicates that deep learning methods will provide valuable means for speeding up or aiding human investigation. Though progress has been made linking a specific neural network's prediction to input features, understanding how users should interpret these models to make testable hypotheses about the system under study remains an open challenge. Furthermore, the limited amount of labelled data for training presents problems in some domains, as do legal and privacy constraints on work with sensitive health records. Nonetheless, we foresee deep learning enabling changes at both bench and bedside with the potential to transform several areas of biology and medicine. PMID:29618526

  10. Field performance of self-siphon sediment cleansing set for sediment removal in deep CSO chamber.

    PubMed

    Zhou, Yongchao; Zhang, Yiping; Tang, Ping

    2013-01-01

    This paper presents a study of the self-siphon sediment cleansing set (SSCS), a system designed to remove sediment from the deep combined sewer overflow (CSO) chamber during dry-weather periods. In order to get a better understanding of the sediment removal effectiveness and operational conditions of the SSCS system, we carried out a full-scale field study and comparison analysis on the sediment depth changes in the deep CSO chambers under the conditions with and without the SSCS. The field investigation results demonstrated that the SSCS drains the dry-weather flow that accumulated for 50-57 min from the sewer channel to the intercepting system in about 10 min. It is estimated that the bed shear stress in the CSO chamber and sewer channel is improved almost 25 times on average. The SSCS acts to remove the near bed solids with high pollution load efficiently. Moreover, it cleans up not only the new sediment layer but also part of the previously accumulated sediment.

  11. Field testing of stiffened deep cement mixing piles under lateral cyclic loading

    NASA Astrophysics Data System (ADS)

    Raongjant, Werasak; Jing, Meng

    2013-06-01

    Construction of seaside and underground wall bracing often uses stiffened deep cement mixed columns (SDCM). This research investigates methods used to improve the level of bearing capacity of these SDCM when subjected to cyclic lateral loading via various types of stiffer cores. Eight piles, two deep cement mixed piles and six stiffened deep cement mixing piles with three different types of cores, H shape cross section prestressed concrete, steel pipe, and H-beam steel, were embedded though soft clay into medium-hard clay on site in Thailand. Cyclic horizontal loading was gradually applied until pile failure and the hysteresis loops of lateral load vs. lateral deformation were recorded. The lateral carrying capacities of the SDCM piles with an H-beam steel core increased by 3-4 times that of the DCM piles. This field research clearly shows that using H-beam steel as a stiffer core for SDCM piles is the best method to improve its lateral carrying capacity, ductility and energy dissipation capacity.

  12. Laminar Neural Field Model of Laterally Propagating Waves of Orientation Selectivity

    PubMed Central

    2015-01-01

    We construct a laminar neural-field model of primary visual cortex (V1) consisting of a superficial layer of neurons that encode the spatial location and orientation of a local visual stimulus coupled to a deep layer of neurons that only encode spatial location. The spatially-structured connections in the deep layer support the propagation of a traveling front, which then drives propagating orientation-dependent activity in the superficial layer. Using a combination of mathematical analysis and numerical simulations, we establish that the existence of a coherent orientation-selective wave relies on the presence of weak, long-range connections in the superficial layer that couple cells of similar orientation preference. Moreover, the wave persists in the presence of feedback from the superficial layer to the deep layer. Our results are consistent with recent experimental studies that indicate that deep and superficial layers work in tandem to determine the patterns of cortical activity observed in vivo. PMID:26491877

  13. Deep subsurface drip irrigation using coal-bed sodic water: part I. water and solute movement

    USGS Publications Warehouse

    Bern, Carleton R.; Breit, George N.; Healy, Richard W.; Zupancic, John W.; Hammack, Richard

    2013-01-01

    Water co-produced with coal-bed methane (CBM) in the semi-arid Powder River Basin of Wyoming and Montana commonly has relatively low salinity and high sodium adsorption ratios that can degrade soil permeability where used for irrigation. Nevertheless, a desire to derive beneficial use from the water and a need to dispose of large volumes of it have motivated the design of a deep subsurface drip irrigation (SDI) system capable of utilizing that water. Drip tubing is buried 92 cm deep and irrigates at a relatively constant rate year-round, while evapotranspiration by the alfalfa and grass crops grown is seasonal. We use field data from two sites and computer simulations of unsaturated flow to understand water and solute movements in the SDI fields. Combined irrigation and precipitation exceed potential evapotranspiration by 300-480 mm annually. Initially, excess water contributes to increased storage in the unsaturated zone, and then drainage causes cyclical rises in the water table beneath the fields. Native chloride and nitrate below 200 cm depth are leached by the drainage. Some CBM water moves upward from the drip tubing, drawn by drier conditions above. Chloride from CBM water accumulates there as root uptake removes the water. Year over year accumulations indicated by computer simulations illustrate that infiltration of precipitation water from the surface only partially leaches such accumulations away. Field data show that 7% and 27% of added chloride has accumulated above the drip tubing in an alfalfa and grass field, respectively, following 6 years of irrigation. Maximum chloride concentrations in the alfalfa field are around 45 cm depth but reach the surface in parts of the grass field, illustrating differences driven by crop physiology. Deep SDI offers a means of utilizing marginal quality irrigation waters and managing the accumulation of their associated solutes in the crop rooting zone.

  14. Do Students Develop towards More Deep Approaches to Learning during Studies? A Systematic Review on the Development of Students' Deep and Surface Approaches to Learning in Higher Education

    ERIC Educational Resources Information Center

    Asikainen, Henna; Gijbels, David

    2017-01-01

    The focus of the present paper is on the contribution of the research in the student approaches to learning tradition. Several studies in this field have started from the assumption that students' approaches to learning develop towards more deep approaches to learning in higher education. This paper reports on a systematic review of longitudinal…

  15. Deep Learning for Computer Vision: A Brief Review

    PubMed Central

    Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619

  16. Deep-subwavelength imaging of both electric and magnetic localized optical fields by plasmonic campanile nanoantenna

    DOE PAGES

    Caselli, Niccolò; La China, Federico; Bao, Wei; ...

    2015-06-05

    Tailoring the electromagnetic field at the nanoscale has led to artificial materials exhibiting fascinating optical properties unavailable in naturally occurring substances. Besides having fundamental implications for classical and quantum optics, nanoscale metamaterials provide a platform for developing disruptive novel technologies, in which a combination of both the electric and magnetic radiation field components at optical frequencies is relevant to engineer the light-matter interaction. Thus, an experimental investigation of the spatial distribution of the photonic states at the nanoscale for both field components is of crucial importance. Here we experimentally demonstrate a concomitant deep-subwavelength near-field imaging of the electric and magneticmore » intensities of the optical modes localized in a photonic crystal nanocavity. We take advantage of the “campanile tip”, a plasmonic near-field probe that efficiently combines broadband field enhancement with strong far-field to near-field coupling. In conclusion, by exploiting the electric and magnetic polarizability components of the campanile tip along with the perturbation imaging method, we are able to map in a single measurement both the electric and magnetic localized near-field distributions.« less

  17. Vorticity and Vertical Motions Diagnosed from Satellite Deep-Layer Temperatures. Revised

    NASA Technical Reports Server (NTRS)

    Spencer, Roy W.; Lapenta, William M.; Robertson, Franklin R.

    1994-01-01

    Spatial fields of satellite-measured deep-layer temperatures are examined in the context of quasigeostrophic theory. It is found that midtropospheric geostrophic vorticity and quasigeostrophic vertical motions can be diagnosed from microwave temperature measurements of only two deep layers. The lower- ( 1000-400 hPa) and upper- (400-50 hPa) layer temperatures are estimated from limb-corrected TIROS-N Microwave Sounding Units (MSU) channel 2 and 3 data, spatial fields of which can be used to estimate the midtropospheric thermal wind and geostrophic vorticity fields. Together with Trenberth's simplification of the quasigeostrophic omega equation, these two quantities can be then used to estimate the geostrophic vorticity advection by the thermal wind, which is related to the quasigeostrophic vertical velocity in the midtroposphere. Critical to the technique is the observation that geostrophic vorticity fields calculated from the channel 3 temperature features are very similar to those calculated from traditional, 'bottom-up' integrated height fields from radiosonde data. This suggests a lack of cyclone-scale height features near the top of the channel 3 weighting function, making the channel 3 cyclone-scale 'thickness' features approximately the same as height features near the bottom of the weighting function. Thus, the MSU data provide observational validation of the LID (level of insignificant dynamics) assumption of Hirshberg and Fritsch.

  18. The Pan-STARRS 1 Medium Deep Field Variable Star Catalog

    NASA Astrophysics Data System (ADS)

    Flewelling, Heather

    2016-01-01

    We present the first Pan-STARRS 1 Medium Deep Field Variable Star Catalog (PS1-MDF-VSC). The Pan-STARRS 1 (PS1) telescope is a 1.8 meter survey telescope with a 1.4 Gigapixel camera, located in Haleakala, Hawaii. The Medium Deep survey, which consists of 10 fields located uniformly across the sky, totaling 70 square degrees, is observed each night, in 2-3 filters per field, with 8 exposures per filter, resulting in 3000-4000 data points per star over a time span of 3.5 years. To find the variables, we select objects with > 200 detections, and remove those flagged as saturated. No other cuts are used. There are approximately 2.4 million objects that fit this criteria, with magnitudes between 13th and 24th. These objects are then passed through a lomb-scargle fitting routine to determine periodicity. After a periodicity cut, the candidates are classified by eye into different types of variable stars. We have identified several thousand periodic variable stars, with periods ranging between a few minutes to a few days. We compare our findings to the variable star catalogs within Vizier and AAVSO. In particular, for field MD02, we recover all the variables that are faint in Vizier, and we find good agreement with the periods reported in Vizier.

  19. The Great Observatories Origins Deep Survey

    NASA Astrophysics Data System (ADS)

    Dickinson, Mark

    2008-05-01

    Observing the formation and evolution of ordinary galaxies at early cosmic times requires data at many wavelengths in order to recognize, separate and analyze the many physical processes which shape galaxies' history, including the growth of large scale structure, gravitational interactions, star formation, and active nuclei. Extremely deep data, covering an adequately large volume, are needed to detect ordinary galaxies in sufficient numbers at such great distances. The Great Observatories Origins Deep Survey (GOODS) was designed for this purpose as an anthology of deep field observing programs that span the electromagnetic spectrum. GOODS targets two fields, one in each hemisphere. Some of the deepest and most extensive imaging and spectroscopic surveys have been carried out in the GOODS fields, using nearly every major space- and ground-based observatory. Many of these data have been taken as part of large, public surveys (including several Hubble Treasury, Spitzer Legacy, and ESO Large Programs), which have produced large data sets that are widely used by the astronomical community. I will review the history of the GOODS program, highlighting results on the formation and early growth of galaxies and their active nuclei. I will also describe new and upcoming observations, such as the GOODS Herschel Key Program, which will continue to fill out our portrait of galaxies in the young universe.

  20. Deep nightside photoelectron observations by MAVEN SWEA: Implications for Martian northern hemispheric magnetic topology and nightside ionosphere source

    NASA Astrophysics Data System (ADS)

    Xu, Shaosui; Mitchell, David; Liemohn, Michael; Dong, Chuanfei; Bougher, Stephen; Fillingim, Matthew; Lillis, Robert; McFadden, James; Mazelle, Christian; Connerney, Jack; Jakosky, Bruce

    2016-09-01

    The Mars Atmosphere and Volatile EvolutioN (MAVEN) mission samples the Mars ionosphere down to altitudes of ˜150 km over a wide range of local times and solar zenith angles. On 5 January 2015 (Orbit 520) when the spacecraft was in darkness at high northern latitudes (solar zenith angle, SZA >120° latitude >60°), the Solar Wind Electron Analyzer (SWEA) instrument observed photoelectrons at altitudes below 200 km. Such observations imply the presence of closed crustal magnetic field loops that cross the terminator and extend thousands of kilometers to the deep nightside. This occurs over the weak northern crustal magnetic source regions, where the magnetic field has been thought to be dominated by draped interplanetary magnetic fields (IMF). Such a day-night magnetic connectivity also provides a source of plasma and energy to the deep nightside. Simulations with the SuperThermal Electron Transport (STET) model show that photoelectron fluxes measured by SWEA precipitating onto the nightside atmosphere provide a source of ionization that can account for the O2+ density measured by the Suprathermal and Thermal Ion Composition (STATIC) instrument below 200 km. This finding indicates another channel for Martian energy redistribution to the deep nightside and consequently localized ionosphere patches and potentially aurora.

  1. Van Allen Probes Measurements of Energetic Particle Deep Penetration Into the Low L Region (L < 4) During the Storm on 8 April 2016

    NASA Astrophysics Data System (ADS)

    Zhao, H.; Baker, D. N.; Califf, S.; Li, X.; Jaynes, A. N.; Leonard, T.; Kanekal, S. G.; Blake, J. B.; Fennell, J. F.; Claudepierre, S. G.; Turner, D. L.; Reeves, G. D.; Spence, H. E.

    2017-12-01

    Using measurements from the Van Allen Probes, a penetration event of tens to hundreds of keV electrons and tens of keV protons into the low L shells (L < 4) is studied. Timing and magnetic local time (MLT) differences of energetic particle deep penetration are unveiled and underlying physical processes are examined. During this event, both proton and electron penetrations are MLT asymmetric. The observed MLT difference of proton penetration is consistent with convection of plasma sheet protons, suggesting enhanced convection during geomagnetic active times to be the cause of energetic proton deep penetration during this event. The observed MLT difference of tens to hundreds of keV electron penetration is completely different from tens of keV protons and cannot be well explained by inward radial diffusion, convection of plasma sheet electrons, or transport of trapped electrons by enhanced convection electric field represented by the Volland-Stern model or a uniform dawn-dusk electric field model based on the electric field measurements. It suggests that the underlying physical mechanism responsible for energetic electron deep penetration, which is very important for fully understanding energetic electron dynamics in the low L shells, should be MLT localized.

  2. Anomalously deep polarization in SrTiO3 (001) interfaced with an epitaxial ultrathin manganite film

    DOE PAGES

    Wang, Zhen; Tao, Jing; Yu, Liping; ...

    2016-10-17

    Using atomically-resolved imaging and spectroscopy, we reveal a remarkably deep polarization in non-ferroelectric SrTiO 3 near its interface with an ultrathin nonmetallic film of La 2/3Sr 1/3MnO 3. Electron holography shows an electric field near the interface in SrTiO 3, yielding a surprising spontaneous polarization density of ~ 21 μC/cm 2. Combining the experimental results with first principles calculations, we propose that the observed deep polarization is induced by the electric field originating from oxygen vacancies that extend beyond a dozen unit-cells from the interface, thus providing important evidence of the role of defects in the emergent interface properties ofmore » transition metal oxides.« less

  3. Radio Identification of Millimeter-Bright Galaxies Detected in the AzTEC/ASTE Blank Field Survey

    NASA Astrophysics Data System (ADS)

    Hatsukade, Bunyo; Kohno, Kotaro; White, Glenn; Matsuura, Shuji; Hanami, Hitoshi; Shirahata, Mai; Nakanishi, Kouichiro; Hughes, David; Tamura, Yoichi; Iono, Daisuke; Wilson, Grant; Yun, Min

    2008-10-01

    We propose a deep 1.4-GHz imaging of millimeter-bright sources in the AzTEC/ASTE 1.1-mm blank field survey of AKARI Deep Field-South. The AzTEC/ASTE uncovered 37 sources, which are possibly at z > 2. We have obtained multi-wavelength data in this field, but the large beam size of AzTEC/ASTE (30 arcsec) prevents us from identifying counterparts. The aim of this proposal is to identify radio counterparts with higher-angular resolution. This enables us (i) To identifying optical/IR counterparts. It enables optical spectroscopy to determine precise redshifts, allowing us to derive SFRs, luminosity functions, clustering properties, mass of dark matter halos, etc. (ii) To constrain luminosity evolutions of SMGs by comparing of 1.4-GHz number counts (and luminosity functions) with luminosity evolution models. (iii) To estimate photometric redshifts from 1.4-GHz and 1.1-mm data using the radio-FIR flux correlation. In case of non-detection, we can put deep lower limits (3 sigma limit of z > 3). These information lead to the study of evolutionary history of SMGs, their relationship with other galaxy populations, contribution to the cosmic star formation history and the infrared background.

  4. The Pan-STARRS 1 Medium Deep Field Variable Star Catalog

    NASA Astrophysics Data System (ADS)

    Flewelling, Heather

    2015-01-01

    We present the first Pan-STARRS 1 Medium Deep Field Variable Star Catalog (PS1-MDF-VSC). The Pan-STARRS 1 (PS1) telescope is a 1.8 meter survey telescope with a 1.4 Gigapixel camera, located in Haleakala, Hawaii. The Medium Deep survey, which consists of 10 fields located uniformly across the sky, totalling 70 square degrees, is observed each night, in 2-3 filters per field, with 8 exposures per filter, resulting in 3000-4000 data points per star over a time span of 3.5 years. To find the variables, we select the stars with > 200 detections, between 16th and 21st magnitude. There are approximately 500k stars that fit this criteria, they then go through a lomb-scargle fitting routine to determine periodicity. After a periodicity cut, the ~400 candidates are classified by eye into different types of variable stars. We have identified several hundred variable stars, with periods ranging between a few minutes to a few days, and about half are not previously identified in the literature. We compare our results to the stripe 82 variable catalog, which overlaps part of the sky with the PS1 catalog.

  5. The Pan-STARRS 1 Medium Deep Field Variable Star Catalog

    NASA Astrophysics Data System (ADS)

    Flewelling, Heather

    2015-08-01

    We present the first Pan-STARRS 1 Medium Deep Field Variable Star Catalog (PS1-MDF-VSC). The Pan-STARRS 1 (PS1) telescope is a 1.8 meter survey telescope with a 1.4 Gigapixel camera, located in Haleakala, Hawaii. The Medium Deep survey, which consists of 10 fields located uniformly across the sky, totalling 70 square degrees, is observed each night, in 2-3 filters per field, with 8 exposures per filter, resulting in 3000-4000 data points per star over a time span of 3.5 years. To find the variables, we select the stars with > 200 detections, between 16th and 21st magnitude. There are approximately 500k stars that fit this criteria, they then go through a lomb-scargle fitting routine to determine periodicity. After a periodicity cut, the ~400 candidates are classified by eye into different types of variable stars. We have identified several hundred variable stars, with periods ranging between a few minutes to a few days, and about half are not previously identified in the literature. We compare our results to the stripe 82 variable catalog, which overlaps part of the sky with the PS1 catalog.

  6. PHAST Version 2-A Program for Simulating Groundwater Flow, Solute Transport, and Multicomponent Geochemical Reactions

    USGS Publications Warehouse

    Parkhurst, David L.; Kipp, Kenneth L.; Charlton, Scott R.

    2010-01-01

    The computer program PHAST (PHREEQC And HST3D) simulates multicomponent, reactive solute transport in three-dimensional saturated groundwater flow systems. PHAST is a versatile groundwater flow and solute-transport simulator with capabilities to model a wide range of equilibrium and kinetic geochemical reactions. The flow and transport calculations are based on a modified version of HST3D that is restricted to constant fluid density and constant temperature. The geochemical reactions are simulated with the geochemical model PHREEQC, which is embedded in PHAST. Major enhancements in PHAST Version 2 allow spatial data to be defined in a combination of map and grid coordinate systems, independent of a specific model grid (without node-by-node input). At run time, aquifer properties are interpolated from the spatial data to the model grid; regridding requires only redefinition of the grid without modification of the spatial data. PHAST is applicable to the study of natural and contaminated groundwater systems at a variety of scales ranging from laboratory experiments to local and regional field scales. PHAST can be used in studies of migration of nutrients, inorganic and organic contaminants, and radionuclides; in projects such as aquifer storage and recovery or engineered remediation; and in investigations of the natural rock/water interactions in aquifers. PHAST is not appropriate for unsaturated-zone flow, multiphase flow, or density-dependent flow. A variety of boundary conditions are available in PHAST to simulate flow and transport, including specified-head, flux (specified-flux), and leaky (head-dependent) conditions, as well as the special cases of rivers, drains, and wells. Chemical reactions in PHAST include (1) homogeneous equilibria using an ion-association or Pitzer specific interaction thermodynamic model; (2) heterogeneous equilibria between the aqueous solution and minerals, ion exchange sites, surface complexation sites, solid solutions, and gases; and (3) kinetic reactions with rates that are a function of solution composition. The aqueous model (elements, chemical reactions, and equilibrium constants), minerals, exchangers, surfaces, gases, kinetic reactants, and rate expressions may be defined or modified by the user. A number of options are available to save results of simulations to output files. The data may be saved in three formats: a format suitable for viewing with a text editor; a format suitable for exporting to spreadsheets and postprocessing programs; and in Hierarchical Data Format (HDF), which is a compressed binary format. Data in the HDF file can be visualized on Windows computers with the program Model Viewer and extracted with the utility program PHASTHDF; both programs are distributed with PHAST.

  7. Real-Time Very High-Resolution Regional 4D Assimilation in Supporting CRYSTAL-FACE Experiment

    NASA Technical Reports Server (NTRS)

    Wang, Donghai; Minnis, Patrick

    2004-01-01

    To better understand tropical cirrus cloud physical properties and formation processes with a view toward the successful modeling of the Earth's climate, the CRYSTAL-FACE (Cirrus Regional Study of Tropical Anvils and Cirrus Layers - Florida Area Cirrus Experiment) field experiment took place over southern Florida from 1 July to 29 July 2002. During the entire field campaign, a very high-resolution numerical weather prediction (NWP) and assimilation system was performed in support of the mission with supercomputing resources provided by NASA Center for Computational Sciences (NCCS). By using NOAA NCEP Eta forecast for boundary conditions and as a first guess for initial conditions assimilated with all available observations, two nested 15/3 km grids are employed over the CRYSTAL-FACE experiment area. The 15-km grid covers the southeast US domain, and is run two times daily for a 36-hour forecast starting at 0000 UTC and 1200 UTC. The nested 3-km grid covering only southern Florida is used for 9-hour and 18-hour forecasts starting at 1500 and 0600 UTC, respectively. The forecasting system provided more accurate and higher spatial and temporal resolution forecasts of 4-D atmospheric fields over the experiment area than available from standard weather forecast models. These forecasts were essential for flight planning during both the afternoon prior to a flight day and the morning of a flight day. The forecasts were used to help decide takeoff times and the most optimal flight areas for accomplishing the mission objectives. See more detailed products on the web site http://asd-www.larc.nasa.gov/mode/crystal. The model/assimilation output gridded data are archived on the NASA Center for Computational Sciences (NCCS) UniTree system in the HDF format at 30-min intervals for real-time forecasts or 5-min intervals for the post-mission case studies. Particularly, the data set includes the 3-D cloud fields (cloud liquid water, rain water, cloud ice, snow and graupe/hail).

  8. Using Gaia as an Astrometric Tool for Deep Ground-based Surveys

    NASA Astrophysics Data System (ADS)

    Casetti-Dinescu, Dana I.; Girard, Terrence M.; Schriefer, Michael

    2018-04-01

    Gaia DR1 positions are used to astrometrically calibrate three epochs' worth of Subaru SuprimeCam images in the fields of globular cluster NGC 2419 and the Sextans dwarf spheroidal galaxy. Distortion-correction ``maps'' are constructed from a combination of offset dithers and reference to Gaia DR1. These are used to derive absolute proper motions in the field of NGC 2419. Notably, we identify the photometrically-detected Monoceros structure in the foreground of NGC 2419 as a kinematically-cold population of stars, distinct from Galactic-field stars. This project demonstrates the feasibility of combining Gaia with deep, ground-based surveys, thus extending high-quality astrometry to magnitudes beyond the limits of Gaia.

  9. A physical data model for fields and agents

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek

    2016-04-01

    Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data models and the raster data model, among many other data models. Our physical data model is capable of storing a first set of kinds of data, like omnipresent scalars, mobile spatio-temporal points and property values, and spatio-temporal rasters. With our poster we will provide an overview of the physical data model expressed in HDF5 and show examples of how it can be used to capture both object- and field-based information. References De Bakker, M, K. de Jong, D. Karssenberg. 2016. A conceptual data model and language for fields and agents. European Geosciences Union, EGU General Assembly, 2016, Vienna.

  10. 24. EXTERIOR VIEW, SHOWING AIRPLANES IN VERY DEEP SNOW. Photographic ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. EXTERIOR VIEW, SHOWING AIRPLANES IN VERY DEEP SNOW. Photographic copy of historic photograph. July-Dec. 1948 OAMA (original print located at Ogden Air Logistics Center, Hill Air Force Base, Utah). Photographer unknown. - Hill Field, Airplane Repair Hangars No. 1-No. 4, 5875 Southgate Avenue, Layton, Davis County, UT

  11. Mechanic: The MPI/HDF code framework for dynamical astronomy

    NASA Astrophysics Data System (ADS)

    Słonina, Mariusz; Goździewski, Krzysztof; Migaszewski, Cezary

    2015-01-01

    We introduce the Mechanic, a new open-source code framework. It is designed to reduce the development effort of scientific applications by providing unified API (Application Programming Interface) for configuration, data storage and task management. The communication layer is based on the well-established Message Passing Interface (MPI) standard, which is widely used on variety of parallel computers and CPU-clusters. The data storage is performed within the Hierarchical Data Format (HDF5). The design of the code follows core-module approach which allows to reduce the user’s codebase and makes it portable for single- and multi-CPU environments. The framework may be used in a local user’s environment, without administrative access to the cluster, under the PBS or Slurm job schedulers. It may become a helper tool for a wide range of astronomical applications, particularly focused on processing large data sets, such as dynamical studies of long-term orbital evolution of planetary systems with Monte Carlo methods, dynamical maps or evolutionary algorithms. It has been already applied in numerical experiments conducted for Kepler-11 (Migaszewski et al., 2012) and νOctantis planetary systems (Goździewski et al., 2013). In this paper we describe the basics of the framework, including code listings for the implementation of a sample user’s module. The code is illustrated on a model Hamiltonian introduced by (Froeschlé et al., 2000) presenting the Arnold diffusion. The Arnold web is shown with the help of the MEGNO (Mean Exponential Growth of Nearby Orbits) fast indicator (Goździewski et al., 2008a) applied onto symplectic SABAn integrators family (Laskar and Robutel, 2001).

  12. Microzooplankton herbivory and community structure in the Amundsen Sea, Antarctica

    NASA Astrophysics Data System (ADS)

    Yang, Eun Jin; Jiang, Yong; Lee, SangHoon

    2016-01-01

    We examined microzooplankton abundance, community structure, and grazing impact on phytoplankton in the Amundsen Sea, Western Antarctica, during the early austral summer from December 2010 to January 2011. Our study area was divided into three regions based on topography, hydrographic properties, and trophic conditions: (1) the Oceanic Zone (OZ), with free sea ice and low phytoplankton biomass dominated by diatoms; (2) the Sea Ice Zone (SIZ), covered by heavy sea ice with colder water, lower salinity, and dominated by diatoms; and (3) the Amundsen Sea Polynya (ASP), with high phytoplankton biomass dominated by Phaeocystis antarctica. Microzooplankton biomass and communities associated with phytoplankton biomass and composition varied among regions. Heterotrophic dinoflagellates (HDF) were the most significant grazers in the ASP and OZ, whereas ciliates co-dominated with HDF in the SIZ. Microzooplankton grazing impact is significant in our study area, particularly in the ASP, and consumed 55.4-107.6% of phytoplankton production (average 77.3%), with grazing impact increasing with prey and grazer biomass. This result implies that a significant proportion of the phytoplankton production is not removed by sinking or other grazers but grazed by microzooplankton. Compared with diatom-based systems, Phaeocystis-based production would be largely remineralized and/or channeled through the microbial food web through microzooplankton grazing. In these waters the major herbivorous fate of phytoplankton is likely mediated by the microzooplankton population. Our study confirms the importance of herbivorous protists in the planktonic ecosystems of high latitudes. In conclusion, microzooplankton herbivory may be a driving force controlling phytoplankton growth in early summer in the Amundsen Sea, particularly in the ASP.

  13. Curcumin loaded chitin nanogels for skin cancer treatment via the transdermal route

    NASA Astrophysics Data System (ADS)

    Mangalathillam, Sabitha; Rejinold, N. Sanoj; Nair, Amrita; Lakshmanan, Vinoth-Kumar; Nair, Shantikumar V.; Jayakumar, Rangasamy

    2011-12-01

    In this study, curcumin loaded chitin nanogels (CCNGs) were developed using biocompatible and biodegradable chitin with an anticancer curcumin drug. Chitin, as well as curcumin, is insoluble in water. However, the developed CCNGs form a very good and stable dispersion in water. The CCNGs were analyzed by DLS, SEM and FTIR and showed spherical particles in a size range of 70-80 nm. The CCNGs showed higher release at acidic pH compared to neutral pH. The cytotoxicity of the nanogels were analyzed on human dermal fibroblast cells (HDF) and A375 (human melanoma) cell lines and the results show that CCNGs have specific toxicity on melanoma in a concentration range of 0.1-1.0 mg mL-1, but less toxicity towards HDF cells. The confocal analysis confirmed the uptake of CCNGs by A375. The apoptotic effect of CCNGs was analyzed by a flow-cytometric assay and the results indicate that CCNGs at the higher concentration of the cytotoxic range showed comparable apoptosis as the control curcumin, in which there was negligible apoptosis induced by the control chitin nanogels. The CCNGs showed a 4-fold increase in steady state transdermal flux of curcumin as compared to that of control curcumin solution. The histopathology studies of the porcine skin samples treated with the prepared materials showed loosening of the horny layer of the epidermis, facilitating penetration with no observed signs of inflammation. These results suggest that the formulated CCNGs offer specific advantage for the treatment of melanoma, the most common and serious type of skin cancer, by effective transdermal penetration.

  14. Development and evaluation of an algorithm to facilitate drug prescription for inpatients with feeding tubes.

    PubMed

    Lohmann, Kristina; Freigofas, Julia; Leichsenring, Julian; Wallenwein, Chantal Marie; Haefeli, Walter Emil; Seidling, Hanna Marita

    2015-04-01

    We aimed to develop and evaluate an algorithm to facilitate drug switching between primary and tertiary care for patients with feeding tubes. An expert consortium developed an algorithm and applied it manually to 267 preadmission drugs of 46 patients admitted to a surgical ward of a tertiary care university hospital between June 12 and December 2, 2013, and requiring a feeding tube during their inpatient stay. The new algorithm considered the following principles: Drugs should be ideally listed on the hospital drug formulary (HDF). Additionally, drugs should include the same ingredient instead of a therapeutic equivalent. Preferred dosage forms were appropriate liquids, followed by solid drugs with liquid administration form, and solid drugs that could be crushed and/or suspended. Of all evaluated drugs, 83.5% could be switched to suitable drugs listed on the HDF and another 6.0% to drugs available on the German drug market. Additionally, for 4.1% of the drugs, the integration of individual switching rules allowed the switch from enteric-coated to immediate-release drugs. Consequently, 6.4% of the drugs could not be automatically switched and required case-to-case decision by a clinical professional (e.g., from sustained-release to immediate-release). The predefined principles were successfully integrated in the new algorithm. Thus, the algorithm switched more than 90% of the evaluated preadmission drugs to suitable drugs for inpatients with feeding tubes. This finding suggests that the algorithm can readily be transferred to an electronic format and integrated into a clinical decision support system.

  15. Antimicrobial activity, cytotoxicity and chemical analysis of lemongrass essential oil (Cymbopogon flexuosus) and pure citral.

    PubMed

    Adukwu, Emmanuel C; Bowles, Melissa; Edwards-Jones, Valerie; Bone, Heather

    2016-11-01

    The aim of this study was to determine the antimicrobial effects of lemongrass essential oil (C. flexuosus) and to determine cytotoxic effects of both test compounds on human dermal fibroblasts. Antimicrobial susceptibility screening was carried out using the disk diffusion method. Antimicrobial resistance was observed in four of five Acinetobacter baumannii strains with two strains confirmed as multi-drug-resistant (MDR). All the strains tested were susceptible to both lemongrass and citral with zones of inhibition varying between 17 to 80 mm. The mean minimum inhibitory concentration (MIC) and minimum bactericidal concentration (MBC) of citral (mic-0.14 % and mbc-0.3 % v/v) was lower than that of Lemongrass (mic-0.65 % and mbc-1.1 % v/v) determined using the microtitre plate method. Cell viability using human dermal fibroblasts (HDF; 106-05a) was determined following exposure to both compounds and a control (Grapeseed oil) using the XTT assay and the IC 50 determined at 0.095 % (v/v) for citral and 0.126 % (v/v) for lemongrass. Grapeseed oil had no effect on cell viability. Live cell imaging was performed using the LumaScope 500 imaging equipment and changes in HDF cell morphology such as necrotic features and shrinkage were observed. The ability of lemongrass essential oil (EO) and citral to inhibit and kill MDR A. baumannii highlights its potential for use in the management of drug-resistant infections; however, in vitro cytotoxicity does suggest further tests are needed before in vivo or ex vivo human exposure.

  16. Six Years of TRMM Precipitation Data at the GES DISC DAAC

    NASA Astrophysics Data System (ADS)

    Rui, H.; Teng, B.; Liu, Z.; Chiu, L.; Hrubiak, P.; Bonk, J.; Lu, L.

    2004-05-01

    The Tropical Rainfall Measuring Mission (TRMM), a joint mission between NASA and the Japan Aerospace Exploration Agency (JAXA), has been acquiring data from shortly after its launch in November 1997 to the present. All TRMM data, including those from the first and, thus far, only space-borne Precipitation Radar (PR), are archived at and distributed by the GES DISC DAAC. As of January 2004, more than six million files, with a total volume of 105 TB, of TRMM data had been distributed to thousands of users from 37 different countries around the world. With the PR, TRMM has been able to produce more accurate measurements of rainfall type, intensity, and three-dimensional distribution, all of which contribute to improved tropical cyclone forecasts and better preparation for hurricanes/typhoons, and to reduction in economic loss. TRMM data have also been widely used for climate, health, environment, agriculture, and interdisciplinary research and applications. The TRMM six-year precipitation climatology is a benchmark for other tropical rainfall measurement, and for estimating tropical contributions to global water and energy cycles. As a data information and services center, the GES DISC DAAC has consistently been providing customer-focused support to the TRMM user community. These include (1) TRMM Data Search and Order System (http://lake.nascom.nasa.gov/data/dataset/TRMM/); (2) online documentation; (3) TRMM HDF Data Read Software (ftp://lake.nascom.nasa.gov/software/trmm_software/Read_HDF/); (4) TRMM Online Visualization and Analysis System (TOVAS, http://lake.nascom.nasa.gov/tovas/); and (5) TRMM data mining (http://daac.gsfc.nasa.gov/hydrology/hd_datamin_intro.shtml).

  17. Black rice (Oryza sativa L.) extract modulates ultraviolet-induced expression of matrix metalloproteinases and procollagen in a skin cell model.

    PubMed

    Han, Mira; Bae, Jung-Soo; Ban, Jae-Jun; Shin, Hee Soon; Lee, Dong Hun; Chung, Jin Ho

    2018-05-01

    Exposure of the skin to ultraviolet (UV) radiation causes extracellular matrix (ECM) collapse in the dermis, owing to an increase in matrix metalloproteinase (MMP) production in both the epidermis and dermis, and a decrease in type I collagen expression in the dermis. Recently, black rice (Oryza sativa L.) was reported to have a wide range of pharmacological effects in various settings. However, the effects of black rice extract (BRE) on UV‑irradiated skin cells have not yet been characterized. BRE treatment did not affect cell morphology and viability of HaCaT and human dermal fibroblasts (HDF). We demonstrated that BRE downregulated basal and UV‑induced MMP‑1 expression in HaCaT cells. Furthermore, BRE significantly increased type I procollagen expression, and decreased MMP‑1 and MMP‑3 expression in UV‑irradiated HDF. The underlying mechanisms of these results involve a decrease in p38 and c‑Jun N‑terminal kinase activity, and suppression of UV‑induced activation of activator protein‑1 (AP‑1). BRE reduced UV‑induced reactive oxygen species production in HaCaT cells in a dose‑dependent manner. Indeed, mass spectrometry revealed that BRE contained antioxidative flavonoid components such as cyanidin‑3‑O‑β‑D‑glycoside and taxifolin‑7‑O‑glucoside. These findings suggest that BRE attenuates UV‑induced ECM damage by modulating mitogen‑activated protein kinase and AP‑1 signaling, and could be used as an active ingredient for preventing photoaging of the skin.

  18. The effect of red-allotrope selenium nanoparticles on head and neck squamous cell viability and growth

    PubMed Central

    Hassan, Christopher E; Webster, Thomas J

    2016-01-01

    Given their low toxicity and natural presence in the human diet, selenium nanoparticles have been established as potential candidates for the treatment of numerous cancers. Red-allotrope selenium nanoparticles (rSeNPs) were synthesized and characterized in this study. Head and neck squamous cell carcinoma (HNSCC) and human dermal fibroblast (HDF) cells were cultured and exposed to rSeNPs at concentrations ranging from 0.01 to 100 μg rSeNP/mL media for 1–3 days. The toxicity of rSeNP toward HNSCC and HDFs was analyzed. Results indicated that the particles were approximately four times as cytotoxic toward HNSCC compared to HDFs, with their respective IC50 values at 19.22 and 59.61 μg rSeNP/mL media. Using statistical analysis, an effective dosage range for killing HNSCC cells while simultaneously minimizing damage to HDFs over a 3-day incubation period was established at 20–55 μg rSeNP/mL media. Observations showed that doses of rSeNP <5 μg rSeNP/mL media resulted in cell proliferation. Transmission electron microscopy images of HNSCC and HDF cells, both treated with rSeNPs, revealed that the rSeNPs became localized in the cytoplasm near the lysosomes and mitochondria. Analysis of cell morphology showed that the rSeNPs primarily induced HNSCC apoptosis. Collectively, these results indicated that rSeNPs are a promising option for treating HNSCC without adversely affecting healthy cells and without resorting to the use of harmful chemotherapeutics. PMID:27536104

  19. Effectiveness of a Crocus sativus Extract on Burn Wounds in Rats.

    PubMed

    Alemzadeh, Esmat; Oryan, Ahmad

    2018-05-23

    Crocus sativus is a spice with various pharmacological properties. Crocin, picrocrocin, and safranal are the main compositions of saffron that have recently been considered in the therapy of many diseases. High-performance liquid chromatography analysis revealed presence of these compounds in our saffron extract. This study was carried out to evaluate the effect of saffron on burn wound healing at an in vivo model. Saffron was topically applied on burn wounds in rats; the percentage of wound closure, wound contraction, and the levels of main cytokines and growth factors were measured. The saffron extract was also applied to evaluate the proliferation and migration of human dermal fibroblast (HDF) cells using in vitro scratch assay and resulted in active proliferation and migration of the HDF cells in a dose-dependent manner. A clear enhanced healing was observed in the saffron-treated wounds compared to the silver sulfadiazine and negative control groups. Decreased expression of interleukin-1 β and transforming growth factor- β 1 (TGF- β 1) during the inflammatory phase demonstrated the role of saffron in promoting wound healing. In addition, enhanced TGF- β 1 expression during the proliferative phase and basic fibroblast growth factor during the remodeling phase represented regenerative and anti-scarring role of saffron, respectively. Our histological and biochemical findings also confirmed that saffron significantly stimulated burn wound healing by modulating healing phases. Therefore, saffron can be an optimal option in promoting skin repair and regeneration. Application of this herbal medicinal drug should be encouraged because of its availability and negligible side effects. Georg Thieme Verlag KG Stuttgart · New York.

  20. An Adaptable Seismic Data Format

    NASA Astrophysics Data System (ADS)

    Krischer, Lion; Smith, James; Lei, Wenjie; Lefebvre, Matthieu; Ruan, Youyi; de Andrade, Elliott Sales; Podhorszki, Norbert; Bozdağ, Ebru; Tromp, Jeroen

    2016-11-01

    We present ASDF, the Adaptable Seismic Data Format, a modern and practical data format for all branches of seismology and beyond. The growing volume of freely available data coupled with ever expanding computational power opens avenues to tackle larger and more complex problems. Current bottlenecks include inefficient resource usage and insufficient data organization. Properly scaling a problem requires the resolution of both these challenges, and existing data formats are no longer up to the task. ASDF stores any number of synthetic, processed or unaltered waveforms in a single file. A key improvement compared to existing formats is the inclusion of comprehensive meta information, such as event or station information, in the same file. Additionally, it is also usable for any non-waveform data, for example, cross-correlations, adjoint sources or receiver functions. Last but not least, full provenance information can be stored alongside each item of data, thereby enhancing reproducibility and accountability. Any data set in our proposed format is self-describing and can be readily exchanged with others, facilitating collaboration. The utilization of the HDF5 container format grants efficient and parallel I/O operations, integrated compression algorithms and check sums to guard against data corruption. To not reinvent the wheel and to build upon past developments, we use existing standards like QuakeML, StationXML, W3C PROV and HDF5 wherever feasible. Usability and tool support are crucial for any new format to gain acceptance. We developed mature C/Fortran and Python based APIs coupling ASDF to the widely used SPECFEM3D_GLOBE and ObsPy toolkits.

Top