Science.gov

Sample records for luminance mapping technique

  1. Evaluation of High Dynamic Range Photography as a Luminance Mapping Technique

    SciTech Connect

    Inanici, Mehlika; Galvin, Jim

    2004-12-30

    The potential, limitations, and applicability of the High Dynamic Range (HDR) photography technique is evaluated as a luminance mapping tool. Multiple exposure photographs of static scenes are taken with a Nikon 5400 digital camera to capture the wide luminance variation within the scenes. The camera response function is computationally derived using the Photosphere software, and is used to fuse the multiple photographs into HDR images. The vignetting effect and point spread function of the camera and lens system is determined. Laboratory and field studies have shown that the pixel values in the HDR photographs can correspond to the physical quantity of luminance with reasonable precision and repeatability.

  2. Mapping luminance onto lightness in vision and art

    NASA Astrophysics Data System (ADS)

    Gilchrist, Alan

    2013-03-01

    Most natural images span a large range of luminance values, often a thousand-to-one and sometimes a million-to-one (Heckaman and Fairchild, 2009). This luminance range must be mapped by the visual system onto a scale of perceived gray shades (called lightness) with a range of roughly thirty-to-one (90% to 3% reflectance). For the painter who wants to represent this scene on a canvas, the challenge is a bit different. The high-range scene must be represented using pigments with a range of only thirty-to-one. Let's begin with vision. Even without a high range in the scene, understanding the mapping of luminance onto lightness has proven elusive. But we can think of the problem as having two parts: Anchoring and scaling.

  3. Nonlinear mapping of the luminance in dual-layer high dynamic range displays

    NASA Astrophysics Data System (ADS)

    Guarnieri, Gabriele; Ramponi, Giovanni; Bonfiglio, Silvio; Albani, Luigi

    2009-02-01

    It has long been known that the human visual system (HVS) has a nonlinear response to luminance. This nonlinearity can be quantified using the concept of just noticeable difference (JND), which represents the minimum amplitude of a specified test pattern an average observer can discern from a uniform background. The JND depends on the background luminance following a threshold versus intensity (TVI) function. It is possible to define a curve which maps physical luminances into a perceptually linearized domain. This mapping can be used to optimize a digital encoding, by minimizing the visibility of quantization noise. It is also commonly used in medical applications to display images adapting to the characteristics of the display device. High dynamic range (HDR) displays, which are beginning to appear on the market, can display luminance levels outside the range in which most standard mapping curves are defined. In particular, dual-layer LCD displays are able to extend the gamut of luminance offered by conventional liquid crystals towards the black region; in such areas suitable and HVS-compliant luminance transformations need to be determined. In this paper we propose a method, which is primarily targeted to the extension of the DICOM curve used in medical imaging, but also has a more general application. The method can be modified in order to compensate for the ambient light, which can be significantly greater than the black level of an HDR display and consequently reduce the visibility of the details in dark areas.

  4. C IV and C III] reverberation mapping of the luminous quasar PG 1247+267

    SciTech Connect

    Trevese, D.; Saturni, F. G.; Perna, M.; Dadina, M.

    2014-11-10

    So far the masses of about 50 active galactic nuclei (AGNs) have been measured through the reverberation mapping technique (RM). Most measurements have been performed for objects of moderate luminosity and redshift, based on Hβ, which is also used to calibrate the scaling relation that allows single-epoch (SE) mass determination based on AGN luminosity and the width of different emission lines. Due to the complex structure and gas dynamics of the relevant emission region, the SE masses obtained from the C IV(1549 Å) line show a large spread around the mean values. Direct RM measures of C IV exist for only six AGNs of low luminosity and redshift, and only one luminous quasar. Since 2003, we have collected photometric and spectroscopic observations of PG1247+267, the most luminous quasar ever analyzed for RM. We provide light curves for the continuum and for C IV(1549 Å) and C III](1909 Å), and measures of the reverberation time lags based on the SPEAR method. The sizes of the line emission regions assume a ratio of R {sub C} {sub III]}/R {sub C} {sub IV} ∼ 2, similar to the case of Seyfert galaxies, indicating for the first time a similar ionization stratification in a luminous quasar and low-luminosity nuclei. Due to the relatively small size of the broad line region and the relatively narrow line widths, we estimate a small mass and an anomalously high Eddington ratio. We discuss the possibility that either the shape of the emission region or an amplification of the luminosity caused by gravitational lensing may be partly responsible for the result.

  5. Covariance mapping techniques

    NASA Astrophysics Data System (ADS)

    Frasinski, Leszek J.

    2016-08-01

    Recent technological advances in the generation of intense femtosecond pulses have made covariance mapping an attractive analytical technique. The laser pulses available are so intense that often thousands of ionisation and Coulomb explosion events will occur within each pulse. To understand the physics of these processes the photoelectrons and photoions need to be correlated, and covariance mapping is well suited for operating at the high counting rates of these laser sources. Partial covariance is particularly useful in experiments with x-ray free electron lasers, because it is capable of suppressing pulse fluctuation effects. A variety of covariance mapping methods is described: simple, partial (single- and multi-parameter), sliced, contingent and multi-dimensional. The relationship to coincidence techniques is discussed. Covariance mapping has been used in many areas of science and technology: inner-shell excitation and Auger decay, multiphoton and multielectron ionisation, time-of-flight and angle-resolved spectrometry, infrared spectroscopy, nuclear magnetic resonance imaging, stimulated Raman scattering, directional gamma ray sensing, welding diagnostics and brain connectivity studies (connectomics). This review gives practical advice for implementing the technique and interpreting the results, including its limitations and instrumental constraints. It also summarises recent theoretical studies, highlights unsolved problems and outlines a personal view on the most promising research directions.

  6. Luminance and contrast ideal balancing based tone mapping algorithm

    NASA Astrophysics Data System (ADS)

    Besrour, Amine; Abdelkefi, Fatma; Siala, Mohamed; Snoussi, Hichem

    2015-09-01

    The tone mapping field represents a challenge for all the HDR researchers. Indeed, this field is very important since, it offers better display terms for the end-user. This paper details a design of a recent tone mapping operator used in high dynamic range imaging systems. The proposed operator represents a local method that uses an adaptable factor which combines both the average neighbouring contrast and the brightness difference. Thanks to that, this solution provides good results with better brightness, contrast, and visibility and without producing neither undesired artifacts nor shadow effects.

  7. Does luminance-contrast contribute to a saliency map for overt visual attention?

    PubMed

    Einhäuser, Wolfgang; König, Peter

    2003-03-01

    In natural environments, humans select a subset of visual stimuli by directing their gaze to locations attended. In previous studies it has been found that at fixation points luminance-contrast is higher than average. This led to the hypothesis that luminance-contrast makes a major contribution to a saliency map of visual overt attention, consistent with a computation of stimulus saliency in early visual cortical areas. We re-evaluate this hypothesis by using natural and modified natural images to uncover the causal effects of luminance-contrast to human overt visual attention and: (i) we confirm that when viewing natural images, contrasts are elevated at fixation points. This, however, only holds for low spatial frequencies and in a limited temporal window after stimulus onset; (ii) however, despite this correlation between overt attention and luminance-contrast, moderate modifications of contrast in natural images do not measurably affect the selection of fixation points. Furthermore, strong local reductions of luminance-contrast do not repel but attract fixation; (iii) neither contrast nor contrast modification is correlated to fixation duration; and (iv), even the moderate contrast modifications used fall into the physiologically relevant range, and subjects are well able to detect them in a forced choice paradigm. In summary, no causal contribution of luminance-contrast to a saliency map of human overt attention is detectable. In conjunction with recent results on the relation of contrast sensitivity of neuronal activity to the level in the visual cortical hierarchy, the present study provides evidence that, for natural scenes, saliency is computed not early but late during processing.

  8. A methodology for luminance map retrieval using airborne hyperspectral and photogrammetric data

    NASA Astrophysics Data System (ADS)

    Pipia, Luca; Alamús, Ramon; Tardà, Anna; Pérez, Fernando; Palà, Vicenç; Corbera, Jordi

    2014-10-01

    This paper puts forward a methodology developed at the Institut Cartogràfic i Geològic de Catalunya (ICGC) to quantify upwelling light flux using hyperspectral and photogrammetric airborne data. The work was carried out in the frame of a demonstrative study requested by the municipality of Sant Cugat del Vallès, in the vicinity of Barcelona (Spain), and aimed to envisage a new approach to assess artificial lighting policies and actions as alternative to field campaigns. Hyperspectral and high resolution multispectral/panchromatic data were acquired simultaneously over urban areas. In order to avoid moon light contributions, data were acquired during the first days of new moon phase. Hyperspectral data were radiometrically calibrated. Then, National Center for Environmental Prediction (NCEP) atmospheric profiles were employed to estimate the actual Column Water Vapor (CWV) to be passed to ModTran5.0 for the atmospheric transmissivity τ calculation. At-the-ground radiance was finally integrated using the photopic sensitivity curve to generate a luminance map (cdm-2) of the flown area by mosaicking the different flight tracks. In an attempt to improve the spatial resolution and enhance the dynamic range of the luminance map, a sensor-fusion strategy was finally looked into. DMC Photogrammetric data acquired simultaneously to hyperspectral information were converted into at-the-ground radiance and upscaled to CASI spatial resolution. High-resolution (HR) luminance maps with enhanced dynamic range were finally generated by linearly fitting up-scaled DMC mosaics to the CASI-based luminance information. In the end, a preliminary assessment of the methodology is carried out using non-simultaneous in-situ measurements.

  9. Reverberation Mapping of the most luminous Quasars at z~2-3.

    NASA Astrophysics Data System (ADS)

    Lira, Paulina; Botti, Ismael; Netzer, Hagai; Kaspi, Shai

    2016-08-01

    Reverberation Mapping (RM) provides the only tool to determine Black Holes (BH) masses directly, but so far this method has been applied only to small and intermediate luminosity systems (L<10^46 ergs/s). We are extending these studies by two orders of magnitude, probing the BH-mass of luminous AGN at redshift 2-3, obtaining the measurement of the largest BHs and extending our knowledge of the physics of AGN and their hosts into the most crucial epoch in galaxy evolution. Since 2005 we have been monitoring very luminous Quasars using broad-band imaging with the SMARTS telescopes and in 2007 we started the RM campaign of our most variable targets obtaining spectroscopic follow-up with the du Pont telescope. After 10 years of data gathering we are finally able to report on the RM for Quasar J221516 as well as several other systems that should yield results in the near future.

  10. Infrared and color visible image fusion system based on luminance-contrast transfer technique

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Gong, Wenfeng; Wang, Chensheng

    2012-12-01

    In this paper, an infrared and color image fusion algorithm based on luminance-contrast transfer technique is presented. This algorithm shall operate YCbCr transform on color visible image, and obtain the luminance component. Then, the grey-scale image fusion methods are utilized to fuse the luminance component of visible and infrared images to acquire grey-scale fusion image. After that, the grey-scale fusion image and visible image are fused to form color fusion image based on inversed YCbCr transform. To acquire better details appearance, a natural-sense color transfer fusion algorithm based on reference image is proposed. Furthermore, a real-time infrared/visible image fusion system based on FPGA is realized. Finally, this design and achievement is verified experimentally, and the experimental results show that the system can produce a color fusion image with good image quality and real-time performance.

  11. New technique maps bioremediation

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    Before researchers can effectively harness microbial populations for bioremediation, they first need to understand why some microbes are attracted to specific minerals.A new tool for studying the feasibility of in situ bioremediation has now been shown to be able to map mineral crystals and bacterial growth on basalt, according to Mary Kauffman, a geo-microbiologist with the U.S. Department of Energy's Idaho National Engineering and Environmental Laboratory (INEEL) in Idaho Falls. Kauffman described her research on December 11 at the 2001 AGU Fall Meeting.

  12. Visuoauditory mappings between high luminance and high pitch are shared by chimpanzees (Pan troglodytes) and humans.

    PubMed

    Ludwig, Vera U; Adachi, Ikuma; Matsuzawa, Tetsuro

    2011-12-20

    Humans share implicit preferences for certain cross-sensory combinations; for example, they consistently associate higher-pitched sounds with lighter colors, smaller size, and spikier shapes. In the condition of synesthesia, people may experience such cross-modal correspondences to a perceptual degree (e.g., literally seeing sounds). So far, no study has addressed the question whether nonhuman animals share cross-modal correspondences as well. To establish the evolutionary origins of cross-modal mappings, we tested whether chimpanzees (Pan troglodytes) also associate higher pitch with higher luminance. Thirty-three humans and six chimpanzees were required to classify black and white squares according to their color while hearing irrelevant background sounds that were either high-pitched or low-pitched. Both species performed better when the background sound was congruent (high-pitched for white, low-pitched for black) than when it was incongruent (low-pitched for white, high-pitched for black). An inherent tendency to pair high pitch with high luminance hence evolved before the human lineage split from that of chimpanzees. Rather than being a culturally learned or a linguistic phenomenon, this mapping constitutes a basic feature of the primate sensory system.

  13. A MAP OF THE INTEGRATED SACHS-WOLFE SIGNAL FROM LUMINOUS RED GALAXIES

    SciTech Connect

    Granett, Benjamin R.; Neyrinck, Mark C.; Szapudi, Istvan

    2009-08-10

    We construct a map of the time derivative of the gravitational potential traced by Sloan Digital Sky Survey luminous red galaxies (LRGs). The potential decays on large scales due to cosmic acceleration, leaving an imprint on cosmic microwave background (CMB) radiation through the integrated Sachs-Wolfe (ISW) effect. With a template fit, we directly measure this signature on the CMB at a 2{sigma} confidence level. The measurement is consistent with the cross-correlation statistic, strengthening the claim that dark energy is indeed the cause of the correlation. This new approach potentially simplifies the cosmological interpretation. Our constructed linear ISW map shows no evidence for degree-scale cold and hot spots associated with supervoid and supercluster structures. This suggests that the linear ISW effect in a concordance {lambda}CDM cosmology is insufficient to explain the strong CMB imprints from these structures that we previously reported.

  14. Brain Friendly Techniques: Mind Mapping

    ERIC Educational Resources Information Center

    Goldberg, Cristine

    2004-01-01

    Mind Mapping can be called the Swiss Army Knife for the brain, a total visual thinking tool or a multi-handed thought catcher. Invented by Tony Buzan in the early 1970s and used by millions around the world, it is a method that can be a part of a techniques repertoire when teaching information literacy, planning, presenting, thinking, and so…

  15. Brain Friendly Techniques: Mind Mapping

    ERIC Educational Resources Information Center

    Goldberg, Cristine

    2004-01-01

    Mind Mapping can be called the Swiss Army Knife for the brain, a total visual thinking tool or a multi-handed thought catcher. Invented by Tony Buzan in the early 1970s and used by millions around the world, it is a method that can be a part of a techniques repertoire when teaching information literacy, planning, presenting, thinking, and so…

  16. Mapping luminous blue compact galaxies with VIRUS-P. Morphology, line ratios, and kinematics

    NASA Astrophysics Data System (ADS)

    Cairós, L. M.; Caon, N.; García Lorenzo, B.; Kelz, A.; Roth, M.; Papaderos, P.; Streicher, O.

    2012-11-01

    Context. Blue compact galaxies (BCG) are narrow emission-line systems that undergo a violent burst of star formation. They are compact, low-luminosity galaxies, with blue colors and low chemical abundances, which offer us a unique opportunity to investigate collective star formation and its effects on galaxy evolution in a relatively simple, dynamically unperturbed environment. Spatially resolved spectrophotometric studies of BCGs are essential for a better understanding of the role of starburst-driven feedback processes on the kinematical and chemical evolution of low-mass galaxies near and far. Aims: We carry out an integral field spectroscopy (IFS) study of a sample of luminous BCGs, with the aim to probe the morphology, kinematics, dust extinction, and excitation mechanisms of their warm interstellar medium (ISM). Methods: We obtained IFS data for five luminous BCGs with VIRUS-P, the prototype instrument for the Visible Integral Field Replicable Unit Spectrograph, attached to the 2.7 m Harlan J. Smith Telescope at the McDonald Observatory. VIRUS-P consists of a square array of 247 optical fibers, which covers a 109″ × 109″ field of view, with a spatial sampling of 4farcs2 and a 0.3 filling factor. We observed in the 3550-5850 Å spectral range, with a resolution of 5 Å FWHM. From these data we built two-dimensional maps of the continuum and the most prominent emission-lines ([O ii] λ3727, Hγ, Hβ and [O iii] λ5007), and investigated the morphology of diagnostic emission-line ratios and the extinction patterns in the ISM as well as stellar and gas kinematics. Additionally, from integrated spectra we inferred total line fluxes and luminosity-weighted extinction coefficients and gas-phase metallicities. Results: All galaxies exhibit an overall regular morphology in the stellar continuum, while their warm ISM morphology is more complex: in II Zw 33 and Mrk 314, the star-forming regions are aligned along a chain-structure; Haro 1, NGC 4670 and III Zw 102

  17. Window defect planar mapping technique

    NASA Technical Reports Server (NTRS)

    Minton, F. R.; Minton, U. O. (Inventor)

    1976-01-01

    A method of planar mapping defects in a window having an edge surface and a planar surface. The method is comprised of steps for mounting the window on a support surface. Then a light sensitive paper is placed adjacent to the window surface. A light source is positioned adjacent to the window edge. The window is then illuminated with the source of light for a predetermined interval of time. Defects on the surface of the glass, as well as in the interior of the glass are detected by analyzing the developed light sensitive paper. The light source must be in the form of optical fibers or a light tube whose light transmitting ends are placed near the edge surface of the window.

  18. Digital Mapping Techniques '07 - Workshop Proceedings

    USGS Publications Warehouse

    Soller, David R.

    2008-01-01

    The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  19. Mapping techniques for atrial fibrillation ablation.

    PubMed

    Sra, Jasbir; Akhtar, Masood

    2007-12-01

    Atrial fibrillation (AF) is a common arrhythmia. Although significant work still needs to be done, recent advances in understanding the mechanism of AF have led to the development of elegant catheter mapping techniques for ablation of AF. These improved mapping techniques are complemented by an evolution in various imaging and navigational technologies, several of which can now be combined in a process called registration, so that the physician no longer needs to rely solely on a mental image of the anatomy of the left atrium and the pulmonary vein while attempting to ablate the region. Ongoing advances in mapping technique will increase safety and efficacy and it is likely that AF ablation will become the first-line therapy in most patients with this complicated arrhythmia.

  20. Principles and techniques of polarimetric mapping.

    NASA Technical Reports Server (NTRS)

    Halajian, J.; Hallock, H.

    1973-01-01

    This paper introduces the concept and potential value of polarimetric maps and the techniques for generating these maps in operational remote sensing. The application-oriented polarimetric signature analyses in the literature are compiled, and several optical models are illustrated to bring out requirements of a sensor system for polarimetric mapping. By use of the concepts of Stokes parameters the descriptive specification of one sensor system is refined. The descriptive specification for a multichannel digital photometric-polarimetric mapper is based upon our experience with the present single channel device which includes the generation of polarimetric maps and pictures. High photometric accuracy and stability coupled with fast, accurate digital output has enabled us to overcome the handicap of taking sequential data from the same terrain.

  1. Multiscale mapping: Physical concepts and mathematical techniques

    USDA-ARS?s Scientific Manuscript database

    This is an introductory summary for papers either invited or a part of a symposium at the 18th World Congress of Soil Science, July 2006 in Philadelphia. The symposium, titled "Multiscale Mapping of Soil Properties for Environmental Studies, Agriculture, and Decision Making," focused on techniques u...

  2. Design and evaluation of a freeform lens by using a method of luminous intensity mapping and a differential equation

    NASA Astrophysics Data System (ADS)

    Essameldin, Mahmoud; Fleischmann, Friedrich; Henning, Thomas; Lang, Walter

    2017-02-01

    Freeform optical systems are playing an important role in the field of illumination engineering for redistributing the light intensity, because of its capability of achieving accurate and efficient results. The authors have presented the basic idea of the freeform lens design method at the 117th annual meeting of the German Society of Applied Optics (DGAOProceedings). Now, we demonstrate the feasibility of the design method by designing and evaluating a freeform lens. The concepts of luminous intensity mapping, energy conservation and differential equation are combined in designing a lens for non-imaging applications. The required procedures to design a lens including the simulations are explained in detail. The optical performance is investigated by using a numerical simulation of optical ray tracing. For evaluation, the results are compared with another recently published design method, showing the accurate performance of the proposed method using a reduced number of mapping angles. As a part of the tolerance analyses of the fabrication processes, the influence of the light source misalignments (translation and orientation) on the beam-shaping performance is presented. Finally, the importance of considering the extended light source while designing a freeform lens using the proposed method is discussed.

  3. EEG source analysis using space mapping techniques

    NASA Astrophysics Data System (ADS)

    Crevecoeur, G.; Hallez, H.; van Hese, P.; D'Asseler, Y.; Dupre, L.; van de Walle, R.

    2008-06-01

    The electroencephalogram (EEG) measures potential differences, generated by electrical activity in brain tissue, between scalp electrodes. The EEG potentials can be calculated by the quasi-static Poisson equation in a certain head model. It is well known that the electrical dipole (source) which best fits the measured EEG potentials is obtained by an inverse problem. The dipole parameters are obtained by finding the global minimum of the relative residual energy (RRE). For the first time, the space mapping technique (SM technique) is used for minimizing the RRE. The SM technique aims at aligning two different simulation models: a fine model, accurate but CPU-time expensive, and a coarse model, computationally fast but less accurate than the fine one. The coarse model is a semi-analytical model, the so-called three-shell concentric sphere model. The fine model numerically solves the Poisson equation in a realistic head model. If we use the aggressive space mapping (ASM) algorithm, the errors on the dipole location are too large. The hybrid aggressive space mapping (HASM) on the other hand has better convergence properties, yielding a reduction in dipole location errors. The computational effort of HASM is greater than ASM but smaller than using direct optimization techniques.

  4. Digital mapping techniques '06 - Workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2007-01-01

    The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data

  5. Investigation of light scattering as a technique for detecting discrete soot particles in a luminous flame

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The practicability of using a classical light-scattering technique, involving comparison of angular scattering intensity patterns with theoretically determined Mie and Rayleight patterns, to detect discrete soot particles (diameter less than 50 nm) in premixed propane/air and propane/oxygen-helium flames is considered. The experimental apparatus employed in this investigation included a laser light source, a flat-flame burner, specially coated optics, a cooled photomultiplier detector, and a lock-in voltmeter readout. Although large, agglomerated soot particles were detected and sized, it was not possible to detect small, discrete particles. The limiting factor appears to be background scattering by the system's optics.

  6. Episodic Mapping: A Technique To Help Students Understand Stories.

    ERIC Educational Resources Information Center

    Schmelzer, Ronald; Henson, Kenneth

    Semantic mapping is effective with expository prose but not as effective with narrative prose. To achieve a better understanding of narrative prose, yet still keep the benefits of semantic mapping, the traditional approach can be modified into a technique called "episodic mapping." Episodic mapping is based on the idea that most stories…

  7. Structural Modeling Using "Scanning and Mapping" Technique

    NASA Technical Reports Server (NTRS)

    Amos, Courtney L.; Dash, Gerald S.; Shen, J. Y.; Ferguson, Frederick; Noga, Donald F. (Technical Monitor)

    2000-01-01

    Supported by NASA Glenn Center, we are in the process developing a structural damage diagnostic and monitoring system for rocket engines, which consists of five modules: Structural Modeling, Measurement Data Pre-Processor, Structural System Identification, Damage Detection Criterion, and Computer Visualization. The function of the system is to detect damage as it is incurred by the engine structures. The scientific principle to identify damage is to utilize the changes in the vibrational properties between the pre-damaged and post-damaged structures. The vibrational properties of the pre-damaged structure can be obtained based on an analytic computer model of the structure. Thus, as the first stage of the whole research plan, we currently focus on the first module - Structural Modeling. Three computer software packages are selected, and will be integrated for this purpose. They are PhotoModeler-Pro, AutoCAD-R14, and MSC/NASTRAN. AutoCAD is the most popular PC-CAD system currently available in the market. For our purpose, it plays like an interface to generate structural models of any particular engine parts or assembly, which is then passed to MSC/NASTRAN for extracting structural dynamic properties. Although AutoCAD is a powerful structural modeling tool, the complexity of engine components requires a further improvement in structural modeling techniques. We are working on a so-called "scanning and mapping" technique, which is a relatively new technique. The basic idea is to producing a full and accurate 3D structural model by tracing on multiple overlapping photographs taken from different angles. There is no need to input point positions, angles, distances or axes. Photographs can be taken by any types of cameras with different lenses. With the integration of such a modeling technique, the capability of structural modeling will be enhanced. The prototypes of any complex structural components will be produced by PhotoModeler first based on existing similar

  8. Structural Modeling Using "Scanning and Mapping" Technique

    NASA Technical Reports Server (NTRS)

    Amos, Courtney L.; Dash, Gerald S.; Shen, J. Y.; Ferguson, Frederick; Noga, Donald F. (Technical Monitor)

    2000-01-01

    Supported by NASA Glenn Center, we are in the process developing a structural damage diagnostic and monitoring system for rocket engines, which consists of five modules: Structural Modeling, Measurement Data Pre-Processor, Structural System Identification, Damage Detection Criterion, and Computer Visualization. The function of the system is to detect damage as it is incurred by the engine structures. The scientific principle to identify damage is to utilize the changes in the vibrational properties between the pre-damaged and post-damaged structures. The vibrational properties of the pre-damaged structure can be obtained based on an analytic computer model of the structure. Thus, as the first stage of the whole research plan, we currently focus on the first module - Structural Modeling. Three computer software packages are selected, and will be integrated for this purpose. They are PhotoModeler-Pro, AutoCAD-R14, and MSC/NASTRAN. AutoCAD is the most popular PC-CAD system currently available in the market. For our purpose, it plays like an interface to generate structural models of any particular engine parts or assembly, which is then passed to MSC/NASTRAN for extracting structural dynamic properties. Although AutoCAD is a powerful structural modeling tool, the complexity of engine components requires a further improvement in structural modeling techniques. We are working on a so-called "scanning and mapping" technique, which is a relatively new technique. The basic idea is to producing a full and accurate 3D structural model by tracing on multiple overlapping photographs taken from different angles. There is no need to input point positions, angles, distances or axes. Photographs can be taken by any types of cameras with different lenses. With the integration of such a modeling technique, the capability of structural modeling will be enhanced. The prototypes of any complex structural components will be produced by PhotoModeler first based on existing similar

  9. Luminous supernovae.

    PubMed

    Gal-Yam, Avishay

    2012-08-24

    Supernovae, the luminous explosions of stars, have been observed since antiquity. However, various examples of superluminous supernovae (SLSNe; luminosities >7 × 10(43) ergs per second) have only recently been documented. From the accumulated evidence, SLSNe can be classified as radioactively powered (SLSN-R), hydrogen-rich (SLSN-II), and hydrogen-poor (SLSN-I, the most luminous class). The SLSN-II and SLSN-I classes are more common, whereas the SLSN-R class is better understood. The physical origins of the extreme luminosity emitted by SLSNe are a focus of current research.

  10. Digital Mapping Techniques '11–12 workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2014-01-01

    At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  11. Color-gamut mapping techniques for color hardcopy images

    NASA Astrophysics Data System (ADS)

    Hoshino, Toru; Berns, Roy S.

    1993-08-01

    Color gamut mapping is required whenever two imaging devices do not have coincident color gamuts or viewing conditions. Two major gamut mapping techniques include lightness and chroma manipulations. Lightness mapping accounts for differences in white level, black level, and viewing conditions while chroma mapping accounts for differences in gamut volume. As a three dimensional space in which color gamut mapping is implemented, the 1991 Hunt model of color appearance was used utilizing dimensions of lightness, chroma, and hue. This model accounts for viewing conditions in addition to the usual device independent specification. The mapping techniques were applied to back-lit photographic transparencies in order to reproduce images using a dye diffusion thermal transfer printer. As the first experiment, a lightness mapping experiment was performed. Three different lightness mapping techniques, a linear technique and two non-linear techniques, were tested for four images. The psychophysical method of paired comparison was used to generate interval scales of preferred color reproduction. In general, the preferred technique depended on the amount of lightness mapping required and on the original image's lightness histograms. For small amounts of compression, the preferred technique was a clipping type. For large amounts of compression, the preferred technique was image dependent; low preference was caused by loss of detail or apparent fluorescence of high chroma image areas.

  12. Concept Mapping: A Critical Thinking Technique

    ERIC Educational Resources Information Center

    Harris, Charles M.; Zha, Shenghua

    2013-01-01

    Concept mapping, graphically depicting the structure of abstract concepts, is based on the observation that pictures and line drawings are often more easily comprehended than the words that represent an abstract concept. The efficacy of concept mapping for facilitating critical thinking was assessed in four sections of an introductory psychology…

  13. Process Mapping: Tools, Techniques, & Critical Success Factors.

    ERIC Educational Resources Information Center

    Kalman, Howard K.

    2002-01-01

    Explains process mapping as an analytical tool and a process intervention that performance technologists can use to improve human performance by reducing error variance. Highlights include benefits of process mapping; and critical success factors, including organizational readiness, time commitment by participants, and the availability of a…

  14. Process Mapping: Tools, Techniques, & Critical Success Factors.

    ERIC Educational Resources Information Center

    Kalman, Howard K.

    2002-01-01

    Explains process mapping as an analytical tool and a process intervention that performance technologists can use to improve human performance by reducing error variance. Highlights include benefits of process mapping; and critical success factors, including organizational readiness, time commitment by participants, and the availability of a…

  15. A Novel Tsunami Inundation Mapping Technique

    NASA Astrophysics Data System (ADS)

    Watts, P.; Walder, J. S.; Waythomas, C. F.

    2003-12-01

    Traditional tsunami inundation maps have focused on maximum on-land inundation. Occasionally, the inundation maps are supplemented with information about maximum water velocity or timing. We demonstrate using several case studies the utility of producing maps showing 1) on-land inundation area, 2) maximum water velocity, 3) maximum water flux, 4) time of wave arrival, 5) time of wave maximum, and 6) time of wave departure. Map attributes 1-3 are related to water motion and are of particular value to scientists involved in tsunami hazard assessment. Attributes 4-6 are related to the timing of wave events and are of particular value to emergency managers. However, this raw form of inundation mapping is not easily interpreted within the usual GIS context because water velocities and fluxes are not readily understood by lay people. To improve on the GIS approach to portraying tsunami information, we employ simple engineering models illustrating the consequences of tsunami attack in a way that lay audiences can readily understand. In particular, we present maps that depict regions where a human would be knocked down, where cars would be moved, where various sizes of sediment would be moved, where anchors would break, and where ships would be moved laterally by a significant amount. Each map is separated into three distinct layers, each layer representing a different level of hazard. For example, the car movement map shows results for three sizes of automobile. Each region is depicted as a separate GIS layer. Given the uncertainty involved in numerical simulations, as well as the many local variables that can alter the outcome during a real event, these maps are meant to provide general guidelines as to the outcome of tsunami attack.

  16. Achromatic Emission Velocity Measurements in Luminous Flows

    NASA Technical Reports Server (NTRS)

    Schneider, S. J.; Fulghum, S. F.; Rostler, P. S.

    1997-01-01

    A new velocity measurement instrument for luminous flows was developed by Science Research Laboratory for NASA. The SIEVE (Segmented Image Emission VElocimeter) instrument uses broadband light emitted by the flow for the velocity measurement. This differs from other velocimetry techniques in that it does not depend on laser illumination and/or light scattering from particles in the flow. The SIEVE is a passive, non-intrusive diagnostic. By moving and adjusting the imaging optics, the SIEVE can provide three-dimensional mapping of a flow field and determine turbulence scale size. A SIEVE instrument was demonstrated on an illuminated rotating disk to evaluate instrument response and noise and on an oxy-acetylene torch to measure flame velocities. The luminous flow in rocket combustors and plumes is an ideal subject for the SIEVE velocity measurement technique.

  17. Evaluation of Techniques Used to Estimate Cortical Feature Maps

    PubMed Central

    Katta, Nalin; Chen, Thomas L.; Watkins, Paul V.; Barbour, Dennis L.

    2011-01-01

    Functional properties of neurons are often distributed nonrandomly within a cortical area and form topographic maps that reveal insights into neuronal organization and interconnection. Some functional maps, such as in visual cortex, are fairly straightforward to discern with a variety of techniques, while other maps, such as in auditory cortex, have resisted easy characterization. In order to determine appropriate protocols for establishing accurate functional maps in auditory cortex, artificial topographic maps were probed under various conditions, and the accuracy of estimates formed from the actual maps was quantified. Under these conditions, low-complexity maps such as sound frequency can be estimated accurately with as few as 25 total samples (e.g., electrode penetrations or imaging pixels) if neural responses are averaged together. More samples are required to achieve the highest estimation accuracy for higher complexity maps, and averaging improves map estimate accuracy even more than increasing sampling density. Undersampling without averaging can result in misleading map estimates, while undersampling with averaging can lead to the false conclusion of no map when one actually exists. Uniform sample spacing only slightly improves map estimation over nonuniform sample spacing typical of serial electrode penetrations. Tessellation plots commonly used to visualize maps estimated using nonuniform sampling are always inferior to linearly interpolated estimates, although differences are slight at higher sampling densities. Within primary auditory cortex, then, multiunit sampling with at least 100 samples would likely result in reasonable feature map estimates for all but the highest complexity maps and the highest variability that might be expected. PMID:21889537

  18. Software for genome mapping by fingerprinting techniques.

    PubMed

    Sulston, J; Mallett, F; Staden, R; Durbin, R; Horsnell, T; Coulson, A

    1988-03-01

    A genome mapping package has been developed for reading and assembling data from clones analysed by restriction enzyme fragmentation and polyacrylamide gel electrophoresis. The package comprises: data entry; matching; assembly; statistical analysis; modelling. Data entry can be either manual or by a semiautomatic system based on a scanning densitometer. The primary emphasis in the analytical routines is on flexibility and interactive convenience, so that the operator has full knowledge of and control over the growing map, but a variety of automatic options are included. The package continually grows to meet the needs of the Caenorhabditis project.

  19. Thermal Diffusivity Mapping of Solids by Scanning Photoacoustic Piezoelectric Technique

    NASA Astrophysics Data System (ADS)

    Zhao, Binxing; Gao, Chunming; Yan, Laijun; Wang, Yafei

    2016-12-01

    Quantitative thermal diffusivity mapping of solid samples was achieved using the scanning photoacoustic piezoelectric (PAPE) technique. Based on the frequency-domain PAPE theoretical model, the methodology of the scanning PAPE thermal diffusivity mapping is introduced. An experimental setup capable of spatial and frequency scanning was established. Thermal diffusivity mapping of homogeneous and inhomogeneous samples was carried out. The obtained thermal diffusivity images are consistent with the optical images in image contrast and consistent with the reference values in thermal diffusivity. Results show that the scanning PAPE technique is able to determine the thermal diffusivity distribution of solids, hence providing an effective method for thermal diffusivity mapping.

  20. Ratio techniques for geochemical remote sensing. [geological mapping technique

    NASA Technical Reports Server (NTRS)

    Vincent, R. K.

    1974-01-01

    The present work discusses spectral ratio methods for extracting information on the composition of minerals and rocks from 0.4-14-micron multispectral scanner data. Ratio images and automatic recognition maps based on spectral ratios have the advantages of covering nearly all wavelength regions of important geochemical information, being relatively insensitive to atmospheric and solar illumination variations, and having definable limits of accuracy concerning the capability of discriminating between important geochemical targets and background materials.

  1. Luminous presence

    NASA Astrophysics Data System (ADS)

    Dawson, Paula

    2008-02-01

    The Luminous Presence project examines the use of standard film language in the framing, angle and of points of view of holographic subjects though eight digital holographic stereograms; seven 25 x 25 cm, Hail, Water, Rain, Snow, Sun, Text, Imprint and 1.5 x 1 m, Luminous Presences i. However, before embarking on a discussion of how filmic language can be used in digital holograms it is first important to explain why this line of investigation could be fruitful. Undoubtedly several of the compositional practices which sprung up and evolved throughout the development of the diverse forms of the holographic medium have contributed to a unique hologram pictorial language, however it is well known that the reading of visual imagery of any type relies a great deal on the viewer's knowledge of and experience of other images .The lens-recorded imagery of film is a far more familiar language than that of holograms and the correlation between certain filmic pictorial conventions and emotional responses are well documented and understood. ii . In short the language of film contains a highly nuanced vocabulary of shot types and lens types (which may be criticised as being formulaic) yet are effective in lending emotion to figures.

  2. Local mapping of interface traps using contactless capacitance transient technique

    NASA Astrophysics Data System (ADS)

    Yoshida, Haruhiko; Mori, Hidenobu

    2016-10-01

    Contactless capacitance transient techniques have been applied to local mapping of interface traps of a semiconductor wafer. In contactless capacitance transient techniques, a Metal-Air gap-Oxide-Semiconductor (MAOS) structure is used instead of a conventional Metal-Oxide-Semiconductor (MOS) structure. The local mapping of interface traps was obtained by using a contactless Isothermal Capacitance Transient Spectroscopy (ICTS), which is one of the contactless capacitance transient techniques. The validity of the contactless ICTS was demonstrated by characterizing a partially Au-doped Si wafer. The results revealed that local mapping of interface traps using contactless capacitance transient techniques is effective in wafer inspection and is a promising technique for the development of MOS devices and solar cells with high reliability and high performance.

  3. A satellite remote sensing technique for geological structure horizon mapping

    SciTech Connect

    Fraser, A.; Huggins, P.; Rees, J.

    1996-08-01

    A Satellite Remote Sensing Technique is demonstrated for generating near surface geological structure data. This technique enables the screening of large areas and targeting of seismic acquisition during hydrocarbon exploration. This is of particular advantage in terrains where surveying is logistically difficult. Landsat Thematic Mapper (TM) data and a high resolution Digital Elevation Model (DEM), are used to identify and map outcropping horizons. These are used to reconstruct the near surface structure. The technique is applied in Central Yemen which is characterised by a {open_quote}layer-cake{close_quote} geological and low dipping terrain. The results are validated using 2D seismic data. The near surface map images faults and structure not apparent in the raw data. Comparison with the structure map generated from a 2D seismic data indicates very good structural and fault correlation. The near surface map successfully highlights areas of potential closure at reservoir depths.

  4. Mapping the climate: guidance on appropriate techniques to map climate variables and their uncertainty

    NASA Astrophysics Data System (ADS)

    Kaye, N. R.; Hartley, A.; Hemming, D.

    2011-08-01

    Maps are a crucial asset in communicating climate science to a diverse audience, and there is a wealth of software available to analyse and visualise climate information. However, this availability makes it easy to create poor maps as users often lack an underlying cartographic knowledge. Furthermore, communicating and visualising uncertainties in climate data and climate change projections, using for example ensemble based approaches, presents additional challenges for mapping that require careful consideration. This paper assesses a range of techniques for mapping uncertainties, comparing "intrinsic" approaches that use colour in much the same way as conventional thematic maps, and "extrinsic" approaches that incorporate additional geometry such as points or features. We proposes that, unlike traditional cartography, where many known standards allow maps to be interpreted easily, there is no standard mapping approach used to represent uncertainty (in climate or other information). Consequently, a wide range of techniques have been applied for this purpose, and users may spend unnecessary time trying to understand the mapping approach rather than interpreting the information presented. We use cartographic knowledge and lessons learned from mapping other information to propose a suitable mapping technique that represents both magnitude and uncertainty in climate data. This technique adjusts the hue of a small palette of colours to show the mean or median of a climate variable, and the saturation of the colour to illustrate a measure of uncertainty. It is designed to be easy to replicate, visible to colour blind people and intuitive to understand. This technique may be utilised to map a wide range of climate data, and it is proposed that it would provide a consistent approach suitable for mapping information for the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR5).

  5. Historical shoreline mapping (I): improving techniques and reducing positioning errors

    USGS Publications Warehouse

    Thieler, E. Robert; Danforth, William W.

    1994-01-01

    A critical need exists among coastal researchers and policy-makers for a precise method to obtain shoreline positions from historical maps and aerial photographs. A number of methods that vary widely in approach and accuracy have been developed to meet this need. None of the existing methods, however, address the entire range of cartographic and photogrammetric techniques required for accurate coastal mapping. Thus, their application to many typical shoreline mapping problems is limited. In addition, no shoreline mapping technique provides an adequate basis for quantifying the many errors inherent in shoreline mapping using maps and air photos. As a result, current assessments of errors in air photo mapping techniques generally (and falsely) assume that errors in shoreline positions are represented by the sum of a series of worst-case assumptions about digitizer operator resolution and ground control accuracy. These assessments also ignore altogether other errors that commonly approach ground distances of 10 m. This paper provides a conceptual and analytical framework for improved methods of extracting geographic data from maps and aerial photographs. We also present a new approach to shoreline mapping using air photos that revises and extends a number of photogrammetric techniques. These techniques include (1) developing spatially and temporally overlapping control networks for large groups of photos; (2) digitizing air photos for use in shoreline mapping; (3) preprocessing digitized photos to remove lens distortion and film deformation effects; (4) simultaneous aerotriangulation of large groups of spatially and temporally overlapping photos; and (5) using a single-ray intersection technique to determine geographic shoreline coordinates and express the horizontal and vertical error associated with a given digitized shoreline. As long as historical maps and air photos are used in studies of shoreline change, there will be a considerable amount of error (on the

  6. Mapping the climate: guidance on appropriate techniques to map climate variables and their uncertainty

    NASA Astrophysics Data System (ADS)

    Kaye, N. R.; Hartley, A.; Hemming, D.

    2012-02-01

    Maps are a crucial asset in communicating climate science to a diverse audience, and there is a wealth of software available to analyse and visualise climate information. However, this availability makes it easy to create poor maps as users often lack an underlying cartographic knowledge. Unlike traditional cartography, where many known standards allow maps to be interpreted easily, there is no standard mapping approach used to represent uncertainty (in climate or other information). Consequently, a wide range of techniques have been applied for this purpose, and users may spend unnecessary time trying to understand the mapping approach rather than interpreting the information presented. Furthermore, communicating and visualising uncertainties in climate data and climate change projections, using for example ensemble based approaches, presents additional challenges for mapping that require careful consideration. The aim of this paper is to provide background information and guidance on suitable techniques for mapping climate variables, including uncertainty. We assess a range of existing and novel techniques for mapping variables and uncertainties, comparing "intrinsic" approaches that use colour in much the same way as conventional thematic maps with "extrinsic" approaches that incorporate additional geometry such as points or features. Using cartographic knowledge and lessons learned from mapping in different disciplines we propose the following 6 general mapping guidelines to develop a suitable mapping technique that represents both magnitude and uncertainty in climate data:

    - use a sensible sequential or diverging colour scheme;

    - use appropriate colour symbolism if it is applicable;

    - ensure the map is usable by colour blind people;

    - use a data classification scheme that does not misrepresent the data;

    - use a map

  7. A calibration-free evapotranspiration mapping technique

    NASA Astrophysics Data System (ADS)

    Szilagyi, J.

    2010-12-01

    With the availability of Moderate Resolution Imaging Spectroradiometer (MODIS) data the spatial distribution of the resulting daytime land surface temperature (Ts) can be tracked at a resolution of about 1-km. A simple, self-calibrating linear transformation of the Ts values into evapotranspiration (ET) rates is possible if the following criteria are met: a) the vertical gradient of the air temperature near the surface is directly proportional to Ts; b) net energy available for sensible and latent heat transfer at the surface is quasi-constant in space; c) heat conduction into the soil is negligible, and; d) land-surface properties do not change drastically over space. The validity of a) has been proved by such models as SEBAL and METRIC. Requirement b) is fulfilled over a flat or rolling terrain provided the probability distribution of the surface albedo values of the MODIS cells has a narrow spread, which is the case for the two study areas (Hungary and Nebraska) with a characteristic vegetation-period mean of about 16% and a standard deviation of 1.4%. Heat conduction into the soil can be considered negligible for periods longer than a day, thus the 8-day composited Ts values employed in the present study comply with this requirement. Finally, for periods longer than a day, the assumption of near-neutral atmospheric conditions is justified which entails that spatial variations in surface properties have a significantly dampened effect on the flux-transfer coefficient (i.e., aerodynamic resistance) value which therefore can be considered as quasi-constant in space. The linear transformation of the Ts values into ET rates in this study has been performed on a monthly basis. The transformation requires specifying two anchor points in the Ts - ET plane with the help of standard atmospheric variables, such as air temperature and humidity, as well as incident global radiation, or in lieu of it, sunshine duration. From March to November ET has been mapped for Hungary

  8. Map Your Way to Speech Success! Employing Mind Mapping as a Speech Preparation Technique

    ERIC Educational Resources Information Center

    Paxman, Christina G.

    2011-01-01

    Mind mapping has gained considerable credibility recently in corporations such as Boeing and Nabisco, as well as in the classroom in terms of preparing for examinations and preparing for speeches. A mind map is a graphic technique for organizing an individual's thoughts and other information. It harnesses the full range of cortical skills--word,…

  9. Map Your Way to Speech Success! Employing Mind Mapping as a Speech Preparation Technique

    ERIC Educational Resources Information Center

    Paxman, Christina G.

    2011-01-01

    Mind mapping has gained considerable credibility recently in corporations such as Boeing and Nabisco, as well as in the classroom in terms of preparing for examinations and preparing for speeches. A mind map is a graphic technique for organizing an individual's thoughts and other information. It harnesses the full range of cortical skills--word,…

  10. Luminal non-specific cationic channels in cultured strial marginal cells of guinea pig and gerbil as determined by patch clamp technique.

    PubMed

    Liu, D S; Achouche, J; Wu, A H; Trotier, D; Tran Ba Huy, P

    1991-01-01

    Using primary cultures of marginal cells of stria vascularis from guinea pig and gerbil, ionic channels located on the luminal membrane were investigated by means of patch clamp technique. Recordings were performed in cell-attached and inside-out configurations. In cell-attached configuration, single channel activity was identified with a conductance of about 25 pS. I-V curve was linear. The probability of opening was increased upon depolarization. Up to 7 channels could be present in the same patch, indicating a rather high density. In inside-out configuration, the reversal potential was 0 mV, suggesting a non-specific cationic channel. These luminal non-specific cationic channels would allow the passive K+ efflux and Na+ influx across the apical membrane of marginal cells. This finding is consistent with the "one-pump" model of strial activity. The present study suggests that culture of strial marginal cells may be a suitable model for in-depth investigation of endolymph physiology.

  11. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  12. Technique for mapping the spectral uniformity of luminescent semiconducting material.

    PubMed

    Yang, J; Cassidy, D T

    1995-08-01

    A technique to map the spectral uniformity of luminescent semiconducting materials at room temperature is described. This technique is based on spatially resolved and polarization-resolved measurements of the photoluminescence and requires a polarizing beam splitter with a splitting ratio that has a linear dependence on wavelength. Measurements on a quantum-well sample that was patterned by intermixing with a focused ion beam are used to demonstrate the technique. With a spectral resolution of better than 1 nm and a spatial resolution of ≊1 µm, as well as the ability to map concurrently the strain field through the measurement of the degree of polarization of the photoluminescence and the photoluminescence yield, this technique provides a simple, nondestructive method of assessing luminescent materials.

  13. Comparison of the Reliability and Validity of Scores from Two Concept-Mapping Techniques.

    ERIC Educational Resources Information Center

    Ruiz-Primo, Maria Araceli; Schultz, Susan E.; Li, Min; Shavelson, Richard J.

    2001-01-01

    Reports the results of a study that compared two concept-mapping techniques, one high-directed, "fill-in-the-map" and one low-directed, "construct-a-map-from-scratch". Examines whether: (1) skeleton map scores were sensitive to the sample; (2) the two types of skeleton maps were equivalent; and (3) the two mapping techniques provided similar…

  14. Comparison of the Reliability and Validity of Scores from Two Concept-Mapping Techniques.

    ERIC Educational Resources Information Center

    Ruiz-Primo, Maria Araceli; Schultz, Susan E.; Li, Min; Shavelson, Richard J.

    2001-01-01

    Reports the results of a study that compared two concept-mapping techniques, one high-directed, "fill-in-the-map" and one low-directed, "construct-a-map-from-scratch". Examines whether: (1) skeleton map scores were sensitive to the sample; (2) the two types of skeleton maps were equivalent; and (3) the two mapping techniques provided similar…

  15. Linking human factors to corporate strategy with cognitive mapping techniques.

    PubMed

    Village, Judy; Greig, Michael; Salustri, Filippo A; Neumann, W Patrick

    2012-01-01

    For human factors (HF) to avoid being considered of "side-car" status, it needs to be positioned within the organization in such a way that it affects business strategies and their implementation. Tools are needed to support this effort. This paper explores the feasibility of applying a technique from operational research called cognitive mapping to link HF to corporate strategy. Using a single case study, a cognitive map is drawn to reveal the complex relationships between human factors and achieving an organization's strategic goals. Analysis of the map for central concepts and reinforcing loops enhances understanding that can lead to discrete initiatives to facilitate integration of HF. It is recommended that this technique be used with senior managers to understand the organizations` strategic goals and enhance understanding of the potential for HF to contribute to the strategic goals.

  16. Extended mapping and characteristics techniques for inverse aerodynamic design

    NASA Technical Reports Server (NTRS)

    Sobieczky, H.; Qian, Y. J.

    1991-01-01

    Some ideas for using hodograph theory, mapping techniques and methods of characteristics to formulate typical aerodynamic design boundary value problems are developed. The inverse method of characteristics is shown to be a fast tool for design of transonic flow elements as well as supersonic flows with given shock waves.

  17. Techniques of stapler-based navigational thoracoscopic segmentectomy using virtual assisted lung mapping (VAL-MAP)

    PubMed Central

    Murayama, Tomonori; Nakajima, Jun

    2016-01-01

    Anatomical segmentectomies play an important role in oncological lung resection, particularly for ground-glass types of primary lung cancers. This operation can also be applied to metastatic lung tumors deep in the lung. Virtual assisted lung mapping (VAL-MAP) is a novel technique that allows for bronchoscopic multi-spot dye markings to provide “geometric information” to the lung surface, using three-dimensional virtual images. In addition to wedge resections, VAL-MAP has been found to be useful in thoracoscopic segmentectomies, particularly complex segmentectomies, such as combined subsegmentectomies or extended segmentectomies. There are five steps in VAL-MAP-assisted segmentectomies: (I) “standing” stitches along the resection lines; (II) cleaning hilar anatomy; (III) confirming hilar anatomy; (IV) going 1 cm deeper; (V) step-by-step stapling technique. Depending on the anatomy, segmentectomies can be classified into linear (lingular, S6, S2), V- or U-shaped (right S1, left S3, S2b + S3a), and three dimensional (S7, S8, S9, S10) segmentectomies. Particularly three dimensional segmentectomies are challenging in the complexity of stapling techniques. This review focuses on how VAL-MAP can be utilized in segmentectomy, and how this technique can assist the stapling process in even the most challenging ones. PMID:28066675

  18. Techniques of stapler-based navigational thoracoscopic segmentectomy using virtual assisted lung mapping (VAL-MAP).

    PubMed

    Sato, Masaaki; Murayama, Tomonori; Nakajima, Jun

    2016-10-01

    Anatomical segmentectomies play an important role in oncological lung resection, particularly for ground-glass types of primary lung cancers. This operation can also be applied to metastatic lung tumors deep in the lung. Virtual assisted lung mapping (VAL-MAP) is a novel technique that allows for bronchoscopic multi-spot dye markings to provide "geometric information" to the lung surface, using three-dimensional virtual images. In addition to wedge resections, VAL-MAP has been found to be useful in thoracoscopic segmentectomies, particularly complex segmentectomies, such as combined subsegmentectomies or extended segmentectomies. There are five steps in VAL-MAP-assisted segmentectomies: (I) "standing" stitches along the resection lines; (II) cleaning hilar anatomy; (III) confirming hilar anatomy; (IV) going 1 cm deeper; (V) step-by-step stapling technique. Depending on the anatomy, segmentectomies can be classified into linear (lingular, S6, S2), V- or U-shaped (right S1, left S3, S2b + S3a), and three dimensional (S7, S8, S9, S10) segmentectomies. Particularly three dimensional segmentectomies are challenging in the complexity of stapling techniques. This review focuses on how VAL-MAP can be utilized in segmentectomy, and how this technique can assist the stapling process in even the most challenging ones.

  19. Uncertainty in mapping urban air quality using crowdsourcing techniques

    NASA Astrophysics Data System (ADS)

    Schneider, Philipp; Castell, Nuria; Lahoz, William; Bartonova, Alena

    2016-04-01

    Small and low-cost sensors measuring various air pollutants have become available in recent years owing to advances in sensor technology. Such sensors have significant potential for improving high-resolution mapping of air quality in the urban environment as they can be deployed in comparatively large numbers and therefore are able to provide information at unprecedented spatial detail. However, such sensor devices are subject to significant and currently little understood uncertainties that affect their usability. Not only do these devices exhibit random errors and biases of occasionally substantial magnitudes, but these errors may also shift over time. In addition, there often tends to be significant inter-sensor variability even when supposedly identical sensors from the same manufacturer are used. We need to quantify accurately these uncertainties to make proper use of the information they provide. Furthermore, when making use of the data and producing derived products such as maps, the measurement uncertainties that propagate throughout the analysis need to be clearly communicated to the scientific and non-scientific users of the map products. Based on recent experiences within the EU-funded projects CITI-SENSE and hackAIR we discuss the uncertainties along the entire processing chain when using crowdsourcing techniques for mapping urban air quality. Starting with the uncertainties exhibited by the sensors themselves, we present ways of quantifying the error characteristics of a network of low-cost microsensors and show suitable statistical metrics for summarizing them. Subsequently, we briefly present a data-fusion-based method for mapping air quality in the urban environment and illustrate how we propagate the uncertainties of the individual sensors throughout the mapping system, resulting in detailed maps that document the pixel-level uncertainty for each concentration field. Finally, we present methods for communicating the resulting spatial uncertainty

  20. MR measurement of luminal water in prostate gland: Quantitative correlation between MRI and histology.

    PubMed

    Sabouri, Shirin; Fazli, Ladan; Chang, Silvia D; Savdie, Richard; Jones, Edward C; Goldenberg, S Larry; Black, Peter C; Kozlowski, Piotr

    2017-09-01

    To determine the relationship between parameters measured from luminal water imaging (LWI), a new magnetic resonance imaging (MRI) T2 mapping technique, and the corresponding tissue composition in prostate. In all, 17 patients with prostate cancer were examined with a 3D multiecho spin echo sequence at 3T prior to undergoing radical prostatectomy. Maps of seven MR parameters, called N, T2-short , T2-long , Ashort , Along , geometric mean T2 time (gmT2 ), and luminal water fraction (LWF), were generated using nonnegative least squares (NNLS) analysis of the T2 decay curves. MR parametric maps were correlated to digitized whole-mount histology sections. Percentage area of tissue components, including luminal space, nuclei, and cytoplasm plus stroma, was measured on the histology sections by using color-based image segmentation. Spearman's rank correlation test was used to evaluate the correlation between MR parameters and the corresponding tissue components, with particular attention paid to the correlation between LWF and percentage area of luminal space. N, T2-short , Along , gmT2 , and LWF showed significant correlation (P < 0.05) with percentage area of luminal space and stroma plus cytoplasm. T2-short and gmT2 also showed significant correlation (P < 0.05) with percentage area of nuclei. Overall, the strongest correlation was observed between LWF and luminal space (Spearman's coefficient of rank correlation = 0.75, P < 0.001). Results of this study show that LWF measured with MRI is strongly correlated with the fractional amount of luminal space in prostatic tissue. This result suggests that LWI can potentially be applied for evaluation of prostatic diseases in which the extent of luminal space differs between normal and abnormal tissues. 1 Technical Efficacy: Stage 1 J. MAGN. RESON. IMAGING 2017;46:861-869. © 2017 International Society for Magnetic Resonance in Medicine.

  1. Geological mapping in northwestern Saudi Arabia using LANDSAT multispectral techniques

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Brown, G. F.; Moik, J. G.

    1975-01-01

    Various computer enhancement and data extraction systems using LANDSAT data were assessed and used to complement a continuing geologic mapping program. Interactive digital classification techniques using both the maximum-likelihood and thresholding statistical approaches achieve very limited success in areas of highly dissected terrain. Computer enhanced imagery developed by color compositing stretched MSS ratio data was constructed for a test site in northwestern Saudi Arabia. Initial results indicate that several igneous and sedimentary rock-types can be discriminated.

  2. Geological mapping in northwestern Saudi Arabia using LANDSAT multispectral techniques

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Brown, G. F.; Moik, J. G.

    1975-01-01

    Various computer enhancement and data extraction systems using LANDSAT data were assessed and used to complement a continuing geologic mapping program. Interactive digital classification techniques using both the parallel-piped and maximum-likelihood statistical approaches achieve very limited success in areas of highly dissected terrain. Computer enhanced imagery developed by color compositing stretched MSS ratio data was constructed for a test site in northwestern Saudi Arabia. Initial results indicate that several igneous and sedimentary rock types can be discriminated.

  3. Satellite SAR interferometric techniques applied to emergency mapping

    NASA Astrophysics Data System (ADS)

    Stefanova Vassileva, Magdalena; Riccardi, Paolo; Lecci, Daniele; Giulio Tonolo, Fabio; Boccardo Boccardo, Piero; Chiesa, Giuliana; Angeluccetti, Irene

    2017-04-01

    This paper aim to investigate the capabilities of the currently available SAR interferometric algorithms in the field of emergency mapping. Several tests have been performed exploiting the Copernicus Sentinel-1 data using the COTS software ENVI/SARscape 5.3. Emergency Mapping can be defined as "creation of maps, geo-information products and spatial analyses dedicated to providing situational awareness emergency management and immediate crisis information for response by means of extraction of reference (pre-event) and crisis (post-event) geographic information/data from satellite or aerial imagery". The conventional differential SAR interferometric technique (DInSAR) and the two currently available multi-temporal SAR interferometric approaches, i.e. Permanent Scatterer Interferometry (PSI) and Small BAseline Subset (SBAS), have been applied to provide crisis information useful for the emergency management activities. Depending on the considered Emergency Management phase, it may be distinguished between rapid mapping, i.e. fast provision of geospatial data regarding the area affected for the immediate emergency response, and monitoring mapping, i.e. detection of phenomena for risk prevention and mitigation activities. In order to evaluate the potential and limitations of the aforementioned SAR interferometric approaches for the specific rapid and monitoring mapping application, five main factors have been taken into account: crisis information extracted, input data required, processing time and expected accuracy. The results highlight that DInSAR has the capacity to delineate areas affected by large and sudden deformations and fulfills most of the immediate response requirements. The main limiting factor of interferometry is the availability of suitable SAR acquisition immediately after the event (e.g. Sentinel-1 mission characterized by 6-day revisiting time may not always satisfy the immediate emergency request). PSI and SBAS techniques are suitable to produce

  4. Influence of pansharpening techniques in obtaining accurate vegetation thematic maps

    NASA Astrophysics Data System (ADS)

    Ibarrola-Ulzurrun, Edurne; Gonzalo-Martin, Consuelo; Marcello-Ruiz, Javier

    2016-10-01

    In last decades, there have been a decline in natural resources, becoming important to develop reliable methodologies for their management. The appearance of very high resolution sensors has offered a practical and cost-effective means for a good environmental management. In this context, improvements are needed for obtaining higher quality of the information available in order to get reliable classified images. Thus, pansharpening enhances the spatial resolution of the multispectral band by incorporating information from the panchromatic image. The main goal in the study is to implement pixel and object-based classification techniques applied to the fused imagery using different pansharpening algorithms and the evaluation of thematic maps generated that serve to obtain accurate information for the conservation of natural resources. A vulnerable heterogenic ecosystem from Canary Islands (Spain) was chosen, Teide National Park, and Worldview-2 high resolution imagery was employed. The classes considered of interest were set by the National Park conservation managers. 7 pansharpening techniques (GS, FIHS, HCS, MTF based, Wavelet `à trous' and Weighted Wavelet `à trous' through Fractal Dimension Maps) were chosen in order to improve the data quality with the goal to analyze the vegetation classes. Next, different classification algorithms were applied at pixel-based and object-based approach, moreover, an accuracy assessment of the different thematic maps obtained were performed. The highest classification accuracy was obtained applying Support Vector Machine classifier at object-based approach in the Weighted Wavelet `à trous' through Fractal Dimension Maps fused image. Finally, highlight the difficulty of the classification in Teide ecosystem due to the heterogeneity and the small size of the species. Thus, it is important to obtain accurate thematic maps for further studies in the management and conservation of natural resources.

  5. The Use of Electromagnetic Induction Techniques for Soil Mapping

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Doolittle, Jim

    2015-04-01

    Soils have high natural spatial variability. This has been recognized for a long time, and many methods of mapping that spatial variability have been investigated. One technique that has received considerable attention over the last ~30 years is electromagnetic induction (EMI). Particularly when coupled with modern GPS and GIS systems, EMI techniques have allowed the rapid and relatively inexpensive collection of large spatially-related data sets that can be correlated to soil properties that either directly or indirectly influence electrical conductance in the soil. Soil electrical conductivity is directly controlled by soil water content, soluble salt content, clay content and mineralogy, and temperature. A wide range of indirect controls have been identified, such as soil organic matter content and bulk density; both influence water relationships in the soil. EMI techniques work best in areas where there are large changes in one soil property that influences soil electrical conductance, and don't work as well when soil properties that influence electrical conductance are largely homogenous. This presentation will present examples of situations where EMI techniques were successful as well as a couple of examples of situations where EMI was not so useful in mapping the spatial variability of soil properties. Reasons for both the successes and failures will be discussed.

  6. Strain mapping accuracy improvement using super-resolution techniques.

    PubMed

    Bárcena-González, G; Guerrero-Lebrero, M P; Guerrero, E; Fernández-Reyes, D; González, D; Mayoral, A; Utrilla, A D; Ulloa, J M; Galindo, P L

    2016-04-01

    Super-resolution (SR) software-based techniques aim at generating a final image by combining several noisy frames with lower resolution from the same scene. A comparative study on high-resolution high-angle annular dark field images of InAs/GaAs QDs has been carried out in order to evaluate the performance of the SR technique. The obtained SR images present enhanced resolution and higher signal-to-noise (SNR) ratio and sharpness regarding the experimental images. In addition, SR is also applied in the field of strain analysis using digital image processing applications such as geometrical phase analysis and peak pairs analysis. The precision of the strain mappings can be improved when SR methodologies are applied to experimental images. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  7. Vibration/Acoustic Isolation Techniques for spectroscopic mapping STS

    NASA Astrophysics Data System (ADS)

    Alldredge, Jacob; Hoskinson, Emile; Hoffmann, Joan; Harrd, Thomas; Packard, Richard; Davis, J. C.

    2004-03-01

    Previous studies of spectroscopic mapping STS systems indicate that the coupling of mechanical, acoustic, and electrical building noise to experiments can be the limiting noise factor when making measurements. With this in mind we have set out to construct new labs that reduce or eliminate noise from the building. Here we describe the design considerations and techniques used in the construction of new, extremely low vibration lab space, to be used initially for scanning tunneling spectroscopy experiments. The lab space is decoupled from the building by placing it upon 25 ton inertial blocks (30 tons including sound room and equipment) resting on air springs. The experimental apparatus itself rests on another set of air springs. The entire space is surrounded by two nested acoustic rooms. We report on the degree of isolation achieved by these techniques.

  8. Artificial Intelligence Techniques for Predicting and Mapping Daily Pan Evaporation

    NASA Astrophysics Data System (ADS)

    Arunkumar, R.; Jothiprakash, V.; Sharma, Kirty

    2017-08-01

    In this study, Artificial Intelligence techniques such as Artificial Neural Network (ANN), Model Tree (MT) and Genetic Programming (GP) are used to develop daily pan evaporation time-series (TS) prediction and cause-effect (CE) mapping models. Ten years of observed daily meteorological data such as maximum temperature, minimum temperature, relative humidity, sunshine hours, dew point temperature and pan evaporation are used for developing the models. For each technique, several models are developed by changing the number of inputs and other model parameters. The performance of each model is evaluated using standard statistical measures such as Mean Square Error, Mean Absolute Error, Normalized Mean Square Error and correlation coefficient (R). The results showed that daily TS-GP (4) model predicted better with a correlation coefficient of 0.959 than other TS models. Among various CE models, CE-ANN (6-10-1) resulted better than MT and GP models with a correlation coefficient of 0.881. Because of the complex non-linear inter-relationship among various meteorological variables, CE mapping models could not achieve the performance of TS models. From this study, it was found that GP performs better for recognizing single pattern (time series modelling), whereas ANN is better for modelling multiple patterns (cause-effect modelling) in the data.

  9. Toward accelerating landslide mapping with interactive machine learning techniques

    NASA Astrophysics Data System (ADS)

    Stumpf, André; Lachiche, Nicolas; Malet, Jean-Philippe; Kerle, Norman; Puissant, Anne

    2013-04-01

    Despite important advances in the development of more automated methods for landslide mapping from optical remote sensing images, the elaboration of inventory maps after major triggering events still remains a tedious task. Image classification with expert defined rules typically still requires significant manual labour for the elaboration and adaption of rule sets for each particular case. Machine learning algorithm, on the contrary, have the ability to learn and identify complex image patterns from labelled examples but may require relatively large amounts of training data. In order to reduce the amount of required training data active learning has evolved as key concept to guide the sampling for applications such as document classification, genetics and remote sensing. The general underlying idea of most active learning approaches is to initialize a machine learning model with a small training set, and to subsequently exploit the model state and/or the data structure to iteratively select the most valuable samples that should be labelled by the user and added in the training set. With relatively few queries and labelled samples, an active learning strategy should ideally yield at least the same accuracy than an equivalent classifier trained with many randomly selected samples. Our study was dedicated to the development of an active learning approach for landslide mapping from VHR remote sensing images with special consideration of the spatial distribution of the samples. The developed approach is a region-based query heuristic that enables to guide the user attention towards few compact spatial batches rather than distributed points resulting in time savings of 50% and more compared to standard active learning techniques. The approach was tested with multi-temporal and multi-sensor satellite images capturing recent large scale triggering events in Brazil and China and demonstrated balanced user's and producer's accuracies between 74% and 80%. The assessment also

  10. Duodenal luminal nutrient sensing

    PubMed Central

    Rønnestad, Ivar; Akiba, Yasutada; Kaji, Izumi; Kaunitz, Jonathan D

    2016-01-01

    The gastrointestinal mucosa is exposed to numerous chemical substances and microorganisms, including macronutrients, micronutrients, bacteria, endogenous ions, and proteins. The regulation of mucosal protection, digestion, absorption and motility is signaled in part by luminal solutes. Therefore, luminal chemosensing is an important mechanism enabling the mucosa to monitor luminal conditions, such as pH, ion concentrations, nutrient quantity, and microflora. The duodenal mucosa shares luminal nutrient receptors with lingual taste receptors in order to detect the five basic tastes, in addition to essential nutrients, and unwanted chemicals. The recent ‘de-orphanization’ of nutrient sensing G protein-coupled receptors provides an essential component of the mechanism by which the mucosa senses luminal nutrients. In this review, we will update the mechanisms of and underlying physiological and pathological roles in luminal nutrient sensing, with a main focus on the duodenal mucosa. PMID:25113991

  11. Interaction-Point Phase-Space Characterization using Single-Beam and Luminous-Region Measurements at PEP-II

    SciTech Connect

    Kozanecki, W; Bevan, A.J.; Viaud, B.F.; Cai, Y.; Fisher, A.S.; O'Grady, C.; Lindquist, B.; Roodman, A.; J.M.Thompson, M.Weaver; /SLAC

    2008-09-09

    We present an extensive experimental characterization of the e{sup {+-}} phase space at the interaction point of the SLAC PEP-II B-Factory, that combines a detailed mapping of luminous-region observables using the BABAR detector, with stored-beam measurements by accelerator techniques.

  12. Applications of Mapping and Tomographic Techniques in Gem Sciences

    NASA Astrophysics Data System (ADS)

    Shen, A. H.

    2014-12-01

    Gem Sciences are scientific studies of gemstones - their genesis, provenance, synthesis, enhancement, treatment and identification. As high quality forms of specific minerals, the gemstones exhibit unusual physical properties that are usually unseen in the regular counterparts. Most gemstones are colored by trace elements incorporated in the crystal lattice during various growth stages; forming coloration zones of various scales. Studying the spectral and chemical contrast across color zones helps elucidating the origins of colors. These are done by UV-visible spectrometers with microscope and LA-ICPMS in modern gemological laboratories. In the case of diamonds, their colored zones arise from various structural defects incorporated in different growth zones and are studied with FTIR spectrometers with IR microscope and laser photoluminescence spectrometers. Advancement in modern synthetic techniques such as chemical vapor deposition (CVD) has created some problem for identification. Some exploratory experiments in carbon isotope mapping were done on diamonds using SIMS. The most important issue in pearls is to identify one particular pearl being a cultured one or a natural pearl. The price difference can be enormous. Classical way of such identification is done by x-ray radiographs, which clearly show the bead and the nacre. Modern cultured pearl advancement has eliminated the need for an artificial bead, but a small piece of tissue instead. Nowadays, computer x-ray tomography (CT) scanning devices are used to depict the clear image of the interior of a pearl. In the Chinese jade market, filling fissures with epoxy and/or wax are very commonly seen. We are currently exploring Magnetic Resonance Imaging (MRI) technique to map the distribution of artificial resin within a polycrystalline aggregates.

  13. Optical coherence tomography investigations of ceramic lumineers

    NASA Astrophysics Data System (ADS)

    Fernandes, Luana O.; Graça, Natalia D. R. L.; Melo, Luciana S. A.; Silva, Claudio H. V.; Gomes, Anderson S. L.

    2016-02-01

    Lumineers are veneer laminates used as an alternative for aesthetic dental solutions of the highest quality, but the only current means of its performance assessment is visual inspection. The objective of this study was to use the Optical Coherence Tomography (OCT) technique working in spectral domain to analyze in vivo in a single patient, 14 lumineers 180 days after cementation. It was possible to observe images in various kinds of changes in the cementing line and the laminate. It was concluded that the OCT is an effective and promising method to clinical evaluation of the cementing line in lumineers.

  14. Cognitive Mapping Techniques: Implications for Research in Engineering and Technology Education

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Lammi, Matthew

    2014-01-01

    The primary goal of this paper is to present the theoretical basis and application of two types of cognitive maps, concept map and mind map, and explain how they can be used by educational researchers in engineering design research. Cognitive mapping techniques can be useful to researchers as they study students' problem solving strategies…

  15. Comparison of the Reliability and Validity of Scores from Two Concept-Mapping Techniques. Draft.

    ERIC Educational Resources Information Center

    Ruiz-Primo, Maria Araceli; Schultz, Susan E.; Li, Min; Shavelson, Richard J.

    A concept map is a graph in which the nodes represent concepts, the lines between the nodes represent relations, and the labels on the lines represent the nature of the relations. Concept maps have been used to assess students' knowledge structures, especially in science education. Two concept mapping techniques, constructing a map and filling in…

  16. Large areas elemental mapping by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  17. Gluing for Raman lidar systems using the lamp mapping technique.

    PubMed

    Walker, Monique; Venable, Demetrius; Whiteman, David N

    2014-12-20

    In the context of combined analog and photon counting (PC) data acquisition in a Lidar system, glue coefficients are defined as constants used for converting an analog signal into a virtual PC signal. The coefficients are typically calculated using Lidar profile data taken under clear, nighttime conditions since, in the presence of clouds or high solar background, it is difficult to obtain accurate glue coefficients from Lidar backscattered data. Here we introduce a new method in which we use the lamp mapping technique (LMT) to determine glue coefficients in a manner that does not require atmospheric profiles to be acquired and permits accurate glue coefficients to be calculated when adequate Lidar profile data are not available. The LMT involves scanning a halogen lamp over the aperture of a Lidar receiver telescope such that the optical efficiency of the entire detection system is characterized. The studies shown here involve two Raman lidar systems; the first from Howard University and the second from NASA/Goddard Space Flight Center. The glue coefficients determined using the LMT and the Lidar backscattered method agreed within 1.2% for the water vapor channel and within 2.5% for the nitrogen channel for both Lidar systems. We believe this to be the first instance of the use of laboratory techniques for determining the glue coefficients for Lidar data analysis.

  18. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    NASA Astrophysics Data System (ADS)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  19. Analyzing the Effects of Various Concept Mapping Techniques on Learning Achievement under Different Learning Styles

    ERIC Educational Resources Information Center

    Chiou, Chei-Chang; Lee, Li-Tze; Tien, Li-Chu; Wang, Yu-Min

    2017-01-01

    This study explored the effectiveness of different concept mapping techniques on the learning achievement of senior accounting students and whether achievements attained using various techniques are affected by different learning styles. The techniques are computer-assisted construct-by-self-concept mapping (CACSB), computer-assisted…

  20. Photodissociation dynamics of 2-bromopropane using velocity map imaging technique.

    PubMed

    Zhu, Rongshu; Tang, Bifeng; Zhang, Xiu; Zhang, Bing

    2010-06-03

    Photodissociation dynamics of 2-bromopropane in the A band was investigated at several wavelengths between 232 and 267 nm using resonance-enhanced multiphoton ionization technique combined with velocity map ion-imaging detection. The ion images of Br ((2)P(3/2)) and Br* ((2)P(1/2)) were analyzed to yield corresponding total translational energy and angular distributions. The total translational energy distributions showed that the channel leading to Br carried more internal energy in the 2-C(3)H(7) moiety than the channel leading to Br*. The anisotropy parameters of beta (Br) were obtained to be between 0.68 and 1.49, and beta (Br*) between 0.73 and 1.96, indicating that the Br* product originates from direct excitation of the (3)Q(0) state and the (1)Q(1) --> (3)Q(0) nonadiabatic transition, and the Br product from direct excitation of the (1)Q(1) or (3)Q(1) state and the (3)Q(0) --> (1)Q(1) nonadiabatic transition. The curve crossing probabilities were determined to be increase with the wavelength. As compared with the case of CH(3)Br, the two heavier branched CH(3) groups significantly enhance the Br ((2)P(3/2)) production from nonadiabatic contribution. The curve crossing from the (3)Q(0) to the (1)Q(1) surface is much higher than that of the reverse from the (1)Q(1) to the (3)Q(0) surface, which may have resulted from the difference in shape between the potential energy surfaces of the (3)Q(0) and (1)Q(1) states. Finally, based on the experimental data, the partial absorption cross sections of the A band for the (3)Q(0), (3)Q(1), and (1)Q(1) states were extracted.

  1. Mapping ecological risks with a portfolio-based technique: incorporating uncertainty and decision-making preferences

    Treesearch

    Denys Yemshanov; Frank H. Koch; Mark Ducey; Klaus Koehler

    2013-01-01

    Geographic mapping of risks is a useful analytical step in ecological risk assessments and in particular, in analyses aimed to estimate risks associated with introductions of invasive organisms. In this paper, we approach invasive species risk mapping as a portfolio allocation problem and apply techniques from decision theory to build an invasion risk map that combines...

  2. Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009

    USGS Publications Warehouse

    Soller, David R.

    2011-01-01

    As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  3. Techniques, problems and uses of mega-geomorphological mapping

    NASA Technical Reports Server (NTRS)

    Embleton, C.

    1985-01-01

    A plea for a program of global geomorphological mapping based on remote sensing data is presented. It is argued that the program is a necessary step in bringing together the rapidly evolving concepts of plate tectonics with the science of geomorphology. Geomorphologists are urged to bring temporal scales into their subject and to abandon their recent isolation from tectonics and geological history. It is suggested that a start be made with a new geomorphological map of Europe, utilizing the latest space technology.

  4. Myocardial T1 Mapping: Techniques and Potential Applications

    PubMed Central

    Burt, Jeremy R.; Zimmerman, Stefan L.; Kamel, Ihab R.; Halushka, Marc

    2014-01-01

    Myocardial fibrosis is a common endpoint in a variety of cardiac diseases and a major independent predictor of adverse cardiac outcomes. Short of histopathologic analysis, which is limited by sampling bias, most diagnostic modalities are limited in their depiction of myocardial fibrosis. Cardiac magnetic resonance (MR) imaging has the advantage of providing detailed soft-tissue characterization, and a variety of novel quantification methods have further improved its usefulness. Contrast material–enhanced cardiac MR imaging depends on differences in signal intensity between regions of scarring and adjacent normal myocardium. Diffuse myocardial fibrosis lacks these differences in signal intensity. Measurement of myocardial T1 times (T1 mapping) with gadolinium-enhanced inversion recovery–prepared sequences may depict diffuse myocardial fibrosis and has good correlation with ex vivo fibrosis content. T1 mapping calculates myocardial T1 relaxation times with image-based signal intensities and may be performed with standard cardiac MR imagers and radiologic workstations. Myocardium with diffuse fibrosis has greater retention of contrast material, resulting in T1 times that are shorter than those in normal myocardium. Early studies have suggested that diffuse myocardial fibrosis may be distinguished from normal myocardium with T1 mapping. Large multicenter studies are needed to define the role of T1 mapping in developing prognoses and therapeutic assessments. However, given its strengths as a noninvasive method for direct quantification of myocardial fibrosis, T1 mapping may eventually play an important role in the management of cardiac disease. © RSNA, 2014 PMID:24617686

  5. A comparison of two conformal mapping techniques applied to an aerobrake body

    NASA Technical Reports Server (NTRS)

    Hommel, Mark J.

    1987-01-01

    Conformal mapping is a classical technique which has been utilized for solving problems in aerodynamics and hydrodynamics. Conformal mapping has been successfully applied in the construction of grids around airfoils, engine inlets and other aircraft configurations. Conformal mapping techniques were applied to an aerobrake body having an axis of symmetry. Two different approaches were utilized: (1) Karman-Trefftz transformation; and (2) Point Wise Schwarz Christoffel transformation. In both cases, the aerobrake body was mapped onto a near circle, and a grid was generated in the mapped plane. The mapped body and grid were then mapped back into physical space and the properties of the associated grids were examined. Advantages and disadvantages of both approaches are discussed.

  6. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  7. THE TOTAL LUMINOUS EFFICIENCY OF LUMINOUS BACTERIA

    PubMed Central

    Harvey, E. Newton

    1925-01-01

    Methods are described for measuring the light emitted by an emulsion of luminous bacteria of given thickness, and calculating the light emitted by a single bacterium, measuring 1.1 x 2.2 micra, provided there is no absorption of light in the emulsion. At the same time, the oxygen consumed by a single bacterium was measured by recording the time for the bacteria to use up .9 of the oxygen dissolved in sea water from air (20 per cent oxygen). The luminescence intensity does not diminish until the oxygen concentration falls below 2 per cent, when the luminescence diminishes rapidly. Above 2 per cent oxygen (when the oxygen dissolving in sea water from pure oxygen at 760 mm. Hg pressure = 100 per cent) the bacteria use equal amounts of oxygen in equal times, while below 2 per cent oxygen it seems very likely that rate of oxygen absorption is proportional to oxygen concentration. By measuring the time for a tube of luminous bacteria of known concentration saturated with air (20 per cent oxygen) to begin to darken (2 per cent oxygen) we can calculate the oxygen absorbed by one bacterium per second. The bacteria per cc. are counted on a blood counting slide or by a centrifugal method, after measuring the volume of a single bacterium (1.695 x 10–12 cc.). Both methods gave results in good agreement with each other. The maximum value for the light from a single bacterium was 24 x 10–14 lumens or 1.9 x 10–14 candles. The maximum value for lumen-seconds per mg. of oxygen absorbed was 14. The average value for lumen-seconds per mg. O2 was 9.25. The maximum values were selected in calculating the efficiency of light production, since some of the bacteria counted may not be producing light, although they may still be using oxygen. The "diet" of the bacteria was 60 per cent glycerol and 40 per cent peptone. To oxidize this mixture each mg. of oxygen would yield 3.38 gm. calories or 14.1 watts per second. 1 lumen per watt is therefore produced by a normal bacterium which

  8. Simulating and mapping spatial complexity using multi-scale techniques

    USGS Publications Warehouse

    De Cola, L.

    1994-01-01

    A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author

  9. Knowledge mapping as a technique to support knowledge translation.

    PubMed Central

    Ebener, S.; Khan, A.; Shademani, R.; Compernolle, L.; Beltran, M.; Lansang, Ma; Lippman, M.

    2006-01-01

    This paper explores the possibility of integrating knowledge mapping into a conceptual framework that could serve as a tool for understanding the many complex processes, resources and people involved in a health system, and for identifying potential gaps within knowledge translation processes in order to address them. After defining knowledge mapping, this paper presents various examples of the application of this process in health, before looking at the steps that need to be taken to identify potential gaps, to determine to what extent these gaps affect the knowledge translation process and to establish their cause. This is followed by proposals for interventions aimed at strengthening the overall process. Finally, potential limitations on the application of this framework at the country level are addressed. PMID:16917651

  10. Mapping ionospheric observations using combined techniques for Europe region

    NASA Astrophysics Data System (ADS)

    Tomasik, Lukasz; Gulyaeva, Tamara; Stanislawska, Iwona; Swiatek, Anna; Pozoga, Mariusz; Dziak-Jankowska, Beata

    An k nearest neighbours algorithm (KNN) was used for filling the gaps of the missing F2-layer critical frequency is proposed and applied. This method uses TEC data calculated from EGNOS Vertical Delay Estimate (VDE ≈0.78 TECU) and several GNSS stations and its spatial correlation whit data from selected ionosondes. For mapping purposes two-dimensional similarity function in KNN method was proposed.

  11. Research Techniques Made Simple: Immunofluorescence Antigen Mapping in Epidermolysis Bullosa.

    PubMed

    Has, Cristina; He, Yinghong

    2016-07-01

    Inherited epidermolysis bullosa is a group of genetic blistering diseases with a broad spectrum of clinical severity and molecular defects. Epidermolysis bullosa results from mutations in genes encoding proteins involved in cell-cell and cell-matrix adhesion in the epidermis. Immunofluorescence antigen mapping makes use of monoclonal antibodies against proteins of the dermal-epidermal junction zone to determine the layer of skin where cleavage occurs and the relative protein abundance. It allows the diagnosis of the type and subtype of inherited epidermolysis bullosa and sheds light on molecular mechanisms underlying the disease. Immunofluorescence mapping steps include obtaining a skin biopsy sample, processing the biopsy material, antigen-antibody interaction on tissue, washing, incubation with fluorescently conjugated secondary antibodies, mounting, observation under a fluorescence microscope, and interpretation. A minimal antibody panel allows discrimination of the main epidermolysis bullosa subtypes. Extended panels can be used depending on the diagnostic or scientific question to be addressed. Immunofluorescence mapping contributed to significant progress in understanding epidermolysis bullosa, including identification of new underlying genetic mutations, mutation mechanisms, and the presence of revertant mosaicism. It is also an important tool in the assessment of the efficacy of experimental therapeutic approaches. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  12. DNA fiber mapping techniques for the assembly of high-resolution physical maps.

    PubMed

    Weier, H U

    2001-08-01

    High-resolution physical maps are indispensable for directed sequencing projects or the finishing stages of shotgun sequencing projects. These maps are also critical for the positional cloning of disease genes and genetic elements that regulate gene expression. Typically, physical maps are based on ordered sets of large insert DNA clones from cosmid, P1/PAC/BAC, or yeast artificial chromosome (YAC) libraries. Recent technical developments provide detailed information about overlaps or gaps between clones and precisely locate the position of sequence tagged sites or expressed sequences, and thus support efforts to determine the complete sequence of the human genome and model organisms. Assembly of physical maps is greatly facilitated by hybridization of non-isotopically labeled DNA probes onto DNA molecules that were released from interphase cell nuclei or recombinant DNA clones, stretched to some extent and then immobilized on a solid support. The bound DNA, collectively called "DNA fibers," may consist of single DNA molecules in some experiments or bundles of chromatin fibers in others. Once released from the interphase nuclei, the DNA fibers become more accessible to probes and detection reagents. Hybridization efficiency is therefore increased, allowing the detection of DNA targets as small as a few hundred base pairs. This review summarizes different approaches to DNA fiber mapping and discusses the detection sensitivity and mapping accuracy as well as recent achievements in mapping expressed sequence tags and DNA replication sites.

  13. Mapping Diffuse Seismicity Using Empirical Matched Field Processing Techniques

    SciTech Connect

    Wang, J; Templeton, D C; Harris, D B

    2011-01-21

    The objective of this project is to detect and locate more microearthquakes using the empirical matched field processing (MFP) method than can be detected using only conventional earthquake detection techniques. We propose that empirical MFP can complement existing catalogs and techniques. We test our method on continuous seismic data collected at the Salton Sea Geothermal Field during November 2009 and January 2010. In the Southern California Earthquake Data Center (SCEDC) earthquake catalog, 619 events were identified in our study area during this time frame and our MFP technique identified 1094 events. Therefore, we believe that the empirical MFP method combined with conventional methods significantly improves the network detection ability in an efficient matter.

  14. Middle Atmosphere Program. Handbook for MAP. Volume 15: Balloon techniques

    NASA Technical Reports Server (NTRS)

    Murcray, D. G. (Editor)

    1985-01-01

    Some techniques employed by investigators using balloons to obtain data on the properties of the middle atmosphere are discussed. Much effort has gone into developing instruments which could be used on small balloons to measure temperature and variable species. These efforts are discussed. Remote sensing techniques used to obtain data on atmospheric composition are described. Measurement of stratospheric ions and stratospheric aerosols are also discussed.

  15. Development of predictive mapping techniques for soil survey and salinity mapping

    NASA Astrophysics Data System (ADS)

    Elnaggar, Abdelhamid A.

    Conventional soil maps represent a valuable source of information about soil characteristics, however they are subjective, very expensive, and time-consuming to prepare. Also, they do not include explicit information about the conceptual mental model used in developing them nor information about their accuracy, in addition to the error associated with them. Decision tree analysis (DTA) was successfully used in retrieving the expert knowledge embedded in old soil survey data. This knowledge was efficiently used in developing predictive soil maps for the study areas in Benton and Malheur Counties, Oregon and accessing their consistency. A retrieved soil-landscape model from a reference area in Harney County was extrapolated to develop a preliminary soil map for the neighboring unmapped part of Malheur County. The developed map had a low prediction accuracy and only a few soil map units (SMUs) were predicted with significant accuracy, mostly those shallow SMUs that have either a lithic contact with the bedrock or developed on a duripan. On the other hand, the developed soil map based on field data was predicted with very high accuracy (overall was about 97%). Salt-affected areas of the Malheur County study area are indicated by their high spectral reflectance and they are easily discriminated from the remote sensing data. However, remote sensing data fails to distinguish between the different classes of soil salinity. Using the DTA method, five classes of soil salinity were successfully predicted with an overall accuracy of about 99%. Moreover, the calculated area of salt-affected soil was overestimated when mapped using remote sensing data compared to that predicted by using DTA. Hence, DTA could be a very helpful approach in developing soil survey and soil salinity maps in more objective, effective, less-expensive and quicker ways based on field data.

  16. S.M.A.R.T. map: Site map attribute retrieval technique

    SciTech Connect

    Brown-Rall, M.

    1995-03-29

    Plant Engineering`s Space and Site Planning (S&SP) organization at Lawrence Livermore National Laboratory (LLNL) has developed a new tool, which is a computerized mapping system that can graphically illustrate facility characteristics. The current ``base`` map being used is the LLNL Site Map prepared by Plant Engineering`s CADD Support group. Using current information in the Facility Information Tracking System (FITS) database, a Microsoft Excel spreadsheet, an electronic sort can be made, tying in the AutoCAD-generated site map to specific database fields. This link is accomplished by using a software overlay called the CadPLUS InfoEngine. The fields in the database include such things as, facility number, occupant program, population, facility age, facility quality, security level, etc. By selecting one or a combination of the fields, a map is generated, illustrating in color and hatch patterns the facilities or entities that are associated with the chosen fields. This process can be very useful for seeing the LLNL site at a glance, with highlighted characteristics for particular areas of interest. The generation of large complex graphics, using large-scale databases selectively, can be accomplished quickly. These extractions and links between data and graphics create a S.M.A.R.T. Map.

  17. Quantification of Geologic Lineaments by Manual and Machine Processing Techniques. [Landsat satellites - mapping/geological faults

    NASA Technical Reports Server (NTRS)

    Podwysocki, M. H.; Moik, J. G.; Shoup, W. C.

    1975-01-01

    The effect of operator variability and subjectivity in lineament mapping and methods to minimize or eliminate these problems by use of several machine preprocessing methods was studied. Mapped lineaments of a test landmass were used and the results were compared statistically. The total number of fractures mapped by the operators and their average lengths varied considerably, although comparison of lineament directions revealed some consensus. A summary map (785 linears) produced by overlaying the maps generated by the four operators shows that only 0.4 percent were recognized by all four operators, 4.7 percent by three, 17.8 percent by two, and 77 percent by one operator. Similar results were obtained in comparing these results with another independent group. This large amount of variability suggests a need for the standardization of mapping techniques, which might be accomplished by a machine aided procedure. Two methods of machine aided mapping were tested, both simulating directional filters.

  18. Reference Map Technique for Incompressible Fluid-Structure Interaction Problems

    NASA Astrophysics Data System (ADS)

    Rycroft, Chris; Wu, Chen-Hung; Yu, Yue; Kamrin, Ken

    2016-11-01

    We present a fully Eulerian approach to simulate soft structures immersed in an incompressible fluid. The flow is simulated on a fixed grid using a second order projection method to solve the incompressible Navier-Stokes equations, and the fluid-structure interfaces are modeled using the level set method. By introducing a reference map variable to model finite-deformation constitutive relations in the structure on the same grid as the fluid, the interfacial coupling is highly simplified. This fully Eulerian approach provides a computationally efficient alternative to moving mesh approaches. Example simulations featuring many-body contacts and flexible swimmers will be presented.

  19. A technique for conducting point pattern analysis of cluster plot stem-maps

    Treesearch

    C.W. Woodall; J.M. Graham

    2004-01-01

    Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...

  20. The Random-Map Technique: Enhancing Mind-Mapping with a Conceptual Combination Technique to Foster Creative Potential

    ERIC Educational Resources Information Center

    Malycha, Charlotte P.; Maier, Günter W.

    2017-01-01

    Although creativity techniques are highly recommended in working environments, their effects have been scarcely investigated. Two cognitive processes are often considered to foster creative potential and are, therefore, taken as a basis for creativity techniques: knowledge activation and conceptual combination. In this study, both processes were…

  1. Photoinduced force microscopy: A technique for hyperspectral nanochemical mapping

    NASA Astrophysics Data System (ADS)

    Murdick, Ryan A.; Morrison, William; Nowak, Derek; Albrecht, Thomas R.; Jahng, Junghoon; Park, Sung

    2017-08-01

    Advances in nanotechnology have intensified the need for tools that can characterize newly synthesized nanomaterials. A variety of techniques has recently been shown which combines atomic force microscopy (AFM) with optical illumination including tip-enhanced Raman spectroscopy (TERS), scattering-type scanning near-field optical microscopy (sSNOM), and photothermal induced resonance microscopy (PTIR). To varying degrees, these existing techniques enable optical spectroscopy with the nanoscale spatial resolution inherent to AFM, thereby providing nanochemical interrogation of a specimen. Here we discuss photoinduced force microscopy (PiFM), a recently developed technique for nanoscale optical spectroscopy that exploits image forces acting between an AFM tip and sample to detect wavelength-dependent polarization within the sample to generate absorption spectra. This approach enables ∼10 nm spatial resolution with spectra that show correlation with macroscopic optical absorption spectra. Unlike other techniques, PiFM achieves this high resolution with virtually no constraints on sample or substrate properties. The applicability of PiFM to a variety of archetypal systems is reported here, highlighting the potential of PiFM as a useful tool for a wide variety of industrial and academic investigations, including semiconducting nanoparticles, nanocellulose, block copolymers, and low dimensional systems, as well as chemical and morphological mixing at interfaces.

  2. Value-by-alpha maps: An alternative technique to the cartogram

    PubMed Central

    Roth, Robert E.; Woodruff, Andrew W.; Johnson, Zachary F.

    2011-01-01

    The cartogram, or value-by-area map, is a popular technique for cartographically representing social data. Such maps visually equalize a basemap prior to mapping a social variable by adjusting the size of each enumeration unit by a second, related variable. However, to scale the basemap units according to an equalizing variable, cartograms must distort the shape and/or topology of the original geography. Such compromises reduce the effectiveness of the visualization for elemental and general map-reading tasks. Here we describe a new kind of representation, termed a value-by-alpha map, which visually equalizes the basemap by adjusting the alpha channel, rather than the size, of each enumeration unit. Although not without its own limitations, the value-by-alpha map is able to circumvent the compromise inherent to the cartogram form, perfectly equalizing the basemap while preserving both shape and topology. PMID:21927062

  3. Comparing interpolation techniques for annual temperature mapping across Xinjiang region

    NASA Astrophysics Data System (ADS)

    Ren-ping, Zhang; Jing, Guo; Tian-gang, Liang; Qi-sheng, Feng; Aimaiti, Yusupujiang

    2016-11-01

    Interpolating climatic variables such as temperature is challenging due to the highly variable nature of meteorological processes and the difficulty in establishing a representative network of stations. In this paper, based on the monthly temperature data which obtained from the 154 official meteorological stations in the Xinjiang region and surrounding areas, we compared five spatial interpolation techniques: Inverse distance weighting (IDW), Ordinary kriging, Cokriging, thin-plate smoothing splines (ANUSPLIN) and Empirical Bayesian kriging(EBK). Error metrics were used to validate interpolations against independent data. Results indicated that, the ANUSPLIN performed best than the other four interpolation methods.

  4. An image morphing technique based on optimal mass preserving mapping.

    PubMed

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2007-06-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L(2) mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods.

  5. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    PubMed Central

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  6. Luminous pulses during triggered lightning

    NASA Astrophysics Data System (ADS)

    Winn, W. P.; Eastvedt, E. M.; Trueblood, J. J.; Eack, K. B.; Edens, H. E.; Aulich, G. D.; Hunyady, S. J.; Murray, W. C.

    2012-05-01

    A triggered lightning flash that transferred negative charge to ground in central New Mexico produced more than three levels of branching above the main channel to ground in a 1 km vertical field of view. A high-speed video recording shows that the main channel had about 50 brief luminous pulses, many of which were superimposed on a slowly changing persistent luminosity. In contrast, superposition was rare in the uppermost visible branches because luminous pulses first appeared on preexisting dark channels before merging into a luminous channel. This observation suggests that luminous pulses in triggered and natural lightning originate only on dark branches and that the complexity of the main channel to ground is the result of multiple mergers of dark branches with pulses into luminous branches without pulses. This suggestion is contrary to an earlier conclusion that there are two kinds of luminous pulses. We also observe behavior characteristic of electromagnetic waves on transmission lines: when a downward propagating luminous pulse reaches a junction with another initially dark branch, it travels both upward and downward along that branch. Upon reaching the ground the downward propagating wave produces a bright reflection which also splits at the junctions, producing luminosity for a short distance upward in one direction while propagating much farther upward along the path charged by the downward propagating wave. However, when a downward moving luminous pulse reaches a junction with an initially luminous branch, splitting is not evident, probably due to the greater conductivity of the luminous channel.

  7. An Electronic Engineering Curriculum Design Based on Concept-Mapping Techniques

    ERIC Educational Resources Information Center

    Toral, S. L.; Martinez-Torres, M. R.; Barrero, F.; Gallardo, S.; Duran, M. J.

    2007-01-01

    Curriculum design is a concern in European Universities as they face the forthcoming European Higher Education Area (EHEA). This process can be eased by the use of scientific tools such as Concept-Mapping Techniques (CMT) that extract and organize the most relevant information from experts' experience using statistics techniques, and helps a…

  8. My Solar System: A Developmentally Adapted Eco-Mapping Technique for Children

    ERIC Educational Resources Information Center

    Curry, Jennifer R.; Fazio-Griffith, Laura J.; Rohr, Shannon N.

    2008-01-01

    Counseling children requires specific skills and techniques, such as play therapy and expressive arts, to address developmental manifestations and to facilitate the understanding of presenting problems. This article outlines an adapted eco-mapping activity that can be used as a creative counseling technique with children in order to promote…

  9. Digitizing zone maps, using modified LARSYS program. [computer graphics and computer techniques for mapping

    NASA Technical Reports Server (NTRS)

    Giddings, L.; Boston, S.

    1976-01-01

    A method for digitizing zone maps is presented, starting with colored images and producing a final one-channel digitized tape. This method automates the work previously done interactively on the Image-100 and Data Analysis System computers of the Johnson Space Center (JSC) Earth Observations Division (EOD). A color-coded map was digitized through color filters on a scanner to form a digital tape in LARSYS-2 or JSC Universal format. The taped image was classified by the EOD LARSYS program on the basis of training fields included in the image. Numerical values were assigned to all pixels in a given class, and the resulting coded zone map was written on a LARSYS or Universal tape. A unique spatial filter option permitted zones to be made homogeneous and edges of zones to be abrupt transitions from one zone to the next. A zoom option allowed the output image to have arbitrary dimensions in terms of number of lines and number of samples on a line. Printouts of the computer program are given and the images that were digitized are shown.

  10. Technique to 'Map' Chromosomal Mosaicism at the Blastocyst Stage.

    PubMed

    Taylor, Tyl H; Griffin, Darren K; Katz, Seth L; Crain, Jack L; Johnson, Lauren; Gitlin, Susan

    2016-01-01

    The purpose of this study was to identify a technique that allows for comprehensive chromosome screening (CCS) of individual cells within human blastocysts along with the approximation of their location in the trophectoderm relative to the inner cell mass (ICM). This proof-of-concept study will allow for a greater understanding of chromosomal mosaicism at the blastocyst stage and the mechanisms by which mosaicism arises. One blastocyst was held by a holding pipette and the ICM was removed. While still being held, the blastocyst was further biopsied into quadrants. To separate the individual cells from the biopsied sections, the sections were placed in calcium/magnesium-free medium with serum for 20 min. A holding pipette was used to aspirate the sections until individual cells were isolated. Individual cells from each section were placed into PCR tubes and prepped for aCGH. A total of 18 cells were used for analysis, of which 15 (83.3%) amplified and provided a result and 3 (16.7%) did not. Fifteen cells were isolated from the trophectoderm; 13 (86.7%) provided an aCGH result, while 2 (13.3%) did not amplify. Twelve cells were euploid (46,XY), while 1 was complex abnormal (44,XY), presenting with monosomy 7, 10, 11, 13, and 19, and trisomy 14, 15, and 21. A total of 3 cells were isolated from the ICM; 2 were euploid (46,XY) and 1 did not amplify. Here, we expand on a previously published technique which disassociates biopsied sections of the blastocyst into individual cells. Since the blastocyst sections were biopsied in regard to the position of the ICM, it was possible to reconstruct a virtual image of the blastocyst while presenting each cell's individual CCS results.

  11. Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008

    USGS Publications Warehouse

    Soller, David R.

    2009-01-01

    The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  12. Improving predictive mapping of deep-water habitats: Considering multiple model outputs and ensemble techniques

    NASA Astrophysics Data System (ADS)

    Robert, Katleen; Jones, Daniel O. B.; Roberts, J. Murray; Huvenne, Veerle A. I.

    2016-07-01

    In the deep sea, biological data are often sparse; hence models capturing relationships between observed fauna and environmental variables (acquired via acoustic mapping techniques) are often used to produce full coverage species assemblage maps. Many statistical modelling techniques are being developed, but there remains a need to determine the most appropriate mapping techniques. Predictive habitat modelling approaches (redundancy analysis, maximum entropy and random forest) were applied to a heterogeneous section of seabed on Rockall Bank, NE Atlantic, for which landscape indices describing the spatial arrangement of habitat patches were calculated. The predictive maps were based on remotely operated vehicle (ROV) imagery transects high-resolution autonomous underwater vehicle (AUV) sidescan backscatter maps. Area under the curve (AUC) and accuracy indicated similar performances for the three models tested, but performance varied by species assemblage, with the transitional species assemblage showing the weakest predictive performances. Spatial predictions of habitat suitability differed between statistical approaches, but niche similarity metrics showed redundancy analysis and random forest predictions to be most similar. As one statistical technique could not be found to outperform the others when all assemblages were considered, ensemble mapping techniques, where the outputs of many models are combined, were applied. They showed higher accuracy than any single model. Different statistical approaches for predictive habitat modelling possess varied strengths and weaknesses and by examining the outputs of a range of modelling techniques and their differences, more robust predictions, with better described variation and areas of uncertainties, can be achieved. As improvements to prediction outputs can be achieved without additional costly data collection, ensemble mapping approaches have clear value for spatial management.

  13. A novel triangle mapping technique to study the h-index based citation distribution.

    PubMed

    Zhang, Chun-Ting

    2013-01-01

    The h-index has received wide attention in recent years. The area under the citation function is divided by the h-index into three parts, representing h-squared, excess and h-tail citations. The h-index by itself does not carry information for excess and h-tail citations, which can play an even more dominant role than h-index in determining the citation curve, and therefore it is necessary to examine the relations among them. A triangle mapping technique is proposed here to map the three percentages of these citations onto a point within a regular triangle. By viewing the distribution of mapping points, shapes of the citation functions can be studied in a perceivable form. As an example, the distribution of the mapping points for 100 most prolific economists is studied by this technique.

  14. Caustics mapping: an image-space technique for real-time caustics.

    PubMed

    Shah, Musawir A; Konttinen, Jaakko; Pattanaik, Sumanta

    2007-01-01

    In this paper, we present a simple and practical technique for real-time rendering of caustics from reflective and refractive objects. Our algorithm, conceptually similar to shadow mapping, consists of two main parts: creation of a caustic map texture, and utilization of the map to render caustics onto nonshiny surfaces. Our approach avoids performing any expensive geometric tests, such as ray-object intersection, and involves no precomputation; both of which are common features in previous work. The algorithm is well suited for the standard rasterization pipeline and runs entirely on the graphics hardware.

  15. Development and Comparison of Techniques for Generating Permeability Maps using Independent Experimental Approaches

    NASA Astrophysics Data System (ADS)

    Hingerl, Ferdinand; Romanenko, Konstantin; Pini, Ronny; Balcom, Bruce; Benson, Sally

    2014-05-01

    We have developed and evaluated methods for creating voxel-based 3D permeability maps of a heterogeneous sandstone sample using independent experimental data from single phase flow (Magnetic Resonance Imaging, MRI) and two-phase flow (X-ray Computed Tomography, CT) measurements. Fluid velocities computed from the generated permeability maps using computational fluid dynamics simulations fit measured velocities very well and significantly outperform empirical porosity-permeability relations, such as the Kozeny-Carman equation. Acquiring images on the meso-scale from porous rocks using MRI has till recently been a great challenge, due to short spin relaxation times and large field gradients within the sample. The combination of the 13-interval Alternating-Pulsed-Gradient Stimulated-Echo (APGSTE) scheme with three-dimensional Single Point Ramped Imaging with T1 Enhancement (SPRITE) - a technique recently developed at the UNB MRI Center - can overcome these challenges and enables obtaining quantitative 3 dimensional maps of porosities and fluid velocities. Using porosity and (single-phase) velocity maps from MRI and (multi-phase) saturation maps from CT measurements, we employed three different techniques to obtain permeability maps. In the first approach, we applied the Kozeny-Carman relationship to porosities measured using MRI. In the second approach, we computed permeabilities using a J-Leverett scaling method, which is based on saturation maps obtained from N2-H2O multi-phase experiments. The third set of permeabilities was generated using a new inverse iterative-updating technique, which is based on porosities and measured velocities obtained in single-phase flow experiments. The resulting three permeability maps provided then input for computational fluid dynamics simulations - employing the Stanford CFD code AD-GPRS - to generate velocity maps, which were compared to velocity maps measured by MRI. The J-Leveret scaling method and the iterative-updating method

  16. Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010

    USGS Publications Warehouse

    Soller, David R.; Soller, David R.

    2012-01-01

    The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  17. Lidar Luminance Quantizer

    NASA Technical Reports Server (NTRS)

    Quilligan, Gerard; DeMonthier, Jeffrey; Suarez, George

    2011-01-01

    This innovation addresses challenges in lidar imaging, particularly with the detection scheme and the shapes of the detected signals. Ideally, the echoed pulse widths should be extremely narrow to resolve fine detail at high event rates. However, narrow pulses require wideband detection circuitry with increased power dissipation to minimize thermal noise. Filtering is also required to shape each received signal into a form suitable for processing by a constant fraction discriminator (CFD) followed by a time-to-digital converter (TDC). As the intervals between the echoes decrease, the finite bandwidth of the shaping circuits blends the pulses into an analog signal (luminance) with multiple modes, reducing the ability of the CFD to discriminate individual events

  18. Evaluation of Landslide Mapping Techniques and LiDAR-based Conditioning Factors

    NASA Astrophysics Data System (ADS)

    Mahalingam, R.; Olsen, M. J.

    2014-12-01

    Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities to plan and prepare for these damaging events. Mapping landslide susceptible locations using GIS and remote sensing techniques is gaining popularity in the past three decades. These efforts use a wide variety of procedures and consider a wide range of factors. Unfortunately, each study is often completed differently and independently of others. Further, the quality of the datasets used varies in terms of source, data collection, and generation, which can propagate errors or inconsistencies into the resulting output maps. Light detection and ranging (LiDAR) has proved to have higher accuracy in representing the continuous topographic surface, which can help minimize this uncertainty. The primary objectives of this paper are to investigate the applicability and performance of terrain factors in landslide hazard mapping, determine if LiDAR-derived datasets (slope, slope roughness, terrain roughness, stream power index and compound topographic index) can be used for predictive mapping without data representing other common landslide conditioning factors, and evaluate the differences in landslide susceptibility mapping using widely-used statistical approaches. The aforementioned factors were used to produce landslide susceptibility maps for a 140 km2 study area in northwest Oregon using six representative techniques: frequency ratio, weights of evidence, logistic regression, discriminant analysis, artificial neural network, and support vector machine. Most notably, the research showed an advantage in selecting fewer critical conditioning factors. The most reliable factors all could be derived from a single LiDAR DEM, reducing the need for laborious and costly data gathering. Most of the six techniques showed similar statistical results; however, ANN showed less accuracy for predictive mapping. Keywords : Li

  19. Detecting humans using luminance saliency in thermal images.

    PubMed

    Ko, ByoungChul; Kim, DeokYeon; Nam, JaeYeal

    2012-10-15

    This Letter introduces an efficient human detection method in thermal images, using a center-symmetric local binary pattern (CS-LBP) with a luminance saliency map and a random forest (RF) classifier scheme. After detecting a candidate human region, we crop only the head and shoulder region, which has a higher thermal spectrum than the legs or trunk. The CS-LBP feature is then extracted from the luminance saliency map of a hotspot and applied to the RF classifier, which is an ensemble of randomized decision trees. We demonstrate that our detection method is more robust than conventional feature descriptors and classifiers in thermal images.

  20. Accuracy of vertical radial plume mapping technique in measuring lagoon gas emission

    USDA-ARS?s Scientific Manuscript database

    Recently, the U.S. Environmental Protection Agency (USEPA) posted a ground-based optical remote sensing method on its website called OTM 10 for measuring fugitive gas emission flux from area sources such as closed landfills. The OTM 10 utilizes the vertical radial plume mapping (VRPM) technique to c...

  1. Information Landscaping: Information Mapping, Charting, Querying and Reporting Techniques for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2003-01-01

    Total quality management and knowledge management are merged and used as a conceptual model to direct and develop information landscaping techniques through the coordination of information mapping, charting, querying, and reporting. Goals included: merge citation analysis and data mining, and apply data visualization and information architecture…

  2. A technique for determining the deuterium/hydrogen contrast map in neutron macromolecular crystallography.

    PubMed

    Chatake, Toshiyuki; Fujiwara, Satoru

    2016-01-01

    A difference in the neutron scattering length between hydrogen and deuterium leads to a high density contrast in neutron Fourier maps. In this study, a technique for determining the deuterium/hydrogen (D/H) contrast map in neutron macromolecular crystallography is developed and evaluated using ribonuclease A. The contrast map between the D2O-solvent and H2O-solvent crystals is calculated in real space, rather than in reciprocal space as performed in previous neutron D/H contrast crystallography. The present technique can thus utilize all of the amplitudes of the neutron structure factors for both D2O-solvent and H2O-solvent crystals. The neutron D/H contrast maps clearly demonstrate the powerful detectability of H/D exchange in proteins. In fact, alternative protonation states and alternative conformations of hydroxyl groups are observed at medium resolution (1.8 Å). Moreover, water molecules can be categorized into three types according to their tendency towards rotational disorder. These results directly indicate improvement in the neutron crystal structure analysis. This technique is suitable for incorporation into the standard structure-determination process used in neutron protein crystallography; consequently, more precise and efficient determination of the D-atom positions is possible using a combination of this D/H contrast technique and standard neutron structure-determination protocols.

  3. Data Maps: A Hypertext Technique for Visualizing, Analyzing and Presenting Qualitative Data.

    ERIC Educational Resources Information Center

    Horney, Mark

    Making diagrams of the relationships among data is not a new idea: however, the utility of computerized hypertext techniques makes the task more feasible, on a wider scale of data. Data maps using "EntryWay" (a hypertext editing program) were made based on qualitative data from eight different research projects, including: a discourse…

  4. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  5. Comparing Computer-Supported Dynamic Modeling and "Paper & Pencil" Concept Mapping Technique in Students' Collaborative Activity

    ERIC Educational Resources Information Center

    Komis, Vassilis; Ergazaki, Marida; Zogza, Vassiliki

    2007-01-01

    This study aims at highlighting the collaborative activity of two high school students (age 14) in the cases of modeling the complex biological process of plant growth with two different tools: the "paper & pencil" concept mapping technique and the computer-supported educational environment "ModelsCreator". Students' shared activity in both cases…

  6. Information Landscaping: Information Mapping, Charting, Querying and Reporting Techniques for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2003-01-01

    Total quality management and knowledge management are merged and used as a conceptual model to direct and develop information landscaping techniques through the coordination of information mapping, charting, querying, and reporting. Goals included: merge citation analysis and data mining, and apply data visualization and information architecture…

  7. Acoustic emission source location in complex structures using full automatic delta T mapping technique

    NASA Astrophysics Data System (ADS)

    Al-Jumaili, Safaa Kh.; Pearson, Matthew R.; Holford, Karen M.; Eaton, Mark J.; Pullin, Rhys

    2016-05-01

    An easy to use, fast to apply, cost-effective, and very accurate non-destructive testing (NDT) technique for damage localisation in complex structures is key for the uptake of structural health monitoring systems (SHM). Acoustic emission (AE) is a viable technique that can be used for SHM and one of the most attractive features is the ability to locate AE sources. The time of arrival (TOA) technique is traditionally used to locate AE sources, and relies on the assumption of constant wave speed within the material and uninterrupted propagation path between the source and the sensor. In complex structural geometries and complex materials such as composites, this assumption is no longer valid. Delta T mapping was developed in Cardiff in order to overcome these limitations; this technique uses artificial sources on an area of interest to create training maps. These are used to locate subsequent AE sources. However operator expertise is required to select the best data from the training maps and to choose the correct parameter to locate the sources, which can be a time consuming process. This paper presents a new and improved fully automatic delta T mapping technique where a clustering algorithm is used to automatically identify and select the highly correlated events at each grid point whilst the "Minimum Difference" approach is used to determine the source location. This removes the requirement for operator expertise, saving time and preventing human errors. A thorough assessment is conducted to evaluate the performance and the robustness of the new technique. In the initial test, the results showed excellent reduction in running time as well as improved accuracy of locating AE sources, as a result of the automatic selection of the training data. Furthermore, because the process is performed automatically, this is now a very simple and reliable technique due to the prevention of the potential source of error related to manual manipulation.

  8. A novel technique for visualizing high-resolution 3D terrain maps

    NASA Astrophysics Data System (ADS)

    Dammann, John

    2007-02-01

    A new technique is presented for visualizing high-resolution terrain elevation data. It produces realistic images at small scales on the order of the data resolution and works particularly well when natural objects are present. Better visualization at small scales opens up new applications, like site surveillance for security and Google Earth-type local search and exploration tasks that are now done with 2-D maps. The large 3-D maps are a natural for high-resolution stereo display. The traditional technique drapes a continuous surface over the regularly spaced elevation values. This technique works well when displaying large areas or in cities with large buildings, but falls apart at small scales or for natural objects like trees. The new technique visualizes the terrain as a set of disjoint square patches. It is combined with an algorithm that identifies smooth areas within the scene. Where the terrain is smooth, such as in grassy areas, roads, parking lots and rooftops, it warps the patches to create a smooth surface. For trees or shrubs or other areas where objects are under-sampled, however, the patches are left disjoint. This has the disadvantage of leaving gaps in the data, but the human mind is very adept at filling in this missing information. It has the strong advantage of making natural terrain look realistic, trees and bushes look stylized but still look natural and are easy to interpret. Also, it does not add artifacts to the map, like filling in blank vertical walls where there are alcoves and other structure and extending bridges and overpasses down to the ground. The new technique is illustrated using very large 1-m resolution 3-D maps from the Rapid Terrain Visualization (RTV) program, and comparisons are made with traditional visualizations using these maps.

  9. A wafer mapping technique for residual stress in surface micromachined films

    NASA Astrophysics Data System (ADS)

    Schiavone, G.; Murray, J.; Smith, S.; Desmulliez, M. P. Y.; Mount, A. R.; Walton, A. J.

    2016-09-01

    The design of MEMS devices employing movable structures is crucially dependant on the mechanical behaviour of the deposited materials. It is therefore important to be able to fully characterize the micromachined films and predict with confidence the mechanical properties of patterned structures. This paper presents a characterization technique that enables the residual stress in MEMS films to be mapped at the wafer level by using microstructures released by surface micromachining. These dedicated MEMS test structures and the associated measurement techniques are used to extract localized information on the strain and Young’s modulus of the film under investigation. The residual stress is then determined by numerically coupling this data with a finite element analysis of the structure. This paper illustrates the measurement routine and demonstrates it with a case study using electrochemically deposited alloys of nickel and iron, particularly prone to develop high levels of residual stress. The results show that the technique enables wafer mapping of film non-uniformities and identifies wafer-to-wafer differences. A comparison between the results obtained from the mapping technique and conventional wafer bow measurements highlights the benefits of using a procedure tailored to films that are non-uniform, patterned and surface-micromachined, as opposed to simple standard stress extraction methods. The presented technique reveals detailed information that is generally unexplored when using conventional stress extraction methods such as wafer bow measurements.

  10. Parametric techniques for characterizing myocardial tissue by magnetic resonance imaging (part 1): T1 mapping.

    PubMed

    Perea Palazón, R J; Ortiz Pérez, J T; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Solé Arqués, M

    2016-01-01

    The development of myocardial fibrosis is a common process in the appearance of ventricular dysfunction in many heart diseases. Magnetic resonance imaging makes it possible to accurately evaluate the structure and function of the heart, and its role in the macroscopic characterization of myocardial fibrosis by late enhancement techniques has been widely validated clinically. Recent studies have demonstrated that T1-mapping techniques can quantify diffuse myocardial fibrosis and the expansion of the myocardial extracellular space in absolute terms. However, further studies are necessary to validate the usefulness of this technique in the early detection of tissue remodeling at a time when implementing early treatment would improve a patient's prognosis. This article reviews the state of the art for T1 mapping of the myocardium, its clinical applications, and its limitations. Copyright © 2016 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  11. Team activity analysis and recognition based on Kinect depth map and optical imagery techniques

    NASA Astrophysics Data System (ADS)

    Elangovan, Vinayak; Bandaru, Vinod K.; Shirkhodaie, Amir

    2012-06-01

    Kinect cameras produce low-cost depth map video streams applicable for conventional surveillance systems. However, commonly applied image processing techniques are not directly applicable for depth map video processing. Kinect depth map images contain range measurement of objects at expense of having spatial features of objects suppressed. For example, typical objects' attributes such as textures, color tones, intensity, and other characteristic attributes cannot be fully realized by processing depth map imagery. In this paper, we demonstrate application of Kinect depth map and optical imagery for characterization of indoor and outdoor group activities. A Casual-Events State Inference (CESI) technique is proposed for spatiotemporal recognition and reasoning of group activities. CESI uses an ontological scheme for representation of casual distinctiveness of a priori known group activities. By tracking and serializing distinctive atomic group activities, CESI allows discovery of more complex group activities. A Modified Sequential Hidden Markov Model (MS-HMM) is implemented for trail analysis of atomic events representing correlated group activities. CESI reasons about five levels of group activities including: Merging, Planning, Cooperation, Coordination, and Dispersion. In this paper, we present results of capability of CESI approach for characterization of group activities taking place both in indoor and outdoor. Based on spatiotemporal pattern matching of atomic activities representing a known group activities, the CESI is able to discriminate suspicious group activity from normal activities. This paper also presents technical details of imagery techniques implemented for detection, tracking, and characterization of atomic events based on Kinect depth map and optical imagery data sets. Various experimental scenarios in indoors and outdoors (e.g. loading and unloading of objects, human-vehicle interactions etc.,) are carried to demonstrate effectiveness and

  12. MODIS Snow Cover Mapping Decision Tree Technique: Snow and Cloud Discrimination

    NASA Technical Reports Server (NTRS)

    Riggs, George A.; Hall, Dorothy K.

    2010-01-01

    Accurate mapping of snow cover continues to challenge cryospheric scientists and modelers. The Moderate-Resolution Imaging Spectroradiometer (MODIS) snow data products have been used since 2000 by many investigators to map and monitor snow cover extent for various applications. Users have reported on the utility of the products and also on problems encountered. Three problems or hindrances in the use of the MODIS snow data products that have been reported in the literature are: cloud obscuration, snow/cloud confusion, and snow omission errors in thin or sparse snow cover conditions. Implementation of the MODIS snow algorithm in a decision tree technique using surface reflectance input to mitigate those problems is being investigated. The objective of this work is to use a decision tree structure for the snow algorithm. This should alleviate snow/cloud confusion and omission errors and provide a snow map with classes that convey information on how snow was detected, e.g. snow under clear sky, snow tinder cloud, to enable users' flexibility in interpreting and deriving a snow map. Results of a snow cover decision tree algorithm are compared to the standard MODIS snow map and found to exhibit improved ability to alleviate snow/cloud confusion in some situations allowing up to about 5% increase in mapped snow cover extent, thus accuracy, in some scenes.

  13. Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005

    USGS Publications Warehouse

    Soller, David R.

    2005-01-01

    Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and

  14. Topsoil moisture mapping using geostatistical techniques under different Mediterranean climatic conditions.

    PubMed

    Martínez-Murillo, J F; Hueso-González, P; Ruiz-Sinoga, J D

    2017-10-01

    Soil mapping has been considered as an important factor in the widening of Soil Science and giving response to many different environmental questions. Geostatistical techniques, through kriging and co-kriging techniques, have made possible to improve the understanding of eco-geomorphologic variables, e.g., soil moisture. This study is focused on mapping of topsoil moisture using geostatistical techniques under different Mediterranean climatic conditions (humid, dry and semiarid) in three small watersheds and considering topography and soil properties as key factors. A Digital Elevation Model (DEM) with a resolution of 1×1m was derived from a topographical survey as well as soils were sampled to analyzed soil properties controlling topsoil moisture, which was measured during 4-years. Afterwards, some topography attributes were derived from the DEM, the soil properties analyzed in laboratory, and the topsoil moisture was modeled for the entire watersheds applying three geostatistical techniques: i) ordinary kriging; ii) co-kriging considering as co-variate topography attributes; and iii) co-kriging ta considering as co-variates topography attributes and gravel content. The results indicated topsoil moisture was more accurately mapped in the dry and semiarid watersheds when co-kriging procedure was performed. The study is a contribution to improve the efficiency and accuracy of studies about the Mediterranean eco-geomorphologic system and soil hydrology in field conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A satellite remote-sensing technique for geological horizon structure mapping

    SciTech Connect

    Fraser, A.J.; Huggins, P.; Cleverley, P.H.; Rees, J.L.

    1995-12-31

    A Satellite Remote Sensing technique is demonstrated which provides accurate and cost effective near-surface geological structure data. In the exploration phase the technique enables the rapid and inexpensive screening of open licences and the targeting of seismic acquisition, particularly important in terrains of difficult data acquisition. This paper describes the satellite data used, the technique of horizon surface data extraction and the analysis of a case study from Yemen. Landsat Thematic Mapper (TM) data and a high resolution digital elevation model (DEM), generated from stereo SPOT panchromatic images, are used in conjunction to identify a number of outcropping horizons and map their spatial position and height. Geological contacts are identified and digitised from the Landsat TM data and the elevations of these points taken from the digital elevation data. The extracted x,y,z co-ordinates are then gridded to construct a horizon structure map. The technique is applied to an area of central Yemen which is characterised by a near-surface {open_quote}layer cake{close_quote} geological structure in an extremely low dipping terrain (Less than 1{degrees}). The remote sensing interpretation is validated by comparison with 2D seismic across the area. Regional flexural structures with bed dips of as little as 0.25{degrees} can be mapped. Trend analysis and residual calculations on the horizon structure map show the techniques ability to identify and quantify horizon deformation related to faulting. Surface geological structure was successfully interpolated into the subsurface indicating potential fault closure at reservoir target depths.

  16. Contemporary Mapping Techniques of Complex Cardiac Arrhythmias – Identifying and Modifying the Arrhythmogenic Substrate

    PubMed Central

    Koutalas, Emmanuel; Rolf, Sascha; Dinov, Borislav; Richter, Sergio; Arya, Arash; Bollmann, Andreas; Hindricks, Gerhard

    2015-01-01

    Cardiac electrophysiology has moved a long way forward during recent decades in the comprehension and treatment of complex cardiac arrhythmias. Contemporary electroanatomical mapping systems, along with state-of-the-art technology in the manufacture of electrophysiology catheters and cardiac imaging modalities, have significantly enriched our armamentarium, enabling the implementation of various mapping strategies and techniques in electrophysiology procedures. Beyond conventional mapping strategies, ablation of complex fractionated electrograms and rotor ablation in atrial fibrillation ablation procedures, the identification and modification of the underlying arrhythmogenic substrate has emerged as a strategy that leads to improved outcomes. Arrhythmogenic substrate modification also has a major role in ventricular tachycardia ablation procedures. Optimisation of contact between tissue and catheter and image integration are a further step forward to augment our precision and effectiveness. Hybridisation of existing technologies with a reasonable cost should be our goal over the next few years. PMID:26835095

  17. Using a Concept Mapping Tool with a Photograph Association Technique (CoMPAT) to Elicit Children's Ideas about Microbial Activity

    ERIC Educational Resources Information Center

    Byrne, Jenny; Grace, Marcus

    2010-01-01

    Concept mapping is a technique used to provide a visual representation of an individual's ideas about a concept or set of related concepts. This paper describes a concept mapping tool using a photograph association technique (CoMPAT) that is considered to be a novel way of eliciting children's ideas. What children at 11 years of age know about…

  18. Beam rate influence on dose distribution and fluence map in IMRT dynamic technique.

    PubMed

    Slosarek, Krzysztof; Grządziel, Aleksandra; Osewski, Wojciech; Dolla, Lukasz; Bekman, Barbara; Petrovic, Borislava

    2012-01-01

    To examine the impact of beam rate on dose distribution in IMRT plans and then to evaluate agreement of calculated and measured dose distributions for various beam rate values. Accelerators used in radiotherapy utilize some beam rate modes which can shorten irradiation time and thus reduce ability of patient movement during a treatment session. This aspect should be considered in high conformal dynamic techniques. Dose calculation was done for two different beam rates (100 MU/min and 600 MU/min) in an IMRT plan. For both, a comparison of Radiation Planning Index (RPI) and MU was conducted. Secondly, the comparison of optimal fluence maps and corresponding actual fluence maps was done. Next, actual fluence maps were measured and compared with the calculated ones. Gamma index was used for that assessment. Additionally, positions of each leaf of the MLC were controlled by home made software. Dose distribution obtained for lower beam rates was slightly better than for higher beam rates in terms of target coverage and risk structure protection. Lower numbers of MUs were achieved in 100 MU/min plans than in 600 MU/min plans. Actual fluence maps converted from optimal ones demonstrated more similarity in 100 MU/min plans. Better conformity of the measured maps to the calculated ones was obtained when a lower beam rate was applied. However, these differences were small. No correlation was found between quality of fluence map conversion and leaf motion accuracy. Execution of dynamic techniques is dependent on beam rate. However, these differences are minor. Analysis shows a slight superiority of a lower beam rate. It does not significantly affect treatment accuracy.

  19. Digital mapping techniques '00, workshop proceedings - May 17-20, 2000, Lexington, Kentucky

    USGS Publications Warehouse

    Soller, David R.

    2000-01-01

    Introduction: The Digital Mapping Techniques '00 (DMT'00) workshop was attended by 99 technical experts from 42 agencies, universities, and private companies, including representatives from 28 state geological surveys (see Appendix A). This workshop was similar in nature to the first three meetings, held in June, 1997, in Lawrence, Kansas (Soller, 1997), in May, 1998, in Champaign, Illinois (Soller, 1998a), and in May, 1999, in Madison, Wisconsin (Soller, 1999). This year's meeting was hosted by the Kentucky Geological Survey, from May 17 to 20, 2000, on the University of Kentucky campus in Lexington. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. When, based on discussions at the workshop, an attendee adopts or modifies a newly learned technique, the workshop clearly has met that objective. Evidence of learning and cooperation among participating agencies continued to be a highlight of the DMT workshops (see example in Soller, 1998b, and various papers in this volume). The meeting's general goal was to help move the state geological surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and geographic information systems (GIS) analysis. Through oral and poster presentations and special discussion sessions, emphasis was given to: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) continued development of the National Geologic Map Database; 3) progress toward building a standard geologic map data model; 4) field data-collection systems; and 5) map citation and authorship guidelines. Four representatives of the GIS hardware and software vendor community were invited to participate. The four annual DMT workshops were coordinated by the AASG/USGS Data Capture Working Group, which was formed in August, 1996, to support the Association of American State Geologists and the USGS in their effort

  20. Technique of sentinel lymph node biopsy and lymphatic mapping during laparoscopic colon resection for cancer

    PubMed Central

    Bianchi, PP; Andreoni, B; Rottoli, M; Celotti, S; Chiappa, A; Montorsi, M

    2007-01-01

    Background: The utility of lymph node mapping to improve staging in colon cancer is still under evaluation. Laparoscopic colectomy for colon cancer has been validated in multi-centric trials. This study assessed the feasibility and technical aspects of lymph node mapping in laparoscopic colectomy for colon cancer. Methods: A total of 42 patients with histologically proven colon cancer were studied from January 2006 to September 2007. Exclusion criteria were: advanced disease (clinical stage III), rectal cancer, previous colon resection and contraindication to laparoscopy. Lymph-nodal status was assessed preoperatively by computed tomography (CT) scan and intra-operatively with the aid of laparoscopic ultrasound. Before resection, 2–3 ml of Patent Blue V dye was injected sub-serosally around the tumour. Coloured lymph nodes were marked as sentinel (SN) with metal clips or suture and laparoscopic colectomy with lymphadenectomy completed as normal. In case of failure of the intra-operative procedure, an ex vivo SN biopsy was performed on the colectomy specimen after resection. Results: A total number of 904 lymph nodes were examined, with a median number of 22 lymph nodes harvested per patient. The SN detection rate was 100%, an ex vivo lymph node mapping was necessary in four patients. Eleven (26.2%) patients had lymph-nodal metastases and in five (45.5%) of these patients, SN was the only positive lymph node. There were two (18.2%) false-negative SN. In three cases (7.1%) with aberrant lymphatic drainage, lymphadenectomy was extended. The accuracy of SN mapping was 95.2% and negative predictive value was 93.9%. Conclusions: Laparoscopic lymphatic mapping and SN removal is feasible in laparoscopic colectomy for colon cancer. The ex vivo technique is useful as a salvage technique in case of failure of the intra-operative procedure. Prospective studies are justified to determine the real accuracy and false-negative rate of the technique. PMID:22275957

  1. Breast density mapping based upon system calibration, x-ray techniques, and FFDM images

    NASA Astrophysics Data System (ADS)

    Chen, Biao; Smith, Andrew P.; Jing, Zhenxue; Wu, Tao

    2007-03-01

    Clinical studies have correlated a high breast density to a women's risk of breast cancer. A breast density measurement that can quantitatively depict the volume distribution and percentage of dense tissues in breasts would be very useful for risk factor assessment of breast cancer, and might be more predictive of risks than the common but subjective and coarse 4-point BIRADS scale. This paper proposes to use a neural-network mapping to compute the breast density information based upon system calibration data, x-ray techniques, and Full Field Digital Mammography (FFDM) images. The mapping consists of four modules, namely, system calibration, generator of beam quality, generator of normalized absorption, and a multi-layer feed-forward neural network. As the core of breast density mapping, the network accepts x-ray target/filter combination, normalized x-ray absorption, pixel-wise breast thickness map, and x-ray beam quality during image acquisition as input elements, and exports a pixel-wise breast density distribution and a single breast density percentage for the imaged breast. Training and testing data sets for the design and verification of the network were formulated from calibrated x-ray beam quality, imaging data with a step wedge phantom under a variety x-ray imaging techniques, and nominal breast densities of tissue equivalent materials. The network was trained using a Levenberg-Marquardt algorithm based back-propagation learning method. Various thickness and glandular density phantom studies were performed with clinical x-ray techniques. Preliminary results showed that the neural network mapping is promising in accurately computing glandular density distribution and breast density percentage.

  2. Estimation of flood environmental effects using flood zone mapping techniques in Halilrood Kerman, Iran.

    PubMed

    Boudaghpour, Siamak; Bagheri, Majid; Bagheri, Zahra

    2014-01-01

    High flood occurrences with large environmental damages have a growing trend in Iran. Dynamic movements of water during a flood cause different environmental damages in geographical areas with different characteristics such as topographic conditions. In general, environmental effects and damages caused by a flood in an area can be investigated from different points of view. The current essay is aiming at detecting environmental effects of flood occurrences in Halilrood catchment area of Kerman province in Iran using flood zone mapping techniques. The intended flood zone map was introduced in four steps. Steps 1 to 3 pave the way to calculate and estimate flood zone map in the understudy area while step 4 determines the estimation of environmental effects of flood occurrence. Based on our studies, wide range of accuracy for estimating the environmental effects of flood occurrence was introduced by using of flood zone mapping techniques. Moreover, it was identified that the existence of Jiroft dam in the study area can decrease flood zone from 260 hectares to 225 hectares and also it can decrease 20% of flood peak intensity. As a result, 14% of flood zone in the study area can be saved environmentally.

  3. Aura Tropospheric Ozone Columns Derived Using the TOR Approach and Mapping Techniques

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Cunnold, D. M.; Wang, H.; Jing, P.

    2005-12-01

    A 2 predictor (PV and geopotential height) interpolation/mapping technique has been applied to Aura MLS measurements combined with GEOS-4 meteorological fields to produce stratospheric ozone columns between the 300K isentropic surface and up to the 800K surface. Subtraction of these columns from OMI total ozone column measurements under clear sky conditions results in tropospheric ozone columns derived by the Tropospheric Ozone Residual (TOR) technique. The precisions and accuracies of the resulting TORs at mid-latitudes are assessed by comparisons against tropospheric ozonesonde and TORs derived from SAGE measurements. It is found that the inclusion of total ozone column as a third predictor in the interpolation increases the precision of the derived TORs. The use of trajectory mapping is also in the process of being evaluated.

  4. Myocardial T1 and T2 Mapping: Techniques and Clinical Applications

    PubMed Central

    Kim, Pan Ki; Im, Dong Jin; Suh, Young Joo; Park, Chul Hwan; Kim, Jin Young; Chang, Suyon; Lee, Hye-Jeong; Hur, Jin; Kim, Young Jin; Choi, Byoung Wook

    2017-01-01

    Cardiac magnetic resonance (CMR) imaging is widely used in various medical fields related to cardiovascular diseases. Rapid technological innovations in magnetic resonance imaging in recent times have resulted in the development of new techniques for CMR imaging. T1 and T2 image mapping sequences enable the direct quantification of T1, T2, and extracellular volume fraction (ECV) values of the myocardium, leading to the progressive integration of these sequences into routine CMR settings. Currently, T1, T2, and ECV values are being recognized as not only robust biomarkers for diagnosis of cardiomyopathies, but also predictive factors for treatment monitoring and prognosis. In this study, we have reviewed various T1 and T2 mapping sequence techniques and their clinical applications. PMID:28096723

  5. Data driven ionospheric mapping technique for wide area nowcasting. Scientific report no. 4

    SciTech Connect

    Reinisch, B.W.; Huang, X.; Sales, G.S.

    1991-09-01

    A new multidimensional mapping technique is being developed which determines the three dimensional electron density distribution of the ionosphere over a wide area of several thousand kilometers at any time, from sounder data recorded at the previous period just before this time. Electron density profiles from five Digisonde stations in the northeastern regions of the North American continent are used to describe the electron distribution in a 30 deg longitude by 30 deg latitude region. The time histories over 32 hours of the profiles (the polynomial coefficients) from each station are Fourier transformed to determine the spectral components. For each spectral component a damped plane wave is fitted to amplitudes and phases at the five stations. This analysis is performed for each of the profile coefficients, and zonal maps are constructed as sums of these plane waves. This nowcasting technique can find applications in ray tracing for OTH radar, HF communication and HF direction finding.

  6. Arsenic risk mapping in Bangladesh: a simulation technique of cokriging estimation from regional count data.

    PubMed

    Hassan, M Manzurul; Atkins, Peter J

    2007-10-01

    Risk analysis with spatial interpolation methods from a regional database on to a continuous surface is of contemporary interest. Groundwater arsenic poisoning in Bangladesh and its impact on human health has been one of the "biggest environmental health disasters" in current years. It is ironic that so many tubewells have been installed in recent times for pathogen-free drinking water but the water pumped is often contaminated with toxic levels of arsenic. This paper seeks to analyse the spatial pattern of arsenic risk by mapping composite "problem regions" in southwest Bangladesh. It also examines the cokriging interpolation method in analysing the suitability of isopleth maps for different risk areas. GIS-based data processing and spatial analysis were used for this research, along with state-of-the-art decision-making techniques. Apart from the GIS-based buffering and overlay mapping operations, a cokriging interpolation method was adopted because of its exact interpolation capacity. The paper presents an interpolation of regional estimates of arsenic data for spatial risk mapping that overcomes the areal bias problem for administrative boundaries. Moreover, the functionality of the cokriging method demonstrates the suitability of isopleth maps that are easy to read.

  7. Radon risk mapping in southern Belgium: an application of geostatistical and GIS techniques.

    PubMed

    Zh, H C; Charlet, J M; Poffijn, A

    2001-05-14

    A data set of long-term radon measurements in approximately 2200 houses in southern Belgium has been collected in an on-going national radon survey. The spatial variation of indoor Rn concentrations is modelled by variograms. A radon distribution map is produced using the log-normal kriging technique. A GIS is used to digitise, process and integrate a variety of data, including geological maps, Rn concentrations associated with house locations and an administrative map, etc. It also allows evaluation of the relationships between various spatial data sets with the goal of producing radon risk maps. Based on geostatistical mapping and spatial analysis, we define three categories of risk areas: high risk, medium risk and low risk area. The correlation between radon concentrations and geological features is proved in this study. High and medium Rn risk zones are dominantly situated in bedrock from the Cambrian to Lower Devonian, although a few medium risk zones are within the Jurassic. It is evident that high-risk zones are related to a strongly folded and fractured context.

  8. Southern European ionospheric TEC maps based on Kriging technique to monitor ionosphere behavior

    NASA Astrophysics Data System (ADS)

    Rodríguez-Bouza, Marta; Paparini, Claudia; Otero, Xurxo; Herraiz, Miguel; Radicella, Sandro M.; Abe, Oladipo E.; Rodríguez-Caderot, Gracia

    2017-10-01

    Global or regional Maps of the ionospheric Total Electron Content (TEC) are an efficient tool to monitor the delay introduced by the ionosphere in the satellite signals. Ionospheric disturbance periods are of particular interest because these conditions can strongly affect satellite navigation range measurements. This work presents post-processing regional vertical TEC maps over Southern Europe ([35°N-50°N] latitude) obtained by applying Kriging interpolation to GPS derived TEC over more than 100 Global Navigation Satellite System (GNSS) stations. These maps are used to study the behavior of the ionosphere during space weather events and their effects. To validate these maps, hereafter called Southern European Ionospheric Maps (SEIMs), their TEC values have been compared with those obtained from EGNOS Message Server (EMS) and with direct experimental TEC data from GNSS stations. Ionospheric space weather events related to geomagnetic storms of March 17th, 2013, February 19th, 2014 and March 17th, 2015 have been selected. To test the methodology, one period of quiet days has been also analyzed. TEC values obtained by SEIMs in the Ionospheric Grid Points (IGPs) defined by EGNOS are very close to those given by EMS and in the period of major geomagnetic storms the difference does not exceed 6 TEC units. These results confirm the good performance of the technique used for obtaining the SEIMs that can be a useful tool to study the ionosphere behavior during geomagnetic storms and their effects in the region of interest.

  9. Mapping land slide occurrence zones using Remote Sensing and GIS techniques in Kelantan state, Malaysia

    NASA Astrophysics Data System (ADS)

    Hashim, M.; Pour, A. B.; Misbari, S.

    2017-05-01

    Integration of satellite remote sensing data and Geographic Information System (GIS) techniques is one of the most applicable approach for landslide mapping and identification of high potential risk and susceptible zones in tropical environments. Yearly, several landslides occur during heavy monsoon rainfall in Kelantan river basin, Peninsular Malaysia. In this investigation, Landsat-8 and Phased Array type L-band Synthetic Aperture Radar-2 (PALSAR-2) remote sensing data sets were integrated with GIS analysis for detect, map and characterize landslide occurrences during December 2014 flooding period in the Kelantan river basin. Landslides were determined by tracking changes in vegetation pixel data using Landsat-8 images that acquired before and after December 2014 flooding for the study area. The PALSAR-2 data were used for mapping of major geological structures and detailed characterizations of lineaments in the state of Kelantan. Analytical Hierarchy Process (AHP) approach was used for landslide susceptibility mapping. Several factors such as slope, aspect, soil, lithology, Normalized Difference Vegetation Index (NDVI), land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from remote sensing satellite data and fieldwork to apply AHP approach. Two main outputs of this study were landslide inventory occurrences map during 2014 flooding episode and landslide susceptibility map for entire the Kelantan state. Modelled/predicted landslides with susceptible map generated prior and post flood episode, confirmed that intense rainfall in the Kelantan have contributed to weightage of numerous landslides with various sizes. It is concluded that precipitation is the most influential factor that bare to landslide event.

  10. Mineral classification map using MF and SAM techniques: A case study in the Nohwa Island, Korea

    SciTech Connect

    Son, Young-Sun; Yoon, Wang-Jung

    2015-03-10

    The purpose of this study is to map pyprophyllite distribution at surface of the Nohwa deposit, Korea by using Advanced Spaceborne Thermal Emission and Reflectance Radiometer (ASTER) data. For this, combined Spectral Angle Mapper (SAM), and Matched Filtering (MF) technique based on mathematical algorithm was applied. The regional distribution of high-grade and low-grade pyrophyllite in the Nohwa deposit area could be differentiated by this method. The results of this study show that ASTER data analysis using combination of SAM and MF techniques will assist in exploration of pyrophyllite at the exposed surface.

  11. A new technique for processing airborne gamma ray spectrometry data for mapping low level contaminations.

    PubMed

    Aage, H K; Korsbech, U; Bargholz, K; Hovgaard, J

    1999-12-01

    A new technique for processing airborne gamma ray spectrometry data has been developed. It is based on the noise adjusted singular value decomposition method introduced by Hovgaard in 1997. The new technique opens for mapping of very low contamination levels. It is tested with data from Latvia where the remaining contamination from the 1986 Chernobyl accident together with fallout from the atmospheric nuclear weapon tests includes 137Cs at levels often well below 1 kBq/m2 equivalent surface contamination. The limiting factors for obtaining reliable results are radon in the air, spectrum stability and accurate altitude measurements.

  12. Mapping Based Technique for Distance Relays Setting in Presence of IPFC

    NASA Astrophysics Data System (ADS)

    Ahmadi, Iraj; Varjani, Ali Yazdian; Khederzadeh, Mojtaba

    In this paper, a new technique is proposed for distance relays setting in presence of Interline Power Flow Controller (IPFC). IPFC presence in power network impacts on conventional protection systems, particularly distance relays. The main impact of IPFC on distance relays is to deviate impedance region seen by distance relays from relays tripping characteristic. The proposed technique employs active and reactive power injected by series converter to adjust distance relay first zone boundaries by mapping so that the new boundary has the most overlap with impedance region. Since the technique needs off-line calculations, it does not impose significant computational burden on the relay during the fault. The efficiency of the proposed technique is shown by simulation results of a four bus system containing a two-converter IPFC.

  13. A semi-quantitative technique for mapping potential aquifer productivity on the national scale: example of England and Wales (UK)

    NASA Astrophysics Data System (ADS)

    Abesser, Corinna; Lewis, Melinda

    2015-12-01

    The development and validation of aquifer productivity and depth-to-source maps for England and Wales are described. Aquifer productivity maps can provide valuable support for the assessment, planning and management of groundwater and renewable heat energy resources. Aquifer productivity is often mapped using geostatistical interpolation techniques such as kriging, but these techniques tend to be unsuitable for mapping at the national scale due to the high data (and time) demands. A methodology is outlined for mapping aquifer productivity at the national scale using existing national-scale data sets. Pumping test data are used to characterise the potential borehole yields that different geological formations of varying lithologies and ages can provide. Based on this analysis and using expert knowledge, the corresponding map codes on the geological map are assigned to potential productivity classes. The subsurface (concealed) extent of aquifer units is mapped from geophysical data, and together with the attributed geological map, provide the bedrock-aquifer productivity map. Drilling and pumping costs can be an important consideration when evaluating the feasibility of developing a groundwater source. Thus, a map of the approximate depth to source is developed alongside the aquifer productivity map. The maps are validated using independent data sets, and map performance is compared against performance from maps derived by random and uniform attribution. The results show that the maps successfully predict potential productivity and approximate depth to the water source, although utility of the depth-to-source map could be improved by increasing the vertical discretisation at which depth intervals are mapped.

  14. [May Medically Assisted Procreation (MAP) be relevant for homosexual women? Study among 147 gynaeco-logists involved in MAP techniques].

    PubMed

    Sintes, R; Darves-Bornoz, J-M

    2002-01-01

    The second part of the twentieth century has seen societal modifications as well as evolution of medical techniques allowing now thinking human procreation in terms of choices or even rights. Certain voices require sometimes Medically Assisted Procreation (MAP) for lesbians. Even though society did not allow such a possibility in France, it seemed interesting to question about it professionals actively involved in the use of MAP techniques. Through systematic internet queries, we obtained a list of one hundred private or public french medical institutions with a unit for the treatment of sterility. A telephone call to their secretary allowed us to individualize those doctors who did practice MAP. A sample of 147 medical doctors practicing MAP was then drawn. They were questioned with a clinical instrument including 20 ended-questions in order to assess their opinions on: homosexual women with a desire of a child; possibility for these clinicians to intervene with a donor insemination in such situations; developmental risk for such children. One hundred twenty five (85%) accepted to answer. Nine percent of these gynaecologists still consider homosexuality as pathological, and 10% as deviant - contrary to international classifications of mental disorders - and 5% deny good maternal abilities to homosexual women. Before the so-called french laws of bioethics in 1994, none of them had practiced a donor insemination for a lesbian couple, though 4% had realized some for single homosexual women. Two third of them do not agree opening donor insemination to homosexual women though for half of them, the anonymity of a donor is not perceived as prejudicial to the child. Eighty-seven percent of these gynaecologists think that a child brought up by homosexual parents is at risk for developmental disorder, the configuration supposed the most pathogenic being when the birth results from a donor insemination. The supposedly most important risk factors are thought to be the

  15. Investigation of Lamp Mapping Technique for Calibration and Diagnostics of Raman LIDAR Systems

    NASA Astrophysics Data System (ADS)

    Walker, Monique

    Raman Lidar systems provide water vapor measurements that can be used for weather forecasting and atmospheric modeling. Most of the accuracy of Raman Lidar water vapor data is dependent on a secondary instrument such as the radiosonde. Here we discuss the use of the standard lamp mapping technique as it applies to Raman Lidar data. Using the standard lamp mapping technique causes the Raman Lidar water vapor data to rely on the accuracy of the fundamental Raman cross sections and the accuracy of the detection system. To be more specific we discuss how the lamp mapping technique (LMT) is used to determine a Raman Lidar water vapor mixing ratio calibration constant, correct a look-up table that could potentially be used to determine atmospheric temperatures based on Lidar measurements, determine a Lidar overlap function, determine Lidar glue coefficients and serve as a Raman Lidar diagnostic test. The mapping technique discussed here is a standard lamp being scanned over the aperture of a Raman Lidar telescope. From the mapping technique we are able to determine a system efficiency for the Lidar detection system, which allows us to perform the functions mentioned above. These various studies were conducted on two Raman Lidar systems with different optical detection systems and configurations. The water vapor mixing ratio calibration determined using the LMT and glue coefficients determined using the LMT showed good agreement with the traditional methods. In addition the LMT has shown to be an excellent diagnostic tool for Lidar systems. Furthermore, we were able to determine an overlap function for the Lidar system single optical channels and also the overlap function for water vapor mixing ratio (WVMR) using the LMT. Lastly, there was a limitation that did not allow us to obtain temperature calibration constants for Lidar-based temperature measurements using a look-up table corrected by the LMT. However, we were able to determine the detector response of the two

  16. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques.

    PubMed

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data.

  17. Advanced Ecosystem Mapping Techniques for Large Arctic Study Domains Using Calibrated High-Resolution Imagery

    NASA Astrophysics Data System (ADS)

    Macander, M. J.; Frost, G. V., Jr.

    2015-12-01

    Regional-scale mapping of vegetation and other ecosystem properties has traditionally relied on medium-resolution remote sensing such as Landsat (30 m) and MODIS (250 m). Yet, the burgeoning availability of high-resolution (<=2 m) imagery and ongoing advances in computing power and analysis tools raises the prospect of performing ecosystem mapping at fine spatial scales over large study domains. Here we demonstrate cutting-edge mapping approaches over a ~35,000 km² study area on Alaska's North Slope using calibrated and atmospherically-corrected mosaics of high-resolution WorldView-2 and GeoEye-1 imagery: (1) an a priori spectral approach incorporating the Satellite Imagery Automatic Mapper (SIAM) algorithms; (2) image segmentation techniques; and (3) texture metrics. The SIAM spectral approach classifies radiometrically-calibrated imagery to general vegetation density categories and non-vegetated classes. The SIAM classes were developed globally and their applicability in arctic tundra environments has not been previously evaluated. Image segmentation, or object-based image analysis, automatically partitions high-resolution imagery into homogeneous image regions that can then be analyzed based on spectral, textural, and contextual information. We applied eCognition software to delineate waterbodies and vegetation classes, in combination with other techniques. Texture metrics were evaluated to determine the feasibility of using high-resolution imagery to algorithmically characterize periglacial surface forms (e.g., ice-wedge polygons), which are an important physical characteristic of permafrost-dominated regions but which cannot be distinguished by medium-resolution remote sensing. These advanced mapping techniques provide products which can provide essential information supporting a broad range of ecosystem science and land-use planning applications in northern Alaska and elsewhere in the circumpolar Arctic.

  18. Testing the PV-Theta Mapping Technique in a 3-D CTM Model Simulation

    NASA Technical Reports Server (NTRS)

    Frith, Stacey M.

    2004-01-01

    Mapping lower stratospheric ozone into potential vorticity (PV)- potential temperature (Theta) coordinates is a common technique employed to analyze sparse data sets. Ozone transformed into a flow-following dynamical coordinate system is insensitive to meteorological variations. Therefore data from a wide range of times/locations can be compared, so long as the measurements were made in the same airmass (as defined by PV). Moreover, once a relationship between ozone and PV/Theta is established, a full 3D ozone field can be estimated from this relationship and the 3D analyzed PV field. However, ozone data mapped in this fashion can be hampered by noisy PV fields, or "mis-matches" in the resolution and/or exact location of the ozone and PV measurements. In this study, we investigate the PV-ozone relationship using output from a recent 50-year run of the Goddard 3D chemical transport model (CTM). Model constituents are transported using off-line dynamics from the finite volume general circulation model (FVGCM). By using the internally consistent model PV and ozone fields, we minimize noise due to mis-matching and resolution issues. We calculate correlations between model ozone and PV throughout the stratosphere, and test the sensitivity of the technique to initial data resolution. To do this we degrade the model data to that of various satellite instruments, then compare the mapped fields derived from the sub-sampled data to the full resolution model data. With these studies we can determine appropriate limits for the PV-theta mapping technique in latitude, altitude, and as a function of original data resolution.

  19. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques

    NASA Astrophysics Data System (ADS)

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R 2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data.

  20. X-ray elemental mapping techniques for elucidating the ecophysiology of hyperaccumulator plants.

    PubMed

    van der Ent, Antony; Przybyłowicz, Wojciech J; de Jonge, Martin D; Harris, Hugh H; Ryan, Chris G; Tylko, Grzegorz; Paterson, David J; Barnabas, Alban D; Kopittke, Peter M; Mesjasz-Przybyłowicz, Jolanta

    2017-10-10

    Contents I. II. III. IV. V. VI. References SUMMARY: Hyperaccumulators are attractive models for studying metal(loid) homeostasis, and probing the spatial distribution and coordination chemistry of metal(loid)s in their tissues is important for advancing our understanding of their ecophysiology. X-ray elemental mapping techniques are unique in providing in situ information, and with appropriate sample preparation offer results true to biological conditions of the living plant. The common platform of these techniques is a reliance on characteristic X-rays of elements present in a sample, excited either by electrons (scanning/transmission electron microscopy), protons (proton-induced X-ray emission) or X-rays (X-ray fluorescence microscopy). Elucidating the cellular and tissue-level distribution of metal(loid)s is inherently challenging and accurate X-ray analysis places strict demands on sample collection, preparation and analytical conditions, to avoid elemental redistribution, chemical modification or ultrastructural alterations. We compare the merits and limitations of the individual techniques, and focus on the optimal field of applications for inferring ecophysiological processes in hyperaccumulator plants. X-ray elemental mapping techniques can play a key role in answering questions at every level of metal(loid) homeostasis in plants, from the rhizosphere interface, to uptake pathways in the roots and shoots. Further improvements in technological capabilities offer exciting perspectives for the study of hyperaccumulator plants into the future. © 2017 University of Queensland. New Phytologist © 2017 New Phytologist Trust.

  1. Induced contrast asynchronies may be useful for luminance photometry.

    PubMed

    Shapiro, Arthur G; D'Antona, Anthony; Smith, Jared B; Belano, Lindsay A; Charles, Justin P

    2004-01-01

    Shapiro et al. (2004) introduced a new visual effect (the induced contrast asynchrony) that demonstrates a perceptual separation between the response to a modulated light and the response to contrast of the light relative to background. The effect is composed of two physically identical disks, one surrounded by a dark annulus and the other by a light annulus. The luminance levels of both central disks were modulated in time, producing a stimulus with in-phase luminance modulation and antiphase contrast modulation. Observers primarily perceived the disks to be modulating asynchronously (i.e. they perceived the contrast), but at low temporal frequencies could also track the luminance level. Here we document that the induced contrast asynchrony disappears when the surrounds are achromatic and the center lights are modulated near the equiluminant axis. Observers viewed 1-deg-diameter disks embedded 2-deg-diameter achromatic surrounds. The chromaticity of the disks was modulated in time (1 Hz) along lines in an S versus Luminance cardinal color plane and an L-M versus Luminance cardinal color plane; observers responded as to whether the modulation appeared in phase. For all observers and both color planes, the lights appeared in phase most frequently at angles near the standard observer's equiluminant line and out of phase at angles further away from that line. Observers differed in the range of angles that produce the appearance of in-phase modulation. The results suggest that induced contrast asynchronies may be useful as a technique for equating luminance of disparate lights.

  2. Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun

    2015-06-01

    Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.

  3. Mapping Ionospheric Electrodynamics with SuperDARN Data: An Assimilative Technique

    NASA Astrophysics Data System (ADS)

    Cousins, E. D.; Matsuo, T.; Richmond, A. D.

    2013-12-01

    Mapping of electrodynamic parameters in Earth's high-latitude ionosphere has been performed widely for many years. Knowledge of the complete pattern of ionospheric plasma convection enables studies of solar wind-magnetosphere-ionosphere coupling, allows for the placing of localized observations in a larger context, and can be used to drive coupled ionosphere-thermosphere first-principles models. In the past, an assimilative procedure has been developed to map ionospheric electrodynamics based on combining various types of observations (e.g., magnetometers, satellites, and incoherent or coherent scatter radars) [Richmond and Kamide, 1988], and a simple regression technique has been developed to use exclusively data from the Super Dual Auroral Radar Network (SuperDARN) coherent backscatter radars [Ruohoniemi and Baker, 1998; Shepherd and Ruohoniemi, 2000]. In this study, an assimilative procedure is developed to combine exclusively SuperDARN data with the empirical plasma convection model of Cousins and Shepherd [2010] (CS10, also based on SuperDARN data). The statistical properties of the CS10 model errors are investigated using the method of empirical orthogonal functions (EOFs) applied to an ~2 yr data set, and a parameterization of the CS10 model error covariance matrix is obtained. The new SuperDARN assimilative procedure, which takes into account realistic error covariances, is shown to significantly improve the mapping of electrodynamic parameters (as compared to the existing regression-based SuperDARN mapping procedure). The application of this assimilative procedure is demonstrated for a geomagnetic storm.

  4. Two-dimensional wavelet mapping techniques for damage detection in structural systems

    NASA Astrophysics Data System (ADS)

    Amizic, Bruno; Amaravadi, Venkat; Rao, Vittal S.; Derriso, Mark M.

    2002-07-01

    Application of the wavelet transforms on the measured vibration data provides a new tool for the damage detection analysis of the two-dimensional structural systems. In this paper, a novel two-dimensional wavelet mapping technique for damage detection based on the wavelet transforms and residual mode shapes are proposed. After vibration data was collected, wavelet de-noising shrinkage was performed in order to reduce measurement noise. By performing wavelet decomposition of the residuals of mode shapes and by taking only detail coefficients, wavelet energy maps are constructed for all decomposition levels. The orthogonal property of the wavelet transforms has bee utilized to correlate energy at decomposition levels with the measured vibrational energy. After wavelet maps of interest are determined, they are mapped on top of each other to figure out damaged areas of the two-dimensional structural systems. The energy segmentation procedure is performed by using minimum homogeneity and uncertainty based thresholding methods. It has been shown that the proposed method can clearly locate the multiple damage locations on the two- dimensional structures. This method requires few sampling points, robust and independent of the type of damage or the material damaged. The proposed method is applied to detect multiple damage locations on a two-dimensional plate. The results are very satisfactory.

  5. Making Accurate Topographic Maps of the Schoolyard Using Ideas and Techniques Learned and Adapted from Multi-beam Sonar Mapping of the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Fuerst, S. I.; Roberts, J. D.

    2010-12-01

    Having participated in a University of Rhode Island Project Armada expedition to join the University of New Hampshire Center for Coastal and Oceanographic Studies in making multi-beam sonar contour maps of the Arctic Ocean floor, I was able to bring the principles learned from this trip to my earth science high school students and create a project in our "mapping the earth" unit. Students learn basic surveying techniques and create authentic, accurately detailed topographic maps of the schoolyard. Models of their maps are then constructed of either Styrofoam or wood which enables them to make the transition from a 2-dimensional map to a 3-dimensional representation. Even though our maps are created using sticks, line levels, compasses and GPS, the scientific concepts of using location and elevation data to draw contour lines are identical to those used in underwater mapping. Once the students understand the science in mapping and creating contour maps to scale on graph paper by hand, they are able to easily relate this knowledge to what I was doing onboard ship using multi-beam sonar and computer mapping programs. We would like to share with you the lab and techniques that we have developed to make this activity possible with minimal materials and simple technology. As a background extension, it is also possible to replicate sonar measurements using an aquarium, food coloring, and a surface grid to map the topography of a teacher created landscape on the aquarium bottom. Earth Science students using simple tools to accurately map the topography of the school grounds

  6. Spectropolarimetry of hot, luminous stars

    NASA Technical Reports Server (NTRS)

    Schulte-Ladbeck, Regina E.

    1994-01-01

    I review polarimetric observations of presumably single, hot luminous stars. The stellar types discussed are OB stars. B(e) supergiants, Luminous Blue Variables (LBV), Wolf-Rayet (W-R) stars, and type II supernovae (SN). It is shown that variable, intrinsic polarization is a common phenomenon in that part of the Hertzsprung-Russell (HR) diagram which these stars occupy. However, much observational work remains to be done before we can answer the most basic, statistical questions about the polarimetric properties of different groups of hot, luminous stars. Insight into the diagnostic power of polarization observations has been gained, but cannot be exploited without detailed models. Thus, while polarimetric observations do tell us that the mass-loss processes of all types of massive stars are time-dependent and anisotropic, the significance that this might have for the accuracy of their stellar parameters and evolutionary paths remains elusive.

  7. Urban Mapping and Growth Prediction using Remote Sensing and GIS Techniques, Pune, India

    NASA Astrophysics Data System (ADS)

    Sivakumar, V.

    2014-11-01

    This study aims to map the urban area in and around Pune region between the year 1991 and 2010, and predict its probable future growth using remote sensing and GIS techniques. The Landsat TM and ETM+ satellite images of 1991, 2001 and 2010 were used for analyzing urban land use class. Urban class was extracted / mapped using supervised classification technique with maximum likelihood classifier. The accuracy assessment was carried out for classified maps. The achieved overall accuracy and Kappa statistics were 86.33 % & 0.76 respectively. Transition probability matrix and area change were obtained using different classified images. A plug-in was developed in QGIS software (open source) based on Markov Chain model algorithm for predicting probable urban growth for the future year 2021. Based on available data set, the result shows that urban area is expected to grow much higher in the year 2021 when compared to 2010. This study provides an insight into understanding of urban growth and aids in subsequent infrastructure planning, management and decision-making.

  8. Performance analysis of the Microsoft Kinect sensor for 2D Simultaneous Localization and Mapping (SLAM) techniques.

    PubMed

    Kamarudin, Kamarulzaman; Mamduh, Syed Muhammad; Shakaff, Ali Yeon Md; Zakaria, Ammar

    2014-12-05

    This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS). The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect's depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks.

  9. Novel Techniques of Real-time Blood Flow and Functional Mapping: Technical Note

    PubMed Central

    KAMADA, Kyousuke; OGAWA, Hiroshi; SAITO, Masato; TAMURA, Yukie; ANEI, Ryogo; KAPELLER, Christoph; HAYASHI, Hideaki; PRUECKL, Robert; GUGER, Christoph

    2014-01-01

    There are two main approaches to intraoperative monitoring in neurosurgery. One approach is related to fluorescent phenomena and the other is related to oscillatory neuronal activity. We developed novel techniques to visualize blood flow (BF) conditions in real time, based on indocyanine green videography (ICG-VG) and the electrophysiological phenomenon of high gamma activity (HGA). We investigated the use of ICG-VG in four patients with moyamoya disease and two with arteriovenous malformation (AVM), and we investigated the use of real-time HGA mapping in four patients with brain tumors who underwent lesion resection with awake craniotomy. Real-time data processing of ICG-VG was based on perfusion imaging, which generated parameters including arrival time (AT), mean transit time (MTT), and BF of brain surface vessels. During awake craniotomy, we analyzed the frequency components of brain oscillation and performed real-time HGA mapping to identify functional areas. Processed results were projected on a wireless monitor linked to the operating microscope. After revascularization for moyamoya disease, AT and BF were significantly shortened and increased, respectively, suggesting hyperperfusion. Real-time fusion images on the wireless monitor provided anatomical, BF, and functional information simultaneously, and allowed the resection of AVMs under the microscope. Real-time HGA mapping during awake craniotomy rapidly indicated the eloquent areas of motor and language function and significantly shortened the operation time. These novel techniques, which we introduced might improve the reliability of intraoperative monitoring and enable the development of rational and objective surgical strategies. PMID:25263624

  10. Performance Analysis of the Microsoft Kinect Sensor for 2D Simultaneous Localization and Mapping (SLAM) Techniques

    PubMed Central

    Kamarudin, Kamarulzaman; Mamduh, Syed Muhammad; Shakaff, Ali Yeon Md; Zakaria, Ammar

    2014-01-01

    This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS). The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect's depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks. PMID:25490595

  11. Novel techniques of real-time blood flow and functional mapping: technical note.

    PubMed

    Kamada, Kyousuke; Ogawa, Hiroshi; Saito, Masato; Tamura, Yukie; Anei, Ryogo; Kapeller, Christoph; Hayashi, Hideaki; Prueckl, Robert; Guger, Christoph

    2014-01-01

    There are two main approaches to intraoperative monitoring in neurosurgery. One approach is related to fluorescent phenomena and the other is related to oscillatory neuronal activity. We developed novel techniques to visualize blood flow (BF) conditions in real time, based on indocyanine green videography (ICG-VG) and the electrophysiological phenomenon of high gamma activity (HGA). We investigated the use of ICG-VG in four patients with moyamoya disease and two with arteriovenous malformation (AVM), and we investigated the use of real-time HGA mapping in four patients with brain tumors who underwent lesion resection with awake craniotomy. Real-time data processing of ICG-VG was based on perfusion imaging, which generated parameters including arrival time (AT), mean transit time (MTT), and BF of brain surface vessels. During awake craniotomy, we analyzed the frequency components of brain oscillation and performed real-time HGA mapping to identify functional areas. Processed results were projected on a wireless monitor linked to the operating microscope. After revascularization for moyamoya disease, AT and BF were significantly shortened and increased, respectively, suggesting hyperperfusion. Real-time fusion images on the wireless monitor provided anatomical, BF, and functional information simultaneously, and allowed the resection of AVMs under the microscope. Real-time HGA mapping during awake craniotomy rapidly indicated the eloquent areas of motor and language function and significantly shortened the operation time. These novel techniques, which we introduced might improve the reliability of intraoperative monitoring and enable the development of rational and objective surgical strategies.

  12. Micro-scale strain mapping technique: a tool to quantify strain partitioning during creep deformation

    NASA Astrophysics Data System (ADS)

    Quintanilla-Terminel, Alejandra; Zimmerman, Mark; Evans, Brian; Kohlstedt, David

    2016-04-01

    Several deformation mechanisms interact to accommodate plastic deformation. Quantifying the contribution of each to the total strain is necessary for establishing a better link between observed microstructures and mechanical data, as well as to allow more confident extrapolation from laboratory to natural conditions. In this contribution, we present the experimental and computational technique involved in micro-scale strain mapping (MSSM). The MSSM technique relies on analyzing the relative displacement of initially regularly spaced markers after deformation. We present several microfabrication techniques that permit us to pattern various rocks with micrometric and nanometric metal markers, as well as the challenges faced in working at high temperatures and pressures. A Hough transform algorithm was used to detect the markers and automate as much as possible the strain analysis. The von Mises strain is calculated for a set of n-points and their relative displacements, which allow us to map the strain at different length scales. We applied the MSSM technique to study strain partitioning during deformation creep of Carrara marble and San Carlos olivine at a confining pressure, Pc, of 300 MPa and homologous temperatures of 0.3 to 0.6. We measured the local strain and strain heterogeneity produced during creep deformation of split cylinders of Carrara marble under conventional triaxial loading to inelastic strains of 11 to 36% at a strain rate of 3x10-5s-1, Pc = 300 MPa and 400o < T <700oC. We conclude that the evolution of deformation structures in marble takes place over a substantial interval in strain and that the duration of this interval depends on strain rate, temperature, and pressure. Our first results on strain mapping of olivine deformed at T = 1150oC and Pc = 300 MPa demonstrate promise for characterizing intragranular strain and better defining the contribution of grain boundary sliding to the total strain.

  13. Accuracy of vertical radial plume mapping technique in measuring lagoon gas emissions.

    PubMed

    Viguria, Maialen; Ro, Kyoung S; Stone, Kenneth C; Johnson, Melvin H

    2015-04-01

    Recently, the U.S. Environmental Protection Agency (EPA) posted a ground-based optical remote sensing method on its Web site called Other Test Method (OTM) 10 for measuring fugitive gas emission flux from area sources such as closed landfills. The OTM 10 utilizes the vertical radial plume mapping (VRPM) technique to calculate fugitive gas emission mass rates based on measured wind speed profiles and path-integrated gas concentrations (PICs). This study evaluates the accuracy of the VRPM technique in measuring gas emission from animal waste treatment lagoons. A field trial was designed to evaluate the accuracy of the VRPM technique. Control releases of methane (CH4) were made from a 45 m×45 m floating perforated pipe network located on an irrigation pond that resembled typical treatment lagoon environments. The accuracy of the VRPM technique was expressed by the ratio of the calculated emission rates (QVRPM) to actual emission rates (Q). Under an ideal condition of having mean wind directions mostly normal to a downwind vertical plane, the average VRPM accuracy was 0.77±0.32. However, when mean wind direction was mostly not normal to the downwind vertical plane, the emission plume was not adequately captured resulting in lower accuracies. The accuracies of these nonideal wind conditions could be significantly improved if we relaxed the VRPM wind direction criteria and combined the emission rates determined from two adjacent downwind vertical planes surrounding the lagoon. With this modification, the VRPM accuracy improved to 0.97±0.44, whereas the number of valid data sets also increased from 113 to 186. The need for developing accurate and feasible measuring techniques for fugitive gas emission from animal waste lagoons is vital for livestock gas inventories and implementation of mitigation strategies. This field lagoon gas emission study demonstrated that the EPA's vertical radial plume mapping (VRPM) technique can be used to accurately measure lagoon gas

  14. Mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Messmore, J.; Copeland, G. E.; Levy, G. F.

    1975-01-01

    This study was undertaken with the intent of elucidating the forest mapping capabilities of ERTS-1 MSS data when analyzed with the aid of LARS' automatic data processing techniques. The site for this investigation was the Great Dismal Swamp, a 210,000 acre wilderness area located on the Middle Atlantic coastal plain. Due to inadequate ground truth information on the distribution of vegetation within the swamp, an unsupervised classification scheme was utilized. Initially pictureprints, resembling low resolution photographs, were generated in each of the four ERTS-1 channels. Data found within rectangular training fields was then clustered into 13 spectral groups and defined statistically. Using a maximum likelihood classification scheme, the unknown data points were subsequently classified into one of the designated training classes. Training field data was classified with a high degree of accuracy (greater than 95 percent), and progress is being made towards identifying the mapped spectral classes.

  15. Erasing the Milky Way: new cleaning technique applied to GBT intensity mapping data

    NASA Astrophysics Data System (ADS)

    Wolz, L.; Blake, C.; Abdalla, F. B.; Anderson, C. J.; Chang, T.-C.; Li, Y.-C.; Masui, K. W.; Switzer, E.; Pen, U.-L.; Voytek, T. C.; Yadav, J.

    2017-02-01

    We present the first application of a new foreground removal pipeline to the current leading H I intensity mapping data set, obtained by the Green Bank Telescope (GBT). We study the 15- and 1-h-field data of the GBT observations previously presented in Mausui et al. and Switzer et al., covering about 41 deg2 at 0.6 < z < 1.0, for which cross-correlations may be measured with the galaxy distribution of the WiggleZ Dark Energy Survey. In the presented pipeline, we subtract the Galactic foreground continuum and the point-source contamination using an independent component analysis technique (FASTICA), and develop a Fourier-based optimal estimator to compute the temperature power spectrum of the intensity maps and cross-correlation with the galaxy survey data. We show that FASTICA is a reliable tool to subtract diffuse and point-source emission through the non-Gaussian nature of their probability distributions. The temperature power spectra of the intensity maps are dominated by instrumental noise on small scales which FASTICA, as a conservative subtraction technique of non-Gaussian signals, cannot mitigate. However, we determine similar GBT-WiggleZ cross-correlation measurements to those obtained by the singular value decomposition (SVD) method, and confirm that foreground subtraction with FASTICA is robust against 21 cm signal loss, as seen by the converged amplitude of these cross-correlation measurements. We conclude that SVD and FASTICA are complementary methods to investigate the foregrounds and noise systematics present in intensity mapping data sets.

  16. Erasing the Milky Way: New Cleaning Technique Applied to GBT Intensity Mapping Data

    NASA Technical Reports Server (NTRS)

    Wolz, L.; Blake, C.; Abdalla, F. B.; Anderson, C. J.; Chang, T.-C.; Li, Y.-C.; Masi, K.W.; Switzer, E.; Pen, U.-L.; Voytek, T. C.; hide

    2016-01-01

    We present the first application of a new foreground removal pipeline to the current leading HI intensity mapping dataset, obtained by the Green Bank Telescope (GBT). We study the 15- and 1-h field data of the GBT observations previously presented in Masui et al. (2013) and Switzer et al. (2013), covering about 41 square degrees at 0.6 less than z is less than 1.0, for which cross-correlations may be measured with the galaxy distribution of the WiggleZ Dark Energy Survey. In the presented pipeline, we subtract the Galactic foreground continuum and the point source contamination using an independent component analysis technique (fastica), and develop a Fourier-based optimal estimator to compute the temperature power spectrum of the intensity maps and cross-correlation with the galaxy survey data. We show that fastica is a reliable tool to subtract diffuse and point-source emission through the non-Gaussian nature of their probability distributions. The temperature power spectra of the intensity maps is dominated by instrumental noise on small scales which fastica, as a conservative sub-traction technique of non-Gaussian signals, can not mitigate. However, we determine similar GBT-WiggleZ cross-correlation measurements to those obtained by the Singular Value Decomposition (SVD) method, and confirm that foreground subtraction with fastica is robust against 21cm signal loss, as seen by the converged amplitude of these cross-correlation measurements. We conclude that SVD and fastica are complementary methods to investigate the foregrounds and noise systematics present in intensity mapping datasets.

  17. A New Active Cavitation Mapping Technique for Pulsed HIFU Applications – Bubble Doppler

    PubMed Central

    Li, Tong; Khokhlova, Tatiana; Sapozhnikov, Oleg; Hwang, Joo Ha; Sapozhnikov, Oleg; O’Donnell, Matthew

    2015-01-01

    In this work, a new active cavitation mapping technique for pulsed high-intensity focused ultrasound (pHIFU) applications termed bubble Doppler is proposed and its feasibility tested in tissue-mimicking gel phantoms. pHIFU therapy uses short pulses, delivered at low pulse repetition frequency, to cause transient bubble activity that has been shown to enhance drug and gene delivery to tissues. The current gold standard for detecting and monitoring cavitation activity during pHIFU treatments is passive cavitation detection (PCD), which provides minimal information on the spatial distribution of the bubbles. B-mode imaging can detect hyperecho formation, but has very limited sensitivity, especially to small, transient microbubbles. The bubble Doppler method proposed here is based on a fusion of the adaptations of three Doppler techniques that had been previously developed for imaging of ultrasound contrast agents – color Doppler, pulse inversion Doppler, and decorrelation Doppler. Doppler ensemble pulses were interleaved with therapeutic pHIFU pulses using three different pulse sequences and standard Doppler processing was applied to the received echoes. The information yielded by each of the techniques on the distribution and characteristics of pHIFU-induced cavitation bubbles was evaluated separately, and found to be complementary. The unified approach - bubble Doppler – was then proposed to both spatially map the presence of transient bubbles and to estimate their sizes and the degree of nonlinearity. PMID:25265178

  18. Tropical Cyclone Vulnerability Mapping Using Geospatial Techniques: Application to a Coastal Upazila in Bangladesh

    NASA Astrophysics Data System (ADS)

    Hoque, M. A. A.; Phinn, S. R.; Roelfsema, C. M.; Childs, I.

    2015-12-01

    Cyclones are one of the most catastrophic natural disasters. Globally, many coastal regions are vulnerable to different categories cyclones. In Bangladesh, disasters from tropical cyclones are annual occurrences in coastal areas. The intensity and extent of damage due to tropical cyclones are very high. An appropriate mapping approach is essential for producing detail vulnerability assessments to deliver useful information for reducing the impacts of cyclones on people, property and environment. The present study developed and tested a vulnerability mapping approach for tropical cyclone impacts in Sarankhola upazila a 151 km2 local government area located in coastal Bangladesh. The study applied the approach by integrating remote sensing, field data and multi-criteria evaluation at regional scales covering <1000 km2. Seven criteria concerned with cyclone impacts were considered for the study: elevation, slope, geomorphology, proximity to coastline, proximity to cyclone track, land uses and population density. Thematic raster map layers were prepared for every criterion and weighted using Analytical Hierarchy Process (AHP) with sensitivity analysis. Weighted Linear Combination (WLC) technique was used for overlaying standardized criteria maps with their weights to produce the vulnerability map. Our results indicated that 15% of the study area had very high vulnerability; mostly close to the river and densely populated areas, with 40 % area as high vulnerability on cropped grounds. Around 25% area was classified at moderate vulnerability covering most of the forests. The low and very low vulnerable area accounts the 12% and 8% respectively. This approach provided very promising result and it was verified by field survey. The result provides the strong applicability of this approach to assess the vulnerability of coastal Bangladesh to tropical cyclones.

  19. Study of lumineers' interfaces by means of optical coherence tomography

    NASA Astrophysics Data System (ADS)

    de Andrade Borges, Erica; Fernandes Cassimiro-Silva, Patrícia; Osório Fernandes, Luana; Leônidas Gomes, Anderson Stevens

    2015-06-01

    OCT has been used to evaluate dental materials, and is employed here to evaluate lumineers for the first time. Lumineers are used as esthetical indirect restoration, and after wearing and aging, several undesirable features such as gaps, bubbles and mismatch can appear in which would only be seen by invasive analysis. The OCT (spectral domain SD-OCT, 930nm central wavelength) was used to evaluate noninvasively the lumineer- cement-tooth interface. We analyzed 20 specimens of lumineers-teeth that were prepared in bovine teeth and randomly allocated in 4 experimental groups (n=5) with two different cementation techniques and two different types of cementing agent (RelyX U200 and RelyX Veneer, 3M ESPE, with the adhesive recommended by the manufacture). The lumineers were made of lithium disilicate and obtained using a vacuum injection technique. The analysis was performed by using 2D and 3D OCT images, obtained before and after cementing and the thermal cycling process to simulate thermal stress in a oral cavity. Initial measurements showed that the SD-OCT was able to see through the 500μm thick lumineer, as delivered by the fabricant, and internal stress was observed. Failures were found in the cementing process and also after ageing simulation by thermal cycling. The adhesive failures as bubbles, gaps and degradation of the cementation line are the natural precursors of other defects reported by several studies of clinical follow-up (detachments, fractures and cracks). Bubble dimensions ranging from 146 μm to 1427 μm were measured and the OCT was validated as an investigative and precise tool for evaluation of the lumineer-cement-tooth.

  20. Brain MAPS: an automated, accurate and robust brain extraction technique using a template library

    PubMed Central

    Leung, Kelvin K.; Barnes, Josephine; Modat, Marc; Ridgway, Gerard R.; Bartlett, Jonathan W.; Fox, Nick C.; Ourselin, Sébastien

    2011-01-01

    Whole brain extraction is an important pre-processing step in neuro-image analysis. Manual or semi-automated brain delineations are labour-intensive and thus not desirable in large studies, meaning that automated techniques are preferable. The accuracy and robustness of automated methods are crucial because human expertise may be required to correct any sub-optimal results, which can be very time consuming. We compared the accuracy of four automated brain extraction methods: Brain Extraction Tool (BET), Brain Surface Extractor (BSE), Hybrid Watershed Algorithm (HWA) and a Multi-Atlas Propagation and Segmentation (MAPS) technique we have previously developed for hippocampal segmentation. The four methods were applied to extract whole brains from 682 1.5T and 157 3T T1-weighted MR baseline images from the Alzheimer’s Disease Neuroimaging Initiative database. Semi-automated brain segmentations with manual editing and checking were used as the gold-standard to compare with the results. The median Jaccard index of MAPS was higher than HWA, BET and BSE in 1.5T and 3T scans (p < 0.05, all tests), and the 1st-99th centile range of the Jaccard index of MAPS was smaller than HWA, BET and BSE in 1.5T and 3T scans (p < 0.05, all tests). HWA and MAPS were found to be best at including all brain tissues (median false negative rate ≤ 0.010% for 1.5T scans and ≤ 0.019% for 3T scans, both methods). The median Jaccard index of MAPS were similar in both 1.5T and 3T scans, whereas those of BET, BSE and HWA were higher in 1.5T scans than 3T scans (p < 0.05, all tests). We found that the diagnostic group had a small effect on the median Jaccard index of all four methods. In conclusion, MAPS had relatively high accuracy and low variability compared to HWA, BET and BSE in MR scans with and without atrophy. PMID:21195780

  1. Mapping land subsidence in Jakarta, Indonesia using persistent scatterer interferometry (PSI) technique with ALOS PALSAR

    NASA Astrophysics Data System (ADS)

    Ng, Alex Hay-Man; Ge, Linlin; Li, Xiaojing; Abidin, Hasanuddin Z.; Andreas, Heri; Zhang, Kui

    2012-08-01

    Jakarta is the capital of Indonesia. It is one of the largest cities with an around 10 million population as for 2010, covering an area of about 661 square kilometers. The land subsidence phenomena in several areas of Jakarta are well known for many years. Land subsidence does damage infrastructures in the city; therefore, it has to be closely monitored and analysed. In this paper, the authors have demonstrated the capability of the PALSAR sensor aboard the Japanese ALOS satellite for large-scale mapping of land subsidence in Jakarta. Mapping results were produced from 17 ALOS PALSAR L-band radar images using persistent scatterer radar interferometry (PSI). The GEOS-PSI software, developed by the Geodesy and Earth Observing Systems group at UNSW for PSI analysis, was used to detect and map those land subsidence. The results showed that the land in the area of Jakarta was deforming at different rates across several zones. The land subsidence measured from ALOS PALSAR imagery were cross-examined with the subsidence values obtained from 4 GPS campaign surveys by the authors at 19 stations between 2007 and 2010. The magnitudes and trends of the deformation obtained from both techniques agreed well in general for those pinpoints of GPS. After the removal of 2 suspected outliers in GPS measurements, the subsidence rate difference between the two techniques range from -29 to 6 mm/year, with a standard deviation of 9 mm/year and an average absolute difference of 8 mm/year. This research has suggested that PSI with L-band ALOS PALSAR data can be a promising technique to complement the GPS surveying for monitoring land subsidence in super large cities like Jakarta.

  2. Spatial Field Variability Mapping of Rice Crop using Clustering Technique from Space Borne Hyperspectral Data

    NASA Astrophysics Data System (ADS)

    Moharana, S.; Dutta, S.

    2015-12-01

    Precision farming refers to field-specific management of an agricultural crop at a spatial scale with an aim to get the highest achievable yield and to achieve this spatial information on field variability is essential. The difficulty in mapping of spatial variability occurring within an agriculture field can be revealed by employing spectral techniques in hyperspectral imagery rather than multispectral imagery. However an advanced algorithm needs to be developed to fully make use of the rich information content in hyperspectral data. In the present study, potential of hyperspectral data acquired from space platform was examined to map the field variation of paddy crop and its species discrimination. This high dimensional data comprising 242 spectral narrow bands with 30m ground resolution Hyperion L1R product acquired for Assam, India (30th Sept and 3rd Oct, 2014) were allowed for necessary pre-processing steps followed by geometric correction using Hyperion L1GST product. Finally an atmospherically corrected and spatially deduced image consisting of 112 band was obtained. By employing an advanced clustering algorithm, 12 different clusters of spectral waveforms of the crop were generated from six paddy fields for each images. The findings showed that, some clusters were well discriminated representing specific rice genotypes and some clusters were mixed treating as a single rice genotype. As vegetation index (VI) is the best indicator of vegetation mapping, three ratio based VI maps were also generated and unsupervised classification was performed for it. The so obtained 12 clusters of paddy crop were mapped spatially to the derived VI maps. From these findings, the existence of heterogeneity was clearly captured in one of the 6 rice plots (rice plot no. 1) while heterogeneity was observed in rest of the 5 rice plots. The degree of heterogeneous was found more in rice plot no.6 as compared to other plots. Subsequently, spatial variability of paddy field was

  3. Mapping Glauconite Unites with Using Remote Sensing Techniques in North East of Iran

    NASA Astrophysics Data System (ADS)

    Ahmadirouhani, R.; Samiee, S.

    2014-10-01

    Glauconite is a greenish ferric-iron silicate mineral with micaceous structure, characteristically formed in shallow marine environments. Glauconite has been used as a pigmentation agent for oil paint, contaminants remover in environmental studies and a source of potassium in plant fertilizers, and other industries. Koppeh-dagh basin is extended in Iran, Afghanistan and Turkmenistan countries and Glauconite units exist in this basin. In this research for enhancing and mapping glauconitic units in Koppeh-dagh structural zone in north east of Iran, remote sensing techniques such as Spectral Angle Mapper classification (SAM), band ratio and band composition methods on SPOT, ASTER and Landsat data in 3 steps were applied.

  4. Characterization of a seeded pulsed molecular beam using the velocity map imaging technique

    NASA Astrophysics Data System (ADS)

    Lietard, Aude; Poisson, Lionel; Mestdagh, Jean-Michel; Gaveau, Marc-André

    2016-11-01

    An experimental study has been performed to characterize the density and the velocity distribution in a pulsed molecular beam generated by a source associating a pulsed valve and an oven placed just downstream. In its operating mode, the flow is alternatively in a supersonic and effusive regime. The Velocity Map Imaging (VMI) technique associated with laser ionization allows measuring the velocity distribution and the density of molecules as a function of time during the expansion. It gives us a very precise insight into the structure of the molecule bunch, and therefore into the nature of the expansion from which the molecular beam is extracted.

  5. Microscale and nanoscale strain mapping techniques applied to creep of rocks

    NASA Astrophysics Data System (ADS)

    Quintanilla-Terminel, Alejandra; Zimmerman, Mark E.; Evans, Brian; Kohlstedt, David L.

    2017-07-01

    Usually several deformation mechanisms interact to accommodate plastic deformation. Quantifying the contribution of each to the total strain is necessary to bridge the gaps from observations of microstructures, to geomechanical descriptions, to extrapolating from laboratory data to field observations. Here, we describe the experimental and computational techniques involved in microscale strain mapping (MSSM), which allows strain produced during high-pressure, high-temperature deformation experiments to be tracked with high resolution. MSSM relies on the analysis of the relative displacement of initially regularly spaced markers after deformation. We present two lithography techniques used to pattern rock substrates at different scales: photolithography and electron-beam lithography. Further, we discuss the challenges of applying the MSSM technique to samples used in high-temperature and high-pressure experiments. We applied the MSSM technique to a study of strain partitioning during creep of Carrara marble and grain boundary sliding in San Carlos olivine, synthetic forsterite, and Solnhofen limestone at a confining pressure, Pc, of 300 MPa and homologous temperatures, T/Tm, of 0.3 to 0.6. The MSSM technique works very well up to temperatures of 700 °C. The experimental developments described here show promising results for higher-temperature applications.

  6. Reference map technique for finite-strain elasticity and fluid-solid interaction

    NASA Astrophysics Data System (ADS)

    Kamrin, Ken; Rycroft, Chris H.; Nave, Jean-Christophe

    2012-11-01

    The reference map, defined as the inverse motion function, is utilized in an Eulerian-frame representation of continuum solid mechanics, leading to a simple, explicit finite-difference method for solids undergoing finite deformations. We investigate the accuracy and applicability of the technique for a range of finite-strain elasticity laws under various geometries and loadings. Capacity to model dynamic, static, and quasi-static conditions is shown. Specifications of the approach are demonstrated for handling irregularly shaped and/or moving boundaries, as well as shock solutions. The technique is also integrated within a fluid-solid framework using a level-set to discern phases and using a standard explicit fluid solver for the fluid phases. We employ a sharp-interface method to institute the interfacial conditions, and the resulting scheme is shown to efficiently capture fluid-solid interaction solutions in several examples.

  7. The creation of a digital soil map for Cyprus using decision-tree classification techniques

    NASA Astrophysics Data System (ADS)

    Camera, Corrado; Zomeni, Zomenia; Bruggeman, Adriana; Noller, Joy; Zissimos, Andreas

    2014-05-01

    Considering the increasing threats soil are experiencing especially in semi-arid, Mediterranean environments like Cyprus (erosion, contamination, sealing and salinisation), producing a high resolution, reliable soil map is essential for further soil conservation studies. This study aims to create a 1:50.000 soil map covering the area under the direct control of the Republic of Cyprus (5.760 km2). The study consists of two major steps. The first is the creation of a raster database of predictive variables selected according to the scorpan formula (McBratney et al., 2003). It is of particular interest the possibility of using, as soil properties, data coming from three older island-wide soil maps and the recently published geochemical atlas of Cyprus (Cohen et al., 2011). Ten highly characterizing elements were selected and used as predictors in the present study. For the other factors usual variables were used: temperature and aridity index for climate; total loss on ignition, vegetation and forestry types maps for organic matter; the DEM and related relief derivatives (slope, aspect, curvature, landscape units); bedrock, surficial geology and geomorphology (Noller, 2009) for parent material and age; and a sub-watershed map to better bound location related to parent material sources. In the second step, the digital soil map is created using the Random Forests package in R. Random Forests is a decision tree classification technique where many trees, instead of a single one, are developed and compared to increase the stability and the reliability of the prediction. The model is trained and verified on areas where a 1:25.000 published soil maps obtained from field work is available and then it is applied for predictive mapping to the other areas. Preliminary results obtained in a small area in the plain around the city of Lefkosia, where eight different soil classes are present, show very good capacities of the method. The Ramdom Forest approach leads to reproduce soil

  8. Delineating the Rattlesnake Springs, New Mexico Watershed Using Shallow Subsurface Geophysical Techniques and Geologic Mapping

    NASA Astrophysics Data System (ADS)

    Doser, D. I.; Langford, R. P.; Boykov, N. D.; Baker, M. R.; Kaip, G. M.

    2007-12-01

    Rattlesnake Springs serves as the sole water source for Carlsbad Caverns National Park. The recent development of oil and gas leases and agricultural lands surrounding the springs has led to concern about contamination of the karst aquifer. We have used geophysical techniques, combined with geologic mapping, to delineate possible fracture systems in the gypsum and carbonate bedrock that feed the spring system. Our initial work has focused on a 700 m by 700 m region surrounding the springs. We conducted a series of ground conductivity surveys with follow-up DC resistivity surveys (Wenner array vertical electrical soundings and a pole- pole survey) to determine variations in soil grain size and moisture content. Surface geologic mapping was used to identify a series of Holocene terraces and valleys that incise the terraces. Our combined results suggest that northwest-southeast and north-south trending fractures and dissolution features control regional water flow. Relict spring valleys are found to the west of the present springs. A pole-pole survey conducted around the perimeter of the springs suggests main water flow into the springs occurs from the northwest. We plan to complete a precision gravity survey in September and October 2007 to map bedrock topography and determine its relation to structural and dissolution features. Ground penetrating radar data will be collected on the northwestern side of the springs in an attempt to better delineate structures controlling inflow into the springs.

  9. Coupling technique of random amplified polymorphic DNA and nanoelectrochemical sensor for mapping pancreatic cancer genetic fingerprint.

    PubMed

    Liu, Qicai; Liu, Ailin; Gao, Feng; Weng, Shaohuang; Zhong, Guangxian; Liu, Jingfeng; Lin, Xinhua; Lin, Jian-hua; Chen, Xuhai

    2011-01-01

    To review the feasibility of coupling the techniques of random amplified polymorphic DNA (RAPD) with carbon nanotube-based modified electrode for guanine/deoxyguanine triphosphate (dGTP) electrochemical sensing for mapping of the pancreatic cancer genetic fingerprint and screening of genetic alterations. We developed a new method to study the electrochemical behavior of dGTP utilizing carbon multiwalled nanotube (MWNT)-modified glassy carbon electrodes (GCEs). RAPD was applied for amplification of DNA samples from healthy controls and patients with pancreatic cancer under the same conditions to determine the different surplus quantity of dGTP in the polymerase chain reaction (PCR), thereby determining the difference/quantity of PCR products or template strands. Using this method we generated a genetic fingerprint map of pancreatic cancer through the combination of electrochemical sensors and gel electrophoresis to screen for genetic alterations. Cloning and sequencing were then performed to verify these gene alterations. dGTP showed favorable electrochemical behavior on the MWNTs/GCE. The results indicated that the electrical signal and dGTP had a satisfactory linear relationship with the dGTP concentration within the conventional PCR concentration range. The MWNTs/GCE could distinguish between different products of RAPD. This experiment successfully identified a new pancreatic cancer-associated mutant gene fragment, consisting of a cyclin-dependent kinase 4 gene 3' terminal mutation. The coupling of RAPD and nanoelectrochemical sensors was successfully applied to the screening of genetic alterations in pancreatic cancer and for mapping of DNA fingerprints.

  10. Using GIS Mapping to Target Public Health Interventions: Examining Birth Outcomes Across GIS Techniques.

    PubMed

    MacQuillan, E L; Curtis, A B; Baker, K M; Paul, R; Back, Y O

    2017-08-01

    With advances in spatial analysis techniques, there has been a trend in recent public health research to assess the contribution of area-level factors to health disparity for a number of outcomes, including births. Although it is widely accepted that health disparity is best addressed by targeted, evidence-based and data-driven community efforts, and despite national and local focus in the U.S. to reduce infant mortality and improve maternal-child health, there is little work exploring how choice of scale and specific GIS visualization technique may alter the perception of analyses focused on health disparity in birth outcomes. Retrospective cohort study. Spatial analysis of individual-level vital records data for low birthweight and preterm births born to black women from 2007 to 2012 in one mid-sized Midwest city using different geographic information systems (GIS) visualization techniques [geocoded address records were aggregated at two levels of scale and additionally mapped using kernel density estimation (KDE)]. GIS analyses in this study support our hypothesis that choice of geographic scale (neighborhood or census tract) for aggregated birth data can alter programmatic decision-making. Results indicate that the relative merits of aggregated visualization or the use of KDE technique depend on the scale of intervention. The KDE map proved useful in targeting specific areas for interventions in cities with smaller populations and larger census tracts, where they allow for greater specificity in identifying intervention areas. When public health programmers seek to inform intervention placement in highly populated areas, however, aggregated data at the census tract level may be preferred, since it requires lower investments in terms of time and cartographic skill and, unlike neighborhood, census tracts are standardized in that they become smaller as the population density of an area increases.

  11. 78 FR 69710 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: U.S. Nuclear Regulatory Commission (NRC). ACTION... consecutive weeks of a combined license (COL) application from Luminant Generation Company, LLC....

  12. 78 FR 66785 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-06

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: Nuclear Regulatory Commission. ACTION: Notice of receipt... consecutive weeks of ] a combined license (COL) application from Luminant Generation Company, LLC....

  13. 78 FR 70964 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: Nuclear Regulatory Commission. ACTION: Combined license... for four consecutive weeks of a combined license (COL) application from Luminant Generation...

  14. 78 FR 68100 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: U.S. Nuclear Regulatory Commission (NRC). ACTION... consecutive weeks of a combined license (COL) application from Luminant Generation Company, LLC....

  15. Mapping.

    ERIC Educational Resources Information Center

    Kinney, Douglas M.; McIntosh, Willard L.

    1979-01-01

    The area of geological mapping in the United States in 1978 increased greatly over that reported in 1977; state geological maps were added for California, Idaho, Nevada, and Alaska last year. (Author/BB)

  16. Creep deformation behavior in eutectic Sn-Ag solder joints using a novel mapping technique

    SciTech Connect

    Lucas, J.P.; Guo, F.; McDougall, J.; Bieler, T.R.; Subramanian, K.N.; Park, J.K.

    1999-11-01

    Creep deformation behavior was measured for 60--100 {micro}m thick solder joints. The solder joints investigated consisted of: (1) non-composite solder joints made with eutectic Sn-Ag solder, and (2) composite solder joints with eutectic Sn-Ag solder containing 20 vol.%, 5 {micro}m diameter in-situ Cu{sub 6}Sn{sub 5} intermetallic reinforcements. All creep testing in this study was carried out at room temperature. Qualitative and quantitative assessment of creep deformation was characterized on the solder joints. Creep deformation was analyzed using a novel mapping technique where a geometrical-regular line pattern was etched over the entire solder joint using excimer laser ablation. During creep, the laser-ablation (LA) pattern becomes distorted due to deformation in the solder joint. By imaging the distortion of laser-ablation patterns using the SEM, actual deformation mapping for the entire solder joint is revealed. The technique involves sequential optical/digital imaging of the deformation versus time history during creep. By tracing and recording the deformation of the LA patterns on the solder over intervals of time, local creep data are obtained in many locations in the joint. This analysis enables global and localized creep shear strains and strain rate to be determined.

  17. A novel multifrequency technique for the detection of point sources in cosmic microwave background maps

    NASA Astrophysics Data System (ADS)

    Herranz, D.; López-Caniego, M.; Sanz, J. L.; González-Nuevo, J.

    2009-03-01

    In this work we address the problem of simultaneous multifrequency detection of extragalactic point sources in the maps of the cosmic microwave background. We apply a new linear filtering technique, the `matched matrix filters', that incorporates full spatial information, including the cross-correlation among channels, without making any a priori assumption about the spectral behaviour of the sources. A substantial reduction of the background is achieved thanks to the optimal combination of filtered maps. We describe the new technique in detail and apply it to the detection of radio sources and estimation of their parameters in realistic all-sky Planck simulations at 30, 44, 70 and 100 GHz. Then, we compare the results with the single-frequency approach based on the standard matched filter, in terms of reliability, completeness and flux accuracy of the resulting point source catalogues. The new filters outperform the standard matched filters for all these indexes at 30, 44 and 70 GHz, whereas at 100 GHz both kinds of filters have a similar performance. We find a notable increment of the number of true detections for a fixed reliability level. In particular, for a 95 per cent reliability we practically double the number of detections at 30, 44 and 70 GHz.

  18. Computer controlled techniques for high emission density mapping of thermionic cathodes

    NASA Astrophysics Data System (ADS)

    Gibson, J. W.; Thomas, R. E.

    1985-12-01

    Some of the techniques commonly used (e.g. SLEEP and thermionic emission microscope) for measuring emission or work function uniformity of thermionic cathode surfaces require the use of very low or near zero current densities, thus the cathode is characterized at current densities and temperatures much lower than that of a normally operating cathode. The system reported on here uses a high voltage pulse technique and is capable of measuring emission densities in the range 1 to 80 A/cm 2 at normal cathode operating temperatures. The cathode surface is scanned with an anode having a 0.025 mm aperture whose position is controlled by computer operated stepping motors. The current through the aperture to a collector electrode is measured using a sample-and-hold amplifier. Pulsing and sampling are computer synchronized with the scanning, and data for each pulse are accumulated and can be processed and displayed in several ways using the computer, including a detailed "three-dimensional" map of either the electron emission density or work function variations. The entire surface of the cathode or any portion of it can be mapped in steps as small as 0.001 mm (1μm), but typically steps of 5-100 μm were used. Measurements are presented illustrating the uniformity or nonuniformity of the electron emission densities and work functions for type-B and type-M cathodes.

  19. Fingerprint detection and mapping using a phase shifted coherent gradient sensing technique.

    PubMed

    Dhanotia, Jitendra; Prakash, Satya; Bhatia, Vimal; Prakash, Shashi

    2016-07-10

    In this paper, a full field technique for mapping a latent fingerprint using a coherent gradient sensing (CGS) sensor is proposed. Collimated light from an He-Ne laser illuminates a specimen comprising a fingerprint implanted onto a reflecting surface. Reflected light from the specimen is analyzed using the CGS sensor comprising a pair of gratings. Reflected light carries information regarding the depth and orientation of furrows and ridges in the fingerprint. The topological information of the fingerprint is retrieved using four-step phase shifting interferometry. Well-defined 2D and 3D phase plots have been reconstructed to map the topography of the human fingerprint. The recorded slope data reconstructs the information regarding the separation and depth of the ridges in the latent fingerprint. The proposed technique is noninvasive and full field and does not require any kind of chemical or physical treatment. The sensor is very simple, yields interferometric sensitivity, and has the advantages of easy alignment, compactness, and low cost.

  20. Electromechanical probe and automated indentation maps are sensitive techniques in assessing early degenerated human articular cartilage.

    PubMed

    Sim, Sotcheadt; Chevrier, Anik; Garon, Martin; Quenneville, Eric; Lavigne, Patrick; Yaroshinsky, Alex; Hoemann, Caroline D; Buschmann, Michael D

    2016-06-09

    Recent advances in the development of new drugs to halt or even reverse the progression of Osteoarthritis at an early-stage requires new tools to detect early degeneration of articular cartilage. We investigated the ability of an electromechanical probe and an automated indentation technique to characterize entire human articular surfaces for rapid non-destructive discrimination between early degenerated and healthy articular cartilage. Human cadaveric asymptomatic articular surfaces (4 pairs of distal femurs and 4 pairs of tibial plateaus) were used. They were assessed ex vivo: macroscopically, electromechanically (maps of the electromechanical quantitative parameter, QP, reflecting streaming potentials), mechanically (maps of the instantaneous modulus, IM) and through cartilage thickness. Osteochondral cores were also harvested from healthy and degenerated regions for histological assessment, biochemical analyses and unconfined compression tests. The macroscopic visual assessment delimited three distinct regions on each articular surface: region I was macroscopically degenerated, region II was macroscopically normal but adjacent to region I and region III was the remaining normal articular surface. Thus, each extracted core was assigned to one of the three regions. A mixed effect model revealed that only the QP (p < 0.0001) and IM (p < 0.0001) were able to statistically discriminate the three regions. Effect size was higher for QP and IM than other assessments, indicating greater sensitivity to distinguish early degeneration of cartilage. When considering the mapping feature of the QP and IM techniques, it also revealed bilateral symmetry in a moderately similar distribution pattern between bilateral joints. This article is protected by copyright. All rights reserved.

  1. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    PubMed

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping.

  2. Calibration of imaging luminance measuring devices (ILMD)

    NASA Astrophysics Data System (ADS)

    Liu, Liying; Zheng, Feng; Zhu, Lingxi; Li, Ye; Huan, Kewei; Shi, Xiaoguang

    2015-11-01

    A method of calibration of imaging luminance measuring devices has been studied. By the device-independent color space transformation, the color image by digital camera could be converted to the CIE's absolute color space lab. Then, the calibration model is fitted between ln(L/t) and luminance. At last, luminance image is obtained and the dynamic range of luminance image could be adjusted by shutter speed.

  3. The Development of Luminance- and Texture-Defined Form Perception during the School-Aged Years

    ERIC Educational Resources Information Center

    Bertone, Armando; Hanck, Julie; Guy, Jacalyn; Cornish, Kim

    2010-01-01

    The objective of the present study was to assess the development of luminance- and texture-defined static form perception in school-aged children. This was done using an adapted Landolt-C technique where C-optotypes were defined by either luminance or texture information, the latter necessitating extra-striate neural processing to be perceived.…

  4. A Hybrid Technique for Thickness-Map Visualization of the Hip Cartilages in MRI

    NASA Astrophysics Data System (ADS)

    Khanmohammadi, Mahdieh; Zoroofi, Reza Aghaiezadeh; Nishii, Takashi; Tanaka, Hisashi; Sato, Yoshinobu

    Quantification of the hip cartilages is clinically important. In this study, we propose an automatic technique for segmentation and visualization of the acetabular and femoral head cartilages based on clinically obtained multi-slice T1-weighted MR data and a hybrid approach. We follow a knowledge based approach by employing several features such as the anatomical shapes of the hip femoral and acetabular cartilages and corresponding image intensities. We estimate the center of the femoral head by a Hough transform and then automatically select the volume of interest. We then automatically segment the hip bones by a self-adaptive vector quantization technique. Next, we localize the articular central line by a modified canny edge detector based on the first and second derivative filters along the radial lines originated from the femoral head center and anatomical constraint. We then roughly segment the acetabular and femoral head cartilages using derivative images obtained in the previous step and a top-hat filter. Final masks of the acetabular and femoral head cartilages are automatically performed by employing the rough results, the estimated articular center line and the anatomical knowledge. Next, we generate a thickness map for each cartilage in the radial direction based on a Euclidian distance. Three dimensional pelvic bones, acetabular and femoral cartilages and corresponding thicknesses are overlaid and visualized. The techniques have been implemented in C++ and MATLAB environment. We have evaluated and clarified the usefulness of the proposed techniques in the presence of 40 clinical hips multi-slice MR images.

  5. Use of Technology-Assisted Techniques of Mind Mapping and Concept Mapping in Science Education: A Constructivist Study

    ERIC Educational Resources Information Center

    Balim, Ali Günay

    2013-01-01

    The study aims to investigate the effects of using mind maps and concept maps on students' learning of concepts in science courses. A total of 51 students participated in this study which used a quasi-experimental research design with pre-test/post-test control groups. The constructivist-inspired study was carried out in the sixth-grade science…

  6. Use of Technology-Assisted Techniques of Mind Mapping and Concept Mapping in Science Education: A Constructivist Study

    ERIC Educational Resources Information Center

    Balim, Ali Günay

    2013-01-01

    The study aims to investigate the effects of using mind maps and concept maps on students' learning of concepts in science courses. A total of 51 students participated in this study which used a quasi-experimental research design with pre-test/post-test control groups. The constructivist-inspired study was carried out in the sixth-grade science…

  7. Optical mapping of site-directed cleavages on single DNA molecules by the RecA-assisted restriction endonuclease technique.

    PubMed Central

    Wang, Y K; Huff, E J; Schwartz, D C

    1995-01-01

    Fluorescence in situ hybridization (FISH) resolution has advanced because newer techniques use increasingly decondensed chromatin. FISH cannot analyze restriction enzyme cutting sites due to limitations of the hybridization and detection technologies. The RecA-assisted restriction endonuclease (RARE) technique cleaves chromosomal DNA at a single EcoRI site within a given gene or selected sequence. We recently described a mapping technique, optical mapping, which uses fluorescence microscopy to produce high-resolution restriction maps rapidly by directly imaging restriction digestion cleavage events occurring on single deproteinized DNA molecules. Ordered maps are then constructed by noting fragment order and size, using several optically based techniques. Since we also wanted to map arbitrary sequences and gene locations, we combined RARE with optical mapping to produce site-specific visible EcoRI restriction cleavage sites on single DNA molecules. Here we describe this combined method, named optical RARE, and its initial application to mapping gene locations on yeast chromosomes. Images Fig. 2 Fig. 3 PMID:7816810

  8. MRI technique for the snapshot imaging of quantitative velocity maps using RARE

    NASA Astrophysics Data System (ADS)

    Shiko, G.; Sederman, A. J.; Gladden, L. F.

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T2 weighted, not T2∗ weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98 × 49 μm2, within 20 min, and monitored over ˜13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390 × 390 μm2. The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques.

  9. Rethinking Program Assessment through the Use of Program Alignment Mapping Technique

    ERIC Educational Resources Information Center

    Liu, Min; Wrobbel, Duff; Blankson, Isaac

    2010-01-01

    Curriculum mapping is a well-known assessment tool used to articulate and align a curriculum. The authors present an improved method of curriculum alignment that combines the traditional curriculum mapping with what is often called prerequisite mapping. This improved method of curriculum alignment mapping we label Program Alignment Mapping (PAM)…

  10. Chromospheres of Luminous Cool Stars

    NASA Astrophysics Data System (ADS)

    Dupree, Andrea K.; Avrett, Eugene

    2015-08-01

    Ultraviolet imaging of Alpha Orionis (Betelgeuse) reveals a complex variable chromospheric structure. Such atmospheres in luminous cool stars can affect features in the optical spectrum. Constructing semi-empiricalmodel atmospheres of luminous stars including the temperature rise due to a chromosphere allows us to predict potential effects on optical transitions. The radiative transfer code, PANDORA, calculates line strengths in a LTE or non-LTE formulation, spherical symmetry, and includes velocity fields when present. Various aspects of the line calculations and their impact on equivalent widths will be discussed including developing appropriate chromospheric models, comparison to a pure radiative equilibrium model, transitions sensitive to non-LTE and the effects of a realistic spherical non-LTE approximation as compared to a plane-parallel approximation. We discuss the extent to which a chromosphere can impact the determination of stellar abundances.

  11. Molecular Imaging of Tumors Using a Quantitative T1 Mapping Technique via Magnetic Resonance Imaging

    PubMed Central

    Herrmann, Kelsey; Johansen, Mette L.; Craig, Sonya E.; Vincent, Jason; Howell, Michael; Gao, Ying; Lu, Lan; Erokwu, Bernadette; Agnes, Richard S.; Lu, Zheng-Rong; Pokorski, Jonathan K.; Basilion, James; Gulani, Vikas; Griswold, Mark; Flask, Chris; Brady-Kalnay, Susann M.

    2015-01-01

    Magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM) with molecular imaging agents would allow for the specific localization of brain tumors. Prior studies using T1-weighted MR imaging demonstrated that the SBK2-Tris-(Gd-DOTA)3 molecular imaging agent labeled heterotopic xenograft models of brain tumors more intensely than non-specific contrast agents using conventional T1-weighted imaging techniques. In this study, we used a dynamic quantitative T1 mapping strategy to more objectively compare intra-tumoral retention of the SBK2-Tris-(Gd-DOTA)3 agent over time in comparison to non-targeted control agents. Our results demonstrate that the targeted SBK2-Tris-(Gd-DOTA)3 agent, a scrambled-Tris-(Gd-DOTA)3 control agent, and the non-specific clinical contrast agent Optimark™ all enhanced flank tumors of human glioma cells with similar maximal changes on T1 mapping. However, the retention of the agents differs. The non-specific agents show significant recovery within 20 min by an increase in T1 while the specific agent SBK2-Tris-(Gd-DOTA)3 is retained in the tumors and shows little recovery over 60 min. The retention effect is demonstrated by percent change in T1 values and slope calculations as well as by calculations of gadolinium concentration in tumor compared to muscle. Quantitative T1 mapping demonstrates the superior binding and retention in tumors of the SBK2-Tris-(Gd-DOTA)3 agent over time compared to the non-specific contrast agent currently in clinical use. PMID:26435847

  12. Probabilistic flood inundation mapping of ungauged rivers: Linking GIS techniques and frequency analysis

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Soltani, Saeed; Modarres, Reza

    2012-08-01

    SummaryThis study presents an exhaustive methodology of floodplain mapping at ungauged rivers. To present our methodology, we selected the Halilrud basin and Jiroft city in southeastern Iran as an example of hazardous regions. To estimate flood quantiles in different return periods at ungauged reaches, we used regional flood frequency analysis. By using the well-known L-moments approach and related criteria, a homogeneous region was formed and the 3-parameter Log normal distribution was identified as the robust regional frequency distribution. The hydro-geomorphic characteristics and the land use properties of the catchments were then extracted using RS&GIS techniques to establish multivariate regional regression models between hydro-geomorphic characteristics and flood quantiles. After delineation of the catchments for the ungauged reaches, flood quantiles as an important factor in flood inundation at outlets of these reaches with different probabilities were estimated using the regional regression models. To delineate flood hazard maps and to enhance the accuracy of the hydraulic modeling, we applied satellite stereoscope images of Cartosat-1 along with the Rational Polynomial Coefficients to extract a high resolution DTM and detailed parameterization of the channel required by 1D hydraulic model HEC-RAS. The GIS-based HEC-Geo RAS pre- and post-processor were also used for careful optimization of the geometry features for real visualization of the flood prone areas. Information of some historical flood events was also used to evaluate the hydraulic model performance in predicting flood inundations. Finally, vulnerable areas were crossed with extracted land use mapping from IRS-P6 satellite images to differentiate the critical infrastructures and the valuable land use classes affected by floods in different return periods.

  13. An Improved Map-Matching Technique Based on the Fréchet Distance Approach for Pedestrian Navigation Services

    PubMed Central

    Bang, Yoonsik; Kim, Jiyoung; Yu, Kiyun

    2016-01-01

    Wearable and smartphone technology innovations have propelled the growth of Pedestrian Navigation Services (PNS). PNS need a map-matching process to project a user’s locations onto maps. Many map-matching techniques have been developed for vehicle navigation services. These techniques are inappropriate for PNS because pedestrians move, stop, and turn in different ways compared to vehicles. In addition, the base map data for pedestrians are more complicated than for vehicles. This article proposes a new map-matching method for locating Global Positioning System (GPS) trajectories of pedestrians onto road network datasets. The theory underlying this approach is based on the Fréchet distance, one of the measures of geometric similarity between two curves. The Fréchet distance approach can provide reasonable matching results because two linear trajectories are parameterized with the time variable. Then we improved the method to be adaptive to the positional error of the GPS signal. We used an adaptation coefficient to adjust the search range for every input signal, based on the assumption of auto-correlation between consecutive GPS points. To reduce errors in matching, the reliability index was evaluated in real time for each match. To test the proposed map-matching method, we applied it to GPS trajectories of pedestrians and the road network data. We then assessed the performance by comparing the results with reference datasets. Our proposed method performed better with test data when compared to a conventional map-matching technique for vehicles. PMID:27782091

  14. Performance prediction of four-contact vertical Hall-devices using a conformal mapping technique

    NASA Astrophysics Data System (ADS)

    Yang, Huang; Yue, Xu; Yufeng, Guo

    2015-12-01

    Instead of the conventional design with five contacts in the sensor active area, innovative vertical Hall devices (VHDs) with four contacts and six contacts are asymmetrical in structural design but symmetrical in the current flow that can be well fit for the spinning current technique for offset elimination. In this article, a conformal mapping calculation method is used to predict the performance of asymmetrical VHD embedded in a deep n-well with four contacts. Furthermore, to make the calculation more accurate, the junction field effect is also involved into the conformal mapping method. The error between calculated and simulated results is less than 5% for the current-related sensitivity, and approximately 13% for the voltage-related sensitivity. This proves that such calculations can be used to predict the optimal structure of the vertical Hall-devices. Project supported by the Natural Science Foundation of Jiangsu Province, China (Nos. BK20131379, BK20141431) and the Graduate Research and Innovation Projects of Jiangsu Province (No. SJLX_0373).

  15. Inventory and mapping of flood inundation using interactive digital image analysis techniques

    USGS Publications Warehouse

    Rohde, Wayne G.; Nelson, Charles A.; Taranik, J.V.

    1979-01-01

    LANDSAT digital data and color infra-red photographs were used in a multiphase sampling scheme to estimate the area of agricultural land affected by a flood. The LANDSAT data were classified with a maximum likelihood algorithm. Stratification of the LANDSAT data, prior to classification, greatly reduced misclassification errors. The classification results were used to prepare a map overlay showing the areal extent of flooding. These data also provided statistics required to estimate sample size in a two phase sampling scheme, and provided quick, accurate estimates of areas flooded for the first phase. The measurements made in the second phase, based on ground data and photo-interpretation, were used with two phase sampling statistics to estimate the area of agricultural land affected by flooding These results show that LANDSAT digital data can be used to prepare map overlays showing the extent of flooding on agricultural land and, with two phase sampling procedures, can provide acreage estimates with sampling errors of about 5 percent. This procedure provides a technique for rapidly assessing the areal extent of flood conditions on agricultural land and would provide a basis for designing a sampling framework to estimate the impact of flooding on crop production.

  16. Mapping the temporal pole with a specialized electrode array: technique and preliminary results

    PubMed Central

    Abel, Taylor J.; Rhone, Ariane E.; Nourski, Kirill V.; Granner, Mark A.; Oya, Hiroyuki; Griffiths, Timothy D.; Tranel, Daniel T.; Kawasaki, Hiroto; Howard, Matthew A.

    2014-01-01

    Temporopolar cortex plays a crucial role in the pathogenesis of temporal lobe epilepsy and subserves important cognitive functions. Because of its shape and position in the middle cranial fossa, complete electrode coverage of the temporal pole (TP) is difficult to achieve using existing devices. We designed a novel TP electrode array that conforms to the surface of temporopolar cortex and achieves dense electrode coverage of this important brain region. A multi-pronged electrode array was designed that can be placed over the surface of the TP using a straightforward insertion technique. Twelve patients with medically intractable epilepsy were implanted with the TP electrode array for purposes of seizure localization. Select patients underwent cognitive mapping by electrocorticographic (ECoG) recording from the TP during a naming task. Use of the array resulted in excellent TP electrode coverage in all patients. High quality ECoG data were consistently obtained for purposes of delineating seizure activity and functional mapping. During a naming task, significant increases in ECoG power were observed within localized subregions of the TP. One patient developed a transient neurological deficit thought to be related to the mass effect of multiple intracranial recording arrays, including the TP array. This deficit resolved following removal of all electrodes. The TP electrode array overcomes limitations of existing devices and enables clinicians and researchers to obtain optimal multi-site recordings from this important brain region. PMID:24480831

  17. Quantitative Mapping of Pore Fraction Variations in Silicon Nitride Using an Ultrasonic Contact Scan Technique

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Kiser, James D.; Swickard, Suzanne M.; Szatmary, Steven A.; Kerwin, David P.

    1993-01-01

    An ultrasonic scan procedure using the pulse-echo contact configuration was employed to obtain maps of pore fraction variations in sintered silicon nitride samples in terms of ultrasonic material properties. Ultrasonic velocity, attenuation coefficient, and reflection coefficient images were obtained simultaneously over a broad band of frequencies (e.g., 30 to 110 MHz) by using spectroscopic analysis. Liquid and membrane (dry) coupling techniques and longitudinal and shear-wave energies were used. The major results include the following: Ultrasonic velocity (longitudinal and shear wave) images revealed and correlated with the extent of average through-thickness pore fraction variations in the silicon nitride disks. Attenuation coefficient images revealed pore fraction nonuniformity due to the scattering that occurred at boundaries between regions of high and low pore fraction. Velocity and attenuation coefficient images were each nearly identical for machined and polished disks, making the method readily applicable to machined materials. Velocity images were similar for wet and membrane coupling. Maps of apparent Poisson's ratio constructed from longitudinal and shear-wave velocities quantified Poisson's ratio variations across a silicon nitride disk. Thermal wave images of a disk indicated transient thermal behavior variations that correlated with observed variations in pore fraction and velocity and attenuation coefficients.

  18. Supersonic shear imaging: a new technique for soft tissue elasticity mapping.

    PubMed

    Bercoff, Jérémy; Tanter, Mickaël; Fink, Mathias

    2004-04-01

    Supersonic shear imaging (SSI) is a new ultrasound-based technique for real-time visualization of soft tissue viscoelastic properties. Using ultrasonic focused beams, it is possible to remotely generate mechanical vibration sources radiating low-frequency, shear waves inside tissues. Relying on this concept, SSI proposes to create such a source and make it move at a supersonic speed. In analogy with the "sonic boom" created by a supersonic aircraft, the resulting shear waves will interfere constructively along a Mach cone, creating two intense plane shear waves. These waves propagate through the medium and are progressively distorted by tissue heterogeneities. An ultrafast scanner prototype is able to both generate this supersonic source and image (5000 frames/s) the propagation of the resulting shear waves. Using inversion algorithms, the shear elasticity of medium can be mapped quantitatively from this propagation movie. The SSI enables tissue elasticity mapping in less than 20 ms, even in strongly viscous medium like breast. Modalities such as shear compounding are implementable by tilting shear waves in different directions and improving the elasticity estimation. Results validating SSI in heterogeneous phantoms are presented. The first in vivo investigations made on healthy volunteers emphasize the potential clinical applicability of SSI for breast cancer detection.

  19. Mapping to Irregular Torus Topologies and Other Techniques for Petascale Biomolecular Simulation

    PubMed Central

    Phillips, James C.; Sun, Yanhua; Jain, Nikhil; Bohm, Eric J.; Kalé, Laxmikant V.

    2014-01-01

    Currently deployed petascale supercomputers typically use toroidal network topologies in three or more dimensions. While these networks perform well for topology-agnostic codes on a few thousand nodes, leadership machines with 20,000 nodes require topology awareness to avoid network contention for communication-intensive codes. Topology adaptation is complicated by irregular node allocation shapes and holes due to dedicated input/output nodes or hardware failure. In the context of the popular molecular dynamics program NAMD, we present methods for mapping a periodic 3-D grid of fixed-size spatial decomposition domains to 3-D Cray Gemini and 5-D IBM Blue Gene/Q toroidal networks to enable hundred-million atom full machine simulations, and to similarly partition node allocations into compact domains for smaller simulations using multiple-copy algorithms. Additional enabling techniques are discussed and performance is reported for NCSA Blue Waters, ORNL Titan, ANL Mira, TACC Stampede, and NERSC Edison. PMID:25594075

  20. Mapping to Irregular Torus Topologies and Other Techniques for Petascale Biomolecular Simulation.

    PubMed

    Phillips, James C; Sun, Yanhua; Jain, Nikhil; Bohm, Eric J; Kalé, Laxmikant V

    Currently deployed petascale supercomputers typically use toroidal network topologies in three or more dimensions. While these networks perform well for topology-agnostic codes on a few thousand nodes, leadership machines with 20,000 nodes require topology awareness to avoid network contention for communication-intensive codes. Topology adaptation is complicated by irregular node allocation shapes and holes due to dedicated input/output nodes or hardware failure. In the context of the popular molecular dynamics program NAMD, we present methods for mapping a periodic 3-D grid of fixed-size spatial decomposition domains to 3-D Cray Gemini and 5-D IBM Blue Gene/Q toroidal networks to enable hundred-million atom full machine simulations, and to similarly partition node allocations into compact domains for smaller simulations using multiple-copy algorithms. Additional enabling techniques are discussed and performance is reported for NCSA Blue Waters, ORNL Titan, ANL Mira, TACC Stampede, and NERSC Edison.

  1. Mapping of thermal injury in biologic tissues using quantitative pathologic techniques

    NASA Astrophysics Data System (ADS)

    Thomsen, Sharon L.

    1999-05-01

    Qualitative and quantitative pathologic techniques can be used for (1) mapping of thermal injury, (2) comparisons lesion sizes and configurations for different instruments or heating sources and (3) comparisons of treatment effects. Concentric zones of thermal damage form around a single volume heat source. The boundaries between some of these zones are distinct and measurable. Depending on the energy deposition, heating times and tissue type, the zones can include the following beginning at the hotter center and progressing to the cooler periphery: (1) tissue ablation, (2) carbonization, (3) tissue water vaporization, (4) structural protein denaturation (thermal coagulation), (5) vital enzyme protein denaturation, (6) cell membrane disruption, (7) hemorrhage, hemostasis and hyperhemia, (8) tissue necrosis and (9) wound organization and healing.

  2. DIGITAL PROCESSING TECHNIQUES FOR IMAGE MAPPING WITH LANDSAT TM AND SPOT SIMULATOR DATA.

    USGS Publications Warehouse

    Chavez, Pat S.; ,

    1984-01-01

    To overcome certain problems associated with the visual selection of Landsat TM bands for image mapping, the author used a quantitative technique that ranks the 20 possible three-band combinations based upon their information content. Standard deviations and correlation coefficients can be used to compute a value called the Optimum Index Factor (OIF) for each of the 20 possible combinations. SPOT simulator images were digitally processed and compared with Landsat-4 Thematic Mapper (TM) images covering a semi-arid region in northern Arizona and a highly vegetated urban area near Washington, D. C. Statistical comparisons indicate the more radiometric or color information exists in certain TM three-band combinations than in the three SPOT bands.

  3. Application of the lamp mapping technique for overlap function for Raman lidar systems.

    PubMed

    Walker, Monique; Venable, Demetrius; Whiteman, David N; Sakai, Tetsu

    2016-04-01

    Traditionally, the lidar water vapor mixing ratio (WVMR) is corrected for overlap using data from another instrument, such as a radiosonde. Here we introduce a new experimental method to determine the overlap function using the lamp mapping technique (LMT), which relies on the lidar optics and detection system. The LMT discussed here involves a standard halogen lamp being scanned over the aperture of a Raman lidar telescope in synchronization with the lidar detection system [Appl. Opt.50, 4622 (2011)APOPAI0003-693510.1364/AO.50.004622, Appl. Opt.53, 8538 (2014)APOPAI0003-693510.1364/AO.53.008535]. In this paper, we show results for a LMT-determined overlap function for individual channels, as well as a WVMR overlap function. We found that the LMT-determined WVMR overlap functions deviate within 5% of the traditional radiosonde-determined overlap.

  4. Mining for diagnostic information in body surface potential maps: A comparison of feature selection techniques

    PubMed Central

    Finlay, Dewar D; Nugent, Chris D; McCullagh, Paul J; Black, Norman D

    2005-01-01

    Background In body surface potential mapping, increased spatial sampling is used to allow more accurate detection of a cardiac abnormality. Although diagnostically superior to more conventional electrocardiographic techniques, the perceived complexity of the Body Surface Potential Map (BSPM) acquisition process has prohibited its acceptance in clinical practice. For this reason there is an interest in striking a compromise between the minimum number of electrocardiographic recording sites required to sample the maximum electrocardiographic information. Methods In the current study, several techniques widely used in the domains of data mining and knowledge discovery have been employed to mine for diagnostic information in 192 lead BSPMs. In particular, the Single Variable Classifier (SVC) based filter and Sequential Forward Selection (SFS) based wrapper approaches to feature selection have been implemented and evaluated. Using a set of recordings from 116 subjects, the diagnostic ability of subsets of 3, 6, 9, 12, 24 and 32 electrocardiographic recording sites have been evaluated based on their ability to correctly asses the presence or absence of Myocardial Infarction (MI). Results It was observed that the wrapper approach, using sequential forward selection and a 5 nearest neighbour classifier, was capable of choosing a set of 24 recording sites that could correctly classify 82.8% of BSPMs. Although the filter method performed slightly less favourably, the performance was comparable with a classification accuracy of 79.3%. In addition, experiments were conducted to show how (a) features chosen using the wrapper approach were specific to the classifier used in the selection model, and (b) lead subsets chosen were not necessarily unique. Conclusion It was concluded that both the filter and wrapper approaches adopted were suitable for guiding the choice of recording sites useful for determining the presence of MI. It should be noted however that in this study

  5. Effect of display polarity and luminance contrast on visual lobe shape characteristics.

    PubMed

    Tsang, Steve N H; Chan, Alan H S; Yu, R F

    2012-01-01

    The effect of display polarity and luminance contrast on visual lobe (effective visual field) shape characteristics was studied using three levels of luminance contrast with combinations of positive and negative polarities. The binocular effective visual field for a detection task, with a peripherally presented target (V) embedded in a homogeneous competing background (Xs), was mapped on 24 imaginary axes passing through the fixation point. The results showed that visual lobes mapped using positive polarity were statistically larger in area, rounder and more regular in shape than those for negative polarity. The medium contrast condition lobes were more symmetric and regular than low contrast condition lobes, and lobe area and perimeter increased with increasing luminance contrast ratio. Under the interaction of positive polarity and high luminance contrast, visual lobes were found to be larger, smoother and rounder. The high level of luminance and contrast however resulted in a higher degree of visual discomfort. The results indicated that positive polarity and contrast of medium (26:1) to high (41:1) levels are possible display settings for better visual lobe characteristics and better anticipated search performance. Practitioner Summary: The effect of display polarity and luminance contrast on visual lobe shape characteristics was examined with uniform stimulus materials in this study. The results help to identify the optimum display settings for luminance contrast and display polarity to enhance lobe shape characteristics and hence search performance in industrial inspection tasks.

  6. OLED lightings with optical feedback for luminance difference compensation

    NASA Astrophysics Data System (ADS)

    Shin, D. K.; Park, J. W.

    2013-12-01

    We have employed an optical feedback circuit in an organic light-emitting diode (OLED) lighting system to ensure uniform light output across large-area OLED lighting tiles. In a lighting system with several large-area OLED lighting tiles involved, the panel aging (luminance decrease) may appear differently in each, resulting in a falling-off in lighting quality. To tackle this, light output from each OLED tile is monitored by the optical feedback circuit that consists of a photodetector, I-V converter, 10-bit analogue-digital converter (ADC), and comparator. A photodetector mounted on a glass side generates a feedback signal (current) by detecting side-emitting OLED light. To monitor bottom-emitting output light by detecting side-emitting OLED light, a mapping table between the ADC value and the luminance of bottom emission has been established. If the ADC value is lower or higher than the reference one corresponding to the target luminance of OLED tiles, a micro controller unit adjusts the pulse width modulation used for the control of the power supplied to OLED tiles in such a way that the ADC value obtained from optical feedback is the same as the reference one. As a result, the target luminance of each individual OLED tile is kept unchanged. With the optical feedback circuit included in the lighting system, we have observed less than 2% difference in relative intensity of neighboring OLED tiles.

  7. Application of Geologic Mapping Techniques and Autonomous Feature Detection to Future Exploration of Europa

    NASA Astrophysics Data System (ADS)

    Bunte, M. K.; Tanaka, K. L.; Doggett, T.; Figueredo, P. H.; Lin, Y.; Greeley, R.; Saripalli, S.; Bell, J. F.

    2013-12-01

    Europa's extremely young surface age, evidence for extensive resurfacing, and indications of a sub-surface ocean elevate its astrobiological potential for habitable environments and make it a compelling focus for study. Knowledge of the global distribution and timing of Europan geologic units is a key step in understanding the history of the satellite and for identifying areas relevant for exploration. I have produced a 1:15M scale global geologic map of Europa which represents a proportionate distribution of four unit types and associated features: plains, linea, chaos, and crater materials. Mapping techniques differ somewhat from other planetary maps but do provide a method to establish stratigraphic markers and to illustrate the surface history through four periods of formation as a function of framework lineament cross-cutting relationships. Correlations of observed features on Europa with Earth analogs enforce a multi-process theory for formation rather than the typical reliance on the principle of parsimony. Lenticulae and microchaos are genetically similar and most likely form by diapirism. Platy and blocky chaos units, endmembers of archetypical chaos, are best explained by brine mobilization. Ridges account for the majority of lineaments and may form by a number of methods indicative of local conditions; most form by either tidal pumping or shear heating. The variety of morphologies exhibited by bands indicates that multiple formation mechanisms apply once fracturing of the brittle surface over a ductile subsurface is initiated. Mapping results support the interpretation that Europa's shell has thickened over time resulting in changes in the style and intensity of deformation. Mapping serves as an index for change detection and classification, aids in pre-encounter targeting, and supports the selection of potential landing sites. Highest priority target areas are those which indicate geophysical activity by the presence of volcanic plumes, outgassing, or

  8. Mapping coral reefs using consumer-grade drones and structure from motion photogrammetry techniques

    NASA Astrophysics Data System (ADS)

    Casella, Elisa; Collin, Antoine; Harris, Daniel; Ferse, Sebastian; Bejarano, Sonia; Parravicini, Valeriano; Hench, James L.; Rovere, Alessio

    2017-03-01

    We propose a novel technique to measure the small-scale three-dimensional features of a shallow-water coral reef using a small drone equipped with a consumer-grade camera, a handheld GPS and structure from motion (SfM) algorithms. We used a GoPro HERO4 with a modified lens mounted on a DJI Phantom 2 drone (maximum total take-off weight <2 kg) to perform a 10 min flight and collect 306 aerial images with an overlap equal or greater than 90%. We mapped an area of 8380 m2, obtaining as output an ortho-rectified aerial photomosaic and a bathymetric digital elevation model (DEM) with a resolution of 0.78 and 1.56 cm pixel-1, respectively. Through comparison with airborne LiDAR data for the same area, we verified that the location of the ortho-rectified aerial photomosaic is accurate within 1.4 m. The bathymetric difference between our DEM and the LiDAR dataset is -0.016 ± 0.45 m (1σ). Our results show that it is possible, in conditions of calm waters, low winds and minimal sun glint, to deploy consumer-grade drones as a relatively low-cost and rapid survey technique to produce multispectral and bathymetric data on shallow-water coral reefs. We discuss the utility of such data to monitor temporal changes in topographic complexity of reefs and associated biological processes.

  9. A self-trained classification technique for producing 30 m percent-water maps from Landsat data

    USGS Publications Warehouse

    Rover, Jennifer R.; Wylie, Bruce K.; Ji, Lei

    2010-01-01

    Small bodies of water can be mapped with moderate-resolution satellite data using methods where water is mapped as subpixel fractions using field measurements or high-resolution images as training datasets. A new method, developed from a regression-tree technique, uses a 30 m Landsat image for training the regression tree that, in turn, is applied to the same image to map subpixel water. The self-trained method was evaluated by comparing the percent-water map with three other maps generated from established percent-water mapping methods: (1) a regression-tree model trained with a 5 m SPOT 5 image, (2) a regression-tree model based on endmembers and (3) a linear unmixing classification technique. The results suggest that subpixel water fractions can be accurately estimated when high-resolution satellite data or intensively interpreted training datasets are not available, which increases our ability to map small water bodies or small changes in lake size at a regional scale.

  10. Modified technique for differential mode delay map measurement by scanning of input/output ends of tested multimode fiber

    NASA Astrophysics Data System (ADS)

    Bourdine, Anton V.; Burdin, Vladimir A.; Pashin, Stanislav S.; Praporshchikov, Denis E.; Sevruk, Nikita L.

    2016-03-01

    We present modified technique for differential mode delay map measurement. Here according to well-known methods a fast laser pulse is also launched into a tested multimode fiber (MMF) via single mode fiber (SMF), which scans core of MMF under precision offset positions. However unlike known technique formalized in ratified standards, proposed modification differs by addition scanning of the output end of tested MMF by short tail of SMF. Therefore for each radial offset position at the input/output MMF ends, the shape of pulse response of launched optical signal is recorded, that provides to get more informative differential mode delay map. This work presents some results of experimental approbation of proposed modified technique for differential mode delay map measurement.

  11. Development And Assessment Of Bias Correction Techniques Based On Distribution Mapping Method For Future Climate Projections

    NASA Astrophysics Data System (ADS)

    P S, S.; Narasimhan, B.; Sudheer, K. P.; Annamalai, H.

    2016-12-01

    Climate change has received much attention recently, owing to the serious threat it poses to the hydrological water balance of our planet. Regional Climate Model (RCM) simulations used for climate change impact assessment studies is subjected to model biases which results in unrealistic hydrologic simulations and hence, bias correction methods are adopted. Among the bias correction methods, Distribution mapping (DM) method which modifies the shape of the distribution function of precipitation thereby generating realistic rainfall simulations is considered efficient. The study adopted a unique sliding window technique around each day to define a population data for fitting the probability distribution based on which the climate data were bias corrected on a daily time scale by DM method. The present study focuses on the development and assessment of four bias correction techniques which are based on various combinations of additive and multiplicative bias correction of change factors incorporated within the DM method to account for the future climate projections. The study is carried out in two watersheds; Achankovil and Vaippar basin which belongs to different topographical divisions. Inorder to assess the performance of the proposed bias correction techniques, plots of average number of rainy days and rainfall volume for observed, uncorrected and bias corrected rainfall simulations were compared for both the basins under current and future climate conditions. The effect of different bias correction methods were found to be almost similar in Vaippar basin, dominated mostly by the North East monsoon. However, Achankovil basin which experiences both South West and North East monsoon, exhibited differences among bias correction methods.The results of the study demonstrate that the bias correction methods can have considerable effect in assessing the impacts due to climate change and a single method of bias correction is not applicable across all climates.

  12. Threading Dislocation Characterization and Stress Mapping Depth Profiling via Ray Tracing Technique

    NASA Astrophysics Data System (ADS)

    Zhou, Tianyi

    Zinc oxide (ZnO) has been well known as a transparent, dielectric, piezoelectric and wide band gap material. The potential capabilities have been demonstrated for a wide range of applications such as piezoelectric transducer, gas sensor, optical waveguides and transparent electrode. It could also be applied as a substrate material for GaN-based devices. However, while some applications have already been realized, issues relating to crystalline defects remain a barrier to the successful realization of several others. In this thesis, the central focus of Chapter II is to characterize threading dislocations in hydrothermal grown ZnO substrates through simulation work as well as other techniques. The goal of this study is to find the origin of threading dislocations and design strategies to mitigate their negative effects by either reducing their densities or completely eliminating them. In Chapter III, the technique of SMART (stress mapping analysis via ray tracing) is discussed in detail to measure residue stress in packaged silicon circuits. Residual stress plays an important role in the performance and lifetime of single crystal device material. There are mainly two advantages of SMART compared with other techniques: (a) all six components of the stress tensor could be evaluated; (b) it is non-destructive and no damaging trace will be left on the sample. In this study, our goal is to build a relationship between stress distribution and depth. The concept of penetration depth is critically important in this study and its value may cause great changes for real space stress distribution. A new function is applied to get better fitting curves. Data in this study is obtained from various penetration depth, which represents exponentially decaying weighted average of actual stress value or in other words this stress profile is Laplace transform of real stress profile. Mathematical procedure is described to determine real stress profile from Laplace profile. Experiment

  13. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  14. The derivation of tropospheric column ozone using the TOR approach and mapping technique

    NASA Astrophysics Data System (ADS)

    Yang, Qing

    2007-12-01

    Tropospheric ozone columns (TCOs) derived from differences between the Dutch-Finnish Aura Ozone Monitoring Instrument (OMI) measurements of the total atmospheric ozone column and the Aura Microwave Limb Sounder (MLS) measurements of stratospheric ozone columns are discussed. Because the measurements by these two instruments are not spatially coincident, interpolation techniques, with emphasis on mapping the stratospheric columns in space and time using the relationships between lower stratospheric ozone and potential vorticity (PV) and geopotential heights (Z), are evaluated at mid-latitudes. It is shown that this PV mapping procedure produces somewhat better agreement in comparisons with ozonesonde measurements, particularly in winter, than does simple linear interpolation of the MLS stratospheric columns or the use of typical coincidence criteria at mid-latitudes. The OMI/MLS derived tropospheric columns are calculated to be 4 Dobson units (DU) smaller than the sonde measured columns at mid-latitudes. This mean difference is consistent with the MLS (version 1.5) stratospheric ozone columns being high relative to Stratospheric Aerosol and Gas Experiment (SAGE II) columns by 3 DU. Standard deviations between the derived tropospheric columns and those measured by ozonesondes are 9 DU (30%) annually but they are just 6 DU (15%) in summer. Uncertainties in the interpolated MLS stratospheric columns are likely to be the primary cause of these standard deviations. An important advantage of the PV mapping approach is that it works well when MLS data are missing (e.g., when an orbit of measurements is missing). In the comparisons against ozonesonde measurements, it provides up to twice as many comparisons compared to the other techniques. The OMI/MLS derived tropospheric ozone columns have been compared with corresponding columns based on the Tropospheric Emission Spectrometer (TES) measurements, and Regional chEmical trAnsport Model (REAM) simulations. The variability of

  15. A Riemannian geometric mapping technique for identifying incompressible equivalents to subsonic potential flows

    NASA Astrophysics Data System (ADS)

    German, Brian Joseph

    This research develops a technique for the solution of incompressible equivalents to planar steady subsonic potential flows. Riemannian geometric formalism is used to develop a gauge transformation of the length measure followed by a curvilinear coordinate transformation to map the given subsonic flow into a canonical Laplacian flow with the same boundary conditions. The effect of the transformation is to distort both the immersed profile shape and the domain interior nonuniformly as a function of local flow properties. The method represents the full nonlinear generalization of the classical methods of Prandtl-Glauert and Karman-Tsien. Unlike the classical methods which are "corrections," this method gives exact results in the sense that the inverse mapping produces the subsonic full potential solution over the original airfoil, up to numerical accuracy. The motivation for this research was provided by an observed analogy between linear potential flow and the special theory of relativity that emerges from the invariance of the d'Alembert wave equation under Lorentz transformations. This analogy is well known in an operational sense, being leveraged widely in linear unsteady aerodynamics and acoustics, stemming largely from the work of Kussner. Whereas elements of the special theory can be invoked for compressibility effects that are linear and global in nature, the question posed in this work was whether other mathematical techniques from the realm of relativity theory could be used to similar advantage for effects that are nonlinear and local. This line of thought led to a transformation leveraging Riemannian geometric methods common to the general theory of relativity. A gauge transformation is used to geometrize compressibility through the metric tensor of the underlying space to produce an equivalent incompressible flow that lives not on a plane but on a curved surface. In this sense, forces owing to compressibility can be ascribed to the geometry of space in

  16. MAPS

    Atmospheric Science Data Center

    2014-07-03

    ... Measurement of Air Pollution from Satellites (MAPS) data were collected during Space Shuttle flights in 1981, ... Facts Correlative Data  - CDIAC - Spring & Fall 1994 - Field and Aircraft Campaigns SCAR-B Block:  ...

  17. Low Altitude AVIRIS Data for Mapping Land Cover in Yellowstone National Park: Use of Isodata Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Spruce, Joe

    2001-01-01

    Yellowstone National Park (YNP) contains a diversity of land cover. YNP managers need site-specific land cover maps, which may be produced more effectively using high-resolution hyperspectral imagery. ISODATA clustering techniques have aided operational multispectral image classification and may benefit certain hyperspectral data applications if optimally applied. In response, a study was performed for an area in northeast YNP using 11 select bands of low-altitude AVIRIS data calibrated to ground reflectance. These data were subjected to ISODATA clustering and Maximum Likelihood Classification techniques to produce a moderately detailed land cover map. The latter has good apparent overall agreement with field surveys and aerial photo interpretation.

  18. Low Altitude AVIRIS Data for Mapping Land Cover in Yellowstone National Park: Use of Isodata Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Spruce, Joe

    2001-01-01

    Yellowstone National Park (YNP) contains a diversity of land cover. YNP managers need site-specific land cover maps, which may be produced more effectively using high-resolution hyperspectral imagery. ISODATA clustering techniques have aided operational multispectral image classification and may benefit certain hyperspectral data applications if optimally applied. In response, a study was performed for an area in northeast YNP using 11 select bands of low-altitude AVIRIS data calibrated to ground reflectance. These data were subjected to ISODATA clustering and Maximum Likelihood Classification techniques to produce a moderately detailed land cover map. The latter has good apparent overall agreement with field surveys and aerial photo interpretation.

  19. Image-based continental shelf habitat mapping using novel automated data extraction techniques

    NASA Astrophysics Data System (ADS)

    Seiler, Jan; Friedman, Ariell; Steinberg, Daniel; Barrett, Neville; Williams, Alan; Holbrook, Neil J.

    2012-08-01

    We automatically mapped the distribution of temperate continental shelf rocky reef habitats with a high degree of confidence using colour, texture, rugosity and patchiness features extracted from images in conjunction with machine-learning algorithms. This demonstrated the potential of novel automation routines to expedite the complex and time-consuming process of seabed mapping. The random forests ensemble classifier outperformed other tree-based algorithms and also offered some valuable built-in model performance assessment tools. Habitat prediction using random forests performed most accurately when all 26 image-derived predictors were included in the model. This produced an overall habitat prediction accuracy of 84% (with a kappa statistic of 0.793) when compared to nine distinct habitat classes assigned by a human annotator. Predictions for three habitat classes were all within the 95% confidence intervals, indicating close agreement between observed and predicted habitat classes. Misclassified images were mostly unevenly, partially or insufficiently illuminated and came mostly from rugged terrains and during the autonomous underwater vehicle's obstacle avoidance manoeuvres. The remaining misclassified images were wrongly or inconsistently labelled by the human annotator. This study demonstrates the suitability of autonomous underwater vehicles to effectively sample benthic habitats and the ability of automated data handling techniques to extract and reliably process large volumes of seabed image data. Our methods for image feature extraction and classification are repeatable, cost-effective and well suited to studies that require non-extractive and/or co-located sampling, e.g. in marine reserves and for monitoring the recovery from physical impacts, e.g. from bottom fishing activities. The methods are transferable to other continental shelf areas and to other disciplines such as seabed geology.

  20. Comparison of the Biological Impacts of the Fluoride Compounds by Graphical Risk Visualization Map Technique.

    PubMed

    Usuda, Kan; Kono, Rei; Ueno, Takaaki; Ito, Yuichi; Dote, Tomotaro; Yokoyama, Hirotaka; Kono, Koichi; Tamaki, Junko

    2015-09-01

    Various fluoride compounds are widely used in industry. The present risk assessment study was conducted using a series of inorganic binary fluorides of the type XFn, where X(n) = Na(+), K(+), Li(+), Mg(2+), Ca(2+), Sr(2+), Ba(2+), Al(3+), Nd(3+), La(3+), Ce(3+), Sm(3+), Gd(3+), Y(3+), Yb(2+), and Zn(2+). The aqueous solutions of these salts were orally administrated to 16 experimental groups (one for each of the salts tested). The levels of fluoride, N-acetyl-β-D-glucosaminidase in cumulative 24-h urine samples and creatinine clearance were measured to assess possible acute renal damages. The levels of fluoride, alanine aminotransferase, and aspartate aminotransferase were also determined in serum samples to assess possible acute hepatic damages. The results reveal that sodium fluoride (NaF), potassium fluoride (KF), and zinc fluoride tetrahydrate (ZnF2 (.)4H2O) can carry the fluoride ion into the bloodstream and that it is excreted via urine more readily than the other compounds tested. These fluorides were assigned the highest risk impact factor. Most of the rare earth fluorides are insoluble in water while those groups 2 and 13 of the periodic table are slightly soluble, so that they do not have a significant negative risk. These findings suggest that the biological impact of fluoride depends on the accompanying counter ion and its solubility. The risk map obtained in the present study shows that the graphical visualization map technique employed is a valuable new tool to assess the toxicological risk of chemical compounds.

  1. Comparing and Reconciling Traditional Field and Photogeologic Mapping Techniques: Lessons from the San Francisco Volcanic Field, Arizona

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Eppler, D. B.; Bleacher, J. E.; Evans, C. A.; Feng, W.; Gruener, J.; Hurwitz, D. M.; Janoiko, B.; Whitson, P.

    2014-01-01

    Cartographic products and - specifically - geologic maps provide critical assistance for establishing physical and temporal frameworks of planetary surfaces. The technical methods that result in the creation of geologic maps vary depending on how observations are made as well as the overall intent of the final products [1-3]. These methods tend to follow a common linear work flow, including the identification and delineation of spatially and temporally discrete materials (units), the documentation of their primary (emplacement) and secondary (erosional) characteristics, analysis of the relative and absolute age relationships between these materials, and the collation of observations and interpretations into an objective map product. The "objectivity" of a map is critical cross comparison with overlapping maps and topical studies as well as its relevance to scientific posterity. However, the "accuracy" and "correctness" of a geologic map is very subject to debate. This can be evidenced by comparison of existing geologic maps at various scales, particularly those compiled through field- and remote-based mapped efforts. Our study focuses on comparing the fidelity of (1) "Apollo-style" geologic investigations, where typically non-geologist crew members follow static traverse routes established through pre-mission planning, and (2) "traditional" field-based investigations, where geologists are given free rein to observe without preplanned routes. This abstract summarizes the regional geology wherein our study was conducted, presents the geologic map created from traditional field mapping techniques, and offers basic insights into how geologic maps created from different tactics can be reconciled in support of exploratory missions. Additional abstracts [4-6] from this study discuss various exploration and science results of these efforts.

  2. Can Social Semantic Web Techniques Foster Collaborative Curriculum Mapping In Medicine?

    PubMed Central

    Finsterer, Sonja; Cremer, Jan; Schenkat, Hennig

    2013-01-01

    Background Curriculum mapping, which is aimed at the systematic realignment of the planned, taught, and learned curriculum, is considered a challenging and ongoing effort in medical education. Second-generation curriculum managing systems foster knowledge management processes including curriculum mapping in order to give comprehensive support to learners, teachers, and administrators. The large quantity of custom-built software in this field indicates a shortcoming of available IT tools and standards. Objective The project reported here aims at the systematic adoption of techniques and standards of the Social Semantic Web to implement collaborative curriculum mapping for a complete medical model curriculum. Methods A semantic MediaWiki (SMW)-based Web application has been introduced as a platform for the elicitation and revision process of the Aachen Catalogue of Learning Objectives (ACLO). The semantic wiki uses a domain model of the curricular context and offers structured (form-based) data entry, multiple views, structured querying, semantic indexing, and commenting for learning objectives (“LOs”). Semantic indexing of learning objectives relies on both a controlled vocabulary of international medical classifications (ICD, MeSH) and a folksonomy maintained by the users. An additional module supporting the global checking of consistency complements the semantic wiki. Statements of the Object Constraint Language define the consistency criteria. We evaluated the application by a scenario-based formative usability study, where the participants solved tasks in the (fictional) context of 7 typical situations and answered a questionnaire containing Likert-scaled items and free-text questions. Results At present, ACLO contains roughly 5350 operational (ie, specific and measurable) objectives acquired during the last 25 months. The wiki-based user interface uses 13 online forms for data entry and 4 online forms for flexible searches of LOs, and all the forms are

  3. Can social semantic web techniques foster collaborative curriculum mapping in medicine?

    PubMed

    Spreckelsen, Cord; Finsterer, Sonja; Cremer, Jan; Schenkat, Hennig

    2013-08-15

    Curriculum mapping, which is aimed at the systematic realignment of the planned, taught, and learned curriculum, is considered a challenging and ongoing effort in medical education. Second-generation curriculum managing systems foster knowledge management processes including curriculum mapping in order to give comprehensive support to learners, teachers, and administrators. The large quantity of custom-built software in this field indicates a shortcoming of available IT tools and standards. The project reported here aims at the systematic adoption of techniques and standards of the Social Semantic Web to implement collaborative curriculum mapping for a complete medical model curriculum. A semantic MediaWiki (SMW)-based Web application has been introduced as a platform for the elicitation and revision process of the Aachen Catalogue of Learning Objectives (ACLO). The semantic wiki uses a domain model of the curricular context and offers structured (form-based) data entry, multiple views, structured querying, semantic indexing, and commenting for learning objectives ("LOs"). Semantic indexing of learning objectives relies on both a controlled vocabulary of international medical classifications (ICD, MeSH) and a folksonomy maintained by the users. An additional module supporting the global checking of consistency complements the semantic wiki. Statements of the Object Constraint Language define the consistency criteria. We evaluated the application by a scenario-based formative usability study, where the participants solved tasks in the (fictional) context of 7 typical situations and answered a questionnaire containing Likert-scaled items and free-text questions. At present, ACLO contains roughly 5350 operational (ie, specific and measurable) objectives acquired during the last 25 months. The wiki-based user interface uses 13 online forms for data entry and 4 online forms for flexible searches of LOs, and all the forms are accessible by standard Web browsers. The

  4. Characterising Nearby Luminous Infrared Galaxies

    NASA Astrophysics Data System (ADS)

    Ramphul, R.; Vaisanen, P.; Van der Heyden, K.

    2017-06-01

    Luminous InfraRed Galaxies (LIRGs) in the local universe are known to be highly interacting galaxies with strong star-formation in obscured environments. LIRGs have diversity in terms of morphology and mode and location of SF, while their even more energetic counterparts, the Ultra-Luminous IR galaxies, ULIRGs, (LIR ≥ 10^12 Lsol ) are normally (remnants of) gas rich major mergers with centralised starbursts and AGN. I will present ongoing work on a survey of >40 (U)LIRGs, in a distance range of 40 to 300Mpc, observed with SALT/RSS in long-slit mode. The sample of galaxies are in various stages of interaction and merging, some with strong AGN contribution. The reduction of the SALT/RSS data, was performed efficiently with our custom-built pipeline written in python/iraf/pyraf and handles error-frames propagation. We are performing a rigorous stellar populations analysis of our sample using Starlight (Cid Fernandes, 2005) which will ultimately lead to understanding the star formation history of these galaxies. We also use automatic line intensity measurements to derive chemical abundances, star formation rates, metallicity and emission line diagnostic. The talk will showcase the latest results that we just obtained for this dataset and discuss some of the future works.

  5. Mapping Tamarix: New techniques for field measurements, spatial modeling and remote sensing

    NASA Astrophysics Data System (ADS)

    Evangelista, Paul H.

    peak growing months. These studies demonstrate that new techniques can further our understanding of tamarisk's impacts on ecosystem processes, predict potential distribution and new invasions, and improve our ability to detect occurrence using remote sensing techniques. Collectively, the results of my studies may increase our ability to map tamarisk distributions and better quantify its impacts over multiple spatial and temporal scales.

  6. Potassium selectivity of frog gastric luminal membrane.

    PubMed

    Kasbekar, D K

    1986-06-01

    Transmural potential difference (PD) and resistance (R) changes after luminal or serosal instillation of K+ were determined under various conditions in chambered preparations of frog gastric mucosae. Potassium selectivity of the luminal membrane is indicated by the rapid reversal of the inverted PD of mucosae bathed in NaCl-free, choline sulfate (Ch2SO4)-Ringer on the serosal side and unbuffered hypertonic Ch2SO4 solution on the luminal side on luminal K+ instillation. The delta PD responses are significantly attenuated, however, in histamine-stimulated mucosae bathed in hypotonic or in burimamide-inhibited mucosae bathed in hyper- and hypotonic luminal media, which suggests that the K+ selectivity of the luminal membrane resides largely in the tubular cell apical membrane. Imposing a serosal-to-luminal transmucosal K+ gradient in both histamine-stimulated and omeprazole-inhibited mucosae also reversed the normal orientation of PD but not in those inhibited with burimamide. In the latter, the PD inversion was attenuated but maintained its normal orientation. These data suggest that burimamide, but not omeprazole, acts by blocking luminal membrane K+ conductance. The inverted PD in mucosae bathed in Cl-free media may thus be due partially or fully to K+ diffusion driven by the cell-to-lumen K+ gradient via the luminal K+ conductance pathway. These findings have implications for the controversy surrounding the postulated electrogenicity of the gastric proton pump.

  7. Coal fire mapping of East Basuria Colliery, Jharia coalfield using vertical derivative technique of magnetic data

    NASA Astrophysics Data System (ADS)

    Pal, S. K.; Vaish, Jitendra; Kumar, Sahadev; Bharti, Abhay Kumar

    2016-02-01

    The present study deals with the coal fire mapping of East Basuria Colliery, Jharia coalfield, India, using the magnetic method. It is based on the fact that rise in temperature would result significant changes in magnetic susceptibility and thermo-remanent magnetization (TRM) of the overlying rocks. Magnetism increases slowly with the rise of temperature until the Curie temperature. Generally, rock/ overburden loses magnetization and becomes paramagnetic due to heating to Curie temperature, which results with significant reduction in magnetic susceptibility. However, magnetism increases significantly after cooling below the Curie temperature. Several data processing methods such as diurnal correction, reduction to pole (RTP), first and second vertical derivatives have been used for analysis of magnetic data and their interpretation. It is observed that the total magnetic field intensity anomaly of the area varies approximately from 44850 to 47460 nT and the residual magnetic anomaly varies approximately from -1323 to 1253 nT. The range of the magnetic anomaly after RTP is approximately 1050-1450 nT. About 20 low magnetic anomaly zones have been identified associated with active coal fire regions and 11 high magnetic anomaly zones have been identified associated with non-coal fire regions using vertical derivative techniques.

  8. Evaluation of the potentials of the ERTS techniques for geological mapping purposes

    NASA Technical Reports Server (NTRS)

    Mohr, H. E. C. V. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Preliminary analysis of the full frame images shows the following: (1) The techniques employed were found suitable for study purposes. (2) The positional accuracy of the system corrected bulk processed MSS images is about 100-400 m. (3) Drainage is well depicted and morphotectonic features could be recognized. (4) Water bodies are clearly shown in MSS band 7. (5) Major roads, railway lines, and large towns are more or less detectable on bands 6 and 7. (6) Vegetation and agricultural areas are recognizable at many places. (7) Major rock formation boundaries, in general, are more or less clear. (8) Some of the structural trends including folded and linear structural elements are clear. (9) MSS bands 5 and 7 appear to be most useful for visual analysis. (10) Quick look analysis suggests that regional geological mapping on ERTS-1 imagery of the study area will be probably quite feasible. However, detailed analysis, visual and quantitative, on subframes, which is under progress may render more significant results.

  9. New Techniques for Heat Flow Calculations and Mapping Temperature-At

    NASA Astrophysics Data System (ADS)

    Frone, Z.; Blackwell, D. D.; Batir, J.; Park, J.; Richards, M.

    2010-12-01

    The results from a new geothermal resource assessment of the US, including for the first time detailed data from much of the eastern US, are summarized along with the techniques used to accomplish the assessment. BHT data were incorporated in the eastern US, where current heat flow data is sparse, using BHT corrections and calculated conductivities from a regional lithology model; comparing results to where overlapping conventional heat flow and BHT data exist for error calibration. A total of 5,000 points are now available in the northeast as opposed to the 1,000 used to produce the 2004 Geothermal Map of North America. Where neither heat flow nor BHT data were available, geophysical data (regional gravity and magnetics) were used as an ancillary predictor to the process for areas with sedimentary cover. The effectiveness of that process is demonstrated. This study uses the new heat flow data to improve the calculated heat in place to 10 km for the US. Based on the preliminary results from this work, the Appalachian Basin may contain some of the most favorable potential targets for EGS geothermal exploration in the eastern 1/3 of the United Stated and especially in eastern West Virginia, where temperatures of at least 150°C exist at a depth of 4.5 km.

  10. Saturation of Alfvén modes in tokamak plasmas investigated by Hamiltonian mapping techniques

    NASA Astrophysics Data System (ADS)

    Briguglio, S.; Schneller, M.; Wang, X.; Di Troia, C.; Hayward-Schneider, T.; Fusco, V.; Vlad, G.; Fogaccia, G.

    2017-07-01

    Nonlinear dynamics of single toroidal number Alfvén eigenmodes destabilised by the the resonant interaction with fast ions is investigated, in tokamak equilibria, by means of Hamiltonian mapping techniques. The results obtained by two different simulation codes, XHMGC and HAGIS, are presented for n  =  2 Beta induced Alfvén eigenmodes and, respectively n  =  6 toroidal Alfvén eigenmodes. Simulations of the bump-on-tail instability performed by a 1-dimensional code, PIC1DP, are also analysed for comparison. As a general feature, modes saturate as the resonant-particle distribution function is flattened over the whole region where mode-particle power transfer can take place in the linear phase. Such region is limited by the narrowest of resonance width and mode width. In the former case, mode amplitude at saturation exhibits a quadratic scaling with the linear growth rate; in the latter case, the scaling is linear. These results are explained in terms of the approximate analytic solution of a nonlinear pendulum model. They are also used to prove that the radial width of the single poloidal harmonic sets an upper limit to the radial displacement of circulating fast ions produced by a single-toroidal-number gap mode in the large n limit, irrespectively of the possible existence of a large global mode structure formed by many harmonics.

  11. Analysis of resonant responses of split ring resonators using conformal mapping techniques

    NASA Astrophysics Data System (ADS)

    McMeekin, Scott G.; Khokhar, Ali Z.; Lahiri, Basudev; De La Rue, Richard M.; Johnson, Nigel P.

    2007-05-01

    We report a novel method for modeling the resonant frequency response of infra-red light, in the range of 2 to 10 microns, reflected from metallic spilt ring resonators (SRRs) fabricated on a silicon substrate. The calculated positions of the TM and TE peaks are determined from the plasma frequency associated with the filling fraction of the metal array and the equivalent LC circuit defined by the SRR elements. The capacitance of the equivalent circuit is calculated using conformal mapping techniques to determine the co-planar capacitance associated with both the individual and the neighbouring elements. The inductance of the equivalent circuit is based on the self-inductance of the individual elements and the mutual inductance of the neighboring elements. The results obtained from the method are in good agreement with experimental results and simulation results obtained from a commercial FDTD simulation software package. The method allows the frequency response of a SRR to be readily calculated without complex computational methods and enables new designs to be optimised for a particular frequency response by tuning the LC circuit.

  12. Real-time PCR mapping of DNaseI-hypersensitive sites using a novel ligation-mediated amplification technique

    PubMed Central

    Follows, George A.; Janes, Mary E.; Vallier, Ludovic; Green, Anthony R.; Gottgens, Berthold

    2007-01-01

    Mapping sites within the genome that are hypersensitive to digestion with DNaseI is an important method for identifying DNA elements that regulate transcription. The standard approach to locating these DNaseI-hypersensitive sites (DHSs) has been to use Southern blotting techniques, although we, and others, have recently published alternative methods using a range of technologies including high-throughput sequencing and genomic array tiling paths. In this article, we describe a novel protocol to use real-time PCR to map DHS. Advantages of the technique reported here include the small cell numbers required for each analysis, rapid, relatively low-cost experiments with minimal need for specialist equipment. Presented examples include comparative DHS mapping of known TAL1/SCL regulatory elements between human embryonic stem cells and K562 cells. PMID:17389645

  13. Impact of Intestinal Microbiota on Intestinal Luminal Metabolome

    PubMed Central

    Matsumoto, Mitsuharu; Kibe, Ryoko; Ooga, Takushi; Aiba, Yuji; Kurihara, Shin; Sawaki, Emiko; Koga, Yasuhiro; Benno, Yoshimi

    2012-01-01

    Low–molecular-weight metabolites produced by intestinal microbiota play a direct role in health and disease. In this study, we analyzed the colonic luminal metabolome using capillary electrophoresis mass spectrometry with time-of-flight (CE-TOFMS) —a novel technique for analyzing and differentially displaying metabolic profiles— in order to clarify the metabolite profiles in the intestinal lumen. CE-TOFMS identified 179 metabolites from the colonic luminal metabolome and 48 metabolites were present in significantly higher concentrations and/or incidence in the germ-free (GF) mice than in the Ex-GF mice (p < 0.05), 77 metabolites were present in significantly lower concentrations and/or incidence in the GF mice than in the Ex-GF mice (p < 0.05), and 56 metabolites showed no differences in the concentration or incidence between GF and Ex-GF mice. These indicate that intestinal microbiota highly influenced the colonic luminal metabolome and a comprehensive understanding of intestinal luminal metabolome is critical for clarifying host-intestinal bacterial interactions. PMID:22724057

  14. Investigation of driving forces of cerebrospinal fluid motion by power and frequency mapping based on asynchronous phase contrast technique.

    PubMed

    Sunohara, Saeko; Yatsushiro, Satoshi; Takizawa, Ken; Matsumae, Mitsunori; Kajihara, Nao; Kuroda, Kagayaki

    2016-08-01

    To classify the cardiac- and respiratory-driven cerebrospinal fluid (CSF) motions, asynchronous 2D phase contrast (PC) of magnetic resonance imaging (MRI) with 217 ms time resolution in conjunction with power and frequency mapping was performed for 7 healthy subjects under respiration guidance. In the frequency domain, the cardiac-driven motion was at around 1.29±0.21 Hz and respiratory-driven motion was at 0.16±0.01 Hz under 6 sec respiratory cycle. Two different techniques were proposed for characterizing the motions; one was power-map (P-map) depicting integrated power spectrum in a selected band, and the other was frequency-map (F-map) delineating the frequency of maximum peak in power spectral density (PSD). These maps visualized the anatomical distributions of the two motions. Portions of the cardiac- and respiratory-driven CSF motions in the spinal subarachnoid space were 58.1±22.2 and 9.50±3.83 %, respectively. Power and frequency mapping clearly classified the cardiac-driven and respiratory-driven CSF motions.

  15. Fast and accurate multi-channel B1+ mapping based on the TIAMO technique for 7T UHF body MRI.

    PubMed

    Brunheim, Sascha; Gratz, Marcel; Johst, Sören; Bitz, Andreas Karsten; Fiedler, Thomas M; Ladd, Mark Edward; Quick, Harald Hartmut; Orzada, Stephan

    2017-10-09

    Current methods for mitigation of transmit field B1+ inhomogeneities at ultrahigh field (UHF) MRI by multi-channel radiofrequency (RF) shimming rely on accurate B1+ mapping. This can be time consuming when many RF channels have to be mapped for in vivo body MRI, where the B1 maps should ideally be acquired within a single breath-hold. Therefore, a new B1+ mapping technique (B1TIAMO) is proposed. The performance of this technique is validated against an established method (DREAM) in phantom measurements for a cylindrical head phantom with an 8-channel transmit/receive (Tx/Rx) array. Furthermore, measurements for a 32-channel Tx/Rx remote array are conducted in a large body phantom and the |B1+| map reliability is validated against simulations of the transmit RF field distribution. Finally, in vivo results of this new mapping technique for human abdomen are presented. For the head phantom (8-channel Tx/Rx coil), the single |B1+| comparison between B1 TIAMO, the direct DREAM measurements, and simulation data showed good agreement with 10-19% difference. For the large body phantom (32-channel Tx/Rx coil), B1TIAMO matched the RF field simulations well. The results demonstrate the potential to acquire 32 accurate single-channel B1+ maps for large field-of-view body imaging within only a single breath-hold of 16 s at 7T UHF MRI. Magn Reson Med, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  16. Use of Portfolio Assessment Technique in Teaching Map Sketching and Location in Secondary School Geography in Jos, Nigeria

    ERIC Educational Resources Information Center

    Ugodulunwa, Christiana; Wakjissa, Sayita

    2015-01-01

    This study investigated the use of portfolio assessment technique in teaching map sketching and location in geography in Jos, Nigeria. It adopted a quasi-experimental design. Two schools were selected using a table of random numbers from a population of 51 schools in Jos South and assigned to each of experimental and control group. The…

  17. A first look at the application of signal extraction techniques to the analysis of body surface potential maps

    NASA Technical Reports Server (NTRS)

    Weinstein, S. B.; Mcneel, M. L.; Matthews, E.; Fischmann, E. J.

    1976-01-01

    Partial body surface potential maps from both normal subjects and subjects with independently diagnosed myocardial infarcts are visually compared from superimposed plots. A correlation test is devised to distinguish the two groups, with the reference waveform determined by means of a gradient-search algorithm. The results are encouraging, and suggest further investigation of these techniques as a future diagnostic tool.

  18. Using Zaltman Metaphor Elicitation Technique to Map Beneficiaries' Experiences and Values: A Case Example From the Sanitation Sector

    ERIC Educational Resources Information Center

    Kokko, Suvi; Lagerkvist, Carl Johan

    2017-01-01

    Using a case example of an innovative sanitation solution in a slum setting, this study explores the usefulness of the Zaltman Metaphor Elicitation Technique in a program planning and evaluation context. Using a qualitative image-based method to map people's mental models of ill-structured problems such as sanitation can aid program planners and…

  19. The Effect of Using the Story-Mapping Technique on Developing Tenth Grade Students' Short Story Writing Skills in EFL

    ERIC Educational Resources Information Center

    Ibnian, Salem Saleh Khalaf

    2010-01-01

    The current study aimed at investigating the effect of using the story-mapping technique on developing tenth grade students' short story writing skills in EFL. The study attempted to answer the following questions: (1) What are the short story writing skills needed for tenth grade students in EFL?; and (2) What is the effect of the using the…

  20. Notes for the improvement of a remote sensing multispectral data non-supervised classification and mapping technique

    NASA Technical Reports Server (NTRS)

    Dalton, C. C.

    1973-01-01

    Examined are: (1) the sequential clustering technique for the unsupervised automatic classification and mapping of earth resources satellite data, (2) theoretical analysis of the tests which were used, and (3) derivation of an alternative set of tests and their necessary algorithm.

  1. [The effectiveness of RF ablation of ventricular ectopic beats made using selected mapping techniques].

    PubMed

    Senderek, Tomasz; Bednarek, Jacek; Lelakowski, Jacek

    2015-11-01

    Only several world-leading centers have summarized outcomes of invasive therapy of ventricular arrhythmia. The aim of the work is to compare the effectiveness of RF ablation of ventricular arrhythmia. 183 patients (111 males, mean age 50 ± 17) underwent RF ablation of ventricular ecopic beats (VEB). Retrospective analysis of procedural protocols, in- and outpatient medical records was performed. RF ablation was done using electroanatomical CARTO system, Pacemapping or both methods (CARTO + Pacemapping). Long-term ablation effectiveness was as follows: CARTO - success rate assessed during the ablation procedure was 84,4%; during post operation period follow-up 70,3%, and in long term followup 71,1%; Pacemaping-success rate assessed during the ablation procedure was 91,7%; during post operation period follow-up 83,3%, and in long term follow-up 75,0%; CARTO + Pacemaping - success rate assessed during the ablation procedure was 85,4%; during post operation period follow-up 70,8%, and in long term follow-up 77,1%. Mean amount of VEBs per day before ablation was 18750 ± 12560 (2435 to 50000) and after ablation 575 ± 428 (0 to 1550), p<0.001. Best results were achieved in cases where both mapping techniques were used in combination. Among clinical parameters affecting long-term ablation effectiveness, only hypertension was found to significantly decrease long-term effectiveness of VEB ablation. Only ablation temperature and energy affected long-term therapy effect significantly (p<0,0014; HR=0,84). After the ablation, there was improvement of the left-ventricular end-diastolic diameter and ejection fraction. Long-term success of ventricular extrasystoly ablation in combined method (CARTO+Pacemapping) was slightly higher compared in CARTO technique and in Pacemapping technique. Classic RF ablation is effective and safe, therefore it can be considered as first-line therapy. In ablation, precise localization of arrhythmic focus is the most important factor. Ablation

  2. The impact of two-dimensional pulsed-field gel electrophoresis techniques for the consistent and complete mapping of bacterial genomes: refined physical map of Pseudomonas aeruginosa PAO.

    PubMed Central

    Römling, U; Tümmler, B

    1991-01-01

    The SpeI/DpnI map of the 5.9 Mb Pseudomonas aeruginosa PAO (DSM 1707) genome was refined by two-dimensional (2D) pulsed-field gel electrophoresis techniques (PFGE) which allow the complete and consistent physical mapping of any bacterial genome of interest. Single restriction digests were repetitively separated by PFGE employing different pulse times and ramps in order to detect all bands with optimum resolution. Fragment order was evaluated from the pattern of 2D PFGE gels: 1. Partial-complete digestion. A partial restriction digest was separated in the first dimension, redigested to completion, and subsequently perpendicularly resolved in the second dimension. 2D-gel comparisons of the ethidium bromide stain of all fragments and of the autoradiogram of end-labeled partial digestion fragments was nearly sufficient for the construction of the macrorestriction map. 2. Reciprocal gels. A complete restriction digest with enzyme A was run in the first dimension, redigested with enzyme B, and separated in the second orthogonal direction. The order of restriction digests was reverse on the second gel. In case of two rare-cutters, fragments were visualized by ethidium bromide staining or hybridization with genomic DNA. If a frequent and a rare cutter were employed, linking fragments were identified by end-labeling of the first digest. 3. A few small fragments were isolated by preparative PFGE and used as a probe for Southern analysis.--38 SpeI and 15 DpnI fragments were positioned on the map. The zero point was relocated to the 'origin of replication'. The anonymous mapping techniques described herein are unbiased by repetitive DNA, unclonable genomic regions, unfavourable location of restriction sites, or cloning artifacts as frequently encountered in other top-down or bottom-up approaches. Images PMID:1905802

  3. Evaluation of EREP techniques for geological mapping. [southern Pyrenees and Ebro basin in Spain

    NASA Technical Reports Server (NTRS)

    Vandermeermohr, H. E. C.; Srivastava, G. S. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Skylab photographs may be successfully utilized for preparing a reconnaissance geological map in the areas where no maps or semi-detailed maps exist. Large coverage of area and regional perspective from Skylab photographs can help better coordination in regional mapping. It is possible to delineate major structural trends and other features like mega-lineaments, geofractures, and faults, which have evaded their detection by conventional methods. The photointerpretability is better in areas dominated by sedimentary rocks. Rock units of smaller extent and having poor geomorphic expressions are difficult to map. Demarcation of quaternary river alluvium can be made with better precision and ease with the Skylab photographs. Stereoscopic viewing greatly helps in interpretation of area structures. Skylab photographs are not good for preparing geological maps larger than 1:270,000 scale.

  4. A Cognitive Approach to Instructional Techniques and Color Selection in Mapping.

    DTIC Science & Technology

    1985-01-01

    Thomas M. Perry 9. PERFORMING ORGANIZATION NAME AND ADDRESS 1O. PROGRAM ELEMENT , PROJECT, TASK AREA & WORK UNIT NUMBERSAFIT STUDENT AT: University of...I: INTRODUCTION We see, today, that cartographic instruction consists of teaching map notation and teaching about the elements that can be combined...to form a map. For instance, consider the following map making topics most extensively treated in the common cartographic text, Elements of Cartography

  5. Luminous Binary Supersoft X-Ray Sources

    NASA Technical Reports Server (NTRS)

    Oliversen, Ronald J. (Technical Monitor); DiStefano, Rosanne

    2003-01-01

    We have made remarkable progress in the study of luminous supersoft X-ray sources during the past year. We have begun to discover a population of ultraluminous SSSs (e.g., in NGC 300 [Kong & Di Stefano 20031 as well as in Ml0l [Di Stefano & Kong 2003]), which may be accreting intermediate-mass (50-100 solar mass) black holes. This work follows from an algorithm we have developed (Di Stefano & Kong 2003) to identify SSSs in external galaxies, selecting them from among each galaxy s total population of X-ray sources. We have applied the algorithm to approximately one dozen galaxies and will make it public after it has been published in its entirety. Through our own application of the algorithm, we have discovered SSSs in every galaxy, mapping their spatial distribution, to obtain important clues to their fundamental natures. We have discovered that there is a large population of X-ray sources which are slightly hotter (100-250 eV) than standard SSSs. Some of these may be accreting BHs with masses between roughly 50 anf 100 solar masses. To explore this possibility, we are working on theoretical models for the formation and evolution of such systems (Di Stefano 2003).

  6. Luminous Binary Supersoft X-Ray Sources

    NASA Technical Reports Server (NTRS)

    Oliversen, Ronald J. (Technical Monitor); DiStefano, Rosanne

    2003-01-01

    We have made remarkable progress in the study of luminous supersoft X-ray sources during the past year. We have begun to discover a population of ultraluminous SSSs (e.g., in NGC 300 [Kong & Di Stefano 20031 as well as in Ml0l [Di Stefano & Kong 2003]), which may be accreting intermediate-mass (50-100 solar mass) black holes. This work follows from an algorithm we have developed (Di Stefano & Kong 2003) to identify SSSs in external galaxies, selecting them from among each galaxy s total population of X-ray sources. We have applied the algorithm to approximately one dozen galaxies and will make it public after it has been published in its entirety. Through our own application of the algorithm, we have discovered SSSs in every galaxy, mapping their spatial distribution, to obtain important clues to their fundamental natures. We have discovered that there is a large population of X-ray sources which are slightly hotter (100-250 eV) than standard SSSs. Some of these may be accreting BHs with masses between roughly 50 anf 100 solar masses. To explore this possibility, we are working on theoretical models for the formation and evolution of such systems (Di Stefano 2003).

  7. Apparent speed increases at low luminance

    PubMed Central

    Vaziri-Pashkam, Maryam; Cavanagh, Patrick

    2009-01-01

    To investigate the effect of luminance on apparent speed, subjects adjusted the speed of a low-luminance rotating grating (0.31 cd/m2) to match that of a high-luminance one (1260 cd/m2). Above 4 Hz, subjects overestimated the speed of the low-luminance grating. This overestimation increased as a function of temporal rate and reached 30% around 10 Hz temporal rates. The speed overestimation became significant once the lower luminance was 2.4 log units lower than the high luminance comparison. Next the role of motion smear in speed overestimation was examined. First it was shown that the length of the perceived motion smear increased at low luminances. Second, the length of the visible smear was manipulated by changing the presentation time of the stimuli. Speed overestimation was reduced at shorter presentation times. Third the speed of a blurred stimulus was compared to a stimulus with sharp edges and the blurred stimulus was judged to move faster. These results indicate that the length of motion smear following a target contributes to its perceived speed and that this leads to speed overestimation at low luminance where motion traces lengthen because of increased persistence. PMID:19146275

  8. A protein molecular weight map of ES2 clear cell ovarian carcinoma cells using a two-dimensional liquid separations/mass mapping technique.

    PubMed

    Wang, Haixing; Kachman, Maureen T; Schwartz, Donald R; Cho, Kathleen R; Lubman, David M

    2002-09-01

    A molecular weight map of the protein content of ES2 human clear cell ovarian carcinoma cells has been produced using a two-dimensional (2-D) liquid separations/mass mapping technique. This method uses a 2-D liquid separation of proteins from whole cell lysates coupled on-line to an electrospray ionization-time of flight (ESI-TOF) mass spectrometer to map the accurate intact molecular weight (M(r)) of the protein content of the cells. The two separation dimensions involve the use of liquid isoelectric focusing as the first phase and nonporous silica reversed-phase high-performance liquid chromatography (HPLC) as the second phase of separation. The detection by ESI-TOF-MS provides an image of pI versus M(r) analogous to 2-D gel electrophoresis. Each protein is then identified based upon matrix-assisted laser desorption/ionization (MALDI)-TOF-MS peptide mapping and intact M(r) so that a standard map is produced against which other ovarian carcinoma cell lines can be compared. The accurate intact M(r) together with the pI fraction, and peptide map serve to tag the protein for future interlysate comparisons. An internal standard is also used to provide a means for quantitation for future interlysate studies. In the ES2 cell line under study it is shown that nearly 900 M(r) bands are detected over 17 pI fractions from pH 4 to 12 and a M(r) range up to 85 kDa and that around 290 of these bands can be identified using mass spectrometric based techniques. The protein M(r) is detected within an accuracy of 150 ppm and it is shown that many of the proteins in this human cancer sample are modified compared to the database. The protein M(r) map may serve as a highly reproducible standard Web-based method for comparing proteins from related human cell lines.

  9. Some new luminance-gradient effects.

    PubMed

    Zavagno, D

    1999-01-01

    Three compelling luminance-gradient effects are described. The first effect concerns a brightness enhancement and a luminous mist spreading out from a central area having the same luminance as the white background and surrounded by four rectangular inducers shaded with a linear luminance gradient. The second effect is perceived with a photographically reversed configuration, and concerns what may be considered a brightness reduction or the enhancement of a darkness quality of a target area of the visual scene. The third effect concerns the perception of a self-luminous disk inside a somewhat foggy medium. The effects are worthy of further examination because they challenge current theories of luminosity perception and brightness perception in general.

  10. Perception of chromatic motion requires luminance interaction.

    PubMed

    Baraas, Rigmor C

    2005-01-01

    There is an ongoing debate related to whether chromatic motion perception arises as a consequence of a chromatic signal only (eg Wandell et al 1999 Neuron 24 901-909) or a signal that is essentially based on luminance processes (luminance artifacts) (Mullen et al 2003 Vision Research 43 1235-1247). These two views conform to the idea that colour and luminance processes are physiologically independent (Livingstone and Hubel 1988 Science 240 740-749), but according to other reports many primary cortical 'V1' cells respond to both colour and luminance contrast (eg Vidyasagar et al 2002 European Journal of Neuroscience 16 945-956). A psychophysical task was designed to test whether possible interaction between luminance and chromatic contrast could account for perception of chromatic motion. It is shown that subjects respond in a manner that reflects involvement of both processes.

  11. Soil zymography - A novel technique for mapping enzyme activity in the rhizosphere

    NASA Astrophysics Data System (ADS)

    Spohn, Marie

    2014-05-01

    The effect plant roots on microbial activity in soil at the millimeter scale is poorly understood. One reason for this is that spatially explicit methods for the study of microbial activity in soil are limited. Here we present a quantitative in situ technique for mapping the distribution of exoenzymes in soil along with some results about the effects of roots on exoenzyme activity in soil. In the first study we showed that both acid and alkaline phosphatase activity were up to 5.4-times larger in the rhizosphere of Lupinus albus than in the bulk soil. While acid phosphatase activity (produced by roots and microorganisms) was closely associated with roots, alkaline phosphatase activity (produced only by microorganisms) was more widely distributed, leading to a 2.5-times larger area of activity of alkaline than of acid phosphatase. These results indicate a spatial differentiation of different ecophysiological groups of organic phosphorus mineralizing organisms in the rhizosphere which might alleviate a potential competition for phosphorus between them. In a second study cellulase, chitinase and phosphatase activities were analyzed in the presence of living Lupinus polyphyllus roots and dead/dying roots (in the same soils 10, 20 and 30 days after cutting the L. polyphyllus shoots). The activity of all three enzymes was 9.0 to 13.9-times higher at the living roots compared to the bulk soil. Microhotspots of cellulase, chitinase and phosphatase activity in the soil were found up to 60 mm away from the living roots. 10 days after shoot cutting, the areas of high activities of cellulase and phosphatase activity were extend up to 55 mm away from the next root, while the extension of the area of chitinase activity did not change significantly. At the root, cellulase and chitinase activity increased first at the root tips after shoot cutting and showed maximal activity 20 days after shoot cutting. The number and activity of microhotspots of chitinase activity was maximal 10

  12. Field Evaluation of Two Geophysical Techniques for Real-Time Mapping of Smouldering Remediation (STAR)

    NASA Astrophysics Data System (ADS)

    Trento, L. M.; Tsourlos, P.; McMaster, M.; Liefl, D.; Sims, A.; Dominguez, J. L. G.; Vidumsky, J.; Gerhard, J.

    2016-12-01

    Self-sustaining Treatment for Active Remediation (STAR) technology destroys non-aqueous phase liquid (NAPL) in situ using principles of smouldering combustion. It involves propagating an exothermic (400-1000C) oxidation reaction outwards from an ignition well. A full-scale STAR system is currently being applied at an industrial site contaminated with coal tar below the water table in New Jersey, USA. STAR is typically tracked using multi-level thermocouples, which are discrete and sparse in space and time. This study evaluates two surface-based geophysical methods - Electrical Resistivity Tomography (ERT) and Self-Potential (SP) - for the ability to map the STAR reaction in real time at the New Jersey site. Both techniques involve placing electrode arrays on the surface and monitoring electrical signals over time (i.e., time-lapse). It is hypothesized that ERT should be able to monitor the resistive dry zone that precedes the reaction front and/or the growing NAPL-depleted zone. SP is expected to be able to detect the potential difference associated with thermal gradients generated by the reaction. Approximately 72 ERT electrodes in a "swiss cross" pattern plus 10 SP electrodes will be emplaced over single STAR treatment cell (six ignition wells). This setup will be employed to monitor both a deep (25 feet) and shallow (8 feet) STAR treatments. The geophysics will be complemented by in situ temperature measurements, continuous gas measurements, and pre- and post-treatment coring. The primary goal of this research is to evaluate the effectiveness of using ERT and SP for STAR under field conditions. The tests will be conducted in August 2016.

  13. Side-scan sonar mapping: Pseudo-real-time processing and mosaicking techniques

    SciTech Connect

    Danforth, W.W.; Schwab, W.C.; O'Brien, T.F. ); Karl, H. )

    1990-05-01

    The US Geological Survey (USGS) surveyed 1,000 km{sup 2} of the continental shelf off San Francisco during a 17-day cruise, using a 120-kHz side-scan sonar system, and produced a digitally processed sonar mosaic of the survey area. The data were processed and mosaicked in real time using software developed at the Lamont-Doherty Geological Observatory and modified by the USGS, a substantial task due to the enormous amount of data produced by high-resolution side-scan systems. Approximately 33 megabytes of data were acquired every 1.5 hr. The real-time sonar images were displayed on a PC-based workstation and the data were transferred to a UNIX minicomputer where the sonar images were slant-range corrected, enhanced using an averaging method of desampling and a linear-contrast stretch, merged with navigation, geographically oriented at a user-selected scale, and finally output to a thermal printer. The hard-copy output was then used to construct a mosaic of the survey area. The final product of this technique is a UTM-projected map-mosaic of sea-floor backscatter variations, which could be used, for example, to locate appropriate sites for sediment sampling to ground truth the sonar imagery while still at sea. More importantly, reconnaissance surveys of this type allow for the analysis and interpretation of the mosaic during a cruise, thus greatly reducing the preparation time needed for planning follow-up studies of a particular area.

  14. Effectiveness of Mind Mapping Technique in Information Retrieval Among Medical College Students in Puducherry-A Pilot Study

    PubMed Central

    Kalyanasundaram, Madhanraj; Abraham, Sherin Billy; Ramachandran, Divija; Jayaseelan, Venkatachalam; Bazroy, Joy; Singh, Zile; Purty, Anil Jacob

    2017-01-01

    Background: The traditional teaching learning methods involve a one way process of transmission of knowledge leaving the students lacking behind in creative abilities. Medical schools need to change their teaching strategies to keep the interest of students and empower them for future self- learning and critical thinking. Objective: To assess the impact of mind mapping technique in information retrieval among medical college students in Puducherry. Methods: A pilot study was conducted using experimental study design among sixth semester MBBS students (n = 64) at a medical college in Puducherry, India. One group (n = 32) followed the text reading method and another group (n = 32) followed the mind mapping technique to learn the same passage given to them. The knowledge about the topic was assessed using a pre designed questionnaire at baseline, day 0 and day 7. The knowledge gain is the primary outcome variable and is compared between two groups. The feedback regarding the teaching methods was obtained from the participants. Results: Mean knowledge score in the text group was lesser than the mind map group at baseline (2.6 Vs 3.5; p = 0.08). On Day 0, the mean score in text group was slightly lesser than the mind map group (8.7 Vs 9.0; p = 0.26). On Day 7, the mean score in mind map group is significantly more than the text group (8.9 Vs 8.5; p = 0.03). Conclusion: The mind mapping technique is an innovative and effective method in remembering things better than the routine way of reading texts. PMID:28331249

  15. Effectiveness of Mind Mapping Technique in Information Retrieval Among Medical College Students in Puducherry-A Pilot Study.

    PubMed

    Kalyanasundaram, Madhanraj; Abraham, Sherin Billy; Ramachandran, Divija; Jayaseelan, Venkatachalam; Bazroy, Joy; Singh, Zile; Purty, Anil Jacob

    2017-01-01

    The traditional teaching learning methods involve a one way process of transmission of knowledge leaving the students lacking behind in creative abilities. Medical schools need to change their teaching strategies to keep the interest of students and empower them for future self- learning and critical thinking. To assess the impact of mind mapping technique in information retrieval among medical college students in Puducherry. A pilot study was conducted using experimental study design among sixth semester MBBS students (n = 64) at a medical college in Puducherry, India. One group (n = 32) followed the text reading method and another group (n = 32) followed the mind mapping technique to learn the same passage given to them. The knowledge about the topic was assessed using a pre designed questionnaire at baseline, day 0 and day 7. The knowledge gain is the primary outcome variable and is compared between two groups. The feedback regarding the teaching methods was obtained from the participants. Mean knowledge score in the text group was lesser than the mind map group at baseline (2.6 Vs 3.5; p = 0.08). On Day 0, the mean score in text group was slightly lesser than the mind map group (8.7 Vs 9.0; p = 0.26). On Day 7, the mean score in mind map group is significantly more than the text group (8.9 Vs 8.5; p = 0.03). The mind mapping technique is an innovative and effective method in remembering things better than the routine way of reading texts.

  16. A regional ionospheric TEC mapping technique over China and adjacent areas on the basis of data assimilation

    NASA Astrophysics Data System (ADS)

    Aa, Ercha; Huang, Wengeng; Yu, Shimei; Liu, Siqing; Shi, Liqin; Gong, Jiancun; Chen, Yanhong; Shen, Hua

    2015-06-01

    In this paper, a regional total electron content (TEC) mapping technique over China and adjacent areas (70°E-140°E and 15°N-55°N) is developed on the basis of a Kalman filter data assimilation scheme driven by Global Navigation Satellite Systems (GNSS) data from the Crustal Movement Observation Network of China and International GNSS Service. The regional TEC maps can be generated accordingly with the spatial and temporal resolution being 1°×1° and 5 min, respectively. The accuracy and quality of the TEC mapping technique have been validated through the comparison with GNSS observations, the International Reference Ionosphere model values, the global ionosphere maps from Center for Orbit Determination of Europe, and the Massachusetts Institute of Technology Automated Processing of GPS TEC data from Madrigal database. The verification results indicate that great systematic improvements can be obtained when data are assimilated into the background model, which demonstrates the effectiveness of this technique in providing accurate regional specification of the ionospheric TEC over China and adjacent areas.

  17. Brain tumors in eloquent areas: A European multicenter survey of intraoperative mapping techniques, intraoperative seizures occurrence, and antiepileptic drug prophylaxis.

    PubMed

    Spena, Giannantonio; Schucht, Philippe; Seidel, Kathleen; Rutten, Geert-Jan; Freyschlag, Christian Franz; D'Agata, Federico; Costi, Emanule; Zappa, Francesca; Fontanella, Marco; Fontaine, Denys; Almairac, Fabien; Cavallo, Michele; De Bonis, Pasquale; Conesa, Gerardo; Foroglou, Nicholas; Gil-Robles, Santiago; Mandonnet, Emanuel; Martino, Juan; Picht, Thomas; Viegas, Catarina; Wager, Michel; Pallud, Johan

    2017-04-01

    Intraoperative mapping and monitoring techniques for eloquent area tumors are routinely used world wide. Very few data are available regarding mapping and monitoring methods and preferences, intraoperative seizures occurrence and perioperative antiepileptic drug management. A questionnaire was sent to 20 European centers with experience in intraoperative mapping or neurophysiological monitoring for the treatment of eloquent area tumors. Fifteen centers returned the completed questionnaires. Data was available on 2098 patients. 863 patients (41.1%) were operated on through awake surgery and intraoperative mapping, while 1235 patients (58.8%) received asleep surgery and intraoperative electrophysiological monitoring or mapping. There was great heterogeneity between centers with some totally AW oriented (up to 100%) and other almost totally ASL oriented (up to 92%) (31% SD). For awake surgery, 79.9% centers preferred an asleep-awake-asleep anesthesia protocol. Only 53.3% of the centers used ECoG or transcutaneous EEG. The incidence of intraoperative seizures varied significantly between centers, ranging from 2.5% to 54% (p < 0.001). It there appears to be a statistically significant link between the mastery of mapping technique and the risk of intraoperative seizures. Moreover, history of preoperative seizures can significantly increase the risk of intraoperative seizures (p < 0.001). Intraoperative seizures occurrence was similar in patients with or without perioperative drugs (12% vs. 12%, p = 0.2). This is the first European survey to assess intraoperative functional mapping and monitoring protocols and the management of peri- and intraoperative seizures. This data can help identify specific aspects that need to be investigated in prospective and controlled studies.

  18. Enhancing Local Climate Projections of Precipitation: Assets and Limitations of Quantile Mapping Techniques for Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Kotlarski, Sven; Schär, Christoph

    2015-04-01

    The Swiss CH2011 scenarios provide a portfolio of climate change scenarios for the region of Switzerland, specifically tailored for use in climate impact research. Although widely applied by a variety of end-users, these scenarios are subject to several limitations related to the underlying delta change methodology. Examples are difficulties to appropriately account for changes in the spatio-temporal variability of meteorological fields and for changes in extreme events. The recently launched ELAPSE project (Enhancing local and regional climate change projections for Switzerland) is connected to the EU COST Action VALUE (www.value-cost.eu) and aims at complementing CH2011 by further scenario products, including a bias-corrected version of daily scenarios at the site scale. For this purpose the well-established empirical quantile mapping (QM) methodology is employed. Here, daily temperature and precipitation output of 15 GCM-RCM model chains of the ENSEMBLES project is downscaled and bias-corrected to match observations at weather stations in Switzerland. We consider established QM techniques based on all empirical quantiles or linear interpolation between the empirical percentiles. In an attempt to improve the downscaling of extreme precipitation events, we also apply a parametric approximation of the daily precipitation distribution by a dynamically weighted mixture of a Gamma distribution for the bulk and a Pareto distribution for the right tail for the first time in the context of QM. All techniques are evaluated and intercompared in a cross-validation framework. The statistical downscaling substantially improves virtually all considered distributional and temporal characteristics as well as their spatial distribution. The empirical methods have in general very similar performances. The parametric method does not show an improvement over the empirical ones. Critical sites and seasons are highlighted and discussed. Special emphasis is placed on investigating the

  19. Reducing surgical levels by paraspinal mapping and diffusion tensor imaging techniques in lumbar spinal stenosis.

    PubMed

    Chen, Hua-Biao; Wan, Qi; Xu, Qi-Feng; Chen, Yi; Bai, Bo

    2016-04-25

    Correlating symptoms and physical examination findings with surgical levels based on common imaging results is not reliable. In patients who have no concordance between radiological and clinical symptoms, the surgical levels determined by conventional magnetic resonance imaging (MRI) and neurogenic examination (NE) may lead to a more extensive surgery and significant complications. We aimed to confirm that whether the use of diffusion tensor imaging (DTI) and paraspinal mapping (PM) techniques can further prevent the occurrence of false positives with conventional MRI, distinguish which are clinically relevant from levels of cauda equina and/or nerve root lesions based on MRI, and determine and reduce the decompression levels of lumbar spinal stenosis than MRI + NE, while ensuring or improving surgical outcomes. We compared the data between patients who underwent MRI + (PM or DTI) and patients who underwent conventional MRI + NE to determine levels of decompression for the treatment of lumbar spinal stenosis. Outcome measures were assessed at 2 weeks, 3 months, 6 months, and 12 months postoperatively. One hundred fourteen patients (59 in the control group, 54 in the experimental group) underwent decompression. The levels of decompression determined by MRI + (PM or DTI) in the experimental group were significantly less than that determined by MRI + NE in the control group (p = 0.000). The surgical time, blood loss, and surgical transfusion were significantly less in the experimental group (p = 0.001, p = 0.011, p = 0.001, respectively). There were no differences in improvement of the visual analog scale back and leg pain (VAS-BP, VAS-LP) scores and Oswestry Disability Index (ODI) scores at 2 weeks, 3 months, 6 months, and 12 months after operation between the experimental and control groups. MRI + (PM or DTI) showed clear benefits in determining decompression levels of lumbar spinal stenosis than MRI + NE. In patients with lumbar spinal

  20. The Art of Teaching Map and Compass: Instructional Techniques, Curricular Formats and Practical Field Exercises.

    ERIC Educational Resources Information Center

    Watters, Ron

    This paper discusses the value of teaching map and compass skills in the classroom or an outdoor situation. Navigation is the most basic of all outdoor skills. A map and compass curriculum can be taught to anyone, is inexpensive, and is easily incorporated in a variety of educational situations. General teaching principles are outlined: (1) start…

  1. Wood transportation systems-a spin-off of a computerized information and mapping technique

    Treesearch

    William W. Phillips; Thomas J. Corcoran

    1978-01-01

    A computerized mapping system originally developed for planning the control of the spruce budworm in Maine has been extended into a tool for planning road net-work development and optimizing transportation costs. A budgetary process and a mathematical linear programming routine are used interactively with the mapping and information retrieval capabilities of the system...

  2. Investigating the Use of ICT-Based Concept Mapping Techniques on Creativity in Literacy Tasks

    ERIC Educational Resources Information Center

    Riley, Nigel R.; Ahlberg, Mauri

    2004-01-01

    The key research question in this small-scale study focuses on the effects that an ICT (information and communications technologies)-based concept mapping intervention has on creativity and writing achievement in 10-11-year-old primary age pupils. The data shows that pupils using a concept mapping intervention significantly improve their NFER…

  3. Applications of remote sensing techniques to county land use and flood hazard mapping

    NASA Technical Reports Server (NTRS)

    Clark, R. B.; Conn, J. S.; Miller, D. A.; Mouat, D. A.

    1975-01-01

    The application of remote sensing in Arizona is discussed. Land use and flood hazard mapping completed by the Applied Remote Sensing Program is described. Areas subject to periodic flood inundation are delineated and land use maps monitoring the growth within specific counties are provided.

  4. Facilitating the exploitation of ERTS-1 imagery using snow enhancement techniques. [geological fault maps of Massachusetts and Connecticut

    NASA Technical Reports Server (NTRS)

    Wobber, F. J. (Principal Investigator); Martin, K. R.; Amato, R. V.; Leshendok, T.

    1973-01-01

    The author has identified the following significant results. The applications of ERTS-1 imagery for geological fracture mapping regardless of season has been repeatedly confirmed. The enhancement provided by a differential cover of snow increases the number and length of fracture-lineaments which can be detected with ERTS-1 data and accelerates the fracture mapping process for a variety of practical applications. The geological mapping benefits of the program will be realized in geographic areas where data are most needed - complex glaciated terrain and areas of deep residual soils. ERTS-1 derived fracture-lineament maps which provide detail well in excess of existing geological maps are not available in the Massachusetts-Connecticut area. The large quantity of new data provided by ERTS-1 may accelerate and improve field mapping now in progress in the area. Numerous other user groups have requested data on the techniques. This represents a major change in operating philosophy for groups who to data judged that snow obscured geological detail.

  5. Moderately luminous Type II supernovae

    NASA Astrophysics Data System (ADS)

    Inserra, C.; Pastorello, A.; Turatto, M.; Pumo, M. L.; Benetti, S.; Cappellaro, E.; Botticella, M. T.; Bufano, F.; Elias-Rosa, N.; Harutyunyan, A.; Taubenberger, S.; Valenti, S.; Zampieri, L.

    2013-07-01

    Context. Core-collapse Supernovae (CC-SNe) descend from progenitors more massive than about 8 M⊙. Because of the young age of the progenitors, the ejecta may eventually interact with the circumstellar medium (CSM) via highly energetic processes detectable in the radio, X-ray, ultraviolet (UV) and, sometimes, in the optical domains. Aims: In this paper we present ultraviolet, optical and near infrared observations of five Type II SNe, namely SNe 2009dd, 2007pk, 2010aj, 1995ad, and 1996W. Together with few other SNe they form a group of moderately luminous Type II events. We investigate the photometric similarities and differences among these bright objects. We also attempt to characterise them by analysing the spectral evolutions, in order to find some traces of CSM-ejecta interaction. Methods: We collected photometry and spectroscopy with several telescopes in order to construct well-sampled light curves and spectral evolutions from the photospheric to the nebular phases. Both photometry and spectroscopy indicate a degree of heterogeneity in this sample. Modelling the data of SNe 2009dd, 2010aj and 1995ad allows us to constrain the explosion parameters and the properties of the progenitor stars. Results: The light curves have luminous peak magnitudes (-16.95 < MB < -18.70). The ejected masses of 56Ni for three SNe span a wide range of values (2.8 × 10-2 M⊙ < M(56Ni)< 1.4 × 10-1 M⊙), while for a fourth (SN 2010aj) we could determine a stringent upper limit (7 × 10-3 M⊙). Clues of interaction, such as the presence of high velocity (HV) features of the Balmer lines, are visible in the photospheric spectra of SNe 2009dd and 1996W. For SN 2007pk we observe a spectral transition from a Type IIn to a standard Type II SN. Modelling the observations of SNe 2009dd, 2010aj and 1995ad with radiation hydrodynamics codes, we infer kinetic plus thermal energies of about 0.2-0.5 foe, initial radii of 2-5 × 1013 cm and ejected masses of ~5.0-9.5 M⊙. Conclusions: These

  6. Comparison of manually produced and automated cross country movement maps using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Wynn, L. K.

    1985-01-01

    The Image-Based Information System (IBIS) was used to automate the cross country movement (CCM) mapping model developed by the Defense Mapping Agency (DMA). Existing terrain factor overlays and a CCM map, produced by DMA for the Fort Lewis, Washington area, were digitized and reformatted into geometrically registered images. Terrain factor data from Slope, Soils, and Vegetation overlays were entered into IBIS, and were then combined utilizing IBIS-programmed equations to implement the DMA CCM model. The resulting IBIS-generated CCM map was then compared with the digitized manually produced map to test similarity. The numbers of pixels comprising each CCM region were compared between the two map images, and percent agreement between each two regional counts was computed. The mean percent agreement equalled 86.21%, with an areally weighted standard deviation of 11.11%. Calculation of Pearson's correlation coefficient yielded +9.997. In some cases, the IBIS-calculated map code differed from the DMA codes: analysis revealed that IBIS had calculated the codes correctly. These highly positive results demonstrate the power and accuracy of IBIS in automating models which synthesize a variety of thematic geographic data.

  7. Analysis of terrain map matching using multisensing techniques for applications to autonomous vehicle navigation

    NASA Technical Reports Server (NTRS)

    Page, Lance; Shen, C. N.

    1991-01-01

    This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.

  8. Analysis of terrain map matching using multisensing techniques for applications to autonomous vehicle navigation

    NASA Technical Reports Server (NTRS)

    Page, Lance; Shen, C. N.

    1991-01-01

    This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.

  9. Dust near luminous ultraviolet stars

    NASA Astrophysics Data System (ADS)

    Henry, Richard C.

    1992-03-01

    More than 700 luminous stars in the infrared astronomical satellite (IRAS) Skyflux plates were examined for the presence of dust heated by a nearby star. This dust may be distinguished from the ubiquitous cool cirrus by its higher temperature and thus enhanced 60 micron emission. More than 120 dust clouds were found around only 106 of the stars with a volume filling factor of 0.006 and an intercloud separation of 46 pc. A region of dust smoothly distributed through the volume of space heated by the star could not be found and hence an upper limit of 0.05 cm-3 is placed on the equivalent gas density in the intercloud regions. The clouds have an average density of 0.22 cm-3 and a radius of 1.9 pc, albeit with wide variations in their properties. Two different scale heights of 140 and 540 pc were found. This was interpreted as evidence for different distributions of dust in and out of the galactic disk.

  10. Dust near luminous ultraviolet stars

    NASA Technical Reports Server (NTRS)

    Henry, Richard C.

    1992-01-01

    More than 700 luminous stars in the infrared astronomical satellite (IRAS) Skyflux plates were examined for the presence of dust heated by a nearby star. This dust may be distinguished from the ubiquitous cool cirrus by its higher temperature and thus enhanced 60 micron emission. More than 120 dust clouds were found around only 106 of the stars with a volume filling factor of 0.006 and an intercloud separation of 46 pc. A region of dust smoothly distributed through the volume of space heated by the star could not be found and hence an upper limit of 0.05 cm(exp -3) is placed on the equivalent gas density in the intercloud regions. The clouds have an average density of 0.22 cm(exp -3) and a radius of 1.9 pc, albeit with wide variations in their properties. Two different scale heights of 140 and 540 pc were found. This was interpreted as evidence for different distributions of dust in and out of the galactic disk.

  11. Dust near luminous ultraviolet stars

    NASA Technical Reports Server (NTRS)

    Henry, Richard C.

    1993-01-01

    This report describes research activities related to the Infrared Astronomical Satellite (IRAS) sky survey. About 745 luminous stars were examined for the presence of interstellar dust heated by a nearby star. The 'cirrus' discovered by IRAS is thermal radiation from interstellar dust at moderate and high galactic latitudes. The IRAS locates the dust which must (at some level) scatter ultraviolet starlight, although it was expected that thermal emission would be found around virtually every star, most stars shown no detectable emission. And the emission found is not uniform. It is not that the star is embedded in 'an interstellar medium', but rather what is found are discrete clouds that are heated by starlight. An exception is the dearth of clouds near the very hottest stars, implying that the very hottest stars play an active role with respect to destroying or substantially modifying the dust clouds over time. The other possibility is simply that the hottest stars are located in regions lacking in dust, which is counter-intuitive. A bibliography of related journal articles is attached.

  12. Field methods and data processing techniques associated with mapped inventory plots

    Treesearch

    William A. Bechtold; Stanley J. Zarnoch

    1999-01-01

    The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...

  13. Reconstructions Using RIF in Motion Mapping Technique Have Substantially Less Arrhythmogenic Artifacts in Dual-source Coronary CTA.

    PubMed

    Reinartz, Sebastian Daniel; Winkler, Markus; Diefenbach, Sascha B; Allmendinger, Thomas; Penzkofer, Tobias; Kuhl, Christiane K; Mahnken, Andreas H

    2017-02-01

    Particularly for patients with heart arrhythmias, conventional BestSystole (BS) and BestDiastole (BD) reconstruction techniques in computed tomography (CT) frequently show artifacts that hinder the readability of the coronary tree. To address this problem, this paper presents an alternative reconstruction method that combines the technique "reconstructions with identical filling" (RIF) with motion mapping: This new technique is called "RIF in motion mapping" (RIMM). This study compares the diagnostic quality of images generated with RIMM to that of the other reconstruction techniques. Having shown major artifacts in standard reconstructions, the CT datasets of 23 patients with suspected coronary artery disease or prior to transcatheter aortic valve replacement were selected manually. Each dataset was evaluated with four reconstruction techniques: BS, BD, RIF, and RIMM. Two radiologists, blinded to the applied reconstruction type, then evaluated the entire coronary tree of each sample using the 15-segment American Heart Association model and the six-grade Likert scale. Of the 345 analyzed coronary segments, the RIMM technique showed a significant number of images with reliable diagnostic quality (n = 228, 66%) as compared to RIF (P = 0.002) and BS/BD reconstructions (P < 0.001). Per coronary segment, vessel, and patient, the RIMM technique scored significantly better than the conventional BS/BD reconstructions (P = 0.003) and better than the RIF reconstructions with regard to the right coronary artery (P = 0.041). This new technique works: Using RIMM on the worst CT images substantially erased many of these artifacts, thereby enabling the radiologists to clearly visualize these segments. As RIMM considerably eliminates artifacts, this new CT reconstruction technique can help make a fast reliable evaluation of a patient's coronary tree. Thus, this enhanced visualization of cardiac images by RIMM avoids the need for further invasive diagnostic

  14. Thermal mapping of mountain slopes on Mars by application of a Differential Apparent Thermal Inertia technique

    NASA Astrophysics Data System (ADS)

    Kubiak, Marta; Mège, Daniel; Gurgurewicz, Joanna; Ciazela, Jakub

    2015-04-01

    Thermal inertia (P) is an important property of geologic surfaces that essentially describes the resistance to temperature (T) change as heat is added. Most remote sensing data describe the surface only. P is a volume property that is sensitive to the composition of the subsurface, down to a depth reached by the diurnal heating wave. As direct measurement of P is not possible on Mars, thermal inertia models (Fergason et al., 2006) and deductive methods (the Apparent Thermal Inertia: ATI and Differential Apparent Thermal Inertia: DATI) are used to estimate it. ATI is computed as (1 - A) / (Tday - Tnight), where A is albedo. Due to the lack of the thermal daytime images with maximum land surface temperature (LST) and nighttime images with minimum LST in Valles Marineris region, the ATI method is difficult to apply. Instead, we have explored the DATI technique (Sabol et al., 2006). DATI is calculated based on shorter time (t) intervals with a high |ΔT/Δt| gradient (in the morning or in the afternoon) and is proportional to the day/night temperature difference (ATI), and hence P. Mars, which exhibits exceptionally high |ΔT/Δt| gradients due to the lack of vegetation and thin atmosphere, is especially suitable for the DATI approach. Here we present a new deductive method for high-resolution differential apparent thermal inertia (DATI) mapping for areas of highly contrasted relief (e.g., Valles Marineris). Contrary to the thermal inertia models, our method takes local relief characteristics (slopes and aspects) into account. This is crucial as topography highly influences A and ΔT measurements. In spite of the different approach, DATI values in the flat areas are in the same range as the values obtained by Fergason et al. (2006). They provide, however, more accurate information for geological interpretations of hilly or mountainous terrains. Sabol, D. E., Gillespie, A. R., McDonald, E., and Danilina, I., 2006. Differential Thermal Inertia of Geological Surfaces. In

  15. The Most Luminous Galaxies Found by WISE

    NASA Astrophysics Data System (ADS)

    Eisenhardt, Peter

    2012-08-01

    We have used photometry from NASA's Wide-field Infrared Survey Explorer (WISE) to select an all-sky sample of objects which are extremely luminous. Herschel far-IR follow-up observations of these sources are underway. We find most are brighter than 10 trillion solar luminosities, and about 10% exceed 100 trillion solar luminosities. We request one night with LRIS-ADC to obtain redshifts for 20 candidate WISE Ultra-Luminous IR Galaxies which are targets of the Herschel program. The redshifts are essential to one of the two primary science objectives for WISE: to identify the most luminous galaxies in the Universe.

  16. Using mind mapping techniques for rapid qualitative data analysis in public participation processes.

    PubMed

    Burgess-Allen, Jilla; Owen-Smith, Vicci

    2010-12-01

    In a health service environment where timescales for patient participation in service design are short and resources scarce, a balance needs to be achieved between research rigour and the timeliness and utility of the findings of patient participation processes. To develop a pragmatic mind mapping approach to managing the qualitative data from patient participation processes. While this article draws on experience of using mind maps in a variety of participation processes, a single example is used to illustrate the approach. In this example mind maps were created during the course of patient participation focus groups. Two group discussions were also transcribed verbatim to allow comparison of the rapid mind mapping approach with traditional thematic analysis of qualitative data. The illustrative example formed part of a local alcohol service review which included consultation with local alcohol service users, their families and staff groups. The mind mapping approach provided a pleasing graphical format for representing the key themes raised during the focus groups. It helped stimulate and galvanize discussion and keep it on track, enhanced transparency and group ownership of the data analysis process, allowed a rapid dynamic between data collection and feedback, and was considerably faster than traditional methods for the analysis of focus groups, while resulting in similar broad themes. This study suggests that the use of a mind mapping approach to managing qualitative data can provide a pragmatic resolution of the tension between limited resources and quality in patient participation processes. © 2010 The Authors. Health Expectations © 2010 Blackwell Publishing Ltd.

  17. A hydrogeologic map of the Death Valley region, Nevada, and California, developed using GIS techniques

    SciTech Connect

    Faunt, C.C.; D`Agnese, F.A.; Turner, A.K.

    1997-12-31

    In support of Yucca Mountain site characterization studies, a hydrogeologic framework was developed, and a hydrogeologic map was constructed for the Death Valley region. The region, covering approximately 100,000 km{sup 2} along the Nevada-California border near Las Vegas, is characterized by isolated mountain ranges juxtaposed against broad, alluvium-filled valleys. Geologic conditions are typical of the Basin and Range Province; a variety of sedimentary and igneous intrusive and extrusive rocks have been subjected to both compressional and extensional deformation. The regional ground-water flow system can best be described as a series of connected intermontane basins in which ground-water flow occurs in basin-fill deposits, carbonate rocks, clastic rocks, and volcanic rocks. Previous investigations have developed more site-specific hydrogeologic relationships; however, few have described all the lithologies within the Death Valley regional ground-water flow system. Information required to characterize the hydrogeologic units in the region was obtained from regional geologic maps and reports. Map data were digitized from regional geologic maps and combined into a composite map using a geographic information system. This map was simplified to show 10 laterally extensive hydrogeologic units with distinct hydrologic properties. The hydraulic conductivity values for the hydrogeologic units range over 15 orders of magnitude due to the variability in burial depth and degree of fracturing.

  18. A Hydrogeologic Map of the Death Valley Region, Nevada and California, Developed Using GIS Techniques

    USGS Publications Warehouse

    Faunt, Claudia C.; D'Agnese, Frank A.; Turner, A. Keith

    1997-01-01

    In support of Yucca Mountain site characterization studies, a hydrogeologic framework was developed, and a hydrogeologic map was constructed for the Death Valley region. The region, covering approximately 100,000 km 2 along the Nevada-California border near Las Vegas, is characterized by isolated mountain ranges juxtaposed against broad, alluvium-filled valleys. Geologic conditions are typical of the Basin and Range Province; a variety of sedimentary and igneous intrusive and extrusive rocks have been subjected to both compressional and extensional deformation. The regional ground-water flow system can best be described as a series of connected intermontane basins in which ground-water flow occurs in basin-fill deposits, carbonate rocks, clastic rocks, and volcanic rocks. Previous investigations have developed more site-specific hydrogeologic relationships; however, few have described all the lithologies within the Death Valley regional ground-water flow system. Information required to characterize the hydrogeologic units in the region was obtained from regional geologic maps and reports. Map data were digitized from regional geologic maps and combined into a composite map using a geographic information system. This map was simplified to show 10 laterally extensive hydrogeologic units with distinct hydrologic properties. The hydraulic conductivity values for the hydrogeologic units range over 15 orders of magnitude due to the variability in burial depth and degree of fracturing.

  19. Using a Concept Mapping Tool with a Photograph Association Technique (CoMPAT) to Elicit Children's Ideas about Microbial Activity

    NASA Astrophysics Data System (ADS)

    Byrne, Jenny; Grace, Marcus

    2010-03-01

    Concept mapping is a technique used to provide a visual representation of an individual's ideas about a concept or set of related concepts. This paper describes a concept mapping tool using a photograph association technique (CoMPAT) that is considered to be a novel way of eliciting children's ideas. What children at 11 years of age know about particular concepts related to microbial activity is presented and discussed in relation to the effectiveness of CoMPAT as a tool to draw out their ideas. It is proposed that this tool could be used to explore ideas about this and other science concepts from children in other age groups, and where language may be a barrier to participation.

  20. An iso-integral mapping technique using magnetocardiogram, and its possible use for diagnosis of ischemic heart disease.

    PubMed

    Tsukada, K; Miyashita, T; Kandori, A; Mitsui, T; Terada, Y; Sato, M; Shiono, J; Horigome, H; Yamada, S; Yamaguchi, I

    2000-02-01

    We have developed an iso-integral mapping technique that uses magneto-cardiogram (MCG) data to obtain a map as projected total current image on the torso from the heart. We have also investigated the applicability of iso-integral mapping to the diagnosis of ischemic heart disease. We simulated and measured the characteristics of two types of iso-integral maps: one using tangential (Bxy) components, and one using the normal component (Bz). Each vector component was measured by two types of superconducting quantum interference device (SQUID) system to determine the tangential and normal components. The tangential component of the magnetic field appeared to be equivalent to the current image in the myocardium projected on the observing plane, and we were able to obtain a projected total current image by integration of the tangential components during the depolarization and repolarization processes. And we found that the iso-integral maps of normal hearts showed similar pattern in both processes; however, those of ischemic hearts showed different patterns.

  1. Nonlinear mapping technique for data visualization and clustering assessment of LIBS data: application to ChemCam data

    NASA Astrophysics Data System (ADS)

    Lasue, Jeremie; Wiens, Roger; Stepinski, Tom; Forni, Olivier; Clegg, Samuel; Maurice, Sylvestre; Chemcam Team

    2011-02-01

    ChemCam is a remote laser-induced breakdown spectroscopy (LIBS) instrument that will arrive on Mars in 2012, on-board the Mars Science Laboratory Rover. The LIBS technique is crucial to accurately identify samples and quantify elemental abundances at various distances from the rover. In this study, we compare different linear and nonlinear multivariate techniques to visualize and discriminate clusters in two dimensions (2D) from the data obtained with ChemCam. We have used principal components analysis (PCA) and independent components analysis (ICA) for the linear tools and compared them with the nonlinear Sammon's map projection technique. We demonstrate that the Sammon's map gives the best 2D representation of the data set, with optimization values from 2.8% to 4.3% (0% is a perfect representation), together with an entropy value of 0.81 for the purity of the clustering analysis. The linear 2D projections result in three (ICA) and five times (PCA) more stress, and their clustering purity is more than twice higher with entropy values about 1.8. We show that the Sammon's map algorithm is faster and gives a slightly better representation of the data set if the initial conditions are taken from the ICA projection rather than the PCA projection. We conclude that the nonlinear Sammon's map projection is the best technique for combining data visualization and clustering assessment of the ChemCam LIBS data in 2D. PCA and ICA projections on more dimensions would improve on these numbers at the cost of the intuitive interpretation of the 2D projection by a human operator.

  2. Nonlinear mapping technique for data visualization and clustering assessment of LIBS data: application to ChemCam data.

    PubMed

    Lasue, J; Wiens, R C; Stepinski, T F; Forni, O; Clegg, S M; Maurice, S

    2011-07-01

    ChemCam is a remote laser-induced breakdown spectroscopy (LIBS) instrument that will arrive on Mars in 2012, on-board the Mars Science Laboratory Rover. The LIBS technique is crucial to accurately identify samples and quantify elemental abundances at various distances from the rover. In this study, we compare different linear and nonlinear multivariate techniques to visualize and discriminate clusters in two dimensions (2D) from the data obtained with ChemCam. We have used principal components analysis (PCA) and independent components analysis (ICA) for the linear tools and compared them with the nonlinear Sammon's map projection technique. We demonstrate that the Sammon's map gives the best 2D representation of the data set, with optimization values from 2.8% to 4.3% (0% is a perfect representation), together with an entropy value of 0.81 for the purity of the clustering analysis. The linear 2D projections result in three (ICA) and five times (PCA) more stress, and their clustering purity is more than twice higher with entropy values about 1.8. We show that the Sammon's map algorithm is faster and gives a slightly better representation of the data set if the initial conditions are taken from the ICA projection rather than the PCA projection. We conclude that the nonlinear Sammon's map projection is the best technique for combining data visualization and clustering assessment of the ChemCam LIBS data in 2D. PCA and ICA projections on more dimensions would improve on these numbers at the cost of the intuitive interpretation of the 2D projection by a human operator.

  3. General comparison of ozone vertical profiles obtained by various techniques during the 1983 MAP/GLOBUS campaign

    NASA Technical Reports Server (NTRS)

    Matthews, W. A.; Aimedieu, P.; Megie, G.; Pelon, J.; Attmannspacher, W.; Komhyr, W.; Marche, P.; De La Noe, J.; Rigaud, P.; Robbins, D. E.

    1987-01-01

    As part of the 1983 MAP/GLOBUS campaign, atmospheric ozone profile measurements were made using a large variety of different techniques both from balloon platforms and the ground. It is shown that, for most techniques, the measured height distributions agree to within + or - 5 percent with the exception of the remote visible absorption method. This + or - 5 percent uncertainty is of the order of the individual intersystem accuracy. It is suggested that since the differences with the visible absorption method are in magnitude rather than in form, the absorption cross-section data could be the possible cause for the discrepancy.

  4. Mapping the petroleum system - An investigative technique to explore the hydrocarbon fluid system

    USGS Publications Warehouse

    Magoon, L.B.; Dow, W.G.

    2000-01-01

    Creating a petroleum system map includes a series of logical steps that require specific information to explain the origin in time and space of discovered hydrocarbon occurrences. If used creatively, this map provides a basis on which to develop complementary plays and prospects. The logical steps include the characterization of a petroleum system (that is, to identify, map, and name the hydrocarbon fluid system) and the summary of these results on a folio sheet. A petroleum system map is based on the understanding that there are several levels of certainty from "guessing" to "knowing" that specific oil and gas accumulations emanated from a particular pod of active source rock. Levels of certainty start with the close geographic proximity of two or more accumulations, continues with the close stratigraphic proximity, followed by the similarities in bulk properties, and then detailed geochemical properties. The highest level of certainty includes the positive geochemical correlation of the hydrocarbon fluid in the accumulations to the extract of the active source rock. A petroleum system map is created when the following logic is implemented. Implementation starts when the oil and gas accumulations of a petroleum province are grouped stratigraphically and geographically. Bulk and geochemical properties are used to further refine the groups through the determination of genetically related oil and gas types. To this basic map, surface seeps and well shows are added. Similarly, the active source rock responsible for these hydrocarbon occurrences are mapped to further define the extent of the system. A folio sheet constructed for a hypothetical case study of the Deer-Boar(.) petroleum system illustrates this methodology.

  5. Applying Value Stream Mapping Technique for Production Improvement in a Manufacturing Company: A Case Study

    NASA Astrophysics Data System (ADS)

    Jeyaraj, K. L.; Muralidharan, C.; Mahalingam, R.; Deshmukh, S. G.

    2013-01-01

    The purpose of this paper is to explain how value stream mapping (VSM) is helpful in lean implementation and to develop the road map to tackle improvement areas to bridge the gap between the existing state and the proposed state of a manufacturing firm. Through this case study, the existing stage of manufacturing is mapped with the help of VSM process symbols and the biggest improvement areas like excessive TAKT time, production, and lead time are identified. Some modifications in current state map are suggested and with these modifications future state map is prepared. Further TAKT time is calculated to set the pace of production processes. This paper compares the current state and future state of a manufacturing firm and witnessed 20 % reduction in TAKT time, 22.5 % reduction in processing time, 4.8 % reduction in lead time, 20 % improvement in production, 9 % improvement in machine utilization, 7 % improvement in man power utilization, objective improvement in workers skill level, and no change in the product and semi finished product inventory level. The findings are limited due to the focused nature of the case study. This case study shows that VSM is a powerful tool for lean implementation and allows the industry to understand and continuously improve towards lean manufacturing.

  6. Mapping technique for VLSI/WSI implementation of multidimensional systolic arrays

    NASA Astrophysics Data System (ADS)

    Abdelrazik, Mohamed B.

    1990-09-01

    This paper describes a niapping technique for transforming a linear systolic array into multidimensional systolic arrays in order to achieve high-speed with less overhead. This technique is systematic therefore it would be useful for logic synthesis. The application of this technique in DSP and numerical computations reduces the design time which results in low design cost. This technique produces various structures (semi-systolic quasi-systolic and pure systolic arrays) which could be considered as application specific array processors.

  7. Compact radio sources in luminous infrared galaxies

    NASA Astrophysics Data System (ADS)

    Parra, Rodrigo

    2007-08-01

    Radio interferometry is an observational technique of high sensitivity and incomparably high spatial resolution. Moreover, because radio waves can freely propagate through interstellar dust and gas, it allows the study of regions of the universe completely obscured at other wavelengths. This thesis reports the observational and theoretical results of my research during the past four years which are mostly based on interferometric radio data. The COLA sample is an infrared selected sample of active star forming galaxies. We conducted 6 cm VLA and VLBI snapshot observations of the northern half of this sample. The radio emission seen at VLA scales is consistent with being powered by star formation activity because it follows the far infrared to radio correlation. We detect 22% of the sample sources in our VLBI snapshots. Based on luminosity arguments, we argue that these sub-parsec VLBI sources are powered by AGN activity. Furthermore, we find that VLBI detections are preferentially found in sources whose VLA scale structures have the highest peak brightnesses suggesting a strong correlation between compact starburst and AGN activity. This observational result is consistent with the theoretical picture of an Eddington-limited nuclear starburst acting as the last valve in the pipeline transporting the gas from kiloparsec scales onto the accretion disc of a buried AGN. Arp 220 is the archetypical ultra luminous infrared galaxy. For many years this source has been known to harbour a compact (~100 pc) cluster of unresolved 18 cm bright sources believed to be bright core collapse supernovae. Using multiwavelength VLBI observations, we obtained for the first time radio spectra for 18 of these sources. We find that over a half of them have spectra consistent with young supernovae. The rest can be better explained as older supernova remnants interacting with the high density starburst ISM. This finding allowed us to constrain the number of possible scenarios for the Arp 220

  8. Predicting a Luminous Red Nova

    NASA Astrophysics Data System (ADS)

    Van Noord, Daniel; Molnar, Larry; Kinemuchi, Karen; Steenwyk, Steven; Alexander, Cara; Spedden, Chris; Kobulnicky, Henry

    2016-05-01

    Luminous Red Novae (LRN) are rare transient events believed to be caused by the merger of a main sequence contact binary. Since the discovery of the prototype, V838 Mon, only a handful of LRN events have been observed. Tylenda et al. (2011) analyzed the OGLE data preceding the 2008 Novae of V1309 Sco and found that it exhibited a similar light curve to that of a contact binary with one interesting exception, the orbital period of V1309 Sco showed exponential period change going to zero. Unfortunately the system was discovered to be a binary after the merger, preventing any targeted observations to narrow down how the system entered this unusual state. However the extreme period change observed in V1309 Sco gives us a signature to look for in other contact binaries, allowing the discovery of merger candidates for follow up. We will present an analysis of light curves and spectra of KIC 9832227 (NSVS 5597755) that show it is a contact binary system with a negative period derivative that is becoming more extreme with time. These data span more than 15 years and are taken from the NSVS, ASAS, WASP, and Kepler surveys, with ongoing measurements from the Calvin College Observatory and the Apache Point Observatory. The ongoing period change observed in the system is consistent with the exponential model fit from V1309 Sco and the rate of period change has surpassed that of all other measured contact binaries with the exception of V1309 Sco. If the exponential period decay continues the system will likely merge between 2019 and 2022 resulting in a naked eye nova. If this event occurs, this star will present the unprecedented opportunity to study a LRN progenitor and to follow the evolution of the merger.

  9. STAR FORMATION IN TWO LUMINOUS SPIRAL GALAXIES

    SciTech Connect

    Hunter, Deidre A.; Ashburn, Allison; Wright, Teresa; Elmegreen, Bruce G.; Rubin, Vera C.; Józsa, Gyula I. G.; Struve, Christian

    2013-10-01

    We examined star formation in two very luminous (M{sub V} = –22 to –23) Sc-type spiral galaxies, NGC 801 and UGC 2885, using ultra-deep Hα images. We combine these Hα images with UBV and Two-Micron All-Sky Survey JHK images and H I maps to explore the star formation characteristics of disk galaxies at high luminosity. Hα traces star formation in these galaxies to 4-6 disk scale lengths, but the lack of detection of Hα further out is likely due to the loss of Lyman continuum photons. Considering gravitational instabilities alone, we find that the gas and stars in the outer regions are marginally stable in an average sense, but considering dissipative gas and radial and azimuthal forcing, the outer regions are marginally unstable to forming spiral arms. Star formation is taking place in spiral arms, which are regions of locally higher gas densities. Furthermore, we have traced smooth exponential stellar disks over four magnitudes in V-band surface brightness and 4-6 disk scale lengths, in spite of a highly variable gravitational instability parameter. Thus, gravitational instability thresholds do not seem relevant to the stellar disk. One possibility for creating an exponential disk is that the molecular cloud densities and star formation rates have exponential profiles and this fact forces the stellar disk to build up such a profile. Another possibility is that the stellar disk is continuously adjusted to an exponential shape regardless of the star formation profile, for example, through global dynamical processes that scatter stars. However, such scattering processes are only known to operate in spiral systems, in which case they cannot explain the same dilemma of smooth exponential disks observed in dwarf irregular galaxies.

  10. Mapping of Primary Instructional Methods and Teaching Techniques for Regularly Scheduled, Formal Teaching Sessions in an Anesthesia Residency Program.

    PubMed

    Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro

    2016-06-01

    In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high.

  11. Indirect ultrasonication for protein quantification and peptide mass mapping through mass spectrometry-based techniques.

    PubMed

    Carreira, R J; Lodeiro, C; Reboiro-Jato, M; Glez-Peña, D; Fdez-Riverola, F; Capelo, J L

    2010-07-15

    We report in this work a fast protocol for protein quantification and for peptide mass mapping that rely on (18)O isotopic labeling through the decoupling procedure. It is demonstrated that the purity and source of trypsin do not compromise the labeling degree and efficiency of the decoupled labeling reaction, and that the pH of the labeling reaction is a critical factor to obtain a significant (18)O double labeling. We also show that the same calibration curve can be used for MALDI protein quantification during several days maintaining a reasonable accuracy, thus simplifying the handling of the quantification process. In addition we demonstrate that (18)O isotopic labeling through the decoupling procedure can be successfully used to elaborate peptide mass maps. BSA was successfully quantified using the same calibration curve in different days and plasma from a freshwater fish, Cyprinus carpio, was used to elaborate the peptide mass maps. Copyright 2010 Elsevier B.V. All rights reserved.

  12. Mapping snow avalanche risk using GIS technique and 3D modeling in Ceahlau Mountain

    NASA Astrophysics Data System (ADS)

    Covasnianu, A.; Grigoras, I. R.; State, L. E.; Balin, D.; Hogas, S.; Balin, I.

    2009-04-01

    This study consisted in a precise mapping project (GPS field campaign and on-screen digitization of the topographic maps at 1:5.000 scale) of the Ceahlau mountain area in Romanian Carpathians in order to address the snow avalanche risk management, surveying and monitoring. Thus we considered the slope, aspect, altitude, landforms and roughness derived from a high resolute numerical terrain model (31 km2 at 1: 5.000 scale resulted in a spatial resolution of 3 m by the help of Topo to Raster tool). These parameters were classified according to a model applied into Tatra Mountains and used over Ceahlau Massive. The results were adapted and interpreted considering to the European Avalanche Hazard Scale. This work was made in the context of the elaboration of Risk Map and is directly concerning both the security of tourism activities but also the management of the Natural Park Ceahlau. The extension of this method to similar mountain areas is ongoing.

  13. Eta Carinae and Other Luminous Blue Variables

    NASA Technical Reports Server (NTRS)

    Corcoran, M. F.

    2006-01-01

    Luminous Blue Variables (LBVs) are believed to be evolved, extremely massive stars close to the Eddington Limit and hence prone to bouts of large-scale, unstable mass loss. I discuss current understanding of the evolutionary state of these objects, the role duplicity may play and known physical characteristics of these stars using the X-ray luminous LBVs Eta Carinae and HD 5980 as test cases.

  14. Low-Altitude AVIRIS Data for Mapping Land Cover in Yellowstone National Park: Use of Isodata Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.

    2001-01-01

    Northeast Yellowstone National Park (YNP) has a diversity of forest, range, and wetland cover types. Several remote sensing studies have recently been done in this area, including the NASA Earth Observations Commercial Applications Program (EOCAP) hyperspectral project conducted by Yellowstone Ecosystems Studies (YES) on the use of hyperspectral imaging for assessing riparian and in-stream habitats. In 1999, YES and NASA's Commercial Remote Sensing Program Office began collaborative study of this area, assessing the potential of synergistic use of hyperspectral, synthetic aperture radar (SAR), and multiband thermal data for mapping forest, range, and wetland land cover. Since the beginning, a quality 'reference' land cover map has been desired as a tool for developing and validating other land cover maps produced during the project. This paper recounts an effort to produce such a reference land cover map using low-altitude Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data and unsupervised classification techniques. The main objective of this study is to assess ISODATA classification for mapping land cover in Northeast YNP using select bands of low-altitude AVIRIS data. A secondary, more long-term objective is to assess the potential for improving ISODATA-based classification of land cover through use of principal components analysis and minimum noise fraction (MNF) techniques. This paper will primarily report on work regarding the primary research objective. This study focuses on an AVIRIS cube acquired on July 23, 1999, by the confluence of Soda Butte Creek with the Lamar River. Range and wetland habitats dominate the image with forested habitats being a comparatively minor component of the scene. The scene generally tracks from southwest to northeast. Most of the scene is valley bottom with some lower side slopes occurring on the western portion. Elevations within the AVIRIS scene range from approximately 1998 to 2165 m above sea level, based on US

  15. Low-Altitude AVIRIS Data for Mapping Land Cover in Yellowstone National Park: Use of Isodata Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.

    2001-01-01

    Northeast Yellowstone National Park (YNP) has a diversity of forest, range, and wetland cover types. Several remote sensing studies have recently been done in this area, including the NASA Earth Observations Commercial Applications Program (EOCAP) hyperspectral project conducted by Yellowstone Ecosystems Studies (YES) on the use of hyperspectral imaging for assessing riparian and in-stream habitats. In 1999, YES and NASA's Commercial Remote Sensing Program Office began collaborative study of this area, assessing the potential of synergistic use of hyperspectral, synthetic aperture radar (SAR), and multiband thermal data for mapping forest, range, and wetland land cover. Since the beginning, a quality 'reference' land cover map has been desired as a tool for developing and validating other land cover maps produced during the project. This paper recounts an effort to produce such a reference land cover map using low-altitude Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data and unsupervised classification techniques. The main objective of this study is to assess ISODATA classification for mapping land cover in Northeast YNP using select bands of low-altitude AVIRIS data. A secondary, more long-term objective is to assess the potential for improving ISODATA-based classification of land cover through use of principal components analysis and minimum noise fraction (MNF) techniques. This paper will primarily report on work regarding the primary research objective. This study focuses on an AVIRIS cube acquired on July 23, 1999, by the confluence of Soda Butte Creek with the Lamar River. Range and wetland habitats dominate the image with forested habitats being a comparatively minor component of the scene. The scene generally tracks from southwest to northeast. Most of the scene is valley bottom with some lower side slopes occurring on the western portion. Elevations within the AVIRIS scene range from approximately 1998 to 2165 m above sea level, based on US

  16. Identification of metastasis-associated proteins in a human tumor metastasis model using the mass-mapping technique

    PubMed Central

    Kreunin, Paweena; Urquidi, Virginia; Lubman, David M; Goodison, Steve

    2005-01-01

    For most cancer cell types, the acquisition of metastatic ability leads to clinically incurable disease. The identification of molecules whose expression is specifically correlated with the metastatic spread of cancer would facilitate the design of therapeutic interventions to inhibit this lethal process. In order to facilitate metastasis gene discovery we have previously characterized a pair of monoclonal cell lines from the human breast carcinoma cell line MDA-MB-435 that have different metastatic phenotypes in immune-compromised mice. In this study, serum-free conditioned media was collected from the cultured monoclonal cell lines and a mass mapping technique was applied in order to profile a component of each cell line proteome. We utilized chromatofocusing in the first dimension to obtain a high resolution separation based on protein pI, and nonporous silica reverse-phase high performance liquid chromatography was used for the second dimension. Selected proteins were identified on the basis of electrospray ionization time of flight mass spectrometry (ESI-TOF MS) intact protein mapping and matrix-assisted laser desorption/ionization time of flight mass spectrometry (MALDI-TOF MS) peptide mass fingerprinting. Using this approach we were able to map over 400 proteins and plot them as a 2-D map of pI versus accurate Mr. This was performed over a pI range of 4.0–6.2, and a mass range of 6–80 kDa. ESI-TOF MS data and further analysis using MALDI-TOF MS confirmed and identified 27 differentially expressed proteins. Proteins associated with the metastatic phenotype included osteopontin and extracellular matrix protein 1, whereas the matrix metalloproteinase-1 and annexin 1 proteins were associated with the non-metastatic phenotype. These findings demonstrate that the mass mapping technique is a powerful tool for the detection and identification of proteins in complex biological samples and which are specifically associated with a cellular phenotype. PMID:15352249

  17. Identification of metastasis-associated proteins in a human tumor metastasis model using the mass-mapping technique.

    PubMed

    Kreunin, Paweena; Urquidi, Virginia; Lubman, David M; Goodison, Steve

    2004-09-01

    For most cancer cell types, the acquisition of metastatic ability leads to clinically incurable disease. The identification of molecules whose expression is specifically correlated with the metastatic spread of cancer would facilitate the design of therapeutic interventions to inhibit this lethal process. In order to facilitate metastasis gene discovery we have previously characterized a pair of monoclonal cell lines from the human breast carcinoma cell line MDA-MB-435 that have different metastatic phenotypes in immune-compromised mice. In this study, serum-free conditioned media was collected from the cultured monoclonal cell lines and a mass mapping technique was applied in order to profile a component of each cell line proteome. We utilized chromatofocusing in the first dimension to obtain a high resolution separation based on protein pI, and nonporous silica reverse-phase high performance liquid chromatography was used for the second dimension. Selected proteins were identified on the basis of electrospray ionization time of flight mass spectrometry (ESI-TOF MS) intact protein mapping and matrix-assisted laser desorption/ionization time of flight mass spectrometry (MALDI-TOF MS) peptide mass fingerprinting. Using this approach we were able to map over 400 proteins and plot them as a 2-D map of pI versus accurate M(r). This was performed over a pI range of 4.0-6.2, and a mass range of 6-80 kDa. ESI-TOF MS data and further analysis using MALDI-TOF MS confirmed and identified 27 differentially expressed proteins. Proteins associated with the metastatic phenotype included osteopontin and extracellular matrix protein 1, whereas the matrix metalloproteinase-1 and annexin 1 proteins were associated with the non-metastatic phenotype. These findings demonstrate that the mass mapping technique is a powerful tool for the detection and identification of proteins in complex biological samples and which are specifically associated with a cellular phenotype.

  18. Processing techniques for the production of an experimental computer-generated shaded-relief map

    USGS Publications Warehouse

    Judd, Damon D.

    1986-01-01

    The data consisted of forty-eight 1° by 1° blocks of resampled digital elevation model (DEM) data. These data were digitally mosaicked and assigned colors based on intervals of elevation values. The color-coded data set was then used to create a shaded-relief image that was photographically composited with cartographic line information to produce a shaded-relief map. The majority of the processing was completed at the National Mapping Division EROS Data Center in Sioux Falls, South Dakota.

  19. Constructive techniques for zeros of monotone mappings in certain Banach spaces.

    PubMed

    Diop, C; Sow, T M M; Djitte, N; Chidume, C E

    2015-01-01

    Let E be a 2-uniformly convex real Banach space with uniformly Gâteaux differentiable norm, and [Formula: see text] its dual space. Let [Formula: see text] be a bounded strongly monotone mapping such that [Formula: see text] For given [Formula: see text] let [Formula: see text] be generated by the algorithm: [Formula: see text]where J is the normalized duality mapping from E into [Formula: see text] and [Formula: see text] is a real sequence in (0, 1) satisfying suitable conditions. Then it is proved that [Formula: see text] converges strongly to the unique point [Formula: see text] Finally, our theorems are applied to the convex minimization problem.

  20. Simultaneous chromatic and luminance human electroretinogram responses

    PubMed Central

    Parry, Neil R A; Murray, Ian J; Panorgias, Athanasios; McKeefry, Declan J; Lee, Barry B; Kremers, Jan

    2012-01-01

    The parallel processing of information forms an important organisational principle of the primate visual system. Here we describe experiments which use a novel chromatic–achromatic temporal compound stimulus to simultaneously identify colour and luminance specific signals in the human electroretinogram (ERG). Luminance and chromatic components are separated in the stimulus; the luminance modulation has twice the temporal frequency of the chromatic modulation. ERGs were recorded from four trichromatic and two dichromatic subjects (1 deuteranope and 1 protanope). At isoluminance, the fundamental (first harmonic) response was elicited by the chromatic component in the stimulus. The trichromatic ERGs possessed low-pass temporal tuning characteristics, reflecting the activity of parvocellular post-receptoral mechanisms. There was very little first harmonic response in the dichromats’ ERGs. The second harmonic response was elicited by the luminance modulation in the compound stimulus and showed, in all subjects, band-pass temporal tuning characteristic of magnocellular activity. Thus it is possible to concurrently elicit ERG responses from the human retina which reflect processing in both chromatic and luminance pathways. As well as providing a clear demonstration of the parallel nature of chromatic and luminance processing in the human retina, the differences that exist between ERGs from trichromatic and dichromatic subjects point to the existence of interactions between afferent post-receptoral pathways that are in operation from the earliest stages of visual processing. PMID:22586211

  1. Ratio model serves suprathreshold color- luminance discrimination

    NASA Astrophysics Data System (ADS)

    Sankeralli, Marcel J.; Mullen, Kathy T.; Hine, Trevor J.

    2002-03-01

    We extended earlier results [J. Opt. Soc. Am. A 16, 2625 (1999)] to examine how the responses of the three postreceptoral mechanisms are combined to subserve discrimination of suprathreshold stimuli. Test thresholds were obtained in the presence of suprathreshold pedestals selected in different quadrants of the red-green/luminance and blue-yellow/luminance planes of cardinal color space. We showed that (1) test threshold was directly proportional to pedestal contrast for pedestal contrasts exceeding five times pedestal contrast threshold, and (2) there were exceptions to this proportionality, notably when the test and pedestal directions were fixed in the cardinal directions. Results support a ratio model of suprathreshold color-luminance discrimination, in which discrimination depends on a ratio of outputs of the postreceptoral mechanisms. We also observed that when test threshold was measured as a function of test color-space direction, masking by the achromatic component of the pedestal was less than that by the chromatic component. In addition, masking by a dark (negative luminance component) pedestal was lower than masking by a light (positive luminance) pedestal of a similar contrast. Our results demonstrated that (1) there is no fundamental difference between discrimination in the isoluminant and in the two chromoluminant cardinal planes, (2) there exists the possibility that discrimination in cardinal directions differs from that in noncardinal (intermediate) directions, and (3) suprathreshold discrimination of luminance differences may be more sensitive than that of chromatic differences for a given suprathreshold pedestal.

  2. Intravascular photoacoustic imaging of exogenously labeled atherosclerotic plaque through luminal blood

    NASA Astrophysics Data System (ADS)

    Yeager, Doug; Karpiouk, Andrei; Wang, Bo; Amirian, James; Sokolov, Konstantin; Smalling, Richard; Emelianov, Stanislav

    2012-10-01

    Combined intravascular ultrasound and intravascular photoacoustic (IVUS/IVPA) imaging has been previously established as a viable means for assessing atherosclerotic plaque morphological and compositional characteristics using both endogenous and exogenous contrast. In this study, IVUS/IVPA imaging of atherosclerotic rabbit aortas following systemic injection of gold nanorods (AUNRs) with peak absorbance within the tissue optical window is performed. Ex vivo imaging results reveal a high photoacoustic signal from localized AUNRs in regions with atherosclerotic plaques. Corresponding histological staining further confirms the preferential extravasation of AUNRs in atherosclerotic regions with compromised luminal endothelium and acute inflammation. The ability to detect AUNRs using combined IVUS and photoacoustic imaging in the presence of luminal saline and luminal blood is evaluated using both spectroscopic and single wavelength IVPA imaging techniques. Results demonstrate that AUNR detection within the arterial wall can be achieved using both methods, even in the case of imaging through luminal blood.

  3. Classification of cancer cell lines using an automated two-dimensional liquid mapping method with hierarchical clustering techniques.

    PubMed

    Wang, Yanfei; Wu, Rong; Cho, Kathleen R; Shedden, Kerby A; Barder, Timothy J; Lubman, David M

    2006-01-01

    A two-dimensional liquid mapping method was used to map the protein expression of eight ovarian serous carcinoma cell lines and three immortalized ovarian surface epithelial cell lines. Maps were produced using pI as the separation parameter in the first dimension and hydrophobicity based upon reversed-phase HPLC separation in the second dimension. The method can be reproducibly used to produce protein expression maps over a pH range from 4.0 to 8.5. A dynamic programming method was used to correct for minor shifts in peaks during the HPLC gradient between sample runs. The resulting corrected maps can then be compared using hierarchical clustering to produce dendrograms indicating the relationship between different cell lines. It was found that several of the ovarian surface epithelial cell lines clustered together, whereas specific groups of serous carcinoma cell lines clustered with each other. Although there is limited information on the current biology of these cell lines, it was shown that the protein expression of certain cell lines is closely related to each other. Other cell lines, including one ovarian clear cell carcinoma cell line, two endometrioid carcinoma cell lines, and three breast epithelial cell lines, were also mapped for comparison to show that their protein profiles cluster differently than the serous samples and to study how they cluster relative to each other. In addition, comparisons can be made between proteins differentially expressed between cell lines that may serve as markers of ovarian serous carcinomas. The automation of the method allows reproducible comparison of many samples, and the use of differential analysis limits the number of proteins that might require further analysis by mass spectrometry techniques.

  4. Implementation of Concept Mapping to Novices: Reasons for Errors, a Matter of Technique or Content?

    ERIC Educational Resources Information Center

    Conradty, Catherine; Bogner, Franz X.

    2010-01-01

    Concept mapping is discussed as a means to promote meaningful learning and in particular progress in reading comprehension skills. Its increasing implementation necessitates the acquisition of adequate knowledge about frequent errors in order to make available an effective introduction to the new learning method. To analyse causes of errors, 283…

  5. Metateaching and the Instructional Map. Teaching Techniques/Strategies Series, Volume 1.

    ERIC Educational Resources Information Center

    Timpson, William M.

    This book describes a conceptual framework, the "Instructional Map"--a metaphor for envisioning the interconnectedness of teacher and student, teaching and learning, and content and process--that can help teachers plan, sort their way through course material and instructional options, interact with students, and reflect upon progress made and what…

  6. Creating High-Resolution Maps of Leaf Water Isotopes Using IM-CRDS and IRMS Techniques

    NASA Astrophysics Data System (ADS)

    Gerlein-Safdi, C.; Sinkler, C. J.; Caylor, K. K.

    2014-12-01

    Since the development of isotope ratio infrared spectroscopy (IRIS), the applications of water isotope analysis have been increasing. Here, we present a new protocol to create high-resolution maps of leaf water isotopes 18O and 2H. We use the Picarro induction module (IM-CRDS) combined with an isotope analyzer (L2130-i) to sample up to 25 locations in one half of each leaf. Each sampling location corresponds to four samples (6 mm outside diameter punched-holes) punched next to each other. In the induction module, an induction coil heats a metal holder containing the leaf sample. The sample will release water vapor that is then sent to the isotope analyzer. The IM-CRDS allows us to significantly reduce the sample size and therefore increase the sample density, compared to the traditional cryogenic extraction method. Using spatial analysis tools, we create high-resolution spatial maps of each isotope as well as d-excess maps. The water in the second half of the leaf is extracted by cryogenic extraction and analyzed using both IRIS and isotope ratio mass spectroscopy. The isotopic composition of the extracted water is compared to the average composition calculated from the maps and used for calibration. We present applications of this protocol to the analysis of the spatio-temporal evolution of foliar uptake in Colocasia esculenta under laboratory conditions.

  7. Colony mapping: A new technique for monitoring crevice-nesting seabirds

    USGS Publications Warehouse

    Renner, H.M.; Renner, M.; Reynolds, J.H.; Harping, A.M.A.; Jones, I.L.; Irons, D.B.; Byrd, G.V.

    2006-01-01

    Monitoring populations of auklets and other crevice-nesting seabirds remains problematic, although numerous methods have been attempted since the mid-1960s. Anecdotal evidence suggests several large auklet colonies have recently decreased in both abundance and extent, concurrently with vegetation encroachment and succession. Quantifying changes in the geographical extent of auklet colonies may be a useful alternative to monitoring population size directly. We propose a standardized method for colony mapping using a randomized systematic grid survey with two components: a simple presence/absence survey and an auklet evidence density survey. A quantitative auklet evidence density index was derived from the frequency of droppings and feathers. This new method was used to map the colony on St. George Island in the southeastern Bering Sea and results were compared to previous colony mapping efforts. Auklet presence was detected in 62 of 201 grid cells (each grid cell = 2500 m2) by sampling a randomly placed 16 m2 plot in each cell; estimated colony area = 155 000 m2. The auklet evidence density index varied by two orders of magnitude across the colony and was strongly correlated with means of replicated counts of birds socializing on the colony surface. Quantitatively mapping all large auklet colonies is logistically feasible using this method and would provide an important baseline for monitoring colony status. Regularly monitoring select colonies using this method may be the best means of detecting changes in distribution and population size of crevice-nesting seabirds. ?? The Cooper Ornithological Society 2006.

  8. Metateaching and the Instructional Map. Teaching Techniques/Strategies Series, Volume 1.

    ERIC Educational Resources Information Center

    Timpson, William M.

    This book describes a conceptual framework, the "Instructional Map"--a metaphor for envisioning the interconnectedness of teacher and student, teaching and learning, and content and process--that can help teachers plan, sort their way through course material and instructional options, interact with students, and reflect upon progress made and what…

  9. Mapping texts through dimensionality reduction and visualization techniques for interactive exploration of document collections

    NASA Astrophysics Data System (ADS)

    de Andrade Lopes, Alneu; Minghim, Rosane; Melo, Vinícius; Paulovich, Fernando V.

    2006-01-01

    The current availability of information many times impair the tasks of searching, browsing and analyzing information pertinent to a topic of interest. This paper presents a methodology to create a meaningful graphical representation of documents corpora targeted at supporting exploration of correlated documents. The purpose of such an approach is to produce a map from a document body on a research topic or field based on the analysis of their contents, and similarities amongst articles. The document map is generated, after text pre-processing, by projecting the data in two dimensions using Latent Semantic Indexing. The projection is followed by hierarchical clustering to support sub-area identification. The map can be interactively explored, helping to narrow down the search for relevant articles. Tests were performed using a collection of documents pre-classified into three research subject classes: Case-Based Reasoning, Information Retrieval, and Inductive Logic Programming. The map produced was capable of separating the main areas and approaching documents by their similarity, revealing possible topics, and identifying boundaries between them. The tool can deal with the exploration of inter-topics and intra-topic relationship and is useful in many contexts that need deciding on relevant articles to read, such as scientific research, education, and training.

  10. Water and Soil Salinity Mapping for Southern Everglades Using Remote Sensing Techniques and In Situ Observations

    NASA Astrophysics Data System (ADS)

    Khadim, Fahad Khan

    Everglades National Park is a hydro-ecologically significant wetland experiencing salinity ingress over the years. This motivated our study to map water salinity using a spatially weighted optimization model (SWOM); and soil salinity using land cover classes and EC thresholds. SWOM was calibrated and validated at 3-km grids with actual salinity for 1998-2001, and yielded acceptable R2 (0.89-0.92) and RMSE (1.73-1.92 ppt). Afterwards, seasonal water salinity mapping for 1996-97, 2004-05, and 2016 was carried out. For soil salinity mapping, supervised land cover classification was firstly carried out for 1996, 2000, 2006, 2010 and 2015; with the first four providing average accuracies of 82%-94% against existing NLCD classifications. The land cover classes and EC thresholds helped mapping four soil salinity classes namely, the non saline (EC = 0˜2 dS/m), low saline (EC = 2˜4 dS/m), moderate saline (EC = 4˜8 dS/m) and high saline (EC >8 dS/m) areas.

  11. Implementation of Concept Mapping to Novices: Reasons for Errors, a Matter of Technique or Content?

    ERIC Educational Resources Information Center

    Conradty, Catherine; Bogner, Franz X.

    2010-01-01

    Concept mapping is discussed as a means to promote meaningful learning and in particular progress in reading comprehension skills. Its increasing implementation necessitates the acquisition of adequate knowledge about frequent errors in order to make available an effective introduction to the new learning method. To analyse causes of errors, 283…

  12. Ionic Current Mapping Techniques and Applications to Aluminum-Copper Corrosion

    SciTech Connect

    Isaacs, H. S.; Jeffcoate, C. S.; Missert, N. A.; Barbour, J. C.

    1999-10-17

    Measurements have been made of the aluminum/metal galvanic couple. A wide range of geometries were investigated varying the areas of anodic and cathodic surfaces and employing specially designed galvanic cells with crevices. In situ ionic current density mapping was used to monitor galvanic corrosion and currents flowing between separated metals was measured.

  13. ERTS-1 imagery interpretation techniques in the Tennessee Valley. [land use and soil mapping

    NASA Technical Reports Server (NTRS)

    Bodenheimer, R. E. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The feasibility of delineating major soil associations and land uses through computerized analyses is discussed. Useful and potential applications in detecting landscape change and land use mapping are described. Recommendations for improving the data processing effort in a multidisciplinary program are presented.

  14. Current limitations about the cleaning of luminal endoscopes.

    PubMed

    Hervé, R; Keevil, C W

    2013-01-01

    The presence and potential build-up of patient material such as proteins in endoscope lumens can have significant implications, including toxic reactions, device damage, inadequate disinfection/sterilization, increased risk of biofilm development and potential transmission of pathogens. To evaluate potential protein deposition and removal in the channels of flexible luminal endoscopes during a simple contamination/cleaning cycle. The level of contamination present on disposable endoscopy forceps which come into contact with the lumen of biopsy channels was evaluated. Following observations in endoscopy units, factors influencing protein adsorption inside luminal endoscope channels and the action of current initial cleaning techniques were evaluated using a proteinaceous test soil and very sensitive fluorescence epimicroscopy. Disposable endoscope accessories appear to be likely to contribute to the contamination of lumens, and were useful indicators of the amount of proteinaceous soil transiting through the channels of luminal endoscopes. Enzymatic cleaning according to the manufacturer's recommendations and brushing of the channels were ineffective at removing all proteinaceous residues from new endoscope channels after a single contamination. Rinsing immediately after contamination only led to a slight improvement in decontamination outcome. Limited action of current decontamination procedures and the lack of applicable quality control methods to assess the cleanliness of channels between patients contribute to increasing the risk of cross-infection of potentially harmful micro-organisms and molecules during endoscopy procedures. Copyright © 2012 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  15. Detection and mapping of illicit drugs and their metabolites in fingermarks by MALDI MS and compatibility with forensic techniques

    PubMed Central

    Groeneveld, G.; de Puit, M.; Bleay, S.; Bradshaw, R.; Francese, S.

    2015-01-01

    Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect’s lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details. PMID:26118853

  16. Detection and mapping of illicit drugs and their metabolites in fingermarks by MALDI MS and compatibility with forensic techniques

    NASA Astrophysics Data System (ADS)

    Groeneveld, G.; de Puit, M.; Bleay, S.; Bradshaw, R.; Francese, S.

    2015-06-01

    Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect’s lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details.

  17. Detection and mapping of illicit drugs and their metabolites in fingermarks by MALDI MS and compatibility with forensic techniques.

    PubMed

    Groeneveld, G; de Puit, M; Bleay, S; Bradshaw, R; Francese, S

    2015-06-29

    Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect's lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details.

  18. Phase Space Tomography: A Simple, Portable and Accurate Technique to Map Phase Spaces of Beams with Space Charge

    SciTech Connect

    Stratakis, D.; Kishek, R. A.; Bernal, S.; Walter, M.; Haber, I.; Fiorito, R.; Thangaraj, J. C. T.; Quinn, B.; Reiser, M.; O'Shea, P. G.; Li, H.

    2006-11-27

    In order to understand the charged particle dynamics, e.g. the halo formation, emittance growth, x-y energy transfer and coupling, knowledge of the actual phase space is needed. Other the past decade there is an increasing number of articles who use tomography to map the beam phase space and measure the beam emittance. These studies where performed at high energy facilities where the effect of space charge was neglible and therefore not considered in the analysis. This work extends the tomography technique to beams with space charge. In order to simplify the analysis linear forces where assumed. By carefully modeling the tomography process using the particle-in-cell code WARP we test the validity of our assumptions and the accuracy of the reconstructed phase space. Finally, we report experimental results of phase space mapping at the University of Maryland Electron Ring (UMER) using tomography.

  19. On extracting brightness temperature maps from scanning radiometer data. [techniques for algorithm design

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Garza-Robles, R.

    1980-01-01

    The extraction of brightness temperature maps from scanning radiometer data is described as a typical linear inverse problem. Spatial quantization and parameter estimation is described and is suggested as an advantageous approach to a solution. Since this approach takes into explicit account the multivariate nature of the problem, it permits an accurate determination of the most detailed resolution extractable from the data as well as explicitly defining the possible compromises between accuracy and resolution. To illustrate the usefulness of the method described for algorithm design and accuracy prediction, it was applied to the problem of providing brightness temperature maps during the NOSS flight segment. The most detained possible resolution was determined and a curve which displays the possible compromises between accuracy and resolution was provided.

  20. Techniques for the creation of land use maps and tabulations from Landsat imagery

    NASA Technical Reports Server (NTRS)

    Angelici, G. L.; Bryant, N. A.

    1977-01-01

    Methods for creating color thematic maps and land use tabulations, employing both Landsat imagery and computer image processing, are discussed. The system, the Multiple Input Land Use System (MILUS) has been tested in the metropolitan section of Dayton, Ohio. Training areas for land use were first digitized by coordinates and then transformed onto an image of white lines on a black background. This image was added to a Landsat image of the same area. Then multispectral classification was performed. A tape of digitized census tract boundaries was computer interfaced to yield an image of tract boundaries on a background registered to the thematic land-use map. Using a data management system, the data were then used to produce figures for the area and percent of land use in each tract. Future work is expected to convert most of the steps into interactive processing. This would greatly reduce the time needed to edit and register the data sets.

  1. Application of New Liquefaction Hazard Mapping Techniques to the Sacramento-San Joaquin Delta

    NASA Astrophysics Data System (ADS)

    Real, C. R.; Knudsen, K. L.; Woods, M. O.

    2010-12-01

    Earthquake-induced liquefaction hazard maps have been prepared for the central Sacramento-San Joaquin Delta region. The hazard assessment is based on deformation potential of naturally occurring sediments (we have not assessed levees) expressed as the capacity for horizontal displacement from limiting shear-strains caused by earthquake-induced cyclic shear stresses during earthquake shaking. For shaking corresponding to 10% in 50-year exceedance probability, the computed lateral displacement index ranges from zero to 9 ft, with the greatest deformation potential between Brentwood and Bethel Island. The new method allows for more knowledge of the three-dimensional spatial distribution of liquefiable sediments to reduce epistemic uncertainty in the mapping of liquefaction hazard. Recently improved relationships between fines-corrected penetration resistance, induced cyclic stress ratio, and shear- and volumetric reconsolidation strain form the basis of the methodology. Quaternary geologic map units are reclassified into liquefaction-deformation susceptibility units based on statistical similarity of strain-depth functions calculated for calibration conditions of 100% saturation and 40% g for sediment samples from over 3,000 borehole penetration tests. Areal grids of tops and bottoms of liquefaction-prone sediments were prepared by stratigraphic correlation of several hundred boring logs. Strain-depth density functions are then re-computed by grouping all borehole information within a defined liquefaction-deformation susceptibility unit, and parametrically modeling liquefaction response by varying shaking input assuming 100% saturation of sediments. At each grid cell the strain-depth density curve corresponding to the shaking level and liquefaction-deformation susceptibility unit is then integrated over the thickness of subsurface liquefiable sediments to estimate displacement potential. The final maps represent hazard as lateral displacement and volumetric

  2. Quantifying Uncertainty in Flood Inundation Mapping Using Streamflow Ensembles and Multiple Hydraulic Modeling Techniques

    NASA Astrophysics Data System (ADS)

    Hosseiny, S. M. H.; Zarzar, C.; Gomez, M.; Siddique, R.; Smith, V.; Mejia, A.; Demir, I.

    2016-12-01

    The National Water Model (NWM) provides a platform for operationalize nationwide flood inundation forecasting and mapping. The ability to model flood inundation on a national scale will provide invaluable information to decision makers and local emergency officials. Often, forecast products use deterministic model output to provide a visual representation of a single inundation scenario, which is subject to uncertainty from various sources. While this provides a straightforward representation of the potential inundation, the inherent uncertainty associated with the model output should be considered to optimize this tool for decision making support. The goal of this study is to produce ensembles of future flood inundation conditions (i.e. extent, depth, and velocity) to spatially quantify and visually assess uncertainties associated with the predicted flood inundation maps. The setting for this study is located in a highly urbanized watershed along the Darby Creek in Pennsylvania. A forecasting framework coupling the NWM with multiple hydraulic models was developed to produce a suite ensembles of future flood inundation predictions. Time lagged ensembles from the NWM short range forecasts were used to account for uncertainty associated with the hydrologic forecasts. The forecasts from the NWM were input to iRIC and HEC-RAS two-dimensional software packages, from which water extent, depth, and flow velocity were output. Quantifying the agreement between output ensembles for each forecast grid provided the uncertainty metrics for predicted flood water inundation extent, depth, and flow velocity. For visualization, a series of flood maps that display flood extent, water depth, and flow velocity along with the underlying uncertainty associated with each of the forecasted variables were produced. The results from this study demonstrate the potential to incorporate and visualize model uncertainties in flood inundation maps in order to identify the high flood risk zones.

  3. Mosquito habitat and dengue risk potential in Kenya: alternative methods to traditional risk mapping techniques.

    PubMed

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Rosenshein Bennett, Lauren; Waters, Nigel M

    2014-11-01

    Outbreaks, epidemics and endemic conditions make dengue a disease that has emerged as a major threat in tropical and sub-tropical countries over the past 30 years. Dengue fever creates a growing burden for public health systems and has the potential to affect over 40% of the world population. The problem being investigated is to identify the highest and lowest areas of dengue risk. This paper presents "Similarity Search", a geospatial analysis aimed at identifying these locations within Kenya. Similarity Search develops a risk map by combining environmental susceptibility analysis and geographical information systems, and then compares areas with dengue prevalence to all other locations. Kenya has had outbreaks of dengue during the past 3 years, and we identified areas with the highest susceptibility to dengue infection using bioclimatic variables, elevation and mosquito habitat as input to the model. Comparison of the modelled risk map with the reported dengue epidemic cases obtained from the open source reporting ProMED and Government news reports from 1982-2013 confirmed the high-risk locations that were used as the Similarity Search presence cells. Developing the risk model based upon the bioclimatic variables, elevation and mosquito habitat increased the efficiency and effectiveness of the dengue fever risk mapping process.

  4. Mapping and evaluation of snow avalanche risk using GIS technique in Rodnei National Park

    NASA Astrophysics Data System (ADS)

    Covǎsnianu, Adrian; Grigoraş, Ioan-Rǎducu; Covǎsnianu, Liliana-Elena; Iordache, Iulian; Balin, Daniela

    2010-05-01

    The study consisted in a precise mapping project (GPS field campaign, on-screen digitization of the topographic maps at 1:25.000 scale and updated with ASTER mission) of the Rodnei National Park area (Romanian Carpathians) with a focus on snow avalanche risk survey. Parameters taken into account were slope, aspect, altitude, landforms and roughness resulted from a high resolute numerical terrain model obtained by ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) mission. The resulted digital surface model with a spatial resolution of 10 m covered a total area of 187 square kilometers and was improved by the help of Topo to Raster tool. All these parameters were calibrated after a model applied onto Tatra Massive and also Ceahlău Mountain. The results were adapted and interpreted in accordance with European avalanche hazard scale. This work was made in the context of the elaboration of Risk Map and is directly concerning both the security of tourism activities but also the management of the Rodnei Natural Park. The extension of this method to similar mountain areas is ongoing.

  5. Displaying and evaluating engineering properties and natural hazards using geomorphic mapping techniques: Telluride, Colorado

    SciTech Connect

    Gunawan, I.; Giardino, J.R.; Tchakerian, V.P. . Geography Dept.)

    1992-01-01

    Telluride, located in the San Juan mountains of southwestern Colorado, is situated in a glacially carved, fluvially modified alpine valley. Today this chic setting is experiencing rapid urban development resulting from flourishing tourist traffic during both the winter ski season and the summer vacation period. A new development, Mountain Village, is being built on an extensive and complex landslide that has only received superficial scrutiny. Recent fast growth is placing considerable pressure on pristine, undeveloped land. This timely quandary incorporates the interaction between prospective development, geomorphic processes, engineering factors, economic feasibility, and landuse adjudication. In an attempt to respond to these issues the State of Colorado enacted Senate Bill 35 (1972) and House Bills 1034 (1974) and 1041 (1974), all mandating assessment of the natural hazards of an area, preparatory to development. The key to evaluating the natural hazards is to comprehend the geomorphic processes. The area is highly-faulted with associated mineralization. Whereas the upper slopes are composed of massive rhyodacitic-tuff breccias and flows, the valley is sculpted from shales, sandstones, and conglomerates. Several periods of glaciation occurred in the area. Glacial till, talus slopes, avalanche chutes and cones, rock glaciers, alluvium, and landslides have been identified in the field and mapped on aerial photographs. Many of the slopes in the area are active. The authors have constructed a geomorphic map (1:12,500) that shows geology, landforms, geomorphic processes and engineering properties. This map can be used by regulatory agencies in identifying areas of natural hazards potentially sensitive to development.

  6. Evaluation of historical museum interior lighting system using fully immersive virtual luminous environment

    NASA Astrophysics Data System (ADS)

    Navvab, Mojtaba; Bisegna, Fabio; Gugliermetti, Franco

    2013-05-01

    Saint Rocco Museum, a historical building in Venice, Italy is used as a case study to explore the performance of its' lighting system and visible light impact on viewing the large size art works. The transition from threedimensional architectural rendering to the three-dimensional virtual luminance mapping and visualization within a virtual environment is described as an integrated optical method for its application toward preservation of the cultural heritage of the space. Lighting simulation programs represent color as RGB triplets in a devicedependent color space such as ITU-R BT709. Prerequisite for this is a 3D-model which can be created within this computer aided virtual environment. The onsite measured surface luminance, chromaticity and spectral data were used as input to an established real-time indirect illumination and a physically based algorithms to produce the best approximation for RGB to be used as an input to generate the image of the objects. Conversion of RGB to and from spectra has been a major undertaking in order to match the infinite number of spectra to create the same colors that were defined by RGB in the program. The ability to simulate light intensity, candle power and spectral power distributions provide opportunity to examine the impact of color inter-reflections on historical paintings. VR offers an effective technique to quantify the visible light impact on human visual performance under precisely controlled representation of light spectrum that could be experienced in 3D format in a virtual environment as well as historical visual archives. The system can easily be expanded to include other measurements and stimuli.

  7. An endoscope for simultaneous macroscopic navigation and microscopic inspection of luminal sidewalls

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas; Sturgis, Jennifer; Robinson, J. Paul

    2008-02-01

    Endoscopic techniques are commonly used for esophageal and gastrointestinal screening. In this process, atypical regions are identified by gross visual and morphological changes. These regions are then biopsied for pathological confirmation prior to determining treatment. In an effort to increase the sensitivity of endoscopic screening, many groups have performed work in developing microscopic endoscopes capable of inspecting tissues on a cellular level. These microscopic endoscopes are generally implemented as either a stand-alone fiber or through the working channel of a traditional endoscope, and are oriented in a manner similar to traditional flexible endoscopes, imaging the region directly ahead of the endoscope with a wide-angle lens. However, this may not be the optimum configuration for microscopic inspection of luminal sidewalls. We present a novel optical configuration for an endoscope that can simultaneously function as a traditional forward-viewing macroscopic endoscope and as a sidewall-viewing microscopic endoscope. With the first prototype, we have realized a water-emersion microscopic that is capable of imaging tissues on a single-cell level. In addition, microscopic side-port configuration enables efficient mapping of the luminal wall. Utilizing simultaneous macroscopic and microscopic imaging, we are developing software for image registration and analysis that will enable localization of microscopic features within a macroscopic frame of reference. Through a combination of microscopic sidewall imaging and software for image analysis, we aim to provide the clinician with the equivalent of an in vivo biopsy, increasing screening effectiveness and decreasing discomfort and costs related to performing multiple biopsies of suspected regions.

  8. Object based technique for delineating and mapping 15 tree species using VHR WorldView-2 imagery

    NASA Astrophysics Data System (ADS)

    Mustafa, Yaseen T.; Habeeb, Hindav N.

    2014-10-01

    Monitoring and analyzing forests and trees are required task to manage and establish a good plan for the forest sustainability. To achieve such a task, information and data collection of the trees are requested. The fastest way and relatively low cost technique is by using satellite remote sensing. In this study, we proposed an approach to identify and map 15 tree species in the Mangish sub-district, Kurdistan Region-Iraq. Image-objects (IOs) were used as the tree species mapping unit. This is achieved using the shadow index, normalized difference vegetation index and texture measurements. Four classification methods (Maximum Likelihood, Mahalanobis Distance, Neural Network, and Spectral Angel Mapper) were used to classify IOs using selected IO features derived from WorldView-2 imagery. Results showed that overall accuracy was increased 5-8% using the Neural Network method compared with other methods with a Kappa coefficient of 69%. This technique gives reasonable results of various tree species classifications by means of applying the Neural Network method with IOs techniques on WorldView-2 imagery.

  9. A new strategy for snow-cover mapping using remote sensing data and ensemble based systems techniques

    NASA Astrophysics Data System (ADS)

    Roberge, S.; Chokmani, K.; De Sève, D.

    2012-04-01

    The snow cover plays an important role in the hydrological cycle of Quebec (Eastern Canada). Consequently, evaluating its spatial extent interests the authorities responsible for the management of water resources, especially hydropower companies. The main objective of this study is the development of a snow-cover mapping strategy using remote sensing data and ensemble based systems techniques. Planned to be tested in a near real-time operational mode, this snow-cover mapping strategy has the advantage to provide the probability of a pixel to be snow covered and its uncertainty. Ensemble systems are made of two key components. First, a method is needed to build an ensemble of classifiers that is diverse as much as possible. Second, an approach is required to combine the outputs of individual classifiers that make up the ensemble in such a way that correct decisions are amplified, and incorrect ones are cancelled out. In this study, we demonstrate the potential of ensemble systems to snow-cover mapping using remote sensing data. The chosen classifier is a sequential thresholds algorithm using NOAA-AVHRR data adapted to conditions over Eastern Canada. Its special feature is the use of a combination of six sequential thresholds varying according to the day in the winter season. Two versions of the snow-cover mapping algorithm have been developed: one is specific for autumn (from October 1st to December 31st) and the other for spring (from March 16th to May 31st). In order to build the ensemble based system, different versions of the algorithm are created by varying randomly its parameters. One hundred of the versions are included in the ensemble. The probability of a pixel to be snow, no-snow or cloud covered corresponds to the amount of votes the pixel has been classified as such by all classifiers. The overall performance of ensemble based mapping is compared to the overall performance of the chosen classifier, and also with ground observations at meteorological

  10. The INSAR technique: its principle and applications to mapping the deformation field of earthquakes

    NASA Astrophysics Data System (ADS)

    Shan, Xin-Jian; Ye, Hong

    1998-11-01

    The development, state-of-art and prospects of application of the radar remote sensing technique are presented. The principle of the INSAR (Interferometric Synthetic Aperture Radar) technique is expounded in more details. Some applications of this technique in measuring seismic dislocations are given. Finally, it is pointed out that INSAR has a non-replaceable application potential in observing ground surface vertical deformations; it would provide an entirely new means and method for monitoring the dynamic field of earthquakes and give an extremely great impetus to the future earthquake prediction work.

  11. Elemental X-ray mapping of agglutinated foraminifer tests: a non- destructive technique for determining compositional characteristics.

    USGS Publications Warehouse

    Commeau, R.F.; Reynolds, Leslie A.; Poag, C.W.

    1985-01-01

    The composition of agglutinated foraminiferal tests vary remarkably in response to local substrate characteristics, physiochemical properties of the water column and species- dependant selectivity of test components. We have employed a technique that combines a scanning electron microscope with an energy dispersive X-ray spectrometer system to identify major and minor elemental constituents of agglutinated foraminiferal walls. As a sample is bombarded with a beam of high energy electrons, X-rays are generated that are characteristic of the elements present. As a result, X- ray density maps can be produced for each of several elements present in the tests of agglutinated foraminifers. -Authors

  12. Elemental X-ray mapping of agglutinated foraminifer tests: A non- destructive technique for determining compositional characteristics.

    USGS Publications Warehouse

    Commeau, R.F.; Reynolds, Leslie A.; Poag, C.W.

    1985-01-01

    The composition of agglutinated foraminiferal tests vary remarkably in response to local substrate characteristics, physiochemical properties of the water column and species- dependant selectivity of test components. We have employed a technique that combines a scanning electron microscope with an energy dispersive X-ray spectrometer system to identify major and minor elemental constituents of agglutinated foraminiferal walls. As a sample is bombarded with a beam of high energy electrons, X-rays are generated that are characteristic of the elements present. As a result, X- ray density maps can be produced for each of several elements present in the tests of agglutinated foraminifers. 

  13. Village Level Tsunami Threat Maps for Tamil Nadu, SE Coast of India: Numerical Modeling Technique

    NASA Astrophysics Data System (ADS)

    MP, J.; Kulangara Madham Subrahmanian, D.; V, R. M.

    2014-12-01

    The Indian Ocean tsunami (IOT) devastated several countries of North Indian Ocean. India is one of the worst affected countries after Indonesia and Sri Lanka. In India, Tamil Nadu suffered maximum with fatalities exceeding 8,000 people. Historical records show that tsunami has invaded the shores of Tamil Nadu in the past and has made people realize that the tsunami threat looms over Tamil Nadu and it is necessary to evolve strategies for tsunami threat management. The IOT has brought to light that tsunami inundation and runup varied within short distances and for the disaster management for tsunami, large scale maps showing areas that are likely to be affected by future tsunami are identified. Therefore threat assessment for six villages including Mamallapuram (also called Mahabalipuram) which is famous for its rock-cut temples, from the northern part of Tamil Nadu state of India has been carried out and threat maps categorizing the coast into areas of different degree of threat are prepared. The threat was assessed by numerical modeling using TUNAMI N2 code considering different tsunamigenic sources along the Andaman - Sumatra trench. While GEBCO and C-Map data was used for bathymetry and for land elevation data was generated by RTK - GPS survey for a distance of 1 km from shore and SRTM for the inland areas. The model results show that in addition to the Sumatra source which generated the IOT in 2004, earthquakes originating in Car Nicobar and North Andaman can inflict more damage. The North Andaman source can generate a massive tsunami and an earthquake of magnitude more than Mw 9 can not only affect Tamil Nadu but also entire south east coast of India. The runup water level is used to demarcate the tsunami threat zones in the villages using GIS.

  14. An Evaluation of Image Based Techniques for Early Wildfire Detection and Fuel Mapping

    SciTech Connect

    Gabbert, Dustin W.

    2015-05-01

    Few events can cause the catastrophic impact to ecology, infrastructure, and human safety of a wildland fire along the wildland urban interface. The suppression of natural wildland fires over the past decade has caused a buildup of dry, dead surface fuels: a condition that, coupled with the right weather conditions, can cause large destructive wildfires that are capable of threatening both ancient tree stands and manmade infrastructure. Firefighters use fire danger models to determine staffing needs on high fire risk days; however models are only as effective as the spatial and temporal density of their observations. OKFIRE, an Oklahoma initiative created by a partnership between Oklahoma State University and the University of Oklahoma, has proven that fire danger assessments close to the fire – both geographically and temporally – can give firefighters a significant increase in their situational awareness while fighting a wildland fire. This paper investigates several possible solutions for a small Unmanned Aerial System (UAS) which could gather information useful for detecting ground fires and constructing fire danger maps. Multiple fire detection and fuel mapping programs utilize satellites, manned aircraft, and large UAS equipped with hyperspectral sensors to gather useful information. Their success provides convincing proof of the utility that could be gained from low-altitude UAS gathering information at the exact time and place firefighters and land managers are interested in. Close proximity, both geographically and operationally, to the end can reduce latency times below what could ever be possible with satellite observation. This paper expands on recent advances in computer vision, photogrammetry, and infrared and color imagery to develop a framework for a next-generation UAS which can assess fire danger and aid firefighters in real time as they observe, contain, or extinguish wildland fires. It also investigates the impact information gained by this

  15. An evaluation of image based techniques for wildfire detection and fuel mapping

    NASA Astrophysics Data System (ADS)

    Gabbert, Dustin W.

    Few events can cause the catastrophic impact to ecology, infrastructure, and human safety of a wildland fire along the wildland urban interface. The suppression of natural wildland fires over the past decade has caused a buildup of dry, dead surface fuels: a condition that, coupled with the right weather conditions, can cause large destructive wildfires that are capable of threatening both ancient tree stands and manmade infrastructure. Firefighters use fire danger models to determine staffing needs on high fire risk days; however models are only as effective as the spatial and temporal density of their observations. OKFIRE, an Oklahoma initiative created by a partnership between Oklahoma State University and the University of Oklahoma, has proven that fire danger assessments close to the fire - both geographically and temporally - can give firefighters a significant increase in their situational awareness while fighting a wildland fire. This paper investigates several possible solutions for a small Unmanned Aerial System (UAS) which could gather information useful for detecting ground fires and constructing fire danger maps. Multiple fire detection and fuel mapping programs utilize satellites, manned aircraft, and large UAS equipped with hyperspectral sensors to gather useful information. Their success provides convincing proof of the utility that could be gained from low-altitude UAS gathering information at the exact time and place firefighters and land managers are interested in. Close proximity, both geographically and operationally, to the end can reduce latency times below what could ever be possible with satellite observation. This paper expands on recent advances in computer vision, photogrammetry, and infrared and color imagery to develop a framework for a next-generation UAS which can assess fire danger and aid firefighters in real time as they observe, contain, or extinguish wildland fires. It also investigates the impact information gained by this

  16. Conformal mapping technique for two-dimensional porous media and jet impingement heat transfer

    NASA Technical Reports Server (NTRS)

    Siegel, R.

    1974-01-01

    Transpiration cooling and liquid metals both provide highly effective heat transfer. Using Darcy's law in porous media and the inviscid approximation for liquid metals, the local fluid velocity in these flows equals the gradient of a potential. The energy equation and flow region are simplified when transformed into potential plane coordinates. In these coordinates, the present problems are reduced to heat conduction solutions which are mapped into the physical geometry. Results are obtained for a porous region with simultaneously prescribed surface temperature and heat flux, heat transfer in a two-dimensional porous bed, and heat transfer for two liquid metal slot jets impinging on a heated plate.

  17. Conformal mapping technique for two-dimensional porous media and jet impingement heat transfer

    NASA Technical Reports Server (NTRS)

    Siegel, R.

    1973-01-01

    Transpiration cooling and liquid metals both provide highly effective heat transfer. Using Darcy's law in porous media, and the inviscid approximation for liquid metals, the local fluid velocity in these flows equals the gradient of a potential, The energy equation and flow region are simplified when transformed into potential plane coordinates. In these coordinates the present problems are reduced to heat conduction solutions which are mapped into the physical geometry. Results are obtained for a porous region with simultaneously prescribed surface temperature and heat flux, heat transfer in a two-dimensional porous bed, and heat transfer for two liquid metal slot jets impinging on a heated plate.

  18. Conformal mapping technique for two-dimensional porous media and jet impingement heat transfer

    NASA Technical Reports Server (NTRS)

    Siegel, R.

    1974-01-01

    Transpiration cooling and liquid metals both provide highly effective heat transfer. Using Darcy's law in porous media and the inviscid approximation for liquid metals, the local fluid velocity in these flows equals the gradient of a potential. The energy equation and flow region are simplified when transformed into potential plane coordinates. In these coordinates, the present problems are reduced to heat conduction solutions which are mapped into the physical geometry. Results are obtained for a porous region with simultaneously prescribed surface temperature and heat flux, heat transfer in a two-dimensional porous bed, and heat transfer for two liquid metal slot jets impinging on a heated plate.

  19. Building perceptual color maps for visualizing interval data

    NASA Astrophysics Data System (ADS)

    Kalvin, Alan D.; Rogowitz, Bernice E.; Pelah, Adar; Cohen, Aron

    2000-06-01

    In visualization, a 'color map' maps a range of data values onto a scale of colors. However, unless a color map is e carefully constructed, visual artifacts can be produced. This problem has stimulated considerable interest in creating perceptually based color maps, that is, color maps where equal steps in data value are perceived as equal steps in the color map [Robertson (1988); Pizer (1981); Green (1992); Lefkowitz and Herman, 1992)]. In Rogowitz and Treinish, (1996, 1998) and in Bergman, Treinish and Rogowitz, (1995), we demonstrated that color maps based on luminance or saturation could be good candidates for satisfying this requirement. This work is based on the seminal work of S.S. Stevens (1966), who measured the perceived magnitude of different magnitudes of physical stimuli. He found that for many physical scales, including luminance (cd/m2) and saturation (the 'redness' of a long-wavelength light source), equal ratios in stimulus value produced equal ratios in perceptual magnitude. He interpreted this as indicating that there exists in human cognition a common scale for representing magnitude, and we scale the effects of different physical stimuli to this internal scale. In Rogowitz, Kalvin, Pelahb and Cohen (1999), we used a psychophysical technique to test this hypothesis as it applies to the creation of perceptually uniform color maps. We constructed color maps as trajectories through three-color spaces, a common computer graphics standard (uncalibrated HSV), a common perceptually-based engineering standard for creating visual stimuli (L*a*b*), and a space commonly used in the graphic arts (Munsell). For each space, we created color scales that varied linearly in hue, saturation, or luminance and measured the detectability of increments in hue, saturation or luminance for each of these color scales. We measured the amplitude of the just-detectable Gaussian increments at 20 different values along the range of each color map. For all three color spaces

  20. SUB-LUMINOUS {gamma}-RAY PULSARS

    SciTech Connect

    Romani, R. W.; Kerr, M.; Craig, H. A.; Johnston, S.; Cognard, I.; Smith, D. A.

    2011-09-01

    Most pulsars observed by the Fermi Large Area Telescope have {gamma}-ray luminosities scaling with spin-down power E-dot as L{sub {gamma}}{approx}(E-dot x 10{sup 33} erg s{sup -1}){sup 1/2}. However, there exist one detection and several upper limits an order of magnitude or more fainter than this trend. We describe these 'sub-luminous' {gamma}-ray pulsars and discuss the case for this being an orientation effect. Of the 12 known young radio pulsars with E-dot >10{sup 34} erg s{sup -1} and d {<=} 2 kpc several are substantially sub-luminous. The limited available geometrical constraints favor aligned geometries for these pulsars, although no one case for alignment is compelling. In this scenario GeV emission detected from such sub-luminous pulsars can be due to a lower altitude, lower-power accelerator gap.

  1. Sub-luminous γ-ray pulsars

    SciTech Connect

    Romani, R. W.; Kerr, M.; Craig, H. A.; Johnston, S.; Cognard, I.; Smith, D. A.

    2011-08-17

    Here, most pulsars observed by the Fermi Large Area Telescope have γ-ray luminosities scaling with spin-down power ${\\dot{E}}$ as $L_\\gamma \\approx ({\\dot{E}}\\, \\times \\, 10^{33}\\,{\\rm erg \\,s^{-1}})^{1/2}$. However, there exist one detection and several upper limits an order of magnitude or more fainter than this trend. We describe these "sub-luminous" γ-ray pulsars and discuss the case for this being an orientation effect. Of the 12 known young radio pulsars with ${\\dot{E}}>10^{34}\\, {\\rm erg\\,s^{-1}}$ and d ≤ 2 kpc several are substantially sub-luminous. The limited available geometrical constraints favor aligned geometries for these pulsars, although no one case for alignment is compelling. In this scenario GeV emission detected from such sub-luminous pulsars can be due to a lower altitude, lower-power accelerator gap.

  2. Three-dimensional compositional mapping using double-pulse micro-laser-induced breakdown spectroscopy technique

    NASA Astrophysics Data System (ADS)

    Grassi, R.; Grifoni, E.; Gufoni, S.; Legnaioli, S.; Lorenzetti, G.; Macro, N.; Menichetti, L.; Pagnotta, S.; Poggialini, F.; Schiavo, C.; Palleschi, V.

    2017-01-01

    In this communication, we present the development of the first double-pulse micro-LIBS (DP-μLIBS) instrument for three-dimensional compositional mapping of materials. The system allows for high-resolution three-dimensional scanning of materials; its advantages with respect to conventional single-pulse micro-LIBS systems are described and discussed. As a test example, we analyzed three Euro coins to show the performances of the system on homogeneous samples (20 Eurocents), heterogeneous samples (1 Euro) and layered samples (5 Eurocents). DP-μLIBS cannot provide isotopic information and has, typically, limits of detection for the elements of interest much higher with respect to Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry (LA-ICP-MS). However, when the isotopic information and the extreme sensitivity to trace elements is not be necessary, the results obtained show that DP-μLIBS can be a viable alternative to LA-ICP-MS for the acquisition of high resolution three-dimensional compositional maps.

  3. High-Resolution Debris Flow Volume Mapping with Unmanned Aerial Systems (uas) and Photogrammetric Techniques

    NASA Astrophysics Data System (ADS)

    Adams, M. S.; Fromm, R.; Lechner, V.

    2016-06-01

    Debris flows cause an average € 30 million damages and 1-2 fatalities every year in Austria. Detailed documentation of their extent and magnitude is essential for understanding, preventing and mitigating these natural hazard events. The recent development of unmanned aerial systems (UAS) has provided a new possibility for on-demand high-resolution monitoring and mapping. Here, we present a study, where the spatial extent and volume of a large debris flow event were mapped with different UAS, fitted with commercial off-the-shelf sensors. Orthophotos and digital terrain models (DTM) were calculated using structure-from-motion photogrammetry software. Terrain height differences caused by the debris flow in the catchment and valley floor were derived by subtracting the pre-event airborne laser scanning (ALS) DTM from a post-event UAS-DTM. The analysis of the volumetric sediment budget showed, that approximately 265,000 m³ material was mobilised in the catchment, of which 45,000 m³ settled there; of the material, which reached the valley floor, 120,000 m³ was deposited, while another 10,000 m³ was eroded from there. The UAS-results were validated against ALS data and imagery from a traditional manned-aircraft photogrammetry campaign. In conclusion, the UAS-data can reach an accuracy and precision comparable to manned aircraft data, but with the added benefits of higher flexibility, easier repeatability, less operational constraints and higher spatial resolution.

  4. Nanoscale structural and functional mapping of nacre by scanning probe microscopy techniques

    NASA Astrophysics Data System (ADS)

    Zhou, Xilong; Miao, Hongchen; Li, Faxin

    2013-11-01

    Nacre has received great attention due to its nanoscale hierarchical structure and extraordinary mechanical properties. Meanwhile, the nanoscale piezoelectric properties of nacre have also been investigated but the structure-function relationship has never been addressed. In this work, firstly we realized quantitative nanomechanical mapping of nacre of a green abalone using atomic force acoustic microscopy (AFAM). The modulus of the mineral tablets is determined to be ~80 GPa and that of the organic biopolymer no more than 23 GPa, and the organic-inorganic interface width is determined to be about 34 +/- 9 nm. Then, we conducted both AFAM and piezoresponse force microscopy (PFM) mapping in the same scanning area to explore the correlations between the nanomechanical and piezoelectric properties. The PFM testing shows that the organic biopolymer exhibits a significantly stronger piezoresponse than the mineral tablets, and they permeate each other, which is very difficult to reproduce in artificial materials. Finally, the phase hysteresis loops and amplitude butterfly loops were also observed using switching spectroscopy PFM, implying that nacre may also be a bio-ferroelectric material. The obtained nanoscale structural and functional properties of nacre could be very helpful in understanding its deformation mechanism and designing biomimetic materials of extraordinary properties.

  5. Comparison of VTEC from ground-based space geodetic techniques based on ray-traced mapping factors

    NASA Astrophysics Data System (ADS)

    Heinkelmann, Robert; Alizadeh, M. Mahdi; Schuh, Harald; Deng, Zhiguo; Zus, Florian; Etemadfard, M. Hossein

    2016-07-01

    For the derivation of vertical total electron content (VTEC) from slant total electron content (STEC), usually a standard approach is used based on mapping functions that assume a single-layer model of the ionosphere (e.g. IERS Conventions 2010). In our study we test the standard approach against a recently developed alternative which is based on station specific ray-traced mapping factors. For the evaluation of this new mapping concept, we compute VTEC at selected Very Long Baseline Interferometry (VLBI) stations using the dispersive delays and the corresponding formal errors obtained by observing extra-galactic radio sources at two radio frequencies in S- and X-bands by the permanent geodetic/astrometric program organized by the IVS (International VLBI Service for Geodesy and Astrometry). Additionally, by applying synchronous sampling and a consistent analysis configuration, we determine VTEC at Global Navigation Satellite System (GNSS) antennas using GPS (Global Positioning System) and/or GLONASS (Globalnaja nawigazionnaja sputnikowaja Sistema) observations provided by the IGS (International GNSS Service) that are operated in the vicinity of the VLBI antennas. We compare the VTEC time series obtained by the individual techniques over a period of about twenty years and describe their characteristics qualitatively and statistically. The length of the time series allows us to assess the long-term climatology of ionospheric VTEC during the last twenty years.

  6. Experimental and Automated Analysis Techniques for High-resolution Electrical Mapping of Small Intestine Slow Wave Activity

    PubMed Central

    Angeli, Timothy R; O'Grady, Gregory; Paskaranandavadivel, Niranchan; Erickson, Jonathan C; Du, Peng; Pullan, Andrew J; Bissett, Ian P

    2013-01-01

    Background/Aims Small intestine motility is governed by an electrical slow wave activity, and abnormal slow wave events have been associated with intestinal dysmotility. High-resolution (HR) techniques are necessary to analyze slow wave propagation, but progress has been limited by few available electrode options and laborious manual analysis. This study presents novel methods for in vivo HR mapping of small intestine slow wave activity. Methods Recordings were obtained from along the porcine small intestine using flexible printed circuit board arrays (256 electrodes; 4 mm spacing). Filtering options were compared, and analysis was automated through adaptations of the falling-edge variable-threshold (FEVT) algorithm and graphical visualization tools. Results A Savitzky-Golay filter was chosen with polynomial-order 9 and window size 1.7 seconds, which maintained 94% of slow wave amplitude, 57% of gradient and achieved a noise correction ratio of 0.083. Optimized FEVT parameters achieved 87% sensitivity and 90% positive-predictive value. Automated activation mapping and animation successfully revealed slow wave propagation patterns, and frequency, velocity, and amplitude were calculated and compared at 5 locations along the intestine (16.4 ± 0.3 cpm, 13.4 ± 1.7 mm/sec, and 43 ± 6 µV, respectively, in the proximal jejunum). Conclusions The methods developed and validated here will greatly assist small intestine HR mapping, and will enable experimental and translational work to evaluate small intestine motility in health and disease. PMID:23667749

  7. Preliminary Detection Model of Rapid Mapping Technique for Landslide Susceptibility Zone Using Multi Sensor Imagery (Case Study in Banjarnegara Regency)

    NASA Astrophysics Data System (ADS)

    Yanuarsyah, I.; Khairiah, R. N.

    2017-01-01

    This study as a preliminary stage as part of disaster mitigation landslide in Banjarnegara Regency, by utilizing a combination of multi-sensor image to overview the pattern forest cover changes with supported by other parameters such as rainfall, slope, aspect, curvature patterns hill (curvature). The objective is how to develop detection model in rapid mapping technique for detection landslide susceptibility zone. This information is used as basis an early detection for estimating landslide potentially happen in the future. there are four main processes which are optical image processing, SAR image processing, DEM processing and Scoring Geoprocessing. The final zone might be verified by particular landslide event location whether it exist on the result map. It obtain “big five” district with higher prone landslide susceptibility zone such as Batur district, Pejawaran district, Wanayasa district, Kalibening district and Rakit district. Total susceptibility zone in Banjarnegara regency approximately 604.79 Ha with 15,250 prone point location. Thus, it classified as 14.16 Ha of low zone, 286.41 Ha of moderate zone and 304.22 Ha of high zone. This study demonstrates as rapid mapping the enormous potential landslide occurrences investigated by susceptibility zone. In term of landslide prone point, the combination optical image and SAR image quite enough to perform post forest cover changes and it also can overlay with another causative parameter.

  8. Mesopic luminance assessed with minimum motion photometry.

    PubMed

    Raphael, Sabine; MacLeod, Donald I A

    2011-08-25

    We measured the relative contribution of rods and cones to luminance across a range of photopic, mesopic, and scotopic adaptation levels and at various retinal eccentricities. We isolated the luminance channel by setting motion-based luminance nulls (minimum motion photometry) using annular stimuli. Luminance nulls between differently colored stimuli require equality in a weighted sum of rod and cone excitations. The relative cone weight increases smoothly from the scotopic range, where rods dominate, to photopic levels, where rod influence becomes negligible. The change from rod to cone vision does not occur uniformly over the visual field. The more peripheral the stimulus location, the higher is the light level required for cones to participate strongly. The relative cone contribution can be described by a sigmoid function of intensity, with two parameters that each depend on the eccentricity and spatial frequency of the stimulus. One parameter determines the "meso-mesopic" luminance--the center of the mesopic range, at which rod and cone contributions are balanced. This increases with eccentricity, reflecting an increase in the meso-mesopic luminance from 0.04 scotopic cd/m(2) at 2° eccentricity to 0.44 scotopic cd/m(2) at 18°. The second parameter represents the slope of the log-log threshold-versus-intensity curve (TVI curve) for rod vision. This parameter inversely scales the width of the mesopic range and increases only slightly with eccentricity (from 0.73 at 2° to 0.78 for vision at 18° off-axis).

  9. The metallicities of luminous, massive field galaxies at intermediate redshifts

    NASA Astrophysics Data System (ADS)

    Mouhcine, M.; Bamford, S. P.; Aragón-Salamanca, A.; Nakamura, O.; Milvang-Jensen, B.

    2006-06-01

    We derive oxygen abundances for a sample of 40 luminous (MB <~ -19), star-forming, mostly disc, field galaxies with redshifts in the range 0.2 <~ z <~ 0.8, with a median of = 0.45. Oxygen abundances, relative to hydrogen, of the interstellar emitting gas are estimated by means of the empirically calibrated strong emission-line ratio technique. The derived 12 + log (O/H) values range from 8.4 to 9.0, with a median of 8.7. Twenty of these galaxies have securely measured rotation velocities, in the range 50-244 km s-1. The measured emission-line equivalent widths and diagnostic ratios for the intermediate redshift galaxies cover similar ranges to those observed across a large sample of local galaxies. The estimated oxygen abundances for our luminous star-forming intermediate redshift galaxies cover the same range as their local counterparts. However, at a given galaxy luminosity, many of our galaxies have significantly lower oxygen abundances, i.e. 12 + log (O/H) ~8.6, than local galaxies with similar luminosities. Interestingly, these luminous, massive, intermediate redshift, star-forming galaxies with low oxygen abundances exhibit physical conditions, i.e. emission-line equivalent width and ionization state, very similar to those of local faint and metal-poor star-forming galaxies. The oxygen abundance of the interstellar gas does not seem to correlate with the maximum rotation velocity or the emission scalelength of the parent galaxy. This suggests that there is diversity in the intrinsic properties of the massive field galaxy population at intermediate redshifts. The distribution of the colour excess, derived from the ratio of extinction-uncorrected Hβ and [OII]λ3727 star formation rate indicators, covers a similar range to that observed locally, but exhibits a lower mean than is observed for local optically selected star-forming galaxies. Luminous field galaxies at intermediate redshifts show similar star formation rates to their local counterparts. However

  10. The luminous starburst galaxy UGC 8387

    NASA Technical Reports Server (NTRS)

    Smith, Denise A.; Herter, Terry; Haynes, Martha P.; Beichman, C. A.; Gautier, T. N. Iii

    1995-01-01

    We present broad-band J, H, and K images and K-band spectroscopy of the luminous starburst galaxy UGC 8387. The images show a disturbed morphology, tidal tails, and a single elognated nucleus. Near infrared color maps constructed from the images reveal that the nucleus region is highly reddened. Strong emission from the central 3 arcseconds in the 2.166 micrometer Brackett gamma, 2.122 micrometer H2 v = 1-0 S(1), and 2.058 micrometer He I lines is present in the K-band spectrum. From the Brackett gamma and published radio fluxes, we find an optical depth toward the nucleus of tau(sub V) approximately 24. The CO band heads produce strong absorption in the spectral region long-ward of 2.3 micrometers. We measure a 'raw' CO index of 0.17 +/- 0.02 mag, consistent with a population of K2 supergiants of K4 giants. The nuclear colors, however, are not consistent with an obscured population of evolved stars. Instead, the red colors are best explained by an obscured mixture of stellar and warm dust emission. The amount of dust emission predicted by the near-infrared colors exceeds that expected from comparisons to galactic H II regions. After correcting the spectrum of UGC 8387 for dust emission and extinction, we obtain a CO index of greater than or equal to 0.25 mag. This value suggests the stellar component of the 2.2 micrometer light is dominated by young supergiants. The infrared excess, L(sub IR)/L(sub Ly alpha) derived for UGC 8387 is lower than that observed in galactic H II regions and M82. This implies that either the lower or upper mass cutoff of the initial mass function must be higher than those of local star-forming regions and M82. The intense nuclear starburst in this galaxy is presumably the result of merger activity; and we estimate the starburst age to be at least a few times 10(exp 7) yr.

  11. Osteochondritis dissecans of the capitellum: lesion size and pattern analysis using quantitative 3-dimensional computed tomography and mapping technique.

    PubMed

    Bexkens, Rens; Oosterhoff, Jacobien H; Tsai, Tsung-Yuan; Doornberg, Job N; van den Bekerom, Michel P J; Eygendaal, Denise; Oh, Luke S

    2017-09-01

    The goals of this study were to evaluate the reliability of a quantitative 3-dimensional computed tomography (Q3DCT) technique for measurement of the capitellar osteochondritis dissecans (OCD) surface area, to analyze OCD distribution using a mapping technique, and to investigate associations between Q3DCT lesion quantification and demographic characteristics and/or clinical examination findings. We identified patients with capitellar OCD who presented to our orthopedic sports medicine practice between January 2001 and January 2016 and who had undergone a preoperative computed tomography scan (slice thickness ≤1.25 mm). A total of 17 patients with a median age of 15 years (range, 12-23 years) were included in this study. Three-dimensional polygon models were reconstructed after osseous structures were marked in 3 planes. Surface areas of the OCD lesion as well as the capitellum were measured. Observer agreement was assessed with the intraclass correlation coefficient (ICC). Heat maps were created to visualize OCD distribution. Measurements of the OCD surface area showed almost perfect intraobserver agreement (ICC, 0.99; confidence interval [CI], 0.98-0.99) and interobserver agreement (ICC, 0.93; CI, 0.86-0.97). Measurements of the capitellar surface area also showed almost perfect intraobserver agreement (ICC, 0.97;CI, 0.91-0.99) and interobserver agreement (ICC, 0.86; CI, 0.46-0.96). The median OCD surface area was 101 mm(2) (range, 49-217 mm(2)). On the basis of OCD heat mapping, the posterolateral zone of the capitellum was most frequently affected. OCDs in which the lateral wall was involved were associated with larger lesion size (P = .041), longer duration of symptoms (P = .030), and worse elbow extension (P = .013). The ability to quantify the capitellar OCD surface area and lesion location in a reliable manner using Q3DCT and a mapping technique should be considered when detailed knowledge of lesion size and location is desired

  12. Spinal cord mapping as an adjunct for resection of intramedullary tumors: surgical technique with case illustrations.

    PubMed

    Quinones-Hinojosa, Alfredo; Gulati, Mittul; Lyon, Russell; Gupta, Nalin; Yingling, Charles

    2002-11-01

    Resection of intramedullary spinal cord tumors may result in transient or permanent neurological deficits. Intraoperative somatosensory evoked potentials (SSEPs) and motor evoked potentials are commonly used to limit complications. We used both antidromically elicited SSEPs for planning the myelotomy site and direct mapping of spinal cord tracts during tumor resection to reduce the risk of neurological deficits and increase the extent of tumor resection. In two patients, 3 and 12 years of age, with tumors of the thoracic and cervical spinal cord, respectively, antidromically elicited SSEPs were evoked by stimulation of the dorsal columns and were recorded with subdermal electrodes placed at the medial malleoli bilaterally. Intramedullary spinal cord mapping was performed by stimulating the resection cavity with a handheld Ojemann stimulator (Radionics, Burlington, MA). In addition to visual observation, subdermal needle electrodes inserted into the abductor pollicis brevis-flexor digiti minimi manus, tibialis anterior-gastrocnemius, and abductor halluces-abductor digiti minimi pedis muscles bilaterally recorded responses that identified motor pathways. The midline of the spinal cord was anatomically identified by visualizing branches of the dorsal medullary vein penetrating the median sulcus. Antidromic responses were obtained by stimulation at 1-mm intervals on either side of the midline, and the region where no response was elicited was selected for the myelotomy. The anatomic and electrical midlines did not precisely overlap. Stimulation of abnormal tissue within the tumor did not elicit electromyographic activity. Approaching the periphery of the tumor, stimulation at 1 mA elicited an electromyographic response before normal spinal cord was visualized. Restimulation at lower currents by use of 0.25-mA increments identified the descending motor tracts adjacent to the tumor. After tumor resection, the tracts were restimulated to confirm functional integrity. Both

  13. X-ray mapping technique: a preliminary study in discriminating gunshot residue particles from aggregates of environmental occupational origin.

    PubMed

    Cardinetti, Bruno; Ciampini, Claudio; D'Onofrio, Carlo; Orlando, Giovanni; Gravina, Luciano; Ferrari, Francesco; Di Tullio, Donatello; Torresi, Luca

    2004-06-30

    The possibility of detection of lead-antimony-barium aggregates from non-firearm sources is confirmed according to the tests performed on brake pads, and firework and automobile workers. Moreover, information on particles taken from cartridge cases shows the relative feeble importance of the morphology in distinguishing gunshot residues (GSRs). Furthermore, also the presence in the spectrum of other elements (e.g., iron) is not so conclusive. In this panorama, the possibility of discriminating gunshot residue particles from other non-firearm lead-antimony-barium aggregates is investigated: the proposed method is based on X-ray mapping technique--currently applied used in Reparto Carabinieri Investigazioni Scientifiche in Rome, the forensic service of Italian Carabinieri--according to which the spatial distribution of the emission energy of each element of the sample is pictured. Gunshot residues present the same lead-antimony-barium distribution (or at least the same antimony-barium distribution with lead nodules), as some other environmental occupational aggregates do not (different plaques of lead, antimony, and barium). So, X-ray mapping technique can offer a new fundamental evaluation parameter in analysis of gunshot residues with scanning electron microscopy/energy-dispersive (SEM/EDS) spectrometry, and new standards could be considered.

  14. Detection of terrain indices related to soil salinity and mapping salt-affected soils using remote sensing and geostatistical techniques.

    PubMed

    Triki Fourati, Hela; Bouaziz, Moncef; Benzina, Mourad; Bouaziz, Samir

    2017-04-01

    Traditional surveying methods of soil properties over landscapes are dramatically cost and time-consuming. Thus, remote sensing is a proper choice for monitoring environmental problem. This research aims to study the effect of environmental factors on soil salinity and to map the spatial distribution of this salinity over the southern east part of Tunisia by means of remote sensing and geostatistical techniques. For this purpose, we used Advanced Spaceborne Thermal Emission and Reflection Radiometer data to depict geomorphological parameters: elevation, slope, plan curvature (PLC), profile curvature (PRC), and aspect. Pearson correlation between these parameters and soil electrical conductivity (ECsoil) showed that mainly slope and elevation affect the concentration of salt in soil. Moreover, spectral analysis illustrated the high potential of short-wave infrared (SWIR) bands to identify saline soils. To map soil salinity in southern Tunisia, ordinary kriging (OK), minimum distance (MD) classification, and simple regression (SR) were used. The findings showed that ordinary kriging technique provides the most reliable performances to identify and classify saline soils over the study area with a root mean square error of 1.83 and mean error of 0.018.

  15. Land use/ cover mapping of the dry and wet season of Kikuletwa catchment using GIS and remote sensing techniques

    NASA Astrophysics Data System (ADS)

    Msigwa, Anna; vangriensven, Ann; Komakech, Hans; Verbeiren, Boud

    2017-04-01

    Management of water resources has become complicated due to lack of reliable information on the water uses of different sectors. The quantification of water consumption has been concentrated to modified and cultivated areas, but often lacks a correct representation of agricultural water management practices (crop rotations, drip irrigation) while leaving out the water consumption from natural ecosystems (forest, barren land, grazed grassland and shrubland or thickets). A detailed land use map can help water resources scientists and managers to better quantify the water uses by these ecosystems. However, most of the time the hydrological seasons are not considered in developing the land use maps. The objective of this study was to develop a land use maps for the two main seasons (dry and wet season) of the semi-arid Kikuletwa catchment, Tanzania. Three Landsat 8 images of March, August and November 2016 were obtained and cloud masked. Ground truthing points and questionnaire surveys regarding cropping system were collected during the month of August 2016. Unsupervised and supervised techniques in ArcMap and ground truthing point with the aid of cropping calendar was used to classify the three images. About 20 land use/land cover classes were obtained. The dry season images seem to have higher accuracy than the wet season images having Maximum NDVI of 0.6. The results showed a clear difference of how the land is being used in the dry and wet seasons. The image obtained on March representing the wet season showed 74% of the total cultivated land is rainfed with supplemental irrigation while 60% of the cultivated land is irrigated in the dry seasons. Additionally, the results show differences in land size of the natural ecosystems like grazed grassland. The total grazed grassland for the dry month of August was 5.3% of the total catchment area while that of November was 5.1%. The change seen during the dry seasons between the month of August and November is due to

  16. Joint Application Mapping/Interconnect Synthesis Techniques for Embedded Chip-Scale Multiprocessors

    DTIC Science & Technology

    2005-02-01

    information on this or any other computing topic, please visit our Digital Library at www.computer.org/publications/ dlib . 112 IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, VOL. 16, NO. 2, FEBRUARY 2005 ...is advanta- geous because it allows us to leverage a large library of useful scheduling techniques. The feasibility/flexibility framework, introduced

  17. Spatial pattern of soil and soybean crop: an assessment using digital mapping techniques

    NASA Astrophysics Data System (ADS)

    Castro Franco, Mauricio; Cordoba, Mariano; Costa, Jose Luis; Aparicio, Virginia; Domenech, Marisa

    2017-04-01

    The aim of this study was to analyze the relationships among spatial patterns of soil properties and soybean crop. The study was carried out in three provinces of Argentina: (i) Buenos Aires (BA), (ii) Entre Rios (ER) and (iii) Cordoba (COR). In each province, 2 agricultural fields were selected. Ancillary information related to soil forming factors in each field was gathered, for example apparent electrical conductivity (ECa), NDVI and yield maps. We used principal component spatial analysis (MULTISPATI-PCA) to delimit zones for soil type by field. To zonal validation, 4 sampling sites were located in which we collected soil samples, grain yield and soybean crop quality. Random Forest (RF) was used to determine the importance of soil properties over soybean crop properties. For comparing soil properties in each zone between fields, a mix lineal model and ANOVA were adjusted. Our results suggest that MULTISPATI-PCA was efficient to delimit zones for soil type. Relationships between soil properties and crop yield were examined and understood. However, it did not occur with crop quality patterns. Topography did not prove to be an accurate indicator of spatial pattern relations of soil properties and crop, whereas ECa, yield maps and NDVI proved to be effective indicators. Grains m-2 and NDVI were affected homogeneously and were showed spatial correspondence according to soil limitations. Percentage of protein did not show spatial correspondence with delimitated zones in saline soils, particularly in ER. In such fields, Om and pH were important for percentage of protein. It was evidenced that a direct relation exists between complex relationship of soil and crop properties and soil degradation.

  18. Inter-comparison of remote sensing sensing-based shoreline mapping techniques at different coastal stretches of India.

    PubMed

    Sunder, Swathy; Ramsankaran, Raaj; Ramakrishnan, Balaji

    2017-06-01

    Many techniques are available for detection of shorelines from multispectral satellite imagery, but the choice of a certain technique for a particular study area can be tough. Hence, for the first time in literature, an inter-comparison of the most widely used shoreline mapping techniques such as Normalized Difference Water Index (NDWI), Modified NDWI (MNDWI), Improved Band Ratio (IBR) Method, and Automatic Water Extraction Index (AWEI) has been done along four different coastal stretches of India using multitemporal Landsat data. The obtained results have been validated with the high-resolution images of Cartosat-2 (panchromatic) and multispectral images from Google Earth. Performance of the above indices has been analyzed based on the statistics, such as overall accuracy, kappa coefficient, user's accuracy, producer's accuracy, and the average deviation from the reference line. It is observed that the performance of NDWI and IBR techniques are dependent on the physical characteristics of the sites, and therefore, it varies from one site to another. Results indicate that unlike these two indices, the AWEI algorithm performs consistently well followed by MNDWI irrespective of the land cover types.

  19. The extremely metal-poor galaxy DDO 68: the luminous blue variable, Hα shells and the most luminous stars

    NASA Astrophysics Data System (ADS)

    Pustilnik, S. A.; Makarova, L. N.; Perepelitsyna, Y. A.; Moiseev, A. V.; Makarov, D. I.

    2017-03-01

    This paper presents new results from the ongoing study of the unusual Lynx-Cancer void galaxy DDO 68, which has star-forming regions of record low metallicity [12+log (O/H) ∼7.14]. The results include the following. (i) A new spectrum and photometry have been obtained with the 6-m SAO RAS telescope (BTA) for the luminous blue variable (LBV = DDO68-V1). Photometric data sets were complemented with others based on the Sloan Digital Sky Survey (SDSS) and the Hubble Space Telescope (HST) archive images. (ii) We performed an analysis of the DDO 68 supergiant shell (SGS) and the prominent smaller Hα arcs/shells visible in the HST image coupled with kinematic maps in Hα obtained with the Fabry-Perot interferometer (FPI) at the BTA. (iii) We compiled a list of about 50 of the most luminous stars (-9.1 mag < MV < -6.0 mag) identified from the HST images associated with the star-forming regions with known extremely low O/H. This is intended to pave the path for the current science to be investigated with the next generation of giant telescopes. We have confirmed earlier hints of significant variation of the LBV optical light, deriving its amplitude as ΔV ≳ 3.7 mag for the first time. New data suggest that in 2008-2010 the LBV reached MV = -10.5 mag and probably underwent a giant eruption. We argue that the structure of star-forming complexes along the SGS ('Northern Ring') perimeter provides evidence for sequential induced star-formation episodes caused by the shell gas instabilities and gravitational collapse. The variability of some luminous extremely metal-poor stars in DDO 68 can currently be monitored with medium-size telescopes at sites with superb seeing.

  20. Application of satellite data and LARS's data processing techniques to mapping vegetation of the Dismal Swamp. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Messmore, J. A.

    1976-01-01

    The feasibility of using digital satellite imagery and automatic data processing techniques as a means of mapping swamp forest vegetation was considered, using multispectral scanner data acquired by the LANDSAT-1 satellite. The site for this investigation was the Dismal Swamp, a 210,000 acre swamp forest located south of Suffolk, Va. on the Virginia-North Carolina border. Two basic classification strategies were employed. The initial classification utilized unsupervised techniques which produced a map of the swamp indicating the distribution of thirteen forest spectral classes. These classes were later combined into three informational categories: Atlantic white cedar (Chamaecyparis thyoides), Loblolly pine (Pinus taeda), and deciduous forest. The subsequent classification employed supervised techniques which mapped Atlantic white cedar, Loblolly pine, deciduous forest, water and agriculture within the study site. A classification accuracy of 82.5% was produced by unsupervised techniques compared with 89% accuracy using supervised techniques.

  1. Influence of spatial luminance interactions on perceived brightness

    NASA Astrophysics Data System (ADS)

    Eisemann, Ulrich

    2000-12-01

    Most color appearance models that have been published so far require a simplified description of the viewing field which is subdivided into a small number of homogeneous regions. The tristimulus values and luminance levels of these regions serve as input parameters for the models. The purpose of this paper however is to study brightness perception in a complex, achromatic surround using a more detailed description of the entire viewing field. Therefore, a number of psycho physical experiments were carried out using a CRT display on which relatively complex images were presented. Several observers were asked to judge the perceived brightness by adjusting the luminance level of a reference grey for a perfect brightness match. All thereby obtained psycho physical data were used to develop a new brightness appearance model that takes all objects in the entire visual field into account. The model includes an object feature extraction stage, in which object properties like area and position are extracted, a stage in which the characteristic object data are sorted into the appropriate classes of a histogram and a multivariate map in the form of a feedforward neural network to calculate the prediction of the perceived brightness.

  2. Remote sensing techniques for mapping range sites and estimating range yield

    NASA Technical Reports Server (NTRS)

    Benson, L. A.; Frazee, C. J.; Waltz, F. A.; Reed, C.; Carey, R. L.; Gropper, J. L.

    1974-01-01

    Image interpretation procedures for determining range yield and for extrapolating range information were investigated for an area of the Pine Ridge Indian Reservation in southwestern South Dakota. Soil and vegetative data collected in the field utilizing a grid sampling design and digital film data from color infrared film and black and white films were analyzed statistically using correlation and regression techniques. The pattern recognition techniques used were K-class, mode seeking, and thresholding. The herbage yield equation derived for the detailed test site was used to predict yield for an adjacent similar field. The herbage yield estimate for the adjacent field was 1744 lbs. of dry matter per acre and was favorably compared to the mean yield of 1830 lbs. of dry matter per acre based upon ground observations. Also an inverse relationship was observed between vegetative cover and the ratio of MSS 5 to MSS 7 of ERTS-1 imagery.

  3. Development of a stereo analysis algorithm for generating topographic maps using interactive techniques of the MPP

    NASA Technical Reports Server (NTRS)

    Strong, James P.

    1987-01-01

    A local area matching algorithm was developed on the Massively Parallel Processor (MPP). It is an iterative technique that first matches coarse or low resolution areas and at each iteration performs matches of higher resolution. Results so far show that when good matches are possible in the two images, the MPP algorithm matches corresponding areas as well as a human observer. To aid in developing this algorithm, a control or shell program was developed for the MPP that allows interactive experimentation with various parameters and procedures to be used in the matching process. (This would not be possible without the high speed of the MPP). With the system, optimal techniques can be developed for different types of matching problems.

  4. Mapping of cerebral perfusion territories using territorial arterial spin labeling: techniques and clinical application.

    PubMed

    Hartkamp, Nolan S; Petersen, Esben T; De Vis, Jill B; Bokkers, Reinoud P H; Hendrikse, Jeroen

    2013-08-01

    A knowledge of the exact cerebral perfusion territory which is supplied by any artery is of great importance in the understanding and diagnosis of cerebrovascular disease. The development and optimization of territorial arterial spin labeling (T-ASL) MRI techniques in the past two decades have made it possible to visualize and determine the cerebral perfusion territories in individual patients and, more importantly, to do so without contrast agents or otherwise invasive procedures. This review provides an overview of the development of ASL techniques that aim to visualize the general cerebral perfusion territories or the territory of a specific artery of interest. The first efforts of T-ASL with pulsed, continuous and pseudo-continuous techniques are summarized and subsequent clinical studies using T-ASL are highlighted. In the healthy population, the perfusion territories of the brain-feeding arteries are highly variable. This high variability requires special consideration in specific patient groups, such as patients with cerebrovascular disease, stroke, steno-occlusive disease of the large arteries and arteriovenous malformations. In the past, catheter angiography with selective contrast injection was the only available method to visualize the cerebral perfusion territories in vivo. Several T-ASL methods, sometimes referred to as regional perfusion imaging, are now available that can easily be combined with conventional brain MRI examinations to show the relationship between the cerebral perfusion territories, vascular anatomy and brain infarcts or other pathology. Increased availability of T-ASL techniques on clinical MRI scanners will allow radiologists and other clinicians to gain further knowledge of the relationship between vasculature and patient diagnosis and prognosis. Treatment decisions, such as surgical revascularization, may, in the near future, be guided by information provided by T-ASL MRI in close correlation with structural MRI and quantitative

  5. Three-dimensional electron paramagnetic resonance imaging technique for mapping porosity in ceramics

    SciTech Connect

    Kordas, G.; Kang, Y.H. )

    1991-04-01

    This paper reports on a three-dimensional (3D) electron paramagnetic resonance imaging (EPRI) method which was developed to probe the structure and size of pores in ceramic materials. The imaging device that was added to the EPR instrument consisted of a computer-controlled current source and magnetic field gradient. This add-on facility was tested using a well-defined diphenlpicrylhydrazzyl phantom sample. Pumice was then used to demonstrate the potential of the technique. This stone was immersed in a 0.5 mm {sup 15}N-substituted per-deutereted tempone water solution to fill the pores with spin labels. Images were reconstructed using a filtered back-projection technique. A two-dimensional (2D) imaging plane was constructed by collecting 33 projection planes over 180 {degrees}. A 3D image was derived from 22 planes each constructed by 22 projections. At present, the facility allows a resolution of 69 and 46 {mu}m for 2D and 3D imaging, respectively. Advancements of the imaging apparatus, software, and line width of the spin labels will be needed to enhance the resolution of this technique.

  6. Mapping the mean annual precipitation of China using local interpolation techniques

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Zhu, Yunqiang; Huang, Shengli; Guo, Chunxia

    2015-01-01

    Spatially explicit precipitation data are required in the research of hydrology, agriculture, ecology, and environmental sciences. In this study, two established techniques of local ordinary linear regression (OLR) and geographically weighted regression (GWR) and two new local hybrid interpolation techniques of local regression-kriging (LRK) and geographically weighted regression kriging (GWRK) were compared to predict the spatial distribution of mean annual precipitation of China. Precipitation data from 684 meteorological stations were used in the analysis, and a stepwise regression analysis was used to select six covariates, including longitude, latitude, elevation, slope, surface roughness, and river density. The four spatial prediction methods (OLR, GWR, LRK, and GWRK) were implemented with local regression techniques with different number of neighbors (50, 100, 150, and 200). The prediction accuracy was assessed at validation sites with the root mean squared deviation, mean estimation error, and R-square values. The results showed that LRK outperforms OLR and GWRK outperforms GWR, indicating that adding the kriging of regression residuals can help improve the prediction performance. GWRK gives the best prediction but the accuracy of estimation varies with the number of neighborhood points used for modeling. Although LRK is outperformed by GWRK, LRK is still recommended as a powerful and practical interpolation method given its computation efficiency. However, if LRK and GWRK are used to extrapolate prediction values, post-processing in the areal interpolation will be needed.

  7. Mapping rainfall variability using different GIS interpolation techniques over Siret Basin

    NASA Astrophysics Data System (ADS)

    Nertan, A. T.; Panaitescu, V.

    2010-09-01

    The analysis of spatial variability of precipitation processes has a major importance for water resources planners and managers, the precipitation data being important for many applications in hydrology or agriculture that determines the availability of drinking water and the level of the soil moisture. It is very important to have adequate information about precipitation and its spatial distribution and variability, not only at local scale but also over a specific region or territory. Spatial interpolation methods are techniques that predict the value at a given location by using values from sample points (meteorological station). A key difference among these approaches is the criterion which is used to weigh the values of the sample points. These criteria may include simple distance relations (e.g., inverse distance methods), minimization of variance (e.g., Kriging and co-Kriging techniques) and minimization of curvature or enforcement of smoothness criteria (Spline interpolation). The main aim of this study is to demonstrate the importance and the implementation of different GIS interpolation techniques for the spatial interpretation of mean annual precipitation for the Siret basin: to test and evaluate the accuracy of different GIS interpolation methods; to compute and compare the average of mean annual precipitation for Siret basin using GIS tools.

  8. CO excitation in four IR luminous galaxies

    NASA Astrophysics Data System (ADS)

    Radford, Simon J. E.; Solomon, P. M.; Downes, Dennis

    1990-07-01

    The correlation between the CO and far infrared luminosities of spiral galaxies is well established. The luminosity ration, LFIR/L sub CO in IR luminous active galaxies is, however, systematically five to ten times higher than in ordinary spirals and molecular clouds in our Galaxy. Furthermore, the masses of molecular hydrogen in luminous galaxies are large, M (H2) approx. equals 1010 solar magnitude, which indicates the observed luminosity ratios are due to an excess of infrared output, rather than a deficiency of molecular gas. These large amounts of molecular gas may fuel luminous galaxies through either star formation or nuclear activity. This interpretation rests on applying the M (H2)/LCO ratio calibrated in our Galaxy to galaxies with strikingly different luminosity ratios. But are the physical conditions of the molecular gas different in galaxies with different luminosity ratios. And, if so, does the proportionality between CO and H2 also vary among galaxies. To investigate these questions researchers observed CO (2 to 1) and (1 to 0) emission from four luminous galaxies with the Institute for Radio Astronomy in the Millimeter range (IRAM) 30 m telescope. Researchers conclude that most of the CO emission from these Arp 193, Arp 220, and Mrk 231 arises in regions with moderate ambient densities similar to the clouds in the Milky Way molecular ring. The emission is neither from dense hot cloud cores nor from the cold low density gas characteristic of the envelopes of dark clouds.

  9. CO excitation in four IR luminous galaxies

    NASA Technical Reports Server (NTRS)

    Radford, Simon J. E.; Solomon, P. M.; Downes, Dennis

    1990-01-01

    The correlation between the CO and far infrared luminosities of spiral galaxies is well established. The luminosity ration, L sub FIR/L sub CO in IR luminous active galaxies is, however, systematically five to ten times higher than in ordinary spirals and molecular clouds in our Galaxy. Furthermore, the masses of molecular hydrogen in luminous galaxies are large, M (H2) approx. equals 10(exp 10) solar magnitude, which indicates the observed luminosity ratios are due to an excess of infrared output, rather than a deficiency of molecular gas. These large amounts of molecular gas may fuel luminous galaxies through either star formation or nuclear activity. This interpretation rests on applying the M (H2)/L sub CO ratio calibrated in our Galaxy to galaxies with strikingly different luminosity ratios. But are the physical conditions of the molecular gas different in galaxies with different luminosity ratios. And, if so, does the proportionality between CO and H2 also vary among galaxies. To investigate these questions researchers observed CO (2 to 1) and (1 to 0) emission from four luminous galaxies with the Institute for Radio Astronomy in the Millimeter range (IRAM) 30 m telescope. Researchers conclude that most of the CO emission from these Arp 193, Arp 220, and Mrk 231 arises in regions with moderate ambient densities similar to the clouds in the Milky Way molecular ring. The emission is neither from dense hot cloud cores nor from the cold low density gas characteristic of the envelopes of dark clouds.

  10. THE EFFECTS OF LIGHT ON LUMINOUS BACTERIA

    PubMed Central

    Harvey, E. Newton

    1925-01-01

    A conservative statement would therefore be that luminous bacteria show no changes in luminescence as a result of illumination by 625 foot candles for 1.5 minutes when examined 1/200 of a second after exposure, and none as the result of illumination by 15,000 foot candles for 6 minutes when examined ⅙ of a second after exposure. PMID:19872170

  11. Luminous Binary Supersoft X-Ray Sources

    NASA Technical Reports Server (NTRS)

    DiStefano, Rosanne; Oliversen, Ronald J. (Technical Monitor)

    2002-01-01

    This grant was for the study of Luminous Supersoft X-Ray Sources (SSSs). During the first year a number of projects were completed and new projects were started. The projects include: 1) Time variability of SSSs 2) SSSs in M31; 3) Binary evolution scenarios; and 4) Acquiring new data.

  12. Soil salinity mapping and hydrological drought indices assessment in arid environments based on remote sensing techniques

    NASA Astrophysics Data System (ADS)

    Elhag, Mohamed; Bahrawi, Jarbou A.

    2017-03-01

    Vegetation indices are mostly described as crop water derivatives. The normalized difference vegetation index (NDVI) is one of the oldest remote sensing applications that is widely used to evaluate crop vigor directly and crop water relationships indirectly. Recently, several NDVI derivatives were exclusively used to assess crop water relationships. Four hydrological drought indices are examined in the current research study. The water supply vegetation index (WSVI), the soil-adjusted vegetation index (SAVI), the moisture stress index (MSI) and the normalized difference infrared index (NDII) are implemented in the current study as an indirect tool to map the effect of different soil salinity levels on crop water stress in arid environments. In arid environments, such as Saudi Arabia, water resources are under pressure, especially groundwater levels. Groundwater wells are rapidly depleted due to the heavy abstraction of the reserved water. Heavy abstractions of groundwater, which exceed crop water requirements in most of the cases, are powered by high evaporation rates in the designated study area because of the long days of extremely hot summer. Landsat 8 OLI data were extensively used in the current research to obtain several vegetation indices in response to soil salinity in Wadi ad-Dawasir. Principal component analyses (PCA) and artificial neural network (ANN) analyses are complementary tools used to understand the regression pattern of the hydrological drought indices in the designated study area.

  13. New non-Doppler remote sensing technique for 3D wind field mapping

    NASA Astrophysics Data System (ADS)

    Belen'kii, Mikhail S.; Gimmestad, Gary G.; Gurvich, Alexander V.

    1994-06-01

    A new approach to the statistical analysis of fluctuating, photon-limited signals that permits us to accumulate and process the lidar returns without averaging of the reflected energy fluctuations is developed. This approach requires recording the photocounts for each pulse in a series of pulses and then determining photocount statistics. Based on the semiclassical theory of photodetection and Mandel's formula, a relationship has been obtained between the time-space cross correlation function and the cross spectrum of the lidar returns and corresponding photocount statistics. It is shown that the relative uncertainties of measuring the cross correlation or the cross spectrum of the lidar returns is determined by the general number of photocounts, but not by their mean value. A fast-scanning lidar system, which is based on a new photocounting analysis approach, is described for 3D wind field mapping in the atmosphere at altitudes up to 5 km. A program for the experimental verification of the new approach is presented.

  14. Using potential flow theory and conformal mapping technique to measure pressure differential on airfoil

    NASA Astrophysics Data System (ADS)

    Mughal, Umair Najeeb

    2017-01-01

    Flow around an airfoil to calculate pressure co-efficient variations at different relative velocities have always been an important/basic part of Aerodynamic Study. Potential flow theory is used to study flow behavior on rankine half body, non-rotating cylinder and rotating cylinder as it is more trackable. Falkan-Skan Similarity Solution is taken to simulate the flow behavior on wedge. However, to use potential flow theory on usable airfoils the author have used conformal mapping to show a relation between realistic airfoil shapes and the knowledge gained from flow about cylinders. This method can further be used in the designing of an airfoil section. The author has used Joukowski Tranform to generate the flow around airfoils of various geometries and then utilized Kutta condition to force the stagnation point at the trailing edge. Co-efficient of pressure over the entire airfoil surface were calculated and corrected using Karman-Tsien compressibility correction equations. On the basis of this, the location of the ports to install the flush measurement system is suggested.

  15. Karst features detection and mapping using airphotos, DSMs and GIS techniques

    NASA Astrophysics Data System (ADS)

    Kakavas, M. P.; Nikolakopoulos, K. G.; Zagana, E.

    2015-10-01

    The aim of this work is to detect and qualify natural karst depressions in the Aitoloakarnania Prefecture, Western Greece, using remote sensing data in conjunction with the Geographical Information Systems - GIS. The study area is a part of the Ionian geotectonic zone, and its geological background consists of the Triassic Evaporates. The Triassic carbonate breccias where formed as a result of the tectonic and orogenetic setting of the external Hellenides and the diaper phenomena of the Triassic Evaporates. The landscape characterized by exokarst features closed depressions in the Triassic carbonate breccias. At the threshold of this study, an in situ observation was performed in order to identify dolines and swallow holes. The creation of sinkholes, in general, is based on the collapse of the surface layer due to chemical dissolution of carbonate rocks. In the current study airphotos stereopairs, DSMs and GIS were combined in order to detect and map the karst features. Thirty seven airphotos were imported in Leica Photogrammetry Suite and a stereo model of the study area was created. Then in 3D view possible karst features were detected and digitized. Those sites were verified during the in situ survey. ASTER GDEM, SRTM DEM, high resolution airphoto DSM created from the Greek Cadastral and a DEM from digitized contours from the 1/50,000 topographic were also evaluated in GIS environment for the automatic detection of the karst depressions. The results are presented in this study.

  16. Lessons in modern digital field geology: Open source software, 3D techniques, and the new world of digital mapping

    NASA Astrophysics Data System (ADS)

    Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura

    2014-05-01

    Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (<300Euros) Windows tablets. However, the lack of flexibility in data structure makes for cumbersome workflows when trying to interface our existing shapefile-centric data structures to Move. Nonetheless, in spring 2014 we will experiment with full-3D immersion in the field using the full Move software package in combination with ground based LiDAR and photogrammetry. One new workflow suggested by our initial experiments is that field geologists should consider using photogrammetry software to capture 3D visualizations of key outcrops. This process is now straightforward in several software packages, and

  17. Basic forest cover mapping using digitized remote sensor data and automated data processing techniques

    NASA Technical Reports Server (NTRS)

    Coggeshall, M. E.; Hoffer, R. M.

    1973-01-01

    Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.

  18. Evolution of luminous IRAS galaxies: Radio imaging

    NASA Technical Reports Server (NTRS)

    Neff, S. G.; Hutchings, J. B.

    1993-01-01

    In a recent study of IRAS galaxies' optical morphologies, we found that luminous IR sources lie in the IR color-luminosity plane in groups which separate out by optical spectroscopic type and also by degree of tidal disturbance. We found that the most luminous steep-IR-spectrum sources are generally galaxies in the initial stages of a major tidal interaction. Galaxies with active nuclei were generally found to have flatter IR spectra, to cover a range of IR luminosity, and to be in the later stages of a tidal interaction. We proposed a sequence of events by which luminous IR sources evolve: they start as interacting or merging galaxies, some develop active nuclei, and most undergo extensive star-formation in their central regions. Another way to study these objects and their individual evolution is to study their radio morphologies. Radio emission may arise at a detectable level from supernovae in star-forming regions and/or the appearance of an active nucleus can be accompanied by a nuclear radio source (which may develop extended structure). Therefore, the compact radio structure may trace the evolution of the inner regions of IRAS-luminous sources. If the radio sources are triggered by the interactions, we would expect to find the radio morphology related to the optical 'interactivity' of the systems. Here, we explore using the radio emission of IRAS galaxies as a possible tracer of galaxy evolution. We present and discuss observations of the compact radio morphology of 111 luminous IRAS-selected active galaxies covering a wide range of IR and optical properties.

  19. West Java Snack Mapping based on Snack Types, Main Ingredients, and Processing Techniques

    NASA Astrophysics Data System (ADS)

    Nurani, A. S.; Subekti, S.; Ana

    2016-04-01

    The research was motivated by lack of literature on archipelago snack especially from West Java. It aims to explore the snack types, the processing techniques, and the main ingredients by planning a learning material on archipelago cake especially from West Java. The research methods used are descriptive observations and interviews. The samples were randomly chosen from all regions in West Java. The findings show the identification of traditional snack from West java including: 1. snack types which are similar in all regions as research sample namely: opak, rangginang, nagasari, aliagrem, cuhcur, keripik, semprong, wajit, dodol, kecimpring, combro, tape ketan, and surabi. The typical snack types involve burayot (Garut), simping kaum (Purwakarta), surabi hejo (Karawang), papais cisaat (Subang), Papais moyong, opak bakar (Kuningan), opak oded, ranggesing (Sumedang), gapit, tapel (Cirebon), gulampo, kue aci (Tasikmalaya), wajit cililin, gurilem (West Bandung), and borondong (Bandung District); 2. various processing techniques namely: steaming, boiling, frying, caramelizing, baking, grilling, roaster, sugaring; 3. various main ingredients namely rice, local glutinous rice, rice flour, glutinous rice flour, starch, wheat flour, hunkue flour, cassava, sweet potato, banana, nuts, and corn; 4. snack classification in West Java namely (1) traditional snack, (2) creation-snack, (3) modification-snack, (4) outside influence-snack.

  20. Fluorescence polarization of tetracycline derivatives as a technique for mapping nonmelanoma skin cancers.

    PubMed

    Yaroslavsky, Anna N; Salomatina, Elena V; Neel, Victor; Anderson, Rox; Flotte, Thomas

    2007-01-01

    Nonmelanoma skin cancer is the most common form of human cancer, often resulting in high morbidity. Low visual contrast of these tumors makes their delineation a challenging problem. Employing a linearly polarized monochromatic light source and a wide-field CCD camera, we have developed a technique for fluorescence polarization imaging of the nonmelanoma cancers stained using antibiotics from the tetracycline family. To determine the feasibility of the method, fluorescence polarization images of 86 thick, fresh cancer excisions were studied. We found that the level of endogenous fluorescence polarization was much lower than that of exogenous, and that the average values of fluorescence polarization of tetracycline derivatives were significantly higher in cancerous as compared to normal tissue. Out of 86 tumors [54 stained in demeclocycline (DMN) and 32 in tetracycline (TCN)], in 79 cases (51-DMN, 28-TCN) the location, size, and shape of the lesions were identified accurately. The results of this trial indicate that nonmelanoma skin tumors can be distinguished from healthy tissue based on the differences in exogenous fluorescence polarization of TCN and/or DMN. Therefore, the developed technique can provide an important new tool for image-guided cancer surgery.

  1. Redox and speciation mapping of rock thin sections using high spatial resolution full-field imaging technique

    NASA Astrophysics Data System (ADS)

    de Andrade, V.; Susini, J.; Salomé, M.; Beraldin, O.; Heymes, T.; Lewin, E.

    2009-04-01

    Because of their complex genesis, natural rocks are the most often heterogeneous systems, with various scale-level heterogeneities for both chemistry and structure. In the last decade, the dramatic improvements of hyperspectral imaging techniques provided new tools for accurate material characterisation. Most of these micro- and nano- analytical techniques rely on scanning instruments, which offer high spatial resolution but suffer from long acquisition times imposing practical limits on the field of view. Conversely, full-field imaging techniques rely on a fast parallel acquisition but have limited resolution. Although soft X-ray full-field microscopes based on Fresnel zone plates are commonly used for high resolution imaging, its combination with spectroscopy is challenging and 2D chemical mapping still difficult. For harder X-rays, lensless X-ray microscope based on simple propagation geometry is easier and can be readily used for 2D spectro-microscopy. A full-field experimental setup was optimized at the ESRF-ID21 beamline to image iron redox and speciation distributions in rocks thin sections. The setup comprises a Si111 or Si220 (E = 0.4 eV) monochromator, a special sample stage and a sensitive camera associated with a brand new GGG:Eu light conversion scintillator and high magnification visible light optics. The pixel size ranges from 1.6 to 0.16 m according to the optic used. This instrument was used to analyse phyllosilicates and oxides of metamorphic sediments coming from the Aspromonte nappes-pile in Calabria. Iron chemical state distributions were derived - from images of 1000 Ã- 2000 Ã- 30 m3 rock thin sections - by subtraction of absorption images above and below the Fe K-edge. Using an automatic stitching reconstruction, a wide field image (4Ã-3 mm2 with a 1 m2 resolution for a total of about 12 millions pixels) of Fetotal elemental distribution was produced. Moreover, -XANES analyses (more than 1 million individual -XANES spectra) were performed

  2. Measuring trace gas emission from multi-distributed sources using vertical radial plume mapping (VRPM) and backward Lagrangian stochastic (bLS) techniques

    USDA-ARS?s Scientific Manuscript database

    Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The accuracy of the vertical radial plume mapping (VRPM) and the backward Lagrangian (bLS) techniques with an open-path optical spectrosco...

  3. The Views of the Teachers about the Mind Mapping Technique in the Elementary Life Science and Social Studies Lessons Based on the Constructivist Method

    ERIC Educational Resources Information Center

    Seyihoglu, Aysegul; Kartal, Ayca

    2010-01-01

    The purpose of this study is to reveal the opinions of teachers on using the mind mapping technique in Life Science and Social Studies lessons. The participants of the study are 20 primary education teachers. In this study, a semi-structured interview technique was used. For content analysis, the themes and codes were defined, based on the views…

  4. A comparison of two-dimensional techniques for converting magnetocardiogram maps into effective current source distributions

    NASA Astrophysics Data System (ADS)

    Ogata, K.; Kandori, A.; Miyashita, T.; Sekihara, K.; Tsukada, K.

    2011-01-01

    The aim of this study was to develop a method for converting the pseudo two-dimensional current given by a current-arrow map (CAM) into the physical current. The physical current distribution is obtained by the optimal solution in a least mean square sense with Tikhonov regularization (LMSTR). In the current dipole simulation, the current pattern differences (ΔJ) between the results of the CAM and the LMSTR with several regularization parameters (α = 10-1-10-15) are calculated. In magnetocardiographic (MCG) analysis, the depth (zd) of a reconstruction plane is chosen by using the coordinates of the sinus node, which is estimated from MCG signals at the early p-wave. The ΔJs at p-wave peaks, QRS-complex peaks, and T-wave peaks of MCG signals for healthy subjects are calculated. Furthermore, correlation coefficients and regression lines are also calculated from the current values of the CAM and the LMSTR during p-waves, QRS-complex, and T-waves of MCG signals. In the simulation, the ΔJs (α ≈ 10-10) had a minimal value. The ΔJs (α = 10-10) at p-wave peaks, QRS-complex peaks, and T-wave peaks of MCG signals for healthy subjects also had minimal value. The correlation coefficients of the current values given by the CAM and the LMSTR (α = 10-10) were greater than 0.9. Furthermore, slopes (y) of the regression lines are correlated with the depth (zd) (r = -0.93). Consequently, the CAM value can be transformed into the LMSTR current value by multiplying it by the slope (y) obtained from the depth (zd). In conclusion, the result given by the CAM can be converted into an effective physical current distribution by using the depth (zd).

  5. Over-Luminous Elliptical Galaxies

    NASA Technical Reports Server (NTRS)

    Mushotsky, Richard (Technical Monitor); Forman, William

    2004-01-01

    We have completed a first draft of a paper on the galaxy group ESO3060170, the hottest known fossil group. We have submitted a first draft of the paper but the final completion is delayed due to several issues mentioned by the referee that we wish to revisit and discuss in more detail. The XMM data was combined with Chandra data which allowed a rich set of projects. The paper discusses the north-south elongation which is similar to that of the central dominant galaxy as well as the galaxy distribution. We detect an X-ray 'finger' or small tail emanating from the central galaxy to the north, suggesting motion of the galaxy within the elongated gravitational potential. The overall agreement between XMM and Chandra data are excellent (although the XMM data extend to larger radii). Both data sets show a cool core centered on the dominant galaxy. Surprisingly, - the temperature maps and detailed spectra indicate that the finger of gas is NOT cool, but has the same temperature as the ambient gas. We extracted surface brightness profiles, deprojected gas density profiles, cooling time profiles, and entropy profiles. There is a sharp discontinuity in gas temperature where the surface brightness profile starts to rise rapidly at 10 kpc. This produces a decrease in the cooling time and the gas entropy within 10 kpc. The central cooling time (within 10 kpc) is less than 109 years and falls to almost half that value in the inner 5 kpc. Despite the very short cooling time, we find no evidence (even with the excellent statistics from XMM-Newton) for multi-phased gas, i.e., a cooling flow. We find two 'edges' associated with the gas distribution (common in peaked X-ray groups and galaxies). On large scales, the temperature profile is flat and disagrees with the profile predicted by Loken et al. (2003) from detailed numerical simulations. We studied the galaxy distribution within one virial radius. The galaxy concentration associated with the group is detectable only within 0.3 of

  6. Over-Luminous Elliptical Galaxies

    NASA Technical Reports Server (NTRS)

    Mushotzky, Richard (Technical Monitor); Forman, William

    2004-01-01

    We have completed a first draft of a paper on the galaxy group ES03060170, the hottest known fossil group. We have submitted a first draft of the paper but the final completion is delayed due to several issues mentioned by the referee that we wish to revisit and discuss in more detail. The XMM data was combined with Chandra data which allowed a rich set of projects. The paper discusses the north-south elongation which is similar to that of the central dominant galaxy as well as the galaxy distribution. We detect an X-ray "finger" or small tail emanating from the central galaxy to the north, suggesting motion of the galaxy within the elongated gravitational potential. The overall agreement between XMM and Chandra data are excellent (although the XMM data extend to larger radii). Both data sets show a cool core centered on the dominant galaxy. Surprisingly, the temperature maps and detailed spectra indicate that the finger of gas is NOT cool, but has the same temperature as the ambient gas. We extracted surface brightness profiles, deprojected gas density profiles, cooling time profiles, and entropy profiles. There is a sharp discontinuity in gas temperature where the surface brightness profile starts to rise rapidly at 10 kpc. This produces a decrease in the cooling time and the gas entropy within 10 kpc. The central cooling time (within 10 kpc) is less than l0(exp 9) years and falls to almost half that value in the inner 5 kpc. Despite the very short cooling time, we find no evidence (even with the excellent statistics from XMM-Newton) for multi-phased gas, i.e., a cooling flow. We find two "edges" associated with the gas distribution (common in peaked X-ray groups and galaxies). On large scales, the temperature profile is flat and disagrees with the profile predicted by Loken et al. (2003) from detailed numerical simulations. We studied the galaxy distribution within one virial radius. The galaxy concentration associated with the group is detectable only within 0

  7. A Microwave Technique for Mapping Ice Temperature in the Arctic Seasonal Sea Ice Zone

    NASA Technical Reports Server (NTRS)

    St.Germain, Karen M.; Cavalieri, Donald J.

    1997-01-01

    A technique for deriving ice temperature in the Arctic seasonal sea ice zone from passive microwave radiances has been developed. The algorithm operates on brightness temperatures derived from the Special Sensor Microwave/Imager (SSM/I) and uses ice concentration and type from a previously developed thin ice algorithm to estimate the surface emissivity. Comparisons of the microwave derived temperatures with estimates derived from infrared imagery of the Bering Strait yield a correlation coefficient of 0.93 and an RMS difference of 2.1 K when coastal and cloud contaminated pixels are removed. SSM/I temperatures were also compared with a time series of air temperature observations from Gambell on St. Lawrence Island and from Point Barrow, AK weather stations. These comparisons indicate that the relationship between the air temperature and the ice temperature depends on ice type.

  8. Mapping the developing human brain in utero using quantitative MR imaging techniques.

    PubMed

    Studholme, Colin

    2015-03-01

    Magnetic resonance imaging of the human fetal brain has been a clinical tool for many years and provides valuable additional information to compliment more common ultrasound studies. Advances in both MRI acquisition and post processing over the last 10 years have enabled full 3D imaging and the accurate combination of data acquired in different head positions to create improved geometric integrity, tissue contrast, and resolution. This research is now motivating the development of new quantitative MRI-based techniques for clinical imaging that can more accurately characterize brain development and detect abnormalities. In this article, we will review some of the key areas that are driving changes in our understanding of fetal brain growth using quantitative measures derived from in utero MRI and the possible directions for its increased use in improving the evaluation of pregnancies and the accurate characterization of abnormal brain growth.

  9. Mapping the developing human brain in utero using quantitative MR imaging techniques

    PubMed Central

    Studholme, Colin

    2015-01-01

    Magnetic resonance imaging of the human fetal brain has been a clinical tool for many years and provides valuable additional information to compliment more common ultrasound studies. Advances in both MRI acquisition and post processing over the last 10 years have enabled full 3D imaging and the accurate combination of data acquired in different head positions to create improved geometric integrity, tissue contrast and resolution. This research is now motivating the development of new quantitative MRI based techniques for clinical imaging that can more accurately characterize brain development and detect abnormalities. In this paper we will review some of the key areas that are driving changes in our understanding of fetal brain growth using quantitative measures derived from inutero MRI, and possible directions for its increased use in improving the evaluation of pregnancies and the accurate characterization of abnormal brain growth. PMID:25813665

  10. Mapping molecular orientation in solids by rotating-frame NQR techniques

    PubMed

    Casanova; Robert; Pusiol

    1998-07-01

    A multi-dimensional NQR technique to image both the spatial distribution of quadrupolar nuclei and the local orientation of the electric field gradient tensor at the quadrupole sites in solids is reported. The encoding procedure is based on the irradiation of the sample by a pulse sequence composed of spatially homogeneous and inhomogeneous radiofrequency fields. A method that encodes the spatial and orientation information in the amplitudes of the free-induction decay signals and a proper three-dimensional reconstruction procedure that yields the space-orientation-dependent NQR spectra are described. A two-dimensional variant allows rapid measurement of the spatially dependent orientation distribution of molecules, disregarding the spectroscopic information. Copyright 1998 Academic Press.

  11. Updating the 1:50.000 geological maps of Rhodes Island using remote sensing data and GIS techniques

    NASA Astrophysics Data System (ADS)

    Tsombos, Panagiotis I.; Nikolakopoulos, Konstantinos G.; Photiades, Adonis; Psonis, Konstantinos

    2007-10-01

    The island of Rhodes represents an uplifted easternmost segment of the Hellenic forearc extending between Greece and Turkey, which is associated with the subduction of the African plate below the Aegean. Middle Miocene-Pleistocene sedimentary basins, are separated by a stack of Alpine nappes of the Hellenide orogen exposed in uplifted fault blocks such as Plattenkalk series (Attaviros Group), Gavrovo-Tripolitza series (Lindos Group), Pindos-Olonos series (Profitis Ilias Group) and several outcrops of Pelagonian series with ophiolitic mélanges. These pre-Neogene formations are dominated by low-grade metamorphic sediments, which were folded and faulted during several phases of the alpine orogeny. The Rhodes region was uplifted in the Middle Miocene after which subsidence and deposition of sediments took place in the Upper Pliocene and Lower Pleistocene. In this paper we present the combined use of remote sensing and GIS techniques for the geological mapping of Rhodes Island at a 1/50.000 scale. The geological formations, geotectonic units and the tectonic structure were recognized in situ and mapped. Interpretation of medium resolution satellite images (Landsat 7 ETM and Terra ASTER) has been carried out in order to detect the linear or not structures of the study area. The in situ mapping was enhanced with data from the digital processing of the satellite data. All the analogical and digital data were imported in a geodata base specially designed for geological data. After the necessary topological control and corrections the data were unified and processed in order to create the final layout at 1/50.000 scale.

  12. Ecosystem discovery: Measuring clean energy innovation ecosystems through knowledge discovery and mapping techniques

    DOE PAGES

    Lin, Jessica; Chinthavali, Supriya; Stahl, Chelsey Dunivan; ...

    2016-10-01

    The term ‘innovation ecosystem’ is often utilized, but rarely quantified. Oak Ridge National Lab conducted a ground-breaking application of natural language processing, link analysis and other computational techniques to transform text and numerical data into metrics on clean energy innovation activity and geography for the U.S. Department of Energy. The project demonstrates that a machine-assisted methodology gives the user a replicable method to rapidly identify, quantify and characterize clean energy innovation ecosystems. EPSA advanced a novel definition for clean energy innovation ecosystem as the overlap of five Ecosystem Components: 1) nascent clean energy indicators, 2) investors, 3) enabling environment, 4)more » networking assets and 5) large companies. The tool was created with the flexibility to allow the user to choose the weights of each of the five ecosystem components and the subcomponents. This flexibility allows the user to visualize different subsets of data as well as the composite IE rank. In an independent parallel effort, a DOE analyst in EPSA developed a short list of 22 top US clean energy innovation ecosystems; the Ecosystem Discovery tool was able to identify over 90% of the analyst-reported ecosystems. Full validation and calibration remain outstanding tasks. The tool and the underlying datasets have the potential to address a number of important policy questions. The initial broad list of U.S. clean energy innovation ecosystems, with geographic area, technology focus, and list and types of involved organizations can help describe regional technology activities and capabilities. The implementation of knowledge discovery techniques also revealed both the potential and limitations of an automatic machine extraction methodology to gather ecosystem component data. The project demonstrates that a machine-assisted methodology gives the user a replicable method to rapidly identify, quantify, and characterize clean energy innovation

  13. Mapping patient safety: a large-scale literature review using bibliometric visualisation techniques

    PubMed Central

    Rodrigues, S P; van Eck, N J; Waltman, L; Jansen, F W

    2014-01-01

    Background The amount of scientific literature available is often overwhelming, making it difficult for researchers to have a good overview of the literature and to see relations between different developments. Visualisation techniques based on bibliometric data are helpful in obtaining an overview of the literature on complex research topics, and have been applied here to the topic of patient safety (PS). Methods On the basis of title words and citation relations, publications in the period 2000–2010 related to PS were identified in the Scopus bibliographic database. A visualisation of the most frequently cited PS publications was produced based on direct and indirect citation relations between publications. Terms were extracted from titles and abstracts of the publications, and a visualisation of the most important terms was created. The main PS-related topics studied in the literature were identified using a technique for clustering publications and terms. Results A total of 8480 publications were identified, of which the 1462 most frequently cited ones were included in the visualisation. The publications were clustered into 19 clusters, which were grouped into three categories: (1) magnitude of PS problems (42% of all included publications); (2) PS risk factors (31%) and (3) implementation of solutions (19%). In the visualisation of PS-related terms, five clusters were identified: (1) medication; (2) measuring harm; (3) PS culture; (4) physician; (5) training, education and communication. Both analysis at publication and term level indicate an increasing focus on risk factors. Conclusions A bibliometric visualisation approach makes it possible to analyse large amounts of literature. This approach is very useful for improving one's understanding of a complex research topic such as PS and for suggesting new research directions or alternative research priorities. For PS research, the approach suggests that more research on implementing PS improvement initiatives

  14. Mapping patient safety: a large-scale literature review using bibliometric visualisation techniques.

    PubMed

    Rodrigues, S P; van Eck, N J; Waltman, L; Jansen, F W

    2014-03-13

    The amount of scientific literature available is often overwhelming, making it difficult for researchers to have a good overview of the literature and to see relations between different developments. Visualisation techniques based on bibliometric data are helpful in obtaining an overview of the literature on complex research topics, and have been applied here to the topic of patient safety (PS). On the basis of title words and citation relations, publications in the period 2000-2010 related to PS were identified in the Scopus bibliographic database. A visualisation of the most frequently cited PS publications was produced based on direct and indirect citation relations between publications. Terms were extracted from titles and abstracts of the publications, and a visualisation of the most important terms was created. The main PS-related topics studied in the literature were identified using a technique for clustering publications and terms. A total of 8480 publications were identified, of which the 1462 most frequently cited ones were included in the visualisation. The publications were clustered into 19 clusters, which were grouped into three categories: (1) magnitude of PS problems (42% of all included publications); (2) PS risk factors (31%) and (3) implementation of solutions (19%). In the visualisation of PS-related terms, five clusters were identified: (1) medication; (2) measuring harm; (3) PS culture; (4) physician; (5) training, education and communication. Both analysis at publication and term level indicate an increasing focus on risk factors. A bibliometric visualisation approach makes it possible to analyse large amounts of literature. This approach is very useful for improving one's understanding of a complex research topic such as PS and for suggesting new research directions or alternative research priorities. For PS research, the approach suggests that more research on implementing PS improvement initiatives might be needed.

  15. A snapshot of radiation therapy techniques and technology in Queensland: An aid to mapping undergraduate curriculum

    SciTech Connect

    Bridge, Pete; Carmichael, Mary-Ann; Brady, Carole; Dry, Allison

    2013-03-15

    Undergraduate students studying the Bachelor of Radiation Therapy at Queensland University of Technology (QUT) attend clinical placements in a number of department sites across Queensland. To ensure that the curriculum prepares students for the most common treatments and current techniques in use in these departments, a curriculum matching exercise was performed. A cross-sectional census was performed on a pre-determined “Snapshot” date in 2012. This was undertaken by the clinical education staff in each department who used a standardized proforma to count the number of patients as well as prescription, equipment, and technique data for a list of tumour site categories. This information was combined into aggregate anonymized data. All 12 Queensland radiation therapy clinical sites participated in the Snapshot data collection exercise to produce a comprehensive overview of clinical practice on the chosen day. A total of 59 different tumour sites were treated on the chosen day and as expected the most common treatment sites were prostate and breast, comprising 46% of patients treated. Data analysis also indicated that intensity-modulated radiotherapy (IMRT) use is relatively high with 19.6% of patients receiving IMRT treatment on the chosen day. Both IMRT and image-guided radiotherapy (IGRT) indications matched recommendations from the evidence. The Snapshot method proved to be a feasible and efficient method of gathering useful data to inform curriculum matching. Frequency of IMRT use in Queensland matches or possibly exceeds that indicated in the literature. It is recommended that future repetition of the study be undertaken in order to monitor trends in referral patterns and new technology implementation.

  16. Mapping Proxy Sensitivity: A New Technique for Compositional Analysis of Cultured Biominerals and Inorganically Precipitated Materials

    NASA Astrophysics Data System (ADS)

    Gagnon, A. C.; DePaolo, D. J.; DeYoreo, J.; Spero, H. J.; Russell, A. D.

    2011-12-01

    Mineral composition is controlled by a host of environmental factors during precipitation. To build accurate paleo-reconstructions we need to separate the impact of each parameter on proxy behavior and use these data to build a chemical-scale understanding of mineral growth. Biomineral culture and inorganic precipitation experiments, where growth parameters can be manipulated independently, are uniquely suited to calibrate proxies and probe mechanism. Culture and precipitation experiments often involve overgrowth of an initial material. For example, seed crystals are used to control mineralogy and avoid nucleation during inorganic precipitation, while culture experiments in marine organisms typically start with wild specimens. New growth corresponding to the experimental conditions must be resolved from the initial material. Separation is typically achieved using microanalysis, skeletal dissection, or estimates of the initial mass and composition. Each approach imposes limits on the accuracy, precision or types of materials that can be analyzed. Slow growth rates and complicated geometries can make these techniques especially challenging when applied to biominerals. We present a method of compositional analysis for use in biological culture and inorganic growth experiments that overcomes many of these challenges. This method relies on growth in a mixed element stable isotope spike, requires neither the initial mass nor the initial composition to be known, harnesses the precision and sensitivity of bulk analysis, and applies even when it is impossible to physically identify newly grown material. Error analysis suggests this method can significantly improve the precision of metal/calcium measurements in experimentally grown material compared to current methods. Furthermore, the method can isolate different events through time, separating, for example, the impact of day and night cycles on biomineral composition. We will present metal/calcium ratios measured using the

  17. Ecosystem discovery: Measuring clean energy innovation ecosystems through knowledge discovery and mapping techniques

    SciTech Connect

    Lin, Jessica; Chinthavali, Supriya; Stahl, Chelsey Dunivan; Stahl, Christopher; Lee, Sangkeun; Shankar, Mallikarjun

    2016-10-01

    The term ‘innovation ecosystem’ is often utilized, but rarely quantified. Oak Ridge National Lab conducted a ground-breaking application of natural language processing, link analysis and other computational techniques to transform text and numerical data into metrics on clean energy innovation activity and geography for the U.S. Department of Energy. The project demonstrates that a machine-assisted methodology gives the user a replicable method to rapidly identify, quantify and characterize clean energy innovation ecosystems. EPSA advanced a novel definition for clean energy innovation ecosystem as the overlap of five Ecosystem Components: 1) nascent clean energy indicators, 2) investors, 3) enabling environment, 4) networking assets and 5) large companies. The tool was created with the flexibility to allow the user to choose the weights of each of the five ecosystem components and the subcomponents. This flexibility allows the user to visualize different subsets of data as well as the composite IE rank. In an independent parallel effort, a DOE analyst in EPSA developed a short list of 22 top US clean energy innovation ecosystems; the Ecosystem Discovery tool was able to identify over 90% of the analyst-reported ecosystems. Full validation and calibration remain outstanding tasks. The tool and the underlying datasets have the potential to address a number of important policy questions. The initial broad list of U.S. clean energy innovation ecosystems, with geographic area, technology focus, and list and types of involved organizations can help describe regional technology activities and capabilities. The implementation of knowledge discovery techniques also revealed both the potential and limitations of an automatic machine extraction methodology to gather ecosystem component data. The project demonstrates that a machine-assisted methodology gives the user a replicable method to rapidly identify, quantify, and characterize clean energy innovation ecosystems.

  18. Luminous efficiency estimates of meteors -I. Uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Subasinghe, Dilini; Campbell-Brown, Margaret; Stokan, Edward

    2017-09-01

    The luminous efficiency of meteors is poorly known, but critical for determining the meteoroid mass. We present an uncertainty analysis of the luminous efficiency as determined by the classical ablation equations, and suggest a possible method for determining the luminous efficiency of real meteor events. We find that a two-term exponential fit to simulated lag data is able to reproduce simulated luminous efficiencies reasonably well.

  19. Scatterer size and concentration estimation technique based on a 3D acoustic impedance map from histologic sections

    NASA Astrophysics Data System (ADS)

    Mamou, Jonathan; Oelze, Michael L.; O'Brien, William D.; Zachary, James F.

    2004-05-01

    Accurate estimates of scatterer parameters (size and acoustic concentration) are beneficial adjuncts to characterize disease from ultrasonic backscatterer measurements. An estimation technique was developed to obtain parameter estimates from the Fourier transform of the spatial autocorrelation function (SAF). A 3D impedance map (3DZM) is used to obtain the SAF of tissue. 3DZMs are obtained by aligning digitized light microscope images from histologic preparations of tissue. Estimates were obtained for simulated 3DZMs containing spherical scatterers randomly located: relative errors were less than 3%. Estimates were also obtained from a rat fibroadenoma and a 4T1 mouse mammary tumor (MMT). Tissues were fixed (10% neutral-buffered formalin), embedded in paraffin, serially sectioned and stained with H&E. 3DZM results were compared to estimates obtained independently against ultrasonic backscatter measurements. For the fibroadenoma and MMT, average scatterer diameters were 91 and 31.5 μm, respectively. Ultrasonic measurements yielded average scatterer diameters of 105 and 30 μm, respectively. The 3DZM estimation scheme showed results similar to those obtained by the independent ultrasonic measurements. The 3D impedance maps show promise as a powerful tool to characterize ultrasonic scattering sites of tissue. [Work supported by the University of Illinois Research Board.

  20. Mapping and hazard assessment of atmospheric pollution in a medium sized urban area using the Rasch model and geostatistics techniques

    NASA Astrophysics Data System (ADS)

    Moral, Francisco J.; Álvarez, Pedro; Canito, José L.

    Researchers or decision-makers frequently need information about atmospheric pollution patterns in urbanized areas. The preparation of this type of information is a complex task, due to the influence of several individual pollutants, with different units, on the global air pollution (e.g. nitrogen dioxide concentrations, ppm, and noise, dB). In this work, a new methodology based on the formulation of the Rasch model is proposed to obtain a measure of the atmospheric pollution. Two main results were obtained after applying this method: (1) A classification of all locations according to the pollution level, which was the value of the Rasch measure; (2) The influence on the environmental deterioration of each individual pollutant (particularly, in this work, NO 2, NO, CO 2, CO and noise). Finally, pollution at locations where no measurements were available was estimated with the optimum interpolation technique, kriging. Kriged estimates were subsequently used to map atmospheric pollution. To illustrate the application of this two-step method (Rasch model plus interpolation), which is useful to generate hazard assessment maps based on the spatial distribution of atmospheric pollution, an example is shown.

  1. Characterisation of Damaged Tubular Composites by Acoustic Emission, Thermal Diffusivity Mapping and TSR-RGB Projection Technique

    NASA Astrophysics Data System (ADS)

    Chandarana, Neha; Lansiaux, Henri; Gresil, Matthieu

    2016-11-01

    An increase in the use of composite materials, owing to improved design and fabrication processes, has led to cost reductions in many industries. Resistance to corrosion, high specific strength, and stiffness are just a few of their many attractive properties. However, damage tolerance remains a major concern in the implementation of composites and uncertainty regarding component lifetimes can lead to over-design and under-use of such materials. A combination of non-destructive evaluation (NDE) and structural health monitoring (SHM) have shown promise in improving confidence by enabling data collection in-situ and in real time. In this work, infrared thermography (IRT) is employed for NDE of tubular composite specimens before and after impact. Four samples are impacted with energies of 5 J, 7.5 J, and 10 J by an un-instrumented falling weight set-up. Acoustic emissions (AE) are monitored using bonded piezoelectric sensors during one of the four impact tests. IRT data is used to generate diffusivity and thermal depth mappings of each sample using the thermographic signal reconstruction (TSR) red green blue (RGB) projection technique. Analysis of AE data alone for a 10 J impact suggest significant damage to the fibres and matrix; this is in good agreement with the generated thermal depth mappings for each sample, which indicate damage through multiple fibre layers. IRT and AE data are correlated and validated by optical micrographs taken along the cross section of damage.

  2. Characterisation of Damaged Tubular Composites by Acoustic Emission, Thermal Diffusivity Mapping and TSR-RGB Projection Technique

    NASA Astrophysics Data System (ADS)

    Chandarana, Neha; Lansiaux, Henri; Gresil, Matthieu

    2017-04-01

    An increase in the use of composite materials, owing to improved design and fabrication processes, has led to cost reductions in many industries. Resistance to corrosion, high specific strength, and stiffness are just a few of their many attractive properties. However, damage tolerance remains a major concern in the implementation of composites and uncertainty regarding component lifetimes can lead to over-design and under-use of such materials. A combination of non-destructive evaluation (NDE) and structural health monitoring (SHM) have shown promise in improving confidence by enabling data collection in-situ and in real time. In this work, infrared thermography (IRT) is employed for NDE of tubular composite specimens before and after impact. Four samples are impacted with energies of 5 J, 7.5 J, and 10 J by an un-instrumented falling weight set-up. Acoustic emissions (AE) are monitored using bonded piezoelectric sensors during one of the four impact tests. IRT data is used to generate diffusivity and thermal depth mappings of each sample using the thermographic signal reconstruction (TSR) red green blue (RGB) projection technique. Analysis of AE data alone for a 10 J impact suggest significant damage to the fibres and matrix; this is in good agreement with the generated thermal depth mappings for each sample, which indicate damage through multiple fibre layers. IRT and AE data are correlated and validated by optical micrographs taken along the cross section of damage.

  3. Proteomic analysis of estrogen response of premalignant human breast cells using a 2-D liquid separation/mass mapping technique.

    PubMed

    Zhao, Jia; Zhu, Kan; Lubman, David M; Miller, Fred R; Shekhar, Malthy P V; Gerard, Brigitte; Barder, Timothy J

    2006-07-01

    A 2-D liquid-phase separation method based on chromatofocusing and nonporous silica RP-HPLC followed by ESI-TOF-MS was used to analyze proteins in whole cell lysates from estrogen-treated and untreated premalignant, estrogen-responsive cell line MCF10AT1 cells. 2-D mass maps in the pH range 4.6-6.0 were generated with good correlation to theoretical M(r) values for intact proteins. Proteins were identified based on intact M(r), pI and PMF, or MS/MS sequencing. About 300 unique proteins were identified and 120 proteins in mass range 5-75 kDa were quantified upon treatment of estrogen. Around 40 proteins were found to be more highly expressed (>four-fold) and 17 were down-regulated (>four-fold) in treated cells. In our study, we found that many altered proteins have characteristics consistent with the development of a malignant phenotype. Some of them have a role in the ras pathway or play an important role in signal pathways. These changed proteins might be essential in the estrogen regulation mechanism. Our study highlights the use of the MCF10AT1 cell line to examine estrogen-induced changes in premalignant breast cells and the ability of the 2-D mass mapping technique to quantitatively study protein expression changes on a proteomic scale.

  4. Explore the interrelationships of the WFE Nexus by relational topological mapping techniques - a case in northern Taiwan

    NASA Astrophysics Data System (ADS)

    Liao, Y. Y.; Tsai, W. P.; Chang, F. J.

    2016-12-01

    In recent years, Water-Food-Energy Nexus (WFE Nexus) has become a global issue due to the scarcity of natural resources. Besides, the resource management of water, energy and food at spatial and temporal scales raises multiple interwoven concerns ranging from co-opetitive relationship to resource accessibility. Over the past few decades, the socio-economic development of Taiwan has experienced drastic transformation because of population growth, urbanization and excessive consumption of natural resources. Therefore, our research intends to explore the interrelationships of the WFE Nexus by means of data mining techniques, such as artificial intelligence (AI). Our study area is the Tamsui River watershed in northern Taiwan. At first, we collect and analyze long-term WFE-related multidimensional heterogeneous observational data, such as water demand, food consumption and energy requirements. Then in order to explore the interrelationships and interactions of the WFE Nexus, the self-organizing map (SOM) is used to classify multidimensional heterogeneous data for building two-dimensional topological maps for clustering purpose. Consequently, we aim to establish the interrelationships in the WFE Nexus based on the obtained SOM topologies. This research will also explore the collaborative synergies of resource allocation and utilization that may influence the development of urbanization in the future. Ultimately, based on our proposed methodology and findings, the considerable impacts of the WFE Nexus on resource supply and demand and how the potential collaborative synergies would promote sustainable socio-economic development will be illuminated.

  5. An easy-to-use technique to characterize cardiodynamics from first-return maps on ΔRR-intervals

    NASA Astrophysics Data System (ADS)

    Fresnel, Emeline; Yacoub, Emad; Freitas, Ubiratan; Kerfourn, Adrien; Messager, Valérie; Mallet, Eric; Muir, Jean-François; Letellier, Christophe

    2015-08-01

    Heart rate variability analysis using 24-h Holter monitoring is frequently performed to assess the cardiovascular status of a patient. The present retrospective study is based on the beat-to-beat interval variations or ΔRR, which offer a better view of the underlying structures governing the cardiodynamics than the common RR-intervals. By investigating data for three groups of adults (with normal sinus rhythm, congestive heart failure, and atrial fibrillation, respectively), we showed that the first-return maps built on ΔRR can be classified according to three structures: (i) a moderate central disk, (ii) a reduced central disk with well-defined segments, and (iii) a large triangular shape. These three very different structures can be distinguished by computing a Shannon entropy based on a symbolic dynamics and an asymmetry coefficient, here introduced to quantify the balance between accelerations and decelerations in the cardiac rhythm. The probability P111111 of successive heart beats without large beat-to-beat fluctuations allows to assess the regularity of the cardiodynamics. A characteristic time scale, corresponding to the partition inducing the largest Shannon entropy, was also introduced to quantify the ability of the heart to modulate its rhythm: it was significantly different for the three structures of first-return maps. A blind validation was performed to validate the technique.

  6. A new technique for landslide mapping from a large-scale remote sensed image: A case study of Central Nepal

    NASA Astrophysics Data System (ADS)

    Yu, Bo; Chen, Fang

    2017-03-01

    This paper presents a new technique for landslide mapping from large-scale Landsat8 images. The method introduces saliency enhancement to enhance the landslide regions, making the landslides salient objects in the image. Morphological operations are applied to the enhanced image to remove most background objects. Afterwards, digital elevation model is applied to further remove the ground objects of plain areas according to the height of landscape, since most landslides occur in mountainous areas. Final landslides are extracted by the proposal regions from selective search. The study area covers 2°x2°, making it more similar with practical cases, such as emergency response and landslide inventory mappings. The proposed method performs satisfactorily by detecting 99.1% of the landslides in the image, and obtains an overall accuracy of 99.8% in the landslides/background classification problem, which gets further validated in another Landsat8 image of a different site. The experiment shows that the proposed method is feasible for landslide detection from large-scale area, which may contribute to the further landslide-related research.

  7. An interpretation of a geologic map of Fannin County, Texas, prepared by ADP techniques from ERTS MSS data

    NASA Technical Reports Server (NTRS)

    Henderson, J. A., Jr.; Gardner, J. J.; Cipra, J. E.

    1973-01-01

    Multispectral scanner data, collected by ERTS-1 from an area in northeast Texas underlain by structurally simple cretaceous sedimentary rocks, were analyzed using several automatic data processing techniques. Training areas for the computer analysis were selected in three ways using: (1) an unsupervised classifier, (2) the unsupervised classifier to further refine training areas, and (3) available ground information. The first two methods utilize the capability of the unsupervised classifier to the unsupervised classifier to group resolution elements with similar spectral properties. With the imagery produced by these procedures, several geologic features can be identified. The most easily recognizable features are streams and other water bodies. The approximate location of contacts between several rock units can be mapped.

  8. Modelling of high-frequency structure-borne sound transmission on FEM grids using the Discrete Flow Mapping technique

    NASA Astrophysics Data System (ADS)

    Hartmann, Timo; Tanner, Gregor; Xie, Gang; Chappell, David; Bajars, Janis

    2016-09-01

    Dynamical Energy Analysis (DEA) combined with the Discrete Flow Mapping technique (DFM) has recently been introduced as a mesh-based high frequency method modelling structure borne sound for complex built-up structures. This has proven to enhance vibro-acoustic simulations considerably by making it possible to work directly on existing finite element meshes circumventing time-consuming and costly re-modelling strategies. In addition, DFM provides detailed spatial information about the vibrational energy distribution within a complex structure in the mid-to-high frequency range. We will present here progress in the development of the DEA method towards handling complex FEM-meshes including Rigid Body Elements. In addition, structure borne transmission paths due to spot welds are considered. We will present applications for a car floor structure.

  9. Digital snow mapping technique using LANDSAT data and General Electric IMAGE 100 system. [Dinwoody Creek in Wind River Mountains, Wyoming

    NASA Technical Reports Server (NTRS)

    Dallam, W. C.; Foster, J. L.

    1975-01-01

    In this study, a technique and procedures using General Electric IMAGE 100 system were derived for performing a snow cover analysis of small watersheds for quasi-operational application. The study area was the Wind River Mountains of west central Wyoming. A small watershed, namely, Dinwoody Creek was selected as a test site. LANDSAT data and U-2 imagery were used in the analysis. From a minimal snowcover LANDSAT scene, multispectral analysis was performed yielding the distribution of forest, bare rock, grassland, water, and snow within the watershed. The forest and bare rock themes were saved and registered with other scenes containing greater snow cover. Likewise, elevation contours from a digitized map were stored and superimposed over the snowpack areas.

  10. Epitope mapping by epitope excision, hydrogen/deuterium exchange, and peptide-panning techniques combined with in silico analysis.

    PubMed

    Clementi, Nicola; Mancini, Nicasio; Criscuolo, Elena; Cappelletti, Francesca; Clementi, Massimo; Burioni, Roberto

    2014-01-01

    The fine characterization of protective B cell epitopes plays a pivotal role in the development of novel vaccines. The development of epitope-based vaccines, in fact, cannot be possible without a clear definition of the antigenic regions involved in the binding between the protective antibody (Ab) and its molecular target. To achieve this result, different epitope-mapping approaches have been widely described (Clementi et al. Drug Discov Today 18(9-10):464-471, 2013). Nowadays, the best way to characterize an Ab bound region is still the resolution of Ab-antigen (Ag) co-crystal structure. Unfortunately, the crystallization approaches are not always feasible. However, different experimental strategies aimed to predict Ab-Ag interaction and followed by in silico analysis of the results may be good surrogate approaches to achieve this result. Here, we review few experimental techniques followed by the use of "basic" informatics tools for the analysis of the results.

  11. [Working hypothesis of simplified techniques for the first mapping of occupational hazards in handicrafts. Part two: physical and chemical risks].

    PubMed

    Montomoli, L; Di Leone, G; Aprea, M C; Sciarra, G; Pinto, I; Sartorelli, P

    2009-01-01

    Craft features a strong link with tradition which preserves and perpetuates techniques created and consolidated over time, in a particular historical, cultural and geographical situation. Italy is home to excellent craftsmanship: fashion, furniture, giftware, design. The framework law for the craft n. 433/1985 defines the craft business carried on by the craftsman for a predominant objective of producing goods, including semi-finished or services. The D.L.gs 81/2008 defines the manner in which the employer must conduct risk assessment in the workplace and process the risk assessment document. The aim of the study is to provide employers with a simple methodology that allows the drafting of the first mapping of hazards/dangers identificating risk sources in the working process. The developed methodology is expressed as a model of information gathering, which brings to a first overview of all major risk indicators that may arise in the craftsmanship.

  12. A Technique for Mapping Characteristic Lengths to Preserve Energy Dissipated via Strain Softening in a Multiscale Analysis

    NASA Technical Reports Server (NTRS)

    Pineda, Evan J.; Bednarcyk, Brett A.; Arnold, Steven M.

    2014-01-01

    It is often advantageous to account for the microstructure of the material directly using multiscale modeling. For computational tractability, an idealized repeating unit cell (RUC) is used to capture all of the pertinent features of the microstructure. Typically, the RUC is dimensionless and depends only on the relative volume fractions of the different phases in the material. This works well for non-linear and inelastic behavior exhibiting a positive-definite constitutive response. Although, once the material exhibits strain softening, or localization, a mesh objective failure theories, such as smeared fracture theories, nodal and element enrichment theories (XFEM), cohesive elements or virtual crack closure technique (VCCT), can be utilized at the microscale, but the dimensions of the RUC must then be defined. One major challenge in multiscale progressive damage modeling is relating the characteristic lengths across the scales in order to preserve the energy that is dissipated via localization at the microscale. If there is no effort to relate the size of the macroscale element to the microscale RUC, then the energy that is dissipated will remain mesh dependent at the macroscale, even if it is regularized at the microscale. Here, a technique for mapping characteristic lengths across the scales is proposed. The RUC will be modeled using the generalized method of cells (GMC) micromechanics theory, and local failure in the matrix constituent subcells will be modeled using the crack band theory. The subcell characteristic lengths used in the crack band calculations will be mapped to the macroscale finite element in order to regularize the local energy in a manner consistent with the global length scale. Examples will be provided with and without the regularization, and they will be compared to a baseline case where the size and shape of the element and RUC are coincident (ensuring energy is preserved across the scales).

  13. A Comparison of Techniques for Mapping the Distribution of Sediment on the Bed of the Colorado River in Grand Canyon

    NASA Astrophysics Data System (ADS)

    Fuller, E.; Kaplinski, M.; Rubin, D. M.

    2004-12-01

    The Grand Canyon Monitoring and Research Center is charged with establishing and implementing monitoring projects to provide scientific information to the Glen Canyon Dam Adaptive Management Program (GCDAMP) on the effects of operating Glen Canyon Dam on the downstream resources of the Colorado River ecosystem. One primary resource of concern to the GCDAMP is fine-grained sediment. Glen Canyon Dam traps approximately 94% of the pre-dam sand supply to the Colorado River in Grand Canyon, resulting in a decline in the size of eddy sand bars (25% decline in surface area over the past 15 years). Sand bars are an important resource because they provide habitat for endangered native fish, protect archeological sites, provide substrate for vegetation, are used as campsites and are a distinctive feature of the pre-dam environment. A combination of traditional survey techniques and multi-beam bathymetry has been used to determine the size and elevation of sandbars and to obtain topographic maps of the riverbed. These techniques have proven useful in evaluating the spatial changes and channel morphology along the Colorado River ecosystem. While previous studies have been very effective in measuring volumetric and spatial changes, a method is needed map the distribution of sediment along the submerged portion of the river channel. The distribution of fine-grained sediment is needed to evaluate the potential for deposition onto high elevation sand bars during proposed experimental high flows. This study used high-resolution multibeam bathymetry, acoustic backscatter and underwater video images collected on expeditions in 2002 and 2004 to evaluate the different methodologies. The purpose of this study was to evaluate possible technologies to be used in determining the distribution of sediment along the bed of the Colorado River in Grand Canyon. These technologies include: 1) visual interpretation of shaded relief images produced from multibeam bathymetry; 2) visual

  14. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  15. Visualizing Mars Using Virtual Reality: A State of the Art Mapping Technique Used on Mars Pathfinder

    NASA Technical Reports Server (NTRS)

    Stoker, C.; Zbinden, E.; Blackmon, T.; Nguyen, L.

    1999-01-01

    We describe an interactive terrain visualization system which rapidly generates and interactively displays photorealistic three-dimensional (3-D) models produced from stereo images. This product, first demonstrated in Mars Pathfinder, is interactive, 3-D, and can be viewed in an immersive display which qualifies it for the name Virtual Reality (VR). The use of this technology on Mars Pathfinder was the first use of VR for geologic analysis. A primary benefit of using VR to display geologic information is that it provides an improved perception of depth and spatial layout of the remote site. The VR aspect of the display allows an operator to move freely in the environment, unconstrained by the physical limitations of the perspective from which the data were acquired. Virtual Reality offers a way to archive and retrieve information in a way that is intuitively obvious. Combining VR models with stereo display systems can give the user a sense of presence at the remote location. The capability, to interactively perform measurements from within the VR model offers unprecedented ease in performing operations that are normally time consuming and difficult using other techniques. Thus, Virtual Reality can be a powerful a cartographic tool. Additional information is contained in the original extended abstract.

  16. Techniques for precise mapping of 226Ra and 228Ra in the ocean

    NASA Astrophysics Data System (ADS)

    Moore, Willard S.; Key, Robert M.; Sarmiento, Jorge L.

    1985-01-01

    Improvements in the analyses of 226Ra and 228Ra in seawater made possible by better extraction and processing techniques reduce significantly the errors associated with these measurements. These improvements and the extensive sampling for Ra isotopes conducted on the TTO North Atlantic Study should enable us to use the distribution of 228Ra to study mixing processes on a 3-15 year time scale in both the upper and deep North Atlantic. The 228Ra profiles already analyzed show a closer resemblance to GEOSECS tritium data than to TTO tritium data in the upper ocean. This is because the transient tracer tritium was responding on a 10-year time scale during GEOSECS and a 20-year time scale during TTO. The steady state tracer 228Ra should always respond on a time scale of 8 years. Thus the 228Ra data obtained on TTO should provide a means to extend the features of the GEOSECS tritium field to the regions of the TTO study. The 226Ra data are of high enough quality to identify features associated with different water masses. Changes in the positions of the deep-water masses since the GEOSECS cruise are revealed by the 226Radata.

  17. The anti-CMS technique for genome-wide mapping of 5-hydroxymethylcytosine.

    PubMed

    Huang, Yun; Pastor, William A; Zepeda-Martínez, Jorge A; Rao, Anjana

    2012-10-01

    5-Hydroxymethylcytosine (5hmC) is a recently discovered base in the mammalian genome, produced upon oxidation of 5-methylcytosine (5mC) in a process catalyzed by TET proteins. The biological functions of 5hmC and further oxidation products of 5mC are under intense investigation, as they are likely intermediates in DNA demethylation pathways. Here we describe a novel protocol to profile 5hmC at a genome-wide scale. This approach is based on sodium bisulfite-mediated conversion of 5hmC to cytosine-5-methylenesulfonate (CMS); CMS-containing DNA fragments are then immunoprecipitated using a CMS-specific antiserum. The anti-CMS technique is highly specific with a low background, and is much less dependent on 5hmC density than anti-5hmC immunoprecipitation (IP). Moreover, it does not enrich for CA and CT repeats, as noted for 5hmC DNA IP using antibodies to 5hmC. The anti-CMS protocol takes 3 d to complete.

  18. Intersubject variability in the analysis of diffusion tensor images at the group level: fractional anisotropy mapping and fiber tracking techniques.

    PubMed

    Müller, Hans-Peter; Unrath, Alexander; Riecker, Axel; Pinkhardt, Elmar H; Ludolph, Albert C; Kassubek, Jan

    2009-04-01

    Diffusion tensor imaging (DTI) provides comprehensive information about quantitative diffusion and connectivity in the human brain. Transformation into stereotactic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The objective of the present study was to optimize technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. Different averaging methods for mean diffusion-weighted images containing DTI information were compared, i.e., region of interest-based fractional anisotropy (FA) mapping, fiber tracking (FT) and corresponding tractwise FA statistics (TFAS). The novel technique of intersubject FT that takes into account directional information of single data sets during the FT process was compared to standard FT techniques. Application of the methods was shown in the comparison of normal subjects and subjects with defined white matter pathology (alterations of the corpus callosum). Fiber tracking was applied to averaged data sets and showed similar results compared with FT on single subject data. The application of TFAS to averaged data showed averaged FA values around 0.4 for normal controls. The values were in the range of the standard deviation for averaged FA values for TFAS applied to single subject data. These results were independent of the applied averaging technique. A significant reduction of the averaged FA values was found in comparison to TFAS applied to data from subjects with defined white matter pathology (FA around 0.2). The applicability of FT techniques in the analysis of different subjects at the group level was demonstrated. Group comparisons as well as FT on group averaged data were shown to be feasible. The objective of this work was to identify the most appropriate method for intersubject averaging and group comparison which incorporates intersubject variability of

  19. Mapping of groundwater potential zones in Salem Chalk Hills, Tamil Nadu, India, using remote sensing and GIS techniques.

    PubMed

    Thilagavathi, N; Subramani, T; Suresh, M; Karunanidhi, D

    2015-04-01

    This study proposes to introduce the remote sensing and geographic information system (GIS) techniques in mapping the groundwater potential zones. Remote sensing and GIS techniques have been used to map the groundwater potential zones in Salem Chalk Hills, Tamil Nadu, India. Charnockites and fissile hornblende biotite gneiss are the major rock types in this region. Dunites and peridodites are the ultramafic rocks which cut across the foliation planes of the gneisses and are highly weathered. It comprises magnesite and chromite deposits which are excavated by five mining companies by adopting bench mining. The thickness of weathered and fracture zone varies from 2.2 to 50 m in gneissic formation and 5.8 to 55 m in charnockite. At the contacts of gneiss and charnockite, the thickness ranges from 9.0 to 90.8 m favoring good groundwater potential. The mine lease area is underlined by fractured and sheared hornblende biotite gneiss where groundwater potential is good. Water catchment tanks in this area of 5 km radius are small to moderate in size and are only seasonal. They remain dry during summer seasons. As perennial water resources are remote, the domestic and agricultural activities in this region depend mainly upon the groundwater resources. The mines are located in gently slope area, and accumulation of water is not observed except in mine pits even during the monsoon period. Therefore, it is essential to map the groundwater potential zones for proper management of the aquifer system. Satellite imageries were also used to extract lineaments, hydrogeomorphic landforms, drainage patterns, and land use, which are the major controlling factors for the occurrence of groundwater. Various thematic layers pertaining to groundwater existence such as geology, geomorphology, land use/land cover, lineament, lineament density, drainage, drainage density, slope, and soil were generated using GIS tools. By integrating all the above thematic layers based on the ranks and

  20. Color and luminance spatial tuning estimated by noise masking in the absence of off-frequency looking

    NASA Astrophysics Data System (ADS)

    Angeles Losada, M.; Mullen, Kathy T.

    1995-02-01

    We assessed the contribution of off-frequency looking for pattern detection and obtained bandwidths for chromatic and luminance mechanisms in conditions free from this effect. We used a simultaneous spatial masking technique with Gaussian enveloped sinusoidal test stimuli (0.5 cycle / deg) and filtered one-dimensional static-noise masks whose spectral power was uniformly distributed per octave. Stimuli were modulated in the chromatic (isoluminant red-green) or the luminance (yellow-black) domain. Color and luminance detection thresholds were compared for low-pass, high-pass, and notch- (band-stopped) filtered noise. We obtained the densities, masking by notched noise is greater than the summed masking of the high- and low-pass noise, indicating the presence of off-frequency looking for both color and luminance detection. There is no evidence for off-frequency looking at lower power densities. (2) Using notch-filtered noise, which avoids the problem of off-frequency looking, we found that color processing is subserved by bandpass channels with bandwidths similar to those revealed for luminance processing. (3) Both color and luminance mechanisms appear to have bandwidths proportional to their center frequency (constant in octaves). (4) The lower and upper sides of the color and luminance tuning functions were estimated individually by use of high-pass and low-pass noise of a low power density and are revealed to

  1. Two motion systems with common and separate pathways for color and luminance.

    PubMed Central

    Gorea, A; Papathomas, T V; Kovacs, I

    1993-01-01

    We present psychological experiments that reveal two motion systems, a specific and an unspecific one. The specific system prevails at medium to high temporal frequencies. It comprises at least two separate motion pathways that are selective for color and for luminance and that do not interact until after the motion signal is extracted separately in each. By contrast, the unspecific system prevails at low temporal frequencies and it combines color and luminance signals at an earlier stage, before motion extraction. The successful implementation of an efficient and accurate technique for assessing equiluminance corroborates further the main findings. These results offer a general framework for understanding the nature of interactions between color and luminance signals in motion perception and suggest that previously proposed dichotomies in motion processing may be encompassed by the specific/unspecific dichotomy proposed here. Images Fig. 2 Fig. 4 PMID:8248227

  2. De-escalating and escalating treatment beyond endocrine therapy in patients with luminal breast cancer.

    PubMed

    McCartney, Amelia; Vignoli, Alessia; Hart, Christopher; Tenori, Leonardo; Luchinat, Claudio; Biganzoli, Laura; Di Leo, Angelo

    2017-08-01

    Luminal breast cancers demonstrate significant molecular and clinical heterogeneity, despite the commonality of shared expression of the estrogen receptor (ER). To date, no clinical trial has prospectively investigated the optimal chemotherapy regime according to luminal type, highlighting a paucity of data furthermore required to guide treatment decisions. Current methods of predicting advantage from adjuvant chemotherapy lack refinement and can over-estimate the risk of relapse, inevitably leading to a proportion of patients being unnecessarily exposed to chemotherapy. This paper will explore the evidence behind modalities which may add further value to existing known clinicopathological and molecular profiling techniques in predicting clinical benefit from chemotherapy. Adjuvant chemotherapy regime choice in the context of early luminal breast cancer types will be discussed, and areas for further research and debate identified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. White LEDs with limit luminous efficacy

    SciTech Connect

    Lisitsyn, V. M.; Stepanov, S. A. Yangyang, Ju; Lukash, V. S.

    2016-01-15

    In most promising widespread gallium nitride based LEDs emission is generated in the blue spectral region with a maximum at about 450 nm which is converted to visible light with the desired spectrum by means of phosphor. The thermal energy in the conversion is determined by the difference in the energies of excitation and emission quanta and the phosphor quantum yield. Heat losses manifest themselves as decrease in the luminous efficacy. LED heating significantly reduces its efficiency and life. In addition, while heating, the emission generation output and the efficiency of the emission conversion decrease. Therefore, the reduction of the energy losses caused by heating is crucial for LED development. In this paper, heat losses in phosphor-converted LEDs (hereinafter chips) during spectrum conversion are estimated. The limit values of the luminous efficacy for white LEDs are evaluated.

  4. The Least Luminous Galaxies in the Universe

    NASA Astrophysics Data System (ADS)

    Willman, Beth

    2011-05-01

    In the past six years, more than two dozen dwarf galaxies have been discovered around the Milky Way and M31. Many of these discoveries are 100 times less luminous than any galaxy previously known, and a million times less luminous than the Milky Way itself. These discoveries have made astronomers question the very meaning of the word "galaxy", and hint that such ultra-faint dwarf galaxies may be the most numerous type of galaxy in the universe. This talk will highlight i. how we can see galaxies that are effectively invisible in images of the sky, ii. the brewing controversy over the definition of the term "galaxy", and iii. what ultra-faint galaxies can reveal about the distribution of dark matter in our Universe.

  5. Evolution of luminous IRAS sources - CCD imaging

    NASA Technical Reports Server (NTRS)

    Hutchings, J. B.; Neff, S. G.

    1991-01-01

    The paper presents optical imaging of a sample of 64 luminous IRAS galaxies which cover a wide range of IRAS spectra and luminosity and also include a range of optical types. The objects are almost all in current or recent tidal interaction. The connections between the strength and age of the interaction, the IR spectrum and luminosity, and the optical colors, and other properties of the galaxies are discussed. The principal result is that the powerful IR sources with steep IR spectra are stronger and dynamically younger interacting systems, while the flat IR spectrum objects are older. Star formation, dust obscuration, and the timescales for nuclear activity compared with IR and tidal events are discussed, and a self-consistent evolution scenario connecting the luminous IR sources is described.

  6. Evolution of luminous IRAS sources - Radio imaging

    NASA Technical Reports Server (NTRS)

    Neff, S. G.; Hutchings, J. B.

    1992-01-01

    Observations of the compact radio morphology of 111 luminous IRAS-selected active galaxies covering a wide range of IR and optical properties are presented and discussed. Of these sources, 72 are observed for the first time with the VLA A configuration. The circumnuclear radio sources are generally small and weak, with 15 nondetections to a limit of about 0.4 mJy. Comparison with the IR and optical properties of the objects indicates that the radio sources turn on within 10 exp 8 yr of the tidal encounter that is presumed to generate the IR activity. However, the radio sources do not all appear at the same time. The radio observations are consistent with the evolution scenario for luminous IRAS galaxies suggested by Hutchings and Neff (1991).

  7. A novel contactless technique for thermal field mapping and thermal conductivity determination: two-laser Raman thermometry.

    PubMed

    Reparaz, J S; Chavez-Angel, E; Wagner, M R; Graczykowski, B; Gomis-Bresco, J; Alzina, F; Sotomayor Torres, C M

    2014-03-01

    We present a novel contactless technique for thermal conductivity determination and thermal field mapping based on creating a thermal distribution of phonons using a heating laser, while a second laser probes the local temperature through the spectral position of a Raman active mode. The spatial resolution can be as small as 300 nm, whereas its temperature accuracy is ±2 K. We validate this technique investigating the thermal properties of three free-standing single crystalline Si membranes with thickness of 250, 1000, and 2000 nm. We show that for two-dimensional materials such as free-standing membranes or thin films, and for small temperature gradients, the thermal field decays as T(r) ∝ ln(r) in the diffusive limit. The case of large temperature gradients within the membranes leads to an exponential decay of the thermal field, T ∝ exp[ - A·ln(r)]. The results demonstrate the full potential of this new contactless method for quantitative determination of thermal properties. The range of materials to which this method is applicable reaches far beyond the here demonstrated case of Si, as the only requirement is the presence of a Raman active mode.

  8. Orbital masses of nearby luminous galaxies

    SciTech Connect

    Karachentsev, Igor D.; Kudrya, Yuri N. E-mail: yukudrya@gmail.com

    2014-09-01

    We use observational properties of galaxies accumulated in the Updated Nearby Galaxy Catalog to derive a dark matter mass of luminous galaxies via motions of their companions. The data on orbital-to-stellar mass ratio are presented for 15 luminous galaxies situated within 11 Mpc from us: the Milky Way, M31, M81, NGC 5128, IC342, NGC 253, NGC 4736, NGC 5236, NGC 6946, M101, NGC 4258, NGC 4594, NGC 3115, NGC 3627, and NGC 3368, as well as for a composite suite around other nearby galaxies of moderate and low luminosity. The typical ratio for these galaxies is M {sub orb}/M {sub *} = 31, corresponding to the mean local density of matter Ω {sub m} = 0.09, i.e., one-third of the global cosmic density. This quantity seems to be rather an upper limit of dark matter density, since the peripheric population of the suites may suffer from the presence of fictitious unbound members. We note that the Milky Way and M31 halos have lower dimensions and lower stellar masses than those of the other 13 nearby luminous galaxies. However, the dark-to-stellar mass ratio for both the Milky Way and M31 is typical for other neighboring luminous galaxies. The distortion in the Hubble flow, observed around the Local Group and five other neighboring groups, yields their total masses within the radius of a zero velocity surface, R {sub 0}; these masses are slightly lower than the orbital and virial values. This difference may be due to the effect of dark energy producing a kind of 'mass defect' within R {sub 0}.

  9. Techniques for Revealing 3d Hidden Archeological Features: Morphological Residual Models as Virtual-Polynomial Texture Maps

    NASA Astrophysics Data System (ADS)

    Pires, H.; Martínez Rubio, J.; Elorza Arana, A.

    2015-02-01

    The recent developments in 3D scanning technologies are not been accompanied by visualization interfaces. We are still using the same types of visual codes as when maps and drawings were made by hand. The available information in 3D scanning data sets is not being fully exploited by current visualization techniques. In this paper we present recent developments regarding the use of 3D scanning data sets for revealing invisible information from archaeological sites. These sites are affected by a common problem, decay processes, such as erosion, that never ceases its action and endangers the persistence of last vestiges of some peoples and cultures. Rock art engravings, or epigraphical inscriptions, are among the most affected by these processes because they are, due to their one nature, carved at the surface of rocks often exposed to climatic agents. The study and interpretation of these motifs and texts is strongly conditioned by the degree of conservation of the imprints left by our ancestors. Every single detail in the remaining carvings can make a huge difference in the conclusions taken by specialists. We have selected two case-studies severely affected by erosion to present the results of the on-going work dedicated to explore in new ways the information contained in 3D scanning data sets. A new method for depicting subtle morphological features in the surface of objects or sites has been developed. It allows to contrast human patterns still present at the surface but invisible to naked eye or by any other archaeological inspection technique. It was called Morphological Residual Model (MRM) because of its ability to contrast the shallowest morphological details, to which we refer as residuals, contained in the wider forms of the backdrop. Afterwards, we have simulated the process of building Polynomial Texture Maps - a widespread technique that as been contributing to archaeological studies for some years - in a 3D virtual environment using the results of MRM

  10. Numerical model calibration with the use of an observed sediment mobility mapping technique.

    NASA Astrophysics Data System (ADS)

    Javernick, Luke; Redolfi, Marco; Bertoldi, Walter

    2017-04-01

    2 mm) and ii) a novel time-lapse imagery technique used to identify areas of incipient motion. Using the numerical model Delft3D Flow, the experiments were simulated and observed incipient motion and modeled shear stress were compared to evaluate the model's ability to accurately predict sediment transport. Observed and model results were evaluated and compared, which identified a motion threshold and the ability to evaluate the model's performance. To quantify model performance, the ratios of correctly predicted areas divided by total area were calculated and produced a 75% inundation accuracy with a 71% incipient motion accuracy. Inundation accuracies are comparable to reported field studies of braided rivers with highly accurate topographic acquisition. Nevertheless, 75% inundation accuracy is less than ideal, and likely suffers from the complicated topography, shallow water depth (average 1 cm), and the corresponding model's inaccuracies that could derive from even subtle 2 mm elevation errors. As shear stress calculations are dependent upon inundation and depth, the sediment transport accuracies likely suffer from the same issues. Regardless, the sediment transport accuracies are very comparable to inundation accuracies, which is an encouraging result. Marie Sklodowska-Curie Individual Fellowship: River-HMV, 656917

  11. Sub-luminous γ-ray pulsars

    DOE PAGES

    Romani, R. W.; Kerr, M.; Craig, H. A.; ...

    2011-08-17

    Here, most pulsars observed by the Fermi Large Area Telescope have γ-ray luminosities scaling with spin-down powermore » $${\\dot{E}}$$ as $$L_\\gamma \\approx ({\\dot{E}}\\, \\times \\, 10^{33}\\,{\\rm erg \\,s^{-1}})^{1/2}$$. However, there exist one detection and several upper limits an order of magnitude or more fainter than this trend. We describe these "sub-luminous" γ-ray pulsars and discuss the case for this being an orientation effect. Of the 12 known young radio pulsars with $${\\dot{E}}>10^{34}\\, {\\rm erg\\,s^{-1}}$$ and d ≤ 2 kpc several are substantially sub-luminous. The limited available geometrical constraints favor aligned geometries for these pulsars, although no one case for alignment is compelling. In this scenario GeV emission detected from such sub-luminous pulsars can be due to a lower altitude, lower-power accelerator gap.« less

  12. Pulsations and outbursts of luminous blue variables

    SciTech Connect

    Cox, A.N.; Guzik, J.A.; Soukup, M.S.; Despain, K.M.

    1997-06-01

    We propose an outburst mechanism for the most luminous stars in our and other galaxies. These million solar luminosity stars, with masses (after earlier mass loss) of between 20 and maybe 70 solar masses, are pulsationally unstable for both radial and low-degree nonradial modes. Some of these modes are ``strange,`` meaning mostly that the pulsations are concentrated near the stellar surface and have very rapid growth rates in linear theory. The pulsation driving is by both the high iron line opacity (near 150,000 K) and the helium opacity (near 30,000 K) kappa effects. Periods range from 5 to 40 days. Depending on the composition, pulsations periodically produce luminosities above the Eddington limit for deep layers. The radiative luminosity creates an outward push that readily eases the very low gamma envelope to very large outburst radii. A key point is that a super-Eddington luminosity cannot be taken up by the sluggish convection rapidly enough to prevent an outward acceleration of much of the envelope. As the helium abundance in the envelope stellar material increases by ordinary wind mass loss and the luminous blue variable outbursts, the opacity in the deep pulsation driving layers decreases. This makes the current Eddington luminosity even higher so that pulsations can then no longer give radiative luminosities exceeding the limit. For the lower mass and luminosity luminous blue variables there is considerably less iron line opacity driving, and pulsations are almost all caused by the helium ionization kappa effect.

  13. Luminal breast cancer: from biology to treatment.

    PubMed

    Ignatiadis, Michail; Sotiriou, Christos

    2013-09-01

    Oestrogen receptor (ER)-positive--or luminal--tumours represent around two-thirds of all breast cancers. Luminal breast cancer is a highly heterogeneous disease comprising different histologies, gene-expression profiles and mutational patterns, with very varied clinical courses and responses to systemic treatment. Despite adjuvant endocrine therapy and chemotherapy treatment for patients at high risk of relapse, both early and late relapses still occur, a fact that highlights the unmet medical needs of these patients. Ongoing research aims to identify those patients who can be spared adjuvant chemotherapy and who will benefit from extended adjuvant hormone therapy. This research also aims to explore the role of adjuvant bisphosphonates, to interrogate new agents for targeting minimal residual disease, and to address endocrine resistance. Data from next-generation sequencing studies have given us new insight into the biology of luminal breast cancer and, together with advances in preclinical models and the availability of newer targeted agents, have led to the testing of rationally chosen combination treatments in clinical trials. However, a major challenge will be to make sense of the large amount of patient genomic data that is becoming increasingly available. This analysis will be critical to our understanding how intertumour and intratumour heterogeneity can influence treatment response and resistance.

  14. Relation between acid back-diffusion and luminal surface hydrophobicity in canine gastric mucosa: Effects of salicylate and prostaglandin

    SciTech Connect

    Goddard, P.J.

    1989-01-01

    The stomach is thought to be protected from luminal acid by a gastric mucosal barrier that restricts the diffusion of acid into tissue. This study tested the hypothesis that the hydrophobic luminal surface of canine gastric mucosa incubated in Ussing chambers, impedes the back-diffusion of luminal acid into the tissue. Isolated sheets of mucosa were treated with cimetidine to inhibit spontaneous acid secretion, and incubated under conditions that prevented significant secretion of luminal bicarbonate. By measuring acid loss from the luminal compartment using the pH-stat technique, acid back-diffusion was continuously monitored; potential difference (PD) was measured as an index of tissue viability. Tissue luminal surface hydrophobicity was estimated by contact angle analysis at the end of each experiment. Addition of 16,16-dimethyl prostaglandin E{sub 2} to the nutrient compartment enhanced luminal surface hydrophobicity, but did not reduce acid back-diffusion in tissues that maintained a constant PD. 10 mM salicylate at pH 4.00 in the luminal compartment reduced surface hydrophobicity, but this decrease did not occur if 1 ug/ml prostaglandin was present in the nutrient solution. Despite possessing relatively hydrophilic and relatively hydrophobic surface properties, respectively, acid back-diffusion in the absence of salicylate was not significantly different between these two groups. Neither group maintained a PD after incubation with salicylate. Lastly, radiolabeled salicylate was used to calculate the free (non-salicylate associated) acid loss in tissues incubated with salicylate and/or prostaglandin. No significant correlation was found between free acid back-diffusion and luminal surface hydrophobicity. These data do not support the hypothesis that acid back-diffusion in impeded by the hydrophobic surface presented by isolated canine gastric mucosa.

  15. Mapping forested wetlands in the Great Zhan River Basin through integrating optical, radar, and topographical data classification techniques.

    PubMed

    Na, X D; Zang, S Y; Wu, C S; Li, W L

    2015-11-01

    Knowledge of the spatial extent of forested wetlands is essential to many studies including wetland functioning assessment, greenhouse gas flux estimation, and wildlife suitable habitat identification. For discriminating forested wetlands from their adjacent land cover types, researchers have resorted to image analysis techniques applied to numerous remotely sensed data. While with some success, there is still no consensus on the optimal approaches for mapping forested wetlands. To address this problem, we examined two machine learning approaches, random forest (RF) and K-nearest neighbor (KNN) algorithms, and applied these two approaches to the framework of pixel-based and object-based classifications. The RF and KNN algorithms were constructed using predictors derived from Landsat 8 imagery, Radarsat-2 advanced synthetic aperture radar (SAR), and topographical indices. The results show that the objected-based classifications performed better than per-pixel classifications using the same algorithm (RF) in terms of overall accuracy and the difference of their kappa coefficients are statistically significant (p<0.01). There were noticeably omissions for forested and herbaceous wetlands based on the per-pixel classifications using the RF algorithm. As for the object-based image analysis, there were also statistically significant differences (p<0.01) of Kappa coefficient between results performed based on RF and KNN algorithms. The object-based classification using RF provided a more visually adequate distribution of interested land cover types, while the object classifications based on the KNN algorithm showed noticeably commissions for forested wetlands and omissions for agriculture land. This research proves that the object-based classification with RF using optical, radar, and topographical data improved the mapping accuracy of land covers and provided a feasible approach to discriminate the forested wetlands from the other land cover types in forestry area.

  16. Lineament Mapping using Remote Sensing Techniques and Structural Geology for Carbon Dioxide Sequestration Site Characterization in Central New York State

    NASA Astrophysics Data System (ADS)

    Zelazny, Melissa Maria

    2011-12-01

    This study identified lineaments from satellite images and Digital Elevation Models (DEMs) utilizing image processing techniques that enhance variations in spectral and spatial reflectivity and topography. In flat-lying sedimentary sequences Lineaments are commonly surface expressions of tectonic fractures and faults in the bedrock, emphasized on the surface by topography, drainage, and vegetation - many can be identified by remotely sensed data. Knowledge of fault locations can prevent unsuitable site selection for CO2 sequestration where CO2 could migrate up fault systems. Lineament patterns also give insight into the fracture fabric of the region- an important consideration for CO2 sequestration. Various data sets, including multispectral satellite imagery (Landsat and ASTER) and DEMs, as well as geological data describing fractures, faults, and hydrology, were used to map and validate the lineament distribution in the study area. Linear features were enhanced with tonal, topographic and textural changes by digital image processing of the satellite imagery and DEMs. Lineaments were then extracted manually using ArcMap (ArcGIS 9.2 -- ESRI). Lineaments longer than 1 km were identified, digitized and stored in a geo-database together with attributes describing their length, orientation and other characteristics. Lineament categories included vegetation, drainage, and topography. Rose diagrams and statistics of length and number of lineaments in each 100 orientation bin were used to characterize the lineament distribution in each remotely sensed data set. The primary lineament orientations from both ASTER imagery and topography trended northeast and northwest in the study area of central New York State. These trends agreed with some of the EarthSat (1997) lineament sets from Landsat images and also corresponded to some published fracture and fault systems but do not reflect the most abundant sets.

  17. BIA interpretation techniques for vegetation mapping using thematic mapper false color composites (interim report for San Carlos Reservation)

    USGS Publications Warehouse

    Bonner, W.J.; English, T.C.; Haas, R.H.; Feagan, T.R.; McKinley, R.A.

    1987-01-01

    The Bureau of Indian Affairs (BIA) is responsible for the natural resource management of approximately 52 million acres of Trust lands in the contiguous United States. The lands are distributed in a "patchwork" fashion throughout the country. Management responsibilities on these areas include: minerals, range, timber, fish and wildlife, agricultural, cultural, and archaeological resources. In an age of decreasing natural resources and increasing natural resource values, effective multiple resource management is critical. BIA has adopted a "systems approach" to natural resource management which utilizes Geographic Information System (GIS) technology. The GIS encompasses a continuum of spatial and relational data elements, and included functional capabilities such as: data collection, data entry, data base development, data analysis, data base management, display, and report generalization. In support of database development activities, BIA and BLM/TGS conducted a cooperative effort to investigate the potential of 1:100,000 scale Thematic Mapper (TM) False Color Composites (FCCs) for providing vegetation information suitable for input to the GIS and to later be incorporated as a generalized Bureau wide land cover map. Land cover information is critical as the majority of reservations currently have no land cover information in either map or digital form. This poster outlines an approach which includes the manual interpretation of land cover using TM FCCs, the digitizing of interpreted polygons, and the editing of digital data, used upon ground truthing exercises. An efficient and cost-effective methodology for generating large area land cover information is illustrated for the Mineral Strip area on the San Carlos Indian Reservation in Arizona. Techniques which capitalize on the knowledge of the local natural resources professionals, while minimizing machine processing requirements, are suggested.

  18. Glacier Facies Mapping and movement Estimation using Remote Sening Techniques: A Case Study at Samudra Tapu Glacier

    NASA Astrophysics Data System (ADS)

    Sood, S.; Thakur, P. K.

    2016-12-01

    Glaciers are directly affected by the recent trends of global warming. Himalayan glaciers are located near Tropic of Cancer this belt receives more heat thus Himalayan glaciers are more sensitive to climate change. Due to highly rugged terrain and inaccessibility of certain areas satellite obtained information can be used to monitor glaciers. Samudra Tapu glacier, used in this study, located in the Great Himalayan range of north-west Himalaya. Distinct glacier facies are visible using multi-temporal SAR datasets representing different seasons. Fully polarimetric SAR data were used to identify different glacier facies. The identified glacier facies are percolation facies, ice walls, ice facies, refreeze snow and supraglacial debris. Object oriented classification was used to map various glacier facies. Using the classified maps altitude on snow line and firn line was detected. More than 50% of the total glacier area is found as accumulation region. Interferometric Synthetic Aperture Radar (InSAR) technique was used for glacier surface velocity estimation using European Remote Sensing Satellite (ERS-1/2) tandem data. High value of coherence was obtained from the SAR return signal for one day temporal difference. A mean velocity of 24cm/day was estimated for the month of May, highest flow rate were seen in the high accumulation area of the northern branch. Spatial analysis of velocity patterns with respect to slope and aspect show that high rates of flow was found in southern slopes and movement rates generally increase with increase in slope. Feature tracking approach was used to estimate the glacier flow for long term and seasonal basis using SAR and optical datasets. The obtained results clearly suggest that glacier flow varies with season and there has been change in the rate of ice flow over the years. Mapping the extent of accumulation and ablation areas and also the rate at which the ice flows in these regions as these are important factors directly related to

  19. Lithostratigraphic interpretation from joint analysis of seismic tomography and magnetotelluric resistivity models using self-organizing map techniques

    NASA Astrophysics Data System (ADS)

    Bauer, K.; Muñoz, G.; Moeck, I.

    2012-12-01

    The combined interpretation of different models as derived from seismic tomography and magnetotelluric (MT) inversion represents a more efficient approach to determine the lithology of the subsurface compared with the separate treatment of each discipline. Such models can be developed independently or by application of joint inversion strategies. After the step of model generation using different geophysical methodologies, a joint interpretation work flow includes the following steps: (1) adjustment of a joint earth model based on the adapted, identical model geometry for the different methods, (2) classification of the model components (e.g. model blocks described by a set of geophysical parameters), and (3) re-mapping of the classified rock types to visualise their distribution within the earth model, and petrophysical characterization and interpretation. One possible approach for the classification of multi-parameter models is based on statistical pattern recognition, where different models are combined and translated into probability density functions. Classes of rock types are identified in these methods as isolated clusters with high probability density function values. Such techniques are well-established for the analysis of two-parameter models. Alternatively we apply self-organizing map (SOM) techniques, which have no limitations in the number of parameters to be analysed in the joint interpretation. Our SOM work flow includes (1) generation of a joint earth model described by so-called data vectors, (2) unsupervised learning or training, (3) analysis of the feature map by adopting image processing techniques, and (4) application of the knowledge to derive a lithological model which is based on the different geophysical parameters. We show the usage of the SOM work flow for a synthetic and a real data case study. Both tests rely on three geophysical properties: P velocity and vertical velocity gradient from seismic tomography, and electrical resistivity

  20. Light Echoes from Luminous Transients - Status of the Field and Future Prospects

    NASA Astrophysics Data System (ADS)

    Welch, Douglas L.; Rest, Armin

    2014-06-01

    Scattered-light echoes from luminous transients (supernovae, luminous blue variables/supernova impostors) are unique in their ability to preserve an observable record of the state of a transient in outburst for centuries. Unlike conventional (i.e. intentional) time-series records, the presence of interstellar dust on the light echo ellipsoid from a given outburst allows photometric and spectroscopic study at any epoch where the scattered-light signal reaches a useful signal-to-noise threshold. Additionally, dust located at different positions on an outburst's light-echo ellipsoid will preserve a record of the hemispheric emission of light from the outburst as seen from each dust location, allowing the degree of asymmetry of an outburst to be studied.A more recent application of scattered-light echoes employs the fact that multiple outbursts of high-amplitude, unstable, luminous variables each produce their own light-echo ellipsoids and dust located on such ellipsoids provides an opportunity for multiple outbursts to be studied at the same (modern) epoch.This talk will provide a brief summary of the evolution of light echoes from use as a tool to map out the interstellar medium around luminous events to their richer and more recent use of investigating the outbursts themselves. Important findings from observations of scattered-light echoes regarding SN1987A, Cas A, and eta Car will be highlighted and near-term prospects for new work and new observational opportunities will be presented.

  1. The Formation of Primordial Luminous Objects

    SciTech Connect

    Ripamonti, Emanuele; Abel, Tom; /KIPAC, Menlo Park

    2005-08-04

    structure formation. In these notes we will leave the discussion of feedback to lecture notes by Ferrara & Salvaterra and by Madau & Haardt in this same book and focus only on the aspects of the formation of the first objects. The advent of cosmological numerical hydrodynamics in particular allow a fresh new look at these questions. Hence, these notes will touch on aspects of theoretical cosmology to chemistry, computer science, hydrodynamics and atomic physics. For further reading and more references on the subject we refer the reader to other relevant reviews such as Barkana & Loeb 2001, and more recently Ciardi & Ferrara 2004, Glover 2004 and Bromm & Larson 2004. In these notes, we try to give a brief introduction to only the most relevant aspects. We will start with a brief overview of the relevant cosmological concepts in section 2, followed by a discussion of the properties of primordial material (with particular emphasis to its cooling and its chemistry) in section 3. We will then review the technique and the results of numerical simulations in sections 4 and 5: the former will deal with detailed 3D simulations of the formation of gaseous clouds which are likely to transform into luminous objects, while the latter will examine results (mostly from 1D codes) about the modalities of such transformation. Finally, in section 6 we will critically discuss the results of the previous sections, examining their consequences and comparing them to our present knowledge of the universe.

  2. A bias-correction and downscaling technique for operational extended range forecasts based on self organizing map

    NASA Astrophysics Data System (ADS)

    Sahai, A. K.; Borah, N.; Chattopadhyay, R.; Joseph, S.; Abhilash, S.

    2017-04-01

    If a coarse resolution dynamical model can well capture the large-scale patterns even if it has bias in smaller scales, the spatial information in smaller domains may also be retrievable. Based on this hypothesis a method has been proposed to downscale the dynamical model forecasts of monsoon intraseasonal oscillations in the extended range, and thus reduce the forecast spatial biases in smaller spatial scales. A hybrid of clustering and analog technique, used in a self organizing map (SOM)-based algorithm, is applied to correct the bias in the model predicted rainfall. The novelty of this method is that the bias correction and downscaling could be done at any resolution in which observation/reanalysis data is available and is independent of the model resolution in which forecast is generated. A set of composite pattern of rainfall is identified by clustering the high resolution observed rainfall using SOM. These set of composite patterns for the clustered days in each cluster centers or nodes are saved and the model forecasts for any day are compared with these patterns. The closest historical pattern is identified by calculating the minimum Euclidean distance between the model rainfall forecast and the observed clustered pattern and is termed as the bias corrected SOM-based post-processed forecast. The bias-corrected and the SOM-based reconstructed forecasts are shown to improve the annual cycle and the skill of deterministic as well as probabilistic forecasts. Usage of the high resolution observational data improves the spatial pattern for smaller domain as seen from a case study for the Mahanadi basin flood during September 2011. Thus, downscaling and bias correction are both achieved by this technique.

  3. Optimal Electromagnetic (EM) Geophysical Techniques to Map the Concentration of Subsurface Ice and Adsorbed Water on Mars and the Moon

    NASA Astrophysics Data System (ADS)

    Stillman, D. E.; Grimm, R. E.

    2013-12-01

    Water ice is ubiquitous in our Solar System and is a probable target for planetary exploration. Mapping the lateral and vertical concentration of subsurface ice from or near the surface could determine the origin of lunar and martian ice and quantify a much-needed resource for human exploration. Determining subsurface ice concentration on Earth is not trivial and has been attempted previously with electrical resistivity tomography (ERT), ground penetrating radar (GPR), airborne EM (AEM), and nuclear magnetic resonance (NMR). These EM geophysical techniques do not actually detect ice, but rather the absence of unfrozen water. This causes a non-unique interpretation of frozen and dry subsurface sediments. This works well in the arctic because most locations are not dry. However, for planetary exploration, liquid water is exceedingly rare and subsurface mapping must discriminate between an ice-rich and a dry subsurface. Luckily, nature has provided a unique electrical signature of ice: its dielectric relaxation. The dielectric relaxation of ice creates a temperature and frequency dependence of the electrical properties and varies the relative dielectric permittivity from ~3.1 at radar frequencies to >100 at low frequencies. On Mars, sediments smaller than silt size can hold enough adsorbed unfrozen water to complicate the measurement. This is because the presence of absorbed water also creates frequency-dependent electrical properties. The dielectric relaxation of adsorbed water and ice can be separated as they have different shapes and frequency ranges as long as a spectrum spanning the two relaxations is measured. The volume concentration of ice and adsorbed water is a function of the strength of their relaxations. Therefore, we suggest that capacitively-coupled dielectric spectroscopy (a.k.a. spectral induced polarization or complex resistivity) can detect the concentration of both ice and adsorbed water in the subsurface. To prove this concept we have collected

  4. Combination of techniques for mapping structural and functional connectivity of soil erosion processes: a case study in a small watershed

    NASA Astrophysics Data System (ADS)

    Seeger, Manuel; Taguas, Encarnación; Brings, Christine; Wirtz, Stefan; Rodrigo Comino, Jesus; Albert, Enrique; Ries, Johabbes B.

    2016-04-01

    experiment). The measurements were conducted at representative areas identified in advance by precedent mapping. Preliminary results show that the rills are highly effective in producing sediments, but also in connecting fast the different sources with the catchment's outlet. But also they act as a disconnecting feature to the areas of observation, as they may lead the runoff (and the transported sediments) outside the catchment. On the other side, the experiments showed that the evidently degraded areas produce only very delayed runoff, and thus also sediments, whereas the areas with stable deep soils show evidences of fast runoff and erosive responses. The preliminary results of the combination of mapping and experimental techniques demonstrate the different levels at where functional and structural connectivity have to be evaluated. The latter one may be, as a geomorphological feature, the result of former process distributions, whereas the directly observable (functional) connectivity may shift in time due to internal feedbacks, such as the result of soil degradation.

  5. Luminal B breast cancer subtype displays a dicotomic epigenetic pattern.

    PubMed

    Bediaga, Naiara G; Beristain, Elena; Calvo, Borja; Viguri, María A; Gutierrez-Corres, Borja; Rezola, Ricardo; Ruiz-Diaz, Irune; Guerra, Isabel; de Pancorbo, Marian M

    2016-01-01

    Luminal B breast tumors have aggressive clinical and biological features, and constitute the most heterogeneous molecular subtype, both clinically and molecularly. Unfortunately, the immunohistochemistry correlate of the luminal B subtype remains still imprecise, and it has now become of paramount importance to define a classification scheme capable of segregating luminal tumors into clinically meaningful subgroups that may be used clinically to guide patient management. With the aim of unraveling the DNA methylation profiles of the luminal subtypes currently being most used in the clinical setting, we have quantified the DNA methylation level of 27,578 CpG sites in 17 luminal B (ER+, Ki67 ≥ 20 % or PgR < 20 % and HER2-), 8 luminal A (ER+ and Ki67 > 20 %) and 4 luminal B-HER2+ (ER+ and HER2+) breast cancer samples by using the Illumina Infinium methylation microarray approach. Unsupervised hierarchical clustering revealed that DNA methylation stratifies luminal B samples in two categories with differing epigenetic and clinical features. One subgroup of luminal B samples showed a methylator phenotype and clustered with the lumB-HER tumors, while the other showed less methylated events, clustered with the luminal A. A 3 CpG marker panel capable of discriminating methylator versus non-methylator luminal B samples was identified and further validated in an independent cohort of patients. Our results provide evidence that DNA methylation and, more specifically, a panel of 3 CpG markers, enables the stratification of luminal B samples in two categories with differing epigenetic and clinical features and support the utilization of this panel for therapeutic stratification of patients with luminal breast cancer.

  6. Evaluation of SLAR and simulated thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.; Dean, M. E.; Knowlton, D. J.; Latty, R. S.

    1982-01-01

    Kershaw County, South Carolina was selected as the study site for analyzing simulated thematic mapper MSS data and dual-polarized X-band synthetic aperture radar (SAR) data. The impact of the improved spatial and spectral characteristics of the LANDSAT D thematic mapper data on computer aided analysis for forest cover type mapping was examined as well as the value of synthetic aperture radar data for differentiating forest and other cover types. The utility of pattern recognition techniques for analyzing SAR data was assessed. Topics covered include: (1) collection and of TMS and reference data; (2) reformatting, geometric and radiometric rectification, and spatial resolution degradation of TMS data; (3) development of training statistics and test data sets; (4) evaluation of different numbers and combinations of wavelength bands on classification performance; (5) comparison among three classification algorithms; and (6) the effectiveness of the principal component transformation in data analysis. The collection, digitization, reformatting, and geometric adjustment of SAR data are also discussed. Image interpretation results and classification results are presented.

  7. Dynamic pH mapping in microfluidic devices by integrating adaptive coatings based on polyaniline with colorimetric imaging techniques.

    PubMed

    Florea, Larisa; Fay, Cormac; Lahiff, Emer; Phelan, Thomas; O'Connor, Noel E; Corcoran, Brian; Diamond, Dermot; Benito-Lopez, Fernando

    2013-03-21

    In this paper we present a microfluidic device that has integrated pH optical sensing capabilities based on polyaniline. The optical properties of polyaniline coatings change in response to the pH of the solution that is flushed inside the microchannel offering the possibility of monitoring pH in continuous flow over a wide pH range throughout the entire channel length. This work also features an innovative detection system for spatial localisation of chemical pH gradients along microfluidic channels through the use of a low cost optical device. Specifically, the use of a microfluidic channel coated with polyaniline is shown to respond colorimetrically to pH and that effect is detected by the detection system, even when pH gradients are induced within the channel. This study explores the capability of detecting this gradient by means of imaging techniques and the mapping of the camera's response to its corresponding pH after a successful calibration process. The provision of an inherently responsive channel means that changes in the pH of a sample moving through the system can be detected dynamically using digital imaging along the entire channel length in real time, without the need to add reagents to the sample. This approach is generic and can be applied to other chemically responsive coatings immobilised on microchannels.

  8. Semi-automatic 10/20 Identification Method for MRI-Free Probe Placement in Transcranial Brain Mapping Techniques.

    PubMed

    Xiao, Xiang; Zhu, Hao; Liu, Wei-Jie; Yu, Xiao-Ting; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe

    2017-01-01

    The International 10/20 system is an important head-surface-based positioning system for transcranial brain mapping techniques, e.g., fNIRS and TMS. As guidance for probe placement, the 10/20 system permits both proper ROI coverage and spatial consistency among multiple subjects and experiments in a MRI-free context. However, the traditional manual approach to the identification of 10/20 landmarks faces problems in reliability and time cost. In this study, we propose a semi-automatic method to address these problems. First, a novel head surface reconstruction algorithm reconstructs head geometry from a set of points uniformly and sparsely sampled on the subject's head. Second, virtual 10/20 landmarks are determined on the reconstructed head surface in computational space. Finally, a visually-guided real-time navigation system guides the experimenter to each of the identified 10/20 landmarks on the physical head of the subject. Compared with the traditional manual approach, our proposed method provides a significant improvement both in reliability and time cost and thus could contribute to improving both the effectiveness and efficiency of 10/20-guided MRI-free probe placement.

  9. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents.

    PubMed

    Teichgräber, Ulf K; de Bucourt, Maximilian

    2012-01-01

    OJECTIVES: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  10. A technique for visualization and mapping of local cartilage thickness changes in MR images of osteoarthritic knee.

    PubMed

    Ge, Quanxu; Cheng, Yuanzhi; Bi, Kesen; Guo, Changyong; Bai, Jing; Tamura, Shinichi

    2012-11-01

    The aim of this paper is to describe a technique for the visualization and mapping of focal, local cartilage thickness changes over time in magnetic resonance images of osteoarthritic knee. Magnetic resonance imaging was performed in 25 fresh frozen pig knee joints and 15 knees of patients with borderline to mild osteoarthritis (51.2±6.3 years). Cartilage and corresponding bone structures were extracted by semi-automatic segmentation. Each point in the bone surface which was part of the bone-cartilage interface was assigned a cartilage thickness value. Cartilage thicknesses were computed for each point in the bone-cartilage interfaces and transferred to the bone surfaces. Moreover, we developed a three dimensional registration method for the identification of anatomically corresponding points of the bone surface to quantify local cartilage thickness changes. One of the main advantages of our method compared to other studies in the field of registration is a global optimization algorithm that does not require any initialization. The registration accuracy was 0.93±0.05 mm (less than a voxel of magnetic resonance data). Local cartilage thickness changes were seen as having follow-up clinical study for detecting local changes in cartilage thickness. Experiment results suggest that our method was sufficiently accurate and effective for monitoring knee joint diseases. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Semi-automatic 10/20 Identification Method for MRI-Free Probe Placement in Transcranial Brain Mapping Techniques

    PubMed Central

    Xiao, Xiang; Zhu, Hao; Liu, Wei-Jie; Yu, Xiao-Ting; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe

    2017-01-01

    The International 10/20 system is an important head-surface-based positioning system for transcranial brain mapping techniques, e.g., fNIRS and TMS. As guidance for probe placement, the 10/20 system permits both proper ROI coverage and spatial consistency among multiple subjects and experiments in a MRI-free context. However, the traditional manual approach to the identification of 10/20 landmarks faces problems in reliability and time cost. In this study, we propose a semi-automatic method to address these problems. First, a novel head surface reconstruction algorithm reconstructs head geometry from a set of points uniformly and sparsely sampled on the subject's head. Second, virtual 10/20 landmarks are determined on the reconstructed head surface in computational space. Finally, a visually-guided real-time navigation system guides the experimenter to each of the identified 10/20 landmarks on the physical head of the subject. Compared with the traditional manual approach, our proposed method provides a significant improvement both in reliability and time cost and thus could contribute to improving both the effectiveness and efficiency of 10/20-guided MRI-free probe placement. PMID:28190997

  12. Evolution Of Map Display Optical Systems

    NASA Astrophysics Data System (ADS)

    Boot, Alan

    1983-06-01

    It is now over 20 years since Ferranti plc introduced optically projected map displays into operational aircraft navigation systems. Then, as now, it was the function of the display to present an image of a topographical map to a pilot or navigator with his present position clearly identified. Then, as now, the map image was projected from a reduced image stored on colour micro film. Then, as now, the fundamental design problems are the same.In the exposed environment of an aircraft cockpit where brightness levels may vary from those associated with direct sunlight on the one hand, to starlight on the other, how does one design an optical system with sufficient luminance, contrast and resolution where in the daytime sunlight may fall on the display or in the pilot's eyes, and at night time the display luminance must not detract from the pilot's ability to pick up external clues? This paper traces the development of Ferranti plc optically projected map displays from the early V Bomber and the ill-fated TSR2 displays to the Harrier and Concorde displays. It then goes on to the development of combined map and electronic displays (COMED), showing how an earlier design, as fitted to Tornado, has been developed into the current COMED design which is fitted to the F-18 and Jaguar aircraft. In each of the above display systems particular features of optical design interest are identified and their impact on the design as a whole are discussed. The use of prisms both for optical rotation and translation, techniques for the maximisation of luminance, the problems associated with contrast enhancement, particularly with polarising filters in the presence of optically active materials, the use of aerial image combining systems and the impact of the pilot interface on the system parameter are all included.Perhaps the most interesting result in considering the evolution of map displays has not been so much the designer's solutions in overcoming the various design problems but

  13. Constraining Emission Models of Luminous Blazar Sources

    SciTech Connect

    Sikora, Marek; Stawarz, Lukasz; Moderski, Rafal; Nalewajko, Krzysztof; Madejski, Greg; /KIPAC, Menlo Park /SLAC

    2009-10-30

    Many luminous blazars which are associated with quasar-type active galactic nuclei display broad-band spectra characterized by a large luminosity ratio of their high-energy ({gamma}-ray) and low-energy (synchrotron) spectral components. This large ratio, reaching values up to 100, challenges the standard synchrotron self-Compton models by means of substantial departures from the minimum power condition. Luminous blazars have also typically very hard X-ray spectra, and those in turn seem to challenge hadronic scenarios for the high energy blazar emission. As shown in this paper, no such problems are faced by the models which involve Comptonization of radiation provided by a broad-line-region, or dusty molecular torus. The lack or weakness of bulk Compton and Klein-Nishina features indicated by the presently available data favors production of {gamma}-rays via up-scattering of infrared photons from hot dust. This implies that the blazar emission zone is located at parsec-scale distances from the nucleus, and as such is possibly associated with the extended, quasi-stationary reconfinement shocks formed in relativistic outflows. This scenario predicts characteristic timescales for flux changes in luminous blazars to be days/weeks, consistent with the variability patterns observed in such systems at infrared, optical and {gamma}-ray frequencies. We also propose that the parsec-scale blazar activity can be occasionally accompanied by dissipative events taking place at sub-parsec distances and powered by internal shocks and/or reconnection of magnetic fields. These could account for the multiwavelength intra-day flares occasionally observed in powerful blazars sources.

  14. Blur tolerance for luminance and chromatic stimuli.

    PubMed

    Wuerger, S M; Owens, H; Westland, S

    2001-06-01

    We investigated the blur tolerance of human observers for stimuli modulated along the isoluminant red-green, the isoluminant yellow-blue, and the luminance (black-white) direction in color space. We report the following results: (i) Blur difference thresholds for red-green and luminance stimuli (of equal cone contrast) are very similar and as low as 0.5 min of visual angle; for yellow-blue the lowest blur thresholds are much higher (1.5 min of visual angle). (ii) The smallest blur thresholds are found for slightly blurred square waves (reference blur of 1 arc min) and not for sharp edges. (iii) Blur thresholds for red-green and black-white follow a Weber law for reference (pedestal) blurs greater than the optimum blur. (iv) Using the model proposed by Watt and Morgan [Vision Res. 24, 1387 (1984)] we estimated the internal blur of the visual system for the black-white and the red-green color directions and arrived at the following estimates: 1.2 arc min for black-white stimuli at 10% contrast and 0.9 arc min for red-green stimuli at 10% cone contrast. Blur tolerance for yellow-blue is independent of external blur and cannot be predicted by the model. (v) The contrast dependence of blur sensitivity is similar for red-green and luminance modulations (slopes of -0.15 and -0.16 in log-log coordinates, respectively) and slightly stronger for yellow-blue (slope = -0.75). Blur discrimination thresholds are not predicted by the contrast sensitivity function of the visual system. Our findings are useful for predicting blur tolerance for complex images and provide a spatial frequency cutoff point when Gaussian low-pass filters are used for noise removal in colored images. They are also useful as a baseline for the study of visual disorders such as amblyopia.

  15. A new methodology for atrial flutter ablation by direct visualization of cavotricuspid conduction with voltage gradient mapping: a comparison to standard techniques.

    PubMed

    Bailin, Steven J; Johnson, William Ben; Jumrussirikul, Pitayadet; Sorentino, Denise; West, Robert

    2013-07-01

    To demonstrate that critical conduction within the cavotricuspid isthmus (CTI) can be directly visualized by voltage gradient mapping and facilitate efficient ablation compared to standard techniques. Group 1 (1 operator, n = 11) ablated based upon contact voltage measurements and voltage gradient mapping. Ablation targeted low-voltage bridges (LVBs) within the CTI. Repeat maps were obtained following ablation. Group 2 (operators 2, 3, and 4 n = 35) utilized electroanatomic navigation and ablated by the creation of linear lesions from the tricuspid valve to the inferior vena cava. Demonstration of bidirectional block (BDB) was required in both groups. LVB were associated with CTI conduction in all Group A patients. LVB ablation terminated flutter, or created BDB. Following ablation, CTI voltage connections were absent in all patients. Compared with Group B, Group A had less radiofrequency (RF) lesions to atrial flutter (AFL) termination (P = 0.001), less total RF lesions (P = 0.0001), and less total RF time (P = 0.001). Group 1 had no recurrent AFL whereas Group 2 had three recurrences. (follow-up median of 231 ± 181 days). (i) Voltage gradient mapping visualized regions of critical CTI conduction, (ii) ablation of LVB terminated AFL and resulted in BDB, (iii) repeat mapping confirmed the absence of trans-isthmus voltage, and (iv) Compared with standard ablation, voltage gradient mapping decreases total RF lesions, lesions to AFL termination, and total RF time. Use of voltage gradient mapping can facilitate successful AFL ablation.

  16. Evolving Our Evaluation of Luminous Environments

    NASA Technical Reports Server (NTRS)

    Clark, Toni

    2016-01-01

    The advance in solid state light emitting technologies and optics for lighting and visual communication necessitates the evaluation of how NASA envisions spacecraft lighting architectures and how NASA uses industry standards for the design and evaluation of lighting systems. Current NASA lighting standards and requirements for existing architectures focus on the separate ability of a lighting system to throw light against a surface or the ability of a display system to provide the appropriate visual contrast. This project investigated large luminous surface lamps as an alternative or supplement to overhead lighting. The efficiency of the technology was evaluated for uniformity and power consumption.

  17. Photometric monitoring of Luminous Blue Variables

    NASA Astrophysics Data System (ADS)

    Buemi, Carla; Distefano, Elisa; Leto, Paolo; Schillirò, Francesco; Trigilio, Corrado; Umana, Grazia; Bernabei, Stefano; Cutispoto, Giuseppe; Messina, Sergio

    2011-01-01

    We present some preliminary results from our program of intensive near-infrared photometric monitoring of a sample of confirmed and candidate Luminous Blue Variables (LBVs) conducted from 2008 to 2010. Clear long-term variability has been observed for Wray 17-96 and V481 Sct, with overall brightness variation greater than 1 mag in the J band. Other sources, such as LBV 1806-20 showed detectable variability with amplitudes of few tenths of a magnitude with a time-scale of about 60 days.

  18. Comprehensive analysis of TEM methods for LiFePO4/FePO4 phase mapping: spectroscopic techniques (EFTEM, STEM-EELS) and STEM diffraction techniques (ACOM-TEM).

    PubMed

    Mu, X; Kobler, A; Wang, D; Chakravadhanula, V S K; Schlabach, S; Szabó, D V; Norby, P; Kübel, C

    2016-11-01

    Transmission electron microscopy (TEM) has been used intensively in investigating battery materials, e.g. to obtain phase maps of partially (dis)charged (lithium) iron phosphate (LFP/FP), which is one of the most promising cathode material for next generation lithium ion (Li-ion) batteries. Due to the weak interaction between Li atoms and fast electrons, mapping of the Li distribution is not straightforward. In this work, we revisited the issue of TEM measurements of Li distribution maps for LFP/FP. Different TEM techniques, including spectroscopic techniques (energy filtered (EF)TEM in the energy range from low-loss to core-loss) and a STEM diffraction technique (automated crystal orientation mapping (ACOM)), were applied to map the lithiation of the same location in the same sample. This enabled a direct comparison of the results. The maps obtained by all methods showed excellent agreement with each other. Because of the strong difference in the imaging mechanisms, it proves the reliability of both the spectroscopic and STEM diffraction phase mapping. A comprehensive comparison of all methods is given in terms of information content, dose level, acquisition time and signal quality. The latter three are crucial for the design of in-situ experiments with beam sensitive Li-ion battery materials. Furthermore, we demonstrated the power of STEM diffraction (ACOM-STEM) providing additional crystallographic information, which can be analyzed to gain a deeper understanding of the LFP/FP interface properties such as statistical information on phase boundary orientation and misorientation between domains.

  19. Luminal-B breast cancer and novel therapeutic targets

    PubMed Central

    2011-01-01

    Gene expression profiling has led to a new molecular classification of breast cancer characterized by four intrinsic subtypes: basal-like, HER2-positive, luminal A, and luminal B. Despite expressing estrogen receptor, the luminal-B subtype confers increased risk of early relapse with endocrine therapy compared with the luminal-A subtype. Although luminal-B definitions vary, the hallmark appears to be increased expression of proliferation-related genes. Several biological pathways are identified as possible contributors to the poor outcomes, and novel agents targeting these pathways are being developed with aims to improve survival. We review the definition of luminal-B breast cancer, its pathological and clinical features, and potential targets for treatment. PMID:22217398

  20. Luminance cues constrain chromatic blur discrimination in natural scene stimuli.

    PubMed

    Sharman, Rebecca J; McGraw, Paul V; Peirce, Jonathan W

    2013-03-22

    Introducing blur into the color components of a natural scene has very little effect on its percept, whereas blur introduced into the luminance component is very noticeable. Here we quantify the dominance of luminance information in blur detection and examine a number of potential causes. We show that the interaction between chromatic and luminance information is not explained by reduced acuity or spatial resolution limitations for chromatic cues, the effective contrast of the luminance cue, or chromatic and achromatic statistical regularities in the images. Regardless of the quality of chromatic information, the visual system gives primacy to luminance signals when determining edge location. In natural viewing, luminance information appears to be specialized for detecting object boundaries while chromatic information may be used to determine surface properties.

  1. Facilitating the exploitation of ERTS imagery using snow enhancement techniques. [geological mapping of New England test area

    NASA Technical Reports Server (NTRS)

    Wobber, F. J.; Martin, K. R. (Principal Investigator); Amato, R. V.; Leshendok, T.

    1974-01-01

    The author has identified the following significant results. The procedure for conducting a regional geological mapping program utilizing snow-enhanced ERTS-1 imagery has been summarized. While it is recognized that mapping procedures in geological programs will vary from area to area and from geologist to geologist, it is believed that the procedure tested in this project is applicable over a wide range of mapping programs. The procedure is designed to maximize the utility and value of ERTS-1 imagery and aerial photography within the initial phase of geological mapping programs. Sample products which represent interim steps in the mapping formula (e.g. the ERTS Fracture-Lineament Map) have been prepared. A full account of these procedures and products will be included within the Snow Enhancement Users Manual.

  2. Peripheral visual response time and retinal luminance-area relations

    NASA Technical Reports Server (NTRS)

    Haines, R. F.

    1975-01-01

    Experiments were undertaken to elucidate the stimulus luminance-retinal area relationship that underlies response time (RT) behavior. Mean RT was significantly faster to stimuli imaged beyond about 70 deg of arc from the fovea when their luminance was increased by an amount equal to the foveal stimulus luminance multiplied by the cosine of the angle between the peripheral stimuli and the line of sight. This and additional data are discussed in relation to previous psychophysical data and to possible response mechanisms.

  3. Peripheral visual response time and retinal luminance-area relations

    NASA Technical Reports Server (NTRS)

    Haines, R. F.

    1975-01-01

    Experiments were undertaken to elucidate the stimulus luminance-retinal area relationship that underlies response time (RT) behavior. Mean RT was significantly faster to stimuli imaged beyond about 70 deg of arc from the fovea when their luminance was increased by an amount equal to the foveal stimulus luminance multiplied by the cosine of the angle between the peripheral stimuli and the line of sight. This and additional data are discussed in relation to previous psychophysical data and to possible response mechanisms.

  4. A new interpretation of luminous blue stars

    NASA Technical Reports Server (NTRS)

    Stothers, R.

    1976-01-01

    A major revision of current theoretical ideas about the brightest blue stars must be made if Carson's (1976) radiative opacities are adopted in stellar models. Unlike earlier opacities, these exhibit a large 'bump' due to CNO ionization, which leads to very strong central condensation, convective instability, and pulsational instability in hot diffuse stellar envelopes. Despite a number of theoretical uncertainties, the new picture of the structure of very luminous stars is reasonably successful in accounting for a variety of previously unexplained observations. The stellar models for the phase of core hydrogen burning predict large radii and rather cool effective temperatures for O stars and a spreading out of the main-sequence band in the H-R diagram toward luminous cool supergiants for masses higher than about 20 solar masses. In massive X-ray binary systems, circular orbits and supergiant-like visual companions are expected to be quite common. Long-period variability is predicted to exist for massive blue supergiants of luminosity class Ia. The models for helium stars predict large radii and rather cool effective temperatures for Wolf-Rayet stars, as well as multimodal pulsational instability and, possibly, surface turbulence for these stars.

  5. The Effect of Learner Constructed, Fill in the Map Concept Map Technique, and Summarizing Strategy on Iranian Pre-University Students' Reading Comprehension

    ERIC Educational Resources Information Center

    Soleimani, Hassan; Nabizadeh, Fatemeh

    2012-01-01

    Concept maps (CM) are powerful tools which have different uses in educational contexts; however, this study limited its extension and explored its impact on the reading comprehension skill of Iranian EFL students. To this purpose, a proficiency test was employed and 90 intermediate pre-university students were chosen and divided into three groups:…

  6. Popliteal-to-Dorsalis Pedis In-Situ Small Saphenous Vein Bypass under Planning with Mapping Using Computed Tomography Volume Rendering Technique.

    PubMed

    Byun, Seung-Jae

    2015-09-01

    The small saphenous vein (SSV) is an important graft in limb salvage surgery. It is frequently translocated for bypass surgery. Sometimes, the use of the SSV as an in-situ graft for posterior tibial artery or peroneal artery reconstruction offers the advantages of reduced vein graft injury and improved patency. Recently, saphenous vein mapping through computed tomography (CT) volume rendering technique offers a great quality view to the surgeon. We experienced a patient in whom a CT image with volume rendering technique revealed an aberrant SSV connected with the great saphenous vein at the medial malleolus level. This case indicates that an aberrant SSV may be successfully used as an in-situ conduit for bypass to the dorsalis pedis artery. Here, we present the case of a popliteal-to-dorsalis pedis in-situ vein bypass using a LeMaitre valvulotome (LeMaitre Vascular Inc., USA) under mapping of the aberrant SSV by CT volume rendering technique.

  7. Covariation of Color and Luminance Facilitate Object Individuation in Infancy

    PubMed Central

    Woods, Rebecca J.; Wilcox, Teresa

    2013-01-01

    The ability to individuate objects is one of our most fundamental cognitive capacities. Recent research has revealed that when objects vary in color or luminance alone, infants fail to individuate those objects until 11.5 months. However, color and luminance frequently covary in the natural environment, thus providing a more salient and reliable indicator of distinct objects. For this reason, we propose that infants may be more likely to individuate when objects vary in both color and luminance. Using the narrow-screen task of Wilcox and Baillargeon (1998a), in Experiment 1 we assessed 7.5-month-old infants' ability to individuate uniformly colored objects that varied in both color and luminance or luminance alone. Experiment 2 further explored the link between color and luminance by assessing infants' ability to use pattern differences that included luminance or color to individuate objects. Results indicated that infants individuated objects only when covariations in c