Science.gov

Sample records for luminance mapping technique

  1. Evaluation of High Dynamic Range Photography as a Luminance Mapping Technique

    SciTech Connect

    Inanici, Mehlika; Galvin, Jim

    2004-12-30

    The potential, limitations, and applicability of the High Dynamic Range (HDR) photography technique is evaluated as a luminance mapping tool. Multiple exposure photographs of static scenes are taken with a Nikon 5400 digital camera to capture the wide luminance variation within the scenes. The camera response function is computationally derived using the Photosphere software, and is used to fuse the multiple photographs into HDR images. The vignetting effect and point spread function of the camera and lens system is determined. Laboratory and field studies have shown that the pixel values in the HDR photographs can correspond to the physical quantity of luminance with reasonable precision and repeatability.

  2. Mapping luminance onto lightness in vision and art

    NASA Astrophysics Data System (ADS)

    Gilchrist, Alan

    2013-03-01

    Most natural images span a large range of luminance values, often a thousand-to-one and sometimes a million-to-one (Heckaman and Fairchild, 2009). This luminance range must be mapped by the visual system onto a scale of perceived gray shades (called lightness) with a range of roughly thirty-to-one (90% to 3% reflectance). For the painter who wants to represent this scene on a canvas, the challenge is a bit different. The high-range scene must be represented using pigments with a range of only thirty-to-one. Let's begin with vision. Even without a high range in the scene, understanding the mapping of luminance onto lightness has proven elusive. But we can think of the problem as having two parts: Anchoring and scaling.

  3. A new device for dynamic luminance mapping and glare risk assessment in buildings

    NASA Astrophysics Data System (ADS)

    Borisuit, Apiparn; Münch, Mirjam; Deschamps, Laurent; Kämpf, Jérôme; Scartezzini, Jean-Louis

    2012-10-01

    High dynamic range imaging has been shown to be a reliable tool to assess luminance maps and glare risk probability in buildings. However, there are some limitations of image capturing time, especially when dealing with highly dynamic and contrasted daylight situations. We used a newly developed prototype of a digital camera which contains a high dynamic range pixel array chip, with a logarithmic scale for encoding. This type of camera allows to effectively assessing luminance, contrast and contrast directions, by taking only a single image or by performing real time recordings. The device was equipped with a fisheye lens and V-lambda filters to adapt the camera's spectral sensitivity to the human eye. After spectral as well as photometric calibration and vignetting correction, the device was tested to perform luminance mapping of real scenes. The results showed that luminance maps of a room can be efficiently assessed under dynamic daylight and mixed day- and electric lighting conditions in a very short time (i.e. 100 ms), when compared to classical HDR imaging techniques. This allows us to calculate glare indexes of a scene simultaneously. The camera opens a variety of new applications as a useful tool for architects, building designers and lighting experts. The device can be used to easily monitor daylight availability and glare indexes in existing buildings and further developments for advanced (day-) lighting control can be envisaged.

  4. Covariance mapping techniques

    NASA Astrophysics Data System (ADS)

    Frasinski, Leszek J.

    2016-08-01

    Recent technological advances in the generation of intense femtosecond pulses have made covariance mapping an attractive analytical technique. The laser pulses available are so intense that often thousands of ionisation and Coulomb explosion events will occur within each pulse. To understand the physics of these processes the photoelectrons and photoions need to be correlated, and covariance mapping is well suited for operating at the high counting rates of these laser sources. Partial covariance is particularly useful in experiments with x-ray free electron lasers, because it is capable of suppressing pulse fluctuation effects. A variety of covariance mapping methods is described: simple, partial (single- and multi-parameter), sliced, contingent and multi-dimensional. The relationship to coincidence techniques is discussed. Covariance mapping has been used in many areas of science and technology: inner-shell excitation and Auger decay, multiphoton and multielectron ionisation, time-of-flight and angle-resolved spectrometry, infrared spectroscopy, nuclear magnetic resonance imaging, stimulated Raman scattering, directional gamma ray sensing, welding diagnostics and brain connectivity studies (connectomics). This review gives practical advice for implementing the technique and interpreting the results, including its limitations and instrumental constraints. It also summarises recent theoretical studies, highlights unsolved problems and outlines a personal view on the most promising research directions.

  5. Nonlinear mapping of the luminance in dual-layer high dynamic range displays

    NASA Astrophysics Data System (ADS)

    Guarnieri, Gabriele; Ramponi, Giovanni; Bonfiglio, Silvio; Albani, Luigi

    2009-02-01

    It has long been known that the human visual system (HVS) has a nonlinear response to luminance. This nonlinearity can be quantified using the concept of just noticeable difference (JND), which represents the minimum amplitude of a specified test pattern an average observer can discern from a uniform background. The JND depends on the background luminance following a threshold versus intensity (TVI) function. It is possible to define a curve which maps physical luminances into a perceptually linearized domain. This mapping can be used to optimize a digital encoding, by minimizing the visibility of quantization noise. It is also commonly used in medical applications to display images adapting to the characteristics of the display device. High dynamic range (HDR) displays, which are beginning to appear on the market, can display luminance levels outside the range in which most standard mapping curves are defined. In particular, dual-layer LCD displays are able to extend the gamut of luminance offered by conventional liquid crystals towards the black region; in such areas suitable and HVS-compliant luminance transformations need to be determined. In this paper we propose a method, which is primarily targeted to the extension of the DICOM curve used in medical imaging, but also has a more general application. The method can be modified in order to compensate for the ambient light, which can be significantly greater than the black level of an HDR display and consequently reduce the visibility of the details in dark areas.

  6. C IV and C III] reverberation mapping of the luminous quasar PG 1247+267

    SciTech Connect

    Trevese, D.; Saturni, F. G.; Perna, M.; Dadina, M.

    2014-11-10

    So far the masses of about 50 active galactic nuclei (AGNs) have been measured through the reverberation mapping technique (RM). Most measurements have been performed for objects of moderate luminosity and redshift, based on Hβ, which is also used to calibrate the scaling relation that allows single-epoch (SE) mass determination based on AGN luminosity and the width of different emission lines. Due to the complex structure and gas dynamics of the relevant emission region, the SE masses obtained from the C IV(1549 Å) line show a large spread around the mean values. Direct RM measures of C IV exist for only six AGNs of low luminosity and redshift, and only one luminous quasar. Since 2003, we have collected photometric and spectroscopic observations of PG1247+267, the most luminous quasar ever analyzed for RM. We provide light curves for the continuum and for C IV(1549 Å) and C III](1909 Å), and measures of the reverberation time lags based on the SPEAR method. The sizes of the line emission regions assume a ratio of R {sub C} {sub III]}/R {sub C} {sub IV} ∼ 2, similar to the case of Seyfert galaxies, indicating for the first time a similar ionization stratification in a luminous quasar and low-luminosity nuclei. Due to the relatively small size of the broad line region and the relatively narrow line widths, we estimate a small mass and an anomalously high Eddington ratio. We discuss the possibility that either the shape of the emission region or an amplification of the luminosity caused by gravitational lensing may be partly responsible for the result.

  7. A methodology for luminance map retrieval using airborne hyperspectral and photogrammetric data

    NASA Astrophysics Data System (ADS)

    Pipia, Luca; Alamús, Ramon; Tardà, Anna; Pérez, Fernando; Palà, Vicenç; Corbera, Jordi

    2014-10-01

    This paper puts forward a methodology developed at the Institut Cartogràfic i Geològic de Catalunya (ICGC) to quantify upwelling light flux using hyperspectral and photogrammetric airborne data. The work was carried out in the frame of a demonstrative study requested by the municipality of Sant Cugat del Vallès, in the vicinity of Barcelona (Spain), and aimed to envisage a new approach to assess artificial lighting policies and actions as alternative to field campaigns. Hyperspectral and high resolution multispectral/panchromatic data were acquired simultaneously over urban areas. In order to avoid moon light contributions, data were acquired during the first days of new moon phase. Hyperspectral data were radiometrically calibrated. Then, National Center for Environmental Prediction (NCEP) atmospheric profiles were employed to estimate the actual Column Water Vapor (CWV) to be passed to ModTran5.0 for the atmospheric transmissivity τ calculation. At-the-ground radiance was finally integrated using the photopic sensitivity curve to generate a luminance map (cdm-2) of the flown area by mosaicking the different flight tracks. In an attempt to improve the spatial resolution and enhance the dynamic range of the luminance map, a sensor-fusion strategy was finally looked into. DMC Photogrammetric data acquired simultaneously to hyperspectral information were converted into at-the-ground radiance and upscaled to CASI spatial resolution. High-resolution (HR) luminance maps with enhanced dynamic range were finally generated by linearly fitting up-scaled DMC mosaics to the CASI-based luminance information. In the end, a preliminary assessment of the methodology is carried out using non-simultaneous in-situ measurements.

  8. Visuoauditory mappings between high luminance and high pitch are shared by chimpanzees (Pan troglodytes) and humans

    PubMed Central

    Ludwig, Vera U.; Adachi, Ikuma; Matsuzawa, Tetsuro

    2011-01-01

    Humans share implicit preferences for certain cross-sensory combinations; for example, they consistently associate higher-pitched sounds with lighter colors, smaller size, and spikier shapes. In the condition of synesthesia, people may experience such cross-modal correspondences to a perceptual degree (e.g., literally seeing sounds). So far, no study has addressed the question whether nonhuman animals share cross-modal correspondences as well. To establish the evolutionary origins of cross-modal mappings, we tested whether chimpanzees (Pan troglodytes) also associate higher pitch with higher luminance. Thirty-three humans and six chimpanzees were required to classify black and white squares according to their color while hearing irrelevant background sounds that were either high-pitched or low-pitched. Both species performed better when the background sound was congruent (high-pitched for white, low-pitched for black) than when it was incongruent (low-pitched for white, high-pitched for black). An inherent tendency to pair high pitch with high luminance hence evolved before the human lineage split from that of chimpanzees. Rather than being a culturally learned or a linguistic phenomenon, this mapping constitutes a basic feature of the primate sensory system. PMID:22143791

  9. Visuoauditory mappings between high luminance and high pitch are shared by chimpanzees (Pan troglodytes) and humans.

    PubMed

    Ludwig, Vera U; Adachi, Ikuma; Matsuzawa, Tetsuro

    2011-12-20

    Humans share implicit preferences for certain cross-sensory combinations; for example, they consistently associate higher-pitched sounds with lighter colors, smaller size, and spikier shapes. In the condition of synesthesia, people may experience such cross-modal correspondences to a perceptual degree (e.g., literally seeing sounds). So far, no study has addressed the question whether nonhuman animals share cross-modal correspondences as well. To establish the evolutionary origins of cross-modal mappings, we tested whether chimpanzees (Pan troglodytes) also associate higher pitch with higher luminance. Thirty-three humans and six chimpanzees were required to classify black and white squares according to their color while hearing irrelevant background sounds that were either high-pitched or low-pitched. Both species performed better when the background sound was congruent (high-pitched for white, low-pitched for black) than when it was incongruent (low-pitched for white, high-pitched for black). An inherent tendency to pair high pitch with high luminance hence evolved before the human lineage split from that of chimpanzees. Rather than being a culturally learned or a linguistic phenomenon, this mapping constitutes a basic feature of the primate sensory system. PMID:22143791

  10. A MAP OF THE INTEGRATED SACHS-WOLFE SIGNAL FROM LUMINOUS RED GALAXIES

    SciTech Connect

    Granett, Benjamin R.; Neyrinck, Mark C.; Szapudi, Istvan

    2009-08-10

    We construct a map of the time derivative of the gravitational potential traced by Sloan Digital Sky Survey luminous red galaxies (LRGs). The potential decays on large scales due to cosmic acceleration, leaving an imprint on cosmic microwave background (CMB) radiation through the integrated Sachs-Wolfe (ISW) effect. With a template fit, we directly measure this signature on the CMB at a 2{sigma} confidence level. The measurement is consistent with the cross-correlation statistic, strengthening the claim that dark energy is indeed the cause of the correlation. This new approach potentially simplifies the cosmological interpretation. Our constructed linear ISW map shows no evidence for degree-scale cold and hot spots associated with supervoid and supercluster structures. This suggests that the linear ISW effect in a concordance {lambda}CDM cosmology is insufficient to explain the strong CMB imprints from these structures that we previously reported.

  11. Brain Friendly Techniques: Mind Mapping

    ERIC Educational Resources Information Center

    Goldberg, Cristine

    2004-01-01

    Mind Mapping can be called the Swiss Army Knife for the brain, a total visual thinking tool or a multi-handed thought catcher. Invented by Tony Buzan in the early 1970s and used by millions around the world, it is a method that can be a part of a techniques repertoire when teaching information literacy, planning, presenting, thinking, and so…

  12. Mapping luminous blue compact galaxies with VIRUS-P. Morphology, line ratios, and kinematics

    NASA Astrophysics Data System (ADS)

    Cairós, L. M.; Caon, N.; García Lorenzo, B.; Kelz, A.; Roth, M.; Papaderos, P.; Streicher, O.

    2012-11-01

    Context. Blue compact galaxies (BCG) are narrow emission-line systems that undergo a violent burst of star formation. They are compact, low-luminosity galaxies, with blue colors and low chemical abundances, which offer us a unique opportunity to investigate collective star formation and its effects on galaxy evolution in a relatively simple, dynamically unperturbed environment. Spatially resolved spectrophotometric studies of BCGs are essential for a better understanding of the role of starburst-driven feedback processes on the kinematical and chemical evolution of low-mass galaxies near and far. Aims: We carry out an integral field spectroscopy (IFS) study of a sample of luminous BCGs, with the aim to probe the morphology, kinematics, dust extinction, and excitation mechanisms of their warm interstellar medium (ISM). Methods: We obtained IFS data for five luminous BCGs with VIRUS-P, the prototype instrument for the Visible Integral Field Replicable Unit Spectrograph, attached to the 2.7 m Harlan J. Smith Telescope at the McDonald Observatory. VIRUS-P consists of a square array of 247 optical fibers, which covers a 109″ × 109″ field of view, with a spatial sampling of 4farcs2 and a 0.3 filling factor. We observed in the 3550-5850 Å spectral range, with a resolution of 5 Å FWHM. From these data we built two-dimensional maps of the continuum and the most prominent emission-lines ([O ii] λ3727, Hγ, Hβ and [O iii] λ5007), and investigated the morphology of diagnostic emission-line ratios and the extinction patterns in the ISM as well as stellar and gas kinematics. Additionally, from integrated spectra we inferred total line fluxes and luminosity-weighted extinction coefficients and gas-phase metallicities. Results: All galaxies exhibit an overall regular morphology in the stellar continuum, while their warm ISM morphology is more complex: in II Zw 33 and Mrk 314, the star-forming regions are aligned along a chain-structure; Haro 1, NGC 4670 and III Zw 102

  13. Simple techniques for thematic mapping

    NASA Astrophysics Data System (ADS)

    Derenyi, Eugene E.

    A digital image restitution and mensuration software package was developed and installed in a geographic information system. Mapping is performed by monoscopic digitization in the image display screen. Two of the three image restitution schemes employ a rigorous mathematical model for generating the object space coordinates, while the third one is based on a close approximation. No photogrammetric and cartographic experience is necessary to follow these procedures. The primary application of this mapping tool is seen as thematic mapping.

  14. Window defect planar mapping technique

    NASA Technical Reports Server (NTRS)

    Minton, F. R.; Minton, U. O. (Inventor)

    1976-01-01

    A method of planar mapping defects in a window having an edge surface and a planar surface. The method is comprised of steps for mounting the window on a support surface. Then a light sensitive paper is placed adjacent to the window surface. A light source is positioned adjacent to the window edge. The window is then illuminated with the source of light for a predetermined interval of time. Defects on the surface of the glass, as well as in the interior of the glass are detected by analyzing the developed light sensitive paper. The light source must be in the form of optical fibers or a light tube whose light transmitting ends are placed near the edge surface of the window.

  15. Digital Mapping Techniques '07 - Workshop Proceedings

    USGS Publications Warehouse

    Soller, David R.

    2008-01-01

    The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  16. Partial covariance mapping techniques at FELs

    NASA Astrophysics Data System (ADS)

    Frasinski, Leszek

    2014-05-01

    The development of free-electron lasers (FELs) is driven by the desire to access the structure and chemical dynamics of biomolecules with atomic resolution. Short, intense FEL pulses have the potential to record x-ray diffraction images before the molecular structure is destroyed by radiation damage. However, even during the shortest, few-femtosecond pulses currently available, there are some significant changes induced by massive ionisation and onset of Coulomb explosion. To interpret the diffraction images it is vital to gain insight into the electronic and nuclear dynamics during multiple core and valence ionisations that compete with Auger cascades. This paper focuses on a technique that is capable to probe these processes. The covariance mapping technique is well suited to the high intensity and low repetition rate of FEL pulses. While the multitude of charges ejected at each pulse overwhelm conventional coincidence methods, an improved technique of partial covariance mapping can cope with hundreds of photoelectrons or photoions detected at each FEL shot. The technique, however, often reveals spurious, uninteresting correlations that spoil the maps. This work will discuss the strengths and limitations of various forms of covariance mapping techniques. Quantitative information extracted from the maps will be linked to theoretical modelling of ionisation and fragmentation paths. Special attention will be given to critical experimental parameters, such as counting rate, FEL intensity fluctuations, vacuum impurities or detector efficiency and nonlinearities. Methods of assessing and optimising signal-to-noise ratio will be described. Emphasis will be put on possible future developments such as multidimensional covariance mapping, compensation for various experimental instabilities and improvements in the detector response. This work has been supported the EPSRC, UK (grants EP/F021232/1 and EP/I032517/1).

  17. Three-dimensional spectroscopy of local luminous compact blue galaxies: kinematic maps of a sample of 22 objects

    NASA Astrophysics Data System (ADS)

    Pérez-Gallego, J.; Guzmán, R.; Castillo-Morales, A.; Gallego, J.; Castander, F. J.; Garland, C. A.; Gruel, N.; Pisano, D. J.; Zamorano, J.

    2011-12-01

    We use three-dimensional optical spectroscopy observations of a sample of 22 local luminous compact blue galaxies (LCBGs) to create kinematic maps. By means of these, we classify the kinematics of these galaxies into three different classes: rotating disc (RD), perturbed rotation (PR) and complex kinematics (CK). We find 48 per cent are RDs, 28 per cent are PRs and 24 per cent are CKs. RDs show rotational velocities that range between ˜50 and ˜200 km s-1, and dynamical masses that range between ˜1 × 109 and ˜3 × 1010 M⊙. We also address the following two fundamental questions through the study of the kinematic maps: (i) What processes are triggering the current starburst in LCBGs? We search our maps of the galaxy velocity fields for signatures of recent interactions and close companions that may be responsible for the enhanced star formation in our sample. We find that 5 per cent of objects show evidence of a recent major merger, 10 per cent of a minor merger and 45 per cent of a companion. This argues in favour of ongoing interactions with close companions as a mechanism for the enhanced star formation activity in these galaxies. (ii) What processes may eventually quench the current starbust in LCBGs? Velocity and velocity width maps, together with emission line ratio maps, can reveal signatures of active galactic nuclei (AGNs) activity or supernova (SN)-driven galactic winds that could halt the current burst. We find only 5 per cent of objects with clear evidence of AGN activity and 27 per cent with kinematics consistent with SN-driven galactic winds. Therefore, a different mechanism may be responsible for quenching the star formation in LCBGs. Finally, from our analysis, we find that the velocity widths of RDs, rather than accounting exclusively for the rotational nature of these objects, may account as well for other kinematic components and may not be good tracers of their dynamical masses.

  18. Multiscale mapping: Physical concepts and mathematical techniques

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This is an introductory summary for papers either invited or a part of a symposium at the 18th World Congress of Soil Science, July 2006 in Philadelphia. The symposium, titled "Multiscale Mapping of Soil Properties for Environmental Studies, Agriculture, and Decision Making," focused on techniques u...

  19. Digital Mapping Techniques '06 - Workshop Proceedings

    USGS Publications Warehouse

    Soller, David R.

    2007-01-01

    The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, ?publishing? includes Web-based release); 2) field data capture

  20. Investigation of light scattering as a technique for detecting discrete soot particles in a luminous flame

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The practicability of using a classical light-scattering technique, involving comparison of angular scattering intensity patterns with theoretically determined Mie and Rayleight patterns, to detect discrete soot particles (diameter less than 50 nm) in premixed propane/air and propane/oxygen-helium flames is considered. The experimental apparatus employed in this investigation included a laser light source, a flat-flame burner, specially coated optics, a cooled photomultiplier detector, and a lock-in voltmeter readout. Although large, agglomerated soot particles were detected and sized, it was not possible to detect small, discrete particles. The limiting factor appears to be background scattering by the system's optics.

  1. Structural Modeling Using "Scanning and Mapping" Technique

    NASA Technical Reports Server (NTRS)

    Amos, Courtney L.; Dash, Gerald S.; Shen, J. Y.; Ferguson, Frederick; Noga, Donald F. (Technical Monitor)

    2000-01-01

    Supported by NASA Glenn Center, we are in the process developing a structural damage diagnostic and monitoring system for rocket engines, which consists of five modules: Structural Modeling, Measurement Data Pre-Processor, Structural System Identification, Damage Detection Criterion, and Computer Visualization. The function of the system is to detect damage as it is incurred by the engine structures. The scientific principle to identify damage is to utilize the changes in the vibrational properties between the pre-damaged and post-damaged structures. The vibrational properties of the pre-damaged structure can be obtained based on an analytic computer model of the structure. Thus, as the first stage of the whole research plan, we currently focus on the first module - Structural Modeling. Three computer software packages are selected, and will be integrated for this purpose. They are PhotoModeler-Pro, AutoCAD-R14, and MSC/NASTRAN. AutoCAD is the most popular PC-CAD system currently available in the market. For our purpose, it plays like an interface to generate structural models of any particular engine parts or assembly, which is then passed to MSC/NASTRAN for extracting structural dynamic properties. Although AutoCAD is a powerful structural modeling tool, the complexity of engine components requires a further improvement in structural modeling techniques. We are working on a so-called "scanning and mapping" technique, which is a relatively new technique. The basic idea is to producing a full and accurate 3D structural model by tracing on multiple overlapping photographs taken from different angles. There is no need to input point positions, angles, distances or axes. Photographs can be taken by any types of cameras with different lenses. With the integration of such a modeling technique, the capability of structural modeling will be enhanced. The prototypes of any complex structural components will be produced by PhotoModeler first based on existing similar

  2. Digital Mapping Techniques '11–12 workshop proceedings

    USGS Publications Warehouse

    Soller, David R.

    2014-01-01

    At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  3. Process Mapping: Tools, Techniques, & Critical Success Factors.

    ERIC Educational Resources Information Center

    Kalman, Howard K.

    2002-01-01

    Explains process mapping as an analytical tool and a process intervention that performance technologists can use to improve human performance by reducing error variance. Highlights include benefits of process mapping; and critical success factors, including organizational readiness, time commitment by participants, and the availability of a…

  4. Concept Mapping: A Critical Thinking Technique

    ERIC Educational Resources Information Center

    Harris, Charles M.; Zha, Shenghua

    2013-01-01

    Concept mapping, graphically depicting the structure of abstract concepts, is based on the observation that pictures and line drawings are often more easily comprehended than the words that represent an abstract concept. The efficacy of concept mapping for facilitating critical thinking was assessed in four sections of an introductory psychology…

  5. A Novel Tsunami Inundation Mapping Technique

    NASA Astrophysics Data System (ADS)

    Watts, P.; Walder, J. S.; Waythomas, C. F.

    2003-12-01

    Traditional tsunami inundation maps have focused on maximum on-land inundation. Occasionally, the inundation maps are supplemented with information about maximum water velocity or timing. We demonstrate using several case studies the utility of producing maps showing 1) on-land inundation area, 2) maximum water velocity, 3) maximum water flux, 4) time of wave arrival, 5) time of wave maximum, and 6) time of wave departure. Map attributes 1-3 are related to water motion and are of particular value to scientists involved in tsunami hazard assessment. Attributes 4-6 are related to the timing of wave events and are of particular value to emergency managers. However, this raw form of inundation mapping is not easily interpreted within the usual GIS context because water velocities and fluxes are not readily understood by lay people. To improve on the GIS approach to portraying tsunami information, we employ simple engineering models illustrating the consequences of tsunami attack in a way that lay audiences can readily understand. In particular, we present maps that depict regions where a human would be knocked down, where cars would be moved, where various sizes of sediment would be moved, where anchors would break, and where ships would be moved laterally by a significant amount. Each map is separated into three distinct layers, each layer representing a different level of hazard. For example, the car movement map shows results for three sizes of automobile. Each region is depicted as a separate GIS layer. Given the uncertainty involved in numerical simulations, as well as the many local variables that can alter the outcome during a real event, these maps are meant to provide general guidelines as to the outcome of tsunami attack.

  6. Achromatic Emission Velocity Measurements in Luminous Flows

    NASA Technical Reports Server (NTRS)

    Schneider, S. J.; Fulghum, S. F.; Rostler, P. S.

    1997-01-01

    A new velocity measurement instrument for luminous flows was developed by Science Research Laboratory for NASA. The SIEVE (Segmented Image Emission VElocimeter) instrument uses broadband light emitted by the flow for the velocity measurement. This differs from other velocimetry techniques in that it does not depend on laser illumination and/or light scattering from particles in the flow. The SIEVE is a passive, non-intrusive diagnostic. By moving and adjusting the imaging optics, the SIEVE can provide three-dimensional mapping of a flow field and determine turbulence scale size. A SIEVE instrument was demonstrated on an illuminated rotating disk to evaluate instrument response and noise and on an oxy-acetylene torch to measure flame velocities. The luminous flow in rocket combustors and plumes is an ideal subject for the SIEVE velocity measurement technique.

  7. Luminous presence

    NASA Astrophysics Data System (ADS)

    Dawson, Paula

    2008-02-01

    The Luminous Presence project examines the use of standard film language in the framing, angle and of points of view of holographic subjects though eight digital holographic stereograms; seven 25 x 25 cm, Hail, Water, Rain, Snow, Sun, Text, Imprint and 1.5 x 1 m, Luminous Presences i. However, before embarking on a discussion of how filmic language can be used in digital holograms it is first important to explain why this line of investigation could be fruitful. Undoubtedly several of the compositional practices which sprung up and evolved throughout the development of the diverse forms of the holographic medium have contributed to a unique hologram pictorial language, however it is well known that the reading of visual imagery of any type relies a great deal on the viewer's knowledge of and experience of other images .The lens-recorded imagery of film is a far more familiar language than that of holograms and the correlation between certain filmic pictorial conventions and emotional responses are well documented and understood. ii . In short the language of film contains a highly nuanced vocabulary of shot types and lens types (which may be criticised as being formulaic) yet are effective in lending emotion to figures.

  8. A satellite remote sensing technique for geological structure horizon mapping

    SciTech Connect

    Fraser, A.; Huggins, P.; Rees, J.

    1996-08-01

    A Satellite Remote Sensing Technique is demonstrated for generating near surface geological structure data. This technique enables the screening of large areas and targeting of seismic acquisition during hydrocarbon exploration. This is of particular advantage in terrains where surveying is logistically difficult. Landsat Thematic Mapper (TM) data and a high resolution Digital Elevation Model (DEM), are used to identify and map outcropping horizons. These are used to reconstruct the near surface structure. The technique is applied in Central Yemen which is characterised by a {open_quote}layer-cake{close_quote} geological and low dipping terrain. The results are validated using 2D seismic data. The near surface map images faults and structure not apparent in the raw data. Comparison with the structure map generated from a 2D seismic data indicates very good structural and fault correlation. The near surface map successfully highlights areas of potential closure at reservoir depths.

  9. Mapping the climate: guidance on appropriate techniques to map climate variables and their uncertainty

    NASA Astrophysics Data System (ADS)

    Kaye, N. R.; Hartley, A.; Hemming, D.

    2012-02-01

    Maps are a crucial asset in communicating climate science to a diverse audience, and there is a wealth of software available to analyse and visualise climate information. However, this availability makes it easy to create poor maps as users often lack an underlying cartographic knowledge. Unlike traditional cartography, where many known standards allow maps to be interpreted easily, there is no standard mapping approach used to represent uncertainty (in climate or other information). Consequently, a wide range of techniques have been applied for this purpose, and users may spend unnecessary time trying to understand the mapping approach rather than interpreting the information presented. Furthermore, communicating and visualising uncertainties in climate data and climate change projections, using for example ensemble based approaches, presents additional challenges for mapping that require careful consideration. The aim of this paper is to provide background information and guidance on suitable techniques for mapping climate variables, including uncertainty. We assess a range of existing and novel techniques for mapping variables and uncertainties, comparing "intrinsic" approaches that use colour in much the same way as conventional thematic maps with "extrinsic" approaches that incorporate additional geometry such as points or features. Using cartographic knowledge and lessons learned from mapping in different disciplines we propose the following 6 general mapping guidelines to develop a suitable mapping technique that represents both magnitude and uncertainty in climate data:

    - use a sensible sequential or diverging colour scheme;

    - use appropriate colour symbolism if it is applicable;

    - ensure the map is usable by colour blind people;

    - use a data classification scheme that does not misrepresent the data;

    - use a map

  10. Map Your Way to Speech Success! Employing Mind Mapping as a Speech Preparation Technique

    ERIC Educational Resources Information Center

    Paxman, Christina G.

    2011-01-01

    Mind mapping has gained considerable credibility recently in corporations such as Boeing and Nabisco, as well as in the classroom in terms of preparing for examinations and preparing for speeches. A mind map is a graphic technique for organizing an individual's thoughts and other information. It harnesses the full range of cortical skills--word,…

  11. Mapping techniques for aligning sulci across multiple brains☆

    PubMed Central

    Tosun, Duygu; Rettmann, Maryam E.; Prince, Jerry L.

    2015-01-01

    Visualization and mapping of function on the cortical surface is difficult because of its sulcal and gyral convolutions. Methods to unfold and flatten the cortical surface for visualization and measurement have been described in the literature. This makes visualization and measurement possible, but comparison across multiple subjects is still difficult because of the lack of a standard mapping technique. In this paper, we describe two methods that map each hemisphere of the cortex to a portion of a sphere in a standard way. To quantify how accurately the geometric features of the cortex – i.e., sulci and gyri – are mapped into the same location, sulcal alignment across multiple brains is analyzed, and probabilistic maps for different sulcal regions are generated to be used in automatic labelling of segmented sulcal regions. PMID:15450224

  12. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  13. Comparison of the Reliability and Validity of Scores from Two Concept-Mapping Techniques.

    ERIC Educational Resources Information Center

    Ruiz-Primo, Maria Araceli; Schultz, Susan E.; Li, Min; Shavelson, Richard J.

    2001-01-01

    Reports the results of a study that compared two concept-mapping techniques, one high-directed, "fill-in-the-map" and one low-directed, "construct-a-map-from-scratch". Examines whether: (1) skeleton map scores were sensitive to the sample; (2) the two types of skeleton maps were equivalent; and (3) the two mapping techniques provided similar…

  14. Extended mapping and characteristics techniques for inverse aerodynamic design

    NASA Technical Reports Server (NTRS)

    Sobieczky, H.; Qian, Y. J.

    1991-01-01

    Some ideas for using hodograph theory, mapping techniques and methods of characteristics to formulate typical aerodynamic design boundary value problems are developed. The inverse method of characteristics is shown to be a fast tool for design of transonic flow elements as well as supersonic flows with given shock waves.

  15. Uncertainty in mapping urban air quality using crowdsourcing techniques

    NASA Astrophysics Data System (ADS)

    Schneider, Philipp; Castell, Nuria; Lahoz, William; Bartonova, Alena

    2016-04-01

    Small and low-cost sensors measuring various air pollutants have become available in recent years owing to advances in sensor technology. Such sensors have significant potential for improving high-resolution mapping of air quality in the urban environment as they can be deployed in comparatively large numbers and therefore are able to provide information at unprecedented spatial detail. However, such sensor devices are subject to significant and currently little understood uncertainties that affect their usability. Not only do these devices exhibit random errors and biases of occasionally substantial magnitudes, but these errors may also shift over time. In addition, there often tends to be significant inter-sensor variability even when supposedly identical sensors from the same manufacturer are used. We need to quantify accurately these uncertainties to make proper use of the information they provide. Furthermore, when making use of the data and producing derived products such as maps, the measurement uncertainties that propagate throughout the analysis need to be clearly communicated to the scientific and non-scientific users of the map products. Based on recent experiences within the EU-funded projects CITI-SENSE and hackAIR we discuss the uncertainties along the entire processing chain when using crowdsourcing techniques for mapping urban air quality. Starting with the uncertainties exhibited by the sensors themselves, we present ways of quantifying the error characteristics of a network of low-cost microsensors and show suitable statistical metrics for summarizing them. Subsequently, we briefly present a data-fusion-based method for mapping air quality in the urban environment and illustrate how we propagate the uncertainties of the individual sensors throughout the mapping system, resulting in detailed maps that document the pixel-level uncertainty for each concentration field. Finally, we present methods for communicating the resulting spatial uncertainty

  16. Geological mapping in northwestern Saudi Arabia using LANDSAT multispectral techniques

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Brown, G. F.; Moik, J. G.

    1975-01-01

    Various computer enhancement and data extraction systems using LANDSAT data were assessed and used to complement a continuing geologic mapping program. Interactive digital classification techniques using both the maximum-likelihood and thresholding statistical approaches achieve very limited success in areas of highly dissected terrain. Computer enhanced imagery developed by color compositing stretched MSS ratio data was constructed for a test site in northwestern Saudi Arabia. Initial results indicate that several igneous and sedimentary rock-types can be discriminated.

  17. Geological mapping in northwestern Saudi Arabia using LANDSAT multispectral techniques

    NASA Technical Reports Server (NTRS)

    Blodget, H. W.; Brown, G. F.; Moik, J. G.

    1975-01-01

    Various computer enhancement and data extraction systems using LANDSAT data were assessed and used to complement a continuing geologic mapping program. Interactive digital classification techniques using both the parallel-piped and maximum-likelihood statistical approaches achieve very limited success in areas of highly dissected terrain. Computer enhanced imagery developed by color compositing stretched MSS ratio data was constructed for a test site in northwestern Saudi Arabia. Initial results indicate that several igneous and sedimentary rock types can be discriminated.

  18. Luminous efficiency functions at higher intensities

    NASA Astrophysics Data System (ADS)

    Harrington, Lawrence Kent

    Two psychophysical measurement techniques, flicker photometry and successive heterochromatic brightness matching, were used to measure changes in luminance efficiency functions with increasing levels of light adaptation. Both measurement techniques were performed using the same optical system and the same seven healthy adults as subjects. Measurements were taken at four reference stimulus intensities, 1, 10, 100 and 1000 foot-lamberts. Luminous efficiency was found to depend on both the technique and the reference stimulus intensity with which the measurements were taken. For heterochromatic brightness matching, luminous efficiency increased for longer wavelengths as reference intensity increased. Peak luminous efficiency shifted from approximately 540nm to greater than 600nm with increasing intensity for all seven subjects. Peak luminous efficiency was constant for flicker photometry across all intensities but the function narrowed slightly at 100 foot-lamberts.

  19. Strain mapping accuracy improvement using super-resolution techniques.

    PubMed

    Bárcena-González, G; Guerrero-Lebrero, M P; Guerrero, E; Fernández-Reyes, D; González, D; Mayoral, A; Utrilla, A D; Ulloa, J M; Galindo, P L

    2016-04-01

    Super-resolution (SR) software-based techniques aim at generating a final image by combining several noisy frames with lower resolution from the same scene. A comparative study on high-resolution high-angle annular dark field images of InAs/GaAs QDs has been carried out in order to evaluate the performance of the SR technique. The obtained SR images present enhanced resolution and higher signal-to-noise (SNR) ratio and sharpness regarding the experimental images. In addition, SR is also applied in the field of strain analysis using digital image processing applications such as geometrical phase analysis and peak pairs analysis. The precision of the strain mappings can be improved when SR methodologies are applied to experimental images. PMID:26501744

  20. Applying field mapping refractive beam shapers to improve holographic techniques

    NASA Astrophysics Data System (ADS)

    Laskin, Alexander; Williams, Gavin; McWilliam, Richard; Laskin, Vadim

    2012-03-01

    Performance of various holographic techniques can be essentially improved by homogenizing the intensity profile of the laser beam with using beam shaping optics, for example, the achromatic field mapping refractive beam shapers like πShaper. The operational principle of these devices presumes transformation of laser beam intensity from Gaussian to flattop one with high flatness of output wavefront, saving of beam consistency, providing collimated output beam of low divergence, high transmittance, extended depth of field, negligible residual wave aberration, and achromatic design provides capability to work with several laser sources with different wavelengths simultaneously. Applying of these beam shapers brings serious benefits to the Spatial Light Modulator based techniques like Computer Generated Holography or Dot-Matrix mastering of security holograms since uniform illumination of an SLM allows simplifying mathematical calculations and increasing predictability and reliability of the imaging results. Another example is multicolour Denisyuk holography when the achromatic πShaper provides uniform illumination of a field at various wavelengths simultaneously. This paper will describe some design basics of the field mapping refractive beam shapers and optical layouts of their applying in holographic systems. Examples of real implementations and experimental results will be presented as well.

  1. Duodenal luminal nutrient sensing

    PubMed Central

    Rønnestad, Ivar; Akiba, Yasutada; Kaji, Izumi; Kaunitz, Jonathan D

    2016-01-01

    The gastrointestinal mucosa is exposed to numerous chemical substances and microorganisms, including macronutrients, micronutrients, bacteria, endogenous ions, and proteins. The regulation of mucosal protection, digestion, absorption and motility is signaled in part by luminal solutes. Therefore, luminal chemosensing is an important mechanism enabling the mucosa to monitor luminal conditions, such as pH, ion concentrations, nutrient quantity, and microflora. The duodenal mucosa shares luminal nutrient receptors with lingual taste receptors in order to detect the five basic tastes, in addition to essential nutrients, and unwanted chemicals. The recent ‘de-orphanization’ of nutrient sensing G protein-coupled receptors provides an essential component of the mechanism by which the mucosa senses luminal nutrients. In this review, we will update the mechanisms of and underlying physiological and pathological roles in luminal nutrient sensing, with a main focus on the duodenal mucosa. PMID:25113991

  2. Applications of Mapping and Tomographic Techniques in Gem Sciences

    NASA Astrophysics Data System (ADS)

    Shen, A. H.

    2014-12-01

    Gem Sciences are scientific studies of gemstones - their genesis, provenance, synthesis, enhancement, treatment and identification. As high quality forms of specific minerals, the gemstones exhibit unusual physical properties that are usually unseen in the regular counterparts. Most gemstones are colored by trace elements incorporated in the crystal lattice during various growth stages; forming coloration zones of various scales. Studying the spectral and chemical contrast across color zones helps elucidating the origins of colors. These are done by UV-visible spectrometers with microscope and LA-ICPMS in modern gemological laboratories. In the case of diamonds, their colored zones arise from various structural defects incorporated in different growth zones and are studied with FTIR spectrometers with IR microscope and laser photoluminescence spectrometers. Advancement in modern synthetic techniques such as chemical vapor deposition (CVD) has created some problem for identification. Some exploratory experiments in carbon isotope mapping were done on diamonds using SIMS. The most important issue in pearls is to identify one particular pearl being a cultured one or a natural pearl. The price difference can be enormous. Classical way of such identification is done by x-ray radiographs, which clearly show the bead and the nacre. Modern cultured pearl advancement has eliminated the need for an artificial bead, but a small piece of tissue instead. Nowadays, computer x-ray tomography (CT) scanning devices are used to depict the clear image of the interior of a pearl. In the Chinese jade market, filling fissures with epoxy and/or wax are very commonly seen. We are currently exploring Magnetic Resonance Imaging (MRI) technique to map the distribution of artificial resin within a polycrystalline aggregates.

  3. Cognitive Mapping Techniques: Implications for Research in Engineering and Technology Education

    ERIC Educational Resources Information Center

    Dixon, Raymond A.; Lammi, Matthew

    2014-01-01

    The primary goal of this paper is to present the theoretical basis and application of two types of cognitive maps, concept map and mind map, and explain how they can be used by educational researchers in engineering design research. Cognitive mapping techniques can be useful to researchers as they study students' problem solving strategies…

  4. Gluing for Raman lidar systems using the lamp mapping technique.

    PubMed

    Walker, Monique; Venable, Demetrius; Whiteman, David N

    2014-12-20

    In the context of combined analog and photon counting (PC) data acquisition in a Lidar system, glue coefficients are defined as constants used for converting an analog signal into a virtual PC signal. The coefficients are typically calculated using Lidar profile data taken under clear, nighttime conditions since, in the presence of clouds or high solar background, it is difficult to obtain accurate glue coefficients from Lidar backscattered data. Here we introduce a new method in which we use the lamp mapping technique (LMT) to determine glue coefficients in a manner that does not require atmospheric profiles to be acquired and permits accurate glue coefficients to be calculated when adequate Lidar profile data are not available. The LMT involves scanning a halogen lamp over the aperture of a Lidar receiver telescope such that the optical efficiency of the entire detection system is characterized. The studies shown here involve two Raman lidar systems; the first from Howard University and the second from NASA/Goddard Space Flight Center. The glue coefficients determined using the LMT and the Lidar backscattered method agreed within 1.2% for the water vapor channel and within 2.5% for the nitrogen channel for both Lidar systems. We believe this to be the first instance of the use of laboratory techniques for determining the glue coefficients for Lidar data analysis. PMID:25608203

  5. Large areas elemental mapping by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  6. Interaction-Point Phase-Space Characterization using Single-Beam and Luminous-Region Measurements at PEP-II

    SciTech Connect

    Kozanecki, W; Bevan, A.J.; Viaud, B.F.; Cai, Y.; Fisher, A.S.; O'Grady, C.; Lindquist, B.; Roodman, A.; J.M.Thompson, M.Weaver; /SLAC

    2008-09-09

    We present an extensive experimental characterization of the e{sup {+-}} phase space at the interaction point of the SLAC PEP-II B-Factory, that combines a detailed mapping of luminous-region observables using the BABAR detector, with stored-beam measurements by accelerator techniques.

  7. Hydrocarbon microseepage mapping using signature based target detection techniques

    NASA Astrophysics Data System (ADS)

    Soydan, Hilal; Koz, Alper; Şebnem Düzgün, H.; Aydin Alatan, A.

    2015-10-01

    In this paper, we compare the conventional methods in hydrocarbon seepage anomalies with the signature based detection algorithms. The Crosta technique [1] is selected as a basement in the experimental comparisons for the conventional approach. The Crosta technique utilizes the characteristic bands of the searched target for principal component transformation in order to determine the components characterizing the target in interest. Desired Target Detection and Classification Algorithm (DTDCA), Spectral Matched Filter (SMF), and Normalized Correlation (NC) are employed for signature based target detection. Signature based target detection algorithms are applied to the whole spectrum benefiting from the information stored in all spectral bands. The selected methods are applied to a multispectral Advanced SpaceBorne Thermal Emission and Radiometer (ASTER) image of the study region, with an atmospheric correction prior to the realization of the algorithms. ASTER provides multispectral bands covering visible, short wave, and thermal infrared region, which serves as a useful tool for the interpretation of the areas with hydrocarbon anomalies. The exploration area is selected as Gemrik Anticline which is located in South East Anatolia, Adıyaman, Bozova Oil Field, where microseeps can be observed with almost no vegetation cover. The spectral signatures collected with Analytical Spectral Devices Inc. (ASD) spectrometer from the reference valley [2] have been utilized as an input to the signature based detection algorithms. The experiments have indicated that DTDCA and MF outperforms the Crosta technique by locating the microseepage patterns along the mitigation pathways with a better contrast. On the other hand, NC has not been able to map the searched target with a visible distinction. It is concluded that the signature based algorithms can be more effective than the conventional methods for the detection of microseepage induced anomalies.

  8. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    NASA Astrophysics Data System (ADS)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  9. Phase reversal technique decreases cortical stimulation time during motor mapping.

    PubMed

    Simon, Mirela V; Sheth, Sameer A; Eckhardt, Christine A; Kilbride, Ronan D; Braver, Diana; Williams, Ziv; Curry, William; Cahill, Dan; Eskandar, Emad N

    2014-06-01

    Neurophysiologic mapping of the primary motor cortex (PMC) is commonly used in supratentorial surgery. Electrical cortical stimulation is guided by anatomic landmarks towards the precentral gyrus, with recording of the triggered primary motor responses (TPMR) in the contralateral hemibody. Thus, factors such as distortion of the pericentral anatomy, small surgical fields, brain shifts and miscalibrated neuronavigational systems may lengthen the process and result in unnecessary stimulations, increasing the probability of triggering seizures. We hypothesized that central sulcus localization via the median somatosensory evoked potentials phase reversal technique (MSSEP PRT) accurately guides the surgeon, resulting in prompt identification of the PMC with minimal electrical stimulation. Multivariate Cox regression was used to study the impact of MSSEP PRT on time spent performing electrical cortical stimulation to TPMR. The analysis was adjusted for presence of increased cortical excitability, high motor thresholds, lesions close to PMC and fMRI data, in 100 consecutive standardized motor mapping procedures for brain tumor resection and epilepsy surgery. Phase reversal and change morphology of the recorded somatosensory evoked potentials quadrupled (hazard ratio [HR] 4.13, p<0.0001) and doubled (HR 2.14, p=0.02) the rate of obtaining TPMR, respectively. A 1mA increase in motor threshold decreased the rate by 9% (HR 0.91, p=0.0002). Afterdischarges triggered before TPMR and lesions in close proximity to PMC decreased the rate of TPMR by 76% (HR 0.23, p<0.0001) and 48% (HR 0.52, p=0.04), respectively. Informative PRT decreases stimulation time. Afterdischarges triggered before TPMR, high motor thresholds and lesions close to the PMC increase it. PMID:24679940

  10. Optical coherence tomography investigations of ceramic lumineers

    NASA Astrophysics Data System (ADS)

    Fernandes, Luana O.; Graça, Natalia D. R. L.; Melo, Luciana S. A.; Silva, Claudio H. V.; Gomes, Anderson S. L.

    2016-02-01

    Lumineers are veneer laminates used as an alternative for aesthetic dental solutions of the highest quality, but the only current means of its performance assessment is visual inspection. The objective of this study was to use the Optical Coherence Tomography (OCT) technique working in spectral domain to analyze in vivo in a single patient, 14 lumineers 180 days after cementation. It was possible to observe images in various kinds of changes in the cementing line and the laminate. It was concluded that the OCT is an effective and promising method to clinical evaluation of the cementing line in lumineers.

  11. Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009

    USGS Publications Warehouse

    Soller, David R.

    2011-01-01

    As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  12. Techniques, problems and uses of mega-geomorphological mapping

    NASA Technical Reports Server (NTRS)

    Embleton, C.

    1985-01-01

    A plea for a program of global geomorphological mapping based on remote sensing data is presented. It is argued that the program is a necessary step in bringing together the rapidly evolving concepts of plate tectonics with the science of geomorphology. Geomorphologists are urged to bring temporal scales into their subject and to abandon their recent isolation from tectonics and geological history. It is suggested that a start be made with a new geomorphological map of Europe, utilizing the latest space technology.

  13. A close-range photogrammetric technique for mapping neotectonic features in trenches

    USGS Publications Warehouse

    Fairer, G.M.; Whitney, J.W.; Coe, J.A.

    1989-01-01

    Close-range photogrammetric techniques and newly available computerized plotting equipment were used to map exploratory trench walls that expose Quaternary faults in the vicinity of Yucca Mountain, Nevada. Small-scale structural, lithologic, and stratigraphic features can be rapidly mapped by the photogrammetric method. This method is more accurate and significantly more rapid than conventional trench-mapping methods, and the analytical plotter is capable of producing cartographic definition of high resolution when detailed trench maps are necessary. -from Authors

  14. Tumor mapping: new rationales, adjusted techniques, expanded applications.

    PubMed

    Cinberg, J Z; Katlein, S B; Vogl, S E

    1983-10-01

    Eleven consecutive patients with cancers of the oropharynx (4), hypopharynx (4), and oral cavity (3) were endoscoped and "tumor mapped" with a modified tattoo solution before beginning non-surgical antineoplastic therapy. The tattooed outlines were clearly visible at 7 weeks (5 patients after induction chemotherapy); at 14 weeks (2 patients after induction chemotherapy and radiotherapy); and between 12 and 16 weeks (4 patients after radiotherapy). The "tumor mapping" aided both the establishment of appropriate resection margins in cancers that had diminished in response to non-surgical therapy and the recognition of a tumor's lack of response to nonsurgical antineoplastic treatment. PMID:6194395

  15. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  16. A comparison of two conformal mapping techniques applied to an aerobrake body

    NASA Technical Reports Server (NTRS)

    Hommel, Mark J.

    1987-01-01

    Conformal mapping is a classical technique which has been utilized for solving problems in aerodynamics and hydrodynamics. Conformal mapping has been successfully applied in the construction of grids around airfoils, engine inlets and other aircraft configurations. Conformal mapping techniques were applied to an aerobrake body having an axis of symmetry. Two different approaches were utilized: (1) Karman-Trefftz transformation; and (2) Point Wise Schwarz Christoffel transformation. In both cases, the aerobrake body was mapped onto a near circle, and a grid was generated in the mapped plane. The mapped body and grid were then mapped back into physical space and the properties of the associated grids were examined. Advantages and disadvantages of both approaches are discussed.

  17. Simulating and mapping spatial complexity using multi-scale techniques

    USGS Publications Warehouse

    De Cola, L.

    1994-01-01

    A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author

  18. Knowledge mapping as a technique to support knowledge translation.

    PubMed Central

    Ebener, S.; Khan, A.; Shademani, R.; Compernolle, L.; Beltran, M.; Lansang, Ma; Lippman, M.

    2006-01-01

    This paper explores the possibility of integrating knowledge mapping into a conceptual framework that could serve as a tool for understanding the many complex processes, resources and people involved in a health system, and for identifying potential gaps within knowledge translation processes in order to address them. After defining knowledge mapping, this paper presents various examples of the application of this process in health, before looking at the steps that need to be taken to identify potential gaps, to determine to what extent these gaps affect the knowledge translation process and to establish their cause. This is followed by proposals for interventions aimed at strengthening the overall process. Finally, potential limitations on the application of this framework at the country level are addressed. PMID:16917651

  19. Research Techniques Made Simple: Immunofluorescence Antigen Mapping in Epidermolysis Bullosa.

    PubMed

    Has, Cristina; He, Yinghong

    2016-07-01

    Inherited epidermolysis bullosa is a group of genetic blistering diseases with a broad spectrum of clinical severity and molecular defects. Epidermolysis bullosa results from mutations in genes encoding proteins involved in cell-cell and cell-matrix adhesion in the epidermis. Immunofluorescence antigen mapping makes use of monoclonal antibodies against proteins of the dermal-epidermal junction zone to determine the layer of skin where cleavage occurs and the relative protein abundance. It allows the diagnosis of the type and subtype of inherited epidermolysis bullosa and sheds light on molecular mechanisms underlying the disease. Immunofluorescence mapping steps include obtaining a skin biopsy sample, processing the biopsy material, antigen-antibody interaction on tissue, washing, incubation with fluorescently conjugated secondary antibodies, mounting, observation under a fluorescence microscope, and interpretation. A minimal antibody panel allows discrimination of the main epidermolysis bullosa subtypes. Extended panels can be used depending on the diagnostic or scientific question to be addressed. Immunofluorescence mapping contributed to significant progress in understanding epidermolysis bullosa, including identification of new underlying genetic mutations, mutation mechanisms, and the presence of revertant mosaicism. It is also an important tool in the assessment of the efficacy of experimental therapeutic approaches. PMID:27342035

  20. Mapping Diffuse Seismicity Using Empirical Matched Field Processing Techniques

    SciTech Connect

    Wang, J; Templeton, D C; Harris, D B

    2011-01-21

    The objective of this project is to detect and locate more microearthquakes using the empirical matched field processing (MFP) method than can be detected using only conventional earthquake detection techniques. We propose that empirical MFP can complement existing catalogs and techniques. We test our method on continuous seismic data collected at the Salton Sea Geothermal Field during November 2009 and January 2010. In the Southern California Earthquake Data Center (SCEDC) earthquake catalog, 619 events were identified in our study area during this time frame and our MFP technique identified 1094 events. Therefore, we believe that the empirical MFP method combined with conventional methods significantly improves the network detection ability in an efficient matter.

  1. Development of predictive mapping techniques for soil survey and salinity mapping

    NASA Astrophysics Data System (ADS)

    Elnaggar, Abdelhamid A.

    Conventional soil maps represent a valuable source of information about soil characteristics, however they are subjective, very expensive, and time-consuming to prepare. Also, they do not include explicit information about the conceptual mental model used in developing them nor information about their accuracy, in addition to the error associated with them. Decision tree analysis (DTA) was successfully used in retrieving the expert knowledge embedded in old soil survey data. This knowledge was efficiently used in developing predictive soil maps for the study areas in Benton and Malheur Counties, Oregon and accessing their consistency. A retrieved soil-landscape model from a reference area in Harney County was extrapolated to develop a preliminary soil map for the neighboring unmapped part of Malheur County. The developed map had a low prediction accuracy and only a few soil map units (SMUs) were predicted with significant accuracy, mostly those shallow SMUs that have either a lithic contact with the bedrock or developed on a duripan. On the other hand, the developed soil map based on field data was predicted with very high accuracy (overall was about 97%). Salt-affected areas of the Malheur County study area are indicated by their high spectral reflectance and they are easily discriminated from the remote sensing data. However, remote sensing data fails to distinguish between the different classes of soil salinity. Using the DTA method, five classes of soil salinity were successfully predicted with an overall accuracy of about 99%. Moreover, the calculated area of salt-affected soil was overestimated when mapped using remote sensing data compared to that predicted by using DTA. Hence, DTA could be a very helpful approach in developing soil survey and soil salinity maps in more objective, effective, less-expensive and quicker ways based on field data.

  2. Middle Atmosphere Program. Handbook for MAP. Volume 15: Balloon techniques

    NASA Technical Reports Server (NTRS)

    Murcray, D. G. (Editor)

    1985-01-01

    Some techniques employed by investigators using balloons to obtain data on the properties of the middle atmosphere are discussed. Much effort has gone into developing instruments which could be used on small balloons to measure temperature and variable species. These efforts are discussed. Remote sensing techniques used to obtain data on atmospheric composition are described. Measurement of stratospheric ions and stratospheric aerosols are also discussed.

  3. Quantification of Geologic Lineaments by Manual and Machine Processing Techniques. [Landsat satellites - mapping/geological faults

    NASA Technical Reports Server (NTRS)

    Podwysocki, M. H.; Moik, J. G.; Shoup, W. C.

    1975-01-01

    The effect of operator variability and subjectivity in lineament mapping and methods to minimize or eliminate these problems by use of several machine preprocessing methods was studied. Mapped lineaments of a test landmass were used and the results were compared statistically. The total number of fractures mapped by the operators and their average lengths varied considerably, although comparison of lineament directions revealed some consensus. A summary map (785 linears) produced by overlaying the maps generated by the four operators shows that only 0.4 percent were recognized by all four operators, 4.7 percent by three, 17.8 percent by two, and 77 percent by one operator. Similar results were obtained in comparing these results with another independent group. This large amount of variability suggests a need for the standardization of mapping techniques, which might be accomplished by a machine aided procedure. Two methods of machine aided mapping were tested, both simulating directional filters.

  4. THE TOTAL LUMINOUS EFFICIENCY OF LUMINOUS BACTERIA.

    PubMed

    Harvey, E N

    1925-09-18

    Methods are described for measuring the light emitted by an emulsion of luminous bacteria of given thickness, and calculating the light emitted by a single bacterium, measuring 1.1 x 2.2 micra, provided there is no absorption of light in the emulsion. At the same time, the oxygen consumed by a single bacterium was measured by recording the time for the bacteria to use up .9 of the oxygen dissolved in sea water from air (20 per cent oxygen). The luminescence intensity does not diminish until the oxygen concentration falls below 2 per cent, when the luminescence diminishes rapidly. Above 2 per cent oxygen (when the oxygen dissolving in sea water from pure oxygen at 760 mm. Hg pressure = 100 per cent) the bacteria use equal amounts of oxygen in equal times, while below 2 per cent oxygen it seems very likely that rate of oxygen absorption is proportional to oxygen concentration. By measuring the time for a tube of luminous bacteria of known concentration saturated with air (20 per cent oxygen) to begin to darken (2 per cent oxygen) we can calculate the oxygen absorbed by one bacterium per second. The bacteria per cc. are counted on a blood counting slide or by a centrifugal method, after measuring the volume of a single bacterium (1.695 x 10(-12) cc.). Both methods gave results in good agreement with each other. The maximum value for the light from a single bacterium was 24 x 10(-14) lumens or 1.9 x 10(-14) candles. The maximum value for lumen-seconds per mg. of oxygen absorbed was 14. The average value for lumen-seconds per mg. O(2) was 9.25. The maximum values were selected in calculating the efficiency of light production, since some of the bacteria counted may not be producing light, although they may still be using oxygen. The "diet" of the bacteria was 60 per cent glycerol and 40 per cent peptone. To oxidize this mixture each mg. of oxygen would yield 3.38 gm. calories or 14.1 watts per second. 1 lumen per watt is therefore produced by a normal bacterium which

  5. MetricMap: an embedding technique for processing distance-based queries in metric spaces.

    PubMed

    Wang, Jason T L; Wang, Xiong; Shasha, Dennis; Zhang, Kaizhong

    2005-10-01

    In this paper, we present an embedding technique, called MetricMap, which is capable of estimating distances in a pseudometric space. Given a database of objects and a distance function for the objects, which is a pseudometric, we map the objects to vectors in a pseudo-Euclidean space with a reasonably low dimension while preserving the distance between two objects approximately. Such an embedding technique can be used as an approximate oracle to process a broad class of distance-based queries. It is also adaptable to data mining applications such as data clustering and classification. We present the theory underlying MetricMap and conduct experiments to compare MetricMap with other methods including MVP-tree and M-tree in processing the distance-based queries. Experimental results on both protein and RNA data show the good performance and the superiority of MetricMap over the other methods. PMID:16240772

  6. Digitizing zone maps, using modified LARSYS program. [computer graphics and computer techniques for mapping

    NASA Technical Reports Server (NTRS)

    Giddings, L.; Boston, S.

    1976-01-01

    A method for digitizing zone maps is presented, starting with colored images and producing a final one-channel digitized tape. This method automates the work previously done interactively on the Image-100 and Data Analysis System computers of the Johnson Space Center (JSC) Earth Observations Division (EOD). A color-coded map was digitized through color filters on a scanner to form a digital tape in LARSYS-2 or JSC Universal format. The taped image was classified by the EOD LARSYS program on the basis of training fields included in the image. Numerical values were assigned to all pixels in a given class, and the resulting coded zone map was written on a LARSYS or Universal tape. A unique spatial filter option permitted zones to be made homogeneous and edges of zones to be abrupt transitions from one zone to the next. A zoom option allowed the output image to have arbitrary dimensions in terms of number of lines and number of samples on a line. Printouts of the computer program are given and the images that were digitized are shown.

  7. My Solar System: A Developmentally Adapted Eco-Mapping Technique for Children

    ERIC Educational Resources Information Center

    Curry, Jennifer R.; Fazio-Griffith, Laura J.; Rohr, Shannon N.

    2008-01-01

    Counseling children requires specific skills and techniques, such as play therapy and expressive arts, to address developmental manifestations and to facilitate the understanding of presenting problems. This article outlines an adapted eco-mapping activity that can be used as a creative counseling technique with children in order to promote…

  8. An Electronic Engineering Curriculum Design Based on Concept-Mapping Techniques

    ERIC Educational Resources Information Center

    Toral, S. L.; Martinez-Torres, M. R.; Barrero, F.; Gallardo, S.; Duran, M. J.

    2007-01-01

    Curriculum design is a concern in European Universities as they face the forthcoming European Higher Education Area (EHEA). This process can be eased by the use of scientific tools such as Concept-Mapping Techniques (CMT) that extract and organize the most relevant information from experts' experience using statistics techniques, and helps a…

  9. Feasibility of a borehole VHF radar technique for fracture mapping

    SciTech Connect

    Chang, H.T.

    1984-01-01

    Experiments were conducted to establish the feasibility of a downhole high-frequency electromagnetic technique for location of fractures in the vicinity of boreholes. An existing flame-cut slot in granite was filled with salt water to simulate a brine-filled fracture. A transmitter consisting of a phased dual-dipole array arranged to provide a directional signal toward the fracture was installed in a borehole opposite the fracture. A receiver operated at 30 to 300 MHz was also located in the same borehole. The radar returns from the simulated fracture were detectable in boreholes located at distances of up to 12 meters from the fracture. These results indicate for the first time the feasibility of a downhole VHF radar for use in a single borehole for detection of fractures located away from the borehole.

  10. Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008

    USGS Publications Warehouse

    Soller, David R.

    2009-01-01

    The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  11. Improving predictive mapping of deep-water habitats: Considering multiple model outputs and ensemble techniques

    NASA Astrophysics Data System (ADS)

    Robert, Katleen; Jones, Daniel O. B.; Roberts, J. Murray; Huvenne, Veerle A. I.

    2016-07-01

    In the deep sea, biological data are often sparse; hence models capturing relationships between observed fauna and environmental variables (acquired via acoustic mapping techniques) are often used to produce full coverage species assemblage maps. Many statistical modelling techniques are being developed, but there remains a need to determine the most appropriate mapping techniques. Predictive habitat modelling approaches (redundancy analysis, maximum entropy and random forest) were applied to a heterogeneous section of seabed on Rockall Bank, NE Atlantic, for which landscape indices describing the spatial arrangement of habitat patches were calculated. The predictive maps were based on remotely operated vehicle (ROV) imagery transects high-resolution autonomous underwater vehicle (AUV) sidescan backscatter maps. Area under the curve (AUC) and accuracy indicated similar performances for the three models tested, but performance varied by species assemblage, with the transitional species assemblage showing the weakest predictive performances. Spatial predictions of habitat suitability differed between statistical approaches, but niche similarity metrics showed redundancy analysis and random forest predictions to be most similar. As one statistical technique could not be found to outperform the others when all assemblages were considered, ensemble mapping techniques, where the outputs of many models are combined, were applied. They showed higher accuracy than any single model. Different statistical approaches for predictive habitat modelling possess varied strengths and weaknesses and by examining the outputs of a range of modelling techniques and their differences, more robust predictions, with better described variation and areas of uncertainties, can be achieved. As improvements to prediction outputs can be achieved without additional costly data collection, ensemble mapping approaches have clear value for spatial management.

  12. Criss-cross mapping BD+30 3639: a new kinematic analysis technique

    NASA Astrophysics Data System (ADS)

    Steffen, W.; Tamayo, F.; Koning, N.

    2011-10-01

    We introduce a new method to analyze proper motion data. The method is called ``criss-cross'' mapping. It emphasizes regions where proper motion vector extensions cross or converge. From a superposition of lines through the vectors, a map is generated which helps to interpret the kinematic data. The new mapping technique is applied to the young planetary nebula BD+30 3639. The data are approx. 200 internal proper motion measurements from Li, Harrington, & Borkowski (2002). From the criss-cross mapping of BD+30 3639, we conclude that the kinematic center is approximately 0.5 arcsec off-set to the South-East from the central star. The mapping also shows evidence for a non-homologous expansion of the nebula that is consistent with a disturbance aligned with the bipolar molecular bullets.

  13. Mapping electrodynamic features of the high-latitude ionosphere from localized observations - Technique

    NASA Technical Reports Server (NTRS)

    Richmond, A. D.; Kamide, Y.

    1988-01-01

    This paper describes a novel procedure for mapping high-latitude electric fields and currents and their associated magnetic variations, using sets of localized observational data derived from different types of measurements. The technique provides a formalism for incorporating simultaneously such different classes of data as electric fields from radars and satellites, electric currents from radars, and magnetic perturbations at the ground and at satellite heights; the technique also uses available statistical information on the averages and variances of electrodynamic fields. The technique provides a more rigorous way of quantitatively estimating high-latitude electric field and current patterns than other methods and has a capability to quantify the errors in the mapped fields, based on the distribution of available data, their errors, and the statistical variances of the fields. The technique is illustrated by an application to a substorm which was analyzed by Kamide et al. (1982) by an earlier technique.

  14. Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010

    USGS Publications Warehouse

    Soller, David R.

    2012-01-01

    The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  15. The Most Luminous Supernovae

    NASA Astrophysics Data System (ADS)

    Sukhbold, Tuguldur; Woosley, S. E.

    2016-04-01

    Recent observations have revealed a stunning diversity of extremely luminous supernovae, seemingly increasing in radiant energy without bound. We consider simple approximate limits for what existing models can provide for the peak luminosity and total radiated energy for non-relativistic, isotropic stellar explosions. The brightest possible supernova is a Type I explosion powered by a sub-millisecond magnetar with field strength B ∼ few × {10}13 G. In extreme cases, such models might reach a peak luminosity of 2× {10}46 {erg} {{{s}}}-1 and radiate a total energy of up to 4× {10}52 {erg}. Other less luminous models are also explored, including prompt hyper-energetic explosions in red supergiants, pulsational-pair instability supernovae, pair-instability supernovae, and colliding shells. Approximate analytic expressions and limits are given for each case. Excluding magnetars, the peak luminosity is near 3× {10}44 {erg} {{{s}}}-1 for the brightest models and the corresponding limit on total radiated energy is 3× {10}51 {erg}. Barring new physics, supernovae with a light output over 3× {10}51 erg must be rotationally powered, either during the explosion itself or after, the most obvious candidate being a rapidly rotating magnetar. A magnetar-based model for the recent transient event, ASASSN-15lh is presented that strains, but does not exceed the limits of what the model can provide.

  16. Evaluation of Landslide Mapping Techniques and LiDAR-based Conditioning Factors

    NASA Astrophysics Data System (ADS)

    Mahalingam, R.; Olsen, M. J.

    2014-12-01

    Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities to plan and prepare for these damaging events. Mapping landslide susceptible locations using GIS and remote sensing techniques is gaining popularity in the past three decades. These efforts use a wide variety of procedures and consider a wide range of factors. Unfortunately, each study is often completed differently and independently of others. Further, the quality of the datasets used varies in terms of source, data collection, and generation, which can propagate errors or inconsistencies into the resulting output maps. Light detection and ranging (LiDAR) has proved to have higher accuracy in representing the continuous topographic surface, which can help minimize this uncertainty. The primary objectives of this paper are to investigate the applicability and performance of terrain factors in landslide hazard mapping, determine if LiDAR-derived datasets (slope, slope roughness, terrain roughness, stream power index and compound topographic index) can be used for predictive mapping without data representing other common landslide conditioning factors, and evaluate the differences in landslide susceptibility mapping using widely-used statistical approaches. The aforementioned factors were used to produce landslide susceptibility maps for a 140 km2 study area in northwest Oregon using six representative techniques: frequency ratio, weights of evidence, logistic regression, discriminant analysis, artificial neural network, and support vector machine. Most notably, the research showed an advantage in selecting fewer critical conditioning factors. The most reliable factors all could be derived from a single LiDAR DEM, reducing the need for laborious and costly data gathering. Most of the six techniques showed similar statistical results; however, ANN showed less accuracy for predictive mapping. Keywords : Li

  17. Lidar Luminance Quantizer

    NASA Technical Reports Server (NTRS)

    Quilligan, Gerard; DeMonthier, Jeffrey; Suarez, George

    2011-01-01

    This innovation addresses challenges in lidar imaging, particularly with the detection scheme and the shapes of the detected signals. Ideally, the echoed pulse widths should be extremely narrow to resolve fine detail at high event rates. However, narrow pulses require wideband detection circuitry with increased power dissipation to minimize thermal noise. Filtering is also required to shape each received signal into a form suitable for processing by a constant fraction discriminator (CFD) followed by a time-to-digital converter (TDC). As the intervals between the echoes decrease, the finite bandwidth of the shaping circuits blends the pulses into an analog signal (luminance) with multiple modes, reducing the ability of the CFD to discriminate individual events

  18. Techniques for Down-Sampling a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces

  19. The Impact on Incorporating Collaborative Concept Mapping with Coteaching Techniques in Elementary Science Classes

    ERIC Educational Resources Information Center

    Jang, Syh-Jong

    2010-01-01

    The purpose of this research was to evaluate a collaborative concept-mapping technique that was integrated into coteaching in fourth-grade science classes in order to examine students' performance and attitudes toward the experimental teaching method. There are two fourth-grade science teachers and four classes with a total of 114 students…

  20. USING HYPERSPECTAL IMAGERY AND SPECTRAL UNMIXING TECHNIQUES FOR MAPPING GRAIN SORGHUM YIELD VARIABILITY

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vegetation indices derived from remotely sensed imagery are commonly used to estimate crop yields. Spectral unmixing techniques provide an alternative approach to quantifying crop canopy abundance within each pixel and have the potential for mapping crop yield variability. The objective of this stud...

  1. A technique for determining the deuterium/hydrogen contrast map in neutron macromolecular crystallography.

    PubMed

    Chatake, Toshiyuki; Fujiwara, Satoru

    2016-01-01

    A difference in the neutron scattering length between hydrogen and deuterium leads to a high density contrast in neutron Fourier maps. In this study, a technique for determining the deuterium/hydrogen (D/H) contrast map in neutron macromolecular crystallography is developed and evaluated using ribonuclease A. The contrast map between the D2O-solvent and H2O-solvent crystals is calculated in real space, rather than in reciprocal space as performed in previous neutron D/H contrast crystallography. The present technique can thus utilize all of the amplitudes of the neutron structure factors for both D2O-solvent and H2O-solvent crystals. The neutron D/H contrast maps clearly demonstrate the powerful detectability of H/D exchange in proteins. In fact, alternative protonation states and alternative conformations of hydroxyl groups are observed at medium resolution (1.8 Å). Moreover, water molecules can be categorized into three types according to their tendency towards rotational disorder. These results directly indicate improvement in the neutron crystal structure analysis. This technique is suitable for incorporation into the standard structure-determination process used in neutron protein crystallography; consequently, more precise and efficient determination of the D-atom positions is possible using a combination of this D/H contrast technique and standard neutron structure-determination protocols. PMID:26894536

  2. Concept Mapping as a Technique for Program Theory Development: An Illustration Using Family Support Programs. Notes

    ERIC Educational Resources Information Center

    Rosas, Scott R.

    2005-01-01

    The emergence of theory-driven evaluation has drawn attention to the need for practical ways for evaluators to construct program theory to achieve the benefits of the approach. This article illustrates the use of concept mapping as a technique to explicate underlying program theory in the context of family support programs. In this example, 29…

  3. Comparing Computer-Supported Dynamic Modeling and "Paper & Pencil" Concept Mapping Technique in Students' Collaborative Activity

    ERIC Educational Resources Information Center

    Komis, Vassilis; Ergazaki, Marida; Zogza, Vassiliki

    2007-01-01

    This study aims at highlighting the collaborative activity of two high school students (age 14) in the cases of modeling the complex biological process of plant growth with two different tools: the "paper & pencil" concept mapping technique and the computer-supported educational environment "ModelsCreator". Students' shared activity in both cases…

  4. Accuracy of vertical radial plume mapping technique in measuring lagoon gas emission

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Recently, the U.S. Environmental Protection Agency (USEPA) posted a ground-based optical remote sensing method on its website called OTM 10 for measuring fugitive gas emission flux from area sources such as closed landfills. The OTM 10 utilizes the vertical radial plume mapping (VRPM) technique to c...

  5. Information Landscaping: Information Mapping, Charting, Querying and Reporting Techniques for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2003-01-01

    Total quality management and knowledge management are merged and used as a conceptual model to direct and develop information landscaping techniques through the coordination of information mapping, charting, querying, and reporting. Goals included: merge citation analysis and data mining, and apply data visualization and information architecture…

  6. Acoustic emission source location in complex structures using full automatic delta T mapping technique

    NASA Astrophysics Data System (ADS)

    Al-Jumaili, Safaa Kh.; Pearson, Matthew R.; Holford, Karen M.; Eaton, Mark J.; Pullin, Rhys

    2016-05-01

    An easy to use, fast to apply, cost-effective, and very accurate non-destructive testing (NDT) technique for damage localisation in complex structures is key for the uptake of structural health monitoring systems (SHM). Acoustic emission (AE) is a viable technique that can be used for SHM and one of the most attractive features is the ability to locate AE sources. The time of arrival (TOA) technique is traditionally used to locate AE sources, and relies on the assumption of constant wave speed within the material and uninterrupted propagation path between the source and the sensor. In complex structural geometries and complex materials such as composites, this assumption is no longer valid. Delta T mapping was developed in Cardiff in order to overcome these limitations; this technique uses artificial sources on an area of interest to create training maps. These are used to locate subsequent AE sources. However operator expertise is required to select the best data from the training maps and to choose the correct parameter to locate the sources, which can be a time consuming process. This paper presents a new and improved fully automatic delta T mapping technique where a clustering algorithm is used to automatically identify and select the highly correlated events at each grid point whilst the "Minimum Difference" approach is used to determine the source location. This removes the requirement for operator expertise, saving time and preventing human errors. A thorough assessment is conducted to evaluate the performance and the robustness of the new technique. In the initial test, the results showed excellent reduction in running time as well as improved accuracy of locating AE sources, as a result of the automatic selection of the training data. Furthermore, because the process is performed automatically, this is now a very simple and reliable technique due to the prevention of the potential source of error related to manual manipulation.

  7. A novel technique for visualizing high-resolution 3D terrain maps

    NASA Astrophysics Data System (ADS)

    Dammann, John

    2007-02-01

    A new technique is presented for visualizing high-resolution terrain elevation data. It produces realistic images at small scales on the order of the data resolution and works particularly well when natural objects are present. Better visualization at small scales opens up new applications, like site surveillance for security and Google Earth-type local search and exploration tasks that are now done with 2-D maps. The large 3-D maps are a natural for high-resolution stereo display. The traditional technique drapes a continuous surface over the regularly spaced elevation values. This technique works well when displaying large areas or in cities with large buildings, but falls apart at small scales or for natural objects like trees. The new technique visualizes the terrain as a set of disjoint square patches. It is combined with an algorithm that identifies smooth areas within the scene. Where the terrain is smooth, such as in grassy areas, roads, parking lots and rooftops, it warps the patches to create a smooth surface. For trees or shrubs or other areas where objects are under-sampled, however, the patches are left disjoint. This has the disadvantage of leaving gaps in the data, but the human mind is very adept at filling in this missing information. It has the strong advantage of making natural terrain look realistic, trees and bushes look stylized but still look natural and are easy to interpret. Also, it does not add artifacts to the map, like filling in blank vertical walls where there are alcoves and other structure and extending bridges and overpasses down to the ground. The new technique is illustrated using very large 1-m resolution 3-D maps from the Rapid Terrain Visualization (RTV) program, and comparisons are made with traditional visualizations using these maps.

  8. A wafer mapping technique for residual stress in surface micromachined films

    NASA Astrophysics Data System (ADS)

    Schiavone, G.; Murray, J.; Smith, S.; Desmulliez, M. P. Y.; Mount, A. R.; Walton, A. J.

    2016-09-01

    The design of MEMS devices employing movable structures is crucially dependant on the mechanical behaviour of the deposited materials. It is therefore important to be able to fully characterize the micromachined films and predict with confidence the mechanical properties of patterned structures. This paper presents a characterization technique that enables the residual stress in MEMS films to be mapped at the wafer level by using microstructures released by surface micromachining. These dedicated MEMS test structures and the associated measurement techniques are used to extract localized information on the strain and Young’s modulus of the film under investigation. The residual stress is then determined by numerically coupling this data with a finite element analysis of the structure. This paper illustrates the measurement routine and demonstrates it with a case study using electrochemically deposited alloys of nickel and iron, particularly prone to develop high levels of residual stress. The results show that the technique enables wafer mapping of film non-uniformities and identifies wafer-to-wafer differences. A comparison between the results obtained from the mapping technique and conventional wafer bow measurements highlights the benefits of using a procedure tailored to films that are non-uniform, patterned and surface-micromachined, as opposed to simple standard stress extraction methods. The presented technique reveals detailed information that is generally unexplored when using conventional stress extraction methods such as wafer bow measurements.

  9. Parametric techniques for characterizing myocardial tissue by magnetic resonance imaging (part 1): T1 mapping.

    PubMed

    Perea Palazón, R J; Ortiz Pérez, J T; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Solé Arqués, M

    2016-01-01

    The development of myocardial fibrosis is a common process in the appearance of ventricular dysfunction in many heart diseases. Magnetic resonance imaging makes it possible to accurately evaluate the structure and function of the heart, and its role in the macroscopic characterization of myocardial fibrosis by late enhancement techniques has been widely validated clinically. Recent studies have demonstrated that T1-mapping techniques can quantify diffuse myocardial fibrosis and the expansion of the myocardial extracellular space in absolute terms. However, further studies are necessary to validate the usefulness of this technique in the early detection of tissue remodeling at a time when implementing early treatment would improve a patient's prognosis. This article reviews the state of the art for T1 mapping of the myocardium, its clinical applications, and its limitations. PMID:26944850

  10. MODIS Snow Cover Mapping Decision Tree Technique: Snow and Cloud Discrimination

    NASA Technical Reports Server (NTRS)

    Riggs, George A.; Hall, Dorothy K.

    2010-01-01

    Accurate mapping of snow cover continues to challenge cryospheric scientists and modelers. The Moderate-Resolution Imaging Spectroradiometer (MODIS) snow data products have been used since 2000 by many investigators to map and monitor snow cover extent for various applications. Users have reported on the utility of the products and also on problems encountered. Three problems or hindrances in the use of the MODIS snow data products that have been reported in the literature are: cloud obscuration, snow/cloud confusion, and snow omission errors in thin or sparse snow cover conditions. Implementation of the MODIS snow algorithm in a decision tree technique using surface reflectance input to mitigate those problems is being investigated. The objective of this work is to use a decision tree structure for the snow algorithm. This should alleviate snow/cloud confusion and omission errors and provide a snow map with classes that convey information on how snow was detected, e.g. snow under clear sky, snow tinder cloud, to enable users' flexibility in interpreting and deriving a snow map. Results of a snow cover decision tree algorithm are compared to the standard MODIS snow map and found to exhibit improved ability to alleviate snow/cloud confusion in some situations allowing up to about 5% increase in mapped snow cover extent, thus accuracy, in some scenes.

  11. Interrogation of a cascaded FBG sensor using a wavelength-to-delay mapping technique

    NASA Astrophysics Data System (ADS)

    Clement, Juan; Hervás, Javier; Fernández-Pousa, Carlos R.; Barrera, David; Pastor, Daniel; Sales, Salvador; Capmany, José

    2015-09-01

    An optical fiber sensor composed of six standard FBGs in cascade is interrogated by use of a technique based on wavelength to delay mapping. A microwave-modulated optical signal enters the FBG cascade and, after being sliced and reflected, experiences dispersion in standard single-mode fiber. The Bragg wavelengths of the FBGs are mapped into the delays that determine the peaks in the system's electrical impulse response. The Bragg wavelength shifts are calculated from the difference of the delays undergone by FBGs samples. A resolution of 9.2 pm in Bragg wavelength shift is demonstrated.

  12. Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005

    USGS Publications Warehouse

    Soller, David R.

    2005-01-01

    Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and

  13. A novel technique for mapping the disparity of off-terrain objects

    NASA Astrophysics Data System (ADS)

    Suliman, A.; Zhang, Y.; Al-Tahir, R.

    2016-04-01

    Third-dimension information is of a great importance for several remote sensing applications, such as building detection. The main data-source for these applications is very high resolution (VHR) satellite images which allow detailed mapping of complex environments. Stereo VHR satellite images allow the extraction of two correlated types of third-dimension information: disparity and elevation information. While the disparity is measured directly, the elevation information is derived computationally. To measure the disparity information, two overlapped images are matched. However, for the backward and forward off-nadir VHR stereo images, building facades occlude areas and hence create many data gaps. When the disparity is required to represent only the off-terrain objects, interpolation and normalization techniques are typically used. However, in dense urban environments, these techniques destroy the quality of the generated data. Therefore, this paper proposes a registration-based technique to measure the disparity of the above-ground objects. The technique includes constructing epipolar images and registering them using common terrain- level features to allow direct disparity mapping for the off-terrain objects. After the implementation, the negative effects of occlusion in the off-nadir VHR stereo images are mitigated through direct disparity mapping of the above-ground objects and bypassing the interpolation and normalization steps.

  14. A satellite remote-sensing technique for geological horizon structure mapping

    SciTech Connect

    Fraser, A.J.; Huggins, P.; Cleverley, P.H.; Rees, J.L.

    1995-12-31

    A Satellite Remote Sensing technique is demonstrated which provides accurate and cost effective near-surface geological structure data. In the exploration phase the technique enables the rapid and inexpensive screening of open licences and the targeting of seismic acquisition, particularly important in terrains of difficult data acquisition. This paper describes the satellite data used, the technique of horizon surface data extraction and the analysis of a case study from Yemen. Landsat Thematic Mapper (TM) data and a high resolution digital elevation model (DEM), generated from stereo SPOT panchromatic images, are used in conjunction to identify a number of outcropping horizons and map their spatial position and height. Geological contacts are identified and digitised from the Landsat TM data and the elevations of these points taken from the digital elevation data. The extracted x,y,z co-ordinates are then gridded to construct a horizon structure map. The technique is applied to an area of central Yemen which is characterised by a near-surface {open_quote}layer cake{close_quote} geological structure in an extremely low dipping terrain (Less than 1{degrees}). The remote sensing interpretation is validated by comparison with 2D seismic across the area. Regional flexural structures with bed dips of as little as 0.25{degrees} can be mapped. Trend analysis and residual calculations on the horizon structure map show the techniques ability to identify and quantify horizon deformation related to faulting. Surface geological structure was successfully interpolated into the subsurface indicating potential fault closure at reservoir target depths.

  15. Mapping of terrain by computer clustering techniques using multispectral scanner data and using color aerial film

    NASA Technical Reports Server (NTRS)

    Smedes, H. W.; Linnerud, H. J.; Woolaver, L. B.; Su, M. Y.; Jayroe, R. R.

    1972-01-01

    Two clustering techniques were used for terrain mapping by computer of test sites in Yellowstone National Park. One test was made with multispectral scanner data using a composite technique which consists of (1) a strictly sequential statistical clustering which is a sequential variance analysis, and (2) a generalized K-means clustering. In this composite technique, the output of (1) is a first approximation of the cluster centers. This is the input to (2) which consists of steps to improve the determination of cluster centers by iterative procedures. Another test was made using the three emulsion layers of color-infrared aerial film as a three-band spectrometer. Relative film densities were analyzed using a simple clustering technique in three-color space. Important advantages of the clustering technique over conventional supervised computer programs are (1) human intervention, preparation time, and manipulation of data are reduced, (2) the computer map, gives unbiased indication of where best to select the reference ground control data, (3) use of easy to obtain inexpensive film, and (4) the geometric distortions can be easily rectified by simple standard photogrammetric techniques.

  16. Contemporary Mapping Techniques of Complex Cardiac Arrhythmias - Identifying and Modifying the Arrhythmogenic Substrate.

    PubMed

    Koutalas, Emmanuel; Rolf, Sascha; Dinov, Borislav; Richter, Sergio; Arya, Arash; Bollmann, Andreas; Hindricks, Gerhard; Sommer, Philipp

    2015-05-01

    Cardiac electrophysiology has moved a long way forward during recent decades in the comprehension and treatment of complex cardiac arrhythmias. Contemporary electroanatomical mapping systems, along with state-of-the-art technology in the manufacture of electrophysiology catheters and cardiac imaging modalities, have significantly enriched our armamentarium, enabling the implementation of various mapping strategies and techniques in electrophysiology procedures. Beyond conventional mapping strategies, ablation of complex fractionated electrograms and rotor ablation in atrial fibrillation ablation procedures, the identification and modification of the underlying arrhythmogenic substrate has emerged as a strategy that leads to improved outcomes. Arrhythmogenic substrate modification also has a major role in ventricular tachycardia ablation procedures. Optimisation of contact between tissue and catheter and image integration are a further step forward to augment our precision and effectiveness. Hybridisation of existing technologies with a reasonable cost should be our goal over the next few years. PMID:26835095

  17. Contemporary Mapping Techniques of Complex Cardiac Arrhythmias – Identifying and Modifying the Arrhythmogenic Substrate

    PubMed Central

    Koutalas, Emmanuel; Rolf, Sascha; Dinov, Borislav; Richter, Sergio; Arya, Arash; Bollmann, Andreas; Hindricks, Gerhard

    2015-01-01

    Cardiac electrophysiology has moved a long way forward during recent decades in the comprehension and treatment of complex cardiac arrhythmias. Contemporary electroanatomical mapping systems, along with state-of-the-art technology in the manufacture of electrophysiology catheters and cardiac imaging modalities, have significantly enriched our armamentarium, enabling the implementation of various mapping strategies and techniques in electrophysiology procedures. Beyond conventional mapping strategies, ablation of complex fractionated electrograms and rotor ablation in atrial fibrillation ablation procedures, the identification and modification of the underlying arrhythmogenic substrate has emerged as a strategy that leads to improved outcomes. Arrhythmogenic substrate modification also has a major role in ventricular tachycardia ablation procedures. Optimisation of contact between tissue and catheter and image integration are a further step forward to augment our precision and effectiveness. Hybridisation of existing technologies with a reasonable cost should be our goal over the next few years. PMID:26835095

  18. Using a Concept Mapping Tool with a Photograph Association Technique (CoMPAT) to Elicit Children's Ideas about Microbial Activity

    ERIC Educational Resources Information Center

    Byrne, Jenny; Grace, Marcus

    2010-01-01

    Concept mapping is a technique used to provide a visual representation of an individual's ideas about a concept or set of related concepts. This paper describes a concept mapping tool using a photograph association technique (CoMPAT) that is considered to be a novel way of eliciting children's ideas. What children at 11 years of age know about…

  19. Digital mapping techniques '00, workshop proceedings - May 17-20, 2000, Lexington, Kentucky

    USGS Publications Warehouse

    Soller, David R.

    2000-01-01

    Introduction: The Digital Mapping Techniques '00 (DMT'00) workshop was attended by 99 technical experts from 42 agencies, universities, and private companies, including representatives from 28 state geological surveys (see Appendix A). This workshop was similar in nature to the first three meetings, held in June, 1997, in Lawrence, Kansas (Soller, 1997), in May, 1998, in Champaign, Illinois (Soller, 1998a), and in May, 1999, in Madison, Wisconsin (Soller, 1999). This year's meeting was hosted by the Kentucky Geological Survey, from May 17 to 20, 2000, on the University of Kentucky campus in Lexington. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. When, based on discussions at the workshop, an attendee adopts or modifies a newly learned technique, the workshop clearly has met that objective. Evidence of learning and cooperation among participating agencies continued to be a highlight of the DMT workshops (see example in Soller, 1998b, and various papers in this volume). The meeting's general goal was to help move the state geological surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and geographic information systems (GIS) analysis. Through oral and poster presentations and special discussion sessions, emphasis was given to: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) continued development of the National Geologic Map Database; 3) progress toward building a standard geologic map data model; 4) field data-collection systems; and 5) map citation and authorship guidelines. Four representatives of the GIS hardware and software vendor community were invited to participate. The four annual DMT workshops were coordinated by the AASG/USGS Data Capture Working Group, which was formed in August, 1996, to support the Association of American State Geologists and the USGS in their effort

  20. Breast density mapping based upon system calibration, x-ray techniques, and FFDM images

    NASA Astrophysics Data System (ADS)

    Chen, Biao; Smith, Andrew P.; Jing, Zhenxue; Wu, Tao

    2007-03-01

    Clinical studies have correlated a high breast density to a women's risk of breast cancer. A breast density measurement that can quantitatively depict the volume distribution and percentage of dense tissues in breasts would be very useful for risk factor assessment of breast cancer, and might be more predictive of risks than the common but subjective and coarse 4-point BIRADS scale. This paper proposes to use a neural-network mapping to compute the breast density information based upon system calibration data, x-ray techniques, and Full Field Digital Mammography (FFDM) images. The mapping consists of four modules, namely, system calibration, generator of beam quality, generator of normalized absorption, and a multi-layer feed-forward neural network. As the core of breast density mapping, the network accepts x-ray target/filter combination, normalized x-ray absorption, pixel-wise breast thickness map, and x-ray beam quality during image acquisition as input elements, and exports a pixel-wise breast density distribution and a single breast density percentage for the imaged breast. Training and testing data sets for the design and verification of the network were formulated from calibrated x-ray beam quality, imaging data with a step wedge phantom under a variety x-ray imaging techniques, and nominal breast densities of tissue equivalent materials. The network was trained using a Levenberg-Marquardt algorithm based back-propagation learning method. Various thickness and glandular density phantom studies were performed with clinical x-ray techniques. Preliminary results showed that the neural network mapping is promising in accurately computing glandular density distribution and breast density percentage.

  1. Technique of sentinel lymph node biopsy and lymphatic mapping during laparoscopic colon resection for cancer

    PubMed Central

    Bianchi, PP; Andreoni, B; Rottoli, M; Celotti, S; Chiappa, A; Montorsi, M

    2007-01-01

    Background: The utility of lymph node mapping to improve staging in colon cancer is still under evaluation. Laparoscopic colectomy for colon cancer has been validated in multi-centric trials. This study assessed the feasibility and technical aspects of lymph node mapping in laparoscopic colectomy for colon cancer. Methods: A total of 42 patients with histologically proven colon cancer were studied from January 2006 to September 2007. Exclusion criteria were: advanced disease (clinical stage III), rectal cancer, previous colon resection and contraindication to laparoscopy. Lymph-nodal status was assessed preoperatively by computed tomography (CT) scan and intra-operatively with the aid of laparoscopic ultrasound. Before resection, 2–3 ml of Patent Blue V dye was injected sub-serosally around the tumour. Coloured lymph nodes were marked as sentinel (SN) with metal clips or suture and laparoscopic colectomy with lymphadenectomy completed as normal. In case of failure of the intra-operative procedure, an ex vivo SN biopsy was performed on the colectomy specimen after resection. Results: A total number of 904 lymph nodes were examined, with a median number of 22 lymph nodes harvested per patient. The SN detection rate was 100%, an ex vivo lymph node mapping was necessary in four patients. Eleven (26.2%) patients had lymph-nodal metastases and in five (45.5%) of these patients, SN was the only positive lymph node. There were two (18.2%) false-negative SN. In three cases (7.1%) with aberrant lymphatic drainage, lymphadenectomy was extended. The accuracy of SN mapping was 95.2% and negative predictive value was 93.9%. Conclusions: Laparoscopic lymphatic mapping and SN removal is feasible in laparoscopic colectomy for colon cancer. The ex vivo technique is useful as a salvage technique in case of failure of the intra-operative procedure. Prospective studies are justified to determine the real accuracy and false-negative rate of the technique. PMID:22275957

  2. Estimation of flood environmental effects using flood zone mapping techniques in Halilrood Kerman, Iran.

    PubMed

    Boudaghpour, Siamak; Bagheri, Majid; Bagheri, Zahra

    2014-01-01

    High flood occurrences with large environmental damages have a growing trend in Iran. Dynamic movements of water during a flood cause different environmental damages in geographical areas with different characteristics such as topographic conditions. In general, environmental effects and damages caused by a flood in an area can be investigated from different points of view. The current essay is aiming at detecting environmental effects of flood occurrences in Halilrood catchment area of Kerman province in Iran using flood zone mapping techniques. The intended flood zone map was introduced in four steps. Steps 1 to 3 pave the way to calculate and estimate flood zone map in the understudy area while step 4 determines the estimation of environmental effects of flood occurrence. Based on our studies, wide range of accuracy for estimating the environmental effects of flood occurrence was introduced by using of flood zone mapping techniques. Moreover, it was identified that the existence of Jiroft dam in the study area can decrease flood zone from 260 hectares to 225 hectares and also it can decrease 20% of flood peak intensity. As a result, 14% of flood zone in the study area can be saved environmentally. PMID:25649059

  3. Development of two-dimensional mapping technique by in-air-PIXE with metal capillary

    NASA Astrophysics Data System (ADS)

    Fujita, N.; Ishii, K.; Ogawa, H.

    2011-05-01

    We have developed the two-dimensional mapping technique with in-air-PIXE (2D-PIXE) using a metal capillary as a guide to extract ion beam to air. The metal capillary is the conventional injection needle with a 200 μm inside diameter. For the target which is the character made of the copper wires on aluminum basement, 2D-PIXE measurements were performed by irradiating 3 MeV proton beam. As a result, the character was tend to be restored clearly by this method. We discuss about the result of the two-dimensional map from a viewpoint of the signal-to-noise ratio and the resolution. This technique is expected to be applicable to various fields such as biology, nano-technology, archeology and so on.

  4. GIS-based statistical mapping technique for block-and-ash pyroclastic flow and surge hazards

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, C.; Voight, B.; Hidayat, D.; Schilling, S.

    2008-12-01

    Assessments of pyroclastic flow (PF) hazards are commonly based on mapping of PF and surge deposits and estimations of inundation limits, and/or computer models of varying degrees of sophistication. In volcanic crises a PF hazard map may be sorely needed, but limited time, exposures, or safety aspects may preclude fieldwork, and insufficient time or baseline data may be available for reliable dynamic simulations. We have developed a statistically constrained simulation model for block-and-ash PFs to estimate potential areas of inundation by adapting methodology from Iverson et al. (1998) for lahars. The predictive equations for block-and-ash PFs are calibrated with data from many volcanoes and given by A = (0.05-0.1)V2/3, B = (35-40)V2/3 , where A is cross-sectional area of inundation, B is planimetric area and V is deposit volume. The proportionality coefficients were obtained from regression analyses and comparison of simulations to mapped deposits. The method embeds the predictive equations in a GIS program coupled with DEM topography, using the LAHARZ program of Schilling (1998). Although the method is objective and reproducible, any PF hazard zone so computed should be considered as an approximate guide only, due to uncertainties on coefficients applicable to individual PFs, DEM details, and release volumes. Gradational nested hazard maps produced by these simulations reflect in a sense these uncertainties. The model does not explicitly consider dynamic behavior, which can be important. Surge impacts must be extended beyond PF hazard zones and we have explored several approaches to do this. The method has been used to supply PF hazard maps in two crises: Merapi 2006; and Montserrat 2006- 2007. We have also compared our hazard maps to actual recent PF deposits and to maps generated by several other model techniques.

  5. New mapping techniques help assess the health of Hawaii's coral reefs

    USGS Publications Warehouse

    Field, M.E.; Chavez, P.S., Jr.; Evans, K.R.; Cochran, S.A.

    2001-01-01

    The U.S. Geological Survey (USGS) is working closely with academic institutions and state and Federal agencies to assess the factors that affect the health of Hawaii's and our Nation's coral reefs. In order to establish a basis from which scientists can objectively detect changes in reef health, the USGS and its cooperators are applying many new techniques to the mapping and monitoring of coral reefs in Hawaii.

  6. Mineral classification map using MF and SAM techniques: A case study in the Nohwa Island, Korea

    SciTech Connect

    Son, Young-Sun; Yoon, Wang-Jung

    2015-03-10

    The purpose of this study is to map pyprophyllite distribution at surface of the Nohwa deposit, Korea by using Advanced Spaceborne Thermal Emission and Reflectance Radiometer (ASTER) data. For this, combined Spectral Angle Mapper (SAM), and Matched Filtering (MF) technique based on mathematical algorithm was applied. The regional distribution of high-grade and low-grade pyrophyllite in the Nohwa deposit area could be differentiated by this method. The results of this study show that ASTER data analysis using combination of SAM and MF techniques will assist in exploration of pyrophyllite at the exposed surface.

  7. Remote Sensing and GIS Techniques to map Groundwater Recharge and Discharge

    NASA Astrophysics Data System (ADS)

    Tweed, S.; Leblanc, M.; Webb, J.; Lubczynski, M.; Stagnitti, F.

    2006-12-01

    In salinity prone catchments, effective management of water resources involves balancing the requirements of sustainable resource allocation with salinity mitigation programs and ecosystem protection. Understanding groundwater recharge and discharge processes is a key component to achieving this balance. In this study, surface and sub-surface features are linked with recharge and discharge processes by incorporating GIS and remote sensing mapping techniques for an unconfined basalt aquifer, in a salinity and drought prone region of southeast Australia. The basalt aquifer covers ~11,500 km2 in an agriculturally intensive region. The approach requires (a) knowledge of local hydrogeological processes, to choose appropriate surface/sub-surface indicators, (b) adequate remote sensing and GIS techniques as well as necessary datasets to map these indicators, and (c) validation of the results at test sites and at the catchment-scale. This approach, applied systematically across a catchment, provides a framework for mapping recharge and discharge areas. A key component in assigning surface and sub-surface indicators is the relevance to the dominant recharge and discharge processes occurring, and using appropriate remote sensing and GIS techniques with the capacity to identify these.

  8. A semi-quantitative technique for mapping potential aquifer productivity on the national scale: example of England and Wales (UK)

    NASA Astrophysics Data System (ADS)

    Abesser, Corinna; Lewis, Melinda

    2015-12-01

    The development and validation of aquifer productivity and depth-to-source maps for England and Wales are described. Aquifer productivity maps can provide valuable support for the assessment, planning and management of groundwater and renewable heat energy resources. Aquifer productivity is often mapped using geostatistical interpolation techniques such as kriging, but these techniques tend to be unsuitable for mapping at the national scale due to the high data (and time) demands. A methodology is outlined for mapping aquifer productivity at the national scale using existing national-scale data sets. Pumping test data are used to characterise the potential borehole yields that different geological formations of varying lithologies and ages can provide. Based on this analysis and using expert knowledge, the corresponding map codes on the geological map are assigned to potential productivity classes. The subsurface (concealed) extent of aquifer units is mapped from geophysical data, and together with the attributed geological map, provide the bedrock-aquifer productivity map. Drilling and pumping costs can be an important consideration when evaluating the feasibility of developing a groundwater source. Thus, a map of the approximate depth to source is developed alongside the aquifer productivity map. The maps are validated using independent data sets, and map performance is compared against performance from maps derived by random and uniform attribution. The results show that the maps successfully predict potential productivity and approximate depth to the water source, although utility of the depth-to-source map could be improved by increasing the vertical discretisation at which depth intervals are mapped.

  9. Luminous-flux measurements by an absolute integrating sphere

    NASA Astrophysics Data System (ADS)

    Rastello, Maria Luisa; Miraldi, Elio; Pisoni, Paolo

    1996-08-01

    We present an original implementation of the absolute-sphere method recently proposed by Ohno. The luminous-flux unit, the lumen, is realized by means of an integrating sphere with an opening calibrated by a luminous-intensity standard placed outside. The adapted experimental setup permits one to measure luminous-flux values between 5 and 2500 lm with a significant improvement with respect to the simulated performances reported in the literature. Traditionally, the luminous-flux unit, the lumen, is realized by goniophotometric techniques in which the luminous-intensity distribution is measured and integrated over the whole solid angle. Thus sphere results are compared with those obtained with the Istituto Elettrotecnico Nazionale goniophotometer. In particular, a set of standards, characterized by luminous-flux values of approximately 2000 lm, has been calibrated with both techniques. We highlight some of the problems encountered. Experimental results show that the agreement between the two methods is within the estimated uncertainty and suggest promising areas for future research.

  10. Testing the PV-Theta Mapping Technique in a 3-D CTM Model Simulation

    NASA Technical Reports Server (NTRS)

    Frith, Stacey M.

    2004-01-01

    Mapping lower stratospheric ozone into potential vorticity (PV)- potential temperature (Theta) coordinates is a common technique employed to analyze sparse data sets. Ozone transformed into a flow-following dynamical coordinate system is insensitive to meteorological variations. Therefore data from a wide range of times/locations can be compared, so long as the measurements were made in the same airmass (as defined by PV). Moreover, once a relationship between ozone and PV/Theta is established, a full 3D ozone field can be estimated from this relationship and the 3D analyzed PV field. However, ozone data mapped in this fashion can be hampered by noisy PV fields, or "mis-matches" in the resolution and/or exact location of the ozone and PV measurements. In this study, we investigate the PV-ozone relationship using output from a recent 50-year run of the Goddard 3D chemical transport model (CTM). Model constituents are transported using off-line dynamics from the finite volume general circulation model (FVGCM). By using the internally consistent model PV and ozone fields, we minimize noise due to mis-matching and resolution issues. We calculate correlations between model ozone and PV throughout the stratosphere, and test the sensitivity of the technique to initial data resolution. To do this we degrade the model data to that of various satellite instruments, then compare the mapped fields derived from the sub-sampled data to the full resolution model data. With these studies we can determine appropriate limits for the PV-theta mapping technique in latitude, altitude, and as a function of original data resolution.

  11. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques

    NASA Astrophysics Data System (ADS)

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R 2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data.

  12. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques.

    PubMed

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data. PMID:23686111

  13. Advanced Ecosystem Mapping Techniques for Large Arctic Study Domains Using Calibrated High-Resolution Imagery

    NASA Astrophysics Data System (ADS)

    Macander, M. J.; Frost, G. V., Jr.

    2015-12-01

    Regional-scale mapping of vegetation and other ecosystem properties has traditionally relied on medium-resolution remote sensing such as Landsat (30 m) and MODIS (250 m). Yet, the burgeoning availability of high-resolution (<=2 m) imagery and ongoing advances in computing power and analysis tools raises the prospect of performing ecosystem mapping at fine spatial scales over large study domains. Here we demonstrate cutting-edge mapping approaches over a ~35,000 km² study area on Alaska's North Slope using calibrated and atmospherically-corrected mosaics of high-resolution WorldView-2 and GeoEye-1 imagery: (1) an a priori spectral approach incorporating the Satellite Imagery Automatic Mapper (SIAM) algorithms; (2) image segmentation techniques; and (3) texture metrics. The SIAM spectral approach classifies radiometrically-calibrated imagery to general vegetation density categories and non-vegetated classes. The SIAM classes were developed globally and their applicability in arctic tundra environments has not been previously evaluated. Image segmentation, or object-based image analysis, automatically partitions high-resolution imagery into homogeneous image regions that can then be analyzed based on spectral, textural, and contextual information. We applied eCognition software to delineate waterbodies and vegetation classes, in combination with other techniques. Texture metrics were evaluated to determine the feasibility of using high-resolution imagery to algorithmically characterize periglacial surface forms (e.g., ice-wedge polygons), which are an important physical characteristic of permafrost-dominated regions but which cannot be distinguished by medium-resolution remote sensing. These advanced mapping techniques provide products which can provide essential information supporting a broad range of ecosystem science and land-use planning applications in northern Alaska and elsewhere in the circumpolar Arctic.

  14. Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun

    2015-06-01

    Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.

  15. Mapping small wetlands of Kenya and Tanzania using remote sensing techniques

    NASA Astrophysics Data System (ADS)

    Mwita, E.; Menz, G.; Misana, S.; Becker, M.; Kisanga, D.; Boehme, B.

    2013-04-01

    Although wetlands in Tanzania and Kenya have great potentials for agricultural production and a multitude of uses, many of them are not even documented on official maps. Lack of official recognition has done little in preventing there over utilization. As the wetlands continue to play remarkable roles in the movement of people and terrestrial species in the region, it is important that they are monitored and properly managed. This study was undertaken in Usambara highlands and the Pangani floodplain in Tanzania, the Mount Kenya highlands and Laikipia floodplain in Kenya to map the different types of wetlands in terms of their size, density, spatial distribution and use patterns. Remote sensing techniques and field surveys were adopted, and 51 wetlands were identified in flood plains within the semi-arid and sub-humid lowlands, and inland valleys in the region. The detailed maps generated showed the intensity of wetland use, inland valleys being the most intensively used, and are useful in monitoring changes in wetlands for their effective management. The use of multispatial resolution imagery, combined with field survey and GIS produced satisfactory results for the delineation and mapping of small wetlands and their uses.

  16. Making Accurate Topographic Maps of the Schoolyard Using Ideas and Techniques Learned and Adapted from Multi-beam Sonar Mapping of the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Fuerst, S. I.; Roberts, J. D.

    2010-12-01

    Having participated in a University of Rhode Island Project Armada expedition to join the University of New Hampshire Center for Coastal and Oceanographic Studies in making multi-beam sonar contour maps of the Arctic Ocean floor, I was able to bring the principles learned from this trip to my earth science high school students and create a project in our "mapping the earth" unit. Students learn basic surveying techniques and create authentic, accurately detailed topographic maps of the schoolyard. Models of their maps are then constructed of either Styrofoam or wood which enables them to make the transition from a 2-dimensional map to a 3-dimensional representation. Even though our maps are created using sticks, line levels, compasses and GPS, the scientific concepts of using location and elevation data to draw contour lines are identical to those used in underwater mapping. Once the students understand the science in mapping and creating contour maps to scale on graph paper by hand, they are able to easily relate this knowledge to what I was doing onboard ship using multi-beam sonar and computer mapping programs. We would like to share with you the lab and techniques that we have developed to make this activity possible with minimal materials and simple technology. As a background extension, it is also possible to replicate sonar measurements using an aquarium, food coloring, and a surface grid to map the topography of a teacher created landscape on the aquarium bottom. Earth Science students using simple tools to accurately map the topography of the school grounds

  17. Urban Mapping and Growth Prediction using Remote Sensing and GIS Techniques, Pune, India

    NASA Astrophysics Data System (ADS)

    Sivakumar, V.

    2014-11-01

    This study aims to map the urban area in and around Pune region between the year 1991 and 2010, and predict its probable future growth using remote sensing and GIS techniques. The Landsat TM and ETM+ satellite images of 1991, 2001 and 2010 were used for analyzing urban land use class. Urban class was extracted / mapped using supervised classification technique with maximum likelihood classifier. The accuracy assessment was carried out for classified maps. The achieved overall accuracy and Kappa statistics were 86.33 % & 0.76 respectively. Transition probability matrix and area change were obtained using different classified images. A plug-in was developed in QGIS software (open source) based on Markov Chain model algorithm for predicting probable urban growth for the future year 2021. Based on available data set, the result shows that urban area is expected to grow much higher in the year 2021 when compared to 2010. This study provides an insight into understanding of urban growth and aids in subsequent infrastructure planning, management and decision-making.

  18. Validating Flood Mapping Products Using a Digital Elevation Model Comparison Technique

    NASA Astrophysics Data System (ADS)

    Fayne, J.

    2014-12-01

    This preliminary study assessed the validity of a pixel analysis elevation comparison technique and determined necessary steps for improvement. The pixel analysis sought to assess the probability of a flood occurring in a particular area by comparing the spatial extent of flood mapping products to the local elevation. The method was developed to determine if the physical relationship between elevation and floods as shown in satellite images is accurately represented in a flood mapping product. The data incorporated in this study are raster digital elevation model (DEM) tiles, a scene from Landsat 5 during a flood period, and a scene from the NASA DEVELOP Flood Disasters Team Flood Product. Pixels representing flooded areas were compared to the elevation height pixels using horizontal transect lines to create pixel value profiles across a 727 km transect of Vietnam and Cambodia. The elevation model comparison validates the Flood Product by depicting water presence in alignment with areas of low elevation. Initial findings indicate that the technique can be used to improve the assessment of flood mapping products in transects less than 10 km. Future research will focus on streamlining the pixel analysis process to yield comprehensive results for larger areas.

  19. Performance Analysis of the Microsoft Kinect Sensor for 2D Simultaneous Localization and Mapping (SLAM) Techniques

    PubMed Central

    Kamarudin, Kamarulzaman; Mamduh, Syed Muhammad; Shakaff, Ali Yeon Md; Zakaria, Ammar

    2014-01-01

    This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS). The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect's depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks. PMID:25490595

  20. Performance analysis of the Microsoft Kinect sensor for 2D Simultaneous Localization and Mapping (SLAM) techniques.

    PubMed

    Kamarudin, Kamarulzaman; Mamduh, Syed Muhammad; Shakaff, Ali Yeon Md; Zakaria, Ammar

    2014-01-01

    This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM) techniques (i.e., Gmapping and Hector SLAM) using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS). The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect's depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks. PMID:25490595

  1. Micro-scale strain mapping technique: a tool to quantify strain partitioning during creep deformation

    NASA Astrophysics Data System (ADS)

    Quintanilla-Terminel, Alejandra; Zimmerman, Mark; Evans, Brian; Kohlstedt, David

    2016-04-01

    Several deformation mechanisms interact to accommodate plastic deformation. Quantifying the contribution of each to the total strain is necessary for establishing a better link between observed microstructures and mechanical data, as well as to allow more confident extrapolation from laboratory to natural conditions. In this contribution, we present the experimental and computational technique involved in micro-scale strain mapping (MSSM). The MSSM technique relies on analyzing the relative displacement of initially regularly spaced markers after deformation. We present several microfabrication techniques that permit us to pattern various rocks with micrometric and nanometric metal markers, as well as the challenges faced in working at high temperatures and pressures. A Hough transform algorithm was used to detect the markers and automate as much as possible the strain analysis. The von Mises strain is calculated for a set of n-points and their relative displacements, which allow us to map the strain at different length scales. We applied the MSSM technique to study strain partitioning during deformation creep of Carrara marble and San Carlos olivine at a confining pressure, Pc, of 300 MPa and homologous temperatures of 0.3 to 0.6. We measured the local strain and strain heterogeneity produced during creep deformation of split cylinders of Carrara marble under conventional triaxial loading to inelastic strains of 11 to 36% at a strain rate of 3x10‑5s‑1, Pc = 300 MPa and 400o < T <700oC. We conclude that the evolution of deformation structures in marble takes place over a substantial interval in strain and that the duration of this interval depends on strain rate, temperature, and pressure. Our first results on strain mapping of olivine deformed at T = 1150oC and Pc = 300 MPa demonstrate promise for characterizing intragranular strain and better defining the contribution of grain boundary sliding to the total strain.

  2. Stacked reverberation mapping

    NASA Astrophysics Data System (ADS)

    Fine, S.; Shanks, T.; Green, P.; Kelly, B. C.; Croom, S. M.; Webster, R. L.; Berger, E.; Chornock, R.; Burgett, W. S.; Chambers, K. C.; Kaiser, N.; Price, P. A.

    2013-07-01

    Over the past 20 years reverberation mapping has proved one of the most successful techniques for studying the local (<1 pc) environment of supermassive black holes that drive active galactic nuclei. Key successes of reverberation mapping have been direct black hole mass estimates, the radius-luminosity relation for the Hβ line and the calibration of single-epoch mass estimators commonly employed up to z ˜ 7. However, observing constraints mean that few studies have been successful at z > 0.1, or for the more-luminous quasars that make up the majority of current spectroscopic samples, or for rest-frame ultraviolet emission lines available in optical spectra of z > 0.5 objects. Previously, we described a technique for stacking cross-correlations to obtain reverberation mapping results at high z. Here, we present the first results from a campaign designed for this purpose. We construct stacked cross-correlation functions for the C IV and Mg II lines and find a clear peak in both. We find that the peak in the Mg II correlation is at longer lags than C IV consistent with previous results at low redshift. For the C IV sample, we are able to bin by luminosity and find evidence for increasing lags for more-luminous objects. This C IV radius-luminosity relation is consistent with previous studies but with a fraction of the observational cost.

  3. Mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Messmore, J.; Copeland, G. E.; Levy, G. F.

    1975-01-01

    This study was undertaken with the intent of elucidating the forest mapping capabilities of ERTS-1 MSS data when analyzed with the aid of LARS' automatic data processing techniques. The site for this investigation was the Great Dismal Swamp, a 210,000 acre wilderness area located on the Middle Atlantic coastal plain. Due to inadequate ground truth information on the distribution of vegetation within the swamp, an unsupervised classification scheme was utilized. Initially pictureprints, resembling low resolution photographs, were generated in each of the four ERTS-1 channels. Data found within rectangular training fields was then clustered into 13 spectral groups and defined statistically. Using a maximum likelihood classification scheme, the unknown data points were subsequently classified into one of the designated training classes. Training field data was classified with a high degree of accuracy (greater than 95%), and progress is being made towards identifying the mapped spectral classes.

  4. Mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Messmore, J.; Copeland, G. E.; Levy, G. F.

    1975-01-01

    This study was undertaken with the intent of elucidating the forest mapping capabilities of ERTS-1 MSS data when analyzed with the aid of LARS' automatic data processing techniques. The site for this investigation was the Great Dismal Swamp, a 210,000 acre wilderness area located on the Middle Atlantic coastal plain. Due to inadequate ground truth information on the distribution of vegetation within the swamp, an unsupervised classification scheme was utilized. Initially pictureprints, resembling low resolution photographs, were generated in each of the four ERTS-1 channels. Data found within rectangular training fields was then clustered into 13 spectral groups and defined statistically. Using a maximum likelihood classification scheme, the unknown data points were subsequently classified into one of the designated training classes. Training field data was classified with a high degree of accuracy (greater than 95 percent), and progress is being made towards identifying the mapped spectral classes.

  5. Tropical Cyclone Vulnerability Mapping Using Geospatial Techniques: Application to a Coastal Upazila in Bangladesh

    NASA Astrophysics Data System (ADS)

    Hoque, M. A. A.; Phinn, S. R.; Roelfsema, C. M.; Childs, I.

    2015-12-01

    Cyclones are one of the most catastrophic natural disasters. Globally, many coastal regions are vulnerable to different categories cyclones. In Bangladesh, disasters from tropical cyclones are annual occurrences in coastal areas. The intensity and extent of damage due to tropical cyclones are very high. An appropriate mapping approach is essential for producing detail vulnerability assessments to deliver useful information for reducing the impacts of cyclones on people, property and environment. The present study developed and tested a vulnerability mapping approach for tropical cyclone impacts in Sarankhola upazila a 151 km2 local government area located in coastal Bangladesh. The study applied the approach by integrating remote sensing, field data and multi-criteria evaluation at regional scales covering <1000 km2. Seven criteria concerned with cyclone impacts were considered for the study: elevation, slope, geomorphology, proximity to coastline, proximity to cyclone track, land uses and population density. Thematic raster map layers were prepared for every criterion and weighted using Analytical Hierarchy Process (AHP) with sensitivity analysis. Weighted Linear Combination (WLC) technique was used for overlaying standardized criteria maps with their weights to produce the vulnerability map. Our results indicated that 15% of the study area had very high vulnerability; mostly close to the river and densely populated areas, with 40 % area as high vulnerability on cropped grounds. Around 25% area was classified at moderate vulnerability covering most of the forests. The low and very low vulnerable area accounts the 12% and 8% respectively. This approach provided very promising result and it was verified by field survey. The result provides the strong applicability of this approach to assess the vulnerability of coastal Bangladesh to tropical cyclones.

  6. Brain MAPS: an automated, accurate and robust brain extraction technique using a template library

    PubMed Central

    Leung, Kelvin K.; Barnes, Josephine; Modat, Marc; Ridgway, Gerard R.; Bartlett, Jonathan W.; Fox, Nick C.; Ourselin, Sébastien

    2011-01-01

    Whole brain extraction is an important pre-processing step in neuro-image analysis. Manual or semi-automated brain delineations are labour-intensive and thus not desirable in large studies, meaning that automated techniques are preferable. The accuracy and robustness of automated methods are crucial because human expertise may be required to correct any sub-optimal results, which can be very time consuming. We compared the accuracy of four automated brain extraction methods: Brain Extraction Tool (BET), Brain Surface Extractor (BSE), Hybrid Watershed Algorithm (HWA) and a Multi-Atlas Propagation and Segmentation (MAPS) technique we have previously developed for hippocampal segmentation. The four methods were applied to extract whole brains from 682 1.5T and 157 3T T1-weighted MR baseline images from the Alzheimer’s Disease Neuroimaging Initiative database. Semi-automated brain segmentations with manual editing and checking were used as the gold-standard to compare with the results. The median Jaccard index of MAPS was higher than HWA, BET and BSE in 1.5T and 3T scans (p < 0.05, all tests), and the 1st-99th centile range of the Jaccard index of MAPS was smaller than HWA, BET and BSE in 1.5T and 3T scans (p < 0.05, all tests). HWA and MAPS were found to be best at including all brain tissues (median false negative rate ≤ 0.010% for 1.5T scans and ≤ 0.019% for 3T scans, both methods). The median Jaccard index of MAPS were similar in both 1.5T and 3T scans, whereas those of BET, BSE and HWA were higher in 1.5T scans than 3T scans (p < 0.05, all tests). We found that the diagnostic group had a small effect on the median Jaccard index of all four methods. In conclusion, MAPS had relatively high accuracy and low variability compared to HWA, BET and BSE in MR scans with and without atrophy. PMID:21195780

  7. A New Active Cavitation Mapping Technique for Pulsed HIFU Applications – Bubble Doppler

    PubMed Central

    Li, Tong; Khokhlova, Tatiana; Sapozhnikov, Oleg; Hwang, Joo Ha; Sapozhnikov, Oleg; O’Donnell, Matthew

    2015-01-01

    In this work, a new active cavitation mapping technique for pulsed high-intensity focused ultrasound (pHIFU) applications termed bubble Doppler is proposed and its feasibility tested in tissue-mimicking gel phantoms. pHIFU therapy uses short pulses, delivered at low pulse repetition frequency, to cause transient bubble activity that has been shown to enhance drug and gene delivery to tissues. The current gold standard for detecting and monitoring cavitation activity during pHIFU treatments is passive cavitation detection (PCD), which provides minimal information on the spatial distribution of the bubbles. B-mode imaging can detect hyperecho formation, but has very limited sensitivity, especially to small, transient microbubbles. The bubble Doppler method proposed here is based on a fusion of the adaptations of three Doppler techniques that had been previously developed for imaging of ultrasound contrast agents – color Doppler, pulse inversion Doppler, and decorrelation Doppler. Doppler ensemble pulses were interleaved with therapeutic pHIFU pulses using three different pulse sequences and standard Doppler processing was applied to the received echoes. The information yielded by each of the techniques on the distribution and characteristics of pHIFU-induced cavitation bubbles was evaluated separately, and found to be complementary. The unified approach - bubble Doppler – was then proposed to both spatially map the presence of transient bubbles and to estimate their sizes and the degree of nonlinearity. PMID:25265178

  8. A technique for the reconstruction of a map of continuous curves from interference fringes

    NASA Astrophysics Data System (ADS)

    Palevicius, Paulius; Ragulskis, Minvydas; Janušas, Giedrius; Palevicius, Arvydas

    2014-08-01

    The well-known phase-shifting approach for three-dimensional surface measurement uses multiple fringe patterns along with the phase-shifting algorithm to obtain 3-D profiles with high accuracy though this approach is not applicable for dynamic object measurement techniques such as time-averaged holography and in cases when only a single interference fringe pattern is available. In this case the fringe tracing method can be used that is based on localization of centers of interference fringes. We propose a technique for the reconstruction of the contour map from fringe patterns which comprises standard image processing techniques and a scheme for reconstruction of the map of continuous curves from the binary matrix of pixels representing fringe centers. The approach of image division into grid cells is taken and such problems as derivation of approximate line equations in each cell using Hough transformation, grouping contacting cells into curves and interpolation between curves with fractures are solved. The functionality of this approach is demonstrated for a demanding optical image containing fractures and noise.

  9. Equivalent Dynamic Stiffness Mapping technique for identifying nonlinear structural elements from frequency response functions

    NASA Astrophysics Data System (ADS)

    Wang, X.; Zheng, G. T.

    2016-02-01

    A simple and general Equivalent Dynamic Stiffness Mapping technique is proposed for identifying the parameters or the mathematical model of a nonlinear structural element with steady-state primary harmonic frequency response functions (FRFs). The Equivalent Dynamic Stiffness is defined as the complex ratio between the internal force and the displacement response of unknown element. Obtained with the test data of responses' frequencies and amplitudes, the real and imaginary part of Equivalent Dynamic Stiffness are plotted as discrete points in a three dimensional space over the displacement amplitude and the frequency, which are called the real and the imaginary Equivalent Dynamic Stiffness map, respectively. These points will form a repeatable surface as the Equivalent Dynamic stiffness is only a function of the corresponding data as derived in the paper. The mathematical model of the unknown element can then be obtained by surface-fitting these points with special functions selected by priori knowledge of the nonlinear type or with ordinary polynomials if the type of nonlinearity is not pre-known. An important merit of this technique is its capability of dealing with strong nonlinearities owning complicated frequency response behaviors such as jumps and breaks in resonance curves. In addition, this technique could also greatly simplify the test procedure. Besides there is no need to pre-identify the underlying linear parameters, the method uses the measured data of excitation forces and responses without requiring a strict control of the excitation force during the test. The proposed technique is demonstrated and validated with four classical single-degree-of-freedom (SDOF) numerical examples and one experimental example. An application of this technique for identification of nonlinearity from multiple-degree-of-freedom (MDOF) systems is also illustrated.

  10. Spatial Field Variability Mapping of Rice Crop using Clustering Technique from Space Borne Hyperspectral Data

    NASA Astrophysics Data System (ADS)

    Moharana, S.; Dutta, S.

    2015-12-01

    Precision farming refers to field-specific management of an agricultural crop at a spatial scale with an aim to get the highest achievable yield and to achieve this spatial information on field variability is essential. The difficulty in mapping of spatial variability occurring within an agriculture field can be revealed by employing spectral techniques in hyperspectral imagery rather than multispectral imagery. However an advanced algorithm needs to be developed to fully make use of the rich information content in hyperspectral data. In the present study, potential of hyperspectral data acquired from space platform was examined to map the field variation of paddy crop and its species discrimination. This high dimensional data comprising 242 spectral narrow bands with 30m ground resolution Hyperion L1R product acquired for Assam, India (30th Sept and 3rd Oct, 2014) were allowed for necessary pre-processing steps followed by geometric correction using Hyperion L1GST product. Finally an atmospherically corrected and spatially deduced image consisting of 112 band was obtained. By employing an advanced clustering algorithm, 12 different clusters of spectral waveforms of the crop were generated from six paddy fields for each images. The findings showed that, some clusters were well discriminated representing specific rice genotypes and some clusters were mixed treating as a single rice genotype. As vegetation index (VI) is the best indicator of vegetation mapping, three ratio based VI maps were also generated and unsupervised classification was performed for it. The so obtained 12 clusters of paddy crop were mapped spatially to the derived VI maps. From these findings, the existence of heterogeneity was clearly captured in one of the 6 rice plots (rice plot no. 1) while heterogeneity was observed in rest of the 5 rice plots. The degree of heterogeneous was found more in rice plot no.6 as compared to other plots. Subsequently, spatial variability of paddy field was

  11. Utility of Close-Range Remote Sensing Techniques for Mapping Topography and Bathymetry in Small Streams

    NASA Astrophysics Data System (ADS)

    Fonstad, M. A.; Dietrich, J.

    2012-12-01

    River science is in the midst of a methodological revolution, where various remote sensing appraoches have been applied to the problem of mapping river forms and processes at high resolution and over large areas. Most of these advancements has been advanced by airborne and spaceborne remote sensing platforms, yielding ground resolutions at meter-scale or larger. Close-range remote-sensing approaches (helicopters, UAS platforms, helikites) have yielded much higher resolutions, but with increased costs, range limitations, and/or technical complexity. Many river researchers, however, still measure riverscapes in small study areas and with a desire for high precision measurements. As such, it is reasonable to infer that ground-based remote sensing techniques might yield highly-precise, low-cost, and easy to acquire information in small streams and be useful for a large user base. Some of the small-stream features that are potentially mappable with close-range imaging include particle sizes, riparian vegetation, water depth, water surface elevations and superelevations, and riverbank and near-floodplain topography. Some of the important comparative metrics include (a) spatial resolution, (b) precision and accuracy, (c) cost, (d) ease of use, (e) speed and ability to map large areas, (f) the importance of specific observation platform, (g) software and hardware considerations, and (h) systematic vs. random errors. In order to compare the utility of mapping approaches against these various metrics, we have mapped a 100m stream test reach in eastern Oregon with both a high-quality dSLR camera and a Kinect active near-IR distance sensor. The camera can map topography through the Structure from Motion (SfM) photogrammetric approach, and can map bathymetry using either SfM or through spectral depth approaches. The Kinect potentially can map above-water topography for land and water. We test the camera-based methods and the Kinect sensor on two platforms. First, by hand

  12. Spectropolarimetry of hot, luminous stars

    NASA Technical Reports Server (NTRS)

    Schulte-Ladbeck, Regina E.

    1994-01-01

    I review polarimetric observations of presumably single, hot luminous stars. The stellar types discussed are OB stars. B(e) supergiants, Luminous Blue Variables (LBV), Wolf-Rayet (W-R) stars, and type II supernovae (SN). It is shown that variable, intrinsic polarization is a common phenomenon in that part of the Hertzsprung-Russell (HR) diagram which these stars occupy. However, much observational work remains to be done before we can answer the most basic, statistical questions about the polarimetric properties of different groups of hot, luminous stars. Insight into the diagnostic power of polarization observations has been gained, but cannot be exploited without detailed models. Thus, while polarimetric observations do tell us that the mass-loss processes of all types of massive stars are time-dependent and anisotropic, the significance that this might have for the accuracy of their stellar parameters and evolutionary paths remains elusive.

  13. Hotspot decorations map plasmonic patterns with the resolution of scanning probe techniques.

    PubMed

    Valev, V K; Silhanek, A V; Jeyaram, Y; Denkova, D; De Clercq, B; Petkov, V; Zheng, X; Volskiy, V; Gillijns, W; Vandenbosch, G A E; Aktsipetrov, O A; Ameloot, M; Moshchalkov, V V; Verbiest, T

    2011-06-01

    In high definition mapping of the plasmonic patterns on the surfaces of nanostructures, the diffraction limit of light remains an important obstacle. Here we demonstrate that this diffraction limit can be completely circumvented. We show that upon illuminating nanostructures made of nickel and palladium, the resulting surface-plasmon pattern is imprinted on the structures themselves; the hotspots (regions of local field enhancement) are decorated with overgrowths, allowing for their subsequent imaging with scanning-probe techniques. The resulting resolution of plasmon pattern imaging is correspondingly improved. PMID:21702624

  14. Mapping Glauconite Unites with Using Remote Sensing Techniques in North East of Iran

    NASA Astrophysics Data System (ADS)

    Ahmadirouhani, R.; Samiee, S.

    2014-10-01

    Glauconite is a greenish ferric-iron silicate mineral with micaceous structure, characteristically formed in shallow marine environments. Glauconite has been used as a pigmentation agent for oil paint, contaminants remover in environmental studies and a source of potassium in plant fertilizers, and other industries. Koppeh-dagh basin is extended in Iran, Afghanistan and Turkmenistan countries and Glauconite units exist in this basin. In this research for enhancing and mapping glauconitic units in Koppeh-dagh structural zone in north east of Iran, remote sensing techniques such as Spectral Angle Mapper classification (SAM), band ratio and band composition methods on SPOT, ASTER and Landsat data in 3 steps were applied.

  15. Brain-Mapping Techniques for Evaluating Poststroke Recovery and Rehabilitation: A Review

    PubMed Central

    Eliassen, James C.; Boespflug, Erin L.; Lamy, Martine; Allendorfer, Jane; Chu, Wen-Jang; Szaflarski, Jerzy P.

    2009-01-01

    Brain-mapping techniques have proven to be vital in understanding the molecular, cellular, and functional mechanisms of recovery after stroke. This article briefly summarizes the current molecular and functional concepts of stroke recovery and addresses how various neuroimaging techniques can be used to observe these changes. The authors provide an overview of various techniques including diffusion-tensor imaging (DTI), magnetic resonance spectroscopy (MRS), ligand-based positron emission tomography (PET), single-photon emission computed tomography (SPECT), regional cerebral blood flow (rCBF) and regional metabolic rate of glucose (rCMRglc) PET and SPECT, functional magnetic resonance imaging (fMRI), near infrared spectroscopy (NIRS), electroencephalography (EEG), magnetoencephalography (MEG), and transcranial magnetic stimulation (TMS). Discussion in the context of poststroke recovery research informs about the applications and limitations of the techniques in the area of rehabilitation research. The authors also provide suggestions on using these techniques in tandem to more thoroughly address the outstanding questions in the field. PMID:19008203

  16. The creation of a digital soil map for Cyprus using decision-tree classification techniques

    NASA Astrophysics Data System (ADS)

    Camera, Corrado; Zomeni, Zomenia; Bruggeman, Adriana; Noller, Joy; Zissimos, Andreas

    2014-05-01

    Considering the increasing threats soil are experiencing especially in semi-arid, Mediterranean environments like Cyprus (erosion, contamination, sealing and salinisation), producing a high resolution, reliable soil map is essential for further soil conservation studies. This study aims to create a 1:50.000 soil map covering the area under the direct control of the Republic of Cyprus (5.760 km2). The study consists of two major steps. The first is the creation of a raster database of predictive variables selected according to the scorpan formula (McBratney et al., 2003). It is of particular interest the possibility of using, as soil properties, data coming from three older island-wide soil maps and the recently published geochemical atlas of Cyprus (Cohen et al., 2011). Ten highly characterizing elements were selected and used as predictors in the present study. For the other factors usual variables were used: temperature and aridity index for climate; total loss on ignition, vegetation and forestry types maps for organic matter; the DEM and related relief derivatives (slope, aspect, curvature, landscape units); bedrock, surficial geology and geomorphology (Noller, 2009) for parent material and age; and a sub-watershed map to better bound location related to parent material sources. In the second step, the digital soil map is created using the Random Forests package in R. Random Forests is a decision tree classification technique where many trees, instead of a single one, are developed and compared to increase the stability and the reliability of the prediction. The model is trained and verified on areas where a 1:25.000 published soil maps obtained from field work is available and then it is applied for predictive mapping to the other areas. Preliminary results obtained in a small area in the plain around the city of Lefkosia, where eight different soil classes are present, show very good capacities of the method. The Ramdom Forest approach leads to reproduce soil

  17. Mapping.

    ERIC Educational Resources Information Center

    Kinney, Douglas M.; McIntosh, Willard L.

    1979-01-01

    The area of geological mapping in the United States in 1978 increased greatly over that reported in 1977; state geological maps were added for California, Idaho, Nevada, and Alaska last year. (Author/BB)

  18. Nonuniformity correction using a flood technique and 1:1 mapping

    NASA Astrophysics Data System (ADS)

    Olson, Eric M.; Bergin, Thomas P.

    2001-08-01

    Although the sparse grid Non-Uniformity Correction (NUC) technique can accurately correct individual emitters on a resistor array, it is not a good solution for all projection applications. Due to busbar robbing, the sparse grid NUC breaks down when a large number of emitters are turned on simultaneously. For this case, a more appropriate NUC data collection method is needed. This method involves measuring the entire resistor array at once with a 1:1 mapping between the projector and NUC sensor. Then busbar effects, measured during the NUC data collection, can be accounted for and corrected. This paper presents details pertaining to the flood NUC technique and results. This NUC system is implemented at the Kinetic Kill Vehicle Hardware In the Loop Simulator (KHILS) at Eglin AFB, Florida.

  19. Mapping of a river using close range photogrammetry technique and unmanned aerial vehicle system

    NASA Astrophysics Data System (ADS)

    Room, M. H. M.; Ahmad, A.

    2014-02-01

    Photogrammetry is a technique that can be used to record the information of any feature without direct contact. Nowadays, a combination of photogrammetry and Unmanned Aerial Vehicle (UAV) systems is widely used for various applications, especially for large scale mapping. UAV systems offer several advantages in terms of cost and image resolution compared to terrestrial photogrammetry and remote sensing system. Therefore, a combination of photogrammetry and UAV created a new term which is UAV photogrammetry. The aim of this study is to investigate the ability of a UAV system to map a river at very close distance. A digital camera is attached to the Hexacopter UAV and it is flown at 2 m above the ground surface to produce aerial photos. Then, the aerial photos are processed to create two photogrammetric products as output. These are mosaicked orthophoto and digital image. Both products are assessed (RSME). The RSME of X and Y coordinates are ±0.009 m and ±0.033 m respectively. As a conclusion, photogrammetry and the UAV system offer a reliable accuracy for mapping a river model and advantages in term of cost-efficient, high ground resolution and rapid data acquisition.

  20. Robust biological parametric mapping: an improved technique for multimodal brain image analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.

    2011-03-01

    Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.

  1. Delineating the Rattlesnake Springs, New Mexico Watershed Using Shallow Subsurface Geophysical Techniques and Geologic Mapping

    NASA Astrophysics Data System (ADS)

    Doser, D. I.; Langford, R. P.; Boykov, N. D.; Baker, M. R.; Kaip, G. M.

    2007-12-01

    Rattlesnake Springs serves as the sole water source for Carlsbad Caverns National Park. The recent development of oil and gas leases and agricultural lands surrounding the springs has led to concern about contamination of the karst aquifer. We have used geophysical techniques, combined with geologic mapping, to delineate possible fracture systems in the gypsum and carbonate bedrock that feed the spring system. Our initial work has focused on a 700 m by 700 m region surrounding the springs. We conducted a series of ground conductivity surveys with follow-up DC resistivity surveys (Wenner array vertical electrical soundings and a pole- pole survey) to determine variations in soil grain size and moisture content. Surface geologic mapping was used to identify a series of Holocene terraces and valleys that incise the terraces. Our combined results suggest that northwest-southeast and north-south trending fractures and dissolution features control regional water flow. Relict spring valleys are found to the west of the present springs. A pole-pole survey conducted around the perimeter of the springs suggests main water flow into the springs occurs from the northwest. We plan to complete a precision gravity survey in September and October 2007 to map bedrock topography and determine its relation to structural and dissolution features. Ground penetrating radar data will be collected on the northwestern side of the springs in an attempt to better delineate structures controlling inflow into the springs.

  2. Proceedings of a workshop on digital mapping techniques; methods for geologic map data capture, management, and publication - June 2 - 5, 1997, Lawrence, Kansas

    USGS Publications Warehouse

    Soller, David R.

    1997-01-01

    Introduction: From June 2-5, 1997, selected technical representatives of the USGS and State geological surveys participated in the 'AASG/USGS Digital Mapping Techniques' workshop in Lawrence, Kansas. The workshop was initiated by the AASG/USGS Data Capture Working Group, and was hosted by the Kansas Geological Survey (KGS). With a focus on methods for data capture and digital map production, the goal was to help move the state surveys and the USGS toward development of more cost-effective, flexible, and useful systems for digital mapping and GIS analysis.

  3. Study of lumineers' interfaces by means of optical coherence tomography

    NASA Astrophysics Data System (ADS)

    de Andrade Borges, Erica; Fernandes Cassimiro-Silva, Patrícia; Osório Fernandes, Luana; Leônidas Gomes, Anderson Stevens

    2015-06-01

    OCT has been used to evaluate dental materials, and is employed here to evaluate lumineers for the first time. Lumineers are used as esthetical indirect restoration, and after wearing and aging, several undesirable features such as gaps, bubbles and mismatch can appear in which would only be seen by invasive analysis. The OCT (spectral domain SD-OCT, 930nm central wavelength) was used to evaluate noninvasively the lumineer- cement-tooth interface. We analyzed 20 specimens of lumineers-teeth that were prepared in bovine teeth and randomly allocated in 4 experimental groups (n=5) with two different cementation techniques and two different types of cementing agent (RelyX U200 and RelyX Veneer, 3M ESPE, with the adhesive recommended by the manufacture). The lumineers were made of lithium disilicate and obtained using a vacuum injection technique. The analysis was performed by using 2D and 3D OCT images, obtained before and after cementing and the thermal cycling process to simulate thermal stress in a oral cavity. Initial measurements showed that the SD-OCT was able to see through the 500μm thick lumineer, as delivered by the fabricant, and internal stress was observed. Failures were found in the cementing process and also after ageing simulation by thermal cycling. The adhesive failures as bubbles, gaps and degradation of the cementation line are the natural precursors of other defects reported by several studies of clinical follow-up (detachments, fractures and cracks). Bubble dimensions ranging from 146 μm to 1427 μm were measured and the OCT was validated as an investigative and precise tool for evaluation of the lumineer-cement-tooth.

  4. Multiple mapping techniques of glaciovolcanic regions: Remote sensing and field mapping of Askja (Dyngjufjöll), Iceland

    NASA Astrophysics Data System (ADS)

    Ellis, M. K.; Graettinger, A. H.; Reath, K. A.; Skilling, I. P.; Ramsey, M. S.; Hughes, C. G.

    2011-12-01

    The Icelandic landscape is dominated by basaltic glaciovolcanic and Holocene post-glacial landforms. The lack of chemical diversity has led to significant simplification of regional maps. This includes areas of historic volcanic activity, such as Askja (Dyngjufjöll) in central Iceland. The purpose of this study is to improve the resolution of the regional map through the identification of variations within glaciovolcanic units and to more accurately constrain the products of historic eruptions. As part of a greater project to map the 45 km2 glaciovolcanic deposits of Askja volcano, Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) based mapping of the region surrounding the Dyngjufjöll massif was undertaken using previous datasets and recent field work for validation. Maps were based on 3 ASTER scenes obtained during overpasses in August 2010 and 2011. Field mapping conducted in August 2010 and 2011 of the glaciovolcanic deposits, previous geologic maps of the Holocene deposits (Annertz 1985, Sigvaldason 1992) and previous remote sensing based investigations (Hjartardottir et al. 2009) were utilized to validate Visible/Near-Infrared (VNIR) and Thermal Infrared (TIR) based maps. Laboratory based TIR data were collected from samples of representative lithologic units within the field area, including glaciovolcanic deposits (pillow lavas, hyaloclastite, etc.), historical deposits (1875 pumice, 1960's basaltic lavas), and Holocene basaltic lavas from Askja. This project produced a highly detailed map of the distribution of the remaining eroded 1875 deposit within the proximal (<50 km) vent area as well as, the complexity of neighboring glaciovolcanic massifs. Additionally, the products of this investigation demonstrate the potential for remote sensing based mapping of glaciovolcanic deposits throughout Iceland and other well exposed geologic regions (i.e. British Columbia). Remote sensing aided mapping will benefit glaciovolcanic studies by

  5. Creep deformation behavior in eutectic Sn-Ag solder joints using a novel mapping technique

    SciTech Connect

    Lucas, J.P.; Guo, F.; McDougall, J.; Bieler, T.R.; Subramanian, K.N.; Park, J.K.

    1999-11-01

    Creep deformation behavior was measured for 60--100 {micro}m thick solder joints. The solder joints investigated consisted of: (1) non-composite solder joints made with eutectic Sn-Ag solder, and (2) composite solder joints with eutectic Sn-Ag solder containing 20 vol.%, 5 {micro}m diameter in-situ Cu{sub 6}Sn{sub 5} intermetallic reinforcements. All creep testing in this study was carried out at room temperature. Qualitative and quantitative assessment of creep deformation was characterized on the solder joints. Creep deformation was analyzed using a novel mapping technique where a geometrical-regular line pattern was etched over the entire solder joint using excimer laser ablation. During creep, the laser-ablation (LA) pattern becomes distorted due to deformation in the solder joint. By imaging the distortion of laser-ablation patterns using the SEM, actual deformation mapping for the entire solder joint is revealed. The technique involves sequential optical/digital imaging of the deformation versus time history during creep. By tracing and recording the deformation of the LA patterns on the solder over intervals of time, local creep data are obtained in many locations in the joint. This analysis enables global and localized creep shear strains and strain rate to be determined.

  6. Fingerprint detection and mapping using a phase shifted coherent gradient sensing technique.

    PubMed

    Dhanotia, Jitendra; Prakash, Satya; Bhatia, Vimal; Prakash, Shashi

    2016-07-10

    In this paper, a full field technique for mapping a latent fingerprint using a coherent gradient sensing (CGS) sensor is proposed. Collimated light from an He-Ne laser illuminates a specimen comprising a fingerprint implanted onto a reflecting surface. Reflected light from the specimen is analyzed using the CGS sensor comprising a pair of gratings. Reflected light carries information regarding the depth and orientation of furrows and ridges in the fingerprint. The topological information of the fingerprint is retrieved using four-step phase shifting interferometry. Well-defined 2D and 3D phase plots have been reconstructed to map the topography of the human fingerprint. The recorded slope data reconstructs the information regarding the separation and depth of the ridges in the latent fingerprint. The proposed technique is noninvasive and full field and does not require any kind of chemical or physical treatment. The sensor is very simple, yields interferometric sensitivity, and has the advantages of easy alignment, compactness, and low cost. PMID:27409305

  7. Computer controlled techniques for high emission density mapping of thermionic cathodes

    NASA Astrophysics Data System (ADS)

    Gibson, J. W.; Thomas, R. E.

    1985-12-01

    Some of the techniques commonly used (e.g. SLEEP and thermionic emission microscope) for measuring emission or work function uniformity of thermionic cathode surfaces require the use of very low or near zero current densities, thus the cathode is characterized at current densities and temperatures much lower than that of a normally operating cathode. The system reported on here uses a high voltage pulse technique and is capable of measuring emission densities in the range 1 to 80 A/cm 2 at normal cathode operating temperatures. The cathode surface is scanned with an anode having a 0.025 mm aperture whose position is controlled by computer operated stepping motors. The current through the aperture to a collector electrode is measured using a sample-and-hold amplifier. Pulsing and sampling are computer synchronized with the scanning, and data for each pulse are accumulated and can be processed and displayed in several ways using the computer, including a detailed "three-dimensional" map of either the electron emission density or work function variations. The entire surface of the cathode or any portion of it can be mapped in steps as small as 0.001 mm (1μm), but typically steps of 5-100 μm were used. Measurements are presented illustrating the uniformity or nonuniformity of the electron emission densities and work functions for type-B and type-M cathodes.

  8. Surface three-dimensional velocity map imaging: A new technique for the study of photodesorption dynamics

    SciTech Connect

    Ji Yuanyuan; Koehler, Sven P. K.; Auerbach, Daniel J.; Wodtke, Alec M.

    2010-07-15

    The authors describe the application of a combination of velocity map imaging and time-of-flight (TOF) techniques to obtain three-dimensional velocity distributions for surface photodesorption. They have established a systematic alignment procedure to achieve correct and reproducible experimental conditions. It includes four steps: (1) optimization of the velocity map imaging ion optics' voltages to achieve optimum velocity map imaging conditions; (2) alignment of the surface normal with the symmetry axis (ion flight axis) of the ion optics; (3) determination of TOF distance between the surface and the ionizing laser beam; (4) alignment of the position of the ionizing laser beam with respect to the ion optics. They applied this set of alignment procedures and then measured Br({sup 2}P{sub 3/2}) (Br) and Br({sup 2}P{sub 1/2}) (Br*) atoms photodesorbing from a single crystal of KBr after exposure to 193 nm light. They analyzed the velocity flux and energy flux distributions for motion normal to the surface. The Br* normal energy distribution shows two clearly resolved peaks at approximately 0.017 and 0.39 eV, respectively. The former is slightly faster than expected for thermal desorption at the surface temperature and the latter is hyperthermal. The Br normal energy distribution shows a single broad peak that is likely composed of two hyperthermal components. The capability that surface three-dimensional velocity map imaging provides for measuring state-specific velocity distributions in all three dimensions separately and simultaneously for the products of surface photodesorption or surface reactions holds great promise to contribute to our understanding of these processes.

  9. Use of Technology-Assisted Techniques of Mind Mapping and Concept Mapping in Science Education: A Constructivist Study

    ERIC Educational Resources Information Center

    Balim, Ali Günay

    2013-01-01

    The study aims to investigate the effects of using mind maps and concept maps on students' learning of concepts in science courses. A total of 51 students participated in this study which used a quasi-experimental research design with pre-test/post-test control groups. The constructivist-inspired study was carried out in the sixth-grade…

  10. A Hybrid Technique for Thickness-Map Visualization of the Hip Cartilages in MRI

    NASA Astrophysics Data System (ADS)

    Khanmohammadi, Mahdieh; Zoroofi, Reza Aghaiezadeh; Nishii, Takashi; Tanaka, Hisashi; Sato, Yoshinobu

    Quantification of the hip cartilages is clinically important. In this study, we propose an automatic technique for segmentation and visualization of the acetabular and femoral head cartilages based on clinically obtained multi-slice T1-weighted MR data and a hybrid approach. We follow a knowledge based approach by employing several features such as the anatomical shapes of the hip femoral and acetabular cartilages and corresponding image intensities. We estimate the center of the femoral head by a Hough transform and then automatically select the volume of interest. We then automatically segment the hip bones by a self-adaptive vector quantization technique. Next, we localize the articular central line by a modified canny edge detector based on the first and second derivative filters along the radial lines originated from the femoral head center and anatomical constraint. We then roughly segment the acetabular and femoral head cartilages using derivative images obtained in the previous step and a top-hat filter. Final masks of the acetabular and femoral head cartilages are automatically performed by employing the rough results, the estimated articular center line and the anatomical knowledge. Next, we generate a thickness map for each cartilage in the radial direction based on a Euclidian distance. Three dimensional pelvic bones, acetabular and femoral cartilages and corresponding thicknesses are overlaid and visualized. The techniques have been implemented in C++ and MATLAB environment. We have evaluated and clarified the usefulness of the proposed techniques in the presence of 40 clinical hips multi-slice MR images.

  11. Rethinking Program Assessment through the Use of Program Alignment Mapping Technique

    ERIC Educational Resources Information Center

    Liu, Min; Wrobbel, Duff; Blankson, Isaac

    2010-01-01

    Curriculum mapping is a well-known assessment tool used to articulate and align a curriculum. The authors present an improved method of curriculum alignment that combines the traditional curriculum mapping with what is often called prerequisite mapping. This improved method of curriculum alignment mapping we label Program Alignment Mapping (PAM)…

  12. MRI technique for the snapshot imaging of quantitative velocity maps using RARE

    NASA Astrophysics Data System (ADS)

    Shiko, G.; Sederman, A. J.; Gladden, L. F.

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T2 weighted, not T2∗ weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98 × 49 μm2, within 20 min, and monitored over ˜13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390 × 390 μm2. The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques.

  13. 78 FR 70964 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: Nuclear Regulatory Commission. ACTION: Combined license... for four consecutive weeks of a combined license (COL) application from Luminant Generation...

  14. 78 FR 66785 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-06

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: Nuclear Regulatory Commission. ACTION: Notice of receipt... consecutive weeks of ] a combined license (COL) application from Luminant Generation Company, LLC....

  15. 78 FR 68100 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: U.S. Nuclear Regulatory Commission (NRC). ACTION... consecutive weeks of a combined license (COL) application from Luminant Generation Company, LLC....

  16. 78 FR 69710 - Luminant Generation Company, LLC

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Luminant Generation Company, LLC AGENCY: U.S. Nuclear Regulatory Commission (NRC). ACTION... consecutive weeks of a combined license (COL) application from Luminant Generation Company, LLC....

  17. The Development of Luminance- and Texture-Defined Form Perception during the School-Aged Years

    ERIC Educational Resources Information Center

    Bertone, Armando; Hanck, Julie; Guy, Jacalyn; Cornish, Kim

    2010-01-01

    The objective of the present study was to assess the development of luminance- and texture-defined static form perception in school-aged children. This was done using an adapted Landolt-C technique where C-optotypes were defined by either luminance or texture information, the latter necessitating extra-striate neural processing to be perceived.…

  18. Updating Landsat-derived land-cover maps using change detection and masking techniques

    NASA Technical Reports Server (NTRS)

    Likens, W.; Maw, K.

    1982-01-01

    The California Integrated Remote Sensing System's San Bernardino County Project was devised to study the utilization of a data base at a number of jurisdictional levels. The present paper discusses the implementation of change-detection and masking techniques in the updating of Landsat-derived land-cover maps. A baseline landcover classification was first created from a 1976 image, then the adjusted 1976 image was compared with a 1979 scene by the techniques of (1) multidate image classification, (2) difference image-distribution tails thresholding, (3) difference image classification, and (4) multi-dimensional chi-square analysis of a difference image. The union of the results of methods 1, 3 and 4 was used to create a mask of possible change areas between 1976 and 1979, which served to limit analysis of the update image and reduce comparison errors in unchanged areas. The techniques of spatial smoothing of change-detection products, and of combining results of difference change-detection algorithms are also shown to improve Landsat change-detection accuracies.

  19. Probabilistic flood inundation mapping of ungauged rivers: Linking GIS techniques and frequency analysis

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Soltani, Saeed; Modarres, Reza

    2012-08-01

    SummaryThis study presents an exhaustive methodology of floodplain mapping at ungauged rivers. To present our methodology, we selected the Halilrud basin and Jiroft city in southeastern Iran as an example of hazardous regions. To estimate flood quantiles in different return periods at ungauged reaches, we used regional flood frequency analysis. By using the well-known L-moments approach and related criteria, a homogeneous region was formed and the 3-parameter Log normal distribution was identified as the robust regional frequency distribution. The hydro-geomorphic characteristics and the land use properties of the catchments were then extracted using RS&GIS techniques to establish multivariate regional regression models between hydro-geomorphic characteristics and flood quantiles. After delineation of the catchments for the ungauged reaches, flood quantiles as an important factor in flood inundation at outlets of these reaches with different probabilities were estimated using the regional regression models. To delineate flood hazard maps and to enhance the accuracy of the hydraulic modeling, we applied satellite stereoscope images of Cartosat-1 along with the Rational Polynomial Coefficients to extract a high resolution DTM and detailed parameterization of the channel required by 1D hydraulic model HEC-RAS. The GIS-based HEC-Geo RAS pre- and post-processor were also used for careful optimization of the geometry features for real visualization of the flood prone areas. Information of some historical flood events was also used to evaluate the hydraulic model performance in predicting flood inundations. Finally, vulnerable areas were crossed with extracted land use mapping from IRS-P6 satellite images to differentiate the critical infrastructures and the valuable land use classes affected by floods in different return periods.

  20. [Endoscopic management of luminal stenosis in inflammatory bowel disease].

    PubMed

    Lorenzo-Zúñiga, Vicente; García-Planella, Esther; Moreno De Vega, Vicente; Domènech, Eugeni; Boix, Jaume

    2012-01-01

    Luminal stenosis is frequent in Crohn's disease (CD) due to transmural involvement. Before any endoscopic treatment, the presence of neoplastic stenosis should always be excluded. Endoscopic balloon dilatation has been used in several series to treat benign stenosis, mainly in CD with involvement of the distal ileon, colon or surgical anastomosis, with success rates of 51% to 85%, although recurrence is high. The concomitant use of injected steroids (triamcinolone) after endoscopic dilatation produces longer-lasting results, but there are few published reports. In patients with luminal stenosis refractory to conventional endoscopy, three emerging techniques may be useful: self-expanding metallic stents, biodegradable endoprostheses and intralesional infliximab injection. PMID:22341673

  1. Inventory and mapping of flood inundation using interactive digital image analysis techniques

    USGS Publications Warehouse

    Rohde, Wayne G.; Nelson, Charles A.; Taranik, J.V.

    1979-01-01

    LANDSAT digital data and color infra-red photographs were used in a multiphase sampling scheme to estimate the area of agricultural land affected by a flood. The LANDSAT data were classified with a maximum likelihood algorithm. Stratification of the LANDSAT data, prior to classification, greatly reduced misclassification errors. The classification results were used to prepare a map overlay showing the areal extent of flooding. These data also provided statistics required to estimate sample size in a two phase sampling scheme, and provided quick, accurate estimates of areas flooded for the first phase. The measurements made in the second phase, based on ground data and photo-interpretation, were used with two phase sampling statistics to estimate the area of agricultural land affected by flooding These results show that LANDSAT digital data can be used to prepare map overlays showing the extent of flooding on agricultural land and, with two phase sampling procedures, can provide acreage estimates with sampling errors of about 5 percent. This procedure provides a technique for rapidly assessing the areal extent of flood conditions on agricultural land and would provide a basis for designing a sampling framework to estimate the impact of flooding on crop production.

  2. Quantitative mapping of pore fraction variations in silicon nitride using an ultrasonic contact scan technique

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Kiser, James D.; Swickard, Suzanne M.; Szatmary, Steven A.; Kerwin, David P.

    1993-01-01

    An ultrasonic scan procedure using the pulse-echo contact configuration was employed to obtain maps of pore fraction variations in sintered silicon nitride samples in terms of ultrasonic material properties. Ultrasonic velocity, attenuation coefficient, and reflection coefficient images were obtained simultaneously over a broad band of frequencies (e.g., 30 to 110 MHz) by using spectroscopic analysis. Liquid and membrane (dry) coupling techniques and longitudinal and shear-wave energies were used. The major results include the following: Ultrasonic velocity (longitudinal and shear wave) images revealed and correlated with the extent of average through-thickness pore fraction variations in the silicon nitride disks. Attenuation coefficient images revealed pore fraction nonuniformity due to the scattering that occurred at boundaries between regions of high and low pore fraction. Velocity and attenuation coefficient images were each nearly identical for machined and polished disks, making the method readily applicable to machined materials. Velocity images were similar for wet and membrane coupling. Maps of apparent Poisson's ratio constructed from longitudinal and shear-wave velocities quantified Poisson's ratio variations across a silicon nitride disk. Thermal wave images of a disk indicated transient thermal behavior variations that correlated with observed variations in pore fraction and velocity and attenuation coefficients.

  3. Supersonic shear imaging: a new technique for soft tissue elasticity mapping.

    PubMed

    Bercoff, Jérémy; Tanter, Mickaël; Fink, Mathias

    2004-04-01

    Supersonic shear imaging (SSI) is a new ultrasound-based technique for real-time visualization of soft tissue viscoelastic properties. Using ultrasonic focused beams, it is possible to remotely generate mechanical vibration sources radiating low-frequency, shear waves inside tissues. Relying on this concept, SSI proposes to create such a source and make it move at a supersonic speed. In analogy with the "sonic boom" created by a supersonic aircraft, the resulting shear waves will interfere constructively along a Mach cone, creating two intense plane shear waves. These waves propagate through the medium and are progressively distorted by tissue heterogeneities. An ultrafast scanner prototype is able to both generate this supersonic source and image (5000 frames/s) the propagation of the resulting shear waves. Using inversion algorithms, the shear elasticity of medium can be mapped quantitatively from this propagation movie. The SSI enables tissue elasticity mapping in less than 20 ms, even in strongly viscous medium like breast. Modalities such as shear compounding are implementable by tilting shear waves in different directions and improving the elasticity estimation. Results validating SSI in heterogeneous phantoms are presented. The first in vivo investigations made on healthy volunteers emphasize the potential clinical applicability of SSI for breast cancer detection. PMID:15139541

  4. Mapping air pollution using Earth observation techniques for cultural heritage sites

    NASA Astrophysics Data System (ADS)

    Agapiou, Athos; Nisantzi, Argyro; Lysandrou, Vasiliki; Mamouri, Rodanthi; Alexakis, Dimitrios D.; Themistocleous, Kyriacos; Sarris, Apostolos; Hadjimitsis, Diofantos G.

    2013-08-01

    Air pollutants, together with climatic parameters, are of major importance for the deterioration of cultural heritage monuments. Atmospheric pollution is widely recognized as one of the major anthropogenic threats to architectural cultural heritage, in particular when associated with water absorption phenomena. Atmospheric particle deposition on surfaces of Monuments (of cultural heritage interest) may cause an aesthetic impact induced by a series of chemical reactions. Therefore there is a need for systematic monitoring and mapping of air pollution for areas where important archaeological sites and monuments are found. observation techniques, such as the use of satellite image for the retrieval of Aerosol Optical Thickness (AOT), are ideal for this purpose. In this paper, all important monuments of the Paphos District, listed by the Department of Antiquities of Cyprus, have been mapped using Geographical Information Systems. Several recent (2012) MODIS satellite images (both Aqua and Terra) have been used to extract the AOT values in this area. Multi-temporal analysis was performed to identify areas of high risk where AOT values are considered to be high. In situ observations have been also carried out to verify the results.

  5. Mapping the temporal pole with a specialized electrode array: technique and preliminary results

    PubMed Central

    Abel, Taylor J.; Rhone, Ariane E.; Nourski, Kirill V.; Granner, Mark A.; Oya, Hiroyuki; Griffiths, Timothy D.; Tranel, Daniel T.; Kawasaki, Hiroto; Howard, Matthew A.

    2014-01-01

    Temporopolar cortex plays a crucial role in the pathogenesis of temporal lobe epilepsy and subserves important cognitive functions. Because of its shape and position in the middle cranial fossa, complete electrode coverage of the temporal pole (TP) is difficult to achieve using existing devices. We designed a novel TP electrode array that conforms to the surface of temporopolar cortex and achieves dense electrode coverage of this important brain region. A multi-pronged electrode array was designed that can be placed over the surface of the TP using a straightforward insertion technique. Twelve patients with medically intractable epilepsy were implanted with the TP electrode array for purposes of seizure localization. Select patients underwent cognitive mapping by electrocorticographic (ECoG) recording from the TP during a naming task. Use of the array resulted in excellent TP electrode coverage in all patients. High quality ECoG data were consistently obtained for purposes of delineating seizure activity and functional mapping. During a naming task, significant increases in ECoG power were observed within localized subregions of the TP. One patient developed a transient neurological deficit thought to be related to the mass effect of multiple intracranial recording arrays, including the TP array. This deficit resolved following removal of all electrodes. The TP electrode array overcomes limitations of existing devices and enables clinicians and researchers to obtain optimal multi-site recordings from this important brain region. PMID:24480831

  6. Pilot Application of 3d Underwater Imaging Techniques for Mapping Posidonia Oceanica (L.) Delile Meadows

    NASA Astrophysics Data System (ADS)

    Rende, F. S.; Irving, A. D.; Lagudi, A.; Bruno, F.; Scalise, S.; Cappa, P.; Montefalcone, M.; Bacci, T.; Penna, M.; Trabucco, B.; Di Mento, R.; Cicero, A. M.

    2015-04-01

    Seagrass communities are considered one of the most productive and complex marine ecosystems. Seagrasses belong to a small group of 66 species that can form extensive meadows in all coastal areas of our planet. Posidonia oceanica beds are the most characteristic ecosystem of the Mediterranean Sea, and should be constantly monitored, preserved and maintained, as specified by EU Habitats Directive for priority habitats. Underwater 3D imaging by means of still or video cameras can allow a detailed analysis of the temporal evolution of these meadows, but also of the seafloor morphology and integrity. Video-photographic devices and open source software for acquiring and managing 3D optical data rapidly became more and more effective and economically viable, making underwater 3D mapping an easier task to carry out. 3D reconstruction of the underwater scene can be obtained with photogrammetric techniques that require just one or more digital cameras, also in stereo configuration. In this work we present the preliminary results of a pilot 3D mapping project applied to the P. oceanica meadow in the Marine Protected Area of Capo Rizzuto (KR, Calabria Region - Italy).

  7. Performance prediction of four-contact vertical Hall-devices using a conformal mapping technique

    NASA Astrophysics Data System (ADS)

    Yang, Huang; Yue, Xu; Yufeng, Guo

    2015-12-01

    Instead of the conventional design with five contacts in the sensor active area, innovative vertical Hall devices (VHDs) with four contacts and six contacts are asymmetrical in structural design but symmetrical in the current flow that can be well fit for the spinning current technique for offset elimination. In this article, a conformal mapping calculation method is used to predict the performance of asymmetrical VHD embedded in a deep n-well with four contacts. Furthermore, to make the calculation more accurate, the junction field effect is also involved into the conformal mapping method. The error between calculated and simulated results is less than 5% for the current-related sensitivity, and approximately 13% for the voltage-related sensitivity. This proves that such calculations can be used to predict the optimal structure of the vertical Hall-devices. Project supported by the Natural Science Foundation of Jiangsu Province, China (Nos. BK20131379, BK20141431) and the Graduate Research and Innovation Projects of Jiangsu Province (No. SJLX_0373).

  8. Application of the lamp mapping technique for overlap function for Raman lidar systems.

    PubMed

    Walker, Monique; Venable, Demetrius; Whiteman, David N; Sakai, Tetsu

    2016-04-01

    Traditionally, the lidar water vapor mixing ratio (WVMR) is corrected for overlap using data from another instrument, such as a radiosonde. Here we introduce a new experimental method to determine the overlap function using the lamp mapping technique (LMT), which relies on the lidar optics and detection system. The LMT discussed here involves a standard halogen lamp being scanned over the aperture of a Raman lidar telescope in synchronization with the lidar detection system [Appl. Opt.50, 4622 (2011)APOPAI0003-693510.1364/AO.50.004622, Appl. Opt.53, 8538 (2014)APOPAI0003-693510.1364/AO.53.008535]. In this paper, we show results for a LMT-determined overlap function for individual channels, as well as a WVMR overlap function. We found that the LMT-determined WVMR overlap functions deviate within 5% of the traditional radiosonde-determined overlap. PMID:27139656

  9. A Multi-Resolution Nonlinear Mapping Technique for Design and Analysis Application

    NASA Technical Reports Server (NTRS)

    Phan, Minh Q.

    1997-01-01

    This report describes a nonlinear mapping technique where the unknown static or dynamic system is approximated by a sum of dimensionally increasing functions (one-dimensional curves, two-dimensional surfaces, etc.). These lower dimensional functions are synthesized from a set of multi-resolution basis functions, where the resolutions specify the level of details at which the nonlinear system is approximated. The basis functions also cause the parameter estimation step to become linear. This feature is taken advantage of to derive a systematic procedure to determine and eliminate basis functions that are less significant for the particular system under identification. The number of unknown parameters that must be estimated is thus reduced and compact models obtained. The lower dimensional functions (identified curves and surfaces) permit a kind of "visualization" into the complexity of the nonlinearity itself.

  10. Mapping to Irregular Torus Topologies and Other Techniques for Petascale Biomolecular Simulation

    PubMed Central

    Phillips, James C.; Sun, Yanhua; Jain, Nikhil; Bohm, Eric J.; Kalé, Laxmikant V.

    2014-01-01

    Currently deployed petascale supercomputers typically use toroidal network topologies in three or more dimensions. While these networks perform well for topology-agnostic codes on a few thousand nodes, leadership machines with 20,000 nodes require topology awareness to avoid network contention for communication-intensive codes. Topology adaptation is complicated by irregular node allocation shapes and holes due to dedicated input/output nodes or hardware failure. In the context of the popular molecular dynamics program NAMD, we present methods for mapping a periodic 3-D grid of fixed-size spatial decomposition domains to 3-D Cray Gemini and 5-D IBM Blue Gene/Q toroidal networks to enable hundred-million atom full machine simulations, and to similarly partition node allocations into compact domains for smaller simulations using multiple-copy algorithms. Additional enabling techniques are discussed and performance is reported for NCSA Blue Waters, ORNL Titan, ANL Mira, TACC Stampede, and NERSC Edison. PMID:25594075

  11. A Multi-Resolution Nonlinear Mapping Technique for Design and Analysis Applications

    NASA Technical Reports Server (NTRS)

    Phan, Minh Q.

    1998-01-01

    This report describes a nonlinear mapping technique where the unknown static or dynamic system is approximated by a sum of dimensionally increasing functions (one-dimensional curves, two-dimensional surfaces, etc.). These lower dimensional functions are synthesized from a set of multi-resolution basis functions, where the resolutions specify the level of details at which the nonlinear system is approximated. The basis functions also cause the parameter estimation step to become linear. This feature is taken advantage of to derive a systematic procedure to determine and eliminate basis functions that are less significant for the particular system under identification. The number of unknown parameters that must be estimated is thus reduced and compact models obtained. The lower dimensional functions (identified curves and surfaces) permit a kind of "visualization" into the complexity of the nonlinearity itself.

  12. DIGITAL PROCESSING TECHNIQUES FOR IMAGE MAPPING WITH LANDSAT TM AND SPOT SIMULATOR DATA.

    USGS Publications Warehouse

    Chavez, Pat S., Jr.

    1984-01-01

    To overcome certain problems associated with the visual selection of Landsat TM bands for image mapping, the author used a quantitative technique that ranks the 20 possible three-band combinations based upon their information content. Standard deviations and correlation coefficients can be used to compute a value called the Optimum Index Factor (OIF) for each of the 20 possible combinations. SPOT simulator images were digitally processed and compared with Landsat-4 Thematic Mapper (TM) images covering a semi-arid region in northern Arizona and a highly vegetated urban area near Washington, D. C. Statistical comparisons indicate the more radiometric or color information exists in certain TM three-band combinations than in the three SPOT bands.

  13. A new context-model for the pseudo-distance technique in lossless compression of color-mapped images

    NASA Astrophysics Data System (ADS)

    Koc, Basar; Arnavut, Ziya

    2012-10-01

    In this work, we propose a method that utilizes a new context model along with a pseudo-distance technique in compression of color-mapped images. Graphic Interchange Format (GIF) and Portable Network Graphics (PNG) are two of the well-known and frequently used techniques for the compression of color-mapped images. There are several techniques that achieve better compression results than GIF and PNG; however, most of these techniques need two passes on the image data, while others do not run in linear time. The pseudo-distance technique runs in linear time and requires only one pass. We show that using the proposed context model along with the pseudo-distance technique yields better results than both PNG and GIF.

  14. Chromospheres of Luminous Cool Stars

    NASA Astrophysics Data System (ADS)

    Dupree, Andrea K.; Avrett, Eugene

    2015-08-01

    Ultraviolet imaging of Alpha Orionis (Betelgeuse) reveals a complex variable chromospheric structure. Such atmospheres in luminous cool stars can affect features in the optical spectrum. Constructing semi-empiricalmodel atmospheres of luminous stars including the temperature rise due to a chromosphere allows us to predict potential effects on optical transitions. The radiative transfer code, PANDORA, calculates line strengths in a LTE or non-LTE formulation, spherical symmetry, and includes velocity fields when present. Various aspects of the line calculations and their impact on equivalent widths will be discussed including developing appropriate chromospheric models, comparison to a pure radiative equilibrium model, transitions sensitive to non-LTE and the effects of a realistic spherical non-LTE approximation as compared to a plane-parallel approximation. We discuss the extent to which a chromosphere can impact the determination of stellar abundances.

  15. Mapping aboveground woody biomass using forest inventory, remote sensing and geostatistical techniques.

    PubMed

    Yadav, Bechu K V; Nandy, S

    2015-05-01

    Mapping forest biomass is fundamental for estimating CO₂ emissions, and planning and monitoring of forests and ecosystem productivity. The present study attempted to map aboveground woody biomass (AGWB) integrating forest inventory, remote sensing and geostatistical techniques, viz., direct radiometric relationships (DRR), k-nearest neighbours (k-NN) and cokriging (CoK) and to evaluate their accuracy. A part of the Timli Forest Range of Kalsi Soil and Water Conservation Division, Uttarakhand, India was selected for the present study. Stratified random sampling was used to collect biophysical data from 36 sample plots of 0.1 ha (31.62 m × 31.62 m) size. Species-specific volumetric equations were used for calculating volume and multiplied by specific gravity to get biomass. Three forest-type density classes, viz. 10-40, 40-70 and >70% of Shorea robusta forest and four non-forest classes were delineated using on-screen visual interpretation of IRS P6 LISS-III data of December 2012. The volume in different strata of forest-type density ranged from 189.84 to 484.36 m(3) ha(-1). The total growing stock of the forest was found to be 2,024,652.88 m(3). The AGWB ranged from 143 to 421 Mgha(-1). Spectral bands and vegetation indices were used as independent variables and biomass as dependent variable for DRR, k-NN and CoK. After validation and comparison, k-NN method of Mahalanobis distance (root mean square error (RMSE) = 42.25 Mgha(-1)) was found to be the best method followed by fuzzy distance and Euclidean distance with RMSE of 44.23 and 45.13 Mgha(-1) respectively. DRR was found to be the least accurate method with RMSE of 67.17 Mgha(-1). The study highlighted the potential of integrating of forest inventory, remote sensing and geostatistical techniques for forest biomass mapping. PMID:25930205

  16. Application of Geologic Mapping Techniques and Autonomous Feature Detection to Future Exploration of Europa

    NASA Astrophysics Data System (ADS)

    Bunte, M. K.; Tanaka, K. L.; Doggett, T.; Figueredo, P. H.; Lin, Y.; Greeley, R.; Saripalli, S.; Bell, J. F.

    2013-12-01

    Europa's extremely young surface age, evidence for extensive resurfacing, and indications of a sub-surface ocean elevate its astrobiological potential for habitable environments and make it a compelling focus for study. Knowledge of the global distribution and timing of Europan geologic units is a key step in understanding the history of the satellite and for identifying areas relevant for exploration. I have produced a 1:15M scale global geologic map of Europa which represents a proportionate distribution of four unit types and associated features: plains, linea, chaos, and crater materials. Mapping techniques differ somewhat from other planetary maps but do provide a method to establish stratigraphic markers and to illustrate the surface history through four periods of formation as a function of framework lineament cross-cutting relationships. Correlations of observed features on Europa with Earth analogs enforce a multi-process theory for formation rather than the typical reliance on the principle of parsimony. Lenticulae and microchaos are genetically similar and most likely form by diapirism. Platy and blocky chaos units, endmembers of archetypical chaos, are best explained by brine mobilization. Ridges account for the majority of lineaments and may form by a number of methods indicative of local conditions; most form by either tidal pumping or shear heating. The variety of morphologies exhibited by bands indicates that multiple formation mechanisms apply once fracturing of the brittle surface over a ductile subsurface is initiated. Mapping results support the interpretation that Europa's shell has thickened over time resulting in changes in the style and intensity of deformation. Mapping serves as an index for change detection and classification, aids in pre-encounter targeting, and supports the selection of potential landing sites. Highest priority target areas are those which indicate geophysical activity by the presence of volcanic plumes, outgassing, or

  17. Effect of display polarity and luminance contrast on visual lobe shape characteristics.

    PubMed

    Tsang, Steve N H; Chan, Alan H S; Yu, R F

    2012-01-01

    The effect of display polarity and luminance contrast on visual lobe (effective visual field) shape characteristics was studied using three levels of luminance contrast with combinations of positive and negative polarities. The binocular effective visual field for a detection task, with a peripherally presented target (V) embedded in a homogeneous competing background (Xs), was mapped on 24 imaginary axes passing through the fixation point. The results showed that visual lobes mapped using positive polarity were statistically larger in area, rounder and more regular in shape than those for negative polarity. The medium contrast condition lobes were more symmetric and regular than low contrast condition lobes, and lobe area and perimeter increased with increasing luminance contrast ratio. Under the interaction of positive polarity and high luminance contrast, visual lobes were found to be larger, smoother and rounder. The high level of luminance and contrast however resulted in a higher degree of visual discomfort. The results indicated that positive polarity and contrast of medium (26:1) to high (41:1) levels are possible display settings for better visual lobe characteristics and better anticipated search performance. Practitioner Summary: The effect of display polarity and luminance contrast on visual lobe shape characteristics was examined with uniform stimulus materials in this study. The results help to identify the optimum display settings for luminance contrast and display polarity to enhance lobe shape characteristics and hence search performance in industrial inspection tasks. PMID:22676836

  18. A self-trained classification technique for producing 30 m percent-water maps from Landsat data

    USGS Publications Warehouse

    Rover, Jennifer R.; Wylie, Bruce K.; Ji, Lei

    2010-01-01

    Small bodies of water can be mapped with moderate-resolution satellite data using methods where water is mapped as subpixel fractions using field measurements or high-resolution images as training datasets. A new method, developed from a regression-tree technique, uses a 30 m Landsat image for training the regression tree that, in turn, is applied to the same image to map subpixel water. The self-trained method was evaluated by comparing the percent-water map with three other maps generated from established percent-water mapping methods: (1) a regression-tree model trained with a 5 m SPOT 5 image, (2) a regression-tree model based on endmembers and (3) a linear unmixing classification technique. The results suggest that subpixel water fractions can be accurately estimated when high-resolution satellite data or intensively interpreted training datasets are not available, which increases our ability to map small water bodies or small changes in lake size at a regional scale.

  19. Threading Dislocation Characterization and Stress Mapping Depth Profiling via Ray Tracing Technique

    NASA Astrophysics Data System (ADS)

    Zhou, Tianyi

    Zinc oxide (ZnO) has been well known as a transparent, dielectric, piezoelectric and wide band gap material. The potential capabilities have been demonstrated for a wide range of applications such as piezoelectric transducer, gas sensor, optical waveguides and transparent electrode. It could also be applied as a substrate material for GaN-based devices. However, while some applications have already been realized, issues relating to crystalline defects remain a barrier to the successful realization of several others. In this thesis, the central focus of Chapter II is to characterize threading dislocations in hydrothermal grown ZnO substrates through simulation work as well as other techniques. The goal of this study is to find the origin of threading dislocations and design strategies to mitigate their negative effects by either reducing their densities or completely eliminating them. In Chapter III, the technique of SMART (stress mapping analysis via ray tracing) is discussed in detail to measure residue stress in packaged silicon circuits. Residual stress plays an important role in the performance and lifetime of single crystal device material. There are mainly two advantages of SMART compared with other techniques: (a) all six components of the stress tensor could be evaluated; (b) it is non-destructive and no damaging trace will be left on the sample. In this study, our goal is to build a relationship between stress distribution and depth. The concept of penetration depth is critically important in this study and its value may cause great changes for real space stress distribution. A new function is applied to get better fitting curves. Data in this study is obtained from various penetration depth, which represents exponentially decaying weighted average of actual stress value or in other words this stress profile is Laplace transform of real stress profile. Mathematical procedure is described to determine real stress profile from Laplace profile. Experiment

  20. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  1. The derivation of tropospheric column ozone using the TOR approach and mapping technique

    NASA Astrophysics Data System (ADS)

    Yang, Qing

    2007-12-01

    Tropospheric ozone columns (TCOs) derived from differences between the Dutch-Finnish Aura Ozone Monitoring Instrument (OMI) measurements of the total atmospheric ozone column and the Aura Microwave Limb Sounder (MLS) measurements of stratospheric ozone columns are discussed. Because the measurements by these two instruments are not spatially coincident, interpolation techniques, with emphasis on mapping the stratospheric columns in space and time using the relationships between lower stratospheric ozone and potential vorticity (PV) and geopotential heights (Z), are evaluated at mid-latitudes. It is shown that this PV mapping procedure produces somewhat better agreement in comparisons with ozonesonde measurements, particularly in winter, than does simple linear interpolation of the MLS stratospheric columns or the use of typical coincidence criteria at mid-latitudes. The OMI/MLS derived tropospheric columns are calculated to be 4 Dobson units (DU) smaller than the sonde measured columns at mid-latitudes. This mean difference is consistent with the MLS (version 1.5) stratospheric ozone columns being high relative to Stratospheric Aerosol and Gas Experiment (SAGE II) columns by 3 DU. Standard deviations between the derived tropospheric columns and those measured by ozonesondes are 9 DU (30%) annually but they are just 6 DU (15%) in summer. Uncertainties in the interpolated MLS stratospheric columns are likely to be the primary cause of these standard deviations. An important advantage of the PV mapping approach is that it works well when MLS data are missing (e.g., when an orbit of measurements is missing). In the comparisons against ozonesonde measurements, it provides up to twice as many comparisons compared to the other techniques. The OMI/MLS derived tropospheric ozone columns have been compared with corresponding columns based on the Tropospheric Emission Spectrometer (TES) measurements, and Regional chEmical trAnsport Model (REAM) simulations. The variability of

  2. A Riemannian geometric mapping technique for identifying incompressible equivalents to subsonic potential flows

    NASA Astrophysics Data System (ADS)

    German, Brian Joseph

    This research develops a technique for the solution of incompressible equivalents to planar steady subsonic potential flows. Riemannian geometric formalism is used to develop a gauge transformation of the length measure followed by a curvilinear coordinate transformation to map the given subsonic flow into a canonical Laplacian flow with the same boundary conditions. The effect of the transformation is to distort both the immersed profile shape and the domain interior nonuniformly as a function of local flow properties. The method represents the full nonlinear generalization of the classical methods of Prandtl-Glauert and Karman-Tsien. Unlike the classical methods which are "corrections," this method gives exact results in the sense that the inverse mapping produces the subsonic full potential solution over the original airfoil, up to numerical accuracy. The motivation for this research was provided by an observed analogy between linear potential flow and the special theory of relativity that emerges from the invariance of the d'Alembert wave equation under Lorentz transformations. This analogy is well known in an operational sense, being leveraged widely in linear unsteady aerodynamics and acoustics, stemming largely from the work of Kussner. Whereas elements of the special theory can be invoked for compressibility effects that are linear and global in nature, the question posed in this work was whether other mathematical techniques from the realm of relativity theory could be used to similar advantage for effects that are nonlinear and local. This line of thought led to a transformation leveraging Riemannian geometric methods common to the general theory of relativity. A gauge transformation is used to geometrize compressibility through the metric tensor of the underlying space to produce an equivalent incompressible flow that lives not on a plane but on a curved surface. In this sense, forces owing to compressibility can be ascribed to the geometry of space in

  3. Comparison of the Biological Impacts of the Fluoride Compounds by Graphical Risk Visualization Map Technique.

    PubMed

    Usuda, Kan; Kono, Rei; Ueno, Takaaki; Ito, Yuichi; Dote, Tomotaro; Yokoyama, Hirotaka; Kono, Koichi; Tamaki, Junko

    2015-09-01

    Various fluoride compounds are widely used in industry. The present risk assessment study was conducted using a series of inorganic binary fluorides of the type XFn, where X(n) = Na(+), K(+), Li(+), Mg(2+), Ca(2+), Sr(2+), Ba(2+), Al(3+), Nd(3+), La(3+), Ce(3+), Sm(3+), Gd(3+), Y(3+), Yb(2+), and Zn(2+). The aqueous solutions of these salts were orally administrated to 16 experimental groups (one for each of the salts tested). The levels of fluoride, N-acetyl-β-D-glucosaminidase in cumulative 24-h urine samples and creatinine clearance were measured to assess possible acute renal damages. The levels of fluoride, alanine aminotransferase, and aspartate aminotransferase were also determined in serum samples to assess possible acute hepatic damages. The results reveal that sodium fluoride (NaF), potassium fluoride (KF), and zinc fluoride tetrahydrate (ZnF2 (.)4H2O) can carry the fluoride ion into the bloodstream and that it is excreted via urine more readily than the other compounds tested. These fluorides were assigned the highest risk impact factor. Most of the rare earth fluorides are insoluble in water while those groups 2 and 13 of the periodic table are slightly soluble, so that they do not have a significant negative risk. These findings suggest that the biological impact of fluoride depends on the accompanying counter ion and its solubility. The risk map obtained in the present study shows that the graphical visualization map technique employed is a valuable new tool to assess the toxicological risk of chemical compounds. PMID:25749926

  4. Low Altitude AVIRIS Data for Mapping Land Cover in Yellowstone National Park: Use of Isodata Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Spruce, Joe

    2001-01-01

    Yellowstone National Park (YNP) contains a diversity of land cover. YNP managers need site-specific land cover maps, which may be produced more effectively using high-resolution hyperspectral imagery. ISODATA clustering techniques have aided operational multispectral image classification and may benefit certain hyperspectral data applications if optimally applied. In response, a study was performed for an area in northeast YNP using 11 select bands of low-altitude AVIRIS data calibrated to ground reflectance. These data were subjected to ISODATA clustering and Maximum Likelihood Classification techniques to produce a moderately detailed land cover map. The latter has good apparent overall agreement with field surveys and aerial photo interpretation.

  5. Can Social Semantic Web Techniques Foster Collaborative Curriculum Mapping In Medicine?

    PubMed Central

    Finsterer, Sonja; Cremer, Jan; Schenkat, Hennig

    2013-01-01

    Background Curriculum mapping, which is aimed at the systematic realignment of the planned, taught, and learned curriculum, is considered a challenging and ongoing effort in medical education. Second-generation curriculum managing systems foster knowledge management processes including curriculum mapping in order to give comprehensive support to learners, teachers, and administrators. The large quantity of custom-built software in this field indicates a shortcoming of available IT tools and standards. Objective The project reported here aims at the systematic adoption of techniques and standards of the Social Semantic Web to implement collaborative curriculum mapping for a complete medical model curriculum. Methods A semantic MediaWiki (SMW)-based Web application has been introduced as a platform for the elicitation and revision process of the Aachen Catalogue of Learning Objectives (ACLO). The semantic wiki uses a domain model of the curricular context and offers structured (form-based) data entry, multiple views, structured querying, semantic indexing, and commenting for learning objectives (“LOs”). Semantic indexing of learning objectives relies on both a controlled vocabulary of international medical classifications (ICD, MeSH) and a folksonomy maintained by the users. An additional module supporting the global checking of consistency complements the semantic wiki. Statements of the Object Constraint Language define the consistency criteria. We evaluated the application by a scenario-based formative usability study, where the participants solved tasks in the (fictional) context of 7 typical situations and answered a questionnaire containing Likert-scaled items and free-text questions. Results At present, ACLO contains roughly 5350 operational (ie, specific and measurable) objectives acquired during the last 25 months. The wiki-based user interface uses 13 online forms for data entry and 4 online forms for flexible searches of LOs, and all the forms are

  6. Comparing and Reconciling Traditional Field and Photogeologic Mapping Techniques: Lessons from the San Francisco Volcanic Field, Arizona

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Eppler, D. B.; Bleacher, J. E.; Evans, C. A.; Feng, W.; Gruener, J.; Hurwitz, D. M.; Janoiko, B.; Whitson, P.

    2014-01-01

    Cartographic products and - specifically - geologic maps provide critical assistance for establishing physical and temporal frameworks of planetary surfaces. The technical methods that result in the creation of geologic maps vary depending on how observations are made as well as the overall intent of the final products [1-3]. These methods tend to follow a common linear work flow, including the identification and delineation of spatially and temporally discrete materials (units), the documentation of their primary (emplacement) and secondary (erosional) characteristics, analysis of the relative and absolute age relationships between these materials, and the collation of observations and interpretations into an objective map product. The "objectivity" of a map is critical cross comparison with overlapping maps and topical studies as well as its relevance to scientific posterity. However, the "accuracy" and "correctness" of a geologic map is very subject to debate. This can be evidenced by comparison of existing geologic maps at various scales, particularly those compiled through field- and remote-based mapped efforts. Our study focuses on comparing the fidelity of (1) "Apollo-style" geologic investigations, where typically non-geologist crew members follow static traverse routes established through pre-mission planning, and (2) "traditional" field-based investigations, where geologists are given free rein to observe without preplanned routes. This abstract summarizes the regional geology wherein our study was conducted, presents the geologic map created from traditional field mapping techniques, and offers basic insights into how geologic maps created from different tactics can be reconciled in support of exploratory missions. Additional abstracts [4-6] from this study discuss various exploration and science results of these efforts.

  7. Studying AGN Feedback with Galactic Outflows in Luminous Obscured Quasar

    NASA Astrophysics Data System (ADS)

    Sun, Ai-Lei

    2016-01-01

    Feedback from Active galactic nuclei (AGN) has been proposed as an important quenching mechanism to suppress star formation in massive galaxies. We investigate the most direct form of AGN feedback - galactic outflows - in the most luminous obscured AGN (L>10^45 erg/s) from the SDSS sample in the nearby universe (z<0.2). Using ALMA and Magellan observations to target molecular and ionized outflows, we find that luminous AGN can impact the dynamics and phase of the galactic medium, and confirm the complex multi-phase and multi-scaled nature of the feedback phenomenon. In particular, we found that most of these luminous AGN hosts ionized outflows. The outflow size, velocity, and energetics correlate with the AGN luminosity, and can be very extended (r > 10 kpc) and fast (v > 1000 km/s) for the most luminous ones. I end with presenting a new technique to find extended ionized outflows using broadband imaging surveys, and to characterize their occurrence rate, morphology, size distribution, and their dependence on the AGN luminosity. This technique will open a new window for feedback studies in the era of large-scale optical imaging surveys, e.g., HSC and then LSST.

  8. Classification of a set of vectors using self-organizing map- and rule-based technique

    NASA Astrophysics Data System (ADS)

    Ae, Tadashi; Okaniwa, Kaishirou; Nosaka, Kenzaburou

    2005-02-01

    There exist various objects, such as pictures, music, texts, etc., around our environment. We have a view for these objects by looking, reading or listening. Our view is concerned with our behaviors deeply, and is very important to understand our behaviors. We have a view for an object, and decide the next action (data selection, etc.) with our view. Such a series of actions constructs a sequence. Therefore, we propose a method which acquires a view as a vector from several words for a view, and apply the vector to sequence generation. We focus on sequences of the data of which a user selects from a multimedia database containing pictures, music, movie, etc... These data cannot be stereotyped because user's view for them changes by each user. Therefore, we represent the structure of the multimedia database as the vector representing user's view and the stereotyped vector, and acquire sequences containing the structure as elements. Such a vector can be classified by SOM (Self-Organizing Map). Hidden Markov Model (HMM) is a method to generate sequences. Therefore, we use HMM of which a state corresponds to the representative vector of user's view, and acquire sequences containing the change of user's view. We call it Vector-state Markov Model (VMM). We introduce the rough set theory as a rule-base technique, which plays a role of classifying the sets of data such as the sets of "Tour".

  9. Coal fire mapping of East Basuria Colliery, Jharia coalfield using vertical derivative technique of magnetic data

    NASA Astrophysics Data System (ADS)

    Pal, S. K.; Vaish, Jitendra; Kumar, Sahadev; Bharti, Abhay Kumar

    2016-02-01

    The present study deals with the coal fire mapping of East Basuria Colliery, Jharia coalfield, India, using the magnetic method. It is based on the fact that rise in temperature would result significant changes in magnetic susceptibility and thermo-remanent magnetization (TRM) of the overlying rocks. Magnetism increases slowly with the rise of temperature until the Curie temperature. Generally, rock/ overburden loses magnetization and becomes paramagnetic due to heating to Curie temperature, which results with significant reduction in magnetic susceptibility. However, magnetism increases significantly after cooling below the Curie temperature. Several data processing methods such as diurnal correction, reduction to pole (RTP), first and second vertical derivatives have been used for analysis of magnetic data and their interpretation. It is observed that the total magnetic field intensity anomaly of the area varies approximately from 44850 to 47460 nT and the residual magnetic anomaly varies approximately from -1323 to 1253 nT. The range of the magnetic anomaly after RTP is approximately 1050-1450 nT. About 20 low magnetic anomaly zones have been identified associated with active coal fire regions and 11 high magnetic anomaly zones have been identified associated with non-coal fire regions using vertical derivative techniques.

  10. Mapping of explosive contamination using GC/chemiluminescence and ion mobility spectrometry techniques

    NASA Astrophysics Data System (ADS)

    Miller, Carla J.; Glenn, D. F.; Hartenstein, Steven D.; Hallowell, Susan F.

    1998-12-01

    Recent efforts at the Idaho National Engineering and Environmental Laboratory (INEEL) have included mapping explosive contamination resulting from manufacturing and carrying improvised explosive devices (IEDs). Two types of trace detection equipment were used to determine levels of contamination from designated sampling areas. A total of twenty IEDs were constructed: ten using TNT and ten using C-4. Two test scenarios were used. The first scenario tracked the activities of a manufacturer who straps the device onto an independent courier. The courier then performed a series of activities to simulate waiting in an airport. The second scenario tracked the activities of a manufacturer who also served as the courier. A sample set for each test consisted of thirty samples from various locations on each IED manufacturer, thirty from each IED courier, twenty-five from the manufacturing area, and twenty-five from the courier area. Pre-samples and post-samples were collected for analysis with each detection technique. Samples analyzed by gc/chemiluminescence were taken by swiping a teflon- coated sampling swipe across the surface of the sampling area to pick up any explosive particles. Samples analyzed by ion mobility spectrometry (IMS) were taken from the clothing of the manufacturer and courier by vacuuming the surface and collecting particulates on a fiberglass filter. Samples for IMS analysis from the manufacturing and courier rooms were taken by wiping a cotton sampling swipe across the surface area. Currently, building IEDs and monitoring the explosive contamination is being directed toward detection with portal monitors.

  11. Real-time PCR mapping of DNaseI-hypersensitive sites using a novel ligation-mediated amplification technique

    PubMed Central

    Follows, George A.; Janes, Mary E.; Vallier, Ludovic; Green, Anthony R.; Gottgens, Berthold

    2007-01-01

    Mapping sites within the genome that are hypersensitive to digestion with DNaseI is an important method for identifying DNA elements that regulate transcription. The standard approach to locating these DNaseI-hypersensitive sites (DHSs) has been to use Southern blotting techniques, although we, and others, have recently published alternative methods using a range of technologies including high-throughput sequencing and genomic array tiling paths. In this article, we describe a novel protocol to use real-time PCR to map DHS. Advantages of the technique reported here include the small cell numbers required for each analysis, rapid, relatively low-cost experiments with minimal need for specialist equipment. Presented examples include comparative DHS mapping of known TAL1/SCL regulatory elements between human embryonic stem cells and K562 cells. PMID:17389645

  12. Unveiling the hearts of luminous and ultra-luminous infrared galaxy mergers with laser guide star adaptive optics

    NASA Astrophysics Data System (ADS)

    Medling, Anne M.

    2013-03-01

    Gas-rich galaxies across cosmic time exhibit one or both of two phenomena: ongoing star formation and an active galactic nucleus indicating current black hole accretion. These two processes are important mechanisms through which galaxies evolve and grow, but their effects are difficult to disentangle. Both will use up some available gas, and both are capable of producing winds strong enough to eject remaining gas from the galaxy. One must look at high spatial resolutions in order to separate the dynamical effects of star formation going on near the nucleus of a galaxy from the black hole growth going on in the nucleus. We present high spatial resolution integral field spectroscopy of fifteen nearby luminous and ultra-luminous infrared galaxies. These systems are extremely bright in the infrared exactly because they host powerful starbursts and active nuclei, which in turn heat the surrounding dust. Our data provide resolved stellar and gaseous kinematics of the central kiloparsec of each of these systems by removing atmospheric blurring with adaptive optics, an observing technique that measures the turbulence in the Earth's atmosphere and then uses a deformable mirror to correct the resulting distortions. Our kinematic maps reveal nuclear disks of gas and stars with radii ˜ a few hundred parsecs surrounding the central black holes. Because the stellar and gas kinematics match well, we conclude that the stars are forming in situ from the gas in the disks. These disks may be the progenitors of kinematically decoupled cores seen in many isolated elliptical galaxies, and may have a significant effect on the merger rate of binary black holes. Additionally, these disks may be used to measure black hole masses which, when combined with host galaxy properties and placed on scaling relations, indicate that black holes grow as or more quickly than their host galaxies during a merger. This suggests that a sudden burst of black hole growth at in the final stages of the merger

  13. The Effect of Using the Story-Mapping Technique on Developing Tenth Grade Students' Short Story Writing Skills in EFL

    ERIC Educational Resources Information Center

    Ibnian, Salem Saleh Khalaf

    2010-01-01

    The current study aimed at investigating the effect of using the story-mapping technique on developing tenth grade students' short story writing skills in EFL. The study attempted to answer the following questions: (1) What are the short story writing skills needed for tenth grade students in EFL?; and (2) What is the effect of the using the…

  14. Notes for the improvement of a remote sensing multispectral data non-supervised classification and mapping technique

    NASA Technical Reports Server (NTRS)

    Dalton, C. C.

    1973-01-01

    Examined are: (1) the sequential clustering technique for the unsupervised automatic classification and mapping of earth resources satellite data, (2) theoretical analysis of the tests which were used, and (3) derivation of an alternative set of tests and their necessary algorithm.

  15. Observing Equatorial Thermospheric Winds and Temperatures with a New Mapping Technique

    NASA Astrophysics Data System (ADS)

    Faivre, M. W.; Meriwether, J. W.; Sherwood, P.; Veliz, O.

    2005-12-01

    Application of the Fabry-Perot interferometer (FPI) at Arequipa, Peru (16.4S, 71.4 W) to measure the Doppler shifts and Doppler broadenings in the equatorial O(1D) 630-nm nightglow has resulted in numerous detections of a large-scale thermospheric phenomenon called the Midnight Temperature Maximum (MTM). A recent detector upgrade with a CCD camera has improved the accuracy of these measurements by a factor of 5. Temperature increases of 50 to 150K have been measured during nights in April and July, 2005, with error bars less than 10K after averaging in all directions. Moreover, the meridional wind measurements show evidence for a flow reversal from equatorward to poleward near local midnight for such events. A new observing strategy based upon the pioneering work of Burnside et al.[1981] maps the equatorial wind and temperature fields by observing in eight equally-spaced azimuth directions, each with a zenith angle of 60 degrees. Analysis of the data obtained with this technique gives the mean wind velocities in the meridional and zonal directions as well as the horizontal gradients of the wind field for these directions. Significant horizontal wind gradients are found for the meridional direction but not for the zonal direction. The zonal wind blows eastward throughout the night with a maximum speed of ~150 m/s near the middle of the night and then decreases towards zero just before dawn. In general, the fastest poleward meridional wind is observed near mid-evening. By the end of the night, the meridional flow tends to be more equatorward at speeds of about 50 m/s. Using the assumption that local time and longitude are equivalent over a period of 30 minutes, a map of the horizontal wind field vector field is constructed over a range of 12 degrees latitude centered at 16.5 S. Comparison between MTM nights and quiet nights (no MTM) revealed significant differences in the horizontal wind fields. Using the method of Fourier decomposition of the line-of-sight winds

  16. Evaluation of EREP techniques for geological mapping. [southern Pyrenees and Ebro basin in Spain

    NASA Technical Reports Server (NTRS)

    Vandermeermohr, H. E. C.; Srivastava, G. S. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Skylab photographs may be successfully utilized for preparing a reconnaissance geological map in the areas where no maps or semi-detailed maps exist. Large coverage of area and regional perspective from Skylab photographs can help better coordination in regional mapping. It is possible to delineate major structural trends and other features like mega-lineaments, geofractures, and faults, which have evaded their detection by conventional methods. The photointerpretability is better in areas dominated by sedimentary rocks. Rock units of smaller extent and having poor geomorphic expressions are difficult to map. Demarcation of quaternary river alluvium can be made with better precision and ease with the Skylab photographs. Stereoscopic viewing greatly helps in interpretation of area structures. Skylab photographs are not good for preparing geological maps larger than 1:270,000 scale.

  17. Soil zymography - A novel technique for mapping enzyme activity in the rhizosphere

    NASA Astrophysics Data System (ADS)

    Spohn, Marie

    2014-05-01

    The effect plant roots on microbial activity in soil at the millimeter scale is poorly understood. One reason for this is that spatially explicit methods for the study of microbial activity in soil are limited. Here we present a quantitative in situ technique for mapping the distribution of exoenzymes in soil along with some results about the effects of roots on exoenzyme activity in soil. In the first study we showed that both acid and alkaline phosphatase activity were up to 5.4-times larger in the rhizosphere of Lupinus albus than in the bulk soil. While acid phosphatase activity (produced by roots and microorganisms) was closely associated with roots, alkaline phosphatase activity (produced only by microorganisms) was more widely distributed, leading to a 2.5-times larger area of activity of alkaline than of acid phosphatase. These results indicate a spatial differentiation of different ecophysiological groups of organic phosphorus mineralizing organisms in the rhizosphere which might alleviate a potential competition for phosphorus between them. In a second study cellulase, chitinase and phosphatase activities were analyzed in the presence of living Lupinus polyphyllus roots and dead/dying roots (in the same soils 10, 20 and 30 days after cutting the L. polyphyllus shoots). The activity of all three enzymes was 9.0 to 13.9-times higher at the living roots compared to the bulk soil. Microhotspots of cellulase, chitinase and phosphatase activity in the soil were found up to 60 mm away from the living roots. 10 days after shoot cutting, the areas of high activities of cellulase and phosphatase activity were extend up to 55 mm away from the next root, while the extension of the area of chitinase activity did not change significantly. At the root, cellulase and chitinase activity increased first at the root tips after shoot cutting and showed maximal activity 20 days after shoot cutting. The number and activity of microhotspots of chitinase activity was maximal 10

  18. Side-scan sonar mapping: Pseudo-real-time processing and mosaicking techniques

    SciTech Connect

    Danforth, W.W.; Schwab, W.C.; O'Brien, T.F. ); Karl, H. )

    1990-05-01

    The US Geological Survey (USGS) surveyed 1,000 km{sup 2} of the continental shelf off San Francisco during a 17-day cruise, using a 120-kHz side-scan sonar system, and produced a digitally processed sonar mosaic of the survey area. The data were processed and mosaicked in real time using software developed at the Lamont-Doherty Geological Observatory and modified by the USGS, a substantial task due to the enormous amount of data produced by high-resolution side-scan systems. Approximately 33 megabytes of data were acquired every 1.5 hr. The real-time sonar images were displayed on a PC-based workstation and the data were transferred to a UNIX minicomputer where the sonar images were slant-range corrected, enhanced using an averaging method of desampling and a linear-contrast stretch, merged with navigation, geographically oriented at a user-selected scale, and finally output to a thermal printer. The hard-copy output was then used to construct a mosaic of the survey area. The final product of this technique is a UTM-projected map-mosaic of sea-floor backscatter variations, which could be used, for example, to locate appropriate sites for sediment sampling to ground truth the sonar imagery while still at sea. More importantly, reconnaissance surveys of this type allow for the analysis and interpretation of the mosaic during a cruise, thus greatly reducing the preparation time needed for planning follow-up studies of a particular area.

  19. Enhancing Local Climate Projections of Precipitation: Assets and Limitations of Quantile Mapping Techniques for Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Kotlarski, Sven; Schär, Christoph

    2015-04-01

    The Swiss CH2011 scenarios provide a portfolio of climate change scenarios for the region of Switzerland, specifically tailored for use in climate impact research. Although widely applied by a variety of end-users, these scenarios are subject to several limitations related to the underlying delta change methodology. Examples are difficulties to appropriately account for changes in the spatio-temporal variability of meteorological fields and for changes in extreme events. The recently launched ELAPSE project (Enhancing local and regional climate change projections for Switzerland) is connected to the EU COST Action VALUE (www.value-cost.eu) and aims at complementing CH2011 by further scenario products, including a bias-corrected version of daily scenarios at the site scale. For this purpose the well-established empirical quantile mapping (QM) methodology is employed. Here, daily temperature and precipitation output of 15 GCM-RCM model chains of the ENSEMBLES project is downscaled and bias-corrected to match observations at weather stations in Switzerland. We consider established QM techniques based on all empirical quantiles or linear interpolation between the empirical percentiles. In an attempt to improve the downscaling of extreme precipitation events, we also apply a parametric approximation of the daily precipitation distribution by a dynamically weighted mixture of a Gamma distribution for the bulk and a Pareto distribution for the right tail for the first time in the context of QM. All techniques are evaluated and intercompared in a cross-validation framework. The statistical downscaling substantially improves virtually all considered distributional and temporal characteristics as well as their spatial distribution. The empirical methods have in general very similar performances. The parametric method does not show an improvement over the empirical ones. Critical sites and seasons are highlighted and discussed. Special emphasis is placed on investigating the

  20. Properties of unusually luminous supernovae

    NASA Astrophysics Data System (ADS)

    Pan, Tony Shih Arng

    This thesis is a theoretical study of the progenitors, event rates, and observational properties of unusually luminous supernova (SN), and aims to identify promising directions for future observations. In Chapter 2, we present model light curves and spectra of pair-instability supernovae (PISNe) over a range of progenitor masses and envelope structures for Pop III stars. We calculate the rates and detectability of PISNe, core-collapse supernovae (CCSNe), and Type Ia SNe at the Epoch of Reionization with the James Webb Space Telescope (JWST), which can be used to determine the contribution of Pop III versus Pop II stars toward ionizing the universe. Although CCSNe are the least intrinsically luminous supernovae, Chapter 5 shows that a JWST survey targeting known galaxy clusters with Einstein radii > 35" should discover gravitationally lensed CCSNe at redshifts exceeding z = 7--8. In Chapter 3, we explain the Pop II/I progenitors of observed PISNe in the local universe can be created via mergers in runaway collisions in young, dense star clusters, despite copious mass loss via line-driven winds. The PISN rate from this mechanism is consistent with the observed volumetric rate, and the Large Synoptic Survey Telescope could discover ~102 such PISNe per year. In Chapter 4, we identify 10 star clusters which may host PISN progenitors with masses up to 600 solar masses formed via runaway collisions. We estimate the probabilities of these very massive stars being in eclipsing binaries to be ≳ 30%, and find that their transits can be detected even under the contamination of the background cluster light, due to mean transit depths of ~10 6 solar luminosities. In Chapter 6, we show that there could be X-ray analogues of optically super-luminous SNe that are powered by the conversion of the kinetic energy of SN ejecta into radiation upon its collision with a dense but optically-thin circumstellar shell. We find shell configurations that can convert a large fraction of the SN

  1. Applications of remote sensing techniques to county land use and flood hazard mapping

    NASA Technical Reports Server (NTRS)

    Clark, R. B.; Conn, J. S.; Miller, D. A.; Mouat, D. A.

    1975-01-01

    The application of remote sensing in Arizona is discussed. Land use and flood hazard mapping completed by the Applied Remote Sensing Program is described. Areas subject to periodic flood inundation are delineated and land use maps monitoring the growth within specific counties are provided.

  2. The Art of Teaching Map and Compass: Instructional Techniques, Curricular Formats and Practical Field Exercises.

    ERIC Educational Resources Information Center

    Watters, Ron

    This paper discusses the value of teaching map and compass skills in the classroom or an outdoor situation. Navigation is the most basic of all outdoor skills. A map and compass curriculum can be taught to anyone, is inexpensive, and is easily incorporated in a variety of educational situations. General teaching principles are outlined: (1) start…

  3. Investigating the Use of ICT-Based Concept Mapping Techniques on Creativity in Literacy Tasks

    ERIC Educational Resources Information Center

    Riley, Nigel R.; Ahlberg, Mauri

    2004-01-01

    The key research question in this small-scale study focuses on the effects that an ICT (information and communications technologies)-based concept mapping intervention has on creativity and writing achievement in 10-11-year-old primary age pupils. The data shows that pupils using a concept mapping intervention significantly improve their NFER…

  4. Benthic habitat mapping: A review of progress towards improved understanding of the spatial ecology of the seafloor using acoustic techniques

    NASA Astrophysics Data System (ADS)

    Brown, Craig J.; Smith, Stephen J.; Lawton, Peter; Anderson, John T.

    2011-05-01

    This review examines the various strategies and methods used to produce benthic habitat maps using acoustic remote sensing techniques, coupled with in situ sampling. The applications of three acoustic survey techniques are examined in detail: single-beam acoustic ground discrimination systems, sidescan sonar systems, and multi-beam echo sounders. Over the past decade we have witnessed the nascence of the field of benthic habitat mapping and, on the evidence of the literature reviewed in this paper, have seen a rapid evolution in the level of sophistication in our ability to image and thus map seafloor habitats. As acoustic survey tools have become ever more complex, new methods have been tested to segment, classify and combine these data with biological ground truth sample data. Although the specific methods used to derive habitat maps vary considerably, the review indicates that studies can generally be categorized into one of three over-arching strategies; 1) Abiotic surrogate mapping; 2) Assemble first, predict later (unsupervised classification); 3) Predict first, assemble later (supervised classification). Whilst there is still no widely accepted agreement on the best way to produce benthic habitat maps, all three strategies provide valuable map resources to support management objectives. Whilst there is still considerable work to be done before we can answer many of the outstanding technological, methodological, ecological and theoretical questions that have been raised here, the review concludes that the advent of spatial ecological studies founded on high-resolution environmental data sets will undoubtedly help us to examine patterns in community and species distributions. This is a vital first step in unraveling ecological complexities and thus providing improved spatial information for management of marine systems.

  5. A regional ionospheric TEC mapping technique over China and adjacent areas on the basis of data assimilation

    NASA Astrophysics Data System (ADS)

    Aa, Ercha; Huang, Wengeng; Yu, Shimei; Liu, Siqing; Shi, Liqin; Gong, Jiancun; Chen, Yanhong; Shen, Hua

    2015-06-01

    In this paper, a regional total electron content (TEC) mapping technique over China and adjacent areas (70°E-140°E and 15°N-55°N) is developed on the basis of a Kalman filter data assimilation scheme driven by Global Navigation Satellite Systems (GNSS) data from the Crustal Movement Observation Network of China and International GNSS Service. The regional TEC maps can be generated accordingly with the spatial and temporal resolution being 1°×1° and 5 min, respectively. The accuracy and quality of the TEC mapping technique have been validated through the comparison with GNSS observations, the International Reference Ionosphere model values, the global ionosphere maps from Center for Orbit Determination of Europe, and the Massachusetts Institute of Technology Automated Processing of GPS TEC data from Madrigal database. The verification results indicate that great systematic improvements can be obtained when data are assimilated into the background model, which demonstrates the effectiveness of this technique in providing accurate regional specification of the ionospheric TEC over China and adjacent areas.

  6. Groundwater vulnerability and risk mapping in a geologically complex area by using stable isotopes, remote sensing and GIS techniques

    NASA Astrophysics Data System (ADS)

    Dimitriou, Elias; Zacharias, Ierotheos

    2006-11-01

    Groundwater vulnerability and risk mapping is a relatively new scientific approach for facilitating planning and decision making processes in order to protect this valuable resource. Pan European methodology for aquifers vulnerability has recently been developed by assessing all the existing relevant techniques and emphasizing on karstic environments. In the particular study, state-of-the-art methods and tools have been implemented such as remote sensing, isotopic investigations and GIS to map the groundwater vulnerability and pollution risk in a geologically complex area of W. Greece. The updated land use map has been developed from a Landsat 7+TM image elaborated with image analysis software, while the detailed hydrogeologic properties of the area have been recorded with an intensive isotopic study. The local groundwater vulnerability map has been produced following the aforementioned Pan European method, in a GIS environment while the risk map, which was the final product of the study, has been developed after combining the vulnerability and the land use maps. The results indicated that the areas comprised of highly tectonized calcareous formations represented high vulnerability and risk zones while forested areas away from the karstic aquifer illustrated moderate to low vulnerability. Moreover, human activities increase the pollution risk in lowland areas consisting of sedimentary deposits that have been classified as moderate vulnerability. The particular methodology operated efficiently in this study and due to its accuracy and relatively easy implementation can be used as a decision support tool for local authorities.

  7. Facilitating the exploitation of ERTS-1 imagery using snow enhancement techniques. [geological fault maps of Massachusetts and Connecticut

    NASA Technical Reports Server (NTRS)

    Wobber, F. J. (Principal Investigator); Martin, K. R.; Amato, R. V.; Leshendok, T.

    1973-01-01

    The author has identified the following significant results. The applications of ERTS-1 imagery for geological fracture mapping regardless of season has been repeatedly confirmed. The enhancement provided by a differential cover of snow increases the number and length of fracture-lineaments which can be detected with ERTS-1 data and accelerates the fracture mapping process for a variety of practical applications. The geological mapping benefits of the program will be realized in geographic areas where data are most needed - complex glaciated terrain and areas of deep residual soils. ERTS-1 derived fracture-lineament maps which provide detail well in excess of existing geological maps are not available in the Massachusetts-Connecticut area. The large quantity of new data provided by ERTS-1 may accelerate and improve field mapping now in progress in the area. Numerous other user groups have requested data on the techniques. This represents a major change in operating philosophy for groups who to data judged that snow obscured geological detail.

  8. Apparent speed increases at low luminance

    PubMed Central

    Vaziri-Pashkam, Maryam; Cavanagh, Patrick

    2009-01-01

    To investigate the effect of luminance on apparent speed, subjects adjusted the speed of a low-luminance rotating grating (0.31 cd/m2) to match that of a high-luminance one (1260 cd/m2). Above 4 Hz, subjects overestimated the speed of the low-luminance grating. This overestimation increased as a function of temporal rate and reached 30% around 10 Hz temporal rates. The speed overestimation became significant once the lower luminance was 2.4 log units lower than the high luminance comparison. Next the role of motion smear in speed overestimation was examined. First it was shown that the length of the perceived motion smear increased at low luminances. Second, the length of the visible smear was manipulated by changing the presentation time of the stimuli. Speed overestimation was reduced at shorter presentation times. Third the speed of a blurred stimulus was compared to a stimulus with sharp edges and the blurred stimulus was judged to move faster. These results indicate that the length of motion smear following a target contributes to its perceived speed and that this leads to speed overestimation at low luminance where motion traces lengthen because of increased persistence. PMID:19146275

  9. Comparison of manually produced and automated cross country movement maps using digital image processing techniques

    NASA Technical Reports Server (NTRS)

    Wynn, L. K.

    1985-01-01

    The Image-Based Information System (IBIS) was used to automate the cross country movement (CCM) mapping model developed by the Defense Mapping Agency (DMA). Existing terrain factor overlays and a CCM map, produced by DMA for the Fort Lewis, Washington area, were digitized and reformatted into geometrically registered images. Terrain factor data from Slope, Soils, and Vegetation overlays were entered into IBIS, and were then combined utilizing IBIS-programmed equations to implement the DMA CCM model. The resulting IBIS-generated CCM map was then compared with the digitized manually produced map to test similarity. The numbers of pixels comprising each CCM region were compared between the two map images, and percent agreement between each two regional counts was computed. The mean percent agreement equalled 86.21%, with an areally weighted standard deviation of 11.11%. Calculation of Pearson's correlation coefficient yielded +9.997. In some cases, the IBIS-calculated map code differed from the DMA codes: analysis revealed that IBIS had calculated the codes correctly. These highly positive results demonstrate the power and accuracy of IBIS in automating models which synthesize a variety of thematic geographic data.

  10. Techniques and applications of SAR interferometry for ERS-1: Topographic mapping, change detection, and slope measurement

    NASA Technical Reports Server (NTRS)

    Werner, Charles L.; Hensley, Scott; Goldstein, Richard M.; Rosen, Paul A.; Zebker, Howard A.

    1993-01-01

    SAR (Synthetic Aperture Radar) interferometry allows the measurement of high resolution topography of terrain from repeat orbit SAR data sets obtained by the ERS-1 AMI (Active Microwave Instrument). The system parameters which affect the accuracy of the topographic maps are errors in determination of the interferometric baseline, non parallel orbit tracks, decorrelation caused by baseline length, thermal noise, and surface change. Surface change can be observed in the interferograms either through decorrelation of the interferometric phase, or coherent phase shifts caused by locally uniform surface displacements. Phase gradient maps can be derived directly from the complex interferograms and can be transformed into surface slopes mapped onto a geometrically corrected grid.

  11. Analysis of terrain map matching using multisensing techniques for applications to autonomous vehicle navigation

    NASA Technical Reports Server (NTRS)

    Page, Lance; Shen, C. N.

    1991-01-01

    This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.

  12. An automated technique for detailed ?-FTIR mapping of diamond and spectral deconvolution

    NASA Astrophysics Data System (ADS)

    Howell, Dan; Griffin, Bill; O'Neill, Craig; O'Reilly, Suzanne; Pearson, Norman; Handley, Heather

    2010-05-01

    other commonly found defects and impurities. Whether these are intrinsic defects like platelets, extrinsic defects like hydrogen or boron atoms, or inclusions of minerals or fluids. Recent technological developments in the field of spectroscopy allow detailed μ-FTIR analysis to be performed rapidly in an automated fashion. The Nicolet iN10 microscope has an integrated design that maximises signal throughput and allows spectra to be collected with greater efficiency than is possible with conventional μ-FTIR spectrometer-microscope systems. Combining this with a computer controlled x-y stage allows for the automated measuring of several thousand spectra in only a few hours. This affords us the ability to record 2D IR maps of diamond plates with minimal effort, but has created the need for an automated technique to process the large quantities of IR spectra and obtain quantitative data from them. We will present new software routines that can process large batches of IR spectra, including baselining, conversion to absorption coefficient, and deconvolution to identify and quantify the various nitrogen components. Possible sources of error in each step of the process will be highlighted so that the data produced can be critically assessed. The end result will be the production of various false colour 2D maps that show the distribution of nitrogen concentrations and aggregation states, as well as other identifiable components.

  13. Language Mapping with Navigated Repetitive TMS: Proof of Technique and Validation

    PubMed Central

    Tarapore, Phiroz E.; Findlay, Anne M.; Honma, Susanne M.; Mizuiri, Danielle; Houde, John F.; Berger, Mitchel S.; Nagarajan, Srikantan S.

    2013-01-01

    Objective Lesion-based mapping of speech pathways has been possible only during invasive neurosurgical procedures using direct cortical stimulation (DCS). However, navigated transcranial magnetic stimulation (nTMS) may allow for lesion-based interrogation of language pathways noninvasively. Although not lesion-based, magnetoencephalographic imaging (MEGI) is another noninvasive modality for language mapping. In this study, we compare the accuracy of nTMS and MEGI with DCS. Methods Subjects with lesions around cortical language areas underwent preoperative nTMS and MEGI for language mapping. nTMS maps were generated using a repetitive TMS protocol to deliver trains of stimulations during a picture naming task. MEGI activation maps were derived from adaptive spatial filtering of beta-band power decreases prior to overt speech during picture naming and verb generation tasks. The subjects subsequently underwent awake language mapping via intraoperative DCS. The language maps obtained from each of the 3 modalities were recorded and compared. Results nTMS and MEGI were performed on 12 subjects. nTMS yielded 21 positive language disruption sites (11 speech arrest, 5 anomia, and 5 other) while DCS yielded 10 positive sites (2 speech arrest, 5 anomia, and 3 other). MEGI isolated 32 sites of peak activation with language tasks. Positive language sites were most commonly found in the pars opercularis for all three modalities. In 9 instances the positive DCS site corresponded to a positive nTMS site, while in 1 instance it did not. In 4 instances, a positive nTMS site corresponded to a negative DCS site, while 169 instances of negative nTMS and DCS were recorded. The sensitivity of nTMS was therefore 90%, specificity was 98%, the positive predictive value was 69% and the negative predictive value was 99% as compared with intraoperative DCS. MEGI language sites for verb generation and object naming correlated with nTMS sites in 5 subjects, and with DCS sites in 2 subjects

  14. A Hydrogeologic Map of the Death Valley Region, Nevada and California, Developed Using GIS Techniques

    USGS Publications Warehouse

    Faunt, Claudia C.; D'Agnese, Frank A.; Turner, A. Keith

    1997-01-01

    In support of Yucca Mountain site characterization studies, a hydrogeologic framework was developed, and a hydrogeologic map was constructed for the Death Valley region. The region, covering approximately 100,000 km 2 along the Nevada-California border near Las Vegas, is characterized by isolated mountain ranges juxtaposed against broad, alluvium-filled valleys. Geologic conditions are typical of the Basin and Range Province; a variety of sedimentary and igneous intrusive and extrusive rocks have been subjected to both compressional and extensional deformation. The regional ground-water flow system can best be described as a series of connected intermontane basins in which ground-water flow occurs in basin-fill deposits, carbonate rocks, clastic rocks, and volcanic rocks. Previous investigations have developed more site-specific hydrogeologic relationships; however, few have described all the lithologies within the Death Valley regional ground-water flow system. Information required to characterize the hydrogeologic units in the region was obtained from regional geologic maps and reports. Map data were digitized from regional geologic maps and combined into a composite map using a geographic information system. This map was simplified to show 10 laterally extensive hydrogeologic units with distinct hydrologic properties. The hydraulic conductivity values for the hydrogeologic units range over 15 orders of magnitude due to the variability in burial depth and degree of fracturing.

  15. A hydrogeologic map of the Death Valley region, Nevada, and California, developed using GIS techniques

    SciTech Connect

    Faunt, C.C.; D`Agnese, F.A.; Turner, A.K.

    1997-12-31

    In support of Yucca Mountain site characterization studies, a hydrogeologic framework was developed, and a hydrogeologic map was constructed for the Death Valley region. The region, covering approximately 100,000 km{sup 2} along the Nevada-California border near Las Vegas, is characterized by isolated mountain ranges juxtaposed against broad, alluvium-filled valleys. Geologic conditions are typical of the Basin and Range Province; a variety of sedimentary and igneous intrusive and extrusive rocks have been subjected to both compressional and extensional deformation. The regional ground-water flow system can best be described as a series of connected intermontane basins in which ground-water flow occurs in basin-fill deposits, carbonate rocks, clastic rocks, and volcanic rocks. Previous investigations have developed more site-specific hydrogeologic relationships; however, few have described all the lithologies within the Death Valley regional ground-water flow system. Information required to characterize the hydrogeologic units in the region was obtained from regional geologic maps and reports. Map data were digitized from regional geologic maps and combined into a composite map using a geographic information system. This map was simplified to show 10 laterally extensive hydrogeologic units with distinct hydrologic properties. The hydraulic conductivity values for the hydrogeologic units range over 15 orders of magnitude due to the variability in burial depth and degree of fracturing.

  16. Thermal mapping of mountain slopes on Mars by application of a Differential Apparent Thermal Inertia technique

    NASA Astrophysics Data System (ADS)

    Kubiak, Marta; Mège, Daniel; Gurgurewicz, Joanna; Ciazela, Jakub

    2015-04-01

    Thermal inertia (P) is an important property of geologic surfaces that essentially describes the resistance to temperature (T) change as heat is added. Most remote sensing data describe the surface only. P is a volume property that is sensitive to the composition of the subsurface, down to a depth reached by the diurnal heating wave. As direct measurement of P is not possible on Mars, thermal inertia models (Fergason et al., 2006) and deductive methods (the Apparent Thermal Inertia: ATI and Differential Apparent Thermal Inertia: DATI) are used to estimate it. ATI is computed as (1 - A) / (Tday - Tnight), where A is albedo. Due to the lack of the thermal daytime images with maximum land surface temperature (LST) and nighttime images with minimum LST in Valles Marineris region, the ATI method is difficult to apply. Instead, we have explored the DATI technique (Sabol et al., 2006). DATI is calculated based on shorter time (t) intervals with a high |ΔT/Δt| gradient (in the morning or in the afternoon) and is proportional to the day/night temperature difference (ATI), and hence P. Mars, which exhibits exceptionally high |ΔT/Δt| gradients due to the lack of vegetation and thin atmosphere, is especially suitable for the DATI approach. Here we present a new deductive method for high-resolution differential apparent thermal inertia (DATI) mapping for areas of highly contrasted relief (e.g., Valles Marineris). Contrary to the thermal inertia models, our method takes local relief characteristics (slopes and aspects) into account. This is crucial as topography highly influences A and ΔT measurements. In spite of the different approach, DATI values in the flat areas are in the same range as the values obtained by Fergason et al. (2006). They provide, however, more accurate information for geological interpretations of hilly or mountainous terrains. Sabol, D. E., Gillespie, A. R., McDonald, E., and Danilina, I., 2006. Differential Thermal Inertia of Geological Surfaces. In

  17. Using a Concept Mapping Tool with a Photograph Association Technique (CoMPAT) to Elicit Children's Ideas about Microbial Activity

    NASA Astrophysics Data System (ADS)

    Byrne, Jenny; Grace, Marcus

    2010-03-01

    Concept mapping is a technique used to provide a visual representation of an individual's ideas about a concept or set of related concepts. This paper describes a concept mapping tool using a photograph association technique (CoMPAT) that is considered to be a novel way of eliciting children's ideas. What children at 11 years of age know about particular concepts related to microbial activity is presented and discussed in relation to the effectiveness of CoMPAT as a tool to draw out their ideas. It is proposed that this tool could be used to explore ideas about this and other science concepts from children in other age groups, and where language may be a barrier to participation.

  18. Nonlinear mapping technique for data visualization and clustering assessment of LIBS data: application to ChemCam data.

    PubMed

    Lasue, J; Wiens, R C; Stepinski, T F; Forni, O; Clegg, S M; Maurice, S

    2011-07-01

    ChemCam is a remote laser-induced breakdown spectroscopy (LIBS) instrument that will arrive on Mars in 2012, on-board the Mars Science Laboratory Rover. The LIBS technique is crucial to accurately identify samples and quantify elemental abundances at various distances from the rover. In this study, we compare different linear and nonlinear multivariate techniques to visualize and discriminate clusters in two dimensions (2D) from the data obtained with ChemCam. We have used principal components analysis (PCA) and independent components analysis (ICA) for the linear tools and compared them with the nonlinear Sammon's map projection technique. We demonstrate that the Sammon's map gives the best 2D representation of the data set, with optimization values from 2.8% to 4.3% (0% is a perfect representation), together with an entropy value of 0.81 for the purity of the clustering analysis. The linear 2D projections result in three (ICA) and five times (PCA) more stress, and their clustering purity is more than twice higher with entropy values about 1.8. We show that the Sammon's map algorithm is faster and gives a slightly better representation of the data set if the initial conditions are taken from the ICA projection rather than the PCA projection. We conclude that the nonlinear Sammon's map projection is the best technique for combining data visualization and clustering assessment of the ChemCam LIBS data in 2D. PCA and ICA projections on more dimensions would improve on these numbers at the cost of the intuitive interpretation of the 2D projection by a human operator. PMID:21331488

  19. Mapping the petroleum system - An investigative technique to explore the hydrocarbon fluid system

    USGS Publications Warehouse

    Magoon, L.B.; Dow, W.G.

    2000-01-01

    Creating a petroleum system map includes a series of logical steps that require specific information to explain the origin in time and space of discovered hydrocarbon occurrences. If used creatively, this map provides a basis on which to develop complementary plays and prospects. The logical steps include the characterization of a petroleum system (that is, to identify, map, and name the hydrocarbon fluid system) and the summary of these results on a folio sheet. A petroleum system map is based on the understanding that there are several levels of certainty from "guessing" to "knowing" that specific oil and gas accumulations emanated from a particular pod of active source rock. Levels of certainty start with the close geographic proximity of two or more accumulations, continues with the close stratigraphic proximity, followed by the similarities in bulk properties, and then detailed geochemical properties. The highest level of certainty includes the positive geochemical correlation of the hydrocarbon fluid in the accumulations to the extract of the active source rock. A petroleum system map is created when the following logic is implemented. Implementation starts when the oil and gas accumulations of a petroleum province are grouped stratigraphically and geographically. Bulk and geochemical properties are used to further refine the groups through the determination of genetically related oil and gas types. To this basic map, surface seeps and well shows are added. Similarly, the active source rock responsible for these hydrocarbon occurrences are mapped to further define the extent of the system. A folio sheet constructed for a hypothetical case study of the Deer-Boar(.) petroleum system illustrates this methodology.

  20. Moderately luminous Type II supernovae

    NASA Astrophysics Data System (ADS)

    Inserra, C.; Pastorello, A.; Turatto, M.; Pumo, M. L.; Benetti, S.; Cappellaro, E.; Botticella, M. T.; Bufano, F.; Elias-Rosa, N.; Harutyunyan, A.; Taubenberger, S.; Valenti, S.; Zampieri, L.

    2013-07-01

    Context. Core-collapse Supernovae (CC-SNe) descend from progenitors more massive than about 8 M⊙. Because of the young age of the progenitors, the ejecta may eventually interact with the circumstellar medium (CSM) via highly energetic processes detectable in the radio, X-ray, ultraviolet (UV) and, sometimes, in the optical domains. Aims: In this paper we present ultraviolet, optical and near infrared observations of five Type II SNe, namely SNe 2009dd, 2007pk, 2010aj, 1995ad, and 1996W. Together with few other SNe they form a group of moderately luminous Type II events. We investigate the photometric similarities and differences among these bright objects. We also attempt to characterise them by analysing the spectral evolutions, in order to find some traces of CSM-ejecta interaction. Methods: We collected photometry and spectroscopy with several telescopes in order to construct well-sampled light curves and spectral evolutions from the photospheric to the nebular phases. Both photometry and spectroscopy indicate a degree of heterogeneity in this sample. Modelling the data of SNe 2009dd, 2010aj and 1995ad allows us to constrain the explosion parameters and the properties of the progenitor stars. Results: The light curves have luminous peak magnitudes (-16.95 < MB < -18.70). The ejected masses of 56Ni for three SNe span a wide range of values (2.8 × 10-2 M⊙ < M(56Ni)< 1.4 × 10-1 M⊙), while for a fourth (SN 2010aj) we could determine a stringent upper limit (7 × 10-3 M⊙). Clues of interaction, such as the presence of high velocity (HV) features of the Balmer lines, are visible in the photospheric spectra of SNe 2009dd and 1996W. For SN 2007pk we observe a spectral transition from a Type IIn to a standard Type II SN. Modelling the observations of SNe 2009dd, 2010aj and 1995ad with radiation hydrodynamics codes, we infer kinetic plus thermal energies of about 0.2-0.5 foe, initial radii of 2-5 × 1013 cm and ejected masses of ~5.0-9.5 M⊙. Conclusions: These

  1. General comparison of ozone vertical profiles obtained by various techniques during the 1983 MAP/GLOBUS campaign

    NASA Technical Reports Server (NTRS)

    Matthews, W. A.; Aimedieu, P.; Megie, G.; Pelon, J.; Attmannspacher, W.; Komhyr, W.; Marche, P.; De La Noe, J.; Rigaud, P.; Robbins, D. E.

    1987-01-01

    As part of the 1983 MAP/GLOBUS campaign, atmospheric ozone profile measurements were made using a large variety of different techniques both from balloon platforms and the ground. It is shown that, for most techniques, the measured height distributions agree to within + or - 5 percent with the exception of the remote visible absorption method. This + or - 5 percent uncertainty is of the order of the individual intersystem accuracy. It is suggested that since the differences with the visible absorption method are in magnitude rather than in form, the absorption cross-section data could be the possible cause for the discrepancy.

  2. Applying Value Stream Mapping Technique for Production Improvement in a Manufacturing Company: A Case Study

    NASA Astrophysics Data System (ADS)

    Jeyaraj, K. L.; Muralidharan, C.; Mahalingam, R.; Deshmukh, S. G.

    2013-01-01

    The purpose of this paper is to explain how value stream mapping (VSM) is helpful in lean implementation and to develop the road map to tackle improvement areas to bridge the gap between the existing state and the proposed state of a manufacturing firm. Through this case study, the existing stage of manufacturing is mapped with the help of VSM process symbols and the biggest improvement areas like excessive TAKT time, production, and lead time are identified. Some modifications in current state map are suggested and with these modifications future state map is prepared. Further TAKT time is calculated to set the pace of production processes. This paper compares the current state and future state of a manufacturing firm and witnessed 20 % reduction in TAKT time, 22.5 % reduction in processing time, 4.8 % reduction in lead time, 20 % improvement in production, 9 % improvement in machine utilization, 7 % improvement in man power utilization, objective improvement in workers skill level, and no change in the product and semi finished product inventory level. The findings are limited due to the focused nature of the case study. This case study shows that VSM is a powerful tool for lean implementation and allows the industry to understand and continuously improve towards lean manufacturing.

  3. Frequency analysis of a task-evoked pupillary response: Luminance-independent measure of mental effort.

    PubMed

    Peysakhovich, Vsevolod; Causse, Mickaël; Scannella, Sébastien; Dehais, Frédéric

    2015-07-01

    Pupil diameter is a widely-studied cognitive load measure, which, despite its convenience for non-intrusive operator state monitoring in complex environments, is still not available for in situ measurements because of numerous methodological limitations. The most important of these limitations is the influence of pupillary light reflex. Hence, there is the need of providing a pupil-based cognitive load measure that is independent of light conditions. In this paper, we present a promising technique of pupillary signal analysis resulting in luminance-independent measure of mental effort that could be used in real-time without a priori on luminous conditions. Twenty-two participants performed a short-term memory task under different screen luminance conditions. Our results showed that the amplitude of pupillary dilation due to load on memory was luminance-dependent with higher amplitude corresponding to lower-luminance condition. Furthermore, our experimentation showed that load on memory and luminance factors express themselves differently according to frequency. Therefore, as our statistical analysis revealed, the ratio between low (0-1.6 Hz) and high frequency (1.6-4 Hz) bands (LF/HF ratio) of power spectral densities of pupillary signal is sensitive to the cognitive load but not to luminance. Our results are promising for the measurement of load on memory in ecological settings. PMID:25941013

  4. Mapping of Primary Instructional Methods and Teaching Techniques for Regularly Scheduled, Formal Teaching Sessions in an Anesthesia Residency Program.

    PubMed

    Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro

    2016-06-01

    In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high. PMID:27243580

  5. Processing techniques for the production of an experimental computer-generated shaded-relief map

    USGS Publications Warehouse

    Judd, Damon D.

    1986-01-01

    The data consisted of forty-eight 1° by 1° blocks of resampled digital elevation model (DEM) data. These data were digitally mosaicked and assigned colors based on intervals of elevation values. The color-coded data set was then used to create a shaded-relief image that was photographically composited with cartographic line information to produce a shaded-relief map. The majority of the processing was completed at the National Mapping Division EROS Data Center in Sioux Falls, South Dakota.

  6. Dust near luminous ultraviolet stars

    NASA Technical Reports Server (NTRS)

    Henry, Richard C.

    1992-01-01

    More than 700 luminous stars in the infrared astronomical satellite (IRAS) Skyflux plates were examined for the presence of dust heated by a nearby star. This dust may be distinguished from the ubiquitous cool cirrus by its higher temperature and thus enhanced 60 micron emission. More than 120 dust clouds were found around only 106 of the stars with a volume filling factor of 0.006 and an intercloud separation of 46 pc. A region of dust smoothly distributed through the volume of space heated by the star could not be found and hence an upper limit of 0.05 cm(exp -3) is placed on the equivalent gas density in the intercloud regions. The clouds have an average density of 0.22 cm(exp -3) and a radius of 1.9 pc, albeit with wide variations in their properties. Two different scale heights of 140 and 540 pc were found. This was interpreted as evidence for different distributions of dust in and out of the galactic disk.

  7. Dust near luminous ultraviolet stars

    NASA Technical Reports Server (NTRS)

    Henry, Richard C.

    1993-01-01

    This report describes research activities related to the Infrared Astronomical Satellite (IRAS) sky survey. About 745 luminous stars were examined for the presence of interstellar dust heated by a nearby star. The 'cirrus' discovered by IRAS is thermal radiation from interstellar dust at moderate and high galactic latitudes. The IRAS locates the dust which must (at some level) scatter ultraviolet starlight, although it was expected that thermal emission would be found around virtually every star, most stars shown no detectable emission. And the emission found is not uniform. It is not that the star is embedded in 'an interstellar medium', but rather what is found are discrete clouds that are heated by starlight. An exception is the dearth of clouds near the very hottest stars, implying that the very hottest stars play an active role with respect to destroying or substantially modifying the dust clouds over time. The other possibility is simply that the hottest stars are located in regions lacking in dust, which is counter-intuitive. A bibliography of related journal articles is attached.

  8. Low-Altitude AVIRIS Data for Mapping Land Cover in Yellowstone National Park: Use of Isodata Clustering Techniques

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.

    2001-01-01

    Northeast Yellowstone National Park (YNP) has a diversity of forest, range, and wetland cover types. Several remote sensing studies have recently been done in this area, including the NASA Earth Observations Commercial Applications Program (EOCAP) hyperspectral project conducted by Yellowstone Ecosystems Studies (YES) on the use of hyperspectral imaging for assessing riparian and in-stream habitats. In 1999, YES and NASA's Commercial Remote Sensing Program Office began collaborative study of this area, assessing the potential of synergistic use of hyperspectral, synthetic aperture radar (SAR), and multiband thermal data for mapping forest, range, and wetland land cover. Since the beginning, a quality 'reference' land cover map has been desired as a tool for developing and validating other land cover maps produced during the project. This paper recounts an effort to produce such a reference land cover map using low-altitude Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data and unsupervised classification techniques. The main objective of this study is to assess ISODATA classification for mapping land cover in Northeast YNP using select bands of low-altitude AVIRIS data. A secondary, more long-term objective is to assess the potential for improving ISODATA-based classification of land cover through use of principal components analysis and minimum noise fraction (MNF) techniques. This paper will primarily report on work regarding the primary research objective. This study focuses on an AVIRIS cube acquired on July 23, 1999, by the confluence of Soda Butte Creek with the Lamar River. Range and wetland habitats dominate the image with forested habitats being a comparatively minor component of the scene. The scene generally tracks from southwest to northeast. Most of the scene is valley bottom with some lower side slopes occurring on the western portion. Elevations within the AVIRIS scene range from approximately 1998 to 2165 m above sea level, based on US

  9. Comparison of regression and kriging techniques for mapping the average annual precipitation of Turkey

    NASA Astrophysics Data System (ADS)

    Bostan, P. A.; Heuvelink, G. B. M.; Akyurek, S. Z.

    2012-10-01

    Accurate mapping of the spatial distribution of annual precipitation is important for many applications in hydrology, climatology, agronomy, ecology and other environmental sciences. In this study, we compared five different statistical methods to predict spatially the average annual precipitation of Turkey using point observations of annual precipitation at meteorological stations and spatially exhaustive covariate data (i.e. elevation, aspect, surface roughness, distance to coast, land use and eco-region). The methods compared were multiple linear regression (MLR), ordinary kriging (OK), regression kriging (RK), universal kriging (UK), and geographically weighted regression (GWR). Average annual precipitation of Turkey from 1970 to 2006 was measured at 225 meteorological stations that are fairly uniformly distributed across the country, with a somewhat higher spatial density along the coastline. The observed annual precipitation varied between 255 mm and 2209 mm with an average of 628 mm. The annual precipitation was highest along the southern and northern coasts and low in the centre of the country, except for the area near the Van Lake, Keban and Ataturk Dams. To compare the performance of the interpolation techniques the total dataset was first randomly split in ten equally sized test datasets. Next, for each test data set the remaining 90% of the data comprised the training dataset. Each training dataset was then used to calibrate and apply the spatial prediction model. Predictions at the test dataset locations were compared with the observed test data. Validation was done by calculating the Root Mean Squared Error (RMSE), R-square and Standardized MSE (SMSE) values. According to these criteria, universal kriging is the most accurate with an RMSE of 178 mm, an R-square of 0.61 and an SMSE of 1.06, whilst multiple linear regression performed worst (RMSE of 222 mm, R-square of 0.39, and SMSE of 1.44). Ordinary kriging, UK using only elevation and geographically

  10. Metateaching and the Instructional Map. Teaching Techniques/Strategies Series, Volume 1.

    ERIC Educational Resources Information Center

    Timpson, William M.

    This book describes a conceptual framework, the "Instructional Map"--a metaphor for envisioning the interconnectedness of teacher and student, teaching and learning, and content and process--that can help teachers plan, sort their way through course material and instructional options, interact with students, and reflect upon progress made and what…

  11. Creating High-Resolution Maps of Leaf Water Isotopes Using IM-CRDS and IRMS Techniques

    NASA Astrophysics Data System (ADS)

    Gerlein-Safdi, C.; Sinkler, C. J.; Caylor, K. K.

    2014-12-01

    Since the development of isotope ratio infrared spectroscopy (IRIS), the applications of water isotope analysis have been increasing. Here, we present a new protocol to create high-resolution maps of leaf water isotopes 18O and 2H. We use the Picarro induction module (IM-CRDS) combined with an isotope analyzer (L2130-i) to sample up to 25 locations in one half of each leaf. Each sampling location corresponds to four samples (6 mm outside diameter punched-holes) punched next to each other. In the induction module, an induction coil heats a metal holder containing the leaf sample. The sample will release water vapor that is then sent to the isotope analyzer. The IM-CRDS allows us to significantly reduce the sample size and therefore increase the sample density, compared to the traditional cryogenic extraction method. Using spatial analysis tools, we create high-resolution spatial maps of each isotope as well as d-excess maps. The water in the second half of the leaf is extracted by cryogenic extraction and analyzed using both IRIS and isotope ratio mass spectroscopy. The isotopic composition of the extracted water is compared to the average composition calculated from the maps and used for calibration. We present applications of this protocol to the analysis of the spatio-temporal evolution of foliar uptake in Colocasia esculenta under laboratory conditions.

  12. Implementation of Concept Mapping to Novices: Reasons for Errors, a Matter of Technique or Content?

    ERIC Educational Resources Information Center

    Conradty, Catherine; Bogner, Franz X.

    2010-01-01

    Concept mapping is discussed as a means to promote meaningful learning and in particular progress in reading comprehension skills. Its increasing implementation necessitates the acquisition of adequate knowledge about frequent errors in order to make available an effective introduction to the new learning method. To analyse causes of errors, 283…

  13. Colony mapping: A new technique for monitoring crevice-nesting seabirds

    USGS Publications Warehouse

    Renner, H.M.; Renner, M.; Reynolds, J.H.; Harping, A.M.A.; Jones, I.L.; Irons, D.B.; Byrd, G.V.

    2006-01-01

    Monitoring populations of auklets and other crevice-nesting seabirds remains problematic, although numerous methods have been attempted since the mid-1960s. Anecdotal evidence suggests several large auklet colonies have recently decreased in both abundance and extent, concurrently with vegetation encroachment and succession. Quantifying changes in the geographical extent of auklet colonies may be a useful alternative to monitoring population size directly. We propose a standardized method for colony mapping using a randomized systematic grid survey with two components: a simple presence/absence survey and an auklet evidence density survey. A quantitative auklet evidence density index was derived from the frequency of droppings and feathers. This new method was used to map the colony on St. George Island in the southeastern Bering Sea and results were compared to previous colony mapping efforts. Auklet presence was detected in 62 of 201 grid cells (each grid cell = 2500 m2) by sampling a randomly placed 16 m2 plot in each cell; estimated colony area = 155 000 m2. The auklet evidence density index varied by two orders of magnitude across the colony and was strongly correlated with means of replicated counts of birds socializing on the colony surface. Quantitatively mapping all large auklet colonies is logistically feasible using this method and would provide an important baseline for monitoring colony status. Regularly monitoring select colonies using this method may be the best means of detecting changes in distribution and population size of crevice-nesting seabirds. ?? The Cooper Ornithological Society 2006.

  14. ERTS-1 imagery interpretation techniques in the Tennessee Valley. [land use and soil mapping

    NASA Technical Reports Server (NTRS)

    Bodenheimer, R. E. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The feasibility of delineating major soil associations and land uses through computerized analyses is discussed. Useful and potential applications in detecting landscape change and land use mapping are described. Recommendations for improving the data processing effort in a multidisciplinary program are presented.

  15. Using Concept Mapping Techniques to Compare Stakeholder Groups' Perceptions of Tech Prep.

    ERIC Educational Resources Information Center

    Roegge, Chris A.; And Others

    A study was conducted to develop a conceptual framework for tech prep programs based on the perceptions of personnel involved in the planning and implementation of local tech prep programs. A structured conceptualization process called concept mapping was used to develop a pictorial representation of stakeholders' perceptions of tech prep. Using a…

  16. Detection and mapping of illicit drugs and their metabolites in fingermarks by MALDI MS and compatibility with forensic techniques.

    PubMed

    Groeneveld, G; de Puit, M; Bleay, S; Bradshaw, R; Francese, S

    2015-01-01

    Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect's lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details. PMID:26118853

  17. Detection and mapping of illicit drugs and their metabolites in fingermarks by MALDI MS and compatibility with forensic techniques

    PubMed Central

    Groeneveld, G.; de Puit, M.; Bleay, S.; Bradshaw, R.; Francese, S.

    2015-01-01

    Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect’s lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details. PMID:26118853

  18. Detection and mapping of illicit drugs and their metabolites in fingermarks by MALDI MS and compatibility with forensic techniques

    NASA Astrophysics Data System (ADS)

    Groeneveld, G.; de Puit, M.; Bleay, S.; Bradshaw, R.; Francese, S.

    2015-06-01

    Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect’s lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details.

  19. Planning and Measuring Luminance Contrast in Staircases.

    PubMed

    Houck, Leif D; Gundersen, Kristoffer; Strengen, Ola

    2016-01-01

    Norwegian legislation has requirements concerning luminance contrast for different elements in staircases. This paper investigates how architects work to meet the requirements, how to measure the actual built luminance contrasts and finally 21 staircases are measured using two different methods. The results show that some architects do not reflect on luminance contrasts at all, some use their "experience" and some try to measure the reflectance value of different materials during planning. The investigations also reveal that there is not any official predefined way to control luminance contrast, and this investigation shows that different approaches will give different results. To perform the measuring of the built staircases, it has been necessary to develop a defined measuring method. The results of the measuring generally shows that only a few of the staircases studied fully meet the legislation requirements. PMID:27534331

  20. Compact radio sources in luminous infrared galaxies

    NASA Astrophysics Data System (ADS)

    Parra, Rodrigo

    2007-08-01

    Radio interferometry is an observational technique of high sensitivity and incomparably high spatial resolution. Moreover, because radio waves can freely propagate through interstellar dust and gas, it allows the study of regions of the universe completely obscured at other wavelengths. This thesis reports the observational and theoretical results of my research during the past four years which are mostly based on interferometric radio data. The COLA sample is an infrared selected sample of active star forming galaxies. We conducted 6 cm VLA and VLBI snapshot observations of the northern half of this sample. The radio emission seen at VLA scales is consistent with being powered by star formation activity because it follows the far infrared to radio correlation. We detect 22% of the sample sources in our VLBI snapshots. Based on luminosity arguments, we argue that these sub-parsec VLBI sources are powered by AGN activity. Furthermore, we find that VLBI detections are preferentially found in sources whose VLA scale structures have the highest peak brightnesses suggesting a strong correlation between compact starburst and AGN activity. This observational result is consistent with the theoretical picture of an Eddington-limited nuclear starburst acting as the last valve in the pipeline transporting the gas from kiloparsec scales onto the accretion disc of a buried AGN. Arp 220 is the archetypical ultra luminous infrared galaxy. For many years this source has been known to harbour a compact (~100 pc) cluster of unresolved 18 cm bright sources believed to be bright core collapse supernovae. Using multiwavelength VLBI observations, we obtained for the first time radio spectra for 18 of these sources. We find that over a half of them have spectra consistent with young supernovae. The rest can be better explained as older supernova remnants interacting with the high density starburst ISM. This finding allowed us to constrain the number of possible scenarios for the Arp 220

  1. STAR FORMATION IN TWO LUMINOUS SPIRAL GALAXIES

    SciTech Connect

    Hunter, Deidre A.; Ashburn, Allison; Wright, Teresa; Elmegreen, Bruce G.; Rubin, Vera C.; Józsa, Gyula I. G.; Struve, Christian

    2013-10-01

    We examined star formation in two very luminous (M{sub V} = –22 to –23) Sc-type spiral galaxies, NGC 801 and UGC 2885, using ultra-deep Hα images. We combine these Hα images with UBV and Two-Micron All-Sky Survey JHK images and H I maps to explore the star formation characteristics of disk galaxies at high luminosity. Hα traces star formation in these galaxies to 4-6 disk scale lengths, but the lack of detection of Hα further out is likely due to the loss of Lyman continuum photons. Considering gravitational instabilities alone, we find that the gas and stars in the outer regions are marginally stable in an average sense, but considering dissipative gas and radial and azimuthal forcing, the outer regions are marginally unstable to forming spiral arms. Star formation is taking place in spiral arms, which are regions of locally higher gas densities. Furthermore, we have traced smooth exponential stellar disks over four magnitudes in V-band surface brightness and 4-6 disk scale lengths, in spite of a highly variable gravitational instability parameter. Thus, gravitational instability thresholds do not seem relevant to the stellar disk. One possibility for creating an exponential disk is that the molecular cloud densities and star formation rates have exponential profiles and this fact forces the stellar disk to build up such a profile. Another possibility is that the stellar disk is continuously adjusted to an exponential shape regardless of the star formation profile, for example, through global dynamical processes that scatter stars. However, such scattering processes are only known to operate in spiral systems, in which case they cannot explain the same dilemma of smooth exponential disks observed in dwarf irregular galaxies.

  2. Eta Carinae and Other Luminous Blue Variables

    NASA Technical Reports Server (NTRS)

    Corcoran, M. F.

    2006-01-01

    Luminous Blue Variables (LBVs) are believed to be evolved, extremely massive stars close to the Eddington Limit and hence prone to bouts of large-scale, unstable mass loss. I discuss current understanding of the evolutionary state of these objects, the role duplicity may play and known physical characteristics of these stars using the X-ray luminous LBVs Eta Carinae and HD 5980 as test cases.

  3. Mapping the pharyngeal and intestinal pH of Caenorhabditis elegans and real-time luminal pH oscillations using extended dynamic range pH-sensitive nanosensors.

    PubMed

    Chauhan, Veeren M; Orsi, Gianni; Brown, Alan; Pritchard, David I; Aylott, Jonathan W

    2013-06-25

    Extended dynamic range pH-sensitive ratiometric nanosensors, capable of accurately mapping the full physiological pH range, have been developed and used to characterize the pH of the pharyngeal and intestinal lumen of Caenorhabditis elegans in real-time. Nanosensors, 40 nm in diameter, were prepared by conjugating pH-sensitive fluorophores, carboxyfluorescein (FAM) and Oregon Green (OG) in a 1:1 ratio, and a reference fluorophore, 5-(and-6)-carboxytetramethylrhodamine (TAMRA) to an inert polyacrylamide matrix. Accurate ratiometric pH measurements were calculated through determination of the fluorescence ratio between the pH-sensitive and reference fluorophores. Nanosensors were calibrated with an automated image analysis system and validated to demonstrate a pH measurement resolution of ±0.17 pH units. The motility of C. elegans populations, as an indicator for viability, showed nematodes treated with nanosensors, for concentrations ranging from 50.00 to 3.13 mg/mL, were not statistically different to nematodes not challenged with nanosensors up to a period of 4 days (p < 0.05). The nanosensors were also found to remain in the C. elegans lumen >24 h after nanosensor challenge was removed. The pH of viable C. elegans lumen was found to range from 5.96 ± 0.31 in the anterior pharynx to 3.59 ± 0.09 in the posterior intestine. The pharyngeal pumping rate, which dictates the transfer of ingested material from the pharynx to the intestine, was found to be temperature dependent. Imaging C. elegans at 4 °C reduced the pharyngeal pumping rate to 7 contractions/min and enabled the reconstruction of rhythmic pH oscillations in the intestinal lumen in real-time with fluorescence microscopy. PMID:23668893

  4. On extracting brightness temperature maps from scanning radiometer data. [techniques for algorithm design

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Garza-Robles, R.

    1980-01-01

    The extraction of brightness temperature maps from scanning radiometer data is described as a typical linear inverse problem. Spatial quantization and parameter estimation is described and is suggested as an advantageous approach to a solution. Since this approach takes into explicit account the multivariate nature of the problem, it permits an accurate determination of the most detailed resolution extractable from the data as well as explicitly defining the possible compromises between accuracy and resolution. To illustrate the usefulness of the method described for algorithm design and accuracy prediction, it was applied to the problem of providing brightness temperature maps during the NOSS flight segment. The most detained possible resolution was determined and a curve which displays the possible compromises between accuracy and resolution was provided.

  5. Techniques for the creation of land use maps and tabulations from Landsat imagery

    NASA Technical Reports Server (NTRS)

    Angelici, G. L.; Bryant, N. A.

    1977-01-01

    Methods for creating color thematic maps and land use tabulations, employing both Landsat imagery and computer image processing, are discussed. The system, the Multiple Input Land Use System (MILUS) has been tested in the metropolitan section of Dayton, Ohio. Training areas for land use were first digitized by coordinates and then transformed onto an image of white lines on a black background. This image was added to a Landsat image of the same area. Then multispectral classification was performed. A tape of digitized census tract boundaries was computer interfaced to yield an image of tract boundaries on a background registered to the thematic land-use map. Using a data management system, the data were then used to produce figures for the area and percent of land use in each tract. Future work is expected to convert most of the steps into interactive processing. This would greatly reduce the time needed to edit and register the data sets.

  6. Simultaneous chromatic and luminance human electroretinogram responses

    PubMed Central

    Parry, Neil R A; Murray, Ian J; Panorgias, Athanasios; McKeefry, Declan J; Lee, Barry B; Kremers, Jan

    2012-01-01

    The parallel processing of information forms an important organisational principle of the primate visual system. Here we describe experiments which use a novel chromatic–achromatic temporal compound stimulus to simultaneously identify colour and luminance specific signals in the human electroretinogram (ERG). Luminance and chromatic components are separated in the stimulus; the luminance modulation has twice the temporal frequency of the chromatic modulation. ERGs were recorded from four trichromatic and two dichromatic subjects (1 deuteranope and 1 protanope). At isoluminance, the fundamental (first harmonic) response was elicited by the chromatic component in the stimulus. The trichromatic ERGs possessed low-pass temporal tuning characteristics, reflecting the activity of parvocellular post-receptoral mechanisms. There was very little first harmonic response in the dichromats’ ERGs. The second harmonic response was elicited by the luminance modulation in the compound stimulus and showed, in all subjects, band-pass temporal tuning characteristic of magnocellular activity. Thus it is possible to concurrently elicit ERG responses from the human retina which reflect processing in both chromatic and luminance pathways. As well as providing a clear demonstration of the parallel nature of chromatic and luminance processing in the human retina, the differences that exist between ERGs from trichromatic and dichromatic subjects point to the existence of interactions between afferent post-receptoral pathways that are in operation from the earliest stages of visual processing. PMID:22586211

  7. Phase Space Tomography: A Simple, Portable and Accurate Technique to Map Phase Spaces of Beams with Space Charge

    SciTech Connect

    Stratakis, D.; Kishek, R. A.; Bernal, S.; Walter, M.; Haber, I.; Fiorito, R.; Thangaraj, J. C. T.; Quinn, B.; Reiser, M.; O'Shea, P. G.; Li, H.

    2006-11-27

    In order to understand the charged particle dynamics, e.g. the halo formation, emittance growth, x-y energy transfer and coupling, knowledge of the actual phase space is needed. Other the past decade there is an increasing number of articles who use tomography to map the beam phase space and measure the beam emittance. These studies where performed at high energy facilities where the effect of space charge was neglible and therefore not considered in the analysis. This work extends the tomography technique to beams with space charge. In order to simplify the analysis linear forces where assumed. By carefully modeling the tomography process using the particle-in-cell code WARP we test the validity of our assumptions and the accuracy of the reconstructed phase space. Finally, we report experimental results of phase space mapping at the University of Maryland Electron Ring (UMER) using tomography.

  8. Subsurface object recognition by means of regularization techniques for mapping coastal waters floor

    NASA Astrophysics Data System (ADS)

    Jiménez-Rodríguez, Luis O.; Umana-Diaz, Alejandra; Diaz-Santos, Jose; Neira-Carolina, Gerardino; Morales-Morales, Javier; Rodriguez, Eladio

    2005-10-01

    A fundamental challenge to Remote Sensing is mapping the ocean floor in coastal shallow waters where variability, due to the interaction between the coast and the sea, can bring significant disparity in the optical properties of the water column. The objects to be detected, coral reefs, sands and submerged aquatic vegetation, have weak signals, with temporal and spatial variation. In real scenarios the absorption and backscattering coefficients have spatial variation due to different sources of variability (river discharge, different depths of shallow waters, water currents) and temporal fluctuations. This paper presents the development of algorithms for retrieving information and its application to the recognition, classification and mapping of objects under coastal shallow waters. A mathematical model that simplifies the radiative transfer equation was used to quantify the interaction between the object of interest, the medium and the sensor. The retrieval of information requires the development of mathematical models and processing tools in the area of inversion, image reconstruction and detection. The algorithms developed were applied to one set of remotely sensed data: a high resolution HYPERION hyperspectral imagery. An inverse problem arises as this spectral data is used for mapping the ocean shallow waters floor. Tikhonov method of regularization was used in the inversion process to estimate the bottom albedo of the ocean floor using a priori information in the form of stored spectral signatures, previously measured, of objects of interest, such as sand, corals, and sea grass.

  9. Mapping extreme rainfall in a mountainous region using geostatistical techniques: a case study in Scotland

    NASA Astrophysics Data System (ADS)

    Prudhomme, Christel; Reed, Duncan W.

    1999-10-01

    The spatial pattern of precipitation is known to be highly dependent on meteorological conditions and relief. However, the relationships between precipitation and topography in mountainous areas are not very well known, partly because of the complex topography in these regions, and partly because of the sparsity of information available to study such relationships in high elevation areas. The purpose of the investigation was to find a method of mapping extreme rainfall in the mountainous region of Scotland, which was easy to use and to understand, and which gave satisfactory results both in terms of statistical performance and consistency with meteorological mechanisms.Among the interpolation methods described in the literature, ordinary kriging and modified residual kriging have been found attractive by reason of their simplicity and ease of use. Both methods have been applied to map an index of extreme rainfall, the median of the annual maximum daily rainfall (RMED), in the mountainous region of Scotland. Rainfall records from a network of 1003 raingauges are used, covering Scotland with uneven density. A 4-parameter regression equation developed in a previous study, relating a transformed variable of RMED to topographical variables, is used in the modified residual kriging method. Comparing the relative performances of ordinary kriging and modified residual kriging shows that the use of topographical information helps to compensate for the lack of local data from which any interpolation method, such as ordinary kriging, might suffer, thus improving the final mapping.

  10. Displaying and evaluating engineering properties and natural hazards using geomorphic mapping techniques: Telluride, Colorado

    SciTech Connect

    Gunawan, I.; Giardino, J.R.; Tchakerian, V.P. . Geography Dept.)

    1992-01-01

    Telluride, located in the San Juan mountains of southwestern Colorado, is situated in a glacially carved, fluvially modified alpine valley. Today this chic setting is experiencing rapid urban development resulting from flourishing tourist traffic during both the winter ski season and the summer vacation period. A new development, Mountain Village, is being built on an extensive and complex landslide that has only received superficial scrutiny. Recent fast growth is placing considerable pressure on pristine, undeveloped land. This timely quandary incorporates the interaction between prospective development, geomorphic processes, engineering factors, economic feasibility, and landuse adjudication. In an attempt to respond to these issues the State of Colorado enacted Senate Bill 35 (1972) and House Bills 1034 (1974) and 1041 (1974), all mandating assessment of the natural hazards of an area, preparatory to development. The key to evaluating the natural hazards is to comprehend the geomorphic processes. The area is highly-faulted with associated mineralization. Whereas the upper slopes are composed of massive rhyodacitic-tuff breccias and flows, the valley is sculpted from shales, sandstones, and conglomerates. Several periods of glaciation occurred in the area. Glacial till, talus slopes, avalanche chutes and cones, rock glaciers, alluvium, and landslides have been identified in the field and mapped on aerial photographs. Many of the slopes in the area are active. The authors have constructed a geomorphic map (1:12,500) that shows geology, landforms, geomorphic processes and engineering properties. This map can be used by regulatory agencies in identifying areas of natural hazards potentially sensitive to development.